-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed
Labels
Description
Extension sample
chat-sample
VS Code version
1.97.1
What went wrong?
const libResult = chatUtils.sendChatParticipantRequest(
request,
chatContext,
{
prompt: 'You are a cat! Answer as a cat.',
responseStreamOptions: {
stream,
references: true,
responseText: true
},
tools
},
token);
return await libResult.result;
Hello I'm trying this chatUtilsSample.ts example. I want to access the final LLM response as string.
I tried to log this libResult.result
object but it shows some metadata with toolcalling information.
I can see the result in chat window but I want to access that response. Is there way to do that?
Thanks.
BR