How to properly parse a chat response when using Ollama provider? #150
Unanswered
robiningelbrecht
asked this question in
Q&A
Replies: 1 comment 7 replies
-
|
No one? 😭 |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Ok, I feel really stupid. I'm using the Ollama provider with model
llama3.2. When I call$response->getContent()contains following info:How am I supposed to parse this properly? Sorry if this is very obvious...
Beta Was this translation helpful? Give feedback.
All reactions