Support streaming LLM response #848
Replies: 3 comments
-
|
looping @johnsonr |
Beta Was this translation helpful? Give feedback.
-
|
I was coming here to ask the same thing. I guess Embabel doesn't support streaming token responses because its focus is not not chat? |
Beta Was this translation helpful? Give feedback.
-
|
please refer to the recent post on Embabel discord, "general" chat: LLM Streaming Support is available now, in agent snapshot build 0.3.1-SNAPSHOT. Please refer to User Guide on usage: embabel-agent/embabel-agent-docs/src/main/asciidoc/reference/streaming/page.adoc. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Currently, the prompt runner call is blocking:
For example:
String response = this.ai.withLlm(options).generateText(prompt);So for long time running prompts, call fails with timeout.
It is very helpful to stream response like python frameworks(crewAI for example). If we can have something like:
Flux<String> response = this.ai.withLlm(options).streamText(prompt);Beta Was this translation helpful? Give feedback.
All reactions