-
Couldn't load subscription status.
- Fork 3.7k
bug: Missing response parsing for N8N Ai Agent Responses #8433
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
All contributors have signed the CLA ✍️ ✅ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 issue found across 1 file
Prompt for AI agents (all 1 issues)
Understand the root cause of the following 1 issues and fix them.
<file name="core/llm/llms/Ollama.ts">
<violation number="1" location="core/llm/llms/Ollama.ts:162">
Storing the thinking state in the static _isThinking flag makes it global across all chat streams, so one request entering a <think> block forces other concurrent streams to emit thinking messages instead of assistant replies. Track the thinking state per stream instead of on the class.</violation>
</file>
Since this is your first cubic review, here's how it works:
- cubic automatically reviews your code and comments on bugs and improvements
- Teach cubic by replying to its comments. cubic learns from your replies and gets better over time
- Ask questions if you need clarification on any suggestion
React with 👍 or 👎 to teach cubic. Mention @cubic-dev-ai to give feedback, ask questions, or re-run the review.
|
I have read the CLA Document and I hereby sign the CLA |
|
recheck |
recheck |
Description
[#8423]
[ What changed? Feel free to be brief. ]
I added a new response model N8nChatResponse so it can be parsed.
I had to extend convertchatmessage so it can handle the different datastruct correctly.
AI Code Review
@continue-reviewChecklist
Screen recording or screenshot
I tested just manual I couldn't find where are you storing example streams for testing
Summary by cubic
Adds N8N JSON response parsing to Ollama chat streaming so the N8N AI Agent returns proper messages. Also adds support for streaming blocks as separate thinking messages.