Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Invalid json output: #809

Open
ekmekovski opened this issue Nov 19, 2024 · 2 comments
Open

Invalid json output: #809

ekmekovski opened this issue Nov 19, 2024 · 2 comments

Comments

@ekmekovski
Copy link

Describe the bug
I believe output parser is not wokring correctşy with newset version and the following versions (1.31.0b1, 1.28.0b4, 1.27.0), even a simple inference ends up with Invalid json output. I have my own json output parser is there any way to disable parsing in lib.
Thnx in advance for the great repo.

To Reproduce
Steps to reproduce the behavior:
1.I just use smartscrapegraph

Expected behavior
Invalid json output error will be thrown

@VinciGit00
Copy link
Collaborator

ok show me the conf

@ekmekovski
Copy link
Author

ekmekovski commented Nov 19, 2024

graph_config = {
"llm": {
"api_key": "MyKey",
"model": "openai/mymodel",
"base_url": inference_endpoint
},
"verbose": True,
"headless": False
}

I keep verbose True and headless False for debugging purposes. For temporary solution I am getting the error message and parsing it myself.

A sample log:
"| scrape_graph_inference | CONTENT : Invalid json output: Here is the output in JSON format:
{"desired_json_key": "........."} | TYPE: <class 'langchain_core.exceptions.OutputParserException'>"

  • I check the json response is coming actually. I am not using guided decoding etc.
  • I am not using openai, I am using vLLM openai compatible server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants