Open ekmekovski opened 3 days ago
ok show me the conf
graph_config = { "llm": { "api_key": "MyKey", "model": "openai/mymodel", "base_url": inference_endpoint }, "verbose": True, "headless": False }
I keep verbose True and headless False for debugging purposes. For temporary solution I am getting the error message and parsing it myself.
A sample log: "| scrape_graph_inference | CONTENT : Invalid json output: Here is the output in JSON format: {"desired_json_key": "........."} | TYPE: <class 'langchain_core.exceptions.OutputParserException'>"
Describe the bug I believe output parser is not wokring correctşy with newset version and the following versions (1.31.0b1, 1.28.0b4, 1.27.0), even a simple inference ends up with Invalid json output. I have my own json output parser is there any way to disable parsing in lib. Thnx in advance for the great repo.
To Reproduce Steps to reproduce the behavior: 1.I just use smartscrapegraph
Expected behavior Invalid json output error will be thrown