Describe the bug
I'm getting an error when running the Groq example on the repo. I confirmed my Groq key works when making a normal request.
Code
from scrapegraphai.graphs import SearchGraph
import os
groq_key = os.getenv("GROQ_API_KEY")
# Define the configuration for the graph
graph_config = {
"llm": {
"model": "groq/gemma-7b-it",
"api_key": groq_key,
"temperature": 0
},
"embeddings": {
"model": "ollama/nomic-embed-text",
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
},
"max_results": 5,
}
# Create the SearchGraph instance
search_graph = SearchGraph(
prompt="List me all the traditional recipes from Chioggia",
config=graph_config
)
# Run the graph
result = search_graph.run()
print(result)
Error
groq.BadRequestError: Error code: 400 - {'error': {'message': 'Please reduce the length of the messages or completion.', 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
Desktop (please complete the following information):
There's a similar problem in issue #400. We need to debug the request chunking system because these problems are getting frequent lately. Hopefully, I will be able to work on it next week
Describe the bug I'm getting an error when running the Groq example on the repo. I confirmed my Groq key works when making a normal request.
Code
Error
Desktop (please complete the following information):