ScrapeGraphAI / Scrapegraph-ai

Python scraper based on AI
https://scrapegraphai.com
MIT License
13k stars 991 forks source link

Groq example results in context_length_exceeded error #411

Open yngwz opened 1 week ago

yngwz commented 1 week ago

Describe the bug I'm getting an error when running the Groq example on the repo. I confirmed my Groq key works when making a normal request.

Code

from scrapegraphai.graphs import SearchGraph
import os

groq_key =  os.getenv("GROQ_API_KEY")

# Define the configuration for the graph
graph_config = {
    "llm": {
        "model": "groq/gemma-7b-it",
        "api_key": groq_key,
        "temperature": 0
    },
    "embeddings": {
        "model": "ollama/nomic-embed-text",
        "base_url": "http://localhost:11434",  # set ollama URL arbitrarily
    },
    "max_results": 5,
}

# Create the SearchGraph instance
search_graph = SearchGraph(
    prompt="List me all the traditional recipes from Chioggia",
    config=graph_config
)

# Run the graph
result = search_graph.run()
print(result)

Error

groq.BadRequestError: Error code: 400 - {'error': {'message': 'Please reduce the length of the messages or completion.', 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

Desktop (please complete the following information):

f-aguzzi commented 1 week ago

There's a similar problem in issue #400. We need to debug the request chunking system because these problems are getting frequent lately. Hopefully, I will be able to work on it next week