ScrapeGraphAI / Scrapegraph-ai

Python scraper based on AI
https://scrapegraphai.com
MIT License
15.87k stars 1.29k forks source link

fix: try to infer possible provider from the model name, resolves #805 #806

Closed Levyathanus closed 4 days ago

Levyathanus commented 4 days ago

If the model provider in the graph configuration is not specified, the abstract_graph tries to infer it from models_tokens.py and the user is warned with an info message. E.g.: if the following configuration is used:

graph_config = {
    "llm": {
        "api_key": YOUR_GEMINI_API_KEY,
        "model": "gemini-pro",
    },
    "verbose": True,
    "headless": False,
}

is considered in the same way as using:

graph_config = {
    "llm": {
        "api_key": YOUR_GEMINI_API_KEY,
        "model": "google_genai/gemini-pro",
    },
    "verbose": True,
    "headless": False,
}
VinciGit00 commented 4 days ago

Ok please show me what should I write for OpenAI

github-actions[bot] commented 4 days ago

:tada: This PR is included in version 1.30.0-beta.5 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

github-actions[bot] commented 3 days ago

:tada: This PR is included in version 1.31.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

github-actions[bot] commented 3 days ago

:tada: This PR is included in version 1.31.0-beta.1 :tada:

The release is available on:

Your semantic-release bot :package::rocket: