Closed waqaskhan137 closed 4 months ago
Hi, we did on the branch, just wait till the deployment is done The configuration is the following: `""" Basic example of scraping pipeline using SmartScraper """ from scrapegraphai.graphs import SmartScraperGraph from scrapegraphai.utils import prettify_exec_info
graph_config = { "llm": { "model": "ollama/gemma", "temperature": 0, "format": "json", # Ollama needs the format to be specified explicitly
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
},
"embeddings": {
"model": "ollama/nomic-embed-text",
"temperature": 0,
"base_url": "http://localhost:11434", # set ollama URL arbitrarily
}
}
smart_scraper_graph = SmartScraperGraph( prompt="List me all the news with their description.",
source="https://perinim.github.io/projects",
config=graph_config
)
result = smart_scraper_graph.run() print(result)
graph_exec_info = smart_scraper_graph.get_execution_info() print(prettify_exec_info(graph_exec_info))`, When we make the next deploy you can use it, just wait
@VinciGit00 Thank for the prompt response. When will be the next deployment or version release?
if you are referring to scrapegraphai==0.4.1
this version still have the issue.
Describe the bug I just followed the instruction and ran the example and getting the following error
I have verified ollama is running and I can use the models using ollama-webui
I am using following models and those are downloaded and working in webui
ollama/gemma:2b ollama/nomic-embed-text"
To Reproduce Steps to reproduce the behavior:
Expected behavior It should work seamlessly
Screenshots If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):