ScrapeGraphAI / Scrapegraph-ai

Python scraper based on AI
https://scrapegraphai.com
MIT License
15.87k stars 1.29k forks source link

RuntimeError: asyncio.run() cannot be called from a running event loop #771

Closed boral closed 2 weeks ago

boral commented 3 weeks ago

While running Readme example facing this

''' result = smart_scraper_graph.run() --- Executing Fetch Node --- --- (Fetching HTML from: https://scrapegraphai.com/) --- Traceback (most recent call last):

Cell In[5], line 1 result = smart_scraper_graph.run()

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\graphs\smart_scraper_graph.py:114 in run self.final_state, self.execution_info = self.graph.execute(inputs)

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\graphs\base_graph.py:263 in execute return self._execute_standard(initial_state)

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\graphs\base_graph.py:184 in _execute_standard raise e

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\graphs\base_graph.py:168 in _execute_standard result = current_node.execute(state)

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\nodes\fetch_node.py:126 in execute return self.handle_web_source(state, source)

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\nodes\fetch_node.py:284 in handle_web_source document = loader.load()

File ~\anaconda3\envs\test\Lib\site-packages\langchain_core\document_loaders\base.py:31 in load return list(self.lazy_load())

File ~\anaconda3\envs\test\Lib\site-packages\scrapegraphai\docloaders\chromium.py:111 in lazy_load html_content = asyncio.run(scraping_fn(url))

File ~\anaconda3\envs\test\Lib\asyncio\runners.py:186 in run raise RuntimeError(

RuntimeError: asyncio.run() cannot be called from a running event loop '''

VinciGit00 commented 3 weeks ago

Are you using collab?

boral commented 3 weeks ago

No Spyder

f-aguzzi commented 3 weeks ago

Take a look at the first two code blocks in the Colab example, the ones about nest_asyncio. Even though you're using Spyder, it's still a Jupyter Notebook, so the fix should be the same.