Describe the bug
I have run the llama3 and nomic-embed-text model in my local with Ollama successfully, when I run python main.py to execute the scrapegraphai program, I got this error.
To Reproduce
(scrapegraphai) ~/Scrapegraph-ai/ [main] python main.py
/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/utils/remover.py:26: MarkupResemblesLocatorWarning: The input looks more like a filename than markup. You may want to open this file and pass the filehandle into Beautiful Soup.
soup = BeautifulSoup(html_content, 'html.parser')
Traceback (most recent call last):
File "/Users/xxx/Techspace/Scrapegraph-ai/main.py", line 27, in <module>
result = smart_scraper_graph.run()
File "/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/graphs/smart_scraper_graph.py", line 116, in run
self.final_state, self.execution_info = self.graph.execute(inputs)
File "/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/graphs/base_graph.py", line 107, in execute
result = current_node.execute(state)
File "/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/nodes/rag_node.py", line 89, in execute
retriever = FAISS.from_documents(
File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_core/vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/vectorstores/faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 211, in embed_documents
embeddings = self._embed(instruction_pairs)
File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 199, in _embed
return [self._process_emb_response(prompt) for prompt in iter_]
File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 199, in <listcomp>
return [self._process_emb_response(prompt) for prompt in iter_]
File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 173, in _process_emb_response
raise ValueError(
ValueError: Error raised by inference API HTTP code: 503,
Expected behavior
it should be run successfully
Desktop (please complete the following information):
Describe the bug I have run the llama3 and nomic-embed-text model in my local with Ollama successfully, when I run
python main.py
to execute the scrapegraphai program, I got this error.To Reproduce
Expected behavior it should be run successfully
Desktop (please complete the following information):