ScrapeGraphAI / Scrapegraph-ai

Python scraper based on AI
https://scrapegraphai.com
MIT License
15.73k stars 1.28k forks source link

nomic-embed-text ValueError: Error raised by inference API HTTP code: 503 #166

Closed dwgeneral closed 6 months ago

dwgeneral commented 6 months ago

Describe the bug I have run the llama3 and nomic-embed-text model in my local with Ollama successfully, when I run python main.py to execute the scrapegraphai program, I got this error.

To Reproduce

(scrapegraphai) ~/Scrapegraph-ai/ [main] python main.py
/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/utils/remover.py:26: MarkupResemblesLocatorWarning: The input looks more like a filename than markup. You may want to open this file and pass the filehandle into Beautiful Soup.
  soup = BeautifulSoup(html_content, 'html.parser')
Traceback (most recent call last):
  File "/Users/xxx/Techspace/Scrapegraph-ai/main.py", line 27, in <module>
    result = smart_scraper_graph.run()
  File "/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/graphs/smart_scraper_graph.py", line 116, in run
    self.final_state, self.execution_info = self.graph.execute(inputs)
  File "/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/graphs/base_graph.py", line 107, in execute
    result = current_node.execute(state)
  File "/Users/xxx/Techspace/Scrapegraph-ai/scrapegraphai/nodes/rag_node.py", line 89, in execute
    retriever = FAISS.from_documents(
  File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_core/vectorstores.py", line 550, in from_documents
    return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
  File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/vectorstores/faiss.py", line 930, in from_texts
    embeddings = embedding.embed_documents(texts)
  File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 211, in embed_documents
    embeddings = self._embed(instruction_pairs)
  File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 199, in _embed
    return [self._process_emb_response(prompt) for prompt in iter_]
  File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 199, in <listcomp>
    return [self._process_emb_response(prompt) for prompt in iter_]
  File "/opt/homebrew/anaconda3/envs/scrapegraphai/lib/python3.9/site-packages/langchain_community/embeddings/ollama.py", line 173, in _process_emb_response
    raise ValueError(
ValueError: Error raised by inference API HTTP code: 503,

截屏2024-05-07 15 08 39

Expected behavior it should be run successfully

Desktop (please complete the following information):

VinciGit00 commented 6 months ago

Give me the code

dwgeneral commented 6 months ago

I have fixed, I didn't notice the terminal enable the proxy server, so that the request not be able to solve.

export https_proxy=""
export http_proxy=""

the issue was mitigated.