TheAiSingularity / graphrag-local-ollama

Local models support for Microsoft's graphrag using ollama (llama3, mistral, gemma2 phi3)- LLM & Embedding extraction
MIT License
662 stars 95 forks source link

embedding = ollama.embeddings(model="nomic-embed-text", prompt=inp) AttributeError: module 'ollama' has no attribute 'embeddings' #26

Open myyourgit opened 2 months ago

myyourgit commented 2 months ago

I strictly do test step and setting.yaml. but meet below error. I really installed nomic-embed-text with command:

ollama pull nomic-embed-text

error file:

File "C:\Users\xxx\AppData\Local\Programs\Ollama\graphrag-local-ollama\graphrag\llm\openai\openai_embeddings_llm.py", line 31, in _execute_llm embedding = ollama.embeddings(model="nomic-embed-text", prompt=inp) ^^^^^^^^^^^^^^^^^ AttributeError: module 'ollama' has no attribute 'embeddings' 22:53:43,676 graphrag.index.reporting.file_workflow_callbacks INFO Error running pipeline! details=None