Closed dipakmeher closed 1 month ago
Issue resolved: The problem was with my input, which I believe was too long for the LLM models I was using: Mistral and Nomic Embed Text. It was a book from Gutenberg, and the code ran perfectly fine with OpenAI LLM models (for both entity extraction and embeddings) but not with Mistral and Nomic Embed Text. I shortened my input to test it, and that resolved the error.
Currently, I am facing issue #1234. Any help with this would be appreciated.
Do you need to file an issue?
Describe the issue
I am getting an error that says 'Error Invoking LLM' in my code. I’ve tried a few tweaks, but nothing has worked. Any help with this would be appreciated.
Error: [Issue]:
Steps to reproduce
You can replicate this issue by using mistral as llm from ollama.
GraphRAG Config Used
Logs and screenshots
Additional Information