Closed NanaXiong00 closed 5 months ago
@NanaXiong00 Thanks for the report! could you check that Ollama is working correctly? I frequently have to restart it manually because of updates. You can check if it works by running the command:
ollama run mistral
If the model starts and you can chat with it, it should be good. If not, try restarting Ollama.
From the logs, it also seems that you tried testing the chat before the document where ingested (running npm run upload:docs
). Seems like the ingestion worked, did you try again the chat after that?
Follow the above reply and test again. The problem cannot be reproduced, so I closed it.
Describe the issue: Using Ollama to run the example locally, opening the web http://localhost:8000 chat fails with the following error:
Repro Steps:
git clone https://github.com/Azure-Samples/serverless-chat-langchainjs.git
ollama pull mistral
andollama pull all-minilm:l6-v2
npm install
npm start
npm run upload:docs
Environments:
Expected behavior: Open the URL http://localhost:8000 in the browser and successfully start chatting with the bot
@sinedied for notification.