Azure-Samples / serverless-chat-langchainjs

Build your own serverless AI Chat with Retrieval-Augmented-Generation using LangChain.js, TypeScript and Azure
MIT License
704 stars 335 forks source link

Error running example locally using Ollama #50

Closed NanaXiong00 closed 5 months ago

NanaXiong00 commented 5 months ago

Describe the issue: Using Ollama to run the example locally, opening the web http://localhost:8000 chat fails with the following error: image image

Repro Steps:

  1. Run command git clone https://github.com/Azure-Samples/serverless-chat-langchainjs.git
  2. Install and run Ollama
  3. Run command ollama pull mistral and ollama pull all-minilm:l6-v2
  4. Run command npm install
  5. Run command npm start
  6. Run command npm run upload:docs
  7. Open the URL http://localhost:8000

Environments:

Expected behavior: Open the URL http://localhost:8000 in the browser and successfully start chatting with the bot

@sinedied for notification.

sinedied commented 5 months ago

@NanaXiong00 Thanks for the report! could you check that Ollama is working correctly? I frequently have to restart it manually because of updates. You can check if it works by running the command:

ollama run mistral

If the model starts and you can chat with it, it should be good. If not, try restarting Ollama.

From the logs, it also seems that you tried testing the chat before the document where ingested (running npm run upload:docs). Seems like the ingestion worked, did you try again the chat after that?

NanaXiong00 commented 5 months ago

Follow the above reply and test again. The problem cannot be reproduced, so I closed it.