jacoblee93 / fully-local-pdf-chatbot

Yes, it's another chat over documents implementation... but this one is entirely local!
https://webml-demo.vercel.app
MIT License
1.68k stars 307 forks source link

Request to Ollama server failed: 404 Not Found. Make sure you are running Ollama #22

Closed kalends closed 6 months ago

kalends commented 6 months ago

I have run llama3, after uploading pdf, the error message pops up, what do I need to fix。Looking forward to your reply。

kalends commented 6 months ago

yarn run v1.22.22 $ next dev ▲ Next.js 13.5.6

Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx

./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').

Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx ⚠ ./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').

Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx

./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').

Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx ○ Compiling /page ...

jacoblee93 commented 6 months ago

Hey sorry for the delay. By default, the model is hard-coded to Mistral. You'll need to change the location in the code to point to Llama 3:

https://github.com/jacoblee93/fully-local-pdf-chatbot/blob/main/components/ChatWindow.tsx#L48

For the second thing - I bumped the dependent version yesterday, which should fix this. There was a breaking minor version change in WebLLM.