Closed kalends closed 6 months ago
yarn run v1.22.22 $ next dev ▲ Next.js 13.5.6
Environments: .env
✓ Ready in 2.8s ⚠ ./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').
Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx
./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').
Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx ⚠ ./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').
Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx
./app/lib/chat_models/webllm.ts Attempted import error: 'ChatModule' is not exported from '@mlc-ai/web-llm' (imported as 'ChatModule').
Import trace for requested module: ./app/lib/chat_models/webllm.ts ./app/worker.ts ./components/ChatWindow.tsx ○ Compiling /page ...
Hey sorry for the delay. By default, the model is hard-coded to Mistral. You'll need to change the location in the code to point to Llama 3:
https://github.com/jacoblee93/fully-local-pdf-chatbot/blob/main/components/ChatWindow.tsx#L48
For the second thing - I bumped the dependent version yesterday, which should fix this. There was a breaking minor version change in WebLLM.
I have run llama3, after uploading pdf, the error message pops up, what do I need to fix。Looking forward to your reply。