cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.1k stars 1.42k forks source link

Dalai server doesn't start without internet connection #449

Open Gary6780 opened 1 year ago

Gary6780 commented 1 year ago

Hello!

I managed to run Llama and Alpaca on Windows 11. But when i start to execute "npx dalai serve" without interrnet connection (WLAN), the npx command tries to open a server and hangs. When i activate WLAN connection and enter the "npx dalai serve" command again, then it starts the local server with "Server running on http://localhost:3000/". For the server-start and opening the adress in the browser, i need internet connection. After that i can cut the network und use the AI.

Am i missing something?

Thanks!

Gary6780 commented 1 year ago

I see it's the same as in issue #157