InternLM / Tutorial

LLM&VLM Tutorial
1.31k stars 885 forks source link

begin load model 之后被killed #1239

Open cainiaozp opened 1 month ago

cainiaozp commented 1 month ago

(internlm-demo) root@intern-studio-50104806:~/code/InternLM# streamlit run web_demo.py --server.address 127.0.0.1 --server.port 6006

Collecting usage statistics. To deactivate, set browser.gatherUsageStats to false.

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:6006

load model begin. Killed

liuwake commented 1 month ago

是Linux内核Killed吧.使用top, htop, 或者`btop之类的工具,看看是不是爆内存了.