Open mishav78 opened 1 year ago
I'm running the 30B alpaca and my memory usage is roughly 78% of my 32GB RAM while in use.
cpu or gpu ram?
My PC memory/RAM. It also uses your CPU. As far as I know there are no current configurable settings to use GPU.
will it work faster with a gpu?
Please see my above edited comment.
very strange. Don't these models usually use gpus?
x
very strange. Don't these models usually use gpus?
This project is using llama.cpp/alpaca.cpp which "Runs on the CPU"
https://github.com/antimatter15/alpaca.cpp#getting-started-30b
very strange. Don't these models usually use gpus?
To train them not to run them.
does it work as good as chatgpt? Or close?
I'd say 30B is closing in at about 80% of chat gpt 3.5. 7B/13B maybe 60%+.
I'd say 30B is closing in at about 80% of chat gpt 3.5. 7B/13B maybe 60%+.
I'd be interested to know what prompts you've tried and what parameter (temperature, etc) values you have.
For me even 30B feels like 10% of what I see with ChaGPT 3.5.
maybe I'm an idiot but I have to ask is below memory requirements for cpu or gpu ram?
Runs on most modern computers. Unless your computer is very very old, it should work.
According to https://github.com/ggerganov/llama.cpp/issues/13, here are the memory requirements:
7B => ~4 GB 13B => ~8 GB 30B => ~16 GB 65B => ~32 GB