withcatai / catai

Run AI ✨ assistant locally! with simple API for Node.js 🚀
https://withcatai.github.io/catai/
MIT License
443 stars 28 forks source link

Feedback: Model discovery and installation #7

Closed os6sense closed 1 year ago

os6sense commented 1 year ago

Running install with an unrecognised model gives the following output:

➜  models catai install gpt4all                                                                                                                                                                  
$ cd /usr/local/lib/node_modules/catai                                                                                                                                                                     
Model unknown, we will download with template URL. You can also try one of thous:7B, 13B, 30B, Vicuna-7B, Vicuna-7B-Uncensored, Vicuna-13B, Stable-Vicuna-13B, Wizard-Vicuna-7B, Wizard-Vicuna-7B-Uncensored, Wizard-Vicuna-13B, OpenAssistant-30B

Outputting a list of available models is excellent but perhaps also worth adding a catai install --list or similar command?

Also observe that the output appears to make no sense ... thous:7B? And what are the 7B,13B,30B models? (edit) Ahhh, the original llama models, doh!

ido-pluto commented 1 year ago

Yes, it is a bit confusing naming something with 7B. This is planned to be changed in the feature.

It is a good idea to list all the available models in a formatted way, with a simple command.

Thanks for sharing this :)

os6sense commented 1 year ago

No worries, and thanks for this project. Obviously there is a lot of work going on in the LLM area and having something simple that doesn't take days of reading to understand is bound to increase interest and the ability of people to experiment.

Just downloading the models is an adventure given that it is not obvious that huggingface models are just git repos, nor is it obvious exactly what formats will work. catai install is bang where the complexity should lay.