alexrozanski / LlamaChat

Chat with your favourite LLaMA models in a native macOS app
https://llamachat.app
MIT License
1.43k stars 53 forks source link

Selected Model is Unsupported #3

Closed xavierallem closed 1 year ago

xavierallem commented 1 year ago

Using Alpaca and gpt4all

Have the following

After selecting the path it shows Selected model is of an unsupported version

image
alexrozanski commented 1 year ago

Hey @xavierallem, sorry - the docs and UI aren't super clear on this yet. Looks like you have one of the slightly older model versions. I'm going to add native flows for these cases, but for now you can use:

from the llama.cpp repo to upgrade the models to the new format required by llama.cpp (which LlamaChat uses internally). Lmk if this works!

alexrozanski commented 1 year ago

added some details to the README for now

danieldunderfelt commented 1 year ago

The instructions are quite obtuse. It took me a while to figure out that the tokenizer.model is found among the Llama files.

alexrozanski commented 1 year ago

@danieldunderfelt good feedback. I added some more notes in https://github.com/alexrozanski/LlamaChat/blob/main/README.md#-models, but this should really be better supported/documented in the conversion flow

danieldunderfelt commented 1 year ago

@danieldunderfelt good feedback. I added some more notes in https://github.com/alexrozanski/LlamaChat/blob/main/README.md#-models, but this should really be better supported/documented in the conversion flow

Epic, thanks!

alexrozanski commented 1 year ago

This is fixed in v1.2.0 -- older models are now supported directly in LlamaChat.