Closed simonw closed 4 months ago
To test this:
rm ~/Library/Application\ Support/io.datasette.llm/llm-anyscale-endpoints.json
Then:
$ llm models list | grep Anysca
AnyscaleEndpoints: codellama/CodeLlama-70b-Instruct-hf
AnyscaleEndpoints: google/gemma-7b-it
AnyscaleEndpoints: meta-llama/Llama-2-13b-chat-hf
AnyscaleEndpoints: meta-llama/Llama-2-70b-chat-hf (aliases: llama70b)
AnyscaleEndpoints: meta-llama/Llama-2-7b-chat-hf
AnyscaleEndpoints: mistralai/Mistral-7B-Instruct-v0.1
AnyscaleEndpoints: mistralai/Mixtral-8x22B-Instruct-v0.1 (aliases: mix22b)
AnyscaleEndpoints: mistralai/Mixtral-8x7B-Instruct-v0.1 (aliases: mixtral)
AnyscaleEndpoints: mlabonne/NeuralHermes-2.5-Mistral-7B
$ llm install -U llm-anyscale-endpoints
...
Installing collected packages: llm-anyscale-endpoints
Attempting uninstall: llm-anyscale-endpoints
Found existing installation: llm-anyscale-endpoints 0.5
Uninstalling llm-anyscale-endpoints-0.5:
Successfully uninstalled llm-anyscale-endpoints-0.5
Successfully installed llm-anyscale-endpoints-0.6
$ llm models list | grep Anysca
AnyscaleEndpoints: meta-llama/Llama-2-7b-chat-hf
AnyscaleEndpoints: meta-llama/Llama-2-13b-chat-hf
AnyscaleEndpoints: mistralai/Mixtral-8x7B-Instruct-v0.1 (aliases: mixtral)
AnyscaleEndpoints: mistralai/Mistral-7B-Instruct-v0.1
AnyscaleEndpoints: meta-llama/Llama-2-70b-chat-hf (aliases: llama70b)
AnyscaleEndpoints: meta-llama/Llama-3-8b-chat-hf
AnyscaleEndpoints: meta-llama/Llama-3-70b-chat-hf
AnyscaleEndpoints: codellama/CodeLlama-70b-Instruct-hf
AnyscaleEndpoints: mistralai/Mixtral-8x22B-Instruct-v0.1 (aliases: mix22b)
AnyscaleEndpoints: mlabonne/NeuralHermes-2.5-Mistral-7B
AnyscaleEndpoints: google/gemma-7b-it
$ llm -m meta-llama/Llama-3-70b-chat-hf 'say something nice about llamas'
I love llamas! They're so cute and gentle, a great companion animal. Did you know that llamas are also very intelligent and can learn to do tricks and tasks? They're also great at carrying heavy loads, which makes them super useful on hikes and treks. Plus, their soft fur is just irresistible!
These are currently available if you know to run
llm anyscale-endpoints refresh
but it would be better if they came out of the box.