Closed simonw closed 9 months ago
Got this working:
llm -m mistral-tiny '1 short reason to get a pet pelican'
A pelican makes for an intriguing and unique pet due to its distinctive appearance and interesting behaviors. They are social animals that form tight communities, which can provide companionship for their owners. Additionally, pelicans have a gentle and inquisitive nature, making them an enjoyable addition to any home. However, it is important to note that owning a pelican as a pet is not feasible for most people due to their large size, specialized dietary needs, and legal restrictions.
llm -m mistral-small '1 short reason to get a pet pelican'
One potential reason to get a pet pelican is that they are highly intelligent and can be trained to perform tricks or tasks. However, it's important to note that pelicans are wild animals and may not be suitable as pets in all circumstances. They require a lot of space, a specialized diet, and specific care that may be difficult for some people to provide. Additionally, in many places it is illegal to keep pelicans as pets without a permit. It's always best to research and consider the needs and welfare of the animal before deciding to keep any wild animal as a pet.
llm -m mistral-medium '1 short reason to get a pet pelican'
One short reason to get a pet pelican is that they are unique and fascinating birds with distinct personalities and can provide companionship and entertainment. However, it is important to note that pelicans are wild animals and require specialized care, housing, and permits to keep legally as pets in most places. They are not suitable for everyone and should only be considered by experienced and dedicated caretakers.
I added conversation support, so llm chat -m mistral-tiny
now works:
llm chat -m mistral-tiny
Chatting with mistral-tiny
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> hello who are you
Hello there! I'm an artificial intelligence designed to assist with various tasks and answer questions. How can I help you today?
> what is a pelican?
A pelican is a large water bird that belongs to the order Pelecaniformes and the family Pelecanidae. Pelicans have a distinctive pouch under their beak which they use to scoop up fish and other prey from the water. They are found in various parts of the world, including North and South America, Europe, Africa, and Asia. Pelicans are known for their graceful flight and their social behavior, often seen in large groups, called colonies. There are eight different species of pelicans, with sizes ranging from 1.5 to 5 feet in length.
> bird or mammal?
A pelican is a bird, not a mammal. Birds are a group of animals characterized by their feathers, laying hard-shelled eggs, a beak without teeth, and a high metabolic rate to maintain a body temperature above their environment. Mammals, on the other hand, are a group of animals that have mammary glands for producing milk to feed their young, fur or hair covering their bodies, and three middle ear bones. Pelicans are obviously not mammals, as they do not have mammary glands or fur.
System prompts work too:
cat llm_mistral.py | llm -m mistral-medium -s 'explain this code'
This code appears to be for a language model (LLM) interface that uses the Mistral AI API to generate text responses. The Mistral AI API is a service that allows you to use pre-trained language models to generate text based on a prompt.
Here is a breakdown of the code:
- The
httpx
andhttpx_sse
libraries are imported to make HTTP requests and handle server-sent events (SSE), respectively.- The
llm
library is imported to use the LLM interface for interacting with language models.- The
register_models
function is a hook that registers the available models with the LLM interface. In this case, three instances of theMistral
model are registered with different model IDs: "mistral-tiny", "mistral-small", and "mistral-medium".- The
Mistral
class is defined, which inherits from thellm.Model
class. This class represents a language model that can be used through the LLM interface.- The
Mistral
class has acan_stream
attribute set toTrue
, which indicates that it supports server-sent events for streaming responses.- The
__init__
method of theMistral
class takes amodel_id
argument, which is used to specify the ID of the model to use when generating text.- The
build_messages
method takes aprompt
object and aconversation
object as arguments. It constructs a list of messages to send to the Mistral AI API based on theprompt
andconversation
objects.- The
execute
method takes aprompt
,stream
,response
, andconversation
objects as arguments. It sends an HTTP POST request to the Mistral AI API with the messages constructed by thebuild_messages
method. The API request includes an authorization header with a bearer token, which is obtained from thellm.get_key()
function.- The
execute
method uses theconnect_sse()
function from thehttpx_sse
library to handle server-sent events. If the API request is successful,Error: peer closed connection without sending complete message body (incomplete chunked read)
Docs here: https://docs.mistral.ai/
I'm going to use https://pypi.org/project/httpx-sse/