ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
4.09k stars 344 forks source link

Suggestion: Use models to encapsulate request/responses #33

Open sachinsachdeva opened 8 months ago

sachinsachdeva commented 8 months ago

consider using models to properly encapsulate request/responses

for example

import ollama
response = ollama.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])

would become something like

import ollama
from ollama.models import ChatMessageRequest, ChatMessageResponse

messages: [ChatMessageRequest] =  [ChatMessageRequest('user' , 'Why is the sky blue?')]
response: ChatMessageResponse = ollama.chat(model='llama2', messages=messages)

print(response.content)
adriens commented 8 months ago

:heavy_plus_sign: :one: