Closed samuelrince closed 5 months ago
Started working on this in feat/cohere
, works with
from ecologits import EcoLogits
import cohere
EcoLogits.init()
client = cohere.Client()
chat = client.chat(
message="Hello!",
max_tokens=100,
model = "command-light"
)
print(chat.text)
print(chat.impacts)
Note: streams work as
from ecologits import EcoLogits
import cohere
EcoLogits.init()
co = cohere.Client()
stream = co.chat_stream(
message="Tell me a short story",
model="command-light",
max_tokens=50
)
for event in stream:
if event.event_type == "text-generation":
print(event.text, end='')
if event.event_type == "stream-end":
print("\n")
print(event.impacts)
Similarly as for Anthropic, output tokens are only returned at the end of the stream
Description
Add Cohere LLM provider.
Solution
Cohere has its own API and python client.
Available models: https://docs.cohere.com/docs/models
Code example: