A Ruby gem for seamlessly and uniformly interacting with large language and vision model (LLM) API's served by numerous services, including those of OpenAI, Anthropic, Google and others.
MIT License
44
stars
2
forks
source link
[together ai] restore :end_sequence_encountered support #1
The Together AI API, with the Llama 3.1 Turbo models, returns an :eos finish_reason as well as a :stop finish reason allowing us to support :end_sequence_encountered.
Unfortunately, other models, such as the Llama 3.1 vision models, do not support this so I've removed support for end_sequence_encountered for now pending clarity from Together Ai.
The Together AI API, with the Llama 3.1 Turbo models, returns an :eos finish_reason as well as a :stop finish reason allowing us to support :end_sequence_encountered.
Unfortunately, other models, such as the Llama 3.1 vision models, do not support this so I've removed support for end_sequence_encountered for now pending clarity from Together Ai.