irthomasthomas / undecidability

13 stars 2 forks source link

Command-R - An low-latency LLM with long context, opitmized for RAG and tool use. #729

Open irthomasthomas opened 8 months ago

irthomasthomas commented 8 months ago

Command-R

Description: Command-R is a Large Language Model optimized for conversational interaction and long context tasks. It targets the “scalable” category of models that balance high performance with strong accuracy, enabling companies to move beyond proof of concept, and into production. Command-R boasts high precision on retrieval augmented generation (RAG) and tool use tasks, low latency and high throughput, a long 128k context, and strong capabilities across 10 key languages.

Model Details

LATEST MODEL DESCRIPTION MAX INPUT TOKENS (INPUT CONTEXT WINDOW) MAX OUTPUT TOKENS (OUTPUT CONTEXT WINDOW) ENDPOINTS
command-r Command-R is an instruction-following conversational model that performs language tasks at a higher quality, more reliably, and with a longer context than previous models. It can be used for complex workflows like code generation, retrieval augmented generation (RAG), tool use, and agents. 128k 4096 Chat

Unique Command-R Model Capabilities

Command-R has been trained on a massive corpus of diverse texts in multiple languages, and can perform a wide array of diverse text-generation tasks. Moreover, Command-R has been trained to especially excel in some of the most critical functionalities for business use-cases.

Multilingual Capabilities

We want Command-R to serve as many people, organizations, and markets as possible, so the new Command-R is capable of interacting in many languages to a fairly high degree of accuracy.

The model is optimized to perform well in the following languages: English, French, Spanish, Italian, German, Brazilian Portuguese, Japanese, Korean, Simplified Chinese, and Arabic.

Additionally, pre-training data has been included for the following 13 languages: Russian, Polish, Turkish, Vietnamese, Dutch, Czech, Indonesian, Ukrainian, Romanian, Greek, Hindi, Hebrew, Persian.

The model has been trained to respond in the language of the user. Here's an example:

co.chat(
  message="Écris une description de produit pour une voiture électrique en 50 à 75 mots"
)

And here's what the response might look like:

TEXT

Découvrez la voiture électrique qui va révolutionner votre façon de conduire. Avec son design élégant, cette voiture offre une expérience de conduite unique avec une accélération puissante et une autonomie impressionnante. Sa technologie avancée vous garantit une charge rapide et une fiabilité inégalée. Avec sa conception innovante et durable, cette voiture est parfaite pour les trajets urbains et les longues distances. Profitez d'une conduite silencieuse et vivez l'expérience de la voiture électrique!

Command-R can not only be used to generate text in several languages but can also perform cross-lingual tasks such as translation or answering questions about content in other languages.

Retrieval Augmented Generation

Command-R has been trained with the ability to ground its generations. This means that it can generate responses based on a list of supplied document snippets, and it will include citations in its response indicating the source of the information.

For more information, check out our dedicated guide on retrieval augmented generation.

Tool Use

Command-R has been trained with conversational tool use capabilities. Its tool use functionality takes a conversation as input (with an optional user-system preamble), along with a list of available tools. The model will then generate a json-formatted list of actions to execute on a subset of those tools. For more information, check out our dedicated tool use guide.

URL: https://docs.cohere.com/docs/command-r

Suggested labels

irthomasthomas commented 8 months ago

Related content

553

Similarity score: 0.89

706

Similarity score: 0.89

681

Similarity score: 0.88

418

Similarity score: 0.88

715

Similarity score: 0.87

684

Similarity score: 0.87