opensearch-project / documentation-website

The documentation for OpenSearch, OpenSearch Dashboards, and their associated plugins.
https://opensearch.org/docs
Apache License 2.0
74 stars 489 forks source link

[DOC] Add support for Bedrock Converse API #8195

Open austintlee opened 2 months ago

austintlee commented 2 months ago

What do you want to do?

Tell us about your request. Provide a summary of the request. ml-commons is adding support for Bedrock Converse API that allows RAG users to gain access to the latest LLMs from Anthropic, i.e. Claude 3.5 sonnet. In addition, this will also work with OpenAI GPT 4o for including images in chats.

*Version: List the OpenSearch version to which this issue applies, e.g. 2.14, 2.12--2.14, or all.

What other resources are available? Provide links to related issues, POCs, steps for testing, etc.

austintlee commented 2 months ago

We are introducing a new RAG search processor parameter called "llm_messages". This is intended to match one-to-one the "messages" parameter of the Bedrock Converse API (https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) and therefore, an array of messages (llm_messages) passed to the RAG search processor will be converted to an array of Message objects and passed to Bedrock.

llm_messages = [ MessageBlock ] ([] representing the array notation) MessageBlock = { "role": , "content": [ ContentBlock ] }, role = "user" | "assistant" (this is typically "user", but you can interweave "user" and "assistant" for few-shot prompting) ContentBlock = {"type": , : }

content_type = "text" | "image" | "document" For "text": content_detail = "string" For "image": content_detail = {"format": ..., "type": ..., "data": ... } For "document": content_detail = {"format":..., "name": ..., "data": ...}

Example 1: text llm_messages: [{"role": "user", "content": [{"type": "text", "text": "what is the tallest building in the world?"}]}]

Example 2: image llm_messages: [{"role": "user", "content": [{"type": "text", "text": "use the image to answer the following question: how many apples are in the tree?"}, {"type": "image", "image": {"format": "jpeg", "type": "data", "data": ....}}]}]

Example 3: document llm_messages: [{"role": "user", "content": [{"type": "text", "text": "what is the average of the third column in the table on page 1 of the attached document?"}, {"type": "document", "document": {"format": "pdf", "name": "many_tables", "data": ....}}]}]

austintlee commented 2 months ago

llm_model prefix must be "bedrock-converse/", e.g. llm_model: bedrock-converse/anthropic.claude-3-sonnet-20240229-v1:0

austintlee commented 2 months ago

Example 4: text + image + document

Naarcha-AWS commented 2 months ago

@austintlee: Can you submit a PR with these updates?

austintlee commented 2 months ago

I'll do that today.

We already have a page that explains all the parameters, but I don't know if this needs a dedicated/separate page.

Zhangxunmt commented 2 months ago

@Naarcha-AWS, please remove the 2.17 tag since this feature is not included in 2.17 anymore.