Instrument BedrockRuntimeClient.prototype.send when command is InvokeModelWithResponseStreamCommand and model is not amazon.titan-embed to create the span Llm/completion/Bedrock/InvokeModelWithResponseStreamCommand and the and the relevant LlmChatCompletionSummary and LlmChatCompletionMessage. When model is amazon.titan-embed create the span with the name of Llm/embedding/Bedrock/InvokeModelWithResponseStreamCommand and the relevant LlmEmbedding.
NOTES:
This instrumentation should only register when ai_monitoring.enabled and feature_flag.aws_bedrock_instrumentation are both true.
The aws sdk v3 instrumentation is middleware based so this is not a traditional span.wrap/record effort.
Instrument BedrockRuntimeClient.prototype.send when command is InvokeModelWithResponseStreamCommand and model is not
amazon.titan-embed
to create the spanLlm/completion/Bedrock/InvokeModelWithResponseStreamCommand
and the and the relevant LlmChatCompletionSummary and LlmChatCompletionMessage. When model isamazon.titan-embed
create the span with the name ofLlm/embedding/Bedrock/InvokeModelWithResponseStreamCommand
and the relevant LlmEmbedding.NOTES:
ai_monitoring.enabled
andfeature_flag.aws_bedrock_instrumentation
are both true.