i-am-bee / bee-agent-framework

The framework for building scalable agentic applications.
https://i-am-bee.github.io/bee-agent-framework/
Apache License 2.0
1.06k stars 99 forks source link

Support Amazon Bedrock as inference provider #122

Open mmurad2 opened 2 weeks ago

mmurad2 commented 2 weeks ago

Description Add Amazon Bedrock as a supported inference provider

Tasks

Additional context 3x requests from LI See comment

Tomas2D commented 2 weeks ago

Relevant documentation page: https://i-am-bee.github.io/bee-agent-framework/#/llms?id=adding-a-new-provider-adapter

abughali commented 2 weeks ago

@mmurad2 please assign it to me

Tomas2D commented 1 day ago

Any progress @abughali?

Tomas2D commented 1 day ago

Temporary workaround with the use of LangChain. @MartinRistov

import { BaseMessage } from "bee-agent-framework/llms/primitives/message";
import { LangChainChatLLM } from "bee-agent-framework/adapters/langchain/llms/chat";

// you need to install this package first @langchain/community
import { BedrockChat } from "@langchain/community/chat_models/bedrock";

console.info("===CHAT===");
const llm = new LangChainChatLLM(
    new Bedrock({
      model: "anthropic.claude-v2",
      region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
      // endpointUrl: "custom.amazonaws.com",
      credentials: {
        accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY,
      },
      temperature: 0,
      maxTokens: undefined,
      maxRetries: 2,
      // other params...
    })
);

const response = await llm.generate([
  BaseMessage.of({
    role: "user",
    text: "Hello world!",
  }),
]);

Related code: https://github.com/i-am-bee/bee-agent-framework/blob/main/examples/llms/providers/langchain.ts LangChain documentation: https://js.langchain.com/docs/integrations/llms/bedrock#instantiation