Open mmurad2 opened 2 weeks ago
Relevant documentation page: https://i-am-bee.github.io/bee-agent-framework/#/llms?id=adding-a-new-provider-adapter
@mmurad2 please assign it to me
Any progress @abughali?
Temporary workaround with the use of LangChain. @MartinRistov
import { BaseMessage } from "bee-agent-framework/llms/primitives/message";
import { LangChainChatLLM } from "bee-agent-framework/adapters/langchain/llms/chat";
// you need to install this package first @langchain/community
import { BedrockChat } from "@langchain/community/chat_models/bedrock";
console.info("===CHAT===");
const llm = new LangChainChatLLM(
new Bedrock({
model: "anthropic.claude-v2",
region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
// endpointUrl: "custom.amazonaws.com",
credentials: {
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY,
},
temperature: 0,
maxTokens: undefined,
maxRetries: 2,
// other params...
})
);
const response = await llm.generate([
BaseMessage.of({
role: "user",
text: "Hello world!",
}),
]);
Related code: https://github.com/i-am-bee/bee-agent-framework/blob/main/examples/llms/providers/langchain.ts LangChain documentation: https://js.langchain.com/docs/integrations/llms/bedrock#instantiation
Description Add Amazon Bedrock as a supported inference provider
Tasks
Additional context 3x requests from LI See comment