tryAGI / LangChain

C# implementation of LangChain. We try to be as close to the original as possible in terms of abstractions, but are open to new entities.
https://tryagi.gitbook.io/langchain/
MIT License
466 stars 71 forks source link

Ability to Provide Configuration for MaxTokens, etc to Bedrock #310

Closed ericgreenmix closed 1 month ago

ericgreenmix commented 2 months ago

Add a Configuration class, similar to what AzureOpenAI and other providers have, to be able to set MaxTokens, Temperature, etc on the Bedrock provider. Right now there is no way to change the MaxTokens on the Model or Provider level.

@curlyfro It appears you are actively working on the Bedrock Provider. If I were to add this in a PR, would it be stepping on you at all?

HavenDV commented 2 months ago

Let's try AI generated PR first, may be it will good foundation

P.S. Bad results

HavenDV commented 2 months ago

@ericgreenmix to get answers faster it will be better to ask this in discord, I can add you to the dev channel, just join it and tell me your nickname.

curlyfro commented 2 months ago

@ericgreenmix, max tokens are already there. BedrockChatSettings is the default but all the model providers have their own custom ChatSettings, like MistralInstructChatSettings, AmazonTitanChatSettings, etc with default values for that model. you can change like this:

var provider = new BedrockProvider();
var llm = new Claude3SonnetModel(provider)
{
    Settings = new BedrockChatSettings
    {
        MaxTokens = 200_000,
        Temperature = 0,
        UseStreaming = true
    }
};

I tried to document this in the wiki.

ericgreenmix commented 1 month ago

Ah okay, thank you for updating the documentation. That is working