aws-samples / serverless-pdf-chat

LLM-powered document chat using Amazon Bedrock and AWS Serverless
https://aws.amazon.com/blogs/compute/building-a-serverless-document-chat-with-aws-lambda-and-amazon-bedrock/
MIT No Attribution
245 stars 222 forks source link

model parameters #12

Closed catalin-hanga closed 1 year ago

catalin-hanga commented 1 year ago

For a particular Bedrock LLM, can we see what are the default values of the model parameters (such as Temperature, Top K, max tokens) that are used by the app, and can we change these values ?

pbv0 commented 1 year ago

Yes, you can check the default parameters for each Bedrock model here (by default this sample uses Claude v2): https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters.html

You can adjust these values by passing model_kwargs to the Langchain Bedrock class: https://api.python.langchain.com/en/latest/llms/langchain.llms.bedrock.Bedrock.html