Closed liyunrui closed 11 months ago
Hi @liyunrui you are using the Anthropic configuration parameters for the AI21Labs model, see details here: https://catalog.workshops.aws/building-with-amazon-bedrock/en-US/foundation/bedrock-inference-parameters. In addition, please make sure you use the new modelID name as of last week (Bedrock Generally Available): ai21.j2-mid-v1 or a21.j2-ultra-v1.
from langchain.llms.bedrock import Bedrock
inference_modifier = {'max_tokens_to_sample':4096, "temperature":0.5, "top_k":250, "top_p":1, "stop_sequences": ["\n\nHuman"] }
textgen_llm = Bedrock(model_id = "ai21.j2-grande-instruct", client = boto3_bedrock, model_kwargs = inference_modifier )
ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: The provided inference configurations are invalid