Open tahpot opened 1 week ago
could #72 address your requirements?
could #72 address your requirements?
That PR added support for Llama 3.1, is a similar change planned for 3.2?
could #72 address your requirements?
No, that didn't work for me, however I moved to using Bedrock natively now.
Describe the bug
I am trying to use Meta 1 and 2 which require inference support.
I am getting this error:
Unsupported model us.meta.llama3-1-70b-instruct-v1:0, please use models API to get a list of supported models
I have tried using the ARN format as well (
arn:aws:bedrock:us-east-1:<accountId>:inference-profile/us.meta.llama3-1-70b-instruct-v1:0
), but get the same error.When using the model
meta.llama3-1-70b-instruct-v1:0
I get the errorAn error occurred (ValidationException) when calling the Converse operation: Invocation of model ID meta.llama3-1-70b-instruct-v1:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model.
I thought it may be because I'm using an older version of this gateway, so I followed the instructions to update to the latest image, which made no difference.
I'm getting this error for Llama 3.1 and 3.2
Please complete the following information:
/chat/completions