JudiniLabs / code-gpt-docs

Docusaurus page
https://code-gpt-docs.vercel.app
MIT License
565 stars 58 forks source link

Bedrock: Supported Imported Models. #315

Open mikhail-khodorovskiy opened 2 months ago

mikhail-khodorovskiy commented 2 months ago

Bedrock supports importing a custom model via https://docs.aws.amazon.com/bedrock/latest/userguide/model-customization-import-model.html#model-customization-import-model-job.

Once imported the model is available to be invoked using the model ARN as model id. AWS CLI example of invoking the model:

 aws bedrock-runtime invoke-model \      
    --model-id arn:aws:bedrock:us-west-2:XXXX:imported-model/XXX \
    --body '{"prompt": "\n\nHuman: story of two dogs\n\nAssistant:", "max_tokens_to_sample" : 300}' \
    --cli-binary-format raw-in-base64-out \
    invoke-model-output.txt

Would CodeGPT team consider adding to the product the ability to select and use a custom imported model in addition to the hardcoded list of the foundational models available in bedrock?

mikhail-khodorovskiy commented 1 month ago

We are hosting deepseek-coder as the model imported into Bedrock. Can the AI Auto completion also be configured to connect to AWS Bedrock with the same profile as the chat model and be selected to be the auto complete model?