Open adampingel opened 4 months ago
:: Meeting with AP: Jul 16th 2024
--> Investigate BedRock to stand these up .. (AA did this very easily) --> Can we use hugging face directly
What is the host name ? Do we need cloudfront for protection ?
https://python.langchain.com/v0.1/docs/integrations/llms/bedrock/
conversation.predict(input="Hi there!")
observed.
it looks like we can only import model_type = llama
into bedrock
The model types we have are "model_type": "gpt_bigcode",
Also Requested to increase quota
https://support.console.aws.amazon.com/support/home#/case/?displayId=172202349600681
Limit name: Custom models per account
New limit value: 250.0
Custom Model Import supports a range of model architectures, including Mistral, Flan, and Llama 2 and 3.
This flexibility ensures that a wide variety of models can be integrated into the Bedrock ecosystem.
Yesterday, @kpsrikanthibm mentioned that AWS is "going to provide more examples". I'm going to moved this to "blocked" until we get those examples.
After talking with Vincent.. I understood a big team is working on enabling granite models in AWS BedRock Current Progress is shared on Slack https://ibm-research.slack.com/archives/C072HAGFTE0/p1724796077343029 Currently progress is blocked as necessary TockenizerClass and Model Architecture are not SUpported in AWS BedRock
Make these model available:
https://huggingface.co/ibm-granite/granite-34b-code-instruct
https://huggingface.co/ibm-granite/granite-20b-code-instruct