google-deepmind / opro

official code for "Large Language Models as Optimizers"
https://arxiv.org/abs/2309.03409
Apache License 2.0
446 stars 46 forks source link

feat: Amazon Bedrock support #7

Open JGalego opened 3 months ago

JGalego commented 3 months ago

This PR adds support for Amazon Bedrock models (scorer + optimizer).

Example:

python optimize_instructions.py --optimizer="bedrock/mistral.mixtral-8x7b-instruct-v0:1" \
                                --scorer="bedrock/anthropic.claude-3-sonnet-20240229-v1:0" \
                                --instruction_pos="Q_begin" \
                                --dataset="gsm8k" \
                                --task="train"