Closed austinmw closed 10 months ago
Sorry, found answer, seems like I can just do this:
models:
- type: main
engine: amazon_bedrock
model: anthropic
parameters:
model_id: anthropic.claude-instant-v1
model_kwargs:
max_tokens_to_sample: 1000
temperature: 0.0
stop_sequences: ["\n\nHuman:", "\nUser:"]
- type: generate_bot_message
engine: amazon_bedrock
model: anthropic
parameters:
model_id: anthropic.claude-v2
model_kwargs:
max_tokens_to_sample: 10000
temperature: 0.3
stop_sequences: ["\n\nHuman:", "\nUser:"]
Hi, is it possible to specify specific LLMs for each task? For example I'd like to use Claude v2 for
generate_bot_message
and Claude Instant for all other tasks.