AbanteAI / rawdog

Generate and auto-execute Python scripts in the cli
Apache License 2.0
1.78k stars 137 forks source link

litellm error when model not listed? #28

Closed iplayfast closed 7 months ago

iplayfast commented 8 months ago

A strange one. I'm using ollama as a local model. If config.yaml is:

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral

rawdog works prefectly. but when I update config.yaml to:

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral

so it uses the mixtral model, now rawdog always gives back an error

What can I do for you? (Ctrl-C to exit)
> list files    

Error:
 {'model': 'ollama/mixtral', 'prompt': 'PROMPT: list files', 'response': " ```python\nimport os\n\nfiles = [f for f in os.listdir('.') if os.path.isfile(f)]\nfor file in files:\n    print(file)\n```", 'cost': None, 'error': 'Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral\n'}
Error: Execution error: Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral

What can I do for you? (Ctrl-C to exit)

This error seems to original from litellm https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json where indeed mixtral isn't listed, and mistral is.

Is it possible to get around this?

iplayfast commented 8 months ago

Found the secret sauce more config.yaml

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: ollama
llm_model: mixtral

Please update the readme to show how to use models that litellm doesn't know about.

kvaky commented 8 months ago

I think the config documentation could be better too. It seems the maintainers are open to useful PRs, so feel free to create one.

jakethekoenig commented 7 months ago

I updated the readme. Thanks for reporting!