Open zhanpengjie opened 1 month ago
@zhanpengjie which local model you are trying to use? Does that have vision capability?
I also have problem with using llama3.2-90B-Vision with vllm. The error said environment variable missing?
@torrischen refer the model and api_base params here and pass it accordingly in zerox: https://docs.litellm.ai/docs/providers/vllm
Also refer #65
@torrischen refer the model and api_base params here and pass it accordingly in zerox: https://docs.litellm.ai/docs/providers/vllm
Also refer #65
Thanks. That’s helpful
how to set base_url and model in python sdk?