meta-llama / llama-stack-client-python

Python SDK for Llama Stack
Apache License 2.0
77 stars 21 forks source link

Can't run inference examples #18

Closed heyjustinai closed 2 weeks ago

heyjustinai commented 2 weeks ago

I get the error below whenever I try to run the inference example.

raise ValueError( ValueError: Model Llama3.2-11B-Vision-Instruct not served by any of the providers: meta-reference-0, meta-reference-1. Make sure there is an Inference provider serving this model.