Open JonathanBechtel opened 1 year ago
For any of the hosted models on huggingface, you get the message in the box for the Huggingface Hosted Inference API.
I've looked through quite a few examples and have not found an exception. Does anyone know why?
Here's an example: https://huggingface.co/chrisociepa/alpaca-lora-7b-pl
Being able to use one of these models via Huggingface for inference would be extremely helpful.
Thank you.
Best, Jonathan
Facing the same issue
For any of the hosted models on huggingface, you get the message in the box for the Huggingface Hosted Inference API.
I've looked through quite a few examples and have not found an exception. Does anyone know why?
Here's an example: https://huggingface.co/chrisociepa/alpaca-lora-7b-pl
Being able to use one of these models via Huggingface for inference would be extremely helpful.
Thank you.
Best, Jonathan