tloen / alpaca-lora

Instruct-tune LLaMA on consumer hardware
Apache License 2.0
18.66k stars 2.22k forks source link

Unable to determine this model’s pipeline type. Check the docs -- Huggingface Inference #579

Open JonathanBechtel opened 1 year ago

JonathanBechtel commented 1 year ago

For any of the hosted models on huggingface, you get the message in the box for the Huggingface Hosted Inference API.

I've looked through quite a few examples and have not found an exception. Does anyone know why?

Here's an example: https://huggingface.co/chrisociepa/alpaca-lora-7b-pl

Being able to use one of these models via Huggingface for inference would be extremely helpful.

Thank you.

Best, Jonathan

JeffMboya commented 1 month ago

Facing the same issue