Closed Mahgozar closed 1 year ago
Not sure if the HF pipelines already fully support LLaMA models. Try using the medAlpaca inferer for now:
from medalpaca.inferer import Inferer
medalpaca = Inferer("medalpaca/medalapca-7b", "prompt_templates/medalpaca.json")
response = medalpaca(input="What is Amoxicillin?")
This should allow prompting the model and also makes sure, the correct template is used.
i get the following error when i try to use the sample code provided in the model card in google colab the code
the error: