Closed lingluodlut closed 1 year ago
Unfortunately, the pipeline wrapper does not work. I've written a simple inferer, which should work. But it has its limitations, as you can see in #31
@kbressem Can you give a working example on how to call the inferer? The one in the comments in the file doesn't even work.
@kbressem I would also like to see a working example of the inferer. Thanks!
@qvks77 I think I may have found the issue. It is the tokenizer that is the issue. https://github.com/huggingface/transformers/issues/22762
Although it doesn't seem like the right thing to do, when I use a working tokenizer such as huggyllama/llama-7b, I don't get this error anymore.
Sorry for the late responses. I think the rapid changes in how LLaMA is implemented in HF and changes in the libraries used could be an issue here.
It is very likely that the tokenizer config I used (or is used in decapoda-research/llama) is outdated.
https://huggingface.co/abhipn implemented a solution to avoid this recursion error. We also plan to update the models soon, but are currently figuring out funding, so unfortunately the Repo is on hold for now.
Hi, I am trying to run the hugging face example of medalpaca-7b:
but I got the following errors:
My transformers version is 4.30.0 Is the error caused by the transformers version? Thanks!