Closed aamir-gmail closed 2 years ago
When I ran the following code with the newest version of Hugging Face's Transformers library from PyPI (4.10.3) I got an exception.
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B")
So, although the model is available on their network, I don't think it's compatible with their current PyPI version. Thus, it can not be used with Happy Transformer. I expect GPTJ-6B to become compatible with Happy Transformer as soon as they make it compatible with their package.
Thank you, Eric, I encountered the same problem, I thought I must be doing something wrong.
Just saw a tweet from Hugging face the GPT-J 6B is now supported. ( https://twitter.com/huggingface/status/1443246197779664903?s=09 ) , Please what can be done from your end to support this.
Thanks for letting me know! I'll check it out tomorrow. At first glance, Happy Transformer should support it given your system satisfies the heavy hardware requirements. But, I'll test this shortly.
Thank you.
On Thu, Sep 30, 2021 at 3:25 PM Eric Fillion @.***> wrote:
Thanks for letting me know! I'll check it out tomorrow. At first glance, Happy Transformer should support it given your system satisfies the heavy hardware requirements. But, I'll test this shortly.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/EricFillion/happy-transformer/issues/262#issuecomment-930794984, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJA2ECIP25JDRJXTCQM3WT3UEPYFFANCNFSM5EQ4UTAQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
-- Kind Regards
Aamir Mirza
I got it working on a A6000 notebook instance offered by Lambda labs. You'll have to restart the kernel after pip installing Happy Transformer.
Please see in the text generation you can provide support for GPTJ-6B ( six billion parameters )