Closed Canahmetozguven closed 1 year ago
Hi, thanks. I'm myself new to this and learning things while I'm building these libraries :)
I created a new library which supports more models and has more features: https://github.com/marella/ctransformers
If you are using gpt4all-j
, I highly recommend migrating to this new library. Currently it supports GPT-2, GPT-J (GPT4All-J), GPT-NeoX (StableLM), Dolly V2, StarCoder. Will continue to add more models.
It provides a unified interface for all models:
from ctransformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained('/path/to/ggml-gpt-2.bin', model_type='gpt2')
print(llm('AI is going to'))
It can be used with models hosted on the Hugging Face Hub:
llm = AutoModelForCausalLM.from_pretrained('marella/gpt-2-ggml')
It also has built-in support for LangChain. Please see README for more details.
Hi dear author, ı have trying to do what you are doing right now and it's difficult for me to find learning resources for this kind of stuff finally your library and you are perfect. Thank you for your effort and pslsss make course or suggest us a place for learning this kind of stuff. Thank you in advance!