Closed kennethleungty closed 1 year ago
It is working without any changes. Model architecture might be same as LLaMA 1:
from ctransformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained("TheBloke/Llama-2-7B-GGML", model_type="llama")
print(llm("AI is going to"))
Thanks!
GGML version from TheBloke is coming soon: https://huggingface.co/TheBloke/Llama-2-7B-GGML
Thanks!