Closed mrseanryan closed 1 month ago
from transformers import pipeline
pipe = pipeline("text-generation", model="TheBloke/Llama-2-13B-chat-GGML")
——-
from transformers import AutoModel model = AutoModel.from_pretrained("TheBloke/Llama-2-13B-chat-GGML")
ref https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML
—-
https://github.com/huggingface/transformers
Probably obsolete with #8
Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="TheBloke/Llama-2-13B-chat-GGML")
——-
Load model directly
from transformers import AutoModel model = AutoModel.from_pretrained("TheBloke/Llama-2-13B-chat-GGML")
ref https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML
—-
https://github.com/huggingface/transformers