Open songsh opened 6 months ago
Hi @songsh,
It appears there was an interruption while loading the model. Could you please try to load the model normally using the following code?
from transformers import AutoConfig, AutoModelForCausalLM, AutoTokenizer
model_name = "NousResearch/Llama-2-7b-hf"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="cuda",
ignore_mismatched_sizes=True,
)
can model support chinese? i use chinese, answer has Garbled code
Hi @songsh, currently, this issue does occur #4. We plan to fix it in the future. For now, we recommend using a Chinese small language model, such as Skyword.
i run in local, error is: Loading checkpoint shards: