Closed becxer closed 3 months ago
I managed to get past this by converting the .pth
files to safetensors
using code from https://gist.githubusercontent.com/epicfilemcnulty/1f55fd96b08f8d4d6693293e37b4c55e/raw/3e099f23cdda9c38104de83d2108fe891f16d8ca/2safetensors.py
This was resolved by updating transformer version to 3.50 (which is mentioned in config.json)
Hi, I'm trying to use the checkpoint with frozen. However, it is not able to load rotary embedding parameters with following code,
model = AutoModelForCausalLM.from_pretrained("TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T")
and it keeps showing the warning message.