AI4Finance-Foundation / FinGPT

FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.
https://ai4finance.org
MIT License
13.95k stars 1.93k forks source link

trying to run forecaster and I get this error: 'base_model.model.model.model.embed_tokens' #186

Open rumcode opened 3 months ago

rumcode commented 3 months ago

I slightly modified the code that I copied from the forecaster page. and I run into an error. Any suggestions? Thanks in advance. Code is

"""
from datasets import load_dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
import torch

base_model = AutoModelForCausalLM.from_pretrained(
    'meta-llama/Llama-2-7b-chat-hf',
    trust_remote_code=True,
    device_map="auto",
    torch_dtype=torch.float16,
    token='mytoken # optional if you have enough VRAM
)

tokenizer = AutoTokenizer.from_pretrained('meta-llama/Llama-2-7b-chat-hf',token='mytoken')
print("hi")
model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora',token='mytoken')
print("hi2")
model = model.eval()

The error messages are:


C:\Users\xx\AppData\Roaming\Python\Python311\site-packages\torch\nn\modules\module.py:2047: UserWarning: for base_model.model.model.layers.31.mlp.down_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
Traceback (most recent call last):

  File c:\ProgramData\Anaconda3\Lib\site-packages\spyder_kernels\py3compat.py:356 in compat_exec
    exec(code, globals, locals)

  File c:\users\rruffley 3677\downloads\fingpt20240715try2.py:23
    model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora',token='mytoken')

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:430 in from_pretrained
    model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:1022 in load_adapter
    self._update_offload(offload_index, adapters_weights)

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:908 in _update_offload
    safe_module = dict(self.named_modules())[extended_prefix]

KeyError: 'base_model.model.model.model.embed_tokens'
pratikm778 commented 1 week ago

Same error. you got any workarounds ?