Closed kily-wmz closed 5 months ago
Hi,
Thanks for your interest in SongComposer. I would recommend you delete the current modeling_internlm2.py at your terminal and re-download the newest model file in the hugging face. We implement the function at L1243 of modeling_internlm2.py.
Best, Shuangrui
Hi! I have re-downloaded modeling_internlm2.py from Hugging Face and run the code, but I am still encountering the same problem. Do you have any other suggestions that might work? Looking forward to your reply.
Thanks!
Can I know your version of transformers? I am using 4.31.0. Maybe you can try AutoModelForCausalLM from the transformer library.
from transformers import AutoTokenizer, AutoModelForCausalLM
ckpt_path = "Mar2Ding/songcomposer_pretrain"
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half()
Btw, the final inference_pretrain
function does not need to input the model
argument, I have fixed the example in the model card.
Best, Shuangrui
Thank you very much! I have successfully implemented it. Looking forward to more projects and papers from you in the future.
notably, part of the model is loaded onto CPU and only 14GiB are on GPU VRAM.
I still have this issue with the latest git clone April 15, and tried the workaround of AutoModelForCausalLM
...
AttributeError Traceback (most recent call last)
Cell In[2], line 10
8 prompt = '<bol> Total 10 lines. The first line:This is Song Composer locally producing music\n'
9 ###### Inference function would generation a three-shot answer. Find the best fit one.##########
---> 10 model.inference_pretrain(prompt, tokenizer)
File ~/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1709, in Module.__getattr__(self, name)
1707 if name in modules:
1708 return modules[name]
-> 1709 raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
AttributeError: 'InternLM2ForCausalLM' object has no attribute 'inference_pretrain'
Thanks for any help.
I recommend to comment the 'inference_pretrain' function at link and try to implement and rewrite the function outside of the model.
Best, Shuangrui
hi! When I run your example code on Hugging Face as follows:
from transformers import AutoTokenizer, AutoModel ckpt_path = "Mar2Ding/songcomposer_pretrain" tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True) model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half() prompt = ' Total 7 lines. The first line:可,,<137>,<79>|惜,<D#4>,<137>,<79>|这,,<137>,<88>|是,,<121>,<79>|属,,<121>,<79>|于,<D#4>,<214>,<88>|你,<D#4>,<141>,<79>|的,,<130>,<79>|风,,<151>,<79>|景,<A#3> ,<181><137>,<79>\n'
model.inference_pretrain(prompt, tokenizer, model)
But the model encountered an error during inference,the error message is as follows: AttributeError: 'InternLM2ForCausalLM' object has no attribute 'inference_pretrain'
Have you encountered this issue before? thanks!