ghdtjr / A-LLMRec

66 stars 9 forks source link

Issue with Input Length Error during Inference #9

Closed haorongchen1015 closed 2 weeks ago

haorongchen1015 commented 2 months ago

Hello,

I have been using the Amazon Magazine_Subscriptions dataset (a smaller dataset compared to Movies_and_TV) for inference with your A-LLMRec model. However, I encountered the following error during execution:

ValueError: Input length of input_ids is 0, but max_length is set to 0. This can lead to unexpected behavior. You should consider increasing max_length or, better yet, setting max_new_tokens.

It seems the input_ids length is 0, which results in the error during the generation step. Could you provide any guidance on how to resolve this issue or adjust the model configuration accordingly?

Thank you for your time and assistance.

Best regards, Haorong

haorongchen1015 commented 1 month ago

data set movies and tvs also has this issue

betoobusy commented 1 month ago

me too!

joyjiuyi commented 1 month ago

Maybe you can solve this problem refer to https://github.com/boheumd/MA-LMM/issues/16.