HappyGu0524 / MultiControl

MIT License
43 stars 2 forks source link

Transformers and other packages version used? #3

Closed AmazoniteSora closed 9 months ago

AmazoniteSora commented 9 months ago

During our reproduction of your code, an error occurred since "generate" methods of GPT2Model ,since latest version do not have "past" as an argument

ValueError: The following model_kwargs are not used by the model: ['past'] (note: typos in the generate arguments will also show up in this list)

and another question occured in your code:

File "xxxxxxxx/lib/python3.7/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 848, in forward hidden_states = inputs_embeds + position_embeds RuntimeError: The size of tensor a (5) must match the size of tensor b (55) at non-singleton dimension 0

We expect you to instruct us a proper environment to replicate your implenmatation, or provide us fixes to run this in current package environment

HappyGu0524 commented 9 months ago

We use transformers==4.17.0 as the base architecture. You can refer to priorcontrol/requirements.txt and multicontrol/requirements.txt for more details of used packages.

If there are still some errors, please provide more detailed descriptions

AmazoniteSora commented 9 months ago

We use transformers==4.17.0 as the base architecture. You can refer to priorcontrol/requirements.txt and multicontrol/requirements.txt for more details of used packages.

If there are still some errors, please provide more detailed descriptions

Thank you for your prompt and effective response. Following the current configuration and after downgrading the version of protobuf, the code now runs correctly. If I encounter more serious errors, I will report them in this issue. Wish you a pleasant day at work!