jiaweizzhao / GaLore

GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection
Apache License 2.0
1.43k stars 148 forks source link

ValueError: some parameters appear in more than one parameter group #41

Open jiaohuix opened 6 months ago

jiaohuix commented 6 months ago

I encountered an error, how should I resolve it?

[WARNING|trainer.py:1272] 2024-04-27 12:04:25,428 >> Activated GaLoRE fine-tuning, depending on your model size and hardware, the training might take a while before starting. Please be patient ! /home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py:48: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set no_deprecation_warning=True to disable this warning warnings.warn( Traceback (most recent call last): File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 618, in main() File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 534, in main train_result = trainer.train(resume_from_checkpoint=checkpoint) File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1848, in train return inner_training_loop( File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1949, in _inner_training_loop self.create_optimizer_and_scheduler(num_training_steps=max_steps) File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 981, in create_optimizer_and_scheduler self.create_optimizer() File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1038, in create_optimizer self.optimizer = optimizer_cls(optimizer_grouped_parameters, **optimizer_kwargs) File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py", line 64, in init super().init(params, defaults) File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 192, in init self.add_param_group(param_group) File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 535, in add_param_group raise ValueError("some parameters appear in more than one parameter group") ValueError: some parameters appear in more than one parameter group