ECNU-ICALK / MELO

[AAAI 2024] MELO: Enhancing Model Editing with Neuron-indexed Dynamic LoRA
21 stars 2 forks source link

GPU Memory Requirement on Training MELO #2

Open junsiknss opened 9 months ago

junsiknss commented 9 months ago

Hello. I just read your paper. In the paper, it is mentioned that the extra parameters are only needed ~0.2% (0.12M) of the original model (T5 small: 60M) when inferencing, but I didn't find anything about memory usage when training MELO.

Is it possible to get a rough idea of how much GPU Memory resources are required when training MELO? Or if I'm misunderstanding the paper, please let me know.

Thanks.

BruthYU commented 8 months ago

All experiments could be conducted on a single RTX 3090. I'll release some running logs recording the correct training and inference processes.

BruthYU commented 8 months ago

Logs could be downloaded using Google Drive. Thanks for your attention, please feel free to contact us whenever you have other questions.