kmeng01 / memit

Mass-editing thousands of facts into a transformer memory (ICLR 2023)
https://memit.baulab.info
MIT License
438 stars 53 forks source link

GPU not big enough? I'm using A5500 24GB RAM #15

Open imessien opened 8 months ago

imessien commented 8 months ago

the paper uses an A6000 GPU with 48GB of RAM but the GPU in my workstation I have 4 A5500 with 24GB of RAM. Can I use the method suggested in the paper by separating out the model editing and model running. Or is there a way for me to run it parallel between my GPUs? My current idea to use this library called transformer-utils that uses a smaller model. I'm getting the message that I'm running out storage when running the model editing