Closed NickAnastasoff closed 1 year ago
The inference code requires about 53 GB of GPU memory for MotionGPT-13B model and about a half for MotionGPT-7B model.
Oh wow! Thanks for the reply. I guess it will be a while before I can run this on my Mac haha
The inference code requires about 53 GB of GPU memory for MotionGPT-13B model and about a half for MotionGPT-7B model.
Do you have a way currently in your code to distribute the inference and/or training among multiple GPUs? Since I don't see that currently (or) I may have missed it. Thanks for your response in advance and kudos to your great work..:)
Hi, I had to hack some code in to get this far, but now when I run the demo (without the render flag) I get this error.
any ideas on where I would need to change to fix this? thanks!