gafniguy / 4D-Facial-Avatars

Dynamic Neural Radiance Fields for Monocular 4D Facial Avater Reconstruction
679 stars 67 forks source link

can't allocate memory: you tried to allocate 20484980736 bytes #23

Closed suyuan945 closed 2 years ago

suyuan945 commented 2 years ago

1646635349(1)

This error occurred when I used the dataset you provided. I want to know what the hardware Env of the code are? my pc: GPU:2080ti 16G CPU memory:32G

gafniguy commented 2 years ago

This is most likely an Out of memory issue since the script tries to load all the training data to RAM, I think I was typically allocating around 70Gb.

You can bypass it by loading one sample at a time instead of loading everything at the start, in a fashion more similar to traditional DL pipelines on large datasets.

Here is some code to replace the training, eval, and loading scripts. Not thoroughly tested https://gist.github.com/gafniguy/5a66f471ad227d022aed96944432adea

suyuan945 commented 2 years ago

Thank you for your reply. I'll try it.

e4s2022 commented 2 years ago

I found the data loading costs ~1 hr for my Linux machine. The available RAM is 80GB. I tried to reduce the number of images of the training and test set. Unfortunately, it still happens. Any ideas?

gafniguy commented 2 years ago

did you try to use the torch dataset class I attached here? Instead of hogging everything on the RAM, it loads the images once at a time. It should be quite quick to start (just reads the jsons and not the images)

I found the data loading costs ~1 hr for my Linux machine. The available RAM is 80GB. I tried to reduce the number of images of the training and test set. Unfortunately, it still happens. Any ideas?

e4s2022 commented 2 years ago

Will try it later. Thank you.