SamsungLabs / NeuralHaircut

Neural Haircut: Prior-Guided Strand-Based Hair Reconstruction. ICCV 2023
Other
516 stars 48 forks source link

numpy.core._exceptions._ArrayMemoryError: Unable to allocate #10

Closed MilesTheProwler closed 1 year ago

MilesTheProwler commented 1 year ago

Hi, can someone help me to solve this error?

I run this command python run_geometry_reconstruction.py --case person_0 --conf ./configs/example_config/neural_strands-monocular.yaml --exp_name first_stage_person_0

Error

Hello Wooden
False
upload transform {'translation': array([0.37335169, 2.34675772, 2.03221262]), 'scale': 2.368383848250081} ./implicit-hair-data/data/monocular/person_0/scale.pickle
Number of views: 66
Traceback (most recent call last):
  File "C:\Users\ADMIN\Desktop\NeuralHaircut\run_geometry_reconstruction.py", line 842, in <module>
    runner = Runner(args.conf, args.mode, args.case, args.is_continue, checkpoint_name=args.checkpoint, exp_name=args.exp_name,  train_cameras=args.train_cameras)
  File "C:\Users\ADMIN\Desktop\NeuralHaircut\run_geometry_reconstruction.py", line 72, in __init__
    self.dataset = MonocularDataset(self.conf['dataset'])
  File "C:\Users\ADMIN\Desktop\NeuralHaircut\src\models\dataset.py", line 361, in __init__
    self.orientations_np = np.stack([cv.imread(im_name) for im_name in self.orientations_lis]) / float(self.num_bins) * math.pi
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 6.88 GiB for an array with shape (66, 2160, 2160, 3) and data type float64
BingEdison commented 1 year ago

Hi I had same error due to not enough vram on gpu , Vanessa’s response to issue solved “In this case, during the first stage, you could decrease the memory consumption by first, controlling the batch_size (https://github.com/SamsungLabs/NeuralHaircut/blob/main/configs/example_config/neural_strands-monocular.yaml#L35) , then n_images_sampling (https://github.com/SamsungLabs/NeuralHaircut/blob/main/configs/example_config/neural_strands-monocular.yaml#L26C5-L26C24) and bs_sampling (https://github.com/SamsungLabs/NeuralHaircut/blob/main/configs/example_config/neural_strands-monocular.yaml#L27). I propose to set batch_size=512, n_images_sampling=16, bs_sampling=32. If it still would take more memory then you have just reduce the batch_size more till it would be ok.”

MilesTheProwler commented 1 year ago

@BingEdison Thank you for your kind reply. But even though I reduce the batch size to 16 , it still have that error. I do something wrong ? It took 99% of my RAM memory. May be I have to change float size to 32 . Do you how to change it?

image

Vanessik commented 1 year ago

@NaToh5 I think you would be able to run the code in your memory restrictions if you either consider less amount of views (images) in monocular video or consider the scene with smaller resolution (h3ds multiview data https://github.com/CrisalixSA/h3ds).

For a fast check of the first option, I propose in https://github.com/SamsungLabs/NeuralHaircut/blob/main/src/models/dataset.py#L330 just add [:2] at the end of each string from 330-340 to consider only 2 views.

MilesTheProwler commented 1 year ago

Thank you for your reply @Vanessik