DekuLiuTesla / CityGaussian

[ECCV2024] CityGaussian: Real-time High-quality Large-Scale Scene Rendering with Gaussians
https://dekuliutesla.github.io/citygs/
Other
324 stars 18 forks source link

Errors in UrbanScene3D examples, Need help. #22

Open GeoVectorMatrix opened 2 weeks ago

GeoVectorMatrix commented 2 weeks ago

I have prepared data as illustrated, and run:
bash scripts/run_citygs.sh It outputs:
GPU 0 is available. Optimizing Output folder: ./output/residence_coarse [29/08 18:36:12] Reading camera 1007/2561Traceback (most recent call last): File "CityGaussian/train_large.py", line 309, in training(lp, op, pp, args.test_iterations, args.save_iterations, args.refilter_iterations, args.checkpoint_iterations, args.start_checkpoint, args.debug_from) File "CityGaussian/train_large.py", line 43, in training scene = LargeScene(dataset, gaussians) File "CityGaussian/scene/init.py", line 131, in init File "CityGaussian/scene/dataset_readers.py", line 145, in readColmapSceneInfo File "CityGaussian/scene/dataset_readers.py", line 99, in readColmapCameras File "anaconda3/envs/citygs/lib/python3.9/site-packages/PIL/Image.py", line 3431, in open OSError: [Errno 24] Too many open files: 'CityGaussian/data/urban_scene_3d/residence-pixsfm/train/images/000148.JPG' Error in atexit._run_exitfuncs: Traceback (most recent call last): File "anaconda3/envs/citygs/lib/python3.9/shutil.py", line 727, in rmtree OSError: [Errno 24] Too many open files: '/tmp/tmpy4j1ld6pwandb-media' Error in atexit._run_exitfuncs: Traceback (most recent call last): File "anaconda3/envs/citygs/lib/python3.9/shutil.py", line 727, in rmtree OSError: [Errno 24] Too many open files: '/tmp/tmp6j9c48awwandb-artifacts' Traceback (most recent call last): File "anaconda3/envs/citygs/lib/python3.9/weakref.py", line 667, in _exitfunc File "anaconda3/envs/citygs/lib/python3.9/weakref.py", line 591, in call File "anaconda3/envs/citygs/lib/python3.9/tempfile.py", line 829, in _cleanup File "anaconda3/envs/citygs/lib/python3.9/tempfile.py", line 825, in _rmtree File "anaconda3/envs/citygs/lib/python3.9/shutil.py", line 730, in rmtree File "anaconda3/envs/citygs/lib/python3.9/shutil.py", line 727, in rmtree OSError: [Errno 24] Too many open files: '/tmp/tmpuixxgyq1' GPU 0 is available. Output folder: ./output/residence_c20_r4 [29/08 18:36:20] Reading camera 1008/2561Traceback (most recent call last): File "/CityGaussian/data_partition.py", line 151, in File "/CityGaussian/scene/init.py", line 131, in init File "/CityGaussian/scene/dataset_readers.py", line 145, in readColmapSceneInfo File "/CityGaussian/scene/dataset_readers.py", line 99, in readColmapCameras File "anaconda3/envs/citygs/lib/python3.9/site-packages/PIL/Image.py", line 3431, in open OSError: [Errno 24] Too many open files: '/CityGaussian/data/urban_scene_3d/residence-pixsfm/train/images/000484.JPG' GPU 0 is available. Starting training block '0' Optimizing Output folder: ./output/residence_c20_r4/cells/cell0 [29/08 18:36:30] Traceback (most recent call last): File "/CityGaussian/train_large.py", line 309, in training(lp, op, pp, args.test_iterations, args.save_iterations, args.refilter_iterations, args.checkpoint_iterations, args.start_checkpoint, args.debug_from) File "/CityGaussian/train_large.py", line 43, in training scene = LargeScene(dataset, gaussians) File "/CityGaussian/scene/init.py", line 123, in init partition = np.load(os.path.join(args.source_path, "data_partitions", f"{args.partition_name}.npy"))[:, args.block_id] File anaconda3/envs/citygs/lib/python3.9/site-packages/numpy/lib/npyio.py", line 427, in load fid = stack.enter_context(open(os_fspath(file), "rb")) FileNotFoundError: [Errno 2] No such file or directory: 'data/urban_scene_3d/residence-pixsfm/train/data_partitions/residence_c20_r4.npy'

Could please give me some advice?

DekuLiuTesla commented 2 weeks ago

Seems the image is lost. Perhaps something is wrong when downloading the dataset?

valenrach commented 2 days ago

I faced similar issue and after 'ulimit -n 65535' to the command line, and it was solved.

GeoVectorMatrix commented 14 hours ago

I faced similar issue and after 'ulimit -n 65535' to the command line, and it was solved.

Many thanks. I will try.