Closed adkAurora closed 11 months ago
Hi, could you please paste a sample output image here? There are several other things to check if the results does not look good.
scripts/tools/visualize_cameras
and it should output a ply file to give a rough visualization of camera parameters.near
, far
and bounds
to see if the result gets better.configs/specs/static.yaml
in your experimentation configuration (located in exps
). Or just add dataloader_cfg.dataset_cfg.frame_sample=0,1,1 val_dataloader_cfg.dataset_cfg.frame_sample=0,1,1
in any command line regarding the dataset.Hi ~ thanks for your reply
cameras.ply
, everything looks okay here.
error.png
of frame one
Sorry for the late reply!
I double-checked the pre-processing script and found that we internally used a different conversion path (neural3dv -> nerfstudio -> easyvolcap) thus the neural3dv_to_easyvolcap
script was not thoroughly tested.
In my latest commit this issue should have been fixed and you should be able to train a l3mhet
model on the dataset correctly after converting with neural3dv_to_easyvolcap
.
I recommend checking the implementation by training on a single frame first:
# Train on the first frame
evc -c configs/exps/l3mhet/l3mhet_sear_steak.yaml,configs/specs/static.yaml exp_name=l3mhet_sear_steak_static runner_cfg.save_latest_ep=1 runner_cfg.eval_ep=1 runner_cfg.resume=False
# Render spiral path
evc -t test -c configs/exps/l3mhet/l3mhet_sear_steak.yaml,configs/specs/static.yaml,configs/specs/spiral.yaml exp_name=l3mhet_sear_steak_static val_dataloader_cfg.dataset_cfg.render_size=540,960
# Fuse depth maps for visualization
python scripts/tools/volume_fusion.py -- -c configs/exps/l3mhet/l3mhet_sear_steak.yaml,configs/specs/static.yaml exp_name=l3mhet_sear_steak_static val_dataloader_cfg.dataset_cfg.ratio=0.05
Another recommended way to check the camera parameters is to render an enerfi
model on the dataset:
# Construct the experiments manually and render on GUI
evc -t gui -c configs/base.yaml,configs/models/enerfi.yaml,configs/datasets/neural3dv/sear_steak.yaml,configs/specs/vf0.yaml exp_name=enerfi_dtu model_cfg.sampler_cfg.n_planes=32,8 model_cfg.sampler_cfg.n_samples=4,1 viewer_cfg.window_size=540,960
Could you please check whether the issue has also been fixed on your end?
Thank you for your attention and effort~ I have tried the new code, there are some new problems.
l3mhet_sear_steak_static
can get reasonable result with mean psnr about 35 ,but I have a new question about val render result as blow, what are the strange vertical lines inside the green box?
EasyVolcap# evc -c configs/exps/l3mhet/l3mhet_sear_steak.yaml
2023-12-20 12:55:11.475942 easyvolcap.scripts.main -> preflight: Starting experiment: l3mhet_sear_steak, command: train main.py:80
2023-12-20 easyvolca… Loading imgs bytes for neural3dv/sear_steak/images TRAIN 100% ━━━━━━━━━━ 6,300/6,3… 0:14:30 < 0:00:00 8.316 p…
13:09:42.… -> it/s
load_resi…
2023-12-20 easyvolcap.da… Caching imgs for neural3dv/sear_steak TRAIN 22% ━━╸━━━━━━━━━━ 1,414/6,300 0:06:53 < 7:30:38 0.181 it/s v…
13:16:36.386… -> load_bytes:
Killed
Hi @adkAurora, thanks for the follow up!
bounds
inside configs/datasets/neural3dv/neural3dv.yaml
.problem solved, Thanks~
I use Neural3DV dataset
sear_steak
, follow the dataset conversion scripts inneural3dv_to_easyvolcap.py
to generate yml files, and train I3mhet use config inconfigs/exps/l3mhet/l3mhet_sear_steak.yaml
, the result is so bad which val psnr is only 6.36