isl-org / StableViewSynthesis

MIT License
213 stars 34 forks source link

Can not find counts.npy #7

Closed phongnhhn92 closed 3 years ago

phongnhhn92 commented 3 years ago

Hello, I am testing your model. However, it seems like I can not find the file counts.npy. I downloaded the preprocessed data from FVS and I can not find it also for each scene. Please help !

python exp.py --net resunet3.16_penone.dirs.avg.seq+9+1+unet+5+2+16.single+mlpdir+mean+3+64+16 --cmd eval --iter last --eval-dsets tat-subseq
[2021-03-18/19:06/INFO/mytorch] Set seed to 42
[2021-03-18/19:06/INFO/mytorch] ================================================================================
[2021-03-18/19:06/INFO/mytorch] Start cmd "eval": tat-wo-val_bs1_nbs3_rpointdir_s0.25_resunet3.16_penone.dirs.avg.seq+9+1+unet+5+2+16.single+mlpdir+mean+3+64+16_vgg
[2021-03-18/19:06/INFO/mytorch] 2021-03-18 19:06:41
[2021-03-18/19:06/INFO/mytorch] host: phong-Server
[2021-03-18/19:06/INFO/mytorch] --------------------------------------------------------------------------------
[2021-03-18/19:06/INFO/mytorch] worker env:
    experiments_root: experiments
    experiment_name: tat-wo-val_bs1_nbs3_rpointdir_s0.25_resunet3.16_penone.dirs.avg.seq+9+1+unet+5+2+16.single+mlpdir+mean+3+64+16_vgg
    n_train_iters: -65536
    seed: 42
    train_batch_size: 1
    train_batch_acc_steps: 1
    eval_batch_size: 1
    num_workers: 6
    save_frequency: <co.mytorch.Frequency object at 0x7fd0cb475970>
    eval_frequency: <co.mytorch.Frequency object at 0x7fd0cb4755e0>
    train_device: cuda:0
    eval_device: cuda:0
    clip_gradient_value: None
    clip_gradient_norm: None
    empty_cache_per_batch: False
    log_debug: []
    train_iter_messages: []
    stopwatch: 
    train_dsets: ['tat-wo-val']
    eval_dsets: ['tat-subseq']
    train_n_nbs: 3
    train_src_mode: image
    train_nbs_mode: argmax
    train_scale: 0.25
    eval_scale: 0.5
    invalid_depth: 1000000000.0
    point_aux_data: ['dirs']
    point_edges_mode: penone
    eval_n_max_sources: 5
    train_rank_mode: pointdir
    eval_rank_mode: pointdir
    train_loss: VGGPerceptualLoss(
  (vgg): Sequential(
    (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (1): ReLU(inplace=True)
    (2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (3): ReLU(inplace=True)
    (4): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    (5): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (6): ReLU(inplace=True)
    (7): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (8): ReLU(inplace=True)
    (9): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    (10): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (11): ReLU(inplace=True)
    (12): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (13): ReLU(inplace=True)
    (14): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (15): ReLU(inplace=True)
    (16): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (17): ReLU(inplace=True)
    (18): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    (19): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (20): ReLU(inplace=True)
    (21): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (22): ReLU(inplace=True)
    (23): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (24): ReLU(inplace=True)
    (25): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (26): ReLU(inplace=True)
    (27): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    (28): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (29): ReLU(inplace=True)
    (30): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (31): ReLU(inplace=True)
    (32): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (33): ReLU(inplace=True)
    (34): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (35): ReLU(inplace=True)
    (36): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
  )
)
    eval_loss: L1Loss()
    exp_out_root: experiments/tat-wo-val_bs1_nbs3_rpointdir_s0.25_resunet3.16_penone.dirs.avg.seq+9+1+unet+5+2+16.single+mlpdir+mean+3+64+16_vgg
    db_path: experiments/tat-wo-val_bs1_nbs3_rpointdir_s0.25_resunet3.16_penone.dirs.avg.seq+9+1+unet+5+2+16.single+mlpdir+mean+3+64+16_vgg/exp.phong-Server.db
    db_logger: <co.sqlite.Logger object at 0x7fd0cb475910>
[2021-03-18/19:06/INFO/mytorch] ================================================================================
[2021-03-18/19:06/INFO/exp] Create eval datasets
[2021-03-18/19:06/INFO/exp]   create dataset for tat_subseq_training_Truck
Traceback (most recent call last):
  File "exp.py", line 945, in <module>
    worker.do(args, worker_objects)
  File "../co/mytorch.py", line 442, in do
    self.do_cmd(args, worker_objects)
  File "../co/mytorch.py", line 429, in do_cmd
    self.eval_iters(
  File "../co/mytorch.py", line 604, in eval_iters
    eval_sets = self.get_eval_sets()
  File "exp.py", line 327, in get_eval_sets
    self.get_eval_set_tat(
  File "exp.py", line 252, in get_eval_set_tat
    dset = self.get_dataset(
  File "exp.py", line 133, in get_dataset
    counts = np.load(ibr_dir / "counts.npy")
  File "/home/phong/miniconda3/envs/deep/lib/python3.8/site-packages/numpy/lib/npyio.py", line 416, in load
    fid = stack.enter_context(open(os_fspath(file), "rb"))
FileNotFoundError: [Errno 2] No such file or directory: '/home/phong/data/Work/Paper3/Code/FreeViewSynthesis/ibr3d_tat/training/Truck/dense/ibr3d_pw_0.50/counts.npy'
griegler commented 3 years ago

If the counts.npy is missing in the given directory, but the individual count_*.npy are present, you can run the following Python code (change root_dir to your dataset directory):

    import numpy as np
    from pathlib import Path

    root_dir = Path('/your/path/to/training/Truck/dense/ibr3d_pw_0.50/')
    count_paths = sorted(root_dir.glob("count_*.npy"))
    counts = []
    for count_path in count_paths:
        count = np.load(count_path)
        counts.append(count)
    counts = np.array(counts)
    np.save(root_dir / "counts.npy", counts)
phongnhhn92 commented 3 years ago

It works ! Thanks