svip-lab / PlanarReconstruction

[CVPR'19] Single-Image Piece-wise Planar 3D Reconstruction via Associative Embedding
MIT License
363 stars 85 forks source link

How to run eval on custom dataset? #31

Closed kharyal closed 4 years ago

kharyal commented 4 years ago

I want to run evaluate on a custom dataset. I converted all the images into a single .npz file and ran the evaluate command python main.py eval with dataset.root_dir=/path/to/save/processd/data resume_dir=/path/to/pretrained.pt dataset.batch_size=1 with my path to .npz file but I got the following error:

WARNING - main - No observers have been added to this run INFO - main - Running command 'eval' INFO - main - Started ERROR - main - Failed after 0:00:10! Traceback (most recent calls WITHOUT Sacred internals): File "main.py", line 392, in eval data_loader = load_dataset('val', cfg.dataset) File "main.py", line 174, in load_dataset PlaneDataset(subset=subset, transform=transforms, root_dir=cfg.root_dir), File "main.py", line 42, in __init__ self.data_list = [line.strip() for line in open(self.txt_file, 'r').readlines()] NotADirectoryError: [Errno 20] Not a directory: '../all_images.npz/val.txt'

niujinshuchong commented 4 years ago

@kharyal I think you should create .npz files under some folder (such as data/val/1.npz) and create a val.txt (data/val.txt) file which contains all the name of npz file (may be like 1\n2\n).

Or you could change the code in https://github.com/svip-lab/PlanarReconstruction/blob/master/main.py#L34

kharyal commented 4 years ago

I have done that. It doesn't matter what I put in val.txt, it always tries to find '0.npz' first. Now, I am facing another error,

WARNING - main - No observers have been added to this run
INFO - main - Running command 'eval'
INFO - main - Started
ERROR - main - Failed after 0:00:04!
Traceback (most recent calls WITHOUT Sacred internals):
  File "main.py", line 403, in eval
    for iter, sample in enumerate(data_loader):
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 345, in __next__
    data = self._next_data()
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 856, in _next_data
    return self._process_data(data)
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 881, in _process_data
    data.reraise()
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/_utils.py", line 395, in reraise
    raise self.exc_type(msg)
TypeError: Caught TypeError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/PIL/Image.py", line 2749, in fromarray
    mode, rawmode = _fromarray_typemap[typekey]
KeyError: ((1, 1, 3), '<f4')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "main.py", line 117, in __getitem__
    image = Image.fromarray(image)
  File "/home/chaitanyakharyal/.local/lib/python3.6/site-packages/PIL/Image.py", line 2751, in fromarray
    raise TypeError("Cannot handle this data type: %s, %s" % typekey) from e
TypeError: Cannot handle this data type: (1, 1, 3), <f4

I have created my .npz files using this code:

import os
import numpy as np
import cv2

path_to_files = "./imgs/"
savepath = './npz_dir/'
valfilepath = savepath+'val.txt'
valfile = open(valfilepath, 'a')
i = 0
for _, file_ in enumerate(os.listdir(path_to_files)):
    print('processing file', file_)
    img_path = path_to_files+file_
    img = cv2.imread(img_path)
    img = img.astype('float32')
    img /= 255
    np.savez(savepath+'val/'+str(i), img)
    valfile.write(str(i)+'\n')
kharyal commented 4 years ago

It was just an error of using float32 instead of uint8. It got resolved by adding

image = (image*255).astype(np.uint8)

while reading the image in main.py

KirillHiddleston commented 3 years ago

@kharyal can you show the data structure and images [image, plane, depth, segmentation] you are using for the convert to npz. Perhaps the dataset you used. When I convert data to npz and run python main.py eval Have error : gt_segmentation = gt_segmentation.reshape((192, 256)) cannot reshape array of size 1572864 into shape (192,256) - segmentation

niujinshuchong commented 3 years ago

@KirillHiddleston It seems that you are using a one-hot encoded segmentation map (the shape is 32x192x256), while we use a index map (1x192x256). We convert the index map to a one-hot segmentation map here: https://github.com/svip-lab/PlanarReconstruction/blob/872a1b0f895abb37bfb119b7c4f814a78d659008/main.py#L130.

KirillHiddleston commented 3 years ago

I use 512x1024x3 segmentation map, but you convert the index map to a one-hot segmentation map after reshape when I have error in reshape array of size. I try convert segmentation map MxNx3 to index map 1xMxN