jmiller656 / EDSR-Tensorflow

Tensorflow implementation of Enhanced Deep Residual Networks for Single Image Super-Resolution
MIT License
330 stars 107 forks source link

test_size in load_dataset #42

Open Atwaman opened 5 years ago

Atwaman commented 5 years ago

First of all, in load_dataset in data.py I have to make this change, coords = [(q, r) for q in range(coords_x) for r in range(coords_y)] to coords = [ (q,r) for q in range(int(coords_x)) for r in range(int(coords_y)) ] otherwise, the except will trigger for all images.

You go on to define test_size = min(10,int( len(imgs)*0.2)) where imgs are images with coordinates ([0,0], [0,1]... [1,0],[1,1]...) appended to them.

But this doesnt reserve 20 images for testing, because len(imgs) = 1448 and you take the min of that and 10 (which is always 10) to define test_set = imgs[:test_size] train_set = imgs[test_size:][:200]

So you have a test_set, containing 10 image- & coordinate combinations, and a test set 20 times larger containing 200 image- and coordinate combinations.

I thought the idea was to set apart 20 % of the images to a test set and 80 % for training, and additionally obtain the coordinates for the central 100 pixels in all of them?