rgeirhos / generalisation-humans-DNNs

Data, code & materials from the paper "Generalisation in humans and deep neural networks" (NeurIPS 2018)
http://papers.nips.cc/paper/7982-generalisation-in-humans-and-deep-neural-networks.pdf
Other
95 stars 21 forks source link

Question with regards to rendering Eidolon Distortions [reach parameter] #3

Closed ArturoDeza closed 5 years ago

ArturoDeza commented 5 years ago

Hi Robert, I've rendered some Eidolon Distortions, but I used the MATLAB equivalent version instead of the python one (which seems like you used to render the stimuli). I have all they hyperparameters set, but for some reason I usually get a dimensionality error when I'm setting the reach parameter to 128.0 for a 256x256 image. I noticed however that if I upsampled it to 512x512 and then applied the transform, the code would not crash in MATLAB. My question now is: Did the python implementation that you used do the same thing? It seems like in the python code there is a line, that upsamples the input image to 512x512 before processing the Eidolon Transform. My assumption is that you downscale the output back again to 256x256 after the transform is computed. Is this what you did? Or did you change the input hyperparameter to 256, and it worked without any problems for you directly from the python code? Thanks! Arturo

rgeirhos commented 5 years ago

Hi Arturo, We set SZ=256 for our experiments (https://github.com/rgeirhos/generalisation-humans-DNNs/blob/master/code/wrapper.py#L37) but I'm not sure whether this setting then gets overridden somewhere else inside of the Eidolon toolbox... Are you interested in generating the exact same images that we used in our experiments with human observers & CNNs? If so, let me know and I'd be happy to share the stimuli directly to save you the troubles.

ArturoDeza commented 5 years ago

Hi Robert, that works! I will write to you via another medium with regards to sharing the stimuli! Thanks!