Open dvdplm opened 3 years ago
MNIST dataset: automatically downloaded by torchvision
data utils. You really only have to run the program with data_specs['stream']
set to 'create'
and data_specs['save']
set to True
, and you'll end up with the torch.Tensor
dataset in the dataset
folder.
SZ: Go to the page linked. The paragraph: "In order to try unsupervised deep learning on the prototypical cognitive modeling problem of visual numerosity perception investigated by Stoianov & Zorzi (2012), you can download the complete dataset of visual images here and follow the instructions provided inside the archive." Click on the here
link, unzip and you'll find SZ_data.mat
and SZ_data_test.mat
. Save these two files in the dataset
directory. Run the program with data_specs['stream']
set to 'create'
and it will build the torch.Tensor
dataset. No need to save it. In this case, the creation of the tensor dataset is not as expensive as the case of MNIST.
Hope it helps. Do not hesitate to write for further clarifications.
Click on the here link, unzip and you'll find SZ_data.mat and SZ_data_test.mat.
That link (http://ccnl.psy.unipd.it/research/visual-number-sense-dataset) downloads stoianovzorzi2012.tar
which contains the following files:
➜ ~ ll Downloads/stoianovzorzi2012
total 10400
-rw-------@ 1 aggron staff 1.4K Apr 12 2013 README.txt
-rw-------@ 1 aggron staff 5.1M Apr 12 2013 StoianovZorzi2012_data.mat
-rw-------@ 1 aggron staff 3.7K Apr 12 2013 stoianovzorzi2012_converter.m
Is running the stoianovzorzi2012_converter.m
necessary or is the SZ_data_test.mat
file sufficient?
As a side-note: the .m
file extension is usually associated with Objective-C code so it might be helpful to users to add a note explaining that they need matlab/octave to run it.
Using GNU Octave, version 5.2.0
, how do I run the converter script? When I run octave --traditional --verbose stoianovzorzi2012_converter.m
I get a window showing a square with yellow squares on, but no .mat
files are created in the folder. What am I doing wrong?
Note: without the --traditional
flag I get a segfault.
No, it should be the .mat
files already, let me search them. Indeed, the link I said does not point to data, I am sorry.
Updated. Now the link points to a OSF repo in which the supporting code and data for another work is saved.
The
Readme
refers to "the MNIST dataset and the SZ dataset" without further specifications other than a link to a webpage with many download links.Exactly what files are users expected to download?