Closed gngdb closed 9 years ago
Given up on the clever way of doing this. Going back to using just DenseDesignMatrix class. Should be much simpler, but will limit our space for augmentation and fill up our RAM.
First version of this working, in dense_dataset.py
called dense_dataset.DensePNGDataset
. However, requires run settings, which don't exist yet, so you have to hand in a dictionary of settings. Can probably put this in YAML for now. Example run settings:
run_settings = {"preprocessing":{'resize':(48,48)},"final_shape":(48,48),"augmentation_factor":1}
final_shape
is the expected final size of the images. augmentation_factor
is how many images you're getting back for each image going into preprocessing.
Using some kind of dataset class. Should inherit from a standard pylearn2 dataset class. Should also be possible to use some of our code for loading data that we already have.