dicarlolab / archconvnets

Architecturally optimized neural networks trained with regularized backpropagation
5 stars 5 forks source link

Mysterious options parser behaviour #15

Closed ardila closed 10 years ago

ardila commented 10 years ago

I'm getting stuck debugging the options parser since my usual approach (using iPython or %pdb) doesn't seem applicable to the options parser. I'm writing out the issue here so that in the process I will get some understanding, and maybe @yamins81 or @daseibert can give me some pointers for what to try.

I made some simple modifications to the options parser, following what seems to be the correct syntax (see branch feature/write-db). But when I try to parse these options, suddenly dp-params does not get parsed. Immediately after calling parse on the options, I print them. Here is the result:

ardila@munctional1:~/src/archconvnets/archconvnets/convnet$ python extractnet.py --test-range=0-44 --train-range=0 --data-provider=general-cropped --load-query='{"experiment_data.experiment_id": "mytestrun2"}' —layer=pool3_11b --feature-path=test_feats_c --data-path=/export/imgnet_storage_full/ardila/hvm_batches --dp-params='{"meta_attribute": "obj", "preproc": {"normalize": false, "dtype": "float32", "resize_to": [128, 128], "mode": "RGB"}, "batch_size": 128, "dataset_name": ["dldata.stimulus_sets.hvm", "HvMWithDiscfade"]}' --write-db=true —write-disk=true

Data batch range: testing                                             : 0-44                  
Data batch range: training                                            : 0-0                   
Data path                                                             :                       
Data provider                                                         : general-cropped       
Data provider paramers                                                :                       [DEFAULT]
GPU override                                                          : [-1] * num_gpus       [DEFAULT]
Host for Saving Checkpoints to DB                                     : localhost             [DEFAULT]
Image Size                                                            : 0                     [DEFAULT]
Load file                                                             :                       
Name for gridfs FS for saved checkpoints                              : convnet_checkpoint_fs [DEFAULT]
Name for mongodb database for saved checkpoints                       : convnet_checkpoint_db [DEFAULT]
Port for Saving Checkpoints to DB                                     : 27017                 [DEFAULT]
Query for loading checkpoint from database                            : {u'experiment_data.experiment_id': u'mytestrun2'} 
Write all data features from the dataset to mongodb in standard format: False                 [DEFAULT]
Write test data features from --layer to --feature-path)              : True                  [DEFAULT]
Write test data features from given layer                             :                       [DEFAULT]
Write test data features to this path (to be used with --layer)       :                       [DEFAULT]

Here is the command line that will print out the same output:

python extractnet.py --test-range=0-44 --train-range=0 --data-provider=general-cropped --load-query='{"experiment_data.experiment_id": "mytestrun2"}' —layer=pool3_11b --feature-path=test_feats_c --data-path=/export/imgnet_storage_full/ardila/hvm_batches --dp-params='{"meta_attribute": "obj", "preproc": {"normalize": false, "dtype": "float32", "resize_to": [128, 128], "mode": "RGB"}, "batch_size": 128, "dataset_name": ["dldata.stimulus_sets.hvm", "HvMWithDiscfade"]}' --write-db=true —write-disk=true

ardila commented 10 years ago

I was putting in hyphens instead of 2 dashes... one problem with using a complex command line interface, but solved for now.