KamitaniLab / GenericObjectDecoding

Demo code for Horikawa and Kamitani (2017) Generic decoding of seen and imagined objects using hierarchical visual features. Nat Commun https://www.nature.com/articles/ncomms15037.
149 stars 47 forks source link

question:It shows error “Undefined function or variable 'repadd’” while trying analysis_FeaturePrediction.m #9

Closed miyucka closed 5 years ago

miyucka commented 5 years ago

I am trying to run a Generic Decoding Demo program. It shows error “Undefined function or variable 'repadd’” while trying analysis_FeaturePrediction.m I searched for “reaped.m” in GenericObjectDecoding folder and BrainDecoderToolbox2 folder, but I couldn’t find it. (I found “reaped.c” instead.) Any suggestions on how to fix error? Appreciate your help.

ShuntaroAoki commented 5 years ago

The function repadd is included in lib/mex_prog_2010-06-21 as a mex program. So please check and try the followings:

  1. Make sure lib/mex_prog_2010-06-21 is included in your Matlab path,
  2. Try re-building repadd (as well as the other mex functions) on your Matlab.
miyucka commented 5 years ago

Thank you for your reply. The problem was solved when I followed your advice. I did not know I needed to build “reaped.c”. Thank you very much.

miyucka commented 5 years ago

question: I’m sorry to bother you, but I have another question about “analysis_FeaturePrediction.m” It shows error “Error using weight_out_delay_time” and “Requested 7516192768001x1750 (98000000.0GB) array exceeds maximum array size preference. ~” when I try analysis_FeaturePrediction.m Is this a normal state? (I feel the data size is too big…) If this is normal, could you tell me system requirements? Thank you for your support.

ShuntaroAoki commented 5 years ago

The demo script, analysis_FeaturePrediction.m, is designed for prediction of relatively small number of units (e.g., 1000 for each layer) and it doesn't support something like prediction of all units in DNNs. When we handle such large data, we divide the units into sub groups and run the prediction for each group to avoid memory error. We are going to add such practical code in future. Sorry for the inconvenience.