brain-score / model-tools

Helper functions to extract model activations and translate from Machine Learning to Neuroscience
MIT License
8 stars 27 forks source link

check_submission not working with Stochastic models #37

Closed tiagogmarques closed 3 years ago

tiagogmarques commented 3 years ago

Getting error when testing stochastic model for model submission. From what I could infer the MockBenchmark does not average trial presentations and then gets a conflicting size error. Check log below.

Traceback (most recent call last): File "brain_models.py", line 50, in check_models.check_brain_models(name) File "/braintree/home/tmarques/brainscore/model-tools/model_tools/check_submission/check_models.py", line 24, in check_brain_models check_brain_model_processing(model) File "/braintree/home/tmarques/brainscore/model-tools/model_tools/check_submission/check_models.py", line 30, in check_brain_model_processing score = benchmark(model, do_behavior=True) File "/braintree/home/tmarques/brainscore/model-tools/model_tools/check_submission/check_models.py", line 88, in call candidate.look_at(self.assembly.stimulus_set) File "/braintree/home/tmarques/brainscore/model-tools/model_tools/brain_transformation/init.py", line 53, in look_at return self.behavior_model.look_at(stimuli) File "/braintree/home/tmarques/brainscore/model-tools/model_tools/brain_transformation/behavior.py", line 22, in look_at return self.current_executor.look_at(stimuli, *args, *kwargs) File "/braintree/home/tmarques/brainscore/model-tools/model_tools/brain_transformation/behavior.py", line 50, in look_at dims=['choice', 'presentation']) File "/braintree/home/tmarques/anaconda3/envs/model-submission/lib/python3.6/site-packages/brainio_base/assemblies.py", line 24, in init super(DataAssembly, self).init(args, **kwargs) File "/braintree/home/tmarques/anaconda3/envs/model-submission/lib/python3.6/site-packages/xarray/core/dataarray.py", line 230, in init coords, dims = _infer_coords_and_dims(data.shape, coords, dims) File "/braintree/home/tmarques/anaconda3/envs/model-submission/lib/python3.6/site-packages/xarray/core/dataarray.py", line 81, in _infer_coords_and_dims 'coordinate %r' % (d, sizes[d], s, k)) ValueError: conflicting sizes for dimension 'choice': length 1 on the data but length 20 on coordinate 'synset'

mschrimpf commented 3 years ago

is this with the most recent version? @dapello had the same issue I believe but it was fixed after pulling the latest changes

dapello commented 3 years ago

the hotfix-logits branch of model-tools resolved this for me I think -- doesn't look like it was merged yet?

mschrimpf commented 3 years ago

merged now (#36)

tiagogmarques commented 3 years ago

Test successful!