YerevaNN / mimic3-benchmarks

Python suite to construct benchmark machine learning datasets from the MIMIC-III 💊 clinical database.
https://arxiv.org/abs/1703.07771
MIT License
805 stars 329 forks source link

[error]AttributeError: 'int' object has no attribute 'ndim' #55

Closed wuzy361 closed 6 years ago

wuzy361 commented 6 years ago

HI Thanks for sharing the code. when I use python3 to run this model: python3 -um mimic3models.in_hospital_mortality.main --network mimic3models/keras_models/lstm.py --dim 16 --timestep 1.0 --depth 2 --dropout 0.3 --mode train --batch_size 8 --output_dir mimic3models/in_hospital_mortality/ The error happend:

==> training
Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/media/bigdate/software/MIMIC-III/mimic3-benchmarks/mimic3models/in_hospital_mortality/main.py", line 154, in <module>
    batch_size=args.batch_size)
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 1593, in fit
    batch_size=batch_size)
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 1430, in _standardize_user_data
    exception_prefix='target')
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 70, in _standardize_input_data
    data = [np.expand_dims(x, 1) if x is not None and x.ndim == 1 else x for x in data]
  File "/usr/local/lib/python3.5/dist-packages/keras/engine/training.py", line 70, in <listcomp>
    data = [np.expand_dims(x, 1) if x is not None and x.ndim == 1 else x for x in data]
AttributeError: 'int' object has no attribute 'ndim'

Maybe the preprocessing data had some mistake,so I retryed the code : python3 -m mimic3benchmark.scripts.create_in_hospital_mortality data/root/ data/in-hospital-mortality/ python3 -m mimic3models.split_train_val data/in-hospital-mortality but this same error occured again.

wuzy361 commented 6 years ago

It's a bug with Keras 2.1.3 specifically. I rolled back to version 2.1.2 and everything worked fine!