openvax / mhcflurry

Peptide-MHC I binding affinity prediction
http://openvax.github.io/mhcflurry/
Apache License 2.0
191 stars 57 forks source link

missing args.num_jobs in `train_allele_specific_models_command` #149

Closed WesleyyC closed 4 years ago

WesleyyC commented 4 years ago

Hi,

In train_allele_specific_models_command.py, we are missing a num_jobs argument so in serial mode, the predictor is not updated correctly withintrain_model(). There used to be a num_jobs argument within the file, but probably get removed during refactoring.

Thanks, Wes

timodonnell commented 4 years ago

Thanks for reporting this. I'm not sure I follow though - can you give more detail or a test code snippit showing the incorrect behavior you're seeing?

There is a --num-jobs argument to the mhcflurry-class1-train-allele-specific-models command:

$ mhcflurry-class1-train-allele-specific-models -h
...
Local parallelism:
  --num-jobs N          Number of local processes to parallelize training
                        over. Set to 0 for serial run. Default: 0.
  --backend {tensorflow-gpu,tensorflow-cpu,tensorflow-default}
                        Keras backend. If not specified will use system
                        default.
  --gpus N              Number of GPUs to attempt to parallelize across.
                        Requires running in parallel.
...
WesleyyC commented 4 years ago

In your source code here:

https://github.com/openvax/mhcflurry/blob/c8976b0bc61f08b6dc50194b62abb4d82ef70301/mhcflurry/train_allele_specific_models_command.py#L196

the arg.num_jobs is not assigned, and running the code in serial mode will cause this line to break:

https://github.com/openvax/mhcflurry/blob/c8976b0bc61f08b6dc50194b62abb4d82ef70301/mhcflurry/train_allele_specific_models_command.py#L309

In addition, since you are checking args.num_jobs == 1, would it make more sense to set the default to 1 instead?

timodonnell commented 4 years ago

Thanks @WesleyyC . This should be fixed in #153 , which should be merged soon

WesleyyC commented 4 years ago

Awesome, thanks!