jaredleekatzman / DeepSurv

DeepSurv is a deep learning approach to survival analysis.
MIT License
592 stars 174 forks source link

Exact training and validation procedure #44

Closed CanPhi closed 6 years ago

CanPhi commented 6 years ago

Hi,

thank you very much for this github! I was wondering if you could provide me with a little more detail on how exactly you achieved the c-index reported in the paper. You mention doing a hyperparameter search with 3-fold cross-validation. I would like to benchmark your publication against one that I am soon going to publish.

What did you do once the optimal hyperparameters were obtained? Did you still use a validation set for early stopping, or train until the mean early stopping epoch on the full training set? Or is the prediction an ensemble of the three models from the hyperparameter search?

If I understand your tutorials and code correctly, I cannot just use your docker files since I am modifying the code (adding a couple of line that require another hyperparameter).

Also in your paper, you describe the WHAS data set as consisting of 5 features, but the HDF5 data set when downloaded seems to have 6 features. Am I getting something wrong or is that a typo in the paper?

Also, did you simply use the default values in DeepSurv.train() for n_epochs, patience etc.? If not, where can I get those values from?

Thanks! Philippe

jaredleekatzman commented 6 years ago

Hi Philippe,

Once I have the optimal training parameters, I train the network and do early stopping if the validation accuracy does not improve by a certain percentage.

I use a separate validation set for that early stopping and for validating post trading that the tuned hyper-parameters worked.

If you modify DeepSurv to add hyper parameters, the docker files should still be work (at least need a little modification). What's do your changes look like such that isn't the case?

Jared On Mon, Jul 2, 2018 at 8:59 AM CanPhi notifications@github.com wrote:

Hi,

thank you very much for this github! I was wondering if you could provide me with a little more detail on how exactly you achieved the c-index reported in the paper. You mention doing a hyperparameter search with 3-fold cross-validation.

What did you do once the optimal hyperparameters were obtained? Did you still use a validation set for early stopping, or train until the mean early stopping epoch on the full training set?

If I understand your tutorials and code correctly, I cannot just use your docker files since I am modifying the code (adding a couple of line that require another hyperparameter).

Thanks! Philippe

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/jaredleekatzman/DeepSurv/issues/44, or mute the thread https://github.com/notifications/unsubscribe-auth/AFmmZnwyGdy4AFeBl734jxbJGEwre24Nks5uChkUgaJpZM4U_RQw .

CanPhi commented 6 years ago

Hi Jared,

thanks for your response! Since I have no experience with docker, I assumed that it has some compiled version of your code running so that modifications don't apply, but from what you are saying I gather that this was a misconception. I will try running it with the docker then.

I basically just need to pass an additional hyperparameter to the initialization of the model, so it is a really minor change. But I can also run the hyperparameter search from my own script, that should be no trouble. I found all settings for the current hyperparameters in this file: https://github.com/jaredleekatzman/DeepSurv/blob/master/experiments/deepsurv/models/whas_model_selu_revision.0.json

The only exceptions are "validation_frequency", "patience" and "improvement_threshold" that are passed to the train method of the model definition. Do you know where I can get these from?

I will keep you updated in case I run into something that I cannot figure out and otherwise link the arxiv here once a first version is ready for public scrutiny.

Thanks again! Philippe

jaredleekatzman commented 6 years ago

Hi Philippe,

I actually copy over the entire contents of the Deep surv directory and access the module locally. So as long as your changes are made in the directory referenced by the docker file, it should work!

As for those additional parameters, they should be just using the default values defined in the package. I can't remember if they are configurable from the hyper parameters configuration.

Good luck!

Jared On Tue, Jul 3, 2018 at 5:01 AM CanPhi notifications@github.com wrote:

Hi Jared,

thanks for your response! Since I have no experience with docker, I assumed that it has some compiled version of your code running so that modifications don't apply, but from what you are saying I gather that this was a misconception. I will try running it with the docker then.

I basically just need to pass an additional hyperparameter to the initialization of the model, so it is a really minor change. But I can also run the hyperparameter search from my own script, that should be no trouble. I found all settings for the current hyperparameters in this file:

https://github.com/jaredleekatzman/DeepSurv/blob/master/experiments/deepsurv/models/whas_model_selu_revision.0.json

The only exceptions are "validation_frequency", "patience" and "improvement_threshold" that are passed to the train method of the model definition. Do you know where I can get these from?

I will keep you updated in case I run into something that I cannot figure out and otherwise link the arxiv here once a first version is ready for public scrutiny.

Thanks again! Philippe

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/jaredleekatzman/DeepSurv/issues/44#issuecomment-402065675, or mute the thread https://github.com/notifications/unsubscribe-auth/AFmmZpMehalpQpS1ADSCNjHATLB_itX4ks5uCzMAgaJpZM4U_RQw .

CanPhi commented 6 years ago

Thanks, the code is up and running and producing the expected results! Turns out I vastly overestimated the importance of these parameters for the training, at least on WHAS it is really robust. Thanks again!