liupei101 / TFDeepSurv

COX Proportional risk model and survival analysis implemented by tensorflow.
MIT License
101 stars 27 forks source link

hyptop issues #6

Open XPeriment2 opened 4 years ago

XPeriment2 commented 4 years ago

Hyopt tuning - i can't seem to change the numbers of hidden layers to any other of the [6,3,1] - any other turns an error of hidden layers. also is there any way of tuning this as part of the parameters?

on 6,3,1 - it's thrown away after 16% saying there are NA's? (the dataset has no NA).

liupei101 commented 4 years ago

Hi @XPeriment2 , thanks for your feedback.

Tools of hyper-parameters tuning have been updated. We tested the script hpopt.py using dataset simulated_data_train.csv (which is accessible for anyone) and did not encounter issues as you described.

You may try again with an updated script hpopt.py !

XPeriment2 commented 4 years ago

has it been uploaded since yesterday? i installed TFDeepSurv yesterday, still can't change the size of hidden layers... thanks

On Tue, Jun 23, 2020 at 2:51 PM Pei Liu notifications@github.com wrote:

Hi @XPeriment2 https://github.com/XPeriment2 , thanks for your feedback.

Tools of hyper-parameters tuning have been updated. We tested the script hpopt.py using dataset simulated_data_train.csv (which is accessible for anyone) and did not encounter issues as you described.

You may try again with an updated script hpopt.py !

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648096512, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF2QM4EQXNUA5CBQGY3RYCJN7ANCNFSM4OFDEX3Q .

liupei101 commented 4 years ago

Sorry, it has been uploaded just a few minutes ago.

Just download hpopt.py and replace the old one by it.

XPeriment2 commented 4 years ago

thank you, but i'm still getting the same :

Traceback (most recent call last): File "hpopt.py", line 169, in main(os.path.join(WORK_DIR, DATA_PATH)) File "hpopt.py", line 166, in main search_params(max_evals=MAX_EVALS) File "hpopt.py", line 118, in search_params best = hpt.fmin(train_dsl_by_vd, space, algo=hpt.tpe.suggest, max_evals=max_evals) File "/home/amir/anaconda3/lib/python3.7/site-packages/hyperopt/fmin.py", line 509, in fmin rval.exhaust() File "/home/amir/anaconda3/lib/python3.7/site-packages/hyperopt/fmin.py", line 330, in exhaust self.run(self.max_evals - n_done, block_until_done=self.asynchronous) File "/home/amir/anaconda3/lib/python3.7/site-packages/hyperopt/fmin.py", line 286, in run self.serial_evaluate() File "/home/amir/anaconda3/lib/python3.7/site-packages/hyperopt/fmin.py", line 165, in serial_evaluate result = self.domain.evaluate(spec, ctrl) File "/home/amir/anaconda3/lib/python3.7/site-packages/hyperopt/base.py", line 894, in evaluate rval = self.fn(pyll_rval) File "hpopt.py", line 96, in train_dsl_by_vd ds.train(train_X, train_y, num_steps=params['num_rounds'], silent=True) File "/home/amir/anaconda3/lib/python3.7/site-packages/tfdeepsurv/dsl.py", line 238, in train

watch_list['metrics'].append(concordance_index(self.train_data_y.values, -y_hat)) File "/home/amir/anaconda3/lib/python3.7/site-packages/tfdeepsurv/utils.py", line 106, in concordance_index ci_value = ci(t, y_pred, e) File "/home/amir/anaconda3/lib/python3.7/site-packages/lifelines/utils/concordance.py", line 54, in concordance_index event_times, predicted_scores, event_observed File "/home/amir/anaconda3/lib/python3.7/site-packages/lifelines/utils/concordance.py", line 269, in _preprocess_scoring_data raise ValueError("NaNs detected in inputs, please correct or drop.") ValueError: NaNs detected in inputs, please correct or drop.

On Tue, Jun 23, 2020 at 3:10 PM Pei Liu notifications@github.com wrote:

Sorry, it has been uploaded just a few minutes ago.

Just download hpopt.py and replace the old one by it.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648104532, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3S476ITQ45MEECTPDRYCLSXANCNFSM4OFDEX3Q .

liupei101 commented 4 years ago

It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).

You can try to normalize dataset by setting IS_NORM = True.

XPeriment2 commented 4 years ago

so all my dataset is imputed so there's no NAN there.. i'll try to normalize and see if it help.

also - how would you add the number of layers (instead of the current 7,3,1) to being optimized?

On Tue, Jun 23, 2020 at 6:13 PM Pei Liu notifications@github.com wrote:

It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).

You can try to normalize dataset by setting IS_NORM = True.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648228522, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3V6S55Z4M6CQZEKU3RYDBCFANCNFSM4OFDEX3Q .

XPeriment2 commented 4 years ago

ok so the normalize seems to solve it - sometimes it runs through 20th searching, and sometimes all the way.

however i want to have the hidden layers be optimized too....from 1 layer up to 50 layers...how can i do that ? (without doing that manually of course)

thank you amir

On Tue, Jun 23, 2020 at 6:17 PM Amir Hadanny amir.had@gmail.com wrote:

so all my dataset is imputed so there's no NAN there.. i'll try to normalize and see if it help.

also - how would you add the number of layers (instead of the current 7,3,1) to being optimized?

On Tue, Jun 23, 2020 at 6:13 PM Pei Liu notifications@github.com wrote:

It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).

You can try to normalize dataset by setting IS_NORM = True.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648228522, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3V6S55Z4M6CQZEKU3RYDBCFANCNFSM4OFDEX3Q .

XPeriment2 commented 4 years ago

no sorry, still fails after 30% of the trials...even with normalize. really strange

On Tue, Jun 23, 2020 at 7:40 PM Amir Hadanny amir.had@gmail.com wrote:

ok so the normalize seems to solve it - sometimes it runs through 20th searching, and sometimes all the way.

however i want to have the hidden layers be optimized too....from 1 layer up to 50 layers...how can i do that ? (without doing that manually of course)

thank you amir

On Tue, Jun 23, 2020 at 6:17 PM Amir Hadanny amir.had@gmail.com wrote:

so all my dataset is imputed so there's no NAN there.. i'll try to normalize and see if it help.

also - how would you add the number of layers (instead of the current 7,3,1) to being optimized?

On Tue, Jun 23, 2020 at 6:13 PM Pei Liu notifications@github.com wrote:

It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).

You can try to normalize dataset by setting IS_NORM = True.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648228522, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3V6S55Z4M6CQZEKU3RYDBCFANCNFSM4OFDEX3Q .