Open XPeriment2 opened 4 years ago
Hi @XPeriment2 , thanks for your feedback.
Tools of hyper-parameters tuning have been updated. We tested the script hpopt.py
using dataset simulated_data_train.csv
(which is accessible for anyone) and did not encounter issues as you described.
You may try again with an updated script hpopt.py
!
has it been uploaded since yesterday? i installed TFDeepSurv yesterday, still can't change the size of hidden layers... thanks
On Tue, Jun 23, 2020 at 2:51 PM Pei Liu notifications@github.com wrote:
Hi @XPeriment2 https://github.com/XPeriment2 , thanks for your feedback.
Tools of hyper-parameters tuning have been updated. We tested the script hpopt.py using dataset simulated_data_train.csv (which is accessible for anyone) and did not encounter issues as you described.
You may try again with an updated script hpopt.py !
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648096512, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF2QM4EQXNUA5CBQGY3RYCJN7ANCNFSM4OFDEX3Q .
Sorry, it has been uploaded just a few minutes ago.
Just download hpopt.py
and replace the old one by it.
thank you, but i'm still getting the same :
Traceback (most recent call last):
File "hpopt.py", line 169, in
watch_list['metrics'].append(concordance_index(self.train_data_y.values, -y_hat)) File "/home/amir/anaconda3/lib/python3.7/site-packages/tfdeepsurv/utils.py", line 106, in concordance_index ci_value = ci(t, y_pred, e) File "/home/amir/anaconda3/lib/python3.7/site-packages/lifelines/utils/concordance.py", line 54, in concordance_index event_times, predicted_scores, event_observed File "/home/amir/anaconda3/lib/python3.7/site-packages/lifelines/utils/concordance.py", line 269, in _preprocess_scoring_data raise ValueError("NaNs detected in inputs, please correct or drop.") ValueError: NaNs detected in inputs, please correct or drop.
On Tue, Jun 23, 2020 at 3:10 PM Pei Liu notifications@github.com wrote:
Sorry, it has been uploaded just a few minutes ago.
Just download hpopt.py and replace the old one by it.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648104532, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3S476ITQ45MEECTPDRYCLSXANCNFSM4OFDEX3Q .
It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).
You can try to normalize dataset by setting IS_NORM = True
.
so all my dataset is imputed so there's no NAN there.. i'll try to normalize and see if it help.
also - how would you add the number of layers (instead of the current 7,3,1) to being optimized?
On Tue, Jun 23, 2020 at 6:13 PM Pei Liu notifications@github.com wrote:
It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).
You can try to normalize dataset by setting IS_NORM = True.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648228522, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3V6S55Z4M6CQZEKU3RYDBCFANCNFSM4OFDEX3Q .
ok so the normalize seems to solve it - sometimes it runs through 20th searching, and sometimes all the way.
however i want to have the hidden layers be optimized too....from 1 layer up to 50 layers...how can i do that ? (without doing that manually of course)
thank you amir
On Tue, Jun 23, 2020 at 6:17 PM Amir Hadanny amir.had@gmail.com wrote:
so all my dataset is imputed so there's no NAN there.. i'll try to normalize and see if it help.
also - how would you add the number of layers (instead of the current 7,3,1) to being optimized?
On Tue, Jun 23, 2020 at 6:13 PM Pei Liu notifications@github.com wrote:
It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).
You can try to normalize dataset by setting IS_NORM = True.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648228522, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3V6S55Z4M6CQZEKU3RYDBCFANCNFSM4OFDEX3Q .
no sorry, still fails after 30% of the trials...even with normalize. really strange
On Tue, Jun 23, 2020 at 7:40 PM Amir Hadanny amir.had@gmail.com wrote:
ok so the normalize seems to solve it - sometimes it runs through 20th searching, and sometimes all the way.
however i want to have the hidden layers be optimized too....from 1 layer up to 50 layers...how can i do that ? (without doing that manually of course)
thank you amir
On Tue, Jun 23, 2020 at 6:17 PM Amir Hadanny amir.had@gmail.com wrote:
so all my dataset is imputed so there's no NAN there.. i'll try to normalize and see if it help.
also - how would you add the number of layers (instead of the current 7,3,1) to being optimized?
On Tue, Jun 23, 2020 at 6:13 PM Pei Liu notifications@github.com wrote:
It seems that NaN occurred in predictions (if you ensure that there is no NaN in your dataset).
You can try to normalize dataset by setting IS_NORM = True.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/liupei101/TFDeepSurv/issues/6#issuecomment-648228522, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJE3VF3V6S55Z4M6CQZEKU3RYDBCFANCNFSM4OFDEX3Q .
Hyopt tuning - i can't seem to change the numbers of hidden layers to any other of the [6,3,1] - any other turns an error of hidden layers. also is there any way of tuning this as part of the parameters?
on 6,3,1 - it's thrown away after 16% saying there are NA's? (the dataset has no NA).