Closed aditipanda closed 4 months ago
Could you tell me which metric you are trying to compute?
@jonasricker I got the table. Finally. :+1:
Now I have added confusion matrix to the list of metrics, using confusion_matrix from sklearn.metrics, same as roc_auc, accuracy, etc. used in the original code. This is the new error:
File "C:\Users\hp\miniconda3\envs\tf\lib\inspect.py", line 3015, in _bind
raise TypeError('missing a required argument: {arg!r}'.
TypeError: missing a required argument: 'y_pred'
Any idea why? I think the y_score=y_score in the apply_metrics function is the reason, but then how did it work for auroc and pd@1% ?
Follow up--
Sorry, my bad. sklearn.metrics.confusion_matrix takes y_score, like accuracy, and unlike roc-auc or avg-precision. I called confusion_matrix like accuracy in the code, and managed to compute and dump it using np.save. To get final results though, I have to change combine_columns and a few other things, so going to work on those, & closing this for now.
@jonasricker I got the table. Finally. 👍
Now I have added confusion matrix to the list of metrics, using confusion_matrix from sklearn.metrics, same as roc_auc, accuracy, etc. used in the original code. This is the new error:
File "C:\Users\hp\miniconda3\envs\tf\lib\inspect.py", line 3015, in _bind raise TypeError('missing a required argument: {arg!r}'. TypeError: missing a required argument: 'y_pred'
Any idea why? I think the y_score=y_score in the apply_metrics function is the reason, but then how did it work for auroc and pd@1% ?
I'm trying to evaluate the pre-trained models on my own dataset. The pth files are getting created but just before the final table, I'm getting this error
ValueError: Classification metrics can't handle a mix of binary and continuous targets
Any idea how to solve? I didn't change anything in the code, except in the apply_metrics_to_df function:
Pls help!