jonasricker / diffusion-model-deepfake-detection

[VISAPP2024] Towards the Detection of Diffusion Model Deepfakes
https://arxiv.org/abs/2210.14571
MIT License
82 stars 10 forks source link

ValueError: Classification metrics can't handle a mix of binary and continuous targets #8

Closed aditipanda closed 4 months ago

aditipanda commented 4 months ago

I'm trying to evaluate the pre-trained models on my own dataset. The pth files are getting created but just before the final table, I'm getting this error

ValueError: Classification metrics can't handle a mix of binary and continuous targets

Any idea how to solve? I didn't change anything in the code, except in the apply_metrics_to_df function:

for metric, func in metrics.items():
            if metric == "Acc":
                out[metric] = func(y_true=y_true, y_pred=y_score > THRESHOLDS[predictor.split("_")[0]])         
            else:
                out[metric] = func(y_true=y_true, y_score=y_score) #Here I changed y_score to y_pred as per sklearn.metrics

Pls help!

jonasricker commented 4 months ago

Could you tell me which metric you are trying to compute?

aditipanda commented 4 months ago

@jonasricker I got the table. Finally. :+1:

Now I have added confusion matrix to the list of metrics, using confusion_matrix from sklearn.metrics, same as roc_auc, accuracy, etc. used in the original code. This is the new error:

File "C:\Users\hp\miniconda3\envs\tf\lib\inspect.py", line 3015, in _bind
    raise TypeError('missing a required argument: {arg!r}'. 
TypeError: missing a required argument: 'y_pred'

Any idea why? I think the y_score=y_score in the apply_metrics function is the reason, but then how did it work for auroc and pd@1% ?

aditipanda commented 4 months ago

Follow up--

Sorry, my bad. sklearn.metrics.confusion_matrix takes y_score, like accuracy, and unlike roc-auc or avg-precision. I called confusion_matrix like accuracy in the code, and managed to compute and dump it using np.save. To get final results though, I have to change combine_columns and a few other things, so going to work on those, & closing this for now.

@jonasricker I got the table. Finally. 👍

Now I have added confusion matrix to the list of metrics, using confusion_matrix from sklearn.metrics, same as roc_auc, accuracy, etc. used in the original code. This is the new error:

File "C:\Users\hp\miniconda3\envs\tf\lib\inspect.py", line 3015, in _bind
    raise TypeError('missing a required argument: {arg!r}'. 
TypeError: missing a required argument: 'y_pred'

Any idea why? I think the y_score=y_score in the apply_metrics function is the reason, but then how did it work for auroc and pd@1% ?