Closed sadaf92 closed 4 years ago
Since I don't know about your application area, here's general ML advice: If there's evidence of overfitting, try more model regularization.
Thanks for your response. Maybe it's better to explain my issue in another way. So as far as I understand, the anomaly object in your dataset is labeled as "13". I was looking through your training dataset and found images that are labeled as "13". On the other hand, I expect that anomalous objects do not exist in the training. Is it caused by my misunderstanding? Best, Sadaf
Since I don't know about your application area, here's general ML advice: If there's evidence of overfitting, try more model regularization.
Thanks for your response! I am wondering how the model works on binary classification? So in this scenario, I would like to consider only two classes and any other third objects as anomalous.
There are perhaps better models than PSPNet for problem such as nvidia segmentation but we have not tried them.
We also did not try PSPNet on a binary classification problem. There are foreground/background segmentation papers that might be helpful to look at what has been done in that area/problem.
Hope that helps.
There are perhaps better models than PSPNet for problem such as nvidia segmentation but we have not tried them.
We also did not try PSPNet on a binary classification problem. There are foreground/background segmentation papers that might be helpful to look at what has been done in that area/problem.
Hope that helps.
Yeah, it makes sense :) Thank you so much!
Hi, I was wondering if there is any specific consideration for training the model based on a new dataset with different types of classes? I want to apply your code on an agricultural dataset and I got 100% accuracy with the basic configuration on my dataset. But the test result shows nothing special. I guess the reason might be caused by the definition of the classes. Do you have any suggestions?