Closed akidway closed 2 months ago
While when I used CluSiam trained for 20 epochs, trivial solution seems disappeared.
And when these features (extracted by 20 epochs CluSiam) used to train a mil model, the result auc also seems normal. In valid set of training data, the auc could reach about 0.9.
Hi @akidway ,
I used CLAM preprocessed datasets as well recently and observed that CluSiam wasn’t as stable as it was with the datasets used previously. I suspect this instability might be due to more redundant backgrounds affecting the pre-training, such as non-tissue regions (e.g., black bars or other artifacts at the edges of WSI) generated by CLAM pipelines.
Increasing the alpha value might address the issue you mentioned. In my In my subsequent experiences, it can further improve mil classification accuracy on the original datasets. More importantly, could you try the CluBYOL training method? I have found it to be more stable than CluSiam and highly recommend using it.
Thank you for your prompt response. Your advice is valuable to me. I'll give CluBYOL a try later on. Best wishes.
Hi, @wwyi1828
I'm trying to reproduce your result.
But when I use CluSiam (training for 50 epochs) to extract features, I found that all tiles representation are almost same.![image](https://github.com/wwyi1828/CluSiam/assets/93326359/f888d89e-80af-4089-89ae-bd832bd0cfe4)
When I used these representations to train a mil model, the result auc is always around 0.5.
To train a CluSiam model, I used commands as below:
The training data are 2.5 millions .jpeg files (C16 processed with Clam).
Is there any trick I missed? Could you please give some advice?