isyangshu / MambaMIL

[MICCAI 2024] Official Code for "MambaMIL: Enhancing Long Sequence Modeling with Sequence Reordering in Computational Pathology"
49 stars 1 forks source link

Why is there a slight gap in the results of my training multiple times, I fixed all possible random seeds #17

Open JiuyangDong opened 2 weeks ago

JiuyangDong commented 2 weeks ago

7cb5f2b9bcf9fa63d31497810846715 148d8fc183340202207545ed8256d8f

JiuyangDong commented 2 weeks ago

Is this due to the randomness that Mamba brings to the table itself?

isyangshu commented 5 days ago

Sry for late reply.

For the instability of Mamba, you can refer to https://github.com/state-spaces/mamba/issues/137. You can use a smaller learning rate, which may allow you to reach the same local optimum before the backpropagation divergence occurs when you run the code multiple times.