nomewang / M3DM

MIT License
153 stars 21 forks source link

Multiple memory bank problem #29

Open Masahiro1002 opened 11 months ago

Masahiro1002 commented 11 months ago

Thank you for sharing your code. I have a question because there is a discrepancy between the results of Table 1 and Table 2 and results of mine.

Regarding the results for 3D and RGB, I am getting results similar to those in the paper. However, for RGB+3D, the AUPRO score is significantly lower than the results in the paper. It seems that multiple memory bank doesn't work well. I suspect there might be an issue with the function add_sample_to_late_fusion_mem_bank calculates s_map, but I haven't been able to resolve it. Do you have any idea what the problem might be?

Point_MAE Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.945 0.648 0.971 0.983 0.832 0.766 0.87 0.919 0.869 0.824 0.863
PAUCROC 0.981 0.949 0.996 0.934 0.959 0.927 0.988 0.994 0.994 0.983 0.971
AUPRO 0.948 0.821 0.977 0.883 0.881 0.746 0.954 0.973 0.948 0.936 0.907
DINO Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.942 0.918 0.896 0.749 0.959 0.767 0.919 0.644 0.942 0.767 0.85
PAUCROC 0.992 0.993 0.994 0.977 0.983 0.955 0.994 0.99 0.995 0.994 0.987
AUPRO 0.951 0.972 0.973 0.891 0.932 0.843 0.97 0.956 0.968 0.966 0.942
Fusion Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.975 0.899 0.915 0.937 0.949 0.882 0.979 0.688 0.957 0.834 0.902
PAUCROC 0.994 0.99 0.997 0.985 0.986 0.972 0.995 0.994 0.997 0.995 0.99
AUPRO 0.963 0.965 0.978 0.942 0.944 0.889 0.974 0.967 0.973 0.973 0.957
DINO+Point_MAE+Fusion Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.988 0.903 0.97 0.971 0.941 0.925 0.964 0.909 0.971 0.86 0.94
PAUCROC 0.698 0.683 0.729 0.651 0.663 0.638 0.684 0.696 0.701 0.708 0.685
AUPRO 0.287 0.27 0.37 0.212 0.31 0.213 0.351 0.589 0.314 0.497 0.341
lijing0901 commented 8 months ago

Thank you for sharing your code. I have a question because there is a discrepancy between the results of Table 1 and Table 2 and results of mine.

Regarding the results for 3D and RGB, I am getting results similar to those in the paper. However, for RGB+3D, the AUPRO score is significantly lower than the results in the paper. It seems that multiple memory bank doesn't work well. I suspect there might be an issue with the function add_sample_to_late_fusion_mem_bank calculates s_map, but I haven't been able to resolve it. Do you have any idea what the problem might be?

Point_MAE

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean IAUCROC 0.945 0.648 0.971 0.983 0.832 0.766 0.87 0.919 0.869 0.824 0.863 PAUCROC 0.981 0.949 0.996 0.934 0.959 0.927 0.988 0.994 0.994 0.983 0.971 AUPRO 0.948 0.821 0.977 0.883 0.881 0.746 0.954 0.973 0.948 0.936 0.907 DINO

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean IAUCROC 0.942 0.918 0.896 0.749 0.959 0.767 0.919 0.644 0.942 0.767 0.85 PAUCROC 0.992 0.993 0.994 0.977 0.983 0.955 0.994 0.99 0.995 0.994 0.987 AUPRO 0.951 0.972 0.973 0.891 0.932 0.843 0.97 0.956 0.968 0.966 0.942 Fusion

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean IAUCROC 0.975 0.899 0.915 0.937 0.949 0.882 0.979 0.688 0.957 0.834 0.902 PAUCROC 0.994 0.99 0.997 0.985 0.986 0.972 0.995 0.994 0.997 0.995 0.99 AUPRO 0.963 0.965 0.978 0.942 0.944 0.889 0.974 0.967 0.973 0.973 0.957 DINO+Point_MAE+Fusion

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean IAUCROC 0.988 0.903 0.97 0.971 0.941 0.925 0.964 0.909 0.971 0.86 0.94 PAUCROC 0.698 0.683 0.729 0.651 0.663 0.638 0.684 0.696 0.701 0.708 0.685 AUPRO 0.287 0.27 0.37 0.212 0.31 0.213 0.351 0.589 0.314 0.497 0.341

Hi,friends! I only tried the dino method, but the effect is much worse than yours, I would like to ask if you have made any improvements to the RGB-only method, or what is the device environment like?

BanquetLee commented 7 months ago

My results of DINO are similar to those in paper, but Point_MAE method didn't perform well on some classes of the mvtec3d

VegetableChicken504 commented 4 months ago

Hi, I ran the results with only the I-AUROC metric not matching the results in the paper at 0.936 (0.945 in the paper), and saw that your I-AUROC result was 0.94, and would like to ask what is your graphics card configuration? The UFF training weights I used are the best training weights provided by the authors (it's the UFF Module under checkpoints)

Masahiro1002 commented 4 months ago

@VegetableChicken504 I used NVIDIA's V100 at that time. I think 0.09 points lower is possible due to random numbers and other conditions.