bruceyo / MMNet

Other
31 stars 11 forks source link

Accuracy of this method:python main_rgb_fused.py recognition -c config/ntu60_xsub/test_rgb_fused.yaml #5

Open katahiyu opened 2 years ago

katahiyu commented 2 years ago

Dear author, I ran this code using a trained model and got the following results: Why is the Top1 accuracy 4.83%? Please let me know if you have any solution. image

bruceyo commented 1 year ago

The current trained models are from the conference version based on the ST-GCN's skeleton data (dating back to May 2020). The journal version's implementations (around Jan 2021) are based on the MS-G3D's skeleton data, where the order is inconsistent with the one of ST-GCN. Sorry about it. I will upload the newly retrained model. The trained models for other datasets should be correct.

You may try to retrain the model to get the proper results.

katahiyu commented 1 year ago

Thank you for your reply. I understand about trained models. We would appreciate it if you could upload your retrained model.

I'm having trouble retraining the model. When I run "python main_rgb_fused.py recognition -c config//train_rgb_fused.yaml" it returns low accuracy like the image below. Is there any solution for this? image

I really appreciate your kind response.

bruceyo commented 1 year ago

Retained models rgb_fused_model_new.pt are uploaded to Google Drive. NTU RGB+D X-Sub NTU RGB+D X-View

katahiyu commented 1 year ago

Thanks for the upload. Why does a new trained model result in lower accuracy? image

bruceyo commented 1 year ago

20221017102013 Above is my results. I do not know your exact configuration. Probably due to the improper preparation of skeleton data.

katahiyu commented 1 year ago

Is the best_model.pt the same as the rgb_fused_model_new.pt you recently uploaded?

bruceyo commented 1 year ago

Is the best_model.pt the same as the rgb_fused_model_new.pt you recently uploaded?

Yes, I downloaded and tested it again. I used the same skeleton data preparation with the MS-G3D.

katahiyu commented 1 year ago

Thanks for the reply. Understood.

This is my feeder_rgb_fused.py. is this path the correct place to put the processed ROI data downloaded from GoogleDrive? This may be a rudimentary question, but thank you in advance. image

bruceyo commented 1 year ago

Thanks for the reply. Understood.

This is my feeder_rgb_fused.py. is this path the correct place to put the processed ROI data downloaded from GoogleDrive? This may be a rudimentary question, but thank you in advance. image

Yes, you can use it to reproduce the result. But not recommended for training as it is implemented with random sampling (see Line 131 of feeder_rgb_fused_ntu.py).

katahiyu commented 1 year ago

I see!

It is still not accurate and does not run well, is there any part of the provided program code that needs to be changed?

katahiyu commented 1 year ago

返信いただきありがとうございます。了解した。 これは私の feeder_rgb_fused.py です。このパスは、GoogleDrive からダウンロードした処理済みの ROI データを配置する正しい場所ですか? 初歩的な質問かもしれませんがよろしくお願いします。画像

はい、結果を再現するために使用できます。ただし、ランダム サンプリングで実装されているため、トレーニングにはお勧めしません ( feeder_rgb_fused_ntu.py の 131 行目を参照)。

This is my feeder_rgb_fused.py file. If I comment out the path to the processed ROI data, I get the same result. Is this evidence that I am not importing the ROI data properly? image

bruceyo commented 1 year ago

Can you try to change Line 131 of feeder_rgb_fused_ntu.py to if not self.evaluation:

katahiyu commented 1 year ago

Thanks for the reply.

I changed line 131 in the feeder_rgb_fused_ntu.py file to "if not self.evaluation:" and got the following result. Thank you! image

bruceyo commented 1 year ago

Great. The slight result difference might be caused by the calculation of the ST-ROI. The provided ST-ROI is from the conference version.

katahiyu commented 1 year ago

Thank you so much for your help. Thank you for the instant and solution you provided. I would be happy to learn more about your training.

2233950316 commented 1 year ago

@katahiyu Hello, I have been reproducing this mmnet recently and I have some basic questions that I would like to ask you. Can I provide you with a contact information? My vx is meng2233950316. Thank you very much