Justin900429 / Facial-Graph-Representation-Learning

PyTorch version for the "Micro-expression Recognition Based on Facial Graph Representation Learning and Facial Action Unit Fusion"
MIT License
58 stars 2 forks source link

I meet some questions in running your code. could u plz help me ? #5

Closed qufy6 closed 1 year ago

qufy6 commented 1 year ago

First, I use pip install to construct my new venv, and some designate version is not existed, so I install the lastest one instead . and then I run "train.py", but it seems there is some thing wrong: ### train.py", line 184, in parser.add_argument("--npz_file", ...... argparse.ArgumentError: argument --npz_file: conflicting option string: --npz_file

I don't know how to fix it. Is about the version ?

Justin900429 commented 1 year ago

The npc file can be made from compute_adj.py. It requires adjacency matrix generated from this code to do the graph convolution.

qufy6 commented 1 year ago

Thx for answering my question. I found that there r 2 "parser.add_argument("--npz_file",", and I try to annotate the first one, the code runs. I don't know is it a correct way... and could you tell me how to let the whole project run ? Should I enter "python train.py" or with some other params?

Justin900429 commented 1 year ago

Hi, one of thenpz arguments is removed. You should run python train.py with other params. Some of them are required: --csv_path --image_root --npz_file --catego. You can find the descriptions in README.

qufy6 commented 1 year ago

Yeah, but it's a pity that I've read but still don't know the correct process from scratch (QAQ)... Where can I get --csv_pathand--image_root? And do I need to download the dataset first as catego? and what is npz? last, should I enter like this : python train.py csv_path=**/**/** image_root=**/**/** npz_file=**/**/** catego =**/**/** thx very much

Justin900429 commented 1 year ago
  1. csv files can be transformed from the original .xlsx files. Note that the column name need to be changed properly. You can find the column name in dataset.py
  2. The image root is where your MER images are located. For example, if your images are put at images/casme, then you should you images/casme as the image root.
  3. Instead of using *=*, you should use --csv_path <param>. Others are the same.
qufy6 commented 1 year ago

hi thx for replying. I've scrutinized the paper and code these days, but I still have some critical questions.

  1. I ran compute_adj.py first and got .npz files. But when I try to run train.py I found that in dataset.py line 105 self.magnet.load_state_dict(torch.load("weight/magnet.pt",map_location=device)) , I couldn't find .pt file. Is this a mistake or I need to get .py first ?(if so, could u plz tell me how to get it?)
  2. in README, you quote an offical code, what is it?
Justin900429 commented 1 year ago
  1. Refer to here.
  2. Just click the link, it can bring you to the official implementation. This repo is just the reimplementation.
qufy6 commented 1 year ago

Hi, thanks a lot ! I've run ur code successfully. But another question is that the acc_rate is about 37, I check the model, finding that the predict labels is all 3. More, I turn the learning rate less, the result is that the labels is almost 0 or some else labels. It's strange that almost all labels change together. Also, the loss fluctuates instead of going down. I tried to change the loss function and optimizer, but that doesnot work... So I guess that whether there is something wrong in backward ? and do we need more layers in some part of model such as GCN, the encoder of transformer or others? OR, if u know where is the mistakes or differences from artical, plz tell me. I appreciate u a lot !

Justin900429 commented 1 year ago

Hi, this issue is also mentioned here issue https://github.com/Justin900429/Facial-Graph-Representation-Learning/issues/4. Still not solve this problem yet. The setup is same as the author. Maybe the adjacent matrix is the problem. But I don't explore it further.

qufy6 commented 1 year ago

Hi.I have tried to find the mistake, but I failed at last. However, I planned to keep finding. Anyway, I appreciate your help for my questions!!!