hysts / pl_gaze_estimation

MIT License
39 stars 13 forks source link

eth-xgaze training and implementation on demo #2

Closed jasony93 closed 2 years ago

jasony93 commented 2 years ago

I appreciate for your great work in advance. I have tried to train eth-xgaze and use the resulting weight file to visualize on demo using your pytorch_mpiigaze_demo repository. It seems to be working, but the gaze estimation model does not give satisfactory result, rather the result seems to be random. Based on training log, the training process is in good shape, giving about 1~2 angular error. I was wondering if you have any idea about what might have been done wrong.

I think only change I made was, copying your pytorch lightning model from training process. Your pytorch_mpiigaze_demo is not using pytorch lightning, but your training process uses pytorch lighting model.

jasony93 commented 2 years ago

The pretrained eth-xgaze model works perfectly fine btw.

hysts commented 2 years ago

Hi, @jasony93 Hmm, I have no idea. You trained the same model as the pretrained model, and only replaced the weight file, right? If the demo app was able to run without error, then it seems that the conversion of the weight file was done correctly, so what went wrong...?

Is it possible to provide the weight file you trained? I'm not sure if I can find the cause even with it, but it may help.

jasony93 commented 2 years ago

Thanks for the quick response. Below is the link for downloading the weight file. https://drive.google.com/drive/folders/1dgiQXDl_KtsFJqZm3XIhNbkwNbMenenY?usp=sharing

hysts commented 2 years ago

Thanks, but I don't have access to the file. Can you make a sharable link with "Anyone with the link"?

jasony93 commented 2 years ago

I apologize. Here is the sharable link. https://drive.google.com/drive/folders/1dgiQXDl_KtsFJqZm3XIhNbkwNbMenenY?usp=sharing

hysts commented 2 years ago

No problem, and thanks for the new link. I'll look into it.

hysts commented 2 years ago

@jasony93 Your model seems to be working fine. I think the problem lies in the model conversion process. You can use the following snippet to convert your .ckpt file to a .pth file that can be used with pytorch_mpiigaze_demo repo.

import torch

ckpt = torch.load('epoch=0014.ckpt', map_location='cpu')
state_dict = ckpt['state_dict']
for key in list(state_dict.keys()):
    state_dict[key[6:]] = state_dict[key]
    del state_dict[key]
torch.save({'model': state_dict}, 'model.pth')