yfeng95 / PRNet

Joint 3D Face Reconstruction and Dense Alignment with Position Map Regression Network (ECCV 2018)
http://openaccess.thecvf.com/content_ECCV_2018/papers/Yao_Feng_Joint_3D_Face_ECCV_2018_paper.pdf
MIT License
4.96k stars 944 forks source link

Please see here. #74

Open yfeng95 opened 6 years ago

yfeng95 commented 6 years ago

Hi, Thanks for clicking in. Please forgive me that I was so busy these days that could not solve issues and reply emails from you. Unfortunately, I will continue to be very busy (in applying PhD, writing master thesis and many other things) until Jan 2019, and this issue is for brief answers to some questions and my plan that need your suggestions.

My plan: In the beginning, I released PRNet just for sharing my research work, which only focuses on reconstruction and dense alignment. Other applications like pose estimation and face swapping are implemented casually for showing the usefulness of the research results. So the problem is that the project is not well-organized and many parts need to be optimized(that's why I don't want to add functions based on the current framework). So, I plan to make a BIG UPDATE in Jan 2019. Just now, I've updated my another project face3d, which implements some basic functions about 3D faces. The expected new version will replace many current codes with functions from face3d, which should be then faster and more extensible. Furthermore, I am very happy that this project could be useful for you, and I want to add more useful functions. It would be appreciated if you could write down some suggestions below.

After the update, if you still have problems, I will then help you solve the issues. Now, I give some brief answers to Questions about

  1. applications based on PRNet I strongly recommend you see the repo face3d, especially transformation and render part, then you should be able to accomplish applications based on PRNet quickly and solve many problems easily. From my personal perspective, PRNet and face3d contain almost all the core parts and enough examples, only writing several lines could implement a specific application(Of course, optimization need more time). If you have done one and released your code, please send me the link and I would like to add it in the readme. Thanks in advance for your sharing!
  2. expressions Sorry for that PRN can not process expressions now, this is decided by the algorithm. I am working on new algorithm that can solve this.
  3. relationship with 3DMM Of course we can establish the relationship between position map and 3D morphable model back, I will add this function in the new version.
  4. training codes I am sorry that the released training codes are all for some reasons. However, from emails, I know there are someone have re-implemented the training part well and are willing to release them, maybe you can wait.
  5. python version Now PRNet is only tested on python 2, I will modify the codes so that they can run on python 3 and windows. You can also run them now with small changes. see #23
  6. ideas about 3D faces It's very welcomed if you are willing to discuss some new ideas with me, via my email fengyao@sjtu.edu.cn.

Thanks for your understanding!

Best, Yao Feng

RaineyWu commented 6 years ago

hope you back.

taylorlu commented 5 years ago

Dear YadiraF, I have a c++ implementation which can run on iOS (only for pose estimate based on dense landmark and face swap), https://github.com/taylorlu/FaceConverter The app can reach 3-5fps for face swap, on iPhone SE device. But it seams that the render part is more time cost when the face area is bigger, because the poisson blend? And the quality of blending face is unadjustable, how to enhance the face quality when we use a face picture with a high resolution. The last question is the 3D landmark in video is apt to shake frame by frame. Is there any method to keep stability such as landmark tracking. Hope you back.

lxzkeyan commented 5 years ago

Hi YadiraF: Will the training source code be open? Could you please send your PRNnet train code to my email?1261775243 @qq.com

Thank you very much!