dongwu92 / AutoPortraitMatting

Tensorflow implementation of Automatic Portrait Matting on paper "Automatic Portrait Segmentation for Image Stylization"
Apache License 2.0
478 stars 151 forks source link

What are the .mat files in data/images_tracker folder? #25

Open michaelhuang74 opened 7 years ago

michaelhuang74 commented 7 years ago

For each image in the data/images folder, there is a corresponding .mat file in the data/images_tracker folder. I have two questions.

(1) What is the purpose of these .mat files? (2) For a new image, how to generate the corresponding .mat file?

Many thanks.

MattKleinsmith commented 6 years ago

In the data readme:

Structures of the Files

The code includes four folders:

Caffe-portraitseg: implementations of our Caffe. data: scripts for downloading data. data/images_mask: all our labeled portrait masks. data/images_tracker: face tracker points of all portraits. data/trainlist.mat: the image IDs of training set. data/testlist.mat: the image IDs of test set. data/alldata_urls.txt: the URLs of all images in Flickr. data/crop.txt: the cropping parameters of each downloaded portrait. training: scripts and models for training. training/FCN8s_models: the initial models from FCN. training/model_files: network and solver files for our models. testing: scripts and models for testing. testing/our_models: our pre-trained models.

MattKleinsmith commented 6 years ago

In short: To use the technique of this paper, you'll need to use a particular face detector. The authors didn't put the face detector code in the files.


These tracker points are used to create the shape channel and the position channels of the input to the network. To obtain tracker points for images outside of the dataset, you'll need to use the face detector the authors used. The authors link to a paper that likely introduced that particular face detector. If I find an implementation of the face detector, I'll link to it here.

See get_warped_xy_mmask.m in the testing folder to see how the tracker points are used. You'll notice that when they use the get_warped_xy_mmask function in the testing code (demo_portraitFCNplus.m) they use it on tracker points that came from a .mat file. The authors generated and stored tracker points for each image in the dataset and provided these tracker points, but didn't provide the code to generate these points.

KESR007 commented 6 years ago

I dont‘t how to generate a corresponding .mat files ,but I s till want use the paper to process my own image.Do you have any feasible idea? Thanks a lot!

super-ruilei commented 6 years ago
2018-06-25 12 23 02

In the pic, we plot the tracker mat file provide by authors. The most used face landmark tool dlib provide 68 landmarks. The author use 49 instead. You can remove 17 landmarks on face contour, and 2 landmarks on mouth.

super-ruilei commented 6 years ago

or you can use dlib to generate you own trackers for training images && test images. Then you need to modify the file get_warped_xy_mmask.m a little bit.