yucornetto / MGMatting

This repository includes the official project of Mask Guided (MG) Matting, presented in our paper: Mask Guided Matting via Progressive Refinement Network
Other
332 stars 48 forks source link

for dataset & matting result #15

Open guoyt35 opened 3 years ago

guoyt35 commented 3 years ago

HI, I use 【com=alphaimg+(1-alpha)([10, 255,15])】 to get the 'real-world portrait dataset' matting foreground on a green background,but i found the boundary area is not so soft。 comp comp1 comp2 And I have used your pre-trained model to get my own portrait dataset's matting result, I found that the boundary area performed well, but it missed a lot of inner foreground. My mask is got by using the u^2 net demo. 0003 0005 0007

yucornetto commented 3 years ago

Hi,

  1. As mentioned in the paper and this github repo, using original image as foreground usually cannot lead to satisfying results. You can try train one using this repo with random alpha blending, which should not be hard to implement, or some other traditional methods (e.g., closed-form matting) to obtain the foreground color. I am occupied recently but will work on releasing the foreground code/model if I get time.

  2. The results look weird to me. Could you please confirm that the pretrained weight you are using is MGMatting-RWP-100k instead of MGMatting-DIM-100k?

guoyt35 commented 3 years ago

I used the MGMatting-RWP-100k pretrained weight to test it again, it performed better a lot in the inner area, thanks a lot!!!

hackkhai commented 2 years ago

I tried MGMatting-RWP-100k and still got weird results like he did :(