Open guoyt35 opened 3 years ago
Hi,
As mentioned in the paper and this github repo, using original image as foreground usually cannot lead to satisfying results. You can try train one using this repo with random alpha blending, which should not be hard to implement, or some other traditional methods (e.g., closed-form matting) to obtain the foreground color. I am occupied recently but will work on releasing the foreground code/model if I get time.
The results look weird to me. Could you please confirm that the pretrained weight you are using is MGMatting-RWP-100k instead of MGMatting-DIM-100k?
I used the MGMatting-RWP-100k pretrained weight to test it again, it performed better a lot in the inner area, thanks a lot!!!
I tried MGMatting-RWP-100k and still got weird results like he did :(
HI, I use 【com=alphaimg+(1-alpha)([10, 255,15])】 to get the 'real-world portrait dataset' matting foreground on a green background,but i found the boundary area is not so soft。 And I have used your pre-trained model to get my own portrait dataset's matting result, I found that the boundary area performed well, but it missed a lot of inner foreground. My mask is got by using the u^2 net demo.