senguptaumd / Background-Matting

Background Matting: The World is Your Green Screen
https://grail.cs.washington.edu/projects/background-matting/
4.78k stars 662 forks source link

Why hand-hold sample video looks not good #53

Open daodaoawaker opened 4 years ago

daodaoawaker commented 4 years ago

First of all, great work !! I want to re-implement the result on sample videos as your procedure in the repo, but why I cannot get the same effect as yours. I'm confusing ... orz. I find the problem is that _back.png obtained by running test_pre_process_video.py looks incorrect, like following pictures in sample_video/input

This is one of frame extracted from teaser.mov 0001_img

This is the corresponding bg generated by running test_pre_process_video.py 0001_back

Obviously there is something wrong. As said in README.md, If there are significant exposure changes between the captured image and the captured background, use bias-gain adjustment to account for that. Should I turn on the part of bias-gain adjustment in the test_pre_process_video.py? CAPTURE_202065_160347

Is that correct ? Thanks very much !!

senguptaumd commented 4 years ago

There is an alternate Matlab code that seems to be more robust to misalignment and exposure change. You can try that if you have access to Matlab. I think the python code for bias-gain adjustment have some bug, that is why I turned it off. The python code was later developed before publishing the code and I was internally using the matlab code for alignment and exposure adjustment.