wba.py currently relies on the assumption that in a full video, a background pixel will not have a ball over it majority of the time. Thus, you can use the mode of pixel color over the length of video to find the background. However, this does not work if the balls do not move a sufficient amount throughout the video. Current work-around is to use the experimental background flag to supply a background JPEG picture.
Can you document the background flag option in the openCV README? Then I think we can close this issue for now, since that seems like the cleanest way around this issue.
wba.py currently relies on the assumption that in a full video, a background pixel will not have a ball over it majority of the time. Thus, you can use the mode of pixel color over the length of video to find the background. However, this does not work if the balls do not move a sufficient amount throughout the video. Current work-around is to use the experimental background flag to supply a background JPEG picture.