fiblan / webcam-virtual-background

Enjoying web conference with virtual backgrounds on linux
11 stars 2 forks source link

Background compatibility #5

Open ianni67 opened 4 years ago

ianni67 commented 4 years ago

If your mask is not well fitting your head, try to choose a background picture that fits the amount of light in your real room. In particular, if the room is well illuminated, with a light background, try to find a picture that matches it. On the other hand, if your background is a bit darker, find a dark picture.

ianni67 commented 4 years ago

I tried to investigate the background compatibility a bit more. The problem is that often the brightness of the "real frame" (let's call it foreground) is very different from the brightness of the background. As a consequence the border between foreground and background becomes very visible and it may even flicker a bit. Adjusting the histogram of two images in order to make them "similar" is not easy nor straightforward. Some suggestions may come from this matlab page. A tentative, naive but easier solution is to adjust both the background and foreground brightnesses in order to let them "meet" at half way. In other words, we adjust both to let them fit each other. This solution does not adjust the whole histograms but only the overall brightness. Here is a sniplet for the get_frame function in fake.py, that implements a sort of weighted brightness adjustment:

def get_frame(cap, background, sf):
    global rem_mask
    global rem
    _, frame = cap.read()
    cols, rows,_ = background.shape
    b_brightness = np.sum(background) / (255 * cols * rows)
    brightness = np.sum(frame) / (255 * cols * rows)
    target_brightness = (b_brightness + 8*brightness)/9
    ratio_f = brightness / target_brightness
    ratio_b = b_brightness / target_brightness

    #if ratio < 1:
       # adjust brightness to get the target brightness
    frame =  cv2.convertScaleAbs(frame, alpha = 1 / ratio_f, beta = 0)
    background =  cv2.convertScaleAbs(background, alpha = 1 / ratio_b, beta = 0)

    # fetch the mask with retries (the app needs to warmup and we're lazy)
    # e v e n t u a l l y c o n s i s t e n t
    mask = None
    if(rem_mask is None):
        rem = 0
        while mask is None:
            try:
                mask = get_mask(frame,sf)
                rem_mask = mask
            except:
                 print("mask request failed, retrying")
    else:
        mask = rem_mask
        rem+=1
        if(rem>1):
            rem_mask = None
        # print(rem)
    # composite the background
    for c in range(frame.shape[2]):
        frame[:,:,c] = frame[:,:,c] * mask + background[:,:,c] * (1 - mask)

    return frame

Note the calls to convertscale. Please keep in mind that this is just a proof-of-concept and that histogram adjustment should be performed differently (again, check the matlab page linked above for some examples).