tub-rip / event_based_optical_flow

The official implementation of "Secrets of Event-based Optical Flow" (ECCV2022 Oral and IEEE T-PAMI 2024)
GNU General Public License v3.0
144 stars 13 forks source link

clarification about implementation of "motion_to_dense_flow" and the objective #33

Open senecobis opened 1 day ago

senecobis commented 1 day ago

Hi @shiba24 thanks for the great work. I have some questions over the implementation of the contrast maximization for optical flow. Inside self.objective_scipy(x, events, coarser_motion), you

1) I don't understand why you normalize twice, one time in the main batch[..., 2] -= np.min(batch[..., 2]) and then inside the optimization 2) I don't understand why you upscale the optical flow from the motion parameters to obtain dense_flow. Why you use it as the warp for actual cost ? Can't you use the warp of the current event stack, given the current motion parameters implemented in wap.py inside class Warp(object): 3) You implemented multiple losses but only used the gradient based, which make sense for optical flow. Here you used a weight for the loss of 1. Which value would you use for multi_focal_normalized_image_variance for the same estimation problem ?

shiba24 commented 5 hours ago

Hi @senecobis thank you for your interest and the questions,

1>

That's a legacy in my codebase and I was lazy to simplify it because having it twice does not harm.

2>

I could not understand your question, could you explain a bit more?

3>

I would use same, 1. If you check multi_focal_normalized_gradient_magnitude in detail, the values returned is around 4 (in case when you use forward_iwe, backward_iwe, and middle_iwe). And this is similar to multi_focal_normalized_image_variance. Since the numerical ranges are similar I think it's ok to use similar weight for multi_focal_normalized_image_variance.

senecobis commented 4 hours ago

Thank you for your reply @shiba24 2) Basically I don't understand this line

Btw do you have by chance either in this code-base or others also contrast maximization for rotation estimation? Instead of optical flow

shiba24 commented 3 hours ago

do you have by chance either in this code-base or others also contrast maximization for rotation estimation? Instead of optical flow

Yes, I do have one, though not included in this public repo.

self.motion_to_dense_flow(pyramidal_motion, t_scale) * t_scale

Please ignore t_scale. It's time scale, and why I have it here is a bit more technical reason. For the tile-based flow estimation, the flow estimation is 2*N_tile, not 2*N_pixel. Raw event coordinates are (x, y) in the full pixel resolution (of course), so I need to interpolate (upscale) the 2*N_tile flow to dense (2*N_pixel) flow. That's why I have interpolate_dense_flow_from_patch_tensor and interpolate_dense_flow_from_patch_numpy. Does it answer your q?