Realsense disparity refinement with CUDA
I made the disparity refinement project for Intel Realsense D400 series. It's based on the C++ & CUDA.
I did the disparity re-computation about the IR pair(left & right) from Intel Realsense D435.
I referred "SOS: Stereo Matching in O(1) with Slanted Support Windows" and "HITNet: Hierarchical Iterative Tile Refinement Network for Real-time Stereo Matching" which are based on the tile disparity.
The environment for testing is below.
- Intel Realsense D435 camera ([848 x 480], [1280 x 720])
- OS : Windows 10
- IDE : Visual studio 2015 community
- CPU : Intel(R) Core(TM) i7-9700K (3.60GHz)
- GPU : Geforce RTX 2080 ti
- RAM : 64 GB
Dependency for testing
The method consists of several modules
- (1) Census transform.
- (2) Disparity to tile.
- (3) Tile disparity refinement using the parabola fitting.
- (4) Tile slant estimation using the eigen value decomposition.
- (5) per pixel estimation based on the tile.
Tile slant visualization
Disparity refinement before/after
-
Before
-
After
TODO
Please use this method and give me a comment about this.
- I'm trying to change the build type to cmake.