Closed arpu closed 5 years ago
4 fisheye sitthing still WIP. And also need calibration data of extrinsic/intrinsic parameters for each camera to do surround view. set_stitch_info is also necessary for quality tuning.
hi how can i generate the extrinsic/intrinsic parameters?
extrinsic parameters, there are lots tools. you can try opencv calibration or search matlab code. intrinsic parameters, you can try https://sites.google.com/site/scarabotix/ocamcalib-toolbox . actually we are going to support different intrinsic algorithms like opencv. this feature is still WIP. you can keep watch patches.
hi any news ? i see some new commits like the soft stitcher but no Dokumentation on this :/
is it possible to stitch my sample hugin_4096x2048 images?
news?
hi @arpu , we have the 4 camera stitching which presently working for car surround-view. there are both CL and CPU supports there. I'm not sure this can work for your case. Anyway, you can take try later after our tests next week. @liuyinhangx can give you some instructions after our tests passed. we are planning to cut a release soon.
Thanks, Wind
Hi @arpu , For the test command line, please refer to: https://github.com/intel/libxcam/wiki/Tests#2-test-image-stitching And the 4-camera mode needs calibration parameters, you can try opencv tools and ocamcalib-toolbox, as @windyuan has mentioned.
Hi @liuyinhangx, Could you upload the 4-camera stitching test datas? (input0.nv12 input1.nv12 input2.nv12 input3.nv12 and intrinsic paramters and extrinsic parameters). Thanks.
@bruce-l, we can't share client's sample data outside. if you need any support, please mail to us privately. thanks.
@windyuan Thanks.
hi @windyuan , @liuyinhangx , I wonder how I can get the extrinsic calibration parameters for the 4-camera stitching test? As the toolboxes usually provide calibration for only 2 cameras (for example the matlab version for opencv toolbox http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/example5.html), should we calibrate the 4 cameras 2 by 2 and always use one of the 4 cameras as the world coordinate? But if we set the front camera as the world, how can we calibrate the rear camera? Thanks a lot!
@GUO-W, it's not like that to get vehicle extrinsic data. you can search surround view calibration to get more details. here's a youtube video introduction. https://www.youtube.com/watch?v=_QaM-aArETU The canvas can be replaced to others. you need find a flat enough area and lay the canvas and label the 3D coordinate positions(e.g. mark ground of car center as (0,0,0)). Bring corners into the matlab or opencv tools (we didn't tried that before, we don't have resource/budget to setup this environment) to get each camera position and pose. Theoretically, this can work. Thanks, Wind
@windyuan : I have a short query.
I am getting intrinsic parameter correctly using ocam calib. For extrinsic parameter I am getting n number of parameter for n images used for caliberation. As I can see in the parser of calibration it expects only one set of extrinsic parameter for a specific view. May I know where am i going wrong and how to tackle the same ?
@windyuan : I have a short query.
I am getting intrinsic parameter correctly using ocam calib. For extrinsic parameter I am getting n number of parameter for n images used for caliberation. As I can see in the parser of calibration it expects only one set of extrinsic parameter for a specific view. May I know where am i going wrong and how to tackle the same ?
@zongwave how to know which image's RRfin to be used for extrinsic parameter ?
hello how can i stitch 4 fisheye images with hugin view settings file? the images are in a opencl Buffer UMat add the hugin image sample and pto settings file
hugin_4096x2048.zip