OpenDroneMap / ODM

A command line toolkit to generate maps, point clouds, 3D models and DEMs from drone, balloon or kite images. 📷
https://opendronemap.org
GNU Affero General Public License v3.0
4.88k stars 1.11k forks source link

Images corrections with lens correction before process #538

Closed kikislater closed 5 years ago

kikislater commented 7 years ago

Hi, I made a test with this dataset : https://support.pix4d.com/hc/en-us/articles/204845195-Example-Datasets-Available-for-Download-DJI-Demo

First test was with images from dataset download. I made a second test but before I use lens calibration under darktable which use lensfun opensource library. Results are below :

ply_lensfun ortho_lensfun

So why not implementing usage of lensfun in process ?

pulquero commented 7 years ago

That is exactly what I do - use darktable to preprocess the images before running them through opendronemap.

kikislater commented 7 years ago

Yes but I realise that I use Darktable and of course lensfun to process raw file from camera. In most case, raw files from photogrammetry is boring and need more time to process raw. Here is a good example that lensfun could work on jpeg files but it is definitively a tool for raw files. Users with camera like NEX5 can not use lensfun because it's already process by camera. Sometimes, jpeg is still distort (unkown lens or bad manufacturer corrections) and photogrammetry user's need to distort them and lensfun could not help. I think it's more helpfull to create database with corrections or help user to calibrate his own lens and rely on link to create profile for lensfun : usage of hugin.

pulquero commented 7 years ago

You can also get lenses calibrated here: http://lensfun.sourceforge.net/calibration/

pierotofy commented 7 years ago

They have a python lib here: https://github.com/letmaik/lensfunpy

@dakotabenjamin doesn't opensfm already do some sort of lens calibration?

smathermather commented 7 years ago

@pierotofy -- for sure OpenSfM does lens calibration, but if you have known parameters, it's always better to calibrate before feeding into SfM. It ensures that:

  1. your noise is minimized and
  2. all your lens parameters are accounted for.

So, the question is -- pull in lensfunpy or build our own? Or do we pull in lensfunpy and then build our own when we have time?

pierotofy commented 7 years ago

As long as the proper exif tags are embedded in the images, OpenSfM already takes care of things: https://github.com/mapillary/OpenSfM/blob/8ed1f366eedb8a6ce23a7182069e791bdf986d0d/opensfm/commands/undistort.py

Maybe it should be a matter of adding the proper exif tags if they are missing. Or am I missing something? Is there some extra step that is not being performed by OpenSfM that should be included in the process?

pulquero commented 7 years ago

Afaict these come from config.yaml: https://github.com/mapillary/OpenSfM/blob/8ed1f366eedb8a6ce23a7182069e791bdf986d0d/opensfm/reconstruction.py

But, things to consider: lensfun also handles chromatic aberration, the distortion parameters can be looked up from its database based on make/model exif tag, the above is just for opensfm's benefit afaik the original images are still used for texturing and orthophoto. Maybe we can skip opensfm undistort altogether if we undistort upstream.

pierotofy commented 7 years ago

I can confirm the undistorted images are used for texturing (in turn those are then used for the orthophoto).

pierotofy commented 7 years ago

The reason this particular dataset does not work well out of the box is because OpenSfM's undistort command does not account for the fisheye lenses of the Phantom 2.

Original:

image

After OpenSfM undistort:

image

Notice the building is still far from straight.

I also noticed that we're using an older version of OpenSfM from Oct 2016, which doesn't have all of the latest fisheye undistort changes: https://github.com/mapillary/OpenSfM/commit/62a4db07d975f75be52509a5c5dd9c9803517de0

paulinus commented 7 years ago

OpenSfM estimates the radial distortion from the images themselves when aligning them. This works well only for small amounts of radial distortion. For strong distortions such as the one above, it is better to either provide the radial distortion parameters to OpenSfM or to undistort the images with an external tool beforehand.

It is also possible to use a fisheye model, but the right parameters (focal and radial distortion) are still required.

Peketronics commented 6 years ago

hELLO, i really need to get reliable DSMs from webodm so i can sale accurate works and be able to contribute to webodm, i could have a lot of problems if i charge to a customer and the results turn out to be off.. please do a tutorial on how to undistort the images before feed to webodm, my knowledge is very limited in informatics and the lensfun workflow seems a bit confusing to me specially when lensfun is not straight forward and needs to be used on another program..

kikislater commented 6 years ago

What you ask is laughing for an opensource software. You make money with it and you want we asset quality of your product. ^_^. I think it's not the proper way to ask, furthermore developpers are doing it in their part time. If you want your problem to be solve, good way are : be patient some have time to look in your input or put some fund to enhance this part ...

Your issue is not similar like this one because phantom 2 have an heavy barrel distorsion in comparison to Mavic pro. Your Dji Mavic Pro is not really suitable for professional work, it have a ton of issue for mapping purpose and your problem could rely on multiple factors :

Note : I have nothing to do with this project, I'm just a simple curious user but I don't use ODM for professionnal work (due to others factors than reliability : I have a ton of datas when I cover a mission and I need speed with GPU computing)

Peketronics commented 6 years ago

@kikislater how unfortunate i find your comment.. i use opensource software to fly racing drones that outperform easily to any DJI drone racing system... that`s why i have faith in open source stuff, i if you read my comment i clearly said i want to contribute to webodm as soon as i make a profit with it... but i need you to help me on the workflow to correct the images before webodm and have good results so i can at least that way be able to make a bit of profit and start contributing with some money to the project

smathermather commented 6 years ago

The help we need at this time, @Peketronics is testing of datasets that aren't working well now in ODM -- preprocess them in Lensfun, and then process in ODM and determine if that fixes the issues. That helps us point the project in the direction of productive fixes.

LucasMM86 commented 6 years ago

Hello, i get good results undistorting images with https://github.com/dakotabenjamin/CameraCalibration . I found it pretty easy to use. In my case i didnt need to use the calibrated chessbord, because i using the mapit cameras, and took the distortion parameters from https://www.mapir.camera/blogs/guide/90677894-adding-camera-profiles-to-pix4d-to-improve-processing-results regards

smathermather commented 5 years ago

Re-titled this as lensfun focuses on RAW images only, and we're typically dealing with JPGs.

smathermather commented 5 years ago

Fixed as of https://github.com/OpenDroneMap/ODM/pull/1015. Please reopen if there are still problems.