microsoft / AirSim

Open source simulator for autonomous vehicles built on Unreal Engine / Unity, from Microsoft AI & Research
https://microsoft.github.io/AirSim/
Other
16.32k stars 4.55k forks source link

Q:Omnidirectional Camera / FOV #1769

Open mxf8bv opened 5 years ago

mxf8bv commented 5 years ago

Hello,

I need to simulate an omnidirectional camera in Airsim, which should also be interesting to other people. I use a pre-cooked scenario under Linux. In the past (e.g. in Blender), I used either

  1. a mirror surface or
  2. "sliced rendering": Small horizontal FOV images stitched together (e.g., 36*10 deg).

The exact geometry of the omnidirectional projection are not important to me, but a view from the same position with different yaw angles should yield almost identical images after shifting them (or, similarly, result in very similar Fourier amplitudes along pixel rows).

I tried approach 2 in Airsim and found that

Questions: a. Can Airsim simulate nonflat reflective surfaces? b. Can I switch off camera aberrations from the API or settings file? c. Is there a way to fix the camera FOV settings? d. Is there a better way to get omnidirectional imagery from Airsim?

Thanks for any help! Mathias

madratman commented 5 years ago

I am not sure which is the best way - but this video highlights two options to capture 360 images in unreal engine - scene capture cube and the nvidia ansel plugin - https://www.youtube.com/watch?v=dbNUPCHxFNg Another option is here - https://www.youtube.com/watch?v=hewxvHJ6GCE

mxf8bv commented 5 years ago

Thanks for the suggestions. The capture cube idea looks promising, however I need to generate thousands of views, and I have no idea how to start in the editor instead of using Airsim directly. IMO scripting via the API in Airsim seems to be the only way that will work for me.

madratman commented 5 years ago

@mxf8bv airsim camera api is just a wrapper around unreal camera. I am trying to first figure out how to what's the appropriate unreal camera for your need. Another thing worth trying is the stereo panaromic capture tool here https://docs.unrealengine.com/en-US/Platforms/VR/StereoPanoramicCapture/QuickStart They have a movie capture plugin which let's you create videos, aka a stream of images.

abhimanyuchadha96 commented 5 years ago

I am not sure which is the best way - but this video highlights two options to capture 360 images in unreal engine - scene capture cube and the nvidia ansel plugin - https://www.youtube.com/watch?v=dbNUPCHxFNg Another option is here - https://www.youtube.com/watch?v=hewxvHJ6GCE

I am trying to simulate a 360 camera view in the neighborhood environment but as mentioned in other threads the environment isn't open source. Looking for suggestions or a way around.

csuwp commented 4 years ago

have you solved this question?

botkevin commented 3 years ago

My solution to this problem is just to get images in +-x, +-y, +-z direction, and stitch them into a cubemap, which I then transform to the desired equirectangular image. This method avoids the vignette stitching inaccuracy problem and seems to work fine... However there are some weird aberrations from the vignetting which I try my best to avoid using a larger fov and cropping down.

Here are my results in the block environment. The color is just b/c of bgr v rgb 125088876_1657114597802275_4432496202637615557_n

Let me know if you want the code. I can link it...

saihv commented 3 years ago

@botkevin This looks cool! It'd be great if you can open a pull request with this feature.

RinR0000 commented 3 years ago

@botkevin

That's great! I would love to learn from you!

yugitw commented 3 years ago

@botkevin That's very helpful. Could you post a link for the code?

botkevin commented 3 years ago

The code uses someone else's code for equirectangular to perspective transformations that is quite slow. It probably can be improved a lot. Currently the code is buried in another project, let me extract it and make a repository. Sorry for the delay, I am a little busy at this time. For now I will just detail steps/pseudocode if you want to replicate before I have the chance to post the link.

This code is specifically for the car object, but I see no reason for the results to not be reproducible in drone or your respective vehicle

Open carpawn.cpp or whatever vehicle's cpp code add a base where you want the 360 camera to be add 6 cameras each pointing in each direction compile airsim again collect images crop down to 90 fov (capture larger fov(~110) and crop because of reasons detailed in my previous post) create cubemap convert cubemap to equirectangular by warping

If you want to wait, I will probably do this by next week and submit a pull request/have code by then.

yugitw commented 3 years ago

For those who want a quick hack, I modified the ComputerVisionPawn.cpp to capture 6 images that can form a cubemap. It is easy to find a tool online that allows you to convert a cubemap to an equirectangular image.

Usage:

  1. Simply replace the CVPawn.h and CVPawn.cpp files and recompile the project.
  2. Modify your settings.json to enable CV mode and make sure you enable all 6 cameras.
  3. Open Unreal editor, find the PostProcessVolume in your project -> Lens->Image Effects ->Check the Vignette Intensity box and set the value to 0. This will allow you to use FoV 90 without worrying about cropping and stitching problems.
  4. Stitch the output cubemap using whatever tool you want.
stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had activity from the community in the last year. It will be closed if no further activity occurs within 20 days.