robotology / yarp

YARP - Yet Another Robot Platform
http://www.yarp.it
Other
521 stars 195 forks source link

openni2 module not taking noMirror parameter into account #683

Closed jgvictores closed 8 years ago

jgvictores commented 8 years ago
kt10aan commented 8 years ago

Thanks for the catch!!

I fixed it in my YARP fork (care had to to be taken for mirroring to be set on/off only when playback is off): https://github.com/kt10aan/yarp/commit/234b42f644b1c8d7c3d2676e9dccedfc9b0ef8de

I will merge it with robotology/yarp after I have finished testing it today :)

kt10aan commented 8 years ago

@barbalberto, you commented on a commit in kt10aan/yarp that was later squashed. In any case, @barbalberto commented that setting mirroring on is not a guarantee that mirroring will work and he brought as an example that mirroring for RGB mode will be always on for the Asus Xtion.

Actually, setting the mirroring On/Off for the RGB mode is a bit pointless, since its a, well, RGB camera! The point of the mirroring option is for the depth stream to be mirrored.

As for if the sensors support mirroring for the depthStream, Kinect does support it and the Asus Xtion series does support it either. So I do not think there is any concern about handling the case where mirroring is not supported.

With the other options, if the user sets a not supported hardware option, the driver proceeds to open as usual, but prints out an informative message about the requested feature being turned off.

Come to think of it, I should add a message for turning mirroring on too!

barbalberto commented 8 years ago

I do believe it matters because the device will stream both rgb and depth images but they will be specular to each other and therefore they cannot be used together for example to generate a point cloud from them or to match the color for each pixel.

My worries about having a single warning message if hw does not support the mirroring is that a lot of time the executable are run remotely through GUIs and the output from terminal is not visible. If an automatic system have to rely on the streaming, like a navigation algorithm, it must trust the streaming it receives is in the way it has been configured.

Imagine your navigation algorithm read images from this device, but they are mirrored, it'll turn in the wrong direction.

kt10aan commented 8 years ago

While I am agreeing with what may happen in the test cases you mention, I am assuming that everyone that will use the driver will go through the documentation first. The audience for YARP is researchers, not consumers, so certain assumptions are made of the technical knowledge of the user. Having said that, your test case makes it obvious that the documentation has to be a bit more clear about what the mirroring option changes and what not (i.e., only the depth stream and not the RGB stream). My point is that the test case of generating a point could, or the navigation test case, should be addressed in the documentation and not in the driver's behaviour.

Regarding the driver having the option for the RGB to be de-mirrored (if that is a valid term!), while it is not something difficult to add in the driver's software, I think that it is the responsibility of the user's module to do such conversions. The philosophy behind the driver is exactly that, of a driver: exposing the functionality and providing an interface to what is available on the physical device and just streaming the data (or replaying the data from an *oni file) in a YARP-friendly format. More in line with the modular paradigm. Citing your example, an automatic system, like a navigation algorithm, has to take into account what data it receives. So the configuration has to be done on the "client" side and not on the driver side. Otherwise, if every possible use case is taken into account, the driver won't be a simple driver anymore and it will grow to something more bloated.

An analogy could be made with e.g., the opencvgrabber device: the driver is mainly streaming or playing a video in YARP image format, it doesn't do any image processing. That is the responsibility of the user's module that is accepting the output of the opencvgrabber, depending on the needs of the user.

P.S. All devices I know support mirroring of the depth stream. However, you are correct that it has to be confirmed that the mirroring option was correctly applied, so the informative message will take into account the actual state of the depthStream and not just the parameter passed (i.e., it will print on or off not based on the value of the --parameter passed, but based on the value of the depthStream.getMirroringEnabled(), checking if the mirroring option has been correctly applied.

gsaponaro commented 8 years ago

I updated the documentation on http://wiki.icub.org/wiki/OpenNI2 to clarify that the mirroring option affects only the depth stream, not the RGB one. This behaviour, as counter-intuitive as it may be, comes from a lower level than the yarp driver, which, being just a driver, currently exposes only the sensor hardware & openni2 functionality with some yarp wrappers.

I verified that the --noMirror option in @kt10aan's fork works with Linux & Asus Xtion Pro, for me it's fine to merge it upstream.

barbalberto commented 8 years ago

:+1:

jgvictores commented 8 years ago

@kt10aan I tried your code at https://github.com/kt10aan/yarp/commit/234b42f644b1c8d7c3d2676e9dccedfc9b0ef8de on the Xtion PRO LIVE sensor I use and it actually worked for both depth AND rgb. We usually update the firmware on the sensor, not sure if that's related.

Based on your code and @barbalberto 's concerns, I forked and branched off the latest YARP master into https://github.com/jgvictores/yarp/commit/31a964cdaebd6b6cb907f9e66e3b3d30e767dfa9 including "yError"s but not exiting (that would be way too draconian) for each of the two activations. While it is not in line with the "printf"s of the rest of the code (additionally, it's my first time around with YARP logging capabilities), I believe it will help expose a potential error even if running remotely.

I am attaching some screenshots of the fork/branch working (wrote HI with keyboards :smiley:, tested at different resolutions), and will now issue a PR.

openni2-default openni2-nomirror

kt10aan commented 8 years ago

@jgvictores, thanks for testing that! Yes capabilities between Xtion sensors can vary a lot (that is what originally led to the printVideoModes option).

Based on your updated information, I think the best course of action is to have two different parameters, one for setting depth mirroring and one for setting Rgb mirroring. This will also give the greatest flexibility for the user. Also, I will include some testing of mirroring capabilities of the sensor.

Regarding the yarp logging, I would prefer to keep things uniform, I.e., either translate all printfs to yarp log or use only yarp log. Well, the best option would be of course to convert everything to yarp log.

I will do the necessary changes and test them with my xtions.

jgvictores commented 8 years ago

@kt10aan Ok that sounds great, thanks!

kt10aan commented 8 years ago

By the way, totally irrelevant, are you using a tiling window manager with your Ubuntu, just the compiz quarter-sized Windows or did you rearrange them manually ? :)

jgvictores commented 8 years ago

Ctrl-Alt-TopLeft(numericPad7), Ctrl-Alt-TopRight(numericPad9), Ctrl-Alt-Down(numericPad2) on a non-tuned (compiz I guess) Ubuntu 14.04.3 LTS. Have played with window managers, but get used to standard stuff due to working on too many machines... :smile:

kt10aan commented 8 years ago

:D I will stick with my i3 wm setup :D

kt10aan commented 8 years ago

I have pushed a commit to https://github.com/kt10aan/yarp/commit/b748d75149ca539e0d4a937a144dc7822090c451

This commit:

Again, mirroring is not changed for playback of .oni files. However, any existing scripts would have to be modified according to the new behaviour.

Next step would be to replace the couts with yLog, addressing also @barbalberto 's concerns regarding remote operation of the driver. However that would require a bit more thinking and planning as to what to change to yLog and what not, so I will do it later in the week.

@jgvictores cc @gsaponaro @barbalberto Could you please test it with your own sensors? Ideally with: a) no depth and rgb mirroring b) depth mirroring and no rgb mirroring c) no depth mirroring and rgb mirroring and d) both depth and rgb mirroring Also, could you comment on the new behaviour of the mirroring parameters (OFF by default)?

jgvictores commented 8 years ago

Had to fork off https://github.com/kt10aan/yarp/commit/b748d75149ca539e0d4a937a144dc7822090c451 to get it compiling and working correctly, 4 commits:

The last commit corresponds to https://github.com/jgvictores/yarp/commit/23bebeae2dad17a65705a41d00ae301675d786cb and has been tested and working in the a)-d) scenarios you proposed.

Regarding mirroring parameters defaulting to OFF, it worries me a bit because the default sensor behavior is usually ON . I'd vote for keeping the new parameter names, but defaulting to ON and documenting in wiki as @gsaponaro and also around https://github.com/jgvictores/yarp/blob/23bebeae2dad17a65705a41d00ae301675d786cb/src/modules/openni2/OpenNI2DeviceDriverServer.cpp#L132-L142 cc @kt10aan @barbalberto I'd like to know your opinion!

kt10aan commented 8 years ago

Thanks for the corrections @jgvictores, I committed my changes while in a conference:D I will integrate the corrections

Regarding the default on/off behaviour:

It makes me a bit uncomfortable too and I would prefer default on behaviour for both streams. However what happens in the case that RGB cannot get mirrored, as in the case of @barbalberto above? I don't want a user to end up with a stream mirrored and another not mirrored when using default options.

This is what led me to defaults off behaviour. But as I said, I would like to hear all opinions in the subject. Default on certainly makes more sense for me also but let's hear the others.

jgvictores commented 8 years ago

@kt10aan 4am is always a dangerous time for a commit :smile: I believe the sensor is mirrored by default (with my sensor, this actually happens commenting out the setMirroringEnabled lines), and there are certain devices that cannot get "un-mirrored". Thus the oldie --noMirror parameter proposed in https://sourceforge.net/p/robotcub/mailman/message/29583795/ regarding the original Kinect. @barbalberto Could you check this behavior on your sensor, before confirming new parameter names and defaults?

kt10aan commented 8 years ago

It was not 4am, it was morning Japan time! :) Yea, we have to investigate a bit before finalizing parameter names and behaviour. Ideally, I would like consistent mirroring behaviour by default for depth and Rgb streams, I.e., both streams exhibit the same behaviour when used with no parameters.

barbalberto commented 8 years ago

I prefer the default to off, but since this results in an unfeasible default for some devices, then I think it is better to have the default to on. In this case, I'd substitute the check:

if(config.check("rgbMirror", "enable RGB  mirroring")) 
{
     rgbMirrorON = true;
} else {
    rgbMirrorON = false;
}

with a boolean to turn it off, like:

if(config.check("rgbMirror", "enable RGB  mirroring")) 
{
        rgbMirror = config.find("rgbMirror").asBool();
    } else {
      rgbMirror = true;
 }

just to avoid a default configuration which result not doable.

barbalberto commented 8 years ago

Ok, I see there is there is good will for improvement and I like it :+1: so here is the 'long answer', I hope you guys can read it all and provide feedback.

My interest in the OpenNI device came from issue #643 I'm in charge to work on.

I examined all drivers related to kinect-like devices (in IIT we need one in a new robot) and I recently implemented the same in the Gazebo simulator (https://github.com/robotology/gazebo-yarp-plugins/issues/100) so we can work on simulator and then go on the real robot.

What I found out is that there is no real 'standard' in yarp for this kind of devices, and the situation is even worse for pointclouds. So I'd like to make some order in this.

I found out there are 4 kinect-like yarp drivers but all of them lack of some key features or have some dis-advantage. Please think of the following points as ways to improve the yarp-ish software related to this awesome depth cameras devices. My observations are: A) Driver in yarp are meant to be splitted into 2 different part (modules/class/object):

This is what is done for motioncontrol, so we use the same controlBoardWrapper2 over CAN devices, ETH devices, simulators (iCub, gazebo, robotran), ethercat device and so on. This allows code to work on different robots by abstracting hw level.
All kinect-like yarp drivers I saw, do the 2 things (driver and wrappper) together in a single module and cannot be separated. This means, for example, there was no 'pure wrapper' I could use with the simulators, therefore I had to write a yet another one.

B) As follow from lack of an unified wrapper, each device opens it own ports and send data using is own format:

This make the devices incompatible between each other even if they basically provide the same data, therefore software made to work with one driver cannot work with another one. The usage of an unified wrapper/interface will make possible to test the same software with different sensors or change the sensor for example from kinect to xtion effortlessly.

C) Each device is a bit customized in the way it works, for example both OpenNi2 and kinect-wrapper has a port for skeleton, but their definition of skeleton is not the same while the other 2 devices do not provide skeleton. This again goes against the re-usage of the code. Following comment from @kt10aan

The philosophy behind the driver is exactly that, of a driver: exposing the functionality and providing an interface to what is available on the physical device and just streaming the data

to which I agree, I think that since skeleton or pointclouds are not data provided by the sensor, but a software elaboration on top, this feature should be extracted and implemented on a separated module. Again, using this approach will allow user to test the same algorithm (library) for skeleton extraction on different hw or to use different skeleton extraction algorithms (libraries) on the same hw. Those modules could be run on a remote machine more powerful. If I have (and I will) have the sensor installed on board of the robot, maybe compiting skeleton extraction or point cloud generation on board could be too heavy.

Another advantage is to remove dependencies for compilation. I'm not able to install Nite2 on my machine and, at the end of the day, I don't need it either but I got compilation issues because it is required.

So, to cut a long story short, as discussed internally with @lornat75, @drdanz and others, the idea in the long run is to have a 'infrastructure' similar to the one we have currently for motor control with:

What have been done so far:

The definition of the interface is tentative, so features can be changed or added. For example mirroring is not present in the interface right now. We could add it so a sw can remotely know how is the mirroring set and comply accordingly.

Anyway, right now the client/server just exchange the image streams, the RPC calls are not yet implemented. Probably first step will be to create a new OpenNi2 compatible device to be used with the RGBDWrapper and then help migration toward it.

What do you guys think? Since it is a piece of work, con you spare some time to help in this process?

I do believe the result will make things more usable clear, improve the reusage of code and share of result and modules.

kt10aan commented 8 years ago

@barbalberto, I took the liberty to move your post to a new issue #690 (marked with the Discussion label) and I will post my reply there in order to keep this issue specific about the mirroring options :)

kt10aan commented 8 years ago

@jgvictores , I integrated the changes to kt10aan fork. @barbalberto , @gsaponaro, could you test the four cases? :)

jgvictores commented 8 years ago

@barbalberto I'll be glad to help (especially after IROS deadline next week :smiley:). I'll drop my comments on corresponding issues #643 and #690.

@kt10aan I tested https://github.com/kt10aan/yarp/commit/3ad6c8175882cb28495ca25c37b007ffdc3d15e4 and it works fine, but I had to change https://github.com/kt10aan/yarp/blob/3ad6c8175882cb28495ca25c37b007ffdc3d15e4/src/modules/openni2/OpenNI2DeviceDriverServer.h#L86 "rgbMirronON -> rgbMirrorON" to get it compiling (I usually set Travis CI on my fork to check that kind of stuff xD).

kt10aan commented 8 years ago

@jgvictores, I seriously have to stop committing code during conferences :D thanks! I will push the fix.

Now, regarding the default behaviour...

I am inclined to vote for OFF, despite not being my preference, just to be on the safe side.

We can revisit the point when I convert to the driver to the RGBD wrapper.

By the way, and since people are so nice volunteering to help, my experience with developing the driver was that the code was not difficult to fix, but it needed a lot of testing with many different devices, so whole cheking out the code before being integrated to YARP, testing it and making suggestions are most welcome.

@gsapponaro's help in that way in the development of the OPenNI2 driver was invaluable.

jgvictores commented 8 years ago

Summing up, we need to decide: default behavior, and error treatment. These are my thoughts:

kt10aan commented 8 years ago

After testing with quite a few different devices, I defaulted the mirroring behaviour to ON, as it seemed to be the most wanted option.

So, now there are two options: --noRGBMirror and --noDepthMirror.

For now, the only error handling is to print out a WARNING. However, since I am moving the whole thing to YARP Log, I will handle the main error (RGB and Depth channels not having matching mirroring behaviour) there. Preferably by exiting the driver.

The PR is created and I will merge it when Travis is finished.

@barbalberto, could you wait a bit and not yet merge your changes so I can test them today?

gsaponaro commented 8 years ago

Hi @kt10aan, hi all, I just tested the new individual mirroring option with the Asus Xtion Pro here, it works perfectly, thanks. :thumbsup: I also updated the documentation at http://wiki.icub.org/wiki/OpenNI2.

barbalberto commented 8 years ago

I'd like to have all changes merged (yours and mine) it by the end of next week if possible. Thanks for the work

kt10aan commented 8 years ago

I am closing the issue since the mirroring problem has been solved. Also, it gets a bit confusing cross-posting between two issues. So lets move general RGBD dicussion to #690