IntelRealSense / hand_tracking_samples

:wave: :ok_hand: research codebase for depth-based hand pose estimation using dynamics based tracking and CNNs
https://realsense.intel.com/
Apache License 2.0
217 stars 74 forks source link

Rift and DirectX #3

Closed hinmanj closed 7 years ago

hinmanj commented 7 years ago

Can you provide an Oculus version of the VR sample - preferably using Direct3D instead of OpenGL?

melax commented 7 years ago

I've provided some sample code that should get you started with this HMD. At this time, its only been tested with Visual Studio 2015 and with the June version of LibOVR, with Win32 (32 bit) compilation in Release mode. Consequently, at this time, this sample, ovr-hand-tracker, resides only in the libovr branch of this repo. Change to that branch to find the project.

There will be some additional work for the developer to download the SDK. You can drop a copy in the ./third_party/ subfolder, or edit the header file to point to a different folder if you want to reference a copy in another location.

Importantly, there may be a mismatch between the compilation settings for LibOVR and the settings used by the project and librealsense. Because things are statically linked, there may be linker errors unless some changes are made. There are instructions in the readme.md, the .cpp, and dx_ovr.h files. One easy way to fix things is to open up the Oculus samples solution for visual studio 2015, change the project settings for LibOVR to be Multi-threaded DLL (/MD).

libovr_settings

Hopefully this will get things up and running on the VR headset.

peace_out

Enjoy.

hinmanj commented 7 years ago

That helps a lot, thanks for the quick response!

melax commented 7 years ago

No problem. Wasn't much effort since we already had been doing tests with this sort of use case.

jgabes commented 7 years ago

This is awesome! Can you add the ability to record annotations in VR too?

melax commented 7 years ago

Admittedly, it can sometimes be hard to accurately judge depth and ensure a correct fit when annotating data sets while looking only at one 3D rendered perspective in an opengl window. Whereas in VR sample its very obvious where and how the hand is posed in 3D and how it lines up with the input depth data stream. stand by ...

melax commented 7 years ago

Ok, i've added, in a separate project, the ability to use a VR headset while capturing ground-truth datasets. See the project ovr-annotator. Again, this has only been tested Win32 Release mode and developer must download LibOVR separately.

A nice feature is that the head can now be used to move the camera around to get some more background variety. This is much better than having to use the other hand to hold and move the depth camera or be rotating a tripod.

ovr_annotator

Enjoy