IntelRealSense / librealsense

Intel® RealSense™ SDK
https://www.intelrealsense.com/
Apache License 2.0
7.54k stars 4.81k forks source link

osx build and linux. newbie and samples/examples? #152

Closed prussiap closed 8 years ago

prussiap commented 8 years ago
Required Info
Camera Model R200
Firmware Version
Operating System & Version osx yosemite, and raspbian jessie
Kernel Version (Linux Only) jessie
Build System VS2013, XCode or Makefile

Hey guys. I'm excited to see the opportunity to use this R200 with my raspberry pi or my osx box but I'm at a total loss for how to

  1. Build for my respective OSes, OSX or Ubuntu Jesse (albeit ARM for Raspberry pi)
  2. Any sample code after it's built, or documentation for how to incorporate or use this? I'd like to in OSX app and/or Unity. Even if it's a point cloud.
  3. I'm sorry if I missed it in the docs. I'm good with linux but not so much with C#, objective-C and xcode direct builds. Where is the Xcode build file ? I got through to the install of on mac GLFW3.
  4. I'm still waiting for my Raspi box though might try a qeumu Docker box to test this but I'm assuming it's just the usual make and make install as directions. I might have ARM specific issues though.

I have kinect One and 360s working fine on my mac and PC and workign well with point clouds on unity at the basics..

I can't seem to find any examples or sample code for integrating on mac or linux. I may have missed it since I'm a newbie though. I saw the ROS and Arch but don't know how to use that.

A sample openni2 or other code just to init camera, some available methods and and if there is any access to the higher level gesture etc options. I don't think those are implemented yet on librealsense as per the Readme. I'm guessing the basic point clouds et all are there.

Any help, samples and examples would be a huge help and I don't mind writing a tutorial or build documentations for OSX (best platform) and Raspi platform when I get them working.

We are hoping to use these as the basis of augmented reality apps with Unity but without windows if possible. Thanks, David

ddiakopoulos commented 8 years ago

Hi @prussiap. Wow! So many questions! We don't really have the infrastructure to support "Getting Started" type tutorials with librealsense, yet, nor any off-the-shelf sample code for game engine integration. We do provide XCode projects for building on OSX and both Makefile and QTCreator projects for building on Linux. ARM support has been provided on other forks of this repository if you search for the list of forks or dig into closed issues on the tracker. Librealsense will not work on RPi because all of our cameras are USB 3 devices and no RPi board currently has a USB 3 host controller.

The RealSense SDK is the place to get started on Windows for augmented reality apps. I must say -- unfortunately -- that this library might be a lower-level API than you are looking for.

prussiap commented 8 years ago

Hi @ddiakopoulos thanks for the responses :). Yep lots of questions but I couldn't find answers in the issues or Readme or the Intel SDK site so I thought i'd ask here and see where that got us. As for a "Getting Started" post I can write something introductory once I myself figure it out but I have zero idea where to begin now. Maybe this sucker for Xcode ? https://github.com/IntelRealSense/librealsense/tree/master/librealsense.xc/c-tutorial-1-depth.xcodeproj AFTER i make the Xcode libsense binaries are made.

This library seems to be linked from the intel site (but not supported by them) and according to Readme it builds and makes on all the OSes as long as you install the dep debs or files. I saw the Makefile I think but could I get a filename for Xcode starter ? I am Xcode/Xamarin/Depth Noobs' here. My guess is these first https://github.com/IntelRealSense/librealsense/tree/master/librealsense.xc BUT how do I import and make and what do I expect to see when done? Either using Xamarin or Xcode as necessary.

I don't mind working my way to what a tutorial would have if I can get the installs working. So then assume we build this on linux or osx and library is compiled and made (also make installed?). Then what? I realize this is Alphaish but i'm stuck then. Even if I have to just get point clouds and write my own code I need to know what methods are exposed and just some of the basics! Again sorry for the dumb and ignorant questions here but this is new to me. The RealSense R200 camera seems to be at a fantastic price point.. and if we could get past the Windows only sdk and drivers I believe this product would make waves.

We've been making "holograms" with Kinect 2 and Unity as well as some face specific stuff with Processing. The other camera I'm looking at is Structure SDK which seems to have openNi2 code and some low level libraries as well for non Windows. I do need to get this working so we can explore what is available and if I have to return the Realsense for lack of software developper community help. It's not the libs fault it seems that Intel is keen on Windows only support and even that seems roundabout with lots of circular linking :)

In any case if anybody has sample code that you've worked on that can be shared as an example? Even the super basic. init camera, load point cloud etc.. I would be GREATLY appreciative and I hope that it will be possible to build something with this library especially with the 100$ R200 price point for this depth camera and USB power which is unheard of pricing.

ddiakopoulos commented 8 years ago

Hi @prussiap this repository already has everything you need!... you seem a bit confused about what this library does. librealsense will enable you to use all current generations of RealSense cameras on Windows, OSX, and Linux with nearly no dependencies outside of libusb (OSX and Linux only) and GLFW (all platforms, for showing GL windows).

On OSX, XCode will happily open this file: https://github.com/IntelRealSense/librealsense/tree/master/librealsense.xc/librealsense.xcworkspace

It contains the reusable library (librealsense proper) and a smattering of example applications. You can use a dropdown menu at the top of the XCode project to select, for example, the "cpp-pointcloud" application. Hitting build on this will automatically build + link librealsense into the app and immediately show you the camera data on launch!

The entire API is documented in a single header file here: https://github.com/IntelRealSense/librealsense/blob/master/include/librealsense/rs.hpp

After following the short installation instructions for using brew to install the libusb and GLFW dependencies, literally the only required action on your part is to hit build in the OSX project to see the camera data.

ddiakopoulos commented 8 years ago

Closing issue; will still respond if you are having problems.

kazoo-kmt commented 7 years ago

Hi @ddiakopoulos, I tried to run the sample project, for example, "cpp-pointcloud" as you mentioned above. I could build the project on xcode (8.2.1) but when I tried to run the project, it didn't show anything (e.g. the display of rgb image), nor any error messages. Could you let me know how to run the project? Are there anything I can check (e.g. connection of usb, the installation of libusb and glfw3)?

martinthomas commented 7 years ago

If you installed glfw3 recently from homebrew then there may be a problem because brew has deprecated glfw3 and instead outputs a warning and installs glfw (without a 3). The cpp-pointcloud project wants glfw3 and fails the linker step unless modified to expect glfw or a symlink is made to point from glfw3 to glfw.

kazoo-kmt commented 7 years ago

Thank you. When I ran brew install homebrew/versions/glfw3, the output said the followings.

Warning: Use glfw instead of deprecated homebrew/versions/glfw3
Warning: glfw-3.2.1 already installed

So, do I need to run brew install homebrew/versions/glfw3 as well as unlink glfw3?

martinthomas commented 7 years ago

On my machine, after receiving the same message that you have, I cannot build the cpp-pointcloud project because it is looking for glfw3 which is not installed - I get a failure in the link step. If you build successfully then your setup is different from mine.

Do you have an up-to-date checkout of librealsense? If you clean and then build the cpp-pointcloud project, do you get a binary? Do you have /usr/local/lib/libglfw3.dylib?

kazoo-kmt commented 7 years ago

Thank you very much.

Do you have an up-to-date checkout of librealsense?

I git clone librealsense a few days ago.

If you clean and then build the cpp-pointcloud project, do you get a binary?

When I clean and build, build succeeds and it doesn't show any error message. However, it doesn't show pointcloud display either.

Do you have /usr/local/lib/libglfw3.dylib?

Not libglfw3.dylib, but I have libglfw.3.2.dylib, libglfw.3.dylib and libglfw.dylib under /usr/local/lib/.