Open eric-schleicher opened 4 years ago
You don't need to copy anything if you just install the SDK with the .msi on Windows or via APT on Ubuntu. The described procedure with the ext folder is just for cases when you can't install the SDK.
Haven't tried it out myself yet but I would guess you need to copy the "bin" & "lib" folder from for example on windows: "C:\Program Files\Azure Kinect SDK v1.3.0\sdk\windows-desktop\amd64\release" in your "ext\sdk\" folder in your azure kinect ros driver package. The depth engine is already in the "bin" folder. Additionally you copy the "include" folder from "C:\Program Files\Azure Kinect SDK v1.3.0\sdk" to the ""ext\sdk\" folder in your azure kinect ros driver package.
Sorry my original question wasn't clearer (edited), I have to install the SDK manually (can't use MSI/Binaries). So I've done that completely and can run the viewer from the build/bin.
Since there isn't a clear mapping between the folders to understand what to bring over, still not sure what to copy/bring.
The SDK provides a licensing agreement refers to REDIST.txt which describes the files which may be redistributed. We don't want to replicate that here because the SDK is the source of truth.
I hope that helps.
Thanks for the info, but it didn't.
I was hoping for a pseudo steps explaination since the current documentation doesn't clearly outline whether the whole SDK should be included, or just the files. I'm sort of inferring from your response, that once the SDK is build (elsewhere) the files from the REDIST.txt can be copied into the /ext folder structure... but it's still ambiguous.
I tried to bring in the files as best i understood, but the workspace won't build with among the halting error(s) are references that i believe are related to the version (1.4 vs 1.3)
My objective was to run the ROS node on ARM64. Giving up for the time being :( unless we can improve the documentation steps or rule out that 1.4 of the SDK and DE can be used with the node.
@eric-schleicher I have the ROS node running on arm64 on a Jetson Nano. It requires rebuilding ROS completely from source against the system OpenCV and making a few source changes along the way.
I added this comment on a similar thread. Hopefully this helps.
"I have the ROS node running on arm64 on a Jetson Nano. It requires rebuilding ROS completely from source against the system OpenCV and making a few source changes along the way."
@tkircher are the steps documented anywhere? I'm at a perfect point to start over on fresh nano install.
@eric-schleicher I just added Kinect support to RTAB-Map instead, for the standalone application. I assumed Microsoft just didn't take this issue seriously.
Thansk for the response. That's super interesting and i have a number of questions!
So two things:
1) I'm encouraged that arm64 happened at all, so I'm hopeful.
2) In ref to adding support to RTABMAP. is that in @matlabbe's mainline RTABMAP, or just for you locally?
The source changes were all in support libraries for ROS as well as ROS modules. I didn't make changes to the Azure SDK or ROS node. As I recall, one of the big issues was updating ros-perception to work with OpenCV 4.
The Kinect Azure DK support for RTAB-Map is in mainline.
@ooeygui I'm sure there's excellent support on Windows. But there doesn't appear to be support for the Kinect ROS driver on Jetpack.
@tkircher That's correct. We currently do not support ARM64 in the Azure Kinect ROS node. Due to our extremely limited resources, we have not prioritized introducing a new platform. Consequently, we do not have a timeframe for supporting ARM64.
@ooeygui I do understand some of the confusion about this though. Microsoft does officially support Jetpack in the SDK, as well as providing fully functional binaries for that platform. The ROS node can also be built and can be made to work, with some effort. And ROS support on jetpack is generally good. So it's surprising to find out that the Kinect ROS node is not supported.
Regarding the following instructions for building the ros driver, after reading it's not exactly clear what/which parts of the AzureKinectSensorSDK need be copied over and in which state (built or not)
(edit) I can't use the MSI/binaries and have to build the SDK locally...
https://github.com/microsoft/Azure_Kinect_ROS_Driver/blob/melodic/docs/building.md
says:
At this point i have the AzureKinectSensorSDK build using the ninja scripts in it's own folder...
Does this mean that the SDK needs to be copied over in it's entirely in an unbuilt state? Or just select pieces.
The folder structure outline in the alternate SDK installation doesn't isn't the same as what is present in the AzureKinectSensorSDK project structure, so it's not clear what to do.
I understand the part about copying the depth engine into, so but not sure about which elements of the SDK to bring over. any help appreciated.