Closed shams3049 closed 3 months ago
Hi @shams3049,
We are very much looking to enable compatibility between jetsons and UE5 via Live Link shortly (more generally unifying the "Live Link" process, be it for Unity, UE5, or any other app). The way we're envisioning it implies that you could send whatever data you need from the sender to your app, by delegating the conversion process from what would be only a modified body tracking sample sending data to a receiver depending on the desired target.
It is definitely high on our list of things to tackle, but if you want to get an idea of how to do this yourself, you can take a look at the conversion code for UE5 Live Link format and the Unity sender. The main idea is to do the sending like in Unity and do the conversion on the receiver side.
In any case, thanks for reaching out, this feedback is important. Jean-Loup
I am definitely interested in Zed Livelink support on Jetson. We have a number of use cases for 5G connected cameras for body tracking and have started experimenting with Jetson and Zed 2 cameras. Would like to get to the point of sending body tracking data to our Edge Compute experiences for real time usage.
@astaikos316 Maybe you've already tested that, but an option, for now, is streaming the ZED feed from the jetson to your main computing unit using something like this sample, but the drawback is that you'd have to run a ZED SDK on this unit.
@astaikos316 Hi, We are doing exactly the same as you described, if you like we can discuss together and maybe have a better solution. We are publically funded, so all our work can stay public. If this motives you, please feel free to write me back.
@astaikos316 Maybe you've already tested that, but an option, for now, is streaming the ZED feed from the jetson to your main computing unit using something like this sample, but the drawback is that you'd have to run a ZED SDK on this unit.
@SLJLMacarit i will take a look. Thanks.
@astaikos316 Hi, We are doing exactly the same as you described, if you like we can discuss together and maybe have a better solution. We are publically funded, so all our work can stay public. If this motives you, please feel free to write me back.
@shams3049 This definitely motivates me. Would be very interested in setting up a discussion where we can go in depth.
@astaikos316 ready when you are, just add @gmail.com to my user name and send your contact so we can set up a call.
Currently our team is working on limiting the number of people tracked, and the place where they should be tracked. We have already worked out the avatar retargeting on metahuman.
Our goal is to build enough confidences to actually use it as zed.hub application (but that's a discussion for an other day).
Hi,
Now that the new Live link version does not require the UE engine source to compile, it should be launchable from a Jetson module (the sender part, not the UE editor).
Therefore, I'm closing this issue.
Preliminary Checks
Proposal
We would like the zedlivelink to run on a jetson device (specifically Jetson Orin).
Use-Case
Distributed architecture: so zed camera skeleton tracking can run in a compact performance subsystem :) that would allow a range of end user configurations.
Anything else?
If the live link is done in a way that it can be extended to add further logic, like addition logic to further process the tracking data, so only refined and needed data is transmitted, this would be quite helpful to expand.