google-ai-edge / mediapipe

Cross-platform, customizable ML solutions for live and streaming media.
https://mediapipe.dev
Apache License 2.0
26.81k stars 5.09k forks source link

Unity Support #36

Closed nadir500 closed 5 years ago

nadir500 commented 5 years ago

Can we see Unity Engine port of it? It would so much great of we used it inside the engine

mgyong commented 5 years ago

@nadir500 We have gotten several request to support MediaPipe in Unity engine. Can you share with us your use cases? We love to work with some teams to get contribution to MediaPipe for getting MediaPipe into Unity @mcclanahoochie @chuoling

LogonDev commented 5 years ago

Some potential use cases:

MahmoudAshraf-CIS commented 5 years ago

Actually there is an sdk from vive port that supports hand tracking on unity For both Android and HTC vive I didn't try it on Android though, but it worked on vive just fine

The sdk is still early access https://developer.viveport.com/documents/sdk/en/vivehandtracking_index.html

I don't know the difference between the two implementations but I guess it would be great if there is some sort of cooperation between the two teams to poost the process.

And for some use cases, check the videos here https://developer.vive.com/resources/knowledgebase/vive-hand-tracking-sdk/

mgyong commented 5 years ago

+other folks

On Wed, Aug 21, 2019 at 6:34 PM Logon13 notifications@github.com wrote:

Some potential use cases:

  • VR Hand tracking (similar to devices like leap motion). Some VR headsets such as the HTC Vive Pro and Valve Index already have RGB cameras built in
  • AR hand tracking
  • Object detection/segmentation using cell phone camera/hololens camera/pc camera

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/google/mediapipe/issues/36?email_source=notifications&email_token=AAQTXUNNQJIGCXLGHGCE7E3QFXUJNA5CNFSM4IOHEEL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD43S7XA#issuecomment-523710428, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQTXUNPXU6O3CSVNSUSI2LQFXUJNANCNFSM4IOHEELQ .

RafaelSPBarbosa commented 5 years ago

I also support this! We are a Unity VR and AR Advergame studio, our clients would loose their minds over this. Initially, we would like to be able to do normal image tracking or arcore tracking to position the objects on the scene, then, we would use this to track the hands and enable our users to directly interact with the game. This would be fenomenal!

KDevS commented 5 years ago

Would love to get a Unity port as well. There aren't any open-source options for hand-tracking in AR, especially for mobile devices. This works perfectly on my phone. There are some like ManoMotion which support hand tracking in 2D but they are paid and in closed beta. If this can be used with Unity then that would help a lot of developers around who are looking to integrate a more natural interaction into their Augmented Reality experiences. The use case for VR is even more obvious.

iBicha commented 5 years ago

Maybe this can be in the style of c api from tflite?

And the string containing the definition of the graph can be passed from the .Net runtime to the native API with PInvoke calls.

I would say it can be even possible to create custom calculators in C#, and the managed methods (GetContract(), Open(), and Process()) can be passed to the C API as a function pointer to be invoked from there.

The incentive would be to make it possible to use alongside arcore-unity-sdk (in a fashion where ARCore will be passing the camera image to mediapipe through the CameraImage), and maybe ARFoundation as well (which also has an API for retrieving the camera image), so it would be in the form of a subsystem. Because this is where most of the AR creations are happening, so this would enable a lot of devs to expand on their existing projects.

These are only ideas as I didn't dive into mediapipe enough to have a solid opinion

mgyong commented 5 years ago

+Chris McClanahan cmcclanahan@google.com +Chuo-Ling Chang chuoling@google.com +Matthias Grundmann grundman@google.com

On Thu, Aug 22, 2019 at 9:10 PM Brahim Hadriche notifications@github.com wrote:

Maybe this can be in the style of c api from tflite? https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/experimental/c/c_api.h

And the string containing the definition of the graph can be passed from the .Net runtime to the native API with PInvoke calls.

I would say it can be even possible to create custom calculators in C#, and the managed methods (GetContract(), Open(), and Process()) can be passed to the C API as a function pointer to be invoked from there.

The incentive would be to make it possible to use alongside arcore-unity-sdk https://github.com/google-ar/arcore-unity-sdk (in a fashion where ARCore will be passing the camera image to mediapipe through the CameraImage https://developers.google.com/ar/reference/unity/class/GoogleARCore/Frame/CameraImage), and maybe ARFoundation https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@1.0/manual/index.html as well (which also has an API for retrieving the camera image)

These are only ideas as I didn't dive into mediapipe enough to have a solid opinion

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/google/mediapipe/issues/36?email_source=notifications&email_token=AAQTXUPVT6GNTOL3GBYGLYTQF5PKDA5CNFSM4IOHEEL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD47B72A#issuecomment-524165096, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQTXUJI5RMHIFCILMMZ6VLQF5PKDANCNFSM4IOHEELQ .

TheBricktop commented 5 years ago

The aristo API from htc currently is much inferior to the Google's approach due to the API that relies on the stereo cameras from vive pro, as tested on original vive mono camera it works really bad, and the Android version is very limited and burns the battery easily thus limiting it's use with other computational heavy tasks like XR. That's why we look forward to see a port of media pipeline to Unity.

mcclanahoochie commented 5 years ago

Hi We definitely welcome contributions to MediaPipe, and Unity support would be great to have. I like reading about the use cases, and @iBicha has some nice ideas on approaches.

I can add a few more ideas, and would say there is a spectrum (as always in programming) of how to solve this (i.e. adding Unity support):

       quick/easier/faster <--> more-involved/harder/tedious 
less-flexible/less-generic <--> more-flexible/more-generic

On the right side, there is the ARCore approach, where the majority of ARCore API is mapped into C#, including all the C/C++/C# wrappers. This obviously requires a lot of C# code (and C interfaces) to be written, but provides the greatest flexibility on developer use-cases and how tightly you can integrate into the C# application.

On the left side, there is the option of minimizing the amount of C#/C-wrapper that needs to be written, via writing a native class (similar to hello_world.cc, or FrameProcessor.java) to handle all the graph initialization/running. In the simplest case, there could be just a few functions in a custom graph runner exposed C#: InitAndStartGraph, AddPacket, RetrieveResultPacket, ShutdownGraph. This would be more of a black-box approach, treating running a graph like calling a function.

I think depending on the application, one approach may be more fitting than the other (considering amount of effort involved), or some hybrid of the two. Hopefully this discussion can get people going in the right direction for them.

A side note for future reference: To link the OpenGL context between Unity and MediaPipe (where Unity is parent context), you would need to follow something similar to what is done for Android in nativeSetParentGlContext and connect with Unity

OnGraphicsDeviceEvent(UnityGfxDeviceEventType eventType) {
  switch (eventType) {
    case kUnityGfxDeviceEventInitialize: {
#if HAS_EGL
      external_context = eglGetCurrentContext();
#elif
    ...
    }
  ...
ahmadpi commented 5 years ago

I have been looking for this solution as well. I fully support this effort!

jmartinho commented 5 years ago

HTC Vive solution is not good enough compared with this mediapipe handtrak. I tested both, and even Vive solution runs on a desktop, is to not compare with mediapipe runing inside a slow android smartphone. Mediapipe handtracking is stunning good and it will be a gamechanger in UI in the nearfuture. Hope mediapipe will support very soon Unity. A good start is looking at TensorFlowSharp, already implemented in Unity: Using TensorFlowSharp in Unity (Experimental) https://github.com/Unity-Technologies/ml-agents/blob/develop/docs/Using-TensorFlow-Sharp-in-Unity.md Here it is as unity package: https://s3.amazonaws.com/unity-ml-agents/0.4/TFSharpPlugin.unitypackage

Ericbing commented 5 years ago

+other folks On Wed, Aug 21, 2019 at 6:34 PM Logon13 @.***> wrote: Some potential use cases: - VR Hand tracking (similar to devices like leap motion). Some VR headsets such as the HTC Vive Pro and Valve Index already have RGB cameras built in - AR hand tracking - Object detection/segmentation using cell phone camera/hololens camera/pc camera — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#36?email_source=notifications&email_token=AAQTXUNNQJIGCXLGHGCE7E3QFXUJNA5CNFSM4IOHEEL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD43S7XA#issuecomment-523710428>, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQTXUNPXU6O3CSVNSUSI2LQFXUJNANCNFSM4IOHEELQ .

+other folks On Wed, Aug 21, 2019 at 6:34 PM Logon13 @.***> wrote: Some potential use cases: - VR Hand tracking (similar to devices like leap motion). Some VR headsets such as the HTC Vive Pro and Valve Index already have RGB cameras built in - AR hand tracking - Object detection/segmentation using cell phone camera/hololens camera/pc camera — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#36?email_source=notifications&email_token=AAQTXUNNQJIGCXLGHGCE7E3QFXUJNA5CNFSM4IOHEEL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD43S7XA#issuecomment-523710428>, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQTXUNPXU6O3CSVNSUSI2LQFXUJNANCNFSM4IOHEELQ .

-VR hand tracking for Oculus Quest/Rift S. It is so important for getting a better user experience with more natural interaction input, finger tracking is definitely on top of the list. Also there will be more and more VR HMD using inside-out tracking meaning they all have camera onboard.

lqz commented 5 years ago

I hope this feature too. It is very importance for us.

GilbertoBitt commented 5 years ago

I fully support this. imagining been able to interact with the game/object itself using hand tracking.

lukamilosevic commented 4 years ago

A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates.

First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results!

TheBricktop commented 4 years ago

There are other models for hand tracking but probably not so performant as the googles version https://github.com/lmb-freiburg/hand3d

seberta commented 4 years ago

Hi, just wondering why this issue is closed now? Is the Unity plugin available?

DBrown12 commented 4 years ago

Like Seberta, I'd like to know. Has any headway been made on this endeavor?

martdob commented 4 years ago

Unity support would be an excellent approach. Especially for variety of Augmented Reality apps and solutions, the Mediapipe Plugin for Unity would be a great help.

TheBricktop commented 4 years ago

I kinda find this weird that feature request is closed without any answear. Intel git does that too.

romaindebraize commented 4 years ago

I also support this. When will it be available ?

mgyong commented 4 years ago

We are looking into this and will update the thread when we have something more definite. We welcome contributions from folks in the thread

asierras commented 4 years ago

I am also looking forward to this. A lot of clients are asking me hand and finger tracking for AR apps on iOS and Android.

Any news about this?

Thank you!

HooliganLabs commented 4 years ago

Also interested in support of this.

boehm-e commented 4 years ago

Interested too

moemenYmoemen commented 4 years ago

Interested in Unity support for it

rtrn1337 commented 4 years ago

I am also interested in an Unity implementation!

justinduynguyen commented 4 years ago

Anyone have done on implementing mediapipe in Unity ?

ostryhub commented 4 years ago

Hey, the issue is closed, but does anyone know if there are plans to support unity integration for iOS and Android ?

huangshenlong commented 4 years ago

Is anyone working on this? I want to do this, does anyone have skills to help? I'll be streaming my efforts on https://twitch.tv/huangshenlong

huangshenlong commented 4 years ago

@lukamilosevic I'm going to try your approach. I'm a Unity noob though so it might take me a while.

You said "A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates.

First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results!"

lukamilosevic commented 4 years ago

@lukamilosevic I'm going to try your approach. I'm a Unity noob though so it might take me a while.

You said "A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates.

First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results!"

Looking back at what I wrote, it's not a good solution. Actual good step would be a server running on the phone. What the app needs is basic server that will serve it's clients hand tracking data (same data shown on the screen)

This way, a separate phone can be dedicated for hand tracking. Idea is to, instead of showing the data on the screen as it currently is on the app, show it on a server.

Here's what I would do if I had the time and the patience:

From here, writing a simple UDP client in unity should do the trick to get the data.

Also from here, making both run on a same phone means either merging unity Android build project and this Android project or editing this project to run in background (maybe as a service).

But, doing the first part only, where hand tracking is done on a separate phone, basically makes a hand tracking product just like any other out there, except everyone has the hardware at home.

b29b commented 4 years ago

not sure if its official or anything but this one works.. https://gitlab.com/thangnh.sas/mediapipe-unity-hand-tracking https://www.youtube.com/watch?v=nNL7zOq3fmo

vivi90 commented 3 years ago

Yes, Unity- or Qt-Integration would be great! Need this for my studies.

ROBYER1 commented 3 years ago

We are looking into this and will update the thread when we have something more definite. We welcome contributions from folks in the thread

Is this thread closed for a reason? I am hoping to use this with Unity for easier cross-platform motion tracking, object detection/objectron support also.

Currently I am constrained by what devices support ARCore/ARKit so the Instant Motion Tracking would be a fantastic alternative with broader reach

midopooler commented 3 years ago

Any conclusion to this thread? I actually needed a human segmentation feature in unity (but for mobile applications)

vivi90 commented 3 years ago

@midopooler Nothing until now, like it seems.

Thaina commented 3 years ago

Why this issue was closed though? We should have unity integration of this library

vivi90 commented 3 years ago

Why this issue was closed though? We should have unity integration of this library

Same opinion.

ostryhub commented 3 years ago

Why this issue was closed though? We should have unity integration of this library

Same here.

ROBYER1 commented 3 years ago

@nadir500 We have gotten several request to support MediaPipe in Unity engine. Can you share with us your use cases? We love to work with some teams to get contribution to MediaPipe for getting MediaPipe into Unity @mcclanahoochie @chuoling

It's been 2 years, I could write an essay of use cases if you need. Many current projects would benefit from this

rtrn1337 commented 3 years ago

Maybe this could help? https://github.com/homuler/MediaPipeUnityPlugin I didnt test it, but got it on my list.

nadir500 commented 3 years ago

The main challenge in Unity currently is to make hands tracking with depth perception an option in vr/ar interactions using mobile camera instead of tracking devices such as leap motion which has no android sdk, of course it's all going to be tuned in real-time since the slightest lag would hurt the experience on some cases. The projects I make on unity often include merging research with vr environments using different tools used to interact. Other features like gestures detection would be useful for different AR concepts.

KDevS commented 3 years ago

Unity support for MediaPipe would be really welcome. Anyone who has worked in the AR/VR/MR applications would love a stable hand-tracking option without being dependent on expensive and/or walled-off hardware to pull it off. I have seen some really good work done on hand-tracking done with just a single RGB camera by some university teams a few years back but none of them were opened up to public, and I won't be surprised if most of them ended up in either Facebook or Microsoft's collection of patents.

OpenCV seems to be the only open option available at the moment but the hand-tracking options are not polished enough to be used commercially. Options like ManoMotion are way too expensive for individual developers.

vivi90 commented 3 years ago

Yes, we really need this stuff for Unity!

nadir500 commented 3 years ago

Support for Unity barracuda would make more benefit for the developers.

FerLuisxd commented 3 years ago

Just an API would be really neat to have to make really cool 3D applications

yugosaito4 commented 3 years ago

@rtrn1337 have you tried the plugin?

rtrn1337 commented 3 years ago

@rtrn1337 have you tried the plugin?

yes i did. Some features are working in Unity Editor. But I get an error in Xcode when I want to build on a device. I haven't had time to test it more closely yet.