61315 / mediapipe-prebuilt

Prebuilt mediapipe packages and demos ready to deploy on your device at one go. 💨
MIT License
32 stars 9 forks source link

How to Get Face Landmark Hand Landmark and Body Landmark? #4

Open Epoch2022-iOS opened 1 year ago

Epoch2022-iOS commented 1 year ago

Hello, I'm developer with an iOS application, Now I need Use MediaPipe To Tracking human body、face and hands, I try to Run the Example Demo and get the realtime pixelBuffer, But I can't get the Face Landmark、Hand Landmark and Body Landmark with the code.

I need your help now to get that landmark.

language:Swift/Objective-C Xcode Version:14.0.1 Device:IPhone 12 Pro Max

61315 commented 1 year ago

@Epoch2022-iOS Have you tried the Holistic example from the official repo? They provide examples for Javascript, Android, and iOS, respectively, and I highly advise you to do so.

Please provide a complete description of the issue as well. Unless otherwise, I honestly don't know how to help you.

If this is a request for adding a prebuilt version of the holistic example, please say so.

I'm also willing to change the current iOS playground project so that it includes all of the examples in the mediapipe iOS example package. This would make it one big universal project where you can play around with everything.

Best of luck.

Epoch2022-iOS commented 1 year ago

@Epoch2022-iOS Have you tried the Holistic example from the official repo? They provide examples for Javascript, Android, and iOS, respectively, and I highly advise you to do so.

Please provide a complete description of the issue as well. Unless otherwise, I honestly don't know how to help you.

If this is a request for adding a prebuilt version of the holistic example, please say so.

I'm also willing to change the current iOS playground project so that it includes all of the examples in the mediapipe iOS example package. This would make it one big universal project where you can play around with everything.

Best of luck.

thanks. we want use face and hand track on one project by Mediapipe. Not Holistic. that's my framework demo:https://github.com/Epoch2022-iOS/DoubleFrameworkTest-iOS

61315 commented 1 year ago

I get the gist.

Apparently, the interface shown in both the hand and face examples does not provide access to the landmark data. I'll see to it that these projects have appropriate member functions so that you can get the landmark data.

Thank you.

Epoch2022-iOS commented 1 year ago

I get the gist.

Apparently, the interface shown in both the hand and face examples does not provide access to the landmark data. I'll see to it that these projects have appropriate member functions so that you can get the landmark data.

Thank you.

thanks. wait for your response!

61315 commented 1 year ago

Checkout the new delegate.

https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/src/ios/facegeometry/MPPBFaceGeometry.h#L12

Also see the implementation of how to consume the landmark data.

https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/examples/ios/facegeometry/mppb-ios-facegeometry/ViewController.swift#L161-L166

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.

https://user-images.githubusercontent.com/46559594/209550324-90d72b0d-d1f3-42a3-998d-7943f77d2df5.mp4

Epoch2022-iOS commented 1 year ago

Checkout the new delegate.

https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/src/ios/facegeometry/MPPBFaceGeometry.h#L12

Also see the implementation of how to consume the landmark data.

https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/examples/ios/facegeometry/mppb-ios-facegeometry/ViewController.swift#L161-L166

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.

20221226-214218-954.mp4

thanks for you replay! for you exp. video, I know you use Mediapipe's face track with [MPPBFaceGeometry.h], But we want use face and hand together.... when I export face and hand track SDK with same project, I get a error:Class CaptureDelegate is implemented in both... this is my project, you can download it and Run the target, you will see it.

thanks again!

Epoch2022-iOS commented 1 year ago

Checkout the new delegate. https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/src/ios/facegeometry/MPPBFaceGeometry.h#L12

Also see the implementation of how to consume the landmark data. https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/examples/ios/facegeometry/mppb-ios-facegeometry/ViewController.swift#L161-L166

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year. 20221226-214218-954.mp4

thanks for you replay! for you exp. video, I know you use Mediapipe's face track with [MPPBFaceGeometry.h], But we want use face and hand together.... when I export face and hand track SDK with same project, I get a error:Class CaptureDelegate is implemented in both... this is my project, you can download it and Run the target, you will see it.

thanks again!

Thank you very much for your patient answer! Let me describe my problem again: We need to obtain hand landmarks as well as face landmarks! After referring to the demo you provided, I did not find an effective solution. For the same problem, I also consulted the official technical engineers of Google Mediapipe, and they told me that I needed to customize a.pbtxt file and then build a new one. BUILD file. This is my question and answer and Google: **[Thank you very much for your patient answer! Let me describe my problem again: We need to obtain hand landmarks as well as face landmarks! After referring to the demo you provided, I did not find an effective solution. For the same problem, I also consulted the official technical engineers of Google Mediapipe, and they told me that I needed to customize a.pbtxt file and then build a new one. BUILD file. This is my question and answer and Google: https://github.com/google/mediapipe/issues/3936#event-8092845671

movtac commented 1 year ago

Checkout the new delegate.

https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/src/ios/facegeometry/MPPBFaceGeometry.h#L12

Also see the implementation of how to consume the landmark data.

https://github.com/61315/mediapipe-prebuilt/blob/0e71edd4872289e9c22d1c88f34d8bbad7e2cc6b/examples/ios/facegeometry/mppb-ios-facegeometry/ViewController.swift#L161-L166

As far as it goes, this answers your problem, I believe. Use it in good health and happy new year.

20221226-214218-954.mp4

Hello! The DEMO you made is amazing, and the texture material fits the face very well! I am an AR developer of iOS. I can use leftEyeTransform and rightEyeTransform data in ARKit for funny expression and Eye tracking.

I have a question about how to obtain leftEyeTransform and rightEyeTransform data in MediaPipe's Face Mesh? [simd_float4x4]A transform matrix indicating the position and orientation of the left eye and right eye.