Closed ShonubiSamuel closed 5 months ago
No registered object with name: FaceLandmarkFrontCpu; Unable to find Calculator "FaceLandmarkFrontCpu"
See https://github.com/homuler/MediaPipeUnityPlugin/issues/870. For now, you can also use the FaceLandmarker API. cf. https://github.com/homuler/MediaPipeUnityPlugin/blob/359fb8b4e8ff7b5c7f794c37a023cb53b69d0fc6/Assets/MediaPipeUnity/Samples/Scenes/Tasks/Face%20Landmark%20Detection/FaceLandmarkerRunner.cs
Thanks a lot it works now but i have a few things to ask:
Mine does gives me a series of GPU related errors like:
EntryPointNotFoundException: mp_GpuResources_Create__Pv assembly:
node: {
calculator: "ImageTransformationCalculator"
input_stream: "IMAGE:throttled_input_video"
output_stream: "IMAGE:transformed_input_video"
node_options: {
[type.googleapis.com/mediapipe.ImageTransformationCalculatorOptions] {
flip_vertically: true
}
}
}
What i did was to rotate Annotation Layer GameObject by 90 in the z axis which sort of fixed the rotation issue but i'm just wondering if there is better way of going about it.
_rawTextureData.Length);
i don't know if you have any recommendation on how to go about making the app run a bit smoother. My Code is down belowπ. Thank for your time :
using System;
using System.Collections;
using TMPro;
using Unity.Collections;
using Unity.Collections.LowLevel.Unsafe;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using UnityEngine.UI;
using Stopwatch = System.Diagnostics.Stopwatch;
namespace Mediapipe.Unity.Tutorial
{
public class FaceMeshGpu : MonoBehaviour
{
[SerializeField] private ARCameraManager _cameraManager;
[SerializeField] private TextAsset _configAsset;
private int _width;
private int _height;
[SerializeField] private MultiFaceLandmarkListAnnotationController _multiFaceLandmarksAnnotationController;
private CalculatorGraph _graph;
private OutputStream _multiFaceLandmarksStream;
private ResourceManager _resourceManager;
private Texture2D _texture;
private NativeArray<byte> _rawTextureData;
private IEnumerator Start()
{
_cameraManager.frameReceived += OnCameraFrameReceived;
yield return new WaitUntil(() => _texture != null);
yield return GpuManager.Initialize();
if (!GpuManager.IsInitialized)
{
throw new Exception("Failed to initialize GPU resources");
}
_width = _texture.width;
_height = _texture.height;
_resourceManager = new StreamingAssetsResourceManager();
yield return _resourceManager.PrepareAssetAsync("face_detection_short_range.bytes");
yield return _resourceManager.PrepareAssetAsync("face_landmark_with_attention.bytes");
var stopwatch = new Stopwatch();
_graph = new CalculatorGraph(_configAsset.text);
_graph.SetGpuResources(GpuManager.GpuResources);
_multiFaceLandmarksStream = new OutputStream(_graph, "multi_face_landmarks");
_multiFaceLandmarksStream.StartPolling();
_graph.StartRun();
stopwatch.Start();
while (true)
{
var imageFrame = new ImageFrame(ImageFormat.Types.Format.Srgba, _width, _height, _width * 4, _rawTextureData);
var currentTimestamp = stopwatch.ElapsedTicks / (TimeSpan.TicksPerMillisecond / 1000);
_graph.AddPacketToInputStream("input_video", Packet.CreateImageFrameAt(imageFrame, currentTimestamp));
var task = _multiFaceLandmarksStream.WaitNextAsync();
yield return new WaitUntil(() => task.IsCompleted);
var result = task.Result;
if (!result.ok)
{
throw new Exception("Something went wrong");
}
var multiFaceLandmarksPacket = result.packet;
if (multiFaceLandmarksPacket != null)
{
var multiFaceLandmarks = multiFaceLandmarksPacket.GetProtoList(NormalizedLandmarkList.Parser);
_multiFaceLandmarksAnnotationController.DrawNow(multiFaceLandmarks);
}
else
{
_multiFaceLandmarksAnnotationController.DrawNow(null);
}
}
}
private unsafe void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
{
if (!_cameraManager.TryAcquireLatestCpuImage(out XRCpuImage image))
{
return;
}
if (_texture == null || _texture.width != image.width || _texture.height != image.height)
{
_texture = new Texture2D(image.width, image.height, TextureFormat.RGBA32, false);
}
var conversionParams = new XRCpuImage.ConversionParams(image, TextureFormat.RGBA32, m_Transformation);
_rawTextureData = _texture.GetRawTextureData<byte>();
try
{
image.Convert(conversionParams, new IntPtr(_rawTextureData.GetUnsafePtr()), _rawTextureData.Length);
}
finally
{
image.Dispose();
}
_texture.Apply();
}
[SerializeField]
Button m_TransformationButton;
public Button transformationButton
{
get => m_TransformationButton;
set => m_TransformationButton = value;
}
XRCpuImage.Transformation m_Transformation = XRCpuImage.Transformation.MirrorX | XRCpuImage.Transformation.MirrorY;
public void CycleTransformation()
{
m_Transformation = m_Transformation switch
{
XRCpuImage.Transformation.None => XRCpuImage.Transformation.MirrorX,
XRCpuImage.Transformation.MirrorX => XRCpuImage.Transformation.MirrorY,
XRCpuImage.Transformation.MirrorY => XRCpuImage.Transformation.MirrorX | XRCpuImage.Transformation.MirrorY,
_ => XRCpuImage.Transformation.None
};
if (m_TransformationButton)
{
m_TransformationButton.GetComponentInChildren<TextMeshProUGUI>().text = m_Transformation.ToString();
}
}
private void OnDestroy()
{
_multiFaceLandmarksStream?.Dispose();
_multiFaceLandmarksStream = null;
if (_graph != null)
{
try
{
_graph.CloseInputStream("input_video");
_graph.WaitUntilDone();
}
finally
{
_graph.Dispose();
_graph = null;
}
GpuManager.Shutdown();
}
}
}
}
how were you able to preview yours in the Editor.
I usually run UnityEditor on Linux.
i'm just wondering if there is better way of going about it.
Check the orientation of the camera. If the input image is not rotated, then you need to flip it vertically. If it's rotated 90 degrees, then you need to un-rotate it and flip it vertically.
My application lags too much i really don't know what to do.
If the ARFoundation API takes long time, then please ask elsewhere. I suspect the resolution is too heigh, but if it's not, you may want to copy the image on the GPU, not the CPU.
Thanks for the reply; I really appreciate it. I have now successfully integrated the hand tracking solution with AR Foundation. However, I had to revert to v0.12 because I still suck at programming, and v0.12 seems a bit easier to navigate.
I acknowledge that I have a lot to learn, especially in understanding the techniques you use to write your code. I would like to know how I can create my own custom calculator, such as for edge detection or hand gestures in Mediapipe with Unity. Where should I get started?
Additionally, if you have any materials that are helpful for learning how to write code like you do, I would greatly appreciate it. Thanks.
I would like to know how I can create my own custom calculator, such as for edge detection or hand gestures in Mediapipe with Unity. Where should I get started?
First and foremost, I recommend reading the official MediaPipe documentation. If you wish to implement your own Calculator, you will need to write C++ code, so please refer to the MediaPipe documentation and code.
However, I guess what you really want to do is to implement custom CalculatorGraph
by combining existing MediaPipe Calculators. In that case, please read the Tutorial and explore how to use CalculatorGraph
.
On the other hand, if you want to use a Task API which has not yet been ported to plugins, it is quicker to read the pull request (PR) where Task API was implemented (cf. #997).
Additionally, if you have any materials that are helpful for learning how to write code like you do, I would greatly appreciate it.
For general advice on learning programming, I believe you can obtain more useful information by asking in other places.
@ShonubiSamuel Hi! I see that you managed to implement hand tracking with AR Foundation. Can you share your project. It would really help me because I'm trying to solve the same problem. Thank you.
Plugin Version or Commit ID
v0.13.1
Unity Version
2022.3.4f1
Your Host OS
macOS Ventura 13.6.1
Target Platform
Android
Description
(Just in case, please note: This current issue is a bit different from the one I had in v0.12.0.) After struggling with v0.12.0, I switched to v0.13.1, which is a lot better because the Official Solution Project (from the Getting Started page) works π . Thanks for the new update.
So, I was able to modify the Official Solution Project to work with AR Foundation and tested it in the editor with my phone using "AR Foundation Remote 2," which works well without errors. However, when I tried out the build on my Android phone, it crashes my phone.
I tried running the Android logcat, I saw something like π (Note: The full version is still far down below):
Code to Reproduce the issue
Additional Context
I ran a logcat using adb -s [device_id] logcat Unity:V native:V tflite:V CRASH:E AndroidRuntime:E "*:S" so i copied out what i felt was the main log:
But here is the full log: