Unity-Technologies / arfoundation-samples

Example content for Unity projects based on AR Foundation
Other
2.98k stars 1.11k forks source link

Setting automaticImageScaleEstimation to enabled on iOS #1065

Closed camnewnham closed 7 months ago

camnewnham commented 1 year ago

ARCore allows leaving the image size blank and will automatically estimate image size. ARKit has had this as an option since iOS 13.

I can't find an option to enable this in the XR Plugin configuration (or anywhere else!) Subsequently, I get an error at build-time when not specifying size. If I try to add one at runtime, I get an exception:

InvalidOperationException: ARKit requires physical dimensions for all reference images.
at UnityEngine.XR.ARKit.ARKitImageDatabase.ScheduleAddImageWithValidationJobImpl 
...

I can see in the logs:

UnityARKit: Updating ARSession configuration with <ARWorldTrackingConfiguration: 0x2805cdd00 worldAlignment=Gravity lightEstimation=Disabled frameSemantics=None videoFormat=<ARVideoFormat: 0x281a5ca50 imageResolution=(1920, 1440) pixelFormat=(420f) framesPerSecond=(60) captureDeviceType=AVCaptureDeviceTypeBuiltInWideAngleCamera captureDevicePosition=(1)> autoFocus=Enabled environmentTexturing=None wantsHDREnvironmentTextures=Enabled planeDetection=None collaboration=Disabled userFaceTracking=Disabled sceneReconstruction=None detectionImages=[<ARReferenceImage: 0x281d98600 name="123_D54BA2D1-D49F-CA40-AA5C-124390EAF0D7" physicalSize=(0.200, 0.200)>] maximumNumberOfTrackedImages=4 automaticImageScaleEstimation=Disabled appClipCodeTracking=Disabled>

which mentions specifically automaticImageScaleEstimation=Disabled.

There's some brief discussion of how to customize this here, but since this doesn't seem to be listed in the Feature enum I'm not quite sure how to go about it.

I figure that even if I can set this flag correctly, I'll still get the exception from ARKitImageDatabase, however I was hoping that the flag might override the physical size (and hence I would specify an arbitrary size to satisfy the database).

Also, is there somewhere else I should be mentioning this to get the feature "officially" added?

camnewnham commented 1 year ago

There's also some stale discussion here but with no developer chime-in: https://forum.unity.com/threads/artrackedimagemanager-automatic-estimation-of-image-target-sizes.1319811/

andyb-unity commented 1 year ago

Hi @camnewnham,

I read your post back in August. Sorry for not responding! I've tagged this issue as a feature request. I agree that this would be a nice feature to support in AR Foundation.

camnewnham commented 1 year ago

Hi @camnewnham,

I read your post back in August. Sorry for not responding! I've tagged this issue as a feature request. I agree that this would be a nice feature to support in AR Foundation.

Thanks @andyb-unity. Please let me know if there's any way I can contribute.

camnewnham commented 1 year ago

Here's a workaround to enable this behaviour, based on this discussion. It doesn't have any error handling and would fail if the configuration is not ARWorldTrackingConfiguration, but otherwise works as expected. Just add the behaviour to the ARSession object.

// EnableSizeEstimation.cs
[RequireComponent(typeof(ARSession))]
public class EnableSizeEstimation : MonoBehaviour
{
    // See https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/manual/extensions.html
    public struct NativePtrData
    {
        public int version;
        public IntPtr sessionPtr;
    }

    private void Update()
    {
        if (GetComponent<ARSession>().subsystem is ARKitSessionSubsystem subsystem)
        {
            // Make sure we have a native ptr
            if (subsystem.nativePtr == IntPtr.Zero)
            {
                return;
            }

            // Get the session ptr from the native ptr data
            IntPtr session = Marshal.PtrToStructure<NativePtrData>(subsystem.nativePtr).sessionPtr;
            if (session == IntPtr.Zero)
            {
                return;
            }

            ARKitEnableSizeEstimation(session);
        }
    }

    [DllImport("__Internal", EntryPoint = "ARKit_EnableImageSizeEstimation")]
    private static extern void ARKitEnableSizeEstimation(IntPtr sessionPtr);
}
// ARWorldTrackingNativeInterop.m
#import <ARKit/ARKit.h>

bool ARKit_IsiOS13OrLater() {
    if (@available(iOS 13, *)) {
        return true;
    }
    return false;
}

void ARKit_EnableImageSizeEstimation(void* sessionPtr) {
    if (@available(iOS 13, *)) {
        ARSession* session = (__bridge ARSession*)sessionPtr;
        ((ARWorldTrackingConfiguration *)session.configuration).automaticImageScaleEstimationEnabled = true;
    }
}
BANZBOWINKEL commented 1 year ago

Dear @camnewnham,

thanks a lot for sharing your workaround! Could you describe how exactly you set this up? I created the ARWorldTrackingNativeInterop.m script and put it in the plugins/iOS-folder and also attached the EnableSizeEstimation.cs to the ARSession but my log says the ImageScaleEstimation is still disabled. The configura tion is an ARWorldTrackingConfiguration. Do you have any idea why this might not be working? Help would be greatly appreciated! Thanks and best!

EnableSizeEstimation

ARWorldTrackingNativeInterop

camnewnham commented 1 year ago

Hi @BANZBOWINKEL That looks like it should work to me. Specifically what I see in the logs is:

UnityARKit: Updating ARSession configuration with <ARWorldTrackingConfiguration: 0x281c0c200 worldAlignment=Gravity lightEstimation=Disabled frameSemantics=None videoFormat=<ARVideoFormat: 0x280035270 imageResolution=(1920, 1440) pixelFormat=(420f) framesPerSecond=(60) captureDeviceType=AVCaptureDeviceTypeBuiltInWideAngleCamera captureDevicePosition=(1)> autoFocus=Enabled environmentTexturing=None wantsHDREnvironmentTextures=Enabled planeDetection=Horizontal|Vertical collaboration=Disabled userFaceTracking=Disabled sceneReconstruction=None appClipCodeTracking=Disabled>

i.e. No mention of this feature at all. Can you post your log output?

The current code I am running is as follows:

// ARKitSizeEstimation.m
#import <ARKit/ARKit.h>

bool ARKit_IsiOS13OrLater() {
    if (@available(iOS 13, *)) {
        return true;
    }
    return false;
}

void ARKit_EnableImageSizeEstimation(void* sessionPtr) {
    if (@available(iOS 13, *)) {
        ARSession* session = (__bridge ARSession*)sessionPtr;
        @try {
            ((ARWorldTrackingConfiguration *)session.configuration).automaticImageScaleEstimationEnabled = true;
        }
        @catch (NSException *exception) {
            NSLog(@"%@", exception.reason);
        }
    }
}
// EnableImageSizeEstimationIOS.cs
using UnityEngine;
using UnityEngine.XR.ARFoundation;
#if UNITY_IOS
using System;
using System.Runtime.InteropServices;
using UnityEngine.XR.ARKit;
#endif

namespace MyApp
{
    /// <summary>
    /// Workaround to enable image size estimation on iOS as this feature is not yet supported by ARKit Unity 
    /// </summary>
    public class EnableImageSizeEstimationIOS : MonoBehaviour
    {
        [SerializeField] private ARSession session;

#if UNITY_IOS
    public struct NativePtrData
    {
        public int version;
        public IntPtr sessionPtr;
    }

    private void Update()
    {
        if (session.subsystem is ARKitSessionSubsystem subsystem)
        {
            // Make sure we have a native ptr
            if (subsystem.nativePtr == IntPtr.Zero)
            {
                return;
            }

            // Get the session ptr from the native ptr data
            IntPtr ptr = Marshal.PtrToStructure<NativePtrData>(subsystem.nativePtr).sessionPtr;
            if (ptr == IntPtr.Zero)
            {
                return;
            }

            ARKitEnableSizeEstimation(ptr);
        }
    }

    [DllImport("__Internal", EntryPoint = "ARKit_EnableImageSizeEstimation")]
    private static extern void ARKitEnableSizeEstimation(IntPtr sessionPtr);
#endif
    }
}

In this case, the set up is is:

Otherwise, I'd note:

In my case, I use a runtime image library so the image in the library is just a placeholder. When I add an image to the library, I specify an arbitrary size which is not used in practice as size estimation takes over.

lookoutking commented 1 year ago

Hi @camnewnham,

I appreciate you sharing this workaround. I have utilized the code you provided to enable automaticImageScaleEstimationEnabled successfully. Additionally, I have employed the following code to automatically include the reference image in the mutable library, and have assigned its physical width. However, the size estimation function does not seem to be functioning as intended, as it is using the width assigned by ScheduleAddImageWithValidationJob.

I would appreciate it if you could offer any insights on what might be causing this issue. In addition, could you please let me know which version of ARFoundation you are using? Many thanks!

public async UniTask AddImageTargetToLibrary(Texture2D image, string name, float width)
{
      var library = _trackedImageManager.CreateRuntimeLibrary();
      _trackedImageManager.referenceLibrary = library;
      if (library is MutableRuntimeReferenceImageLibrary mutableLibrary)
      {
#if UNITY_ANDROID
        Debug.Log($"Android::Add Image Target: {name}");
        var jobState = mutableLibrary.ScheduleAddImageWithValidationJob(
                   image,
                   name,
                   null);
#elif UNITY_IOS
        Debug.Log($"iOS::Add Image Target: {name}");
        var jobState = mutableLibrary.ScheduleAddImageWithValidationJob(
                   image,
                   name,
                   1);
#endif
          await UniTask.WaitUntil(() => jobState.jobHandle.IsCompleted);
          Debug.Log($"CreateImageLibrary Success: {name}");
      }
}
camnewnham commented 1 year ago

I would appreciate it if you could offer any insights on what might be causing this issue. In addition, could you please let me know which version of ARFoundation you are using? Many thanks!

I'm using ARFoundation 5.0.6 with Apple ARKit XR Plugin 5.0.6 in Unity 2022.1.3f1 and my testing has all been on an iPhone X with iOS 16.x

Your code for adding the image looks fine. Here is mine for context:

private IEnumerator BeginTrackingImage(Texture2D img, string imgName)
{
    while (ARSession.state != ARSessionState.SessionTracking)
    {
        yield return null;
    }

    if (!trackableManager.subsystem.running)
    {
        Debug.Log($"Starting {nameof(XRImageTrackingSubsystem)}");
        trackableManager.subsystem.Start();
        trackableManager.subsystem.requestedMaxNumberOfMovingImages = 4;
    }

    while (!trackableManager.subsystem.running)
    {
        yield return null;
    }

    if (trackableManager.referenceLibrary is MutableRuntimeReferenceImageLibrary lib)
    {

    }
    else if (trackableManager.descriptor.supportsMutableLibrary)
    {
        lib = trackableManager.CreateRuntimeLibrary() as MutableRuntimeReferenceImageLibrary;
        trackableManager.referenceLibrary = lib;
        Debug.Log("Created mutable reference image library.");
    }
    else
    {
        Debug.LogError("Platform does not support runtime images");
        yield break;
    }

#if UNITY_IOS
    float? width = 0.2f;
#elif UNITY_ANDROID
    float? width = null;
#endif

    AddReferenceImageJobState job = (trackableManager.referenceLibrary as MutableRuntimeReferenceImageLibrary).ScheduleAddImageWithValidationJob(img, imgName, width);

    while (job.status == AddReferenceImageJobStatus.None || job.status == AddReferenceImageJobStatus.Pending)
    {
        yield return null;
    }

    Debug.Log($"Added image \"{imgName}\" to tracking library: {job.status}. Tracked images: {trackableManager.referenceLibrary.count}");
}
lookoutking commented 12 months ago

Hi @camnewnham,

After conducting a thorough investigation, I discovered that my automaticImageScaleEstimationEnabled is not correctly set to true, which is why the auto determination is not being triggered. I understand that it can be quite challenging to pinpoint the exact reason for this issue. If you have any insights or suggestions, please feel free to share them with me.

My settings are ARFoundation 5.0.6 with ARKit XR Plugin 5.0.6 in Unity 2021.3.19f1. I modify the code to get the log for setting automaticImageScaleEstimationEnabled, which always indicate that automaticImageScaleEstimationEnabled is false.

char* ARKit_EnableImageSizeEstimation(void* sessionPtr) {
    NSString *result = @"";
    if (@available(iOS 13, *)) {
        ARSession* session = (__bridge ARSession*)sessionPtr;
        @try {
            ((ARWorldTrackingConfiguration *)session.configuration).automaticImageScaleEstimationEnabled = true;
            result = [NSString stringWithFormat:@"%d", ((ARWorldTrackingConfiguration *)session.configuration).automaticImageScaleEstimationEnabled];
        }
        @catch (NSException *exception) {
            NSLog(@"%@", exception.reason);
            result =  exception.reason;
        }
    }
    return cStringCopy([result UTF8String]);
}
camnewnham commented 12 months ago

which always indicate that automaticImageScaleEstimationEnabled is false.

Hmm, that is strange. I'm inexperienced on the Apple side of this so I can't really say why that would be the case (sorry!). Are you calling ARKitEnableSizeEstimation(ptr) in the Update loop? Since the feature demands change, the session pointer can also change.

lookoutking commented 12 months ago

which always indicate that automaticImageScaleEstimationEnabled is false.

Hmm, that is strange. I'm inexperienced on the Apple side of this so I can't really say why that would be the case (sorry!). Are you calling ARKitEnableSizeEstimation(ptr) in the Update loop? Since the feature demands change, the session pointer can also change.

No worries, I'm not very experienced with iOS either, haha. Thanks for discussing it with me. I used the provided code to call ARKitEnableSizeEstimation(ptr) in the Update loop.

I'm curious, how did you find out that the automaticImageScaleEstimationEnabled property is effective when the ARKit doesn't use widthInMeters to determine the size of the image target?

camnewnham commented 12 months ago

@lookoutking I'm glad you looked into this - I had definitely overlooked a critical element. My previous code did not work - my use case / app just made it difficult to tell that depth was wrong (because 2D content still appeared correct, just further away).

We need to call runWithConfiguration to update the session configuration.

Give this one a try:

#import <ARKit/ARKit.h>

char* ARKit_EnableImageSizeEstimation(void* sessionPtr) {
    NSString *result = @"";
    if (@available(iOS 13, *)) {
        ARSession* session = (__bridge ARSession*)sessionPtr;
        @try {
            ARWorldTrackingConfiguration * config = ((ARWorldTrackingConfiguration *)session.configuration);
            if (!config.automaticImageScaleEstimationEnabled)
            {
                config.automaticImageScaleEstimationEnabled = true;
                result = [NSString stringWithFormat:@"%d", config.automaticImageScaleEstimationEnabled];
                [session runWithConfiguration:config];
            }
            else {
                result = @"OK";
            }
        }
        @catch (NSException *exception) {
            NSLog(@"%@", exception.reason);
            result =  exception.reason;
        }
    }
    return strdup([result UTF8String]);
}

double ARKit_GetImageAnchorEstimatedScaleFactor(void* imageAnchorPtr) {
    if (@available(iOS 13, *)) {
        ARImageAnchor* anchor = (__bridge ARImageAnchor*)imageAnchorPtr;
        return anchor.estimatedScaleFactor;
    }
    return (double)1.0;
}

and


using UnityEngine;
using UnityEngine.XR.ARFoundation;
#if UNITY_IOS
using System;
using System.Runtime.InteropServices;
#endif

public class EnableSizeEstimation : MonoBehaviour
{
    [SerializeField] private ARSession session;

#if UNITY_IOS
    public struct NativePtrData
    {
        public int version;
        public IntPtr sessionPtr;
    }

    private void Update()
    {
        if (session.subsystem is UnityEngine.XR.ARKit.ARKitSessionSubsystem subsystem)
        {
            // Make sure we have a native ptr
            if (subsystem.nativePtr == IntPtr.Zero)
            {
                return;
            }

            // Get the session ptr from the native ptr data
            IntPtr ptr = Marshal.PtrToStructure<NativePtrData>(subsystem.nativePtr).sessionPtr;
            if (ptr == IntPtr.Zero)
            {
                return;
            }

            string result = ARKitEnableSizeEstimation(ptr);
            Debug.Log("Result: " + result);
        }
    }

    [DllImport("__Internal", EntryPoint = "ARKit_EnableImageSizeEstimation")]
    private static extern string ARKitEnableSizeEstimation(IntPtr sessionPtr);

    [DllImport("__Internal", EntryPoint = "ARKit_GetImageAnchorEstimatedScaleFactor")]
    private static extern double ARKitGetImageAnchorEstimatedScaleFactor(IntPtr imageAnchorPtr);

    public static float GetEstimatedSize(ARTrackedImage image)
    {            // Get the session ptr from the native ptr data
        IntPtr ptr = Marshal.PtrToStructure<NativePtrData>(image.nativePtr).sessionPtr;
        if (ptr == IntPtr.Zero)
        {
            return 1;
        }

        return (float)ARKitGetImageAnchorEstimatedScaleFactor(ptr);
    }

#endif
}

And an example of using the result:

private void OnImageUpdated(object sender, ARTrackedImage e)
        {
            Vector2 size = e.size;
#if UNITY_IOS
            size *= EnableSizeEstimation.GetEstimatedSize(e);
#endif

            visual.GetComponentInChildren<TMPro.TextMeshPro>().text = e.referenceImage.name;
            visual.transform.SetPositionAndRotation(e.transform.position, e.transform.rotation);
            visual.transform.localScale = new Vector3(size.x, size.x, size.x);
        }
andyb-unity commented 7 months ago

Hi all,

I'm happy to share that Unity has revamped our XR roadmaps: https://unity.com/roadmap/unity-platform/arvr.

You can create new items on the roadmap by clicking on the AR Foundation tab, then scrolling down to Submit a New Idea. The new road map site is our official feature request tool, so I'm closing this issue accordingly.

Moving forward please direct feature requests to the road map site.

As an aside I also want to say that I think this is a good idea-- we just haven't had the priority necessary to assign this work. The road map has a built-in tool for upvoting feature requests that is useful for our product team as they look to prioritize work across all AR Foundation-supported platforms.