Unity-Technologies / arfoundation-samples

Example content for Unity projects based on AR Foundation
Other
2.97k stars 1.11k forks source link

Open source the ARKit3 portion so the community can help ARFoundation stay up to date #190

Closed yosun closed 3 years ago

yosun commented 4 years ago

Hi, the iOS 13 public beta - with ARKit 3 - has been out to the general public for several weeks now. However, ARFoundations has still not yet been updated to support many basic features of ARKit 3 - from human segmentation (3D) to (from some accounts even basic) plane tracking.

Instead of being closed sourced through dlls - Is it possible to have the ARKit portion be open sourced so that the community can help push these fixes?

sam598 commented 4 years ago

Unity did a great job of updating ARFoundation with human segmentation and depth estimation the same week it was released by Apple.

With that said I do strongly agree that the iOS portion of ARFoundation should be open source. The original ARKit for Unity plugin on bitbucket was great because it was a straightforward interface for developers, but it could also be extended and fine tuned as needed. With the closed source library we have to wait on the ARFoundation release cycle, and some more niche feature like accessing the TrueDepth camera depth map may never become available.

mdurand42 commented 4 years ago

@yosun We released a sample demonstrating 3D human segmentation 15 days ago. It was supported in AR Foundation 3 days after it was announced at WWDC.

https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/HumanSegmentation

Please elaborate on how exactly were are failing to support plane tracking. Please see the sample that demonstrates the plane tracking available since ARKit 1.0. https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/Plane%20Detection

-Thanks, Mike

yosun commented 4 years ago

@yosun We released a sample demonstrating 3D human segmentation 15 days ago. It was supported in AR Foundation 3 days after it was announced at WWDC.

https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/HumanSegmentation

Please elaborate on how exactly were are failing to support plane tracking. Please see the sample that demonstrates the plane tracking available since ARKit 1.0. https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/Plane%20Detection

-Thanks, Mike

Hi Mike, Mainly, we are concerned with human segmentation 3D not working in the iOS 13 public beta. The issues are regarding the iOS beta changes subsequently to WWDC, that ARFoundation has not been able to support now for almost a month now. (This is a very significant time differential in a beta timeline.)

@mdurand42 Please see this thread regarding Human Segmentation 3D issues for details - while it has worked for beta 1, it doesn't work for beta 2 or the public beta. https://github.com/Unity-Technologies/arfoundation-samples/issues/170#issuecomment-504813112

Several in the thread have offered to help keep these fixes up to date, but since ARFoundation is not open source like the ARKit plugin in the past, we have to keep waiting for you guys. (And that Human Segmentation 3D thread keeps getting longer).

Hope you can fix the Human Segmentation 3D issue ASAP. and/or Please consider opensourcing or allowing the community to help you guys fix these things!

gpt3-bot commented 4 years ago

It might have worked with Beta 1, but it doesn't work with Beta 3.

The issue is that there are a number of us who are polyglot programmers who have been doing every possible AR thing since long before Vuforia was a thing and cross platform AR was easy, and so we love everything Unity is doing to make our lives easier but when we have to sit around and wait for one guy on one team to fix one of hundreds of bugs so that we can even use the latest stuff with the latest stuff at all and those fixes don't come for weeks or months while projects are sitting collecting dust, it's basically the worst and most frustrating experience.

It's cute that the ARFoundation team is working hard on this, but there are a LOT of us who's entire life is AR, and who find stupid bugs in unnecessary DLLs to be extremely frustrating to say the least.

ARFoundation has been essentially unused and useless for almost two years now because it has never both been on parity with features and offered a stable working environment,. Please just open source it already.

yosun commented 4 years ago

It might have worked with Beta 1, but it doesn't work with Beta 3.

The issue is that there are a number of us who are polyglot programmers who have been doing every possible AR thing since long before Vuforia was a thing and cross platform AR was easy, and so we love everything Unity is doing to make our lives easier but when we have to sit around and wait for one guy on one team to fix one of hundreds of bugs so that we can even use the latest stuff with the latest stuff at all and those fixes don't come for weeks or months while projects are sitting collecting dust, it's basically the worst and most frustrating experience.

It's cute that the ARFoundation team is working hard on this, but there are a LOT of us who's entire life is AR, and who find stupid bugs in unnecessary DLLs to be extremely frustrating to say the least.

ARFoundation has been essentially unused and useless for almost two years now because it has never both been on parity with features and offered a stable working environment,. Please just open source it already.

I hope that there's more than one person committing to ARFoundation. If it is only just @Jimmy Alamparambil again... Previously, it seems that the community was able to help the ARKit Unity plugin quite a bit, when it was opensource on bitbucket. So, why not let us help again?

n.b. I believe @shawmakesmusic meant ARKit (formerly Metaio SLAM 2013) and not Vuforia (QCAR/Qualcomm Augmented Reality 2009).

gpt3-bot commented 4 years ago

Well, Vuforia being the first SDK that mere mortals could get our hands on that made cross platform development in Unity painless (and I think they released v1 in 2012?) -- definitely not making the claim that Vuforia was the first AR SDK by a long shot. (And I'm sure there will be a line drawn between keypoint matching/homogeny and SLAM, but for now it's all AR-ish :P )

Point being I literally get paid to research and argue and glue stuff like this together. I'd be more than happy to put significant hours into pull requesting if ARFoundation wasn't a bunch of managed code unnecessarily wrapped up in DLLs. Even if it was mirrored on a Git... I mean, what are we gonna do with that code other than improved it, it's literally not useful in any ecosystem outside of Unity :P

yosun commented 4 years ago

@mdurand42 Human Segmentation 3D in the ARFoundation implementation still does not work on iOS 13 beta release 3. And: This repo hasn't been updated for a month now...

Please consider open sourcing ARFoundation so that the community can help you guys keep things up to date!

@shawmakesmusic although perhaps it might be easier just to write a new custom plugin for a better implementation of ARKit in Unity, with proper drag-and-drop prefabs Vuforia style

lyaunzbe commented 4 years ago

As far as Im aware, there's also no easy way to downgrade to Beta 1 if you've already enrolled in the beta program, so the plugin becomes pretty much unusable :(

yosun commented 4 years ago

Haven't heard back - for those w

@yosun We released a sample demonstrating 3D human segmentation 15 days ago. It was supported in AR Foundation 3 days after it was announced at WWDC.

https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/HumanSegmentation

Please elaborate on how exactly were are failing to support plane tracking. Please see the sample that demonstrates the plane tracking available since ARKit 1.0. https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/Plane%20Detection

-Thanks, Mike

Hi Mike, We appreciate that you guys had support right after WWDC - and I don't mean to criticize your work. However, many iOS 13 beta releases have come out since then, as it is with beta's... It must be a hassle to keep up with all this and we want to help! Hope that's a better perspective to consider things!

(Mainly, I just want to be able to see the ARFoundation ARKit3 Human Segmentation 3D support up to date with the many fixes Apple is releasing with iOS beta this summer. Has been over a month now since it's been borked with the current iOS 13 beta...)

jBachalo commented 4 years ago

I 100% agree with the open source suggestion. Not only body tracking but depth and occlusion masking are absolute game changers in the world of AR and ARFoundation needs to be at parity with ARkit3 in supporting these features. I hope this critical feedback is taken in a positive and constructive fashion. Many developers, myself included, are over the top passionate about both Unity and AR and desperately want to showcase what is possible!

tdmowrer commented 4 years ago

ARKit XR Plugin version 2.2.0-preview.2 was released yesterday and supports iOS 13 beta 3.

2.2.0-preview.3 will be available shortly and will support iOS 13 beta 4.

sam598 commented 4 years ago

Thank you for updating the plugin. However the core issue remains. Is there a technical reason or limitation why the ARKit library cannot be open source?

If the only issue for the past month was that it needs to be built with the latest version of XCode, what happens when Beta 5 is released in a few weeks? This can be easily avoided by open sourcing the library.

There are also additional ARKit 3 features that have not been implemented yet:

These are all very niche features, that are not cross platform, so they may never be implemented in ARFoundation. Open sourcing the ARKit library in ARFoundation would allow developers to create their own branches when necessary.

tdmowrer commented 4 years ago

Thank you for updating the plugin. However the core issue remains. Is there a technical reason or limitation why the ARKit library cannot be open source?

I don't believe there is a technical reason. We are discussing this possibility.

If the only issue for the past month was that it needs to be built with the latest version of XCode...

That was not the only issue. Beta 3 contained breaking changes and required that we add additional APIs to support it.

...what happens when Beta 5 is released in a few weeks?

Betas 2 & 3 were released while most of us were out for a company-wide, weeklong event which was followed immediately by US Independence Day, so we were effectively gone for a couple weeks. The timing was rather unfortunate, but now that we are back, future updates should be much closer to the beta releases from Apple. I think we had beta 4 support the day after Apple released it, for example.

There are also additional ARKit 3 features that have not been implemented yet...These are all very niche features, that are not cross platform, so they may never be implemented in ARFoundation.

For platform-specific features, we still don't necessarily add them to ARFoundation but we often expose them in the platform-specific packages. For instance, we support world maps and collaborative sessions in the ARKit XR Plugin. The features you mentioned are on our backlog, but if you really wanted to get going yourself you could get some of those features from the native pointers exposed on the ARSession and other ARFoundation components.

sam598 commented 4 years ago

Thanks Tim, I have no doubt the Unity team is working to get these brand new features out as fast as they can. I'm very glad to hear that open sourcing the plugin is under consideration.

gpt3-bot commented 4 years ago

Tim,

Your transparency here is much appreciated.

Ya'll are doing good work, but it's a lot of work and the ARF ecosystem still needs a lot of features to thrive (Remote, abstracted spatial anchors, etc) -- if the package is open sourced the community can help by fixing bugs with pull requests so that you guys can stay focused on the new and fun stuff and not have to drop everything every time Apple drops a new beta :)

Best,

Shaw

On Jul 22, 2019, at 9:09 PM, Tim Mowrer notifications@github.com wrote:

Thank you for updating the plugin. However the core issue remains. Is there a technical reason or limitation why the ARKit library cannot be open source?

I don't believe there is a technical reason. We are discussing this possibility.

If the only issue for the past month was that it needs to be built with the latest version of XCode...

That was not the only issue. Beta 3 contained breaking changes and required that we add additional APIs to support it.

...what happens when Beta 5 is released in a few weeks?

Betas 2 & 3 were released while most of us were out for a company-wide, weeklong event which was followed immediately by US Independence Day, so we were effectively gone for a couple weeks. The timing was rather unfortunate, but now that we are back, future updates should be much closer to the beta releases from Apple. I think we had beta 4 support the day after it Apple released it, for example.

There are also additional ARKit 3 features that have not been implemented yet...These are all very niche features, that are not cross platform, so they may never be implemented in ARFoundation.

For platform-specific features, we still don't necessarily add them to ARFoundation but we often expose them in the platform-specific packages. For instance, we support world maps and collaborative sessions in the ARKit XR Plugin. The features you mentioned are on our backlog, but if you really wanted to get going yourself you could get some of those features from the native pointers exposed on the ARSession and other ARFoundation components.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

thibautvdu commented 4 years ago

+1 for the need to open source it. I'm seriously considering migrating UnityARKit deprecated plugin to ARKit 3 for my current project because of that. It's research oriented and we need to get our hands dirty with ARKit 3 full API, so this closed-source library doesn't give us enough confidence and warranties.

Blackclaws commented 4 years ago

Any news on this? I don't really see what is gained on Unity's side by making the ARKit integration closed source.

gpt3-bot commented 4 years ago

They'd rather we all migrate to WebXR now that HitTest API is coming out so they don't have to keep supporting it.

On Fri, May 29, 2020 at 4:08 AM Blackclaws notifications@github.com wrote:

Any news on this? I don't really see what is gained on Unity's side by making the ARKit integration closed source.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Unity-Technologies/arfoundation-samples/issues/190#issuecomment-635914718, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACOD7QDVKTUOLJ3K7XVMCBLRT6JULANCNFSM4H6AZ67A .

gpt3-bot commented 4 years ago

(This is a joke, but it's the effective reality of putting 2 engineers on the most important emerging technology in the whole space.)

On Fri, May 29, 2020 at 4:08 AM Blackclaws notifications@github.com wrote:

Any news on this? I don't really see what is gained on Unity's side by making the ARKit integration closed source.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Unity-Technologies/arfoundation-samples/issues/190#issuecomment-635914718, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACOD7QDVKTUOLJ3K7XVMCBLRT6JULANCNFSM4H6AZ67A .

StephenHodgson commented 4 years ago

I'm also not a fan of how much of the XR plugins are going closed source (not to mention the XR Plugin architecture is a bit clunky and cumbersome), so we're doing everything out in the open with the Mixed Reality Toolkit.

We're also trying to solve other big problems in Unity as well, such as breaking away from Monobehavours except for UI/UX components and keeping things plain old c objects (POCO) as possible, ensuring that people can implement their own versions of common classes via interfaces. https://medium.com/@stephen_hodgson/the-mixed-reality-framework-6fdb5c11feb2

We haven't started our initial work on the ARCore/ARKit projects, but that's high on our priority list.

I'm generally frustrated that something I could fix in 10 min usually takes 3-4 weeks while I wait for Unity to update. If that's something people in this thread would like to help with, we're ready to start forging in this direction as needed.

orangeagain commented 3 years ago

Any news on this? I don't really see what is gained on Unity's side by making the ARKit integration closed source.

unity can earn money. like mircosoft

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

sam598 commented 3 years ago

Bumping as the issue is still open.

mdurand42 commented 3 years ago

Is there a need for community access to the native source code of ARKit at the moment? Respectfully, we have been and are continuing to be very responsive to the addition of ARKit features so we feel as though the community is being well supported even without an open-sourced ARKit plugin. Please note that none of the other platform plugins used with AR Foundation (ARCore, WindowsMR, Magic Leap) is open-sourced.

The issue with iOS beta updates sometimes breaking our support for new ARKit features is definitely real. But we have been pleased to see that Apple has maintained backward compatibility during the iOS 13 beta cycle. Our changelog always includes the version of Xcode we used to compile the plugin so that developers understand which version of iOS they should be using. We will continue to try to respond to any breaking changes with updated versions of the plugin.

Thanks very much for input and we hope you can understand our point of view on this matter.

-Mike

gpt3-bot commented 3 years ago

Everyone is on this thread because they tried solving their own problem and ran into a wall of DLL they couldn't break through. I think if you FOSS it you won't reduce your issue backlog to 0, but a lot of the more self-help types won't need to bother you with their problem, and it could be good for morale, community and reducing the time you guys need to support us.

On Tue, Sep 15, 2020 at 7:45 PM Mike Durand notifications@github.com wrote:

Is there a need for community access to the native source code of ARKit at the moment? Respectfully, we have been and are continuing to be very responsive to the addition of ARKit features so we feel as though the community is being well supported even without an open-sourced ARKit plugin. Please note that none of the other platform plugins used with AR Foundation (ARCore, WindowsMR, Magic Leap) is open-sourced.

The issue with iOS beta updates sometimes breaking our support for new ARKit features is definitely real. But we have been pleased to see that Apple has maintained backward compatibility during the iOS 13 beta cycle. Our changelog always includes the version of Xcode we used to compile the plugin so that developers understand which version of iOS they should be using. We will continue to try to respond to any breaking changes with updated versions of the plugin.

Thanks very much for input and we hope you can understand our point of view on this matter.

-Mike

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Unity-Technologies/arfoundation-samples/issues/190#issuecomment-693136335, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACOD7QABUVC53IL5TSRFA2TSGARDPANCNFSM4H6AZ67A .

tdmowrer commented 3 years ago

The origin of this request had to do with a rapidly changing Apple beta cycle that we (Unity) had trouble keeping up with due to to American-based holidays around early July last year. I would like to better understand the current need for this.

ran into a wall of DLL they couldn't break through

What, exactly, couldn't you do? Maybe there is an in-between option that would help without full source. I would like to understand the types of problems people are running into that an open source plugin would solve.

What are your expectations around native source changes? if open sourced, does the native code now become part of the public API? Does a refactor of native source on our part become a breaking change? Presumably, if you are making modifications at that level, you are already straying outside of what is officially supported, but what are your expectations here?

Blackclaws commented 3 years ago

I think there are some use cases where having access at that level would simplify matters. One is for example (and this isn't ARKit specific but ARCore instead) there is according to the ARCore specification a way to setup the image streams so that there is a still image channel together with a video stream. I did not find a way to actually setup the image streams through the ARCore interface exposed within unity.

In general it would allow us to debug deeper into the plugins themselves to see whether an issues lies within there or without.

For example I'm currently thinking about opening an issue for the ARCore plugin because feature points keep disappearing there when the device is not in motion. I'm not sure whether this is a Unity Plugin issue or whether ARCore itself is at fault.

My first instinct would be to go into the source code to see what exactly is being done, however I cannot as I do not have access to the source code beyond the C# part.

Let me ask the other way around, what are you losing by open sourcing these plugins that are only designed to translate ARCore/ARKit to your own custom ARFoundation C# code?

The code itself could still very well be under the UnityCompanion license (while a more open license would certainly be preferable), in which case you don't even risk anything there.

Is there any sort of secret-sauce postprocessing happening within those libraries that makes you reluctant to open source them?

thibautvdu commented 3 years ago

@tdmowrer

I personally see two sides to this issue :

gpt3-bot commented 3 years ago

Just now I copied and pasted a solution from 2017 for fixing the TextMeshPro UI shader to work with the new Hololens 2. i.e. a 3 year old bug that has supposedly been fixed is still there, and the solution is for me to copy the shader built into the package and rewrite it with single pass stereo instancing-- a great example of where I can fix my own problem because the code for the shader is available and overridable. I have to do this kind of thing all the time, but many times when I run into a wall I have to explain to my employer that I can't fix it because Unity is a closed source product and that's just how it is until next version. I mean how many years was there no cloud anchor or spatial anchor support in ARFoundation?

Here's a link to that bug if you don't believe me ;) https://forum.unity.com/threads/hololens-text-only-visible-for-one-eye.493673/

On Wed, Sep 16, 2020 at 12:33 AM Thibaut Dumont notifications@github.com wrote:

@tdmowrer https://github.com/tdmowrer

I personally see two sides to this issue :

  • First, the actual walls we can run into. I had the occurrence two times those past two years, the one I remind being #535 https://github.com/Unity-Technologies/arfoundation-samples/issues/535 that would have been quickly solved on my side by removing or tweaking this configuration enforcement we were discussing about. I ended up making an editor script that retrieve the .arobjects at build-time and have my client rebuild the app every time they add an object in their environment, which is only acceptable because they are an AI research lab and thus power users.
  • Secondly, the confidence an open-source solution gives, knowing that whatever happen we will actually be able to achieve everything we could achieve with ARKit. When my last client approached me, they were looking for a Unity + ARKit solution and I wasn't 100% confident using ARFoundation. The project's requirements regarding the AR demo were due to evolve along its development and after the reception of the LIDAR iPad.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Unity-Technologies/arfoundation-samples/issues/190#issuecomment-693229883, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACOD7QCVYZDXU5MGD7QO4ZDSGBS5HANCNFSM4H6AZ67A .

sam598 commented 3 years ago

@tdmowrer

My understanding is that these ARKit features are currently missing from ARFoundation:

Some of these features may be accessible through accessing the native session API, but there is very little documentation on how that works. My assumption is in most cases it would be much easier to build upon the existing structure of the plugin instead of trying to rebuild it externally.

To your concern about breaking changes, I've always seen this working the same way that the original ARKit for Unity plugin worked. The code for the library is available for modification, but the vast majority of developers only ever interface with the Unity plugin API.

If a developer is intrepid enough to make changes to the native code, they do so with the understanding that they are "creating their own branch" and that a future update can and will break their modifications. One of the best things about ARFoundation is that it provides a single wrapper API for so many AR APIs. Developers who need to customize that do so at their own risk.

tdmowrer commented 3 years ago

Great answers; thanks for the discussion points everyone. My responses are below. I noticed a couple of feature requests that I have not seen before -- if these are important to you, please create a feature request issue.

@Blackclaws

One is for example (and this isn't ARKit specific but ARCore instead) there is according to the ARCore specification a way to setup the image streams so that there is a still image channel together with a video stream. I did not find a way to actually setup the image streams through the ARCore interface exposed within unity.

I wasn't aware of that. Can you link me to the documentation for this? Maybe it's something that we could add.

This thread was originally about open sourcing the ARKit plugin, which is only even possible because of the way Unity builds for iOS -- that is, it creates an Xcode project that you then compile. Raw Objective-C files in the project (or a package) would automatically become part of that build.

However, the situation is quite different for ARCore, and we would have to expose parts of our build system to help you produce the final binary. Not impossible, but a very different ask. This is also the first time I have heard any request to open source the ARCore plugin.

In general it would allow us to debug deeper into the plugins themselves to see whether an issues lies within there or without.

For example I'm currently thinking about opening an issue for the ARCore plugin because feature points keep disappearing there when the device is not in motion. I'm not sure whether this is a Unity Plugin issue or whether ARCore itself is at fault.

That is most certainly an ARCore issue, but that is also why we have this issues page -- please do not hesitate to open issues for questions like that.

Let me ask the other way around, what are you losing by open sourcing these plugins...Is there any sort of secret-sauce postprocessing happening within those libraries that makes you reluctant to open source them?

I am not the right person to answer that (I'm "just an engineer"), so while your question is valid, my goal is to understand the technical needs and expectations.

@thibautvdu

I had the occurrence two times those past two years, the one I remind being #535 that would have been quickly solved on my side by removing or tweaking this configuration enforcement we were discussing about.

That's a good example, thank you. That's also something we should probably find a way to support properly (without requiring source).

When my last client approached me, they were looking for a Unity + ARKit solution and I wasn't 100% confident using ARFoundation.

Good to know, too; thanks.

@shawmakesmusic

I appreciate your comments, but you are making generalizations about open source in general. I'm trying to understand what problems are solved by opening sourcing the ARKit XR Plugin in particular.

...how many years was there no cloud anchor or spatial anchor support...

ARKit only added geo anchors in the iOS 14 beta. Is this what you mean? We've had world map support for a while.

@sam598

My understanding is that these ARKit features are currently missing from ARFoundation:

  • Front facing "True Depth" depth map

This would make a great feature request. Added to our backlog.

  • AR Geo Anchors

Fair point; we are still investigating this one.

  • 60fps environment lighting

I'm not sure what you mean by this. Can you elaborate/point me to some docs?

  • smoothedSceneDepth depth maps

This is still a beta feature, but we do a fair amount of smoothing on the depth image already. I'll look into this one too.

If a developer is intrepid enough to make changes to the native code, they do so with the understanding that they are "creating their own branch" and that a future update can and will break their modifications. One of the best things about ARFoundation is that it provides a single wrapper API for so many AR APIs. Developers who need to customize that do so at their own risk.

I'm glad we agree on this point, but my guess is that many people would not. In general, anytime an API is publicly accessible, the expectation is that it will be supported and breaking changes will generally cause headaches. As an example, many Unity APIs have historically been marked as "experimental", "preview", or put in an "internal" namespace, but as long as they are publicly accessible, people will build on them and not be happy when they change unexpectedly.

sam598 commented 3 years ago

Thanks @tdmowrer for the detailed responses.

In general, anytime an API is publicly accessible, the expectation is that it will be supported and breaking changes will generally cause headaches. As an example, many Unity APIs have historically been marked as "experimental", "preview", or put in an "internal" namespace, but as long as they are publicly accessible, people will build on them and not be happy when they change unexpectedly.

I understand the concern here, but what I would say is that a large portion of the C# source code for the AR Foundation Package is already open source and can be modified.

Also isn't this the entire purpose behind the package system? My understanding is that a package is treated like an API for a Unity project, but the source code for that package can be upgraded, or brought into the actual project and be modified as needed. All we are asking for is that the raw objective-c files be included in that package.

we do a fair amount of smoothing on the depth image already

Is AR Foundation doing additional processing on the depth images provided by the underlying API? If so this could be very undesirable in some cases. Could you provide more information on what depth images are coming in "raw" and which have additional processing on them?

60fps environment lighting

I'm not sure what you mean by this. Can you elaborate/point me to some docs?

Sorry for the confusion, apparently it is not officially documented. Some people have been posting demos from the lidar equipped iPad Pros showing environment probes updating at 60fps. If I can find more info or documentation I will post it in a feature request. https://twitter.com/danmonaghanz/status/1303461402737127424

tdmowrer commented 3 years ago

Also isn't this the entire purpose behind the package system? My understanding is that a package is treated like an API for a Unity project, but the source code for that package can be upgraded, or brought into the actual project and be modified as needed. All we are asking for is that the raw objective-c files be included in that package.

Semantic versioning in packages is meant to solve this problem, yes. My concern is that the native source now becomes a public API surface subject to semantic versioning (and our tooling doesn't look for native source changes, so it would be easy for us to make a breaking change accidentally).

If you change the package source directly, then you're on your own, sure. I'm thinking of a scenario where someone, say, puts an asset on the asset store that calls one of the Objective-C functions in a particular version of the ARKit package. If that happens, then we either break that asset whenever we change our native code OR breaking changes to native code require a major version bump, and we have no automated way to catch that.

Is AR Foundation doing additional processing on the depth images provided by the underlying API? If so this could be very undesirable in some cases. Could you provide more information on what depth images are coming in "raw" and which have additional processing on them?

Yes, and it is configurable. It's what these EnvironmentDepthModes are for (settable on the AROcclusionManager). You can disable it entirely if you want.

sam598 commented 3 years ago

I'm thinking of a scenario where someone, say, puts an asset on the asset store that calls one of the Objective-C functions in a particular version of the ARKit package. If that happens, then we either break that asset whenever we change our native code OR breaking changes to native code require a major version bump, and we have no automated way to catch that.

I might be misunderstanding something here. Do the existing Objective-C functions become "more public" if the source code is included? Doesn't the ARKit XR Plugin package already make calls to those Objective-C functions? Or is the package/plugin structured in a way that those functions are all private, and including the library source would make it public?

Yes, and it is configurable. It's what these EnvironmentDepthModes are for (settable on the AROcclusionManager). You can disable it entirely if you want.

The documentation doesn't explain how the depth maps are being modified by Unity. Depth map quality can degrade very quickly when when smoothing is applied, especially if there are multiple passes done by both iOS and Unity. Not having that information be clear is a big concern for use cases involving custom mesh reconstruction and custom occlusion/background removal post processing.

Thanks again, really appreciate the responses and feedback.

tdmowrer commented 3 years ago

Do the existing Objective-C functions become "more public" if the source code is included?

Technically speaking: no. Anyone can run objdump or whatever to see all the public methods in any binary. But if source is included, then yes, I think the contract changes. As a consumer of this API, I would expect those methods to be subject to semantic versioning.

The documentation doesn't explain how the depth maps are being modified by Unity. Depth map quality can degrade very quickly when when smoothing is applied, especially if there are multiple passes done by both iOS and Unity. Not having that information be clear is a big concern for use cases involving custom mesh reconstruction and custom occlusion/background removal post processing.

I think this is a bit off topic for an "open source the ARKit plugin" discussion. If you are interested in the specifics, please create a separate issue.

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

orangeagain commented 3 years ago

they can scan high quilty mesh and texture,but unity can't.ARmeshMannager density and colors does not work. It'll put unity behind the AR competition. Will you guys strengthen mesh quilty and texture? https://youtu.be/YizMalWigkI

tdmowrer commented 3 years ago

The ARMeshManager is platform agnostic and includes some options that ARKit does not support, such as colors. This is not a missing implementation or something that could be solved by open sourcing the plugin.

The Unity mesh generated by the mesh manager is exactly what ARKit gives us. Can you be more specific as to what you believe the differences are?

orangeagain commented 3 years ago

The ARMeshManager is platform agnostic and includes some options that ARKit does not support, such as colors. This is not a missing implementation or something that could be solved by open sourcing the plugin.

The Unity mesh generated by the mesh manager is exactly what ARKit gives us. Can you be more specific as to what you believe the differences are?

https://youtu.be/YizMalWigkI their app show very high quilty mesh and texture. is their use low level arkit api to implement it? I don't know.but I find a way to generate texture that use https://github.com/kitbashery/xatlas-for-Unity to create uv,and map camera photo to texture.

orangeagain commented 3 years ago

Get mesh with texture function should be basic api for AR. If ARFoundation implement it, you will be leading the world.

tdmowrer commented 3 years ago

That is an impressive-looking demo; I do not know what they used to build it, but xatlas seems to be unrelated to ARKit:

xatlas is a small C++11 library with no external dependencies that generates unique texture coordinates suitable for baking lightmaps or texture painting.

Get mesh with texture function should be basic api for AR. If ARFoundation implement it, you will be leading the world.

ARKit does not provide texture coordinates for their meshes (correct me if I am wrong). You seem to be asking for Unity to provide additional functionality to what ARKit provides natively. While I appreciate the suggestion, the topic of this thread is open sourcing the ARKit plugin, and I do not see how doing so would make a difference.

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

sam598 commented 3 years ago

Technically speaking: no. Anyone can run objdump or whatever to see all the public methods in any binary. But if source is included, then yes, I think the contract changes. As a consumer of this API, I would expect those methods to be subject to semantic versioning.

If that is the case I am still not sure how including objective-c code in the package would be different from the open C# code included in the package. To use your scenario, someone can already make a custom version of AR Foundation based off of a specific version and put it on the asset store. Isn't it the responsibility of that developer to make sure their asset is up to date?

our tooling doesn't look for native source changes, so it would be easy for us to make a breaking change accidentally.

If including the native source in the package makes development harder for Unity, is there a world where the native source for the library is a separate undocumented "use at your own risk" repository?

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

sam598 commented 3 years ago

See https://github.com/Unity-Technologies/arfoundation-samples/issues/627

tdmowrer commented 3 years ago

We really appreciate all the feedback and discussion around this. We do not plan to provide the native source code for either the ARCore or ARKit plugins at this time, so I will close this issue.