needle-tools / ar-simulation

AR Simulation for Unity β€’ Right in the Editor β€’ Minimally Invasive
109 stars 10 forks source link
arfoundation augmented-reality development editor package plugin tool unity-asset unity-editor unity3d unity3d-plugin xr

AR Simulation Cover Image

AR Simulation

Build AR apps with confidence.
Iterate fast, right in Editor.
Non-invasive, drop-in solution.
Fair pricing.

Quick Start βš‘ β€’ License & Pricing πŸ’Έ β€’ Documentation πŸ“œ β€’ Troubleshooting β˜‚️

Technical Details πŸ”Ž β€’ Comparison to MARS πŸš€ β€’ Related Solutions πŸ‘ͺ β€’ Say hi βœοΈ

What is this?

This package allows you to fly around in the Editor and test your AR app, without having to change any code or structure. Iterate faster, test out more ideas, build better apps.
ARSimulation is a custom XR backend, built on top of the XR plugin architecture.

Zero Setup
This scene only uses ARFoundation features.

Because it's just another XR Plugin and we took great care to simulate important features, it works with your existing app, ARFoundation, XR Interaction Toolkit β€” zero changes to your code or setup needed! ✨
And if you need more control, there's a lot of knobs to turn.

Quick Start ⚑

Slow Start 🐒

Click here for a step-by-step walkthrough β€” it's worth the time! Here, we'll walk you through using AR Simulation with the arfoundation-samples project which provides a good overview over what AR Simulation can and can't do. - **First, we'll download the samples and take a look at what we got.** - Clone or download Unity's [arfoundation-samples project from GitHub](https://github.com/Unity-Technologies/arfoundation-samples) - Open the project (at time of writing, this is currently using Unity 2019.4.1f1, but should work with any 2019.3+). - Open the menu scene `Scenes/ARFoundationMenu/Menu.unity` - Press Play. *"Hey wait, we didn't import ARSimulation yet!" "Yes indeed. We want to show you how lonely it's here without it."* - The samples scene looks somewhat like this (more or less pixelated depending on your game window settings): ![arfoundation-samples: first play](../../wiki/images/slow-start-01-arfoundation-samples-first-play.png) ***Q:** "Wow what's up with the font?!"* ***A:** "Unfortunately seems Unity thinks this will ever only be looked at on a device, in a build. AR in Editor?! Hm."* - **Let's give it a chance and install the Device Simulator package to make this look better.** - Stop Play Mode - Open Package Manager - Make sure "All Packages" and "Preview Packages" is enabled (*Note: on 2020.1/2, you'll need to enable preview settings in Project Settings first) - Install Device Simulator 2.2.2+ ![arfoundation-samples: install Device Simulator](../../wiki/images/slow-start-02-install-device-simulator.png) - In your Game View, select the little new dropdown and then `Simulator`. ![arfoundation-samples: set up Device Simulator](../../wiki/images/slow-start-03-switch-to-device-simulator.png) - Press Play - Note that this looks more like a device now: ![arfoundation-samples: Device Simulator View](../../wiki/images/slow-start-04-device-sim-view.png) - Also note, this tells you that **no AR features are available in the Editor**. Luckily, AR Simulation is here to change that! - **Next up, the fun stuff happens. We'll install AR Simulation.** - Stop Play Mode - Download the [πŸ“¦ ARSimulation Installer](https://github.com/needle-tools/ar-simulation/releases/latest/download/ARSimulationInstaller.unitypackage) - Drop the installer into your project. - You should be greeted by our Getting Started window, and have documentation and sample scenes ready: ![arsimulation: Getting Started](../../wiki/images/slow-start-05-installation.png) You can close this window for now β€” you can always get back to it with `Window/AR Simulation/Getting Started`. - Press Play. ![arsimulation: first play](../../wiki/images/slow-start-06-arsim-first-play.png) ***Q:** "Oh, suddenly there are a lot of sample buttons that have turned white?"* ***A:** "You probably guessed it, that means we can now simulate those scenes!"* - **Let's play with the samples a bit.** - Click on the `Simple AR` button to load that scene. - Press and hold the Right Mouse Button and use the WASDQE keys to move around the scene. Move the mouse to look around. - Note that there's an orange plane detected and tracked, together with some matching pointcloud visuals. These visuals come straight from ARFoundation - this is _exactly_ what happens on device, including everything going on in the Unity hierarchy. - Release Right Mouse Button and click with the Left Mouse Button to spawn something. ![arsimulation: Simple AR](../../wiki/images/slow-start-07-arsim-sample-01.png) - Press the Return button - Click on the `Interaction` button to load that scene. This scene uses the XR Interaction Toolkit for interactive-ness, another thing that is notoriously hard to test in Editor. ![arsimulation: Interaction](../../wiki/images/slow-start-08-arfoundation-interaction.gif) Press the `Return` button - Last one here, let's try `Sample UX` β€” click it! ![arsimulation: Sample UX](../../wiki/images/slow-start-12-sample-ux.gif) - ***Q:** "OK, I got it, the arfoundation-samples work. Is there more?"* ***A:** "Well, happy that you ask, of course!"* - Make sure you have the AR Simulation samples installed at `Samples/AR Simulation`. - if not, open `Window > AR Simulation > Getting Started` and click `Install Samples`. - From `Samples/AR Simulation/someversion/Getting Started`, open the scene `RaycastPlanes.unity`. - This scene uses a `Simulated AR Environment` to provide a more complex testing scenario. - Press Play. - Move around the scene using Right Mouse Button and WASDQE. - While moving around, notice that planes are detected as you go: ![arsimulation: simulated environment 1](../../wiki/images/slow-start-09-arsimulation-planes.gif) - Click the Left Mouse Button to spawn little guys on all planes. ![arsimulation: simulated environment 2](../../wiki/images/slow-start-11-sample-tracked-planes.png) ***Q:** "Wow. This makes testing AR applications so much easier! How can I thank you guys?" ***A:** "Well, please don't forget to buy a license!"*

License & Pricing πŸ’Έ

Using ARSimulation requires you to buy a per-seat license β€”
please buy seats for your team through Unity AssetStore.
This is a single $60 payment, not a monthly subscription.

You can use it for 7 days for evaluation purposes only,
without buying a license. 🧐

Troubleshooting β˜‚οΈ

Input does not work, I can not move around

Go to Edit/Project Settings/XR-Plugin-Management and make sure that AR Simulation is checked βœ”οΈ in the PC, Mac and Linux Standalone Settings tab.
image

Documentation πŸ“œ

We are working on improving the docs right now and making some nice "Getting Started" videos. Stay tuned β€” until then, here's some things you might need:

Found a bug? πŸ˜… Missing a feature?

Please open an issue and tell us about it! We want this to be as useful as possible to speed up your workflow.

Need more tracked planes? ✈

Video: Custom Planes
Video: Runtime Adjustments

The same works for Point Clouds.
(Tracked 3D Objects Coming Soonβ„’)

Working with Image Tracking? πŸ–Ό

Want to test against a more complicated scenery? 🏰

Video: Complex Environment Simulation

Here's a preview of a nicely dressed apartement sample: 🏑 sample: Example Apartment

URP example πŸ”¨

URP Sample scene as Environment
Click preview to watch video

Works great with

πŸ“± Device Simulator (but works without)

πŸ‘† Input System: both (but works with old/new/both)

Supported Configurations

Unity Version Input System ARFoundation Interaction Mode
Old Both New 3.15 4.0 Game View Device Simulator1
βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ
βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ
βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ βœ”οΈ
Unity Version Render Pipeline Platform
Built-in URP HDRP2 Editor iOS/Android Build3 Desktop Build4
βœ”οΈ βœ”οΈ β€” βœ”οΈ βœ”οΈ untested
βœ”οΈ βœ”οΈ β€” βœ”οΈ βœ”οΈ untested
βœ”οΈ βœ”οΈ β€” βœ”οΈ βœ”οΈ untested

1 Recommended. Feels very nice to use, and gives correct sizes for UI etc.
2 HDRP is not supported by Unity on iOS/Android currently.
3 "Support" here means: ARSimulation does not affect your builds, it is purely for Editor simulation.
4 We haven't done as extensive testing as with the others yet. Making Desktop builds with ARSimulation is very useful for testing multiplayer scenarios without the need to deploy to multiple mobile devices. 5 There is a known bug in XR Plugin Management 3.2.13 and earlier. Please use to 3.2.15 or newer.

AR Simulation running in Device Simulator
ARSimulation running in Device Simulator.

Technical Details πŸ”¬

ARSimulation is a XR Plugin that works with Unity's XR SDK infrastructure and thus plugs right into ARFoundation and other systems in the VR/AR realm inside Unity.

XR Architecture - ARSimulation
Currently supported features are marked orange.

This architecture has some advantages:

Known Issues 🚧

But there is also MARS now! πŸ”­

Long story short:

MARS: A Framework for Simplified, Flexible AR Authoring

Unity describes MARS (Mixed and Augmented Reality Studio) as "a framework for simplified, flexible AR authoring". We were active alpha testers, trying to use it for our own AR applications, and started developing our own solution in parallel. After a while, we stopped using MARS (besides of course testing and giving feedback to new releases).

MARS is very ambitious and future-facing. It tries to anticipate many new types of devices and sensors, and to do that, reinvents the wheel (namely: basic ARFoundation features) in many places.
It wraps around ARFoundation instead of extending it, which is great for some usecases but makes it very heavy for others.
A core concept of MARS is Functionality Injection, which at its base feels pretty similar to what the XR SDK system is trying to achieve (note: FI might allow for more complex scenarious, but solves a similar problem of device-specific implementations.)

XR Architecture - MARS

ARSimulation: A non-invasive Editor Simulation Backend

Our goal are fast iteration times in the Editor for a range of AR applications we and partner companies build. These usually consist of placing and interacting with objects from different angles. We just needed a way to "simulate" an AR device in the Editor, not a full-blown additional framework!

Fortunately, Unity provides the ability to build exactly that using the XR plugin architecture: a custom XR provider that works in the Editor and Desktop builds.
There were quite some challenges, especially around Input System support (we support old/both/new modes now) and touch injection (there's a private Input.SimulateTouch API that is also used by the DeviceSimulator package).
Plus, the usual amount of Unity bugs and crashes; we are pretty confident that we worked around most of them and sent bug reports for the others.

Comparison between MARS and ARSimulation βš”

βš” ARSimulation MARS
Claim Non-invasive editor simulation backend for ARFoundation Framework for simplified, flexible AR Authoring
Functionality XR SDK plugin for Desktop:
positional tracking simulation, touch input simulation, image tracking, ...
Wrapper around ARFoundation with added functionality:
custom simulation window, object constraints and forces, editor simulation (including most of what ARSimulation can do), file system watchers, custom Editor handles, codegen, ...
Complexity
  • 1 package
  • no additional files in project,
    only for XR SDK configuration
  • < 80 Types
  • 6 packages
  • 5 new top-level folders in your project
  • > 800 Types and classes
  • 27 different ScriptableObjects with settings
  • 18 code-generated scripts with defines etc.
Changes to project none
Required changes none ARFoundation components need to be replaced with their MARS counterparts

The following table compares ARSimulation and MARS in respect to in-editor simulation for features available in ARFoundation.
Note that MARS has a lot of additional tools and features (functionality injection, proxies, recordings, automatic placement of objects, constraints, ...) not mentioned here that might be relevant to your usecase. See the MARS docs for additional featuers.

βš” ARSimulation
Simulation Features
MARS
Simulation Features
Plane Tracking βœ”οΈ βœ”οΈ
Touch Input βœ”οΈ ❌1
Simulated Environments (βœ”οΈ)2 βœ”οΈ
Device Simulator βœ”οΈ ❌3
Point Clouds βœ”οΈ βœ”οΈ
Image Tracking βœ”οΈ βœ”οΈ
Light Estimation
Spherical Harmonics
βœ”οΈ ❌
Anchors βœ”οΈ ❌
Meshing (βœ”οΈ) βœ”οΈ
Face Tracking ❌ (βœ”οΈ)4
Object Tracking βœ”οΈ ❌
Human Segmentation ❌ ❌

1 MARS uses Input.GetMouseButtonDown for editor input AND on-device input. This means: no testing of XR Interaction Toolkit features, no multitouch. You can see the (somewhat embarassing) MARS input example at this Unity Forum link. ARSimulation supports full single-touch simulation in GameView and DeviceSimulator.
2 ARSimulation's plane shader doesn't support occlusion right now, which matches what ARFoundation shaders currently do (no occlusion). You can still use your own shaders that support occlusion (see AR Foundation samples/PlaneOcclusion scene)
3 MARS uses a custom "Device View", but doesn't support the Unity-provided Device Simulator package. This means you can't test your UIs with MARS with proper DPI settings (e.g. the typical use of Canvas: Physical Size).
4 MARS has a concept of Landmarks that are created from ARKit blendshapes and ARCore raw meshes, but no direct support for either.

Open Issues on Unity's end πŸ›

Unfortunately it seems nobody at Unity anticipated someone building custom XR providers in C# that are actually supposed to work in the Editor. It's advertised as a "way to build custom C++ plugins" only.

This has lead to funny situations where we reporting bugs around usage in Editor (e.g. of the ARFoundation Samples, XR Interaction Toolkit, and others), and Unity telling us that these "don't matter since you can't use them in Editor anyways". Well guys, we hope now you see why we were asking.

Related solutions πŸ‘ͺ

Since Unity still hasn't provided a viable solution for testing AR projects without building to devices, a number of interesting projects arose to overcome that, especially for remoting.
For our own projects, we found that device remoting is still too slow for continuous testing and experimentation, so we made ARSimulation.

Kirill Kuzyk recently released a great tool called AR Foundation Editor Remote which uses a similar approach and creates a custom, editor-only XR SDK backend that injects data received from remote Android and iOS devices.
Here's the AR Foundation Editor Remote forum thread.

Koki Ibukuro has also experimented with remoting data into the Unity editor for ARFoundation development. His plugin also supports background rendering.
It's available on GitHub: asus4/ARKitStreamer.

Unity Techologies is of course also experimenting with remoting, and currently has an internal Alpha version that is undergoing user testing.
Here's the forum thread for the upcoming Unity AR Remoting.

And of course there's MARS, the newly released, 600$/seat/year framework for simplified and flexible AR Authoring. It's probably a great solution for enterprises, and has a ton of additional tooling that goes way beyond what ARFoundation provides. We were Alpha testers of MARS and early on it became clear that it was not what many people believed it to be β€” a simple way to test your app without building to device. Here's the Forum section for MARS.

Contact ✍️

Forum Thread β€” ARSimulation

needle β€” tools for unity β€’ @NeedleTools β€’ @marcel_wiessler β€’ @hybridherbst β€’ Say hi!

Discord Link Downloads GitHub Issues