Build AR apps with confidence.
Iterate fast, right in Editor.
Non-invasive, drop-in solution.
Fair pricing.
Quick Start β‘ β’ License & Pricing πΈ β’ Documentation π β’ Troubleshooting βοΈ
Technical Details π β’ Comparison to MARS π β’ Related Solutions πͺ β’ Say hi βοΈ
This package allows you to fly around in the Editor and test your AR app, without having to change any code or structure. Iterate faster, test out more ideas, build better apps.
ARSimulation is a custom XR backend, built on top of the XR plugin architecture.
This scene only uses ARFoundation features.
Because it's just another XR Plugin and we took great care to simulate important features, it works with your existing app, ARFoundation, XR Interaction Toolkit β zero changes to your code or setup needed! β¨
And if you need more control, there's a lot of knobs to turn.
Tools/AR Simulation/Convert to AR Scene
Using ARSimulation requires you to buy a per-seat license β
please buy seats for your team through Unity AssetStore.
This is a single $60 payment, not a monthly subscription.
You can use it for 7 days for evaluation purposes only,
without buying a license. π§
Go to Edit/Project Settings/XR-Plugin-Management
and make sure that AR Simulation is checked βοΈ in the PC, Mac and Linux Standalone Settings
tab.
We are working on improving the docs right now and making some nice "Getting Started" videos. Stay tuned β until then, here's some things you might need:
Please open an issue and tell us about it! We want this to be as useful as possible to speed up your workflow.
SimulatedPlane
Prefab into the scene in Edit or Play Mode (or just add another SimulatedARPlane
component)Video: Custom Planes
Video: Runtime Adjustments
The same works for Point Clouds.
(Tracked 3D Objects Coming Soonβ’)
Simulated Tracked Image
is generated for you.Empty GameObject
+ SimulatedARTrackedImage
of your choice
(needs to be in a ReferenceImageLibrary
of course)SimulatedEnvironment
component to itVideo: Complex Environment Simulation
Here's a preview of a nicely dressed apartement sample: π‘
π± Device Simulator (but works without)
π Input System: both (but works with old/new/both)
Unity Version | Input System | ARFoundation | Interaction Mode | ||||
---|---|---|---|---|---|---|---|
Old | Both | New | 3.15 | 4.0 | Game View | Device Simulator1 | |
βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | |
βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | |
βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | βοΈ | βοΈ |
Unity Version | Render Pipeline | Platform | ||||
---|---|---|---|---|---|---|
Built-in | URP | HDRP2 | Editor | iOS/Android Build3 | Desktop Build4 | |
βοΈ | βοΈ | β | βοΈ | βοΈ | untested | |
βοΈ | βοΈ | β | βοΈ | βοΈ | untested | |
βοΈ | βοΈ | β | βοΈ | βοΈ | untested |
1 Recommended. Feels very nice to use, and gives correct sizes for UI etc.
2 HDRP is not supported by Unity on iOS/Android currently.
3 "Support" here means: ARSimulation does not affect your builds, it is purely for Editor simulation.
4 We haven't done as extensive testing as with the others yet. Making Desktop builds with ARSimulation is very useful for testing multiplayer scenarios without the need to deploy to multiple mobile devices.
5 There is a known bug in XR Plugin Management 3.2.13 and earlier. Please use to 3.2.15 or newer.
ARSimulation running in Device Simulator.
ARSimulation is a XR Plugin
that works with Unity's XR SDK infrastructure and thus plugs right into ARFoundation and other systems in the VR/AR realm inside Unity.
Currently supported features are marked orange.
This architecture has some advantages:
AR Foundation samples/Plane Occlusion
scene)Auto Generate
is on in Lighting Window or bake light data.No active UnityEngine.XR.XRInputSubsystem is available. Please ensure that a valid loader configuration exists in the XR project settings.
We have no idea what that means: Link to Forum Thread XRLegacyInputHelpers
that isn't needed in call cases; we will remove that dependency in a future release. Long story short:
Unity describes MARS (Mixed and Augmented Reality Studio) as "a framework for simplified, flexible AR authoring". We were active alpha testers, trying to use it for our own AR applications, and started developing our own solution in parallel. After a while, we stopped using MARS (besides of course testing and giving feedback to new releases).
MARS is very ambitious and future-facing. It tries to anticipate many new types of devices and sensors, and to do that, reinvents the wheel (namely: basic ARFoundation features) in many places.
It wraps around ARFoundation instead of extending it, which is great for some usecases but makes it very heavy for others.
A core concept of MARS is Functionality Injection, which at its base feels pretty similar to what the XR SDK system is trying to achieve (note: FI might allow for more complex scenarious, but solves a similar problem of device-specific implementations.)
Our goal are fast iteration times in the Editor for a range of AR applications we and partner companies build. These usually consist of placing and interacting with objects from different angles. We just needed a way to "simulate" an AR device in the Editor, not a full-blown additional framework!
Fortunately, Unity provides the ability to build exactly that using the XR plugin architecture: a custom XR provider that works in the Editor and Desktop builds.
There were quite some challenges, especially around Input System support (we support old/both/new modes now) and touch injection (there's a private Input.SimulateTouch API that is also used by the DeviceSimulator package).
Plus, the usual amount of Unity bugs and crashes; we are pretty confident that we worked around most of them and sent bug reports for the others.
β | ARSimulation | MARS |
---|---|---|
Claim | Non-invasive editor simulation backend for ARFoundation | Framework for simplified, flexible AR Authoring |
Functionality | XR SDK plugin for Desktop: positional tracking simulation, touch input simulation, image tracking, ... |
Wrapper around ARFoundation with added functionality: custom simulation window, object constraints and forces, editor simulation (including most of what ARSimulation can do), file system watchers, custom Editor handles, codegen, ... |
Complexity |
|
|
Changes to project | none | |
Required changes | none | ARFoundation components need to be replaced with their MARS counterparts |
The following table compares ARSimulation and MARS in respect to in-editor simulation for features available in ARFoundation.
Note that MARS has a lot of additional tools and features (functionality injection, proxies, recordings, automatic placement of objects, constraints, ...) not mentioned here that might be relevant to your usecase. See the MARS docs for additional featuers.
β | ARSimulation Simulation Features |
MARS Simulation Features |
---|---|---|
Plane Tracking | βοΈ | βοΈ |
Touch Input | βοΈ | β1 |
Simulated Environments | (βοΈ)2 | βοΈ |
Device Simulator | βοΈ | β3 |
Point Clouds | βοΈ | βοΈ |
Image Tracking | βοΈ | βοΈ |
Light Estimation Spherical Harmonics |
βοΈ | β |
Anchors | βοΈ | β |
Meshing | (βοΈ) | βοΈ |
Face Tracking | β | (βοΈ)4 |
Object Tracking | βοΈ | β |
Human Segmentation | β | β |
1 MARS uses Input.GetMouseButtonDown
for editor input AND on-device input. This means: no testing of XR Interaction Toolkit features, no multitouch. You can see the (somewhat embarassing) MARS input example at this Unity Forum link. ARSimulation supports full single-touch simulation in GameView and DeviceSimulator.
2 ARSimulation's plane shader doesn't support occlusion right now, which matches what ARFoundation shaders currently do (no occlusion). You can still use your own shaders that support occlusion (see AR Foundation samples/PlaneOcclusion
scene)
3 MARS uses a custom "Device View", but doesn't support the Unity-provided Device Simulator package. This means you can't test your UIs with MARS with proper DPI settings (e.g. the typical use of Canvas: Physical Size).
4 MARS has a concept of Landmarks that are created from ARKit blendshapes and ARCore raw meshes, but no direct support for either.
Unfortunately it seems nobody at Unity anticipated someone building custom XR providers in C# that are actually supposed to work in the Editor. It's advertised as a "way to build custom C++ plugins" only.
This has lead to funny situations where we reporting bugs around usage in Editor (e.g. of the ARFoundation Samples, XR Interaction Toolkit, and others), and Unity telling us that these "don't matter since you can't use them in Editor anyways". Well guys, we hope now you see why we were asking.
Since Unity still hasn't provided a viable solution for testing AR projects without building to devices, a number of interesting projects arose to overcome that, especially for remoting.
For our own projects, we found that device remoting is still too slow for continuous testing and experimentation, so we made ARSimulation.
Kirill Kuzyk recently released a great tool called AR Foundation Editor Remote which uses a similar approach and creates a custom, editor-only XR SDK backend that injects data received from remote Android and iOS devices.
Here's the AR Foundation Editor Remote forum thread.
Koki Ibukuro has also experimented with remoting data into the Unity editor for ARFoundation development. His plugin also supports background rendering.
It's available on GitHub: asus4/ARKitStreamer.
Unity Techologies is of course also experimenting with remoting, and currently has an internal Alpha version that is undergoing user testing.
Here's the forum thread for the upcoming Unity AR Remoting.
And of course there's MARS, the newly released, 600$/seat/year framework for simplified and flexible AR Authoring. It's probably a great solution for enterprises, and has a ton of additional tooling that goes way beyond what ARFoundation provides. We were Alpha testers of MARS and early on it became clear that it was not what many people believed it to be β a simple way to test your app without building to device. Here's the Forum section for MARS.
needle β tools for unity β’ @NeedleTools β’ @marcel_wiessler β’ @hybridherbst β’ Say hi!