Closed jiungerich closed 4 years ago
The sample in this repo uses a similar technique to overlay a copy of the reference image on a virtual plane representing the detected image. In the ImageInfoPrefab
prefab, the plane is rotated 180 degrees around the Y axis. Could you do something similar?
Ok thanks - That does work, but it seems unintuitive. It seems that as long as I create an empty object as the base of the prefab, I can transform or move around anything I want inside it. This also seems a bit odd that I need to first rotate everything 180 from where I think it should go or that the prefab and the main object rotation and positioning is broken.
It seems like for one to do positioning and rotation adjustments for the whole prefab, one would have to create an empty object and then a base transparent plane that all other objects were children of. Then do scaling on the empty base object but rotation and positioning with the plane. It works, but honestly, I spent weeks trying to figure out how to get objects to position correctly before I came here (part of it was issues with 2D objects in 2019.2 that seem fixed now I have updated). I know there were several other people on the forum having similar issues.
It would be good to at least have notes in the main Image Tracking documentation on how a prefab gets positioned and requirements to transform the whole prefab unless it is going to change. Seems easier if the transform boxes just all worked.
So I have been working on this more. Specifically I have been working on spawning different prefabs to each image. I know of two published methods for this on YouTube (and on Github) and I have tried both. These are here: https://www.youtube.com/watch?v=I9j3MD7gS5Y and https://www.youtube.com/watch?v=iM0ghkvsRos
These work well with the 3D objects - Basically they both just get the ARTrackedImage from events.arg. and then use trackedImage.transform.position to set the position of the object. They just set the object position to the trackedImage.transform.position with no modification.
For the 3D objects it offset the centroid without any extra code similar to other 3D objects.
However - right now I am trying to just add a note, exactly like the example prefab provided in the AR Foundation sample. When I use the exact prefab from Ar Foundation samples, it always shows up 90 degrees from where it should so the thin side of the note plane is facing the viewer. Not sure why the behavior is different or why the object is instantiate not at the image position but at a position offset by the canvas centroid and turned 90 degrees. It doesn't seem to matter now if I manipulate the canvas rotation in the prefab.
What I would really like to do is spawn whole cool animated scenes each procedurally generated and related to the specific image. I could build these scenes in a prefab and have them spawn accordingly if I could get them to position reliably. I just want more than simple static objects - because ,well ,it would just be much much cooler to release the entire power of the Unity Engine on an image.
I have been using Unity 2019.3.12f1 with all the latest AR packages available.
So I figured out the issue with my above post. The references I was following in the youtube video only set the position but not the object rotation. Following the AR Foundation demo code for multiple objects solves it. The code is here for anyone else looking at this:
However, I do still desire an easier way to setup the prefabs as in the main part of the thread- it would nice.
Yes, this really an issue for AR Foundation image tracking. We normally use Vuforia and with their SDK it is possible to position objects on the image in the Unity Editor. Then you wrap it in a prefab and instantiate it. It would be very helpful when this kind of functionality would also be added to AR Foundation.
We will look at improving the workflow and documentation around this.
I have been working in Unity 2019.3.7f1 with ARfoundation 3.0.1.
I have been able to get image recognition to generate a standard cube and a plane with a video player (which always has video appearing upside-down).
It seems that when spawning an object on an image I have no control of the transform of the object beyond scale. If add a rotation or position offset I have no control as to where or what rotation in which the object spawns. It looks like Ar Tracked Image Manager just uses a single calculation from the centroid of the prefab to the centroid of the image.
I have basically just been trying to replicate in AR Foundation a project I made last summer in using Vuforia and Unity which was almost codeless. That included and offset video player and an animated robot spawn on a business card, but I have been resorting to coding out workarounds but I have no clear or easy way to locate a spawned prefab off-center or rotated. A video of that past project is here:
[https://youtu.be/szThPWMD6Ik]
It would just be good to have some transform coordinated in the Ar Tracked Image Manager tool, or even just a toggle that let the transform on the prefab be used to help set position. Even better would be to build a prefab in scene with an image plane and centroid to locate from.