Closed cgathergood closed 6 years ago
If you didn't already know, there is an Augmented Images example in the ARCore v1.2 SDK samples, but it doesn't use sceneform.
I've been thinking that it'd be really nice to have some form of integration between these projects. For example, the documentation for Augmented Images states the following:
To get the matched images, poll for updated AugmentedImages in your frame update loop.
However, when using Sceneform, there isn't a direct API for tapping into the frame update loop. Perhaps a listener could be created for this purpose?
Also, handy, perhaps would be a way to add to the Session Configuration at the appropriate time. Again, a simple listener might be appropriate here.
I've read through the source code and am wondering if the only recourse would be to extend ArFragment by subclassing it. Is that the way we should be going about solving these problems?
@cgathergood and @nealsanche - Thanks for asking! We're working on what's the best way to provide samples for both Sceneform and the ARCore Android samples. This conversation is definitely helpful.
@nealsanche - Yes, subclassing the ArFragment class is the intended approach for changing the session configuration. What I do is override getSessionConfiguration() to return a Config object with the desired settings. The ArFragment class then always sets the update mode to LATEST_CAMERA_IMAGE so Sceneform does not block on the UI thread.
Completely unofficial, but I took a stab the Augmented images sample in my personal repo: https://github.com/claywilkinson/arcore-android-sdk/pull/1/files
Hopefully we can get it merged into the official repository in the near future.
Thanks @claywilkinson this is a great starting point, I'll fork your PR and take a look for myself. Cheers!
I guess just for some info for future discoverers of this, I did some super raw exploration by modifying the hello sceneform application slightly:
Made a simple ArFragment
subclass as so:
package com.google.ar.sceneform.samples.hellosceneform;
import com.google.ar.core.Frame;
import com.google.ar.sceneform.FrameTime;
import com.google.ar.sceneform.ux.ArFragment;
public class MyArFragment extends ArFragment {
FrameListener listener = null;
public interface FrameListener {
void onFrame(FrameTime frameTime, Frame frame);
}
@Override
public void onUpdate(FrameTime frameTime) {
super.onUpdate(frameTime);
Frame arFrame = getArSceneView().getArFrame();
if (listener != null) {
listener.onFrame(frameTime, arFrame);
}
}
public void setOnFrameListener(FrameListener listener) {
this.listener = listener;
}
}
Then, updated the layout:
<fragment android:name="com.google.ar.sceneform.samples.hellosceneform.MyArFragment"
android:id="@+id/ux_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
And updated the activity to make use of that:
private MyArFragment arFragment;
...
arFragment = (MyArFragment) getSupportFragmentManager().findFragmentById(R.id.ux_fragment);
Then I made a listener:
arFragment.setOnFrameListener((frameTime, frame) -> {
Collection<AugmentedImage> updatedAugmentedImages =
frame.getUpdatedTrackables(AugmentedImage.class);
if (updatedAugmentedImages.size() == 1) {
for (AugmentedImage image : updatedAugmentedImages) {
if (image.getTrackingState() == TrackingState.TRACKING) {
if (image.getName().equals("book")) {
Anchor anchor = image.createAnchor(image.getCenterPose());
if (andy == null) {
makeAndy(anchor);
}
}
}
}
}
});
And refactored the 3d model anchoring code into a method:
private void makeAndy(Anchor anchor) {
AnchorNode anchorNode = new AnchorNode(anchor);
anchorNode.setParent(arFragment.getArSceneView().getScene());
// Create the transformable andy and add it to the anchor.
TransformableNode andy = new TransformableNode(arFragment.getTransformationSystem());
andy.setParent(anchorNode);
andy.setRenderable(andyRenderable);
andy.select();
this.andy = andy;
}
Now this part is probably not what you'd want to do, but it was enough to get it working for me. But it halts the app for a while:
@Override
protected void onResume() {
super.onResume();
Session session = arFragment.getArSceneView().getSession();
if (session != null) {
Log.e(TAG, "Got a session!");
addImageTracker();
}
}
And some rough code to add the image tracker and train on an image I put in the assets folder:
private void addImageTracker() {
try {
session = arFragment.getArSceneView().getSession();
if (session == null) {
arFragment.getArSceneView().setupSession(new Session(this));
session = arFragment.getArSceneView().getSession();
}
AugmentedImageDatabase imageDatabase = new AugmentedImageDatabase(session);
Bitmap bitmap = null;
try (InputStream inputStream = getAssets().open("laptop.jpg")) {
bitmap = BitmapFactory.decodeStream(inputStream);
} catch (IOException e) {
Log.e(TAG, "I/O exception loading augmented image bitmap.", e);
}
if (bitmap != null) {
int index = imageDatabase.addImage("book", bitmap, 0.5f);
}
Config config = new Config(session);
config.setAugmentedImageDatabase(imageDatabase);
session.configure(config);
} catch (Throwable ex) {
Log.e(TAG, "Something bad happened", ex);
}
}
This ends up recognizing the lid of my laptop, and paints Andy floating above it.
@nealsanche - That looks reasonable, thanks for sharing! You're correct the onResume() bit is a little awkward. if you override getSessionConfiguration
in your subclassed fragment like this, you don't need to add the onResume part.
getSessionConfiguration is called by the BaseArFragment right after the session is created. It is done this way so the fragment can call setUpdateMode to LATEST_CAMERA_IMAGE to avoid the janky updating, since Sceneform runs on this on the UI thread.
public class ArFragmentWithAugmentedImages extends ArFragment {
static final String TAG = "ArFragmentWithAugmentedImages";
@Override
protected Config getSessionConfiguration(Session session) {
Config augmentedConfig = new Config(session);
AugmentedImageDatabase imageDatabase = new AugmentedImageDatabase(session);
Bitmap bitmap = null;
try (InputStream inputStream = getActivity().getAssets().open("laptop.jpg")) {
bitmap = BitmapFactory.decodeStream(inputStream);
} catch (IOException e) {
Log.e(TAG, "I/O exception loading augmented image bitmap.", e);
}
if (bitmap != null) {
int index = imageDatabase.addImage("book", bitmap, 0.5f);
}
Config config = new Config(session);
config.setAugmentedImageDatabase(imageDatabase);
return config;
}
}
@claywilkinson how to rotate my ViewRenderable node front of my target image
I want to make node just overlay image
Thanks for the help everyone - I did a quick write up of what I learned here: https://medium.com/@cgathergood/augmented-images-with-arcore-and-sceneform-4c3fe774b5bd
Let me know what you think! Great work on Sceneform team!
[Edit by fredsa@] The samples directory has moved. Find link to all Sceneform examples here: developers.google.com/ar/develop/java/sceneform/samples
Sceneform augmented Image sample up at https://github.com/google-ar/sceneform-android-sdk/tree/master/augmentedimage
Hello. Can you help me with an code example for multi image tracking in ARCore? Now with the ARCore Augmented Images example I can create a database with 5 images let's say and only one object to augmented for all Images. How I can have 5 images and to assign one object to each image? Thank you.
@AlexandruStrujac Sure thing so in your onUpdateFrame
you can detect the image and add a new model as follows (it would be nicer to use a switch here but you get the idea)
if (augmentedImage.getTrackingState() == TrackingState.TRACKING) { if (augmentedImage.getName().equals("apple")) { AugmentedImageNode node = new AugmentedImageNode(this, "apple.sfb"); node.setImage(augmentedImage); arSceneView.getScene().addChild(node); } else if (augmentedImage.getName().equals("orange")) { AugmentedImageNode node = new AugmentedImageNode(this, "orange.sfb"); node.setImage(augmentedImage); arSceneView.getScene().addChild(node);
Making sure the name of the image correlates to your file.
AugmentedImageNode looks like this
And here's my MainActivity for reference
Let me know if that works for you!
@cgathergood onupdateframe is in augmented image controller? What should I do with Augmented image node? Sorry but I am a beginner in ArCore and coding....Also the images there in your code are apple and orange right? What are apple and orange .sfb, the 3d objects?
Thank you so much for helping me.
Or should i put that code in AugmentedImage Script.
My project is base on Google ARCore Augmented Image Example.
I did most of my work before the official Augmented Images Sample so I'm not too familiar with the code.
I'd like to think my project is relatively easy to follow along with - https://github.com/cgathergood/Sceneform-Augmented-Images
I looked at the non-sceneform example first then worked from there to utilize Sceneform. All my example does is simply match a 3D object to a stored image.
I wrote a blog post on this which should hopefully explain my approach. If you just want the model to appear in front of an image then I'd recommend this approach.
@nealsanche @claywilkinson
whenever i tried to launch the arFragment in my app i got the crash given below
more about my issue you guys can go to my issue
here
java.lang.NoSuchMethodError: No virtual method requireActivity()Landroid/support/v4/app/FragmentActivity; in class Lcom/google/ar/sceneform/ux/BaseArFragment; or its super classes (declaration of 'com.google.ar.sceneform.ux.BaseArFragment' appears in /data/app/com.tipestrygo.lenovo.tipestry-1/base.apk:classes7.dex) at com.google.ar.sceneform.ux.BaseArFragment.requestDangerousPermissions(BaseArFragment.java:264) at com.google.ar.sceneform.ux.BaseArFragment.onCreateView(BaseArFragment.java:213) at android.support.v4.app.Fragment.performCreateView(Fragment.java:2261) at android.support.v4.app.FragmentManagerImpl.ensureInflatedFragmentView(FragmentManager.java:1655) at android.support.v4.app.FragmentManagerImpl.moveToState(FragmentManager.java:1390) at android.support.v4.app.FragmentManagerImpl.moveToState(FragmentManager.java:1650) at android.support.v4.app.FragmentManagerImpl.addFragment(FragmentManager.java:1906) at android.support.v4.app.FragmentManagerImpl.onCreateView(FragmentManager.java:3698) at android.support.v4.app.FragmentController.onCreateView(FragmentController.java:111) at android.support.v4.app.FragmentActivity.dispatchFragmentsOnCreateView(FragmentActivity.java:350) at android.support.v4.app.BaseFragmentActivityApi14.onCreateView(BaseFragmentActivityApi14.java:39)
I guess just for some info for future discoverers of this, I did some super raw exploration by modifying the hello sceneform application slightly:
Made a simple
ArFragment
subclass as so:package com.google.ar.sceneform.samples.hellosceneform; import com.google.ar.core.Frame; import com.google.ar.sceneform.FrameTime; import com.google.ar.sceneform.ux.ArFragment; public class MyArFragment extends ArFragment { FrameListener listener = null; public interface FrameListener { void onFrame(FrameTime frameTime, Frame frame); } @Override public void onUpdate(FrameTime frameTime) { super.onUpdate(frameTime); Frame arFrame = getArSceneView().getArFrame(); if (listener != null) { listener.onFrame(frameTime, arFrame); } } public void setOnFrameListener(FrameListener listener) { this.listener = listener; } }
Then, updated the layout:
<fragment android:name="com.google.ar.sceneform.samples.hellosceneform.MyArFragment" android:id="@+id/ux_fragment" android:layout_width="match_parent" android:layout_height="match_parent" />
And updated the activity to make use of that:
private MyArFragment arFragment; ... arFragment = (MyArFragment) getSupportFragmentManager().findFragmentById(R.id.ux_fragment);
Then I made a listener:
arFragment.setOnFrameListener((frameTime, frame) -> { Collection<AugmentedImage> updatedAugmentedImages = frame.getUpdatedTrackables(AugmentedImage.class); if (updatedAugmentedImages.size() == 1) { for (AugmentedImage image : updatedAugmentedImages) { if (image.getTrackingState() == TrackingState.TRACKING) { if (image.getName().equals("book")) { Anchor anchor = image.createAnchor(image.getCenterPose()); if (andy == null) { makeAndy(anchor); } } } } } });
And refactored the 3d model anchoring code into a method:
private void makeAndy(Anchor anchor) { AnchorNode anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene()); // Create the transformable andy and add it to the anchor. TransformableNode andy = new TransformableNode(arFragment.getTransformationSystem()); andy.setParent(anchorNode); andy.setRenderable(andyRenderable); andy.select(); this.andy = andy; }
Now this part is probably not what you'd want to do, but it was enough to get it working for me. But it halts the app for a while:
@Override protected void onResume() { super.onResume(); Session session = arFragment.getArSceneView().getSession(); if (session != null) { Log.e(TAG, "Got a session!"); addImageTracker(); } }
And some rough code to add the image tracker and train on an image I put in the assets folder:
private void addImageTracker() { try { session = arFragment.getArSceneView().getSession(); if (session == null) { arFragment.getArSceneView().setupSession(new Session(this)); session = arFragment.getArSceneView().getSession(); } AugmentedImageDatabase imageDatabase = new AugmentedImageDatabase(session); Bitmap bitmap = null; try (InputStream inputStream = getAssets().open("laptop.jpg")) { bitmap = BitmapFactory.decodeStream(inputStream); } catch (IOException e) { Log.e(TAG, "I/O exception loading augmented image bitmap.", e); } if (bitmap != null) { int index = imageDatabase.addImage("book", bitmap, 0.5f); } Config config = new Config(session); config.setAugmentedImageDatabase(imageDatabase); session.configure(config); } catch (Throwable ex) { Log.e(TAG, "Something bad happened", ex); } }
This ends up recognizing the lid of my laptop, and paints Andy floating above it.
Hi,
I end up getting updatedAugmentedImages.size() =0 . Please help.
Regards, Seema
Hey team! Nice work with sceneform, it's been great to read through the source code for the solar system to get an understanding of the SDK.
Is there any chance of getting a Augmented Images example? I'd love to get an understanding of how to get started with this feature. Thanks!