Closed tellypresence closed 3 years ago
@romainguy Thank you for following up but without actual sample code this is likely to remain a difficult problem to surmount
Surely, surely, surely Google aren't killing Sceneform without replacing it with some other high-ish level 3D scene renderer? My various iOS and Android codebases, despite obvious differences (Swift/Kotlin, ARKit/ARCore, SceneKit/Sceneform), are very similar in structure and operation. Now I have a big chunk of functionality missing going forward, unless I use an archived library (and we all know how long they stay well maintained and easy to use, no matter how enthusiastic the community)....
A direct equivalent of Apple's SceneKit that can be used cleanly with and without ARCore would be the ideal solution. Unity or Unreal Engine are not realistic options in a lot of cases, for any number of reasons.
I don't think Google is working on a replacement library. They have officially deprecated Sceneform and are focused solely on ARCore and SceneViewer. I am hopeful that the community is able to fix the gaps in Sceneform and allow it to continue to be used.
Our current blocker to adoption of 1.16 is support for video textures, aka getting ExternalTexture to work again as @Sergiioh mentioned. Has anyone made any progress on that front?
Hi @hshapley, you can use the ExternalTexture adding this code.
Code:
CompletableFuture<ModelRenderable> ulCorner;
Material.builder()
.setSource(
context,
RenderingResources.GetSceneformResource(
context, RenderingResources.Resource.VIEW_RENDERABLE_MATERIAL))
.build()
.thenAccept(
material -> {
ArrayList<Vertex> vertices = new ArrayList<>();
vertices.add(Vertex.builder()
.setPosition(new Vector3(-0.5f, 0.0f, 0.0f))
.setNormal(new Vector3(0.0f, 0.0f, 1.0f))
.setUvCoordinate(new Vertex.UvCoordinate(0.0f, 0.0f))
.build());
vertices.add(Vertex.builder()
.setPosition(new Vector3(0.5f, 0.0f, 0.0f))
.setNormal(new Vector3(0.0f, 0.0f, 1.0f))
.setUvCoordinate(new Vertex.UvCoordinate(1.0f, 0.0f))
.build());
vertices.add(Vertex.builder()
.setPosition(new Vector3(-0.5f, 1.0f, 0.0f))
.setNormal(new Vector3(0.0f, 0.0f, 1.0f))
.setUvCoordinate(new Vertex.UvCoordinate(0.0f, 1.0f))
.build());
vertices.add(Vertex.builder()
.setPosition(new Vector3(0.5f, 1.0f, 0.0f))
.setNormal(new Vector3(0.0f, 0.0f, 1.0f))
.setUvCoordinate(new Vertex.UvCoordinate(1.0f, 1.0f))
.build());
ArrayList<Integer> triangleIndices = new ArrayList<>();
triangleIndices.add(0);
triangleIndices.add(1);
triangleIndices.add(2);
triangleIndices.add(1);
triangleIndices.add(3);
triangleIndices.add(2);
RenderableDefinition.Submesh submesh =
RenderableDefinition.Submesh.builder().setTriangleIndices(triangleIndices).setMaterial(material).build();
ulCorner = ModelRenderable.builder().setSource(
RenderableDefinition.builder()
.setVertices(vertices)
.setSubmeshes(Arrays.asList(submesh))
.build()
).build();
// set External Texture in ulCorner.thenAccept
ulCorner.thenAccept( renderable ->
{
renderable.getMaterial().setExternalTexture("viewTexture", texture);
})
.exceptionally(
throwable -> {
//Toast toast =
// Toast.makeText(context, "Unable to load video renderable", Toast.LENGTH_LONG);
//toast.setGravity(Gravity.CENTER, 0, 0);
//toast.show();
return null;
});
});
if you have problems with the RenderingResources class. You can find it inside the sceneform folder at ('com.google.ar.sceneform.rendering').
RenderingResources class.
import android.content.Context;
import com.google.ar.sceneform.utilities.LoadHelper;
final class RenderingResources {
public static enum Resource {
CAMERA_MATERIAL,
OPAQUE_COLORED_MATERIAL,
TRANSPARENT_COLORED_MATERIAL,
OPAQUE_TEXTURED_MATERIAL,
TRANSPARENT_TEXTURED_MATERIAL,
PLANE_SHADOW_MATERIAL,
PLANE_MATERIAL,
PLANE,
VIEW_RENDERABLE_MATERIAL,
};
private static int GetSceneformSourceResource(Context context, Resource resource) {
switch (resource) {
case CAMERA_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(context, "sceneform_camera_material");
case OPAQUE_COLORED_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(context, "sceneform_opaque_colored_material");
case TRANSPARENT_COLORED_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(
context, "sceneform_transparent_colored_material");
case OPAQUE_TEXTURED_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(
context, "sceneform_opaque_textured_material");
case TRANSPARENT_TEXTURED_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(
context, "sceneform_transparent_textured_material");
case PLANE_SHADOW_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(context, "sceneform_plane_shadow_material");
case PLANE_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(context, "sceneform_plane_material");
case PLANE:
return LoadHelper.drawableResourceNameToIdentifier(context, "sceneform_plane");
case VIEW_RENDERABLE_MATERIAL:
return LoadHelper.rawResourceNameToIdentifier(context, "sceneform_view_material");
}
return 0;
}
private static int GetMaterialFactoryBlazeResource(Resource resource) {return 0;}
private static int GetViewRenderableBlazeResource(Resource resource) {return 0;}
private static int GetSceneformBlazeResource(Resource resource) {return 0;}
public static int GetSceneformResource(Context context, Resource resource) {
int blazeResource = GetSceneformBlazeResource(resource);
if (blazeResource != 0) {
return blazeResource;
}
return GetSceneformSourceResource(context, resource);
}
private RenderingResources() {}
}
Hello @tpsiaki, sorry to bother you again. I found a problem while using sceneview. It duplicates the model and has problems rendering the background. Do you know how fix this?
source code: XML
<com.google.ar.sceneform.SceneView
android:id="@+id/scene_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/gray" />
java code
mSceneView = (SceneView) findViewById(R.id.scene_view);
mScene = mSceneView.getScene();
andyRenderable = renderable;
//Setting model for SceneView
TransformationSystem system = arFragment.getTransformationSystem();
mSceneNode = new TransformableNode(system);
mSceneNode.setParent(mScene);
mSceneNode.setRenderable(andyRenderable);
mSceneNode.getScaleController().setMinScale(0.01f);
mSceneNode.getScaleController().setMaxScale(1f);
mSceneNode.select();
mSceneNode.setLocalPosition(new Vector3(0f, -0.5f, -1f));
mScene.addChild(mSceneNode);
onPlayAnimation(mSceneNode);
Greetings
This happens if you don't clear/render the entire surface.
Thanks for answer @romainguy This was working at version 1.15.0, so i thought it may be a bug in SceneView, just like the External Texture render error i commented before. Note: The idea is show a model at center of the scene. Something like this.
Same code, tested on S10 with Android Q. the later phone was a S8 with Android 7 (Nougat)
this time the model seems fine, but the background color doesn't change.
Here is a example app that loads a GLB file and renders it with Filament. All Sceneform dependencies are removed. This is not a framework. I hope this helps anyone that wants to get started. example
Here is a example app that loads a GLB file and renders it with Filament. All Sceneform dependencies are removed. This is not a framework. I hope this helps anyone that wants to get started. example
Thx for your effort. That means if I want to replace Sceneform, or just a part of it, I have to create my own Engine based on Filament?
Sceneform was a framework that used hid the details of working with ARCore and Filament directly. Filament is Google's open source, cross platform, 3D rendering library. Since Sceneform was archived and hasn't been updated to work with the latest versions of Filament, you will likely need to use ARCore and some graphics library directly. Most of the ARCore example projects use OpenGL directly and you could do the same but it's a lot of code to maintain. The advantage of using Filament is that it handles loading of GLB/GLTF files for you and handles most of the material based rendering for you. You will probably still need write some shaders yourself but if all you need is to load a 3D models, it will do all the heavy work for you. This project will at least give you a starting point that you can modify for your own purposes. I dislike the word engine because it's sounds like it's a motor that takes fuel and might break down. ;-)
actually this issue comments is a gold mine. Really the only place to clarify sceneform real current state in 2020 and what are the (there are no) real alternative. Guess we are all in this boat together.
I started implementation for a 3D .glb models using Sceneform fork inside React Native as a native UI Component
Forked v0.1.16
and updated it to AndroidX
. I used this commit that solved the conflict Sceneform v0.1.16
had with Filament with TextView
basically v0.1.16
used Filament 1.45.0
which was crushing on ViewRenderable.builder()
Using this commit - link Allowed me to update Filament to v1.7.0
which solved the issue.
Also hoping for a solid solution on Google side for this, kinda shocking the state of things on this front vs Apple solid stack
hope it will help someone in the future.
This happens if you don't clear/render the entire surface.
Having the same issue @romainguy .. How can you achieve this? Cant really find anything in the docs
@zirman can you please tell how we can achieve depth api functionality as well.
@HemanParbhakar I have not used the depth API. From what I know it seems to just return a texture that contains depth information. I would recommend looking at the docs here: https://developers.google.com/ar/develop/java/depth/overview
@zirman that's in unity i need that in java also what about occlusion in filament?
@HemanParbhakar Sorry, in my rush to respond I posted the wrong link. I've updated my response.
@zirman but in java its using OPEN GL.
@HemanParbhakar Sorry, but that's the best I can do. You'll have to figure it how to get it working with Filament.
ok @zirman i will have a look.
Hi @romainguy.
Regarding your statement:
Passing 0 to beginFrame will work but it's better to pass the time supplied by Choreographer instead. It will allow much better frame skipping/dynamic resolution computations on the Filament side.
How can SceneView instance be reached from inside Renderer in order to be able to use "Choreographer" returned timestamp in the call to renderer.beginFrame()?
By the way, what exact API must be called?
Best regards.
Hi @vortice3D , Can you share your latest work on this? Did you have luck with Filament 1.70 or any later?
@HemanParbhakar I've updated the example app to do depth occlusion when it's available. https://github.com/zirman/arcore-filament-example-app
@zirman i have tried it. But does not work. Shall i share a screenshot as well ?
@HemanParbhakar Is that using the latest master branch? Not all devices support the Depth API. I would check to see if it's supported.
@zirman Yes i have latest branch . My device also supports depth api.
https://github.com/zirman/arcore-filament-example-app/blob/master/app/src/main/materials/depth.mat Should it not contain depthWrite: true as per this : https://github.com/google/filament/issues/3100
@HemanParbhakar depthWrite is true because blending is opaque. https://google.github.io/filament/Materials.html#materialdefinitions/materialblock/rasterization:depthwrite
I have placed the AR object Behind the table but still its visible on the foreground. The Device is Oneplus. The OS is 10.
@HemanParbhakar It appears to be working correctly. The Depth API isn't accurate all the time. it improves the more you move your camera around. It seems to be occluding his feet so I would just move the camera some more. You can also try rendering the depth as greyscale to see what the depth api is giving back. edit: Also it only gives depth information for stationary objects.
@HemanParbhakar If you are asking how to render wireframes, I'm not sure if there is way to enable wireframe rendering without major changes to the rendering code.
@zirman but whey does the object not stationary as i have place a object some where and lot of there is drifting in the object postion when i turn camera to same. It was not there in sceneform.
@HemanParbhakar If you are asking how to render wireframes, I'm not sure if there is way to enable wireframe rendering without major changes to the rendering code.
no no. Like a line connecting two anchors.
I have no plans to add lines connecting anchors. You can build a renderable using line primitives: https://github.com/google/filament/blob/56682794d398236c4caa5be40d80acdb73a13bc8/android/filament-android/src/main/java/com/google/android/filament/RenderableManager.java
You can take a look at how detected planes are rendered here as a base: https://github.com/zirman/arcore-filament-example-app/blob/c5fa1bf1c02fcace7489395adc95b74ced09cd30/app/src/main/java/com/example/app/renderer/PlaneRenderer.kt
@zirman thanks for such a great example. But i am confused here now. as you have taken totally a different approach. the approach which @romainguy mentioned was. First to make a mat file with depthWrite to true, writeColor to false, then setting the priority like this: Camera > Occluder renderables > Others renderables You have used RenderableManager but have not set any priority(except plane that is 0 but what about camera). May i know why?
@HemanParbhakar I had not seen that thread before I started. The camera texture is tessellated into a grid of 160x120 squares that are given a position in world space so that the correct depth is written to the depth buffer. Without depth information, the camera texture was just a single triangle on that would be on the very edge of clip space. https://github.com/zirman/arcore-filament-example-app/blob/master/app/src/main/materials/depth.mat https://github.com/zirman/arcore-filament-example-app/blob/master/app/src/main/materials/flat.mat
Unless there is a way to specify the value that is written to the depth buffer explicitly in the pixel shader, I don't know how else to do object occlusion.
There may be a way to simplify the equations in the vertex shader by changing the vertexDomain. https://google.github.io/filament/Materials.html#materialdefinitions/materialblock/vertexandattributes:vertexdomain
Edit: Priority doesn't need to be set in this case because the color and depth are written in a single pass.
@zirman now i got it clear. you have totally removed the sceneform. you are only using filament and arcore.
@zirman so this will only render model. No View (Android Layout) no Augmented Images right?
I am not currently planning on adding any more examples. :-)
@zirman Thank you so much for ur awesome engagement. I really appreciate it :)
Wow! This thread is on its way to becoming a real top hit.
Maybe discontinuing Sceneform wasn't such a good idea after all.
Hi @romainguy. Regarding your statement:
Passing 0 to beginFrame will work but it's better to pass the time supplied by Choreographer instead. It will allow much better frame skipping/dynamic resolution computations on the Filament side.
How can SceneView instance be reached from inside Renderer in order to be able to use "Choreographer" returned timestamp in the call to renderer.beginFrame()? By the way, what exact API must be called? Best regards.
Hi @vortice3D , Can you share your latest work on this? Did you have luck with Filament 1.70 or any later?
Hi there @nguyenbs:
About the more-than-a-barebones-support-for-animation-under-filament,, from here it seems to me that the good people of Google are considering adding a "more complex" approach.
Anyway, I can't see anything about it in the gltfio improvements of last Filament releases (1.9s), so I suppose is still in the oven.
Best regards.
@romainguy : "Sérieusement?"
From what I can see since 2 weeks trying to integrate only simple things like Instant Placement and Augmented Image on my Android native app (maybe 80% of development cases), the only 2 solutions I have right now are :
My client asked us "Could you integrate a simple AR section in the existing app?" My answer was "I never worked on it but from what I have read about Google AR technologies, that will be easy"
My client follows Google AR actualities and told me "Please integrate Death API on the app" but the problem is that I can't even tell him if it's possible.
Developing every possible AR APIs is great but now it's time to make them stable and easily usable for us cause Apple did it a long time ago and we have to assume the fact that nothing is really ready in front of our clients/company.
Thanks @romainguy and "Un grand respect pour ton formidable travail chez Google depuis toutes ces années même si je suis pas encore convaincu par Jetpack Compose"
One more little thing.
More than the fact that everything is based on very old java code, when I see things like this in the ARCore samples :
<menu xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent">
or
<android.opengl.GLSurfaceView android:id="@+id/surfaceview" android:layout_width="fill_parent" android:layout_height="fill_parent" android:layout_gravity="top"/>
I have a serious doubt about the quality of the sample I m working from.
Were should we begin to work from? Could you just give us even some little information about the future of ARCore developments?
With the help of the sample of @zirman (thanks you for the great work) I build a more generic sample which allows to place object in the air (in front of the camera) and includes android sensor pose handling to place object aligned to the north pose.
You basically need two things.
Provide a renderable
data class Dancer(override val positioningPose: PositioningPose) : Renderable() {
override val scaling: Float
get() = 1.0f
override val assetFileName: String
get() = "eren-hiphop-dance.glb"
}
Publish the renderable
SimpleEventBus.publish(Dancer(AirPositioning()))
Since nothing simce to change about the status of SceneForm since months, maybe it's time to create a fork I made one here : https://github.com/ThomasGorisse/sceneform-android-sdk
Anyone who want to help can contact me. The first goal of this fork is to make Scene Form compatible with the latests versions of AR Core and Filament.
After this, @vortice3D I invited you on this repo. Could you try to integrate your work concerning the Filament material part on Renderable maybe directly inside the Node.java class ?
If anyone wants to participate, there a lot of work to do on the glTF file usage cause right now only glb files are supported.
Hi every body, I got problem with sceneform 1.16 with filament-android:1.7.0 and gltfio-android:1.7.0, when i loading a large GLTF model (The DamagedHelmet model). It'll block my UIThread until this model loading complete.
I guess this problem may be from createFilamentAssetModelInstace() method. How can i fix that ?
void createFilamentAssetModelInstance() {
if (renderable.getRenderableData() instanceof RenderableInternalFilamentAssetData) {
Log.d(TAG, "Start Create Filament Asset Model Instance");
RenderableInternalFilamentAssetData renderableData = (RenderableInternalFilamentAssetData) renderable.getRenderableData();
Engine engine = EngineInstance.getEngine().getFilamentEngine();
AssetLoader loader = new AssetLoader(engine, RenderableInternalFilamentAssetData.getMaterialProvider(), EntityManager.get());
FilamentAsset createdAsset = renderableData.isGltfBinary ? loader.createAssetFromBinary(renderableData.gltfByteBuffer) : loader.createAssetFromJson(renderableData.gltfByteBuffer);
if (createdAsset == null) {
throw new IllegalStateException("Failed to load gltf");
}
if (renderable.collisionShape == null) {
com.google.android.filament.Box box = createdAsset.getBoundingBox();
float[] halfExtent = box.getHalfExtent();
float[] center = box.getCenter();
renderable.collisionShape = new Box(new Vector3(halfExtent[0], halfExtent[1], halfExtent[2]).scaled(2.0f), new Vector3(center[0], center[1], center[2]));
}
Function<String, Uri> urlResolver = renderableData.urlResolver;
for (String uri : createdAsset.getResourceUris()) {
if (urlResolver == null) {
Log.e(TAG, "Failed to download uri " + uri + " no url resolver.");
continue;
}
Uri dataUri = urlResolver.apply(uri);
try {
Callable<InputStream> callable = LoadHelper.fromUri(renderableData.context, dataUri);
renderableData.resourceLoader.addResourceData(uri, ByteBuffer.wrap(SceneformBufferUtils.inputStreamCallableToByteArray(callable)));
} catch (Exception e) {
Log.e(TAG, "Failed to download data uri " + dataUri, e);
}
}
renderableData.resourceLoader.loadResources(createdAsset);
TransformManager transformManager = EngineInstance.getEngine().getTransformManager();
@EntityInstance int rootInstance = transformManager.getInstance(createdAsset.getRoot());
@EntityInstance int parentInstance = transformManager.getInstance(childEntity == 0 ? entity : childEntity);
transformManager.setParent(rootInstance, parentInstance);
filamentAsset = createdAsset;
Log.d(TAG, "Complete Create !");
}
}
Best regards.
@phuhuynhh The problem comes from the fact that Filament assets are currently loaded outside of the ComputableFuture Renderable original sfb loading because they need a Node context in order to be instantiated.
I'm currently working on it in order to exclude every sfb loading from Scene Form and replace it with a ComputableFuture/Coroutine loading exclusively from Filament glfio.
Instead of working on personnal Scene Form evolutions on your project, if you want, we really need help to improve Scene Form 2 repo : https://github.com/ThomasGorisse/sceneform-android-sdk
I have currently migrated 80% of Scene Form java classes to Kotlin.
Tell me if you want to help, I will invite you to the repo.
@tpsiaki @romainguy I really need the sceneform_default_light_probe.sfb original LightingDef as we don't have the sfa file available on the archived repo. It seems to be the last sfb needed to run an empty Screne Form project.
Thanks
Is the SFB required? You could create your own Filament IndirectLight
instead.
@romainguy True, that's what I'm doing but I'm missing all the IndirectLight default properties defined in sceneform_default_light_probe.sfb
Now that SceneForm has been abandoned*, will ARCore be modified to be easier to adopt for users without a 3D background?
*SceneForm repo now archived (read only); users unable to