Open luislukas opened 4 years ago
The issue seems to be solved when setting this property rather than using the default priority. I
sceneView.cameraStreamRenderPriority = Renderable.RENDER_PRIORITY_FIRST
I guess this can be closed. It'd be good to know does the property do. Is it just allocating more resources to render images/nodes etc?
UPDATE: I think I was too optimistic, it happened again as soon as I made the ImageView
100x100 pixel size. If I keep them around 50x50 it can handle more.
I'm inflating a basic layout with an
ImageView
using theViewRenderable.builder().setView(...)
approach. My use case needs to fetch the images from a server so I'm usingGlide
for this purpose. After that, I can successfully place the view in aNode
and see it in theSceneView
. The problem comes when doing the same with multiple views - I always end up with low frame rate and eventually the following crash:I'm using the
ArFragment
approach and SceneForm version1.13.0
in case it helps. I've tested with different image sizes and the bigger the image the sooner the crash happens. Also if the images are less heavy - few hundred kb, then I need to add more for the crash to happen (probably around 10). I found the following issue, now closed, that I think could be related (although seems that we have different use case.) https://github.com/google-ar/sceneform-android-sdk/issues/181 Basically my question then, would be, are we meant to just use textures etc and not images (png, jpg etc) ?