SceneView / sceneform-android

Sceneform Maintained is an ARCore Android SDK with Google Filament as 3D engine. This is the continuation of the archived Sceneform
https://sceneview.github.io/sceneform-android/
Apache License 2.0
632 stars 147 forks source link

i called reclaimReleasedResources() method,but the memory is not release.help me ! #75

Closed steven-gao closed 2 years ago

steven-gao commented 3 years ago

i add a line code in SceneView.destory() as follows:

/**
   * Required to exit Sceneform.
   *
   * <p>Typically called from onDestroy().
   */
  public void destroy() {
    if (renderer != null) {
      //add this method to release memory.
      reclaimReleasedResources();
      renderer.dispose();
      renderer = null;
    }
  }

but i see memory in android studio:Profiler window not release. Any suggestions for me? thank for your time.

RGregat commented 3 years ago

I noticed as well an increase on my memory footprint on multiple start and stops. Currently I call mArFragment.getArSceneView().getSession().close(); in my onDestroy function, because I noticed that it takes a while that the Session get's destroyed, if I let the system do it. Do you have a graph to show? I also used the new performance debug monitor to get some vitals during a session, what I noticed here is a constant increase of my java memory until it drops and increases again. The native memory is pretty stable.

ThomasGorisse commented 3 years ago

Hi,

I noticed as well an increase on my memory footprint on multiple start and stops. Currently I call mArFragment.getArSceneView().getSession().close(); in my onDestroy function, because I noticed that it takes a while that the Session get's destroyed, if I let the system do it.

You are right, the session.close() isn't called yet inside the ArFragment.onDestroy() even is if the session.pause() is handled. I'll double check if there is no issue when starting again a session @RGregat did you encounter any issue with this on your tests ?

And it's true that the native and java memory is increasing while using the sdk and not moving the phone. I don't know yet if it comes from ARCore collecting data or something else like the plane renderer. image

steven-gao commented 3 years ago

Hi,

I noticed as well an increase on my memory footprint on multiple start and stops. Currently I call mArFragment.getArSceneView().getSession().close(); in my onDestroy function, because I noticed that it takes a while that the Session get's destroyed, if I let the system do it.

You are right, the session.close() isn't called yet inside the ArFragment.onDestroy() even is if the session.pause() is handled. I'll double check if there is no issue when starting again a session @RGregat did you encounter any issue with this on your tests ?

And it's true that the native and java memory is increasing while using the sdk and not moving the phone. I don't know yet if it comes from ARCore collecting data or something else like the plane renderer. image

Do you mean that the problem is not sceneform problem but ARCore?

steven-gao commented 3 years ago

@ThomasGorisse My current problem is to release the memory immediately when I close the AR page, so that the next time I display the AR model again, the memory will not overflow. Do you have any good ideas?

steven-gao commented 3 years ago

I noticed as well an increase on my memory footprint on multiple start and stops. Currently I call mArFragment.getArSceneView().getSession().close(); in my onDestroy function, because I noticed that it takes a while that the Session get's destroyed, if I let the system do it. Do you have a graph to show? I also used the new performance debug monitor to get some vitals during a session, what I noticed here is a constant increase of my java memory until it drops and increases again. The native memory is pretty stable.

i called the getArSceneView().getSession().close(); in my ondestroy() function ,but the memory is not release . not release memory

ThomasGorisse commented 3 years ago

Hi,

I noticed as well an increase on my memory footprint on multiple start and stops. Currently I call mArFragment.getArSceneView().getSession().close(); in my onDestroy function, because I noticed that it takes a while that the Session get's destroyed, if I let the system do it.

You are right, the session.close() isn't called yet inside the ArFragment.onDestroy() even is if the session.pause() is handled. I'll double check if there is no issue when starting again a session @RGregat did you encounter any issue with this on your tests ?

And it's true that the native and java memory is increasing while using the sdk and not moving the phone. I don't know yet if it comes from ARCore collecting data or something else like the plane renderer. image

Do you mean that the problem is not sceneform problem but ARCore?

No, I don't know where does the problem come from

RGregat commented 3 years ago

This is the way how I close currently the session

@Override
protected void onPause() {
     super.onPause();
     if (isFinishing()) {
         LoggerHelper.showLog(Log.DEBUG, "ArActivity", "finish");
         onStop();
         mController.onDestroy();
     }
 }

And in my Controller I have the following Code

void onDestroy() {
   Optional.ofNullable(mArFragment.getArSceneView().getSession())
      .ifPresent(Session::onClose());
}

The reason to use the onPause Function is, that the onDestroy function is called with a delay. It can take a few seconds after the Activity is already closed. At least that is what I noticed for my App and UseCase. So far I had no trouble by doing it that way, I only noticed a warning when the original process is trying to close the session, but it doesn't break anything on my side. So far I noticed less Crashes on multiple starts.

To monitor the performance including the memory I'm using the new Performance-Overlay https://developers.google.com/ar/develop/c/debugging-tools/performance-overlay

What I noticed here is a relative constant native memory, but the java memory is looking strange to me. It's starting very low, then it climbs constantly and then after a while it is falling back to a very low value. And that process repeats through the whole session. Maybe it is normal and part of the DepthApi?

For the Memory Profiler you showed. On my end I have the same issue, that on every start the footprint is a little bit higher than before. Only to hard close the app an reopen it, is solving this problem. But I didn't investigated this enough for now.

ThomasGorisse commented 3 years ago

I noticed as well an increase on my memory footprint on multiple start and stops. Currently I call mArFragment.getArSceneView().getSession().close(); in my onDestroy function, because I noticed that it takes a while that the Session get's destroyed, if I let the system do it. Do you have a graph to show? I also used the new performance debug monitor to get some vitals during a session, what I noticed here is a constant increase of my java memory until it drops and increases again. The native memory is pretty stable.

i called the getArSceneView().getSession().close(); in my ondestroy() function ,but the memory is not release . not release memory

You can also ask the question on the ARCore repo.

ThomasGorisse commented 3 years ago

@ThomasGorisse My current problem is to release the memory immediately when I close the AR page, so that the next time I display the AR model again, the memory will not overflow. Do you have any good ideas?

Put your code here for help

steven-gao commented 3 years ago

@ThomasGorisse My current problem is to release the memory immediately when I close the AR page, so that the next time I display the AR model again, the memory will not overflow. Do you have any good ideas?

Put your code here for help ok, my code main include an activity and a Nodes as subclass as TransformableNode , sorry for code a bit many, but the structure is clearly, please help me to check whether the code leaking memory or not. my ar page named :ARSceneActivity `class ARSceneActivity : BaseActivity(), OnClickListener {

var arFragment: ArFragment? = null
var arSceneView: ArSceneView? = null

private val coordinator by lazy { Coordinator(this, ::onArTap, ::onNodeSelected, ::onNodeFocused) }

var modelList: ArrayList<ArModel> = arrayListOf()
var bannerList: ArrayList<ArBanner> = arrayListOf()
var videoNodeList = arrayListOf<ArVideo>()
var videoBeanList = arrayListOf<ArModel>()
var phoenixList = arrayListOf<PhoenixModel>()
var nodeMap = hashMapOf<String, Nodes>()

var videoLocalPathMap = hashMapOf<String, String>()

var modelLocalPathMap = hashMapOf<String, String>()

val mHandler: Handler = object : Handler() {
    override fun handleMessage(msg: Message) {

        if (msg.what == AR_INIT_COMPLETE) {
            WHQuicmoToARworld.getInstance().setMC0(arSceneView!!.arFrame!!.camera)
            arSceneView?.let {
                /** start to request server data */
                requestData()
            }
            /** when frame is null ,then go on */
        } else if (msg.what == AR_FRAME_NULL) {
            postDelayed(runnable, 500)
            /** when init not finish ,then go on */
        } else if (msg.what == AR_INITING) {
            postDelayed(runnable, 500)

        }else ….
    }
}

override fun initWidget(savedInstanceState: Bundle?) {
    arFragment = supportFragmentManager.findFragmentById(R.id.ux_fragment) as ArFragment?
    arFragment?.getArSceneView()?.planeRenderer?.isVisible = false
    arSceneView = arFragment?.arSceneView
    arSceneView!!.setZOrderMediaOverlay(true)
    …
    initAr()

mHandler.post(runnable)
…
}

override fun getContentView(): Int {
    return R.layout.activity_scene
}

override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)

initWidget(savedInstanceState)
initData(savedInstanceState)

  }

override fun initData(savedInstanceState: Bundle?) {
    var bundle = intent.extras

}

/**
 * get server data
 */
fun requestData() {

    if (sceneState == 1 || sceneState == 2) {
        bindScene()
    } else {
        getDefaultScene()
    }
}

/**
 *  get scene info
 * @param sceneId String
 */
fun getSceneInfo(sceneId: String) {
    …..
            showModels(modelList)
         …… 

}

/**
 * 
 */
var runnable = Runnable {
    var frame = arSceneView?.arFrame
    if (frame == null) {
        mHandler.sendEmptyMessage(AR_FRAME_NULL)
    } else {
        if (frame.camera.trackingState == TrackingState.TRACKING) {
            mHandler.sendEmptyMessage(AR_INIT_COMPLETE)
        } else {
            mHandler.sendEmptyMessage(AR_INITING)
        }
    }
}

/**
 * down video file
 * @param url String
 * @param videoModel ArModel
 */
fun downVideo(url: String, videoModel: ArModel) {
    ...
            showVideo(task?.targetFilePath!!, videoModel)
   … 

}

/**
 * down glb  file
 * @param url String
 * @param videoModel ArModel
 */
fun downArModel(url: String, arModel: ArModel) {
…
              showAnimals(arModel)
   …
}

/**
 * get default scene 
 */
fun getDefaultScene() {
…
            showModels(responseData?.data.get(0).arModels)
   …

        }

/**
 * show all model in my arSceneview
 * @param list List<ArModel>
 */
fun showModels(list: List<ArModel>) {

    showAxiesAtOriginal()

       list.forEach {
        when (it.materialType) {
            ShareData.TYPE_BRAND -> {
                showBrand(it)

            }
            ShareData.TYPE_BANNER -> {
                showLoopPic(it)
                refreshProgress()
            }
            ShareData.TYPE_BALL -> {
                showARBall(it)

            }
            ShareData.TYPE_ARMODEL -> {
                armodelPath = it.androidDownloadUrl
            }
            ShareData.TYPE_VIDEO -> {
                videoBeanList.add(it)
                videoPath = it.urlArr.get(0)
            }
            ShareData.TYPE_VIDEO_GREEN -> {
                videoPath = it.androidDownloadUrl
                videoBeanList.add(it)
            }
        }

        if (!videoPath.isNullOrEmpty()) {
            downVideo(videoPath!!, it)
            videoPath = ""
        }

        if (!armodelPath.isNullOrEmpty()) {
            //Log.d("steven", "armodelPath:" + armodelPath)
            downArModel(armodelPath!!, it)
            armodelPath = ""
        }
    }
    coordinator.selectNode(null)
}

/**
 * every frame listener
 */
private fun onArUpdate() {

    if (hasPickUp) {
        calculateMoveMatrix(selectedNode!!)
    }
    for (phoenix in phoenixList) {
        phoenix.doGlbAnimation()
    }
}

/**
 * show ar model
 * @param model ArModel
 */
fun showAnimals(model: ArModel) {
    val anchorNode = AnchorNode()
    var matrix = getArModelMatrix(model)
    anchorNode.worldPosition = getModelShowPosition(matrix)
    var phoenixModel = PhoenixModel(mHandler, this, coordinator, model)
    nodeMap.put(model.arModelId, phoenixModel)

    phoenixList.add(phoenixModel)
    phoenixModel?.let {
        val anchorNode = AnchorNode()
        var matrix = getArModelMatrix(model)
        anchorNode.worldPosition = getModelShowPosition(matrix)

        it.attach(anchorNode, arSceneView?.scene!!)
        it.worldScale = getModelShowScale(matrix)
        it.worldRotation = getModelShowRotation(it.worldScale, matrix)

    }
}

/**
 * show video model 
 * @param path String
 * @param videoBean ArModel
 */
fun showVideo(path: String, videoBean: ArModel) {

   …
    val anchorNode = AnchorNode()
    var destMatrxi = getArModelMatrix(videoBean!!)
    anchorNode.worldPosition = getModelShowPosition(destMatrxi)
    var videoNode = ArVideo(this, coordinator, videoBean!!, path)
    videoNodeList.add(videoNode)
    nodeMap.put(videoBean!!.arModelId, videoNode!!)
    var scale = getModelShowScale(destMatrxi)

    videoNode?.let {
        it.attach(anchorNode, arSceneView?.scene!!)

       it.worldRotation = getModelShowRotation(scale, destMatrxi)

        it.worldScale = Vector3(videoWidth.toFloat() * scale.x, videoHeight.toFloat() * scale.y, 1.0f)
    }
}

/**
 * 显示大球
 * @param model ArModel
 */
fun showARBall(model: ArModel) {
    detailDialog = BallFaceDialog(this)
    ball = model
    config!!.objCylinderAdapter = ObjCylinderAdapter(this, initBallFaceData(model))
    val msg = mHandler.obtainMessage()
    msg.what = SHOW_BALL
    mHandler.sendMessageDelayed(msg, 500)
}

/**
 * show axies model 
 * @param model ArModel
 */
fun showAxiesAtOriginal() {

        val anchorNode = AnchorNode()
        var matrix = WHQuicmoToARworld.getInstance().quicmoToARworld(trans2SceneFormMatrix(matrix2Transform(Matrix())))
        anchorNode.worldPosition = getModelShowPosition(matrix)
        var axiesModel = AxisModel(this, coordinator, ArModel())
        axiesModel?.let {
            it.attach(anchorNode, arSceneView?.scene!!)
            it.worldScale = Vector3.one().scaled(0.1f)
            it.worldRotation = getModelShowRotation(it.worldScale, matrix)
        }

}

/**
 * show looper pic model
 * @param model ArModel
 */
fun showLoopPic(model: ArModel) {
    val anchorNode = AnchorNode()
    var matrix = getArModelMatrix(model)
    anchorNode.worldPosition = getModelShowPosition(matrix)

    var pic = ArBanner(this, coordinator, model)
    nodeMap.put(model.arModelId, pic)

    pic?.let {
        it.attach(anchorNode, arSceneView?.scene!!)
        it.worldScale = getModelShowScale(matrix)

        if (!model.isHorizontal) {
            it.worldRotation = getModelShowRotation(it.worldScale, matrix)

        }
    }
    bannerList.add(pic)
}

/**
 * show brand model
 * @param model ArModel
 */
fun showBrand(model: ArModel) {
    val anchorNode = AnchorNode()
    var matrix = getArModelMatrix(model)
    anchorNode.worldPosition = getModelShowPosition(matrix)
    var brand = ShopBrand(this, coordinator, model)
    nodeMap.put(model.arModelId, brand)
    brand?.let {
        it.attach(anchorNode, arSceneView?.scene!!)
        it.worldScale = getModelShowScale(matrix)

        if (!model.isHorizontal) {
            it.worldRotation = getModelShowRotation(it.worldScale, matrix)

        }
    }
}

override fun onResume() {
    super.onResume()
    try {
        arSceneView?.resume()

        ballNode?.let {
            it.startAnimation()
        }
    } catch (ex: CameraNotAvailableException) {
    }
}

override fun onPause() {
    super.onPause()
    arSceneView?.pause()
    ballNode?.let {
        it.stopAnimation()
    }
}

override fun onDestroy() {
    super.onDestroy()
    if (isPlay) {
        detail_player.getCurrentPlayer().release()
    }
    if (orientationUtils != null) orientationUtils!!.releaseListener()

    bannerList.forEach {
        it.releaseBanner()
    }

    for (videoNode in videoNodeList) {
        videoNode?.let {
            it.releaseVideo()
        }
    }

    ballNode?.let {
        it.stopAnimation()
    }
    for (armodel in phoenixList) {
        armodel?.let {
            it.stopAnimation()
        }
    }

    nodeMap.forEach {
        it.value.detach()
    }
    arSceneView?.destroy()

// ArSceneView.destroyAllResources() // ArSceneView.destroyAllResourceExceptEngine() }

/**
 * init AR 
 */
private fun initAr() {
    arSceneView?.scene?.let {
        it.addOnUpdateListener {
            onArUpdate()
        }
        it.addOnPeekTouchListener { hitTestResult, motionEvent ->
            coordinator.onTouch(hitTestResult, motionEvent)
        }
    }
}

/**
 * 
 * @param node Nodes?
 */
fun onNodeFocused(node: Nodes?) {
….
….

}

fun onNodeSelected(old: Nodes? = coordinator.selectedNode, new: Nodes?) {

}

private fun onArTap(motionEvent: MotionEvent) {
…..        
}

fun checkIsSupportedDeviceOrFinish(activity: BaseActivity): Boolean {
    if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {
     ….
        return false
    }
    val openGlVersionString = (activity.getSystemService(Context.ACTIVITY_SERVICE) as ActivityManager).deviceConfigurationInfo.glEsVersion
    if (java.lang.Double.parseDouble(openGlVersionString) < 3.0) {
       ….
        return false
    }

    var available = ArCoreApk.getInstance().checkAvailability(this);
    //not support AR
    if (available != ArCoreApk.Availability.SUPPORTED_INSTALLED) {
        ….
             }

    return true
}

}

package com.whyhow.sucailib.ar

sealed class Nodes(activity: ARSceneActivity, name: String, coordinator: Coordinator, model: ArModel) : TransformableNode(coordinator), ViewSizer {

var model: ArModel
var activity: ARSceneActivity
var axiesModel: AxisModel? = null

var anchor: AnchorNode? = null

var animatorPosition: Vector3 = Vector3.zero()
    set(value) {
        field = value
        worldPosition = value
    }
    get() = field

companion object {
    private val IDS: MutableMap<KClass<*>, AtomicLong> = mutableMapOf()
    fun Any.newId(): Long = IDS.getOrElse(this::class, { AtomicLong().also { IDS[this::class] = it } }).incrementAndGet()
}

fun getWidthMeterMap(): Int {
    var displayMetrics = Resources.getSystem().getDisplayMetrics();
    return (model.materialWidth * (displayMetrics.xdpi / 160.0f) * 250).toInt()
}

fun getHeightMeterMap(): Int {
    var displayMetrics = Resources.getSystem().getDisplayMetrics();
    return (model.materialHeight * (displayMetrics.xdpi / 160.0f) * 250).toInt()
}

override fun getSize(view: View?): Vector3 {
    return Vector3(model.materialWidth.toFloat(), model.materialHeight.toFloat(), 0.01f)
}

init {
    this.activity = activity
    this.name = name
    this.model = model
    rotationController.isEnabled = false
    scaleController.isEnabled = false
    translationController.isEnabled = false
    @Suppress("LeakingThis")
    model?.let {
        if (model.isLookCamera) rotationController.isEnabled = false
    }

    addTransformChangedListener(object : TransformChangedListener {
        override fun onTransformChanged(p0: Node?, p1: Node?) {
            var isVideo = model.materialType == ShareData.TYPE_VIDEO || model.materialType == ShareData.TYPE_VIDEO_GREEN
            if (!isVideo) {
                syncAxis()
            }
        }
    })
}

var onNodeUpdate: ((Nodes) -> Any)? = null

override fun getTransformationSystem(): Coordinator = super.getTransformationSystem() as Coordinator

override fun setRenderable(renderable: Renderable?) {
    if(renderable == null) return
    super.setRenderable(renderable?.apply {

        isShadowCaster = false
        isShadowReceiver = false
    })
}

override fun onUpdate(frameTime: FrameTime) {
    onNodeUpdate?.invoke(this)
    model?.let {
        if (model.isLookCamera) {
            facingCamera()
        }
    }
}

open fun facingCamera() {

        if (isTransforming) return /*Prevent infinite loop*/
        val camera = scene?.camera ?: return
        var direction: Vector3? = null
        var isVideo = model.materialType == ShareData.TYPE_VIDEO || model.materialType == ShareData.TYPE_VIDEO_GREEN
        var localVideoRotation = getModelVideoRotation(model, activity)
        if (isVideo && localVideoRotation % 180 != 0) {

            direction = Vector3.subtract(worldPosition, camera.worldPosition)
            direction.y = 0f
            var yawPhone = Utils.Quaternion2Eular(Quaternion.lookRotation(direction, Vector3.up())).y
            Log.d("steven", "yawPhone。。。。 " + (yawPhone))
            if(this is AxisModel){
                direction = Vector3.subtract(worldPosition, camera.worldPosition)

                direction.y = 0f
                worldRotation = Quaternion.lookRotation(direction, Vector3.up())

            }else{
                direction = Vector3.subtract(worldPosition, camera.worldPosition)

                direction.y = 0f
                worldRotation = Quaternion.lookRotation(direction, Vector3.right())
            }
        } else {
            direction = Vector3.subtract(worldPosition, camera.worldPosition)

            direction.y = 0f
            worldRotation = Quaternion.lookRotation(direction, Vector3.up())
        }

}

open fun attach(anchor: AnchorNode, scene: Scene, focus: Boolean = false) {
    this.anchor = anchor
    setParent(anchor.apply { setParent(scene) })

    if (focus) {
        transformationSystem.focusNode(this)
    }
}

open fun detach() {
    if (this == transformationSystem.selectedNode) {
        transformationSystem.selectNode(null)
    }
    (parent as? AnchorNode)?.anchor?.detach()
    setParent(null)
    renderable = null
}

fun addAxis() {
    if (model.materialType != ShareData.TYPE_BALL) {
        if (axiesModel == null) {
            axiesModel = AxisModel(activity, getTransformationSystem(), model)
        }
        axiesModel?.attach(anchor!!, activity.arSceneView!!.scene)

        if (model.materialType == ShareData.TYPE_BALL || model.materialType == ShareData.TYPE_ARMODEL) {

        } else {
            var widthScale = model.materialWidth / 6f
            var heightScale = model.materialHeight / 6f
            var localVideoRotation = getModelVideoRotation(model, activity)
            if (localVideoRotation % 180 != 0) {
                axiesModel!!.worldScale = Vector3((heightScale + 0.05).toFloat(), (widthScale + 0.05).toFloat(), 0.05f)
            } else {
                axiesModel!!.worldScale = Vector3((widthScale + 0.05).toFloat(), (heightScale + 0.05).toFloat(), 0.05f)
            }
        }
        syncAxis()
    }
}

fun syncAxis() {
    if (model.materialType != ShareData.TYPE_BALL) {
        axiesModel?.let {
            if (model.materialType == ShareData.TYPE_VIDEO || model.materialType == ShareData.TYPE_VIDEO_GREEN) {
                var localVideoRotation = getModelVideoRotation(model, activity)
                if (localVideoRotation % 180 != 0) {
                    var videoOffsetRotation = Quaternion.axisAngle(Vector3(0f, 0f, 1f), localVideoRotation.toFloat())
                    it.worldRotation = Quaternion.multiply(worldRotation, videoOffsetRotation)
                    it.worldPosition = worldPosition
                } else {
                    it.worldRotation = worldRotation
                    it.worldPosition = worldPosition
                }
            } else {
                it.worldRotation = worldRotation
                it.worldPosition = worldPosition
            }
        }
    }
}

fun removeAxis() {
    axiesModel?.detach()
    axiesModel = null
}

override fun onTap(hitTestResult: HitTestResult?, motionEvent: MotionEvent?) {
    super.onTap(hitTestResult, motionEvent)
    if (isTransforming) return // ignored when dragging over a small distance
    if (name != "AxisModel") {
        transformationSystem.focusNode(this)
    }
}

override fun equals(other: Any?): Boolean {
    if (this === other) return true
    if (javaClass != other?.javaClass) return false

    other as Nodes

    if (model != other.model) return false

    return true
}

override fun hashCode(): Int {
    return model.hashCode()
}

}

class AxisModel(context: ARSceneActivity, coordinator: Coordinator, model: ArModel) : Nodes(context, "AxisModel", coordinator, model), Footprint.Invisible { init { activity = context ModelRenderable.builder().setSource(activity, R.raw.axis) .setIsFilamentGltf(true) .build() .thenAccept( { renderable -> this.renderable = renderable }) collisionShape = null } }

class ShopBrand( context: ARSceneActivity, coordinator: Coordinator, model: ArModel ) : Nodes(context, "ShopBrand", coordinator, model), Footprint.Invisible {

init {
    this.name = "ShopBrand"
    var view = LayoutInflater.from(context).inflate(R.layout.renderable_ar_brand, null)
    var logo = view.findViewById<ImageView>(R.id.logo)

    GlideLoadUtils.getInstance().load(context, model.materialThumbnail, logo)
    ViewRenderable.builder()
            .setView(context, view)
            .setVerticalAlignment(ViewRenderable.VerticalAlignment.CENTER)
            .build()
            .thenAccept {
                renderable = it
                it.view.layoutParams.width = getWidthMeterMap()
                it.view.layoutParams.height = getHeightMeterMap()
            }
}

}

class ArBanner( context: ARSceneActivity, coordinator: Coordinator, model: ArModel ) : Nodes(context, "ArBanner", coordinator, model), Footprint.Invisible { var banner: Banner<ArModel, BannerImageAdapter>;

init {
    this.model = model
    var view = LayoutInflater.from(context).inflate(R.layout.renderable_ar_looppic, null)
    banner = view.findViewById(R.id.banner)
    banner.removeIndicator()
    banner.setLoopTime(4000)

    var adapter = BannerImageAdapter(model.urlArr)
    banner?.let {
        it.addBannerLifecycleObserver(activity)
        it.setUserInputEnabled(false)
        it.setAdapter(adapter, true)
        if (model.urlArr.size == 1) {
            it.isAutoLoop(false)
        }
    }

    ViewRenderable.builder()
            .setVerticalAlignment(ViewRenderable.VerticalAlignment.CENTER)
            .setView(context, view)
            .build()
            .thenAccept {
                renderable = it
                it.view.layoutParams.width = getWidthMeterMap()
                it.view.layoutParams.height = getHeightMeterMap()
            }
}

fun releaseBanner() {
    banner.stop()
}

}

@RequiresApi(Build.VERSION_CODES.O) class ArVideo(context: ARSceneActivity, coordinator: Coordinator, model: ArModel, localPath: String) : Nodes(context, "Video", coordinator, model), Footprint.Invisible { var mediaPlayer: MediaPlayer?=null val texture = ExternalTexture() var uri: Uri var context: ARSceneActivity init { this.context = context uri = getUriForFileKotlin(context, BuildConfig.APPLICATION_ID, File(localPath)) if (localPath.isNullOrEmpty()) { context.showToast("视频路径为空") context.whenLoadVideoFail() } else { try { mediaPlayer = MediaPlayer.create(context, uri) if (mediaPlayer != null) { mediaPlayer!!.isLooping = true mediaPlayer!!.setSurface(texture.surface)

                    Material.builder()
                            .setSource(context, R.raw.sceneform_chroma_key_material)
                            .build()
                            .thenAccept { material: Material? ->
                                val cube = ShapeFactory.makePlane(1f, 1f, material)
                                cube.material?.setExternalTexture("videoTexture", texture)    
                                cube.material?.setFloat4("keyColor", Color(0.1843f, 1.0f, 0.098f))       
                                if (model.materialType == ShareData.TYPE_VIDEO) {
                                    cube.material.setBoolean("disableChromaKey", true);
                                } else {
                                    cube.material.setBoolean("disableChromaKey", false);
                                }

                                if (!mediaPlayer!!.isPlaying) {
                                    if(TDevice.getAndroidSDKVersion() >= android.os.Build.VERSION_CODES.O){
                                        mediaPlayer!!.seekTo(500,MediaPlayer.SEEK_CLOSEST)
                                    }else{
                                        mediaPlayer!!.seekTo(500)
                                    }

                                    texture.surfaceTexture.setOnFrameAvailableListener {
                                        renderable = cube
                                        texture.surfaceTexture.setOnFrameAvailableListener(null)
                                    }
                                } else {
                                    renderable = cube
                                }
                            }
                } else {
                    context.whenLoadVideoFail()
                    ToastUtils.showLongToast(activity, "视频格式不支持")
                }
            }catch (e: Exception){
                context.whenLoadVideoFail()
                e.printStackTrace()
                ToastUtils.showLongToast(activity, "视频格式不支持")
            }
    }
}

fun releaseVideo() {
    try{
        mediaPlayer?.stop()
        mediaPlayer?.release()
    }catch (e:Exception){
        e.printStackTrace()
    }

}

fun isPlaying():Boolean{
    if(mediaPlayer !=null){
        try{
            return mediaPlayer!!.isPlaying
        }catch (e:Exception){
            return false
        }
    }else{
        return false
    }
}

fun pause() {
    mediaPlayer?.let {
        if (it.isPlaying) {
            it.pause()
        }
    }
}

fun startPlay() {
    mediaPlayer?.let {
        it.start()
    }
}

}

class ArBall(activity: ARSceneActivity, coordinator: Coordinator, model: ArModel) : Nodes(activity, "ArBall", coordinator, model), Footprint.Invisible { var rotationAnimation: ObjectAnimator? = null var degreesPerSecond = 10.0f val speedMultiplier = 1.0f

private fun getAnimationDur(): Long {
    return (1000 * 360 / (degreesPerSecond * speedMultiplier)).toLong()
}

fun startAnimation() {
    if (rotationAnimation != null) {
        return
    }
    rotationAnimation = createAnimator()
    rotationAnimation!!.target = this
    rotationAnimation!!.duration = getAnimationDur()
    rotationAnimation!!.start()
}

fun stopAnimation() {
    if (rotationAnimation == null) {
        return
    }
    rotationAnimation!!.cancel()
    rotationAnimation = null
}

private fun createAnimator(): ObjectAnimator {

    val orientation1 = Quaternion.axisAngle(Vector3(0.0f, 1.0f, 0.0f), 0f)
    val orientation2 = Quaternion.axisAngle(Vector3(0.0f, 1.0f, 0.0f), 120f)
    val orientation3 = Quaternion.axisAngle(Vector3(0.0f, 1.0f, 0.0f), 240f)
    val orientation4 = Quaternion.axisAngle(Vector3(0.0f, 1.0f, 0.0f), 360f)
    val rotationAnimation = ObjectAnimator()
    rotationAnimation.setObjectValues(orientation1, orientation2, orientation3, orientation4)

    rotationAnimation.setPropertyName("localRotation")

    rotationAnimation.setEvaluator(QuaternionEvaluator())

    rotationAnimation.repeatCount = ObjectAnimator.INFINITE
    rotationAnimation.repeatMode = ObjectAnimator.RESTART
    rotationAnimation.interpolator = LinearInterpolator()
    rotationAnimation.setAutoCancel(true)
    return rotationAnimation
}

}

class PhoenixModel(handler: Handler, activity: ARSceneActivity, coordinator: Coordinator, model: ArModel) : Nodes(activity, "PhoenixModel", coordinator, model) {

var modelAnimation: ObjectAnimator? = null
var repeatCount = 0
var handler:Handler
var animCount:Int = 0

private var animators: HashSet<AnimationInstance> = hashSetOf()

val colors = Arrays.asList(
        Color(0f, 0f, 0f, 1f),
        Color(1f, 0f, 0f, 1f),
        Color(0f, 1f, 0f, 1f),
        Color(0f, 0f, 1f, 1f),
        Color(1f, 1f, 0f, 1f),
        Color(0f, 1f, 1f, 1f),
        Color(1f, 0f, 1f, 1f),
        Color(1f, 1f, 1f, 1f))
var nextColor = 0

class AnimationInstance internal constructor(animator: Animator, index: Int, startTime: Long) {
    var animator: Animator
    var startTime: Long
    var duration: Float
    var index: Int

    init {
        this.animator = animator
        this.startTime = startTime
        duration = this.animator.getAnimationDuration(index)
        this.index = index
    }
}

init {
    this.handler = handler

    var localPath = activity.modelLocalPathMap.get(model.androidDownloadUrl)
    var pathUri  = Uri.parse(localPath)
    ModelRenderable.builder().setSource(activity, pathUri)
            .setIsFilamentGltf(true).build().thenAccept(
                    { modelRenderable ->
                        renderable = modelRenderable
                        Log.d("steven","renderabel.submeshCount:" + renderable!!.submeshCount)
                        val filamentAsset = renderableInstance!!.filamentAsset
                        animCount = filamentAsset!!.animator.animationCount
                        if (animCount > 0) {
                            for (i in 0 until animCount) {
                                animators.add(AnimationInstance(filamentAsset.animator, i, System.nanoTime()))
                            }
                        }
                        val color: Color = colors.get(nextColor)
                        nextColor++
                        for (i in 0 until renderable!!.getSubmeshCount()) {
                            val material: Material = renderable!!.getMaterial(i)
                            material.setFloat4("baseColorFactor", color)
                        }
                    })
            .exceptionally(
                    { throwable ->
                        val toast = Toast.makeText(activity, "无法加载模型文件,请确认格式是否正确", Toast.LENGTH_LONG)
                        toast.setGravity(Gravity.CENTER, 0, 0)
                        toast.show()
                        null
                    })
}

fun startAnimation(startPosition: Vector3) {
    Log.d("steven", "PhoenixModel start position ......")
    var endPosition = Vector3(model!!.moveX.toFloat(), startPosition.y, model!!.moveZ.toFloat())
    if (modelAnimation != null) {
        return
    }

    modelAnimation = createGlbAnimator(this, startPosition, endPosition)

    modelAnimation!!.addListener(object : android.animation.Animator.AnimatorListener {

        override fun onAnimationRepeat(animation: android.animation.Animator?) {
            repeatCount++
            worldRotation = Quaternion.multiply(worldRotation, Quaternion.axisAngle(Vector3(0f, 1f, 0f), 180f))
        }

        override fun onAnimationEnd(animation: android.animation.Animator?) {            
        }

        override fun onAnimationCancel(animation: android.animation.Animator?) {

        }

        override fun onAnimationStart(animation: android.animation.Animator?) {

        }
    })

    modelAnimation!!.target = this
    modelAnimation!!.duration = model.cycle.toLong() * 1000
    modelAnimation!!.start()
}

fun stopAnimation() {
    if (modelAnimation == null) {
        return
    }
    modelAnimation!!.cancel()
    modelAnimation = null
}

fun doGlbAnimation() {
    if(animCount >0){
        val time = System.nanoTime()
        for (animator in animators) {
            animator.animator.applyAnimation(animator.index, ((time - animator.startTime) / TimeUnit.SECONDS.toNanos(1).toDouble()).toFloat()
                    % animator.duration)
            animator.animator.updateBoneMatrices()
        }
    }
}

} class Cube(size: Vector3, center: Vector3, color: Color, context: ARSceneActivity, coordinator: Coordinator, model: ArModel) : Nodes(context, "Sphere", coordinator, model) { var size: Vector3 var center: Vector3 var color: Color var context: Context

init {
    this.context = context
    this.size = size
    this.center = center
    this.color = color
    loadDebugPointModel()
}

fun loadDebugPointModel() {
    Texture.builder().setSource(context, R.drawable.ic_quicmo).build()
            .thenAccept(
                    { texture ->
                        MaterialFactory.makeOpaqueWithColor(context, color)
                                .thenAccept { material ->
                                    renderable = ShapeFactory.makeCube(size, center, material)
                                    localPosition = Vector3.zero()
                                }
                    }
            )
}

}

`

ThomasGorisse commented 3 years ago

@steven-gao First of all remove all the arSceneView.resume(), arSceneView.destroy(),... since it's already handled by the ARFragment. After that, make sure to call very light operations on the onArUpdate() since it's happening on every frame callback. I don't know what your calculateMoveMatrix() is doing but make sure it doesn't cost too much. I also don't know what your

WHQuicmoToARworld.getInstance().setMC0(arSceneView!!.arFrame!!.camera)
arSceneView?.let {
    /** start to request server data */
    requestData()
}

is doing so I cannot help you on that.

Finally, from what I see in your profiler, it looks like you are using 600 Mo of graphical memory. So you should check that all your models and mediaplayers are well released.

My advise is to comment some parts of your code ans check steps by steps where does the 600Mo comes from in order to release the corresponding memory on destroy.

AND PLEASE CLICK ON THE PREVIEW TAB BEFORE SENDING A COMMENT.

ThomasGorisse commented 3 years ago

OK, I think I figured out maybe one of the biggest consuming part and why one of the highest allocated class is Bitmap. I forgot the last SFB asset still in use for the default LightProbe and most important, the one for the Skybox with a 2K resolution! That part was waiting for Filament to have a KTXLoader I think.

ThomasGorisse commented 3 years ago

Great performances/rendering increase in perspective but also a bit of work to do.

steven-gao commented 3 years ago

operations

Thank you for your patient guidance first of all,yestoday i add RenderableInstance.destroyAsset(); in RenderableInstance.destroyAsset() look like this :

public void detachFromRenderer() {
        Renderer rendererToDetach = attachedRenderer;
        if (rendererToDetach != null) {
            detachFilamentAssetFromRenderer();
            destroyAsset();
            rendererToDetach.removeInstance(this);
            renderable.detatchFromRenderer();
        }
    }

i test for serval times ,it make a bit effect

ThomasGorisse commented 3 years ago

If you want to totally clear the rendering memory you should destroy the Filament Engine and the glContext on your ‘onDestroy()‘.

Since you seems to use a lot of 3D assets, the gain on your memory leaks will be quite big.

The counterpart wich I'm working on will be that you will currently have issues creating the rendering context again.

You will also have issues with some already destroyed entities exception since they are done on objects ’finalize()‘ called.

ThomasGorisse commented 2 years ago

Please move to https://github.com/SceneView for a well handled lifecycle aware memory management

kustraslawomir commented 1 year ago

you should destroy the Filament Engine and the glContext on your ‘onDestroy()‘.

how to do that?

kustraslawomir commented 1 year ago

by EngineInstance.destroyFilamentEngine?