airsdk / Adobe-Runtime-Support

Report, track and discuss issues in Adobe AIR. Monitored by Adobe - and HARMAN - and maintained by the AIR community.
206 stars 11 forks source link

Video performance regression on Harman vs. Adobe Runtime #2503

Open xpojosh opened 1 year ago

xpojosh commented 1 year ago

Problem Description

Since moving from the Adobe version 31.0 SDK to the Harman 50.1.1, we've seen severe degradation on Windows video performance. This first surfaced as a Windows 10 device that was no longer able to smoothly play 4K video.

Upon further investigation, it became evident that there was a big resource usage difference between the two runtimes:

Adobe 31.1 (CPU utilization 2-4%, GPU utilization 46%) Harman 50.1 (CPU utilization 70-85%, GPU utilization 65%)

The primary test system was a Gen 11 Intel NUC with a i3-1115G4 @ 3.00GHz processor, integrated graphics, 8GB ram (dual 4GB sticks) -- however, I saw the same issues on a Rizen embedded V1605B system with 8GB of ram.

The primary class tested was a simple Videotexture object; however, I saw the same performance issues when using StageVideo.

Steps to Reproduce

I've created a the following basic test which requires no additional frameworks, only a prepositioned video file. I've added a simple logger class which is not displayed here, but it can be ignored.

package com.xpodigital.video {
    import com.adobe.utils.AGALMiniAssembler;
    import com.adobe.utils.PerspectiveMatrix3D;

    import flash.desktop.NativeApplication;
    import flash.display.*;
    import flash.display3D.Context3D;
    import flash.display3D.Context3DProgramType;
    import flash.display3D.Context3DVertexBufferFormat;
    import flash.display3D.IndexBuffer3D;
    import flash.display3D.Program3D;
    import flash.display3D.VertexBuffer3D;
    import flash.display3D.textures.VideoTexture;
    import flash.events.*;
    import flash.filesystem.File;
    import flash.geom.Matrix3D;
    import flash.geom.Vector3D;
    import flash.net.NetConnection;
    import flash.net.NetStream;

    /**
     * 
     * @author xpodigital
     */
    [SWF(backgroundColor="#FFFFFF", frameRate="30", width="3840", height="2160")]
    public class VideotextureTest extends Sprite {
        // constants used during inits
        private const swfWidth : int = 3840;
        private const swfHeight : int = 2160;
        // the 3d graphics window on the stage
        private var context3D : Context3D;
        // the compiled shader used to render our mesh
        private var shaderProgram : Program3D;
        // the uploaded verteces used by our mesh
        private var vertexBuffer : VertexBuffer3D;
        // the uploaded indeces of each vertex of the mesh
        private var indexBuffer : IndexBuffer3D;
        // the data that defines our 3d mesh model
        private var meshVertexData : Vector.<Number>;
        // the indeces that define what data is used by each vertex
        private var meshIndexData : Vector.<uint>;
        // matrices that affect the mesh location and camera angles
        private var projectionMatrix : PerspectiveMatrix3D = new PerspectiveMatrix3D();
        private var modelMatrix : Matrix3D = new Matrix3D();
        private var viewMatrix : Matrix3D = new Matrix3D();
        private var modelViewProjection : Matrix3D = new Matrix3D();
        private var videoFile : File;
        private var vTexture : VideoTexture;
        private var netStream : NetStream;
        private var videoConnection : NetConnection;
        private var logger : PipeLogger;

        public function VideotextureTest() {
            this.loaderInfo.uncaughtErrorEvents.addEventListener(UncaughtErrorEvent.UNCAUGHT_ERROR, onUncaughtError);
            NativeApplication.nativeApplication.addEventListener(InvokeEvent.INVOKE, onInvoke);

            logger = new PipeLogger();
            logger.localLog("VideoTexure test instantiated.");

            videoFile = new File("C:\\Users\\xpodigital\\");
            videoFile = videoFile.resolvePath("4k.mp4");

            if (stage != null) {
                init();
            } else {
                addEventListener(Event.ADDED_TO_STAGE, init);
            }
        }

        private function onUncaughtError(event : UncaughtErrorEvent) : void {
            var theError : Error = event.error as Error;

            logger.localLog("Uncaught exception= " + theError.getStackTrace());
        }

        protected function onInvoke(event : InvokeEvent) : void {
            NativeApplication.nativeApplication.activeWindow.width = 3840;
            NativeApplication.nativeApplication.activeWindow.height = 2160;
            NativeApplication.nativeApplication.activeWindow.x = 0;
            NativeApplication.nativeApplication.activeWindow.y = 0;
        }

        private function init(e : Event = null) : void {
            logger.localLog(NativeApplication.nativeApplication.runtimeVersion);

            if (hasEventListener(Event.ADDED_TO_STAGE)) {
                removeEventListener(Event.ADDED_TO_STAGE, init);
            }
            stage.scaleMode = StageScaleMode.NO_SCALE;
            stage.align = StageAlign.TOP_LEFT;

            stage.stage3Ds[0].addEventListener(Event.CONTEXT3D_CREATE, onContext3DCreate);
            stage.stage3Ds[0].requestContext3D();
        }

        private function onContext3DCreate(event : Event) : void {
            logger.localLog("Context3D created!!");

            var t : Stage3D = event.target as Stage3D;
            context3D = t.context3D;

            if (context3D == null) {
                return;
            }

            logger.localLog("Do we support videotextures = " + Context3D.supportsVideoTexture);

            context3D.enableErrorChecking = true;
            initData();

            context3D.configureBackBuffer(swfWidth, swfHeight, 0, true);

            // A simple vertex shader which does a 3D transformation
            var vertexShaderAssembler : AGALMiniAssembler = new AGALMiniAssembler();
            vertexShaderAssembler.assemble(Context3DProgramType.VERTEX,
            // 4x4 matrix multiply to get camera angle 
            "m44 op, va0, vc0\n" +
            // tell fragment shader about XYZ 
            "mov v0, va0\n" +
            // tell fragment shader about UV 
            "mov v1, va1\n");

            // A simple fragment shader which will use the vertex position as a color
            var fragmentShaderAssembler : AGALMiniAssembler = new AGALMiniAssembler();
            fragmentShaderAssembler.assemble(Context3DProgramType.FRAGMENT, 
            // grab the texture color from texture fs0
            // using the UV coordinates stored in v1
            // "tex ft0, v1, fs0 <2d,repeat,miplinear>\n" + image with mipmapping
            // "tex ft0, v1, fs0 <2d,repeat, clamp>\n" +     works 
            "tex ft0, v1, fs0 <2d,rgba>\n" +    
            // move this value to the output color 
            "mov oc, ft0\n");

            // combine shaders into a program which we then upload to the GPU
            shaderProgram = context3D.createProgram();
            shaderProgram.upload(vertexShaderAssembler.agalcode, fragmentShaderAssembler.agalcode);

            // upload the mesh indexes
            indexBuffer = context3D.createIndexBuffer(meshIndexData.length);
            indexBuffer.uploadFromVector(meshIndexData, 0, meshIndexData.length);

            // upload the mesh vertex data
            // since our particular data is
            // x, y, z, u, v, nx, ny, nz
            // each vertex uses 8 array elements
            vertexBuffer = context3D.createVertexBuffer(meshVertexData.length / 8, 8);
            vertexBuffer.uploadFromVector(meshVertexData, 0, meshVertexData.length / 8);

            // create projection matrix for our 3D scene
            projectionMatrix.identity();
            // 45 degrees FOV, 640/480 aspect ratio, 0.1=near, 100=far
            projectionMatrix.perspectiveFieldOfViewRH(45, swfWidth / swfHeight, 0.01, 100.0);

            // create a matrix that defines the camera location
            viewMatrix.identity();
            videoConnection = new NetConnection();
            videoConnection.connect(null);
            netStream = new NetStream(videoConnection);
            netStream.bufferTime = 5;
            var metaListener : Object = new Object();
            metaListener.onMetaData = onMetaData;

            netStream.client = metaListener;
            netStream.addEventListener(NetStatusEvent.NET_STATUS, netStreamStatus);

            vTexture = context3D.createVideoTexture();

            vTexture.attachNetStream(netStream);
            vTexture.addEventListener(VideoTextureEvent.RENDER_STATE, renderFrame);
            vTexture.addEventListener(Event.TEXTURE_READY, textureReady);
            netStream.play(videoFile.url);
        }

        private function renderFrame(e : VideoTextureEvent) : void {
            logger.localLog("Video texture event heard = " + e.status);
        }

        private function onMetaData(metadata : Object) : void {
            logger.localLog("onMetaData called. Video started. Total time = " + metadata.duration);
        }

        private function netStreamStatus(event : NetStatusEvent) : void {
            logger.localLog("Netstream status: " + event.info.code);

            if (event.info.code == "NetStream.Play.Stop") {
                netStream.seek(0);
            }
        }

        private function textureReady(e : Event) : void {
            // clear scene before rendering is mandatory

            context3D.clear(0, 0, 0);

            context3D.setProgram(shaderProgram);

            // create the various transformation matrices
            modelMatrix.identity();

            modelMatrix.appendRotation(0, Vector3D.Y_AXIS);
            modelMatrix.appendRotation(0, Vector3D.X_AXIS);
            modelMatrix.appendRotation(0, Vector3D.Z_AXIS);
            modelMatrix.appendTranslation(0.0, 0.0, 0.0);
            modelMatrix.appendRotation(180, Vector3D.X_AXIS);

            // clear the matrix and append new angles
            modelViewProjection.identity();
            modelViewProjection.append(modelMatrix);
            modelViewProjection.append(viewMatrix);
            modelViewProjection.append(projectionMatrix);

            // pass our matrix data to the shader program
            context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 0, modelViewProjection, true);

            // associate the vertex data with current shader program
            // position
            context3D.setVertexBufferAt(0, vertexBuffer, 0, Context3DVertexBufferFormat.FLOAT_3);
            // tex coord
            context3D.setVertexBufferAt(1, vertexBuffer, 3, Context3DVertexBufferFormat.FLOAT_3);

            context3D.setTextureAt(0, vTexture);

            // finally draw the triangles
            context3D.drawTriangles(indexBuffer, 0, meshIndexData.length / 3);

            // present/flip back buffer
            context3D.present();
        }

        private function initData() : void {
            logger.localLog("Initialize meshindexData");

            meshIndexData = Vector.<uint>([0, 1, 2, 0, 2, 3]);
            meshVertexData = Vector.<Number>([
            // X, Y, Z, U, V, nX, nY, nZ
            -1.77, -1, 1, 0, 0, 0, 0, 1, 1.77, -1, 1, 1, 0, 0, 0, 1, 1.77, 1, 1, 1, 1, 0, 0, 1, -1.77, 1, 1, 0, 1, 0, 0, 1]);
        }
    }
}

Known Workarounds

Unfortunately our workaround has been to downgrade the runtime to 31.0; however, that doesn't seem like a sustainable approach, as that SDK is no longer available on the Internet. (The last Adobe runtime, 32.0, had a bug where it would not play desktop video (hosted on local filesystem) for Windows.)

xpojosh commented 1 year ago

We've done a little more work that may provide some insight here. We ran DXVA checker while running the two different runtimes.

With Adobe 31, we can see that in fact video decoding is being performed by the GPU. With the Harman runtime, the GPU is never called for decoding and thus must be falling back to software (event though an "accelerated" event is reported by the video texture)

See attached DXVA screenshots.

Adobe Traces

adobeDXVA

adobeTraceCrop

Harman Traces

harmanDecodeDXVA

ajwfrost commented 1 year ago

Okay thanks .. so it looks like I'm seeing the same thing here although I also see a "video processing" thing going on with ADL, just not a "video decoding". With AIR SDK 31 it has the decoding too, per your view there. I will see if we can find out what's changed then, may need to dig through some of the Adobe history.

So I'm suspecting this is to do with the changes that Adobe originally made, and then we fixed to get some of the video working again, but maybe the routing/decisioning internally is still a bit screwed up...

thanks

ajwfrost commented 1 year ago

Ah .. so it looks like the video decoder logic in AIR needs some initial processing of the H.264 stream and then slices are passed to the DirectX hardware decoders to then accelerate the decoder and display. But that first bit of processing is something that's been removed when Adobe pulled out the software codecs that required the patent licensing.

I recall a few years ago we were hooking up a VDPAU decoder on an embedded Linux platform, and think there were some similar challenges with that... let me talk with the guy who did that and see whether we can pull together some code from there.. or alternatively we can see whether there is more flexibility in the DirectX stuff to hook this in at an earlier point, I'd have hoped that this is possible..!

thanks

xpojosh commented 1 year ago

Andrew --

This is all good news! Thanks so much for the frequent updates. They are much appreciated and help us plan our strategy going forward.

Cheers!

Josh

ajwfrost commented 1 year ago

FYI I've also just used that DXVA checker to see what happens with our new multimedia framework - still work in progress but currently it can output to a NativeWindow... and that does use the hardware video decoder component. Hopefully it will still do that when we add the custom presenter logic so that we can redirect the output to textures, stagevideo and also (via software copying of the buffers) a normal video object...

xpojosh commented 1 year ago

That's awesome news!

ajwfrost commented 1 year ago

In case this helps (available/working from 50.2)

package {

import air.media.FileSource;
import air.media.MediaElement;
import air.media.Pipeline;
import air.media.States;
import air.media.VideoOutput;

import flash.display.Sprite;
import flash.display.StageDisplayState;
import flash.events.Event;
import flash.filesystem.File;

public class Main extends Sprite {
    public function Main() {
        stage.displayState = StageDisplayState.FULL_SCREEN_INTERACTIVE;

        var fileSource : FileSource = new FileSource();
        fileSource.file = new File(@"c:\adobe\videos\bbb4k.mp4");
        var viOp : VideoOutput = new VideoOutput(this.stage.nativeWindow);

        var pipeline : Pipeline = Pipeline.createPipeline(fileSource, viOp);
        pipeline.addEventListener(MediaElement.STATE_CHANGE, onPipelineStateChange);
    }
    private function onPipelineStateChange(e : Event) : void
    {
        var pipeline : Pipeline = e.target as Pipeline;
        trace("Pipeline state change -> " + pipeline.readyState);
        if (pipeline.readyState == States.READY) pipeline.play();
    }
}

}
xpojosh commented 1 year ago

Andrew --

This is so exciting! We are living in the future.

Here's our confirmation that we are getting GPU video decoding with the new Pipeline class:

pipelineCrop

We'd also be interested if that VDPAU code could be revived for AIR on Linux, unless you are already pursuing another path there.

itlancer commented 1 year ago

@ajwfrost New Pipeline looks very promising. I hope we could get documentation about it and make some experiments. On which platforms it available right now? Only Windows or all desktops?

About hardware/software decoding we also have the same issue right now with AIR 33+. May be related issue https://github.com/airsdk/Adobe-Runtime-Support/issues/155 NetStream::useHardwareDecoder seems was designed to control that but now it not working..

ajwfrost commented 1 year ago

@itlancer it's still a bit early for a roll-out; the only thing that's hooked up so far is on Windows, audio outputs are working (but not to any internal audio mixing component, just to the available sound output devices) and video output is only available to a NativeWindow, we're looking to add support for the other options (Video, StageVideo, VideoTexture). Focusing first on the standard cases where someone has an external file/URL, or an embedded bytearray of an audio clip etc, and just wants to play it in a simple way...

ged-mc commented 1 year ago

Great to hear this is progressing. Thanks

henke37 commented 1 year ago

I have my doubts about new classes. Is this really something that a developer needs to concern themself with?

Also, does this work with RTMP connections?

ajwfrost commented 1 year ago

It was partly because we needed a fresh start; the NetStream implementation is so complex (C++ code that started in pre-AS3 days and has had so many different mechanisms come and go) and the idea of a more pipeline-based API set is fairly common. But trying also to make the common use cases as straightforward to code as possible.

No reason to not support RTMP, I know they're also looking to adapt/improve the spec - see https://github.com/veovera/enhanced-rtmp The goal we have here though is to allow other folk to also create components/filters that can be plugged in to the pipeline, such that the RTMP protocol could be handled by a separate/open source component controlled by the AS3 application and displaying the video content within a Stage3D texture or whatever..

esidegallery commented 1 year ago

It's such a relief to see that improvements and fixes to video playback hasn't been dropped. From my perspective, video performance and stability issues are the one thing I'm worried about in terms of future proofing my (Windows) apps - too large for porting to a new platform to be feasible. A VideoTexture implementation of this new Pipeline, which simply plays from local files is all I would need, personally speaking. Is this still on the agenda? And would it make HEVC decoding possible too?

esidegallery commented 11 months ago

Well, I'm getting flawless, smooth playback on a 4K video with Pipeline. The same video freezes only a second after playback starts using the current system and VideoTexture.

itlancer commented 4 months ago

@ajwfrost When new pipeline planned to be released? We need smooth 4K video playback at least for Windows with VideoTexture or Video.

esidegallery commented 4 months ago

This is such a big issue for me, I would purchase an additional Enterprise Tier membership if it would help push the issue up the priority list?

ajwfrost commented 4 months ago

It's moving forwards, albeit slowly. A fair bit more complicated than we had first thought, to get this working consistently across different platforms/operating systems with a fairly low-level API...

So we had been thinking to also provide some higher-level utility mechanisms, to simplify the usage but also to provide an interim way of doing things (which allows us to use some simpler OS-specific mechanisms for these short-cut cases, whilst building up the ability to also provide the same things using the 'pipeline' behind the scenes).

Taking an example: audio playback on Android is something we have working via the pipeline classes but it's not quite fitting into our current pipeline APIs because of how the Android native audio mechanisms work. So we will need to be updating the APIs again (which is a reason we've not officially published them!). But there is no reason why we couldn't provide a simpler API for playing back of sound files, or of video files, which we would hope covers 80%+ of use cases. And with that approach I would hope we can then get something more usable and complete released a bit sooner...

itlancer commented 1 month ago

@ajwfrost When new pipeline will be released for Windows? Right now there are a lot of performance issues with any type of video playback (VideoTexture, Video, StageVideo): https://github.com/airsdk/Adobe-Runtime-Support/issues/2162 https://github.com/airsdk/Adobe-Runtime-Support/issues/1963 https://github.com/airsdk/Adobe-Runtime-Support/issues/646 https://github.com/airsdk/Adobe-Runtime-Support/issues/2163