jean343 / Node-OpenMAX

Node wrapper for the OpenMAX library
https://www.npmjs.com/package/openmax
MIT License
16 stars 5 forks source link

RGB888 to VideoRender? #11

Open digi-chris opened 6 years ago

digi-chris commented 6 years ago

Hi, this code is fantastic! Thanks for building it.

I'm wondering if it's possible to use the VideoDecoder with a raw RGB888 format and then push this to the renderer? Looking at the docs for OpenMax, I think it is possible - it looks like it should be just a case of changing the flag in the following example:

VideoDecode.setVideoPortFormat(omx.Video.OMX_VIDEO_CODINGTYPE.OMX_VIDEO_CodingAVC);

But, I can't find anywhere that defines the flag for RGB888, or even BGR888, ARGB, etc. What should I set it to, or isn't this possible?

As an alternative, what could I send directly to the VideoRender object and bypass the VideoDecoder? Is it YUY2?

Thanks,

Chris.

jean343 commented 6 years ago

Hi, thanks so much for your interest!

You wound need the VideoDecode to convert from h.264 to buffers. I have not tested it, but I found the following info:

From http://home.nouwen.name/RaspberryPi/documentation/ilcomponents/video_decode.html

The output format may be set to RGB565 using OMX_IndexParamPortDefinition; in this case proprietary communication will not be used, and the image data will be converted on the fly.

You could try:

var format = VideoDecode.getParameter(VideoDecode.out_port, omx.INDEXTYPE.IndexParamPortDefinition);
format.video.eColorFormat = omx.COLOR_FORMATTYPE.COLOR_Format24bitRGB888;
VideoDecode.setParameter(VideoDecode.out_port, omx.INDEXTYPE.OMX_IndexParamPortDefinition, format);

Again, this is not tested, you can find the flags here https://github.com/jean343/Node-OpenMAX/blob/master/lib/flags/IVCommon.ts

digi-chris commented 6 years ago

Hi, thanks for the response! This looks really promising, but unfortunately I can't get it to work. Basic code as follows:

var fs = require('fs');
var omx = require('openmax');

var VideoDecode = new omx.VideoDecode();
var VideoRender = new omx.VideoRender();

omx.Component.initAll([VideoDecode, VideoRender])
.then(function () {
    var format = VideoDecode.getParameter(VideoDecode.out_port, omx.INDEXTYPE.IndexParamPortDefinition);
    format.video.eColorFormat = omx.COLOR_FORMATTYPE.COLOR_Format24bitBGR888;
    VideoDecode.setParameter(VideoDecode.out_port, omx.INDEXTYPE.OMX_IndexParamPortDefinition, format);

    fs.createReadStream("t.rgb")
        .pipe(VideoDecode)
        .tunnel(VideoRender)
        .on('finish', function () {
            console.log("Video playing done.");
            process.exit();
          });
})
.catch((err) => {
    console.log(err);
});

The 'video playing done' never appears in the console, although it does appear to open the file - there is a delay before it finally responds with simply:

Quit on exit 0

I tried changing the VideoDecode.out_port to VideoDecode.in_port (thinking that maybe I needed to state my RGB888 frames are on the input side, not the output), but it made no difference.

I also tried setting my frame width and height (since they won't be defined in the raw data stream) using the following properties:

    format.video.nFrameWidth = 1920;
    format.video.nFrameHeight = 1080;
    format.video.nStride = 1920;
    format.video.nSliceHeight = 1080;

The screen stays blank throughout, and I can't get any further error messages - I tried to add a try.. catch.. block around the ReadStream section, but it made no difference.

Any ideas? Thanks again for your help.

digi-chris commented 6 years ago

I should add, my file, 't.rgb' is a test card image, raw 24-bit BGR 888 format, 1920 pixels wide, 1080 pixels high, with 10 frames in the stream.

jean343 commented 6 years ago

Hi,

The VideoDecode component is made to read encoded streams such as h.264, mpeg-4, etc., the entire list of types is here: https://github.com/jean343/Node-OpenMAX/blob/cd7120a478a941360ea0d305a13d8009c6550310/lib/flags/Video.ts#L2

The VideoDecode component does not read raw frames. You possibly could skip the VideoDecode and send your buffers directly to the VideoRender. Or use an ImageDecode to read jpegs...

What is your use case? Displaying raw images on the PI does not need the complexity of OpenMax :)

JP

digi-chris commented 6 years ago

Thanks, my use case is some new software I'm working on where I can share data streams between multiple systems, and one main type of data I want to push through the system would be video. It's still a very early prototype at the minute and quite experimental, but this gives more info:

http://hector.direct/

I wrote a C# version and now I'm building a Node JS version with the aim that I'll open source it and you'd be able to send data between Windows, Mac and Linux systems (especially the Raspberry Pi) just by connecting wires on a web-based interface.

So, basically I have a datastream coming out of the PC with uncompressed YUV and/or RGB data, and I want to be able to connect it to the Raspberry Pi and it just instantly starts outputting to the Pi's HDMI port. I have the underlying TCP stack already working.

Of course, the added complexity is that I will also want to be able to send compressed streams (such as H.264 and MJPEG) through to the Pi as well, and I will be adding that later. So I was hoping if I could use OpenMax to display my uncompressed streams, first off it means very minimal install on the Pi (no need for X11 or any sort of GUI), and secondly it means I'll have an easier time adding compressed streams later without needing to require different libraries.

Hope that makes sense! Back to the question at hand...

So, if I could push the raw data directly to the VideoRender, any ideas how I would set that up? I'm thinking maybe I could have a PassThrough stream and send buffers to it which are piped to the VideoRender object, but I've tried sending buffers full of 0xFF hoping I'll get a white image on screen, and I just get an OMX_ErrorBadParameter error.

Does the data being sent to the VideoRender object have to have some info relating to the frame being sent added to the stream somehow? Looking at your SimpleVideoDecoderRenderBuffer example, it looks to me like it's just raw data being piped through, but I must be missing something important in the stream.

Thanks,

Chris.

jean343 commented 6 years ago

Hi,

I have built pretty much what you are looking for, but it’s in c++ :(

I don’t recommend sending raw frames as the pi is limited at 100mbit/sec. You will get 2-3 fps and saturate the network. Beside, you will have to implement something complex.

I always wanted to make a sample for your use case, but I never got the time.

I would start from the server component I shared in my link above, and I would connect it through tcp to a node running this code. You can pick any sample that reads a h.264 file, and read from tcp instead.

Have fun!

digi-chris commented 6 years ago

Hi,

Thanks - yes I do agree raw data isn't exactly very network friendly, and in a production environment is probably out of the question, but for me it feels like a good place to start and I would like to support it - and it makes it easier for me to debug my compressed streams if I know the uncompressed stream works OK.

I can't seem to find the server component you mentioned - which link are you referring to?

Cheers,

Chris.

jean343 commented 6 years ago

I understand your urge to get uncompressed data, but in this case it will slow you down, trust me :)

You could get an array of jpeg, if you really want to debug something.

As per the server it’s here. https://github.com/jean343/RPI-GPU-rdpClient/tree/master/Server

It’s a complex piece of code which allows the creation of an h.264 stream using the nvidia gpu.

J-P

digi-chris commented 6 years ago

Cool, thanks.. I was just thinking I could try with an MJPEG stream. I've been using Intel QuickSync here with very good results, I just know for completeness I will want to support uncompressed data as well.. There must be a way to do it firing direct to the VideoRender class, so I'll let you know if I figure it out.

But, MJPEG would be a good alternative (and likely more useful) as a proof-of-concept :)

The RDP client looks very cool, I'll check it out! I've done some work on capturing the screen at very fast rates via DWM on Windows, so if I get a chance I'll compare what we're doing to see if I can share anything helpful (although likely you're already doing it in a very efficient way).

Cheers,

Chris.

jean343 commented 6 years ago

Yes, it’s using wddm. If you are confortable with that and intel quick sync, i would start directly with it. The pi is highly optimized at displaying h.264 videos, you will likely get a better result vs mjpeg.

digi-chris commented 6 years ago

Cool, thanks. If I find a solution to the raw issue, I'll let you know... but, for the purposes of getting this working sooner rather than later, I'll take your advice and start with MJPEG and H.264.

digi-chris commented 6 years ago

Sorry to keep this going, but I'm having trouble even getting MJPEG to work.

I'm assuming I should just be able to take one of your H.264 examples and switch it to MJPEG, for example:

"use strict";
var fs = require('fs');
var omx = require('openmax');
var VideoDecode = new omx.VideoDecode();
var VideoRender = new omx.VideoRender();
omx.Component.initAll([VideoDecode, VideoRender])
        .then(function () {
          VideoDecode.setVideoPortFormat(omx.VIDEO_CODINGTYPE.VIDEO_CodingMJPEG);

          fs.createReadStream("t.mjpeg")
                  .pipe(VideoDecode)
                  .tunnel(VideoRender)
                  .on('finish', function () {
                    console.log("Done");
                    process.exit();
                  });
        });

But it just hangs somewhere after creating the ReadStream. I've used omxplayer to play my test mjpeg file, and it works fine. Am I doing something wrong?

Thanks,

Chris.

jean343 commented 6 years ago

Hmm, this is a good question. Omxplayer does a little more than OpenMax. Mjpeg streams are strange, as they are a bunch of JPEGs with multipart/form-data encoding. They can be encoded and read differently. I'm not 100% sure of OpenMax supports them natively. You could have more luck going the path of splitting the JPEGs and read then with the ImageDecode. Or, you could try h.264! Hope it helps! JP

digi-chris commented 6 years ago

Thanks, maybe I don't have the stream set up right... do you have an example of piping the ImageDecode to the VideoRender? Is it the same as using the VideoDecode?

Thanks,

Chris.

jean343 commented 6 years ago

It's relatively similar to the VideoDecode example, I do remember trying, and I got one single frame, then it failed. I did not make an example because of that. You could open a defect around reading a jpg file, and I will take a look.

digi-chris commented 6 years ago

Thanks, I'll do that. Thanks for taking the time to look into this :)