Closed julianscheel closed 8 years ago
There does appear to be a resize mmal component (MMAL_RIL_COMPONENT_TYPE_VIDEO_RESIZE). I've not used it myself, but perhaps @6by9 or @luked99 can confirm that it should work.
Resize is there and works - the PiCamera python library makes use of it for one.
A quick check says it won't work with MMAL_ENCODING_OPAQUE at the moment, but could be made to work on the input side as it appears to only be some plumbing missing. OPAQUE on the output is not supported, and wouldn't provide much benefit anyway as video_encode has to do a format conversion and subsample.
It's VPU code only, not using a dedicated hardware block. In theory the ISP hardware could do the resize purely in hardware, but there's no support code for it at present.
Thanks, I'll take a look at it. Having opaque input might be a great improvement if it's easy to fix anyway.
@6by9 @popcornmix Any chance that one of you guys finds some time to enable opaque input?
Not me as yet. What formats did you want to convert from/to? If transcoding decoded video I'd guess YUV in and out.
@popcornmix would need to give me access to the code and a compiler. If that's possible I guess I could have a go.
@6by9 Input would be either mmal_decode (mpeg2 or h264) or mmal_image_fx (deinterlacing). Output should ideally be YUV, so that it can be fed to either ffmpeg mpeg2video encoder or to mmal h264 encoder. @luked99 This would be awesome!
Thinking about it this morning I had reservations as to whether resize would handle the proprietary image format that video decode spits out when in opaque mode.
Checking the code it looks like it does, and then I found I had a commit on my local tree dated 17th June "resize: add support for opaque images on the input. Not tested as yet." Looks like I did half the work after your last comment, but never got around to testing it. It is mixed up with other stuff that I was working on at the time, so needs a little attention - I'll try to find half an hour to do that and produce a test firmware.
@6by9 Did you happen to have a chance to look into that code? I'd be happy to test it :)
Code done, but not done even a basic test. Compounded by swapping to a new PC for firmware development. I'll sync my RpiTest repo later today and push the firmware as a totally untested thing.
@6by9 Great, I should be able to test in at the end of the week then
Test firmware image pushed to https://github.com/6by9/RPiTest/blob/master/resize_opaque/start.elf Should actually be called start_x.elf, and does NOT support altering the gpu_mem setting - will be fixed at 128M.
popcornmix has tagged the latest commit as including my commit, so hopefully it should address this issue - resize should now accept opaque buffers at least on the input. Sorry it's taken so long.
I haven't been as thorough on the testing as I could have been, so please report back if you do find anything funny. Opaque buffers on the output of resize feeding into a video encode component is not recommended, and I don't think it will allow it.
Something else you might want to try - there's a new component "vc.ril.isp". It's a wrapper around the ISP hardware, so it can do resize and format conversion with minimal VPU overhead. Currently RGB888, BGR888, or various YUV flavours on the input (not opaque at the moment), and I420 on the output. This is my current spare time project, so please let me know if you find problems. I will be adding opaque input support, but opaque output is more involved and probably doesn't help much.
Thanks for the update. Unfortunately last weeks got unexpectedly troublesome so I was unable to test it yet. But it's quite on the top of my agenda.
The vc.ril.isp
stuff sounds great. So that one should even be able to do the scaling at lower cost than the resize component? At least once it is able to handle opaque input?
Yes, with vc.ril.isp the scaling is done by hardware block normally used by the camera. There's a small setup phase, and then the VPU just has to give the hardware a pointer to the new buffer for each frame :-)
@julianscheel is this still an issue?
@popcornmix Unfortunately I never came to down to really implement this. But I still have at least one project pending where I'll need it. It's just not really clear at what point in time it will happen. If you prefer we can close this bug and I'll reopen in case I run into issues when using it.
Yes. I'll close for now as we believe the issue is resolved. Feel free to reopen if you discover problems when actually using the feature.
Is there any option to scale an MMAL opaque images on the GPU? I am thinking about a transcoding usecase where a pipeline video_decode->imagefx(deinterlace)->scale->video_encode would be required. But so far the only hw accelerated scaling options I am aware of are dispmanx and GLES. Where I don't see an option for both of them to be used as part of an opaque/zerocopy chain.