techyian / MMALSharp

C# wrapper to Broadcom's MMAL with an API to the Raspberry Pi camera.
MIT License
195 stars 33 forks source link

Problem using InMemoryCaptureHandler to Capture "Stream" #109

Closed kenssamson closed 4 years ago

kenssamson commented 4 years ago

The project I'm working on uses the InMemoryCaptureHandler so I can overwrite the same output image multiple times. This is done in a continuous loop.

service.cs // Background Worker running on separate thread

//setup camera
private MMALCamera mmal = null;
private ILoggerFactory loggerFactory;  // set in constructor

// use /dev/shm - special "temp" folder on pi that resides in memory instead of on SD card
private const string tempFile = "/dev/shm/mmalsharp/preview.jpg";

private async Task CameraSetup() {

    MMALLog.LoggerFactory = loggerFactory;
    mmal = MMALCamera.Instance;
    MMALCameraConfig.StillResolution = new Resolution(640, 480);
    mmal.ConfigCameraSettings();
    await Task.Delay(2000).ConfigureAwait(false);
}

private async Task<List<byte>> TakePicture() {

    List<byte> data = null;

    using (var imgHandler = new InMemoryCaptureHandler())
    using (var imgEncoder = new MMALImageEncoder())
    using (var nullSink = new MMALNullSinkComponent()) {
        var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
        imgEncoder.ConfigureOutputPort(portConfig, imgHandler);

        mmal.Camera.PreviewPort.ConnectTo(nullSink);
        mmal.Camera.StillPort.ConnectTo(imgEncoder);

        await mmal.ProcessAsync(mmal.Camera.StillPort).ConfigureAwait(false);
        data = new List<byte>(imgHandler.WorkingData);
    }
    return data;
}

// use static for easier access from main application
public static bool IsStreaming = true;

public async Task ExecuteAsync(CancellationToken stoppingToken) {

    try {
        await CameraSetup();
        while (!stoppingToken.IsCancellationRequested) {
            if (IsStreaming) {
                var data = await TakePicture();
                await System.IO.File.WriteAllBytesAsync(tempFile, data.toArray());
                await Task.Delay(20);   // allows for 50 fps by client
            } else {
                await Task.Delay(50);    // when streaming is paused
            }
        }
    }
    catch (Exception ex) {
        logger.LogError(ex, "Exception thrown while streaming");
    }
}

HomeController.cs

/* ui code */
public async Task<IActionResult> GetImageAsync() {

    if (System.IO.File.Exsists(tempFile)) {
        var fileData = await System.IO.File.ReadAllBytesAsync(tempFile);
        return Json(new { success = true, data = Convert.ToBase64String(fileData)) });
    }
    return Json(new { sucess = false, error = $"File not found - '{tempFile}'" });
}

site.ts


var processID = 0;
var isStreaming = false;
var img = document.getElementById("preview");   // img tag

var showImage = () => {
    if (isStreaming) {
        $.ajax({ url: "/home/GetImageAsync", dataType: "json", cache: false, method: "get" })
            .done((result) => {
                if (result.success) {
                    img.src = `data:image/jpeg;base64,${result.data}`;
                } else {
                    console.log(result.error);
                    isStreaming = false;
                }
            }).fail((jqXHR, status, error) => {
                console.log(`Ajax Error - ${error}`);
                isStreaming = false;
            });
    }
}

var onStartStream = (event) => {
    event.preventDefault();
    isStreaming = true;
    processID = window.setInterval(showImage, 1000 / 25);
}

var onStopStream = (event) => {
    event.preventDefault();
    isStreaming = false;
    window.clearInterval(processID);
    processID = 0;
}

$('#streamStart').on('click', onStartStream);
$('#streamStop').on('click', onStopStream);

Code is only part of the process but contains the main logic. Basically, the background worker keeps updating the image in the temp folder. The front end then uses AJAX to reload the image in the browser.

The problem is that the background process appears to freeze after a few seconds and stops updating the picture. MMALSharp reports the following:

warn: MMALSharp[0] Port not enabled.
info: MMALSharp[0] Successfully processed 0.22mb.
warn: MMALSharp[0] Port not enabled.
info: MMALSharp[0] Successfully processed 0.22mb.
warn: MMALSharp[0] Port not enabled.
info: MMALSharp[0] Successfully processed 0.22mb.
warn: MMALSharp[0] Port not enabled.
info: MMALSharp[0] Successfully processed 0.22mb.

I thought about combining the TakePicture and ExcuteAsync procedures but it's not clear if the using statements should be outside or inside the while loop. Basically, I am trying to do the following:

  1. Open the camera with the necessary warm-up delay before starting the stream
  2. Take a Picture every X milliseconds, replacing the last picture taken with new one.
  3. When done streaming, close the camera but be able to start the camera again if requested.

Let me know if you need additional information.

Thanks, Ken

techyian commented 4 years ago

Hi Ken,

You should be able to have your while loop inside of the using blocks so you're not having to dispose and re-create the native components on each iteration. Please see the Timeout and Timelapse helper examples in MMALCamera.cs.

I have noticed something very similar happening whilst running the unit tests over a longer period of time, but that isn't specifically tied to the InMemoryCaptureHandler being used. I haven't been able to determine whether that's a fault of this library or more a general problem with the native MMAL framework being used.

Please let me know whether having the while loop within your using blocks helps at all. If it does, it'll help me narrow down the potential cause.

Thanks,

Ian

techyian commented 4 years ago

Also, do you see any difference between v0.5.1 and v0.6 (dev)? In v0.6, I have moved over to using TaskCompletionSource for asynchronous checking of when a component port has finished processing. It's a much nicer way of doing the event handling, but I'm unsure of its potential performance benefits yet until I run some benchmarks.

kenssamson commented 4 years ago

Is the WorkingData reset after each call to ProcessAsync? It's not clear in the documentation.

techyian commented 4 years ago

Currently no, but it should be. The PostProcess method is called on each EOS event received via the Default callback handler. You will see in the InMemoryCaptureHandler override of PostProcess that the clearing of the WorkingData property is enclosed in the if statement which relies on you configuring additional image processing.

For now, please clear the WorkingData list yourself. I will get this fixed this evening in the dev branch.

Thanks for pointing me to it.

techyian commented 4 years ago

Raised #110.

techyian commented 4 years ago

Sorry, my last message was incorrect, I had misread what I was re-initialising the list with, and I can see now it's the data from the image processing pipeline. Please continue to clear the WorkingData property yourself after you've retrieved the data from it.

kenssamson commented 4 years ago

So I've updated my code and moved the while loop inside the using block. It still fails after various amounts of time.

Basic code...

public async Task TakePictureStream(CancellationToken stoppingToken) {

    var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);

    using (var imgHandler = new InMemoryCaptureHandler())
    using (var imgEncoder = new MMALImageEncode())
    using (var nullSink = new MMALNullSinkComponent()) {

        imgEncoder.ConfigureOutputPort(portConfig, imgHandler);
        mmal.Camera.PreviewPort.ConnectTo(nullSink);
        mmal.Camera.StillPort.ConnectTo(imgEncoder);

        long counter = 0;
        while (!stoppingToken.IsCancellationRequested && IsStreaming) {
            counter++;
            logger.LogInformation($"Start Taking Picture [{counter}] at {DateTime.Now.Ticks}");
            await mmal.ProcessAsync(mmal.Camera.StillPort).ConfigureAwait(false);
            await data.SaveFileAsync(imgHandler.WorkingData).ConfigureAwait(false);
            imgHandler.workingData.Clear();
            await Task.Delay(WaitTime).ConfigureAwait(false);
            logger.LogInformation($"Stop Taking Picture [{counter}] at {DateTime.Now.Ticks}");
            if (counter == long.MaxValue) counter = 0;
        }
    }
}

WaitTime is 25 FPS / 40 milliseconds data is a helper class for saving and loading share files

The process runs for a while before freezing. The freezes occurs after a various amount of time, I seen it freeze after as little as 30 pictures or as many as 160 pictures. Even though the loop is inside the using statement, the process still goes through several steps.

Log Entries for a single picture

info: PiService.CameraService[0]
      Start Taking Picture [127] at 637137670082606432
dbug: MMALSharp[0]
      Starting output port vc.ril.image_encode:out:0(JPEG)
dbug: MMALSharp[0]
      Enabling port.
dbug: MMALSharp[0]
      Initialising buffer pool.
dbug: MMALSharp[0]
      Creating buffer pool with 1 buffers of size 81920
dbug: MMALSharp[0]
      Sending buffer to output port: Length 0
dbug: MMALSharp[0]
      Setting parameter MMAL_PARAMETER_CAPTURE
dbug: MMALSharp[0]
      vc.ril.image_encode vc.ril.image_encode:out:0(JPEG) End of stream. Signaling completion...
dbug: MMALSharp[0]
      Setting parameter MMAL_PARAMETER_CAPTURE
dbug: MMALSharp[0]
      Disabling port vc.ril.image_encode:out:0(JPEG)
dbug: MMALSharp[0]
      vc.ril.image_encode vc.ril.image_encode:out:0(JPEG) End of stream. Signaling completion...
warn: MMALSharp[0]
      Port not enabled.
dbug: MMALSharp[0]
      Destroying output port pool.
dbug: MMALSharp[0]
      Releasing active buffers for port vc.ril.image_encode:out:0(JPEG).
dbug: MMALSharp[0]
      Disposing buffer pool.
info: PiService.CameraService[0]
      Stop Taking Picture [127] at 637137670088284662

The freeze does occur at the exact same step.. dbug: MMALSharp[0] Releasing active buffers for port vc.ril.image_encode:out:0(JPEG).

Also, if you look at the tick difference between start and stop, it's around 5.5 mil which equates to around 550 milliseconds. This means the max rate of capturing images is between 1 and 2 fps. Ideally, I would want to capture images at a faster rate.

Let me know if you need any additional information.

techyian commented 4 years ago

Thanks @kenssamson, I will do some testing and get back to you.

techyian commented 4 years ago

Hi @kenssamson,

I've just committed a change to use mmal_queue_timedwait instead of mmal_queue_wait. Both functions are used to wait for a mmal buffer to be returned to the queue (blocking), however the former function can have a timeout value set against it which I'm hoping will help resolve this. I've set the timeout to 1s.

Regarding capturing images at a faster rate, I've added the global config MMALCameraConfig.StillBurstMode which you can set before calling ConfigureCameraSettings(). This should increase the speed of images from the camera's still port, but not by much. You could see if that fits your use case?

There is another way of capturing rapid images using the camera's video port, please see the example here. This will produce images at the framerate you have set in MMALCameraConfig.VideoFramerate. There is a slight caveat as you're using the InMemoryCaptureHandler and that is that unless you extend that class with your own and override the PostProcess method, there's no way currently of capturing the image frames as they're created. When doing operations with the video port, the MMALCamera.ProcessAsync method will only return once a timeout value has been exceeded. Is this a major issue?

Please let me know how you get on :)

Ian

kenssamson commented 4 years ago

Hi Ian,

Actually, I have been looking at that example and using a custom CaptureHandler. I did one where I capture the images to a Queue and then send them to web view when done. However, that setup does not allow for real time. I'm working on a slightly different version that instead overwrites a file during PostProcess. It's similar to what I was originally doing but moving the save file portion to inside the Handler should allow me to use the continuousCapture: true option. I'll let you know what I found out.

It looks like the MMALCamerConfig.VideoFramerate is only used by MMALVideoEncoder. The example and my setup uses the MMALImageEncoder. To be honest, setting the frame rate on the server side is not that important. I just want it to be fast enough so the image appears fairly smooth. I actually control the frame rate on the client side but even that is a little out of my hands since I make AJAX calls to get the image from the server.

I did find another possible solution using UV4L (https://www.linux-projects.org/uv4l/installation/) but I would have less control over the video. I noticed you use FFmpeg and RTMP in some of your examples. The issue with those is that they are not as web friendly as other protocols.

Ken

techyian commented 4 years ago

Hi Ken,

The MMALCameraConfig.VideoFramerate is also used when initialising the camera component to specify the framerate the video port should operate at:

https://github.com/techyian/MMALSharp/blob/dev/src/MMALSharp/Components/MMALCameraComponent.cs#L191

Please let me know whether the changes to the native function being called has made any difference.

Ian

kenssamson commented 4 years ago

Hi Ian,

I'm happy to report that the latest changes seem to have fixed my issues. I created a new OuputCaptureHandler to handle capturing and saving the image to a file that is overwritten with each capture. I've attached a copy of the file. If you wish, feel free to include it with the project. It's basically uses the same process as the InMemoryCaptureHandler with the addition of a NewFile() procedure to handle saving the image when it's done being processed. It also has a flag that controls whether or not the counter is used. Since my default setup is to have this process run continuously from the a background service, I didn't want to worry about the counter overflowing or causing other issues so I disabled it by default. I've also attached two other files from my in-progress project that demonstrate how I use your library to access the camera.

Camera.cs - Contains the Camera Logic. Includes logic for other parts that I have not implemented. Worker.cs - Background Service to run the camera in the background.

The main idea is that the background worker initializes the camera, and runs the process in a continuous loop until the cancellation token is triggered by the user or another process. The camera service itself, has another cancellation token which can be triggered to "pause" the capture process so other tasks can be performed like taking a single picture or modifying the camera configuration. I'm still working on that part of the project. When ready, I plan to share the project on GitHub.

Please let me know if you have any questions and thanks again for all your help.

Ken

OverwriteFileCaptureHandler.cs.txt

Camera.cs.txt

Worker.cs.txt

kenssamson commented 4 years ago

Just noticed one item in my code, it appears I was wrong about the resolution. Only the video resolution is taken into account so it's sending the larger image. I thought it was the smaller image based on the file size but I guess that's simply part of the jpeg encoder. If I wanted to "save" small images for faster read/write, could I use the video splitter component and a resizer component? If so, could you provide an example?

The process does run fairly well right now. So, I'm not sure if there would be a net gain would be to adding the resizer. More processing but smaller file size might be faster, slower, or about the same.

Just Curious. Ken

kenssamson commented 4 years ago

Meant to reopen the question.

techyian commented 4 years ago

Hi Ken,

Glad you're nearly there with your project. I'm not 100% clear on the resolution issue you're having - are these the images which are processed using your PictureStream method? I can see that method is using the rapid capture feature, so that will be using the MMALCameraConfig.VideoResolution value. As you're setting that to 1640x1232, I assume you're using the Sony IMX219 camera in order to get the 40fps framerate you're requesting. Can you confirm that resolution and framerate is being sent from the camera? What resolution are your image stills coming out at?

If the camera isn't automatically picking up which "mode" to use then you can force it onto a particular mode by setting MMALCameraConfig.SensorMode prior to calling ConfigureCameraSettings.

Thanks, Ian

techyian commented 4 years ago

Hi @kenssamson,

Are you happy for this ticket to be closed now as the fix to https://github.com/techyian/MMALSharp/commit/1e245e0283146ae4219c7ec0286568cda11bc4ed resolved the blocking issue?

kenssamson commented 4 years ago

The main issue appears to be resolved. My follow up question was on how to use the SplitterComponent and the ResizerComponent together. Basically, I want to capture at the higher resolution but save the images at the lower resolution so they can load quicker on the client side.

Thanks for all your time and effort.

Ken

techyian commented 4 years ago

No problem, thanks for replying.

Using your previous code as an example, could you see if the following produces the output you're expecting? I have added a resizer component to your pipeline and set the resolution to 800x600. I can't remember off the top of my head whether MMAL scales or crops to this resolution though.

public async Task<CameraStatus> PictureStream(CancellationToken stoppingToken)
{
    // if called after user cancel requested
    if (stoppingToken.IsCancellationRequested)
    {
        status = CameraStatus.Closed;
    }

    if (!IsReady) return status;

    logger.LogInformation("Initializing Stream");

    var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 75);

    ctsStream = new CancellationTokenSource();
    var stopStreamToken = ctsStream.Token;

    using (var imgHandler = new Handlers.OverwriteFileCaptureHandler("/dev/shm/mmalsharp/preview.jpg"))
    using (var splitter = new MMALSplitterComponent())
    using (var resizer = new MMALResizerComponent())
    using (var imgEncoder = new MMALImageEncoder(continuousCapture: true))
    using (var nullSink = new MMALNullSinkComponent())
    {
        var resizerInputConfig = new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420);
        var resizerOutputConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 800, 600, 0, 0, 0, false, null);

        resizer.ConfigureInputPort(resizerInputConfig, splitter.Outputs[0], null)
               .ConfigureOutputPort(resizerOutputConfig, null);

        imgEncoder.ConfigureOutputPort(portConfig, imgHandler);

        mmal.Camera.PreviewPort.ConnectTo(nullSink);
        mmal.Camera.VideoPort.ConnectTo(splitter);
        splitter.Outputs[0].ConnectTo(resizer);
        resizer.Outputs[0].ConnectTo(imgEncoder);

        using (var cts = CancellationTokenSource.CreateLinkedTokenSource(stoppingToken, stopStreamToken))
        {
            logger.LogInformation("Starting Stream");
            // this should run until camera 'paused' or stopingToken is triggered
            await mmal.ProcessAsync(mmal.Camera.VideoPort, cts.Token).ConfigureAwait(false);

            logger.LogInformation("Stream Stopped");

            if (stopStreamToken.IsCancellationRequested)
            {
                logger.LogInformation("Stream Paused");
                status = CameraStatus.Paused;
            }
            else if (stoppingToken.IsCancellationRequested)
            {
                logger.LogInformation("Stream Cancelled by User");
                status = CameraStatus.Closed;

                // perform clean-up if exiting process
                ctsStream.Dispose();
                ctsStream = null;
            }
        }
    }

    return status;
}

If you're still having issues I'll test on my Pi later.

Ian

kenssamson commented 4 years ago

I believe at 800x600 it crops. I’ve been using 820x616 which are exactly half the desired capture resolution and that seems to scale the image. Also, I would think that the resizer would only scale and that cropping would depend on the capture resolution. That’s why I was thinking of trying to use it.

I’m on holiday now and don’t have access to the device. I will see if your recommendation works when I return to work at the end of next week.

Thanks again for all your help.

Ken

Sent from my iPhone

On Jan 17, 2020, at 10:46 AM, Ian Auty notifications@github.com wrote:

No problem, thanks for replying.

Using your previous code as an example, could you see if the following produces the output you're expecting? I have added a resizer component to your pipeline and set the resolution to 800x600. I can't remember off the top of my head whether MMAL scales or crops to this resolution though.

public async Task PictureStream(CancellationToken stoppingToken) { // if called after user cancel requested if (stoppingToken.IsCancellationRequested) { status = CameraStatus.Closed; }

if (!IsReady) return status;

logger.LogInformation("Initializing Stream");

var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 75);

ctsStream = new CancellationTokenSource();
var stopStreamToken = ctsStream.Token;

using (var imgHandler = new Handlers.OverwriteFileCaptureHandler("/dev/shm/mmalsharp/preview.jpg"))
using (var splitter = new MMALSplitterComponent())
using (var resizer = new MMALResizerComponent())
using (var imgEncoder = new MMALImageEncoder(continuousCapture: true))
using (var nullSink = new MMALNullSinkComponent())
{
    var resizerInputConfig = new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420);
    var resizerOutputConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 800, 600, 0, 0, 0, false, null);

    resizer.ConfigureInputPort(resizerInputConfig, splitter.Outputs[0], null)
           .ConfigureOutputPort(resizerOutputConfig, null);

    imgEncoder.ConfigureOutputPort(portConfig, imgHandler);

    mmal.Camera.PreviewPort.ConnectTo(nullSink);
    mmal.Camera.VideoPort.ConnectTo(splitter);
    splitter.Outputs[0].ConnectTo(resizer);
    resizer.Outputs[0].ConnectTo(imgEncoder);

    using (var cts = CancellationTokenSource.CreateLinkedTokenSource(stoppingToken, stopStreamToken))
    {
        logger.LogInformation("Starting Stream");
        // this should run until camera 'paused' or stopingToken is triggered
        await mmal.ProcessAsync(mmal.Camera.VideoPort, cts.Token).ConfigureAwait(false);

        logger.LogInformation("Stream Stopped");

        if (stopStreamToken.IsCancellationRequested)
        {
            logger.LogInformation("Stream Paused");
            status = CameraStatus.Paused;
        }
        else if (stoppingToken.IsCancellationRequested)
        {
            logger.LogInformation("Stream Cancelled by User");
            status = CameraStatus.Closed;

            // perform clean-up if exiting process
            ctsStream.Dispose();
            ctsStream = null;
        }
    }
}

return status;

} If you're still having issues I'll test on my Pi later.

Ian

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.

techyian commented 4 years ago

Ok no problem. Internally MMAL likes to keep widths to multiples of 32 and heights to multiples of 16. There's a method on the Resolution class called Pad to help assist with this.

Enjoy your holiday!