Closed MV10 closed 4 years ago
I've just hacked together an async Channel<string>
buffering setup and performance is equivalent to suppressing console output. I did a quick-and-dirty test by "buffering" to a public string in Program.cs, and only dumping that after the recording stopped. With buffering, ffmpeg settled at a comfortable 1.23x speed across just about any duration of recording. The channel approach shown below yields the same results but you still get realtime console output during recording.
With Console.WriteLine in the mix, ffmpeg often dropped to about 1.13x on a static scene and dipped below 1.0x with a lot of motion (which is when you lose recording time vs requested duration). It also dropped frames a lot more often, probably thanks to Console.WriteLine's blocking behavior.
Here's what I did in a local copy of the ffmpeg handler. There's probably architecturally better ways to fit this into your project but this works as a proof of concept.
First, a helper function to create the channel:
/// <summary>
/// Convenience method for creating the Channel used to buffer console output. Pass
/// this to the constructor, then await Task.WhenAll for both MMALCamera.ProcessAsync
/// and EmitStdOutBuffer.
/// </summary>
/// <returns>A new unbounded Channel object.</returns>
public static Channel<string> CreateStdOutBuffer()
=> Channel.CreateUnbounded<string>(new UnboundedChannelOptions { SingleReader = true, SingleWriter = true });
Modify the constructor and save to an instance variable. If the buffer argument is null, output is completely suppressed. The output events are wired up to a helper method:
private Channel<string> outputBuffer;
public BasicFFMpegCaptureHandler(string argument, Channel<string> stdoutBuffer = null)
{
outputBuffer = stdoutBuffer;
// ...etc
_process.EnableRaisingEvents = true;
_process.OutputDataReceived += WriteToStdOut;
_process.ErrorDataReceived += WriteToStdOut;
// ...etc
}
// Using "async void" is ok for an event-handler. The purpose of a Task is to communicate the
// result of an operation to some external "observer" -- but by definition the caller that fires
// an event doesn't care about the result of the operation. The only caveat is that you get no
// exception handling; an unhandled exception would terminate the process.
private async void WriteToStdOut(object sendingProcess, DataReceivedEventArgs e)
{
try
{
if (outputBuffer != null && e.Data != null)
await outputBuffer.Writer.WriteAsync(e.Data);
}
catch
{ }
}
Another helper method to process the buffer:
/// <summary>
/// When console output is buffered, this asynchronously read/writes the buffer contents
/// without blocking ffmpeg, as a raw Console.WriteLine would in response to process output.
/// </summary>
public static async Task EmitStdOutBuffer(Channel<string> stdoutBuffer, CancellationToken cancellationToken)
{
try
{
while (await stdoutBuffer.Reader.WaitToReadAsync(cancellationToken))
Console.WriteLine(await stdoutBuffer.Reader.ReadAsync());
}
catch(OperationCanceledException)
{
// token cancellation, disregard
}
}
Then in the application code:
static async Task mp4(int seconds)
{
// routine
MMALCameraConfig.SensorMode = MMALSensorMode.Mode4; // 1296 x 972 with 2x2 binning
var cam = MMALCamera.Instance;
var pathname = "/media/ramdisk/video.mp4";
File.Delete(pathname);
cam.ConfigureCameraSettings();
// get a buffer and pass it to the capture handler
var stdout = BasicFFMpegCaptureHandler.CreateStdOutBuffer();
var ffmpeg = new FFMpegCaptureHandler($"-framerate 24 -i - -b:v 2500k -c copy {pathname}", stdout);
// all of this is routine
var portCfg = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, quality: 10, bitrate: 0, timeout: null);
using var encoder = new MMALVideoEncoder();
using var renderer = new MMALVideoRenderer();
encoder.ConfigureOutputPort(portCfg, ffmpeg);
cam.Camera.VideoPort.ConnectTo(encoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
await Task.Delay(2000);
var token = new CancellationTokenSource(TimeSpan.FromSeconds(seconds));
// concurrently process the video capture and buffered console output
await Task.WhenAll(new Task[]{
cam.ProcessAsync(cam.Camera.VideoPort, token.Token),
BasicFFMpegCaptureHandler.EmitStdOutBuffer(stdout, token.Token)
});
}
It turns out my testing was still running at 1080p, apparently it isn't enough to just set the sensor mode, the resolution also has to be set to match (or maybe the mode setting doesn't matter at all, I haven't tried it without). So MP4 recording at 1080P 24FPS with a 10% to 15% speed improvement is pretty good, I think.
MMALCameraConfig.VideoResolution = new MMALSharp.Common.Utility.Resolution(1296, 972);
Anyway, after messing around with various MP4 captures for awhile, I eventually get an MMAL error, Edit: Side effect of not Disposing the capture handler, apparently.mmal: mmal_vc_component_enable: failed to enable component: ENOSPC
, followed by an error from MMALSharp itself, Out of resources. Unable to enable component
. I'd moved on from MP4 recording to messing with MJPEG streaming via cvlc, which works -- once, then I get those error messages. Apparently at that point MMAL is dead until I reboot.
I wondered if it could be related to my suspicion that Dispose isn't being called, but I realized I wasn't calling Cleanup before exit. However, calling Cleanup yields a different error: Argument is invalid. Unable to destroy component
. The only time Cleanup works is when I run a trivial TakePicture
test.
I'm going to have a look at your package's verbose logging, but if you have some time to kill, here's the current state of my simple experiments:
https://github.com/MV10/pi-cam-test/blob/master/Program.cs
Edit: Directly cutting-and-pasting your motion-detection sample from the wiki also throws the Argument is invalid. Unable to destroy component
exception when calling Cleanup.
Dispose is definitely not being called. I set up Serilog and excluding all the frame stuff, there's nothing unusual looking during startup, but after recording ends, this is all we get. The exception entry at the end is from my Program.cs in response to calling cam.Cleanup.
2020-07-05 15:05:26.638 -04:00 [DBG] Releasing buffer.
2020-07-05 15:05:26.638 -04:00 [DBG] Port not enabled.
2020-07-05 15:05:26.643 -04:00 [DBG] Destroying output port pool.
2020-07-05 15:05:26.649 -04:00 [DBG] Releasing active buffers for port vc.ril.video_encode:out:0(H264).
2020-07-05 15:05:26.650 -04:00 [DBG] Disposing buffer pool.
2020-07-05 15:05:26.657 -04:00 [DBG] Destroying final components
2020-07-05 15:05:26.668 -04:00 [DBG] Removing Component connection - Upstream component: vc.ril.camera on port vc.ril.camera:out:1(OPQV) Downstream component: vc.ril.video_encode on port vc.ril.video_encode:in:0(OPQV)
2020-07-05 15:05:26.670 -04:00 [DBG] Disposing connection.
2020-07-05 15:05:26.672 -04:00 [DBG] Destroying output port pool.
2020-07-05 15:05:26.672 -04:00 [DBG] Buffer pool already null or disposed of.
2020-07-05 15:05:26.721 -04:00 [DBG] Remaining components in pipeline: 1
2020-07-05 15:05:26.727 -04:00 [DBG] Disposing component vc.ril.video_encode.
2020-07-05 15:05:26.727 -04:00 [DBG] Destroying port pool
2020-07-05 15:05:26.727 -04:00 [DBG] Buffer pool already null or disposed of.
2020-07-05 15:05:26.733 -04:00 [DBG] Completed disposal...
2020-07-05 15:05:26.735 -04:00 [DBG] Removing Component connection - Upstream component: vc.ril.camera on port vc.ril.camera:out:0(OPQV) Downstream component: vc.ril.video_render on port vc.ril.video_render:in:0(OPQV)
2020-07-05 15:05:26.735 -04:00 [DBG] Disposing connection.
2020-07-05 15:05:26.746 -04:00 [DBG] Remaining components in pipeline: 0
2020-07-05 15:05:26.746 -04:00 [DBG] Disposing component vc.ril.video_render.
2020-07-05 15:05:26.749 -04:00 [DBG] Completed disposal...
2020-07-05 15:05:26.750 -04:00 [DBG] Disposing component vc.ril.camera.
2020-07-05 15:05:26.793 -04:00 [DBG] Completed disposal...
2020-07-05 15:05:26.793 -04:00 [DBG] Remaining components in pipeline: 0
2020-07-05 15:05:26.793 -04:00 [DBG] Disposing component vc.ril.video_render.
2020-07-05 15:05:26.795 -04:00 [DBG] Remaining components in pipeline: 0
2020-07-05 15:05:26.795 -04:00 [DBG] Disposing component vc.ril.video_encode.
2020-07-05 15:05:26.795 -04:00 [DBG] Destroying port pool
2020-07-05 15:05:26.795 -04:00 [DBG] Buffer pool already null or disposed of.
2020-07-05 15:05:26.803 -04:00 [ERR] Exception: Argument is invalid. Unable to destroy component
My Dispose looks like this, so that should have shown up in the log somewhere:
public void Dispose()
{
if (Common.Utility.MMALLog.Logger != null)
Common.Utility.MMALLog.Logger.LogTrace("Disposing ExternalProcessCaptureHandler");
if (!_process.HasExited)
{
_process.Kill();
}
_process.Close();
_process.Dispose();
}
As an aside, Serilog doesn't normally work with LoggerFactory
the way older loggers do (I used to be a frequent Serilog contributor). If you want to update your wiki to show how to wire up Serilog, this is the least-effort option. It's kind of hard to track down because one of the packages isn't commonly used any more. Dependencies are Microsoft.Extensions.Logging, Serilog.Extensions.Logging, and Serilog.Sinks.File.
if (useDebug)
{
File.Delete("debug.log"); // log output will append
MMALCameraConfig.Debug = true;
MMALLog.LoggerFactory = new LoggerFactory()
.AddSerilog(new LoggerConfiguration()
.MinimumLevel.Verbose()
.WriteTo.File("debug.log")
.CreateLogger());
}
Moments after shutting down I realized the call to ffmpeg (whether it be your examples using your real ffmpeg class or my modified generic-process version) should also be wrapped in a using
statement. I still think something elsewhere in the pipeline probably ought to call Dispose but this at least does result in the Process
object being cleanly released eventually. Unfortunately it doesn't address the Cleanup exception. ("VRB" = Verbose ... Serilog's equivalent to Trace log level.)
Last bits of the log after adding a using
:
2020-07-05 15:28:27.281 -04:00 [DBG] Disposing component vc.ril.camera.
2020-07-05 15:28:27.304 -04:00 [DBG] Completed disposal...
2020-07-05 15:28:27.305 -04:00 [DBG] Remaining components in pipeline: 0
2020-07-05 15:28:27.305 -04:00 [DBG] Disposing component vc.ril.video_render.
2020-07-05 15:28:27.305 -04:00 [DBG] Remaining components in pipeline: 0
2020-07-05 15:28:27.305 -04:00 [DBG] Disposing component vc.ril.video_encode.
2020-07-05 15:28:27.306 -04:00 [DBG] Destroying port pool
2020-07-05 15:28:27.306 -04:00 [DBG] Buffer pool already null or disposed of.
2020-07-05 15:28:27.307 -04:00 [VRB] Disposing ExternalProcessCaptureHandler
2020-07-05 15:28:27.312 -04:00 [ERR] Exception: Argument is invalid. Unable to destroy component
2020-07-05 15:28:27.313 -04:00 [DBG] Application exit
I'll close this and open a new better-focused issue specific to buffering the stdout/stderr streams.
After a bunch of research, it actually isn't possible to generate a valid MP4 on the fly using ffmpeg as a child process. The short version is that ffmepg will only write the trailing header (MOOV atom) when it cleanly terminates. Supposedly you can do this by sending SIGINT twice, but as a child process that actually kills it immediately. Stackoverflow is awash in questions about this, and on the ffmpeg mailing list I found discussions of the problem going back ten years (this guy tested every signal, and this person even proposed a fix which seems to have been ignored).
I did get some exit output with SIGINT then SIGQUIT (via Mono.Unix.Native.Syscall
) but the file was still trashed:
Error writing trailer of /media/ramdisk/video.mp4: Immediate exit requested
frame= 238 fps= 26 q=-1.0 Lsize= 58880kB time=00:00:09.87 bitrate=48844.7kbits/s speed=1.08x
video:59134kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Exiting normally, received signal 3.
Fortunately I don't actually need direct-to-MP4 output, I really only need h264 files which can be processed later, so this was mostly me refusing to give up. Anyway, I'll write a cleaner external process manager and start a new discussion.
If you saw the noise of my earlier posts to this topic -- my apologies, I went from playing with raspivid command lines to using your library -- which is excellent by the way -- and forgot to set a binning mode so I thought I was seeing hard-to-believe overhead. But it did lead me to the console output which is still a small issue.
I've done quite a bit of testing and calling
Console.WriteLine
measurably degrades ffmpeg performance in the FFmpegCaptureHandler. Not surprisingly, this gets even worse if you're watching that output on a remote terminal. I propose a simple constructor switch to disable that output for running headless, where it wouldn't ever be seen or be useful anyway.Additionally, I don't think Dispose is being called in the handler. For one of my tests I simply buffered the output to a string, then wrote that to the console in the Dispose. It was never output. I backtracked through ProcessAsync and I see where it ought to be called in all the output component code, but it isn't.Edit: This ended up being my fault (forgot the using statement).Also, in addition to
Process.Kill
, outside that if-block Dispose should also callProcess.Close
thenProcess.Dispose
. Also the catch-block for theProcess
method in the ffmpeg handler should call Dispoe rather than directly killing the process.I'd be happy to PR some of this but I'm not sure how to track down the Dispose problem yet.