Open mkj-stonehill opened 3 years ago
In order to debug this more easily, I created the following console application. When run, MMAL throws an exception on the ConfigureCameraSettings() call:
Unhandled Exception:
MMALSharp.MMALNoSpaceException: Out of resources. Unable to enable component occurred
If I comment out the line which sets the resolution, it runs successfully to completion, and I get 5 BMP files, all at the default resolution of 1280 x 720. How do I acquire still images of the max camera resolution? This is a Pi Camera V2.1, so it should be capable of 3280 x 2464, which is what I get when I take a picture using raspistill. Setting the resolution to anything larger than the default 1280 x 20 (e.g. I tried Resolution.As1080p) causes this exception
using MMALSharp;
using MMALSharp.Common;
using MMALSharp.Common.Utility;
using MMALSharp.Components;
using MMALSharp.Handlers;
using MMALSharp.Native;
using MMALSharp.Ports;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace cap
{
class Program
{
async static Task Main(string[] args)
{
Stopwatch sw = new Stopwatch();
Console.WriteLine("Configuring camera pipeline...");
sw.Start();
// Configure various settings
MMALCameraConfig.Resolution = new Resolution(3280, 2464);
// Get reference to the (one and only) camera object
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new MemoryStreamCaptureHandler())
using (var imgEncoder = new MMALImageEncoder())
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, quality: 90);
imgEncoder.ConfigureOutputPort(portConfig, imgCaptureHandler);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
sw.Stop();
Console.WriteLine($"{sw.ElapsedMilliseconds}ms; Warming up camera...");
sw.Restart();
await Task.Delay(2000);
sw.Stop();
Console.WriteLine($"{sw.ElapsedMilliseconds}ms;");
for (int n=1; n<=5; n++)
{
#if false
Console.WriteLine("\nHit any key to take a picture (ESC to exit)...");
k = Console.ReadKey();
if (k.Key == ConsoleKey.Escape)
break;
#endif
// Wait for the next picture
Console.WriteLine($"\nAcquiring picture {n}...");
sw.Restart();
await cam.ProcessAsync(cam.Camera.StillPort);
sw.Stop();
Console.WriteLine($"{sw.ElapsedMilliseconds}ms; saving to cap{n}.bmp...");
sw.Restart();
using (var bitmap = Bitmap.FromStream(imgCaptureHandler.CurrentStream))
{
bitmap.Save($"cap{n}.bmp");
}
sw.Stop();
Console.WriteLine($"{sw.ElapsedMilliseconds}ms;");
}
Console.WriteLine("\nExiting.");
}
}
}
}
I've received an MMALNoSpaceException before when I've had two processes trying to aceess the camera at the same time. You could try rebooting the Pi, and check that you don't accidentally have two instances of your program running, and that raspistill isn't running at the same time.
If that doesn't solve the issue, the first thing I'd try is to change:
cam.ConfigureCameraSettings();
to:
cam.ConfigureCameraSettings(imgCaptureHandler);
Failing that, you could try removing the MMALImageEncoder, and use this method to convert the raw bytes to a bitmap instead:
public static Bitmap BitmapFromRawData(byte[] data, int width, int height)
{
var bmp = new Bitmap(width, height, PixelFormat.Format24bppRgb);
var bmpData = bmp.LockBits(new Rectangle(0, 0,
bmp.Width,
bmp.Height),
ImageLockMode.WriteOnly,
bmp.PixelFormat);
var pNative = bmpData.Scan0;
Marshal.Copy(data, 0, pNative, width * height * 3);
bmp.UnlockBits(bmpData);
return bmp;
}
If you use the method above, ensure you have your encoding set to:
MMALCameraConfig.StillEncoding = MMALEncoding.BGR24;
MMALCameraConfig.StillSubFormat = MMALEncoding.BGR24;
I wanted to add dual camera support to MMALSharp, so I cloned the repo, and build it into my application (it was using version 0.6 from nuget before, to successfully capture images from camera 0).
There were a few changes required in my app with the latest version of MMALSharp (mostly the merged properties in MMALCameraConfig), but they seemed straight-forward. However, now when I run it, the very first
await this.Cam.ProcessAsync(this.Cam.Camera.StillPort);
I do never completes.I've looked at Issue #198 (which is very similar); but I'm never even getting the StillPort.NativeOutputPortCallback call.
I've even looked through all of the diffs from the 0.6 release on, but nothing obvious (to me) jumps out. It's probably something stupid I'm doing (or not doing), that just happened to work on 0.6, but no longer does. Can you point me in the right direction?
I've added logging (using NLog), and here's the log I get: