Open corlejas opened 4 years ago
Does it work if you use picam-1.x ?
I reviewed the code, I don't see anything materially wrong.
There was a problem with setting the FPS ranges for high exposure times, see https://github.com/caprica/picam-native/commit/ebcac5aaf7bae1edcfa4d9ab500f15bc108ec0b9.
But I don't think it makes any difference.
To be frank, it's unlikely I will find any time to look deeper into this. This project is a very low priority at the moment. If a problem can be identified, I'll look again.
I cant get picam 1.1.0 to work debug report (on instantiating the camera object) reports: 10:58:13.595 [main] DEBUG uk.co.caprica.picam.CameraParameterUtils - getCameraCapturePort() mmal: mmal_port_parameter_set: param not supplied Does it work with the new HQ camera? - or maybe me being useless as not using maven so copying and compiling all dependencies.
As for Picam 2 and raspistill I did some tests: with identical configs for both raspistill and Picam 2 (-ISO 100 -ex verylong -awb sun -mm average -hf -vf -ss 10000000 -drc off -w 4056 -h 3040 -q 100 -o raspistill.jpg -t 10000 -e jpg ) - intent being 10 second exposure and 10 second timeout: raspistill took 72 seconds to execute Picam 2 took 3.5 seconds to execute
Conscious its just a side project, so if you do get around to looking at it again, hopefully the above might be of use.
Thanks
There's clearly a problem, but where it is I don't know. Your 72 seconds vs 3.5 seconds test is of course convincing.
I never saw any problem like you mentioned with picam-1.x, but I never played around with shutter speed tbh.
I will some time do a new release of picam-2.x with the change I made yesterday, it is in the area of shutter-speed, but I am sceptical if it is materially related to this problem.
I'd like to resolve this, I really just don't have spare time right now. The problem is the Pi dev process is currently a bit painful and time-consuming. I don't work on the Pi at all.
And as for your other question, I don't have the HQ camera, but I see no reason why it wouldn't work.
Just to add - would like to help but never touched c. Happy to put in the effort to learn (but could take quite some time) - any pointers as to a great resource for starting out?
My view is anybody who's a decent developer should be able to develop in pretty much any language, but learning C from scratch is kinda hard. Going from Java to C should be very easy though.
I would just advise looking at the source code in the picam-native project, and then looking at the code in RaspiStill.c, and seeing if you can spot a problem. In picam-native it's Camera.c, Configuration.c and Port.c IIRC.
I already did a quick code review just like that but couldn't find anything obviously wrong.
I know for sure that the picam-native code is receiving the correct shutter value from the picam Java side, so the problem is not there.
I've taken a look, focusing just on the flow of the shutterSpeed attribute and read up on JNI (and a little C).
The Java side all looks consistent.
I don't understand how shutterSpeed is passed between Java and C as I couldn't find any "native" declarations in the Java code, therefore I assume I've read about a different way of using JNI (if you could point me to an explanation of the method you've used, I can read up further on it).
As for C > MMAL RaspiStill.c has this (lines 1908 to 1910) within the frame loop:
// There is a possibility that shutter needs to be set each loop. if (mmal_status_to_int(mmal_port_parameter_set_uint32(state.camera_component->control, MAL_PARAMETER_SHUTTER_SPEED, state.camera_parameters.shutter_speed)) != MMAL_SUCCESS) vcos_log_error("Unable to set shutter speed");
The comment maybe the key. I think that you maybe setting the value just once in the applyCameraConfiguration method in Camera.c, where as it may actually need to be set in Encoder.c; however, I don't understand how JNI is passing back the collection of image frames to the Java code, else I might then understand better what is happening in the Encoder.c class and what change may be necessary.
Also, conscious you changed fps_low numerator for shutter speed > 6000000, but also the fps_low numerator for shutter speed > 1000000 should be 166 not 167.
Hope this is of help
I know with absolute certainty that the JNI layer is receiving the correct shutter speed value, because I printed it out in the JNI layer.
re: 3. Maybe, but then why does it fail on the first capture?
I'm not following; what fails on first capture?
The shutter speed fails to operate correctly on the first picture you take doesn't it? The piece of code you highlighted in RaspiStill sets the shutter speed each time a capture is taken when RaspiStill operates in loop mode.
So if what you found was the only problem, then surely when taking the first picture the shutter speed should function as expected?
Because picam does that shutter speed code when setting the camera configuration before the first capture.
I thought it worked differently. I believe the image chip is not a stills chip, it is a video chip, I believe it gets frames from the chip and overlays them on top of each other to build an image. I assume that is what is happening in the frame loop of RaspiStill.c (unless it is handled in the MMAL layer).
That loop in RaspiStill.c is simply to capture multiple individual pictures is it not?
There is a separate period where you prepare the capture then let the sensor "settle" before actually performing the capture.
But I'm not a hardware person, I don't know much about these things.
No I dont think so. Check out page 108 of https://magazines-static.raspberrypi.org/books/full_pdfs/000/000/036/original/Camera-Guide.pdf?1588180275)
or as an extract: In addition, the HQ Camera or Camera Module acts more like a video camera than a stills camera, as it is rarely idle. Once initialised, it is constantly streaming rows of frames down the ribbon cable to Raspberry Pi for processing. Numerous background tasks include automatic gain control, exposure time, and white balance. That’s why it’s best to give it a couple of seconds or more once activated, to adjust the exposure levels and gains before capturing an image.
Or all the gory details direct from Pi are here: https://picamera.readthedocs.io/en/latest/fov.html
So the frames loop is where I think it overlays frames upon each other for the requisite period. Because of this, being able to amend the analogue and digital gains is helpful as it would potentially enable manual optimisation of noise reduction (although this is just a hypothesis)
Look at wait_for_next_frame and FRAME_NEXT_SINGLE. That's what my library does.
Okay, if I understand and have read the code correctly, your code would only perform a single loop equivalent of the while(keep_looping) loop as you enforce a sleep (or something equivalent) for the remainder of the Timeout attribute, so shutterSpeed would by default be set for each instance. On that basis, have gone through the detailed tech info on the Picamera and this para raised another possibility:
_The maximum exposure time is currently 6 seconds on the V1 camera module, and 10 seconds on the V2 camera module. Remember that exposure time is limited by framerate, so you need to set an extremely slow framerate before setting shutterspeed.
This may play into the fps value and when it is set. On the HQ camera the maximum exposure time 200 seconds which gives a minimum framerate of 0.005 (which is 5/1000) - which is probably now represented by:
param.fps_low.num = 5; param.fps_low.den = 1000;
in your Port.c file - so you may have fixed it yesterday.
I need to learn how to generate the .so file from your .c code to test it.
To build the .so, clone the repo to your Pi and run the pi.sh file.
And yes, your understanding of what my code does is correct.
Interesting that it says you need to set the frame rate before the shutter speed.
Tried with your latest Port.c file, made no difference (still 3.5 second execution duration for a 10 second shutterSpeed with 10 second timeout) - validated wasn't using old .so file by amending pi.sh to build with different name and changing System.load()
call in my Java test file.
So confirmed in RaspStill.c that the
mmal_port_parameter_set(still_port, &fps_range.hdr)
is called before
MMAL_PARAMETER_SHUTTER_SPEED
.
So thought I'd try to amend your code accordingly and moved
setFpsRange (capturePort, control->shutterSpeed) &&
from the bottom to near the top of the return statement of
Camera.applyCameraConfiguration(PicamContext *context)
rebuilt the .so file and re-tested, no change.
As I'm only understanding a small proportion of the code (as c is much harder to decipher than Java), I do feel I'm scratching around in the dark somewhat.
Having said that your version of setFPSRange()
only differs from the RaspiStill version of mmal_port_parameter_set()
by your use of:
param.fps_low.num = 5;
param.fps_low.den = 1000;
param.fps_high.num = 166;
param.fps_high.den = 1000;
and their use of:
{ 5, 1000 }, {166, 1000}
and I assume that would make no difference (although I don't understand what is driving the variation, nor how these get turned into two rationals).
Any guidance how to debug this? (I assume using fprintf() in the .so file would output to the console where the .jar file (that system.loads the .so file) is executed from).
printf is the way for debugging here! Unless you want to set up a native debugger on the PI, I mean stuff like gdb is possible, but it's not trivial.
The structure packing of those fraction initialisations is equivalent as far as I can remember, as long as the field ordering is the same. But just looking at it, the numerators and denominators look right, and "low" is less than "high". Expressing integer fractions obviously just means that you can do your multiplies and divides as integers and only ending up with a float at the latest possible moment, so reducing rounding errors etc. I expect that's deep in MMAL code somewhere. I prefer not to use that structure notation since I like things to be explicitly clear.
What you're doing is helpful, but clearly this is something subtle that was missed when I wrote this code. I just can't myself spend hours poring over this code looking for it.
Played some more. Added
int rtnValue = setFpsRange (capturePort, control->shutterSpeed);
rtnValue=setUInt32 (controlPort, MMAL_PARAMETER_SHUTTER_SPEED , control->shutterSpeed);
to body of Camera.applyCameraConfiguration(). Added get functions in too, to return the values and printf them. They all work - FpsRange and shutterSpeed are set as exepected. No change to the execution time of a shutter speed setting of 10 seconds (shutterSpeed value of 10000000).
In testing the >6000000 and >1000000 variances in Fpsrange, it was clear that something does happen as shutterSpeed is changed. The following data was compiled by measuring just the duration of the .takePicture() call in the java app - rather than the whole Java Main() method) and varying just the shutterSpeed: 20000000 (20 seconds) takes 1.679 seconds to take the picture 2000000 (2 seconds) takes 1.680 seconds to take the picture 200000 (200 milliseconds) takes 869 milliseconds to take the picture 20000 (20 milliseconds) takes 685 milliseconds to take the picture 2000 (2 milliseconds) takes 656 milliseconds to take the picture During all of these the FPSrange and shutterSpeed printf's reported the values I was expecting.
Clearly there is an overhead (which may include default disposal of frames to enable the algorithms to set some basic levels of gain etc.) but in the data above even through I am increasing shutterSpeed by orders of magnitude each time, the largest single increase is only a doubling in duration (and that only occurs once).
Maybe there is another setting which is overriding the shutterSpeed?
I don't know. If there is some variance, then either that could be some random corruption somewhere, or maybe it IS related to a problem with the FPS ranges being set. Seems fishy.
Some very brief testing...
Shutterspeed -> capture duration (including whatever delays are intentionally introduced by the test application, I think it's aroun d 5s by default extra delay):
1,000 -> 5.4s 10,000 -> 5.4s 50,000 -> 5.4s 80,000 -> 5.4s 1,000,000 -> 6.3s 1,000,001 -> 6.9s 5,000,000 -> 6.9s 6,000,000 -> 6.9s 6,000,001 -> 11.9s 100,000,000 -> 11.9s
This actually tallies with the breakpoints in the code:
int setFpsRange(MMAL_PORT_T *port, uint32_t shutterSpeed) {
MMAL_PARAMETER_FPS_RANGE_T param = {{MMAL_PARAMETER_FPS_RANGE, sizeof(param)}};
printf("setFpsRange for shutter speed %d\n", shutterSpeed);
if (shutterSpeed > 6000000) { // <---- 6,000,000
param.fps_low.num = 5;
param.fps_low.den = 1000;
param.fps_high.num = 166;
param.fps_high.den = 1000;
} else if (shutterSpeed > 1000000) { // <--- 1,000,000
param.fps_low.num = 166;
param.fps_low.den = 1000;
param.fps_high.num = 999;
param.fps_high.den = 1000;
}
return mmal_port_parameter_set(port, ¶m.hdr) == MMAL_SUCCESS ? 1 : 0;
}
Of note is that closing/cleaning up the camera seems to take significantly longer with the high valules.
Interesting that for values less than or equal to 1,000,000 the FPS settings are zeros. That presumably means something.
These results show that the shutter speed settings have some effect, at least according to this breakpoints for FPS ranges.
But it does not explain the problem of course.
From browsing the native code I wondered if the camera must be in burst capture mode:
mmal_port_parameter_set_boolean(state.camera_component->control, MMAL_PARAMETER_CAMERA_BURST_CAPTURE, 1)
But adding this makes no difference.
Usingthe Pi HQ camera with picam-2.0.2.jar as library and picam-2.0.1.so as hardware driver (couldn't find a 2.0.2 version) - all appears to work as expected except shutter speed.
Would like to play with long exposures upto the longest shutter speed (200 seconds for the HQ camera), but what ever value I use, the duration of Camera.takePicture() is typically very quick (less than 1 second). I can make it work with raspistill - have to make sure -timeout is also commensurately long (eg -t 200000 -ss 200000000) - but enter these as a cameraConfiguration does not affect the exposure.