dridri / OpenMaxIL-cpp

OpenMax IL C++ wrapper for RaspberryPi
MIT License
23 stars 5 forks source link

[OMX.broadcom.camera]Wating state to be 3 #1

Closed cedricve closed 7 years ago

cedricve commented 7 years ago

Trying your API out, but when running the samples it blocks on waiting for state 3.

  [OMX.broadcom.camera] SetConfig 100000B failed : 8000101A
  ===> types : 8[200] | 8[71]
  Port 71: out 1/1 30720 16 disabled,not pop.,not cont. 1280x720 1280x16 @0fps 20
  Port 200: in 1/1 15360 16 disabled,not pop.,not cont. 160x64 160x64 @30fps 20
  Wating port 71 to be 1
  Port 71: out 0/0 30720 16 enabled,populated,not cont. 1280x720 1280x16 @0fps 20
  [OMX.broadcom.camera]Wating state to be 3
dridri commented 7 years ago

Commit 929adde should solve the problem

cedricve commented 7 years ago

by the speed of light @dridri awesome!

cedricve commented 7 years ago

Also I'm wondering to use this library in the Kerberos.io project. I'll need to capture images from the preview port 70 (and port to cv::Mat), while having the ability to record (simultaneously) on the video port 71. Is this something possible with the library, or will it require more development?

dridri commented 7 years ago

Just seen there was also a missing argument in Camera() call, solved by 78c9a44

Yep you can use all the 3 ports of the camera simultaneously, by using

OMX_ERRORTYPE Camera::SetupTunnelPreview( Component* next, uint8_t port_input = 0 );
OMX_ERRORTYPE Camera::SetupTunnelVideo( Component* next, uint8_t port_input = 0 );
OMX_ERRORTYPE Camera::SetupTunnelImage( Component* next, uint8_t port_input = 0 );

Calling Camera::SetCapturing() on ports 70 or 71 will start both, to take still captures you have to call SetCapturing( true, 72 ) each time you need it

cedricve commented 7 years ago

Crazy, i'll try this out and reference your repo on kerberos.io once I have this working. Would you be interested in some guidance, or integrate an example in your samples directory? Anyway, this is awesome!

dridri commented 7 years ago

Of course, I can add a typical example for what you want to achieve. Do not hesitate to ask me if you have any questions :)

cedricve commented 7 years ago

So this is my use case, and I think this might be applicable for many others.

  1. I currently use raspicam (an MMAL wrapper) from which I grab images decoded to cv:Mat to execute some image processing e.g. motion detection. https://github.com/kerberos-io/machinery/blob/master/src/kerberos/capture/RaspiCamera.cpp#L73-L74

  2. Next to processing those images, Kerberos.io will fire events when motion was detected, one of the events is video recording. At the moment I use the built-in VideoWriter of opencv with x264 encoding which is slow. This should be replaced by your library. https://github.com/kerberos-io/machinery/blob/master/src/kerberos/machinery/io/IoVideo.cpp#L239

The whole idea is that points 1 and 2 are running in parallel, this means that motion detection is also executed while a video is being encoded. If no more motion is detected the video encoding thread is ended.

So my ideal example would be to have a main thread, capturing frames from the video (port 70) and the ability to run video encoding in a seperate thread.

Let me know if this makes sense @dridri

dridri commented 7 years ago

I was just looking to you source code.

Since my library is a wrapper and raspicam is threaded by design, there is no counterpart to use it in different threads. I think you probably guessed it, but you will not be able to use both your MMAL implementation and mine at the same time.

You have two possibilities to achieve what you want :

  1. Start the camera with ports 70 and 71 enabled, start the two threads, and just tell the encoder to consume data but not use it if recording is disabled (simply by calling getOutputData() and doing nothing with the data). You are forced to do that otherwise the camera will stall after a few seconds.
  2. Start the camera with only port 70 and start only the processing thread. At the moment you need recording, you have to stop camera then tunnel the video (71) port to the encoder, then restart the camera and start the encoder (the whole process takes less than 100 milliseconds). To stop recording, the same process applies : stopping camera, destroying tunnel to encoder, restarting camera

To my opinion, 1st option is easier and less prone to bugs (stopping and starting camera continuously may lead to memory leaks), and the thread will not be CPU consuming since Encoder::getOutputData() is a blocking call.

I'm going to make an example code so. Just to be sure, the video stream is not displayed/sent anywhere if there is no motion detected ?

cedricve commented 7 years ago

@dridri thanks for your quick response makes sense! Well I've forgot this one, we also send a MJPEG stream. Which should also use the images from port 70 for this.

Thus to be complete:

  1. Main thread that processes images.
  2. Additional thread that listens to clients, and sends them an encoded JPEG. https://github.com/kerberos-io/machinery/blob/master/src/kerberos/capture/Stream.cpp#L151
  3. Start a video thread that encodes video.

How I solve 1, 2 and 3 is to share a capture (camera) object which is using locking when grabbing an image.

I also prefer scenario 1 as this separates the video recording and image processing, and will gives us the best performance for encoding. I think we'll need to have a share image object for the processing and the MJPEG streaming. What do you think @dridri ?

dridri commented 7 years ago

There is an OMX component called "video_splitter" which allows to use multiple encoders from the same source image, but for unknown reason I can't get it to work. But using my implementation, you could take advantage of the hardware-accelerated MJPEG encoder. So you will not have to use locking anymore.

Here's how it works roughly :

The only counterpart is that the hardware processing produces YUV420 images, but I just checked and it seems to be easy to load yuv420 in opencv.

cedricve commented 7 years ago

Sounds like my perfect situation, where have you been all this time @dridri, haha? I'm trolling with the raspicam library for 3 years now, it's a shame I didn't found your library more quickly.

Will you able to contribute an example this week or upcoming time, just to know we spend time right? Otherwise I'll deep dive into this, let me know what you think is best.

What do you mean with this?

The only counterpart is that the hardware processing produces YUV420 images, but I just checked and it seems to be easy to load yuv420 in opencv.

For the recording we can just write the raw h264 to a file right? At the moment we don't need to have any intermediate image processing (e.g. an overlay).

Thanks @dridri

dridri commented 7 years ago

I mean that in this situation, not only the h264 port produces yuv420 images, but also the preview port.

I'm writing an example right now

dridri commented 7 years ago

Here you go ! https://github.com/dridri/OpenMaxIL-cpp/blob/master/samples/camera_threaded.cpp (tested and working)

cedricve commented 7 years ago

Awesome dude.. I've send you a personal message.

cedricve commented 7 years ago

@dridri looks great and works! I was wondering why the preview_thread is taking so much CPU resources?

dridri commented 7 years ago

I was wondering the same, it's surely caused by this memcpy : https://github.com/dridri/OpenMaxIL-cpp/blob/master/src/Component.cpp#L401 I can expose the "pBuffer" pointer to avoid this

cedricve commented 7 years ago

Hmm, indeed that memcpy will be the issue. Well that might be a good idea, because I'll also convert it to a cv::Mat, which means a double copy operation.

cedricve commented 7 years ago

FYI: Started integration in Kerberos.io in a separate branch: https://github.com/kerberos-io/machinery/commits/openmax-il.

dridri commented 7 years ago

c0c2846 solves a part of the problem, I can maybe reduce it even more

cedricve commented 7 years ago

looks good! can we make a stable version after I've tested your library for some time. Just want to make sure that future changes will not affect other people compiling kerberos.io, as it will now point to your master branch. Just a question, not necessary until I release a new version of kerberos.io

cedricve commented 7 years ago

@dridri nice performance improvement from 50% to 15% on a Pi2. Great work, what other sections could you improve?

cedricve commented 7 years ago

Made some progress last night by hooking your library to the MJPEG streamer. https://github.com/kerberos-io/machinery/tree/openmax-il. Probably the code I've produced isn't that per formant as it will have to be, but lets improve it later on.

What I now have :

  1. A lock thread that copy the data from your mjpeg_data buffer, to a class property. https://github.com/kerberos-io/machinery/blob/openmax-il/src/kerberos/capture/RaspiCamera.cpp#L81-L87 and https://github.com/kerberos-io/machinery/blob/openmax-il/src/kerberos/capture/RaspiCamera.cpp#L142-L159

  2. A stream thread that will read the mjpeg_data buffer. https://github.com/kerberos-io/machinery/blob/openmax-il/src/kerberos/Kerberos.cpp#L281-L283

  3. Write the data buffer to the client. https://github.com/kerberos-io/machinery/blob/openmax-il/src/kerberos/capture/Stream.cpp#L151-L183

The code seems to work, however when opening with a browser a black image is shown. If I print out the lengths of the buffers I see them differ, so the data flow "looks oke". Do you have any ideas or isn't the MJPEG encoding the proper one for this?

screen shot 2017-05-30 at 11 21 24

dridri commented 7 years ago

In your previous implementation, your stream seemed to be a continuous flow of JPEG images, which is different from MJPEG. You could try to set Content-Type to "video/x-motion-jpeg" If it still doesn't work, does a software like VLC correctly open the stream ?

edit: I think you should put back an usleep in your streamContinuously() loop edit2: oh I just found you probably misstyped retrieveRAW, you do not pass data by reference or double-pointer, so the code data = mjpeg_data_buffer; just sets data locally

cedricve commented 7 years ago

@dridri, I've update the flow before I've seen your message and I managed to get it work with a sleep indeed, it was trying to send null data. Also I pass pointers around now, like you've suggested.

screen shot 2017-05-30 at 12 21 31

dridri commented 7 years ago

Yay, seems perfect !

cedricve commented 7 years ago

This is amazing fast, what a difference 👍 👍 I'll work on the motion detection now. what I've been thinking about is to sue the MJPEG image as well for the motion detection. I think we can simple use cv::imdecode in stead of copying the YUV data.

dridri commented 7 years ago

You probably should compare cpu usage between decoding mjpeg and converting yuv

cedricve commented 7 years ago

Good point, let's give it a try. Keep you posted, can't wait to share this with out community. HD video surveillance on a Raspberry Pi Zero, oh man.

dridri commented 7 years ago

What was your previous resolution and framerate ?

cedricve commented 7 years ago

A short brenchmark without too much details:

We have multiple Io devices (https://github.com/kerberos-io/machinery/tree/openmax-il/src/kerberos/machinery/io), one of them is the IoVideo class (used to record mp4's). There is a clock builtin, to make sure the video is at the right FPS. This is needed because e.g. with IP cameras you can have image drops, so you need to make sure to keep a frame in memory to fallback. This is needed because with OpenCV VideoWriter you need to define the FPS in advance.

cedricve commented 7 years ago

Also I'm coding the POC on a Pi2, and previously streaming a 640x480 image at 3 FPS takes around 30-50% of the CPU (this was cause due to cv::imencode, which encodes the image on the cpu). With your code, it streams 1280x720 at 17FPS and 15% of CPU. Crazy 👍

dridri commented 7 years ago

Only 17fps ? Did you manually changed it or is there a lag somewhere ?

cedricve commented 7 years ago

@dridri I need to confirm, but it looks that sometimes it slows down, not sure why. Maybe due to network connectivity. https://github.com/kerberos-io/machinery/blob/openmax-il/src/kerberos/capture/Stream.cpp#L151-L199

What you see below is the length of each packet that is written to the socket. So from my understanding the new image isn't written fast enough.

 30/05/2017 10:53:54.095 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.096 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.096 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.096 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.097 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.097 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.097 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.097 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.098 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.098 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.098 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.098 INFO  [trivial] streaming: 34723
 30/05/2017 10:53:54.101 INFO  [trivial] streaming: 34662
 30/05/2017 10:53:54.102 INFO  [trivial] streaming: 34662
 30/05/2017 10:53:54.102 INFO  [trivial] streaming: 34662
 30/05/2017 10:53:54.103 INFO  [trivial] streaming: 34662
 30/05/2017 10:53:54.103 INFO  [trivial] streaming: 34662
 30/05/2017 10:53:54.103 INFO  [trivial] streaming: 34662
 30/05/2017 10:53:54.103 INFO  [trivial] streaming: 34662

When I resize the view port of the stream (e.g. with IP camera view it suddenly runs faster).

 30/05/2017 12:09:40.987 INFO  [trivial] streaming: 34907
 30/05/2017 12:09:40.987 INFO  [trivial] streaming: 34907
 30/05/2017 12:09:41.118 INFO  [trivial] streaming: 34983
 30/05/2017 12:09:41.119 INFO  [trivial] streaming: 34983
 30/05/2017 12:09:41.341 INFO  [trivial] streaming: 35032
 30/05/2017 12:09:41.342 INFO  [trivial] streaming: 35032
 30/05/2017 12:09:41.343 INFO  [trivial] streaming: 35032
 30/05/2017 12:09:42.012 INFO  [trivial] streaming: 34931
 30/05/2017 12:09:42.013 INFO  [trivial] streaming: 34931
 30/05/2017 12:09:42.263 INFO  [trivial] streaming: 35113
 30/05/2017 12:09:42.264 INFO  [trivial] streaming: 35113
 30/05/2017 12:09:42.531 INFO  [trivial] streaming: 34905
 30/05/2017 12:09:42.531 INFO  [trivial] streaming: 34905
 30/05/2017 12:09:42.532 INFO  [trivial] streaming: 34905
 30/05/2017 12:09:42.648 INFO  [trivial] streaming: 34947
 30/05/2017 12:09:42.649 INFO  [trivial] streaming: 34947
 30/05/2017 12:09:42.668 INFO  [trivial] streaming: 34866
 30/05/2017 12:09:42.668 INFO  [trivial] streaming: 34866
 30/05/2017 12:09:43.216 INFO  [trivial] streaming: 35033
 30/05/2017 12:09:43.218 INFO  [trivial] streaming: 35033
 30/05/2017 12:09:43.321 INFO  [trivial] streaming: 34984
 30/05/2017 12:09:43.322 INFO  [trivial] streaming: 34984
 30/05/2017 12:09:43.324 INFO  [trivial] streaming: 34992
 30/05/2017 12:09:43.426 INFO  [trivial] streaming: 34946
 30/05/2017 12:09:43.427 INFO  [trivial] streaming: 3494
cedricve commented 7 years ago

Edit: What I think is strange is that it writes multiple times the same image. I was first thinking that this was caused to multiple clients, but that's not the case. This might be an issue with not getting a free lock from the preview thread?

30/05/2017 12:24:28.915 INFO  [trivial] streaming (32352) to 0
30/05/2017 12:24:29.**403** INFO  [trivial] streaming (**32488**) to 0
30/05/2017 12:24:29.**405** INFO  [trivial] streaming (**32488**) to 0
30/05/2017 12:24:29.**407** INFO  [trivial] streaming (**32488**) to 0
30/05/2017 12:24:29.**408** INFO  [trivial] streaming (**32488**) to 0
30/05/2017 12:24:29.**409** INFO  [trivial] streaming (**32488**) to 0
30/05/2017 12:24:29.**410** INFO  [trivial] streaming (**32488**) to 0
30/05/2017 12:24:29.**411** INFO  [trivial] streaming (**32488**) to 0
dridri commented 7 years ago

This is probably caused by your mjpeg streaming thread running too fast compared to preview_thread one, try increasing the usleep

cedricve commented 7 years ago

You're right, introducing a lag (in function of a FPS parameter) works. You see the grabbing and streaming alternate, which is perfect! Sometimes it reaches 24/fps. However due to network drops there are some gaps (you see it when its grabbing more than streaming).

30/05/2017 12:37:09.970 INFO  [trivial] streaming (31911) to 0
30/05/2017 12:37:09.989 INFO  [trivial] grab new image
30/05/2017 12:37:10.005 INFO  [trivial] streaming (31976) to 0
30/05/2017 12:37:10.024 INFO  [trivial] grab new image
30/05/2017 12:37:10.039 INFO  [trivial] streaming (31996) to 0
30/05/2017 12:37:10.057 INFO  [trivial] grab new image
30/05/2017 12:37:10.073 INFO  [trivial] streaming (32041) to 0
30/05/2017 12:37:10.089 INFO  [trivial] grab new image
30/05/2017 12:37:10.107 INFO  [trivial] streaming (32016) to 0
30/05/2017 12:37:10.122 INFO  [trivial] grab new image
30/05/2017 12:37:10.156 INFO  [trivial] grab new image
30/05/2017 12:37:10.190 INFO  [trivial] grab new image
30/05/2017 12:37:10.211 INFO  [trivial] streaming (32014) to 0
30/05/2017 12:37:10.226 INFO  [trivial] grab new image
30/05/2017 12:37:10.246 INFO  [trivial] streaming (31977) to 0
30/05/2017 12:37:10.253 INFO  [trivial] grab new image
30/05/2017 12:37:10.281 INFO  [trivial] streaming (31901) to 0
30/05/2017 12:37:10.287 INFO  [trivial] grab new image
30/05/2017 12:37:10.315 INFO  [trivial] streaming (31883) to 0
30/05/2017 12:37:10.322 INFO  [trivial] grab new image
30/05/2017 12:37:10.350 INFO  [trivial] streaming (32021) to 0
30/05/2017 12:37:10.357 INFO  [trivial] grab new image
30/05/2017 12:37:10.384 INFO  [trivial] streaming (31905) to 0
30/05/2017 12:37:10.389 INFO  [trivial] grab new image
30/05/2017 12:37:10.419 INFO  [trivial] streaming (32019) to 0
30/05/2017 12:37:10.424 INFO  [trivial] grab new image
30/05/2017 12:37:10.453 INFO  [trivial] streaming (31937) to 0
30/05/2017 12:37:10.457 INFO  [trivial] grab new image
30/05/2017 12:37:10.488 INFO  [trivial] streaming (31988) to 0
30/05/2017 12:37:10.491 INFO  [trivial] grab new image
30/05/2017 12:37:10.522 INFO  [trivial] grab new image
30/05/2017 12:37:10.556 INFO  [trivial] grab new image
30/05/2017 12:37:10.590 INFO  [trivial] grab new image
30/05/2017 12:37:10.597 INFO  [trivial] streaming (31995) to 0
30/05/2017 12:37:10.624 INFO  [trivial] grab new image
30/05/2017 12:37:10.631 INFO  [trivial] streaming (31991) to 0
30/05/2017 12:37:10.655 INFO  [trivial] grab new image
30/05/2017 12:37:10.665 INFO  [trivial] streaming (31929) to 0
30/05/2017 12:37:10.689 INFO  [trivial] grab new image
30/05/2017 12:37:10.699 INFO  [trivial] streaming (32012) to 0
30/05/2017 12:37:10.722 INFO  [trivial] grab new image
30/05/2017 12:37:10.734 INFO  [trivial] streaming (31982) to 0
30/05/2017 12:37:10.756 INFO  [trivial] grab new image
30/05/2017 12:37:10.768 INFO  [trivial] streaming (31865) to 0
30/05/2017 12:37:10.789 INFO  [trivial] grab new image
30/05/2017 12:37:10.824 INFO  [trivial] grab new image
30/05/2017 12:37:10.857 INFO  [trivial] grab new image
30/05/2017 12:37:10.891 INFO  [trivial] grab new image
30/05/2017 12:37:10.906 INFO  [trivial] streaming (31957) to 0
30/05/2017 12:37:10.924 INFO  [trivial] grab new image
30/05/2017 12:37:10.940 INFO  [trivial] streaming (32049) to 0
30/05/2017 12:37:10.958 INFO  [trivial] grab new image
30/05/2017 12:37:10.974 INFO  [trivial] streaming (31896) to 0
30/05/2017 12:37:10.991 INFO  [trivial] grab new image
cedricve commented 7 years ago

Did some more testing @dridri overall looks good, but sometimes it takes the socket too much time to send the frame over HTTP to get full speed. However when using an incognito session in the browser it runs at full speed (30FPS it is). Strange..

Another thing I've noticed is the following: When I was streaming, and I switched of the lights (environment) and put them on again (dark environment -> light environment). I start receiving black images. When I try to open a new stream from another client, I get a black screen as well. Although I can see from the console it keeps getting frames.

What I've tried is to stop the binary and restart, but then it looks that the initializing is stalled. The funny thing is that I'm able to reproduce this behavior.. just by switching lights on/off.

EDIT: You can simply simulate it by putting something on the lens and remove it (fast). Abrupt changes in intensity makes the camera crash. Apparently the camera doesn't like complete darkness.. Funny bug.

30/05/2017 21:05:52.473 INFO  [trivial] Logging is set to verbose
30/05/2017 21:05:52.474 INFO  [trivial] Starting cloud service: S3
30/05/2017 21:05:52.475 INFO  [trivial] Starting capture device: RaspiCamera
CopyPort( OMX.broadcom.camera->70, OMX.broadcom.video_encode->200
Port 200: in 1/1 1382400 16 disabled,not pop.,not cont. 1280x720 1280x720 @30fps 20
===> types : 8[200] | 8[71]
Port 71: out 1/1 1382400 16 disabled,not pop.,not cont. 1280x720 1280x720 @30fps 20
Port 200: in 1/1 15360 16 disabled,not pop.,not cont. 160x64 160x64 @30fps 20
[OMX.broadcom.camera]Wating state to be 2
dridri commented 7 years ago

Wow that's weird, I've never had this problem in months. What is strange is that the port 200 (encoder input) values have totally changed (looks like it has been resetted to default, in 160x64 mode). Try to reduce bitrate a little bit. I'm going to give it a try to see if I can reproduce the problem

cedricve commented 7 years ago

@dridri well I think this is the initial state, but it hangs at [OMX.broadcom.camera]Wating state to be 2. From what I see is that you reset the it after this statement.

cedricve commented 7 years ago

By lowering the bitrate to 4k it's indeed harder to reproduce, so far I only had it once, with 8k it was pretty easy.

dridri commented 7 years ago

Try to run it using gdb, there is a maybe a hidden segfault/bus error

cedricve commented 7 years ago

Hmm not that familiar with gdb to find bus errors. I'll give it a try.

dridri commented 7 years ago

Just start using the command gdb ./you_program, then in GDB console type run [args...], then gdb will tell you if it detects anything

cedricve commented 7 years ago

Hmm well, it does throws an error on libcrypto, but that doesn't makes sense..

Starting program: /home/pi/machinery/bin/kerberosio 
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/arm-linux-gnueabihf/libthread_db.so.1".
Cannot access memory at address 0x0
Program received signal SIGILL, Illegal instruction.
0x75a33de8 in ?? () from /usr/lib/arm-linux-gnueabihf/libcrypto.so.1.0.0
(gdb) 
dridri commented 7 years ago

At this point, you can type bt to see a backtrace. (You maybe have to add "-ggdb3" in CXX-flags to get more infos)

cedricve commented 7 years ago

Thanks I had to enter following command before entering run:

handle SIGILL nostop

Well it looks with GDB the camera is still crashing. But don't get any errors.

dridri commented 7 years ago

Okay so it is not a memcpy leak (I had some before) I will take a look at it

cedricve commented 7 years ago

Thanks, what I still think is strange is that after it crashed the first time, and I try to load the binary a second time it's locked on "Waiting state to be 2", and I need to reboot the Pi to make it work again.

   30/05/2017 22:44:07.917 INFO  [trivial] Logging is set to verbose
   30/05/2017 22:44:07.918 INFO  [trivial] Starting cloud service: S3
   30/05/2017 22:44:07.919 INFO  [trivial] Starting capture device: RaspiCamera
   CopyPort( OMX.broadcom.camera->70, OMX.broadcom.video_encode->200
   Port 200: in 1/1 1382400 16 disabled,not pop.,not cont. 1280x720 1280x720 @30fps 20
   ===> types : 8[200] | 8[71]
   Port 71: out 1/1 1382400 16 disabled,not pop.,not cont. 1280x720 1280x720 @30fps 20
   Port 200: in 1/1 15360 16 disabled,not pop.,not cont. 160x64 160x64 @30fps 20
   [OMX.broadcom.camera]Wating state to be 2
   ^CComponent::onexit()
   Stopping Camera OMX.broadcom.camera
   Camera stopped
   Stopping component OMX.broadcom.camera
   Stopping component OMX.broadcom.video_encode
   Stopping component OMX.broadcom.video_encode
   Deleting component OMX.broadcom.video_encode
   Component delete ok
   Deleting component OMX.broadcom.video_encode
   Component delete ok
   Deleting component OMX.broadcom.camera
   Deleting Camera...
   Deleting Camera ok
   [OMX.broadcom.camera]Wating state to be 2
cedricve commented 7 years ago

Will dig into further today, let me know if you want me to test something @dridri. Thanks!

cedricve commented 7 years ago

Hey hey, sorry I'm spamming this issue. I just want to keep you posted about my tests, so we can resolve this issue by any chance.

  1. I tested my kerberos binary with several Pi camera's, all show the same result as describe above.
  2. Because I want to be sure it's not my code breaking your library, I tested with one of your examples. A. camera_encode, don't have any issues, when I change intensity quickly the camera doesn't fail. The recording generated is ok. B. When using the camera_threaded example I can generate the same error, so there might be something wrong with this setup. The good news is that I see the behaviour happening when outputting the MJPEG data to std::cout (please see below).

So what I receive is the following while not doing anything (which is ok):

????
????
????
????
????

However when I do the dark/light switch I get some strange output:

????
????
????
????
????
h?)m_
h?)m_
)I??O???'=?;
)I??O???'=?;
yf??;SI;q?S9nn?u(?x??aJ??1ǽ(?ª?Xi ??Q?F
yf??;SI;q?S9nn?u(?x??aJ??1ǽ(?ª?Xi ??Q?F
>Ԍ?b??*&b??H?"?s??0?ǚi=?ø?
???TU?28?>??W*N(q?2cN
S????LdEw|?c?/<񚍋n-?=(Fy?
L??'$?H`?ϥ5?q$??Sq???y?`?6[h????
??V0#?zvppi??
??V0#?zvppi??
????
????
????
????
????
????

The first time it goes back to the normal output, however the second time it stalls and again sends wrong output. At this point the camera is locked, and I need to reboot the Pi to restore the lock.

????
????
????
????
????
????
?W8׭/?b1?9??2?~s@?ƀҞ????ML???Rƻ??*?F????щ?zT/?8٩??????,?vqǭ<8?8?ڢFN;R??j???@$u?MYO??T,??R?`?)0'Y??3??OYI?O???L?ڞX*?9?08?8>??sӊi?c?Olђ[?[???z???OS?S8<??N;?%y<qN?#

EDIT: if I modify the camera_thread example to not use the MJPEG encoding, I don't get the error either. EDIT2: when lowering the resolution to 640x480 and 15FPS, 4k bitrate I can't make the camera crash anymore.. this is so weird.