Open azsde opened 7 months ago
Hi, thanks for the report. Does a simple "configure - start-recording - stop-recording" script (probably just half a dozen lines or so) show the same problem? If so, could you maybe post that? Please also confirm what Pi you have, what camera you are using and that the problem occurs with the latest fully-updated OS release with no further modifications. Thanks!
(Just checking that you realised that the 20.0
in that 2nd little code snippet needs to be the frame time in milliseconds? So for 30fps, you need to use 33.3
.)
Hi @davidplowman, thank you for your input. I updated the timestamp checker script to have 33.3 instead of 20.0 and indeed there are no lost frames at 30FPS, sorry I missed that.
I am using a raspberry pi 4 with 8Gb of RAM, along with a raspberry pi camera module v3, which should support 1080p @ 50FPS according to the documentation.
When doing the first recordings, it seems that there are few lost frames, but as a keep on doing more recordings, there are more and more lost frames until the footage is really choppy / unusable.
And I am trying to figure out why.
I'm going to try with a simple script that will perform several successive recordings and validate that the loss of frame is getting worse and worse.
In the meantime if you have any leads on how to pinpoint the issue, I'd be interested.
1080p50 encode is true of the rpicam-vid, with Python there are some extra overheads which mean you may not achieve that. You are also doing an extra MJPEG encode, which will cost you as well. There is in general no guarantee that you can encode at the rate that a camera can deliver, and with Python you certainly have to be more careful.
The first thing to do is to change your configuration so that the "main" stream is in "YUV420" format. This will save an expensive software conversion and should perform better. Also experiment without the MJPEG encode to see what effect that has. If you can post a simplified script as I described, then I would be able to try it.
Hi @davidplowman
Without doing the "extra MJPEG encode", and with a "simple" script I achieve a decent result:
import time
from datetime import datetime
from libcamera import controls, Transform
from picamera2 import Picamera2
from picamera2.encoders import H264Encoder, Quality
picam2 = Picamera2()
config = picam2.create_video_configuration(main={"size": (1920, 1080)},
controls={
'FrameRate': 50,
'NoiseReductionMode': 1,
'AfMode': controls.AfModeEnum.Continuous},
transform=Transform(hflip=0, vflip=0))
picam2.configure(config)
h264encoder = H264Encoder(framerate=50, enable_sps_framerate=True)
for i in range(0, 5):
base_filename = f"recording-{i}"
h264_file = base_filename + ".h264"
timestamps_file = "timestamps-" + base_filename + ".txt"
current_time = datetime.now()
formatted_datetime = current_time.strftime("%Y-%m-%d %H:%M:%S")
print(f"{formatted_datetime} - Start recording {h264_file} for 10 minutes")
picam2.start_recording(h264encoder, h264_file, quality=Quality.HIGH, pts=timestamps_file)
time.sleep(600)
picam2.stop_recording()
current_time = datetime.now()
formatted_datetime = current_time.strftime("%Y-%m-%d %H:%M:%S")
print(f"{formatted_datetime} - Stopped recording {h264_file} after 10 minutes")
time.sleep(5)
# Instanciate a new H264Encoder in order to reset the timestamps
h264encoder = H264Encoder(framerate=50, enable_sps_framerate=True)
Analyzing the loss of frames using:
import sys
import os
# List all .txt files in the current directory
txt_files = [file for file in os.listdir() if file.endswith('.txt')]
# Iterate over each .txt file
for file in txt_files:
with open(file, 'r') as f:
next(f)
timestamps = [float(line.rstrip()) for line in f.readlines()]
diffs = [round((next - now) / 20.0) - 1 for now, next in zip(timestamps, timestamps[1:])]
frames_received = len(timestamps)
lost_frames = int(sum(diffs))
lost_frames_percentage = "{:.2f}".format((lost_frames * 100.00) / (frames_received + lost_frames))
print(f"{file} - Received frames: {frames_received} - lost: {lost_frames} ({lost_frames_percentage}%)" )
This gives me:
timestamps-recording-0.txt - Received frames: 29694 - lost: 315 (1.05%)
timestamps-recording-1.txt - Received frames: 29640 - lost: 369 (1.23%)
timestamps-recording-2.txt - Received frames: 29754 - lost: 255 (0.85%)
timestamps-recording-3.txt - Received frames: 29706 - lost: 303 (1.01%)
timestamps-recording-4.txt - Received frames: 29797 - lost: 213 (0.71%)
So around 1% of lost frames is quite acceptable at 50 FPS.
The issue is when adding the MJPEG encoder, it starts out "ok-ish" but then it gets worse over time:
import io
import threading
from datetime import datetime
import time
from libcamera import controls, Transform
from picamera2 import Picamera2
from picamera2.encoders import MJPEGEncoder, H264Encoder, Quality
from picamera2.outputs import FileOutput
class StreamingOutput(io.BufferedIOBase):
def __init__(self):
self.frame = None
self.condition = threading.Condition()
def write(self, buf):
with self.condition:
self.frame = buf
self.condition.notify_all()
if __name__ == '__main__':
# Create camera
picam2 = Picamera2()
# Create config
config = picam2.create_video_configuration(main={"size": (1920, 1080)},
lores={"size": (740, 480)},
controls={
'FrameRate': 50,
'NoiseReductionMode': 1,
'AfMode': controls.AfModeEnum.Continuous},
transform=Transform(hflip=0, vflip=0))
# Configure camera
picam2.configure(config)
# Instanciate mjpegEncoder & streaming output (required to stream the camera in a web browser for instance)
mjpegEncoder = MJPEGEncoder()
streaming_output = StreamingOutput()
mjpegEncoder.output = [FileOutput(streaming_output)]
# Start the encoder
picam2.start_encoder(mjpegEncoder, name="lores")
# Instanciate h264 encoder to store recordings to files
h264encoder = H264Encoder(framerate=50, enable_sps_framerate=True)
# Start the camera
picam2.start()
for i in range(0, 5):
# Define filenames
base_filename = f"recording-{i}"
h264_file = base_filename + ".h264"
timestamps_file = "timestamps-" + base_filename + ".txt"
# Get current time
current_time = datetime.now()
formatted_datetime = current_time.strftime("%Y-%m-%d %H:%M:%S")
# Start recording for 10 minutes
print(f"{formatted_datetime} - Start recording {h264_file} for 10 minutes")
picam2.start_recording(h264encoder, h264_file, quality=Quality.HIGH, pts=timestamps_file)
time.sleep(600)
# Get current time
current_time = datetime.now()
formatted_datetime = current_time.strftime("%Y-%m-%d %H:%M:%S")
# Stop recording (I do not use stop_recording as it will stop every encoders, thus interumpting the web stream)
print(f"{formatted_datetime} - Stopped recording {h264_file} after 10 minutes")
picam2.stop_encoder(h264encoder)
# Wait for things to settle down (just in case ?)
time.sleep(5)
# Instanciate a new H264Encoder in order to reset the timestamps
h264encoder = H264Encoder(framerate=50, enable_sps_framerate=True)
timestamps-recording-0.txt - Received frames: 25438 - lost: 4580 (15.26%)
timestamps-recording-1.txt - Received frames: 17336 - lost: 12683 (42.25%)
timestamps-recording-2.txt - Received frames: 7978 - lost: 22037 (73.42%)
timestamps-recording-3.txt - Received frames: 7760 - lost: 22259 (74.15%)
timestamps-recording-4.txt - Received frames: 7760 - lost: 22256 (74.15%)
What is puzzling me, is that the initial loss is not that bad, but then performances are getting really bad and this issue doesn't happen at 30 FPS.
Do you think the pi might be thermal throttling ? Resulting in quite bad performances ? Maybe the HW encoder cannot keep up ?
Also, when trying to turn the .h264 and the timestamp.txt file into a .mkv file:
mkvmerge -o output.mkv --timecodes 0:timestamps-recording-4.txt recording-4.h264
I get a lot of lines like this:
[null @ 0x5559ef8020] Application provided invalid, non monotonically increasing dts to muxer in stream 0: 6784 >= 6784
So I'm assuming this is because of "invalid" timestamps ?
1080p50 is considerably beyond what the Pi 4 is designed for, though people generally report that 1080p50 works "with luck and a following wind". This is less true of Python than our C++ applications, but generally people get "close". In this case you have lots of extra work going through the h/w conversion and encoder blocks on the GPU, so it's possible that you're simply asking too much.
Having said that, here are a few more things to try...
Try putting force_turbo=1
into your /boot/firmware/config.txt
file. It's possible that the Arm isn't realising how busy the GPU is, so let's force all the clocks to maximum.
I wonder if the filesystem is having an effect? Maybe try writing the video streams to /dev/null
. You can still check the timestamp tiles. Maybe consider writing the timestamp files to /dev/shm
(memory file system). That will rule out any SD card or SSD effects completely.
You can check for temperature and throttling with things like: vcgencmd get_throttled
, vcgencmd measure_temp
, vcgencmd measure_clock arm
, vcgencmd measure_clock core
. Obviously you can try fans and heatsinks, but it's probably worth measuring the numbers first.
As regards the timestamps, are you seeing lots of consecutive values in those timestamp files that are identical? Strictly speaking that shouldn't happen. But if the system is really struggling and different bits of processing are getting clumped together, then I suppose it might be possible.
There is an outside chance that using the JpegEncoder
might work better than the MJPEGEncoder
. The former is a software encoder on the Arm, the latter is a GPU h/w encoder. But if it's the GPU that can't cope, then moving the JPEG encode onto the Arm might actually help. (Edit - that won't work. On a Pi 4 the lores stream has to be YUV420, and Python generally doesn't like YUV420 so the JpegEncoder
would require RGB.)
I see, thank you for your answer.
I have made some changes:
Before adding force_turbo=1 i have those values:
throttled=0x0
temp=46.7'C
frequency(48)=1800457088
frequency(1)=319992192
Adding force_turbo=1 does seem to improve the performance as I don't lose as many frames as it used to:
timestamps-recording-0.txt - Received frames: 20012 - lost: 10006 (33.33%)
timestamps-recording-1.txt - Received frames: 18520 - lost: 11497 (38.30%)
timestamps-recording-2.txt - Received frames: 18597 - lost: 11421 (38.05%)
timestamps-recording-3.txt - Received frames: 18609 - lost: 11409 (38.01%)
timestamps-recording-4.txt - Received frames: 18626 - lost: 11392 (37.95%)
timestamps-recording-5.txt - Received frames: 18580 - lost: 11437 (38.10%)
The temperature and frequency are also higher:
throttled=0x0
temp=53.5'C
frequency(48)=1800404352
frequency(1)=499987808
Having force_turbo=1 and redirecting the video recordings to /dev/null and timestamps to /dev/shm gives me:
timestamps-recording-0.txt - Received frames: 20012 - lost: 10005 (33.33%)
timestamps-recording-1.txt - Received frames: 20011 - lost: 10006 (33.33%)
timestamps-recording-2.txt - Received frames: 20011 - lost: 10005 (33.33%)
timestamps-recording-3.txt - Received frames: 20009 - lost: 10007 (33.34%)
timestamps-recording-4.txt - Received frames: 20011 - lost: 10005 (33.33%)
timestamps-recording-5.txt - Received frames: 20011 - lost: 10005 (33.33%)
So there's a slight gain, but it still doesn't allow me to reach a full 50 FPS.
I do have access to a Pi 5, but do not have the proper ribbon cable to perform the tests just yet.
If I switch to a Pi 5, are there specific considerations to keep in mind ?
Do you think active cooling will be needed on the Pi 5 to reach the target 50 FPS ?
Also, I have read that on the Pi 5, H264 and MJPEG encoding are done through software rather than hardware ? I thought hardware was the way to go to get the most performance ?
I thank you once again for your answers.
Regards,
Azsde.
As you say, on a Pi 5 all the encoding will be done in software. Most things on a Pi 5 perform much better, but I am not convinced that you will get more encode performance. I think 1080p50 is about the best we've achieved there too, so once you start adding MJPEG I think you'll see frame drops. FWIW, the quality of the encoded bitstream is considerably better on a Pi 5 for the same bit rate (so maybe some scope to lower that if it drops fewer frames).
You'll be running all cores flat out for sure with these kinds of workloads, so I think you would probably find that you need active cooling.
On a Pi 5, make very sure you're asking for "YUV420" for the 1080p stream that you want to encode, otherwise you'll get a slow software conversion into YUV420. I also said that I thought the JpegEncoder might be better than the MJPEGEncoder, you'd have to try it and see. I think you'll need to ask for "RGB888" (or maybe "BGR888") on the low res stream for that to work. Again, force_turbo=1
is likely to be a good idea.
Hello,
I am developping a program that allows to both visualize the video feed in real time in a web browser and allow to manually trigger recordings.
Here is the code I use
When checking the video stream on a webpage (not implemented in this example) for some reason the video stream begins to stutter after a few recordings, and when checking the recorded h264 file, it sometimes stutters for a while before resuming after a few seconds, resulting in lost frames, and other times there are so much lost frames that the video playback is quite choppy, like if it was recorded at 10 FPS or so.
I tried to investigate using this script by @davidplowman :
Running this gives me:
Total frames: 838 lost: 1571
, but I am not sure how to interpret the results.When setting the parameters to 30 FPS instead of 50, I do not seem to reproduce the issue, but running the same script as above returns
Total frames: 17970 lost: 18019
but this doesn't seem right as the video plays back nicely.I don't know where to begin to look in order to fix this issue, can someone point me in the right direction ?
I thank you in advance.
Regards,
Azsde.