Closed schillingderek closed 1 month ago
Hi, thanks for the question. I think the problem may just be that the easiest way to do this is to pass the encoder, output and stream name altogether to the start_encoder function. When you start the encoder, it should pick up the resolution, format and image stream from the name supplied - in your case it was probably overwriting whatever you'd tried to configure!
import time
from picamera2 import Picamera2
from picamera2.outputs import FileOutput
from picamera2.encoders import H264Encoder
with Picamera2() as picam2:
main = {'size': (960, 540), 'format': 'YUV420'}
lores = {'size': (640, 360), 'format': 'YUV420'}
config = picam2.create_video_configuration(main, lores=lores)
picam2.configure(config)
streaming_encoder = H264Encoder()
streaming_output = FileOutput("stream.h264")
file_encoder = H264Encoder()
file_output = FileOutput("file.h264")
picam2.start()
picam2.start_encoder(streaming_encoder, streaming_output, name='lores')
time.sleep(2)
picam2.start_encoder(file_encoder, file_output, name='main')
time.sleep(5)
picam2.stop_encoder(file_encoder)
time.sleep(2)
picam2.stop_encoder(streaming_encoder)
When you run this, you should get a ~5 second file at the larger resolution, and a ~9 second file at the lower resolution (you could play them back with ffplay).
Brilliant - works perfectly :) Thanks @davidplowman
Describe what it is that you want to accomplish
Raspberry Pi Zero 2 W DietPi v9.6.1 - Debian GNU/Linux 12 (bookworm) Arducam OV5647
I am attempting to continuously stream the
lores
config, which I'm using for streaming video - and then occasionally capture themain
config to a file.The issue I'm having is that I can't understand how to get the streaming encoder to honor the size and format being passed in - it instead appears to just be using the
main
configuration - I can tell this because when the streaming video is decoded, it's clearly in themain
size - changing the value of thelores
size has no effect on the stream, but changing themain
size most certainly does.I have been trying to follow the example in dual_encode, but am not understanding how it works. Specifically this bit related to creating a request and then encoding the request:
When I read about the
capture_request
method in the docs, it seems like this is something that should only be done occasionally to avoid creating problems. But as I want to stream thelores
config continuously, I'm not sure how to apply the idea to my situation.I'm aware of the existing mjpeg_streaming_server example, and was trying to combine that with the idea of the
dual_encode
example to create high-quality video captures with a lower-quality stream. I found the MJPEG stream to be far too slow/choppy on my Pi Zero 2 W - but the H264 stream over websocket (based on this older Code Inside Out example, but updated to work with Picamera2) seems to work great.Describe alternatives you've considered
This is my current attempt: https://github.com/schillingderek/birdCam/tree/main/streamingServer/stream_picamera_h264
The Picamera code in
server.py
I'll copy here for convenience: