pikvm / ustreamer

µStreamer - Lightweight and fast MJPEG-HTTP streamer
https://pikvm.org
GNU General Public License v3.0
1.73k stars 240 forks source link

How do I access H264 image quality on the / stream page? #132

Closed TelDragon closed 3 years ago

TelDragon commented 3 years ago

How do I access H264 image quality on the / stream page?

Dear developer, first of all, thank you very much for your open source ustreamer program 尊敬的开发者您好,首先非常感谢您提供的开源 ustreamer 程序

Based on the help file of ustreamer and the ustreamer startup instance of your project, I try to turn on the enable H264 feature 根据 ustreamer 的 help文件以及您项目的 ustreamer 启动实例,我尝试打开启用h264功能

My startup information 我的启动信息

sudo /usr/local/ustreamer/ustreamer \
--host 0.0.0.0 \
--port 8001 \
--device /dev/video0 \
--encoder OMX \
--format UYVY \
--desired-fps 15 \
--persistent \
--dv-timings \
--workers 3 \
--quality 80 \
--drop-same-frames 30 \
--sink=ustreamerjpeg \
--h264-sink=streamerh264 \
--h264-bitrate=100 \
--h264-gop=10 

cli

pi@raspberrypi:/usr/local/ustreamer $ sudo /usr/local/ustreamer/ustreamer \
> --host 0.0.0.0 \
> --port 8001 \
> --device /dev/video0 \
> --encoder OMX \
> --format UYVY \
> --desired-fps 15 \
> --persistent \
> --dv-timings \
> --workers 3 \
> --quality 80 \
> --drop-same-frames 30 \
> --sink=jpeg \
> --h264-sink=h264 \
> --h264-bitrate=100 \
> --h264-gop=10 
-- INFO  [18482.862      main] -- Using internal blank placeholder
-- INFO  [18482.862      main] -- Using JPEG-sink: jpeg
-- INFO  [18482.862      main] -- Using H264-sink: h264
-- INFO  [18482.865      main] -- Listening HTTP on [0.0.0.0]:8001
-- INFO  [18482.866    stream] -- Using V4L2 device: /dev/video0
-- INFO  [18482.866    stream] -- Using desired FPS: 15
-- INFO  [18482.866      http] -- Starting HTTP eventloop ...
-- INFO  [18482.866    stream] -- H264: Initializing MMAL encoder ...
-- INFO  [18482.866    stream] -- H264: Using bitrate: 100 Kbps
-- INFO  [18482.866    stream] -- H264: Using GOP: 10
-- INFO  [18482.889    stream] -- H264: Configuring MMAL encoder: zero_copy=0 ...
================================================================================
-- INFO  [18482.928    stream] -- Device fd=13 opened
-- INFO  [18482.928    stream] -- Using input channel: 0
-- INFO  [18482.935    stream] -- Got new DV timings: resolution=1920x1080, pixclk=148500000
-- INFO  [18482.956    stream] -- Using resolution: 1920x1080
-- INFO  [18482.956    stream] -- Using pixelformat: UYVY
-- INFO  [18482.956    stream] -- Querying HW FPS changing is not supported
-- INFO  [18482.956    stream] -- Using IO method: MMAP
-- INFO  [18482.966    stream] -- Requested 5 device buffers, got 5
-- INFO  [18482.973    stream] -- Capturing started
-- INFO  [18482.973    stream] -- Initializing OMX encoder ...
-- INFO  [18482.975    stream] -- Initializing OMX encoder ...
-- INFO  [18482.976    stream] -- Initializing OMX encoder ...
-- INFO  [18482.984    stream] -- Using JPEG quality: 80%
-- INFO  [18482.984    stream] -- Creating pool JPEG with 3 workers ...
-- INFO  [18482.985    stream] -- Capturing ...
-- INFO  [18482.985    stream] -- H264: Configuring MMAL encoder: zero_copy=1 ...
-- INFO  [18500.495      http] -- HTTP: Registered client: [192.168.1.51]:48400, id=4c0df888422cd629; clients now: 1

/state

{"ok": true, "result": { "encoder": {"type": "OMX", "quality": 80}, "h264": {"bitrate": 100, "gop": 10, "online": true}, "sinks": {"jpeg": {"has_clients": false}, "h264": {"has_clients": false}}, "source": {"resolution": {"width": 1920, "height": 1080}, "online": true, "desired_fps": 15, "captured_fps": 12}, "stream": {"queued_fps": 2, "clients": 1, "clients_stat": {"4c0df888422cd629": {"fps": 2, "extra_headers": false, "advance_headers": true, "dual_final_frames": false, "zero_data": false}}}}}

ok  true
result  
encoder 
type    "OMX"
quality 80
h264    
bitrate 100
gop 10
online  true
sinks   
jpeg    
has_clients false
h264    
has_clients false
source  
resolution  
width   1920
height  1080
online  true
desired_fps 15
captured_fps    13
stream  
queued_fps  2
clients 2
clients_stat    
ee65075a19130b2a    
fps 2
extra_headers   false
advance_headers true
dual_final_frames   false
zero_data   false
2509eda7400987d0    
fps 2
extra_headers   false
advance_headers false
dual_final_frames   false
zero_data   false

When I normally access http 当我正常访问http时

http://ip:8001/stream

I don't know whether I am accessing JPEG stream or H264 stream. Do you need additional parameters when opening access? 我不清楚我是访问的JPEG流 还是H264流。 打开访问时是不是需要额外的参数?

similar http://ip:8001/stream?advance_headers=1

The main purpose is because the default mode is JPEG. I want to enable the function of h264 to cooperate with OMX and bitrate to reduce traffic bandwidth 其中主要目的是因为默认的方式为JPEG ,我想启用h264的功能,来配合 OMX 和 bitrate 减少流量带宽

TelDragon commented 3 years ago

The same H264 question.

If UNIX is enabled, HTTP listening is discarded. How is the backend connected? 如果启用 unix 后,http监听被抛弃。后端如何连接?

Help indicates that the stream is stored in the shared memory of the sock. help 说明是将流存放到了 sock 的共享内存里。

My startup information

/usr/local/ustreamer/ustreamer \
--host=0.0.0.0 \
--port=8001 \
--device=/dev/video0 \
--persistent \
--dv-timings \
--format=uyvy \
--encoder=omx \
--workers=3 \
--quality=80 \
--desired-fps=30 \
--drop-same-frames=30 \
--last-as-blank=0 \
--unix=/run/ustreamer.sock \
--unix-rm \
--unix-mode=0660 \
--exit-on-parent-death \
--process-name-prefix=ustreamer \
--notify-parent \
--no-log-colors \
--sink=jpeg \
--sink-mode=0660 \
--h264-sink=h264 \
--h264-sink=mode=0660 \
--h264-bitrate=5000 \
--h264-gop=30 

cli

ls /run/

ustreamer.sock
mdevaev commented 3 years ago

H264 stream is not available over HTTP because there is no real-time video transmission protocol for HTTP except for MJPEG. H264 is only available via shared memory.

When you use unix socket, then /state is also available only via socket.

TelDragon commented 3 years ago

Thank you for your reply. Now it has been determined that if H264 is used, it must exist in the form of sock shared memory, and HTTP will discard it. 感谢您的回复,现在已经确定,如果使用h264 那么必须是以 sock 共享内存形式存在, http 会抛弃。

We are still a layman in the aspect of shared memory reading. You have a related instance (Python). How will you read the data stream? There should be one direction, even ustreamer.sock. Or is there an entrance? 关于共享内存读取这方面,我们还是个外行,您有相关的实例(python),将如何读取到数据流? 应该是有一个方向,哪怕是ustreamer.sock 。或者是有个入口?

mdevaev commented 3 years ago

Compile ustreamer with make WITH_PYTHON=1 and use ustreamer library to read the stream

TelDragon commented 3 years ago

Thank you very much. I have compiled the so library file according to your instructions 非常感谢,我已经按照您的指导编译出了so库文件

cli

make WITH_PYTHON=1 OMX=1

make apps
make[1]: Entering directory '/usr/local/ustreamer'
make -C src
make[2]: Entering directory '/usr/local/ustreamer/src'
-- CC ustreamer/encoders/omx/component.c
-- CC ustreamer/encoders/omx/encoder.c
-- CC ustreamer/encoders/omx/formatters.c
-- CC ustreamer/h264/encoder.c
-- CC ustreamer/h264/stream.c
== LD ustreamer.bin
make[2]: Leaving directory '/usr/local/ustreamer/src'
make[1]: Leaving directory '/usr/local/ustreamer'
make python
make[1]: Entering directory '/usr/local/ustreamer'
make -C python
make[2]: Entering directory '/usr/local/ustreamer/python'
== PY_BUILD ustreamer-*.so
running build
running build_ext
building 'ustreamer' extension
creating build
creating build/temp.linux-armv7l-3.7
creating build/temp.linux-armv7l-3.7/src
arm-linux-gnueabihf-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -O3 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -UNDEBUG -I/usr/include/python3.7m -c src/ustreamer.c -o build/temp.linux-armv7l-3.7/src/ustreamer.o
arm-linux-gnueabihf-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -O3 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -UNDEBUG -I/usr/include/python3.7m -c src/frame.c -o build/temp.linux-armv7l-3.7/src/frame.o
creating build/lib.linux-armv7l-3.7
arm-linux-gnueabihf-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -O3 -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-armv7l-3.7/src/ustreamer.o build/temp.linux-armv7l-3.7/src/frame.o -lrt -lm -lpthread -o build/lib.linux-armv7l-3.7/ustreamer.cpython-37m-arm-linux-gnueabihf.so
make[2]: Leaving directory '/usr/local/ustreamer/python'
make[1]: Leaving directory '/usr/local/ustreamer'
ls -ll
drwxrwxr-x 3 root root  4096 Nov  1 12:51 janus
-rw-rw-r-- 1 root root 35147 Jun 11 00:32 LICENSE
drwxrwxr-x 2 root root  4096 Nov  1 12:51 linters
-rw-rw-r-- 1 root root  2266 Jun 11 00:32 Makefile
drwxrwxr-x 2 root root  4096 Nov  1 12:51 man
drwxrwxr-x 6 root root  4096 Nov  1 12:51 pkg
drwxrwxr-x 4 root root  4096 Nov 16 14:40 python
-rw-rw-r-- 1 root root  9219 Jun 11 00:32 README.md
-rw-rw-r-- 1 root root 14265 Jun 11 00:32 README.ru.md
drwxrwxr-x 6 root root  4096 Nov 16 14:40 src
-r--r--r-- 1 root root   512 Jun 25 10:32 tc358743-edid.hex
drwxrwxr-x 2 root root  4096 Nov  1 12:51 tools
lrwxrwxrwx 1 root root    17 Nov 16 14:40 ustreamer -> src/ustreamer.bin
lrwxrwxrwx 1 root root    78 Nov 16 14:40 ustreamer.cpython-37m-arm-linux-gnueabihf.so -> python/build/lib.linux-armv7l-3.7/ustreamer.cpython-37m-arm-linux-gnueabihf.so
lrwxrwxrwx 1 root root    22 Nov 16 14:40 ustreamer-dump -> src/ustreamer-dump.bin

Then store the compiled library file to the corresponding location of arm 然后将编译出来的库文件存放至 arm 的相应位置

/lib/
/usr/lib/
/usr/local/lib/

Then call directly if Python calls 然后python调用的话直接调用

import ustreamer 
mdevaev commented 3 years ago

Welcome