BreeeZe / rpos

Raspberry Pi Onvif Server
http://breeeze.github.io/rpos
MIT License
643 stars 146 forks source link

ffmpeg failing #110

Open rjsdotorg opened 3 years ago

rjsdotorg commented 3 years ago

The Onvif device manager shows ~1 second of live streaming upon Live video selection, the "NO SIGNAL" and the command line output reads ffmpeg exec error: Error: Command failed: Full output:

pi@raspberrypi:~/rpos $ sudo modprobe bcm2835-v4l2
pi@raspberrypi:~/rpos $ node rpos.js
Read IP address 192.168.0.105 from wlan0
Manufacturer : RPOS Raspberry Pi
Model : Model_B+_PI_3
HardwareId : 
SerialNumber : 00000000297ed551
FirmwareVersion : 2.0.6
Starting camera settings webserver on http://192.168.0.105:8081/
Binding DeviceService to http://192.168.0.105:8081/onvif/device_service
Binding MediaService to http://192.168.0.105:8081/onvif/media_service
Binding PTZService to http://192.168.0.105:8081/onvif/ptz_service
Binding ImagingService to http://192.168.0.105:8081/onvif/imaging_service
discovery_service started
imaging_service started
Starting Live555 rtsp server
media_service started
ptz_service started
device_service started
rtspServer: Streaming on URL "rtsp://192.168.0.105:8554/h264"

(RTSP-over-HTTP tunneling is not available.)

(node:4254) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
(Use `node --trace-deprecation ...` to show where the warning was created)
No Username/Password (ws-security) supplied for GetCapabilities
No Username/Password (ws-security) supplied for GetDeviceInformation
No Username/Password (ws-security) supplied for GetNetworkInterfaces
No Username/Password (ws-security) supplied for GetScopes
No Username/Password (ws-security) supplied for GetDNS
ffmpeg - starting
ffmpeg - finished
ffmpeg exec error: Error: Command failed: ffmpeg -fflags nobuffer -probesize 256 -rtsp_transport tcp -i rtsp://127.0.0.1:8554/h264 -vframes 1  -r 1 -s 640x360 -y /dev/shm/snapshot.jpg
ffmpeg version 3.2.12-1~deb9u1+rpt1 Copyright (c) 2000-2018 the FFmpeg developers
  built with gcc 6.3.0 (Raspbian 6.3.0-18+rpi1+deb9u1) 20170516
  configuration: --prefix=/usr --extra-version='1~deb9u1+rpt1' --toolchain=hardened --libdir=/usr/lib/arm-linux-gnueabihf --incdir=/usr/include/arm-linux-gnueabihf --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-omx-rpi --enable-mmal --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --arch=armhf --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
  libavutil      55. 34.101 / 55. 34.101
  libavcodec     57. 64.101 / 57. 64.101
  libavformat    57. 56.101 / 57. 56.101
  libavdevice    57.  1.100 / 57.  1.100
  libavfilter     6. 65.100 /  6. 65.100
  libavresample   3.  1.  0 /  3.  1.  0
  libswscale      4.  2.100 /  4.  2.100
  libswresample   2.  3.100 /  2.  3.100
  libpostproc    54.  1.100 / 54.  1.100
[rtsp @ 0x1e18620] Stream #0: not enough frames to estimate rate; consider increasing probesize
[rtsp @ 0x1e18620] decoding for stream 0 failed
Input #0, rtsp, from 'rtsp://127.0.0.1:8554/h264':
  Metadata:
    title           : Session streamed by "testRTSPServer"
    comment         : h264
  Duration: N/A, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuv420p(progressive), 1280x720, 90k tbr, 90k tbn, 180k tbc
[swscaler @ 0x1e2a3f0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/dev/shm/snapshot.jpg':
  Metadata:
    title           : Session streamed by "testRTSPServer"
    comment         : h264
    encoder         : Lavf57.56.101
    Stream #0:0: Video: mjpeg, yuvj420p(pc), 640x360, q=2-31, 200 kb/s, 1 fps, 1 tbn, 1 tbc
    Metadata:
      encoder         : Lavc57.64.101 mjpeg
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
Press [q] to stop, [?] for help
frame=    0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x    
    at ChildProcess.exithandler (node:child_process:326:12)
    at ChildProcess.emit (node:events:365:28)
    at maybeClose (node:internal/child_process:1067:16)
    at Socket.<anonymous> (node:internal/child_process:453:11)
    at Socket.emit (node:events:365:28)
    at Pipe.<anonymous> (node:net:661:12) {
  killed: true,
  code: null,
  signal: 'SIGTERM',
  cmd: 'ffmpeg -fflags nobuffer -probesize 256 -rtsp_transport tcp -i rtsp://127.0.0.1:8554/h264 -vframes 1  -r 1 -s 640x360 -y /dev/shm/snapshot.jpg'
rjsdotorg commented 3 years ago

More info (doing this post on the Pi3 and Github editing is partially broken) Raspbian Linux 9 Stretch, on a Pi 3 B+ Standard Pi camera, raspistill and vid work fine. Install is fresh. If I run

on its own, I get

  built with gcc 6.3.0 (Raspbian 6.3.0-18+rpi1+deb9u1) 20170516
  configuration: --prefix=/usr --extra-version='1~deb9u1+rpt1' --toolchain=hardened --libdir=/usr/lib/arm-linux-gnueabihf --incdir=/usr/include/arm-linux-gnueabihf --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-omx-rpi --enable-mmal --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --arch=armhf --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
  libavutil      55. 34.101 / 55. 34.101
  libavcodec     57. 64.101 / 57. 64.101
  libavformat    57. 56.101 / 57. 56.101
  libavdevice    57.  1.100 / 57.  1.100
  libavfilter     6. 65.100 /  6. 65.100
  libavresample   3.  1.  0 /  3.  1.  0
  libswscale      4.  2.100 /  4.  2.100
  libswresample   2.  3.100 /  2.  3.100
  libpostproc    54.  1.100 / 54.  1.100
[rtsp @ 0x1831620] Stream #0: not enough frames to estimate rate; consider increasing probesize
[rtsp @ 0x1831620] decoding for stream 0 failed
Input #0, rtsp, from 'rtsp://127.0.0.1:8554/h264':
  Metadata:
    title           : Session streamed by "testRTSPServer"
    comment         : h264
  Duration: N/A, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuv420p(progressive), 1280x720, 90k tbr, 90k tbn, 180k tbc
[swscaler @ 0x1844a60] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/dev/shm/snapshot.jpg':
  Metadata:
    title           : Session streamed by "testRTSPServer"
    comment         : h264
    encoder         : Lavf57.56.101
    Stream #0:0: Video: mjpeg, yuvj420p(pc), 640x360, q=2-31, 200 kb/s, 1 fps, 1 tbn, 1 tbc
    Metadata:
      encoder         : Lavc57.64.101 mjpeg
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
Press [q] to stop, [?] for help
frame=    0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x   
rjsdotorg commented 3 years ago

Also, Step 2 on I had to do

pi@raspberrypi:~ $ git clone https://github.com/BreeeZe/rpos.git
pi@raspberrypi:~/rpos $ nvm install 12
pi@raspberrypi:~/rpos $ nvm use node
pi@raspberrypi:~/rpos $ nvm install-latest-npm
pi@raspberrypi:~/rpos $ npm install
pi@raspberrypi:~/rpos $ npm audit fix
pi@raspberrypi:~/rpos $ npx gulp
pi@raspberrypi:~/rpos $ sudo apt-get install liblivemedia-dev
pi@raspberrypi:~/rpos $ sh setup_v4l2rtspserver.sh
RogerHardiman commented 3 years ago

sorry for the delay in noticing this was posted on github. It was school vacation week last week so I was offline. (we call it half-term week)

I've just been making changes that used the newer versions of Node, so you are right, we do need to mention the Node 12 instructions now. Prior to my recent commits it was all Node 8 I think.

The ffmpeg errors you posted are a ugly hack to get a JPEG image from the Pi camera. I used ffmpeg to connect to the local RTSP stream, wait for 1 frame of video and then exit. So it needed ffmpeg to be on your PATH. I've just wrapped some of that code in a try/catch as it was causing me woes on a machine that did not have ffmpeg installed (actually it was running RPOS on a Windows box).

I keep meaning to add a 'tee' to the gstreamer RTSP option to save out the JPEG to shmem but when we are using mpromonet's RTSP server, it takes over the Pi camera and I've no way to get a JPEG out of it, so came up with this terrible hack using ffmpeg with the vframes=1 flag

rjsdotorg commented 3 years ago

I'll try running through the install with Node 12 again - there was a step I needed that I did not list above. Should I try gstreamer instead of mpromonet?

RogerHardiman commented 3 years ago

Hi Ray grab the latest source from github. I added a try/catch so the ffmpeg snapshot failure would not crash the software. Roger

rjsdotorg commented 3 years ago

OK finally back to this - I got the 2.1.0 and the mpromonet RTSP is working, very nice! Clipboard01

We do have a Pimoroni PTZ to test, but those are I2C and we are looking to control P-T stepper drivers directly via Pi GPIOs, so I am looking at PTZDriver.ts next.

RogerHardiman commented 3 years ago

in the file ./lib/PTZDriver.ts is a block of code the Pan/Tilt/Zoom actions for each supported PTZ device. Looks like you have found it The data passed into this function is a Pan value, Tilt value and Zoom value with the range -1.0 to +1.0 (the range is defined in other ONVIF messages were we tell VMS software what the valid ranges are) So a slow Pan may have a Pan value of 0.15. A fast Pan would have a value of 0.99 or 1.00

So there are back end drivers for PanHilt Hat, Pelco, Visca and a few others. You'd add to this code to control the GPIO.

So it could be

if Pan > 0 turn on GPIO 1. Turn off GPIO 2 if Pan <0 turn off GPIO 1. Turn on GPIO 2 if Pan == 0 turn off GPIO1. Turn off GPIO 2

Same for your Tilt and your Zoom.

This would only be a singe speed though.

Here is the section of code you'd be working in

    else if (command==='ptz') {
      console.log("Continuous PTZ "+ data.pan + ' ' + data.tilt + ' ' + data.zoom);
      var p=0.0;
      var t=0.0;
      var z=0.0;
      try {p = parseFloat(data.pan)} catch (err) {}
      try {t = parseFloat(data.tilt)} catch (err) {}
      try {z = parseFloat(data.zoom)} catch (err) {}
      if (this.tenx) {
        if      (p < -0.1 && t >  0.1) this.tenx.upleft();
        else if (p >  0.1 && t >  0.1) this.tenx.upright();
        else if (p < -0.1 && t < -0.1) this.tenx.downleft();
        else if (p >  0.1 && t < -0.1) this.tenx.downright();
        else if (p >  0.1) this.tenx.right();
        else if (p < -0.1) this.tenx.left();
        else if (t >  0.1) this.tenx.up();
        else if (t < -0.1) this.tenx.down()
        else this.tenx.stop();
      }
      if (this.pelcod) {
        this.pelcod.up(false).down(false).left(false).right(false);
        if      (p < 0 && t > 0) this.pelcod.up(true).left(true);
        else if (p > 0 && t > 0) this.pelcod.up(true).right(true);
        else if (p < 0 && t < 0) this.pelcod.down(true).left(true);
        else if (p > 0 && t < 0) this.pelcod.down(true).right(true);
        else if (p > 0) this.pelcod.right(true);
        else if (p < 0) this.pelcod.left(true);
        else if (t > 0) this.pelcod.up(true);
        else if (t < 0) this.pelcod.down(true);

        // Set Pan/Tilt speed
        // scale speeds from 0..1 to 0..63
        var pan_speed = Math.round(Math.abs(p) * 63.0 );
        var tilt_speed = Math.round(Math.abs(t) * 63.0 );

        this.pelcod.setPanSpeed(pan_speed);
        this.pelcod.setTiltSpeed(tilt_speed);

        this.pelcod.zoomIn(false).zoomOut(false);
        if (z>0) this.pelcod.zoomIn(true);
        if (z<0) this.pelcod.zoomOut(true);

        // Set Zoom speed
        // scale speeds from 0..1 to 0 (slow), 1 (low med), 2 (high med), 3 (fast)
        var abs_z = Math.abs(z);
        var zoom_speed = 0;
        if (abs_z > 0.75) zoom_speed = 3;
        else if (abs_z > 0.5) zoom_speed = 2;
        else if (abs_z > 0.25) zoom_speed = 1;
        else zoom_speed = 0;

        // sendSetZoomSpeed is not in node-pelcod yet so wrap with try/catch
        try {
          if (z != 0) this.pelcod.sendSetZoomSpeed(zoom_speed);
        } catch (err) {}

        this.pelcod.send();
      }
      if (this.visca) {
        // Map ONVIF Pan and Tilt Speed 0 to 1 to VISCA Speed 1 to 0x18
        // Map ONVIF Zoom Speed (0 to 1) to VISCA Speed 0 to 7
        let visca_pan_speed = ( Math.abs(p) * 0x18) / 1.0;
        let visca_tilt_speed = ( Math.abs(t) * 0x18) / 1.0;
        let visca_zoom_speed = ( Math.abs(z) * 0x07) / 1.0;

        // rounding check. Visca Pan/Tilt to be in range 0x01 .. 0x18
        if (visca_pan_speed === 0) visca_pan_speed = 1;
        if (visca_tilt_speed === 0) visca_tilt_speed = 1;

        if (this.config.PTZDriver === 'visca') {
          let data: number[] = [];
          if      (p < 0 && t > 0) { // upleft
            data.push(0x81,0x01,0x06,0x01,visca_pan_speed,visca_zoom_speed,0x01,0x01,0xff);
          }
          else if (p > 0 && t > 0) { // upright
            data.push(0x81,0x01,0x06,0x01,visca_pan_speed,visca_zoom_speed,0x02,0x01,0xff);
          }
          else if (p < 0 && t < 0) { // downleft;
            data.push(0x81,0x01,0x06,0x01,visca_pan_speed,visca_zoom_speed,0x01,0x02,0xff);
          }
          else if (p >  0 && t < 0) { // downright;
            data.push(0x81,0x01,0x06,0x01,visca_pan_speed,visca_zoom_speed,0x02,0x02,0xff);
          }
          else if (p > 0) { // right
            data.push(0x81,0x01,0x06,0x01,visca_pan_speed,0x00,0x02,0x03,0xff);
          }
          else if (p < 0) { // left
            data.push(0x81,0x01,0x06,0x01,visca_pan_speed,0x00,0x01,0x03,0xff);
          }
          else if (t > 0) { // up
            data.push(0x81,0x01,0x06,0x01,0x00,visca_tilt_speed,0x03,0x01,0xff);
          }
          else if (t < 0) { // down
            data.push(0x81,0x01,0x06,0x01,0x00,visca_tilt_speed,0x03,0x02,0xff);
          }
          else { // stop 
            data.push(0x81,0x01,0x06,0x01,0x00,0x00,0x03,0x03,0xff);
          }

          // Zoom
          if (z < 0) { // zoom out
            data.push(0x81,0x01,0x04,0x07,(0x30 + visca_zoom_speed),0xff);
          }
          else if (z > 0) { // zoom in
            data.push(0x81,0x01,0x04,0x07,(0x20 + visca_zoom_speed),0xff);
          } else { // zoom stop
            data.push(0x81,0x01,0x04,0x07,0x00,0xff);
          }

          this.stream.write(new Buffer(data));
        }
      }
      if (this.pan_tilt_hat) {
        // Map ONVIF Pan and Tilt Speed 0 to 1 to Speed 0 to 15
        let pan_speed  = ( Math.abs(p) * 15) / 1.0;
        let tilt_speed = ( Math.abs(t) * 15) / 1.0;

        // rounding check.
        if (pan_speed > 15) pan_speed = 15;
        if (tilt_speed > 15) tilt_speed = 15;
        if (pan_speed < 0) pan_speed = 0;
        if (tilt_speed < 0) tilt_speed = 0;

        if (p < 0)  this.pan_tilt_hat.pan_left(pan_speed);
        if (p > 0)  this.pan_tilt_hat.pan_right(pan_speed);
        if (p == 0) this.pan_tilt_hat.pan_right(0); // stop
        if (t < 0)  this.pan_tilt_hat.tilt_down(tilt_speed);
        if (t > 0)  this.pan_tilt_hat.tilt_up(tilt_speed);
        if (t == 0) this.pan_tilt_hat.tilt_down(0); // stop
      }
    }
RogerHardiman commented 3 years ago

You mentioned they were stepper motors. I was thinking it was GPIO On or Off to start and stop the motor. But you are probably wanting to output a pulses and change the square wave to control the speed.

rjsdotorg commented 3 years ago

At 12:27 AM 6/23/2021, you wrote:

You mentioned they were stepper motors. I was thinking it was GPIO On or Off to start and stop the motor. But you are probably wanting to output a pulses and change the square wave to control the speed.

Yes, calls loops with variable delays to pulse the EasyDriver is what is in the existing Arduino code. I also use JMScheduler to control any regularly programmed actions. With the generalized stepper driver we can also translate X-Y instead of tilt, say for inspection systems. Thanks for the input - I'll let you know what I come up with.

rjsdotorg commented 2 years ago

Still working on the generic PTZ use case: camera.ts only allows usbcam or picam (default), and so without either plugged in, constructor() exits immediately, so I set "RTSPServer": 0 in rposConfig.json. I get ~30 "Could not retrieve Controlvalue ..." lines at init, but PTZ commands from ONVIF Device Manager are coming through OK. I'd actually like to serve a still image via ffmpeg -i in media_service.ts.py, but is that even necessary? Can I have it just use deliver_jpg() somehow? Having it stream a canned MPEG via ffmpeg would also be useful, but I'm not seeing what really should be modified to do that.

RogerHardiman commented 2 years ago

RTSPServer set to 0 is fine. I use that method myself when I don't want to use mpromonet or gstreamer RTSP servers. What I'd done a few years back was run ffmpeg's RTSP server (part of ffserver), but that was dropped with ffmpeg 4.x so you have to go and build an older ffmpeg 3.x. Or, VLC has a mode where it can be an RTSP server and then you can feed in a file with a loop. Or I've set up gstreamer to read an image from a file by editing the gst launch pipeline in the python subdirectory

rjsdotorg commented 2 years ago

Re:

in the file ./lib/PTZDriver.ts is a block of code the Pan/Tilt/Zoom actions for each supported PTZ device.

Line 218: console.log("Continuous PTZ "+ data.pan ... appears that it writes to the terminal with every PTZ request. However, I've tried modifying the line of code Line 218: console.log("Continuous PTZ CMD recv"+ data.pan ... in /home/pi/rpos/lib/PTZDriver.ts, and it does not change the terminal output (even rebooted as well).

Upon further investigation, there is a PTZDriver.js which is what actually is run - if I modify that, then the output changes. What is the method used to generate the .ts file, and how is it used in rpos? I can rename/delete the .ts file and it seems fine. ?

I have a nice new Sparkfun Stepper driver version of pantilt.py I'd like to test, and not sure how this is organized.

RogerHardiman commented 2 years ago

When you installed RPOS you would have followed a step to 'compile' .ts (typescript) into .js (javascript). When you modify the .ts files, you need to recompile rpos.

rjsdotorg commented 2 years ago

Ah, npx gulp is needed to generate the .js I've written a modified PanTilt class, substituting the Sparkfun stepper driver for the Pimoroni i2C, and will be testing it. It might be nice to start some Wiki entries, but maybe you can get an outline going? Some of these "issues" really would be best as wiki recipes once settled.

rjsdotorg commented 2 years ago

OK, I see that PTZDriver.ts loads pan-tilt-hat.js which spawns node-pantilthat.py: spawn('python', [path.join(__dirname, '/bin/node-pantilthat.py')]) which reads command-line calls to that daemonic instance. I can get the pid from this.python once spawned so I know it was done, but I don't see any output from it in the terminal, like from the lines

print('Valid commands are');
print('  pan angle');
print('  tilt angle');
print('  goto angle angle');
print('  get_pan');
print('  get_tilt');

Also, oddly, I see node start, but no pid I see in TaskManager matches what the Javascript spawn() returns. I tried adding flush=True to the print()s but still nothing visible. I also tried switching from default py2.7 to py3.5 to no avail.

My design idea was to create a new node-modules/sparkfun-driver/pan-tilt.js, based on pan-tilt-hat.js, which would spawn node-sparkfundriver.py, which would import a clone of the Pimoroni pantilthat package where I've modified pantilt.py. However, debug print()s I add to node-pantilthat.py and pantilt.py do not appear, so I'm misunderstanding something...

If you have a better design idea, please suggest.

Karlo318 commented 2 years ago

Hi,

I am dealing with the same problem: I get 1 second of live stream and then I get the messagge NO SIGNAL. Here is the error I get:

ffmpeg exec error: Error: Command failed: ffmpeg -fflags nobuffer -probesize 256 -rtsp_transport tcp -i rtsp://127.0.0.1:8554/h264 -vframes 1 -r 1 -s 640x360 -y /dev/shm/snapshot.jpg ffmpeg version 4.3.3-0+rpt2+deb11u1 Copyright (c) 2000-2021 the FFmpeg developers built with gcc 10 (Raspbian 10.2.1-6+rpi1) configuration: --prefix=/usr --extra-version=0+rpt2+deb11u1 --toolchain=hardened --incdir=/usr/include/arm-linux-gnueabihf --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-mmal --enable-neon --enable-rpi --enable-v4l2-request --enable-libudev --enable-epoxy --enable-pocketsphinx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared --libdir=/usr/lib/arm-linux-gnueabihf --cpu=arm1176jzf-s --arch=arm libavutil 56. 51.100 / 56. 51.100 libavcodec 58. 91.100 / 58. 91.100 libavformat 58. 45.100 / 58. 45.100 libavdevice 58. 10.100 / 58. 10.100 libavfilter 7. 85.100 / 7. 85.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 7.100 / 5. 7.100 libswresample 3. 7.100 / 3. 7.100 libpostproc 55. 7.100 / 55. 7.100 [rtsp @ 0xf28940] Stream #0: not enough frames to estimate rate; consider increasing probesize [rtsp @ 0xf28940] decoding for stream 0 failed Input #0, rtsp, from 'rtsp://127.0.0.1:8554/h264': Metadata: title : Session streamed by "testRTSPServer" comment : h264 Duration: N/A, start: 0.000000, bitrate: N/A Stream #0:0: Video: h264 (High), yuv420p(progressive), 1280x720, 90k tbr, 90k tbn, 180k tbc Stream mapping: Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native)) Press [q] to stop, [?] for help frame= 0 fps=0.0 q=0.0 size=N/A time=-577014:32:22.77 bitrate=N/A speed=N/A

Also, here is my config file:

{ "NetworkAdapters" : ["awdl0","eth0", "wlan0", "en0"], "IpAddress" : "myipaddress", "ServicePort" : 8081, "Username" : "user", "Password" : "pass", "CameraType" : "picam", "RTSPAddress" : "", "//":"Normally left blank. Used to set RTSP Server Address", "RTSPPort" : 8554, "RTSPName" : "h264", "MulticastEnabled" : false, "RTSPMulticastName" : "h264m", "MulticastAddress" : "224.0.0.1", "MulticastPort" : "10001", "RTSPServer" : 1, "RtspServerComment" : "## Select RTSP Server > 1:RPOS RTSP Server 2:V4L2 RTSP Server by mpromonet (auto se> "PTZDriver" : "none", "PTZDriverComment": "## valid values are none,tenx,pelcod,visca and pan-tilt-hat", "PTZOutput" : "none", "PTZOutputComment": "## values are none (eg Tenx), serial and tcp", "PTZSerialPort" : "/dev/ttyUSB0", "PTZSerialPortSettings" : { "baudRate":2400, "dataBits":8, "parity":"none", "stopBits":1 }, "PTZOutputURL": "127.0.0.1:9999", "PTZCameraAddress": 1, "DeviceInformation" : { "Manufacturer" : "Raspberry Pi", "Model" : "Zero W", "HardwareId" : "" }, "logLevel" : 3, "": "## LogLevels are > 1:Error 2:Warning 3:Info 4:Debug", "logSoapCalls" : false }

I am using Raspberry Pi Zero W and official PiCam v.2. Do you have any idea why is the stream failing?

rjsdotorg commented 2 years ago

Karlo318Look above in the thread - I installed and compiled 2.1. and it worked as described there.

Karlo318 commented 2 years ago

@rjsdotorg I already installed and compiled version 2.1.0. I even tried different RTSP server options, but nothing helped. When I choose RPOS RTSP server I get at least one second of streaming. On other options, I get NO SIGNAL immediately.

rjsdotorg commented 2 years ago

I see you had "RTSPServer" : 1, whereas I was using 2 "mpromonet" I've now changed to 0 for other reasons but could test options soon.

Karlo318 commented 2 years ago

This is error I get when I set RTSP server to option 2:

rtspServer: VIDIOC_REQBUFS: Inappropriate ioctl for device VIDIOC_STREAMOFF: Inappropriate ioctl for device VIDIOC_REQBUFS: Inappropriate ioctl for device

rtspServer: VIDIOC_REQBUFS: Inappropriate ioctl for device VIDIOC_STREAMOFF: Inappropriate ioctl for device VIDIOC_REQBUFS: Inappropriate ioctl for device

Also, I get ffmpeg error previously described. The only difference is that in this case I don't get even one second on screen before NO SIGNAL message.

rjsdotorg commented 2 years ago

Seems like that might be a Node version issue https://github.com/BreeeZe/rpos/issues/106

You do sudo modprobe bcm2835-v4l2 right?

Karlo318 commented 2 years ago

I did everything like it says in readme. My current node version is 10.24.0. I could try again with 8.x.

cdaher78 commented 2 years ago

@Karlo318 it only works with Node v8 and NPM v6.

node -v (8.17.3) npm -v (6.13.4)

rjsdotorg commented 2 years ago

As to the RPOS instructions, newest RPi OS 11 Bullseye installs NodeJS v12 whereas RPi OS 10 Buster installs Node V10 and the latest is now 14. RogerHardiman I think is still working on these node version issues.

Karlo318 commented 2 years ago

I have been trying everything, but I always end up with the same problem. Here is what I did:

At first attempt I enabled legacy camera support in raspberry config, so I thought that was the problem. I had to enable Glamor in raspi-config in order to get the camera working on Raspberry Pi Zero. Here is the link of documentation: https://www.raspberrypi.com/documentation/accessories/camera.html. There it says:

On Pi3 and earlier devices running Bullseye you need to re-enable Glamor in order to make the X-Windows hardware accelerated preview window work. To do this enter sudo raspi-config at a terminal window and then choose Advanced Options, Glamor and Yes. Finally quit raspi-config and let it reboot your Pi.

So I did that and I can confirm that camera is working. Then I followed the steps in readme. My current versions are:

node -v - 8.17.0 npm -v - 6.13.4 nodejs -v - 12.22.5

I tried RTSP server option 1 and option 2, but either way I get NO SIGNAL message. The only difference now is that I don't get that one second of stream before NO SIGNAL message. I don't know what else to do.

cdaher78 commented 2 years ago

@Karlo318 have you tried on Buster? I deployed two RPB 4 2GB on Pi OS Buster version.

One using server 2 option and another one using server 3 option . Both are running, but Server 2 has the best image rendering.

I went through what you are going through and I had to begin from scratch several times to understand what was going on. Attached goes some intructions I made to myself to guide me through a working process.

Onvif Pi Camera Deploy Instructions.txt

I hope it could help you to make it work.

Karlo318 commented 2 years ago

@cdaher78 Thank you for the instructions, I did everything like you said and hoped for the best. However, I got the same result as in previous case. I still get ffmpeg error and I can get one second of stream only on stream option 1. On option 2 I don't even get that.

Maybe the problem is in node and npm versions. Pi Zero W runs on ARMv6 architecture. I cannot install node version 8.17.3 because of that. I can only get version 8.17.0. Npm version is 6.13.4 so that is OK.

Do you have any other ideas? Does anyone run rpos on Pi Zero W?

RogerHardiman commented 2 years ago

I don't have a Pi Zero so don't know what issues it may have. There are issues with the lastest Raspberry Pi OS releases and camera drivers and last time I checked (2 months ago) the best plan was to look for the Raspberry Pi OS LEGACY Release version and use that as the Pi Foundation made loads of changes to the Pi camera drivers which cause all sorts of compatibility problems.

The node version is not a general issue. I use Node version 12. But it may be an issue on the Pi Zero due to the older CPU model.

Can I ask you open a new thread as this thread is for issues with the background ffmpeg worker process that is used to make the Thumbnail JPEG image (part of the ONVIF standard)

I've noticed today the background ffmpeg is not working so off to investigate Thanks Roger

RogerHardiman commented 2 years ago

the background ffmpeg makes a connection to the local RTSP server, so if the RTSP server has failed (eg NO SIGNAL in Onvif Device Manager) then the ffmpeg will timeout and report an error too.

RogerHardiman commented 2 years ago

Looks like the gstreamer RTSP server fails when there are multiple RTSP connections. When video software asks for a JPEG thumbnail, we start a local copy of ffmpeg to connect to the local RTSP server and capture 1 frame of video. This then breaks the live video stream. That is unfortunate.

RogerHardiman commented 2 years ago

Seems that the gstreamer RTSP server in gstreamer 1.18 works fine with multiple viewers, but the streamer 1.14 in old buster had an issue. Have not checked yet to see what the v4l2rtspserver from mpromonet does