elsampsa / websocket-mse-demo

Stream H264 to browsers with websocket and w3 media source extensions
123 stars 31 forks source link

Offset is outside the bounds of the DataView #2

Closed nahueltaibo closed 3 years ago

nahueltaibo commented 3 years ago

Hi elsampsa

I am trying to get this working on Ubuntu, and after preparing the environment for it I am getting the following error:

Uncaught RangeError: Offset is outside the bounds of the DataView at DataView.getInt32 () at toInt (ws_client_new.html:71) at getBox (ws_client_new.html:80) at putPacket (ws_client_new.html:134) at WebSocket.ws.onmessage (ws_client_new.html:222) toInt @ ws_client_new.html:71 getBox @ ws_client_new.html:80 putPacket @ ws_client_new.html:134 ws.onmessage @ ws_client_new.html:222

This error repeats constantly and the stream is never visible. Any idea what can be causing this?

Thanks

elsampsa commented 3 years ago

hmm.. never seen this before.

Btw, this whole example is a bit outdated.. I should substitute the whole apache thing with nginx and the muxing part with libValkka (instead of a simple ffmpeg command). Let's see if I get motivated today.

Just for the sake of it.. can you tell a bit more about your camera & video stream? what resolution?

Do this on command-line & paste here the first few lines of the output

ffmpeg -r 5 -i rtsp://user:passwd@ip-address -c:v copy -an -movflags frag_keyframe+empty_moov -f mp4 kokkelis.mp4
nahueltaibo commented 3 years ago

So when running the original ffmpeg url from the example: st="ffmpeg -r 5 -i rtsp://192.168.86.46:5554 -c:v copy -an -movflags frag_keyframe+empty_moov -f mp4 pipe:1"

I get this output on the server:

python3 ws_serve.py ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers built with gcc 9 (Ubuntu 9.3.0-10ubuntu2) configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared libavutil 56. 31.100 / 56. 31.100 libavcodec 58. 54.100 / 58. 54.100 libavformat 58. 29.100 / 58. 29.100 libavdevice 58. 8.100 / 58. 8.100 libavfilter 7. 57.100 / 7. 57.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 5.100 / 5. 5.100 libswresample 3. 5.100 / 3. 5.100 libpostproc 55. 5.100 / 55. 5.100 [rtsp @ 0x564383fb1700] UDP timeout, retrying with TCP [rtsp @ 0x564383fb1700] Nonmatching transport in server reply [rtsp @ 0x564383fb1700] Could not find codec parameters for stream 0 (Video: h264, none): unspecified size Consider increasing the value for the 'analyzeduration' and 'probesize' options Input #0, rtsp, from 'rtsp://192.168.86.46:5554': Metadata: title : RTSP_CAMERA comment : N/A Duration: N/A, bitrate: N/A Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc Stream #0:1: Audio: aac, 22050 Hz, mono, fltp Output #0, mp4, to 'pipe:1': Output file #0 does not contain any stream Error in connection handler Traceback (most recent call last): File "/home/now/.local/lib/python3.8/site-packages/websockets/server.py", line 191, in handler await self.ws_handler(self, path) File "ws_serve.py", line 36, in hello await websocket.send(packet) File "/home/now/.local/lib/python3.8/site-packages/websockets/protocol.py", line 567, in send await self.write_frame(True, opcode, data) File "/home/now/.local/lib/python3.8/site-packages/websockets/protocol.py", line 1083, in write_frame await self.ensure_open() File "/home/now/.local/lib/python3.8/site-packages/websockets/protocol.py", line 803, in ensure_open raise self.connection_closed_exc() websockets.exceptions.ConnectionClosedError: code = 1006 (connection closed abnormally [internal]), no reason

When running the version you requested the output for: ffmpeg -r 5 -i rtsp://user:passwd@ip-address -c:v copy -an -movflags frag_keyframe+empty_moov -f mp4 kokkelis.mp4

ffmpeg -r 5 -i rtsp://192.168.86.46:5554 -c:v copy -an -movflags frag_keyframe+empty_moov -f mp4 kokkelis.mp4 ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers built with gcc 9 (Ubuntu 9.3.0-10ubuntu2) configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared libavutil 56. 31.100 / 56. 31.100 libavcodec 58. 54.100 / 58. 54.100 libavformat 58. 29.100 / 58. 29.100 libavdevice 58. 8.100 / 58. 8.100 libavfilter 7. 57.100 / 7. 57.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 5.100 / 5. 5.100 libswresample 3. 5.100 / 3. 5.100 libpostproc 55. 5.100 / 55. 5.100 [rtsp @ 0x55a2320a1700] UDP timeout, retrying with TCP [rtsp @ 0x55a2320a1700] Nonmatching transport in server reply [rtsp @ 0x55a2320a1700] Could not find codec parameters for stream 0 (Video: h264, none): unspecified size Consider increasing the value for the 'analyzeduration' and 'probesize' options Input #0, rtsp, from 'rtsp://192.168.86.46:5554': Metadata: title : RTSP_CAMERA comment : N/A Duration: N/A, bitrate: N/A Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc Stream #0:1: Audio: aac, 22050 Hz, mono, fltp

As I now see, it seems like the app I'm using to create the rtsp stream is not setting the codecs, is that correct? The app is this android app: https://play.google.com/store/apps/details?id=com.miv.rtspcamera (It creates an rtsp stream from the phone camera, and tells you the url to connect to it)

elsampsa commented 3 years ago

you should scrap the audio.

But nevermind! The example has now been completely rewritten, so I'll close this ticket. Please try the rewritten example. :)

nahueltaibo commented 3 years ago

oh this was the only example I could find that was really basic, not using other libraries. My main idea is to achieve this in c# for the server. No worries, I'll try to get it working based on other examples. Thanks for your help

elsampsa commented 3 years ago

Not .. using .. other .. libraries..? Good luck with that :) I recommend that you start by creating your own rtsp client library (like live555) and then a muxing library (like the one we have in ffmpeg). Both of them have hundreds of man-years invested.

nahueltaibo commented 3 years ago

Lol, I think I didnt explain myself right. Im trying to understand the inside of mp4 structure, (as you did when you created the previous example) and live video streaming in general. My main idea is to get live streaming to html5 video from an rtsp camera. Im ok using ffmpeg to produce the fragmented mp4, just want to be able to understand the issues I am havig streaming that video feed through a websocket into the video using MSE.

elsampsa commented 3 years ago

ok., got it :) if you want to use ffmpeg, the old example is in the "legacy" folder. Please be sure to remove any audio from the stream. The problem with reading the ffmpeg processes output is, that it doesn't give you complete mp4 "boxes", but you have to "manually" reconstruct the "boxes" (moov, ftyp, mdat) from fragments. And for understanding f-mp4 please see the references/links in the main readme page.