Closed francisuk1989 closed 2 years ago
And what could we do to fix this?
I blacklisted h264_omx by default because ffmpeg has zero copy enabled on h264_omx by default, and it WILL cause the GPU to lock up. Disable zero copy on your ffmpeg, recompile and reinstall, then comment out h264_omx from the blacklist in motion source code to re-enable it. All patches to do this are available in motionEyeOS.
@jasaw thanks for providing the patches! I would like to try this out, as CPU usage is just too high on my XU4. Could you kindly point to where in the MotionEyeOS repo the requisite patches are located?
ffmpeg disable zero copy patch: https://github.com/ccrisan/motioneyeos/blob/master/package/ffmpeg/disable-rpi-omx-input-zerocopy.patch
ffmpeg fix handling of fragmented buffers patch: https://trac.ffmpeg.org/raw-attachment/ticket/7687/0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch
motion undo h264_omx blacklist patch: https://github.com/ccrisan/motioneyeos/blob/master/package/motion/enable-h264-omx-codec.patch
Are these patches still necessary with the latest version of ffmpeg?
@gururise Did the patch help you? I also have a XU4 and CPU problems saving the videos.
@roger- I believe the patches are still necessary.
@maffi91 @gururise For XU4 hardware video encoding, you need to use h264_v4l2m2m
encoder. Make sure your ffmpeg supports h264_v4l2m2m, then change movie_codec
parameter in the camera-1.conf
file to select the h264_v4l2m2m
encoder to this: movie_codec mp4:h264_v4l2m2m
.
Are there instructions for applying the patches somewhere I can reference?
Thanks!
# Compile and install FFmpeg from source
sudo apt-get update -qq && sudo apt-get -y install \
autoconf \
automake \
build-essential \
cmake \
git-core \
libass-dev \
libfreetype6-dev \
libsdl2-dev \
libtool \
libva-dev \
libvdpau-dev \
libvorbis-dev \
libxcb1-dev \
libxcb-shm0-dev \
libxcb-xfixes0-dev \
pkg-config \
texinfo \
wget \
zlib1g-dev \
libx264-dev
sudo apt-get install libavformat-dev libavcodec-dev libavutil-dev libswscale-dev libavdevice-dev
wget https://ffmpeg.org/releases/ffmpeg-4.2.tar.bz2
tar xf ffmpeg-4.2.tar.bz2
wget https://raw.githubusercontent.com/ccrisan/motioneyeos/master/package/ffmpeg/disable-rpi-omx-input-zerocopy.patch
wget https://trac.ffmpeg.org/raw-attachment/ticket/7687/0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch
patch -p1 -d ffmpeg-4.2 < 0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch
patch -p1 -d ffmpeg-4.2 < disable-rpi-omx-input-zerocopy.patch
cd ffmpeg-4.2
./configure --enable-mmal --enable-omx --enable-omx-rpi --extra-cflags=-I/opt/vc/include/IL
make
make install
cd -
# Compile and install motion from source
sudo apt-get install autoconf automake pkgconf libtool libjpeg8-dev build-essential libzip-dev gettext libmicrohttpd-dev
git clone https://github.com/Motion-Project/motion.git
cd motion
git checkout ff903de2e3d6320f13841a91e4080c1220c38059
wget https://raw.githubusercontent.com/jasaw/motioneyeos/motion-disable-omx-zerocopy/package/motion/2000-disable-zerocopy-via-avoptions.patch
patch -p1 -d . < 2000-disable-zerocopy-via-avoptions.patch
wget https://raw.githubusercontent.com/jasaw/motioneyeos/motion-disable-omx-zerocopy/package/motion/enable-h264-omx-codec.patch
patch -p1 -d . < enable-h264-omx-codec.patch
autoreconf -fiv
./configure
make
make install
The above is great, thanks @jasaw , I just wonder how long it takes to pull that through on a RPi... I once tried to set up cross-compiling toolchain to recompile ffmpeg on a Linux desktop but failed at it (I don't exactly remember what it was that failed, I recall I managed to produce a simple test arm executable so it might have been some ffmpeg dependencies or required tools or just configuration, since I didn't really know what I was doing). I'd be interested in trying it out once more if someone could point out (or share) instructions for it.
RPi 3 and above are pretty fast, shouldn't take too long. I compile things on my RPi all the time. Replace make
with make -j4
to use 4 cores to compile. FFmpeg may take a while to compile, but can always start the compilation and leave it overnight.
Cross compiling ffmpeg is painful because it has A LOT of dependencies. You'll need to set up the whole cross compilation environment properly, and cross compile all the dependencies too. This is why we have MotionEyeOS project that deals with all the cross compilation. If you still want to cross compile, check out MotionEyeOS.
# Compile and install FFmpeg from source sudo apt-get update -qq && sudo apt-get -y install \ autoconf \ automake \ build-essential \ cmake \ git-core \ libass-dev \ libfreetype6-dev \ libsdl2-dev \ libtool \ libva-dev \ libvdpau-dev \ libvorbis-dev \ libxcb1-dev \ libxcb-shm0-dev \ libxcb-xfixes0-dev \ pkg-config \ texinfo \ wget \ zlib1g-dev \ libx264-dev sudo apt-get install libavformat-dev libavcodec-dev libavutil-dev libswscale-dev libavdevice-dev git clone https://github.com/FFmpeg/FFmpeg.git patch -p1 -d FFmpeg < disable-rpi-omx-input-zerocopy.patch patch -p1 -d FFmpeg < 0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch cd FFmpeg ./configure make make install cd - # Compile and install motion from source sudo apt-get install autoconf automake pkgconf libtool libjpeg8-dev build-essential libzip-dev gettext libmicrohttpd-dev git clone https://github.com/Motion-Project/motion.git patch -p1 -d motion < enable-h264-omx-codec.patch cd motion autoreconf -fiv ./configure make make install
`pi@raspberrypi:~$ patch -p1 -d FFmpeg < disable-rpi-omx-input-zerocopy.patch -bash: disable-rpi-omx-input-zerocopy.patch: No such file or directory
pi@raspberrypi:~$ patch -p1 -d FFmpeg < 0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch -bash: 0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch: No such file or directory`
Do you have a full explanation please ?
@Saku241 Did you download the patches (links given by @jasaw in a couple of comments up)? You need to have those present to be able to apply them :)
@zagrim Where I suppose to place them ? Anyone got a full instruction for Raspberry Pi 3 ?
@Saku241 If you want to use the given instructions 1:1 (copy-paste), then you need to place the patch files in the parent directory of the cloned repos, which would seem to mean the home dir of the "pi" user.
In case command line is not your friend, you should execute these commands after running the first git clone
command:
wget https://github.com/ccrisan/motioneyeos/blob/master/package/ffmpeg/disable-rpi-omx-input-zerocopy.patch
wget https://trac.ffmpeg.org/raw-attachment/ticket/7687/0001-avcodec-omx-Fix-handling-of-fragmented-buffers.patch
wget https://github.com/ccrisan/motioneyeos/blob/master/package/motion/enable-h264-omx-codec.patch
@zagrim Thank you for everything, all is working but now the "h264/omx" is gone :/
Edit : I think it broke my MotionEye because now when it detects motion it start recording but dont stop Here's my log : https://pastebin.com/raw/Um7BcRVQ
@Saku241 Sorry, my bad, I left out --enable-mmal --enable-omx --enable-omx-rpi
in the ffmpeg configure stage. I've also updated my post above to include the enable lines. I was expecting the configure script to pick it up automatically, but it obviously did not. Let's hope it can pick up the dependencies. Let us know if it doesn't.
After you ran configure, compiled and installed ffmpeg, check whether h264_omx is supported by running this command: ffmpeg -encoders | grep omx
. You should get the below line if it's supported.
V..... h264_omx OpenMAX IL H.264 video encoder (codec h264)
@Saku241 Sorry, my bad, I left out
--enable-mmal --enable-omx --enable-omx-rpi
in the ffmpeg configure stage. I've also updated my post above to include the enable lines. I was expecting the configure script to pick it up automatically, but it obviously did not. Let's hope it can pick up the dependencies. Let us know if it doesn't.After you ran configure, compiled and installed ffmpeg, check whether h264_omx is supported by running this command:
ffmpeg -encoders | grep omx
. You should get the below line if it's supported.V..... h264_omx OpenMAX IL H.264 video encoder (codec h264)
pi@raspberrypi:~/FFmpeg $ ./configure --enable-mmal --enable-omx --enable-omx-rpi ERROR: OMX_Core.h not found
If you think configure made a mistake, make sure you are using the latest version from Git. If the latest version fails, report the problem to the ffmpeg-user@ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net. Include the log file "ffbuild/config.log" produced by configure as this will help solve the problem.
Edit : fixed with sudo apt-get install libomxil-bellagio-dev
but now I got a strange green line on my videos :
Just a heads up: after 2 years, zerocopy problem has finally been fixed in ffmpeg. Now it doesn't force enable zerocopy on RPi builds anymore, instead allows applications to disable zerocopy. See https://github.com/FFmpeg/FFmpeg/commit/23a3e1460a7a609651bfe75b7b4c428eaa8f3902
What this means is, ffmpeg master is in a mess right now, and in a broken state that has the green line across the screen. I've updated my instructions above, which works for me.
Just a heads up: after 2 years, zerocopy problem has finally been fixed in ffmpeg. Now it doesn't force enable zerocopy on RPi builds anymore, instead allows applications to disable zerocopy. See FFmpeg/FFmpeg@23a3e14
What this means is, ffmpeg master is in a mess right now, and in a broken state that has the green line across the screen. I've updated my instructions above, which works for me.
Alright, everything seems working now, no more green line but I got a problem when a file is recorded, it creates the video and a 262bytes file here :
that contain this :
I got this from motion.log
aoû 29 15:36:29 raspberrypi motion[310]: Not starting motion daemon, disabled via /etc/default/motion ... (warning).
@Saku241 Very odd. Make sure you don't have 2 instances of motion or motioneye running? I don't see this problem on my Pi.
@jasaw working fine now This happen only when continuous recording is on, it doesnt happen in motion dectection
@jasaw thanks for your instructions, I got my motionEYE framerate up to pretty decent levels (20-30fps or so). I am running motioneye on latest Rasbian (raspberry pi 4 4gb). Had only 2-3fps before. Same with my raspberry pi3B.
Only difference I did (regarding your instructions), I installed ffmpeg 4.2.1 instead of 4.2.
But now there's another problem --> those green lines appear in nearly every video. Is there any solution to this?
As far as I know, the latest working ffmpeg version is 4.2. Anything newer has the green line over the video. My suggestion is to stick to version 4.2.
These instructions are truly wonderful. I was struggling with low fps with my Rpi3b, and even bought Rpi4 because I understood from some of the comments (other issues regarding low fps) that earlier Pi's are simply too underpowered for recording HD video at anything above 2fps or so. But then I found out that the issue remains despite running motionEye on Rpi4. A friend of mine has the same problem with his Rpi3 and I believe there are several others who just think that Rpi can't handle the video.
What I'm suggesting is, that shouldn't this solution be more easily found and "advertised" in the wiki or something in order people to find it? Or even better, help ffmpeg guys to fix it?
I had to register on here to thank jasaw for that detailed guide on compiling motioneye and ffmpeg. It had been 15 years since I patched code on linux so was really lost until I found this. Now my humble Pi 3 B does a great job of recording from my cameras, before enabling omx I would get high CPU usage and mostly corrupt video files.
Thanks for this.
With my limited knowledge, i cannot get this right. Everything appears to work but at the end of the script when it says 'make' and then 'make install' i get an error (make: *** No rule to make target 'install'. Stop.)
Are you using sudo for make install because you need to.
Thanks for the reply. Alas, i tried it again, slowly, and it worked. The installation is done.
Now that the custom installation of ffmpeg and motion is done - how do i get motioneye? (ie. the web interface)
UPDATE: I installed motioneye and it is working now. I add 1 x RTSP camera and the cpu is right up to 120% again while motion is running.
UPDATE 2: I started the pi from a fresh install .img and i installed exactly as the instructions on this article. Then, i setup motion.conf and i ran motion from the command line. The stream is working from rtsp network camera using 1280x720 @ 2fps but CPU is 125% (exactly the same as motioneye was in every test i've done before this).
Any suggestions?
@seanocaster123 Follow the instructions here, but exclude ffmpeg
in step 1
https://github.com/ccrisan/motioneye/wiki/Install-On-Raspbian
Thanks, i will try that. The challenge is that i'm using ffmpeg to do some other timelapses... does this mean i cannot run motion and ffmpeg together?
I have to say, i've been trying to get RTSP -> HTTP working with low CPU usage for a few weeks now and i can't believe it is so difficult. I have the RPi 4 1GB and it should be able to handle at least 1 camera transcoding to HTTP
You can run motion and ffmpeg simultaneously, as long as both can access the same video source simultaneously (presuming you're working with one video source only).
Low latency low CPU RTSP is difficult ! Maybe you could describe your set up and what you are trying to do, someone here might be able to give you some pointers?
Thanks for the reply. I really appreciate it.
I am a surfer and i live in Dubai on the west coast. Meanwhile, we sometimes get waves on the east coast which is 100km away. So i have partnered with a hotel to host a hikvision camera, acurite 5-in-1 weather station and a RPi 4 for me. The station has been online for 2 weeks and it's working reliably. I have an ffmpeg script taking 1 photo every 5 minutes and placing it inside the /var/www/html/images/ folder and then via apache, people who browse the RPi can see some basic info via a simple html website here - [(http://www.seanssurfreport.com)] - the purpose of this whole process is to help the community in the UAE to have an idea of the conditions in order to avoid the long drive if it is flat.
The camera settings are 1920x1080 at 20fps and that is being broadcasted as an rtsp:// feed which the pi accesses via the local network. If i vnc into the pi from Dubai and open the stream via VLC it works well through VNC viewer.
vlc and ffmpeg are very tricky to work with, the syntax examples available online do not always translate to the linux version exactly so it took lots of fiddling to get it working. I can provide examples of all the scripts that are working to take individual pictures or even short videos clips if you like?
The ultimate goal is to provide a 1 fps http stream because that is the best way to see the true movement of the ocean. A pictures every 1, 2 or 5 etc. minutes is often missing the larger waves.
Thanks,
I would try this first.
Use gstreamer
to receive RTSP stream from your hikvision camera and pipe it to streameye. Streameye is a HTTP server that serves MJPEG only. You can then embed this MJPEG stream in your webpage that is exposed to the public. From personal experience, gstreamer
may introduce a high latency, but may not be an issue for your purpose.
Something like this may be enough to get you going: gst-launch-1.0 -v rtspsrc location=rtsp://<IP_ADDR>:<RTSP_PORT>/<URI> latency=0 drop-on-latency=1 ! rtph264depay ! h264parse ! omxh264dec ! videorate ! video/x-raw,framerate=20/1 ! jpegenc ! filesink location=/dev/stdout | streameye
.
This assumes that you have omxh264dec
enabled on your gstreamer which offloads H264 video decoding to the GPU (assuming your camera outputs h264 video). Framerate can be whatever you like (set to 20 here).
Thanks, latency is not a big deal at all. And honestly, i would be happy with 1 fps or even less, like 1 frame every 2 seconds.
I spent hours trying to install gstreamer and it gets close - but is never 100% ... when i try run a script to do a test video it just gives errors.
2019-10-25 08:03:29: ERROR: bind() failed: Address already in use 2019-10-25 08:03:29: ERROR: failed to start server
I also tried to use the script you sent using my ip camera details and it gives and error:
gst-launch-1.0 -v rtspsrc location=rtsp://192.168.1.65:554 latency=0 drop-on-latency=1 ! rtph264depay ! h264parse ! omxh264dec ! videorate ! video/x-raw,framerate=20/1 ! jpegenc ! filesink location=/dev/stdout | streameye
The fact that the test video didn't work tells me gstreamer is not working properly.
Have you seen this error before?
Good day users and devs! for the last month I`m using Motioneye on Raspbian buster ( Raspberry pi 4) from a RTSP camera. The framerate is terrible, although i enabled hardware acceleration! My idea is to use FFMPEG to decode the RTSP stream of the camera and encode it into MJPEG, in a separate script.
For the last few days, I`ve learned a great deal about FFMPEG and motion, but cannot get it to work for some reason! ( missing ffserver binary). Should i pursue this path? at the moment, The process Motion, is eating 350% of my cpu on 2 full HD cameras!
Another question for the Motioneeye dev is : is there a way to use the direct stream in the browser for a preview, and to use FFMPEG just for recording the original stream, bypasing motion alltoghether? Or, if i can use a substream of the camera on lower quality, but use the high quality one to be recordet? regards!
I`m trying to do that at the moment, using ffmpeg to convert stream to mjpeg and using streameye to broadcast it, then try to feed it as a mjpeg camera into motioneye. At the moment, i cant figure the url streameeye is outputing, but it seems to work with minimum cpu usage!( 50%) for a camera.
I feel like i have read every article on the internet trying to solve this video streaming mystery. I've tried motion, motioneye, gstreamer, ffmpeg, vlc and uv4l to load the IP camera as a virtual (local) camera. Some of them work, but kill the CPU - others don't even work. I feel like the coyote who keeps getting defeated by the road runner...
If 30fps is killing the CPU and there is no workable alternative i'm starting to wonder if it may be better to run an ffmpeg script that could take a 1 photo every 5 seconds as a continuous sequence and then simultaneously run another script to convert that ongoing sequence to some kind of live feed. You could even use php or javascript type of plugin to watch the folder and update the image every time it sees a new jpg in there.
Sadly, i think you are right. RTSP streams cameras are killing the CPU, since there is no GPU decoding in motion! Sadly my knowlege of HTML is limited, as i would code the same page as motioneye ( which i think is brilliant ) just for preview and navigation trough the recordings!, but, for recording the actuall stream, FFMPEG with a simple date and time script would suffice.
at the moment i get like a fps every 2 seconds with some frames dropped, having motion detection disabled. It works good enough for my family.
Hope the fix the motion high cpu usage in the future.
Regards
@seanocaster123
2019-10-25 08:03:29: ERROR: bind() failed: Address already in use 2019-10-25 08:03:29: ERROR: failed to start server
Make sure your motionEye and motion programs are not running. You don't need anything else except gstreamer
piping stdout to streameye
.
To troubleshoot your error message, try running gstreamer without streameye first by running gst-launch-1.0 -v rtspsrc location=rtsp://192.168.1.65:554 latency=0 drop-on-latency=1 ! rtph264depay ! h264parse ! omxh264dec ! videorate ! video/x-raw,framerate=20/1 ! jpegenc ! filesink location=/dev/null
. Make sure you set the GPU memory to something like 256MB in /boot/config.txt
.
If gstreamer works, then you can run back the previous command that pipes the output to streameye. You can then watch streameye via a web browser by pointing your browser to http://<RPi_IP_Address>:8080
.
@lolren I think there's currently no plans in motion to support hardware accelerated video decoding. Last time I looked, ffmpeg lacked any sensible API to specify which video decoder to use. I had another idea on enabling hw accel decoding in motion, but it's low priority at the moment.
Update: good news, motion now supports hardware decoding: https://github.com/Motion-Project/motion/pull/1047
ffserver
has been deprecated. You are better off trying gstreamer option that I suggested. omxh264dec
in the command line means offload decoding to GPU.
Yeyyy! Will test it now! Should i add omxh264dec to extra motion commands in Motioneye? edit : ignore this line, just seen that`s a gstreamer option!
Edit: From what I can see, that commit it`s not yet into master! I will try to see how I can apply it! edit: never mind, just find out how to get that branch... zero copy patch and enable h264_omx dont work.. should i try without it?
Just installed it now, CPU still spiked, but, if its not placebo, i see a slight increase in frame rate and less dropped frames!
@lolren Thanks for the replies. At my house i run software on my iMac called SecuritySpy. It's a paid app but it monitors a network IP camera i have pointing at my front yard. I have motion detection enabled for a certain region and it works reliably to capture short video clips. I guess this app works well because the iMac has a lot more power.
@jasaw Thanks. I will do a fresh install of rasbian and try gstreamer again. Does it matter if i use stretch or buster?
Update: good news, motion now supports hardware decoding: Motion-Project/motion#1047
@jasaw Sorry i'm asking such basic questions - but can you please point me in the right direction to enable hardware decoding / enable GPU acceleration etc. as i don't know if this enabled with a stock install and i haven't been able to find a straightforward tutorial for it. Thanks
On a different note - getting back to my idea to have ffmpeg take a picture every 30 seconds and put it into a folder. Then using a 2nd piece of code, to constantly update that folder and to view it like a gallery. This would be awesome because it would have history, unlike a live video stream where you could only watch the current moment. Thanks for all the support.
@seanocaster123 allow me to share my knowledge which i`ve gained today by experimentingwith FFMPEG: lets suppose we talk about h264 encoder /decoder: by typing ffmpeg -codecs |grep 264 you get the following output:
ffmpeg version 4.2 Copyright (c) 2000-2019 the FFmpeg developers built with gcc 8 (Raspbian 8.3.0-6+rpi1) configuration: --enable-mmal --enable-omx --enable-omx-rpi --enable-gpl --enable-libx264 --enable-nonfree --extra-cflags=-I/opt/vc/include/IL libavutil 56. 31.100 / 56. 31.100 libavcodec 58. 54.100 / 58. 54.100 libavformat 58. 29.100 / 58. 29.100 libavdevice 58. 8.100 / 58. 8.100 libavfilter 7. 57.100 / 7. 57.100 libswscale 5. 5.100 / 5. 5.100 libswresample 3. 5.100 / 3. 5.100 libpostproc 55. 5.100 / 55. 5.100 DEV.LS h264 H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (decoders: h264 h264_v4l2m2m h264_mmal ) (encoders: libx264 libx264rgb h264_omx h264_v4l2m2m h264_vaapi )
lets say you want to use as a decoder h264_mmal which is GPU Decoder and h264_omx as a GPU Encoder:
I have made a simple bash script to get a stream from my camera (Hickvision) and use FFMPEG to decode/encode the stream: `
stream="rtsp://admin:pass@192.168.1.101:554/Streaming/Channels/101"
ffmpeg -c:v h264_mmal -i $stream -threads 2 -c:v h264_omx -an output.mp4 here, we use h264_mmal for decoding the h264 stream using GPU and h264_omx to encode it using GPU and save it on HDD `
replacing the encoders and decoders here with others from the list will give you different results/encoding times!
The enable h264_omx patch instructions that I posted earlier won't work with the latest motion because a lot has changed in motion code. I haven't got time to update the instructions.
@lolren You're right, the hw accel patch hasn't been merged into master. Sorry, I didn't see it properly.
@seanocaster123 If you want live stream of 1 or 2 fps, and store historical snapshots of once every 30 seconds, you could use gstreamer + streameye
to give you MJPEG stream at low CPU cost. You could then serve the same MJPEG stream to multiple clients, one client is your public facing web server, and another being motioneye where motioneye could be configured to take snapshots and auto-delete of images.
+--> public facing web server
hikvision cam --> gstreamer --> streameye +
+--> motioneye (taking snapshots and auto-deleting)
@jasaw
The enable h264_omx patch instructions that I posted earlier won't work with the latest motion because a lot has changed in motion code. I haven't got time to update the instructions.
It seems that your patches are merged into latest commit of motion
& ffmpeg
. There is no need to patch them anymore?
@noliveleger It depends on what version of ffmpeg you want to run.
Latest ffmpeg version (as of today) & latest motion (as of today): You still need this patch to remove h264_omx from blacklist. https://github.com/ccrisan/motioneyeos/blob/dev/package/motion/0002-enable-h264-omx-codec.patch The reason being ffmpeg version number has not changed yet, and there is no way for motion to detect whether the installed ffmpeg version has been fixed or not. When ffmpeg version number changes, there is no need to apply the above patch anymore. However, there is one problem with the latest ffmpeg: the top quarter of the encoded video is green.
ffmpeg version 4.2 is the latest stable version when I tested it. Any newer version has the green video mentioned in point 1. With ffmpeg version 4.2, you need both zerocopy and fragmented-buffer patches. On top of that, if you run latest motion (as of today), you also need the remove-h264_omx-blacklist patch.
@lolren thanks for the info and sorry for the late reply. Work has been crazy the last week. Please give me your email address - I think you are I are on similar missions and can really help each other.
I have still not had much success with bringing down cpu usage for motion, ffmpeg or VLC. For live video streaming. However, I have been playing with the parameters and some php code as well so although I don’t have a live video feed working efficiently, I think I am going in a direction that may be even better with thumbnails of all the pictures that have been taken. Check it out http://86.98.158.68:1051/index-thumbs.php
I plan to make it easier to browse by date and even have a play button. Plus stop the script from displaying pictures during certain times (ie. night)
I also have an ffmpeg script that i run to record 60-second of video and save it to the /var/www/html/ folder and it works well. But when it runs the cpu goes up to 180+% but at least it’s only like that when it runs and not continuously.
I might have discovered an issue for Motioneye that results into libx264 been used instead of h264_omx Also using the h264/omx under Movies with Motion 4.1.1.1 installed and with motioneye v0.39.2, Ubuntu 18.04 Bionic Beaver x64
ffmpeg libavcodec version 57.107.100 libavformat version 57.83.100 [1:ml1] [WRN] [ENC] ffmpeg_set_codec: Preferred codec h264_omx has been blacklisted [1:ml1] [NTC] [ENC] ffmpeg_set_codec: Using codec libx264
squidpop@xpsdc:lap~$ ffmpeg -v ffmpeg version 3.4.2-2 Copyright (c) 2000-2018 the FFmpeg developers built with gcc 7 (Ubuntu 7.3.0-16ubuntu2) configuration: --prefix=/usr --extra-version=2 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared libavutil 55. 78.100 / 55. 78.100 libavcodec 57.107.100 / 57.107.100 libavformat 57. 83.100 / 57. 83.100 libavdevice 57. 10.100 / 57. 10.100 libavfilter 6.107.100 / 6.107.100 libavresample 3. 7. 0 / 3. 7. 0 libswscale 4. 8.100 / 4. 8.100 libswresample 2. 9.100 / 2. 9.100 libpostproc 54. 7.100 / 54. 7.100