Closed rpoxo closed 10 months ago
Hi - Thanks for using the software!
I'll add a way to configure the timeout, or detect the data rate or something. However.. networking and in particular SMB/CIFS settings can have a huge impact on performance. Out of curiosity, when copying a large file to and from the remote server, are you able to saturate the link? You should be able to get almost 100% of the link speed if everything is working OK.
Also, I'd be really interested to know what you are using the software for, if you are happy to share - no worries at all if not!
An extended timeout is available in v0.111.0 - should be on PyPI now..
saturating link
Yes, if i'm copying big files, atop reporting ~98% usage, ~975Mbps, rsync reports ~110 MB/s average, which i believe should on par with linear read for regular hdd's.
And here's usage patten when launching&running container. I haven't tested it yet on 10G link, but i believe result would be similar due to low IOPS over networked storage.
use case
Long ass boring gopro videos of me traveling on bike :) My friends were interested in looking at speed/cadence/heartrate stats during hard sections, aswell as map track on video, rather than compare tracks from strava, and then guessing where it is on video. Before i've been using VIRB edit for same purposes, but it's not supported anymore, and manually merging data track with video were pain, aswell as drifting over time longer than 5 minutes were less than desirable. My current process much easier, i just do 2 commands and can do something else meanwhile:
ffmpeg.exe -f concat -i mylist.txt -c copy -map 0:v -map 0:a -map 0:3 -copy_unknown -tag:2 gpmd concatenated.mp4
docker run -it -d --rm --name gopro-overlay -v "$(pwd):/work" dev/gopro-dashboard-overlay:0.108.1 gopro-dashboard.py --units-speed kph --units-altitude meter --units-distance km --gpx Afternoon_Gravel_Ride.gpx concatenated.mp4 concatenated_.mp4
I haven't gotten around hardware acceleration yet, through i do know it should be possible via vaapi and passing rendering device to docker container. As software encoding even on veryfast preset takes ~4 hours for 3h30m 4k@30fps 100mbps on 5950x with 75% cores utilization.
Thanks for this info! There is a fix for this in v0.111.0, improved hopefully in v0.112.0 - now timeout is longer, and it also shows speed of loading data. Note that the speed indicated is for the data track, not necessarily the speed of reading the underlying file.
Some minor thoughts... Docker can give worse performance than running natively. ffmpeg in docker doesn't AFAIK have GPU extensiions, and may not be as optimised for your particular machine. Have you tried gopro_join.py ? I'd be interested if it works for you.
Hi @rpoxo was this better?
Is it possible to increase timeout for gpmd ffmpeg subprocess run, or make it configurable from env variables? I'm using separate machine for applying overlay, and directory with source file mounted from fast ssd(barely ~5% activity during actual render) via cifs over gigabit lan(~10% activity during actual render). Source file is concatenated ~150G 4k 3hr footage, so i suspect the filesize is the issue here. Increasing timeout to 20-30 seconds prevents error, and scripts doing their job just fine.