Closed eugenwalcher closed 2 years ago
There are two queues. The cleanup
command currently only clears the remote one, but it might as well clean the local one as well. The quick workaround would be:
$ gopro cleanup
$ sqlite3 gopro.db
sqlite> delete from upload_parts;
sqlite> delete from uploads;
sqlite> .quit
I'm adding that to the cleanup
command currently, though.
It's probably not that hard to get the createupload
command to recurse if you give it a directory -- I've already got some descent code, but I'll probably not do that tonight. You can do something like:
find /Volumes/WD/2019 -type f -print0 | xargs -0 gopro createupload
It should work as long as there's not a multipart at just the right place so it can't detect the grouping. Otherwise, I'd just write a script like:
#!/bin/sh
for d in /Volumes/WD/2019/*
do
for s in "$d"/*
do
gopro createupload "$s"/*
done
done
That should maintain any multi-chapter groupings as it's creating uploads.
Actually, I'm pretty sure this change will work with:
gopro cleanup
gopro createupload /Volumes/WD/2019/
gopro upload
Or you can just do it with two commands, but if you're uploading a lot, you'll probably end up running the gopro upload
command more than once.
perfect, thanks!
First of all, thanks for the effort to build this tool I have a back catalog of originals I want to upload to GoPro cloud so I've been trying to get my head around all the commands on my Mac.
I tried using the createupload command to queue up some uploads but now when I run upload there is an error.
Even if I try to use the upload command and supply a different path of files it still fails with the same error.
Is there a way to clear the queue and try createupload again?
Lastly, is there a way to either use upload or createupload and choose to include sub directories (recursive)? My footage is in folders like:
It seems as though I have to enter the path name of the actual GPro MP4 files.
I was hoping for something like: