Closed the-lay closed 1 year ago
Try replacing the lines 80-94 with this code and lmk if that fixed it for you:
# batch files into groups of 10000
for ((i=0; i<${#fullpaths[@]}; i+=10000)); do
batch=("${fullpaths[@]:i:10000}")
# run exiftool on batched files
exiftool \
-api LargeFileSupport=1 \
-d %s \
-tagsfromfile "%d%f.%e.json" \
"-GPSAltitude<GeoDataAltitude" \
"-GPSLatitude<GeoDataLatitude" \
"-GPSLatitudeRef<GeoDataLatitude" \
"-GPSLongitude<GeoDataLongitude" \
"-GPSLongitudeRef<GeoDataLongitude" \
"-DateTimeOriginal<PhotoTakenTimeTimestamp" \
-overwrite_original \
-preserve \
-progress \
"${batch[@]}" || true
done
This will basically batch the processing into groups of 10k.
I am almost done with a more proper solution for this issue (<30 mins if everything goes well).
I pushed a new commit which uses a temporary file to store the file paths: https://github.com/Zaczero/google-photos-takeout.sh/commit/9d7a78162702808b3bf1668b825ff0b68753f3b8. exiftool supports reading from such files using the -@ file
option. This will resolve the issue with the arguments being too long.
It did work now! Thank you, great job :)
I was hoping to use your script to process google photos takeout of 66215 photos. The script crashes on Ubuntu 22.04 with
jq 1.6
andexiftool 11.88-1ubuntu0.1
.I've looked at exiftool forum and found this thread it suggests that there is a limit, and I guess it's less than 66.2k