Closed nitishsaDire closed 1 year ago
Hi When pyexiv2 modifies an image, it needs to read it into memory. So, the speed of modifying metadata is inversely proportional to the size of the image. When you read some images repeatedly, these files may be retained in memory by the operating system and do not have to be read again. In short, pyexiv2 just needs to read the file, there is no code to optimize. If the image is a small file and the disk is a high-speed SSD, pyexiv2 will process it faster.
Hey, Thanks for the reply. Speed inversely proportional to size is absolutely right. But the issue is that sometimes a 2-3 MB jpg file takes more time than a 30 MB dng file, and both are on the same device. Also sometimes updating a 2 MB jpg takes roughly 0.25 sec, even when the file is on SSD.
I think the time consumed to modify an image is divided into two parts:
You can measure the time with the following code:
import pyexiv2
import time
path = './test.jpg'
t0 = time.time()
with open(path, 'rb') as f:
data = f.read()
t1 = time.time()
print(f'time to read image from disk:\t {t1-t0}')
t0 = time.time()
with pyexiv2.ImageData(data) as img:
img.modify_xmp({"Xmp.xmp.Rating":x, "Xmp.xmp.Label":y})
t1 = time.time()
print(f'time to modify image in memory:\t {t1-t0}')
t0 = time.time()
with open(path, 'wb') as f:
f.write(img.get_bytes())
t1 = time.time()
print(f'time to write image into disk:\t {t1-t0}')
Hey, I am updating the jpg files (around 100, each of roughly size 2-3Mbs). My code is:
running this on a loop for each of 100 files.
Sometimes speed is 20 iterations/sec, sometimes it is 2sec/iteration. Also, some dngs (other set) are roughly the size of 30Mbs/dng, there the speed is the same as of 2-3Mbs jpgs.
Am I doing something unoptimized, is there any possible improvements? Thanks in advance. python3.8