Lou1sM / meaningful_image_complexity

11 stars 0 forks source link

long computation time #2

Open bakachan19 opened 1 month ago

bakachan19 commented 1 month ago

Hi, I am interested in using your work to compute the complexity of some images. I tried computing the complexity of an image using the following code:

from measure_complexity import ComplexityMeasurer
import numpy as np

comp_meas = ComplexityMeasurer(ncs_to_check=8,
                               n_cluster_inits=1,
                               nz=2,
                               num_levels=4,
                               cluster_model='GMM',
                               info_subsample=0.3,
                               )

img = np.load(<path-to-img-file>)

complexity_of_img_at_each_level = comp_meas.interpret(img)

However, after more than 3 hours the execution on a single image did not finish. Is this something expected, or is there any particular configuration of the function that I need to use? For running the code I used Google Colab with CPU backend.

Thank you.

Lou1sM commented 1 month ago

Hi, it definitely shouldn't take that long. What's the resolution of your images? The experiments in the paper used imagenet size of 224*224, and took around 5-10s per image.

bakachan19 commented 1 month ago

Hi @Lou1sM, thanks for the quick reply. My images are much bigger than 224*224. I will try rerunning the script after resizing. Thank you.

Lou1sM commented 1 month ago

Yeah I think resizing is the best option, if you really wanted to stick to high resolution you could reduce the number of layers to 2, but I doubt resizing the images will make any difference to the scores anyway.