amazon-science / patchcore-inspection

Apache License 2.0
719 stars 146 forks source link

Size of embeddings #37

Open silvanobianchi opened 2 years ago

silvanobianchi commented 2 years ago

At the end of the _embed(self, images, detach=True, provide_patch_shapes=False): method, why the size of the embeddings is different from the sum of the channels of the chosen layers?

As example: if I'm using a wide resnet50 v2, and layer2 and 3 as features extractors, why the embeddings size is 1024 (#channels of layer 3) and not 1024+512 (layer2+layer3 channels) ?