facebookresearch / NeuralCompression

A collection of tools for neural compression enthusiasts.
MIT License
496 stars 43 forks source link

Metrics refactor #197

Closed mmuckley closed 1 year ago

mmuckley commented 1 year ago

Refactors metrics implementations in the repository.

We no longer include all metrics in one big distortion.py. Instead, most metrics are upstreamed to torchmetrics.

However, there are some issues with torchmetrics, the main one being for FID and KID calculations. For these metrics, tensors are stored or accumulated on-GPU, which could lead to numerical precision or memory issues for very large feature counts.

In neuralcompression, we instead provide minimalist wrappers for the final calculations in torch-fidelity. The intermediate inception features are stored on CPU, where the memory footprint is not so severe.

The PR also includes new implementations of a function for measuring the size of pickle files in-memory, as well as a wrapper for FID and KID to compute the FID/256 metric introduced in the HiFiC paper.

mmuckley commented 1 year ago

Thanks so much for the positive comments @marksibrahim!

  1. I would prefer absolute I think, but I haven't structured the repo completely well for that as it can lead to some circular imports that mess everything up. At some point we may want to figure out how to work around that and use the absolute imports to keep things a little more stable.
  2. This is something introduced by Allen to help guide linters and autocompletion. At the moment we're just keeping with this established standard.