aai-institute / pyDVL

pyDVL is a library of stable implementations of algorithms for data valuation and influence function computation
https://pydvl.org
GNU Lesser General Public License v3.0
99 stars 8 forks source link

Simplify caching #600

Open janosg opened 3 months ago

janosg commented 3 months ago

The new design of data valuation methods avoids repeated computations of the utility function without relying on caching. We could therefore get rid of our current caching implementation based on memcached, which seems overpowered. This would close several issues related to caching (e.g. #517, #475, #464 and #459). Moreover, it could solve problems that arise due to the many files the current caching solution creates.

The only situation where caching ist still really important is when one benchmarks multiple algorithms and wants to use caching to ensure that randomness is kept as constant as possible between different algorithms and to save runtime in the benchmark. We therefore should create an entry point for benchmarking frameworks to enable caching. I see two possible solutions:

  1. Use a simple shared-memory cache to store all utility evaluations and return them as part of the ValuationResult. A benchmarking library could then use these evaluations to build up a cache. All logic to wrap Utility with a cached version would be in the benchmarking library.
  2. We could keep the cache_backend abstraction in the Utility but only implement a much simpler shared-memory backend in pydvl. Users with advanced caching needs could then build their own backends.
AnesBenmerzoug commented 2 months ago

Now that we only use joblib for the parallelization of data valuation algorithms we could also leverage its caching mechanism through the Memory class and maybe only offer one extension to support caching in a distributed setting.

I tried using it when I refactored the caching backends and couldn't really make it work with memcached because it is implemented as a file-based caching. So I gave up on basing our code on it but I still took heavy inspiration from their interface so perhaps we could consider it again.