aai-institute / nnbench

A small framework for benchmarking machine learning models.
https://aai-institute.github.io/nnbench/
Apache License 2.0
10 stars 3 forks source link

Add `nnbench.State` object holding current benchmark information, inject into setup/teardown tasks #125

Closed nicholasjng closed 6 months ago

nicholasjng commented 6 months ago

We want to experiment with a global memoization cache, which we want to explicitly clear in a teardown task after a family of benchmarks for a model (say NER on distilbert) has run.

This means that the setup and teardown tasks need to know which benchmark they are currently applied in.

For a single benchmark, say

@nnbench.benchmark
def echo(s: str) -> str:
    print(s)

the corresponding State should look something like this:

def tearDown(state, **params): # <- or maybe a mappingproxy of the params?
    print(state)

# name: "echo"
# family: "echo" <- this is for parametrized benchmarks, where the family name equals the function name.
# family_size: 1 <- how many members in the current benchmark family?
# family_index: 0 <- which number of the family is it?

More metadata suggestions welcome. After this, we can try evicting a memo from the cache on the condition family_index == family_size - 1.