Deeplite / deeplite-torch-zoo

Collection of SOTA efficient computer vision models for embedded applications, with pre-trained weights and training recipes
Apache License 2.0
82 stars 10 forks source link

Add user-API method to get evaulator objects and add a corresponding registry #132

Closed lzrwch closed 2 years ago

lzrwch commented 2 years ago

The task is to add a high-level API function:

from deeplite_torch_zoo import get_evaluaton_fn

eval_fn = get_evaluaton_fn(dataset_name=..., model_name=...)

metrics_dict = eval_fn(model, test_dataloader, **kwargs)

Refer to these tests for the reference on how eval fn are used now.

Implementation of get_evaluaton_fn should be based on a registry objects as for the other API methods

lzrwch commented 2 years ago

from our discussion: for the model wrapper registry, we can add another container that maps (model_name, dataset_name) -> task_type (e.g. classification, object_detection, semantic_segmentation). Then if we have an evaluator registry we use the task type (to be looked up in the model registry) as a key: classification -> return classification evaluator

Another point: evaluators might be dependent only on task (e.g. classification) or on task + model name and/or dataset name (e.g. yolo VOC evaluator). this needs to be addressed when creating the evaluator registry.