SebChw / Actually-Robust-Training

Actually Robust Training - Tool Inspired by Andrej Karpathy "Recipe for training neural networks". It allows you to decompose your Deep Learning pipeline into modular and insightful "Steps". Additionally it has many features for testing and debugging neural nets.
MIT License
44 stars 0 forks source link

Add decorators closes #178 #191

Closed SebChw closed 12 months ago

SebChw commented 12 months ago

Previously we had to implement decorators on our own + we decorated classes.

def first_logging_func(data):
    logging.warning(data)

art_decorate([(MNISTModel, "predict")], first_logging_func)

Decorating class is an awful idea as function persists decorated during all runs. For scripts it's not that big deal but for notebooks much bigger.

Now decorating model work like this

from art.decorators import ModelDecorator, LogInputStats
project.run_all(model_decorators=[ModelDecorator("predict", LogInputStats())])

So:

  1. Some decorators are provided to the users
  2. Decorators persist only within specific project run
trebacz626 commented 12 months ago

Am I right that log inputs stats just prints inputs?

kordc commented 12 months ago

The link from the title hasn't applied, so just for reference - #178

SebChw commented 12 months ago

Am I right that log inputs stats just prints inputs?

Yes, but at first it turn on lovely tensors. So, mean, min-max and std are printed