Closed berniel closed 4 years ago
in readme: Additional logging
If you have additional information to be logged, in _train_epoch() of your trainer class, merge them with log as shown below before returning:
additional_log = {"gradient_norm": g, "sensitivity": s} log = log.update(additional_log) return log
should it be _log.update(additionallog) instead of _log = log.update(additionallog)?
Yes it should... Thank you for reporting this.
in readme: Additional logging
If you have additional information to be logged, in _train_epoch() of your trainer class, merge them with log as shown below before returning:
additional_log = {"gradient_norm": g, "sensitivity": s} log = log.update(additional_log) return log
should it be _log.update(additionallog) instead of _log = log.update(additionallog)?