Open chrissype opened 5 years ago
Hi @chrissype, thank you for revealing the problem. Yes, definitely would be great to see to what you've came up, probably we could work on a better solution eventually.
Sure thing, I have opened a pull request so you can see what I did. The tests all pass on my machine.
The alternative I could think of was to store the running total not as a tensorflow tensor, but it looks like you'd have to .eval() the results of the metric calculation and I don't know how expensive that would be.
@chrissype, eventually, it looks like this effort of fixing this issue is useless, since keras
already provides stateful metrics in it's API: https://github.com/keras-team/keras/blob/master/keras/metrics.py
Probably, it would be better to deprecate this repository, since it's completely integrated into the keras.
Well spotted, can't believe I'd missed that. I did always find it very weird that they took them out.
Instantiated metrics are not pickleable, as
__init__
instantiates tf variables, which cannot be pickled.This means that keras models trained with one of the metrics, and then saved, cannot be opened without knowing the dict structure for the 'custom_objects' argument when loading the model. Without using dill, one cannot pickle said dict, which would make delivering a complete model possible. Tools like mlflow are able to track and return complete keras models, however only if the model and custom_objects are both pickleable.
I've come up with a moderately ugly solution that waits until a metric is called to instantiate any tf variables, let me know if you want to see it.