Closed frauzufall closed 5 years ago
Ok, this oneliner is fixed and ready to be merged. Should I wait for anyone to review it?
:+1: LGTM.
As an aside: we might want to consider adding some kind of cache clearing API to the TensorFlowService
. As things stand, the only way to free all that memory is to dispose the entire service. IIUC, the loaded models take a substantial amount of memory.
Right! I opened an issue https://github.com/imagej/imagej-tensorflow/issues/21.
This adds a loaded model to a Hashmap to be able to load it faster the next time. We already do the same for graphs and labels, I think this line just got lost?