ANTsX / ANTsPyNet

Pre-trained models and utilities for deep learning on medical images in Python
https://antspynet.readthedocs.io
Apache License 2.0
200 stars 29 forks source link

Tensorflow: unnecessary retracing #138

Open neverix opened 1 month ago

neverix commented 1 month ago

I am running ANTsPyNet functions (brain_extraction and deep_atropos) in a loop and getting this warning:

WARNING:tensorflow:6 out of the last 6 calls to <function TensorFlowTrainer.make_predict_function.<locals>.one_step_on_data_distributed at 0x2ff13e8c0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.

Is there a way to create an object instance so the tf.functions can be reused?

ntustison commented 1 month ago

This issue has been around since initial development. I remember looking into it a couple times and not finding an easy solution, although maybe I'm wrong. But the actual cost of this function hasn't caused me to prioritize finding a solution. I just ignore it.

If you have a way of removing this warning, a pull request would certainly be welcome.

neverix commented 1 month ago

One way to solve it while keeping the current interface I can see would be to create a global cache of model objects (to be used in the most commonly called functions) and wrap each model's .predict with the tf.function decorator with an argument to reduce recompilation. If you measured compilation time and it is insignificant, I don't have a reason to implement this. Still, I think it is good to have an issue for a common warning in case someone does.