microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.48k stars 2.9k forks source link

memory keep increasing with dynamic input shape of network #5796

Open yflv-yanxia opened 3 years ago

yflv-yanxia commented 3 years ago

The input shape of our network is dynamic. When using onnxruntime with backend "Default CPU" to run the network, all is ok. While using onnxruntime with backend DNNL, the memory will keep increasing with various input shapes. However, if the shape of input is fixed,the memory will not increase. So we guess onnxruntime+DNNL will always allocate new memory with different input shapes of network?

jywu-msft commented 3 years ago

yes, this is expected behavior for the implementation.