microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
13.92k stars 2.81k forks source link

how to release gpu memory after session.run #20517

Open ZTurboX opened 3 months ago

ZTurboX commented 3 months ago

Describe the issue

how to release gpu memory after session.run

To reproduce

how to release gpu memory after session.run

Urgency

No response

Platform

Linux

OS Version

ubuntu

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.17.1

ONNX Runtime API

Python

Architecture

X86

Execution Provider

CUDA

Execution Provider Library Version

cuda 11.7

pranavsharma commented 3 months ago

Just let the session destruct if you don't intend to use the session anymore. All memory is released to the GPU when the session's destructor is called. If you intend to use the session again, you can configure this run option: https://github.com/microsoft/onnxruntime/blob/72ce4de07df91b43d36d5c475a609095bde50a53/include/onnxruntime/core/session/onnxruntime_run_options_config_keys.h#L27

ZTurboX commented 3 months ago

Just let the session destruct if you don't intend to use the session anymore. All memory is released to the GPU when the session's destructor is called. If you intend to use the session again, you can configure this run option:

how to use python method

github-actions[bot] commented 2 months ago

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.