google-research / deeplab2

DeepLab2 is a TensorFlow library for deep labeling, aiming to provide a unified and state-of-the-art TensorFlow codebase for dense pixel labeling tasks.
Apache License 2.0
1.01k stars 159 forks source link

How to identify model is using GPU ? #163

Open PritiDrishtic opened 1 year ago

PritiDrishtic commented 1 year ago

I exported the KMAX model using export_model.py on GPU( Tesla T4) .

Please advise how I can determine whether this model is using GPU as the execution performance for pretrained model from CPU mode and this model is nearly identical.

executing inference on an image and attaching tensorflow logs from script -

I tensorflow/core/common_runtime/executor.cc:1197] [/device:CPU:0] (DEBUG INFO) Executor start aborting (this does not indicate an error and you can ignore this message): INVALID_ARGUMENT: You must feed a value for placeholder tensor 'input_tensor' with dtype uint8 and shape [1530,2720,3]      [[{{node input_tensor}}]] 2023-05-17 05:35:42.665202: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:424] Loaded cuDNN version 8800 2023-05-17 05:35:43.606321: I tensorflow/compiler/xla/service/service.cc:169] XLA service 0x7f29702d1dc0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2023-05-17 05:35:43.606368: I tensorflow/compiler/xla/service/service.cc:177]   StreamExecutor device (0): Tesla T4, Compute Capability 7.5 2023-05-17 05:35:44.473843: I ./tensorflow/compiler/jit/device_compiler.h:180] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.