Project-MONAI / MONAI

AI Toolkit for Healthcare Imaging
https://monai.io/
Apache License 2.0
5.73k stars 1.05k forks source link

AttributeError: 'torch.device' object has no attribute 'gpu_id' #8017

Closed KumoLiu closed 1 month ago

KumoLiu commented 1 month ago
======================================================================
ERROR: test_value_0_fp32 (tests.test_convert_to_trt.TestConvertToTRT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/parameterized/parameterized.py", line 620, in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
  File "/tmp/MONAI/tests/test_convert_to_trt.py", line 64, in test_value
    torchscript_model = convert_to_trt(
  File "/tmp/MONAI/monai/networks/utils.py", line 985, in convert_to_trt
    trt_model = torch_tensorrt.compile(
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/_compile.py", line 208, in compile
    compiled_ts_module: torch.jit.ScriptModule = torchscript_compile(
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/ts/_compiler.py", line 154, in compile
    compiled_cpp_mod = _C.compile_graph(module._c, _parse_compile_spec(spec))
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/ts/_compile_spec.py", line 267, in _parse_compile_spec
    info.device = _parse_device(compile_spec["device"])
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/ts/_compile_spec.py", line 90, in _parse_device
    return TorchScriptDevice._from(device_info)._to_internal()
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/ts/_Device.py", line 66, in _from
    gpu_id=d.gpu_id,
AttributeError: 'torch.device' object has no attribute 'gpu_id'

Not work in Pytorch base image 24.08

KumoLiu commented 1 month ago
======================================================================
ERROR: test_onnx_trt_export_0_fp32 (tests.test_bundle_trt_export.TestTRTExport)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/MONAI/tests/utils.py", line 810, in command_line_tests
    normal_out = subprocess.run(cmd, env=test_env, check=True, capture_output=True)
  File "/usr/lib/python3.10/subprocess.py", line 526, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['python', '-m', 'monai.bundle', 'trt_export', 'network_def', '--filepath', '/tmp/tmpwmbqcva5/model_trt_fp32.ts', '--meta_file', '/tmp/MONAI/tests/testing_data/metadata.json', '--config_file', "['/tmp/MONAI/tests/testing_data/inference.json','/tmp/tmpwmbqcva5/def_args.yaml']", '--ckpt_file', '/tmp/tmpwmbqcva5/model.pt', '--args_file', '/tmp/tmpwmbqcva5/def_args.yaml', '--precision', 'fp32', '--use_onnx', 'True', '--input_shape', '[1, 1, 96, 96, 96]', '--dynamic_batch', '[1, 4, 8]']' returned non-zero exit status 1.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/parameterized/parameterized.py", line 620, in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
  File "/tmp/MONAI/tests/test_bundle_trt_export.py", line 126, in test_onnx_trt_export
    command_line_tests(cmd)
  File "/tmp/MONAI/tests/utils.py", line 816, in command_line_tests
    raise RuntimeError(f"subprocess call error {e.returncode}: {errors}, {output}") from e
RuntimeError: subprocess call error 1: b'/usr/local/lib/python3.10/dist-packages/ignite/handlers/checkpoint.py:17: DeprecationWarning: `TorchScript` support for functional optimizers is deprecated and will be removed in a future PyTorch release. Consider using the `torch.compile` optimizer instead.
  from torch.distributed.optim import ZeroRedundancyOptimizer
You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don\'t have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
`torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch.
WARNING:py.warnings:There is no dynamic batch range. The converted model only takes [1, 1, 96, 96, 96] shape input.

Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/tmp/MONAI/monai/bundle/__main__.py", line 31, in <module>
    fire.Fire()
  File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 143, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 477, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 693, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "/tmp/MONAI/monai/bundle/scripts.py", line 1718, in trt_export
    _export(
  File "/tmp/MONAI/monai/bundle/scripts.py", line 1293, in _export
    net = converter(model=net, **kwargs)
  File "/tmp/MONAI/monai/networks/utils.py", line 965, in convert_to_trt
    trt_model = _onnx_trt_compile(
  File "/tmp/MONAI/monai/networks/utils.py", line 825, in _onnx_trt_compile
    torch_tensorrt.set_device(device)
AttributeError: module \'torch_tensorrt\' has no attribute \'set_device\'
WARNING:py.warnings:Implicitly cleaning up <TemporaryDirectory \'/tmp/tmpsq0sg0pp\'>

', b"2024-08-13 12:55:34,968 - INFO - --- input summary of monai.bundle.scripts.trt_export ---
2024-08-13 12:55:34,969 - INFO - > meta_file: '/tmp/MONAI/tests/testing_data/metadata.json'
2024-08-13 12:55:34,969 - INFO - > net_id: 'network_def'
2024-08-13 12:55:34,969 - INFO - > filepath: '/tmp/tmpwmbqcva5/model_trt_fp32.ts'
2024-08-13 12:55:34,969 - INFO - > config_file: ['/tmp/MONAI/tests/testing_data/inference.json',
 '/tmp/tmpwmbqcva5/def_args.yaml']
2024-08-13 12:55:34,969 - INFO - > ckpt_file: '/tmp/tmpwmbqcva5/model.pt'
2024-08-13 12:55:34,969 - INFO - > precision: 'fp32'
2024-08-13 12:55:34,969 - INFO - > input_shape: [1, 1, 96, 96, 96]
2024-08-13 12:55:34,969 - INFO - > use_onnx: True
2024-08-13 12:55:34,969 - INFO - > dynamic_batch: [1, 4, 8]
2024-08-13 12:55:34,969 - INFO - ---
KumoLiu commented 1 month ago

For the first one, I try to fix it in #8019. For the second one, I reported it on torch-tensorrt repo: https://github.com/pytorch/TensorRT/issues/3084

Also cc @binliunls

KumoLiu commented 1 month ago

For the second one, as PyTorch team suggested, use torch API instead.