Closed ChenglongWang closed 1 year ago
Hi @ChenglongWang, a workaround can be writer.add_graph(model, tensor(inputs), verbose=True)
.
Thanks.
Hi @ChenglongWang, a workaround can be
writer.add_graph(model, tensor(inputs), verbose=True)
. Thanks.
Thanks! It really helps!
Describe the bug TL,DR: Error occurred when using SummaryWriter.add_graph().
To Reproduce Steps to reproduce the behavior:
writer.add_graph(model, inputs, verbose=True)
in the training looptorch.utils.tensorboard
ortensorboardX
failed.Environment (please complete the following information):
Optional dependencies: Pytorch Ignite version: 0.4.10 Nibabel version: 5.1.0 scikit-image version: 0.20.0 Pillow version: 9.4.0 Tensorboard version: 2.12.1 gdown version: 4.7.1 TorchVision version: 0.13.1 tqdm version: 4.65.0 lmdb version: 1.4.1 psutil version: 5.9.4 pandas version: 2.0.0 einops version: 0.6.0 transformers version: 4.21.3 mlflow version: 2.2.2
generating synthetic data to /tmp/tmp9810tcm9 (this may take a while) torch.Size([10, 1, 96, 96]) torch.Size([10, 1, 96, 96])
epoch 1/10 1/5, train_loss: 0.3956 Traceback (most recent call last): File "/homes/clwang/Code/tests/add_graph_test.py", line 174, in
main(tempdir)
File "/homes/clwang/Code/tests/add_graph_test.py", line 127, in main
writer.add_graph(model, inputs, verbose=True)
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/tensorboardX/writer.py", line 937, in add_graph
self._get_file_writer().add_graph(graph(model, input_to_model, verbose))
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/utils/tensorboard/_pytorch_graph.py", line 338, in graph
trace = torch.jit.trace(model, args, strict=use_strict_trace)
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/jit/_trace.py", line 750, in trace
return trace_module(
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/jit/_trace.py", line 992, in trace_module
_check_trace(
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/jit/_trace.py", line 329, in _check_trace
copied_dict[name] = _clone_inputs(data)
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/jit/_trace.py", line 160, in _clone_inputs
return function._nested_map(
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/autograd/function.py", line 463, in _map
return fn(obj)
File "/homes/clwang/.miniconda3/envs/monai/lib/python3.10/site-packages/torch/jit/_trace.py", line 150, in clone_input
a.detach()
TypeError: MetaTensor.clone() got an unexpected keyword argument 'memory_format'