When runing the code, I got this problem:
/anaconda3/envs/df116/lib/python3.9/site-packages/pytorch_lightning/utilities/data.py:98: UserWarning: Trying to infer thebatch_sizefrom an ambiguous collection. The batch size we found is 64. To avoid any miscalculations, useself.log(..., batch_size=batch_size). warning_cache.warn( /anaconda3/envs/df116/lib/python3.9/site-packages/torchmetrics/functional/classification/precision_recall_curve.py:70: UserWarning: cumsum_cuda_kernel does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True, warn_only=True)'. You can file an issue at https://github.com/pytorch/pytorch/issues to help us prioritize adding deterministic support for this operation. (Triggered internally at ../aten/src/ATen/Context.cpp:82.) tps = torch.cumsum(target * weight, dim=0)[threshold_idxs] terminate called after throwing an instance of 'std::runtime_error' what(): tensorflow/compiler/xla/xla_client/computation_client.cc:280 : Missing XLA configuration Aborted (core dumped)
When runing the code, I got this problem:
/anaconda3/envs/df116/lib/python3.9/site-packages/pytorch_lightning/utilities/data.py:98: UserWarning: Trying to infer the
batch_sizefrom an ambiguous collection. The batch size we found is 64. To avoid any miscalculations, use
self.log(..., batch_size=batch_size). warning_cache.warn( /anaconda3/envs/df116/lib/python3.9/site-packages/torchmetrics/functional/classification/precision_recall_curve.py:70: UserWarning: cumsum_cuda_kernel does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True, warn_only=True)'. You can file an issue at https://github.com/pytorch/pytorch/issues to help us prioritize adding deterministic support for this operation. (Triggered internally at ../aten/src/ATen/Context.cpp:82.) tps = torch.cumsum(target * weight, dim=0)[threshold_idxs] terminate called after throwing an instance of 'std::runtime_error' what(): tensorflow/compiler/xla/xla_client/computation_client.cc:280 : Missing XLA configuration Aborted (core dumped)