Scalar named tensors (without named dimensions), like those returned from loss functions, crash when _force_order is called internally. A minimal example is:
a = NamedTensor(torch.tensor(0.0), ())
b = NamedTensor(torch.tensor(1.0), ())
a+b
The relevant part of the trace is:
...
/usr/local/lib/python3.6/dist-packages/namedtensor/core.py in _force_order(self, names)
195 trans.append(d)
196 return self.__class__(
--> 197 self.transpose(*trans)._tensor.contiguous().view(*view), ex
198 )
199
/usr/local/lib/python3.6/dist-packages/namedtensor/core.py in transpose(self, *dims)
132 )
133 indices = [self._schema.get(d) for d in to_dims]
--> 134 tensor = self._tensor.permute(*indices)
135 return self.__class__(tensor, to_dims)
136
TypeError: permute() missing 1 required positional arguments: "dims"
Scalar named tensors (without named dimensions), like those returned from loss functions, crash when _force_order is called internally. A minimal example is:
The relevant part of the trace is: