causes trouble on my side when tokens is None. Probably because the token encoder expects integer tensors but in that line, the tensor is initialized by default as float.
An easy fix is probably something like this:
tokens = torch.zeros(batch, 0, dtype=torch.int64)
Hi all,
This line here: https://github.com/wfondrie/depthcharge/blob/main/depthcharge/transformers/analytes.py#L375
causes trouble on my side when
tokens
isNone
. Probably because the token encoder expects integer tensors but in that line, the tensor is initialized by default as float.An easy fix is probably something like this:
tokens = torch.zeros(batch, 0, dtype=torch.int64)
Best, Daniela