Hi, since HIPT in the 3rd stage is trained only on train_batch_size=1, I tried padding the sequences with the highest number of regions in mini-batch, as follows:
def collate_features(batch, label_type: str = "int"):
idx = torch.LongTensor([item[0] for item in batch])
# implement pad mini-batch wise which pads the current mini-batch using
# the biggest sequence length from that mini-batch
feature = pad_sequence([item[1] for item in batch], batch_first=True, padding_value=0)
if label_type == "float":
label = torch.FloatTensor([item[2] for item in batch])
elif label_type == "int":
label = torch.LongTensor([item[2] for item in batch])
return [idx, feature, label]
Hi, since HIPT in the 3rd stage is trained only on
train_batch_size=1
, I tried padding the sequences with the highest number of regions in mini-batch, as follows:collate_features function
` from torch.nn.utils.rnn import pad_sequence
def collate_features(batch, label_type: str = "int"): idx = torch.LongTensor([item[0] for item in batch])
`
GlobalHIPT forward function
` def forward(self, x):
`
I have tested and it can work with
train_batch_size>1
. Hope it helps and please let me know if you find any errors in my implementation