alvinwan / neural-backed-decision-trees

Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet
https://nbdt.aaalv.in
MIT License
606 stars 130 forks source link

Using softTreeLoss error #11

Open Muzijiajian opened 3 years ago

Muzijiajian commented 3 years ago

Hello, I am trying to use softTreeLoss by using following codes: from nbdt.loss import SoftTreeSupLoss train_loss_fn = nn.CrossEntropyLoss().cuda() criterion = SoftTreeSupLoss(criterion=train_loss_fn, dataset='Imagenet1000', tree_supervision_weight=1.0, hierarchy='induced-efficientnet_b7b') ... for i, (input, targets) in enumerate(train_loader): targets = targets.cuda(async=True) input_var = torch.autograd.Variable(input).cuda() targets_var = torch.autograd.Variable(targets).cuda() scores = model(input_var) loss = criterion(scores, targets_var)

Then it comes the following errors: File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 240, in forward wnid_to_outputs = self.forward_nodes(outputs) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 101, in forward_nodes return self.get_all_node_outputs(outputs, self.nodes) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 90, in get_all_node_outputs node_logits = cls.get_node_logits(outputs, node) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 79, in get_node_logits for new_label in range(node.num_classes) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 79, in for new_label in range(node.num_classes) AttributeError: 'Tensor' object has no attribute 'T'

alvinwan commented 3 years ago

@Muzijiajian Hm, are you on PyTorch 1.4? https://github.com/alvinwan/neural-backed-decision-trees/blob/master/requirements.txt#L2. Your code looks right, and tensors should definitely support .T for transpose.