cshjin / swmp_ml

Other
2 stars 0 forks source link

NoneType of `gen` for opf model in HGT model #17

Closed cshjin closed 1 year ago

cshjin commented 1 year ago

Popup error:

Traceback (most recent call last):
  File "demo_train_opf.py", line 117, in <module>
    out = model(data.x_dict, data.edge_index_dict)
  File "/home/jinh/miniconda3/envs/swmp/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "demo_train_opf.py", line 45, in forward
    return self.lin(x_dict['gen'])
  File "/home/jinh/miniconda3/envs/swmp/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/jinh/miniconda3/envs/swmp/lib/python3.8/site-packages/torch_geometric/nn/dense/linear.py", line 118, in forward
    return F.linear(x, self.weight, self.bias)
TypeError: linear(): argument 'input' (position 1) must be Tensor, not NoneType
cshjin commented 1 year ago

the forward in HGTConv layer only update dst_type node?

# Iterate over edge-types:
        for edge_type, edge_index in edge_index_dict.items():
            src_type, _, dst_type = edge_type
            edge_type = '__'.join(edge_type)

            a_rel = self.a_rel[edge_type]
            k = (k_dict[src_type].transpose(0, 1) @ a_rel).transpose(1, 0)

            m_rel = self.m_rel[edge_type]
            v = (v_dict[src_type].transpose(0, 1) @ m_rel).transpose(1, 0)

            # propagate_type: (k: Tensor, q: Tensor, v: Tensor, rel: Tensor)
            out = self.propagate(edge_index, k=k, q=q_dict[dst_type], v=v,
                                 rel=self.p_rel[edge_type], size=None)
            out_dict[dst_type].append(out)
cshjin commented 1 year ago

Resolved by adding a bidirectional link. This won't be a problem in the conv layers.