DeepLearnPhysics / lartpc_mlreco3d

7 stars 32 forks source link

Full Chain Grappa Track breaks when a batch is too large #195

Open bear-is-asleep opened 3 months ago

bear-is-asleep commented 3 months ago

If the batch was too large the training breaks

The complete graph is too large, must skip batch
Traceback (most recent call last):
...
  File "/sdf/home/b/bearc/lartpc_mlreco3d_fd/mlreco/models/layers/common/gnn_full_chain.py", line 107, in run_gnn
    gnn_output['edge_index'][0][b],
KeyError: 'edge_index'

The error is just caused by skipping the graph and landing in an area its not supposed to be. The temporary work around is to train on smaller batches. In the standalone module it'll just skip the batch.