NLeSC / Machine_Learning_SIG

The topics discussed in the Machine Learning SIG group.
12 stars 4 forks source link

[2024/02/08 14:30] Yue Zhao - Transformer-inspired architectures for particle track reconstruction #73

Open APJansen opened 1 year ago

APJansen commented 1 year ago

recording

We address the track reconstruction problem in the ATLAS experiment at the Large Hadron Collider (LHC). When protons collide in a particle detector, the collision generates a multitude of secondary particles. Each secondary particle passes through a series of detectors, leaving behind a 3D point cloud of signals called “hits”. Reconstructing particle tracks from detector hits is a necessary step for scientists to further analyse and identify the generated particles. The first step in track reconstruction is to associate hits that likely originated from the same particle.

In our study, we assess the feasibility of using three Transformer-inspired architectures for hit clustering/classification, leveraging Transformer models’ attention mechanism. Preliminary studies on a simplified dataset show high success rates for all models. However, the real challenge of the problem lies in the size of the realistic data. I’ll discuss our ideas of adapting the models for maximising the sequence length we can process on the limited memory of available hardware, and some thoughts on how to enable high throughput inference once the training is done.