SciPost logo

SciPost Submission Page

TrackFormers Part 2: Enhanced Transformer-Based Models for High-Energy Physics Track Reconstruction

by Sascha Caron, Nadezhda Dobreva, Maarten Kimpel, Uraz Odyurt, Slav Pshenov, Roberto Ruiz de Austri Bazan, Eugene Shalugin, Zef Wolffs, Yue Zhao

Submission summary

Authors (as registered SciPost users): Uraz Odyurt · Yue Zhao
Submission information
Preprint Link: scipost_202509_00062v1  (pdf)
Date submitted: Sept. 30, 2025, 5:21 p.m.
Submitted by: Yue Zhao
Submitted to: SciPost Physics Proceedings
Proceedings issue: The 2nd European AI for Fundamental Physics Conference (EuCAIFCon2025)
Ontological classification
Academic field: Physics
Specialties:
  • High-Energy Physics - Experiment

Abstract

High-Energy Physics experiments are rapidly escalating in generated data volume, a trend that will intensify with the upcoming High-Luminosity LHC upgrade. This surge in data necessitates critical revisions across the data processing pipeline, with particle track reconstruction being a prime candidate for improvement. In our previous work, we introduced "TrackFormers", a collection of Transformer-based one-shot encoder-only models that effectively associate hits with expected tracks. In this study, we extend our earlier efforts by incorporating loss functions that account for inter-hit correlations, conducting detailed investigations into (various) Transformer attention mechanisms, and a study on the reconstruction of higher-level objects. Furthermore we discuss new datasets that allow the training on hit level for a range of physics processes. These developments collectively aim to boost both the accuracy, and potentially the efficiency of our tracking models, offering a robust solution to meet the demands of next-generation high-energy physics experiments.

Current status:
Voting in preparation

Login to report or comment