A Lorentz-equivariant transformer for all of the LHC
Johann Brehmer, Víctor Bresó, Pim de Haan, Tilman Plehn, Huilin Qu, Jonas Spinner, Jesse Thaler
SciPost Phys. 19, 108 (2025) · published 23 October 2025
- doi: 10.21468/SciPostPhys.19.4.108
- Submissions/Reports
-
Abstract
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The underlying architecture is a versatile and scalable transformer, which is able to break symmetries if needed. We demonstrate the power of L-GATr for amplitude regression and jet classification, and then benchmark it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find significant improvements over previous architectures.
Cited by 2
Authors / Affiliations: mappings to Contributors and Organizations
See all Organizations.- 1 Johann Brehmer,
- 2 Victor Bresó,
- 1 Pim de Haan,
- 2 Tilman Plehn,
- 3 Huilin Qu,
- 2 Jonas Spinner,
- 4 5 Jesse Thaler
- 1 CuspAI
- 2 Ruprecht-Karls-Universität Heidelberg / Heidelberg University
- 3 Organisation européenne pour la recherche nucléaire / European Organization for Nuclear Research [CERN]
- 4 Massachusetts Institute of Technology [MIT]
- 5 The NSF AI Institute for Artificial Intelligence and Fundamental Interactions [IAIFI]
- Bundesministerium für Bildung und Forschung / Federal Ministry of Education and Research [BMBF]
- Carl-Zeiss-Stiftung / Carl Zeiss Foundation
- Deutsche Forschungsgemeinschaft / German Research FoundationDeutsche Forschungsgemeinschaft [DFG]
- Ministerio de Ciencia, Innovación y Universidades / Ministry of Science, Innovation and Universities
- National Science Foundation [NSF]
- NextGenerationEU
- Simons Foundation
- United States Department of Energy [DOE]
