SciPost logo

Numerical aspects of large deviations

Alexander K. Hartmann

SciPost Phys. Lect. Notes 100 (2025) · published 30 September 2025

Part of the 2024-07: Theory of Large Deviations and Applications Collection in the Les Houches Summer School Lecture Notes Series.

Abstract

An introduction to numerical large-deviation sampling is provided. First, direct biasing with a known distribution is explained. As simple example, the Bernoulli process is used throughout the text. Next, Markov chain Monte Carlo (MCMC) simulations are introduced. In particular, the Metropolis-Hastings algorithm is explained. As first implementation of MCMC, sampling of the plain Bernoulli model is shown. Next, an exponential bias is used for the same model, which allows one to obtain the tails of the distribution of a measurable quantity. This approach is generalized to MCMC simulations, where the states are vectors of $U(0,1)$ random entries. This allows one to use the exponential or any other bias to access the large-deviation properties of rather arbitrary random processes. Finally, some recent research applications to study more complex models are discussed.

Supplementary Information

External links to supplemental resources; opens in a new tab.


Ontology / Topics

See full Ontology or Topics database.

Monte-Carlo simulations

Author / Affiliation: mappings to Contributors and Organizations

See all Organizations.