Past Awards
The 2010 John von Neumann Theory Prize is awarded by the Institute for Operations Research and the Management Sciences to Søren Asmussen and Peter Glynn for their outstanding contributions in applied probability and the theory of stochastic simulation.
Søren Asmussen has made fundamental contributions in many areas of applied probability and stochastic operations research, including queueing systems, large deviations and rare events, heavy-tailed phenomenon, insurance-risk models, matrix-analytic algorithms and the theory of stochastic simulation. In a 1982 paper, Asmussen described the dynamics of random walks and queues conditional on rare events, thus laying the foundation for simulating these rare events. His 1987 book, Applied Probability and Queues introduced a generation of researchers to the use of exponential changes of measure to analyze rare events and remains required reading for new researchers in queueing theory. In his 1992 paper, “Queueing Simulation in Heavy Traffic” Asmussen proved results quantifying the variance and bias of moment estimates in simulating a single-server queue in heavy traffic. This paper, together with contemporary work by other researchers, placed on a sound footing the till-then folklore associated with the difficulty in getting accurate estimates from simulations of queues in heavy traffic. A 2000 paper co-authored with K. Binswanger and B. Højgaard was the first to provide a rigorous analysis for a rare-event simulation problem in the heavy-tailed setting and to demonstrate that this setting required an entirely different analytical approach. This pioneering paper has inspired a generation of researchers on rare-event simulation analysis with heavy-tailed distribution and opened up applications in telecommunications, reliability, insurance and finance.
Peter Glynn has made sustained and important contributions in stochastic simulation theory over the last 30 years. He has authored or co-authored landmark papers in rare-event simulation, the regenerative method of output analysis, standardized time series for output analysis and gradient estimation. His contributions have deepened the theory and broadened the applicability of stochastic simulation and brought forth deep connections with applied probability. Glynn’s 1989 paper with D.L. Iglehart, “Importance Sampling for Stochastic Simulations” explained how to extend importance sampling in finite-dimensional settings to problems involving stochastic processes over finite horizons, thereby strengthening the theory and popularizing the approach. Much of Glynn’s work on steady-state estimation builds on and develops the regenerative method, which is intimately linked to the modern theory of Harris recurrence of general state-space Markov processes. Here, too, Glynn has a crucial paper, “Simulation Output Analysis Using Standardized Time Series,” co-authored by D.L. Iglehart, as well as multiple papers on the widely used method of batch means. Research on eliminating the initial transient bias in steady-state simulation has been revolutionized by the development of methods for exact sampling from steady-state distributions. A key early reference on this topic is the 1992 paper “Stationarity Detection in the Initial Transient Problem,” co-authored with Asmussen and H. Thorisson. This paper introduced the first rigorous simulation-based algorithm for generating samples according to the stationary distribution of a Markov chain. Glynn has also been very influential through supervising over 30 doctoral students who themselves have made numerous significant contributions in the interface between stochastic simulation and applied probability.
Asmussen and Glynn have not simply solved a series of difficult problems, they have helped create a modern theory of what might be called computational probability. Key questions for any computational method are efficiency, convergence and error analysis; in a stochastic setting, these are grounded in probabilistic bounds and limit theorems. Their 2007 joint book, Stochastic Simulation: Algorithms and Analysis has set the standard for doctoral-level study in the field and will undoubtedly influence its direction for many years to come.
The challenge of efficient rare-event simulation arises in many areas of applied probability and has motivated the development of theory and algorithms for over thirty years. Most advances in the field have addressed models built from light-tailed distributions and have relied on importance sampling through an exponential change of measure.
This series of papers marks a major advance in the design and analysis of provably efficient simulation algorithms. They contain new algorithms for simulation of rare-event probabilities in multi-dimensional random walks, multiple-server queues, and counting problems in combinatorial optimization and statistics. They specifically address the setting of queueing systems with heavy tails, which has attracted much attention from researchers.
A unifying theme of the papers is an innovative analysis through Lyapunov functions. The papers show that this approach can be used in problems as diverse as multi-dimensional queueing systems and combinatorial counting problems. By introducing a systematic approach to the development and analysis of algorithms, the papers have opened new opportunities for research in the field.