The Past, Present and Future of Statistical Methodology for Computer Experiments

Murali Haran Chair
Penn State University
 
Murali Haran Organizer
Penn State University
 
Bruno Sanso Organizer
University of California-Santa Cruz
 
Derek Bingham Organizer
Simon Fraser University
 
Daniel Williamson Organizer
University of Exeter
 
Tuesday, Aug 5: 10:30 AM - 12:20 PM
0284 
Invited Paper Session 
Music City Center 
Room: CC-209B 

Applied

Yes

Main Sponsor

Section on Physical and Engineering Sciences

Co Sponsors

History of Statistics Interest Group
Section on Statistical Computing
Uncertainty Quantification in Complex Systems Interest Group

Presentations

Four decades in statistics of Uncertainty Quantification for computer modeling

Nearly four decades ago the design and analysis of computer experiments began to emerge as a distinct sub discipline of statistics. That emergence and its subsequent explosive growth in methodology has paralleled the rise of virtual science and engineering: field and lab studies have been complemented or even supplanted by experiments with data from mathematical models. Statistics has proven to be crucial in Uncertainty Quantification (UQ) for such models, allowing for assessment of the accuracy of model predictions and the uncertainties associated with all aspects of computer modeling. This presentation will cover the highlights of this four decades of history. 

Speaker

James Berger, Duke University

Multivariate and functional output emulation

The output of almost any computational model is multivariate, even after postprocessing the output. This multivariate output is typically produced on a fixed support space; the support might be indexed over a lattice of spatial coordinates, or wavenumber k from a computed power spectrum, or time in weeks of an epidemic. In such cases, the output is supported on a well-defined, ordered space where concepts such as interpolation, refinement, and coarsening are sensible operations. Here, the multivariate output is also described as functional. This talk will focus on computationally efficient strategies for constructing a Gaussian process-based, multivariate/functional emulators. 

Co-Author(s)

Maike Holthuijzen, Sandia National Labs
Sierra Merkes, Virginia Tech Statistics Department

Speaker

David Higdon, Virginia Tech

Non-stationary Gaussian Process Surrogates

We provide a survey of non-stationary surrogate models which utilize Gaussian processes (GPs) or variations thereof, including non-stationary kernel adaptations, partition and local GPs, and spatial warpings through deep Gaussian processes. We also overview publicly available software implementations and conclude with a bake-off involving an 8-dimensional satellite drag computer experiment. Code for this example is provided in a public git repository. 

Keywords

deep Gaussian process 

Speaker

Annie Booth, NC State University

Design of Experiments for Emulations: A Review

Space-filling designs are crucial for efficient computer experiments, enabling accurate surrogate modeling and uncertainty quantification in many scientific and engineering applications, such as digital twin systems and cyber-physical systems.
In this work, we will provide a comprehensive review on key design methodologies, including Maximin/miniMax designs, Latin hypercubes, and projection-based designs. Moreover, we will connect the space-filling design criteria like the fill distance to Gaussian process performance.
Numerical studies are conducted to investigate the practical trade-offs among various design types, with the discussion on emerging challenges in high-dimensional and constrained settings.
The paper concludes with future directions in adaptive sampling and machine learning integration, providing guidance for improving computational experiments. 

Keywords

computer experiment

design of experiment

emulation

surrogate models 

Speaker

Lulu Kang, University of Massachusetts Amherst

Sensitivity Analysis in Practice

The modern practitioner of uncertainty quantification (UQ) for complex models is fortunate to have at their disposal developments from more than three decades of research in the field of computer experiments. Major developments include efficient design schemes, elegant approaches to combining data sources, and innovative techniques for sensitivity analysis. Sensitivity analysis seeks to explain uncertainty in model outputs based on uncertainty in model inputs. In this talk, we describe various sensitivity analysis approaches with an eye toward their practical use. We describe approaches that utilize many model runs like Sobol, Shapley, and delta sensitivity, as well as approaches that require fewer model runs like derivative-based global sensitivity measures and design-based approaches. We also discuss the practical utility of emulators for sensitivity analysis and a few open questions. 

Keywords

Sobol sensitivity

emulator

computer experiment

uncertainty quantification 

Speaker

Devin Francom, Los Alamos National Laboratory