Advances in Statistical Learning and Uncertainty Quantification: Theory and Computation

Abstract Number:

1648 

Submission Type:

Topic-Contributed Paper Session 

Participants:

Mengyang Gu (1), Weining Shen (2), Yun Yang (3), Chih-Li Sung (4), Jiaoyang Huang (5), Mengyang Gu (1), Omar Al-Ghattas (6)

Institutions:

(1) University of California-Santa Barbara, N/A, (2) University of California, Irvine, N/A, (3) University of Illinois Urbana-Champaign, N/A, (4) Michigan State University, N/A, (5) University of Pennsylvania, N/A, (6) University of Chicago, N/A

Chair:

Weining Shen  
University of California, Irvine

Session Organizer:

Mengyang Gu  
University of California-Santa Barbara

Speaker(s):

Yun Yang  
University of Illinois Urbana-Champaign
Chih-Li Sung  
Michigan State University
Jiaoyang Huang  
University of Pennsylvania
Mengyang Gu  
University of California-Santa Barbara
Omar Al-Ghattas  
University of Chicago

Session Description:

Statistical learning approaches are indispensable components to accelerate science discovery in the age of artificial intelligence. Statistical theory and methods have been increasingly used for predicting costly computer simulations of integral and differential equations, and inversely estimating system parameters from correlated experimental or field observations, including images, spatio-temporal and functional data. An advantage of a statistical or probabilistic model is the availability of internal uncertainty quantification, which has become the basis of various modern machine learning algorithms, such as adaptive designs for active learning and Bayesian optimization.

Various interesting problems, including high-dimensional parameter space, complex data structure, and highly nonlinear maps, emerge from the scientific theory and mathematical models. These problems require more efficient and scalable statistical approaches for prediction and estimation. In recent years, substantial progress has been made toward solving these challenges through new statistical theory and algorithms in dynamical systems, graphs, functions and manifolds. These advances enable a wide range of applications in science and engineering, ranging from image analysis to climate modeling. This timely session will bring together experts to share their latest findings in theory and computation, and to outline future directions of statistical learning and uncertainty quantification approaches for complex systems, making a timely contribution to the 2024 JSM program.

The tentative titles of five speakers are as follows.
1. Yun Yang (University of Illinois Urbana-Champaign, Associate professor): Adaptivity of Diffusion Models to Manifold Structures
2. Chih-Li Sung (Michigan State University, Assistant Professor): Stacking designs: designing multi-fidelity computer experiments with target predictive accuracy
3. Jiaoyang Huang (University of Pennsylvania, Assistant Professor): Efficient derivative-free Bayesian inference for large-scale inverse problems
4. Mengyang Gu (University of California, Santa Barbara, Assistant Professor): Fast ab initio uncertainty quantification and Bayesian inference for dynamical systems
5. Omar Al-Ghattas (University of Chicago, PhD Student): High and infinite-dimensional analysis of ensemble Kalman methods

Sponsors:

Section on Bayesian Statistical Science 3
Section on Physical and Engineering Sciences 2
Uncertainty Quantification in Complex Systems Interest Group 1

Theme: Statistics and Data Science: Informing Policy and Countering Misinformation

Yes

Applied

Yes

Estimated Audience Size

Small (<80)

I have read and understand that JSM participants must abide by the Participant Guidelines.

Yes

I understand and have communicated to my proposed speakers that JSM participants must register and pay the appropriate registration fee by June 1, 2024. The registration fee is nonrefundable.

I understand