Advances in Inference and Theory for Bayesian Neural Networks

Abstract Number:

1213 

Submission Type:

Invited Paper Session 

Participants:

Natalie Klein (1), Giosue Migliorini (2), Giosue Migliorini (2), Eric Nalisnick (3), Babak Shahbaba (4), Beau Coker (5), Alexander Immer (6), Andrew Wilson (7), Maurizio Filippone (8)

Institutions:

(1) Los Alamos National Laboratory, Los Alamos,NM, (2) University of California Irvine, Irvine,CA, (3) University of Amsterdam, Amsterdam,Netherlands, (4) UCI, Irvine,CA, (5) Harvard University, Cambridge,MA, (6) ETH Zurich, Zurich,Switzerland, (7) NYU, New York,NY, (8) EURECOM, Biot,France

Chair:

Giosue Migliorini  
University of California Irvine

Co-Organizer:

Giosue Migliorini  
University of California Irvine

Session Organizer:

Natalie Klein  
Los Alamos National Laboratory

Speaker(s):

Eric Nalisnick  
University of Amsterdam
Babak Shahbaba  
UCI
Beau Coker  
Harvard University
Alexander Immer  
ETH Zurich
Andrew Wilson  
NYU
Maurizio Filippone  
EURECOM

Session Description:

This session focuses on recent advances in inference and theory for Bayesian neural networks. Neural networks exhibit remarkable flexibility as parametric models and have recently been used to achieve impressive results in tasks as varied as realistic image generation, automation (e.g., self-driving cars), and natural language comprehension. Despite the empirical success of these models, the statistical treatment of neural networks remains an active area of research. As neural network models are increasingly applied to high-consequence areas such as medical and physical sciences, it is critical to better understand the behavior of neural network models and to interrogate the utility of prevailing approximate uncertainty quantification techniques for neural networks.

The talks in this session focus on recent advances in methodology and theory for Bayesian neural networks. A Bayesian treatment offers a principled approach to uncertainty quantification and model selection and a lens through which to understand functional properties of stochastic neural networks (for example, connections between Bayesian neural networks and more familiar statistical models models such as Gaussian processes). Conventionally, Bayesian approaches to neural networks posit a prior distribution on the neural network parameters, but it is difficult to make well-informed prior choices, and the high-dimensional nature of modern neural networks typically results in challenging posterior inference. The invited speakers are leaders in the field of scalable inference methods for Bayesian neural networks and their work truly lies at the intersection of statistics and machine learning, with recent works appearing in high-profile machine learning venues. Thus, this session offers not only the opportunity for statisticians to learn about the latest advances in the Bayesian treatment of neural network models, but also the opportunity for leading machine learning researchers to connect more deeply with the statistics community. We anticipate a timely and exciting session with broad appeal to JSM attendees.

The six confirmed speakers come from a variety of institutions and contribute a very appealing mixture of statistics and machine learning expertise. The tentative talk titles reveal a diversity of approaches to understanding and inference for Bayesian neural networks, including recent advances in approximate inference techniques ("The Boons of Being Less Bayesian: A Study of Partially Stochastic Neural Networks" - Eric Nalisnick, University of Amsterdam, "Approximate Inference in Bayesian Neural Networks" - Babak Shahbaba, University of California Irvine), model selection ("Bayesian Hyperparameter Selection for Neural Networks" - Alexander Immer, ETH Zurich, "Is Bayesian Model Selection Aligned with Model Generalization?" - Andrew Gordon Wilson, NYU), and theory ("Wide Mean-Field Bayesian Neural Networks Ignore the Data" - Beau Coker, Harvard University, "Imposing Functional Priors on Bayesian Neural Networks" - Maurizio Filippone, EURECOM).

Sponsors:

IMS 3
Section on Bayesian Statistical Science 2
Section on Physical and Engineering Sciences 1

Theme: Statistics and Data Science: Informing Policy and Countering Misinformation

No

Applied

No

Estimated Audience Size

Large (150-275)

I have read and understand that JSM participants must abide by the Participant Guidelines.

Yes

I understand and have communicated to my proposed speakers that JSM participants must register and pay the appropriate registration fee by June 1, 2024. The registration fee is nonrefundable.

I understand