Probabilistic Symmetry, Variable Exchangeability, and Deep Network Learning Invariance and Equivariance
Ivo Dinov
Co-Author
Statistics Online Computational Resource
Tuesday, Aug 5: 2:00 PM - 3:50 PM
Invited Paper Session
This talk will first describe the mathematical-statistics framework for representing, modeling, and utilizing invariance and equivariance properties of deep neural networks. By drawing direct parallels between characterizations of invariance and equivariance principles, probabilistic symmetry, and statistical inference, we explore the foundational properties underpinning reliability in deep learning models. We examine the group-theoretic invariance in a number of deep neural networks including, multilayer perceptrons, convolutional networks, transformers, variational autoencoders, and steerable neural networks.
Understanding the theoretical foundation underpinning deep neural network invariance is critical for reliable estimation of prior-predictive distributions, accurate calculations of posterior inference, and consistent AI prediction, classification, and forecasting. Two relevant data studies will be presented: one is on a theoretical physics dataset, the other is on an fMRI music dataset. Some biomedical and imaging applications are discussed at the end.
Invariance, equivariance, probabilistic symmetry, (Lie) group representations, statistical inference.
You have unsaved changes.