A Generalized Extension of Finite Mixture Models
Monday, Aug 4: 2:20 PM - 2:35 PM
2465
Contributed Papers
Music City Center
Finite mixture models have been widely used where one assumes that a random variable follows some distribution conditioned on a categorical latent variable. Under the standard finite mixture model regime, each observation is assumed to have a single latent variable associated with it. More complex regimes exist such as semi-supervised finite mixture models where either some of the latent variables are known or where multiple observations are dependent on the same latent variable. Further variations exist, such as covariance clustering mixture models, finite mixture model discrimination analysis, and parsimonious mixture models among many others. We propose a generalized extension of finite mixture models where each observation follows some distribution conditioned on an arbitrary number of latent and/or known variables sampled from an arbitrary number of categorical distributions. It can be shown that a vast number of models can be expressed in this form. We derive the Expectation Maximization algorithm for the generalized extension of the finite mixture model, which allows for the rapid development, implementation, and estimation of new models that follow this form.
Clustering
Classification
Few-Shot Learning
Main Sponsor
Section on Statistical Computing
You have unsaved changes.