Mallows Ranking Models with Learned Distance Metrics: Sampling and Maximum Likelihood Estimation

Kiana Asgari Co-Author
Stanford
 
Amin Saberi Co-Author
Stanford
 
Yeganeh Alimohammadi First Author
 
Yeganeh Alimohammadi Presenting Author
 
Wednesday, Aug 6: 10:50 AM - 11:05 AM
2349 
Contributed Papers 
Music City Center 
We study the problem of rank elicitation within the Mallows model, where a permutation $\pi$ is sampled proportional to \( \exp(-\beta d(\pi, \sigma)) \), with $\sigma$ representing a central ranking and $\beta$ as the dispersion parameter. Specifically, we study a general class of $L_\alpha$ distances \( d_\alpha(\pi, \sigma) = \sum_{i=1}^n (\pi(i) - \sigma(i))^\alpha \). For any $\alpha \geq 1$ and any $\beta > 0$, we present a Polynomial-Time Approximation Scheme (PTAS) that achieves two key objectives: (i) estimating the partition function \( Z_n(\beta, \alpha) \) within a multiplicative factor of \( 1 \pm \epsilon \), and (ii) generating samples $\epsilon$-close (in total variation distance) to the true distribution. Leveraging this approximation, we design an efficient Maximum Likelihood Estimation (MLE) algorithm that jointly infers the true underlying ranking, the dispersion parameter, and the distance parameter metric. This work is the first to study metric learning alongside preference learning in the context of Mallows models.

Keywords

Rank Elicitation

Mallows Model

MLE

Preference Elicitation 

Main Sponsor

Business and Economic Statistics Section