Wednesday, Aug 6: 8:30 AM - 10:20 AM
7001
Invited Paper Session
Music City Center
Room: CC-101A
Main Sponsor
Memorial
Co Sponsors
History of Statistics Interest Group
Presentations
Dr. Bill Strawderman's contributions to statistical theory have profoundly shaped modern perspectives on estimation and decision theory. I was fortunate to be his colleague and to benefit from his guidance, both academically and personally. In this memorial session, I focus on the framework of confidence distributions, which we co-developed together with Kesar Singh. This talk revisits the problem of combining information from multiple sources—frequentist, Bayesian, or otherwise—within the confidence distribution framework, and sheds light on the paradox of the discrepant posterior distribution phenomenon. By connecting these themes, we honor Strawderman's deep commitment to principled inference and methodological rigor, while also presenting recent developments moving his legacy forward. Particular attention will be given to the foundational and practical implications of these ideas for unifying Bayesian and frequentist inference approaches, especially in the context of modern statistical science, machine learning, and artificial intelligence.
We study the construction of prior distributions which give Bayes minimax estimators of a normal mean vector. Particular attention is paid to priors which are not scale mixtures of normal distributions.
For the canonical problem of estimating a multivariate normal mean under squared error loss, minimax multiple shrinkage estimators adaptively shrink estimates towards multiple points and subspaces, thereby enhancing the scope of potential risk reduction while maintaining the safety guarantee of minimaxity. Motivated from a Bayesian point of view, the construction of such minimax estimators has relied, up to now, on using mixtures of improper priors yielding superharmonic marginals. Indeed, even just the existence of proper Bayes minimax multiple shrinkage estimators has been a challenging open problem, one that Bill and I worked on for the last 30 years. Happily, Bill ultimately came up with a novel unbiased estimate of risk argument to demonstrate, for the first time, the existence of such estimators, including the existence of proper Bayes minimax multiple shrinkage estimators based on mixtures of the Strawderman-type priors which he pioneered in 1971. Not only are such multiple shrinkage estimators automatically admissible, but they also allow for the interpretation of their adaptive mixture weights as valid posterior probabilities. (This work is joint with Pankaj Bhagwat and Bill Strawderman).
Consider a set of observations from a common exponential family governed by potentially different univariate parameters. We wish to test separate null and alternative hypotheses for each parameter; the null hypothesis is that the parameter takes the value zero, and the alternative is that the value exceeds a common threshold. In most applications the threshold is called the signal strength. Suppose we know a limit on the number of alternatives that can be true. Common losses are based on the number of mistakes and often involve False Discovery Rate and False Non-discovery Rate. Most results tend to show asymptotic optimality of procedures. Here we investigate the impact of knowledge of maximal number of true alternatives for fixed sample size.