Scalable piecewise smoothing with BART
Monday, Aug 4: 11:55 AM - 12:15 PM
Topic-Contributed Paper Session
Music City Center
Although it is an extremely effective, easy-to-use, and increasingly popular tool for nonparametric regression, the Bayesian Additive Regression Trees (BART) model is limited by the fact that it can only produce discontinuous output. Initial attempts to overcome this limitation were based on regression trees that output Gaussian Processes instead of constants. Unfortunately, implementations of these extensions cannot scale to large datasets. We propose ridgeBART, an extension of BART built with trees that output linear combinations of ridge functions (i.e., a composition of an affine transformation of the inputs and non-linearity); that is, we build a Bayesian ensemble of localized neural networks with a single hidden layer. We develop a new MCMC sampler that updates trees in linear time and establish nearly minimax-optimal posterior contraction rates when the underlying function is piecewise smooth. We demonstrate ridgeBART's effectiveness on synthetic data and use it to estimate the probability that a professional basketball player makes a shot from any location on the court in a spatially smooth fashion.
BART
nonparametrics
regression trees
ensemble learning
neural networks
You have unsaved changes.