Convergence Rates of Oblique Regression Trees for Flexible Function Libraries
Monday, Aug 4: 2:55 PM - 3:20 PM
Invited Paper Session
Music City Center
Decision trees and neural networks are conventionally seen as two contrasting approaches to learning. The popular belief is that decision trees compromise accuracy for being easy to use and understand, whereas neural networks are more accurate, but at the cost of being less transparent. In this talk, we challenge the status quo by showing that, under suitable conditions, decision trees that recursively place splits along linear combinations of the covariates achieve similar modeling power and predictive accuracy as single-hidden layer neural networks. The analytical framework presented here can importantly accommodate many existing computational tools in the literature, such as those based on randomization, dimensionality reduction, and mixed-integer optimization.
decision trees, neural networks, greedy algorithms
You have unsaved changes.