14. ProGO: Probabilistic Global Optimizer
Conference: Conference on Statistical Practice (CSP) 2024
02/27/2024: 5:30 PM - 7:00 PM CST
Posters
Addressing the limitations of current global optimization algorithms, especially when tackling non-convex functions and when gradient information is computationally intensive or absent, we introduce a novel approach. Our proposed Probabilistic Global Optimizer (ProGO) is based on a sequence of multidimensional integrations that converge to global optima under specific regularity conditions. This gradient-free method benefits from a robust convergence framework built on the properties of emerging optima distributions. We've also created a latent slice sampler with a geometric convergence rate for sampling from these distributions to approximate global optima effectively. ProGO is designed as a versatile framework that scales to approximate global optima for continuous functions across any dimension. Our empirical tests on well-known non-convex functions demonstrate ProGO's superior performance over many established algorithms in terms of regret value and convergence speed.
Time series forecasting
Machine learning
Subsampling
Pareto efficiency
Autoregressive processes
Ensemble methods
Presenting Author
Xinyu Zhang, North Carolina State University
First Author
Xinyu Zhang, North Carolina State University
CoAuthor
Sujit Ghosh, North Carolina State University
You have unsaved changes.