36: The Logistic Orthogonal Polynomial System, Logistic Kernels and their uses in Machine Learning

Ebenezer George Co-Author
University of Memphis
 
Felix Havugimana Co-Author
The University of Memphis
 
Deepak Venugopal Co-Author
The University of Memphis
 
Mohammed Yeasin Co-Author
The University of Memphis
 
Alexander Jefferson First Author
 
Alexander Jefferson Presenting Author
 
Wednesday, Aug 6: 10:30 AM - 12:20 PM
2021 
Contributed Posters 
Music City Center 
Orthogonal polynomials play important roles in deriving asymptotic distributions of high-dimensional parametric and nonparametric statistics, the Edgeworth series expansion being the most notable example. In recent years, they have emerged as natural bases for constructing kernel functions in machine learning. In this study, we introduce the Logistic Orthogonal Polynomial System (LOPS), derived by orthogonalizing the Logistic polynomials using the Gram-Schmidt process. We establish that LOPS satisfies the hypergeometric differential equation and explore its connections to Legendre polynomials. Motivated by these theoretical properties, we investigate using LOPS-based Logistic kernels in Support Vector Machines (SVMs). We conducted empirical studies on several high-profile datasets to evaluate the model performance (i.e., predictive accuracy, precision, recall, and F-1 score) and found that it performs on par with or better than traditional SVM kernels. This study contributes to the growing intersection of orthogonal polynomials and machine learning, with potential future implications for novel kernel-based deep learning models.

Keywords

Logistic Distribution Function

Orthogonal Polynomials

Kernel Methods

Support Vector Machines

Neural Networks

Hypergeometric Functions 

Main Sponsor

Quantitative Communication Interest Group