Contributed Poster Presentations: Quantitative Communication Interest Group
Wednesday, Aug 6: 10:30 AM - 12:20 PM
4170
Contributed Posters
Music City Center
Room: CC-Hall B
Main Sponsor
Quantitative Communication Interest Group
Presentations
Orthogonal polynomials play important roles in deriving asymptotic distributions of high-dimensional parametric and nonparametric statistics, the Edgeworth series expansion being the most notable example. In recent years, they have emerged as natural bases for constructing kernel functions in machine learning. In this study, we introduce the Logistic Orthogonal Polynomial System (LOPS), derived by orthogonalizing the Logistic polynomials using the Gram-Schmidt process. We establish that LOPS satisfies the hypergeometric differential equation and explore its connections to Legendre polynomials. Motivated by these theoretical properties, we investigate using LOPS-based Logistic kernels in Support Vector Machines (SVMs). We conducted empirical studies on several high-profile datasets to evaluate the model performance (i.e., predictive accuracy, precision, recall, and F-1 score) and found that it performs on par with or better than traditional SVM kernels. This study contributes to the growing intersection of orthogonal polynomials and machine learning, with potential future implications for novel kernel-based deep learning models.
Keywords
Logistic Distribution Function
Orthogonal Polynomials
Kernel Methods
Support Vector Machines
Neural Networks
Hypergeometric Functions
You have unsaved changes.