Approximate regularized least squares algorithm for classification

Jing Peng, Alex J. Aved

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In machine learning, a good predictive model is the one that generalizes well over future unseen data. In general, this problem is ill-posed. To mitigate this problem, a predictive model can be constructed by simultaneously minimizing an empirical error over training samples and controlling the complexity of the model. Thus, the regularized least squares (RLS) is developed. RLS requires matrix inversion, which is expensive. And as such, its "big data" applications can be adversely affected. To address this issue, we have developed an efficient machine learning algorithm for pattern recognition that approximates RLS. The algorithm does not require matrix inversion, and achieves competitive performance against the RLS algorithm. It has been shown mathematically that RLS is a sound learning algorithm. Therefore, a definitive statement about the relationship between the new algorithm and RLS will lay a solid theoretical foundation for the new algorithm. A recent study shows that the spectral norm of the kernel matrix in RLS is tightly bounded above by the size of the matrix. This spectral norm becomes a constant when the training samples have independent centered sub-Gaussian coordinators. For example, typical sub-Gaussian random vectors such as the standard normal and Bernoulli satisfy this assumption. Basically, each sample is drawn from a product distribution formed from some centered univariate sub-Gaussian distributions. These new results allow us to establish a bound between the new algorithm and RLS in finite samples and show that the new algorithm converges to RLS in the limit. Experimental results are provided that validate the theoretical analysis and demonstrate the new algorithm to be very promising in solving "big data" classification problems.

Original languageEnglish
Title of host publicationPattern Recognition and Tracking XXIX
EditorsMohammad S. Alam
PublisherSPIE
ISBN (Electronic)9781510618091
DOIs
Publication statusPublished - 1 Jan 2018
EventPattern Recognition and Tracking XXIX 2018 - Orlando, United States
Duration: 18 Apr 201819 Apr 2018

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume10649
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Other

OtherPattern Recognition and Tracking XXIX 2018
CountryUnited States
CityOrlando
Period18/04/1819/04/18

    Fingerprint

Keywords

  • Classification
  • regularized least squares
  • ridge regression

Cite this

Peng, J., & Aved, A. J. (2018). Approximate regularized least squares algorithm for classification. In M. S. Alam (Ed.), Pattern Recognition and Tracking XXIX [106490S] (Proceedings of SPIE - The International Society for Optical Engineering; Vol. 10649). SPIE. https://doi.org/10.1117/12.2305075