Efficient regularized least squares classification

Peng Zhang, Jing Peng

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.

Original languageEnglish
Article number1384892
JournalIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Volume2004-January
Issue numberJanuary
DOIs
StatePublished - 2004
Event2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2004 - Washington, United States
Duration: 27 Jun 20042 Jul 2004

Fingerprint

Dive into the research topics of 'Efficient regularized least squares classification'. Together they form a unique fingerprint.

Cite this