Efficient regularized least squares classification

Peng Zhang, Jing Peng

Research output: Contribution to journalConference articleResearchpeer-review

2 Citations (Scopus)

Abstract

Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.

Original languageEnglish
Article number1384892
JournalIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Volume2004-January
Issue numberJanuary
DOIs
StatePublished - 1 Jan 2004
Event2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2004 - Washington, United States
Duration: 27 Jun 20042 Jul 2004

Fingerprint

Support vector machines
Hilbert spaces
Acoustic waves

Cite this

@article{cb876d83e47448efbffff7f975c3470c,
title = "Efficient regularized least squares classification",
abstract = "Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.",
author = "Peng Zhang and Jing Peng",
year = "2004",
month = "1",
day = "1",
doi = "10.1109/CVPR.2004.331",
language = "English",
volume = "2004-January",
journal = "IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops",
issn = "2160-7508",
number = "January",

}

Efficient regularized least squares classification. / Zhang, Peng; Peng, Jing.

In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Vol. 2004-January, No. January, 1384892, 01.01.2004.

Research output: Contribution to journalConference articleResearchpeer-review

TY - JOUR

T1 - Efficient regularized least squares classification

AU - Zhang, Peng

AU - Peng, Jing

PY - 2004/1/1

Y1 - 2004/1/1

N2 - Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.

AB - Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.

UR - http://www.scopus.com/inward/record.url?scp=84932622260&partnerID=8YFLogxK

U2 - 10.1109/CVPR.2004.331

DO - 10.1109/CVPR.2004.331

M3 - Conference article

VL - 2004-January

JO - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops

JF - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops

SN - 2160-7508

IS - January

M1 - 1384892

ER -