Discriminant learning analysis

Jing Peng, Peng Zhang, Norbert Riedel

Research output: Contribution to journalArticlepeer-review

27 Scopus citations


Linear discriminant analysis (LDA) as a dimension reduction method is widely used in classification such as face recognition. However, it suffers from the small sample size (SSS) problem when data dimensionality is greater than the sample size, as in images where features are high dimensional and correlated. In this paper, we propose to address the SSS problem in the framework of statistical learning theory. We compute linear discriminants by regularized least squares regression, where the singularity problem is resolved. The resulting discriminants are complete in that they include both and information. We show that our proposal and its nonlinear extension belong to the same framework where powerful classifiers such as support vector machines are formulated. In addition, our approach allows us to establish an error bound for LDA. Finally, our experiments validate our theoretical analysis results.

Original languageEnglish
Pages (from-to)1614-1625
Number of pages12
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Issue number6
StatePublished - 2008


  • Dimensionality reduction
  • Discriminant learning analysis (DLA)
  • Feature extraction
  • Linear discriminant analysis (LDA)
  • Regularized least squares (RLS)
  • Small sample size (SSS) problem


Dive into the research topics of 'Discriminant learning analysis'. Together they form a unique fingerprint.

Cite this