TY - GEN
T1 - Discriminant analysis
T2 - 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005 - Workshops
AU - Zhang, Peng
AU - Peng, Jing
AU - Riedel, Nobert
N1 - Publisher Copyright:
© 2005 IEEE Computer Society. All rights reserved.
PY - 2005
Y1 - 2005
N2 - Linear discriminant analysis (LDA) is a very important approach to selecting features in classification such as facial recognition. However it suffers from the small sample size (SSS) problem where LDA cannot be solved numerically. The SSS problem occurs when the number of training samples is less than the number of dimensions, which is often the case in practice. Researchers have proposed several modified versions of LDA to deal with this problem. However, a solid theoretical analysis is missing. In this paper, we analyze LDA and the SSS problem based on learning theory. LDA is derived from Fisher's criterion. However, when formulated as a least square approximation problem, LDA has a direct connection to regularization network (RN) algorithms. Many learning algorithms such as support vector machines (SVMs) can be viewed as regularization networks. LDA turns out to be an RN without the regularization term, which is in general an ill-posed problem. This explains why LDA suffers from the SSS problem. In order to transform the ill-posed problem into a well-posed one, the regularization term is necessary. Thus, based on statistical learning theory, we derive a new approach to discriminant analysis. We call it discriminant learning analysis (DLA). DLA is well-posed and behaves well in the SSS situation. Experimental results are presented to validate our proposal.
AB - Linear discriminant analysis (LDA) is a very important approach to selecting features in classification such as facial recognition. However it suffers from the small sample size (SSS) problem where LDA cannot be solved numerically. The SSS problem occurs when the number of training samples is less than the number of dimensions, which is often the case in practice. Researchers have proposed several modified versions of LDA to deal with this problem. However, a solid theoretical analysis is missing. In this paper, we analyze LDA and the SSS problem based on learning theory. LDA is derived from Fisher's criterion. However, when formulated as a least square approximation problem, LDA has a direct connection to regularization network (RN) algorithms. Many learning algorithms such as support vector machines (SVMs) can be viewed as regularization networks. LDA turns out to be an RN without the regularization term, which is in general an ill-posed problem. This explains why LDA suffers from the SSS problem. In order to transform the ill-posed problem into a well-posed one, the regularization term is necessary. Thus, based on statistical learning theory, we derive a new approach to discriminant analysis. We call it discriminant learning analysis (DLA). DLA is well-posed and behaves well in the SSS situation. Experimental results are presented to validate our proposal.
KW - Error bound
KW - Kernel trick
KW - Least squares
KW - Linear discriminant analysis
KW - Regularization
KW - Small sample size problem
UR - http://www.scopus.com/inward/record.url?scp=80053260141&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2005.444
DO - 10.1109/CVPR.2005.444
M3 - Conference contribution
AN - SCOPUS:80053260141
T3 - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
BT - 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005 - Workshops
PB - IEEE Computer Society
Y2 - 21 September 2005 through 23 September 2005
ER -