TY - GEN
T1 - Information preserving discriminant projections
AU - Peng, Jing
AU - Aved, Alex J.
N1 - Publisher Copyright:
© 2020 by SCITEPRESS - Science and Technology Publications, Lda. All rights reserved
PY - 2020
Y1 - 2020
N2 - In classification, a large number of features often make the design of a classifier difficult and degrade its performance. This is particularly pronounced when the number of examples is small relative to the number of features, which is due to the curse of dimensionality. There are many dimensionality reduction techniques in the literature. However, most these techniques are either informative (or minimum information loss), as in principal component analysis (PCA), or discriminant, as in linear discriminant analysis (LDA). Each type of technique has its strengths and weaknesses. Motivated by Gaussian Processes Latent Variable Models, we propose a simple linear projection technique that explores the characteristics of both PCA and LDA in latent representations. The proposed technique optimizes a regularized information preserving objective, where the regularizer is a LDA based criterion. And as such, it prefers a latent space that is both informative and discriminant, thereby providing better generalization performance. Experimental results based on a variety of data sets are provided to validate the proposed technique.
AB - In classification, a large number of features often make the design of a classifier difficult and degrade its performance. This is particularly pronounced when the number of examples is small relative to the number of features, which is due to the curse of dimensionality. There are many dimensionality reduction techniques in the literature. However, most these techniques are either informative (or minimum information loss), as in principal component analysis (PCA), or discriminant, as in linear discriminant analysis (LDA). Each type of technique has its strengths and weaknesses. Motivated by Gaussian Processes Latent Variable Models, we propose a simple linear projection technique that explores the characteristics of both PCA and LDA in latent representations. The proposed technique optimizes a regularized information preserving objective, where the regularizer is a LDA based criterion. And as such, it prefers a latent space that is both informative and discriminant, thereby providing better generalization performance. Experimental results based on a variety of data sets are provided to validate the proposed technique.
KW - Classification
KW - Dimensionality Reduction
KW - Feature Selection
UR - https://www.scopus.com/pages/publications/85083091427
U2 - 10.5220/0008967601620171
DO - 10.5220/0008967601620171
M3 - Conference contribution
AN - SCOPUS:85083091427
T3 - ICAART 2020 - Proceedings of the 12th International Conference on Agents and Artificial Intelligence
SP - 162
EP - 171
BT - ICAART 2020 - Proceedings of the 12th International Conference on Agents and Artificial Intelligence
A2 - Rocha, Ana
A2 - Steels, Luc
A2 - van den Herik, Jaap
PB - SciTePress
T2 - 12th International Conference on Agents and Artificial Intelligence, ICAART 2020
Y2 - 22 February 2020 through 24 February 2020
ER -