TY - GEN
T1 - Learning Latent Variable Models with Discriminant Regularization
AU - Peng, Jing
AU - Aved, Alex J.
N1 - Publisher Copyright:
© 2021, This is a U.S. government work and not under copyright protection in the United States; foreign copyright protection may apply.
PY - 2021
Y1 - 2021
N2 - In many machine learning applications, data are often described by a large number of features or attributes. However, too many features can result in overfitting. This is often the case when the number of examples is smaller than the number of features. The problem can be mitigated by learning latent variable models where the data can be described by a fewer number of latent dimensions. There are many techniques for learning latent variable models in the literature. Most of these techniques can be grouped into two classes: techniques that are informative, represented by principal component analysis (PCA), and techniques that are discriminant, represented by linear discriminant analysis (LDA). Each class of the techniques has its advantages. In this work, we introduce a technique for learning latent variable models with discriminant regularization that combines the characteristics of both classes. Empirical evaluation using a variety of data sets is presented to verify the performance of the proposed technique.
AB - In many machine learning applications, data are often described by a large number of features or attributes. However, too many features can result in overfitting. This is often the case when the number of examples is smaller than the number of features. The problem can be mitigated by learning latent variable models where the data can be described by a fewer number of latent dimensions. There are many techniques for learning latent variable models in the literature. Most of these techniques can be grouped into two classes: techniques that are informative, represented by principal component analysis (PCA), and techniques that are discriminant, represented by linear discriminant analysis (LDA). Each class of the techniques has its advantages. In this work, we introduce a technique for learning latent variable models with discriminant regularization that combines the characteristics of both classes. Empirical evaluation using a variety of data sets is presented to verify the performance of the proposed technique.
KW - Classification
KW - Dimensionality reduction
KW - Latent variable models
UR - http://www.scopus.com/inward/record.url?scp=85103482401&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-71158-0_18
DO - 10.1007/978-3-030-71158-0_18
M3 - Conference contribution
AN - SCOPUS:85103482401
SN - 9783030711573
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 378
EP - 398
BT - Agents and Artificial Intelligence - 12th International Conference, ICAART 2020, Revised Selected Papers
A2 - Rocha, Ana Paula
A2 - Steels, Luc
A2 - van den Herik, Jaap
PB - Springer Science and Business Media Deutschland GmbH
T2 - 12th International Conference on Agents and Artificial Intelligence, ICAART 2020
Y2 - 22 February 2020 through 24 February 2020
ER -