TY - GEN
T1 - Learning Latent Variable Models with Regularization
AU - Peng, Jing
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Learning from data needs a good representation of the data. In many applications, data are often represented by many features, which makes learning more difficult. This becomes particularly challenging if there are more features than available training examples, which can cause over-fitting. Many techniques have been proposed in the literature for dimensional-ity reduction by learning latent variable models. However, many of these techniques generally require a large amount of data to avoid overfitting. In this paper, we introduce a regularization technique for learning latent variable models to address the overfitting problem. The proposed regularization technique has the potential to take into account the local curvatures described by row and column variance matrices. Furthermore, when the precision matrices are identity, the proposed technique is related to learning factor models with trace penalization. We provide empirical evidence that validates the proposed technique.
AB - Learning from data needs a good representation of the data. In many applications, data are often represented by many features, which makes learning more difficult. This becomes particularly challenging if there are more features than available training examples, which can cause over-fitting. Many techniques have been proposed in the literature for dimensional-ity reduction by learning latent variable models. However, many of these techniques generally require a large amount of data to avoid overfitting. In this paper, we introduce a regularization technique for learning latent variable models to address the overfitting problem. The proposed regularization technique has the potential to take into account the local curvatures described by row and column variance matrices. Furthermore, when the precision matrices are identity, the proposed technique is related to learning factor models with trace penalization. We provide empirical evidence that validates the proposed technique.
KW - Dimension reduction
KW - latent models
KW - regularization
UR - http://www.scopus.com/inward/record.url?scp=85127057331&partnerID=8YFLogxK
U2 - 10.1109/ICECET52533.2021.9698660
DO - 10.1109/ICECET52533.2021.9698660
M3 - Conference contribution
AN - SCOPUS:85127057331
T3 - International Conference on Electrical, Computer, and Energy Technologies, ICECET 2021
BT - International Conference on Electrical, Computer, and Energy Technologies, ICECET 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Conference on Electrical, Computer, and Energy Technologies, ICECET 2021
Y2 - 9 December 2021 through 10 December 2021
ER -