Learning from data needs a good representation of the data. In many applications, data are often represented by many features, which makes learning more difficult. This becomes particularly challenging if there are more features than available training examples, which can cause over-fitting. Many techniques have been proposed in the literature for dimensional-ity reduction by learning latent variable models. However, many of these techniques generally require a large amount of data to avoid overfitting. In this paper, we introduce a regularization technique for learning latent variable models to address the overfitting problem. The proposed regularization technique has the potential to take into account the local curvatures described by row and column variance matrices. Furthermore, when the precision matrices are identity, the proposed technique is related to learning factor models with trace penalization. We provide empirical evidence that validates the proposed technique.