TY - GEN
T1 - Analysis of chernoff criterion for linear dimensionality reduction
AU - Peng, Jing
AU - Robila, Stefan
AU - Fan, Wei
AU - Seetharaman, Guna
PY - 2010
Y1 - 2010
N2 - Well known linear discriminant analysis (LDA) based on the Fisher criterion is incapable of dealing with heteroscedasticity in data. However, in many practical applications we often encounter heteroscedastic data, i.e., within class scatter matrices can not be expected to be equal. A technique based on the Chernoff criterion for linear dimensionality reduction has been proposed recently. The technique extends well-known Fisher's LDA and is capable of exploiting information about heteroscedasticity in the data. While the Chernoff criterion has been shown to outperform the Fisher's, a clear understanding of its exact behavior is lacking. In addition, the criterion, as introduced, is rather complex, thereby making it difficult to clearly state its relationship to other linear dimensionality techniques. In this paper, we show precisely what can be expected from the Chernoff criterion and its relations to the Fisher criterion and Fukunaga-Koontz transform. Furthermore, we show that a recently proposed decomposition of the data space into four subspaces is incomplete. We provide arguments on how to best enrich the decomposition of the data space in order to account for heteroscedasticity in the data.
AB - Well known linear discriminant analysis (LDA) based on the Fisher criterion is incapable of dealing with heteroscedasticity in data. However, in many practical applications we often encounter heteroscedastic data, i.e., within class scatter matrices can not be expected to be equal. A technique based on the Chernoff criterion for linear dimensionality reduction has been proposed recently. The technique extends well-known Fisher's LDA and is capable of exploiting information about heteroscedasticity in the data. While the Chernoff criterion has been shown to outperform the Fisher's, a clear understanding of its exact behavior is lacking. In addition, the criterion, as introduced, is rather complex, thereby making it difficult to clearly state its relationship to other linear dimensionality techniques. In this paper, we show precisely what can be expected from the Chernoff criterion and its relations to the Fisher criterion and Fukunaga-Koontz transform. Furthermore, we show that a recently proposed decomposition of the data space into four subspaces is incomplete. We provide arguments on how to best enrich the decomposition of the data space in order to account for heteroscedasticity in the data.
KW - Chernoff distance
KW - Dimensionality reduction
KW - Linear discriminant analysis
UR - http://www.scopus.com/inward/record.url?scp=78751554089&partnerID=8YFLogxK
U2 - 10.1109/ICSMC.2010.5641971
DO - 10.1109/ICSMC.2010.5641971
M3 - Conference contribution
AN - SCOPUS:78751554089
SN - 9781424465880
T3 - Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
SP - 3014
EP - 3021
BT - 2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010
T2 - 2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010
Y2 - 10 October 2010 through 13 October 2010
ER -