LDA/SVM driven nearest neighbor classification

Jing Peng, Douglas R. Heisterkamp, H. K. Dai

Research output: Contribution to journalArticleResearchpeer-review

46 Citations (Scopus)

Abstract

Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities are locally constant. This assumption becomes false in high dimensions with finite samples due to the curse of dimensionality. The NN rule introduces severe bias under these conditions. We propose a locally adaptive neighborhood morphing classification method to try to minimize bias. We use local support vector machine learning to estimate an effective metric for producing neighborhoods that are elongated along less discriminant feature dimensions and constricted along most discriminant ones. As a result, the class conditional probabilities can be expected to be approximately constant in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other competing techniques using a number of datasets.

Original languageEnglish
Pages (from-to)940-942
Number of pages3
JournalIEEE Transactions on Neural Networks
Volume14
Issue number4
DOIs
StatePublished - 1 Jul 2003

Fingerprint

Support vector machines
Learning systems
Support Vector Machine
Datasets
Machine Learning

Keywords

  • Classification
  • Linear discriminant analysis (LDA)
  • Nearest neighbor
  • Support vector machine (SVM)

Cite this

Peng, Jing ; Heisterkamp, Douglas R. ; Dai, H. K. / LDA/SVM driven nearest neighbor classification. In: IEEE Transactions on Neural Networks. 2003 ; Vol. 14, No. 4. pp. 940-942.
@article{3ff0fdf86c104ae39a9cb3d27f1d3892,
title = "LDA/SVM driven nearest neighbor classification",
abstract = "Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities are locally constant. This assumption becomes false in high dimensions with finite samples due to the curse of dimensionality. The NN rule introduces severe bias under these conditions. We propose a locally adaptive neighborhood morphing classification method to try to minimize bias. We use local support vector machine learning to estimate an effective metric for producing neighborhoods that are elongated along less discriminant feature dimensions and constricted along most discriminant ones. As a result, the class conditional probabilities can be expected to be approximately constant in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other competing techniques using a number of datasets.",
keywords = "Classification, Linear discriminant analysis (LDA), Nearest neighbor, Support vector machine (SVM)",
author = "Jing Peng and Heisterkamp, {Douglas R.} and Dai, {H. K.}",
year = "2003",
month = "7",
day = "1",
doi = "10.1109/TNN.2003.813835",
language = "English",
volume = "14",
pages = "940--942",
journal = "IEEE Transactions on Neural Networks",
issn = "1045-9227",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "4",

}

LDA/SVM driven nearest neighbor classification. / Peng, Jing; Heisterkamp, Douglas R.; Dai, H. K.

In: IEEE Transactions on Neural Networks, Vol. 14, No. 4, 01.07.2003, p. 940-942.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - LDA/SVM driven nearest neighbor classification

AU - Peng, Jing

AU - Heisterkamp, Douglas R.

AU - Dai, H. K.

PY - 2003/7/1

Y1 - 2003/7/1

N2 - Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities are locally constant. This assumption becomes false in high dimensions with finite samples due to the curse of dimensionality. The NN rule introduces severe bias under these conditions. We propose a locally adaptive neighborhood morphing classification method to try to minimize bias. We use local support vector machine learning to estimate an effective metric for producing neighborhoods that are elongated along less discriminant feature dimensions and constricted along most discriminant ones. As a result, the class conditional probabilities can be expected to be approximately constant in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other competing techniques using a number of datasets.

AB - Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities are locally constant. This assumption becomes false in high dimensions with finite samples due to the curse of dimensionality. The NN rule introduces severe bias under these conditions. We propose a locally adaptive neighborhood morphing classification method to try to minimize bias. We use local support vector machine learning to estimate an effective metric for producing neighborhoods that are elongated along less discriminant feature dimensions and constricted along most discriminant ones. As a result, the class conditional probabilities can be expected to be approximately constant in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other competing techniques using a number of datasets.

KW - Classification

KW - Linear discriminant analysis (LDA)

KW - Nearest neighbor

KW - Support vector machine (SVM)

UR - http://www.scopus.com/inward/record.url?scp=0042024924&partnerID=8YFLogxK

U2 - 10.1109/TNN.2003.813835

DO - 10.1109/TNN.2003.813835

M3 - Article

VL - 14

SP - 940

EP - 942

JO - IEEE Transactions on Neural Networks

JF - IEEE Transactions on Neural Networks

SN - 1045-9227

IS - 4

ER -