Adaptive quasiconformal kernel nearest neighbor classification

Jing Peng, Douglas R. Heisterkamp, H. K. Dai

Research output: Contribution to journalArticle

49 Citations (Scopus)

Abstract

Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.

Original languageEnglish
Pages (from-to)656-661
Number of pages6
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume26
Issue number5
DOIs
StatePublished - 1 May 2004

Keywords

  • Classification
  • Feature space
  • Kernel methods
  • Nearest neighbors
  • Quasiconformal mapping

Cite this

@article{43553c29531e4632a497c59a1bb018d3,
title = "Adaptive quasiconformal kernel nearest neighbor classification",
abstract = "Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.",
keywords = "Classification, Feature space, Kernel methods, Nearest neighbors, Quasiconformal mapping",
author = "Jing Peng and Heisterkamp, {Douglas R.} and Dai, {H. K.}",
year = "2004",
month = "5",
day = "1",
doi = "10.1109/TPAMI.2004.1273978",
language = "English",
volume = "26",
pages = "656--661",
journal = "IEEE Transactions on Pattern Analysis and Machine Intelligence",
issn = "0162-8828",
publisher = "IEEE Computer Society",
number = "5",

}

Adaptive quasiconformal kernel nearest neighbor classification. / Peng, Jing; Heisterkamp, Douglas R.; Dai, H. K.

In: IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 26, No. 5, 01.05.2004, p. 656-661.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Adaptive quasiconformal kernel nearest neighbor classification

AU - Peng, Jing

AU - Heisterkamp, Douglas R.

AU - Dai, H. K.

PY - 2004/5/1

Y1 - 2004/5/1

N2 - Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.

AB - Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.

KW - Classification

KW - Feature space

KW - Kernel methods

KW - Nearest neighbors

KW - Quasiconformal mapping

UR - http://www.scopus.com/inward/record.url?scp=3042573893&partnerID=8YFLogxK

U2 - 10.1109/TPAMI.2004.1273978

DO - 10.1109/TPAMI.2004.1273978

M3 - Article

C2 - 15460287

AN - SCOPUS:3042573893

VL - 26

SP - 656

EP - 661

JO - IEEE Transactions on Pattern Analysis and Machine Intelligence

JF - IEEE Transactions on Pattern Analysis and Machine Intelligence

SN - 0162-8828

IS - 5

ER -