Adaptive quasiconformal kernel nearest neighbor classification

Jing Peng, Douglas R. Heisterkamp, H. K. Dai

Research output: Contribution to journalArticlepeer-review

54 Scopus citations


Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.

Original languageEnglish
Pages (from-to)656-661
Number of pages6
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Issue number5
StatePublished - May 2004


  • Classification
  • Feature space
  • Kernel methods
  • Nearest neighbors
  • Quasiconformal mapping


Dive into the research topics of 'Adaptive quasiconformal kernel nearest neighbor classification'. Together they form a unique fingerprint.

Cite this