Local discriminative learning for pattern recognition

Jing Peng, Bir Bhanu

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

Local discriminative learning methods approximate a target function (a posteriori class probability function) directly by partitioning the feature space into a set of local regions, and appropriately modeling a simple input-output relationship (function) in each one. This paper presents a new method for judiciously partitioning the input feature space in order to accurately represent the target function. The method accomplishes this by approximating not only the target function itself but also its derivatives. As such, the method partitions the input feature space along those dimensions for which the class probability function changes most rapidly, thus minimizing bias. The efficacy of the method is validated using a variety of simulated and real-world data.

Original languageEnglish
Pages (from-to)139-150
Number of pages12
JournalPattern Recognition
Volume34
Issue number1
DOIs
StatePublished - 1 Jan 2001

Fingerprint

Pattern recognition
Derivatives

Keywords

  • Discriminative learning
  • Feature relevance
  • Local learning
  • Recursive partitioning
  • Statistical pattern classification

Cite this

Peng, Jing ; Bhanu, Bir. / Local discriminative learning for pattern recognition. In: Pattern Recognition. 2001 ; Vol. 34, No. 1. pp. 139-150.
@article{182175cd84bc46d5b3cd34bbb8dd686d,
title = "Local discriminative learning for pattern recognition",
abstract = "Local discriminative learning methods approximate a target function (a posteriori class probability function) directly by partitioning the feature space into a set of local regions, and appropriately modeling a simple input-output relationship (function) in each one. This paper presents a new method for judiciously partitioning the input feature space in order to accurately represent the target function. The method accomplishes this by approximating not only the target function itself but also its derivatives. As such, the method partitions the input feature space along those dimensions for which the class probability function changes most rapidly, thus minimizing bias. The efficacy of the method is validated using a variety of simulated and real-world data.",
keywords = "Discriminative learning, Feature relevance, Local learning, Recursive partitioning, Statistical pattern classification",
author = "Jing Peng and Bir Bhanu",
year = "2001",
month = "1",
day = "1",
doi = "10.1016/S0031-3203(99)00209-5",
language = "English",
volume = "34",
pages = "139--150",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier Ltd",
number = "1",

}

Local discriminative learning for pattern recognition. / Peng, Jing; Bhanu, Bir.

In: Pattern Recognition, Vol. 34, No. 1, 01.01.2001, p. 139-150.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Local discriminative learning for pattern recognition

AU - Peng, Jing

AU - Bhanu, Bir

PY - 2001/1/1

Y1 - 2001/1/1

N2 - Local discriminative learning methods approximate a target function (a posteriori class probability function) directly by partitioning the feature space into a set of local regions, and appropriately modeling a simple input-output relationship (function) in each one. This paper presents a new method for judiciously partitioning the input feature space in order to accurately represent the target function. The method accomplishes this by approximating not only the target function itself but also its derivatives. As such, the method partitions the input feature space along those dimensions for which the class probability function changes most rapidly, thus minimizing bias. The efficacy of the method is validated using a variety of simulated and real-world data.

AB - Local discriminative learning methods approximate a target function (a posteriori class probability function) directly by partitioning the feature space into a set of local regions, and appropriately modeling a simple input-output relationship (function) in each one. This paper presents a new method for judiciously partitioning the input feature space in order to accurately represent the target function. The method accomplishes this by approximating not only the target function itself but also its derivatives. As such, the method partitions the input feature space along those dimensions for which the class probability function changes most rapidly, thus minimizing bias. The efficacy of the method is validated using a variety of simulated and real-world data.

KW - Discriminative learning

KW - Feature relevance

KW - Local learning

KW - Recursive partitioning

KW - Statistical pattern classification

UR - http://www.scopus.com/inward/record.url?scp=0035201642&partnerID=8YFLogxK

U2 - 10.1016/S0031-3203(99)00209-5

DO - 10.1016/S0031-3203(99)00209-5

M3 - Article

AN - SCOPUS:0035201642

VL - 34

SP - 139

EP - 150

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

IS - 1

ER -