Classifier fusion using shared sampling distribution for boosting

Costin Barbu, Raja Iqbal, Jing Peng

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

4 Citations (Scopus)

Abstract

We present a new framework for classifier fusion that uses a shared sampling distribution for obtaining a weighted classifier ensemble. The weight update process is self regularizing as subsequent classifiers trained on the disjoint views rectify the bias introduced by any classifier in preceding iterations. We provide theoretical guarantees that our approach indeed provides results which are better than the case when boosting is performed separately on different views. The results are shown to outperform other classifier fusion strategies on a well known texture image database.

Original languageEnglish
Title of host publicationProceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005
Pages34-41
Number of pages8
DOIs
StatePublished - 1 Dec 2005
Event5th IEEE International Conference on Data Mining, ICDM 2005 - Houston, TX, United States
Duration: 27 Nov 200530 Nov 2005

Other

Other5th IEEE International Conference on Data Mining, ICDM 2005
CountryUnited States
CityHouston, TX
Period27/11/0530/11/05

Fingerprint

Classifiers
Fusion reactions
Sampling
Textures

Cite this

Barbu, C., Iqbal, R., & Peng, J. (2005). Classifier fusion using shared sampling distribution for boosting. In Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005 (pp. 34-41). [1565659] https://doi.org/10.1109/ICDM.2005.40
Barbu, Costin ; Iqbal, Raja ; Peng, Jing. / Classifier fusion using shared sampling distribution for boosting. Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005. 2005. pp. 34-41
@inproceedings{7fd1c67012c04b389ac3cd36f8afacc1,
title = "Classifier fusion using shared sampling distribution for boosting",
abstract = "We present a new framework for classifier fusion that uses a shared sampling distribution for obtaining a weighted classifier ensemble. The weight update process is self regularizing as subsequent classifiers trained on the disjoint views rectify the bias introduced by any classifier in preceding iterations. We provide theoretical guarantees that our approach indeed provides results which are better than the case when boosting is performed separately on different views. The results are shown to outperform other classifier fusion strategies on a well known texture image database.",
author = "Costin Barbu and Raja Iqbal and Jing Peng",
year = "2005",
month = "12",
day = "1",
doi = "10.1109/ICDM.2005.40",
language = "English",
isbn = "0769522785",
pages = "34--41",
booktitle = "Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005",

}

Barbu, C, Iqbal, R & Peng, J 2005, Classifier fusion using shared sampling distribution for boosting. in Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005., 1565659, pp. 34-41, 5th IEEE International Conference on Data Mining, ICDM 2005, Houston, TX, United States, 27/11/05. https://doi.org/10.1109/ICDM.2005.40

Classifier fusion using shared sampling distribution for boosting. / Barbu, Costin; Iqbal, Raja; Peng, Jing.

Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005. 2005. p. 34-41 1565659.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

TY - GEN

T1 - Classifier fusion using shared sampling distribution for boosting

AU - Barbu, Costin

AU - Iqbal, Raja

AU - Peng, Jing

PY - 2005/12/1

Y1 - 2005/12/1

N2 - We present a new framework for classifier fusion that uses a shared sampling distribution for obtaining a weighted classifier ensemble. The weight update process is self regularizing as subsequent classifiers trained on the disjoint views rectify the bias introduced by any classifier in preceding iterations. We provide theoretical guarantees that our approach indeed provides results which are better than the case when boosting is performed separately on different views. The results are shown to outperform other classifier fusion strategies on a well known texture image database.

AB - We present a new framework for classifier fusion that uses a shared sampling distribution for obtaining a weighted classifier ensemble. The weight update process is self regularizing as subsequent classifiers trained on the disjoint views rectify the bias introduced by any classifier in preceding iterations. We provide theoretical guarantees that our approach indeed provides results which are better than the case when boosting is performed separately on different views. The results are shown to outperform other classifier fusion strategies on a well known texture image database.

UR - http://www.scopus.com/inward/record.url?scp=33845540017&partnerID=8YFLogxK

U2 - 10.1109/ICDM.2005.40

DO - 10.1109/ICDM.2005.40

M3 - Conference contribution

SN - 0769522785

SN - 9780769522784

SP - 34

EP - 41

BT - Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005

ER -

Barbu C, Iqbal R, Peng J. Classifier fusion using shared sampling distribution for boosting. In Proceedings - Fifth IEEE International Conference on Data Mining, ICDM 2005. 2005. p. 34-41. 1565659 https://doi.org/10.1109/ICDM.2005.40