Microbagging Estimators: An Ensemble Approach to Distance-weighted Classifiers

Blaine Nelson, Battista Biggio, Pavel Laskov
Proceedings of the Asian Conference on Machine Learning, PMLR 20:63-79, 2011.

Abstract

Support vector machines (SVMs) have been the predominate approach to kernel-based classification. While SVMs have demonstrated excellent performance in many application domains, they are known to be sensitive to noise in their training dataset. Motivated by the equalizing effect of bagging classifiers, we present a novel approach to kernel-based classification that we call microbagging. This method bags all possible maximal-margin estimators between pairs of training points to create a novel linear kernel classifier with weights defined directly as functions of the pairwise distance matrix induced by the kernel function. We derive relationships between linear and distance-based classifiers and empirically compare microbagging to the SVMs and robust SVMs on several datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v20-nelson11, title = {Microbagging Estimators: An Ensemble Approach to Distance-weighted Classifiers}, author = {Nelson, Blaine and Biggio, Battista and Laskov, Pavel}, booktitle = {Proceedings of the Asian Conference on Machine Learning}, pages = {63--79}, year = {2011}, editor = {Hsu, Chun-Nan and Lee, Wee Sun}, volume = {20}, series = {Proceedings of Machine Learning Research}, address = {South Garden Hotels and Resorts, Taoyuan, Taiwain}, month = {14--15 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v20/nelson11/nelson11.pdf}, url = {https://proceedings.mlr.press/v20/nelson11.html}, abstract = {Support vector machines (SVMs) have been the predominate approach to kernel-based classification. While SVMs have demonstrated excellent performance in many application domains, they are known to be sensitive to noise in their training dataset. Motivated by the equalizing effect of bagging classifiers, we present a novel approach to kernel-based classification that we call microbagging. This method bags all possible maximal-margin estimators between pairs of training points to create a novel linear kernel classifier with weights defined directly as functions of the pairwise distance matrix induced by the kernel function. We derive relationships between linear and distance-based classifiers and empirically compare microbagging to the SVMs and robust SVMs on several datasets.} }
Endnote
%0 Conference Paper %T Microbagging Estimators: An Ensemble Approach to Distance-weighted Classifiers %A Blaine Nelson %A Battista Biggio %A Pavel Laskov %B Proceedings of the Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2011 %E Chun-Nan Hsu %E Wee Sun Lee %F pmlr-v20-nelson11 %I PMLR %P 63--79 %U https://proceedings.mlr.press/v20/nelson11.html %V 20 %X Support vector machines (SVMs) have been the predominate approach to kernel-based classification. While SVMs have demonstrated excellent performance in many application domains, they are known to be sensitive to noise in their training dataset. Motivated by the equalizing effect of bagging classifiers, we present a novel approach to kernel-based classification that we call microbagging. This method bags all possible maximal-margin estimators between pairs of training points to create a novel linear kernel classifier with weights defined directly as functions of the pairwise distance matrix induced by the kernel function. We derive relationships between linear and distance-based classifiers and empirically compare microbagging to the SVMs and robust SVMs on several datasets.
RIS
TY - CPAPER TI - Microbagging Estimators: An Ensemble Approach to Distance-weighted Classifiers AU - Blaine Nelson AU - Battista Biggio AU - Pavel Laskov BT - Proceedings of the Asian Conference on Machine Learning DA - 2011/11/17 ED - Chun-Nan Hsu ED - Wee Sun Lee ID - pmlr-v20-nelson11 PB - PMLR DP - Proceedings of Machine Learning Research VL - 20 SP - 63 EP - 79 L1 - http://proceedings.mlr.press/v20/nelson11/nelson11.pdf UR - https://proceedings.mlr.press/v20/nelson11.html AB - Support vector machines (SVMs) have been the predominate approach to kernel-based classification. While SVMs have demonstrated excellent performance in many application domains, they are known to be sensitive to noise in their training dataset. Motivated by the equalizing effect of bagging classifiers, we present a novel approach to kernel-based classification that we call microbagging. This method bags all possible maximal-margin estimators between pairs of training points to create a novel linear kernel classifier with weights defined directly as functions of the pairwise distance matrix induced by the kernel function. We derive relationships between linear and distance-based classifiers and empirically compare microbagging to the SVMs and robust SVMs on several datasets. ER -
APA
Nelson, B., Biggio, B. & Laskov, P.. (2011). Microbagging Estimators: An Ensemble Approach to Distance-weighted Classifiers. Proceedings of the Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 20:63-79 Available from https://proceedings.mlr.press/v20/nelson11.html.

Related Material