KDD Cup 2009 @ Budapest: feature partitioning and boosting

Miklós Kurucz, Dávid Siklósi, István Bíró, Péter Csizsek, Zsolt Fekete, Róbert Iwatt, Tamás Kiss, Adrienn Szabó
Proceedings of KDD-Cup 2009 Competition, PMLR 7:65-75, 2009.

Abstract

We describe the method used in our final submission to KDD Cup 2009 as well as a selection of promising directions that are generally believed to work well but did not justify our expectations. Our final method consists of a combination of a LogitBoost and an ADTree classifier with a feature selection method that, as shaped by the experiments we have conducted, have turned out to be very different from those described in some well-cited surveys. Some methods that failed include distance, information and dependence measures for feature selection as well as combination of classifiers over a partitioned feature set. As another main lesson learned, alternating decision trees and LogitBoost outperformed most classifiers for most feature subsets of the KDD Cup 2009 data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v7-kurucz09, title = {KDD Cup 2009 @ Budapest: feature partitioning and boosting}, author = {Kurucz, Miklós and Siklósi, Dávid and Bíró, István and Csizsek, Péter and Fekete, Zsolt and Iwatt, Róbert and Kiss, Tamás and Szabó, Adrienn}, booktitle = {Proceedings of KDD-Cup 2009 Competition}, pages = {65--75}, year = {2009}, editor = {Dror, Gideon and Boullé, Mar and Guyon, Isabelle and Lemaire, Vincent and Vogel, David}, volume = {7}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v7/kurucz09/kurucz09.pdf}, url = {https://proceedings.mlr.press/v7/kurucz09.html}, abstract = {We describe the method used in our final submission to KDD Cup 2009 as well as a selection of promising directions that are generally believed to work well but did not justify our expectations. Our final method consists of a combination of a LogitBoost and an ADTree classifier with a feature selection method that, as shaped by the experiments we have conducted, have turned out to be very different from those described in some well-cited surveys. Some methods that failed include distance, information and dependence measures for feature selection as well as combination of classifiers over a partitioned feature set. As another main lesson learned, alternating decision trees and LogitBoost outperformed most classifiers for most feature subsets of the KDD Cup 2009 data.} }
Endnote
%0 Conference Paper %T KDD Cup 2009 @ Budapest: feature partitioning and boosting %A Miklós Kurucz %A Dávid Siklósi %A István Bíró %A Péter Csizsek %A Zsolt Fekete %A Róbert Iwatt %A Tamás Kiss %A Adrienn Szabó %B Proceedings of KDD-Cup 2009 Competition %C Proceedings of Machine Learning Research %D 2009 %E Gideon Dror %E Mar Boullé %E Isabelle Guyon %E Vincent Lemaire %E David Vogel %F pmlr-v7-kurucz09 %I PMLR %P 65--75 %U https://proceedings.mlr.press/v7/kurucz09.html %V 7 %X We describe the method used in our final submission to KDD Cup 2009 as well as a selection of promising directions that are generally believed to work well but did not justify our expectations. Our final method consists of a combination of a LogitBoost and an ADTree classifier with a feature selection method that, as shaped by the experiments we have conducted, have turned out to be very different from those described in some well-cited surveys. Some methods that failed include distance, information and dependence measures for feature selection as well as combination of classifiers over a partitioned feature set. As another main lesson learned, alternating decision trees and LogitBoost outperformed most classifiers for most feature subsets of the KDD Cup 2009 data.
RIS
TY - CPAPER TI - KDD Cup 2009 @ Budapest: feature partitioning and boosting AU - Miklós Kurucz AU - Dávid Siklósi AU - István Bíró AU - Péter Csizsek AU - Zsolt Fekete AU - Róbert Iwatt AU - Tamás Kiss AU - Adrienn Szabó BT - Proceedings of KDD-Cup 2009 Competition DA - 2009/12/04 ED - Gideon Dror ED - Mar Boullé ED - Isabelle Guyon ED - Vincent Lemaire ED - David Vogel ID - pmlr-v7-kurucz09 PB - PMLR DP - Proceedings of Machine Learning Research VL - 7 SP - 65 EP - 75 L1 - http://proceedings.mlr.press/v7/kurucz09/kurucz09.pdf UR - https://proceedings.mlr.press/v7/kurucz09.html AB - We describe the method used in our final submission to KDD Cup 2009 as well as a selection of promising directions that are generally believed to work well but did not justify our expectations. Our final method consists of a combination of a LogitBoost and an ADTree classifier with a feature selection method that, as shaped by the experiments we have conducted, have turned out to be very different from those described in some well-cited surveys. Some methods that failed include distance, information and dependence measures for feature selection as well as combination of classifiers over a partitioned feature set. As another main lesson learned, alternating decision trees and LogitBoost outperformed most classifiers for most feature subsets of the KDD Cup 2009 data. ER -
APA
Kurucz, M., Siklósi, D., Bíró, I., Csizsek, P., Fekete, Z., Iwatt, R., Kiss, T. & Szabó, A.. (2009). KDD Cup 2009 @ Budapest: feature partitioning and boosting. Proceedings of KDD-Cup 2009 Competition, in Proceedings of Machine Learning Research 7:65-75 Available from https://proceedings.mlr.press/v7/kurucz09.html.

Related Material