MLKD logo   Machine Learning &
Knowledge Discovery Group

Publication Details

  Author(s): G. Tsoumakas, I. Vlahavas.

Title: “Distributed Data Mining of Large Classifier Ensembles”.

Availability: Click here to download the PDF (Acrobat Reader) file (6 pages).


Appeared in: Proc. (companion volume) 2nd Hellenic Conference on AI (SETN '02), I. Vlahavas, C. Spyropoulos (Ed.), pp. 249-255, Thessaloniki, Greece, 2002.

Abstract: Nowadays, classifier ensembles are often used for distributed data mining in order to discover knowledge from inherently distributed information sources and scale up learning to very large databases. One of the most successful methods used for combining multiple classifiers is Stacking. However, this method suffers from very high computational cost in the case of large number of distributed nodes. This paper presents a new classifier combination strategy that scales up efficiently and achieves both high predictive accuracy and tractability of problems with high complexity. It induces a global model by learning from the averages of the local classifiers' output. This way, fast and effective combination of large number of classifiers is achieved.

Relevant Links: