Jmax-pruning: a facility for the information theoretic pruning of modular classification rules

F. Stahl, Max Bramer

Research output: Contribution to journalArticlepeer-review

81 Downloads (Pure)

Abstract

The Prism family of algorithms induces modular classification rules in contrast to the Top Down Induction of Decision Trees (TDIDT) approach which induces classification rules in the intermediate form of a tree structure. Both approaches achieve a comparable classification accuracy. However in some cases Prism outperforms TDIDT. For both approaches pre-pruning facilities have been developed in order to prevent the induced classifiers from overfitting on noisy datasets, by cutting rule terms or whole rules or by truncating decision trees according to certain metrics. There have been many pre-pruning mechanisms developed for the TDIDT approach, but for the Prism family the only existing pre-pruning facility is J-pruning. J-pruning not only works on Prism algorithms but also on TDIDT. Although it has been shown that J-pruning produces good results, this work points out that J-pruning does not use its full potential. The original J-pruning facility is examined and the use of a new pre-pruning facility, called Jmax-pruning, is proposed and evaluated empirically. A possible pre-pruning facility for TDIDT based on Jmax-pruning is also discussed.
Original languageEnglish
Pages (from-to)12-19
Number of pages8
JournalKnowledge-Based Systems
Volume29
DOIs
Publication statusPublished - 2012

Fingerprint

Dive into the research topics of 'Jmax-pruning: a facility for the information theoretic pruning of modular classification rules'. Together they form a unique fingerprint.

Cite this