Induction of modular classification rules: using Jmax-pruning

F. Stahl, Max Bramer

Research output: Contribution to conferencePaperpeer-review

125 Downloads (Pure)

Abstract

The Prism family of algorithms induces modular classification rules which, in contrast to decision tree induction algorithms, do not necessarily fit together into a decision tree structure. Classifiers induced by Prism algorithms achieve a comparable accuracy compared with decision trees and in some cases even outperform decision trees. Both kinds of algorithms tend to overfit on large and noisy datasets and this has led to the development of pruning methods. Pruning methods use various metrics to truncate decision trees or to eliminate whole rules or single rule terms from a Prism rule set. For decision trees many pre-pruning and post-pruning methods exist, however for Prism algorithms only one pre-pruning method has been developed, J-pruning. Recent work with Prism algorithms examined J-pruning in the context of very large datasets and found that the current method does not use its full potential. This paper revisits the J-pruning method for the Prism family of algorithms and develops a new pruning method Jmax-pruning, discusses it in theoretical terms and evaluates it empirically.
Original languageEnglish
Publication statusPublished - 2011
EventIn Thirtieth SGAI International Conference on Innovative Techniques and Applications of Artificial Intelligence - Cambridge
Duration: 14 Dec 201016 Dec 2010

Conference

ConferenceIn Thirtieth SGAI International Conference on Innovative Techniques and Applications of Artificial Intelligence
CityCambridge
Period14/12/1016/12/10

Fingerprint

Dive into the research topics of 'Induction of modular classification rules: using Jmax-pruning'. Together they form a unique fingerprint.

Cite this