Abstract
A new decision tree generator MEC is proposed in this paper, which uses the difference of multi-base entropy as a consistent criterion for discretization and selection of attributes. To evaluate the performance of the proposed generator, it is compared to other generators which use criteria based on entropy and adopt different discretization styles. As an experimental result, it is shown that the proposed generator produces the most efficient classifiers, which have the least number of leaves at the same error rate, regardless of whether attribute values constituting the training set are discrete or continuous.