Manwani, N and Sastry, PS (2012) Geometric Decision Tree. In: IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 42 (1). pp. 181-192.
Geometric_Decision.pdf - Published Version
Restricted to Registered users only
Download (607Kb) | Request a copy
In this paper, we present a new algorithm for learning oblique decision trees. Most of the current decision tree algorithms rely on impurity measures to assess the goodness of hyperplanes at each node while learning a decision tree in top-down fashion. These impurity measures do not properly capture the geometric structures in the data. Motivated by this, our algorithm uses a strategy for assessing the hyperplanes in such a way that the geometric structure in the data is taken into account. At each node of the decision tree, we find the clustering hyperplanes for both the classes and use their angle bisectors as the split rule at that node. We show through empirical studies that this idea leads to small decision trees and better performance. We also present some analysis to show that the angle bisectors of clustering hyperplanes that we use as the split rules at each node are solutions of an interesting optimization problem and hence argue that this is a principled method of learning a decision tree.
|Item Type:||Journal Article|
|Additional Information:||Copyright 2012 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.|
|Keywords:||Decision trees;generalized eigenvalue problem;multiclass classification;oblique decision tree.|
|Department/Centre:||Division of Electrical Sciences > Electrical Engineering|
|Date Deposited:||16 Mar 2012 08:13|
|Last Modified:||16 Mar 2012 08:13|
Actions (login required)