Font Size: a A A

Using Mutual Information For Selecting Continuous-valued Attribute In Decision Tree Learning

Posted on:2005-02-16Degree:MasterType:Thesis
Country:ChinaCandidate:H LiFull Text:PDF
GTID:2120360125454791Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
Fayyad decision trees learning algorithm was built by using the entropy-based selection measure. But it only considered the relation between qualification attributes and decision attributes, which may induce to select the previously selected attributes for branching. The repeated selection can't lead to a greatest decrease of information entropy. Here, we use mutual information to avoid the limitation. We propose a learning algorithm using the information entropy minimization heuristic and mutual information entropy heuristic to select expanded attributes. And our test results show that this method can obtain good accuracy of training and testing. In addition, we define a total ordering on the space of one-side triangular fuzzy numbers and propose a learning algorithm that generates decision trees for a data set of which attribute values are one-side triangular fuzzy numbers. Based on the nonstationary cut-points, the computational load of partition is reduced.
Keywords/Search Tags:Inductive learning, Machine learning, Decision tree, Information entropy minimization, Mutual information, One-side triangular fuzzy number
PDF Full Text Request
Related items