Font Size: a A A

The Construction Of Decision Tree Based On The Covering Rough Set Theory

Posted on:2017-03-20Degree:MasterType:Thesis
Country:ChinaCandidate:D L MaFull Text:PDF
GTID:2348330488486022Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Rough set theory is a kind of mathematical method to deal with incomplete and imprecise knowledge and has been applied into many fields such as machine learning, data mining, pattern recognition and so on. Rough set discovers knowledge hidden in data set and attribute reduction is the core content of rough set which aims to delete superfluous attributes, improve efficiency, save storage space and reduce the complexity of systems. Classical rough set theory is based on equivalence relation or partition and can only deal with the data sets in which every object takes unique discrete value for every attribute. In most practical problems, object can take set-valued, missing-valued rather than unique discrete values for attributes. Covering rough sets are developed by replacing partitions of the universe with coverings. Decision tree learning is a classified method to approximate the discrete valued function and its core task is classifying the sample to the possible discrete values. Decision tree has been successfully applied to the study of medical diagnosis, learning assessment of credit risk and classification of celestial bodies.The main task of this paper is summarized as follows:1. In section 3, in set-valued data sets,?-reduction with covering rough sets mainly keeps the possible rules'confidence not being lower than the prescribed threshold. In the study of ?-reduction with covering rough sets, the minimal elements are sufficient to find ?-reduction in the discernibility matrix.?-reduction of inconsistent decision systems aims to delete superfluous attributes, improve efficiency, and deal with noise and inconsistency.2. In section 4, for the construction problem of the inconsistent systems' decision tree, the information gain and the confidence are put forward to construct decision tree. The information gain helps us to select the node and the confidence can determine the rules to avoid over fitting in the process of generating decision trees. This algorithm not only maintains the high confidence rules are not less than a specified threshold, but also eliminates pruning steps. And we can use the confidence degree to describe the inconsistent leaf. Inconsistent decision tree built with this method is not only simple-structured, but also easily comprehensible and can high-efficiently describe the decision systems'inconsistency. Several experiments are performed to demonstrate that our decision trees' methods of proposed are effective in this section.
Keywords/Search Tags:covering rough set, ?-reduction, discernibility matrix, inconsistent decision tree
PDF Full Text Request
Related items