Font Size: a A A

Probabilistic Tolerance Rough Set

Posted on:2015-01-13Degree:MasterType:Thesis
Country:ChinaCandidate:B WanFull Text:PDF
GTID:2268330422469450Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Classical rough set theory proposed by Professor Pawlak in1982is a data miningmethod which can deal with the data with uncertainty, such as inconsistent data, incompletedata. The core idea of rough set is to use a pair of operators to approximate target concept.Classical set theory, fuzzy set theory and rough set theory consist in3classical set theories.Because that the rough set method need not any prior knowledge, it is easy to understandand to implement, it has been widely applied to data mining, pattern classification, intelligentinformation processing, etc. and it has been a hot research topic in recent years. To overcomethe drawbacks of classical rough set which lacks tolerance to error classification, theprobabilistic rough set models with adjustable parameters have been proposed by differentresearches, such as decision rough sets, variable precision rough sets. But these probabilisticmodels are still based on the equivalence relations, and can not directly apply to real-valueddata. In order to deal with this problem, this paper extend the equivalence relation of theprobabilistic rough set to the tolerance relation, and proposed the probabilistic tolerancerough set model, the basic properties of the upper and lower approximation operatorsincluding the numerical features and reduction are investigated. In addition, considering theminimum Bayes decision risk, this paper transforms the decision problem into a parameteroptimization problem, and we employ genetic algorithms to solve this optimal problem.
Keywords/Search Tags:Rough set, Probabilistic rough set, Tolerance rough set, Probabilistic, tolerance rough set, Genetic algorithm
PDF Full Text Request
Related items