Font Size: a A A

Gray Adaptive Resonance Theory And Its Performance Assessment

Posted on:2004-12-03Degree:MasterType:Thesis
Country:ChinaCandidate:Z G TanFull Text:PDF
GTID:2208360092976023Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
In general, the learning methods of neural networks have three types, that is, supervised learning, unsupervised learning, and reinforcement learning. The Adaptive Resonance Theory (ART) proposed by S. Grossberg and G.A. Carpenter is an important class of competitive neural network model using unsupervised learning. It has memory mode similar to human, and its storage capacity can increase with the addition of learning patterns. Moreover, this model can perform online learning, is suitable to dynamical environments and has good adaption.According to the symmetry of the similarity functions used in ART, the ART can be divided into two classes. One class is the traditional ART, and the other is the simplified ART (SART). The similarity function of the former is asymmetrical function, but that of the latter is symmetrical function. These two ARTs have obtained broad applications due to both their simple learning rules and dynamical online learning behaviors. But their pivotal weight adjustments are determined only by their learning rates and the differences between the input patterns and the winner neuron's weights. It seems that the traditional ART and SART ignore some (implicit) correlative relationships during the learning, which actually exist between the input patterns and the weights of all the other nodes that participate in competition. In this paper, the Deng's grey relational coefficients (GRCs), which characterize and stress the aforementioned correlation relationships, are explicitly introduced into the learning rules of the traditional ART and SART. Such combinations produce a family of Grey ART, namely GART, and Grey SART, for short GSART. The family of the grey relational functions in the models' learning rules includes polynomial, exponential, and tangential functions. By more analyses, we found that the above-defined GRCs neglect the whole characteristics between the input patterns and weights and further induce another GRCs containing the whole characteristics. So a new family of Improved GART, for short IGART, and Improved GSART, namely IGSART, is derived. Finally, the experiments on the Iris and Wine benchmark dataset confirm the validities and feasibilities of the above two families models over those of the traditional ART and SART.
Keywords/Search Tags:Neural Network, Pattern Recognition, Adaptive Resonance Theory (ART), Grey Relation Coefficient (GRC), Grey Relation Function, Whole Characteristic, Clustering, Unsupervised Learning
PDF Full Text Request
Related items