Font Size: a A A

Mixture Of Experts Research And Its Applications

Posted on:2008-09-21Degree:MasterType:Thesis
Country:ChinaCandidate:Y PanFull Text:PDF
GTID:2178360212990714Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
With the fast development of natural science and social science, the number of data in the every field is growing at a rapid pace with geometric series. It becomes more and more important for people to find the useful data from these great data, to find the relationship among them and to use them better. In order to use these data better to training and forecast, the researchers have excogitated many kinds of traning and forecast methods which are based on different theories. These methods had some good effect on the forecast in some fields, but there are still limitations in every approach.The experts are working on this to do some improvement.Mixture of Experts is a mixture system that collects and gathers several outstanding data training methods, like neural network, varied clustering algorithms. Actually, it uses clustering algorithm as gating network to combine the outputs of multi neural network. It is a multi-level expert network and should have a better result than what the single data training method has.Most previous forecast systems have some shortcomings as follows: The error of the single forecast system for different data is not stable due to the limitations of the single system. Manual definition of the weight in the expert network is not effective and has bad pertinency. The data are trained through different expert networks and the outputs of them are combined with weight as the final result. The weights have to been defined manually, and have nothing to do with the sample data. Whether their values are similar or the combination method is not reasonable will bring about the weak result.In order to resolve these problems, this paper proposes mixture of experts (MoE) approach. For every set of sample data, MoE approach use gating network to automatically generate several clusters, find their center and then calculate the weight of every sample data for every cluster. After that, expert network trains them to test.Based on MoE approach, this paper designs and realizes a modular fuzzy mixture of experts and a modular support vector mixture of experts. For the same sample data, MFMoE and MSVMoE implement the function of gating network through FCM and SVC respectively. MFMoE depends on the number of sample data and initial value; the cluster number needs to be defined in advance; the figure of cluster is hypersphere; thesize of every cluster is almost equal. So this paper tries to use MSVMoE to solve these problems and improve the system.
Keywords/Search Tags:Mixture of Experts, Committee Machines, Neural Network, Gating Network, Fuzzy C-Means, Support Vector Clustering
PDF Full Text Request
Related items