Font Size: a A A

Convergence Of Learning And Dispensability Of Bias For Fuzzy Neural Networks

Posted on:2007-01-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:J YangFull Text:PDF
GTID:1118360182482427Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Fuzzy sets and neuro-computation theory are playing important roles in the area of information processing. Fuzzy logic and neural networks (NNs) are different computational models supplying different capabilities for information processing systems. Since the late 1980's Prof. Kosko integrated them into a unified framework called fuzzy neural networks (FNNs) by introducing the fuzzy operators, max(∨) and min(∧), into the operation of conventional NNs, FNNs have been a very active research area in the intelligent computation.A famous result in neural network theory, when there is no fuzzy logic involved, is the finite convergence theorem for the perceptron rule when the training examples are linearly separable, which says that the learning procedure will stop (converge) after finite steps of iterations. Will the same thing happen for fuzzy perceptrons? We propose a new training algorithm for a fuzzy perceptron and try to answer this question. In the case that the dimension of the input vectors is two, we can prove a finite convergence. When the dimension is greater than two, stronger conditions are needed to guarantee the finite convergence.Another question we are concerned in this thesis is that whether the bias is dispensable for FNNs. It is well-known that a conventional feedforward NN always has bias (threshold) terms, which are necessary for it to solve classification or approximation problems. But for FNNs, it seems not quite clear yet whether the bias is dispensable or not. Some authors introduce the bias, while the others do not. Here that the bias is dispensable means the following: If an FNN with bias can perform a fuzzy mapping, then an FNN without bias can do the same. We first show that the bias is basically dispensable for a fuzzy perceptron used for classification problems. On the other hand, for max-min FNNs used for approximation problems, the bias is dispensable if and only if a special condition is satisfied. But this special condition is generally not valid, or not easy to justify in practice. Therefore, the bias is generally indispensable for max-min FNNs.
Keywords/Search Tags:Fuzzy neural network, Finite convergence, Bias, Fuzzy perceptron, Max-min fuzzy neural network
PDF Full Text Request
Related items