For any real industrial control systems, failure always inevitably occurs in its life cycle. Therefore, the fault detection technology, as an important guarantee for improving stability and reliability of the system, have traditionally received wide attention. As a result, a series of fault detection methods have been proposed and developed based on the traditional centralized control systems.However, with the rapid development of the system control technology and network communication technology, modern industrial control systems is no longer confined to a centralized local system, it is often connected by the sharing digital network with many different control units spread many different regions, and become a large distributed system. The network induced has brought the system flexibility and reduced costs, but also brought some adverse factors, such as delay, packet loss, and quantization errors caused by the ADC(Analog Digital Converter). In this case, the fault detection system is bound to put forward higher requirements.Firstly, this thesis briefly introduced several benchmark models for simulation, basic behavior of the fault detection system and the assessments of its performance. Secondly, we studied the influence of some network-induced effects to the traditional fault detection methods by several simulations on the benchmark models in different case of network performance separately, and then concluded the results. Thirdly, we researched on the method of fault detection for NCSs subject to quantization and random packet dropout, and the sufficient condition satisfied the H-infinity disturbance level for the filter design was figured out. Then the comparing simulation with traditional method and the adaptability simulation of this method were also performed. Finally, some simulations for the method of fault detection to nonlinear NCSs on benchmark models were carried out, and the adaptability results were also showed. |