Font Size: a A A

Dissertation Template For Doctoral Degree Of Engineering In Shanghai Jiao Tong University

Posted on:2008-04-20Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q GaoFull Text:PDF
GTID:1118360242476031Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Reliable process data is the foundation of process monitoring, control performance evaluation, process control, optimization and statistical quality control. However, due to various sources such as measurement irreproducibility, instrument degradation and malfunction, human error, process-related errors, and other unmeasured errors, measurements can be contaminated with errors (random and/or gross). Rational use of the large volume of data generated by chemical plants requires the application of suitable techniques to improve their accuracy. Data reconciliation (DR) is a procedure of optimally adjusting measured data so that the adjusted values obey the conservation laws and other constraints. So far, the researches on data rectification include conventional DR, simultaneous data reconciliation and gross error detection, and DR based error possible density function (PDF) estimation. Conventional DR is based on the assumption of normal distribution, firstly detect and eliminate gross errors and then proceed DR. Simultaneous strategy assume measurement errors have normal-like distribution functions with heavy tails or combining two distributions to account for the contamination caused by the outliers (gross errors) and the generalized maximum likelihood objective functions are taken as the reconciliation objects. DR based error PDF estimation need to estimate the real error distribution and precede the DR with the real PDF.In this dissertation, a further study and research on robust data reconciliation was carried out, putting more attention on the sensitivity ananlyses and redundancy calculation, improved robust least square algorithms in linear, nonlinear and dynamic systems. This dissertation studies the problems of robust data reconciliation systematically, and makes progresses in the following aspects:1. The quantitative and qualitative contribution of diferent measurement variables to the estimation precision, variable local redundency and the gross eror detection in robust data rectification, based contaminated normal distribution, was investigated. Based on this information, decisions can be taken: addition of weight factor to suspicious measurements result in reduction of sensitivity of estimations to gross error, unnecessary measurements, enhancement of variable local redundancy and gross error detectability and significant improvement in the quality of the process validation. 2. Conventional robust least square data rectification iterative algorithm is subjected to the disability of detecting gross error in weak redundancy variable and occurrence of false-detection. An improved iterative algorithm is proposed to deal with above problems by taking local redundancy and inequation constraints into account. During iteration, elimination of suspicious measurement with the largest statistical or the measurement with second largest statistical when the estimation violate bounds, the method results in a significant improvement in the estimation and gross error detection.3. By solving the generalized maximum likelihood DR problem with regularization objective function can improve effectively the singularity of iterative solution of nonlinear data reconciliation. This is also obtained by addition a influence function matrix to regularized objective function, which leads to a more pratical iterative robust nonlinear data reconciliation algorithm, using SQP to solve the nonlinear optimization during iterations. Simulation results validate the efficiency of the proposed approachs.4. Contaminated Gaussian distribution based method is robust for data rectification for its ability of taking probability distributions of random error and gross error into account simultaneously. But its application is limited because the precision of estimation depends on the selection of priori model parameters, which is difficult to obtain in practice. To avoid providing these parameters, a robust adaptive data rectification approach is proposed in this paper. First, a robust adaptive probability distribution model of errors is constructed. Then, Lagrange method is used to obtain the iterative algebraic solution. Application to bilinear constraints process validates the efficiency of proposed robust adaptive data rectification method.5. A novel robust method is proposed by introducing a trust function matrix in conventional least-square objective function of nonlinear dynamic data reconciliation. To avoid the loss of data information, element-wise Mahalanobis distance is proposed, as an improvement on vector-wise distance, to construct a penalty function matrix, which is called two-step-method. Howerer, above methods, as well as NDDR, show a significant delay for the input estimation when a step occurs, which also causes some delay in the estimation of output variables. So a continuous error test was included as a part of the GED logic to detect set-point change. When the set-point change is sensed, a new historic data window is obtained to tracks the new set-point. The correlation of measurement error is also considered in this article.6. A general and robust classification algorithm is given to analyze the redundancy and observability of process variables for large-scale bilinear balances with splitters found in industrial processes when data reconciliation is applied. In the algorithm, the degree of freedom is first used to examine redundancy and observability and then a coefficient matrix of linearized constraints from the Taylor's expansion at the initial estimates is used to further examine the observability for entire process based on Crowe's method and QR fractorization. This classification method is not only suitable for the bilinear process with intensive constraints but also the trilinear constraints with multi-split-fractions.7. For the problem of the quasi-static, multi-components system with bilinear constraints in Menthol Joint-production Plant of Shanghai Coke Production Factory, the practical robust data rectification algorithm flowsheet was introduced and the result of data rectification was analyzed. The dissertation is also concluded with a summary and prospect of future robust data reconciliation researches.
Keywords/Search Tags:Robust estimation, influence function, Data reconciliation, Bilinear, Redundency, Data classification, Huber estimation, Hampel estimation
PDF Full Text Request
Related items