Font Size: a A A

Study On Multi-collinear Problem Solved By Biased Estimator

Posted on:2020-10-19Degree:MasterType:Thesis
Country:ChinaCandidate:J XuFull Text:PDF
GTID:2370330575486600Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In the linear regression model,the least squares estimation has many good properties for the parameter estimation problem.Least square estimation is very influential and the most widely used parameter estimation method.It plays an important role in the parameter estimation of linear regression model.In the rapid development of computer society,linear regression model is used to solve problems in many fields.Problems can not be avoided,such as less selection of selected variables,superfluous,redundant,etc.This will lead to a variety of problems,such as sequential correlation,variances,multicollinearity and so on,in which the selected variables or random interference items are presented as sequence correlation,already variance,multiple collinearity and so on.In order to deal with the problem of multicollinearity between independent variables in multivariate linear regression,there are several commonly used methods: Ridge estimation,Principal component estimation,partial least Squares Estimator,Universal Ridge Estimator.On the premise of reading many references,the multi-collinearity in linear regression model is discussed.There are many commonly used methods to solve the multicollinearity problem in linear regression,and its theory and function are different.In regression modeling,the principal component estimation and partial least squares estimation adopt the method of component extraction.Because the partial least square estimation takes into account the relation with dependent variables,the partial least square estimation is more superior than the principal component estimation.
Keywords/Search Tags:Ridge estimation, Principal component estimation, Partial least Squares, Pan Ridge estimation, C_L Criterion and M(C) Criterion
PDF Full Text Request
Related items