Font Size: a A A

A Study On The Fast Algorithms Of Large Scale Ordinal Regression

Posted on:2021-01-02Degree:MasterType:Thesis
Country:ChinaCandidate:X GengFull Text:PDF
GTID:2428330647452814Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Ordinal regression problems are an important class of problems in machine learning.Its goal is to predict discrete ordered categories,which makes it different from multi-classification and regression problems.The ordinal regression method has been successfully applied in many real-world scenarios,such as age estimation,credit evaluation,and information retrieval.However,in the era of big data,machine learning problems usually need to handle hundreds of thousands or even tens of millions of data.Although researchers have proposed many ordinal regression methods,these methods are still unable to process large-scale data efficiently.In the present study,we will make further discussions on large-scale algorithms for multi-classification and regression problems.The main contents will include asynchronous parallel coordinate descent algorithm,doubly gradients descent algorithms,and deep learning algorithm.The main questions on the large-scale ordinal regression problem which are probed in this study are as following:(1)For the support vector ordinal regression model,this study proposes two new asynchronous greedy coordinate descent algorithms.The first algorithm uses active set technology to further accelerate the state-of-the-art asynchronous parallel greedy coordinate descent algorithm.The second specially designed algorithm can maintain the ordered thresholds as much as possible during the training process,so that it can obtain good prediction results faster.More importantly,this study analyzes the time complexity of several parallel coordinate descent algorithms.Finally,experiments on multiple large-scale data sets verify the acceleration of the proposed algorithm.(2)For the general kernel ordinal regression threshold model,this paper proposes a new double stochastic gradients descent algorithm.Because there are multiple thresholds for the ordinal regression threshold model to divide each ordered category,the current advanced double stochastic gradient descent algorithm and its theoretical analysis cannot directly apply to the model.To solve this problem,this study proposes a new double stochastic gradient descents algorithm,which updates the hyperplane and multiple thresholds,respectively.In theory,we prove that it has the convergence rate of O(1/t)just like the common stochastic gradient algorithm.Finally,we design large-scale experiments to confirm that this algorithm is faster than existing methods.(3)Aiming at the characteristics of ordinal regression problem,this study proposes a new deep ordinal regression algorithm.The ordinal regression problem combines the characteristics of classification and regression problems,but the existing ordinal regression methods often focus on one of the problem perspectives.And most of the ordinal regression methods ignore the naturally occurring ordered noise in the ordinal regression data set,which makes the generalization performance worse.Given these,we propose a new deep ordinal regression objective function,which combines the characteristics of classification and regression loss functions,and is more robust.In order to further accelerate the training speed on ordinal regression problems with large number of ranks,this paper proposes to reduce the problem size from O(rn)to O(log(r)n)using binary coding.Compared with multiple deep ordinal regression algorithms on large-scale data,the proposed method is faster and more effective.
Keywords/Search Tags:Ordinal Regression, Parallel, Coordinate Descent, Double Stochastic Gradients, Deep Learning
PDF Full Text Request
Related items