Font Size: a A A

Averaging Projected Stochastic Gradient Descent for large scale least square problem

Posted on:2013-12-14Degree:M.SType:Thesis
University:University of Massachusetts BostonCandidate:Mu, YangFull Text:PDF
GTID:2450390008981741Subject:Artificial Intelligence
Abstract/Summary:
The least squares problem is one of the most important regression problems in statistics and machine learning. In this paper, we present an Averaging Projection Stochastic Gradient Descent (APSGD) algorithm to solve the large-scale least squares problem. APSGD improves the Stochastic Gradient Descent (SGD) by using the constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound O(logT), and fastest convergence speed among all first order approaches. Empirical studies confirm the effectiveness of APSGD by comparing it with the state-of-art methods.
Keywords/Search Tags:Stochastic gradient descent, APSGD
Related items