In this dissertation, we propose two stochastic alternating optimization methods for solving structured regularization problems, which have been widely used in machine learning and data mining. The rst algorithm is called Stochastic Alternating Linearization (SALIN), which is an stochastic extension of the Alternating Linearization (ALIN) in solving convex optimization problems with complex non-smooth regularization term. SALIN linearizes the loss function and penalty function alternatively at each iteration, based on the stochastic approximation of sub-gradients. By applying a special update test at each iteration, and a carefully designed sub-gradients update scheme, the algorithm achieves fast and stable convergence. The update test just relies on a xed pre-de ned set, and we show that the choice of the test set has little influence on the overall performance of the algorithm. Therefore SALIN is a robust method. The other algorithm is called preconditioned stochastic Alternating Direction Method of Multipliers, which is specially designed to handle structured regularized regression problems such as Fused LASSO, but with the design matrix being ill-conditioned. We prove its O(1/sqrt{t}) convergence rate for general convex functions and O(log t/t) for strongly convex functions, and show that the constant depends only on the lower dimension of the data matrix. We present results of extensive numerical experiments for structured regularization problems such as Fused LASSO and graph-guided SVM, with both synthetic and real-world datasets. The numerical results demonstrate the efficacy and accuracy of our methods.
Subject (authority = RUETD)
Topic
Management
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
ETD_8968
PhysicalDescription
Form (authority = gmd)
electronic resource
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
1 online resource (xi, 62 p. : ill.)
Note (type = degree)
Ph.D.
Note (type = bibliography)
Includes bibliographical references
Subject (authority = ETD-LCSH)
Topic
Machine learning
Subject (authority = ETD-LCSH)
Topic
Mathematical optimization
Note (type = statement of responsibility)
by Kaicheng Wu
RelatedItem (type = host)
TitleInfo
Title
Graduate School - Newark Electronic Theses and Dissertations
Identifier (type = local)
rucore10002600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.