Staff View
Optimization in sparse learning

Descriptive

TitleInfo
Title
Optimization in sparse learning
SubTitle
from convexity to non-convexity
Name (type = personal)
NamePart (type = family)
Liu
NamePart (type = given)
Bo
NamePart (type = date)
1984-
DisplayForm
Bo Liu
Role
RoleTerm (authority = RULIB)
author
Name (type = personal)
NamePart (type = family)
Metaxas
NamePart (type = given)
Dimitris N.
DisplayForm
Dimitris N. Metaxas
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
chair
Name (type = personal)
NamePart (type = family)
Pavlovic
NamePart (type = given)
Vladimir
DisplayForm
Vladimir Pavlovic
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Elgammal
NamePart (type = given)
Ahmed
DisplayForm
Ahmed Elgammal
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
internal member
Name (type = personal)
NamePart (type = family)
Katehakis
NamePart (type = given)
Michael
DisplayForm
Michael Katehakis
Affiliation
Advisory Committee
Role
RoleTerm (authority = RULIB)
outside member
Name (type = corporate)
NamePart
Rutgers University
Role
RoleTerm (authority = RULIB)
degree grantor
Name (type = corporate)
NamePart
School of Graduate Studies
Role
RoleTerm (authority = RULIB)
school
TypeOfResource
Text
Genre (authority = marcgt)
theses
OriginInfo
DateCreated (qualifier = exact)
2019
DateOther (qualifier = exact); (type = degree)
2019-01
CopyrightDate (encoding = w3cdtf)
2019
Place
PlaceTerm (type = code)
xx
Language
LanguageTerm (authority = ISO639-2b); (type = code)
eng
Abstract (type = abstract)
Nowadays, the explosive data scale increase provides an unprecedented opportunity to apply machine learning methods in various application domains. The high-dimension data representation proposes curse of dimension challenge to machine learning models. Sparse model offers a tool that can alleviate this challenge by learning a low-dimension feature and model representation. Traditionally, the sparse model is learned by penalizing the $ell_1$-norm of the model parameter in optimization. Recent sparse model learning research is studying more accurate way of modeling the sparsity degree as prior knowledge, representative work includes $k$-support norm regularized minimization and $ell_0$-constrained minimization.

If the training loss is convex, minimizing the training loss with model parameter $k$-support-norm regularizer is still a convex optimization problem. In chapter 2 we introduce the proposed fully corrective Frank-Wolfe type algorithm, called $k$-FCFW, for $k$-support-norm regularized sparse model learning.
We reformulate the regularized minimization into a constrained minimization task, then the the proposed algorithm is applied to solve the reformulated problem. In this work we compare the per-iteration complexity of the proposed $k$-FCFW algorithm with proximal gradient algorithm, which is conventionally used to solve the original problem. One theoretical contribution is that we establish a linear convergence for the proposed algorithm under some standard assumptions.

The model parameter $ell_0$-norm can be directly used as constraint in the learning objective.
However, in this condition even if training loss is convex, the problem is non-convex and NP-hard because of the $ell_0$-norm constraint. To obtain a tradeoff between model solving accuracy and efficiency, several primal domain greedy algorithms have been proposed. The algorithm properties such as model estimation error upper bound and support recovery are analyzed in literature. In chapter 3 we introduce the proposed dual space algorithm for the sparsity-constrained $ell_2$-norm regularized finite sum loss minimization problem. The sparse duality theory established in this work sets up the sufficient and necessary conditions under which the original non-convex problem can be equivalently solved in a concave dual formulation. The dual iterative hardthresholding (DIHT) algorithm and its stochastic variant are proved to be able to recover the parameter support without the Restricted Isometry Property (RIP) condition.

Distributed optimization algorithm is used for learning a global optimal model when training samples locate on different machines. Communication efficiency is one important concern in distributed model training algorithm design. Chapter 4 elaborates the proposed Newton-type inexact pursuit algorithm for the $ell_0$-constrained empirical risk minimization problem. The proposed algorithm iterates between inexactly solving a local sparse learning problem with existing single machine algorithms, and communicating gradient and model parameter between master machine and worker machines.
Algorithm analysis shows the model estimation error upper bound has linear convergence rate.

In the last part, we conclude this dissertation and present the future direction of sparse model learning research.
Subject (authority = RUETD)
Topic
Computer Science
Subject (authority = ETD-LCSH)
Topic
Machine learning
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
ETD_9494
PhysicalDescription
Form (authority = gmd)
electronic resource
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
1 online resource (100 pages) : illustrations
Note (type = degree)
Ph.D.
Note (type = bibliography)
Includes bibliographical references
Note (type = statement of responsibility)
by Bo Liu
RelatedItem (type = host)
TitleInfo
Title
School of Graduate Studies Electronic Theses and Dissertations
Identifier (type = local)
rucore10001600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
NjNbRU
Identifier (type = doi)
doi:10.7282/t3-y3br-8236
Genre (authority = ExL-Esploro)
ETD doctoral
Back to the top

Rights

RightsDeclaration (ID = rulibRdec0006)
The author owns the copyright to this work.
RightsHolder (type = personal)
Name
FamilyName
Liu
GivenName
Bo
Role
Copyright Holder
RightsEvent
Type
Permission or license
DateTime (encoding = w3cdtf); (qualifier = exact); (point = start)
2019-01-06 17:43:28
AssociatedEntity
Name
Bo Liu
Role
Copyright holder
Affiliation
Rutgers University. School of Graduate Studies
AssociatedObject
Type
License
Name
Author Agreement License
Detail
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.
RightsEvent
Type
Embargo
DateTime (encoding = w3cdtf); (qualifier = exact); (point = start)
2019-01-31
DateTime (encoding = w3cdtf); (qualifier = exact); (point = end)
2019-08-02
Detail
Access to this PDF has been restricted at the author's request. It will be publicly available after August 2nd, 2019.
Copyright
Status
Copyright protected
Availability
Status
Open
Reason
Permission or license
Back to the top

Technical

RULTechMD (ID = TECHNICAL1)
ContentModel
ETD
OperatingSystem (VERSION = 5.1)
windows xp
CreatingApplication
Version
1.5
ApplicationName
pdfTeX-1.40.12
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2019-01-08T09:30:24
DateCreated (point = end); (encoding = w3cdtf); (qualifier = exact)
2019-01-08T09:30:24
Back to the top
Version 8.3.13
Rutgers University Libraries - Copyright ©2021