The work presented in this thesis aims to study the conditions essential for reliable dictionary recovery based on the maximal response criterion and exploit the application of dictionary learning in classification of distributed data. The first part of this thesis revisits the problem of recovery of an overcomplete dictionary in a local neighborhood from training samples using the so-called maximal response criterion. While it is known in the literature that the maximal response criterion can be used for asymptotic exact recovery of a dictionary in a local neighborhood, those results do not allow for linear (in the ambient dimension) scaling of sparsity levels in signal representations. The first contribution in this work is introducing a new condition for the sparse representation of signals and leveraging a new proof technique to establish that maximal response criterion can in fact handle linear sparsity (modulo a logarithmic factor) of signal representations. While the focus of this work is on asymptotic exact recovery, the same ideas can be used in a straightforward manner to strengthen the original maximal response criterion-based results involving noisy observations and finite number of training samples. The second part of this thesis addresses the problem of collaborative training of nonlinear classifiers using big, distributed training data. The proposed supervised learning strategy corresponds to data-driven joint learning of a nonlinear transformation that maps the (training) data to a higher-dimensional feature space and a ridge regression based linear classifier in the feature space. The key aspect of this work, which distinguishes it from related prior work, is that it assumes: 1. The training data are distributed across a number of interconnected sites. 2. Sizes of the local training data as well as privacy concerns prohibit exchange of individual training samples between sites. The main contribution is formulation of an algorithm, termed cloud D-KSVD, that reliably, efficiently and collaboratively learns both the nonlinear map and the linear classifier under these constraints. In order to demonstrate the effectiveness of cloud D-KSVD, a number of numerical experiments on the MNIST dataset are also reported.
Subject (authority = RUETD)
Topic
Electrical and Computer Engineering
Subject (authority = ETD-LCSH)
Topic
Machine learning
RelatedItem (type = host)
TitleInfo
Title
Rutgers University Electronic Theses and Dissertations
Identifier (type = RULIB)
ETD
Identifier
ETD_7270
PhysicalDescription
Form (authority = gmd)
electronic resource
InternetMediaType
application/pdf
InternetMediaType
text/xml
Extent
1 online resource (vii, 36 p. : ill.)
Note (type = degree)
M.S.
Note (type = bibliography)
Includes bibliographical references
Note (type = statement of responsibility)
by Zahra Shakeri
RelatedItem (type = host)
TitleInfo
Title
Graduate School - New Brunswick Electronic Theses and Dissertations
Identifier (type = local)
rucore19991600001
Location
PhysicalLocation (authority = marcorg); (displayLabel = Rutgers, The State University of New Jersey)
Rutgers University. Graduate School - New Brunswick
AssociatedObject
Type
License
Name
Author Agreement License
Detail
I hereby grant to the Rutgers University Libraries and to my school the non-exclusive right to archive, reproduce and distribute my thesis or dissertation, in whole or in part, and/or my abstract, in whole or in part, in and from an electronic format, subject to the release date subsequently stipulated in this submittal form and approved by my school. I represent and stipulate that the thesis or dissertation and its abstract are my original work, that they do not infringe or violate any rights of others, and that I make these grants as the sole owner of the rights to my thesis or dissertation and its abstract. I represent that I have obtained written permissions, when necessary, from the owner(s) of each third party copyrighted matter to be included in my thesis or dissertation and will supply copies of such upon request by my school. I acknowledge that RU ETD and my school will not distribute my thesis or dissertation or its abstract if, in their reasonable judgment, they believe all such rights have not been secured. I acknowledge that I retain ownership rights to the copyright of my work. I also retain the right to use all or part of this thesis or dissertation in future works, such as articles or books.