AUGMENTED LAGRANGIAN BASED ALGORITHMS FOR NONCONVEX OPTIMIZATION WITH APPLICATIONS IN SUBSPACE CLUSTERING

dc.contributor.advisorVidal, Rene
dc.contributor.committeeMemberRobinson, Daniel P.
dc.contributor.committeeMemberCurtis, Frank E.
dc.contributor.committeeMemberBasu, Amitabh
dc.creatorJiang, Hao
dc.date.accessioned2018-01-09T03:29:56Z
dc.date.available2018-01-09T03:29:56Z
dc.date.created2016-08
dc.date.issued2016-06-03
dc.date.submittedAugust 2016
dc.date.updated2018-01-09T03:29:56Z
dc.description.abstractThis thesis considers the design, analysis, and implementation of algorithms for nonconvex optimization that utilize the augmented Lagrangian function. In the first part of the thesis, we address general nonconvex optimization problems that have smooth objective and constraint functions. Observing a potential drawback of a traditional augmented Lagrangian (AL) method, we propose adaptive trust-region and linesearch AL algorithms that use the same novel feature, namely, an adaptive update for the penalty parameter. As with a traditional AL algorithm, the adaptive methods are matrix-free (i.e., they do not need to form or factorize problem matrices) and thus represent a viable option for solving large-scale problems. We prove global convergence for our adaptive AL algorithms and illustrate through extensive numerical experiments that our methods outperform traditional AL methods in terms of efficiency and reliability. In the latter part of the thesis, we focus on a structured nonconvex nonsmooth problem arising from a machine learning application called subspace clustering. We show that the alternating direction method of multipliers (ADMM), which may roughly be described as an application of block-wise coordinate minimization of the AL function, is well suited for this machine learning task. Moreover, we establish a global convergence result for the algorithm since it was previously unknown. Numerical results are presented to show that the chosen optimization modeling formulation together with ADMM can achieve subspace clustering accuracy that is on par with other state-of-the-art methods.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://jhir.library.jhu.edu/handle/1774.2/44631
dc.language.isoen_US
dc.publisherJohns Hopkins University
dc.publisher.countryUSA
dc.subjectnonlinear optimization
dc.subjectaugmented Lagrangian
dc.subjectADMM
dc.subjectsubspace clustering
dc.titleAUGMENTED LAGRANGIAN BASED ALGORITHMS FOR NONCONVEX OPTIMIZATION WITH APPLICATIONS IN SUBSPACE CLUSTERING
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentApplied Mathematics and Statistics
thesis.degree.disciplineApplied Mathematics & Statistics
thesis.degree.grantorJohns Hopkins University
thesis.degree.grantorWhiting School of Engineering
thesis.degree.levelDoctoral
thesis.degree.namePh.D.
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
JIANG-DISSERTATION-2016.pdf
Size:
948.15 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.67 KB
Format:
Plain Text
Description: