• Login
    View Item 
    •   JScholarship Home
    • Theses and Dissertations, Electronic (ETDs)
    • ETD -- Doctoral Dissertations
    • View Item
    •   JScholarship Home
    • Theses and Dissertations, Electronic (ETDs)
    • ETD -- Doctoral Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    AUGMENTED LAGRANGIAN BASED ALGORITHMS FOR NONCONVEX OPTIMIZATION WITH APPLICATIONS IN SUBSPACE CLUSTERING

    Thumbnail
    View/Open
    JIANG-DISSERTATION-2016.pdf (948.1Kb)
    Date
    2016-06-03
    Author
    Jiang, Hao
    Metadata
    Show full item record
    Abstract
    This thesis considers the design, analysis, and implementation of algorithms for nonconvex optimization that utilize the augmented Lagrangian function. In the first part of the thesis, we address general nonconvex optimization problems that have smooth objective and constraint functions. Observing a potential drawback of a traditional augmented Lagrangian (AL) method, we propose adaptive trust-region and linesearch AL algorithms that use the same novel feature, namely, an adaptive update for the penalty parameter. As with a traditional AL algorithm, the adaptive methods are matrix-free (i.e., they do not need to form or factorize problem matrices) and thus represent a viable option for solving large-scale problems. We prove global convergence for our adaptive AL algorithms and illustrate through extensive numerical experiments that our methods outperform traditional AL methods in terms of efficiency and reliability. In the latter part of the thesis, we focus on a structured nonconvex nonsmooth problem arising from a machine learning application called subspace clustering. We show that the alternating direction method of multipliers (ADMM), which may roughly be described as an application of block-wise coordinate minimization of the AL function, is well suited for this machine learning task. Moreover, we establish a global convergence result for the algorithm since it was previously unknown. Numerical results are presented to show that the chosen optimization modeling formulation together with ADMM can achieve subspace clustering accuracy that is on par with other state-of-the-art methods.
    URI
    http://jhir.library.jhu.edu/handle/1774.2/44631
    Collections
    • ETD -- Doctoral Dissertations

    DSpace software copyright © 2002-2016  DuraSpace
    Policies | Contact Us | Send Feedback
    Theme by 
    Atmire NV
     

     

    Browse

    All of JScholarshipCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    DSpace software copyright © 2002-2016  DuraSpace
    Policies | Contact Us | Send Feedback
    Theme by 
    Atmire NV