Publication Date
2024
Document Type
Dissertation/Thesis
First Advisor
Bello-Cruz, Yunier
Degree Name
Ph.D. (Doctor of Philosophy)
Legacy Department
Department of Mathematical Sciences
Abstract
Convex optimization problems are central to numerous fields, including machine learning, signal processing, and image reconstruction. The development and analysis of algorithms for solving splitting optimization problems constitutes a significant area of research. In particular, we investigate a proximal gradient splitting method (PGM) with an explicit linesearch for finding the solution of nonsmooth optimization problems, where the objective function is the sum of two convex functions.
We focus on cases where one of the functions is differentiable, without imposing any Lipschitz continuity assumption on its gradient. We establish the proper definition of the linesearch, and demonstrate that this version of PGM exhibits the usual complexity of O(1/k). Furthermore, an accelerated explicit version of PGM (M-APGME) is proposed, inspired by the classical FISTA iteration. M-APGME demonstrates a lower computational cost per iteration than FISTA by evaluating the proximal operator only once at each step of the backtracking process. Notably, M-APGME appears to achieve the optimal complexity of O(1/k^2) and performs comparably to FISTA in terms of numerical performance for LASSO, TV-Huber ROF Denoising, and CUR-Like Factorization problems.
This dissertation also presents and investigates an inexact proximal gradient method for solving such composite convex optimization problems. We introduce an explicit linesearch strategy that requires only a relative inexact solution of the proximal subproblem per iteration. We prove the convergence of the sequence generated by our scheme and establish its iteration complexity, considering both the functional values and a residual associated with first-order stationary solutions. Additionally, we provide numerical experiments of CUR-Like Factorization and Total Variation Regularization problems to illustrate the practical efficacy of our method.
Recommended Citation
Mohr, Cassandra, "Explicit Proximal Gradient Methods: Bridging Theory and Practice in Optimization" (2024). Graduate Research Theses & Dissertations. 7976.
https://huskiecommons.lib.niu.edu/allgraduate-thesesdissertations/7976
Extent
164 pages
Language
en
Publisher
Northern Illinois University
Rights Statement
In Copyright
Rights Statement 2
NIU theses are protected by copyright. They may be viewed from Huskie Commons for any purpose, but reproduction or distribution in any format is prohibited without the written permission of the authors.
Media Type
Text
