Skip to Main Content
Frequently Asked Questions
Submit an ETD
Global Search Box
Need Help?
Keyword Search
Participating Institutions
Advanced Search
School Logo
Files
File List
Dissertation-Final.pdf (13 MB)
ETD Abstract Container
Abstract Header
First-Order Algorithms for Continuous Optimization With Inexact Oracles
Author Info
Liu, Yin
Permalink:
http://rave.ohiolink.edu/etdc/view?acc_num=osu1721330362439766
Abstract Details
Year and Degree
2024, Doctor of Philosophy, Ohio State University, Industrial and Systems Engineering.
Abstract
First-order deterministic and stochastic optimization algorithms have gained significant importance in the past two decades primarily due to applications in data science and machine learning. In numerous recent problems, however, obtaining the exact gradient (in deterministic settings) or an unbiased gradient estimator (in stochastic settings) is computationally challenging. A naive implementation of the classical algorithms for such problems results in sub-optimal performance and unsatisfactory results. To address this gap, this research aims to investigate the properties of these problems, design new optimization algorithms, and investigate their theoretical convergence guarantees. Along this path, the dissertation makes three main contributions: 1) algorithmic development and in-depth analyses for a specific problem, namely stochastic composition optimization; 2) exploration of three biased stochastic approximation algorithms for the general setup and their theoretical analysis in the nonconvex setting; 3) investigation of accelerated gradient descent method for problems with inexact gradient oracles in convex setups and derivation of a new upper bound for the accumulated error. The first work focuses on the stochastic composition optimization problem. It explores scenarios where either the inner or outer function lacks Lipschitz continuity of their gradients. To generalize the assumption of Lipschitz continuity of the gradients, the notion of relative smoothness is introduced. The properties of composition gradients for three non-trivial combinations are examined, leading to a discussion of their corresponding smoothness properties. For each type of composition problem, first-order methods are proposed and their convergence analyses are conducted. Furthermore, the sample complexities for these proposed algorithms are established. The theoretical findings are then validated through experimental results. The subsequent research presents a unified framework for stochastic nonconvex problems, which encompass different kinds of problems characterized by biased gradients. The relationship between the bias-control parameter and convergence guarantees is explored. Based on these findings, an adaptively biased stochastic algorithm is developed, which determines the bias level according to the algorithm's trajectory. Additionally, the impact of bias on a variance-reduced extension of the proposed algorithm is also investigated. The third work focuses on the accelerated first-order method with inexact gradient estimates applied to deterministic convex problems. The Performace Estimation Problem (PEP) is utilized as an analysis tool to study the performance of the proposed algorithm. Moreover, a novel bound for the convergence rate is derived, where the accumulated error term is independent of the initial condition and trajectory of the algorithm. To the best of the author's knowledge, this is the first time such a bound is proposed under the absolute inexact assumption for the error. To summarize, this dissertation contributes to the first-order methods where exact or unbiased gradient estimates are absent. Under both deterministic and stochastic optimization scenarios, the three works develop novel algorithms and convergence analysis techniques to address the challenges posed by gradient error.
Committee
Sam Davanloo Tajbakhsh (Advisor)
Guzin Bayraksan (Committee Member)
Jia (Kevin) Liu (Committee Member)
Pages
171 p.
Subject Headings
Industrial Engineering
;
Operations Research
Recommended Citations
Refworks
EndNote
RIS
Mendeley
Citations
Liu, Y. (2024).
First-Order Algorithms for Continuous Optimization With Inexact Oracles
[Doctoral dissertation, Ohio State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=osu1721330362439766
APA Style (7th edition)
Liu, Yin.
First-Order Algorithms for Continuous Optimization With Inexact Oracles.
2024. Ohio State University, Doctoral dissertation.
OhioLINK Electronic Theses and Dissertations Center
, http://rave.ohiolink.edu/etdc/view?acc_num=osu1721330362439766.
MLA Style (8th edition)
Liu, Yin. "First-Order Algorithms for Continuous Optimization With Inexact Oracles." Doctoral dissertation, Ohio State University, 2024. http://rave.ohiolink.edu/etdc/view?acc_num=osu1721330362439766
Chicago Manual of Style (17th edition)
Abstract Footer
Document number:
osu1721330362439766
Download Count:
138
Copyright Info
© 2024, all rights reserved.
This open access ETD is published by The Ohio State University and OhioLINK.