Skip navigation

DSpace JSPUI

DSpace preserves and enables easy and open access to all types of digital content including text, images, moving images, mpegs and data sets

Learn More
DSpace logo
English
中文
  • Browse
    • Communities
      & Collections
    • Publication Year
    • Author
    • Title
    • Subject
    • Advisor
  • Search TDR
  • Rights Q&A
    • My Page
    • Receive email
      updates
    • Edit Profile
  1. NTU Theses and Dissertations Repository
  2. 工學院
  3. 工業工程學研究所
Please use this identifier to cite or link to this item: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98941
Title: 樂觀梯度下降上升法於相對利普希茨條件下求解最佳化問題之研究
A Research on Relative Lipschitzness in Optimistic Gradient Descent Ascent Method for Solving Optimization Problems
Authors: 潘宇泰
Yu-Tai Pan
Advisor: 黃道宏
Kevin Dowhon Huang
Keyword: 凸最佳化,Fenchel轉換,有限總和最佳化,隨機最佳化,加速算法,
convex optimization,Fenchel transformation,finite-sum optimization,stochastic optimization,acceleration,
Publication Year : 2025
Degree: 碩士
Abstract: 本研究從[10]所提出的相對利普希茨觀點出發,提出兩種基於樂觀梯度下降上升法(Optimistic Gradient Descent Ascent Method, OGDA)的演算法,分別對應於一般最佳化問題與有限總和最佳化問題。

第一個演算法針對光滑且強凸的目標函數,透過Fenchel共軛轉換將原問題改寫為鞍點問題,再把OGDA詮釋為近端算法(Proximal Point)的一種變形。我們證明此方法可達到一階方法中最佳的迭代複雜度O(√(L/μ)log(1/ε))。

第二個演算法設計於有限總和形式的最佳化問題中,每次迭代僅隨機抽取一個光滑的子函數進行更新。我們構造了一種隨機版本的OGDA,並證明在各子函數具有不同平滑常數L_i的情況下,仍能達到收斂速率O((n+(Σ_{i=1}^n √(L_i))/√(nμ))*log((γε_0)/ε)),其中γ:=1+(Σ_{i=1}^n L_i)/(nμ)。

最後,我們設計問題進行實驗驗證。結果顯示提出的OGDA變體具加速收斂效果。
In this research, we develop and analyze two variants of the Optimistic Gradient Descent Ascent (OGDA) algorithm through the lens of relative Lipschitzness, a structural condition introduced by [10]. We first consider the setting of Lipschitz gradient and strongly convex minimization problems and reformulate them as structured convex-concave saddle-point problems using Fenchel duality. Within this formulation, we show that OGDA achieves the optimal iteration complexity of O(√(L/μ)log(1/ε)).

Inspired by the finite-sum framework studied in [16], the second algorithm targets the finite-sum setting, where the objective function is expressed as an average of n convex component functions with Lipschitz continuous gradients. We develop a new algorithm based on OGDA, mirroring the structure used in the single-function setting. Our analysis demonstrates that in the finite-sum case, the proposed method achieves a convergence rate of O((n+(Σ_{i=1}^n √(L_i))/√(nμ))*log((γε_0)/ε)), where γ:=1+(Σ_{i=1}^n L_i)/(nμ).

Finally, we validate the effectiveness of our approach through numerical experiments.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/98941
DOI: 10.6342/NTU202504117
Fulltext Rights: 同意授權(全球公開)
metadata.dc.date.embargo-lift: 2025-08-21
Appears in Collections:工業工程學研究所

Files in This Item:
File SizeFormat 
ntu-113-2.pdf895.64 kBAdobe PDFView/Open
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved