Deep Learning Theory Team Seminar(Talk by Taoli Zheng, The Chinese University of Hong Kong).
Team】Deep Learning Theory Team
【Date】2025/May/22(Thursday) 10:00-11:00(JST)
【Speaker】Talk by Taoli Zheng, The Chinese University of Hong Kong
Title: Universal Algorithms for Smooth Minimax Optimization: From Theory to Practice
Abstract:
Smooth minimax optimization problems play a central role in a wide range of applications, including machine learning, game theory, and operations research. However, the diversity of problem structures—spanning convex-concave, nonconvex-concave, convex-nonconcave, and nonconvex-nonconcave with additional regularity conditions—has traditionally required problem-specific algorithmic designs and careful hyperparameter tuning. In this talk, I will present a unified approach through two recent algorithms: Doubly Smoothed Gradient Descent Ascent (DS-GDA) and Doubly Smoothed Optimistic GDA (DS-OGDA). These single-loop methods require no prior structural knowledge and work effectively across a broad class of smooth minimax problems, achieving optimal or best-known convergence guarantees while demonstrating strong empirical performance on notoriously difficult nonconvex-nonconcave examples.