EE ZOOM Seminar: The Algorithmic Bias of 1st Order Methods

30 ביוני 2025, 15:00 
 
EE ZOOM Seminar: The Algorithmic Bias of 1st Order Methods

https://tau-ac-il.zoom.us/j/82194207072?pwd=7rKWjTpzhk5r5xCA65pgGGO9rieapJ.1
Meeting ID: 821 9420 7072
Passcode: 040024

 

Electrical Engineering Systems ZOOM Seminar

 

Speaker: Idan Amir

Ph.D. student under the supervision of Prof. Roi Livni

 

Monday, 30th May 2025, at 15:00

 

The Algorithmic Bias of 1st Order Methods

Abstract

We explore the interplay between optimization and generalization in stochastic convex optimization and overparameterized settings. Our result establishes a separation between Stochastic Gradient Descent (SGD) and full-batch Gradient Descent (GD), showing that SGD generalizes efficiently with O(1/ϵ^2) iterations, while GD requires Ω(1/ϵ^4), even with regularization. Further, the work examines full-batch optimization methods, revealing dimension-dependent inefficiencies that limit their generalization performance compared to stochastic methods. Finally, it demonstrates that early-stopped GD can achieve optimal generalization in Generalized Linear Models (GLMs) with O(1/ϵ^2) complexity by leveraging implicit regularization and problem-specific geometry. Together, these works advance the understanding of efficient learning in high-dimensional and overparameterized regimes.

אוניברסיטת תל אביב עושה כל מאמץ לכבד זכויות יוצרים. אם בבעלותך זכויות יוצרים בתכנים שנמצאים פה ו/או השימוש שנעשה בתכנים אלה לדעתך מפר זכויות
שנעשה בתכנים אלה לדעתך מפר זכויות נא לפנות בהקדם לכתובת שכאן >>