Stochastic Optimization Methods

Date:

Event: Continuous Optimization Seminar (Seminário de Otimização Contínua)

Location: Campinas, Brazil

Official Event Website: Seminários de Otimização Contínua

Slides on Overleaf

Summary: We will address the fundamentals and recent advances in stochastic optimization methods, essential techniques for large-scale problems. Starting from basic Stochastic Gradient Descent (SGD), we will explore its advanced variants such as Momentum, Nesterov Accelerated Gradient, and adaptive methods (Adagrad, RMSProp, and Adam). We will discuss efficient strategies for mini-batch selection and provide practical guidelines for implementation and choosing the appropriate method for different applications.