16:00 - 17:00
5161.0165 (Bernoulliborg) and online
Title: A restart scheme for a dynamic with Hessian damping
Abstract: The study of the convergence of optimization algorithms is directly linked to the study of certain differential equations. In [1], it is shown that the convergence of Nesterov's accelerated gradient method is related to a dynamic involving the gradient of a convex function. Also, a restart scheme is provided, that accelerates the convergence of the function along the solutions of the dynamics. It has been studied that the addition of a term involving the Hessian of the functions stabilizes the convergence, so the main objective is to present the results obtained on the convergence of a new restart scheme for the dynamics with the Hessian term.
[1] W. Su, S. Boyd, and E. Candes, E. "A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights," Advances in neural information processing systems, 2014.
The colloquium will also take place online in Google Meet. You can email the organizer for a link to the meeting.