CS 201: Derivative-Free Optimization of Noisy Functions, JORGE NOCEDAL, Northwestern University

Speaker: Jorge Nocedal
Affiliation: Northwestern University

ABSTRACT:

The most successful method for derivative-free optimization developed by the optimization community in the last 2 decades was proposed by Powell. It creates quadratic models using a minimum change principle in conjunction with function interpolation, and computes a step by minimizing the model inside a trust region. This approach works surprisingly well when functions contain noise. Nevertheless, we claim that a more flexible approach is based on approximating derivatives of noisy functions (using smoothing techniques) and using them within a quasi-Newton method. To ensure stability we propose a noise-tolerant quasi-Newton updating technique and provide its supporting theory. The efficiency of the method is demonstrated on unconstrained, nonlinear least squares, and constrained optimization problems with noise.

BIO:

Jorge Nocedal is an applied mathematician and computer scientist, and the Walter P. Murphy professor in the Industrial Engineering and Management Sciences department in the McCormick School of Engineering at Northwestern University.

Jorge Nocedal’s research interests are in optimization and its application in machine learning and in disciplines involving differential equations. He specializes in nonlinear optimization, both convex and non-convex; deterministic and stochastic.

Hosted by Professor Cho-Jui Hsieh

Date/Time:
Date(s) - Apr 08, 2021
4:00 pm - 5:45 pm

Location:
Zoom Webinar
404 Westwood Plaza Los Angeles
Map Unavailable