Improving the Flexibility and Robustness of Derivative-Free Optimization Solvers
With Lindon Roberts
Improving the Flexibility and Robustness of Derivative-Free Optimization Solvers
Classical nonlinear optimisation algorithms require the availability of gradient evaluations for constructing local approximations to the objective and testing for convergence. In settings where the objective is expensive to evaluate or noisy, evaluating the gradient may be too expensive or inaccurate, so cannot be used; we must turn to optimisation methods which do not require gradient information, so-called derivative-free optimisation (DFO). This has applications in areas such as climate modelling, hyperparameter tuning and generating adversarial examples in deep learning. In this talk, I will introduce DFO and discuss two software packages for DFO for nonlinear least-squares problems and general minimisation problems. I will describe their novel features aimed at expensive and/or noisy problems, and show their state-of-the-art performance. Time permitting, I will also show a heuristic method which improves the ability of these methods to escape local minima, and show its favourable performance on global optimisation problems.
- Speaker: Lindon Roberts
- Friday 08 March 2019, 14:00–15:00
- Venue: MR11, Centre for Mathematical Sciences, Wilberforce Road, Cambridge.
- Series: CMIH seminar series; organiser: J.W.Stevens.