Hvass Laboratories
Magnus Erik Hvass Pedersen

All this was made by a single person without any grant-money.
If you find it useful then please donate securely using PayPal.
Even a few dollars are appreciated. Thanks!


Investing
Books
Services
Blog
S&P 500

Video Talks
Investing
Deep Learning
Optimization

Source Code
GitHub
SwarmOps
RandomOps
ArrayOps
NeuralOps

Author
Publications
Thesis
Citations
Schoolwork
Contact

Hvass Labs

Thesis

Tuning & Simplifying Heuristical Optimization

Abstract

This thesis is about the tuning and simplification of black-box (direct-search, derivative-free) optimization methods, which by definition do not use gradient information to guide their search for an optimum but merely need a fitness (cost, error, objective) measure for each candidate solution to the optimization problem. Such optimization methods often have parameters that influence their behaviour and efficacy. A Meta-Optimization technique is presented here for tuning the behavioural parameters of an optimization method by employing an additional layer of optimization. This is used in a number of experiments on two popular optimization methods, Differential Evolution and Particle Swarm Optimization, and unveils the true performance capabilities of an optimizer in different usage scenarios. It is found that state-of-the-art optimizer variants with their supposedly adaptive behavioural parameters do not have a general and consistent performance advantage but are outperformed in several cases by simplified optimizers, if only the behavioural parameters are tuned properly.

Download

  • Thesis (PDF document, 4.4 MB)
  • SwarmOps is the source-code library used for the computational experiments in the thesis.

Hvass Laboratories