• Optimization algorithms find a solution by minimizing a function.
• Virtually all algorithms use gradient information for the downhill search.
Optimization Algorithms
• The size of the optimization problem limits the number of available
algorithms to
quasi-Newton methods
and
Conjugate Gradient search
.
If implemented well, none should be superior to the others.
• These algorithms are deterministic, so that given infinite time, they find
ESTRO IMRT course 2016
IMRT optimization: algorithms&cost functions
London April 4th, 2016
-20-
a minimum, which is unique in the absence of delivery constraints.
• Some delivery constraints make the problem virtually unsolvable.
• If nothing else helps: stochastic algorithms (like Simulated Annealing)
which perform a sparse, random search of the entire solution
space (which is very, very big). Stochastic algorithms are the last
resort when the problem is otherwise intractable.