Optimization#
Note
Optimization can be minimization or maximization, but minimization will be implemented all in this library to unify usage of classes for all algorithms.
Unconstrained Optimization#
Note
twice_differentiable_objective_function
in the model
will be used in Newton method,
which is a descent_method_optimizer
but requires Hessian.
Note
In actual implementations, template classes and Curiously Recurring Template Pattern (CRTP) will be applied to classes for preventing overheads of virtual functions.
auto obj_fun = objective_function();
// create an optimizer with an objective function
auto optimizer = num_collect::opt::steepest_descent<objective_function>(obj_fun);
// set constants in algorithms if necessary
optimizer.line_searcher().armijo_coeff(0.3);
// initialize with the initial variable
optimizer.init(initial_variable);
// call iterate() multiple times or call solve() to minimize the objective function
if (use_iterate) {
for (std::size_t i = 0; i < max_iteration; ++i) {
optimizer.iterate()
}
}
else {
optimizer.solve();
}
// check results
std::cout << optimizer.opt_variable() << std::endl;
std::cout << optimizer.opt_value() << std::endl;