ad-optim
06/04/21
Inspired by the roptim package, I've put together a C++ wrapper around R's optimisation routines. However, unlike roptim which computes gradients numerically, I've made it such that gradients are evaluated via auto-differentiation.
For example we can define the function $$ f(x_1, x_2) = 100 (x_2 - x_1^2)^2 + (1-x_1)^2 $$
As follows
// C++
class Func : public AD_func {
// define the fn method
ad_double fn(const ad_vec_double &x) {
return 100 * pow(x[1] - x[0]*x[0], 2) + pow(1-x[0], 2);
}
}
And have it optimised by calling
// C++
// ...
Func func;
std::vector par = {0, 0}; // parameter values
func.set_grad(par); // initialise the gradient function
optim(func, par); // optimise
The repo can be found here.