LSST Applications
21.0.0-172-gfb10e10a+18fedfabac,22.0.0+297cba6710,22.0.0+80564b0ff1,22.0.0+8d77f4f51a,22.0.0+a28f4c53b1,22.0.0+dcf3732eb2,22.0.1-1-g7d6de66+2a20fdde0d,22.0.1-1-g8e32f31+297cba6710,22.0.1-1-geca5380+7fa3b7d9b6,22.0.1-12-g44dc1dc+2a20fdde0d,22.0.1-15-g6a90155+515f58c32b,22.0.1-16-g9282f48+790f5f2caa,22.0.1-2-g92698f7+dcf3732eb2,22.0.1-2-ga9b0f51+7fa3b7d9b6,22.0.1-2-gd1925c9+bf4f0e694f,22.0.1-24-g1ad7a390+a9625a72a8,22.0.1-25-g5bf6245+3ad8ecd50b,22.0.1-25-gb120d7b+8b5510f75f,22.0.1-27-g97737f7+2a20fdde0d,22.0.1-32-gf62ce7b1+aa4237961e,22.0.1-4-g0b3f228+2a20fdde0d,22.0.1-4-g243d05b+871c1b8305,22.0.1-4-g3a563be+32dcf1063f,22.0.1-4-g44f2e3d+9e4ab0f4fa,22.0.1-42-gca6935d93+ba5e5ca3eb,22.0.1-5-g15c806e+85460ae5f3,22.0.1-5-g58711c4+611d128589,22.0.1-5-g75bb458+99c117b92f,22.0.1-6-g1c63a23+7fa3b7d9b6,22.0.1-6-g50866e6+84ff5a128b,22.0.1-6-g8d3140d+720564cf76,22.0.1-6-gd805d02+cc5644f571,22.0.1-8-ge5750ce+85460ae5f3,master-g6e05de7fdc+babf819c66,master-g99da0e417a+8d77f4f51a,w.2021.48
LSST Data Management Base Package
|
A numerical optimizer customized for least-squares problems with Bayesian priors. More...
#include <optimizer.h>
Public Types | |
enum | StateFlags { CONVERGED_GRADZERO = 0x0001 , CONVERGED_TR_SMALL = 0x0002 , CONVERGED = CONVERGED_GRADZERO | CONVERGED_TR_SMALL , FAILED_MAX_INNER_ITERATIONS = 0x0010 , FAILED_MAX_OUTER_ITERATIONS = 0x0020 , FAILED_MAX_ITERATIONS = 0x0030 , FAILED_EXCEPTION = 0x0040 , FAILED_NAN = 0x0080 , FAILED = FAILED_MAX_INNER_ITERATIONS | FAILED_MAX_OUTER_ITERATIONS | FAILED_EXCEPTION | FAILED_NAN , STATUS_STEP_REJECTED = 0x0100 , STATUS_STEP_ACCEPTED = 0x0200 , STATUS_STEP = STATUS_STEP_REJECTED | STATUS_STEP_ACCEPTED , STATUS_TR_UNCHANGED = 0x1000 , STATUS_TR_DECREASED = 0x2000 , STATUS_TR_INCREASED = 0x4000 , STATUS_TR = STATUS_TR_UNCHANGED | STATUS_TR_DECREASED | STATUS_TR_INCREASED , STATUS = STATUS_STEP | STATUS_TR } |
typedef OptimizerObjective | Objective |
typedef OptimizerControl | Control |
typedef OptimizerHistoryRecorder | HistoryRecorder |
Public Member Functions | |
Optimizer (std::shared_ptr< Objective const > objective, ndarray::Array< Scalar const, 1, 1 > const ¶meters, Control const &ctrl) | |
std::shared_ptr< Objective const > | getObjective () const |
Control const & | getControl () const |
bool | step () |
bool | step (HistoryRecorder const &recorder, afw::table::BaseCatalog &history) |
int | run () |
int | run (HistoryRecorder const &recorder, afw::table::BaseCatalog &history) |
int | getState () const |
Scalar | getObjectiveValue () const |
ndarray::Array< Scalar const, 1, 1 > | getParameters () const |
ndarray::Array< Scalar const, 1, 1 > | getResiduals () const |
ndarray::Array< Scalar const, 1, 1 > | getGradient () const |
ndarray::Array< Scalar const, 2, 2 > | getHessian () const |
void | removeSR1Term () |
Remove the symmetric-rank-1 secant term from the Hessian, making it just (J^T J) More... | |
Friends | |
class | OptimizerHistoryRecorder |
A numerical optimizer customized for least-squares problems with Bayesian priors.
The algorithm used by Optimizer combines the Gauss-Newton approach of approximating the second-derivative (Hessian) matrix as the inner product of the Jacobian of the residuals, while maintaining a matrix of corrections to this to account for large residuals, which is updated using a symmetric rank-1 (SR1) secant formula. We assume the prior has analytic first and second derivatives, but use numerical derivatives to compute the Jacobian of the residuals at every step. A trust region approach is used to ensure global convergence.
We consider the function \(f(x)\) we wish to optimize to have two terms, which correspond to negative log likelihood ( \(\chi^2/2=\|r(x)|^2\), where \(r(x)\) is the vector of residuals at \(x\)) and negative log prior \(q(x)=-\ln P(x)\):
\[ f(x) = \frac{1}{2}\|r(x)\|^2 + q(x) \]
At each iteration \(k\), we expand \(f(x)\) in a Taylor series in \(s=x_{k+1}-x_{k}\):
\[ f(x) \approx m(s) = f(x_k) + g_k^T s + \frac{1}{2}s^T H_k s \]
where
\[ g_k \equiv \left.\frac{\partial f}{\partial x}\right|_{x_k} = J_k^T r_k + \nabla q_k;\quad\quad J_k \equiv \left.\frac{\partial r}{\partial x}\right|_{x_k} \]
\[ H_k = J_k^T J_k + \nabla^2 q_k + B_k \]
Here, \(B_k\) is the SR1 approximation term to the second derivative term:
\[ B_k \approx \sum_i \frac{\partial^2 r^{(i)}_k}{\partial x^2}r^{(i)}_k \]
which we initialize to zero and then update with the following formula:
\[ B_{k+1} = B_{k} + \frac{v v^T}{v^T s};\quad\quad v\equiv J^T_{k+1} r_{k+1} - J^T_k r_k \]
Unlike the more common rank-2 BFGS update formula, SR1 updates are not guaranteed to produce a positive definite Hessian. This can result in more accurate approximations of the Hessian (and hence more accurate covariance matrices), but it rules out line-search methods and the simple dog-leg approach to the trust region problem. As a result, we should require fewer steps to converge, but spend more time computing each step; this is ideal when we expect the time spent in function evaluation to dominate the time per step anyway.
Definition at line 399 of file optimizer.h.
Definition at line 403 of file optimizer.h.
Definition at line 404 of file optimizer.h.
Definition at line 402 of file optimizer.h.
Definition at line 406 of file optimizer.h.
lsst::meas::modelfit::Optimizer::Optimizer | ( | std::shared_ptr< Objective const > | objective, |
ndarray::Array< Scalar const, 1, 1 > const & | parameters, | ||
Control const & | ctrl | ||
) |
|
inline |
Definition at line 434 of file optimizer.h.
|
inline |
Definition at line 456 of file optimizer.h.
|
inline |
Definition at line 458 of file optimizer.h.
|
inline |
Definition at line 432 of file optimizer.h.
|
inline |
Definition at line 450 of file optimizer.h.
|
inline |
Definition at line 452 of file optimizer.h.
|
inline |
Definition at line 454 of file optimizer.h.
|
inline |
Definition at line 448 of file optimizer.h.
void lsst::meas::modelfit::Optimizer::removeSR1Term | ( | ) |
Remove the symmetric-rank-1 secant term from the Hessian, making it just (J^T J)
|
inline |
Definition at line 442 of file optimizer.h.
|
inline |
Definition at line 444 of file optimizer.h.
|
inline |
Definition at line 436 of file optimizer.h.
|
inline |
Definition at line 438 of file optimizer.h.
|
friend |
Definition at line 476 of file optimizer.h.