LSST Applications g063fba187b+cac8b7c890,g0f08755f38+6aee506743,g1653933729+a8ce1bb630,g168dd56ebc+a8ce1bb630,g1a2382251a+b4475c5878,g1dcb35cd9c+8f9bc1652e,g20f6ffc8e0+6aee506743,g217e2c1bcf+73dee94bd0,g28da252d5a+1f19c529b9,g2bbee38e9b+3f2625acfc,g2bc492864f+3f2625acfc,g3156d2b45e+6e55a43351,g32e5bea42b+1bb94961c2,g347aa1857d+3f2625acfc,g35bb328faa+a8ce1bb630,g3a166c0a6a+3f2625acfc,g3e281a1b8c+c5dd892a6c,g3e8969e208+a8ce1bb630,g414038480c+5927e1bc1e,g41af890bb2+8a9e676b2a,g7af13505b9+809c143d88,g80478fca09+6ef8b1810f,g82479be7b0+f568feb641,g858d7b2824+6aee506743,g89c8672015+f4add4ffd5,g9125e01d80+a8ce1bb630,ga5288a1d22+2903d499ea,gb58c049af0+d64f4d3760,gc28159a63d+3f2625acfc,gcab2d0539d+b12535109e,gcf0d15dbbd+46a3f46ba9,gda6a2b7d83+46a3f46ba9,gdaeeff99f8+1711a396fd,ge79ae78c31+3f2625acfc,gef2f8181fd+0a71e47438,gf0baf85859+c1f95f4921,gfa517265be+6aee506743,gfa999e8aa5+17cd334064,w.2024.51
LSST Data Management Base Package
|
A numerical optimizer customized for least-squares problems with Bayesian priors. More...
#include <optimizer.h>
Public Types | |
enum | StateFlags { CONVERGED_GRADZERO = 0x0001 , CONVERGED_TR_SMALL = 0x0002 , CONVERGED = CONVERGED_GRADZERO | CONVERGED_TR_SMALL , FAILED_MAX_INNER_ITERATIONS = 0x0010 , FAILED_MAX_OUTER_ITERATIONS = 0x0020 , FAILED_MAX_ITERATIONS = 0x0030 , FAILED_EXCEPTION = 0x0040 , FAILED_NAN = 0x0080 , FAILED = FAILED_MAX_INNER_ITERATIONS | FAILED_MAX_OUTER_ITERATIONS | FAILED_EXCEPTION | FAILED_NAN , STATUS_STEP_REJECTED = 0x0100 , STATUS_STEP_ACCEPTED = 0x0200 , STATUS_STEP = STATUS_STEP_REJECTED | STATUS_STEP_ACCEPTED , STATUS_TR_UNCHANGED = 0x1000 , STATUS_TR_DECREASED = 0x2000 , STATUS_TR_INCREASED = 0x4000 , STATUS_TR = STATUS_TR_UNCHANGED | STATUS_TR_DECREASED | STATUS_TR_INCREASED , STATUS = STATUS_STEP | STATUS_TR } |
typedef OptimizerObjective | Objective |
typedef OptimizerControl | Control |
typedef OptimizerHistoryRecorder | HistoryRecorder |
Public Member Functions | |
Optimizer (std::shared_ptr< Objective const > objective, ndarray::Array< Scalar const, 1, 1 > const ¶meters, Control const &ctrl) | |
std::shared_ptr< Objective const > | getObjective () const |
Control const & | getControl () const |
bool | step () |
bool | step (HistoryRecorder const &recorder, afw::table::BaseCatalog &history) |
int | run () |
int | run (HistoryRecorder const &recorder, afw::table::BaseCatalog &history) |
int | getState () const |
Scalar | getObjectiveValue () const |
ndarray::Array< Scalar const, 1, 1 > | getParameters () const |
ndarray::Array< Scalar const, 1, 1 > | getResiduals () const |
ndarray::Array< Scalar const, 1, 1 > | getGradient () const |
ndarray::Array< Scalar const, 2, 2 > | getHessian () const |
void | removeSR1Term () |
Remove the symmetric-rank-1 secant term from the Hessian, making it just (J^T J) | |
Friends | |
class | OptimizerHistoryRecorder |
A numerical optimizer customized for least-squares problems with Bayesian priors.
The algorithm used by Optimizer combines the Gauss-Newton approach of approximating the second-derivative (Hessian) matrix as the inner product of the Jacobian of the residuals, while maintaining a matrix of corrections to this to account for large residuals, which is updated using a symmetric rank-1 (SR1) secant formula. We assume the prior has analytic first and second derivatives, but use numerical derivatives to compute the Jacobian of the residuals at every step. A trust region approach is used to ensure global convergence.
We consider the function \(f(x)\) we wish to optimize to have two terms, which correspond to negative log likelihood ( \(\chi^2/2=\|r(x)|^2\), where \(r(x)\) is the vector of residuals at \(x\)) and negative log prior \(q(x)=-\ln P(x)\):
\[ f(x) = \frac{1}{2}\|r(x)\|^2 + q(x) \]
At each iteration \(k\), we expand \(f(x)\) in a Taylor series in \(s=x_{k+1}-x_{k}\):
\[ f(x) \approx m(s) = f(x_k) + g_k^T s + \frac{1}{2}s^T H_k s \]
where
\[ g_k \equiv \left.\frac{\partial f}{\partial x}\right|_{x_k} = J_k^T r_k + \nabla q_k;\quad\quad J_k \equiv \left.\frac{\partial r}{\partial x}\right|_{x_k} \]
\[ H_k = J_k^T J_k + \nabla^2 q_k + B_k \]
Here, \(B_k\) is the SR1 approximation term to the second derivative term:
\[ B_k \approx \sum_i \frac{\partial^2 r^{(i)}_k}{\partial x^2}r^{(i)}_k \]
which we initialize to zero and then update with the following formula:
\[ B_{k+1} = B_{k} + \frac{v v^T}{v^T s};\quad\quad v\equiv J^T_{k+1} r_{k+1} - J^T_k r_k \]
Unlike the more common rank-2 BFGS update formula, SR1 updates are not guaranteed to produce a positive definite Hessian. This can result in more accurate approximations of the Hessian (and hence more accurate covariance matrices), but it rules out line-search methods and the simple dog-leg approach to the trust region problem. As a result, we should require fewer steps to converge, but spend more time computing each step; this is ideal when we expect the time spent in function evaluation to dominate the time per step anyway.
Definition at line 399 of file optimizer.h.
Definition at line 403 of file optimizer.h.
Definition at line 404 of file optimizer.h.
Definition at line 402 of file optimizer.h.
Definition at line 406 of file optimizer.h.
lsst::meas::modelfit::Optimizer::Optimizer | ( | std::shared_ptr< Objective const > | objective, |
ndarray::Array< Scalar const, 1, 1 > const & | parameters, | ||
Control const & | ctrl ) |
|
inline |
Definition at line 434 of file optimizer.h.
|
inline |
Definition at line 456 of file optimizer.h.
|
inline |
Definition at line 458 of file optimizer.h.
|
inline |
Definition at line 432 of file optimizer.h.
|
inline |
Definition at line 450 of file optimizer.h.
|
inline |
Definition at line 452 of file optimizer.h.
|
inline |
Definition at line 454 of file optimizer.h.
|
inline |
Definition at line 448 of file optimizer.h.
void lsst::meas::modelfit::Optimizer::removeSR1Term | ( | ) |
Remove the symmetric-rank-1 secant term from the Hessian, making it just (J^T J)
|
inline |
Definition at line 442 of file optimizer.h.
|
inline |
Definition at line 444 of file optimizer.h.
|
inline |
Definition at line 436 of file optimizer.h.
|
inline |
Definition at line 438 of file optimizer.h.
|
friend |
Definition at line 476 of file optimizer.h.