LSST Applications g0f08755f38+9c285cab97,g1635faa6d4+13f3999e92,g1653933729+a8ce1bb630,g1a0ca8cf93+bf6eb00ceb,g28da252d5a+0829b12dee,g29321ee8c0+5700dc9eac,g2bbee38e9b+9634bc57db,g2bc492864f+9634bc57db,g2cdde0e794+c2c89b37c4,g3156d2b45e+41e33cbcdc,g347aa1857d+9634bc57db,g35bb328faa+a8ce1bb630,g3a166c0a6a+9634bc57db,g3e281a1b8c+9f2c4e2fc3,g414038480c+077ccc18e7,g41af890bb2+fde0dd39b6,g5fbc88fb19+17cd334064,g781aacb6e4+a8ce1bb630,g80478fca09+55a9465950,g82479be7b0+d730eedb7d,g858d7b2824+9c285cab97,g9125e01d80+a8ce1bb630,g9726552aa6+10f999ec6a,ga5288a1d22+2a84bb7594,gacf8899fa4+c69c5206e8,gae0086650b+a8ce1bb630,gb58c049af0+d64f4d3760,gc28159a63d+9634bc57db,gcf0d15dbbd+4b7d09cae4,gda3e153d99+9c285cab97,gda6a2b7d83+4b7d09cae4,gdaeeff99f8+1711a396fd,ge2409df99d+5e831397f4,ge79ae78c31+9634bc57db,gf0baf85859+147a0692ba,gf3967379c6+41c94011de,gf3fb38a9a8+8f07a9901b,gfb92a5be7c+9c285cab97,w.2024.46
LSST Data Management Base Package
|
A numerical optimizer customized for least-squares problems with Bayesian priors. More...
#include <optimizer.h>
Public Types | |
enum | StateFlags { CONVERGED_GRADZERO = 0x0001 , CONVERGED_TR_SMALL = 0x0002 , CONVERGED = CONVERGED_GRADZERO | CONVERGED_TR_SMALL , FAILED_MAX_INNER_ITERATIONS = 0x0010 , FAILED_MAX_OUTER_ITERATIONS = 0x0020 , FAILED_MAX_ITERATIONS = 0x0030 , FAILED_EXCEPTION = 0x0040 , FAILED_NAN = 0x0080 , FAILED = FAILED_MAX_INNER_ITERATIONS | FAILED_MAX_OUTER_ITERATIONS | FAILED_EXCEPTION | FAILED_NAN , STATUS_STEP_REJECTED = 0x0100 , STATUS_STEP_ACCEPTED = 0x0200 , STATUS_STEP = STATUS_STEP_REJECTED | STATUS_STEP_ACCEPTED , STATUS_TR_UNCHANGED = 0x1000 , STATUS_TR_DECREASED = 0x2000 , STATUS_TR_INCREASED = 0x4000 , STATUS_TR = STATUS_TR_UNCHANGED | STATUS_TR_DECREASED | STATUS_TR_INCREASED , STATUS = STATUS_STEP | STATUS_TR } |
typedef OptimizerObjective | Objective |
typedef OptimizerControl | Control |
typedef OptimizerHistoryRecorder | HistoryRecorder |
Public Member Functions | |
Optimizer (std::shared_ptr< Objective const > objective, ndarray::Array< Scalar const, 1, 1 > const ¶meters, Control const &ctrl) | |
std::shared_ptr< Objective const > | getObjective () const |
Control const & | getControl () const |
bool | step () |
bool | step (HistoryRecorder const &recorder, afw::table::BaseCatalog &history) |
int | run () |
int | run (HistoryRecorder const &recorder, afw::table::BaseCatalog &history) |
int | getState () const |
Scalar | getObjectiveValue () const |
ndarray::Array< Scalar const, 1, 1 > | getParameters () const |
ndarray::Array< Scalar const, 1, 1 > | getResiduals () const |
ndarray::Array< Scalar const, 1, 1 > | getGradient () const |
ndarray::Array< Scalar const, 2, 2 > | getHessian () const |
void | removeSR1Term () |
Remove the symmetric-rank-1 secant term from the Hessian, making it just (J^T J) | |
Friends | |
class | OptimizerHistoryRecorder |
A numerical optimizer customized for least-squares problems with Bayesian priors.
The algorithm used by Optimizer combines the Gauss-Newton approach of approximating the second-derivative (Hessian) matrix as the inner product of the Jacobian of the residuals, while maintaining a matrix of corrections to this to account for large residuals, which is updated using a symmetric rank-1 (SR1) secant formula. We assume the prior has analytic first and second derivatives, but use numerical derivatives to compute the Jacobian of the residuals at every step. A trust region approach is used to ensure global convergence.
We consider the function \(f(x)\) we wish to optimize to have two terms, which correspond to negative log likelihood ( \(\chi^2/2=\|r(x)|^2\), where \(r(x)\) is the vector of residuals at \(x\)) and negative log prior \(q(x)=-\ln P(x)\):
\[ f(x) = \frac{1}{2}\|r(x)\|^2 + q(x) \]
At each iteration \(k\), we expand \(f(x)\) in a Taylor series in \(s=x_{k+1}-x_{k}\):
\[ f(x) \approx m(s) = f(x_k) + g_k^T s + \frac{1}{2}s^T H_k s \]
where
\[ g_k \equiv \left.\frac{\partial f}{\partial x}\right|_{x_k} = J_k^T r_k + \nabla q_k;\quad\quad J_k \equiv \left.\frac{\partial r}{\partial x}\right|_{x_k} \]
\[ H_k = J_k^T J_k + \nabla^2 q_k + B_k \]
Here, \(B_k\) is the SR1 approximation term to the second derivative term:
\[ B_k \approx \sum_i \frac{\partial^2 r^{(i)}_k}{\partial x^2}r^{(i)}_k \]
which we initialize to zero and then update with the following formula:
\[ B_{k+1} = B_{k} + \frac{v v^T}{v^T s};\quad\quad v\equiv J^T_{k+1} r_{k+1} - J^T_k r_k \]
Unlike the more common rank-2 BFGS update formula, SR1 updates are not guaranteed to produce a positive definite Hessian. This can result in more accurate approximations of the Hessian (and hence more accurate covariance matrices), but it rules out line-search methods and the simple dog-leg approach to the trust region problem. As a result, we should require fewer steps to converge, but spend more time computing each step; this is ideal when we expect the time spent in function evaluation to dominate the time per step anyway.
Definition at line 399 of file optimizer.h.
Definition at line 403 of file optimizer.h.
Definition at line 404 of file optimizer.h.
Definition at line 402 of file optimizer.h.
Definition at line 406 of file optimizer.h.
lsst::meas::modelfit::Optimizer::Optimizer | ( | std::shared_ptr< Objective const > | objective, |
ndarray::Array< Scalar const, 1, 1 > const & | parameters, | ||
Control const & | ctrl ) |
|
inline |
Definition at line 434 of file optimizer.h.
|
inline |
Definition at line 456 of file optimizer.h.
|
inline |
Definition at line 458 of file optimizer.h.
|
inline |
Definition at line 432 of file optimizer.h.
|
inline |
Definition at line 450 of file optimizer.h.
|
inline |
Definition at line 452 of file optimizer.h.
|
inline |
Definition at line 454 of file optimizer.h.
|
inline |
Definition at line 448 of file optimizer.h.
void lsst::meas::modelfit::Optimizer::removeSR1Term | ( | ) |
Remove the symmetric-rank-1 secant term from the Hessian, making it just (J^T J)
|
inline |
Definition at line 442 of file optimizer.h.
|
inline |
Definition at line 444 of file optimizer.h.
|
inline |
Definition at line 436 of file optimizer.h.
|
inline |
Definition at line 438 of file optimizer.h.
|
friend |
Definition at line 476 of file optimizer.h.