LSST Applications  21.0.0+04719a4bac,21.0.0-1-ga51b5d4+f5e6047307,21.0.0-11-g2b59f77+a9c1acf22d,21.0.0-11-ga42c5b2+86977b0b17,21.0.0-12-gf4ce030+76814010d2,21.0.0-13-g1721dae+760e7a6536,21.0.0-13-g3a573fe+768d78a30a,21.0.0-15-g5a7caf0+f21cbc5713,21.0.0-16-g0fb55c1+b60e2d390c,21.0.0-19-g4cded4ca+71a93a33c0,21.0.0-2-g103fe59+bb20972958,21.0.0-2-g45278ab+04719a4bac,21.0.0-2-g5242d73+3ad5d60fb1,21.0.0-2-g7f82c8f+8babb168e8,21.0.0-2-g8f08a60+06509c8b61,21.0.0-2-g8faa9b5+616205b9df,21.0.0-2-ga326454+8babb168e8,21.0.0-2-gde069b7+5e4aea9c2f,21.0.0-2-gecfae73+1d3a86e577,21.0.0-2-gfc62afb+3ad5d60fb1,21.0.0-25-g1d57be3cd+e73869a214,21.0.0-3-g357aad2+ed88757d29,21.0.0-3-g4a4ce7f+3ad5d60fb1,21.0.0-3-g4be5c26+3ad5d60fb1,21.0.0-3-g65f322c+e0b24896a3,21.0.0-3-g7d9da8d+616205b9df,21.0.0-3-ge02ed75+a9c1acf22d,21.0.0-4-g591bb35+a9c1acf22d,21.0.0-4-g65b4814+b60e2d390c,21.0.0-4-gccdca77+0de219a2bc,21.0.0-4-ge8a399c+6c55c39e83,21.0.0-5-gd00fb1e+05fce91b99,21.0.0-6-gc675373+3ad5d60fb1,21.0.0-64-g1122c245+4fb2b8f86e,21.0.0-7-g04766d7+cd19d05db2,21.0.0-7-gdf92d54+04719a4bac,21.0.0-8-g5674e7b+d1bd76f71f,master-gac4afde19b+a9c1acf22d,w.2021.13
LSST Data Management Base Package
Public Member Functions | Public Attributes | Static Public Attributes | List of all members
lsst.meas.algorithms.measureApCorr.MeasureApCorrTask Class Reference
Inheritance diagram for lsst.meas.algorithms.measureApCorr.MeasureApCorrTask:
lsst.pipe.base.task.Task

Public Member Functions

def __init__ (self, schema, **kwds)
 
def run (self, exposure, catalog)
 
def emptyMetadata (self)
 
def getSchemaCatalogs (self)
 
def getAllSchemaCatalogs (self)
 
def getFullMetadata (self)
 
def getFullName (self)
 
def getName (self)
 
def getTaskDict (self)
 
def makeSubtask (self, name, **keyArgs)
 
def timer (self, name, logLevel=Log.DEBUG)
 
def makeField (cls, doc)
 
def __reduce__ (self)
 

Public Attributes

 refFluxKeys
 
 toCorrect
 
 metadata
 
 log
 
 config
 

Static Public Attributes

 ConfigClass = MeasureApCorrConfig
 

Detailed Description

Task to measure aperture correction

Definition at line 110 of file measureApCorr.py.

Constructor & Destructor Documentation

◆ __init__()

def lsst.meas.algorithms.measureApCorr.MeasureApCorrTask.__init__ (   self,
  schema,
**  kwds 
)
Construct a MeasureApCorrTask

For every name in lsst.meas.base.getApCorrNameSet():
- If the corresponding flux fields exist in the schema:
    - Add a new field apcorr_{name}_used
    - Add an entry to the self.toCorrect dict
- Otherwise silently skip the name

Definition at line 116 of file measureApCorr.py.

116  def __init__(self, schema, **kwds):
117  """Construct a MeasureApCorrTask
118 
119  For every name in lsst.meas.base.getApCorrNameSet():
120  - If the corresponding flux fields exist in the schema:
121  - Add a new field apcorr_{name}_used
122  - Add an entry to the self.toCorrect dict
123  - Otherwise silently skip the name
124  """
125  Task.__init__(self, **kwds)
126  self.refFluxKeys = FluxKeys(self.config.refFluxName, schema)
127  self.toCorrect = {} # dict of flux field name prefix: FluxKeys instance
128  for name in sorted(getApCorrNameSet()):
129  try:
130  self.toCorrect[name] = FluxKeys(name, schema)
131  except KeyError:
132  # if a field in the registry is missing, just ignore it.
133  pass
134  self.makeSubtask("sourceSelector")
135 

Member Function Documentation

◆ __reduce__()

def lsst.pipe.base.task.Task.__reduce__ (   self)
inherited
Pickler.

Reimplemented in lsst.pipe.drivers.multiBandDriver.MultiBandDriverTask, and lsst.pipe.drivers.coaddDriver.CoaddDriverTask.

Definition at line 432 of file task.py.

432  def __reduce__(self):
433  """Pickler.
434  """
435  return self._unpickle_via_factory, (self.__class__, [], self._reduce_kwargs())

◆ emptyMetadata()

def lsst.pipe.base.task.Task.emptyMetadata (   self)
inherited
Empty (clear) the metadata for this Task and all sub-Tasks.

Definition at line 166 of file task.py.

166  def emptyMetadata(self):
167  """Empty (clear) the metadata for this Task and all sub-Tasks.
168  """
169  for subtask in self._taskDict.values():
170  subtask.metadata = dafBase.PropertyList()
171 
Class for storing ordered metadata with comments.
Definition: PropertyList.h:68

◆ getAllSchemaCatalogs()

def lsst.pipe.base.task.Task.getAllSchemaCatalogs (   self)
inherited
Get schema catalogs for all tasks in the hierarchy, combining the
results into a single dict.

Returns
-------
schemacatalogs : `dict`
    Keys are butler dataset type, values are a empty catalog (an
    instance of the appropriate `lsst.afw.table` Catalog type) for all
    tasks in the hierarchy, from the top-level task down
    through all subtasks.

Notes
-----
This method may be called on any task in the hierarchy; it will return
the same answer, regardless.

The default implementation should always suffice. If your subtask uses
schemas the override `Task.getSchemaCatalogs`, not this method.

Definition at line 204 of file task.py.

204  def getAllSchemaCatalogs(self):
205  """Get schema catalogs for all tasks in the hierarchy, combining the
206  results into a single dict.
207 
208  Returns
209  -------
210  schemacatalogs : `dict`
211  Keys are butler dataset type, values are a empty catalog (an
212  instance of the appropriate `lsst.afw.table` Catalog type) for all
213  tasks in the hierarchy, from the top-level task down
214  through all subtasks.
215 
216  Notes
217  -----
218  This method may be called on any task in the hierarchy; it will return
219  the same answer, regardless.
220 
221  The default implementation should always suffice. If your subtask uses
222  schemas the override `Task.getSchemaCatalogs`, not this method.
223  """
224  schemaDict = self.getSchemaCatalogs()
225  for subtask in self._taskDict.values():
226  schemaDict.update(subtask.getSchemaCatalogs())
227  return schemaDict
228 

◆ getFullMetadata()

def lsst.pipe.base.task.Task.getFullMetadata (   self)
inherited
Get metadata for all tasks.

Returns
-------
metadata : `lsst.daf.base.PropertySet`
    The `~lsst.daf.base.PropertySet` keys are the full task name.
    Values are metadata for the top-level task and all subtasks,
    sub-subtasks, etc.

Notes
-----
The returned metadata includes timing information (if
``@timer.timeMethod`` is used) and any metadata set by the task. The
name of each item consists of the full task name with ``.`` replaced
by ``:``, followed by ``.`` and the name of the item, e.g.::

    topLevelTaskName:subtaskName:subsubtaskName.itemName

using ``:`` in the full task name disambiguates the rare situation
that a task has a subtask and a metadata item with the same name.

Definition at line 229 of file task.py.

229  def getFullMetadata(self):
230  """Get metadata for all tasks.
231 
232  Returns
233  -------
234  metadata : `lsst.daf.base.PropertySet`
235  The `~lsst.daf.base.PropertySet` keys are the full task name.
236  Values are metadata for the top-level task and all subtasks,
237  sub-subtasks, etc.
238 
239  Notes
240  -----
241  The returned metadata includes timing information (if
242  ``@timer.timeMethod`` is used) and any metadata set by the task. The
243  name of each item consists of the full task name with ``.`` replaced
244  by ``:``, followed by ``.`` and the name of the item, e.g.::
245 
246  topLevelTaskName:subtaskName:subsubtaskName.itemName
247 
248  using ``:`` in the full task name disambiguates the rare situation
249  that a task has a subtask and a metadata item with the same name.
250  """
251  fullMetadata = dafBase.PropertySet()
252  for fullName, task in self.getTaskDict().items():
253  fullMetadata.set(fullName.replace(".", ":"), task.metadata)
254  return fullMetadata
255 
std::vector< SchemaItem< Flag > > * items
Class for storing generic metadata.
Definition: PropertySet.h:67

◆ getFullName()

def lsst.pipe.base.task.Task.getFullName (   self)
inherited
Get the task name as a hierarchical name including parent task
names.

Returns
-------
fullName : `str`
    The full name consists of the name of the parent task and each
    subtask separated by periods. For example:

    - The full name of top-level task "top" is simply "top".
    - The full name of subtask "sub" of top-level task "top" is
      "top.sub".
    - The full name of subtask "sub2" of subtask "sub" of top-level
      task "top" is "top.sub.sub2".

Definition at line 256 of file task.py.

256  def getFullName(self):
257  """Get the task name as a hierarchical name including parent task
258  names.
259 
260  Returns
261  -------
262  fullName : `str`
263  The full name consists of the name of the parent task and each
264  subtask separated by periods. For example:
265 
266  - The full name of top-level task "top" is simply "top".
267  - The full name of subtask "sub" of top-level task "top" is
268  "top.sub".
269  - The full name of subtask "sub2" of subtask "sub" of top-level
270  task "top" is "top.sub.sub2".
271  """
272  return self._fullName
273 

◆ getName()

def lsst.pipe.base.task.Task.getName (   self)
inherited
Get the name of the task.

Returns
-------
taskName : `str`
    Name of the task.

See also
--------
getFullName

Definition at line 274 of file task.py.

274  def getName(self):
275  """Get the name of the task.
276 
277  Returns
278  -------
279  taskName : `str`
280  Name of the task.
281 
282  See also
283  --------
284  getFullName
285  """
286  return self._name
287 
std::string const & getName() const noexcept
Return a filter's name.
Definition: Filter.h:78

◆ getSchemaCatalogs()

def lsst.pipe.base.task.Task.getSchemaCatalogs (   self)
inherited
Get the schemas generated by this task.

Returns
-------
schemaCatalogs : `dict`
    Keys are butler dataset type, values are an empty catalog (an
    instance of the appropriate `lsst.afw.table` Catalog type) for
    this task.

Notes
-----

.. warning::

   Subclasses that use schemas must override this method. The default
   implementation returns an empty dict.

This method may be called at any time after the Task is constructed,
which means that all task schemas should be computed at construction
time, *not* when data is actually processed. This reflects the
philosophy that the schema should not depend on the data.

Returning catalogs rather than just schemas allows us to save e.g.
slots for SourceCatalog as well.

See also
--------
Task.getAllSchemaCatalogs

Definition at line 172 of file task.py.

172  def getSchemaCatalogs(self):
173  """Get the schemas generated by this task.
174 
175  Returns
176  -------
177  schemaCatalogs : `dict`
178  Keys are butler dataset type, values are an empty catalog (an
179  instance of the appropriate `lsst.afw.table` Catalog type) for
180  this task.
181 
182  Notes
183  -----
184 
185  .. warning::
186 
187  Subclasses that use schemas must override this method. The default
188  implementation returns an empty dict.
189 
190  This method may be called at any time after the Task is constructed,
191  which means that all task schemas should be computed at construction
192  time, *not* when data is actually processed. This reflects the
193  philosophy that the schema should not depend on the data.
194 
195  Returning catalogs rather than just schemas allows us to save e.g.
196  slots for SourceCatalog as well.
197 
198  See also
199  --------
200  Task.getAllSchemaCatalogs
201  """
202  return {}
203 

◆ getTaskDict()

def lsst.pipe.base.task.Task.getTaskDict (   self)
inherited
Get a dictionary of all tasks as a shallow copy.

Returns
-------
taskDict : `dict`
    Dictionary containing full task name: task object for the top-level
    task and all subtasks, sub-subtasks, etc.

Definition at line 288 of file task.py.

288  def getTaskDict(self):
289  """Get a dictionary of all tasks as a shallow copy.
290 
291  Returns
292  -------
293  taskDict : `dict`
294  Dictionary containing full task name: task object for the top-level
295  task and all subtasks, sub-subtasks, etc.
296  """
297  return self._taskDict.copy()
298 
def getTaskDict(config, taskDict=None, baseName="")

◆ makeField()

def lsst.pipe.base.task.Task.makeField (   cls,
  doc 
)
inherited
Make a `lsst.pex.config.ConfigurableField` for this task.

Parameters
----------
doc : `str`
    Help text for the field.

Returns
-------
configurableField : `lsst.pex.config.ConfigurableField`
    A `~ConfigurableField` for this task.

Examples
--------
Provides a convenient way to specify this task is a subtask of another
task.

Here is an example of use:

.. code-block:: python

    class OtherTaskConfig(lsst.pex.config.Config):
        aSubtask = ATaskClass.makeField("brief description of task")

Definition at line 359 of file task.py.

359  def makeField(cls, doc):
360  """Make a `lsst.pex.config.ConfigurableField` for this task.
361 
362  Parameters
363  ----------
364  doc : `str`
365  Help text for the field.
366 
367  Returns
368  -------
369  configurableField : `lsst.pex.config.ConfigurableField`
370  A `~ConfigurableField` for this task.
371 
372  Examples
373  --------
374  Provides a convenient way to specify this task is a subtask of another
375  task.
376 
377  Here is an example of use:
378 
379  .. code-block:: python
380 
381  class OtherTaskConfig(lsst.pex.config.Config):
382  aSubtask = ATaskClass.makeField("brief description of task")
383  """
384  return ConfigurableField(doc=doc, target=cls)
385 

◆ makeSubtask()

def lsst.pipe.base.task.Task.makeSubtask (   self,
  name,
**  keyArgs 
)
inherited
Create a subtask as a new instance as the ``name`` attribute of this
task.

Parameters
----------
name : `str`
    Brief name of the subtask.
keyArgs
    Extra keyword arguments used to construct the task. The following
    arguments are automatically provided and cannot be overridden:

    - "config".
    - "parentTask".

Notes
-----
The subtask must be defined by ``Task.config.name``, an instance of
`~lsst.pex.config.ConfigurableField` or
`~lsst.pex.config.RegistryField`.

Definition at line 299 of file task.py.

299  def makeSubtask(self, name, **keyArgs):
300  """Create a subtask as a new instance as the ``name`` attribute of this
301  task.
302 
303  Parameters
304  ----------
305  name : `str`
306  Brief name of the subtask.
307  keyArgs
308  Extra keyword arguments used to construct the task. The following
309  arguments are automatically provided and cannot be overridden:
310 
311  - "config".
312  - "parentTask".
313 
314  Notes
315  -----
316  The subtask must be defined by ``Task.config.name``, an instance of
317  `~lsst.pex.config.ConfigurableField` or
318  `~lsst.pex.config.RegistryField`.
319  """
320  taskField = getattr(self.config, name, None)
321  if taskField is None:
322  raise KeyError(f"{self.getFullName()}'s config does not have field {name!r}")
323  subtask = taskField.apply(name=name, parentTask=self, **keyArgs)
324  setattr(self, name, subtask)
325 

◆ run()

def lsst.meas.algorithms.measureApCorr.MeasureApCorrTask.run (   self,
  exposure,
  catalog 
)
Measure aperture correction

Parameters
----------
exposure : `lsst.afw.image.Exposure`
    Exposure aperture corrections are being measured on. The
    bounding box is retrieved from it, and it is passed to the
    sourceSelector. The output aperture correction map is *not*
    added to the exposure; this is left to the caller.
catalog : `lsst.afw.table.SourceCatalog`
    SourceCatalog containing measurements to be used to
    compute aperture corrections.

Returns
-------
Struct : `lsst.pipe.base.Struct`
    Contains the following:

    ``apCorrMap``
        aperture correction map (`lsst.afw.image.ApCorrMap`)
        that contains two entries for each flux field:
        - flux field (e.g. base_PsfFlux_instFlux): 2d model
        - flux sigma field (e.g. base_PsfFlux_instFluxErr): 2d model of error

Definition at line 136 of file measureApCorr.py.

136  def run(self, exposure, catalog):
137  """Measure aperture correction
138 
139  Parameters
140  ----------
141  exposure : `lsst.afw.image.Exposure`
142  Exposure aperture corrections are being measured on. The
143  bounding box is retrieved from it, and it is passed to the
144  sourceSelector. The output aperture correction map is *not*
145  added to the exposure; this is left to the caller.
146  catalog : `lsst.afw.table.SourceCatalog`
147  SourceCatalog containing measurements to be used to
148  compute aperture corrections.
149 
150  Returns
151  -------
152  Struct : `lsst.pipe.base.Struct`
153  Contains the following:
154 
155  ``apCorrMap``
156  aperture correction map (`lsst.afw.image.ApCorrMap`)
157  that contains two entries for each flux field:
158  - flux field (e.g. base_PsfFlux_instFlux): 2d model
159  - flux sigma field (e.g. base_PsfFlux_instFluxErr): 2d model of error
160  """
161  bbox = exposure.getBBox()
162  import lsstDebug
163  display = lsstDebug.Info(__name__).display
164  doPause = lsstDebug.Info(__name__).doPause
165 
166  self.log.info("Measuring aperture corrections for %d flux fields" % (len(self.toCorrect),))
167  # First, create a subset of the catalog that contains only selected stars
168  # with non-flagged reference fluxes.
169  subset1 = [record for record in self.sourceSelector.run(catalog, exposure=exposure).sourceCat
170  if (not record.get(self.refFluxKeys.flag)
171  and numpy.isfinite(record.get(self.refFluxKeys.flux)))]
172 
173  apCorrMap = ApCorrMap()
174 
175  # Outer loop over the fields we want to correct
176  for name, keys in self.toCorrect.items():
177  fluxName = name + "_instFlux"
178  fluxErrName = name + "_instFluxErr"
179 
180  # Create a more restricted subset with only the objects where the to-be-correct flux
181  # is not flagged.
182  fluxes = numpy.fromiter((record.get(keys.flux) for record in subset1), float)
183  with numpy.errstate(invalid="ignore"): # suppress NAN warnings
184  isGood = numpy.logical_and.reduce([
185  numpy.fromiter((not record.get(keys.flag) for record in subset1), bool),
186  numpy.isfinite(fluxes),
187  fluxes > 0.0,
188  ])
189  subset2 = [record for record, good in zip(subset1, isGood) if good]
190 
191  # Check that we have enough data points that we have at least the minimum of degrees of
192  # freedom specified in the config.
193  if len(subset2) - 1 < self.config.minDegreesOfFreedom:
194  if name in self.config.allowFailure:
195  self.log.warn("Unable to measure aperture correction for '%s': "
196  "only %d sources, but require at least %d." %
197  (name, len(subset2), self.config.minDegreesOfFreedom+1))
198  continue
199  raise RuntimeError("Unable to measure aperture correction for required algorithm '%s': "
200  "only %d sources, but require at least %d." %
201  (name, len(subset2), self.config.minDegreesOfFreedom+1))
202 
203  # If we don't have enough data points to constrain the fit, reduce the order until we do
204  ctrl = self.config.fitConfig.makeControl()
205  while len(subset2) - ctrl.computeSize() < self.config.minDegreesOfFreedom:
206  if ctrl.orderX > 0:
207  ctrl.orderX -= 1
208  if ctrl.orderY > 0:
209  ctrl.orderY -= 1
210 
211  # Fill numpy arrays with positions and the ratio of the reference flux to the to-correct flux
212  x = numpy.zeros(len(subset2), dtype=float)
213  y = numpy.zeros(len(subset2), dtype=float)
214  apCorrData = numpy.zeros(len(subset2), dtype=float)
215  indices = numpy.arange(len(subset2), dtype=int)
216  for n, record in enumerate(subset2):
217  x[n] = record.getX()
218  y[n] = record.getY()
219  apCorrData[n] = record.get(self.refFluxKeys.flux)/record.get(keys.flux)
220 
221  for _i in range(self.config.numIter):
222 
223  # Do the fit, save it in the output map
224  apCorrField = ChebyshevBoundedField.fit(bbox, x, y, apCorrData, ctrl)
225 
226  if display:
227  plotApCorr(bbox, x, y, apCorrData, apCorrField, "%s, iteration %d" % (name, _i), doPause)
228 
229  # Compute errors empirically, using the RMS difference between the true reference flux and the
230  # corrected to-be-corrected flux.
231  apCorrDiffs = apCorrField.evaluate(x, y)
232  apCorrDiffs -= apCorrData
233  apCorrErr = numpy.mean(apCorrDiffs**2)**0.5
234 
235  # Clip bad data points
236  apCorrDiffLim = self.config.numSigmaClip * apCorrErr
237  with numpy.errstate(invalid="ignore"): # suppress NAN warning
238  keep = numpy.fabs(apCorrDiffs) <= apCorrDiffLim
239  x = x[keep]
240  y = y[keep]
241  apCorrData = apCorrData[keep]
242  indices = indices[keep]
243 
244  # Final fit after clipping
245  apCorrField = ChebyshevBoundedField.fit(bbox, x, y, apCorrData, ctrl)
246 
247  self.log.info("Aperture correction for %s: RMS %f from %d" %
248  (name, numpy.mean((apCorrField.evaluate(x, y) - apCorrData)**2)**0.5, len(indices)))
249 
250  if display:
251  plotApCorr(bbox, x, y, apCorrData, apCorrField, "%s, final" % (name,), doPause)
252 
253  # Save the result in the output map
254  # The error is constant spatially (we could imagine being
255  # more clever, but we're not yet sure if it's worth the effort).
256  # We save the errors as a 0th-order ChebyshevBoundedField
257  apCorrMap[fluxName] = apCorrField
258  apCorrErrCoefficients = numpy.array([[apCorrErr]], dtype=float)
259  apCorrMap[fluxErrName] = ChebyshevBoundedField(bbox, apCorrErrCoefficients)
260 
261  # Record which sources were used
262  for i in indices:
263  subset2[i].set(keys.used, True)
264 
265  return Struct(
266  apCorrMap=apCorrMap,
267  )
268 
269 
daf::base::PropertySet * set
Definition: fits.cc:912
def plotApCorr(bbox, xx, yy, zzMeasure, field, title, doPause)
def run(self, skyInfo, tempExpRefList, imageScalerList, weightList, altMaskList=None, mask=None, supplementaryData=None)

◆ timer()

def lsst.pipe.base.task.Task.timer (   self,
  name,
  logLevel = Log.DEBUG 
)
inherited
Context manager to log performance data for an arbitrary block of
code.

Parameters
----------
name : `str`
    Name of code being timed; data will be logged using item name:
    ``Start`` and ``End``.
logLevel
    A `lsst.log` level constant.

Examples
--------
Creating a timer context:

.. code-block:: python

    with self.timer("someCodeToTime"):
        pass  # code to time

See also
--------
timer.logInfo

Definition at line 327 of file task.py.

327  def timer(self, name, logLevel=Log.DEBUG):
328  """Context manager to log performance data for an arbitrary block of
329  code.
330 
331  Parameters
332  ----------
333  name : `str`
334  Name of code being timed; data will be logged using item name:
335  ``Start`` and ``End``.
336  logLevel
337  A `lsst.log` level constant.
338 
339  Examples
340  --------
341  Creating a timer context:
342 
343  .. code-block:: python
344 
345  with self.timer("someCodeToTime"):
346  pass # code to time
347 
348  See also
349  --------
350  timer.logInfo
351  """
352  logInfo(obj=self, prefix=name + "Start", logLevel=logLevel)
353  try:
354  yield
355  finally:
356  logInfo(obj=self, prefix=name + "End", logLevel=logLevel)
357 
def logInfo(obj, prefix, logLevel=Log.DEBUG)
Definition: timer.py:63

Member Data Documentation

◆ config

lsst.pipe.base.task.Task.config
inherited

Definition at line 162 of file task.py.

◆ ConfigClass

lsst.meas.algorithms.measureApCorr.MeasureApCorrTask.ConfigClass = MeasureApCorrConfig
static

Definition at line 113 of file measureApCorr.py.

◆ log

lsst.pipe.base.task.Task.log
inherited

Definition at line 161 of file task.py.

◆ metadata

lsst.pipe.base.task.Task.metadata
inherited

Definition at line 134 of file task.py.

◆ refFluxKeys

lsst.meas.algorithms.measureApCorr.MeasureApCorrTask.refFluxKeys

Definition at line 126 of file measureApCorr.py.

◆ toCorrect

lsst.meas.algorithms.measureApCorr.MeasureApCorrTask.toCorrect

Definition at line 127 of file measureApCorr.py.


The documentation for this class was generated from the following file: