LSSTApplications  18.0.0+106,18.0.0+50,19.0.0,19.0.0+1,19.0.0+10,19.0.0+11,19.0.0+13,19.0.0+17,19.0.0+2,19.0.0-1-g20d9b18+6,19.0.0-1-g425ff20,19.0.0-1-g5549ca4,19.0.0-1-g580fafe+6,19.0.0-1-g6fe20d0+1,19.0.0-1-g7011481+9,19.0.0-1-g8c57eb9+6,19.0.0-1-gb5175dc+11,19.0.0-1-gdc0e4a7+9,19.0.0-1-ge272bc4+6,19.0.0-1-ge3aa853,19.0.0-10-g448f008b,19.0.0-12-g6990b2c,19.0.0-2-g0d9f9cd+11,19.0.0-2-g3d9e4fb2+11,19.0.0-2-g5037de4,19.0.0-2-gb96a1c4+3,19.0.0-2-gd955cfd+15,19.0.0-3-g2d13df8,19.0.0-3-g6f3c7dc,19.0.0-4-g725f80e+11,19.0.0-4-ga671dab3b+1,19.0.0-4-gad373c5+3,19.0.0-5-ga2acb9c+2,19.0.0-5-gfe96e6c+2,w.2020.01
LSSTDataManagementBasePackage
Public Member Functions | Public Attributes | Static Public Attributes | List of all members
lsst.pipe.drivers.constructCalibs.FringeTask Class Reference
Inheritance diagram for lsst.pipe.drivers.constructCalibs.FringeTask:
lsst.pipe.drivers.constructCalibs.CalibTask lsst.ctrl.pool.parallel.BatchPoolTask lsst.ctrl.pool.parallel.BatchCmdLineTask lsst.pipe.base.cmdLineTask.CmdLineTask lsst.pipe.base.task.Task

Public Member Functions

def applyOverrides (cls, config)
 
def __init__ (self, args, kwargs)
 
def processSingle (self, sensorRef)
 
def batchWallTime (cls, time, parsedCmd, numCores)
 
def runDataRef (self, expRefList, butler, calibId)
 Construct a calib from a list of exposure references. More...
 
def getOutputId (self, expRefList, calibId)
 Generate the data identifier for the output calib. More...
 
def getMjd (self, butler, dataId, timescale=dafBase.DateTime.UTC)
 
def getFilter (self, butler, dataId)
 
def addMissingKeys (self, dataId, butler, missingKeys=None, calibName=None)
 
def updateMetadata (self, calibImage, exposureTime, darkTime=None, kwargs)
 Update the metadata from the VisitInfo. More...
 
def scatterProcess (self, pool, ccdIdLists)
 Scatter the processing among the nodes. More...
 
def process (self, cache, ccdId, outputName="postISRCCD", kwargs)
 Process a CCD, specified by a data identifier. More...
 
def processWrite (self, dataRef, exposure, outputName="postISRCCD")
 Write the processed CCD. More...
 
def processResult (self, exposure)
 
def scale (self, ccdIdLists, data)
 Determine scaling across CCDs and exposures. More...
 
def scatterCombine (self, pool, outputId, ccdIdLists, scales)
 Scatter the combination of exposures across multiple nodes. More...
 
def getFullyQualifiedOutputId (self, ccdName, butler, outputId)
 
def combine (self, cache, struct, outputId)
 Combine multiple exposures of a particular CCD and write the output. More...
 
def calculateOutputHeaderFromRaws (self, butler, calib, dataIdList, outputId)
 Calculate the output header from the raw headers. More...
 
def recordCalibInputs (self, butler, calib, dataIdList, outputId)
 Record metadata including the inputs and creation details. More...
 
def interpolateNans (self, image)
 
def write (self, butler, exposure, dataId)
 Write the final combined calib. More...
 
def makeCameraImage (self, camera, dataId, calibs)
 Create and write an image of the entire camera. More...
 
def checkCcdIdLists (self, ccdIdLists)
 
def parseAndRun (cls, args, kwargs)
 
def parseAndRun (cls, args=None, config=None, log=None, doReturnResults=False)
 
def parseAndSubmit (cls, args=None, kwargs)
 
def batchCommand (cls, args)
 Return command to run CmdLineTask. More...
 
def logOperation (self, operation, catch=False, trace=True)
 Provide a context manager for logging an operation. More...
 
def writeConfig (self, butler, clobber=False, doBackup=True)
 
def writeSchemas (self, butler, clobber=False, doBackup=True)
 
def writeMetadata (self, dataRef)
 
def writePackageVersions (self, butler, clobber=False, doBackup=True, dataset="packages")
 
def emptyMetadata (self)
 
def getSchemaCatalogs (self)
 
def getAllSchemaCatalogs (self)
 
def getFullMetadata (self)
 
def getFullName (self)
 
def getName (self)
 
def getTaskDict (self)
 
def makeSubtask (self, name, keyArgs)
 
def timer (self, name, logLevel=Log.DEBUG)
 
def makeField (cls, doc)
 
def __reduce__ (self)
 

Public Attributes

 metadata
 
 log
 
 config
 

Static Public Attributes

 ConfigClass = FringeConfig
 
string calibName = "fringe"
 
 RunnerClass = CalibTaskRunner
 
 filterName = None
 
float exposureTime = 1.0
 
bool canMultiprocess = True
 

Detailed Description

Fringe construction task

The principal change from the base class is that the images are
background-subtracted and rescaled by the background.

XXX This is probably not right for a straight-up combination, as we
are currently doing, since the fringe amplitudes need not scale with
the continuum.

XXX Would like to have this do PCA and generate multiple images, but
that will take a bit of work with the persistence code.

Definition at line 1123 of file constructCalibs.py.

Constructor & Destructor Documentation

◆ __init__()

def lsst.pipe.drivers.constructCalibs.FringeTask.__init__ (   self,
  args,
  kwargs 
)

Definition at line 1145 of file constructCalibs.py.

1145  def __init__(self, *args, **kwargs):
1146  CalibTask.__init__(self, *args, **kwargs)
1147  self.makeSubtask("detection")
1148  self.makeSubtask("stats")
1149  self.makeSubtask("subtractBackground")
1150 

Member Function Documentation

◆ __reduce__()

def lsst.pipe.base.task.Task.__reduce__ (   self)
inherited
Pickler.

Definition at line 373 of file task.py.

373  def __reduce__(self):
374  """Pickler.
375  """
376  return self.__class__, (self.config, self._name, self._parentTask, None)
377 

◆ addMissingKeys()

def lsst.pipe.drivers.constructCalibs.CalibTask.addMissingKeys (   self,
  dataId,
  butler,
  missingKeys = None,
  calibName = None 
)
inherited

Definition at line 549 of file constructCalibs.py.

549  def addMissingKeys(self, dataId, butler, missingKeys=None, calibName=None):
550  if calibName is None:
551  calibName = self.calibName
552 
553  if missingKeys is None:
554  missingKeys = set(butler.getKeys(calibName).keys()) - set(dataId.keys())
555 
556  for k in missingKeys:
557  try:
558  v = butler.queryMetadata('raw', [k], dataId) # n.b. --id refers to 'raw'
559  except Exception:
560  continue
561 
562  if len(v) == 0: # failed to lookup value
563  continue
564 
565  if len(v) == 1:
566  dataId[k] = v[0]
567  else:
568  raise RuntimeError("No unique lookup for %s: %s" % (k, v))
569 
daf::base::PropertySet * set
Definition: fits.cc:902

◆ applyOverrides()

def lsst.pipe.drivers.constructCalibs.FringeTask.applyOverrides (   cls,
  config 
)
Overrides for fringe construction

Definition at line 1141 of file constructCalibs.py.

1141  def applyOverrides(cls, config):
1142  """Overrides for fringe construction"""
1143  config.isr.doFringe = False
1144 

◆ batchCommand()

def lsst.ctrl.pool.parallel.BatchCmdLineTask.batchCommand (   cls,
  args 
)
inherited

Return command to run CmdLineTask.

Parameters
clsClass
argsParsed batch job arguments (from BatchArgumentParser)

Definition at line 471 of file parallel.py.

471  def batchCommand(cls, args):
472  """!Return command to run CmdLineTask
473 
474  @param cls: Class
475  @param args: Parsed batch job arguments (from BatchArgumentParser)
476  """
477  job = args.job if args.job is not None else "job"
478  module = cls.__module__
479  script = ("import os; os.umask(%#05o); " +
480  "import lsst.base; lsst.base.disableImplicitThreading(); " +
481  "import lsst.ctrl.pool.log; lsst.ctrl.pool.log.jobLog(\"%s\"); ") % (UMASK, job)
482 
483  if args.batchStats:
484  script += ("import lsst.ctrl.pool.parallel; import atexit; " +
485  "atexit.register(lsst.ctrl.pool.parallel.printProcessStats); ")
486 
487  script += "import %s; %s.%s.parseAndRun();" % (module, module, cls.__name__)
488 
489  profilePre = "import cProfile; import os; cProfile.run(\"\"\""
490  profilePost = "\"\"\", filename=\"profile-" + job + "-%s-%d.dat\" % (os.uname()[1], os.getpid()))"
491 
492  return ("python -c '" + (profilePre if args.batchProfile else "") + script +
493  (profilePost if args.batchProfile else "") + "' " + shCommandFromArgs(args.leftover) +
494  " --noExit")
495 
def shCommandFromArgs(args)
Definition: parallel.py:42

◆ batchWallTime()

def lsst.pipe.drivers.constructCalibs.CalibTask.batchWallTime (   cls,
  time,
  parsedCmd,
  numCores 
)
inherited

Definition at line 410 of file constructCalibs.py.

410  def batchWallTime(cls, time, parsedCmd, numCores):
411  numCcds = len(parsedCmd.butler.get("camera"))
412  numExps = len(cls.RunnerClass.getTargetList(
413  parsedCmd)[0]['expRefList'])
414  numCycles = int(numCcds/float(numCores) + 0.5)
415  return time*numExps*numCycles
416 

◆ calculateOutputHeaderFromRaws()

def lsst.pipe.drivers.constructCalibs.CalibTask.calculateOutputHeaderFromRaws (   self,
  butler,
  calib,
  dataIdList,
  outputId 
)
inherited

Calculate the output header from the raw headers.

This metadata will go into the output FITS header. It will include all headers that are identical in all inputs.

Parameters
butlerData butler
calibCombined calib exposure.
dataIdListList of data identifiers for calibration inputs
outputIdData identifier for output

Definition at line 766 of file constructCalibs.py.

766  def calculateOutputHeaderFromRaws(self, butler, calib, dataIdList, outputId):
767  """!Calculate the output header from the raw headers.
768 
769  This metadata will go into the output FITS header. It will include all
770  headers that are identical in all inputs.
771 
772  @param butler Data butler
773  @param calib Combined calib exposure.
774  @param dataIdList List of data identifiers for calibration inputs
775  @param outputId Data identifier for output
776  """
777  header = calib.getMetadata()
778 
779  rawmd = [butler.get("raw_md", dataId) for dataId in dataIdList if
780  dataId is not None]
781 
782  merged = merge_headers(rawmd, mode="drop")
783 
784  # Place merged set into the PropertyList if a value is not
785  # present already
786  # Comments are not present in the merged version so copy them across
787  for k, v in merged.items():
788  if k not in header:
789  comment = rawmd[0].getComment(k) if k in rawmd[0] else None
790  header.set(k, v, comment=comment)
791 
792  # Create an observation group so we can add some standard headers
793  # independent of the form in the input files.
794  # Use try block in case we are dealing with unexpected data headers
795  try:
796  group = ObservationGroup(rawmd, pedantic=False)
797  except Exception:
798  group = None
799 
800  comments = {"TIMESYS": "Time scale for all dates",
801  "DATE-OBS": "Start date of earliest input observation",
802  "MJD-OBS": "[d] Start MJD of earliest input observation",
803  "DATE-END": "End date of oldest input observation",
804  "MJD-END": "[d] End MJD of oldest input observation",
805  "MJD-AVG": "[d] MJD midpoint of all input observations",
806  "DATE-AVG": "Midpoint date of all input observations"}
807 
808  if group is not None:
809  oldest, newest = group.extremes()
810  dateCards = dates_to_fits(oldest.datetime_begin, newest.datetime_end)
811  else:
812  # Fall back to setting a DATE-OBS from the calibDate
813  dateCards = {"DATE-OBS": "{}T00:00:00.00".format(outputId[self.config.dateCalib])}
814  comments["DATE-OBS"] = "Date of start of day of calibration midpoint"
815 
816  for k, v in dateCards.items():
817  header.set(k, v, comment=comments.get(k, None))
818 
def format(config, name=None, writeSourceLine=True, prefix="", verbose=False)
Definition: history.py:174

◆ checkCcdIdLists()

def lsst.pipe.drivers.constructCalibs.CalibTask.checkCcdIdLists (   self,
  ccdIdLists 
)
inherited
Check that the list of CCD dataIds is consistent

@param ccdIdLists  Dict of data identifier lists for each CCD name
@return Number of exposures, number of CCDs

Definition at line 886 of file constructCalibs.py.

886  def checkCcdIdLists(self, ccdIdLists):
887  """Check that the list of CCD dataIds is consistent
888 
889  @param ccdIdLists Dict of data identifier lists for each CCD name
890  @return Number of exposures, number of CCDs
891  """
892  visitIdLists = collections.defaultdict(list)
893  for ccdName in ccdIdLists:
894  for dataId in ccdIdLists[ccdName]:
895  visitName = dictToTuple(dataId, self.config.visitKeys)
896  visitIdLists[visitName].append(dataId)
897 
898  numExps = set(len(expList) for expList in ccdIdLists.values())
899  numCcds = set(len(ccdList) for ccdList in visitIdLists.values())
900 
901  if len(numExps) != 1 or len(numCcds) != 1:
902  # Presumably a visit somewhere doesn't have the full complement available.
903  # Dump the information so the user can figure it out.
904  self.log.warn("Number of visits for each CCD: %s",
905  {ccdName: len(ccdIdLists[ccdName]) for ccdName in ccdIdLists})
906  self.log.warn("Number of CCDs for each visit: %s",
907  {vv: len(visitIdLists[vv]) for vv in visitIdLists})
908  raise RuntimeError("Inconsistent number of exposures/CCDs")
909 
910  return numExps.pop(), numCcds.pop()
911 
912 
std::shared_ptr< FrameSet > append(FrameSet const &first, FrameSet const &second)
Construct a FrameSet that performs two transformations in series.
Definition: functional.cc:33
daf::base::PropertySet * set
Definition: fits.cc:902
def dictToTuple(dict_, keys)
Return a tuple of specific values from a dict.

◆ combine()

def lsst.pipe.drivers.constructCalibs.CalibTask.combine (   self,
  cache,
  struct,
  outputId 
)
inherited

Combine multiple exposures of a particular CCD and write the output.

Only the slave nodes execute this method.

Parameters
cacheProcess pool cache
structParameters for the combination, which has the following components:
  • ccdName Name tuple for CCD
  • ccdIdList List of data identifiers for combination
  • scales Scales to apply (expScales are scalings for each exposure, ccdScale is final scale for combined image)
outputIdData identifier for combined image (exposure part only)
Returns
binned calib image

Definition at line 726 of file constructCalibs.py.

726  def combine(self, cache, struct, outputId):
727  """!Combine multiple exposures of a particular CCD and write the output
728 
729  Only the slave nodes execute this method.
730 
731  @param cache Process pool cache
732  @param struct Parameters for the combination, which has the following components:
733  * ccdName Name tuple for CCD
734  * ccdIdList List of data identifiers for combination
735  * scales Scales to apply (expScales are scalings for each exposure,
736  ccdScale is final scale for combined image)
737  @param outputId Data identifier for combined image (exposure part only)
738  @return binned calib image
739  """
740  outputId = self.getFullyQualifiedOutputId(struct.ccdName, cache.butler, outputId)
741  dataRefList = [getDataRef(cache.butler, dataId) if dataId is not None else None for
742  dataId in struct.ccdIdList]
743  self.log.info("Combining %s on %s" % (outputId, NODE))
744  calib = self.combination.run(dataRefList, expScales=struct.scales.expScales,
745  finalScale=struct.scales.ccdScale)
746 
747  if not hasattr(calib, "getMetadata"):
748  if hasattr(calib, "getVariance"):
749  calib = afwImage.makeExposure(calib)
750  else:
751  calib = afwImage.DecoratedImageF(calib.getImage()) # n.b. hardwires "F" for the output type
752 
753  self.calculateOutputHeaderFromRaws(cache.butler, calib, struct.ccdIdList, outputId)
754 
755  self.updateMetadata(calib, self.exposureTime)
756 
757  self.recordCalibInputs(cache.butler, calib,
758  struct.ccdIdList, outputId)
759 
760  self.interpolateNans(calib)
761 
762  self.write(cache.butler, calib, outputId)
763 
764  return afwMath.binImage(calib.getImage(), self.config.binning)
765 
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
Definition: Exposure.h:442
def getDataRef(butler, dataId, datasetType="raw")
Definition: utils.py:16
def run(self, skyInfo, tempExpRefList, imageScalerList, weightList, altMaskList=None, mask=None, supplementaryData=None)
std::shared_ptr< ImageT > binImage(ImageT const &inImage, int const binsize, lsst::afw::math::Property const flags=lsst::afw::math::MEAN)
Definition: binImage.cc:38

◆ emptyMetadata()

def lsst.pipe.base.task.Task.emptyMetadata (   self)
inherited
Empty (clear) the metadata for this Task and all sub-Tasks.

Definition at line 153 of file task.py.

153  def emptyMetadata(self):
154  """Empty (clear) the metadata for this Task and all sub-Tasks.
155  """
156  for subtask in self._taskDict.values():
157  subtask.metadata = dafBase.PropertyList()
158 
Class for storing ordered metadata with comments.
Definition: PropertyList.h:68

◆ getAllSchemaCatalogs()

def lsst.pipe.base.task.Task.getAllSchemaCatalogs (   self)
inherited
Get schema catalogs for all tasks in the hierarchy, combining the results into a single dict.

Returns
-------
schemacatalogs : `dict`
    Keys are butler dataset type, values are a empty catalog (an instance of the appropriate
    lsst.afw.table Catalog type) for all tasks in the hierarchy, from the top-level task down
    through all subtasks.

Notes
-----
This method may be called on any task in the hierarchy; it will return the same answer, regardless.

The default implementation should always suffice. If your subtask uses schemas the override
`Task.getSchemaCatalogs`, not this method.

Definition at line 188 of file task.py.

188  def getAllSchemaCatalogs(self):
189  """Get schema catalogs for all tasks in the hierarchy, combining the results into a single dict.
190 
191  Returns
192  -------
193  schemacatalogs : `dict`
194  Keys are butler dataset type, values are a empty catalog (an instance of the appropriate
195  lsst.afw.table Catalog type) for all tasks in the hierarchy, from the top-level task down
196  through all subtasks.
197 
198  Notes
199  -----
200  This method may be called on any task in the hierarchy; it will return the same answer, regardless.
201 
202  The default implementation should always suffice. If your subtask uses schemas the override
203  `Task.getSchemaCatalogs`, not this method.
204  """
205  schemaDict = self.getSchemaCatalogs()
206  for subtask in self._taskDict.values():
207  schemaDict.update(subtask.getSchemaCatalogs())
208  return schemaDict
209 

◆ getFilter()

def lsst.pipe.drivers.constructCalibs.CalibTask.getFilter (   self,
  butler,
  dataId 
)
inherited
Determine the filter from a data identifier

Definition at line 544 of file constructCalibs.py.

544  def getFilter(self, butler, dataId):
545  """Determine the filter from a data identifier"""
546  filt = butler.queryMetadata('raw', [self.config.filter], dataId)[0]
547  return filt
548 

◆ getFullMetadata()

def lsst.pipe.base.task.Task.getFullMetadata (   self)
inherited
Get metadata for all tasks.

Returns
-------
metadata : `lsst.daf.base.PropertySet`
    The `~lsst.daf.base.PropertySet` keys are the full task name. Values are metadata
    for the top-level task and all subtasks, sub-subtasks, etc..

Notes
-----
The returned metadata includes timing information (if ``@timer.timeMethod`` is used)
and any metadata set by the task. The name of each item consists of the full task name
with ``.`` replaced by ``:``, followed by ``.`` and the name of the item, e.g.::

    topLevelTaskName:subtaskName:subsubtaskName.itemName

using ``:`` in the full task name disambiguates the rare situation that a task has a subtask
and a metadata item with the same name.

Definition at line 210 of file task.py.

210  def getFullMetadata(self):
211  """Get metadata for all tasks.
212 
213  Returns
214  -------
215  metadata : `lsst.daf.base.PropertySet`
216  The `~lsst.daf.base.PropertySet` keys are the full task name. Values are metadata
217  for the top-level task and all subtasks, sub-subtasks, etc..
218 
219  Notes
220  -----
221  The returned metadata includes timing information (if ``@timer.timeMethod`` is used)
222  and any metadata set by the task. The name of each item consists of the full task name
223  with ``.`` replaced by ``:``, followed by ``.`` and the name of the item, e.g.::
224 
225  topLevelTaskName:subtaskName:subsubtaskName.itemName
226 
227  using ``:`` in the full task name disambiguates the rare situation that a task has a subtask
228  and a metadata item with the same name.
229  """
230  fullMetadata = dafBase.PropertySet()
231  for fullName, task in self.getTaskDict().items():
232  fullMetadata.set(fullName.replace(".", ":"), task.metadata)
233  return fullMetadata
234 
std::vector< SchemaItem< Flag > > * items
Class for storing generic metadata.
Definition: PropertySet.h:67

◆ getFullName()

def lsst.pipe.base.task.Task.getFullName (   self)
inherited
Get the task name as a hierarchical name including parent task names.

Returns
-------
fullName : `str`
    The full name consists of the name of the parent task and each subtask separated by periods.
    For example:

    - The full name of top-level task "top" is simply "top".
    - The full name of subtask "sub" of top-level task "top" is "top.sub".
    - The full name of subtask "sub2" of subtask "sub" of top-level task "top" is "top.sub.sub2".

Definition at line 235 of file task.py.

235  def getFullName(self):
236  """Get the task name as a hierarchical name including parent task names.
237 
238  Returns
239  -------
240  fullName : `str`
241  The full name consists of the name of the parent task and each subtask separated by periods.
242  For example:
243 
244  - The full name of top-level task "top" is simply "top".
245  - The full name of subtask "sub" of top-level task "top" is "top.sub".
246  - The full name of subtask "sub2" of subtask "sub" of top-level task "top" is "top.sub.sub2".
247  """
248  return self._fullName
249 

◆ getFullyQualifiedOutputId()

def lsst.pipe.drivers.constructCalibs.CalibTask.getFullyQualifiedOutputId (   self,
  ccdName,
  butler,
  outputId 
)
inherited
Get fully-qualified output data identifier

We may need to look up keys that aren't in the output dataId.

@param ccdName  Name tuple for CCD
@param butler  Data butler
@param outputId  Data identifier for combined image (exposure part only)
@return fully-qualified output dataId

Definition at line 710 of file constructCalibs.py.

710  def getFullyQualifiedOutputId(self, ccdName, butler, outputId):
711  """Get fully-qualified output data identifier
712 
713  We may need to look up keys that aren't in the output dataId.
714 
715  @param ccdName Name tuple for CCD
716  @param butler Data butler
717  @param outputId Data identifier for combined image (exposure part only)
718  @return fully-qualified output dataId
719  """
720  fullOutputId = {k: ccdName[i] for i, k in enumerate(self.config.ccdKeys)}
721  fullOutputId.update(outputId)
722  self.addMissingKeys(fullOutputId, butler)
723  fullOutputId.update(outputId) # must be after the call to queryMetadata in 'addMissingKeys'
724  return fullOutputId
725 

◆ getMjd()

def lsst.pipe.drivers.constructCalibs.CalibTask.getMjd (   self,
  butler,
  dataId,
  timescale = dafBase.DateTime.UTC 
)
inherited
Determine the Modified Julian Date (MJD; in TAI) from a data identifier

Definition at line 531 of file constructCalibs.py.

531  def getMjd(self, butler, dataId, timescale=dafBase.DateTime.UTC):
532  """Determine the Modified Julian Date (MJD; in TAI) from a data identifier"""
533  if self.config.dateObs in dataId:
534  dateObs = dataId[self.config.dateObs]
535  else:
536  dateObs = butler.queryMetadata('raw', [self.config.dateObs], dataId)[0]
537  if "T" not in dateObs:
538  dateObs = dateObs + "T12:00:00.0Z"
539  elif not dateObs.endswith("Z"):
540  dateObs += "Z"
541 
542  return dafBase.DateTime(dateObs, timescale).get(dafBase.DateTime.MJD)
543 
Class for handling dates/times, including MJD, UTC, and TAI.
Definition: DateTime.h:64

◆ getName()

def lsst.pipe.base.task.Task.getName (   self)
inherited
Get the name of the task.

Returns
-------
taskName : `str`
    Name of the task.

See also
--------
getFullName

Definition at line 250 of file task.py.

250  def getName(self):
251  """Get the name of the task.
252 
253  Returns
254  -------
255  taskName : `str`
256  Name of the task.
257 
258  See also
259  --------
260  getFullName
261  """
262  return self._name
263 

◆ getOutputId()

def lsst.pipe.drivers.constructCalibs.CalibTask.getOutputId (   self,
  expRefList,
  calibId 
)
inherited

Generate the data identifier for the output calib.

The mean date and the common filter are included, using keywords from the configuration. The CCD-specific part is not included in the data identifier.

Parameters
expRefListList of data references at exposure level
calibIdData identifier elements for the calib provided by the user
Returns
data identifier

Definition at line 496 of file constructCalibs.py.

496  def getOutputId(self, expRefList, calibId):
497  """!Generate the data identifier for the output calib
498 
499  The mean date and the common filter are included, using keywords
500  from the configuration. The CCD-specific part is not included
501  in the data identifier.
502 
503  @param expRefList List of data references at exposure level
504  @param calibId Data identifier elements for the calib provided by the user
505  @return data identifier
506  """
507  midTime = 0
508  filterName = None
509  for expRef in expRefList:
510  butler = expRef.getButler()
511  dataId = expRef.dataId
512 
513  midTime += self.getMjd(butler, dataId)
514  thisFilter = self.getFilter(
515  butler, dataId) if self.filterName is None else self.filterName
516  if filterName is None:
517  filterName = thisFilter
518  elif filterName != thisFilter:
519  raise RuntimeError("Filter mismatch for %s: %s vs %s" % (
520  dataId, thisFilter, filterName))
521 
522  midTime /= len(expRefList)
523  date = str(dafBase.DateTime(
524  midTime, dafBase.DateTime.MJD).toPython().date())
525 
526  outputId = {self.config.filter: filterName,
527  self.config.dateCalib: date}
528  outputId.update(calibId)
529  return outputId
530 
Class for handling dates/times, including MJD, UTC, and TAI.
Definition: DateTime.h:64

◆ getSchemaCatalogs()

def lsst.pipe.base.task.Task.getSchemaCatalogs (   self)
inherited
Get the schemas generated by this task.

Returns
-------
schemaCatalogs : `dict`
    Keys are butler dataset type, values are an empty catalog (an instance of the appropriate
    `lsst.afw.table` Catalog type) for this task.

Notes
-----

.. warning::

   Subclasses that use schemas must override this method. The default implemenation returns
   an empty dict.

This method may be called at any time after the Task is constructed, which means that all task
schemas should be computed at construction time, *not* when data is actually processed. This
reflects the philosophy that the schema should not depend on the data.

Returning catalogs rather than just schemas allows us to save e.g. slots for SourceCatalog as well.

See also
--------
Task.getAllSchemaCatalogs

Definition at line 159 of file task.py.

159  def getSchemaCatalogs(self):
160  """Get the schemas generated by this task.
161 
162  Returns
163  -------
164  schemaCatalogs : `dict`
165  Keys are butler dataset type, values are an empty catalog (an instance of the appropriate
166  `lsst.afw.table` Catalog type) for this task.
167 
168  Notes
169  -----
170 
171  .. warning::
172 
173  Subclasses that use schemas must override this method. The default implemenation returns
174  an empty dict.
175 
176  This method may be called at any time after the Task is constructed, which means that all task
177  schemas should be computed at construction time, *not* when data is actually processed. This
178  reflects the philosophy that the schema should not depend on the data.
179 
180  Returning catalogs rather than just schemas allows us to save e.g. slots for SourceCatalog as well.
181 
182  See also
183  --------
184  Task.getAllSchemaCatalogs
185  """
186  return {}
187 

◆ getTaskDict()

def lsst.pipe.base.task.Task.getTaskDict (   self)
inherited
Get a dictionary of all tasks as a shallow copy.

Returns
-------
taskDict : `dict`
    Dictionary containing full task name: task object for the top-level task and all subtasks,
    sub-subtasks, etc..

Definition at line 264 of file task.py.

264  def getTaskDict(self):
265  """Get a dictionary of all tasks as a shallow copy.
266 
267  Returns
268  -------
269  taskDict : `dict`
270  Dictionary containing full task name: task object for the top-level task and all subtasks,
271  sub-subtasks, etc..
272  """
273  return self._taskDict.copy()
274 
def getTaskDict(config, taskDict=None, baseName="")

◆ interpolateNans()

def lsst.pipe.drivers.constructCalibs.CalibTask.interpolateNans (   self,
  image 
)
inherited
Interpolate over NANs in the combined image

NANs can result from masked areas on the CCD.  We don't want them getting
into our science images, so we replace them with the median of the image.

Definition at line 847 of file constructCalibs.py.

847  def interpolateNans(self, image):
848  """Interpolate over NANs in the combined image
849 
850  NANs can result from masked areas on the CCD. We don't want them getting
851  into our science images, so we replace them with the median of the image.
852  """
853  if hasattr(image, "getMaskedImage"): # Deal with Exposure vs Image
854  self.interpolateNans(image.getMaskedImage().getVariance())
855  image = image.getMaskedImage().getImage()
856  if hasattr(image, "getImage"): # Deal with DecoratedImage or MaskedImage vs Image
857  image = image.getImage()
858  array = image.getArray()
859  bad = np.isnan(array)
860  array[bad] = np.median(array[np.logical_not(bad)])
861 

◆ logOperation()

def lsst.ctrl.pool.parallel.BatchCmdLineTask.logOperation (   self,
  operation,
  catch = False,
  trace = True 
)
inherited

Provide a context manager for logging an operation.

Parameters
operationdescription of operation (string)
catchCatch all exceptions?
traceLog a traceback of caught exception?

Note that if 'catch' is True, all exceptions are swallowed, but there may be other side-effects such as undefined variables.

Definition at line 497 of file parallel.py.

497  def logOperation(self, operation, catch=False, trace=True):
498  """!Provide a context manager for logging an operation
499 
500  @param operation: description of operation (string)
501  @param catch: Catch all exceptions?
502  @param trace: Log a traceback of caught exception?
503 
504  Note that if 'catch' is True, all exceptions are swallowed, but there may
505  be other side-effects such as undefined variables.
506  """
507  self.log.info("%s: Start %s" % (NODE, operation))
508  try:
509  yield
510  except Exception:
511  if catch:
512  cls, e, _ = sys.exc_info()
513  self.log.warn("%s: Caught %s while %s: %s" % (NODE, cls.__name__, operation, e))
514  if trace:
515  self.log.info("%s: Traceback:\n%s" % (NODE, traceback.format_exc()))
516  return
517  raise
518  finally:
519  self.log.info("%s: Finished %s" % (NODE, operation))
520 
521 

◆ makeCameraImage()

def lsst.pipe.drivers.constructCalibs.CalibTask.makeCameraImage (   self,
  camera,
  dataId,
  calibs 
)
inherited

Create and write an image of the entire camera.

This is useful for judging the quality or getting an overview of the features of the calib.

Parameters
cameraCamera object
dataIdData identifier for output
calibsDict mapping CCD detector ID to calib image

Definition at line 874 of file constructCalibs.py.

874  def makeCameraImage(self, camera, dataId, calibs):
875  """!Create and write an image of the entire camera
876 
877  This is useful for judging the quality or getting an overview of
878  the features of the calib.
879 
880  @param camera Camera object
881  @param dataId Data identifier for output
882  @param calibs Dict mapping CCD detector ID to calib image
883  """
884  return makeCameraImage(camera, calibs, self.config.binning)
885 
def makeCameraImage(camera, exposures, filename=None, binning=8)

◆ makeField()

def lsst.pipe.base.task.Task.makeField (   cls,
  doc 
)
inherited
Make a `lsst.pex.config.ConfigurableField` for this task.

Parameters
----------
doc : `str`
    Help text for the field.

Returns
-------
configurableField : `lsst.pex.config.ConfigurableField`
    A `~ConfigurableField` for this task.

Examples
--------
Provides a convenient way to specify this task is a subtask of another task.

Here is an example of use::

    class OtherTaskConfig(lsst.pex.config.Config)
aSubtask = ATaskClass.makeField("a brief description of what this task does")

Definition at line 329 of file task.py.

329  def makeField(cls, doc):
330  """Make a `lsst.pex.config.ConfigurableField` for this task.
331 
332  Parameters
333  ----------
334  doc : `str`
335  Help text for the field.
336 
337  Returns
338  -------
339  configurableField : `lsst.pex.config.ConfigurableField`
340  A `~ConfigurableField` for this task.
341 
342  Examples
343  --------
344  Provides a convenient way to specify this task is a subtask of another task.
345 
346  Here is an example of use::
347 
348  class OtherTaskConfig(lsst.pex.config.Config)
349  aSubtask = ATaskClass.makeField("a brief description of what this task does")
350  """
351  return ConfigurableField(doc=doc, target=cls)
352 

◆ makeSubtask()

def lsst.pipe.base.task.Task.makeSubtask (   self,
  name,
  keyArgs 
)
inherited
Create a subtask as a new instance as the ``name`` attribute of this task.

Parameters
----------
name : `str`
    Brief name of the subtask.
keyArgs
    Extra keyword arguments used to construct the task. The following arguments are automatically
    provided and cannot be overridden:

    - "config".
    - "parentTask".

Notes
-----
The subtask must be defined by ``Task.config.name``, an instance of pex_config ConfigurableField
or RegistryField.

Definition at line 275 of file task.py.

275  def makeSubtask(self, name, **keyArgs):
276  """Create a subtask as a new instance as the ``name`` attribute of this task.
277 
278  Parameters
279  ----------
280  name : `str`
281  Brief name of the subtask.
282  keyArgs
283  Extra keyword arguments used to construct the task. The following arguments are automatically
284  provided and cannot be overridden:
285 
286  - "config".
287  - "parentTask".
288 
289  Notes
290  -----
291  The subtask must be defined by ``Task.config.name``, an instance of pex_config ConfigurableField
292  or RegistryField.
293  """
294  taskField = getattr(self.config, name, None)
295  if taskField is None:
296  raise KeyError("%s's config does not have field %r" % (self.getFullName(), name))
297  subtask = taskField.apply(name=name, parentTask=self, **keyArgs)
298  setattr(self, name, subtask)
299 

◆ parseAndRun() [1/2]

def lsst.ctrl.pool.parallel.BatchPoolTask.parseAndRun (   cls,
  args,
  kwargs 
)
inherited
Run with a MPI process pool

Definition at line 529 of file parallel.py.

529  def parseAndRun(cls, *args, **kwargs):
530  """Run with a MPI process pool"""
531  pool = startPool()
532  super(BatchPoolTask, cls).parseAndRun(*args, **kwargs)
533  pool.exit()
534 
535 
def startPool(comm=None, root=0, killSlaves=True)
Start a process pool.
Definition: pool.py:1216

◆ parseAndRun() [2/2]

def lsst.pipe.base.cmdLineTask.CmdLineTask.parseAndRun (   cls,
  args = None,
  config = None,
  log = None,
  doReturnResults = False 
)
inherited
Parse an argument list and run the command.

Parameters
----------
args : `list`, optional
    List of command-line arguments; if `None` use `sys.argv`.
config : `lsst.pex.config.Config`-type, optional
    Config for task. If `None` use `Task.ConfigClass`.
log : `lsst.log.Log`-type, optional
    Log. If `None` use the default log.
doReturnResults : `bool`, optional
    If `True`, return the results of this task. Default is `False`. This is only intended for
    unit tests and similar use. It can easily exhaust memory (if the task returns enough data and you
    call it enough times) and it will fail when using multiprocessing if the returned data cannot be
    pickled.

Returns
-------
struct : `lsst.pipe.base.Struct`
    Fields are:

    ``argumentParser``
the argument parser (`lsst.pipe.base.ArgumentParser`).
    ``parsedCmd``
the parsed command returned by the argument parser's
`~lsst.pipe.base.ArgumentParser.parse_args` method
(`argparse.Namespace`).
    ``taskRunner``
the task runner used to run the task (an instance of `Task.RunnerClass`).
    ``resultList``
results returned by the task runner's ``run`` method, one entry
per invocation (`list`). This will typically be a list of
`Struct`, each containing at least an ``exitStatus`` integer
(0 or 1); see `Task.RunnerClass` (`TaskRunner` by default) for
more details.

Notes
-----
Calling this method with no arguments specified is the standard way to run a command-line task
from the command-line. For an example see ``pipe_tasks`` ``bin/makeSkyMap.py`` or almost any other
file in that directory.

If one or more of the dataIds fails then this routine will exit (with a status giving the
number of failed dataIds) rather than returning this struct;  this behaviour can be
overridden by specifying the ``--noExit`` command-line option.

Definition at line 549 of file cmdLineTask.py.

549  def parseAndRun(cls, args=None, config=None, log=None, doReturnResults=False):
550  """Parse an argument list and run the command.
551 
552  Parameters
553  ----------
554  args : `list`, optional
555  List of command-line arguments; if `None` use `sys.argv`.
556  config : `lsst.pex.config.Config`-type, optional
557  Config for task. If `None` use `Task.ConfigClass`.
558  log : `lsst.log.Log`-type, optional
559  Log. If `None` use the default log.
560  doReturnResults : `bool`, optional
561  If `True`, return the results of this task. Default is `False`. This is only intended for
562  unit tests and similar use. It can easily exhaust memory (if the task returns enough data and you
563  call it enough times) and it will fail when using multiprocessing if the returned data cannot be
564  pickled.
565 
566  Returns
567  -------
568  struct : `lsst.pipe.base.Struct`
569  Fields are:
570 
571  ``argumentParser``
572  the argument parser (`lsst.pipe.base.ArgumentParser`).
573  ``parsedCmd``
574  the parsed command returned by the argument parser's
575  `~lsst.pipe.base.ArgumentParser.parse_args` method
576  (`argparse.Namespace`).
577  ``taskRunner``
578  the task runner used to run the task (an instance of `Task.RunnerClass`).
579  ``resultList``
580  results returned by the task runner's ``run`` method, one entry
581  per invocation (`list`). This will typically be a list of
582  `Struct`, each containing at least an ``exitStatus`` integer
583  (0 or 1); see `Task.RunnerClass` (`TaskRunner` by default) for
584  more details.
585 
586  Notes
587  -----
588  Calling this method with no arguments specified is the standard way to run a command-line task
589  from the command-line. For an example see ``pipe_tasks`` ``bin/makeSkyMap.py`` or almost any other
590  file in that directory.
591 
592  If one or more of the dataIds fails then this routine will exit (with a status giving the
593  number of failed dataIds) rather than returning this struct; this behaviour can be
594  overridden by specifying the ``--noExit`` command-line option.
595  """
596  if args is None:
597  commandAsStr = " ".join(sys.argv)
598  args = sys.argv[1:]
599  else:
600  commandAsStr = "{}{}".format(lsst.utils.get_caller_name(skip=1), tuple(args))
601 
602  argumentParser = cls._makeArgumentParser()
603  if config is None:
604  config = cls.ConfigClass()
605  parsedCmd = argumentParser.parse_args(config=config, args=args, log=log, override=cls.applyOverrides)
606  # print this message after parsing the command so the log is fully configured
607  parsedCmd.log.info("Running: %s", commandAsStr)
608 
609  taskRunner = cls.RunnerClass(TaskClass=cls, parsedCmd=parsedCmd, doReturnResults=doReturnResults)
610  resultList = taskRunner.run(parsedCmd)
611 
612  try:
613  nFailed = sum(((res.exitStatus != 0) for res in resultList))
614  except (TypeError, AttributeError) as e:
615  # NOTE: TypeError if resultList is None, AttributeError if it doesn't have exitStatus.
616  parsedCmd.log.warn("Unable to retrieve exit status (%s); assuming success", e)
617  nFailed = 0
618 
619  if nFailed > 0:
620  if parsedCmd.noExit:
621  parsedCmd.log.error("%d dataRefs failed; not exiting as --noExit was set", nFailed)
622  else:
623  sys.exit(nFailed)
624 
625  return Struct(
626  argumentParser=argumentParser,
627  parsedCmd=parsedCmd,
628  taskRunner=taskRunner,
629  resultList=resultList,
630  )
631 
def format(config, name=None, writeSourceLine=True, prefix="", verbose=False)
Definition: history.py:174

◆ parseAndSubmit()

def lsst.ctrl.pool.parallel.BatchCmdLineTask.parseAndSubmit (   cls,
  args = None,
  kwargs 
)
inherited

Definition at line 433 of file parallel.py.

433  def parseAndSubmit(cls, args=None, **kwargs):
434  taskParser = cls._makeArgumentParser(doBatch=True, add_help=False)
435  batchParser = BatchArgumentParser(parent=taskParser)
436  batchArgs = batchParser.parse_args(config=cls.ConfigClass(), args=args, override=cls.applyOverrides,
437  **kwargs)
438 
439  if not cls.RunnerClass(cls, batchArgs.parent).precall(batchArgs.parent): # Write config, schema
440  taskParser.error("Error in task preparation")
441 
442  setBatchType(batchArgs.batch)
443 
444  if batchArgs.batch is None: # don't use a batch system
445  sys.argv = [sys.argv[0]] + batchArgs.leftover # Remove all batch arguments
446 
447  return cls.parseAndRun()
448  else:
449  numCores = batchArgs.cores if batchArgs.cores > 0 else batchArgs.nodes*batchArgs.procs
450  walltime = cls.batchWallTime(batchArgs.time, batchArgs.parent, numCores)
451 
452  command = cls.batchCommand(batchArgs)
453  batchArgs.batch.run(command, walltime=walltime)
454 
def setBatchType(batchType)
Definition: pool.py:101

◆ process()

def lsst.pipe.drivers.constructCalibs.CalibTask.process (   self,
  cache,
  ccdId,
  outputName = "postISRCCD",
  kwargs 
)
inherited

Process a CCD, specified by a data identifier.

After processing, optionally returns a result (produced by the 'processResult' method) calculated from the processed exposure. These results will be gathered by the master node, and is a means for coordinated scaling of all CCDs for flats, etc.

Only slave nodes execute this method.

Parameters
cacheProcess pool cache
ccdIdData identifier for CCD
outputNameOutput dataset name for butler
Returns
result from 'processResult'

Definition at line 602 of file constructCalibs.py.

602  def process(self, cache, ccdId, outputName="postISRCCD", **kwargs):
603  """!Process a CCD, specified by a data identifier
604 
605  After processing, optionally returns a result (produced by
606  the 'processResult' method) calculated from the processed
607  exposure. These results will be gathered by the master node,
608  and is a means for coordinated scaling of all CCDs for flats,
609  etc.
610 
611  Only slave nodes execute this method.
612 
613  @param cache Process pool cache
614  @param ccdId Data identifier for CCD
615  @param outputName Output dataset name for butler
616  @return result from 'processResult'
617  """
618  if ccdId is None:
619  self.log.warn("Null identifier received on %s" % NODE)
620  return None
621  sensorRef = getDataRef(cache.butler, ccdId)
622  if self.config.clobber or not sensorRef.datasetExists(outputName):
623  self.log.info("Processing %s on %s" % (ccdId, NODE))
624  try:
625  exposure = self.processSingle(sensorRef, **kwargs)
626  except Exception as e:
627  self.log.warn("Unable to process %s: %s" % (ccdId, e))
628  raise
629  return None
630  self.processWrite(sensorRef, exposure)
631  else:
632  self.log.info(
633  "Using previously persisted processed exposure for %s" % (sensorRef.dataId,))
634  exposure = sensorRef.get(outputName)
635  return self.processResult(exposure)
636 
def getDataRef(butler, dataId, datasetType="raw")
Definition: utils.py:16

◆ processResult()

def lsst.pipe.drivers.constructCalibs.CalibTask.processResult (   self,
  exposure 
)
inherited
Extract processing results from a processed exposure

This method generates what is gathered by the master node.
This can be a background measurement or similar for scaling
flat-fields.  It must be picklable!

Only slave nodes execute this method.

Definition at line 660 of file constructCalibs.py.

660  def processResult(self, exposure):
661  """Extract processing results from a processed exposure
662 
663  This method generates what is gathered by the master node.
664  This can be a background measurement or similar for scaling
665  flat-fields. It must be picklable!
666 
667  Only slave nodes execute this method.
668  """
669  return None
670 

◆ processSingle()

def lsst.pipe.drivers.constructCalibs.FringeTask.processSingle (   self,
  sensorRef 
)
Subtract the background and normalise by the background level

Definition at line 1151 of file constructCalibs.py.

1151  def processSingle(self, sensorRef):
1152  """Subtract the background and normalise by the background level"""
1153  exposure = CalibTask.processSingle(self, sensorRef)
1154  bgLevel = self.stats.run(exposure)
1155  self.subtractBackground.run(exposure)
1156  mi = exposure.getMaskedImage()
1157  mi /= bgLevel
1158  footprintSets = self.detection.detectFootprints(
1159  exposure, sigma=self.config.detectSigma)
1160  mask = exposure.getMaskedImage().getMask()
1161  detected = 1 << mask.addMaskPlane("DETECTED")
1162  for fpSet in (footprintSets.positive, footprintSets.negative):
1163  if fpSet is not None:
1164  afwDet.setMaskFromFootprintList(
1165  mask, fpSet.getFootprints(), detected)
1166  return exposure
1167 
1168 
def run(self, skyInfo, tempExpRefList, imageScalerList, weightList, altMaskList=None, mask=None, supplementaryData=None)

◆ processWrite()

def lsst.pipe.drivers.constructCalibs.CalibTask.processWrite (   self,
  dataRef,
  exposure,
  outputName = "postISRCCD" 
)
inherited

Write the processed CCD.

We need to write these out because we can't hold them all in memory at once.

Only slave nodes execute this method.

Parameters
dataRefData reference
exposureCCD exposure to write
outputNameOutput dataset name for butler.

Definition at line 646 of file constructCalibs.py.

646  def processWrite(self, dataRef, exposure, outputName="postISRCCD"):
647  """!Write the processed CCD
648 
649  We need to write these out because we can't hold them all in
650  memory at once.
651 
652  Only slave nodes execute this method.
653 
654  @param dataRef Data reference
655  @param exposure CCD exposure to write
656  @param outputName Output dataset name for butler.
657  """
658  dataRef.put(exposure, outputName)
659 

◆ recordCalibInputs()

def lsst.pipe.drivers.constructCalibs.CalibTask.recordCalibInputs (   self,
  butler,
  calib,
  dataIdList,
  outputId 
)
inherited

Record metadata including the inputs and creation details.

This metadata will go into the FITS header.

Parameters
butlerData butler
calibCombined calib exposure.
dataIdListList of data identifiers for calibration inputs
outputIdData identifier for output

Definition at line 819 of file constructCalibs.py.

819  def recordCalibInputs(self, butler, calib, dataIdList, outputId):
820  """!Record metadata including the inputs and creation details
821 
822  This metadata will go into the FITS header.
823 
824  @param butler Data butler
825  @param calib Combined calib exposure.
826  @param dataIdList List of data identifiers for calibration inputs
827  @param outputId Data identifier for output
828  """
829  header = calib.getMetadata()
830  header.set("OBSTYPE", self.calibName) # Used by ingestCalibs.py
831 
832  # date, time, host, and root
833  now = time.localtime()
834  header.set("CALIB_CREATION_DATE", time.strftime("%Y-%m-%d", now))
835  header.set("CALIB_CREATION_TIME", time.strftime("%X %Z", now))
836 
837  # Inputs
838  visits = [str(dictToTuple(dataId, self.config.visitKeys)) for dataId in dataIdList if
839  dataId is not None]
840  for i, v in enumerate(sorted(set(visits))):
841  header.set("CALIB_INPUT_%d" % (i,), v)
842 
843  header.set("CALIB_ID", " ".join("%s=%s" % (key, value)
844  for key, value in outputId.items()))
845  checksum(calib, header)
846 
def checksum(obj, header=None, sumType="MD5")
Calculate a checksum of an object.
Definition: checksum.py:24
daf::base::PropertySet * set
Definition: fits.cc:902
def dictToTuple(dict_, keys)
Return a tuple of specific values from a dict.

◆ runDataRef()

def lsst.pipe.drivers.constructCalibs.CalibTask.runDataRef (   self,
  expRefList,
  butler,
  calibId 
)
inherited

Construct a calib from a list of exposure references.

This is the entry point, called by the TaskRunner.__call__

Only the master node executes this method.

Parameters
expRefListList of data references at the exposure level
butlerData butler
calibIdIdentifier dict for calib

Definition at line 422 of file constructCalibs.py.

422  def runDataRef(self, expRefList, butler, calibId):
423  """!Construct a calib from a list of exposure references
424 
425  This is the entry point, called by the TaskRunner.__call__
426 
427  Only the master node executes this method.
428 
429  @param expRefList List of data references at the exposure level
430  @param butler Data butler
431  @param calibId Identifier dict for calib
432  """
433  if len(expRefList) < 1:
434  raise RuntimeError("No valid input data")
435 
436  for expRef in expRefList:
437  self.addMissingKeys(expRef.dataId, butler, self.config.ccdKeys, 'raw')
438 
439  outputId = self.getOutputId(expRefList, calibId)
440  ccdIdLists = getCcdIdListFromExposures(
441  expRefList, level="sensor", ccdKeys=self.config.ccdKeys)
442  self.checkCcdIdLists(ccdIdLists)
443 
444  # Ensure we can generate filenames for each output
445  outputIdItemList = list(outputId.items())
446  for ccdName in ccdIdLists:
447  dataId = dict([(k, ccdName[i]) for i, k in enumerate(self.config.ccdKeys)])
448  dataId.update(outputIdItemList)
449  self.addMissingKeys(dataId, butler)
450  dataId.update(outputIdItemList)
451 
452  try:
453  butler.get(self.calibName + "_filename", dataId)
454  except Exception as e:
455  raise RuntimeError(
456  "Unable to determine output filename \"%s_filename\" from %s: %s" %
457  (self.calibName, dataId, e))
458 
459  processPool = Pool("process")
460  processPool.storeSet(butler=butler)
461 
462  # Scatter: process CCDs independently
463  data = self.scatterProcess(processPool, ccdIdLists)
464 
465  # Gather: determine scalings
466  scales = self.scale(ccdIdLists, data)
467 
468  combinePool = Pool("combine")
469  combinePool.storeSet(butler=butler)
470 
471  # Scatter: combine
472  calibs = self.scatterCombine(combinePool, outputId, ccdIdLists, scales)
473 
474  if self.config.doCameraImage:
475  camera = butler.get("camera")
476  # Convert indexing of calibs from "ccdName" to detector ID (as used by makeImageFromCamera)
477  calibs = {butler.get("postISRCCD_detector",
478  dict(zip(self.config.ccdKeys, ccdName))).getId(): calibs[ccdName]
479  for ccdName in ccdIdLists}
480 
481  try:
482  cameraImage = self.makeCameraImage(camera, outputId, calibs)
483  butler.put(cameraImage, self.calibName + "_camera", dataId)
484  except Exception as exc:
485  self.log.warn("Unable to create camera image: %s" % (exc,))
486 
487  return Struct(
488  outputId=outputId,
489  ccdIdLists=ccdIdLists,
490  scales=scales,
491  calibs=calibs,
492  processPool=processPool,
493  combinePool=combinePool,
494  )
495 
def getCcdIdListFromExposures(expRefList, level="sensor", ccdKeys=["ccd"])
Determine a list of CCDs from exposure references.
daf::base::PropertyList * list
Definition: fits.cc:903

◆ scale()

def lsst.pipe.drivers.constructCalibs.CalibTask.scale (   self,
  ccdIdLists,
  data 
)
inherited

Determine scaling across CCDs and exposures.

This is necessary mainly for flats, so as to determine a consistent scaling across the entire focal plane. This implementation is simply a placeholder.

Only the master node executes this method.

Parameters
ccdIdListsDict of data identifier lists for each CCD tuple
dataDict of lists of returned data for each CCD tuple
Returns
dict of Struct(ccdScale: scaling for CCD, expScales: scaling for each exposure ) for each CCD tuple

Definition at line 671 of file constructCalibs.py.

671  def scale(self, ccdIdLists, data):
672  """!Determine scaling across CCDs and exposures
673 
674  This is necessary mainly for flats, so as to determine a
675  consistent scaling across the entire focal plane. This
676  implementation is simply a placeholder.
677 
678  Only the master node executes this method.
679 
680  @param ccdIdLists Dict of data identifier lists for each CCD tuple
681  @param data Dict of lists of returned data for each CCD tuple
682  @return dict of Struct(ccdScale: scaling for CCD,
683  expScales: scaling for each exposure
684  ) for each CCD tuple
685  """
686  self.log.info("Scale on %s" % NODE)
687  return dict((name, Struct(ccdScale=None, expScales=[None] * len(ccdIdLists[name])))
688  for name in ccdIdLists)
689 
def scale(algorithm, min, max=None, frame=None)
Definition: ds9.py:109

◆ scatterCombine()

def lsst.pipe.drivers.constructCalibs.CalibTask.scatterCombine (   self,
  pool,
  outputId,
  ccdIdLists,
  scales 
)
inherited

Scatter the combination of exposures across multiple nodes.

In this case, we can only scatter across as many nodes as there are CCDs.

Only the master node executes this method.

Parameters
poolProcess pool
outputIdOutput identifier (exposure part only)
ccdIdListsDict of data identifier lists for each CCD name
scalesDict of structs with scales, for each CCD name
dictof binned images

Definition at line 690 of file constructCalibs.py.

690  def scatterCombine(self, pool, outputId, ccdIdLists, scales):
691  """!Scatter the combination of exposures across multiple nodes
692 
693  In this case, we can only scatter across as many nodes as
694  there are CCDs.
695 
696  Only the master node executes this method.
697 
698  @param pool Process pool
699  @param outputId Output identifier (exposure part only)
700  @param ccdIdLists Dict of data identifier lists for each CCD name
701  @param scales Dict of structs with scales, for each CCD name
702  @param dict of binned images
703  """
704  self.log.info("Scatter combination")
705  data = [Struct(ccdName=ccdName, ccdIdList=ccdIdLists[ccdName], scales=scales[ccdName]) for
706  ccdName in ccdIdLists]
707  images = pool.map(self.combine, data, outputId)
708  return dict(zip(ccdIdLists.keys(), images))
709 

◆ scatterProcess()

def lsst.pipe.drivers.constructCalibs.CalibTask.scatterProcess (   self,
  pool,
  ccdIdLists 
)
inherited

Scatter the processing among the nodes.

We scatter each CCD independently (exposures aren't grouped together), to make full use of all available processors. This necessitates piecing everything back together in the same format as ccdIdLists afterwards.

Only the master node executes this method.

Parameters
poolProcess pool
ccdIdListsDict of data identifier lists for each CCD name
Returns
Dict of lists of returned data for each CCD name

Definition at line 586 of file constructCalibs.py.

586  def scatterProcess(self, pool, ccdIdLists):
587  """!Scatter the processing among the nodes
588 
589  We scatter each CCD independently (exposures aren't grouped together),
590  to make full use of all available processors. This necessitates piecing
591  everything back together in the same format as ccdIdLists afterwards.
592 
593  Only the master node executes this method.
594 
595  @param pool Process pool
596  @param ccdIdLists Dict of data identifier lists for each CCD name
597  @return Dict of lists of returned data for each CCD name
598  """
599  self.log.info("Scatter processing")
600  return mapToMatrix(pool, self.process, ccdIdLists)
601 
def mapToMatrix(pool, func, ccdIdLists, args, kwargs)

◆ timer()

def lsst.pipe.base.task.Task.timer (   self,
  name,
  logLevel = Log.DEBUG 
)
inherited
Context manager to log performance data for an arbitrary block of code.

Parameters
----------
name : `str`
    Name of code being timed; data will be logged using item name: ``Start`` and ``End``.
logLevel
    A `lsst.log` level constant.

Examples
--------
Creating a timer context::

    with self.timer("someCodeToTime"):
pass  # code to time

See also
--------
timer.logInfo

Definition at line 301 of file task.py.

301  def timer(self, name, logLevel=Log.DEBUG):
302  """Context manager to log performance data for an arbitrary block of code.
303 
304  Parameters
305  ----------
306  name : `str`
307  Name of code being timed; data will be logged using item name: ``Start`` and ``End``.
308  logLevel
309  A `lsst.log` level constant.
310 
311  Examples
312  --------
313  Creating a timer context::
314 
315  with self.timer("someCodeToTime"):
316  pass # code to time
317 
318  See also
319  --------
320  timer.logInfo
321  """
322  logInfo(obj=self, prefix=name + "Start", logLevel=logLevel)
323  try:
324  yield
325  finally:
326  logInfo(obj=self, prefix=name + "End", logLevel=logLevel)
327 
def logInfo(obj, prefix, logLevel=Log.DEBUG)
Definition: timer.py:62

◆ updateMetadata()

def lsst.pipe.drivers.constructCalibs.CalibTask.updateMetadata (   self,
  calibImage,
  exposureTime,
  darkTime = None,
  kwargs 
)
inherited

Update the metadata from the VisitInfo.

Parameters
calibImageThe image whose metadata is to be set
exposureTimeThe exposure time for the image
darkTimeThe time since the last read (default: exposureTime)

Definition at line 570 of file constructCalibs.py.

570  def updateMetadata(self, calibImage, exposureTime, darkTime=None, **kwargs):
571  """!Update the metadata from the VisitInfo
572 
573  @param calibImage The image whose metadata is to be set
574  @param exposureTime The exposure time for the image
575  @param darkTime The time since the last read (default: exposureTime)
576  """
577 
578  if darkTime is None:
579  darkTime = exposureTime # avoid warning messages when using calibration products
580 
581  visitInfo = afwImage.VisitInfo(exposureTime=exposureTime, darkTime=darkTime, **kwargs)
582  md = calibImage.getMetadata()
583 
584  afwImage.setVisitInfoMetadata(md, visitInfo)
585 
Information about a single exposure of an imaging camera.
Definition: VisitInfo.h:68

◆ write()

def lsst.pipe.drivers.constructCalibs.CalibTask.write (   self,
  butler,
  exposure,
  dataId 
)
inherited

Write the final combined calib.

Only the slave nodes execute this method

Parameters
butlerData butler
exposureCCD exposure to write
dataIdData identifier for output

Definition at line 862 of file constructCalibs.py.

862  def write(self, butler, exposure, dataId):
863  """!Write the final combined calib
864 
865  Only the slave nodes execute this method
866 
867  @param butler Data butler
868  @param exposure CCD exposure to write
869  @param dataId Data identifier for output
870  """
871  self.log.info("Writing %s on %s" % (dataId, NODE))
872  butler.put(exposure, self.calibName, dataId)
873 
def write(self, patchRef, catalog)
Write the output.

◆ writeConfig()

def lsst.pipe.base.cmdLineTask.CmdLineTask.writeConfig (   self,
  butler,
  clobber = False,
  doBackup = True 
)
inherited
Write the configuration used for processing the data, or check that an existing
one is equal to the new one if present.

Parameters
----------
butler : `lsst.daf.persistence.Butler`
    Data butler used to write the config. The config is written to dataset type
    `CmdLineTask._getConfigName`.
clobber : `bool`, optional
    A boolean flag that controls what happens if a config already has been saved:
    - `True`: overwrite or rename the existing config, depending on ``doBackup``.
    - `False`: raise `TaskError` if this config does not match the existing config.
doBackup : bool, optional
    Set to `True` to backup the config files if clobbering.

Definition at line 656 of file cmdLineTask.py.

656  def writeConfig(self, butler, clobber=False, doBackup=True):
657  """Write the configuration used for processing the data, or check that an existing
658  one is equal to the new one if present.
659 
660  Parameters
661  ----------
662  butler : `lsst.daf.persistence.Butler`
663  Data butler used to write the config. The config is written to dataset type
664  `CmdLineTask._getConfigName`.
665  clobber : `bool`, optional
666  A boolean flag that controls what happens if a config already has been saved:
667  - `True`: overwrite or rename the existing config, depending on ``doBackup``.
668  - `False`: raise `TaskError` if this config does not match the existing config.
669  doBackup : bool, optional
670  Set to `True` to backup the config files if clobbering.
671  """
672  configName = self._getConfigName()
673  if configName is None:
674  return
675  if clobber:
676  butler.put(self.config, configName, doBackup=doBackup)
677  elif butler.datasetExists(configName, write=True):
678  # this may be subject to a race condition; see #2789
679  try:
680  oldConfig = butler.get(configName, immediate=True)
681  except Exception as exc:
682  raise type(exc)("Unable to read stored config file %s (%s); consider using --clobber-config" %
683  (configName, exc))
684 
685  def logConfigMismatch(msg):
686  self.log.fatal("Comparing configuration: %s", msg)
687 
688  if not self.config.compare(oldConfig, shortcut=False, output=logConfigMismatch):
689  raise TaskError(
690  ("Config does not match existing task config %r on disk; tasks configurations " +
691  "must be consistent within the same output repo (override with --clobber-config)") %
692  (configName,))
693  else:
694  butler.put(self.config, configName)
695 
table::Key< int > type
Definition: Detector.cc:163

◆ writeMetadata()

def lsst.pipe.base.cmdLineTask.CmdLineTask.writeMetadata (   self,
  dataRef 
)
inherited
Write the metadata produced from processing the data.

Parameters
----------
dataRef
    Butler data reference used to write the metadata.
    The metadata is written to dataset type `CmdLineTask._getMetadataName`.

Definition at line 731 of file cmdLineTask.py.

731  def writeMetadata(self, dataRef):
732  """Write the metadata produced from processing the data.
733 
734  Parameters
735  ----------
736  dataRef
737  Butler data reference used to write the metadata.
738  The metadata is written to dataset type `CmdLineTask._getMetadataName`.
739  """
740  try:
741  metadataName = self._getMetadataName()
742  if metadataName is not None:
743  dataRef.put(self.getFullMetadata(), metadataName)
744  except Exception as e:
745  self.log.warn("Could not persist metadata for dataId=%s: %s", dataRef.dataId, e)
746 
def writeMetadata(self, dataRefList)
No metadata to write, and not sure how to write it for a list of dataRefs.

◆ writePackageVersions()

def lsst.pipe.base.cmdLineTask.CmdLineTask.writePackageVersions (   self,
  butler,
  clobber = False,
  doBackup = True,
  dataset = "packages" 
)
inherited
Compare and write package versions.

Parameters
----------
butler : `lsst.daf.persistence.Butler`
    Data butler used to read/write the package versions.
clobber : `bool`, optional
    A boolean flag that controls what happens if versions already have been saved:
    - `True`: overwrite or rename the existing version info, depending on ``doBackup``.
    - `False`: raise `TaskError` if this version info does not match the existing.
doBackup : `bool`, optional
    If `True` and clobbering, old package version files are backed up.
dataset : `str`, optional
    Name of dataset to read/write.

Raises
------
TaskError
    Raised if there is a version mismatch with current and persisted lists of package versions.

Notes
-----
Note that this operation is subject to a race condition.

Definition at line 747 of file cmdLineTask.py.

747  def writePackageVersions(self, butler, clobber=False, doBackup=True, dataset="packages"):
748  """Compare and write package versions.
749 
750  Parameters
751  ----------
752  butler : `lsst.daf.persistence.Butler`
753  Data butler used to read/write the package versions.
754  clobber : `bool`, optional
755  A boolean flag that controls what happens if versions already have been saved:
756  - `True`: overwrite or rename the existing version info, depending on ``doBackup``.
757  - `False`: raise `TaskError` if this version info does not match the existing.
758  doBackup : `bool`, optional
759  If `True` and clobbering, old package version files are backed up.
760  dataset : `str`, optional
761  Name of dataset to read/write.
762 
763  Raises
764  ------
765  TaskError
766  Raised if there is a version mismatch with current and persisted lists of package versions.
767 
768  Notes
769  -----
770  Note that this operation is subject to a race condition.
771  """
772  packages = Packages.fromSystem()
773 
774  if clobber:
775  return butler.put(packages, dataset, doBackup=doBackup)
776  if not butler.datasetExists(dataset, write=True):
777  return butler.put(packages, dataset)
778 
779  try:
780  old = butler.get(dataset, immediate=True)
781  except Exception as exc:
782  raise type(exc)("Unable to read stored version dataset %s (%s); "
783  "consider using --clobber-versions or --no-versions" %
784  (dataset, exc))
785  # Note that because we can only detect python modules that have been imported, the stored
786  # list of products may be more or less complete than what we have now. What's important is
787  # that the products that are in common have the same version.
788  diff = packages.difference(old)
789  if diff:
790  raise TaskError(
791  "Version mismatch (" +
792  "; ".join("%s: %s vs %s" % (pkg, diff[pkg][1], diff[pkg][0]) for pkg in diff) +
793  "); consider using --clobber-versions or --no-versions")
794  # Update the old set of packages in case we have more packages that haven't been persisted.
795  extra = packages.extra(old)
796  if extra:
797  old.update(packages)
798  butler.put(old, dataset, doBackup=doBackup)
799 
table::Key< int > type
Definition: Detector.cc:163

◆ writeSchemas()

def lsst.pipe.base.cmdLineTask.CmdLineTask.writeSchemas (   self,
  butler,
  clobber = False,
  doBackup = True 
)
inherited
Write the schemas returned by `lsst.pipe.base.Task.getAllSchemaCatalogs`.

Parameters
----------
butler : `lsst.daf.persistence.Butler`
    Data butler used to write the schema. Each schema is written to the dataset type specified as the
    key in the dict returned by `~lsst.pipe.base.Task.getAllSchemaCatalogs`.
clobber : `bool`, optional
    A boolean flag that controls what happens if a schema already has been saved:
    - `True`: overwrite or rename the existing schema, depending on ``doBackup``.
    - `False`: raise `TaskError` if this schema does not match the existing schema.
doBackup : `bool`, optional
    Set to `True` to backup the schema files if clobbering.

Notes
-----
If ``clobber`` is `False` and an existing schema does not match a current schema,
then some schemas may have been saved successfully and others may not, and there is no easy way to
tell which is which.

Definition at line 696 of file cmdLineTask.py.

696  def writeSchemas(self, butler, clobber=False, doBackup=True):
697  """Write the schemas returned by `lsst.pipe.base.Task.getAllSchemaCatalogs`.
698 
699  Parameters
700  ----------
701  butler : `lsst.daf.persistence.Butler`
702  Data butler used to write the schema. Each schema is written to the dataset type specified as the
703  key in the dict returned by `~lsst.pipe.base.Task.getAllSchemaCatalogs`.
704  clobber : `bool`, optional
705  A boolean flag that controls what happens if a schema already has been saved:
706  - `True`: overwrite or rename the existing schema, depending on ``doBackup``.
707  - `False`: raise `TaskError` if this schema does not match the existing schema.
708  doBackup : `bool`, optional
709  Set to `True` to backup the schema files if clobbering.
710 
711  Notes
712  -----
713  If ``clobber`` is `False` and an existing schema does not match a current schema,
714  then some schemas may have been saved successfully and others may not, and there is no easy way to
715  tell which is which.
716  """
717  for dataset, catalog in self.getAllSchemaCatalogs().items():
718  schemaDataset = dataset + "_schema"
719  if clobber:
720  butler.put(catalog, schemaDataset, doBackup=doBackup)
721  elif butler.datasetExists(schemaDataset, write=True):
722  oldSchema = butler.get(schemaDataset, immediate=True).getSchema()
723  if not oldSchema.compare(catalog.getSchema(), afwTable.Schema.IDENTICAL):
724  raise TaskError(
725  ("New schema does not match schema %r on disk; schemas must be " +
726  " consistent within the same output repo (override with --clobber-config)") %
727  (dataset,))
728  else:
729  butler.put(catalog, schemaDataset)
730 
std::vector< SchemaItem< Flag > > * items

Member Data Documentation

◆ calibName

string lsst.pipe.drivers.constructCalibs.FringeTask.calibName = "fringe"
static

Definition at line 1138 of file constructCalibs.py.

◆ canMultiprocess

bool lsst.pipe.base.cmdLineTask.CmdLineTask.canMultiprocess = True
staticinherited

Definition at line 524 of file cmdLineTask.py.

◆ config

lsst.pipe.base.task.Task.config
inherited

Definition at line 149 of file task.py.

◆ ConfigClass

lsst.pipe.drivers.constructCalibs.FringeTask.ConfigClass = FringeConfig
static

Definition at line 1136 of file constructCalibs.py.

◆ exposureTime

float lsst.pipe.drivers.constructCalibs.CalibTask.exposureTime = 1.0
staticinherited

Definition at line 401 of file constructCalibs.py.

◆ filterName

lsst.pipe.drivers.constructCalibs.CalibTask.filterName = None
staticinherited

Definition at line 399 of file constructCalibs.py.

◆ log

lsst.pipe.base.task.Task.log
inherited

Definition at line 148 of file task.py.

◆ metadata

lsst.pipe.base.task.Task.metadata
inherited

Definition at line 121 of file task.py.

◆ RunnerClass

lsst.pipe.drivers.constructCalibs.CalibTask.RunnerClass = CalibTaskRunner
staticinherited

Definition at line 398 of file constructCalibs.py.


The documentation for this class was generated from the following file: