LSST Applications  21.0.0+75b29a8a7f,21.0.0+e70536a077,21.0.0-1-ga51b5d4+62c747d40b,21.0.0-10-gbfb87ad6+3307648ee3,21.0.0-15-gedb9d5423+47cba9fc36,21.0.0-2-g103fe59+fdf0863a2a,21.0.0-2-g1367e85+d38a93257c,21.0.0-2-g45278ab+e70536a077,21.0.0-2-g5242d73+d38a93257c,21.0.0-2-g7f82c8f+e682ffb718,21.0.0-2-g8dde007+d179fbfa6a,21.0.0-2-g8f08a60+9402881886,21.0.0-2-ga326454+e682ffb718,21.0.0-2-ga63a54e+08647d4b1b,21.0.0-2-gde069b7+26c92b3210,21.0.0-2-gecfae73+0445ed2f95,21.0.0-2-gfc62afb+d38a93257c,21.0.0-27-gbbd0d29+ae871e0f33,21.0.0-28-g5fc5e037+feb0e9397b,21.0.0-3-g21c7a62+f4b9c0ff5c,21.0.0-3-g357aad2+57b0bddf0b,21.0.0-3-g4be5c26+d38a93257c,21.0.0-3-g65f322c+3f454acf5d,21.0.0-3-g7d9da8d+75b29a8a7f,21.0.0-3-gaa929c8+9e4ef6332c,21.0.0-3-ge02ed75+4b120a55c4,21.0.0-4-g3300ddd+e70536a077,21.0.0-4-g591bb35+4b120a55c4,21.0.0-4-gc004bbf+4911b9cd27,21.0.0-4-gccdca77+f94adcd104,21.0.0-4-ge8fba5a+2b3a696ff9,21.0.0-5-gb155db7+2c5429117a,21.0.0-5-gdf36809+637e4641ee,21.0.0-6-g00874e7+c9fd7f7160,21.0.0-6-g4e60332+4b120a55c4,21.0.0-7-gc8ca178+40eb9cf840,21.0.0-8-gfbe0b4b+9e4ef6332c,21.0.0-9-g2fd488a+d83b7cd606,w.2021.05
LSST Data Management Base Package
cameraMapper.py
Go to the documentation of this file.
1 # This file is part of obs_base.
2 #
3 # Developed for the LSST Data Management System.
4 # This product includes software developed by the LSST Project
5 # (https://www.lsst.org).
6 # See the COPYRIGHT file at the top-level directory of this distribution
7 # for details of code ownership.
8 #
9 # This program is free software: you can redistribute it and/or modify
10 # it under the terms of the GNU General Public License as published by
11 # the Free Software Foundation, either version 3 of the License, or
12 # (at your option) any later version.
13 #
14 # This program is distributed in the hope that it will be useful,
15 # but WITHOUT ANY WARRANTY; without even the implied warranty of
16 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
17 # GNU General Public License for more details.
18 #
19 # You should have received a copy of the GNU General Public License
20 # along with this program. If not, see <https://www.gnu.org/licenses/>.
21 
22 import copy
23 import os
24 import re
25 import traceback
26 import warnings
27 import weakref
28 from deprecated.sphinx import deprecated
29 
30 from astro_metadata_translator import fix_header
31 from lsst.utils import doImport
32 import lsst.daf.persistence as dafPersist
33 from . import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
34 import lsst.daf.base as dafBase
35 import lsst.afw.geom as afwGeom
36 import lsst.afw.image as afwImage
37 import lsst.afw.table as afwTable
38 from lsst.afw.fits import readMetadata
39 import lsst.afw.cameraGeom as afwCameraGeom
40 import lsst.log as lsstLog
41 import lsst.pex.exceptions as pexExcept
42 from .exposureIdInfo import ExposureIdInfo
43 from .makeRawVisitInfo import MakeRawVisitInfo
44 from .utils import createInitialSkyWcs, InitialSkyWcsError
45 from lsst.utils import getPackageDir
46 from ._instrument import Instrument
47 
48 __all__ = ["CameraMapper", "exposureFromImage"]
49 
50 
52 
53  """CameraMapper is a base class for mappers that handle images from a
54  camera and products derived from them. This provides an abstraction layer
55  between the data on disk and the code.
56 
57  Public methods: keys, queryMetadata, getDatasetTypes, map,
58  canStandardize, standardize
59 
60  Mappers for specific data sources (e.g., CFHT Megacam, LSST
61  simulations, etc.) should inherit this class.
62 
63  The CameraMapper manages datasets within a "root" directory. Note that
64  writing to a dataset present in the input root will hide the existing
65  dataset but not overwrite it. See #2160 for design discussion.
66 
67  A camera is assumed to consist of one or more rafts, each composed of
68  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
69  (amps). A camera is also assumed to have a camera geometry description
70  (CameraGeom object) as a policy file, a filter description (Filter class
71  static configuration) as another policy file.
72 
73  Information from the camera geometry and defects are inserted into all
74  Exposure objects returned.
75 
76  The mapper uses one or two registries to retrieve metadata about the
77  images. The first is a registry of all raw exposures. This must contain
78  the time of the observation. One or more tables (or the equivalent)
79  within the registry are used to look up data identifier components that
80  are not specified by the user (e.g. filter) and to return results for
81  metadata queries. The second is an optional registry of all calibration
82  data. This should contain validity start and end entries for each
83  calibration dataset in the same timescale as the observation time.
84 
85  Subclasses will typically set MakeRawVisitInfoClass and optionally the
86  metadata translator class:
87 
88  MakeRawVisitInfoClass: a class variable that points to a subclass of
89  MakeRawVisitInfo, a functor that creates an
90  lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
91 
92  translatorClass: The `~astro_metadata_translator.MetadataTranslator`
93  class to use for fixing metadata values. If it is not set an attempt
94  will be made to infer the class from ``MakeRawVisitInfoClass``, failing
95  that the metadata fixup will try to infer the translator class from the
96  header itself.
97 
98  Subclasses must provide the following methods:
99 
100  _extractDetectorName(self, dataId): returns the detector name for a CCD
101  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
102  a dataset identifier referring to that CCD or a subcomponent of it.
103 
104  _computeCcdExposureId(self, dataId): see below
105 
106  _computeCoaddExposureId(self, dataId, singleFilter): see below
107 
108  Subclasses may also need to override the following methods:
109 
110  _transformId(self, dataId): transformation of a data identifier
111  from colloquial usage (e.g., "ccdname") to proper/actual usage
112  (e.g., "ccd"), including making suitable for path expansion (e.g. removing
113  commas). The default implementation does nothing. Note that this
114  method should not modify its input parameter.
115 
116  getShortCcdName(self, ccdName): a static method that returns a shortened
117  name suitable for use as a filename. The default version converts spaces
118  to underscores.
119 
120  _mapActualToPath(self, template, actualId): convert a template path to an
121  actual path, using the actual dataset identifier.
122 
123  The mapper's behaviors are largely specified by the policy file.
124  See the MapperDictionary.paf for descriptions of the available items.
125 
126  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
127  mappings (see Mappings class).
128 
129  Common default mappings for all subclasses can be specified in the
130  "policy/{images,exposures,calibrations,datasets}.yaml" files. This
131  provides a simple way to add a product to all camera mappers.
132 
133  Functions to map (provide a path to the data given a dataset
134  identifier dictionary) and standardize (convert data into some standard
135  format or type) may be provided in the subclass as "map_{dataset type}"
136  and "std_{dataset type}", respectively.
137 
138  If non-Exposure datasets cannot be retrieved using standard
139  daf_persistence methods alone, a "bypass_{dataset type}" function may be
140  provided in the subclass to return the dataset instead of using the
141  "datasets" subpolicy.
142 
143  Implementations of map_camera and bypass_camera that should typically be
144  sufficient are provided in this base class.
145 
146  Notes
147  -----
148  .. todo::
149 
150  Instead of auto-loading the camera at construction time, load it from
151  the calibration registry
152 
153  Parameters
154  ----------
155  policy : daf_persistence.Policy,
156  Policy with per-camera defaults already merged.
157  repositoryDir : string
158  Policy repository for the subclassing module (obtained with
159  getRepositoryPath() on the per-camera default dictionary).
160  root : string, optional
161  Path to the root directory for data.
162  registry : string, optional
163  Path to registry with data's metadata.
164  calibRoot : string, optional
165  Root directory for calibrations.
166  calibRegistry : string, optional
167  Path to registry with calibrations' metadata.
168  provided : list of string, optional
169  Keys provided by the mapper.
170  parentRegistry : Registry subclass, optional
171  Registry from a parent repository that may be used to look up
172  data's metadata.
173  repositoryCfg : daf_persistence.RepositoryCfg or None, optional
174  The configuration information for the repository this mapper is
175  being used with.
176  """
177  packageName = None
178 
179  # a class or subclass of MakeRawVisitInfo, a functor that makes an
180  # lsst.afw.image.VisitInfo from the FITS metadata of a raw image
181  MakeRawVisitInfoClass = MakeRawVisitInfo
182 
183  # a class or subclass of PupilFactory
184  PupilFactoryClass = afwCameraGeom.PupilFactory
185 
186  # Class to use for metadata translations
187  translatorClass = None
188 
189  # Gen3 instrument corresponding to this mapper
190  # Can be a class or a string with the full name of the class
191  _gen3instrument = None
192 
193  def __init__(self, policy, repositoryDir,
194  root=None, registry=None, calibRoot=None, calibRegistry=None,
195  provided=None, parentRegistry=None, repositoryCfg=None):
196 
197  dafPersist.Mapper.__init__(self)
198 
199  self.loglog = lsstLog.Log.getLogger("CameraMapper")
200 
201  if root:
202  self.rootroot = root
203  elif repositoryCfg:
204  self.rootroot = repositoryCfg.root
205  else:
206  self.rootroot = None
207 
208  repoPolicy = repositoryCfg.policy if repositoryCfg else None
209  if repoPolicy is not None:
210  policy.update(repoPolicy)
211 
212  # Levels
213  self.levelslevels = dict()
214  if 'levels' in policy:
215  levelsPolicy = policy['levels']
216  for key in levelsPolicy.names(True):
217  self.levelslevels[key] = set(levelsPolicy.asArray(key))
218  self.defaultLeveldefaultLevel = policy['defaultLevel']
219  self.defaultSubLevelsdefaultSubLevels = dict()
220  if 'defaultSubLevels' in policy:
221  self.defaultSubLevelsdefaultSubLevels = policy['defaultSubLevels']
222 
223  # Root directories
224  if root is None:
225  root = "."
226  root = dafPersist.LogicalLocation(root).locString()
227 
228  self.rootStoragerootStorage = dafPersist.Storage.makeFromURI(uri=root)
229 
230  # If the calibRoot is passed in, use that. If not and it's indicated in
231  # the policy, use that. And otherwise, the calibs are in the regular
232  # root.
233  # If the location indicated by the calib root does not exist, do not
234  # create it.
235  calibStorage = None
236  if calibRoot is not None:
237  calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
238  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
239  create=False)
240  else:
241  calibRoot = policy.get('calibRoot', None)
242  if calibRoot:
243  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
244  create=False)
245  if calibStorage is None:
246  calibStorage = self.rootStoragerootStorage
247 
248  self.rootroot = root
249 
250  # Registries
251  self.registryregistry = self._setupRegistry_setupRegistry("registry", "exposure", registry, policy, "registryPath",
252  self.rootStoragerootStorage, searchParents=False,
253  posixIfNoSql=(not parentRegistry))
254  if not self.registryregistry:
255  self.registryregistry = parentRegistry
256  needCalibRegistry = policy.get('needCalibRegistry', None)
257  if needCalibRegistry:
258  if calibStorage:
259  self.calibRegistrycalibRegistry = self._setupRegistry_setupRegistry("calibRegistry", "calib", calibRegistry, policy,
260  "calibRegistryPath", calibStorage,
261  posixIfNoSql=False) # NB never use posix for calibs
262  else:
263  raise RuntimeError(
264  "'needCalibRegistry' is true in Policy, but was unable to locate a repo at "
265  f"calibRoot ivar:{calibRoot} or policy['calibRoot']:{policy.get('calibRoot', None)}")
266  else:
267  self.calibRegistrycalibRegistry = None
268 
269  # Dict of valid keys and their value types
270  self.keyDictkeyDict = dict()
271 
272  self._initMappings_initMappings(policy, self.rootStoragerootStorage, calibStorage, provided=None)
273  self._initWriteRecipes_initWriteRecipes()
274 
275  # Camera geometry
276  self.cameraDataLocationcameraDataLocation = None # path to camera geometry config file
277  self.cameracamera = self._makeCamera_makeCamera(policy=policy, repositoryDir=repositoryDir)
278 
279  # Filter translation table
280  self.filtersfilters = None
281 
282  # verify that the class variable packageName is set before attempting
283  # to instantiate an instance
284  if self.packageNamepackageName is None:
285  raise ValueError('class variable packageName must not be None')
286 
287  self.makeRawVisitInfomakeRawVisitInfo = self.MakeRawVisitInfoClassMakeRawVisitInfoClass(log=self.loglog)
288 
289  # Assign a metadata translator if one has not been defined by
290  # subclass. We can sometimes infer one from the RawVisitInfo
291  # class.
292  if self.translatorClasstranslatorClass is None and hasattr(self.makeRawVisitInfomakeRawVisitInfo, "metadataTranslator"):
293  self.translatorClasstranslatorClass = self.makeRawVisitInfomakeRawVisitInfo.metadataTranslator
294 
295  def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
296  """Initialize mappings
297 
298  For each of the dataset types that we want to be able to read, there
299  are methods that can be created to support them:
300  * map_<dataset> : determine the path for dataset
301  * std_<dataset> : standardize the retrieved dataset
302  * bypass_<dataset> : retrieve the dataset (bypassing the usual
303  retrieval machinery)
304  * query_<dataset> : query the registry
305 
306  Besides the dataset types explicitly listed in the policy, we create
307  additional, derived datasets for additional conveniences,
308  e.g., reading the header of an image, retrieving only the size of a
309  catalog.
310 
311  Parameters
312  ----------
313  policy : `lsst.daf.persistence.Policy`
314  Policy with per-camera defaults already merged
315  rootStorage : `Storage subclass instance`
316  Interface to persisted repository data.
317  calibRoot : `Storage subclass instance`
318  Interface to persisted calib repository data
319  provided : `list` of `str`
320  Keys provided by the mapper
321  """
322  # Sub-dictionaries (for exposure/calibration/dataset types)
323  imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
324  "obs_base", "ImageMappingDefaults.yaml", "policy"))
325  expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
326  "obs_base", "ExposureMappingDefaults.yaml", "policy"))
327  calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
328  "obs_base", "CalibrationMappingDefaults.yaml", "policy"))
329  dsMappingPolicy = dafPersist.Policy()
330 
331  # Mappings
332  mappingList = (
333  ("images", imgMappingPolicy, ImageMapping),
334  ("exposures", expMappingPolicy, ExposureMapping),
335  ("calibrations", calMappingPolicy, CalibrationMapping),
336  ("datasets", dsMappingPolicy, DatasetMapping)
337  )
338  self.mappingsmappings = dict()
339  for name, defPolicy, cls in mappingList:
340  if name in policy:
341  datasets = policy[name]
342 
343  # Centrally-defined datasets
344  defaultsPath = os.path.join(getPackageDir("obs_base"), "policy", name + ".yaml")
345  if os.path.exists(defaultsPath):
346  datasets.merge(dafPersist.Policy(defaultsPath))
347 
348  mappings = dict()
349  setattr(self, name, mappings)
350  for datasetType in datasets.names(True):
351  subPolicy = datasets[datasetType]
352  subPolicy.merge(defPolicy)
353 
354  if not hasattr(self, "map_" + datasetType) and 'composite' in subPolicy:
355  def compositeClosure(dataId, write=False, mapper=None, mapping=None,
356  subPolicy=subPolicy):
357  components = subPolicy.get('composite')
358  assembler = subPolicy['assembler'] if 'assembler' in subPolicy else None
359  disassembler = subPolicy['disassembler'] if 'disassembler' in subPolicy else None
360  python = subPolicy['python']
361  butlerComposite = dafPersist.ButlerComposite(assembler=assembler,
362  disassembler=disassembler,
363  python=python,
364  dataId=dataId,
365  mapper=self)
366  for name, component in components.items():
367  butlerComposite.add(id=name,
368  datasetType=component.get('datasetType'),
369  setter=component.get('setter', None),
370  getter=component.get('getter', None),
371  subset=component.get('subset', False),
372  inputOnly=component.get('inputOnly', False))
373  return butlerComposite
374  setattr(self, "map_" + datasetType, compositeClosure)
375  # for now at least, don't set up any other handling for
376  # this dataset type.
377  continue
378 
379  if name == "calibrations":
380  mapping = cls(datasetType, subPolicy, self.registryregistry, self.calibRegistrycalibRegistry, calibStorage,
381  provided=provided, dataRoot=rootStorage)
382  else:
383  mapping = cls(datasetType, subPolicy, self.registryregistry, rootStorage, provided=provided)
384 
385  if datasetType in self.mappingsmappings:
386  raise ValueError(f"Duplicate mapping policy for dataset type {datasetType}")
387  self.keyDictkeyDict.update(mapping.keys())
388  mappings[datasetType] = mapping
389  self.mappingsmappings[datasetType] = mapping
390  if not hasattr(self, "map_" + datasetType):
391  def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
392  return mapping.map(mapper, dataId, write)
393  setattr(self, "map_" + datasetType, mapClosure)
394  if not hasattr(self, "query_" + datasetType):
395  def queryClosure(format, dataId, mapping=mapping):
396  return mapping.lookup(format, dataId)
397  setattr(self, "query_" + datasetType, queryClosure)
398  if hasattr(mapping, "standardize") and not hasattr(self, "std_" + datasetType):
399  def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
400  return mapping.standardize(mapper, item, dataId)
401  setattr(self, "std_" + datasetType, stdClosure)
402 
403  def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
404  """Set convenience methods on CameraMapper"""
405  mapName = "map_" + datasetType + "_" + suffix
406  bypassName = "bypass_" + datasetType + "_" + suffix
407  queryName = "query_" + datasetType + "_" + suffix
408  if not hasattr(self, mapName):
409  setattr(self, mapName, mapImpl or getattr(self, "map_" + datasetType))
410  if not hasattr(self, bypassName):
411  if bypassImpl is None and hasattr(self, "bypass_" + datasetType):
412  bypassImpl = getattr(self, "bypass_" + datasetType)
413  if bypassImpl is not None:
414  setattr(self, bypassName, bypassImpl)
415  if not hasattr(self, queryName):
416  setattr(self, queryName, queryImpl or getattr(self, "query_" + datasetType))
417 
418  # Filename of dataset
419  setMethods("filename", bypassImpl=lambda datasetType, pythonType, location, dataId:
420  [os.path.join(location.getStorage().root, p) for p in location.getLocations()])
421  # Metadata from FITS file
422  if subPolicy["storage"] == "FitsStorage": # a FITS image
423  def getMetadata(datasetType, pythonType, location, dataId):
424  md = readMetadata(location.getLocationsWithRoot()[0])
425  fix_header(md, translator_class=self.translatorClasstranslatorClass)
426  return md
427 
428  setMethods("md", bypassImpl=getMetadata)
429 
430  # Add support for configuring FITS compression
431  addName = "add_" + datasetType
432  if not hasattr(self, addName):
433  setattr(self, addName, self.getImageCompressionSettingsgetImageCompressionSettings)
434 
435  if name == "exposures":
436  def getSkyWcs(datasetType, pythonType, location, dataId):
437  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
438  return fitsReader.readWcs()
439 
440  setMethods("wcs", bypassImpl=getSkyWcs)
441 
442  def getRawHeaderWcs(datasetType, pythonType, location, dataId):
443  """Create a SkyWcs from the un-modified raw
444  FITS WCS header keys."""
445  if datasetType[:3] != "raw":
446  raise dafPersist.NoResults("Can only get header WCS for raw exposures.",
447  datasetType, dataId)
448  return afwGeom.makeSkyWcs(readMetadata(location.getLocationsWithRoot()[0]))
449 
450  setMethods("header_wcs", bypassImpl=getRawHeaderWcs)
451 
452  def getPhotoCalib(datasetType, pythonType, location, dataId):
453  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
454  return fitsReader.readPhotoCalib()
455 
456  setMethods("photoCalib", bypassImpl=getPhotoCalib)
457 
458  def getVisitInfo(datasetType, pythonType, location, dataId):
459  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
460  return fitsReader.readVisitInfo()
461 
462  setMethods("visitInfo", bypassImpl=getVisitInfo)
463 
464  # TODO: remove in DM-27177
465  @deprecated(reason="Replaced with getFilterLabel. Will be removed after v22.", category=FutureWarning)
466  def getFilter(datasetType, pythonType, location, dataId):
467  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
468  return fitsReader.readFilter()
469 
470  setMethods("filter", bypassImpl=getFilter)
471 
472  # TODO: deprecate in DM-27177, remove in DM-27811
473  def getFilterLabel(datasetType, pythonType, location, dataId):
474  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
475  storedFilter = fitsReader.readFilterLabel()
476 
477  # Apply standardization used by full Exposure
478  try:
479  # mapping is local to enclosing scope
480  idFilter = mapping.need(['filter'], dataId)['filter']
481  except dafPersist.NoResults:
482  idFilter = None
483  bestFilter = self._getBestFilter_getBestFilter(storedFilter, idFilter)
484  if bestFilter is not None:
485  return bestFilter
486  else:
487  return storedFilter
488 
489  setMethods("filterLabel", bypassImpl=getFilterLabel)
490 
491  setMethods("detector",
492  mapImpl=lambda dataId, write=False:
494  pythonType="lsst.afw.cameraGeom.CameraConfig",
495  cppType="Config",
496  storageName="Internal",
497  locationList="ignored",
498  dataId=dataId,
499  mapper=self,
500  storage=None,
501  ),
502  bypassImpl=lambda datasetType, pythonType, location, dataId:
503  self.cameracamera[self._extractDetectorName_extractDetectorName(dataId)]
504  )
505 
506  def getBBox(datasetType, pythonType, location, dataId):
507  md = readMetadata(location.getLocationsWithRoot()[0], hdu=1)
508  fix_header(md, translator_class=self.translatorClasstranslatorClass)
509  return afwImage.bboxFromMetadata(md)
510 
511  setMethods("bbox", bypassImpl=getBBox)
512 
513  elif name == "images":
514  def getBBox(datasetType, pythonType, location, dataId):
515  md = readMetadata(location.getLocationsWithRoot()[0])
516  fix_header(md, translator_class=self.translatorClasstranslatorClass)
517  return afwImage.bboxFromMetadata(md)
518  setMethods("bbox", bypassImpl=getBBox)
519 
520  if subPolicy["storage"] == "FitsCatalogStorage": # a FITS catalog
521 
522  def getMetadata(datasetType, pythonType, location, dataId):
523  md = readMetadata(os.path.join(location.getStorage().root,
524  location.getLocations()[0]), hdu=1)
525  fix_header(md, translator_class=self.translatorClasstranslatorClass)
526  return md
527 
528  setMethods("md", bypassImpl=getMetadata)
529 
530  # Sub-images
531  if subPolicy["storage"] == "FitsStorage":
532  def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
533  subId = dataId.copy()
534  del subId['bbox']
535  loc = mapping.map(mapper, subId, write)
536  bbox = dataId['bbox']
537  llcX = bbox.getMinX()
538  llcY = bbox.getMinY()
539  width = bbox.getWidth()
540  height = bbox.getHeight()
541  loc.additionalData.set('llcX', llcX)
542  loc.additionalData.set('llcY', llcY)
543  loc.additionalData.set('width', width)
544  loc.additionalData.set('height', height)
545  if 'imageOrigin' in dataId:
546  loc.additionalData.set('imageOrigin',
547  dataId['imageOrigin'])
548  return loc
549 
550  def querySubClosure(key, format, dataId, mapping=mapping):
551  subId = dataId.copy()
552  del subId['bbox']
553  return mapping.lookup(format, subId)
554  setMethods("sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
555 
556  if subPolicy["storage"] == "FitsCatalogStorage":
557  # Length of catalog
558 
559  def getLen(datasetType, pythonType, location, dataId):
560  md = readMetadata(os.path.join(location.getStorage().root,
561  location.getLocations()[0]), hdu=1)
562  fix_header(md, translator_class=self.translatorClasstranslatorClass)
563  return md["NAXIS2"]
564 
565  setMethods("len", bypassImpl=getLen)
566 
567  # Schema of catalog
568  if not datasetType.endswith("_schema") and datasetType + "_schema" not in datasets:
569  setMethods("schema", bypassImpl=lambda datasetType, pythonType, location, dataId:
570  afwTable.Schema.readFits(os.path.join(location.getStorage().root,
571  location.getLocations()[0])))
572 
573  def _computeCcdExposureId(self, dataId):
574  """Compute the 64-bit (long) identifier for a CCD exposure.
575 
576  Subclasses must override
577 
578  Parameters
579  ----------
580  dataId : `dict`
581  Data identifier with visit, ccd.
582  """
583  raise NotImplementedError()
584 
585  def _computeCoaddExposureId(self, dataId, singleFilter):
586  """Compute the 64-bit (long) identifier for a coadd.
587 
588  Subclasses must override
589 
590  Parameters
591  ----------
592  dataId : `dict`
593  Data identifier with tract and patch.
594  singleFilter : `bool`
595  True means the desired ID is for a single-filter coadd, in which
596  case dataIdmust contain filter.
597  """
598  raise NotImplementedError()
599 
600  def _search(self, path):
601  """Search for path in the associated repository's storage.
602 
603  Parameters
604  ----------
605  path : string
606  Path that describes an object in the repository associated with
607  this mapper.
608  Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The
609  indicator will be stripped when searching and so will match
610  filenames without the HDU indicator, e.g. 'foo.fits'. The path
611  returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
612 
613  Returns
614  -------
615  string
616  The path for this object in the repository. Will return None if the
617  object can't be found. If the input argument path contained an HDU
618  indicator, the returned path will also contain the HDU indicator.
619  """
620  return self.rootStoragerootStorage.search(path)
621 
622  def backup(self, datasetType, dataId):
623  """Rename any existing object with the given type and dataId.
624 
625  The CameraMapper implementation saves objects in a sequence of e.g.:
626 
627  - foo.fits
628  - foo.fits~1
629  - foo.fits~2
630 
631  All of the backups will be placed in the output repo, however, and will
632  not be removed if they are found elsewhere in the _parent chain. This
633  means that the same file will be stored twice if the previous version
634  was found in an input repo.
635  """
636 
637  # Calling PosixStorage directly is not the long term solution in this
638  # function, this is work-in-progress on epic DM-6225. The plan is for
639  # parentSearch to be changed to 'search', and search only the storage
640  # associated with this mapper. All searching of parents will be handled
641  # by traversing the container of repositories in Butler.
642 
643  def firstElement(list):
644  """Get the first element in the list, or None if that can't be
645  done.
646  """
647  return list[0] if list is not None and len(list) else None
648 
649  n = 0
650  newLocation = self.mapmap(datasetType, dataId, write=True)
651  newPath = newLocation.getLocations()[0]
652  path = dafPersist.PosixStorage.search(self.rootroot, newPath, searchParents=True)
653  path = firstElement(path)
654  oldPaths = []
655  while path is not None:
656  n += 1
657  oldPaths.append((n, path))
658  path = dafPersist.PosixStorage.search(self.rootroot, "%s~%d" % (newPath, n), searchParents=True)
659  path = firstElement(path)
660  for n, oldPath in reversed(oldPaths):
661  self.rootStoragerootStorage.copyFile(oldPath, "%s~%d" % (newPath, n))
662 
663  def keys(self):
664  """Return supported keys.
665 
666  Returns
667  -------
668  iterable
669  List of keys usable in a dataset identifier
670  """
671  return iter(self.keyDictkeyDict.keys())
672 
673  def getKeys(self, datasetType, level):
674  """Return a dict of supported keys and their value types for a given
675  dataset type at a given level of the key hierarchy.
676 
677  Parameters
678  ----------
679  datasetType : `str`
680  Dataset type or None for all dataset types.
681  level : `str` or None
682  Level or None for all levels or '' for the default level for the
683  camera.
684 
685  Returns
686  -------
687  `dict`
688  Keys are strings usable in a dataset identifier, values are their
689  value types.
690  """
691 
692  # not sure if this is how we want to do this. what if None was
693  # intended?
694  if level == '':
695  level = self.getDefaultLevelgetDefaultLevel()
696 
697  if datasetType is None:
698  keyDict = copy.copy(self.keyDictkeyDict)
699  else:
700  keyDict = self.mappingsmappings[datasetType].keys()
701  if level is not None and level in self.levelslevels:
702  keyDict = copy.copy(keyDict)
703  for lev in self.levelslevels[level]:
704  if lev in keyDict:
705  del keyDict[lev]
706  return keyDict
707 
708  def getDefaultLevel(self):
709  return self.defaultLeveldefaultLevel
710 
711  def getDefaultSubLevel(self, level):
712  if level in self.defaultSubLevelsdefaultSubLevels:
713  return self.defaultSubLevelsdefaultSubLevels[level]
714  return None
715 
716  @classmethod
717  def getCameraName(cls):
718  """Return the name of the camera that this CameraMapper is for."""
719  className = str(cls)
720  className = className[className.find('.'):-1]
721  m = re.search(r'(\w+)Mapper', className)
722  if m is None:
723  m = re.search(r"class '[\w.]*?(\w+)'", className)
724  name = m.group(1)
725  return name[:1].lower() + name[1:] if name else ''
726 
727  @classmethod
728  def getPackageName(cls):
729  """Return the name of the package containing this CameraMapper."""
730  if cls.packageNamepackageName is None:
731  raise ValueError('class variable packageName must not be None')
732  return cls.packageNamepackageName
733 
734  @classmethod
735  def getGen3Instrument(cls):
736  """Return the gen3 Instrument class equivalent for this gen2 Mapper.
737 
738  Returns
739  -------
740  instr : `type`
741  A `~lsst.obs.base.Instrument` class.
742  """
743  if cls._gen3instrument_gen3instrument is None:
744  raise NotImplementedError("Please provide a specific implementation for your instrument"
745  " to enable conversion of this gen2 repository to gen3")
746  if isinstance(cls._gen3instrument_gen3instrument, str):
747  # Given a string to convert to an instrument class
748  cls._gen3instrument_gen3instrument = doImport(cls._gen3instrument_gen3instrument)
749  if not issubclass(cls._gen3instrument_gen3instrument, Instrument):
750  raise ValueError(f"Mapper {cls} has declared a gen3 instrument class of {cls._gen3instrument}"
751  " but that is not an lsst.obs.base.Instrument")
752  return cls._gen3instrument_gen3instrument
753 
754  @classmethod
755  def getPackageDir(cls):
756  """Return the base directory of this package"""
757  return getPackageDir(cls.getPackageNamegetPackageName())
758 
759  def map_camera(self, dataId, write=False):
760  """Map a camera dataset."""
761  if self.cameracamera is None:
762  raise RuntimeError("No camera dataset available.")
763  actualId = self._transformId_transformId(dataId)
765  pythonType="lsst.afw.cameraGeom.CameraConfig",
766  cppType="Config",
767  storageName="ConfigStorage",
768  locationList=self.cameraDataLocationcameraDataLocation or "ignored",
769  dataId=actualId,
770  mapper=self,
771  storage=self.rootStoragerootStorage
772  )
773 
774  def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
775  """Return the (preloaded) camera object.
776  """
777  if self.cameracamera is None:
778  raise RuntimeError("No camera dataset available.")
779  return self.cameracamera
780 
781  def map_expIdInfo(self, dataId, write=False):
783  pythonType="lsst.obs.base.ExposureIdInfo",
784  cppType=None,
785  storageName="Internal",
786  locationList="ignored",
787  dataId=dataId,
788  mapper=self,
789  storage=self.rootStoragerootStorage
790  )
791 
792  def bypass_expIdInfo(self, datasetType, pythonType, location, dataId):
793  """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
794  expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
795  expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
796  return ExposureIdInfo(expId=expId, expBits=expBits)
797 
798  def std_bfKernel(self, item, dataId):
799  """Disable standardization for bfKernel
800 
801  bfKernel is a calibration product that is numpy array,
802  unlike other calibration products that are all images;
803  all calibration images are sent through _standardizeExposure
804  due to CalibrationMapping, but we don't want that to happen to bfKernel
805  """
806  return item
807 
808  def std_raw(self, item, dataId):
809  """Standardize a raw dataset by converting it to an Exposure instead
810  of an Image"""
811  return self._standardizeExposure_standardizeExposure(self.exposures['raw'], item, dataId,
812  trimmed=False, setVisitInfo=True)
813 
814  def map_skypolicy(self, dataId):
815  """Map a sky policy."""
816  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
817  "Internal", None, None, self,
818  storage=self.rootStoragerootStorage)
819 
820  def std_skypolicy(self, item, dataId):
821  """Standardize a sky policy by returning the one we use."""
822  return self.skypolicy
823 
824 
829 
830  def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
831  posixIfNoSql=True):
832  """Set up a registry (usually SQLite3), trying a number of possible
833  paths.
834 
835  Parameters
836  ----------
837  name : string
838  Name of registry.
839  description: `str`
840  Description of registry (for log messages)
841  path : string
842  Path for registry.
843  policy : string
844  Policy that contains the registry name, used if path is None.
845  policyKey : string
846  Key in policy for registry path.
847  storage : Storage subclass
848  Repository Storage to look in.
849  searchParents : bool, optional
850  True if the search for a registry should follow any Butler v1
851  _parent symlinks.
852  posixIfNoSql : bool, optional
853  If an sqlite registry is not found, will create a posix registry if
854  this is True.
855 
856  Returns
857  -------
858  lsst.daf.persistence.Registry
859  Registry object
860  """
861  if path is None and policyKey in policy:
862  path = dafPersist.LogicalLocation(policy[policyKey]).locString()
863  if os.path.isabs(path):
864  raise RuntimeError("Policy should not indicate an absolute path for registry.")
865  if not storage.exists(path):
866  newPath = storage.instanceSearch(path)
867 
868  newPath = newPath[0] if newPath is not None and len(newPath) else None
869  if newPath is None:
870  self.loglog.warn("Unable to locate registry at policy path (also looked in root): %s",
871  path)
872  path = newPath
873  else:
874  self.loglog.warn("Unable to locate registry at policy path: %s", path)
875  path = None
876 
877  # Old Butler API was to indicate the registry WITH the repo folder,
878  # New Butler expects the registry to be in the repo folder. To support
879  # Old API, check to see if path starts with root, and if so, strip
880  # root from path. Currently only works with PosixStorage
881  try:
882  root = storage.root
883  if path and (path.startswith(root)):
884  path = path[len(root + '/'):]
885  except AttributeError:
886  pass
887 
888  # determine if there is an sqlite registry and if not, try the posix
889  # registry.
890  registry = None
891 
892  def search(filename, description):
893  """Search for file in storage
894 
895  Parameters
896  ----------
897  filename : `str`
898  Filename to search for
899  description : `str`
900  Description of file, for error message.
901 
902  Returns
903  -------
904  path : `str` or `None`
905  Path to file, or None
906  """
907  result = storage.instanceSearch(filename)
908  if result:
909  return result[0]
910  self.loglog.debug("Unable to locate %s: %s", description, filename)
911  return None
912 
913  # Search for a suitable registry database
914  if path is None:
915  path = search("%s.pgsql" % name, "%s in root" % description)
916  if path is None:
917  path = search("%s.sqlite3" % name, "%s in root" % description)
918  if path is None:
919  path = search(os.path.join(".", "%s.sqlite3" % name), "%s in current dir" % description)
920 
921  if path is not None:
922  if not storage.exists(path):
923  newPath = storage.instanceSearch(path)
924  newPath = newPath[0] if newPath is not None and len(newPath) else None
925  if newPath is not None:
926  path = newPath
927  localFileObj = storage.getLocalFile(path)
928  self.loglog.info("Loading %s registry from %s", description, localFileObj.name)
929  registry = dafPersist.Registry.create(localFileObj.name)
930  localFileObj.close()
931  elif not registry and posixIfNoSql:
932  try:
933  self.loglog.info("Loading Posix %s registry from %s", description, storage.root)
934  registry = dafPersist.PosixRegistry(storage.root)
935  except Exception:
936  registry = None
937 
938  return registry
939 
940  def _transformId(self, dataId):
941  """Generate a standard ID dict from a camera-specific ID dict.
942 
943  Canonical keys include:
944  - amp: amplifier name
945  - ccd: CCD name (in LSST this is a combination of raft and sensor)
946  The default implementation returns a copy of its input.
947 
948  Parameters
949  ----------
950  dataId : `dict`
951  Dataset identifier; this must not be modified
952 
953  Returns
954  -------
955  `dict`
956  Transformed dataset identifier.
957  """
958 
959  return dataId.copy()
960 
961  def _mapActualToPath(self, template, actualId):
962  """Convert a template path to an actual path, using the actual data
963  identifier. This implementation is usually sufficient but can be
964  overridden by the subclass.
965 
966  Parameters
967  ----------
968  template : `str`
969  Template path
970  actualId : `dict`
971  Dataset identifier
972 
973  Returns
974  -------
975  `str`
976  Pathname
977  """
978 
979  try:
980  transformedId = self._transformId_transformId(actualId)
981  return template % transformedId
982  except Exception as e:
983  raise RuntimeError("Failed to format %r with data %r: %s" % (template, transformedId, e))
984 
985  @staticmethod
986  def getShortCcdName(ccdName):
987  """Convert a CCD name to a form useful as a filename
988 
989  The default implementation converts spaces to underscores.
990  """
991  return ccdName.replace(" ", "_")
992 
993  def _extractDetectorName(self, dataId):
994  """Extract the detector (CCD) name from the dataset identifier.
995 
996  The name in question is the detector name used by lsst.afw.cameraGeom.
997 
998  Parameters
999  ----------
1000  dataId : `dict`
1001  Dataset identifier.
1002 
1003  Returns
1004  -------
1005  `str`
1006  Detector name
1007  """
1008  raise NotImplementedError("No _extractDetectorName() function specified")
1009 
1010  def _setAmpDetector(self, item, dataId, trimmed=True):
1011  """Set the detector object in an Exposure for an amplifier.
1012 
1013  Defects are also added to the Exposure based on the detector object.
1014 
1015  Parameters
1016  ----------
1017  item : `lsst.afw.image.Exposure`
1018  Exposure to set the detector in.
1019  dataId : `dict`
1020  Dataset identifier
1021  trimmed : `bool`
1022  Should detector be marked as trimmed? (ignored)
1023  """
1024 
1025  return self._setCcdDetector_setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
1026 
1027  def _setCcdDetector(self, item, dataId, trimmed=True):
1028  """Set the detector object in an Exposure for a CCD.
1029 
1030  Parameters
1031  ----------
1032  item : `lsst.afw.image.Exposure`
1033  Exposure to set the detector in.
1034  dataId : `dict`
1035  Dataset identifier
1036  trimmed : `bool`
1037  Should detector be marked as trimmed? (ignored)
1038  """
1039  if item.getDetector() is not None:
1040  return
1041 
1042  detectorName = self._extractDetectorName_extractDetectorName(dataId)
1043  detector = self.cameracamera[detectorName]
1044  item.setDetector(detector)
1045 
1046  @staticmethod
1047  def _resolveFilters(definitions, idFilter, filterLabel):
1048  """Identify the filter(s) consistent with partial filter information.
1049 
1050  Parameters
1051  ----------
1052  definitions : `lsst.obs.base.FilterDefinitionCollection`
1053  The filter definitions in which to search for filters.
1054  idFilter : `str` or `None`
1055  The filter information provided in a data ID.
1056  filterLabel : `lsst.afw.image.FilterLabel` or `None`
1057  The filter information provided by an exposure; may be incomplete.
1058 
1059  Returns
1060  -------
1061  filters : `set` [`lsst.obs.base.FilterDefinition`]
1062  The set of filters consistent with ``idFilter``
1063  and ``filterLabel``.
1064  """
1065  # Assume none of the filter constraints actually wrong/contradictory.
1066  # Then taking the intersection of all constraints will give a unique
1067  # result if one exists.
1068  matches = set(definitions)
1069  if idFilter is not None:
1070  matches.intersection_update(definitions.findAll(idFilter))
1071  if filterLabel is not None and filterLabel.hasPhysicalLabel():
1072  matches.intersection_update(definitions.findAll(filterLabel.physicalLabel))
1073  if filterLabel is not None and filterLabel.hasBandLabel():
1074  matches.intersection_update(definitions.findAll(filterLabel.bandLabel))
1075  return matches
1076 
1077  def _getBestFilter(self, storedLabel, idFilter):
1078  """Estimate the most complete filter information consistent with the
1079  file or registry.
1080 
1081  Parameters
1082  ----------
1083  storedLabel : `lsst.afw.image.FilterLabel` or `None`
1084  The filter previously stored in the file.
1085  idFilter : `str` or `None`
1086  The filter implied by the data ID, if any.
1087 
1088  Returns
1089  -------
1090  bestFitler : `lsst.afw.image.FilterLabel` or `None`
1091  The complete filter to describe the dataset. May be equal to
1092  ``storedLabel``. `None` if no recommendation can be generated.
1093  """
1094  try:
1095  # getGen3Instrument returns class; need to construct it.
1096  filterDefinitions = self.getGen3InstrumentgetGen3Instrument()().filterDefinitions
1097  except NotImplementedError:
1098  filterDefinitions = None
1099 
1100  if filterDefinitions is not None:
1101  definitions = self._resolveFilters_resolveFilters(filterDefinitions, idFilter, storedLabel)
1102  self.loglog.debug("Matching filters for id=%r and label=%r are %s.",
1103  idFilter, storedLabel, definitions)
1104  if len(definitions) == 1:
1105  newLabel = list(definitions)[0].makeFilterLabel()
1106  return newLabel
1107  elif definitions:
1108  self.loglog.warn("Multiple matches for filter %r with data ID %r.", storedLabel, idFilter)
1109  # Can we at least add a band?
1110  # Never expect multiple definitions with same physical filter.
1111  bands = {d.band for d in definitions} # None counts as separate result!
1112  if len(bands) == 1 and storedLabel is None:
1113  band = list(bands)[0]
1114  return afwImage.FilterLabel(band=band)
1115  else:
1116  return None
1117  else:
1118  # Unknown filter, nothing to be done.
1119  self.loglog.warn("Cannot reconcile filter %r with data ID %r.", storedLabel, idFilter)
1120  return None
1121 
1122  # Not practical to recommend a FilterLabel without filterDefinitions
1123 
1124  return None
1125 
1126  def _setFilter(self, mapping, item, dataId):
1127  """Set the filter information in an Exposure.
1128 
1129  The Exposure should already have had a filter loaded, but the reader
1130  (in ``afw``) had to act on incomplete information. This method
1131  cross-checks the filter against the data ID and the standard list
1132  of filters.
1133 
1134  Parameters
1135  ----------
1136  mapping : `lsst.obs.base.Mapping`
1137  Where to get the data ID filter from.
1138  item : `lsst.afw.image.Exposure`
1139  Exposure to set the filter in.
1140  dataId : `dict`
1141  Dataset identifier.
1142  """
1143  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI)
1144  or isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
1145  return
1146 
1147  itemFilter = item.getFilterLabel() # may be None
1148  try:
1149  idFilter = mapping.need(['filter'], dataId)['filter']
1150  except dafPersist.NoResults:
1151  idFilter = None
1152 
1153  bestFilter = self._getBestFilter_getBestFilter(itemFilter, idFilter)
1154  if bestFilter is not None:
1155  if bestFilter != itemFilter:
1156  item.setFilterLabel(bestFilter)
1157  # Already using bestFilter, avoid unnecessary edits
1158  elif itemFilter is None:
1159  # Old Filter cleanup, without the benefit of FilterDefinition
1160  if self.filtersfilters is not None and idFilter in self.filtersfilters:
1161  idFilter = self.filtersfilters[idFilter]
1162  try:
1163  # TODO: remove in DM-27177; at that point may not be able
1164  # to process IDs without FilterDefinition.
1165  with warnings.catch_warnings():
1166  warnings.filterwarnings("ignore", category=FutureWarning)
1167  item.setFilter(afwImage.Filter(idFilter))
1168  except pexExcept.NotFoundError:
1169  self.loglog.warn("Filter %s not defined. Set to UNKNOWN.", idFilter)
1170 
1171  def _standardizeExposure(self, mapping, item, dataId, filter=True,
1172  trimmed=True, setVisitInfo=True):
1173  """Default standardization function for images.
1174 
1175  This sets the Detector from the camera geometry
1176  and optionally set the Filter. In both cases this saves
1177  having to persist some data in each exposure (or image).
1178 
1179  Parameters
1180  ----------
1181  mapping : `lsst.obs.base.Mapping`
1182  Where to get the values from.
1183  item : image-like object
1184  Can be any of lsst.afw.image.Exposure,
1185  lsst.afw.image.DecoratedImage, lsst.afw.image.Image
1186  or lsst.afw.image.MaskedImage
1187 
1188  dataId : `dict`
1189  Dataset identifier
1190  filter : `bool`
1191  Set filter? Ignored if item is already an exposure
1192  trimmed : `bool`
1193  Should detector be marked as trimmed?
1194  setVisitInfo : `bool`
1195  Should Exposure have its VisitInfo filled out from the metadata?
1196 
1197  Returns
1198  -------
1199  `lsst.afw.image.Exposure`
1200  The standardized Exposure.
1201  """
1202  try:
1203  exposure = exposureFromImage(item, dataId, mapper=self, logger=self.loglog,
1204  setVisitInfo=setVisitInfo)
1205  except Exception as e:
1206  self.loglog.error("Could not turn item=%r into an exposure: %s" % (repr(item), e))
1207  raise
1208 
1209  if mapping.level.lower() == "amp":
1210  self._setAmpDetector_setAmpDetector(exposure, dataId, trimmed)
1211  elif mapping.level.lower() == "ccd":
1212  self._setCcdDetector_setCcdDetector(exposure, dataId, trimmed)
1213 
1214  # We can only create a WCS if it doesn't already have one and
1215  # we have either a VisitInfo or exposure metadata.
1216  # Do not calculate a WCS if this is an amplifier exposure
1217  if mapping.level.lower() != "amp" and exposure.getWcs() is None and \
1218  (exposure.getInfo().getVisitInfo() is not None or exposure.getMetadata().toDict()):
1219  self._createInitialSkyWcs_createInitialSkyWcs(exposure)
1220 
1221  if filter:
1222  self._setFilter_setFilter(mapping, exposure, dataId)
1223 
1224  return exposure
1225 
1226  def _createSkyWcsFromMetadata(self, exposure):
1227  """Create a SkyWcs from the FITS header metadata in an Exposure.
1228 
1229  Parameters
1230  ----------
1231  exposure : `lsst.afw.image.Exposure`
1232  The exposure to get metadata from, and attach the SkyWcs to.
1233  """
1234  metadata = exposure.getMetadata()
1235  fix_header(metadata, translator_class=self.translatorClasstranslatorClass)
1236  try:
1237  wcs = afwGeom.makeSkyWcs(metadata, strip=True)
1238  exposure.setWcs(wcs)
1239  except pexExcept.TypeError as e:
1240  # See DM-14372 for why this is debug and not warn (e.g. calib
1241  # files without wcs metadata).
1242  self.loglog.debug("wcs set to None; missing information found in metadata to create a valid wcs:"
1243  " %s", e.args[0])
1244  # ensure any WCS values stripped from the metadata are removed in the
1245  # exposure
1246  exposure.setMetadata(metadata)
1247 
1248  def _createInitialSkyWcs(self, exposure):
1249  """Create a SkyWcs from the boresight and camera geometry.
1250 
1251  If the boresight or camera geometry do not support this method of
1252  WCS creation, this falls back on the header metadata-based version
1253  (typically a purely linear FITS crval/crpix/cdmatrix WCS).
1254 
1255  Parameters
1256  ----------
1257  exposure : `lsst.afw.image.Exposure`
1258  The exposure to get data from, and attach the SkyWcs to.
1259  """
1260  # Always use try to use metadata first, to strip WCS keys from it.
1261  self._createSkyWcsFromMetadata_createSkyWcsFromMetadata(exposure)
1262 
1263  if exposure.getInfo().getVisitInfo() is None:
1264  msg = "No VisitInfo; cannot access boresight information. Defaulting to metadata-based SkyWcs."
1265  self.loglog.warn(msg)
1266  return
1267  try:
1268  newSkyWcs = createInitialSkyWcs(exposure.getInfo().getVisitInfo(), exposure.getDetector())
1269  exposure.setWcs(newSkyWcs)
1270  except InitialSkyWcsError as e:
1271  msg = "Cannot create SkyWcs using VisitInfo and Detector, using metadata-based SkyWcs: %s"
1272  self.loglog.warn(msg, e)
1273  self.loglog.debug("Exception was: %s", traceback.TracebackException.from_exception(e))
1274  if e.__context__ is not None:
1275  self.loglog.debug("Root-cause Exception was: %s",
1276  traceback.TracebackException.from_exception(e.__context__))
1277 
1278  def _makeCamera(self, policy, repositoryDir):
1279  """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing
1280  the camera geometry
1281 
1282  Also set self.cameraDataLocation, if relevant (else it can be left
1283  None).
1284 
1285  This implementation assumes that policy contains an entry "camera"
1286  that points to the subdirectory in this package of camera data;
1287  specifically, that subdirectory must contain:
1288  - a file named `camera.py` that contains persisted camera config
1289  - ampInfo table FITS files, as required by
1290  lsst.afw.cameraGeom.makeCameraFromPath
1291 
1292  Parameters
1293  ----------
1294  policy : `lsst.daf.persistence.Policy`
1295  Policy with per-camera defaults already merged
1296  (PexPolicy only for backward compatibility).
1297  repositoryDir : `str`
1298  Policy repository for the subclassing module (obtained with
1299  getRepositoryPath() on the per-camera default dictionary).
1300  """
1301  if 'camera' not in policy:
1302  raise RuntimeError("Cannot find 'camera' in policy; cannot construct a camera")
1303  cameraDataSubdir = policy['camera']
1304  self.cameraDataLocationcameraDataLocation = os.path.normpath(
1305  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
1306  cameraConfig = afwCameraGeom.CameraConfig()
1307  cameraConfig.load(self.cameraDataLocationcameraDataLocation)
1308  ampInfoPath = os.path.dirname(self.cameraDataLocationcameraDataLocation)
1309  return afwCameraGeom.makeCameraFromPath(
1310  cameraConfig=cameraConfig,
1311  ampInfoPath=ampInfoPath,
1312  shortNameFunc=self.getShortCcdNamegetShortCcdName,
1313  pupilFactoryClass=self.PupilFactoryClassPupilFactoryClass
1314  )
1315 
1316  def getRegistry(self):
1317  """Get the registry used by this mapper.
1318 
1319  Returns
1320  -------
1321  Registry or None
1322  The registry used by this mapper for this mapper's repository.
1323  """
1324  return self.registryregistry
1325 
1326  def getImageCompressionSettings(self, datasetType, dataId):
1327  """Stuff image compression settings into a daf.base.PropertySet
1328 
1329  This goes into the ButlerLocation's "additionalData", which gets
1330  passed into the boost::persistence framework.
1331 
1332  Parameters
1333  ----------
1334  datasetType : `str`
1335  Type of dataset for which to get the image compression settings.
1336  dataId : `dict`
1337  Dataset identifier.
1338 
1339  Returns
1340  -------
1341  additionalData : `lsst.daf.base.PropertySet`
1342  Image compression settings.
1343  """
1344  mapping = self.mappingsmappings[datasetType]
1345  recipeName = mapping.recipe
1346  storageType = mapping.storage
1347  if storageType not in self._writeRecipes_writeRecipes:
1348  return dafBase.PropertySet()
1349  if recipeName not in self._writeRecipes_writeRecipes[storageType]:
1350  raise RuntimeError("Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1351  (datasetType, storageType, recipeName))
1352  recipe = self._writeRecipes_writeRecipes[storageType][recipeName].deepCopy()
1353  seed = hash(tuple(dataId.items())) % 2**31
1354  for plane in ("image", "mask", "variance"):
1355  if recipe.exists(plane + ".scaling.seed") and recipe.getScalar(plane + ".scaling.seed") == 0:
1356  recipe.set(plane + ".scaling.seed", seed)
1357  return recipe
1358 
1359  def _initWriteRecipes(self):
1360  """Read the recipes for writing files
1361 
1362  These recipes are currently used for configuring FITS compression,
1363  but they could have wider uses for configuring different flavors
1364  of the storage types. A recipe is referred to by a symbolic name,
1365  which has associated settings. These settings are stored as a
1366  `PropertySet` so they can easily be passed down to the
1367  boost::persistence framework as the "additionalData" parameter.
1368 
1369  The list of recipes is written in YAML. A default recipe and
1370  some other convenient recipes are in obs_base/policy/writeRecipes.yaml
1371  and these may be overridden or supplemented by the individual obs_*
1372  packages' own policy/writeRecipes.yaml files.
1373 
1374  Recipes are grouped by the storage type. Currently, only the
1375  ``FitsStorage`` storage type uses recipes, which uses it to
1376  configure FITS image compression.
1377 
1378  Each ``FitsStorage`` recipe for FITS compression should define
1379  "image", "mask" and "variance" entries, each of which may contain
1380  "compression" and "scaling" entries. Defaults will be provided for
1381  any missing elements under "compression" and "scaling".
1382 
1383  The allowed entries under "compression" are:
1384 
1385  * algorithm (string): compression algorithm to use
1386  * rows (int): number of rows per tile (0 = entire dimension)
1387  * columns (int): number of columns per tile (0 = entire dimension)
1388  * quantizeLevel (float): cfitsio quantization level
1389 
1390  The allowed entries under "scaling" are:
1391 
1392  * algorithm (string): scaling algorithm to use
1393  * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64)
1394  * fuzz (bool): fuzz the values when quantising floating-point values?
1395  * seed (long): seed for random number generator when fuzzing
1396  * maskPlanes (list of string): mask planes to ignore when doing
1397  statistics
1398  * quantizeLevel: divisor of the standard deviation for STDEV_* scaling
1399  * quantizePad: number of stdev to allow on the low side (for
1400  STDEV_POSITIVE/NEGATIVE)
1401  * bscale: manually specified BSCALE (for MANUAL scaling)
1402  * bzero: manually specified BSCALE (for MANUAL scaling)
1403 
1404  A very simple example YAML recipe:
1405 
1406  FitsStorage:
1407  default:
1408  image: &default
1409  compression:
1410  algorithm: GZIP_SHUFFLE
1411  mask: *default
1412  variance: *default
1413  """
1414  recipesFile = os.path.join(getPackageDir("obs_base"), "policy", "writeRecipes.yaml")
1415  recipes = dafPersist.Policy(recipesFile)
1416  supplementsFile = os.path.join(self.getPackageDirgetPackageDir(), "policy", "writeRecipes.yaml")
1417  validationMenu = {'FitsStorage': validateRecipeFitsStorage, }
1418  if os.path.exists(supplementsFile) and supplementsFile != recipesFile:
1419  supplements = dafPersist.Policy(supplementsFile)
1420  # Don't allow overrides, only supplements
1421  for entry in validationMenu:
1422  intersection = set(recipes[entry].names()).intersection(set(supplements.names()))
1423  if intersection:
1424  raise RuntimeError("Recipes provided in %s section %s may not override those in %s: %s" %
1425  (supplementsFile, entry, recipesFile, intersection))
1426  recipes.update(supplements)
1427 
1428  self._writeRecipes_writeRecipes = {}
1429  for storageType in recipes.names(True):
1430  if "default" not in recipes[storageType]:
1431  raise RuntimeError("No 'default' recipe defined for storage type %s in %s" %
1432  (storageType, recipesFile))
1433  self._writeRecipes_writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1434 
1435 
1436 def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True):
1437  """Generate an Exposure from an image-like object
1438 
1439  If the image is a DecoratedImage then also set its WCS and metadata
1440  (Image and MaskedImage are missing the necessary metadata
1441  and Exposure already has those set)
1442 
1443  Parameters
1444  ----------
1445  image : Image-like object
1446  Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or
1447  Exposure.
1448 
1449  Returns
1450  -------
1451  `lsst.afw.image.Exposure`
1452  Exposure containing input image.
1453  """
1454  translatorClass = None
1455  if mapper is not None:
1456  translatorClass = mapper.translatorClass
1457 
1458  metadata = None
1459  if isinstance(image, afwImage.MaskedImage):
1460  exposure = afwImage.makeExposure(image)
1461  elif isinstance(image, afwImage.DecoratedImage):
1462  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
1463  metadata = image.getMetadata()
1464  fix_header(metadata, translator_class=translatorClass)
1465  exposure.setMetadata(metadata)
1466  elif isinstance(image, afwImage.Exposure):
1467  exposure = image
1468  metadata = exposure.getMetadata()
1469  fix_header(metadata, translator_class=translatorClass)
1470  else: # Image
1472 
1473  # set VisitInfo if we can
1474  if setVisitInfo and exposure.getInfo().getVisitInfo() is None:
1475  if metadata is not None:
1476  if mapper is None:
1477  if not logger:
1478  logger = lsstLog.Log.getLogger("CameraMapper")
1479  logger.warn("I can only set the VisitInfo if you provide a mapper")
1480  else:
1481  exposureId = mapper._computeCcdExposureId(dataId)
1482  visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1483 
1484  exposure.getInfo().setVisitInfo(visitInfo)
1485 
1486  return exposure
1487 
1488 
1489 def validateRecipeFitsStorage(recipes):
1490  """Validate recipes for FitsStorage
1491 
1492  The recipes are supplemented with default values where appropriate.
1493 
1494  TODO: replace this custom validation code with Cerberus (DM-11846)
1495 
1496  Parameters
1497  ----------
1498  recipes : `lsst.daf.persistence.Policy`
1499  FitsStorage recipes to validate.
1500 
1501  Returns
1502  -------
1503  validated : `lsst.daf.base.PropertySet`
1504  Validated FitsStorage recipe.
1505 
1506  Raises
1507  ------
1508  `RuntimeError`
1509  If validation fails.
1510  """
1511  # Schemas define what should be there, and the default values (and by the
1512  # default value, the expected type).
1513  compressionSchema = {
1514  "algorithm": "NONE",
1515  "rows": 1,
1516  "columns": 0,
1517  "quantizeLevel": 0.0,
1518  }
1519  scalingSchema = {
1520  "algorithm": "NONE",
1521  "bitpix": 0,
1522  "maskPlanes": ["NO_DATA"],
1523  "seed": 0,
1524  "quantizeLevel": 4.0,
1525  "quantizePad": 5.0,
1526  "fuzz": True,
1527  "bscale": 1.0,
1528  "bzero": 0.0,
1529  }
1530 
1531  def checkUnrecognized(entry, allowed, description):
1532  """Check to see if the entry contains unrecognised keywords"""
1533  unrecognized = set(entry.keys()) - set(allowed)
1534  if unrecognized:
1535  raise RuntimeError(
1536  "Unrecognized entries when parsing image compression recipe %s: %s" %
1537  (description, unrecognized))
1538 
1539  validated = {}
1540  for name in recipes.names(True):
1541  checkUnrecognized(recipes[name], ["image", "mask", "variance"], name)
1542  rr = dafBase.PropertySet()
1543  validated[name] = rr
1544  for plane in ("image", "mask", "variance"):
1545  checkUnrecognized(recipes[name][plane], ["compression", "scaling"],
1546  name + "->" + plane)
1547 
1548  for settings, schema in (("compression", compressionSchema),
1549  ("scaling", scalingSchema)):
1550  prefix = plane + "." + settings
1551  if settings not in recipes[name][plane]:
1552  for key in schema:
1553  rr.set(prefix + "." + key, schema[key])
1554  continue
1555  entry = recipes[name][plane][settings]
1556  checkUnrecognized(entry, schema.keys(), name + "->" + plane + "->" + settings)
1557  for key in schema:
1558  value = type(schema[key])(entry[key]) if key in entry else schema[key]
1559  rr.set(prefix + "." + key, value)
1560  return validated
1561 
A container for an Image and its associated metadata.
Definition: Image.h:404
A FITS reader class for Exposures and their components.
A class to contain the data, WCS, and other information needed to describe an image of the sky.
Definition: Exposure.h:72
A group of labels for a filter in an exposure or coadd.
Definition: FilterLabel.h:58
A class to manipulate images, masks, and variance as a single object.
Definition: MaskedImage.h:73
Class for storing generic metadata.
Definition: PropertySet.h:67
Class for logical location of a persisted Persistable instance.
def map(self, datasetType, dataId, write=False)
Definition: mapper.py:135
def backup(self, datasetType, dataId)
def _getBestFilter(self, storedLabel, idFilter)
def _makeCamera(self, policy, repositoryDir)
def _setAmpDetector(self, item, dataId, trimmed=True)
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
def std_bfKernel(self, item, dataId)
def _resolveFilters(definitions, idFilter, filterLabel)
def std_skypolicy(self, item, dataId)
def getImageCompressionSettings(self, datasetType, dataId)
def _createSkyWcsFromMetadata(self, exposure)
def map_camera(self, dataId, write=False)
def _setFilter(self, mapping, item, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
def map_expIdInfo(self, dataId, write=False)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def getKeys(self, datasetType, level)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _setCcdDetector(self, item, dataId, trimmed=True)
Reports attempts to access elements using an invalid key.
Definition: Runtime.h:151
Reports errors from accepting an object of an unexpected or inappropriate type.
Definition: Runtime.h:167
std::shared_ptr< daf::base::PropertyList > readMetadata(std::string const &fileName, int hdu=DEFAULT_HDU, bool strip=false)
Read FITS header.
Definition: fits.cc:1657
std::shared_ptr< SkyWcs > makeSkyWcs(daf::base::PropertySet &metadata, bool strip=false)
Construct a SkyWcs from FITS keywords.
Definition: SkyWcs.cc:526
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects.
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
Definition: Image.cc:688
std::shared_ptr< FilterLabel > makeFilterLabel(Filter const &filter)
Convert an old-style Filter to a FilterLabel.
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
Definition: Exposure.h:462
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
Definition: MaskedImage.h:1268
Definition: Log.h:706
def validateRecipeFitsStorage(recipes)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
def createInitialSkyWcs(visitInfo, detector, flipX=False)
Definition: utils.py:44
table::Key< int > type
Definition: Detector.cc:163
daf::base::PropertyList * list
Definition: fits.cc:913
daf::base::PropertySet * set
Definition: fits.cc:912