LSSTApplications  18.0.0+106,18.0.0+50,19.0.0,19.0.0+1,19.0.0+10,19.0.0+11,19.0.0+13,19.0.0+17,19.0.0+2,19.0.0-1-g20d9b18+6,19.0.0-1-g425ff20,19.0.0-1-g5549ca4,19.0.0-1-g580fafe+6,19.0.0-1-g6fe20d0+1,19.0.0-1-g7011481+9,19.0.0-1-g8c57eb9+6,19.0.0-1-gb5175dc+11,19.0.0-1-gdc0e4a7+9,19.0.0-1-ge272bc4+6,19.0.0-1-ge3aa853,19.0.0-10-g448f008b,19.0.0-12-g6990b2c,19.0.0-2-g0d9f9cd+11,19.0.0-2-g3d9e4fb2+11,19.0.0-2-g5037de4,19.0.0-2-gb96a1c4+3,19.0.0-2-gd955cfd+15,19.0.0-3-g2d13df8,19.0.0-3-g6f3c7dc,19.0.0-4-g725f80e+11,19.0.0-4-ga671dab3b+1,19.0.0-4-gad373c5+3,19.0.0-5-ga2acb9c+2,19.0.0-5-gfe96e6c+2,w.2020.01
LSSTDataManagementBasePackage
cameraMapper.py
Go to the documentation of this file.
1 # This file is part of obs_base.
2 #
3 # Developed for the LSST Data Management System.
4 # This product includes software developed by the LSST Project
5 # (https://www.lsst.org).
6 # See the COPYRIGHT file at the top-level directory of this distribution
7 # for details of code ownership.
8 #
9 # This program is free software: you can redistribute it and/or modify
10 # it under the terms of the GNU General Public License as published by
11 # the Free Software Foundation, either version 3 of the License, or
12 # (at your option) any later version.
13 #
14 # This program is distributed in the hope that it will be useful,
15 # but WITHOUT ANY WARRANTY; without even the implied warranty of
16 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
17 # GNU General Public License for more details.
18 #
19 # You should have received a copy of the GNU General Public License
20 # along with this program. If not, see <https://www.gnu.org/licenses/>.
21 
22 import copy
23 import os
24 import re
25 import traceback
26 import weakref
27 
28 from deprecated.sphinx import deprecated
29 
30 from astro_metadata_translator import fix_header
31 import lsst.daf.persistence as dafPersist
32 from . import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
33 import lsst.daf.base as dafBase
34 import lsst.afw.geom as afwGeom
35 import lsst.afw.image as afwImage
36 import lsst.afw.table as afwTable
37 from lsst.afw.fits import readMetadata
38 import lsst.afw.cameraGeom as afwCameraGeom
39 import lsst.log as lsstLog
40 import lsst.pex.exceptions as pexExcept
41 from .exposureIdInfo import ExposureIdInfo
42 from .makeRawVisitInfo import MakeRawVisitInfo
43 from .utils import createInitialSkyWcs, InitialSkyWcsError
44 from lsst.utils import getPackageDir
45 
46 __all__ = ["CameraMapper", "exposureFromImage"]
47 
48 
50 
51  """CameraMapper is a base class for mappers that handle images from a
52  camera and products derived from them. This provides an abstraction layer
53  between the data on disk and the code.
54 
55  Public methods: keys, queryMetadata, getDatasetTypes, map,
56  canStandardize, standardize
57 
58  Mappers for specific data sources (e.g., CFHT Megacam, LSST
59  simulations, etc.) should inherit this class.
60 
61  The CameraMapper manages datasets within a "root" directory. Note that
62  writing to a dataset present in the input root will hide the existing
63  dataset but not overwrite it. See #2160 for design discussion.
64 
65  A camera is assumed to consist of one or more rafts, each composed of
66  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
67  (amps). A camera is also assumed to have a camera geometry description
68  (CameraGeom object) as a policy file, a filter description (Filter class
69  static configuration) as another policy file.
70 
71  Information from the camera geometry and defects are inserted into all
72  Exposure objects returned.
73 
74  The mapper uses one or two registries to retrieve metadata about the
75  images. The first is a registry of all raw exposures. This must contain
76  the time of the observation. One or more tables (or the equivalent)
77  within the registry are used to look up data identifier components that
78  are not specified by the user (e.g. filter) and to return results for
79  metadata queries. The second is an optional registry of all calibration
80  data. This should contain validity start and end entries for each
81  calibration dataset in the same timescale as the observation time.
82 
83  Subclasses will typically set MakeRawVisitInfoClass and optionally the
84  metadata translator class:
85 
86  MakeRawVisitInfoClass: a class variable that points to a subclass of
87  MakeRawVisitInfo, a functor that creates an
88  lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
89 
90  translatorClass: The `~astro_metadata_translator.MetadataTranslator`
91  class to use for fixing metadata values. If it is not set an attempt
92  will be made to infer the class from ``MakeRawVisitInfoClass``, failing
93  that the metadata fixup will try to infer the translator class from the
94  header itself.
95 
96  Subclasses must provide the following methods:
97 
98  _extractDetectorName(self, dataId): returns the detector name for a CCD
99  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
100  a dataset identifier referring to that CCD or a subcomponent of it.
101 
102  _computeCcdExposureId(self, dataId): see below
103 
104  _computeCoaddExposureId(self, dataId, singleFilter): see below
105 
106  Subclasses may also need to override the following methods:
107 
108  _transformId(self, dataId): transformation of a data identifier
109  from colloquial usage (e.g., "ccdname") to proper/actual usage
110  (e.g., "ccd"), including making suitable for path expansion (e.g. removing
111  commas). The default implementation does nothing. Note that this
112  method should not modify its input parameter.
113 
114  getShortCcdName(self, ccdName): a static method that returns a shortened
115  name suitable for use as a filename. The default version converts spaces
116  to underscores.
117 
118  _mapActualToPath(self, template, actualId): convert a template path to an
119  actual path, using the actual dataset identifier.
120 
121  The mapper's behaviors are largely specified by the policy file.
122  See the MapperDictionary.paf for descriptions of the available items.
123 
124  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
125  mappings (see Mappings class).
126 
127  Common default mappings for all subclasses can be specified in the
128  "policy/{images,exposures,calibrations,datasets}.yaml" files. This
129  provides a simple way to add a product to all camera mappers.
130 
131  Functions to map (provide a path to the data given a dataset
132  identifier dictionary) and standardize (convert data into some standard
133  format or type) may be provided in the subclass as "map_{dataset type}"
134  and "std_{dataset type}", respectively.
135 
136  If non-Exposure datasets cannot be retrieved using standard
137  daf_persistence methods alone, a "bypass_{dataset type}" function may be
138  provided in the subclass to return the dataset instead of using the
139  "datasets" subpolicy.
140 
141  Implementations of map_camera and bypass_camera that should typically be
142  sufficient are provided in this base class.
143 
144  Notes
145  -----
146  .. todo::
147 
148  Instead of auto-loading the camera at construction time, load it from
149  the calibration registry
150 
151  Parameters
152  ----------
153  policy : daf_persistence.Policy,
154  Policy with per-camera defaults already merged.
155  repositoryDir : string
156  Policy repository for the subclassing module (obtained with
157  getRepositoryPath() on the per-camera default dictionary).
158  root : string, optional
159  Path to the root directory for data.
160  registry : string, optional
161  Path to registry with data's metadata.
162  calibRoot : string, optional
163  Root directory for calibrations.
164  calibRegistry : string, optional
165  Path to registry with calibrations' metadata.
166  provided : list of string, optional
167  Keys provided by the mapper.
168  parentRegistry : Registry subclass, optional
169  Registry from a parent repository that may be used to look up
170  data's metadata.
171  repositoryCfg : daf_persistence.RepositoryCfg or None, optional
172  The configuration information for the repository this mapper is
173  being used with.
174  """
175  packageName = None
176 
177  # a class or subclass of MakeRawVisitInfo, a functor that makes an
178  # lsst.afw.image.VisitInfo from the FITS metadata of a raw image
179  MakeRawVisitInfoClass = MakeRawVisitInfo
180 
181  # a class or subclass of PupilFactory
182  PupilFactoryClass = afwCameraGeom.PupilFactory
183 
184  # Class to use for metadata translations
185  translatorClass = None
186 
187  def __init__(self, policy, repositoryDir,
188  root=None, registry=None, calibRoot=None, calibRegistry=None,
189  provided=None, parentRegistry=None, repositoryCfg=None):
190 
191  dafPersist.Mapper.__init__(self)
192 
193  self.log = lsstLog.Log.getLogger("CameraMapper")
194 
195  if root:
196  self.root = root
197  elif repositoryCfg:
198  self.root = repositoryCfg.root
199  else:
200  self.root = None
201 
202  repoPolicy = repositoryCfg.policy if repositoryCfg else None
203  if repoPolicy is not None:
204  policy.update(repoPolicy)
205 
206  # Levels
207  self.levels = dict()
208  if 'levels' in policy:
209  levelsPolicy = policy['levels']
210  for key in levelsPolicy.names(True):
211  self.levels[key] = set(levelsPolicy.asArray(key))
212  self.defaultLevel = policy['defaultLevel']
213  self.defaultSubLevels = dict()
214  if 'defaultSubLevels' in policy:
215  self.defaultSubLevels = policy['defaultSubLevels']
216 
217  # Root directories
218  if root is None:
219  root = "."
220  root = dafPersist.LogicalLocation(root).locString()
221 
222  self.rootStorage = dafPersist.Storage.makeFromURI(uri=root)
223 
224  # If the calibRoot is passed in, use that. If not and it's indicated in
225  # the policy, use that. And otherwise, the calibs are in the regular
226  # root.
227  # If the location indicated by the calib root does not exist, do not
228  # create it.
229  calibStorage = None
230  if calibRoot is not None:
231  calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
232  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
233  create=False)
234  else:
235  calibRoot = policy.get('calibRoot', None)
236  if calibRoot:
237  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
238  create=False)
239  if calibStorage is None:
240  calibStorage = self.rootStorage
241 
242  self.root = root
243 
244  # Registries
245  self.registry = self._setupRegistry("registry", "exposure", registry, policy, "registryPath",
246  self.rootStorage, searchParents=False,
247  posixIfNoSql=(not parentRegistry))
248  if not self.registry:
249  self.registry = parentRegistry
250  needCalibRegistry = policy.get('needCalibRegistry', None)
251  if needCalibRegistry:
252  if calibStorage:
253  self.calibRegistry = self._setupRegistry("calibRegistry", "calib", calibRegistry, policy,
254  "calibRegistryPath", calibStorage,
255  posixIfNoSql=False) # NB never use posix for calibs
256  else:
257  raise RuntimeError(
258  "'needCalibRegistry' is true in Policy, but was unable to locate a repo at " +
259  "calibRoot ivar:%s or policy['calibRoot']:%s" %
260  (calibRoot, policy.get('calibRoot', None)))
261  else:
262  self.calibRegistry = None
263 
264  # Dict of valid keys and their value types
265  self.keyDict = dict()
266 
267  self._initMappings(policy, self.rootStorage, calibStorage, provided=None)
268  self._initWriteRecipes()
269 
270  # Camera geometry
271  self.cameraDataLocation = None # path to camera geometry config file
272  self.camera = self._makeCamera(policy=policy, repositoryDir=repositoryDir)
273 
274  # Filter translation table
275  self.filters = None
276 
277  # verify that the class variable packageName is set before attempting
278  # to instantiate an instance
279  if self.packageName is None:
280  raise ValueError('class variable packageName must not be None')
281 
283 
284  # Assign a metadata translator if one has not been defined by
285  # subclass. We can sometimes infer one from the RawVisitInfo
286  # class.
287  if self.translatorClass is None and hasattr(self.makeRawVisitInfo, "metadataTranslator"):
288  self.translatorClass = self.makeRawVisitInfo.metadataTranslator
289 
290  def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
291  """Initialize mappings
292 
293  For each of the dataset types that we want to be able to read, there
294  are methods that can be created to support them:
295  * map_<dataset> : determine the path for dataset
296  * std_<dataset> : standardize the retrieved dataset
297  * bypass_<dataset> : retrieve the dataset (bypassing the usual
298  retrieval machinery)
299  * query_<dataset> : query the registry
300 
301  Besides the dataset types explicitly listed in the policy, we create
302  additional, derived datasets for additional conveniences,
303  e.g., reading the header of an image, retrieving only the size of a
304  catalog.
305 
306  Parameters
307  ----------
308  policy : `lsst.daf.persistence.Policy`
309  Policy with per-camera defaults already merged
310  rootStorage : `Storage subclass instance`
311  Interface to persisted repository data.
312  calibRoot : `Storage subclass instance`
313  Interface to persisted calib repository data
314  provided : `list` of `str`
315  Keys provided by the mapper
316  """
317  # Sub-dictionaries (for exposure/calibration/dataset types)
318  imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
319  "obs_base", "ImageMappingDefaults.yaml", "policy"))
320  expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
321  "obs_base", "ExposureMappingDefaults.yaml", "policy"))
322  calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
323  "obs_base", "CalibrationMappingDefaults.yaml", "policy"))
324  dsMappingPolicy = dafPersist.Policy()
325 
326  # Mappings
327  mappingList = (
328  ("images", imgMappingPolicy, ImageMapping),
329  ("exposures", expMappingPolicy, ExposureMapping),
330  ("calibrations", calMappingPolicy, CalibrationMapping),
331  ("datasets", dsMappingPolicy, DatasetMapping)
332  )
333  self.mappings = dict()
334  for name, defPolicy, cls in mappingList:
335  if name in policy:
336  datasets = policy[name]
337 
338  # Centrally-defined datasets
339  defaultsPath = os.path.join(getPackageDir("obs_base"), "policy", name + ".yaml")
340  if os.path.exists(defaultsPath):
341  datasets.merge(dafPersist.Policy(defaultsPath))
342 
343  mappings = dict()
344  setattr(self, name, mappings)
345  for datasetType in datasets.names(True):
346  subPolicy = datasets[datasetType]
347  subPolicy.merge(defPolicy)
348 
349  if not hasattr(self, "map_" + datasetType) and 'composite' in subPolicy:
350  def compositeClosure(dataId, write=False, mapper=None, mapping=None,
351  subPolicy=subPolicy):
352  components = subPolicy.get('composite')
353  assembler = subPolicy['assembler'] if 'assembler' in subPolicy else None
354  disassembler = subPolicy['disassembler'] if 'disassembler' in subPolicy else None
355  python = subPolicy['python']
356  butlerComposite = dafPersist.ButlerComposite(assembler=assembler,
357  disassembler=disassembler,
358  python=python,
359  dataId=dataId,
360  mapper=self)
361  for name, component in components.items():
362  butlerComposite.add(id=name,
363  datasetType=component.get('datasetType'),
364  setter=component.get('setter', None),
365  getter=component.get('getter', None),
366  subset=component.get('subset', False),
367  inputOnly=component.get('inputOnly', False))
368  return butlerComposite
369  setattr(self, "map_" + datasetType, compositeClosure)
370  # for now at least, don't set up any other handling for this dataset type.
371  continue
372 
373  if name == "calibrations":
374  mapping = cls(datasetType, subPolicy, self.registry, self.calibRegistry, calibStorage,
375  provided=provided, dataRoot=rootStorage)
376  else:
377  mapping = cls(datasetType, subPolicy, self.registry, rootStorage, provided=provided)
378 
379  if datasetType in self.mappings:
380  raise ValueError(f"Duplicate mapping policy for dataset type {datasetType}")
381  self.keyDict.update(mapping.keys())
382  mappings[datasetType] = mapping
383  self.mappings[datasetType] = mapping
384  if not hasattr(self, "map_" + datasetType):
385  def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
386  return mapping.map(mapper, dataId, write)
387  setattr(self, "map_" + datasetType, mapClosure)
388  if not hasattr(self, "query_" + datasetType):
389  def queryClosure(format, dataId, mapping=mapping):
390  return mapping.lookup(format, dataId)
391  setattr(self, "query_" + datasetType, queryClosure)
392  if hasattr(mapping, "standardize") and not hasattr(self, "std_" + datasetType):
393  def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
394  return mapping.standardize(mapper, item, dataId)
395  setattr(self, "std_" + datasetType, stdClosure)
396 
397  def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
398  """Set convenience methods on CameraMapper"""
399  mapName = "map_" + datasetType + "_" + suffix
400  bypassName = "bypass_" + datasetType + "_" + suffix
401  queryName = "query_" + datasetType + "_" + suffix
402  if not hasattr(self, mapName):
403  setattr(self, mapName, mapImpl or getattr(self, "map_" + datasetType))
404  if not hasattr(self, bypassName):
405  if bypassImpl is None and hasattr(self, "bypass_" + datasetType):
406  bypassImpl = getattr(self, "bypass_" + datasetType)
407  if bypassImpl is not None:
408  setattr(self, bypassName, bypassImpl)
409  if not hasattr(self, queryName):
410  setattr(self, queryName, queryImpl or getattr(self, "query_" + datasetType))
411 
412  # Filename of dataset
413  setMethods("filename", bypassImpl=lambda datasetType, pythonType, location, dataId:
414  [os.path.join(location.getStorage().root, p) for p in location.getLocations()])
415  # Metadata from FITS file
416  if subPolicy["storage"] == "FitsStorage": # a FITS image
417  def getMetadata(datasetType, pythonType, location, dataId):
418  md = readMetadata(location.getLocationsWithRoot()[0])
419  fix_header(md, translator_class=self.translatorClass)
420  return md
421 
422  setMethods("md", bypassImpl=getMetadata)
423 
424  # Add support for configuring FITS compression
425  addName = "add_" + datasetType
426  if not hasattr(self, addName):
427  setattr(self, addName, self.getImageCompressionSettings)
428 
429  if name == "exposures":
430  def getSkyWcs(datasetType, pythonType, location, dataId):
431  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
432  return fitsReader.readWcs()
433 
434  setMethods("wcs", bypassImpl=getSkyWcs)
435 
436  def getRawHeaderWcs(datasetType, pythonType, location, dataId):
437  """Create a SkyWcs from the un-modified raw FITS WCS header keys."""
438  if datasetType[:3] != "raw":
439  raise dafPersist.NoResults("Can only get header WCS for raw exposures.",
440  datasetType, dataId)
441  return afwGeom.makeSkyWcs(readMetadata(location.getLocationsWithRoot()[0]))
442 
443  setMethods("header_wcs", bypassImpl=getRawHeaderWcs)
444 
445  def getPhotoCalib(datasetType, pythonType, location, dataId):
446  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
447  return fitsReader.readPhotoCalib()
448 
449  setMethods("photoCalib", bypassImpl=getPhotoCalib)
450 
451  def getVisitInfo(datasetType, pythonType, location, dataId):
452  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
453  return fitsReader.readVisitInfo()
454 
455  setMethods("visitInfo", bypassImpl=getVisitInfo)
456 
457  def getFilter(datasetType, pythonType, location, dataId):
458  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
459  return fitsReader.readFilter()
460 
461  setMethods("filter", bypassImpl=getFilter)
462 
463  setMethods("detector",
464  mapImpl=lambda dataId, write=False:
466  pythonType="lsst.afw.cameraGeom.CameraConfig",
467  cppType="Config",
468  storageName="Internal",
469  locationList="ignored",
470  dataId=dataId,
471  mapper=self,
472  storage=None,
473  ),
474  bypassImpl=lambda datasetType, pythonType, location, dataId:
475  self.camera[self._extractDetectorName(dataId)]
476  )
477 
478  def getBBox(datasetType, pythonType, location, dataId):
479  md = readMetadata(location.getLocationsWithRoot()[0], hdu=1)
480  fix_header(md, translator_class=self.translatorClass)
481  return afwImage.bboxFromMetadata(md)
482 
483  setMethods("bbox", bypassImpl=getBBox)
484 
485  elif name == "images":
486  def getBBox(datasetType, pythonType, location, dataId):
487  md = readMetadata(location.getLocationsWithRoot()[0])
488  fix_header(md, translator_class=self.translatorClass)
489  return afwImage.bboxFromMetadata(md)
490  setMethods("bbox", bypassImpl=getBBox)
491 
492  if subPolicy["storage"] == "FitsCatalogStorage": # a FITS catalog
493 
494  def getMetadata(datasetType, pythonType, location, dataId):
495  md = readMetadata(os.path.join(location.getStorage().root,
496  location.getLocations()[0]), hdu=1)
497  fix_header(md, translator_class=self.translatorClass)
498  return md
499 
500  setMethods("md", bypassImpl=getMetadata)
501 
502  # Sub-images
503  if subPolicy["storage"] == "FitsStorage":
504  def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
505  subId = dataId.copy()
506  del subId['bbox']
507  loc = mapping.map(mapper, subId, write)
508  bbox = dataId['bbox']
509  llcX = bbox.getMinX()
510  llcY = bbox.getMinY()
511  width = bbox.getWidth()
512  height = bbox.getHeight()
513  loc.additionalData.set('llcX', llcX)
514  loc.additionalData.set('llcY', llcY)
515  loc.additionalData.set('width', width)
516  loc.additionalData.set('height', height)
517  if 'imageOrigin' in dataId:
518  loc.additionalData.set('imageOrigin',
519  dataId['imageOrigin'])
520  return loc
521 
522  def querySubClosure(key, format, dataId, mapping=mapping):
523  subId = dataId.copy()
524  del subId['bbox']
525  return mapping.lookup(format, subId)
526  setMethods("sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
527 
528  if subPolicy["storage"] == "FitsCatalogStorage":
529  # Length of catalog
530 
531  def getLen(datasetType, pythonType, location, dataId):
532  md = readMetadata(os.path.join(location.getStorage().root,
533  location.getLocations()[0]), hdu=1)
534  fix_header(md, translator_class=self.translatorClass)
535  return md["NAXIS2"]
536 
537  setMethods("len", bypassImpl=getLen)
538 
539  # Schema of catalog
540  if not datasetType.endswith("_schema") and datasetType + "_schema" not in datasets:
541  setMethods("schema", bypassImpl=lambda datasetType, pythonType, location, dataId:
542  afwTable.Schema.readFits(os.path.join(location.getStorage().root,
543  location.getLocations()[0])))
544 
545  def _computeCcdExposureId(self, dataId):
546  """Compute the 64-bit (long) identifier for a CCD exposure.
547 
548  Subclasses must override
549 
550  Parameters
551  ----------
552  dataId : `dict`
553  Data identifier with visit, ccd.
554  """
555  raise NotImplementedError()
556 
557  def _computeCoaddExposureId(self, dataId, singleFilter):
558  """Compute the 64-bit (long) identifier for a coadd.
559 
560  Subclasses must override
561 
562  Parameters
563  ----------
564  dataId : `dict`
565  Data identifier with tract and patch.
566  singleFilter : `bool`
567  True means the desired ID is for a single-filter coadd, in which
568  case dataIdmust contain filter.
569  """
570  raise NotImplementedError()
571 
572  def _search(self, path):
573  """Search for path in the associated repository's storage.
574 
575  Parameters
576  ----------
577  path : string
578  Path that describes an object in the repository associated with
579  this mapper.
580  Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The
581  indicator will be stripped when searching and so will match
582  filenames without the HDU indicator, e.g. 'foo.fits'. The path
583  returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
584 
585  Returns
586  -------
587  string
588  The path for this object in the repository. Will return None if the
589  object can't be found. If the input argument path contained an HDU
590  indicator, the returned path will also contain the HDU indicator.
591  """
592  return self.rootStorage.search(path)
593 
594  def backup(self, datasetType, dataId):
595  """Rename any existing object with the given type and dataId.
596 
597  The CameraMapper implementation saves objects in a sequence of e.g.:
598 
599  - foo.fits
600  - foo.fits~1
601  - foo.fits~2
602 
603  All of the backups will be placed in the output repo, however, and will
604  not be removed if they are found elsewhere in the _parent chain. This
605  means that the same file will be stored twice if the previous version
606  was found in an input repo.
607  """
608 
609  # Calling PosixStorage directly is not the long term solution in this
610  # function, this is work-in-progress on epic DM-6225. The plan is for
611  # parentSearch to be changed to 'search', and search only the storage
612  # associated with this mapper. All searching of parents will be handled
613  # by traversing the container of repositories in Butler.
614 
615  def firstElement(list):
616  """Get the first element in the list, or None if that can't be
617  done.
618  """
619  return list[0] if list is not None and len(list) else None
620 
621  n = 0
622  newLocation = self.map(datasetType, dataId, write=True)
623  newPath = newLocation.getLocations()[0]
624  path = dafPersist.PosixStorage.search(self.root, newPath, searchParents=True)
625  path = firstElement(path)
626  oldPaths = []
627  while path is not None:
628  n += 1
629  oldPaths.append((n, path))
630  path = dafPersist.PosixStorage.search(self.root, "%s~%d" % (newPath, n), searchParents=True)
631  path = firstElement(path)
632  for n, oldPath in reversed(oldPaths):
633  self.rootStorage.copyFile(oldPath, "%s~%d" % (newPath, n))
634 
635  def keys(self):
636  """Return supported keys.
637 
638  Returns
639  -------
640  iterable
641  List of keys usable in a dataset identifier
642  """
643  return iter(self.keyDict.keys())
644 
645  def getKeys(self, datasetType, level):
646  """Return a dict of supported keys and their value types for a given
647  dataset type at a given level of the key hierarchy.
648 
649  Parameters
650  ----------
651  datasetType : `str`
652  Dataset type or None for all dataset types.
653  level : `str` or None
654  Level or None for all levels or '' for the default level for the
655  camera.
656 
657  Returns
658  -------
659  `dict`
660  Keys are strings usable in a dataset identifier, values are their
661  value types.
662  """
663 
664  # not sure if this is how we want to do this. what if None was intended?
665  if level == '':
666  level = self.getDefaultLevel()
667 
668  if datasetType is None:
669  keyDict = copy.copy(self.keyDict)
670  else:
671  keyDict = self.mappings[datasetType].keys()
672  if level is not None and level in self.levels:
673  keyDict = copy.copy(keyDict)
674  for l in self.levels[level]:
675  if l in keyDict:
676  del keyDict[l]
677  return keyDict
678 
679  def getDefaultLevel(self):
680  return self.defaultLevel
681 
682  def getDefaultSubLevel(self, level):
683  if level in self.defaultSubLevels:
684  return self.defaultSubLevels[level]
685  return None
686 
687  @classmethod
688  def getCameraName(cls):
689  """Return the name of the camera that this CameraMapper is for."""
690  className = str(cls)
691  className = className[className.find('.'):-1]
692  m = re.search(r'(\w+)Mapper', className)
693  if m is None:
694  m = re.search(r"class '[\w.]*?(\w+)'", className)
695  name = m.group(1)
696  return name[:1].lower() + name[1:] if name else ''
697 
698  @classmethod
699  def getPackageName(cls):
700  """Return the name of the package containing this CameraMapper."""
701  if cls.packageName is None:
702  raise ValueError('class variable packageName must not be None')
703  return cls.packageName
704 
705  @classmethod
706  def getPackageDir(cls):
707  """Return the base directory of this package"""
708  return getPackageDir(cls.getPackageName())
709 
710  def map_camera(self, dataId, write=False):
711  """Map a camera dataset."""
712  if self.camera is None:
713  raise RuntimeError("No camera dataset available.")
714  actualId = self._transformId(dataId)
716  pythonType="lsst.afw.cameraGeom.CameraConfig",
717  cppType="Config",
718  storageName="ConfigStorage",
719  locationList=self.cameraDataLocation or "ignored",
720  dataId=actualId,
721  mapper=self,
722  storage=self.rootStorage
723  )
724 
725  def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
726  """Return the (preloaded) camera object.
727  """
728  if self.camera is None:
729  raise RuntimeError("No camera dataset available.")
730  return self.camera
731 
732  def map_expIdInfo(self, dataId, write=False):
734  pythonType="lsst.obs.base.ExposureIdInfo",
735  cppType=None,
736  storageName="Internal",
737  locationList="ignored",
738  dataId=dataId,
739  mapper=self,
740  storage=self.rootStorage
741  )
742 
743  def bypass_expIdInfo(self, datasetType, pythonType, location, dataId):
744  """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
745  expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
746  expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
747  return ExposureIdInfo(expId=expId, expBits=expBits)
748 
749  def std_bfKernel(self, item, dataId):
750  """Disable standardization for bfKernel
751 
752  bfKernel is a calibration product that is numpy array,
753  unlike other calibration products that are all images;
754  all calibration images are sent through _standardizeExposure
755  due to CalibrationMapping, but we don't want that to happen to bfKernel
756  """
757  return item
758 
759  def std_raw(self, item, dataId):
760  """Standardize a raw dataset by converting it to an Exposure instead
761  of an Image"""
762  return self._standardizeExposure(self.exposures['raw'], item, dataId,
763  trimmed=False, setVisitInfo=True)
764 
765  def map_skypolicy(self, dataId):
766  """Map a sky policy."""
767  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
768  "Internal", None, None, self,
769  storage=self.rootStorage)
770 
771  def std_skypolicy(self, item, dataId):
772  """Standardize a sky policy by returning the one we use."""
773  return self.skypolicy
774 
775 
780 
781  def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
782  posixIfNoSql=True):
783  """Set up a registry (usually SQLite3), trying a number of possible
784  paths.
785 
786  Parameters
787  ----------
788  name : string
789  Name of registry.
790  description: `str`
791  Description of registry (for log messages)
792  path : string
793  Path for registry.
794  policy : string
795  Policy that contains the registry name, used if path is None.
796  policyKey : string
797  Key in policy for registry path.
798  storage : Storage subclass
799  Repository Storage to look in.
800  searchParents : bool, optional
801  True if the search for a registry should follow any Butler v1
802  _parent symlinks.
803  posixIfNoSql : bool, optional
804  If an sqlite registry is not found, will create a posix registry if
805  this is True.
806 
807  Returns
808  -------
809  lsst.daf.persistence.Registry
810  Registry object
811  """
812  if path is None and policyKey in policy:
813  path = dafPersist.LogicalLocation(policy[policyKey]).locString()
814  if os.path.isabs(path):
815  raise RuntimeError("Policy should not indicate an absolute path for registry.")
816  if not storage.exists(path):
817  newPath = storage.instanceSearch(path)
818 
819  newPath = newPath[0] if newPath is not None and len(newPath) else None
820  if newPath is None:
821  self.log.warn("Unable to locate registry at policy path (also looked in root): %s",
822  path)
823  path = newPath
824  else:
825  self.log.warn("Unable to locate registry at policy path: %s", path)
826  path = None
827 
828  # Old Butler API was to indicate the registry WITH the repo folder, New Butler expects the registry to
829  # be in the repo folder. To support Old API, check to see if path starts with root, and if so, strip
830  # root from path. Currently only works with PosixStorage
831  try:
832  root = storage.root
833  if path and (path.startswith(root)):
834  path = path[len(root + '/'):]
835  except AttributeError:
836  pass
837 
838  # determine if there is an sqlite registry and if not, try the posix registry.
839  registry = None
840 
841  def search(filename, description):
842  """Search for file in storage
843 
844  Parameters
845  ----------
846  filename : `str`
847  Filename to search for
848  description : `str`
849  Description of file, for error message.
850 
851  Returns
852  -------
853  path : `str` or `None`
854  Path to file, or None
855  """
856  result = storage.instanceSearch(filename)
857  if result:
858  return result[0]
859  self.log.debug("Unable to locate %s: %s", description, filename)
860  return None
861 
862  # Search for a suitable registry database
863  if path is None:
864  path = search("%s.pgsql" % name, "%s in root" % description)
865  if path is None:
866  path = search("%s.sqlite3" % name, "%s in root" % description)
867  if path is None:
868  path = search(os.path.join(".", "%s.sqlite3" % name), "%s in current dir" % description)
869 
870  if path is not None:
871  if not storage.exists(path):
872  newPath = storage.instanceSearch(path)
873  newPath = newPath[0] if newPath is not None and len(newPath) else None
874  if newPath is not None:
875  path = newPath
876  localFileObj = storage.getLocalFile(path)
877  self.log.info("Loading %s registry from %s", description, localFileObj.name)
878  registry = dafPersist.Registry.create(localFileObj.name)
879  localFileObj.close()
880  elif not registry and posixIfNoSql:
881  try:
882  self.log.info("Loading Posix %s registry from %s", description, storage.root)
883  registry = dafPersist.PosixRegistry(storage.root)
884  except Exception:
885  registry = None
886 
887  return registry
888 
889  def _transformId(self, dataId):
890  """Generate a standard ID dict from a camera-specific ID dict.
891 
892  Canonical keys include:
893  - amp: amplifier name
894  - ccd: CCD name (in LSST this is a combination of raft and sensor)
895  The default implementation returns a copy of its input.
896 
897  Parameters
898  ----------
899  dataId : `dict`
900  Dataset identifier; this must not be modified
901 
902  Returns
903  -------
904  `dict`
905  Transformed dataset identifier.
906  """
907 
908  return dataId.copy()
909 
910  def _mapActualToPath(self, template, actualId):
911  """Convert a template path to an actual path, using the actual data
912  identifier. This implementation is usually sufficient but can be
913  overridden by the subclass.
914 
915  Parameters
916  ----------
917  template : `str`
918  Template path
919  actualId : `dict`
920  Dataset identifier
921 
922  Returns
923  -------
924  `str`
925  Pathname
926  """
927 
928  try:
929  transformedId = self._transformId(actualId)
930  return template % transformedId
931  except Exception as e:
932  raise RuntimeError("Failed to format %r with data %r: %s" % (template, transformedId, e))
933 
934  @staticmethod
935  def getShortCcdName(ccdName):
936  """Convert a CCD name to a form useful as a filename
937 
938  The default implementation converts spaces to underscores.
939  """
940  return ccdName.replace(" ", "_")
941 
942  def _extractDetectorName(self, dataId):
943  """Extract the detector (CCD) name from the dataset identifier.
944 
945  The name in question is the detector name used by lsst.afw.cameraGeom.
946 
947  Parameters
948  ----------
949  dataId : `dict`
950  Dataset identifier.
951 
952  Returns
953  -------
954  `str`
955  Detector name
956  """
957  raise NotImplementedError("No _extractDetectorName() function specified")
958 
959  @deprecated("This method is no longer used for ISR (will be removed after v11)", category=FutureWarning)
960  def _extractAmpId(self, dataId):
961  """Extract the amplifier identifer from a dataset identifier.
962 
963  .. note:: Deprecated in 11_0
964 
965  amplifier identifier has two parts: the detector name for the CCD
966  containing the amplifier and index of the amplifier in the detector.
967 
968  Parameters
969  ----------
970  dataId : `dict`
971  Dataset identifer
972 
973  Returns
974  -------
975  `tuple`
976  Amplifier identifier
977  """
978 
979  trDataId = self._transformId(dataId)
980  return (trDataId["ccd"], int(trDataId['amp']))
981 
982  def _setAmpDetector(self, item, dataId, trimmed=True):
983  """Set the detector object in an Exposure for an amplifier.
984 
985  Defects are also added to the Exposure based on the detector object.
986 
987  Parameters
988  ----------
989  item : `lsst.afw.image.Exposure`
990  Exposure to set the detector in.
991  dataId : `dict`
992  Dataset identifier
993  trimmed : `bool`
994  Should detector be marked as trimmed? (ignored)
995  """
996 
997  return self._setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
998 
999  def _setCcdDetector(self, item, dataId, trimmed=True):
1000  """Set the detector object in an Exposure for a CCD.
1001 
1002  Parameters
1003  ----------
1004  item : `lsst.afw.image.Exposure`
1005  Exposure to set the detector in.
1006  dataId : `dict`
1007  Dataset identifier
1008  trimmed : `bool`
1009  Should detector be marked as trimmed? (ignored)
1010  """
1011  if item.getDetector() is not None:
1012  return
1013 
1014  detectorName = self._extractDetectorName(dataId)
1015  detector = self.camera[detectorName]
1016  item.setDetector(detector)
1017 
1018  def _setFilter(self, mapping, item, dataId):
1019  """Set the filter object in an Exposure. If the Exposure had a FILTER
1020  keyword, this was already processed during load. But if it didn't,
1021  use the filter from the registry.
1022 
1023  Parameters
1024  ----------
1025  mapping : `lsst.obs.base.Mapping`
1026  Where to get the filter from.
1027  item : `lsst.afw.image.Exposure`
1028  Exposure to set the filter in.
1029  dataId : `dict`
1030  Dataset identifier.
1031  """
1032 
1033  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI) or
1034  isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
1035  return
1036 
1037  if item.getFilter().getId() != afwImage.Filter.UNKNOWN:
1038  return
1039 
1040  actualId = mapping.need(['filter'], dataId)
1041  filterName = actualId['filter']
1042  if self.filters is not None and filterName in self.filters:
1043  filterName = self.filters[filterName]
1044  try:
1045  item.setFilter(afwImage.Filter(filterName))
1046  except pexExcept.NotFoundError:
1047  self.log.warn("Filter %s not defined. Set to UNKNOWN." % (filterName))
1048 
1049  def _standardizeExposure(self, mapping, item, dataId, filter=True,
1050  trimmed=True, setVisitInfo=True):
1051  """Default standardization function for images.
1052 
1053  This sets the Detector from the camera geometry
1054  and optionally set the Filter. In both cases this saves
1055  having to persist some data in each exposure (or image).
1056 
1057  Parameters
1058  ----------
1059  mapping : `lsst.obs.base.Mapping`
1060  Where to get the values from.
1061  item : image-like object
1062  Can be any of lsst.afw.image.Exposure,
1063  lsst.afw.image.DecoratedImage, lsst.afw.image.Image
1064  or lsst.afw.image.MaskedImage
1065 
1066  dataId : `dict`
1067  Dataset identifier
1068  filter : `bool`
1069  Set filter? Ignored if item is already an exposure
1070  trimmed : `bool`
1071  Should detector be marked as trimmed?
1072  setVisitInfo : `bool`
1073  Should Exposure have its VisitInfo filled out from the metadata?
1074 
1075  Returns
1076  -------
1077  `lsst.afw.image.Exposure`
1078  The standardized Exposure.
1079  """
1080  try:
1081  exposure = exposureFromImage(item, dataId, mapper=self, logger=self.log,
1082  setVisitInfo=setVisitInfo)
1083  except Exception as e:
1084  self.log.error("Could not turn item=%r into an exposure: %s" % (repr(item), e))
1085  raise
1086 
1087  if mapping.level.lower() == "amp":
1088  self._setAmpDetector(exposure, dataId, trimmed)
1089  elif mapping.level.lower() == "ccd":
1090  self._setCcdDetector(exposure, dataId, trimmed)
1091 
1092  # We can only create a WCS if it doesn't already have one and
1093  # we have either a VisitInfo or exposure metadata.
1094  # Do not calculate a WCS if this is an amplifier exposure
1095  if mapping.level.lower() != "amp" and exposure.getWcs() is None and \
1096  (exposure.getInfo().getVisitInfo() is not None or exposure.getMetadata().toDict()):
1097  self._createInitialSkyWcs(exposure)
1098 
1099  if filter:
1100  self._setFilter(mapping, exposure, dataId)
1101 
1102  return exposure
1103 
1104  def _createSkyWcsFromMetadata(self, exposure):
1105  """Create a SkyWcs from the FITS header metadata in an Exposure.
1106 
1107  Parameters
1108  ----------
1109  exposure : `lsst.afw.image.Exposure`
1110  The exposure to get metadata from, and attach the SkyWcs to.
1111  """
1112  metadata = exposure.getMetadata()
1113  try:
1114  wcs = afwGeom.makeSkyWcs(metadata, strip=True)
1115  exposure.setWcs(wcs)
1116  except pexExcept.TypeError as e:
1117  # See DM-14372 for why this is debug and not warn (e.g. calib files without wcs metadata).
1118  self.log.debug("wcs set to None; missing information found in metadata to create a valid wcs:"
1119  " %s", e.args[0])
1120  # ensure any WCS values stripped from the metadata are removed in the exposure
1121  exposure.setMetadata(metadata)
1122 
1123  def _createInitialSkyWcs(self, exposure):
1124  """Create a SkyWcs from the boresight and camera geometry.
1125 
1126  If the boresight or camera geometry do not support this method of
1127  WCS creation, this falls back on the header metadata-based version
1128  (typically a purely linear FITS crval/crpix/cdmatrix WCS).
1129 
1130  Parameters
1131  ----------
1132  exposure : `lsst.afw.image.Exposure`
1133  The exposure to get data from, and attach the SkyWcs to.
1134  """
1135  # Always use try to use metadata first, to strip WCS keys from it.
1136  self._createSkyWcsFromMetadata(exposure)
1137 
1138  if exposure.getInfo().getVisitInfo() is None:
1139  msg = "No VisitInfo; cannot access boresight information. Defaulting to metadata-based SkyWcs."
1140  self.log.warn(msg)
1141  return
1142  try:
1143  newSkyWcs = createInitialSkyWcs(exposure.getInfo().getVisitInfo(), exposure.getDetector())
1144  exposure.setWcs(newSkyWcs)
1145  except InitialSkyWcsError as e:
1146  msg = "Cannot create SkyWcs using VisitInfo and Detector, using metadata-based SkyWcs: %s"
1147  self.log.warn(msg, e)
1148  self.log.debug("Exception was: %s", traceback.TracebackException.from_exception(e))
1149  if e.__context__ is not None:
1150  self.log.debug("Root-cause Exception was: %s",
1151  traceback.TracebackException.from_exception(e.__context__))
1152 
1153  def _makeCamera(self, policy, repositoryDir):
1154  """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing
1155  the camera geometry
1156 
1157  Also set self.cameraDataLocation, if relevant (else it can be left
1158  None).
1159 
1160  This implementation assumes that policy contains an entry "camera"
1161  that points to the subdirectory in this package of camera data;
1162  specifically, that subdirectory must contain:
1163  - a file named `camera.py` that contains persisted camera config
1164  - ampInfo table FITS files, as required by
1165  lsst.afw.cameraGeom.makeCameraFromPath
1166 
1167  Parameters
1168  ----------
1169  policy : `lsst.daf.persistence.Policy`
1170  Policy with per-camera defaults already merged
1171  (PexPolicy only for backward compatibility).
1172  repositoryDir : `str`
1173  Policy repository for the subclassing module (obtained with
1174  getRepositoryPath() on the per-camera default dictionary).
1175  """
1176  if 'camera' not in policy:
1177  raise RuntimeError("Cannot find 'camera' in policy; cannot construct a camera")
1178  cameraDataSubdir = policy['camera']
1179  self.cameraDataLocation = os.path.normpath(
1180  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
1181  cameraConfig = afwCameraGeom.CameraConfig()
1182  cameraConfig.load(self.cameraDataLocation)
1183  ampInfoPath = os.path.dirname(self.cameraDataLocation)
1184  return afwCameraGeom.makeCameraFromPath(
1185  cameraConfig=cameraConfig,
1186  ampInfoPath=ampInfoPath,
1187  shortNameFunc=self.getShortCcdName,
1188  pupilFactoryClass=self.PupilFactoryClass
1189  )
1190 
1191  def getRegistry(self):
1192  """Get the registry used by this mapper.
1193 
1194  Returns
1195  -------
1196  Registry or None
1197  The registry used by this mapper for this mapper's repository.
1198  """
1199  return self.registry
1200 
1201  def getImageCompressionSettings(self, datasetType, dataId):
1202  """Stuff image compression settings into a daf.base.PropertySet
1203 
1204  This goes into the ButlerLocation's "additionalData", which gets
1205  passed into the boost::persistence framework.
1206 
1207  Parameters
1208  ----------
1209  datasetType : `str`
1210  Type of dataset for which to get the image compression settings.
1211  dataId : `dict`
1212  Dataset identifier.
1213 
1214  Returns
1215  -------
1216  additionalData : `lsst.daf.base.PropertySet`
1217  Image compression settings.
1218  """
1219  mapping = self.mappings[datasetType]
1220  recipeName = mapping.recipe
1221  storageType = mapping.storage
1222  if storageType not in self._writeRecipes:
1223  return dafBase.PropertySet()
1224  if recipeName not in self._writeRecipes[storageType]:
1225  raise RuntimeError("Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1226  (datasetType, storageType, recipeName))
1227  recipe = self._writeRecipes[storageType][recipeName].deepCopy()
1228  seed = hash(tuple(dataId.items())) % 2**31
1229  for plane in ("image", "mask", "variance"):
1230  if recipe.exists(plane + ".scaling.seed") and recipe.getScalar(plane + ".scaling.seed") == 0:
1231  recipe.set(plane + ".scaling.seed", seed)
1232  return recipe
1233 
1234  def _initWriteRecipes(self):
1235  """Read the recipes for writing files
1236 
1237  These recipes are currently used for configuring FITS compression,
1238  but they could have wider uses for configuring different flavors
1239  of the storage types. A recipe is referred to by a symbolic name,
1240  which has associated settings. These settings are stored as a
1241  `PropertySet` so they can easily be passed down to the
1242  boost::persistence framework as the "additionalData" parameter.
1243 
1244  The list of recipes is written in YAML. A default recipe and
1245  some other convenient recipes are in obs_base/policy/writeRecipes.yaml
1246  and these may be overridden or supplemented by the individual obs_*
1247  packages' own policy/writeRecipes.yaml files.
1248 
1249  Recipes are grouped by the storage type. Currently, only the
1250  ``FitsStorage`` storage type uses recipes, which uses it to
1251  configure FITS image compression.
1252 
1253  Each ``FitsStorage`` recipe for FITS compression should define
1254  "image", "mask" and "variance" entries, each of which may contain
1255  "compression" and "scaling" entries. Defaults will be provided for
1256  any missing elements under "compression" and "scaling".
1257 
1258  The allowed entries under "compression" are:
1259 
1260  * algorithm (string): compression algorithm to use
1261  * rows (int): number of rows per tile (0 = entire dimension)
1262  * columns (int): number of columns per tile (0 = entire dimension)
1263  * quantizeLevel (float): cfitsio quantization level
1264 
1265  The allowed entries under "scaling" are:
1266 
1267  * algorithm (string): scaling algorithm to use
1268  * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64)
1269  * fuzz (bool): fuzz the values when quantising floating-point values?
1270  * seed (long): seed for random number generator when fuzzing
1271  * maskPlanes (list of string): mask planes to ignore when doing
1272  statistics
1273  * quantizeLevel: divisor of the standard deviation for STDEV_* scaling
1274  * quantizePad: number of stdev to allow on the low side (for
1275  STDEV_POSITIVE/NEGATIVE)
1276  * bscale: manually specified BSCALE (for MANUAL scaling)
1277  * bzero: manually specified BSCALE (for MANUAL scaling)
1278 
1279  A very simple example YAML recipe:
1280 
1281  FitsStorage:
1282  default:
1283  image: &default
1284  compression:
1285  algorithm: GZIP_SHUFFLE
1286  mask: *default
1287  variance: *default
1288  """
1289  recipesFile = os.path.join(getPackageDir("obs_base"), "policy", "writeRecipes.yaml")
1290  recipes = dafPersist.Policy(recipesFile)
1291  supplementsFile = os.path.join(self.getPackageDir(), "policy", "writeRecipes.yaml")
1292  validationMenu = {'FitsStorage': validateRecipeFitsStorage, }
1293  if os.path.exists(supplementsFile) and supplementsFile != recipesFile:
1294  supplements = dafPersist.Policy(supplementsFile)
1295  # Don't allow overrides, only supplements
1296  for entry in validationMenu:
1297  intersection = set(recipes[entry].names()).intersection(set(supplements.names()))
1298  if intersection:
1299  raise RuntimeError("Recipes provided in %s section %s may not override those in %s: %s" %
1300  (supplementsFile, entry, recipesFile, intersection))
1301  recipes.update(supplements)
1302 
1303  self._writeRecipes = {}
1304  for storageType in recipes.names(True):
1305  if "default" not in recipes[storageType]:
1306  raise RuntimeError("No 'default' recipe defined for storage type %s in %s" %
1307  (storageType, recipesFile))
1308  self._writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1309 
1310 
1311 def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True):
1312  """Generate an Exposure from an image-like object
1313 
1314  If the image is a DecoratedImage then also set its WCS and metadata
1315  (Image and MaskedImage are missing the necessary metadata
1316  and Exposure already has those set)
1317 
1318  Parameters
1319  ----------
1320  image : Image-like object
1321  Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or
1322  Exposure.
1323 
1324  Returns
1325  -------
1326  `lsst.afw.image.Exposure`
1327  Exposure containing input image.
1328  """
1329  metadata = None
1330  if isinstance(image, afwImage.MaskedImage):
1331  exposure = afwImage.makeExposure(image)
1332  elif isinstance(image, afwImage.DecoratedImage):
1333  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
1334  metadata = image.getMetadata()
1335  exposure.setMetadata(metadata)
1336  elif isinstance(image, afwImage.Exposure):
1337  exposure = image
1338  metadata = exposure.getMetadata()
1339  else: # Image
1341 
1342  # set VisitInfo if we can
1343  if setVisitInfo and exposure.getInfo().getVisitInfo() is None:
1344  if metadata is not None:
1345  if mapper is None:
1346  if not logger:
1347  logger = lsstLog.Log.getLogger("CameraMapper")
1348  logger.warn("I can only set the VisitInfo if you provide a mapper")
1349  else:
1350  exposureId = mapper._computeCcdExposureId(dataId)
1351  visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1352 
1353  exposure.getInfo().setVisitInfo(visitInfo)
1354 
1355  return exposure
1356 
1357 
1359  """Validate recipes for FitsStorage
1360 
1361  The recipes are supplemented with default values where appropriate.
1362 
1363  TODO: replace this custom validation code with Cerberus (DM-11846)
1364 
1365  Parameters
1366  ----------
1367  recipes : `lsst.daf.persistence.Policy`
1368  FitsStorage recipes to validate.
1369 
1370  Returns
1371  -------
1372  validated : `lsst.daf.base.PropertySet`
1373  Validated FitsStorage recipe.
1374 
1375  Raises
1376  ------
1377  `RuntimeError`
1378  If validation fails.
1379  """
1380  # Schemas define what should be there, and the default values (and by the default
1381  # value, the expected type).
1382  compressionSchema = {
1383  "algorithm": "NONE",
1384  "rows": 1,
1385  "columns": 0,
1386  "quantizeLevel": 0.0,
1387  }
1388  scalingSchema = {
1389  "algorithm": "NONE",
1390  "bitpix": 0,
1391  "maskPlanes": ["NO_DATA"],
1392  "seed": 0,
1393  "quantizeLevel": 4.0,
1394  "quantizePad": 5.0,
1395  "fuzz": True,
1396  "bscale": 1.0,
1397  "bzero": 0.0,
1398  }
1399 
1400  def checkUnrecognized(entry, allowed, description):
1401  """Check to see if the entry contains unrecognised keywords"""
1402  unrecognized = set(entry.keys()) - set(allowed)
1403  if unrecognized:
1404  raise RuntimeError(
1405  "Unrecognized entries when parsing image compression recipe %s: %s" %
1406  (description, unrecognized))
1407 
1408  validated = {}
1409  for name in recipes.names(True):
1410  checkUnrecognized(recipes[name], ["image", "mask", "variance"], name)
1411  rr = dafBase.PropertySet()
1412  validated[name] = rr
1413  for plane in ("image", "mask", "variance"):
1414  checkUnrecognized(recipes[name][plane], ["compression", "scaling"],
1415  name + "->" + plane)
1416 
1417  for settings, schema in (("compression", compressionSchema),
1418  ("scaling", scalingSchema)):
1419  prefix = plane + "." + settings
1420  if settings not in recipes[name][plane]:
1421  for key in schema:
1422  rr.set(prefix + "." + key, schema[key])
1423  continue
1424  entry = recipes[name][plane][settings]
1425  checkUnrecognized(entry, schema.keys(), name + "->" + plane + "->" + settings)
1426  for key in schema:
1427  value = type(schema[key])(entry[key]) if key in entry else schema[key]
1428  rr.set(prefix + "." + key, value)
1429  return validated
def _makeCamera(self, policy, repositoryDir)
def map_expIdInfo(self, dataId, write=False)
def _setAmpDetector(self, item, dataId, trimmed=True)
def validateRecipeFitsStorage(recipes)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
Class for logical location of a persisted Persistable instance.
def _setFilter(self, mapping, item, dataId)
A class to contain the data, WCS, and other information needed to describe an image of the sky...
Definition: Exposure.h:72
def _setCcdDetector(self, item, dataId, trimmed=True)
daf::base::PropertySet * set
Definition: fits.cc:902
def std_bfKernel(self, item, dataId)
def getKeys(self, datasetType, level)
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
Definition: MaskedImage.h:1279
Definition: Log.h:706
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
Definition: Exposure.h:442
def getImageCompressionSettings(self, datasetType, dataId)
def _createSkyWcsFromMetadata(self, exposure)
Reports attempts to access elements using an invalid key.
Definition: Runtime.h:151
def createInitialSkyWcs(visitInfo, detector, flipX=False)
Definition: utils.py:42
table::Key< int > type
Definition: Detector.cc:163
def map_camera(self, dataId, write=False)
def map(self, datasetType, dataId, write=False)
Definition: mapper.py:135
A class to manipulate images, masks, and variance as a single object.
Definition: MaskedImage.h:73
def backup(self, datasetType, dataId)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
Utility functions.
Holds an integer identifier for an LSST filter.
Definition: Filter.h:141
def std_skypolicy(self, item, dataId)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
std::shared_ptr< SkyWcs > makeSkyWcs(TransformPoint2ToPoint2 const &pixelsToFieldAngle, lsst::geom::Angle const &orientation, bool flipX, lsst::geom::SpherePoint const &boresight, std::string const &projection="TAN")
Construct a FITS SkyWcs from camera geometry.
Definition: SkyWcs.cc:516
Class for storing generic metadata.
Definition: PropertySet.h:67
A FITS reader class for Exposures and their components.
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects...
Reports errors from accepting an object of an unexpected or inappropriate type.
Definition: Runtime.h:167
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
Definition: Image.cc:694
A container for an Image and its associated metadata.
Definition: Image.h:404