LSSTApplications  17.0+124,17.0+14,17.0+73,18.0.0+37,18.0.0+80,18.0.0-4-g68ffd23+4,18.1.0-1-g0001055+12,18.1.0-1-g03d53ef+5,18.1.0-1-g1349e88+55,18.1.0-1-g2505f39+44,18.1.0-1-g5315e5e+4,18.1.0-1-g5e4b7ea+14,18.1.0-1-g7e8fceb+4,18.1.0-1-g85f8cd4+48,18.1.0-1-g8ff0b9f+4,18.1.0-1-ga2c679d+1,18.1.0-1-gd55f500+35,18.1.0-10-gb58edde+2,18.1.0-11-g0997b02+4,18.1.0-13-gfe4edf0b+12,18.1.0-14-g259bd21+21,18.1.0-19-gdb69f3f+2,18.1.0-2-g5f9922c+24,18.1.0-2-gd3b74e5+11,18.1.0-2-gfbf3545+32,18.1.0-26-g728bddb4+5,18.1.0-27-g6ff7ca9+2,18.1.0-3-g52aa583+25,18.1.0-3-g8ea57af+9,18.1.0-3-gb69f684+42,18.1.0-3-gfcaddf3+6,18.1.0-32-gd8786685a,18.1.0-4-gf3f9b77+6,18.1.0-5-g1dd662b+2,18.1.0-5-g6dbcb01+41,18.1.0-6-gae77429+3,18.1.0-7-g9d75d83+9,18.1.0-7-gae09a6d+30,18.1.0-9-gc381ef5+4,w.2019.45
LSSTDataManagementBasePackage
cameraMapper.py
Go to the documentation of this file.
1 #
2 # LSST Data Management System
3 # Copyright 2008, 2009, 2010 LSST Corporation.
4 #
5 # This product includes software developed by the
6 # LSST Project (http://www.lsst.org/).
7 #
8 # This program is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU General Public License as published by
10 # the Free Software Foundation, either version 3 of the License, or
11 # (at your option) any later version.
12 #
13 # This program is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU General Public License for more details.
17 #
18 # You should have received a copy of the LSST License Statement and
19 # the GNU General Public License along with this program. If not,
20 # see <http://www.lsstcorp.org/LegalNotices/>.
21 #
22 
23 import copy
24 import os
25 import re
26 import traceback
27 import weakref
28 
29 from deprecated.sphinx import deprecated
30 
31 from astro_metadata_translator import fix_header
32 import lsst.daf.persistence as dafPersist
33 from . import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
34 import lsst.daf.base as dafBase
35 import lsst.afw.geom as afwGeom
36 import lsst.afw.image as afwImage
37 import lsst.afw.table as afwTable
38 from lsst.afw.fits import readMetadata
39 import lsst.afw.cameraGeom as afwCameraGeom
40 import lsst.log as lsstLog
41 import lsst.pex.exceptions as pexExcept
42 from .exposureIdInfo import ExposureIdInfo
43 from .makeRawVisitInfo import MakeRawVisitInfo
44 from .utils import createInitialSkyWcs, InitialSkyWcsError
45 from lsst.utils import getPackageDir
46 
47 __all__ = ["CameraMapper", "exposureFromImage"]
48 
49 
51 
52  """CameraMapper is a base class for mappers that handle images from a
53  camera and products derived from them. This provides an abstraction layer
54  between the data on disk and the code.
55 
56  Public methods: keys, queryMetadata, getDatasetTypes, map,
57  canStandardize, standardize
58 
59  Mappers for specific data sources (e.g., CFHT Megacam, LSST
60  simulations, etc.) should inherit this class.
61 
62  The CameraMapper manages datasets within a "root" directory. Note that
63  writing to a dataset present in the input root will hide the existing
64  dataset but not overwrite it. See #2160 for design discussion.
65 
66  A camera is assumed to consist of one or more rafts, each composed of
67  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
68  (amps). A camera is also assumed to have a camera geometry description
69  (CameraGeom object) as a policy file, a filter description (Filter class
70  static configuration) as another policy file.
71 
72  Information from the camera geometry and defects are inserted into all
73  Exposure objects returned.
74 
75  The mapper uses one or two registries to retrieve metadata about the
76  images. The first is a registry of all raw exposures. This must contain
77  the time of the observation. One or more tables (or the equivalent)
78  within the registry are used to look up data identifier components that
79  are not specified by the user (e.g. filter) and to return results for
80  metadata queries. The second is an optional registry of all calibration
81  data. This should contain validity start and end entries for each
82  calibration dataset in the same timescale as the observation time.
83 
84  Subclasses will typically set MakeRawVisitInfoClass and optionally the
85  metadata translator class:
86 
87  MakeRawVisitInfoClass: a class variable that points to a subclass of
88  MakeRawVisitInfo, a functor that creates an
89  lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
90 
91  translatorClass: The `~astro_metadata_translator.MetadataTranslator`
92  class to use for fixing metadata values. If it is not set an attempt
93  will be made to infer the class from ``MakeRawVisitInfoClass``, failing
94  that the metadata fixup will try to infer the translator class from the
95  header itself.
96 
97  Subclasses must provide the following methods:
98 
99  _extractDetectorName(self, dataId): returns the detector name for a CCD
100  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
101  a dataset identifier referring to that CCD or a subcomponent of it.
102 
103  _computeCcdExposureId(self, dataId): see below
104 
105  _computeCoaddExposureId(self, dataId, singleFilter): see below
106 
107  Subclasses may also need to override the following methods:
108 
109  _transformId(self, dataId): transformation of a data identifier
110  from colloquial usage (e.g., "ccdname") to proper/actual usage
111  (e.g., "ccd"), including making suitable for path expansion (e.g. removing
112  commas). The default implementation does nothing. Note that this
113  method should not modify its input parameter.
114 
115  getShortCcdName(self, ccdName): a static method that returns a shortened
116  name suitable for use as a filename. The default version converts spaces
117  to underscores.
118 
119  _mapActualToPath(self, template, actualId): convert a template path to an
120  actual path, using the actual dataset identifier.
121 
122  The mapper's behaviors are largely specified by the policy file.
123  See the MapperDictionary.paf for descriptions of the available items.
124 
125  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
126  mappings (see Mappings class).
127 
128  Common default mappings for all subclasses can be specified in the
129  "policy/{images,exposures,calibrations,datasets}.yaml" files. This
130  provides a simple way to add a product to all camera mappers.
131 
132  Functions to map (provide a path to the data given a dataset
133  identifier dictionary) and standardize (convert data into some standard
134  format or type) may be provided in the subclass as "map_{dataset type}"
135  and "std_{dataset type}", respectively.
136 
137  If non-Exposure datasets cannot be retrieved using standard
138  daf_persistence methods alone, a "bypass_{dataset type}" function may be
139  provided in the subclass to return the dataset instead of using the
140  "datasets" subpolicy.
141 
142  Implementations of map_camera and bypass_camera that should typically be
143  sufficient are provided in this base class.
144 
145  Notes
146  -----
147  .. todo::
148 
149  Instead of auto-loading the camera at construction time, load it from
150  the calibration registry
151 
152  Parameters
153  ----------
154  policy : daf_persistence.Policy,
155  Policy with per-camera defaults already merged.
156  repositoryDir : string
157  Policy repository for the subclassing module (obtained with
158  getRepositoryPath() on the per-camera default dictionary).
159  root : string, optional
160  Path to the root directory for data.
161  registry : string, optional
162  Path to registry with data's metadata.
163  calibRoot : string, optional
164  Root directory for calibrations.
165  calibRegistry : string, optional
166  Path to registry with calibrations' metadata.
167  provided : list of string, optional
168  Keys provided by the mapper.
169  parentRegistry : Registry subclass, optional
170  Registry from a parent repository that may be used to look up
171  data's metadata.
172  repositoryCfg : daf_persistence.RepositoryCfg or None, optional
173  The configuration information for the repository this mapper is
174  being used with.
175  """
176  packageName = None
177 
178  # a class or subclass of MakeRawVisitInfo, a functor that makes an
179  # lsst.afw.image.VisitInfo from the FITS metadata of a raw image
180  MakeRawVisitInfoClass = MakeRawVisitInfo
181 
182  # a class or subclass of PupilFactory
183  PupilFactoryClass = afwCameraGeom.PupilFactory
184 
185  # Class to use for metadata translations
186  translatorClass = None
187 
188  def __init__(self, policy, repositoryDir,
189  root=None, registry=None, calibRoot=None, calibRegistry=None,
190  provided=None, parentRegistry=None, repositoryCfg=None):
191 
192  dafPersist.Mapper.__init__(self)
193 
194  self.log = lsstLog.Log.getLogger("CameraMapper")
195 
196  if root:
197  self.root = root
198  elif repositoryCfg:
199  self.root = repositoryCfg.root
200  else:
201  self.root = None
202 
203  repoPolicy = repositoryCfg.policy if repositoryCfg else None
204  if repoPolicy is not None:
205  policy.update(repoPolicy)
206 
207  # Levels
208  self.levels = dict()
209  if 'levels' in policy:
210  levelsPolicy = policy['levels']
211  for key in levelsPolicy.names(True):
212  self.levels[key] = set(levelsPolicy.asArray(key))
213  self.defaultLevel = policy['defaultLevel']
214  self.defaultSubLevels = dict()
215  if 'defaultSubLevels' in policy:
216  self.defaultSubLevels = policy['defaultSubLevels']
217 
218  # Root directories
219  if root is None:
220  root = "."
221  root = dafPersist.LogicalLocation(root).locString()
222 
223  self.rootStorage = dafPersist.Storage.makeFromURI(uri=root)
224 
225  # If the calibRoot is passed in, use that. If not and it's indicated in
226  # the policy, use that. And otherwise, the calibs are in the regular
227  # root.
228  # If the location indicated by the calib root does not exist, do not
229  # create it.
230  calibStorage = None
231  if calibRoot is not None:
232  calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
233  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
234  create=False)
235  else:
236  calibRoot = policy.get('calibRoot', None)
237  if calibRoot:
238  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
239  create=False)
240  if calibStorage is None:
241  calibStorage = self.rootStorage
242 
243  self.root = root
244 
245  # Registries
246  self.registry = self._setupRegistry("registry", "exposure", registry, policy, "registryPath",
247  self.rootStorage, searchParents=False,
248  posixIfNoSql=(not parentRegistry))
249  if not self.registry:
250  self.registry = parentRegistry
251  needCalibRegistry = policy.get('needCalibRegistry', None)
252  if needCalibRegistry:
253  if calibStorage:
254  self.calibRegistry = self._setupRegistry("calibRegistry", "calib", calibRegistry, policy,
255  "calibRegistryPath", calibStorage,
256  posixIfNoSql=False) # NB never use posix for calibs
257  else:
258  raise RuntimeError(
259  "'needCalibRegistry' is true in Policy, but was unable to locate a repo at " +
260  "calibRoot ivar:%s or policy['calibRoot']:%s" %
261  (calibRoot, policy.get('calibRoot', None)))
262  else:
263  self.calibRegistry = None
264 
265  # Dict of valid keys and their value types
266  self.keyDict = dict()
267 
268  self._initMappings(policy, self.rootStorage, calibStorage, provided=None)
269  self._initWriteRecipes()
270 
271  # Camera geometry
272  self.cameraDataLocation = None # path to camera geometry config file
273  self.camera = self._makeCamera(policy=policy, repositoryDir=repositoryDir)
274 
275  # Filter translation table
276  self.filters = None
277 
278  # verify that the class variable packageName is set before attempting
279  # to instantiate an instance
280  if self.packageName is None:
281  raise ValueError('class variable packageName must not be None')
282 
284 
285  # Assign a metadata translator if one has not been defined by
286  # subclass. We can sometimes infer one from the RawVisitInfo
287  # class.
288  if self.translatorClass is None and hasattr(self.makeRawVisitInfo, "metadataTranslator"):
289  self.translatorClass = self.makeRawVisitInfo.metadataTranslator
290 
291  def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
292  """Initialize mappings
293 
294  For each of the dataset types that we want to be able to read, there
295  are methods that can be created to support them:
296  * map_<dataset> : determine the path for dataset
297  * std_<dataset> : standardize the retrieved dataset
298  * bypass_<dataset> : retrieve the dataset (bypassing the usual
299  retrieval machinery)
300  * query_<dataset> : query the registry
301 
302  Besides the dataset types explicitly listed in the policy, we create
303  additional, derived datasets for additional conveniences,
304  e.g., reading the header of an image, retrieving only the size of a
305  catalog.
306 
307  Parameters
308  ----------
309  policy : `lsst.daf.persistence.Policy`
310  Policy with per-camera defaults already merged
311  rootStorage : `Storage subclass instance`
312  Interface to persisted repository data.
313  calibRoot : `Storage subclass instance`
314  Interface to persisted calib repository data
315  provided : `list` of `str`
316  Keys provided by the mapper
317  """
318  # Sub-dictionaries (for exposure/calibration/dataset types)
319  imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
320  "obs_base", "ImageMappingDefaults.yaml", "policy"))
321  expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
322  "obs_base", "ExposureMappingDefaults.yaml", "policy"))
323  calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
324  "obs_base", "CalibrationMappingDefaults.yaml", "policy"))
325  dsMappingPolicy = dafPersist.Policy()
326 
327  # Mappings
328  mappingList = (
329  ("images", imgMappingPolicy, ImageMapping),
330  ("exposures", expMappingPolicy, ExposureMapping),
331  ("calibrations", calMappingPolicy, CalibrationMapping),
332  ("datasets", dsMappingPolicy, DatasetMapping)
333  )
334  self.mappings = dict()
335  for name, defPolicy, cls in mappingList:
336  if name in policy:
337  datasets = policy[name]
338 
339  # Centrally-defined datasets
340  defaultsPath = os.path.join(getPackageDir("obs_base"), "policy", name + ".yaml")
341  if os.path.exists(defaultsPath):
342  datasets.merge(dafPersist.Policy(defaultsPath))
343 
344  mappings = dict()
345  setattr(self, name, mappings)
346  for datasetType in datasets.names(True):
347  subPolicy = datasets[datasetType]
348  subPolicy.merge(defPolicy)
349 
350  if not hasattr(self, "map_" + datasetType) and 'composite' in subPolicy:
351  def compositeClosure(dataId, write=False, mapper=None, mapping=None,
352  subPolicy=subPolicy):
353  components = subPolicy.get('composite')
354  assembler = subPolicy['assembler'] if 'assembler' in subPolicy else None
355  disassembler = subPolicy['disassembler'] if 'disassembler' in subPolicy else None
356  python = subPolicy['python']
357  butlerComposite = dafPersist.ButlerComposite(assembler=assembler,
358  disassembler=disassembler,
359  python=python,
360  dataId=dataId,
361  mapper=self)
362  for name, component in components.items():
363  butlerComposite.add(id=name,
364  datasetType=component.get('datasetType'),
365  setter=component.get('setter', None),
366  getter=component.get('getter', None),
367  subset=component.get('subset', False),
368  inputOnly=component.get('inputOnly', False))
369  return butlerComposite
370  setattr(self, "map_" + datasetType, compositeClosure)
371  # for now at least, don't set up any other handling for this dataset type.
372  continue
373 
374  if name == "calibrations":
375  mapping = cls(datasetType, subPolicy, self.registry, self.calibRegistry, calibStorage,
376  provided=provided, dataRoot=rootStorage)
377  else:
378  mapping = cls(datasetType, subPolicy, self.registry, rootStorage, provided=provided)
379 
380  if datasetType in self.mappings:
381  raise ValueError(f"Duplicate mapping policy for dataset type {datasetType}")
382  self.keyDict.update(mapping.keys())
383  mappings[datasetType] = mapping
384  self.mappings[datasetType] = mapping
385  if not hasattr(self, "map_" + datasetType):
386  def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
387  return mapping.map(mapper, dataId, write)
388  setattr(self, "map_" + datasetType, mapClosure)
389  if not hasattr(self, "query_" + datasetType):
390  def queryClosure(format, dataId, mapping=mapping):
391  return mapping.lookup(format, dataId)
392  setattr(self, "query_" + datasetType, queryClosure)
393  if hasattr(mapping, "standardize") and not hasattr(self, "std_" + datasetType):
394  def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
395  return mapping.standardize(mapper, item, dataId)
396  setattr(self, "std_" + datasetType, stdClosure)
397 
398  def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
399  """Set convenience methods on CameraMapper"""
400  mapName = "map_" + datasetType + "_" + suffix
401  bypassName = "bypass_" + datasetType + "_" + suffix
402  queryName = "query_" + datasetType + "_" + suffix
403  if not hasattr(self, mapName):
404  setattr(self, mapName, mapImpl or getattr(self, "map_" + datasetType))
405  if not hasattr(self, bypassName):
406  if bypassImpl is None and hasattr(self, "bypass_" + datasetType):
407  bypassImpl = getattr(self, "bypass_" + datasetType)
408  if bypassImpl is not None:
409  setattr(self, bypassName, bypassImpl)
410  if not hasattr(self, queryName):
411  setattr(self, queryName, queryImpl or getattr(self, "query_" + datasetType))
412 
413  # Filename of dataset
414  setMethods("filename", bypassImpl=lambda datasetType, pythonType, location, dataId:
415  [os.path.join(location.getStorage().root, p) for p in location.getLocations()])
416  # Metadata from FITS file
417  if subPolicy["storage"] == "FitsStorage": # a FITS image
418  def getMetadata(datasetType, pythonType, location, dataId):
419  md = readMetadata(location.getLocationsWithRoot()[0])
420  fix_header(md, translator_class=self.translatorClass)
421  return md
422 
423  setMethods("md", bypassImpl=getMetadata)
424 
425  # Add support for configuring FITS compression
426  addName = "add_" + datasetType
427  if not hasattr(self, addName):
428  setattr(self, addName, self.getImageCompressionSettings)
429 
430  if name == "exposures":
431  def getSkyWcs(datasetType, pythonType, location, dataId):
432  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
433  return fitsReader.readWcs()
434 
435  setMethods("wcs", bypassImpl=getSkyWcs)
436 
437  def getRawHeaderWcs(datasetType, pythonType, location, dataId):
438  """Create a SkyWcs from the un-modified raw FITS WCS header keys."""
439  if datasetType[:3] != "raw":
440  raise dafPersist.NoResults("Can only get header WCS for raw exposures.",
441  datasetType, dataId)
442  return afwGeom.makeSkyWcs(readMetadata(location.getLocationsWithRoot()[0]))
443 
444  setMethods("header_wcs", bypassImpl=getRawHeaderWcs)
445 
446  def getPhotoCalib(datasetType, pythonType, location, dataId):
447  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
448  return fitsReader.readPhotoCalib()
449 
450  setMethods("photoCalib", bypassImpl=getPhotoCalib)
451 
452  def getVisitInfo(datasetType, pythonType, location, dataId):
453  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
454  return fitsReader.readVisitInfo()
455 
456  setMethods("visitInfo", bypassImpl=getVisitInfo)
457 
458  def getFilter(datasetType, pythonType, location, dataId):
459  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
460  return fitsReader.readFilter()
461 
462  setMethods("filter", bypassImpl=getFilter)
463 
464  setMethods("detector",
465  mapImpl=lambda dataId, write=False:
467  pythonType="lsst.afw.cameraGeom.CameraConfig",
468  cppType="Config",
469  storageName="Internal",
470  locationList="ignored",
471  dataId=dataId,
472  mapper=self,
473  storage=None,
474  ),
475  bypassImpl=lambda datasetType, pythonType, location, dataId:
476  self.camera[self._extractDetectorName(dataId)]
477  )
478 
479  def getBBox(datasetType, pythonType, location, dataId):
480  md = readMetadata(location.getLocationsWithRoot()[0], hdu=1)
481  fix_header(md, translator_class=self.translatorClass)
482  return afwImage.bboxFromMetadata(md)
483 
484  setMethods("bbox", bypassImpl=getBBox)
485 
486  elif name == "images":
487  def getBBox(datasetType, pythonType, location, dataId):
488  md = readMetadata(location.getLocationsWithRoot()[0])
489  fix_header(md, translator_class=self.translatorClass)
490  return afwImage.bboxFromMetadata(md)
491  setMethods("bbox", bypassImpl=getBBox)
492 
493  if subPolicy["storage"] == "FitsCatalogStorage": # a FITS catalog
494 
495  def getMetadata(datasetType, pythonType, location, dataId):
496  md = readMetadata(os.path.join(location.getStorage().root,
497  location.getLocations()[0]), hdu=1)
498  fix_header(md, translator_class=self.translatorClass)
499  return md
500 
501  setMethods("md", bypassImpl=getMetadata)
502 
503  # Sub-images
504  if subPolicy["storage"] == "FitsStorage":
505  def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
506  subId = dataId.copy()
507  del subId['bbox']
508  loc = mapping.map(mapper, subId, write)
509  bbox = dataId['bbox']
510  llcX = bbox.getMinX()
511  llcY = bbox.getMinY()
512  width = bbox.getWidth()
513  height = bbox.getHeight()
514  loc.additionalData.set('llcX', llcX)
515  loc.additionalData.set('llcY', llcY)
516  loc.additionalData.set('width', width)
517  loc.additionalData.set('height', height)
518  if 'imageOrigin' in dataId:
519  loc.additionalData.set('imageOrigin',
520  dataId['imageOrigin'])
521  return loc
522 
523  def querySubClosure(key, format, dataId, mapping=mapping):
524  subId = dataId.copy()
525  del subId['bbox']
526  return mapping.lookup(format, subId)
527  setMethods("sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
528 
529  if subPolicy["storage"] == "FitsCatalogStorage":
530  # Length of catalog
531 
532  def getLen(datasetType, pythonType, location, dataId):
533  md = readMetadata(os.path.join(location.getStorage().root,
534  location.getLocations()[0]), hdu=1)
535  fix_header(md, translator_class=self.translatorClass)
536  return md["NAXIS2"]
537 
538  setMethods("len", bypassImpl=getLen)
539 
540  # Schema of catalog
541  if not datasetType.endswith("_schema") and datasetType + "_schema" not in datasets:
542  setMethods("schema", bypassImpl=lambda datasetType, pythonType, location, dataId:
543  afwTable.Schema.readFits(os.path.join(location.getStorage().root,
544  location.getLocations()[0])))
545 
546  def _computeCcdExposureId(self, dataId):
547  """Compute the 64-bit (long) identifier for a CCD exposure.
548 
549  Subclasses must override
550 
551  Parameters
552  ----------
553  dataId : `dict`
554  Data identifier with visit, ccd.
555  """
556  raise NotImplementedError()
557 
558  def _computeCoaddExposureId(self, dataId, singleFilter):
559  """Compute the 64-bit (long) identifier for a coadd.
560 
561  Subclasses must override
562 
563  Parameters
564  ----------
565  dataId : `dict`
566  Data identifier with tract and patch.
567  singleFilter : `bool`
568  True means the desired ID is for a single-filter coadd, in which
569  case dataIdmust contain filter.
570  """
571  raise NotImplementedError()
572 
573  def _search(self, path):
574  """Search for path in the associated repository's storage.
575 
576  Parameters
577  ----------
578  path : string
579  Path that describes an object in the repository associated with
580  this mapper.
581  Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The
582  indicator will be stripped when searching and so will match
583  filenames without the HDU indicator, e.g. 'foo.fits'. The path
584  returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
585 
586  Returns
587  -------
588  string
589  The path for this object in the repository. Will return None if the
590  object can't be found. If the input argument path contained an HDU
591  indicator, the returned path will also contain the HDU indicator.
592  """
593  return self.rootStorage.search(path)
594 
595  def backup(self, datasetType, dataId):
596  """Rename any existing object with the given type and dataId.
597 
598  The CameraMapper implementation saves objects in a sequence of e.g.:
599 
600  - foo.fits
601  - foo.fits~1
602  - foo.fits~2
603 
604  All of the backups will be placed in the output repo, however, and will
605  not be removed if they are found elsewhere in the _parent chain. This
606  means that the same file will be stored twice if the previous version
607  was found in an input repo.
608  """
609 
610  # Calling PosixStorage directly is not the long term solution in this
611  # function, this is work-in-progress on epic DM-6225. The plan is for
612  # parentSearch to be changed to 'search', and search only the storage
613  # associated with this mapper. All searching of parents will be handled
614  # by traversing the container of repositories in Butler.
615 
616  def firstElement(list):
617  """Get the first element in the list, or None if that can't be
618  done.
619  """
620  return list[0] if list is not None and len(list) else None
621 
622  n = 0
623  newLocation = self.map(datasetType, dataId, write=True)
624  newPath = newLocation.getLocations()[0]
625  path = dafPersist.PosixStorage.search(self.root, newPath, searchParents=True)
626  path = firstElement(path)
627  oldPaths = []
628  while path is not None:
629  n += 1
630  oldPaths.append((n, path))
631  path = dafPersist.PosixStorage.search(self.root, "%s~%d" % (newPath, n), searchParents=True)
632  path = firstElement(path)
633  for n, oldPath in reversed(oldPaths):
634  self.rootStorage.copyFile(oldPath, "%s~%d" % (newPath, n))
635 
636  def keys(self):
637  """Return supported keys.
638 
639  Returns
640  -------
641  iterable
642  List of keys usable in a dataset identifier
643  """
644  return iter(self.keyDict.keys())
645 
646  def getKeys(self, datasetType, level):
647  """Return a dict of supported keys and their value types for a given
648  dataset type at a given level of the key hierarchy.
649 
650  Parameters
651  ----------
652  datasetType : `str`
653  Dataset type or None for all dataset types.
654  level : `str` or None
655  Level or None for all levels or '' for the default level for the
656  camera.
657 
658  Returns
659  -------
660  `dict`
661  Keys are strings usable in a dataset identifier, values are their
662  value types.
663  """
664 
665  # not sure if this is how we want to do this. what if None was intended?
666  if level == '':
667  level = self.getDefaultLevel()
668 
669  if datasetType is None:
670  keyDict = copy.copy(self.keyDict)
671  else:
672  keyDict = self.mappings[datasetType].keys()
673  if level is not None and level in self.levels:
674  keyDict = copy.copy(keyDict)
675  for l in self.levels[level]:
676  if l in keyDict:
677  del keyDict[l]
678  return keyDict
679 
680  def getDefaultLevel(self):
681  return self.defaultLevel
682 
683  def getDefaultSubLevel(self, level):
684  if level in self.defaultSubLevels:
685  return self.defaultSubLevels[level]
686  return None
687 
688  @classmethod
689  def getCameraName(cls):
690  """Return the name of the camera that this CameraMapper is for."""
691  className = str(cls)
692  className = className[className.find('.'):-1]
693  m = re.search(r'(\w+)Mapper', className)
694  if m is None:
695  m = re.search(r"class '[\w.]*?(\w+)'", className)
696  name = m.group(1)
697  return name[:1].lower() + name[1:] if name else ''
698 
699  @classmethod
700  def getPackageName(cls):
701  """Return the name of the package containing this CameraMapper."""
702  if cls.packageName is None:
703  raise ValueError('class variable packageName must not be None')
704  return cls.packageName
705 
706  @classmethod
707  def getPackageDir(cls):
708  """Return the base directory of this package"""
709  return getPackageDir(cls.getPackageName())
710 
711  def map_camera(self, dataId, write=False):
712  """Map a camera dataset."""
713  if self.camera is None:
714  raise RuntimeError("No camera dataset available.")
715  actualId = self._transformId(dataId)
717  pythonType="lsst.afw.cameraGeom.CameraConfig",
718  cppType="Config",
719  storageName="ConfigStorage",
720  locationList=self.cameraDataLocation or "ignored",
721  dataId=actualId,
722  mapper=self,
723  storage=self.rootStorage
724  )
725 
726  def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
727  """Return the (preloaded) camera object.
728  """
729  if self.camera is None:
730  raise RuntimeError("No camera dataset available.")
731  return self.camera
732 
733  def map_expIdInfo(self, dataId, write=False):
735  pythonType="lsst.obs.base.ExposureIdInfo",
736  cppType=None,
737  storageName="Internal",
738  locationList="ignored",
739  dataId=dataId,
740  mapper=self,
741  storage=self.rootStorage
742  )
743 
744  def bypass_expIdInfo(self, datasetType, pythonType, location, dataId):
745  """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
746  expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
747  expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
748  return ExposureIdInfo(expId=expId, expBits=expBits)
749 
750  def std_bfKernel(self, item, dataId):
751  """Disable standardization for bfKernel
752 
753  bfKernel is a calibration product that is numpy array,
754  unlike other calibration products that are all images;
755  all calibration images are sent through _standardizeExposure
756  due to CalibrationMapping, but we don't want that to happen to bfKernel
757  """
758  return item
759 
760  def std_raw(self, item, dataId):
761  """Standardize a raw dataset by converting it to an Exposure instead
762  of an Image"""
763  return self._standardizeExposure(self.exposures['raw'], item, dataId,
764  trimmed=False, setVisitInfo=True)
765 
766  def map_skypolicy(self, dataId):
767  """Map a sky policy."""
768  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
769  "Internal", None, None, self,
770  storage=self.rootStorage)
771 
772  def std_skypolicy(self, item, dataId):
773  """Standardize a sky policy by returning the one we use."""
774  return self.skypolicy
775 
776 
781 
782  def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
783  posixIfNoSql=True):
784  """Set up a registry (usually SQLite3), trying a number of possible
785  paths.
786 
787  Parameters
788  ----------
789  name : string
790  Name of registry.
791  description: `str`
792  Description of registry (for log messages)
793  path : string
794  Path for registry.
795  policy : string
796  Policy that contains the registry name, used if path is None.
797  policyKey : string
798  Key in policy for registry path.
799  storage : Storage subclass
800  Repository Storage to look in.
801  searchParents : bool, optional
802  True if the search for a registry should follow any Butler v1
803  _parent symlinks.
804  posixIfNoSql : bool, optional
805  If an sqlite registry is not found, will create a posix registry if
806  this is True.
807 
808  Returns
809  -------
810  lsst.daf.persistence.Registry
811  Registry object
812  """
813  if path is None and policyKey in policy:
814  path = dafPersist.LogicalLocation(policy[policyKey]).locString()
815  if os.path.isabs(path):
816  raise RuntimeError("Policy should not indicate an absolute path for registry.")
817  if not storage.exists(path):
818  newPath = storage.instanceSearch(path)
819 
820  newPath = newPath[0] if newPath is not None and len(newPath) else None
821  if newPath is None:
822  self.log.warn("Unable to locate registry at policy path (also looked in root): %s",
823  path)
824  path = newPath
825  else:
826  self.log.warn("Unable to locate registry at policy path: %s", path)
827  path = None
828 
829  # Old Butler API was to indicate the registry WITH the repo folder, New Butler expects the registry to
830  # be in the repo folder. To support Old API, check to see if path starts with root, and if so, strip
831  # root from path. Currently only works with PosixStorage
832  try:
833  root = storage.root
834  if path and (path.startswith(root)):
835  path = path[len(root + '/'):]
836  except AttributeError:
837  pass
838 
839  # determine if there is an sqlite registry and if not, try the posix registry.
840  registry = None
841 
842  def search(filename, description):
843  """Search for file in storage
844 
845  Parameters
846  ----------
847  filename : `str`
848  Filename to search for
849  description : `str`
850  Description of file, for error message.
851 
852  Returns
853  -------
854  path : `str` or `None`
855  Path to file, or None
856  """
857  result = storage.instanceSearch(filename)
858  if result:
859  return result[0]
860  self.log.debug("Unable to locate %s: %s", description, filename)
861  return None
862 
863  # Search for a suitable registry database
864  if path is None:
865  path = search("%s.pgsql" % name, "%s in root" % description)
866  if path is None:
867  path = search("%s.sqlite3" % name, "%s in root" % description)
868  if path is None:
869  path = search(os.path.join(".", "%s.sqlite3" % name), "%s in current dir" % description)
870 
871  if path is not None:
872  if not storage.exists(path):
873  newPath = storage.instanceSearch(path)
874  newPath = newPath[0] if newPath is not None and len(newPath) else None
875  if newPath is not None:
876  path = newPath
877  localFileObj = storage.getLocalFile(path)
878  self.log.info("Loading %s registry from %s", description, localFileObj.name)
879  registry = dafPersist.Registry.create(localFileObj.name)
880  localFileObj.close()
881  elif not registry and posixIfNoSql:
882  try:
883  self.log.info("Loading Posix %s registry from %s", description, storage.root)
884  registry = dafPersist.PosixRegistry(storage.root)
885  except Exception:
886  registry = None
887 
888  return registry
889 
890  def _transformId(self, dataId):
891  """Generate a standard ID dict from a camera-specific ID dict.
892 
893  Canonical keys include:
894  - amp: amplifier name
895  - ccd: CCD name (in LSST this is a combination of raft and sensor)
896  The default implementation returns a copy of its input.
897 
898  Parameters
899  ----------
900  dataId : `dict`
901  Dataset identifier; this must not be modified
902 
903  Returns
904  -------
905  `dict`
906  Transformed dataset identifier.
907  """
908 
909  return dataId.copy()
910 
911  def _mapActualToPath(self, template, actualId):
912  """Convert a template path to an actual path, using the actual data
913  identifier. This implementation is usually sufficient but can be
914  overridden by the subclass.
915 
916  Parameters
917  ----------
918  template : `str`
919  Template path
920  actualId : `dict`
921  Dataset identifier
922 
923  Returns
924  -------
925  `str`
926  Pathname
927  """
928 
929  try:
930  transformedId = self._transformId(actualId)
931  return template % transformedId
932  except Exception as e:
933  raise RuntimeError("Failed to format %r with data %r: %s" % (template, transformedId, e))
934 
935  @staticmethod
936  def getShortCcdName(ccdName):
937  """Convert a CCD name to a form useful as a filename
938 
939  The default implementation converts spaces to underscores.
940  """
941  return ccdName.replace(" ", "_")
942 
943  def _extractDetectorName(self, dataId):
944  """Extract the detector (CCD) name from the dataset identifier.
945 
946  The name in question is the detector name used by lsst.afw.cameraGeom.
947 
948  Parameters
949  ----------
950  dataId : `dict`
951  Dataset identifier.
952 
953  Returns
954  -------
955  `str`
956  Detector name
957  """
958  raise NotImplementedError("No _extractDetectorName() function specified")
959 
960  @deprecated("This method is no longer used for ISR (will be removed after v11)", category=FutureWarning)
961  def _extractAmpId(self, dataId):
962  """Extract the amplifier identifer from a dataset identifier.
963 
964  .. note:: Deprecated in 11_0
965 
966  amplifier identifier has two parts: the detector name for the CCD
967  containing the amplifier and index of the amplifier in the detector.
968 
969  Parameters
970  ----------
971  dataId : `dict`
972  Dataset identifer
973 
974  Returns
975  -------
976  `tuple`
977  Amplifier identifier
978  """
979 
980  trDataId = self._transformId(dataId)
981  return (trDataId["ccd"], int(trDataId['amp']))
982 
983  def _setAmpDetector(self, item, dataId, trimmed=True):
984  """Set the detector object in an Exposure for an amplifier.
985 
986  Defects are also added to the Exposure based on the detector object.
987 
988  Parameters
989  ----------
990  item : `lsst.afw.image.Exposure`
991  Exposure to set the detector in.
992  dataId : `dict`
993  Dataset identifier
994  trimmed : `bool`
995  Should detector be marked as trimmed? (ignored)
996  """
997 
998  return self._setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
999 
1000  def _setCcdDetector(self, item, dataId, trimmed=True):
1001  """Set the detector object in an Exposure for a CCD.
1002 
1003  Parameters
1004  ----------
1005  item : `lsst.afw.image.Exposure`
1006  Exposure to set the detector in.
1007  dataId : `dict`
1008  Dataset identifier
1009  trimmed : `bool`
1010  Should detector be marked as trimmed? (ignored)
1011  """
1012  if item.getDetector() is not None:
1013  return
1014 
1015  detectorName = self._extractDetectorName(dataId)
1016  detector = self.camera[detectorName]
1017  item.setDetector(detector)
1018 
1019  def _setFilter(self, mapping, item, dataId):
1020  """Set the filter object in an Exposure. If the Exposure had a FILTER
1021  keyword, this was already processed during load. But if it didn't,
1022  use the filter from the registry.
1023 
1024  Parameters
1025  ----------
1026  mapping : `lsst.obs.base.Mapping`
1027  Where to get the filter from.
1028  item : `lsst.afw.image.Exposure`
1029  Exposure to set the filter in.
1030  dataId : `dict`
1031  Dataset identifier.
1032  """
1033 
1034  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI) or
1035  isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
1036  return
1037 
1038  if item.getFilter().getId() != afwImage.Filter.UNKNOWN:
1039  return
1040 
1041  actualId = mapping.need(['filter'], dataId)
1042  filterName = actualId['filter']
1043  if self.filters is not None and filterName in self.filters:
1044  filterName = self.filters[filterName]
1045  try:
1046  item.setFilter(afwImage.Filter(filterName))
1047  except pexExcept.NotFoundError:
1048  self.log.warn("Filter %s not defined. Set to UNKNOWN." % (filterName))
1049 
1050  def _standardizeExposure(self, mapping, item, dataId, filter=True,
1051  trimmed=True, setVisitInfo=True):
1052  """Default standardization function for images.
1053 
1054  This sets the Detector from the camera geometry
1055  and optionally set the Filter. In both cases this saves
1056  having to persist some data in each exposure (or image).
1057 
1058  Parameters
1059  ----------
1060  mapping : `lsst.obs.base.Mapping`
1061  Where to get the values from.
1062  item : image-like object
1063  Can be any of lsst.afw.image.Exposure,
1064  lsst.afw.image.DecoratedImage, lsst.afw.image.Image
1065  or lsst.afw.image.MaskedImage
1066 
1067  dataId : `dict`
1068  Dataset identifier
1069  filter : `bool`
1070  Set filter? Ignored if item is already an exposure
1071  trimmed : `bool`
1072  Should detector be marked as trimmed?
1073  setVisitInfo : `bool`
1074  Should Exposure have its VisitInfo filled out from the metadata?
1075 
1076  Returns
1077  -------
1078  `lsst.afw.image.Exposure`
1079  The standardized Exposure.
1080  """
1081  try:
1082  exposure = exposureFromImage(item, dataId, mapper=self, logger=self.log,
1083  setVisitInfo=setVisitInfo)
1084  except Exception as e:
1085  self.log.error("Could not turn item=%r into an exposure: %s" % (repr(item), e))
1086  raise
1087 
1088  if mapping.level.lower() == "amp":
1089  self._setAmpDetector(exposure, dataId, trimmed)
1090  elif mapping.level.lower() == "ccd":
1091  self._setCcdDetector(exposure, dataId, trimmed)
1092 
1093  # We can only create a WCS if it doesn't already have one and
1094  # we have either a VisitInfo or exposure metadata.
1095  # Do not calculate a WCS if this is an amplifier exposure
1096  if mapping.level.lower() != "amp" and exposure.getWcs() is None and \
1097  (exposure.getInfo().getVisitInfo() is not None or exposure.getMetadata().toDict()):
1098  self._createInitialSkyWcs(exposure)
1099 
1100  if filter:
1101  self._setFilter(mapping, exposure, dataId)
1102 
1103  return exposure
1104 
1105  def _createSkyWcsFromMetadata(self, exposure):
1106  """Create a SkyWcs from the FITS header metadata in an Exposure.
1107 
1108  Parameters
1109  ----------
1110  exposure : `lsst.afw.image.Exposure`
1111  The exposure to get metadata from, and attach the SkyWcs to.
1112  """
1113  metadata = exposure.getMetadata()
1114  try:
1115  wcs = afwGeom.makeSkyWcs(metadata, strip=True)
1116  exposure.setWcs(wcs)
1117  except pexExcept.TypeError as e:
1118  # See DM-14372 for why this is debug and not warn (e.g. calib files without wcs metadata).
1119  self.log.debug("wcs set to None; missing information found in metadata to create a valid wcs:"
1120  " %s", e.args[0])
1121  # ensure any WCS values stripped from the metadata are removed in the exposure
1122  exposure.setMetadata(metadata)
1123 
1124  def _createInitialSkyWcs(self, exposure):
1125  """Create a SkyWcs from the boresight and camera geometry.
1126 
1127  If the boresight or camera geometry do not support this method of
1128  WCS creation, this falls back on the header metadata-based version
1129  (typically a purely linear FITS crval/crpix/cdmatrix WCS).
1130 
1131  Parameters
1132  ----------
1133  exposure : `lsst.afw.image.Exposure`
1134  The exposure to get data from, and attach the SkyWcs to.
1135  """
1136  # Always use try to use metadata first, to strip WCS keys from it.
1137  self._createSkyWcsFromMetadata(exposure)
1138 
1139  if exposure.getInfo().getVisitInfo() is None:
1140  msg = "No VisitInfo; cannot access boresight information. Defaulting to metadata-based SkyWcs."
1141  self.log.warn(msg)
1142  return
1143  try:
1144  newSkyWcs = createInitialSkyWcs(exposure.getInfo().getVisitInfo(), exposure.getDetector())
1145  exposure.setWcs(newSkyWcs)
1146  except InitialSkyWcsError as e:
1147  msg = "Cannot create SkyWcs using VisitInfo and Detector, using metadata-based SkyWcs: %s"
1148  self.log.warn(msg, e)
1149  self.log.debug("Exception was: %s", traceback.TracebackException.from_exception(e))
1150  if e.__context__ is not None:
1151  self.log.debug("Root-cause Exception was: %s",
1152  traceback.TracebackException.from_exception(e.__context__))
1153 
1154  def _makeCamera(self, policy, repositoryDir):
1155  """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing
1156  the camera geometry
1157 
1158  Also set self.cameraDataLocation, if relevant (else it can be left
1159  None).
1160 
1161  This implementation assumes that policy contains an entry "camera"
1162  that points to the subdirectory in this package of camera data;
1163  specifically, that subdirectory must contain:
1164  - a file named `camera.py` that contains persisted camera config
1165  - ampInfo table FITS files, as required by
1166  lsst.afw.cameraGeom.makeCameraFromPath
1167 
1168  Parameters
1169  ----------
1170  policy : `lsst.daf.persistence.Policy`
1171  Policy with per-camera defaults already merged
1172  (PexPolicy only for backward compatibility).
1173  repositoryDir : `str`
1174  Policy repository for the subclassing module (obtained with
1175  getRepositoryPath() on the per-camera default dictionary).
1176  """
1177  if 'camera' not in policy:
1178  raise RuntimeError("Cannot find 'camera' in policy; cannot construct a camera")
1179  cameraDataSubdir = policy['camera']
1180  self.cameraDataLocation = os.path.normpath(
1181  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
1182  cameraConfig = afwCameraGeom.CameraConfig()
1183  cameraConfig.load(self.cameraDataLocation)
1184  ampInfoPath = os.path.dirname(self.cameraDataLocation)
1185  return afwCameraGeom.makeCameraFromPath(
1186  cameraConfig=cameraConfig,
1187  ampInfoPath=ampInfoPath,
1188  shortNameFunc=self.getShortCcdName,
1189  pupilFactoryClass=self.PupilFactoryClass
1190  )
1191 
1192  def getRegistry(self):
1193  """Get the registry used by this mapper.
1194 
1195  Returns
1196  -------
1197  Registry or None
1198  The registry used by this mapper for this mapper's repository.
1199  """
1200  return self.registry
1201 
1202  def getImageCompressionSettings(self, datasetType, dataId):
1203  """Stuff image compression settings into a daf.base.PropertySet
1204 
1205  This goes into the ButlerLocation's "additionalData", which gets
1206  passed into the boost::persistence framework.
1207 
1208  Parameters
1209  ----------
1210  datasetType : `str`
1211  Type of dataset for which to get the image compression settings.
1212  dataId : `dict`
1213  Dataset identifier.
1214 
1215  Returns
1216  -------
1217  additionalData : `lsst.daf.base.PropertySet`
1218  Image compression settings.
1219  """
1220  mapping = self.mappings[datasetType]
1221  recipeName = mapping.recipe
1222  storageType = mapping.storage
1223  if storageType not in self._writeRecipes:
1224  return dafBase.PropertySet()
1225  if recipeName not in self._writeRecipes[storageType]:
1226  raise RuntimeError("Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1227  (datasetType, storageType, recipeName))
1228  recipe = self._writeRecipes[storageType][recipeName].deepCopy()
1229  seed = hash(tuple(dataId.items())) % 2**31
1230  for plane in ("image", "mask", "variance"):
1231  if recipe.exists(plane + ".scaling.seed") and recipe.getScalar(plane + ".scaling.seed") == 0:
1232  recipe.set(plane + ".scaling.seed", seed)
1233  return recipe
1234 
1235  def _initWriteRecipes(self):
1236  """Read the recipes for writing files
1237 
1238  These recipes are currently used for configuring FITS compression,
1239  but they could have wider uses for configuring different flavors
1240  of the storage types. A recipe is referred to by a symbolic name,
1241  which has associated settings. These settings are stored as a
1242  `PropertySet` so they can easily be passed down to the
1243  boost::persistence framework as the "additionalData" parameter.
1244 
1245  The list of recipes is written in YAML. A default recipe and
1246  some other convenient recipes are in obs_base/policy/writeRecipes.yaml
1247  and these may be overridden or supplemented by the individual obs_*
1248  packages' own policy/writeRecipes.yaml files.
1249 
1250  Recipes are grouped by the storage type. Currently, only the
1251  ``FitsStorage`` storage type uses recipes, which uses it to
1252  configure FITS image compression.
1253 
1254  Each ``FitsStorage`` recipe for FITS compression should define
1255  "image", "mask" and "variance" entries, each of which may contain
1256  "compression" and "scaling" entries. Defaults will be provided for
1257  any missing elements under "compression" and "scaling".
1258 
1259  The allowed entries under "compression" are:
1260 
1261  * algorithm (string): compression algorithm to use
1262  * rows (int): number of rows per tile (0 = entire dimension)
1263  * columns (int): number of columns per tile (0 = entire dimension)
1264  * quantizeLevel (float): cfitsio quantization level
1265 
1266  The allowed entries under "scaling" are:
1267 
1268  * algorithm (string): scaling algorithm to use
1269  * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64)
1270  * fuzz (bool): fuzz the values when quantising floating-point values?
1271  * seed (long): seed for random number generator when fuzzing
1272  * maskPlanes (list of string): mask planes to ignore when doing
1273  statistics
1274  * quantizeLevel: divisor of the standard deviation for STDEV_* scaling
1275  * quantizePad: number of stdev to allow on the low side (for
1276  STDEV_POSITIVE/NEGATIVE)
1277  * bscale: manually specified BSCALE (for MANUAL scaling)
1278  * bzero: manually specified BSCALE (for MANUAL scaling)
1279 
1280  A very simple example YAML recipe:
1281 
1282  FitsStorage:
1283  default:
1284  image: &default
1285  compression:
1286  algorithm: GZIP_SHUFFLE
1287  mask: *default
1288  variance: *default
1289  """
1290  recipesFile = os.path.join(getPackageDir("obs_base"), "policy", "writeRecipes.yaml")
1291  recipes = dafPersist.Policy(recipesFile)
1292  supplementsFile = os.path.join(self.getPackageDir(), "policy", "writeRecipes.yaml")
1293  validationMenu = {'FitsStorage': validateRecipeFitsStorage, }
1294  if os.path.exists(supplementsFile) and supplementsFile != recipesFile:
1295  supplements = dafPersist.Policy(supplementsFile)
1296  # Don't allow overrides, only supplements
1297  for entry in validationMenu:
1298  intersection = set(recipes[entry].names()).intersection(set(supplements.names()))
1299  if intersection:
1300  raise RuntimeError("Recipes provided in %s section %s may not override those in %s: %s" %
1301  (supplementsFile, entry, recipesFile, intersection))
1302  recipes.update(supplements)
1303 
1304  self._writeRecipes = {}
1305  for storageType in recipes.names(True):
1306  if "default" not in recipes[storageType]:
1307  raise RuntimeError("No 'default' recipe defined for storage type %s in %s" %
1308  (storageType, recipesFile))
1309  self._writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1310 
1311 
1312 def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True):
1313  """Generate an Exposure from an image-like object
1314 
1315  If the image is a DecoratedImage then also set its WCS and metadata
1316  (Image and MaskedImage are missing the necessary metadata
1317  and Exposure already has those set)
1318 
1319  Parameters
1320  ----------
1321  image : Image-like object
1322  Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or
1323  Exposure.
1324 
1325  Returns
1326  -------
1327  `lsst.afw.image.Exposure`
1328  Exposure containing input image.
1329  """
1330  metadata = None
1331  if isinstance(image, afwImage.MaskedImage):
1332  exposure = afwImage.makeExposure(image)
1333  elif isinstance(image, afwImage.DecoratedImage):
1334  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
1335  metadata = image.getMetadata()
1336  exposure.setMetadata(metadata)
1337  elif isinstance(image, afwImage.Exposure):
1338  exposure = image
1339  metadata = exposure.getMetadata()
1340  else: # Image
1342 
1343  # set VisitInfo if we can
1344  if setVisitInfo and exposure.getInfo().getVisitInfo() is None:
1345  if metadata is not None:
1346  if mapper is None:
1347  if not logger:
1348  logger = lsstLog.Log.getLogger("CameraMapper")
1349  logger.warn("I can only set the VisitInfo if you provide a mapper")
1350  else:
1351  exposureId = mapper._computeCcdExposureId(dataId)
1352  visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1353 
1354  exposure.getInfo().setVisitInfo(visitInfo)
1355 
1356  return exposure
1357 
1358 
1360  """Validate recipes for FitsStorage
1361 
1362  The recipes are supplemented with default values where appropriate.
1363 
1364  TODO: replace this custom validation code with Cerberus (DM-11846)
1365 
1366  Parameters
1367  ----------
1368  recipes : `lsst.daf.persistence.Policy`
1369  FitsStorage recipes to validate.
1370 
1371  Returns
1372  -------
1373  validated : `lsst.daf.base.PropertySet`
1374  Validated FitsStorage recipe.
1375 
1376  Raises
1377  ------
1378  `RuntimeError`
1379  If validation fails.
1380  """
1381  # Schemas define what should be there, and the default values (and by the default
1382  # value, the expected type).
1383  compressionSchema = {
1384  "algorithm": "NONE",
1385  "rows": 1,
1386  "columns": 0,
1387  "quantizeLevel": 0.0,
1388  }
1389  scalingSchema = {
1390  "algorithm": "NONE",
1391  "bitpix": 0,
1392  "maskPlanes": ["NO_DATA"],
1393  "seed": 0,
1394  "quantizeLevel": 4.0,
1395  "quantizePad": 5.0,
1396  "fuzz": True,
1397  "bscale": 1.0,
1398  "bzero": 0.0,
1399  }
1400 
1401  def checkUnrecognized(entry, allowed, description):
1402  """Check to see if the entry contains unrecognised keywords"""
1403  unrecognized = set(entry.keys()) - set(allowed)
1404  if unrecognized:
1405  raise RuntimeError(
1406  "Unrecognized entries when parsing image compression recipe %s: %s" %
1407  (description, unrecognized))
1408 
1409  validated = {}
1410  for name in recipes.names(True):
1411  checkUnrecognized(recipes[name], ["image", "mask", "variance"], name)
1412  rr = dafBase.PropertySet()
1413  validated[name] = rr
1414  for plane in ("image", "mask", "variance"):
1415  checkUnrecognized(recipes[name][plane], ["compression", "scaling"],
1416  name + "->" + plane)
1417 
1418  for settings, schema in (("compression", compressionSchema),
1419  ("scaling", scalingSchema)):
1420  prefix = plane + "." + settings
1421  if settings not in recipes[name][plane]:
1422  for key in schema:
1423  rr.set(prefix + "." + key, schema[key])
1424  continue
1425  entry = recipes[name][plane][settings]
1426  checkUnrecognized(entry, schema.keys(), name + "->" + plane + "->" + settings)
1427  for key in schema:
1428  value = type(schema[key])(entry[key]) if key in entry else schema[key]
1429  rr.set(prefix + "." + key, value)
1430  return validated
def _makeCamera(self, policy, repositoryDir)
def map_expIdInfo(self, dataId, write=False)
def _setAmpDetector(self, item, dataId, trimmed=True)
def validateRecipeFitsStorage(recipes)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
Class for logical location of a persisted Persistable instance.
def _setFilter(self, mapping, item, dataId)
A class to contain the data, WCS, and other information needed to describe an image of the sky...
Definition: Exposure.h:72
def _setCcdDetector(self, item, dataId, trimmed=True)
daf::base::PropertySet * set
Definition: fits.cc:902
def std_bfKernel(self, item, dataId)
def getKeys(self, datasetType, level)
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
Definition: MaskedImage.h:1277
Definition: Log.h:706
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
Definition: Exposure.h:442
def getImageCompressionSettings(self, datasetType, dataId)
def _createSkyWcsFromMetadata(self, exposure)
Reports attempts to access elements using an invalid key.
Definition: Runtime.h:151
def createInitialSkyWcs(visitInfo, detector, flipX=False)
Definition: utils.py:43
table::Key< int > type
Definition: Detector.cc:163
def map_camera(self, dataId, write=False)
def map(self, datasetType, dataId, write=False)
Definition: mapper.py:138
A class to manipulate images, masks, and variance as a single object.
Definition: MaskedImage.h:73
def backup(self, datasetType, dataId)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
Utility functions.
Holds an integer identifier for an LSST filter.
Definition: Filter.h:141
def std_skypolicy(self, item, dataId)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
std::shared_ptr< SkyWcs > makeSkyWcs(TransformPoint2ToPoint2 const &pixelsToFieldAngle, lsst::geom::Angle const &orientation, bool flipX, lsst::geom::SpherePoint const &boresight, std::string const &projection="TAN")
Construct a FITS SkyWcs from camera geometry.
Definition: SkyWcs.cc:516
Class for storing generic metadata.
Definition: PropertySet.h:67
A FITS reader class for Exposures and their components.
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects...
Reports errors from accepting an object of an unexpected or inappropriate type.
Definition: Runtime.h:167
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
Definition: Image.cc:694
A container for an Image and its associated metadata.
Definition: Image.h:404