LSSTApplications  17.0+11,17.0+34,17.0+56,17.0+57,17.0+59,17.0+7,17.0-1-g377950a+33,17.0.1-1-g114240f+2,17.0.1-1-g4d4fbc4+28,17.0.1-1-g55520dc+49,17.0.1-1-g5f4ed7e+52,17.0.1-1-g6dd7d69+17,17.0.1-1-g8de6c91+11,17.0.1-1-gb9095d2+7,17.0.1-1-ge9fec5e+5,17.0.1-1-gf4e0155+55,17.0.1-1-gfc65f5f+50,17.0.1-1-gfc6fb1f+20,17.0.1-10-g87f9f3f+1,17.0.1-11-ge9de802+16,17.0.1-16-ga14f7d5c+4,17.0.1-17-gc79d625+1,17.0.1-17-gdae4c4a+8,17.0.1-2-g26618f5+29,17.0.1-2-g54f2ebc+9,17.0.1-2-gf403422+1,17.0.1-20-g2ca2f74+6,17.0.1-23-gf3eadeb7+1,17.0.1-3-g7e86b59+39,17.0.1-3-gb5ca14a,17.0.1-3-gd08d533+40,17.0.1-30-g596af8797,17.0.1-4-g59d126d+4,17.0.1-4-gc69c472+5,17.0.1-6-g5afd9b9+4,17.0.1-7-g35889ee+1,17.0.1-7-gc7c8782+18,17.0.1-9-gc4bbfb2+3,w.2019.22
LSSTDataManagementBasePackage
cameraMapper.py
Go to the documentation of this file.
1 #
2 # LSST Data Management System
3 # Copyright 2008, 2009, 2010 LSST Corporation.
4 #
5 # This product includes software developed by the
6 # LSST Project (http://www.lsst.org/).
7 #
8 # This program is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU General Public License as published by
10 # the Free Software Foundation, either version 3 of the License, or
11 # (at your option) any later version.
12 #
13 # This program is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU General Public License for more details.
17 #
18 # You should have received a copy of the LSST License Statement and
19 # the GNU General Public License along with this program. If not,
20 # see <http://www.lsstcorp.org/LegalNotices/>.
21 #
22 
23 import copy
24 import os
25 from astropy.io import fits # required by _makeDefectsDict until defects are written as AFW tables
26 import re
27 import weakref
28 import lsst.daf.persistence as dafPersist
29 from . import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
30 import lsst.daf.base as dafBase
31 import lsst.afw.geom as afwGeom
32 import lsst.afw.image as afwImage
33 import lsst.afw.table as afwTable
34 from lsst.afw.fits import readMetadata
35 import lsst.afw.cameraGeom as afwCameraGeom
36 import lsst.log as lsstLog
37 import lsst.pex.exceptions as pexExcept
38 from .exposureIdInfo import ExposureIdInfo
39 from .makeRawVisitInfo import MakeRawVisitInfo
40 from lsst.utils import getPackageDir
41 
42 __all__ = ["CameraMapper", "exposureFromImage"]
43 
44 
46 
47  """CameraMapper is a base class for mappers that handle images from a
48  camera and products derived from them. This provides an abstraction layer
49  between the data on disk and the code.
50 
51  Public methods: keys, queryMetadata, getDatasetTypes, map,
52  canStandardize, standardize
53 
54  Mappers for specific data sources (e.g., CFHT Megacam, LSST
55  simulations, etc.) should inherit this class.
56 
57  The CameraMapper manages datasets within a "root" directory. Note that
58  writing to a dataset present in the input root will hide the existing
59  dataset but not overwrite it. See #2160 for design discussion.
60 
61  A camera is assumed to consist of one or more rafts, each composed of
62  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
63  (amps). A camera is also assumed to have a camera geometry description
64  (CameraGeom object) as a policy file, a filter description (Filter class
65  static configuration) as another policy file, and an optional defects
66  description directory.
67 
68  Information from the camera geometry and defects are inserted into all
69  Exposure objects returned.
70 
71  The mapper uses one or two registries to retrieve metadata about the
72  images. The first is a registry of all raw exposures. This must contain
73  the time of the observation. One or more tables (or the equivalent)
74  within the registry are used to look up data identifier components that
75  are not specified by the user (e.g. filter) and to return results for
76  metadata queries. The second is an optional registry of all calibration
77  data. This should contain validity start and end entries for each
78  calibration dataset in the same timescale as the observation time.
79 
80  Subclasses will typically set MakeRawVisitInfoClass:
81 
82  MakeRawVisitInfoClass: a class variable that points to a subclass of
83  MakeRawVisitInfo, a functor that creates an
84  lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
85 
86  Subclasses must provide the following methods:
87 
88  _extractDetectorName(self, dataId): returns the detector name for a CCD
89  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
90  a dataset identifier referring to that CCD or a subcomponent of it.
91 
92  _computeCcdExposureId(self, dataId): see below
93 
94  _computeCoaddExposureId(self, dataId, singleFilter): see below
95 
96  Subclasses may also need to override the following methods:
97 
98  _transformId(self, dataId): transformation of a data identifier
99  from colloquial usage (e.g., "ccdname") to proper/actual usage
100  (e.g., "ccd"), including making suitable for path expansion (e.g. removing
101  commas). The default implementation does nothing. Note that this
102  method should not modify its input parameter.
103 
104  getShortCcdName(self, ccdName): a static method that returns a shortened
105  name suitable for use as a filename. The default version converts spaces
106  to underscores.
107 
108  _getCcdKeyVal(self, dataId): return a CCD key and value
109  by which to look up defects in the defects registry.
110  The default value returns ("ccd", detector name)
111 
112  _mapActualToPath(self, template, actualId): convert a template path to an
113  actual path, using the actual dataset identifier.
114 
115  The mapper's behaviors are largely specified by the policy file.
116  See the MapperDictionary.paf for descriptions of the available items.
117 
118  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
119  mappings (see Mappings class).
120 
121  Common default mappings for all subclasses can be specified in the
122  "policy/{images,exposures,calibrations,datasets}.yaml" files. This
123  provides a simple way to add a product to all camera mappers.
124 
125  Functions to map (provide a path to the data given a dataset
126  identifier dictionary) and standardize (convert data into some standard
127  format or type) may be provided in the subclass as "map_{dataset type}"
128  and "std_{dataset type}", respectively.
129 
130  If non-Exposure datasets cannot be retrieved using standard
131  daf_persistence methods alone, a "bypass_{dataset type}" function may be
132  provided in the subclass to return the dataset instead of using the
133  "datasets" subpolicy.
134 
135  Implementations of map_camera and bypass_camera that should typically be
136  sufficient are provided in this base class.
137 
138  Notes
139  -----
140  TODO:
141 
142  - Handle defects the same was as all other calibration products, using the
143  calibration registry
144  - Instead of auto-loading the camera at construction time, load it from
145  the calibration registry
146  - Rewrite defects as AFW tables so we don't need astropy.io.fits to
147  unpersist them; then remove all mention of astropy.io.fits from this
148  package.
149  """
150  packageName = None
151 
152  # a class or subclass of MakeRawVisitInfo, a functor that makes an
153  # lsst.afw.image.VisitInfo from the FITS metadata of a raw image
154  MakeRawVisitInfoClass = MakeRawVisitInfo
155 
156  # a class or subclass of PupilFactory
157  PupilFactoryClass = afwCameraGeom.PupilFactory
158 
159  def __init__(self, policy, repositoryDir,
160  root=None, registry=None, calibRoot=None, calibRegistry=None,
161  provided=None, parentRegistry=None, repositoryCfg=None):
162  """Initialize the CameraMapper.
163 
164  Parameters
165  ----------
166  policy : daf_persistence.Policy,
167  Policy with per-camera defaults already merged.
168  repositoryDir : string
169  Policy repository for the subclassing module (obtained with
170  getRepositoryPath() on the per-camera default dictionary).
171  root : string, optional
172  Path to the root directory for data.
173  registry : string, optional
174  Path to registry with data's metadata.
175  calibRoot : string, optional
176  Root directory for calibrations.
177  calibRegistry : string, optional
178  Path to registry with calibrations' metadata.
179  provided : list of string, optional
180  Keys provided by the mapper.
181  parentRegistry : Registry subclass, optional
182  Registry from a parent repository that may be used to look up
183  data's metadata.
184  repositoryCfg : daf_persistence.RepositoryCfg or None, optional
185  The configuration information for the repository this mapper is
186  being used with.
187  """
188 
189  dafPersist.Mapper.__init__(self)
190 
191  self.log = lsstLog.Log.getLogger("CameraMapper")
192 
193  if root:
194  self.root = root
195  elif repositoryCfg:
196  self.root = repositoryCfg.root
197  else:
198  self.root = None
199 
200  repoPolicy = repositoryCfg.policy if repositoryCfg else None
201  if repoPolicy is not None:
202  policy.update(repoPolicy)
203 
204  # Levels
205  self.levels = dict()
206  if 'levels' in policy:
207  levelsPolicy = policy['levels']
208  for key in levelsPolicy.names(True):
209  self.levels[key] = set(levelsPolicy.asArray(key))
210  self.defaultLevel = policy['defaultLevel']
211  self.defaultSubLevels = dict()
212  if 'defaultSubLevels' in policy:
213  self.defaultSubLevels = policy['defaultSubLevels']
214 
215  # Root directories
216  if root is None:
217  root = "."
218  root = dafPersist.LogicalLocation(root).locString()
219 
220  self.rootStorage = dafPersist.Storage.makeFromURI(uri=root)
221 
222  # If the calibRoot is passed in, use that. If not and it's indicated in
223  # the policy, use that. And otherwise, the calibs are in the regular
224  # root.
225  # If the location indicated by the calib root does not exist, do not
226  # create it.
227  calibStorage = None
228  if calibRoot is not None:
229  calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
230  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
231  create=False)
232  else:
233  calibRoot = policy.get('calibRoot', None)
234  if calibRoot:
235  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
236  create=False)
237  if calibStorage is None:
238  calibStorage = self.rootStorage
239 
240  self.root = root
241 
242  # Registries
243  self.registry = self._setupRegistry("registry", "exposure", registry, policy, "registryPath",
244  self.rootStorage, searchParents=False,
245  posixIfNoSql=(not parentRegistry))
246  if not self.registry:
247  self.registry = parentRegistry
248  needCalibRegistry = policy.get('needCalibRegistry', None)
249  if needCalibRegistry:
250  if calibStorage:
251  self.calibRegistry = self._setupRegistry("calibRegistry", "calib", calibRegistry, policy,
252  "calibRegistryPath", calibStorage,
253  posixIfNoSql=False) # NB never use posix for calibs
254  else:
255  raise RuntimeError(
256  "'needCalibRegistry' is true in Policy, but was unable to locate a repo at " +
257  "calibRoot ivar:%s or policy['calibRoot']:%s" %
258  (calibRoot, policy.get('calibRoot', None)))
259  else:
260  self.calibRegistry = None
261 
262  # Dict of valid keys and their value types
263  self.keyDict = dict()
264 
265  self._initMappings(policy, self.rootStorage, calibStorage, provided=None)
266  self._initWriteRecipes()
267 
268  # Camera geometry
269  self.cameraDataLocation = None # path to camera geometry config file
270  self.camera = self._makeCamera(policy=policy, repositoryDir=repositoryDir)
271 
272  # Defect registry and root. Defects are stored with the camera and the registry is loaded from the
273  # camera package, which is on the local filesystem.
274  self.defectRegistry = None
275  if 'defects' in policy:
276  self.defectPath = os.path.join(repositoryDir, policy['defects'])
277  defectRegistryLocation = os.path.join(self.defectPath, "defectRegistry.sqlite3")
278  self.defectRegistry = dafPersist.Registry.create(defectRegistryLocation)
279 
280  # Filter translation table
281  self.filters = None
282 
283  # verify that the class variable packageName is set before attempting
284  # to instantiate an instance
285  if self.packageName is None:
286  raise ValueError('class variable packageName must not be None')
287 
289 
290  def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
291  """Initialize mappings
292 
293  For each of the dataset types that we want to be able to read, there
294  are methods that can be created to support them:
295  * map_<dataset> : determine the path for dataset
296  * std_<dataset> : standardize the retrieved dataset
297  * bypass_<dataset> : retrieve the dataset (bypassing the usual
298  retrieval machinery)
299  * query_<dataset> : query the registry
300 
301  Besides the dataset types explicitly listed in the policy, we create
302  additional, derived datasets for additional conveniences,
303  e.g., reading the header of an image, retrieving only the size of a
304  catalog.
305 
306  Parameters
307  ----------
308  policy : `lsst.daf.persistence.Policy`
309  Policy with per-camera defaults already merged
310  rootStorage : `Storage subclass instance`
311  Interface to persisted repository data.
312  calibRoot : `Storage subclass instance`
313  Interface to persisted calib repository data
314  provided : `list` of `str`
315  Keys provided by the mapper
316  """
317  # Sub-dictionaries (for exposure/calibration/dataset types)
318  imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
319  "obs_base", "ImageMappingDefaults.yaml", "policy"))
320  expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
321  "obs_base", "ExposureMappingDefaults.yaml", "policy"))
322  calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
323  "obs_base", "CalibrationMappingDefaults.yaml", "policy"))
324  dsMappingPolicy = dafPersist.Policy()
325 
326  # Mappings
327  mappingList = (
328  ("images", imgMappingPolicy, ImageMapping),
329  ("exposures", expMappingPolicy, ExposureMapping),
330  ("calibrations", calMappingPolicy, CalibrationMapping),
331  ("datasets", dsMappingPolicy, DatasetMapping)
332  )
333  self.mappings = dict()
334  for name, defPolicy, cls in mappingList:
335  if name in policy:
336  datasets = policy[name]
337 
338  # Centrally-defined datasets
339  defaultsPath = os.path.join(getPackageDir("obs_base"), "policy", name + ".yaml")
340  if os.path.exists(defaultsPath):
341  datasets.merge(dafPersist.Policy(defaultsPath))
342 
343  mappings = dict()
344  setattr(self, name, mappings)
345  for datasetType in datasets.names(True):
346  subPolicy = datasets[datasetType]
347  subPolicy.merge(defPolicy)
348 
349  if not hasattr(self, "map_" + datasetType) and 'composite' in subPolicy:
350  def compositeClosure(dataId, write=False, mapper=None, mapping=None,
351  subPolicy=subPolicy):
352  components = subPolicy.get('composite')
353  assembler = subPolicy['assembler'] if 'assembler' in subPolicy else None
354  disassembler = subPolicy['disassembler'] if 'disassembler' in subPolicy else None
355  python = subPolicy['python']
356  butlerComposite = dafPersist.ButlerComposite(assembler=assembler,
357  disassembler=disassembler,
358  python=python,
359  dataId=dataId,
360  mapper=self)
361  for name, component in components.items():
362  butlerComposite.add(id=name,
363  datasetType=component.get('datasetType'),
364  setter=component.get('setter', None),
365  getter=component.get('getter', None),
366  subset=component.get('subset', False),
367  inputOnly=component.get('inputOnly', False))
368  return butlerComposite
369  setattr(self, "map_" + datasetType, compositeClosure)
370  # for now at least, don't set up any other handling for this dataset type.
371  continue
372 
373  if name == "calibrations":
374  mapping = cls(datasetType, subPolicy, self.registry, self.calibRegistry, calibStorage,
375  provided=provided, dataRoot=rootStorage)
376  else:
377  mapping = cls(datasetType, subPolicy, self.registry, rootStorage, provided=provided)
378 
379  if datasetType in self.mappings:
380  raise ValueError(f"Duplicate mapping policy for dataset type {datasetType}")
381  self.keyDict.update(mapping.keys())
382  mappings[datasetType] = mapping
383  self.mappings[datasetType] = mapping
384  if not hasattr(self, "map_" + datasetType):
385  def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
386  return mapping.map(mapper, dataId, write)
387  setattr(self, "map_" + datasetType, mapClosure)
388  if not hasattr(self, "query_" + datasetType):
389  def queryClosure(format, dataId, mapping=mapping):
390  return mapping.lookup(format, dataId)
391  setattr(self, "query_" + datasetType, queryClosure)
392  if hasattr(mapping, "standardize") and not hasattr(self, "std_" + datasetType):
393  def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
394  return mapping.standardize(mapper, item, dataId)
395  setattr(self, "std_" + datasetType, stdClosure)
396 
397  def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
398  """Set convenience methods on CameraMapper"""
399  mapName = "map_" + datasetType + "_" + suffix
400  bypassName = "bypass_" + datasetType + "_" + suffix
401  queryName = "query_" + datasetType + "_" + suffix
402  if not hasattr(self, mapName):
403  setattr(self, mapName, mapImpl or getattr(self, "map_" + datasetType))
404  if not hasattr(self, bypassName):
405  if bypassImpl is None and hasattr(self, "bypass_" + datasetType):
406  bypassImpl = getattr(self, "bypass_" + datasetType)
407  if bypassImpl is not None:
408  setattr(self, bypassName, bypassImpl)
409  if not hasattr(self, queryName):
410  setattr(self, queryName, queryImpl or getattr(self, "query_" + datasetType))
411 
412  # Filename of dataset
413  setMethods("filename", bypassImpl=lambda datasetType, pythonType, location, dataId:
414  [os.path.join(location.getStorage().root, p) for p in location.getLocations()])
415  # Metadata from FITS file
416  if subPolicy["storage"] == "FitsStorage": # a FITS image
417  setMethods("md", bypassImpl=lambda datasetType, pythonType, location, dataId:
418  readMetadata(location.getLocationsWithRoot()[0]))
419 
420  # Add support for configuring FITS compression
421  addName = "add_" + datasetType
422  if not hasattr(self, addName):
423  setattr(self, addName, self.getImageCompressionSettings)
424 
425  if name == "exposures":
426  def getSkyWcs(datasetType, pythonType, location, dataId):
427  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
428  return fitsReader.readWcs()
429 
430  setMethods("wcs", bypassImpl=getSkyWcs)
431 
432  def getPhotoCalib(datasetType, pythonType, location, dataId):
433  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
434  return fitsReader.readPhotoCalib()
435 
436  setMethods("photoCalib", bypassImpl=getPhotoCalib)
437 
438  def getVisitInfo(datasetType, pythonType, location, dataId):
439  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
440  return fitsReader.readVisitInfo()
441 
442  setMethods("visitInfo", bypassImpl=getVisitInfo)
443 
444  def getFilter(datasetType, pythonType, location, dataId):
445  fitsReader = afwImage.ExposureFitsReader(location.getLocationsWithRoot()[0])
446  return fitsReader.readFilter()
447 
448  setMethods("filter", bypassImpl=getFilter)
449 
450  setMethods("detector",
451  mapImpl=lambda dataId, write=False:
453  pythonType="lsst.afw.cameraGeom.CameraConfig",
454  cppType="Config",
455  storageName="Internal",
456  locationList="ignored",
457  dataId=dataId,
458  mapper=self,
459  storage=None,
460  ),
461  bypassImpl=lambda datasetType, pythonType, location, dataId:
462  self.camera[self._extractDetectorName(dataId)]
463  )
464  setMethods("bbox", bypassImpl=lambda dsType, pyType, location, dataId:
466  readMetadata(location.getLocationsWithRoot()[0], hdu=1)))
467 
468  elif name == "images":
469  setMethods("bbox", bypassImpl=lambda dsType, pyType, location, dataId:
471  readMetadata(location.getLocationsWithRoot()[0])))
472 
473  if subPolicy["storage"] == "FitsCatalogStorage": # a FITS catalog
474  setMethods("md", bypassImpl=lambda datasetType, pythonType, location, dataId:
475  readMetadata(os.path.join(location.getStorage().root,
476  location.getLocations()[0]), hdu=1))
477 
478  # Sub-images
479  if subPolicy["storage"] == "FitsStorage":
480  def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
481  subId = dataId.copy()
482  del subId['bbox']
483  loc = mapping.map(mapper, subId, write)
484  bbox = dataId['bbox']
485  llcX = bbox.getMinX()
486  llcY = bbox.getMinY()
487  width = bbox.getWidth()
488  height = bbox.getHeight()
489  loc.additionalData.set('llcX', llcX)
490  loc.additionalData.set('llcY', llcY)
491  loc.additionalData.set('width', width)
492  loc.additionalData.set('height', height)
493  if 'imageOrigin' in dataId:
494  loc.additionalData.set('imageOrigin',
495  dataId['imageOrigin'])
496  return loc
497 
498  def querySubClosure(key, format, dataId, mapping=mapping):
499  subId = dataId.copy()
500  del subId['bbox']
501  return mapping.lookup(format, subId)
502  setMethods("sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
503 
504  if subPolicy["storage"] == "FitsCatalogStorage":
505  # Length of catalog
506  setMethods("len", bypassImpl=lambda datasetType, pythonType, location, dataId:
507  readMetadata(os.path.join(location.getStorage().root,
508  location.getLocations()[0]),
509  hdu=1).getScalar("NAXIS2"))
510 
511  # Schema of catalog
512  if not datasetType.endswith("_schema") and datasetType + "_schema" not in datasets:
513  setMethods("schema", bypassImpl=lambda datasetType, pythonType, location, dataId:
514  afwTable.Schema.readFits(os.path.join(location.getStorage().root,
515  location.getLocations()[0])))
516 
517  def _computeCcdExposureId(self, dataId):
518  """Compute the 64-bit (long) identifier for a CCD exposure.
519 
520  Subclasses must override
521 
522  Parameters
523  ----------
524  dataId : `dict`
525  Data identifier with visit, ccd.
526  """
527  raise NotImplementedError()
528 
529  def _computeCoaddExposureId(self, dataId, singleFilter):
530  """Compute the 64-bit (long) identifier for a coadd.
531 
532  Subclasses must override
533 
534  Parameters
535  ----------
536  dataId : `dict`
537  Data identifier with tract and patch.
538  singleFilter : `bool`
539  True means the desired ID is for a single-filter coadd, in which
540  case dataIdmust contain filter.
541  """
542  raise NotImplementedError()
543 
544  def _search(self, path):
545  """Search for path in the associated repository's storage.
546 
547  Parameters
548  ----------
549  path : string
550  Path that describes an object in the repository associated with
551  this mapper.
552  Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The
553  indicator will be stripped when searching and so will match
554  filenames without the HDU indicator, e.g. 'foo.fits'. The path
555  returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
556 
557  Returns
558  -------
559  string
560  The path for this object in the repository. Will return None if the
561  object can't be found. If the input argument path contained an HDU
562  indicator, the returned path will also contain the HDU indicator.
563  """
564  return self.rootStorage.search(path)
565 
566  def backup(self, datasetType, dataId):
567  """Rename any existing object with the given type and dataId.
568 
569  The CameraMapper implementation saves objects in a sequence of e.g.:
570 
571  - foo.fits
572  - foo.fits~1
573  - foo.fits~2
574 
575  All of the backups will be placed in the output repo, however, and will
576  not be removed if they are found elsewhere in the _parent chain. This
577  means that the same file will be stored twice if the previous version
578  was found in an input repo.
579  """
580 
581  # Calling PosixStorage directly is not the long term solution in this
582  # function, this is work-in-progress on epic DM-6225. The plan is for
583  # parentSearch to be changed to 'search', and search only the storage
584  # associated with this mapper. All searching of parents will be handled
585  # by traversing the container of repositories in Butler.
586 
587  def firstElement(list):
588  """Get the first element in the list, or None if that can't be
589  done.
590  """
591  return list[0] if list is not None and len(list) else None
592 
593  n = 0
594  newLocation = self.map(datasetType, dataId, write=True)
595  newPath = newLocation.getLocations()[0]
596  path = dafPersist.PosixStorage.search(self.root, newPath, searchParents=True)
597  path = firstElement(path)
598  oldPaths = []
599  while path is not None:
600  n += 1
601  oldPaths.append((n, path))
602  path = dafPersist.PosixStorage.search(self.root, "%s~%d" % (newPath, n), searchParents=True)
603  path = firstElement(path)
604  for n, oldPath in reversed(oldPaths):
605  self.rootStorage.copyFile(oldPath, "%s~%d" % (newPath, n))
606 
607  def keys(self):
608  """Return supported keys.
609 
610  Returns
611  -------
612  iterable
613  List of keys usable in a dataset identifier
614  """
615  return iter(self.keyDict.keys())
616 
617  def getKeys(self, datasetType, level):
618  """Return a dict of supported keys and their value types for a given
619  dataset type at a given level of the key hierarchy.
620 
621  Parameters
622  ----------
623  datasetType : `str`
624  Dataset type or None for all dataset types.
625  level : `str` or None
626  Level or None for all levels or '' for the default level for the
627  camera.
628 
629  Returns
630  -------
631  `dict`
632  Keys are strings usable in a dataset identifier, values are their
633  value types.
634  """
635 
636  # not sure if this is how we want to do this. what if None was intended?
637  if level == '':
638  level = self.getDefaultLevel()
639 
640  if datasetType is None:
641  keyDict = copy.copy(self.keyDict)
642  else:
643  keyDict = self.mappings[datasetType].keys()
644  if level is not None and level in self.levels:
645  keyDict = copy.copy(keyDict)
646  for l in self.levels[level]:
647  if l in keyDict:
648  del keyDict[l]
649  return keyDict
650 
651  def getDefaultLevel(self):
652  return self.defaultLevel
653 
654  def getDefaultSubLevel(self, level):
655  if level in self.defaultSubLevels:
656  return self.defaultSubLevels[level]
657  return None
658 
659  @classmethod
660  def getCameraName(cls):
661  """Return the name of the camera that this CameraMapper is for."""
662  className = str(cls)
663  className = className[className.find('.'):-1]
664  m = re.search(r'(\w+)Mapper', className)
665  if m is None:
666  m = re.search(r"class '[\w.]*?(\w+)'", className)
667  name = m.group(1)
668  return name[:1].lower() + name[1:] if name else ''
669 
670  @classmethod
671  def getPackageName(cls):
672  """Return the name of the package containing this CameraMapper."""
673  if cls.packageName is None:
674  raise ValueError('class variable packageName must not be None')
675  return cls.packageName
676 
677  @classmethod
678  def getPackageDir(cls):
679  """Return the base directory of this package"""
680  return getPackageDir(cls.getPackageName())
681 
682  def map_camera(self, dataId, write=False):
683  """Map a camera dataset."""
684  if self.camera is None:
685  raise RuntimeError("No camera dataset available.")
686  actualId = self._transformId(dataId)
688  pythonType="lsst.afw.cameraGeom.CameraConfig",
689  cppType="Config",
690  storageName="ConfigStorage",
691  locationList=self.cameraDataLocation or "ignored",
692  dataId=actualId,
693  mapper=self,
694  storage=self.rootStorage
695  )
696 
697  def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
698  """Return the (preloaded) camera object.
699  """
700  if self.camera is None:
701  raise RuntimeError("No camera dataset available.")
702  return self.camera
703 
704  def map_defects(self, dataId, write=False):
705  """Map defects dataset.
706 
707  Returns
708  -------
709  `lsst.daf.butler.ButlerLocation`
710  Minimal ButlerLocation containing just the locationList field
711  (just enough information that bypass_defects can use it).
712  """
713  defectFitsPath = self._defectLookup(dataId=dataId)
714  if defectFitsPath is None:
715  raise RuntimeError("No defects available for dataId=%s" % (dataId,))
716 
717  return dafPersist.ButlerLocation(None, None, None, defectFitsPath,
718  dataId, self,
719  storage=self.rootStorage)
720 
721  def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId):
722  """Return a defect based on the butler location returned by map_defects
723 
724  Parameters
725  ----------
726  butlerLocation : `lsst.daf.persistence.ButlerLocation`
727  locationList = path to defects FITS file
728  dataId : `dict`
729  Butler data ID; "ccd" must be set.
730 
731  Note: the name "bypass_XXX" means the butler makes no attempt to
732  convert the ButlerLocation into an object, which is what we want for
733  now, since that conversion is a bit tricky.
734  """
735  detectorName = self._extractDetectorName(dataId)
736  defectsFitsPath = butlerLocation.locationList[0]
737 
738  with fits.open(defectsFitsPath) as hduList:
739  for hdu in hduList[1:]:
740  if hdu.header["name"] != detectorName:
741  continue
742 
743  defectList = []
744  for data in hdu.data:
745  bbox = afwGeom.Box2I(
746  afwGeom.Point2I(int(data['x0']), int(data['y0'])),
747  afwGeom.Extent2I(int(data['width']), int(data['height'])),
748  )
749  defectList.append(afwImage.DefectBase(bbox))
750  return defectList
751 
752  raise RuntimeError("No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
753 
754  def map_expIdInfo(self, dataId, write=False):
756  pythonType="lsst.obs.base.ExposureIdInfo",
757  cppType=None,
758  storageName="Internal",
759  locationList="ignored",
760  dataId=dataId,
761  mapper=self,
762  storage=self.rootStorage
763  )
764 
765  def bypass_expIdInfo(self, datasetType, pythonType, location, dataId):
766  """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
767  expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
768  expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
769  return ExposureIdInfo(expId=expId, expBits=expBits)
770 
771  def std_bfKernel(self, item, dataId):
772  """Disable standardization for bfKernel
773 
774  bfKernel is a calibration product that is numpy array,
775  unlike other calibration products that are all images;
776  all calibration images are sent through _standardizeExposure
777  due to CalibrationMapping, but we don't want that to happen to bfKernel
778  """
779  return item
780 
781  def std_raw(self, item, dataId):
782  """Standardize a raw dataset by converting it to an Exposure instead
783  of an Image"""
784  return self._standardizeExposure(self.exposures['raw'], item, dataId,
785  trimmed=False, setVisitInfo=True)
786 
787  def map_skypolicy(self, dataId):
788  """Map a sky policy."""
789  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
790  "Internal", None, None, self,
791  storage=self.rootStorage)
792 
793  def std_skypolicy(self, item, dataId):
794  """Standardize a sky policy by returning the one we use."""
795  return self.skypolicy
796 
797 
802 
803  def _getCcdKeyVal(self, dataId):
804  """Return CCD key and value used to look a defect in the defect
805  registry
806 
807  The default implementation simply returns ("ccd", full detector name)
808  """
809  return ("ccd", self._extractDetectorName(dataId))
810 
811  def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
812  posixIfNoSql=True):
813  """Set up a registry (usually SQLite3), trying a number of possible
814  paths.
815 
816  Parameters
817  ----------
818  name : string
819  Name of registry.
820  description: `str`
821  Description of registry (for log messages)
822  path : string
823  Path for registry.
824  policy : string
825  Policy that contains the registry name, used if path is None.
826  policyKey : string
827  Key in policy for registry path.
828  storage : Storage subclass
829  Repository Storage to look in.
830  searchParents : bool, optional
831  True if the search for a registry should follow any Butler v1
832  _parent symlinks.
833  posixIfNoSql : bool, optional
834  If an sqlite registry is not found, will create a posix registry if
835  this is True.
836 
837  Returns
838  -------
839  lsst.daf.persistence.Registry
840  Registry object
841  """
842  if path is None and policyKey in policy:
843  path = dafPersist.LogicalLocation(policy[policyKey]).locString()
844  if os.path.isabs(path):
845  raise RuntimeError("Policy should not indicate an absolute path for registry.")
846  if not storage.exists(path):
847  newPath = storage.instanceSearch(path)
848 
849  newPath = newPath[0] if newPath is not None and len(newPath) else None
850  if newPath is None:
851  self.log.warn("Unable to locate registry at policy path (also looked in root): %s",
852  path)
853  path = newPath
854  else:
855  self.log.warn("Unable to locate registry at policy path: %s", path)
856  path = None
857 
858  # Old Butler API was to indicate the registry WITH the repo folder, New Butler expects the registry to
859  # be in the repo folder. To support Old API, check to see if path starts with root, and if so, strip
860  # root from path. Currently only works with PosixStorage
861  try:
862  root = storage.root
863  if path and (path.startswith(root)):
864  path = path[len(root + '/'):]
865  except AttributeError:
866  pass
867 
868  # determine if there is an sqlite registry and if not, try the posix registry.
869  registry = None
870 
871  def search(filename, description):
872  """Search for file in storage
873 
874  Parameters
875  ----------
876  filename : `str`
877  Filename to search for
878  description : `str`
879  Description of file, for error message.
880 
881  Returns
882  -------
883  path : `str` or `None`
884  Path to file, or None
885  """
886  result = storage.instanceSearch(filename)
887  if result:
888  return result[0]
889  self.log.debug("Unable to locate %s: %s", description, filename)
890  return None
891 
892  # Search for a suitable registry database
893  if path is None:
894  path = search("%s.pgsql" % name, "%s in root" % description)
895  if path is None:
896  path = search("%s.sqlite3" % name, "%s in root" % description)
897  if path is None:
898  path = search(os.path.join(".", "%s.sqlite3" % name), "%s in current dir" % description)
899 
900  if path is not None:
901  if not storage.exists(path):
902  newPath = storage.instanceSearch(path)
903  newPath = newPath[0] if newPath is not None and len(newPath) else None
904  if newPath is not None:
905  path = newPath
906  localFileObj = storage.getLocalFile(path)
907  self.log.info("Loading %s registry from %s", description, localFileObj.name)
908  registry = dafPersist.Registry.create(localFileObj.name)
909  localFileObj.close()
910  elif not registry and posixIfNoSql:
911  try:
912  self.log.info("Loading Posix %s registry from %s", description, storage.root)
913  registry = dafPersist.PosixRegistry(storage.root)
914  except Exception:
915  registry = None
916 
917  return registry
918 
919  def _transformId(self, dataId):
920  """Generate a standard ID dict from a camera-specific ID dict.
921 
922  Canonical keys include:
923  - amp: amplifier name
924  - ccd: CCD name (in LSST this is a combination of raft and sensor)
925  The default implementation returns a copy of its input.
926 
927  Parameters
928  ----------
929  dataId : `dict`
930  Dataset identifier; this must not be modified
931 
932  Returns
933  -------
934  `dict`
935  Transformed dataset identifier.
936  """
937 
938  return dataId.copy()
939 
940  def _mapActualToPath(self, template, actualId):
941  """Convert a template path to an actual path, using the actual data
942  identifier. This implementation is usually sufficient but can be
943  overridden by the subclass.
944 
945  Parameters
946  ----------
947  template : `str`
948  Template path
949  actualId : `dict`
950  Dataset identifier
951 
952  Returns
953  -------
954  `str`
955  Pathname
956  """
957 
958  try:
959  transformedId = self._transformId(actualId)
960  return template % transformedId
961  except Exception as e:
962  raise RuntimeError("Failed to format %r with data %r: %s" % (template, transformedId, e))
963 
964  @staticmethod
965  def getShortCcdName(ccdName):
966  """Convert a CCD name to a form useful as a filename
967 
968  The default implementation converts spaces to underscores.
969  """
970  return ccdName.replace(" ", "_")
971 
972  def _extractDetectorName(self, dataId):
973  """Extract the detector (CCD) name from the dataset identifier.
974 
975  The name in question is the detector name used by lsst.afw.cameraGeom.
976 
977  Parameters
978  ----------
979  dataId : `dict`
980  Dataset identifier.
981 
982  Returns
983  -------
984  `str`
985  Detector name
986  """
987  raise NotImplementedError("No _extractDetectorName() function specified")
988 
989  def _extractAmpId(self, dataId):
990  """Extract the amplifier identifer from a dataset identifier.
991 
992  .. note:: Deprecated in 11_0
993 
994  amplifier identifier has two parts: the detector name for the CCD
995  containing the amplifier and index of the amplifier in the detector.
996 
997  Parameters
998  ----------
999  dataId : `dict`
1000  Dataset identifer
1001 
1002  Returns
1003  -------
1004  `tuple`
1005  Amplifier identifier
1006  """
1007 
1008  trDataId = self._transformId(dataId)
1009  return (trDataId["ccd"], int(trDataId['amp']))
1010 
1011  def _setAmpDetector(self, item, dataId, trimmed=True):
1012  """Set the detector object in an Exposure for an amplifier.
1013 
1014  Defects are also added to the Exposure based on the detector object.
1015 
1016  Parameters
1017  ----------
1018  item : `lsst.afw.image.Exposure`
1019  Exposure to set the detector in.
1020  dataId : `dict`
1021  Dataset identifier
1022  trimmed : `bool`
1023  Should detector be marked as trimmed? (ignored)
1024  """
1025 
1026  return self._setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
1027 
1028  def _setCcdDetector(self, item, dataId, trimmed=True):
1029  """Set the detector object in an Exposure for a CCD.
1030 
1031  Parameters
1032  ----------
1033  item : `lsst.afw.image.Exposure`
1034  Exposure to set the detector in.
1035  dataId : `dict`
1036  Dataset identifier
1037  trimmed : `bool`
1038  Should detector be marked as trimmed? (ignored)
1039  """
1040  if item.getDetector() is not None:
1041  return
1042 
1043  detectorName = self._extractDetectorName(dataId)
1044  detector = self.camera[detectorName]
1045  item.setDetector(detector)
1046 
1047  def _setFilter(self, mapping, item, dataId):
1048  """Set the filter object in an Exposure. If the Exposure had a FILTER
1049  keyword, this was already processed during load. But if it didn't,
1050  use the filter from the registry.
1051 
1052  Parameters
1053  ----------
1054  mapping : `lsst.obs.base.Mapping`
1055  Where to get the filter from.
1056  item : `lsst.afw.image.Exposure`
1057  Exposure to set the filter in.
1058  dataId : `dict`
1059  Dataset identifier.
1060  """
1061 
1062  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI) or
1063  isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
1064  return
1065 
1066  if item.getFilter().getId() != afwImage.Filter.UNKNOWN:
1067  return
1068 
1069  actualId = mapping.need(['filter'], dataId)
1070  filterName = actualId['filter']
1071  if self.filters is not None and filterName in self.filters:
1072  filterName = self.filters[filterName]
1073  item.setFilter(afwImage.Filter(filterName))
1074 
1075  # Default standardization function for exposures
1076  def _standardizeExposure(self, mapping, item, dataId, filter=True,
1077  trimmed=True, setVisitInfo=True):
1078  """Default standardization function for images.
1079 
1080  This sets the Detector from the camera geometry
1081  and optionally set the Fiter. In both cases this saves
1082  having to persist some data in each exposure (or image).
1083 
1084  Parameters
1085  ----------
1086  mapping : `lsst.obs.base.Mapping`
1087  Where to get the values from.
1088  item : image-like object
1089  Can be any of lsst.afw.image.Exposure,
1090  lsst.afw.image.DecoratedImage, lsst.afw.image.Image
1091  or lsst.afw.image.MaskedImage
1092 
1093  dataId : `dict`
1094  Dataset identifier
1095  filter : `bool`
1096  Set filter? Ignored if item is already an exposure
1097  trimmed : `bool`
1098  Should detector be marked as trimmed?
1099  setVisitInfo : `bool`
1100  Should Exposure have its VisitInfo filled out from the metadata?
1101 
1102  Returns
1103  -------
1104  `lsst.afw.image.Exposure`
1105  The standardized Exposure.
1106  """
1107  try:
1108  item = exposureFromImage(item, dataId, mapper=self, logger=self.log, setVisitInfo=setVisitInfo)
1109  except Exception as e:
1110  self.log.error("Could not turn item=%r into an exposure: %s" % (repr(item), e))
1111  raise
1112 
1113  if mapping.level.lower() == "amp":
1114  self._setAmpDetector(item, dataId, trimmed)
1115  elif mapping.level.lower() == "ccd":
1116  self._setCcdDetector(item, dataId, trimmed)
1117 
1118  if filter:
1119  self._setFilter(mapping, item, dataId)
1120 
1121  return item
1122 
1123  def _defectLookup(self, dataId, dateKey='taiObs'):
1124  """Find the defects for a given CCD.
1125 
1126  Parameters
1127  ----------
1128  dataId : `dict`
1129  Dataset identifier
1130 
1131  Returns
1132  -------
1133  `str`
1134  Path to the defects file or None if not available.
1135  """
1136  if self.defectRegistry is None:
1137  return None
1138  if self.registry is None:
1139  raise RuntimeError("No registry for defect lookup")
1140 
1141  ccdKey, ccdVal = self._getCcdKeyVal(dataId)
1142 
1143  dataIdForLookup = {'visit': dataId['visit']}
1144  # .lookup will fail in a posix registry because there is no template to provide.
1145  rows = self.registry.lookup((dateKey), ('raw_visit'), dataIdForLookup)
1146  if len(rows) == 0:
1147  return None
1148  assert len(rows) == 1
1149  dayObs = rows[0][0]
1150 
1151  # Lookup the defects for this CCD serial number that are valid at the exposure midpoint.
1152  rows = self.defectRegistry.executeQuery(("path",), ("defect",),
1153  [(ccdKey, "?")],
1154  ("DATETIME(?)", "DATETIME(validStart)", "DATETIME(validEnd)"),
1155  (ccdVal, dayObs))
1156  if not rows or len(rows) == 0:
1157  return None
1158  if len(rows) == 1:
1159  return os.path.join(self.defectPath, rows[0][0])
1160  else:
1161  raise RuntimeError("Querying for defects (%s, %s) returns %d files: %s" %
1162  (ccdVal, dayObs, len(rows), ", ".join([_[0] for _ in rows])))
1163 
1164  def _makeCamera(self, policy, repositoryDir):
1165  """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing
1166  the camera geometry
1167 
1168  Also set self.cameraDataLocation, if relevant (else it can be left
1169  None).
1170 
1171  This implementation assumes that policy contains an entry "camera"
1172  that points to the subdirectory in this package of camera data;
1173  specifically, that subdirectory must contain:
1174  - a file named `camera.py` that contains persisted camera config
1175  - ampInfo table FITS files, as required by
1176  lsst.afw.cameraGeom.makeCameraFromPath
1177 
1178  Parameters
1179  ----------
1180  policy : `lsst.daf.persistence.Policy`
1181  Policy with per-camera defaults already merged
1182  (PexPolicy only for backward compatibility).
1183  repositoryDir : `str`
1184  Policy repository for the subclassing module (obtained with
1185  getRepositoryPath() on the per-camera default dictionary).
1186  """
1187  if 'camera' not in policy:
1188  raise RuntimeError("Cannot find 'camera' in policy; cannot construct a camera")
1189  cameraDataSubdir = policy['camera']
1190  self.cameraDataLocation = os.path.normpath(
1191  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
1192  cameraConfig = afwCameraGeom.CameraConfig()
1193  cameraConfig.load(self.cameraDataLocation)
1194  ampInfoPath = os.path.dirname(self.cameraDataLocation)
1195  return afwCameraGeom.makeCameraFromPath(
1196  cameraConfig=cameraConfig,
1197  ampInfoPath=ampInfoPath,
1198  shortNameFunc=self.getShortCcdName,
1199  pupilFactoryClass=self.PupilFactoryClass
1200  )
1201 
1202  def getRegistry(self):
1203  """Get the registry used by this mapper.
1204 
1205  Returns
1206  -------
1207  Registry or None
1208  The registry used by this mapper for this mapper's repository.
1209  """
1210  return self.registry
1211 
1212  def getImageCompressionSettings(self, datasetType, dataId):
1213  """Stuff image compression settings into a daf.base.PropertySet
1214 
1215  This goes into the ButlerLocation's "additionalData", which gets
1216  passed into the boost::persistence framework.
1217 
1218  Parameters
1219  ----------
1220  datasetType : `str`
1221  Type of dataset for which to get the image compression settings.
1222  dataId : `dict`
1223  Dataset identifier.
1224 
1225  Returns
1226  -------
1227  additionalData : `lsst.daf.base.PropertySet`
1228  Image compression settings.
1229  """
1230  mapping = self.mappings[datasetType]
1231  recipeName = mapping.recipe
1232  storageType = mapping.storage
1233  if storageType not in self._writeRecipes:
1234  return dafBase.PropertySet()
1235  if recipeName not in self._writeRecipes[storageType]:
1236  raise RuntimeError("Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1237  (datasetType, storageType, recipeName))
1238  recipe = self._writeRecipes[storageType][recipeName].deepCopy()
1239  seed = hash(tuple(dataId.items())) % 2**31
1240  for plane in ("image", "mask", "variance"):
1241  if recipe.exists(plane + ".scaling.seed") and recipe.getScalar(plane + ".scaling.seed") == 0:
1242  recipe.set(plane + ".scaling.seed", seed)
1243  return recipe
1244 
1245  def _initWriteRecipes(self):
1246  """Read the recipes for writing files
1247 
1248  These recipes are currently used for configuring FITS compression,
1249  but they could have wider uses for configuring different flavors
1250  of the storage types. A recipe is referred to by a symbolic name,
1251  which has associated settings. These settings are stored as a
1252  `PropertySet` so they can easily be passed down to the
1253  boost::persistence framework as the "additionalData" parameter.
1254 
1255  The list of recipes is written in YAML. A default recipe and
1256  some other convenient recipes are in obs_base/policy/writeRecipes.yaml
1257  and these may be overridden or supplemented by the individual obs_*
1258  packages' own policy/writeRecipes.yaml files.
1259 
1260  Recipes are grouped by the storage type. Currently, only the
1261  ``FitsStorage`` storage type uses recipes, which uses it to
1262  configure FITS image compression.
1263 
1264  Each ``FitsStorage`` recipe for FITS compression should define
1265  "image", "mask" and "variance" entries, each of which may contain
1266  "compression" and "scaling" entries. Defaults will be provided for
1267  any missing elements under "compression" and "scaling".
1268 
1269  The allowed entries under "compression" are:
1270 
1271  * algorithm (string): compression algorithm to use
1272  * rows (int): number of rows per tile (0 = entire dimension)
1273  * columns (int): number of columns per tile (0 = entire dimension)
1274  * quantizeLevel (float): cfitsio quantization level
1275 
1276  The allowed entries under "scaling" are:
1277 
1278  * algorithm (string): scaling algorithm to use
1279  * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64)
1280  * fuzz (bool): fuzz the values when quantising floating-point values?
1281  * seed (long): seed for random number generator when fuzzing
1282  * maskPlanes (list of string): mask planes to ignore when doing
1283  statistics
1284  * quantizeLevel: divisor of the standard deviation for STDEV_* scaling
1285  * quantizePad: number of stdev to allow on the low side (for
1286  STDEV_POSITIVE/NEGATIVE)
1287  * bscale: manually specified BSCALE (for MANUAL scaling)
1288  * bzero: manually specified BSCALE (for MANUAL scaling)
1289 
1290  A very simple example YAML recipe:
1291 
1292  FitsStorage:
1293  default:
1294  image: &default
1295  compression:
1296  algorithm: GZIP_SHUFFLE
1297  mask: *default
1298  variance: *default
1299  """
1300  recipesFile = os.path.join(getPackageDir("obs_base"), "policy", "writeRecipes.yaml")
1301  recipes = dafPersist.Policy(recipesFile)
1302  supplementsFile = os.path.join(self.getPackageDir(), "policy", "writeRecipes.yaml")
1303  validationMenu = {'FitsStorage': validateRecipeFitsStorage, }
1304  if os.path.exists(supplementsFile) and supplementsFile != recipesFile:
1305  supplements = dafPersist.Policy(supplementsFile)
1306  # Don't allow overrides, only supplements
1307  for entry in validationMenu:
1308  intersection = set(recipes[entry].names()).intersection(set(supplements.names()))
1309  if intersection:
1310  raise RuntimeError("Recipes provided in %s section %s may not override those in %s: %s" %
1311  (supplementsFile, entry, recipesFile, intersection))
1312  recipes.update(supplements)
1313 
1314  self._writeRecipes = {}
1315  for storageType in recipes.names(True):
1316  if "default" not in recipes[storageType]:
1317  raise RuntimeError("No 'default' recipe defined for storage type %s in %s" %
1318  (storageType, recipesFile))
1319  self._writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1320 
1321 
1322 def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True):
1323  """Generate an Exposure from an image-like object
1324 
1325  If the image is a DecoratedImage then also set its WCS and metadata
1326  (Image and MaskedImage are missing the necessary metadata
1327  and Exposure already has those set)
1328 
1329  Parameters
1330  ----------
1331  image : Image-like object
1332  Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or
1333  Exposure.
1334 
1335  Returns
1336  -------
1337  `lsst.afw.image.Exposure`
1338  Exposure containing input image.
1339  """
1340  metadata = None
1341  if isinstance(image, afwImage.MaskedImage):
1342  exposure = afwImage.makeExposure(image)
1343  elif isinstance(image, afwImage.DecoratedImage):
1344  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
1345  metadata = image.getMetadata()
1346  try:
1347  wcs = afwGeom.makeSkyWcs(metadata, strip=True)
1348  exposure.setWcs(wcs)
1349  except pexExcept.TypeError as e:
1350  # raised on failure to create a wcs (and possibly others)
1351  if logger is None:
1352  logger = lsstLog.Log.getLogger("CameraMapper")
1353  logger.debug("wcs set to None; insufficient information found in metadata to create a valid wcs:"
1354  " %s", e.args[0])
1355 
1356  exposure.setMetadata(metadata)
1357  elif isinstance(image, afwImage.Exposure):
1358  # Exposure
1359  exposure = image
1360  metadata = exposure.getMetadata()
1361  else:
1362  # Image
1364  #
1365  # set VisitInfo if we can
1366  #
1367  if setVisitInfo and exposure.getInfo().getVisitInfo() is None:
1368  if metadata is not None:
1369  if mapper is None:
1370  if not logger:
1371  logger = lsstLog.Log.getLogger("CameraMapper")
1372  logger.warn("I can only set the VisitInfo if you provide a mapper")
1373  else:
1374  exposureId = mapper._computeCcdExposureId(dataId)
1375  visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1376 
1377  exposure.getInfo().setVisitInfo(visitInfo)
1378 
1379  return exposure
1380 
1381 
1383  """Validate recipes for FitsStorage
1384 
1385  The recipes are supplemented with default values where appropriate.
1386 
1387  TODO: replace this custom validation code with Cerberus (DM-11846)
1388 
1389  Parameters
1390  ----------
1391  recipes : `lsst.daf.persistence.Policy`
1392  FitsStorage recipes to validate.
1393 
1394  Returns
1395  -------
1396  validated : `lsst.daf.base.PropertySet`
1397  Validated FitsStorage recipe.
1398 
1399  Raises
1400  ------
1401  `RuntimeError`
1402  If validation fails.
1403  """
1404  # Schemas define what should be there, and the default values (and by the default
1405  # value, the expected type).
1406  compressionSchema = {
1407  "algorithm": "NONE",
1408  "rows": 1,
1409  "columns": 0,
1410  "quantizeLevel": 0.0,
1411  }
1412  scalingSchema = {
1413  "algorithm": "NONE",
1414  "bitpix": 0,
1415  "maskPlanes": ["NO_DATA"],
1416  "seed": 0,
1417  "quantizeLevel": 4.0,
1418  "quantizePad": 5.0,
1419  "fuzz": True,
1420  "bscale": 1.0,
1421  "bzero": 0.0,
1422  }
1423 
1424  def checkUnrecognized(entry, allowed, description):
1425  """Check to see if the entry contains unrecognised keywords"""
1426  unrecognized = set(entry.keys()) - set(allowed)
1427  if unrecognized:
1428  raise RuntimeError(
1429  "Unrecognized entries when parsing image compression recipe %s: %s" %
1430  (description, unrecognized))
1431 
1432  validated = {}
1433  for name in recipes.names(True):
1434  checkUnrecognized(recipes[name], ["image", "mask", "variance"], name)
1435  rr = dafBase.PropertySet()
1436  validated[name] = rr
1437  for plane in ("image", "mask", "variance"):
1438  checkUnrecognized(recipes[name][plane], ["compression", "scaling"],
1439  name + "->" + plane)
1440 
1441  for settings, schema in (("compression", compressionSchema),
1442  ("scaling", scalingSchema)):
1443  prefix = plane + "." + settings
1444  if settings not in recipes[name][plane]:
1445  for key in schema:
1446  rr.set(prefix + "." + key, schema[key])
1447  continue
1448  entry = recipes[name][plane][settings]
1449  checkUnrecognized(entry, schema.keys(), name + "->" + plane + "->" + settings)
1450  for key in schema:
1451  value = type(schema[key])(entry[key]) if key in entry else schema[key]
1452  rr.set(prefix + "." + key, value)
1453  return validated
def _makeCamera(self, policy, repositoryDir)
def map_expIdInfo(self, dataId, write=False)
def _setAmpDetector(self, item, dataId, trimmed=True)
Encapsulate information about a bad portion of a detector.
Definition: Defect.h:41
def validateRecipeFitsStorage(recipes)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
Class for logical location of a persisted Persistable instance.
def _setFilter(self, mapping, item, dataId)
A class to contain the data, WCS, and other information needed to describe an image of the sky...
Definition: Exposure.h:72
def _setCcdDetector(self, item, dataId, trimmed=True)
def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId)
daf::base::PropertySet * set
Definition: fits.cc:884
def std_bfKernel(self, item, dataId)
def getKeys(self, datasetType, level)
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
Definition: MaskedImage.h:1280
def _defectLookup(self, dataId, dateKey='taiObs')
Definition: Log.h:691
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
Definition: Exposure.h:457
def getImageCompressionSettings(self, datasetType, dataId)
def map_defects(self, dataId, write=False)
table::Key< int > type
Definition: Detector.cc:167
def map_camera(self, dataId, write=False)
def map(self, datasetType, dataId, write=False)
Definition: mapper.py:138
A class to manipulate images, masks, and variance as a single object.
Definition: MaskedImage.h:74
def backup(self, datasetType, dataId)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
Holds an integer identifier for an LSST filter.
Definition: Filter.h:141
def std_skypolicy(self, item, dataId)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
std::shared_ptr< SkyWcs > makeSkyWcs(TransformPoint2ToPoint2 const &pixelsToFieldAngle, lsst::geom::Angle const &orientation, bool flipX, lsst::geom::SpherePoint const &boresight, std::string const &projection="TAN")
Construct a FITS SkyWcs from camera geometry.
Definition: SkyWcs.cc:496
Class for storing generic metadata.
Definition: PropertySet.h:68
A FITS reader class for Exposures and their components.
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects...
Reports errors from accepting an object of an unexpected or inappropriate type.
Definition: Runtime.h:167
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
An integer coordinate rectangle.
Definition: Box.h:54
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
Definition: Image.cc:709
A container for an Image and its associated metadata.
Definition: Image.h:407
def _getCcdKeyVal(self, dataId)
Utility functions.