LSSTApplications  16.0-10-g0ee56ad+5,16.0-11-ga33d1f2+5,16.0-12-g3ef5c14+3,16.0-12-g71e5ef5+18,16.0-12-gbdf3636+3,16.0-13-g118c103+3,16.0-13-g8f68b0a+3,16.0-15-gbf5c1cb+4,16.0-16-gfd17674+3,16.0-17-g7c01f5c+3,16.0-18-g0a50484+1,16.0-20-ga20f992+8,16.0-21-g0e05fd4+6,16.0-21-g15e2d33+4,16.0-22-g62d8060+4,16.0-22-g847a80f+4,16.0-25-gf00d9b8+1,16.0-28-g3990c221+4,16.0-3-gf928089+3,16.0-32-g88a4f23+5,16.0-34-gd7987ad+3,16.0-37-gc7333cb+2,16.0-4-g10fc685+2,16.0-4-g18f3627+26,16.0-4-g5f3a788+26,16.0-5-gaf5c3d7+4,16.0-5-gcc1f4bb+1,16.0-6-g3b92700+4,16.0-6-g4412fcd+3,16.0-6-g7235603+4,16.0-69-g2562ce1b+2,16.0-8-g14ebd58+4,16.0-8-g2df868b+1,16.0-8-g4cec79c+6,16.0-8-gadf6c7a+1,16.0-8-gfc7ad86,16.0-82-g59ec2a54a+1,16.0-9-g5400cdc+2,16.0-9-ge6233d7+5,master-g2880f2d8cf+3,v17.0.rc1
LSSTDataManagementBasePackage
cameraMapper.py
Go to the documentation of this file.
1 #
2 # LSST Data Management System
3 # Copyright 2008, 2009, 2010 LSST Corporation.
4 #
5 # This product includes software developed by the
6 # LSST Project (http://www.lsst.org/).
7 #
8 # This program is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU General Public License as published by
10 # the Free Software Foundation, either version 3 of the License, or
11 # (at your option) any later version.
12 #
13 # This program is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU General Public License for more details.
17 #
18 # You should have received a copy of the LSST License Statement and
19 # the GNU General Public License along with this program. If not,
20 # see <http://www.lsstcorp.org/LegalNotices/>.
21 #
22 
23 import copy
24 import os
25 from astropy.io import fits # required by _makeDefectsDict until defects are written as AFW tables
26 import re
27 import weakref
28 import lsst.daf.persistence as dafPersist
29 from . import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
30 import lsst.daf.base as dafBase
31 import lsst.afw.geom as afwGeom
32 import lsst.afw.image as afwImage
33 import lsst.afw.table as afwTable
34 from lsst.afw.fits import readMetadata
35 import lsst.afw.cameraGeom as afwCameraGeom
36 import lsst.log as lsstLog
37 import lsst.pex.exceptions as pexExcept
38 from .exposureIdInfo import ExposureIdInfo
39 from .makeRawVisitInfo import MakeRawVisitInfo
40 from lsst.utils import getPackageDir
41 
42 __all__ = ["CameraMapper", "exposureFromImage"]
43 
44 
46 
47  """CameraMapper is a base class for mappers that handle images from a
48  camera and products derived from them. This provides an abstraction layer
49  between the data on disk and the code.
50 
51  Public methods: keys, queryMetadata, getDatasetTypes, map,
52  canStandardize, standardize
53 
54  Mappers for specific data sources (e.g., CFHT Megacam, LSST
55  simulations, etc.) should inherit this class.
56 
57  The CameraMapper manages datasets within a "root" directory. Note that
58  writing to a dataset present in the input root will hide the existing
59  dataset but not overwrite it. See #2160 for design discussion.
60 
61  A camera is assumed to consist of one or more rafts, each composed of
62  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
63  (amps). A camera is also assumed to have a camera geometry description
64  (CameraGeom object) as a policy file, a filter description (Filter class
65  static configuration) as another policy file, and an optional defects
66  description directory.
67 
68  Information from the camera geometry and defects are inserted into all
69  Exposure objects returned.
70 
71  The mapper uses one or two registries to retrieve metadata about the
72  images. The first is a registry of all raw exposures. This must contain
73  the time of the observation. One or more tables (or the equivalent)
74  within the registry are used to look up data identifier components that
75  are not specified by the user (e.g. filter) and to return results for
76  metadata queries. The second is an optional registry of all calibration
77  data. This should contain validity start and end entries for each
78  calibration dataset in the same timescale as the observation time.
79 
80  Subclasses will typically set MakeRawVisitInfoClass:
81 
82  MakeRawVisitInfoClass: a class variable that points to a subclass of
83  MakeRawVisitInfo, a functor that creates an
84  lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
85 
86  Subclasses must provide the following methods:
87 
88  _extractDetectorName(self, dataId): returns the detector name for a CCD
89  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
90  a dataset identifier referring to that CCD or a subcomponent of it.
91 
92  _computeCcdExposureId(self, dataId): see below
93 
94  _computeCoaddExposureId(self, dataId, singleFilter): see below
95 
96  Subclasses may also need to override the following methods:
97 
98  _transformId(self, dataId): transformation of a data identifier
99  from colloquial usage (e.g., "ccdname") to proper/actual usage
100  (e.g., "ccd"), including making suitable for path expansion (e.g. removing
101  commas). The default implementation does nothing. Note that this
102  method should not modify its input parameter.
103 
104  getShortCcdName(self, ccdName): a static method that returns a shortened
105  name suitable for use as a filename. The default version converts spaces
106  to underscores.
107 
108  _getCcdKeyVal(self, dataId): return a CCD key and value
109  by which to look up defects in the defects registry.
110  The default value returns ("ccd", detector name)
111 
112  _mapActualToPath(self, template, actualId): convert a template path to an
113  actual path, using the actual dataset identifier.
114 
115  The mapper's behaviors are largely specified by the policy file.
116  See the MapperDictionary.paf for descriptions of the available items.
117 
118  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
119  mappings (see Mappings class).
120 
121  Common default mappings for all subclasses can be specified in the
122  "policy/{images,exposures,calibrations,datasets}.yaml" files. This
123  provides a simple way to add a product to all camera mappers.
124 
125  Functions to map (provide a path to the data given a dataset
126  identifier dictionary) and standardize (convert data into some standard
127  format or type) may be provided in the subclass as "map_{dataset type}"
128  and "std_{dataset type}", respectively.
129 
130  If non-Exposure datasets cannot be retrieved using standard
131  daf_persistence methods alone, a "bypass_{dataset type}" function may be
132  provided in the subclass to return the dataset instead of using the
133  "datasets" subpolicy.
134 
135  Implementations of map_camera and bypass_camera that should typically be
136  sufficient are provided in this base class.
137 
138  Notes
139  -----
140  TODO:
141 
142  - Handle defects the same was as all other calibration products, using the
143  calibration registry
144  - Instead of auto-loading the camera at construction time, load it from
145  the calibration registry
146  - Rewrite defects as AFW tables so we don't need astropy.io.fits to
147  unpersist them; then remove all mention of astropy.io.fits from this
148  package.
149  """
150  packageName = None
151 
152  # a class or subclass of MakeRawVisitInfo, a functor that makes an
153  # lsst.afw.image.VisitInfo from the FITS metadata of a raw image
154  MakeRawVisitInfoClass = MakeRawVisitInfo
155 
156  # a class or subclass of PupilFactory
157  PupilFactoryClass = afwCameraGeom.PupilFactory
158 
159  def __init__(self, policy, repositoryDir,
160  root=None, registry=None, calibRoot=None, calibRegistry=None,
161  provided=None, parentRegistry=None, repositoryCfg=None):
162  """Initialize the CameraMapper.
163 
164  Parameters
165  ----------
166  policy : daf_persistence.Policy,
167  Policy with per-camera defaults already merged.
168  repositoryDir : string
169  Policy repository for the subclassing module (obtained with
170  getRepositoryPath() on the per-camera default dictionary).
171  root : string, optional
172  Path to the root directory for data.
173  registry : string, optional
174  Path to registry with data's metadata.
175  calibRoot : string, optional
176  Root directory for calibrations.
177  calibRegistry : string, optional
178  Path to registry with calibrations' metadata.
179  provided : list of string, optional
180  Keys provided by the mapper.
181  parentRegistry : Registry subclass, optional
182  Registry from a parent repository that may be used to look up
183  data's metadata.
184  repositoryCfg : daf_persistence.RepositoryCfg or None, optional
185  The configuration information for the repository this mapper is
186  being used with.
187  """
188 
189  dafPersist.Mapper.__init__(self)
190 
191  self.log = lsstLog.Log.getLogger("CameraMapper")
192 
193  if root:
194  self.root = root
195  elif repositoryCfg:
196  self.root = repositoryCfg.root
197  else:
198  self.root = None
199 
200  repoPolicy = repositoryCfg.policy if repositoryCfg else None
201  if repoPolicy is not None:
202  policy.update(repoPolicy)
203 
204  # Levels
205  self.levels = dict()
206  if 'levels' in policy:
207  levelsPolicy = policy['levels']
208  for key in levelsPolicy.names(True):
209  self.levels[key] = set(levelsPolicy.asArray(key))
210  self.defaultLevel = policy['defaultLevel']
211  self.defaultSubLevels = dict()
212  if 'defaultSubLevels' in policy:
213  self.defaultSubLevels = policy['defaultSubLevels']
214 
215  # Root directories
216  if root is None:
217  root = "."
218  root = dafPersist.LogicalLocation(root).locString()
219 
220  self.rootStorage = dafPersist.Storage.makeFromURI(uri=root)
221 
222  # If the calibRoot is passed in, use that. If not and it's indicated in
223  # the policy, use that. And otherwise, the calibs are in the regular
224  # root.
225  # If the location indicated by the calib root does not exist, do not
226  # create it.
227  calibStorage = None
228  if calibRoot is not None:
229  calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
230  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
231  create=False)
232  else:
233  calibRoot = policy.get('calibRoot', None)
234  if calibRoot:
235  calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
236  create=False)
237  if calibStorage is None:
238  calibStorage = self.rootStorage
239 
240  self.root = root
241 
242  # Registries
243  self.registry = self._setupRegistry("registry", "exposure", registry, policy, "registryPath",
244  self.rootStorage, searchParents=False,
245  posixIfNoSql=(not parentRegistry))
246  if not self.registry:
247  self.registry = parentRegistry
248  needCalibRegistry = policy.get('needCalibRegistry', None)
249  if needCalibRegistry:
250  if calibStorage:
251  self.calibRegistry = self._setupRegistry("calibRegistry", "calib", calibRegistry, policy,
252  "calibRegistryPath", calibStorage,
253  posixIfNoSql=False) # NB never use posix for calibs
254  else:
255  raise RuntimeError(
256  "'needCalibRegistry' is true in Policy, but was unable to locate a repo at " +
257  "calibRoot ivar:%s or policy['calibRoot']:%s" %
258  (calibRoot, policy.get('calibRoot', None)))
259  else:
260  self.calibRegistry = None
261 
262  # Dict of valid keys and their value types
263  self.keyDict = dict()
264 
265  self._initMappings(policy, self.rootStorage, calibStorage, provided=None)
266  self._initWriteRecipes()
267 
268  # Camera geometry
269  self.cameraDataLocation = None # path to camera geometry config file
270  self.camera = self._makeCamera(policy=policy, repositoryDir=repositoryDir)
271 
272  # Defect registry and root. Defects are stored with the camera and the registry is loaded from the
273  # camera package, which is on the local filesystem.
274  self.defectRegistry = None
275  if 'defects' in policy:
276  self.defectPath = os.path.join(repositoryDir, policy['defects'])
277  defectRegistryLocation = os.path.join(self.defectPath, "defectRegistry.sqlite3")
278  self.defectRegistry = dafPersist.Registry.create(defectRegistryLocation)
279 
280  # Filter translation table
281  self.filters = None
282 
283  # verify that the class variable packageName is set before attempting
284  # to instantiate an instance
285  if self.packageName is None:
286  raise ValueError('class variable packageName must not be None')
287 
289 
290  def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
291  """Initialize mappings
292 
293  For each of the dataset types that we want to be able to read, there
294  are methods that can be created to support them:
295  * map_<dataset> : determine the path for dataset
296  * std_<dataset> : standardize the retrieved dataset
297  * bypass_<dataset> : retrieve the dataset (bypassing the usual
298  retrieval machinery)
299  * query_<dataset> : query the registry
300 
301  Besides the dataset types explicitly listed in the policy, we create
302  additional, derived datasets for additional conveniences,
303  e.g., reading the header of an image, retrieving only the size of a
304  catalog.
305 
306  Parameters
307  ----------
308  policy : `lsst.daf.persistence.Policy`
309  Policy with per-camera defaults already merged
310  rootStorage : `Storage subclass instance`
311  Interface to persisted repository data.
312  calibRoot : `Storage subclass instance`
313  Interface to persisted calib repository data
314  provided : `list` of `str`
315  Keys provided by the mapper
316  """
317  # Sub-dictionaries (for exposure/calibration/dataset types)
318  imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
319  "obs_base", "ImageMappingDefaults.yaml", "policy"))
320  expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
321  "obs_base", "ExposureMappingDefaults.yaml", "policy"))
322  calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
323  "obs_base", "CalibrationMappingDefaults.yaml", "policy"))
324  dsMappingPolicy = dafPersist.Policy()
325 
326  # Mappings
327  mappingList = (
328  ("images", imgMappingPolicy, ImageMapping),
329  ("exposures", expMappingPolicy, ExposureMapping),
330  ("calibrations", calMappingPolicy, CalibrationMapping),
331  ("datasets", dsMappingPolicy, DatasetMapping)
332  )
333  self.mappings = dict()
334  for name, defPolicy, cls in mappingList:
335  if name in policy:
336  datasets = policy[name]
337 
338  # Centrally-defined datasets
339  defaultsPath = os.path.join(getPackageDir("obs_base"), "policy", name + ".yaml")
340  if os.path.exists(defaultsPath):
341  datasets.merge(dafPersist.Policy(defaultsPath))
342 
343  mappings = dict()
344  setattr(self, name, mappings)
345  for datasetType in datasets.names(True):
346  subPolicy = datasets[datasetType]
347  subPolicy.merge(defPolicy)
348 
349  if not hasattr(self, "map_" + datasetType) and 'composite' in subPolicy:
350  def compositeClosure(dataId, write=False, mapper=None, mapping=None,
351  subPolicy=subPolicy):
352  components = subPolicy.get('composite')
353  assembler = subPolicy['assembler'] if 'assembler' in subPolicy else None
354  disassembler = subPolicy['disassembler'] if 'disassembler' in subPolicy else None
355  python = subPolicy['python']
356  butlerComposite = dafPersist.ButlerComposite(assembler=assembler,
357  disassembler=disassembler,
358  python=python,
359  dataId=dataId,
360  mapper=self)
361  for name, component in components.items():
362  butlerComposite.add(id=name,
363  datasetType=component.get('datasetType'),
364  setter=component.get('setter', None),
365  getter=component.get('getter', None),
366  subset=component.get('subset', False),
367  inputOnly=component.get('inputOnly', False))
368  return butlerComposite
369  setattr(self, "map_" + datasetType, compositeClosure)
370  # for now at least, don't set up any other handling for this dataset type.
371  continue
372 
373  if name == "calibrations":
374  mapping = cls(datasetType, subPolicy, self.registry, self.calibRegistry, calibStorage,
375  provided=provided, dataRoot=rootStorage)
376  else:
377  mapping = cls(datasetType, subPolicy, self.registry, rootStorage, provided=provided)
378  self.keyDict.update(mapping.keys())
379  mappings[datasetType] = mapping
380  self.mappings[datasetType] = mapping
381  if not hasattr(self, "map_" + datasetType):
382  def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
383  return mapping.map(mapper, dataId, write)
384  setattr(self, "map_" + datasetType, mapClosure)
385  if not hasattr(self, "query_" + datasetType):
386  def queryClosure(format, dataId, mapping=mapping):
387  return mapping.lookup(format, dataId)
388  setattr(self, "query_" + datasetType, queryClosure)
389  if hasattr(mapping, "standardize") and not hasattr(self, "std_" + datasetType):
390  def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
391  return mapping.standardize(mapper, item, dataId)
392  setattr(self, "std_" + datasetType, stdClosure)
393 
394  def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
395  """Set convenience methods on CameraMapper"""
396  mapName = "map_" + datasetType + "_" + suffix
397  bypassName = "bypass_" + datasetType + "_" + suffix
398  queryName = "query_" + datasetType + "_" + suffix
399  if not hasattr(self, mapName):
400  setattr(self, mapName, mapImpl or getattr(self, "map_" + datasetType))
401  if not hasattr(self, bypassName):
402  if bypassImpl is None and hasattr(self, "bypass_" + datasetType):
403  bypassImpl = getattr(self, "bypass_" + datasetType)
404  if bypassImpl is not None:
405  setattr(self, bypassName, bypassImpl)
406  if not hasattr(self, queryName):
407  setattr(self, queryName, queryImpl or getattr(self, "query_" + datasetType))
408 
409  # Filename of dataset
410  setMethods("filename", bypassImpl=lambda datasetType, pythonType, location, dataId:
411  [os.path.join(location.getStorage().root, p) for p in location.getLocations()])
412  # Metadata from FITS file
413  if subPolicy["storage"] == "FitsStorage": # a FITS image
414  setMethods("md", bypassImpl=lambda datasetType, pythonType, location, dataId:
415  readMetadata(location.getLocationsWithRoot()[0]))
416 
417  # Add support for configuring FITS compression
418  addName = "add_" + datasetType
419  if not hasattr(self, addName):
420  setattr(self, addName, self.getImageCompressionSettings)
421 
422  if name == "exposures":
423  setMethods("wcs", bypassImpl=lambda datasetType, pythonType, location, dataId:
424  afwGeom.makeSkyWcs(readMetadata(location.getLocationsWithRoot()[0])))
425  setMethods("calib", bypassImpl=lambda datasetType, pythonType, location, dataId:
426  afwImage.Calib(readMetadata(location.getLocationsWithRoot()[0])))
427  setMethods("visitInfo",
428  bypassImpl=lambda datasetType, pythonType, location, dataId:
429  afwImage.VisitInfo(readMetadata(location.getLocationsWithRoot()[0])))
430  setMethods("filter",
431  bypassImpl=lambda datasetType, pythonType, location, dataId:
432  afwImage.Filter(readMetadata(location.getLocationsWithRoot()[0])))
433  setMethods("detector",
434  mapImpl=lambda dataId, write=False:
436  pythonType="lsst.afw.cameraGeom.CameraConfig",
437  cppType="Config",
438  storageName="Internal",
439  locationList="ignored",
440  dataId=dataId,
441  mapper=self,
442  storage=None,
443  ),
444  bypassImpl=lambda datasetType, pythonType, location, dataId:
445  self.camera[self._extractDetectorName(dataId)]
446  )
447  setMethods("bbox", bypassImpl=lambda dsType, pyType, location, dataId:
449  readMetadata(location.getLocationsWithRoot()[0], hdu=1)))
450 
451  elif name == "images":
452  setMethods("bbox", bypassImpl=lambda dsType, pyType, location, dataId:
454  readMetadata(location.getLocationsWithRoot()[0])))
455 
456  if subPolicy["storage"] == "FitsCatalogStorage": # a FITS catalog
457  setMethods("md", bypassImpl=lambda datasetType, pythonType, location, dataId:
458  readMetadata(os.path.join(location.getStorage().root,
459  location.getLocations()[0]), hdu=1))
460 
461  # Sub-images
462  if subPolicy["storage"] == "FitsStorage":
463  def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
464  subId = dataId.copy()
465  del subId['bbox']
466  loc = mapping.map(mapper, subId, write)
467  bbox = dataId['bbox']
468  llcX = bbox.getMinX()
469  llcY = bbox.getMinY()
470  width = bbox.getWidth()
471  height = bbox.getHeight()
472  loc.additionalData.set('llcX', llcX)
473  loc.additionalData.set('llcY', llcY)
474  loc.additionalData.set('width', width)
475  loc.additionalData.set('height', height)
476  if 'imageOrigin' in dataId:
477  loc.additionalData.set('imageOrigin',
478  dataId['imageOrigin'])
479  return loc
480 
481  def querySubClosure(key, format, dataId, mapping=mapping):
482  subId = dataId.copy()
483  del subId['bbox']
484  return mapping.lookup(format, subId)
485  setMethods("sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
486 
487  if subPolicy["storage"] == "FitsCatalogStorage":
488  # Length of catalog
489  setMethods("len", bypassImpl=lambda datasetType, pythonType, location, dataId:
490  readMetadata(os.path.join(location.getStorage().root,
491  location.getLocations()[0]),
492  hdu=1).getScalar("NAXIS2"))
493 
494  # Schema of catalog
495  if not datasetType.endswith("_schema") and datasetType + "_schema" not in datasets:
496  setMethods("schema", bypassImpl=lambda datasetType, pythonType, location, dataId:
497  afwTable.Schema.readFits(os.path.join(location.getStorage().root,
498  location.getLocations()[0])))
499 
500  def _computeCcdExposureId(self, dataId):
501  """Compute the 64-bit (long) identifier for a CCD exposure.
502 
503  Subclasses must override
504 
505  Parameters
506  ----------
507  dataId : `dict`
508  Data identifier with visit, ccd.
509  """
510  raise NotImplementedError()
511 
512  def _computeCoaddExposureId(self, dataId, singleFilter):
513  """Compute the 64-bit (long) identifier for a coadd.
514 
515  Subclasses must override
516 
517  Parameters
518  ----------
519  dataId : `dict`
520  Data identifier with tract and patch.
521  singleFilter : `bool`
522  True means the desired ID is for a single-filter coadd, in which
523  case dataIdmust contain filter.
524  """
525  raise NotImplementedError()
526 
527  def _search(self, path):
528  """Search for path in the associated repository's storage.
529 
530  Parameters
531  ----------
532  path : string
533  Path that describes an object in the repository associated with
534  this mapper.
535  Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The
536  indicator will be stripped when searching and so will match
537  filenames without the HDU indicator, e.g. 'foo.fits'. The path
538  returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
539 
540  Returns
541  -------
542  string
543  The path for this object in the repository. Will return None if the
544  object can't be found. If the input argument path contained an HDU
545  indicator, the returned path will also contain the HDU indicator.
546  """
547  return self.rootStorage.search(path)
548 
549  def backup(self, datasetType, dataId):
550  """Rename any existing object with the given type and dataId.
551 
552  The CameraMapper implementation saves objects in a sequence of e.g.:
553 
554  - foo.fits
555  - foo.fits~1
556  - foo.fits~2
557 
558  All of the backups will be placed in the output repo, however, and will
559  not be removed if they are found elsewhere in the _parent chain. This
560  means that the same file will be stored twice if the previous version
561  was found in an input repo.
562  """
563 
564  # Calling PosixStorage directly is not the long term solution in this
565  # function, this is work-in-progress on epic DM-6225. The plan is for
566  # parentSearch to be changed to 'search', and search only the storage
567  # associated with this mapper. All searching of parents will be handled
568  # by traversing the container of repositories in Butler.
569 
570  def firstElement(list):
571  """Get the first element in the list, or None if that can't be
572  done.
573  """
574  return list[0] if list is not None and len(list) else None
575 
576  n = 0
577  newLocation = self.map(datasetType, dataId, write=True)
578  newPath = newLocation.getLocations()[0]
579  path = dafPersist.PosixStorage.search(self.root, newPath, searchParents=True)
580  path = firstElement(path)
581  oldPaths = []
582  while path is not None:
583  n += 1
584  oldPaths.append((n, path))
585  path = dafPersist.PosixStorage.search(self.root, "%s~%d" % (newPath, n), searchParents=True)
586  path = firstElement(path)
587  for n, oldPath in reversed(oldPaths):
588  self.rootStorage.copyFile(oldPath, "%s~%d" % (newPath, n))
589 
590  def keys(self):
591  """Return supported keys.
592 
593  Returns
594  -------
595  iterable
596  List of keys usable in a dataset identifier
597  """
598  return iter(self.keyDict.keys())
599 
600  def getKeys(self, datasetType, level):
601  """Return a dict of supported keys and their value types for a given
602  dataset type at a given level of the key hierarchy.
603 
604  Parameters
605  ----------
606  datasetType : `str`
607  Dataset type or None for all dataset types.
608  level : `str` or None
609  Level or None for all levels or '' for the default level for the
610  camera.
611 
612  Returns
613  -------
614  `dict`
615  Keys are strings usable in a dataset identifier, values are their
616  value types.
617  """
618 
619  # not sure if this is how we want to do this. what if None was intended?
620  if level == '':
621  level = self.getDefaultLevel()
622 
623  if datasetType is None:
624  keyDict = copy.copy(self.keyDict)
625  else:
626  keyDict = self.mappings[datasetType].keys()
627  if level is not None and level in self.levels:
628  keyDict = copy.copy(keyDict)
629  for l in self.levels[level]:
630  if l in keyDict:
631  del keyDict[l]
632  return keyDict
633 
634  def getDefaultLevel(self):
635  return self.defaultLevel
636 
637  def getDefaultSubLevel(self, level):
638  if level in self.defaultSubLevels:
639  return self.defaultSubLevels[level]
640  return None
641 
642  @classmethod
643  def getCameraName(cls):
644  """Return the name of the camera that this CameraMapper is for."""
645  className = str(cls)
646  className = className[className.find('.'):-1]
647  m = re.search(r'(\w+)Mapper', className)
648  if m is None:
649  m = re.search(r"class '[\w.]*?(\w+)'", className)
650  name = m.group(1)
651  return name[:1].lower() + name[1:] if name else ''
652 
653  @classmethod
654  def getPackageName(cls):
655  """Return the name of the package containing this CameraMapper."""
656  if cls.packageName is None:
657  raise ValueError('class variable packageName must not be None')
658  return cls.packageName
659 
660  @classmethod
661  def getPackageDir(cls):
662  """Return the base directory of this package"""
663  return getPackageDir(cls.getPackageName())
664 
665  def map_camera(self, dataId, write=False):
666  """Map a camera dataset."""
667  if self.camera is None:
668  raise RuntimeError("No camera dataset available.")
669  actualId = self._transformId(dataId)
671  pythonType="lsst.afw.cameraGeom.CameraConfig",
672  cppType="Config",
673  storageName="ConfigStorage",
674  locationList=self.cameraDataLocation or "ignored",
675  dataId=actualId,
676  mapper=self,
677  storage=self.rootStorage
678  )
679 
680  def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
681  """Return the (preloaded) camera object.
682  """
683  if self.camera is None:
684  raise RuntimeError("No camera dataset available.")
685  return self.camera
686 
687  def map_defects(self, dataId, write=False):
688  """Map defects dataset.
689 
690  Returns
691  -------
692  `lsst.daf.butler.ButlerLocation`
693  Minimal ButlerLocation containing just the locationList field
694  (just enough information that bypass_defects can use it).
695  """
696  defectFitsPath = self._defectLookup(dataId=dataId)
697  if defectFitsPath is None:
698  raise RuntimeError("No defects available for dataId=%s" % (dataId,))
699 
700  return dafPersist.ButlerLocation(None, None, None, defectFitsPath,
701  dataId, self,
702  storage=self.rootStorage)
703 
704  def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId):
705  """Return a defect based on the butler location returned by map_defects
706 
707  Parameters
708  ----------
709  butlerLocation : `lsst.daf.persistence.ButlerLocation`
710  locationList = path to defects FITS file
711  dataId : `dict`
712  Butler data ID; "ccd" must be set.
713 
714  Note: the name "bypass_XXX" means the butler makes no attempt to
715  convert the ButlerLocation into an object, which is what we want for
716  now, since that conversion is a bit tricky.
717  """
718  detectorName = self._extractDetectorName(dataId)
719  defectsFitsPath = butlerLocation.locationList[0]
720  with fits.open(defectsFitsPath) as hduList:
721  for hdu in hduList[1:]:
722  if hdu.header["name"] != detectorName:
723  continue
724 
725  defectList = []
726  for data in hdu.data:
727  bbox = afwGeom.Box2I(
728  afwGeom.Point2I(int(data['x0']), int(data['y0'])),
729  afwGeom.Extent2I(int(data['width']), int(data['height'])),
730  )
731  defectList.append(afwImage.DefectBase(bbox))
732  return defectList
733 
734  raise RuntimeError("No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
735 
736  def map_expIdInfo(self, dataId, write=False):
738  pythonType="lsst.obs.base.ExposureIdInfo",
739  cppType=None,
740  storageName="Internal",
741  locationList="ignored",
742  dataId=dataId,
743  mapper=self,
744  storage=self.rootStorage
745  )
746 
747  def bypass_expIdInfo(self, datasetType, pythonType, location, dataId):
748  """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
749  expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
750  expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
751  return ExposureIdInfo(expId=expId, expBits=expBits)
752 
753  def std_bfKernel(self, item, dataId):
754  """Disable standardization for bfKernel
755 
756  bfKernel is a calibration product that is numpy array,
757  unlike other calibration products that are all images;
758  all calibration images are sent through _standardizeExposure
759  due to CalibrationMapping, but we don't want that to happen to bfKernel
760  """
761  return item
762 
763  def std_raw(self, item, dataId):
764  """Standardize a raw dataset by converting it to an Exposure instead
765  of an Image"""
766  return self._standardizeExposure(self.exposures['raw'], item, dataId,
767  trimmed=False, setVisitInfo=True)
768 
769  def map_skypolicy(self, dataId):
770  """Map a sky policy."""
771  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
772  "Internal", None, None, self,
773  storage=self.rootStorage)
774 
775  def std_skypolicy(self, item, dataId):
776  """Standardize a sky policy by returning the one we use."""
777  return self.skypolicy
778 
779 
784 
785  def _getCcdKeyVal(self, dataId):
786  """Return CCD key and value used to look a defect in the defect
787  registry
788 
789  The default implementation simply returns ("ccd", full detector name)
790  """
791  return ("ccd", self._extractDetectorName(dataId))
792 
793  def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
794  posixIfNoSql=True):
795  """Set up a registry (usually SQLite3), trying a number of possible
796  paths.
797 
798  Parameters
799  ----------
800  name : string
801  Name of registry.
802  description: `str`
803  Description of registry (for log messages)
804  path : string
805  Path for registry.
806  policy : string
807  Policy that contains the registry name, used if path is None.
808  policyKey : string
809  Key in policy for registry path.
810  storage : Storage subclass
811  Repository Storage to look in.
812  searchParents : bool, optional
813  True if the search for a registry should follow any Butler v1
814  _parent symlinks.
815  posixIfNoSql : bool, optional
816  If an sqlite registry is not found, will create a posix registry if
817  this is True.
818 
819  Returns
820  -------
821  lsst.daf.persistence.Registry
822  Registry object
823  """
824  if path is None and policyKey in policy:
825  path = dafPersist.LogicalLocation(policy[policyKey]).locString()
826  if os.path.isabs(path):
827  raise RuntimeError("Policy should not indicate an absolute path for registry.")
828  if not storage.exists(path):
829  newPath = storage.instanceSearch(path)
830 
831  newPath = newPath[0] if newPath is not None and len(newPath) else None
832  if newPath is None:
833  self.log.warn("Unable to locate registry at policy path (also looked in root): %s",
834  path)
835  path = newPath
836  else:
837  self.log.warn("Unable to locate registry at policy path: %s", path)
838  path = None
839 
840  # Old Butler API was to indicate the registry WITH the repo folder, New Butler expects the registry to
841  # be in the repo folder. To support Old API, check to see if path starts with root, and if so, strip
842  # root from path. Currently only works with PosixStorage
843  try:
844  root = storage.root
845  if path and (path.startswith(root)):
846  path = path[len(root + '/'):]
847  except AttributeError:
848  pass
849 
850  # determine if there is an sqlite registry and if not, try the posix registry.
851  registry = None
852 
853  def search(filename, description):
854  """Search for file in storage
855 
856  Parameters
857  ----------
858  filename : `str`
859  Filename to search for
860  description : `str`
861  Description of file, for error message.
862 
863  Returns
864  -------
865  path : `str` or `None`
866  Path to file, or None
867  """
868  result = storage.instanceSearch(filename)
869  if result:
870  return result[0]
871  self.log.debug("Unable to locate %s: %s", description, filename)
872  return None
873 
874  # Search for a suitable registry database
875  if path is None:
876  path = search("%s.pgsql" % name, "%s in root" % description)
877  if path is None:
878  path = search("%s.sqlite3" % name, "%s in root" % description)
879  if path is None:
880  path = search(os.path.join(".", "%s.sqlite3" % name), "%s in current dir" % description)
881 
882  if path is not None:
883  if not storage.exists(path):
884  newPath = storage.instanceSearch(path)
885  newPath = newPath[0] if newPath is not None and len(newPath) else None
886  if newPath is not None:
887  path = newPath
888  localFileObj = storage.getLocalFile(path)
889  self.log.info("Loading %s registry from %s", description, localFileObj.name)
890  registry = dafPersist.Registry.create(localFileObj.name)
891  localFileObj.close()
892  elif not registry and posixIfNoSql:
893  try:
894  self.log.info("Loading Posix %s registry from %s", description, storage.root)
895  registry = dafPersist.PosixRegistry(storage.root)
896  except Exception:
897  registry = None
898 
899  return registry
900 
901  def _transformId(self, dataId):
902  """Generate a standard ID dict from a camera-specific ID dict.
903 
904  Canonical keys include:
905  - amp: amplifier name
906  - ccd: CCD name (in LSST this is a combination of raft and sensor)
907  The default implementation returns a copy of its input.
908 
909  Parameters
910  ----------
911  dataId : `dict`
912  Dataset identifier; this must not be modified
913 
914  Returns
915  -------
916  `dict`
917  Transformed dataset identifier.
918  """
919 
920  return dataId.copy()
921 
922  def _mapActualToPath(self, template, actualId):
923  """Convert a template path to an actual path, using the actual data
924  identifier. This implementation is usually sufficient but can be
925  overridden by the subclass.
926 
927  Parameters
928  ----------
929  template : `str`
930  Template path
931  actualId : `dict`
932  Dataset identifier
933 
934  Returns
935  -------
936  `str`
937  Pathname
938  """
939 
940  try:
941  transformedId = self._transformId(actualId)
942  return template % transformedId
943  except Exception as e:
944  raise RuntimeError("Failed to format %r with data %r: %s" % (template, transformedId, e))
945 
946  @staticmethod
947  def getShortCcdName(ccdName):
948  """Convert a CCD name to a form useful as a filename
949 
950  The default implementation converts spaces to underscores.
951  """
952  return ccdName.replace(" ", "_")
953 
954  def _extractDetectorName(self, dataId):
955  """Extract the detector (CCD) name from the dataset identifier.
956 
957  The name in question is the detector name used by lsst.afw.cameraGeom.
958 
959  Parameters
960  ----------
961  dataId : `dict`
962  Dataset identifier.
963 
964  Returns
965  -------
966  `str`
967  Detector name
968  """
969  raise NotImplementedError("No _extractDetectorName() function specified")
970 
971  def _extractAmpId(self, dataId):
972  """Extract the amplifier identifer from a dataset identifier.
973 
974  .. note:: Deprecated in 11_0
975 
976  amplifier identifier has two parts: the detector name for the CCD
977  containing the amplifier and index of the amplifier in the detector.
978 
979  Parameters
980  ----------
981  dataId : `dict`
982  Dataset identifer
983 
984  Returns
985  -------
986  `tuple`
987  Amplifier identifier
988  """
989 
990  trDataId = self._transformId(dataId)
991  return (trDataId["ccd"], int(trDataId['amp']))
992 
993  def _setAmpDetector(self, item, dataId, trimmed=True):
994  """Set the detector object in an Exposure for an amplifier.
995 
996  Defects are also added to the Exposure based on the detector object.
997 
998  Parameters
999  ----------
1000  item : `lsst.afw.image.Exposure`
1001  Exposure to set the detector in.
1002  dataId : `dict`
1003  Dataset identifier
1004  trimmed : `bool`
1005  Should detector be marked as trimmed? (ignored)
1006  """
1007 
1008  return self._setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
1009 
1010  def _setCcdDetector(self, item, dataId, trimmed=True):
1011  """Set the detector object in an Exposure for a CCD.
1012 
1013  Parameters
1014  ----------
1015  item : `lsst.afw.image.Exposure`
1016  Exposure to set the detector in.
1017  dataId : `dict`
1018  Dataset identifier
1019  trimmed : `bool`
1020  Should detector be marked as trimmed? (ignored)
1021  """
1022  if item.getDetector() is not None:
1023  return
1024 
1025  detectorName = self._extractDetectorName(dataId)
1026  detector = self.camera[detectorName]
1027  item.setDetector(detector)
1028 
1029  def _setFilter(self, mapping, item, dataId):
1030  """Set the filter object in an Exposure. If the Exposure had a FILTER
1031  keyword, this was already processed during load. But if it didn't,
1032  use the filter from the registry.
1033 
1034  Parameters
1035  ----------
1036  mapping : `lsst.obs.base.Mapping`
1037  Where to get the filter from.
1038  item : `lsst.afw.image.Exposure`
1039  Exposure to set the filter in.
1040  dataId : `dict`
1041  Dataset identifier.
1042  """
1043 
1044  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI) or
1045  isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
1046  return
1047 
1048  if item.getFilter().getId() != afwImage.Filter.UNKNOWN:
1049  return
1050 
1051  actualId = mapping.need(['filter'], dataId)
1052  filterName = actualId['filter']
1053  if self.filters is not None and filterName in self.filters:
1054  filterName = self.filters[filterName]
1055  item.setFilter(afwImage.Filter(filterName))
1056 
1057  # Default standardization function for exposures
1058  def _standardizeExposure(self, mapping, item, dataId, filter=True,
1059  trimmed=True, setVisitInfo=True):
1060  """Default standardization function for images.
1061 
1062  This sets the Detector from the camera geometry
1063  and optionally set the Fiter. In both cases this saves
1064  having to persist some data in each exposure (or image).
1065 
1066  Parameters
1067  ----------
1068  mapping : `lsst.obs.base.Mapping`
1069  Where to get the values from.
1070  item : image-like object
1071  Can be any of lsst.afw.image.Exposure,
1072  lsst.afw.image.DecoratedImage, lsst.afw.image.Image
1073  or lsst.afw.image.MaskedImage
1074 
1075  dataId : `dict`
1076  Dataset identifier
1077  filter : `bool`
1078  Set filter? Ignored if item is already an exposure
1079  trimmed : `bool`
1080  Should detector be marked as trimmed?
1081  setVisitInfo : `bool`
1082  Should Exposure have its VisitInfo filled out from the metadata?
1083 
1084  Returns
1085  -------
1086  `lsst.afw.image.Exposure`
1087  The standardized Exposure.
1088  """
1089  try:
1090  item = exposureFromImage(item, dataId, mapper=self, logger=self.log, setVisitInfo=setVisitInfo)
1091  except Exception as e:
1092  self.log.error("Could not turn item=%r into an exposure: %s" % (repr(item), e))
1093  raise
1094 
1095  if mapping.level.lower() == "amp":
1096  self._setAmpDetector(item, dataId, trimmed)
1097  elif mapping.level.lower() == "ccd":
1098  self._setCcdDetector(item, dataId, trimmed)
1099 
1100  if filter:
1101  self._setFilter(mapping, item, dataId)
1102 
1103  return item
1104 
1105  def _defectLookup(self, dataId):
1106  """Find the defects for a given CCD.
1107 
1108  Parameters
1109  ----------
1110  dataId : `dict`
1111  Dataset identifier
1112 
1113  Returns
1114  -------
1115  `str`
1116  Path to the defects file or None if not available.
1117  """
1118  if self.defectRegistry is None:
1119  return None
1120  if self.registry is None:
1121  raise RuntimeError("No registry for defect lookup")
1122 
1123  ccdKey, ccdVal = self._getCcdKeyVal(dataId)
1124 
1125  dataIdForLookup = {'visit': dataId['visit']}
1126  # .lookup will fail in a posix registry because there is no template to provide.
1127  rows = self.registry.lookup(('taiObs'), ('raw_visit'), dataIdForLookup)
1128  if len(rows) == 0:
1129  return None
1130  assert len(rows) == 1
1131  taiObs = rows[0][0]
1132 
1133  # Lookup the defects for this CCD serial number that are valid at the exposure midpoint.
1134  rows = self.defectRegistry.executeQuery(("path",), ("defect",),
1135  [(ccdKey, "?")],
1136  ("DATETIME(?)", "DATETIME(validStart)", "DATETIME(validEnd)"),
1137  (ccdVal, taiObs))
1138  if not rows or len(rows) == 0:
1139  return None
1140  if len(rows) == 1:
1141  return os.path.join(self.defectPath, rows[0][0])
1142  else:
1143  raise RuntimeError("Querying for defects (%s, %s) returns %d files: %s" %
1144  (ccdVal, taiObs, len(rows), ", ".join([_[0] for _ in rows])))
1145 
1146  def _makeCamera(self, policy, repositoryDir):
1147  """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing
1148  the camera geometry
1149 
1150  Also set self.cameraDataLocation, if relevant (else it can be left
1151  None).
1152 
1153  This implementation assumes that policy contains an entry "camera"
1154  that points to the subdirectory in this package of camera data;
1155  specifically, that subdirectory must contain:
1156  - a file named `camera.py` that contains persisted camera config
1157  - ampInfo table FITS files, as required by
1158  lsst.afw.cameraGeom.makeCameraFromPath
1159 
1160  Parameters
1161  ----------
1162  policy : `lsst.daf.persistence.Policy`
1163  Policy with per-camera defaults already merged
1164  (PexPolicy only for backward compatibility).
1165  repositoryDir : `str`
1166  Policy repository for the subclassing module (obtained with
1167  getRepositoryPath() on the per-camera default dictionary).
1168  """
1169  if 'camera' not in policy:
1170  raise RuntimeError("Cannot find 'camera' in policy; cannot construct a camera")
1171  cameraDataSubdir = policy['camera']
1172  self.cameraDataLocation = os.path.normpath(
1173  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
1174  cameraConfig = afwCameraGeom.CameraConfig()
1175  cameraConfig.load(self.cameraDataLocation)
1176  ampInfoPath = os.path.dirname(self.cameraDataLocation)
1177  return afwCameraGeom.makeCameraFromPath(
1178  cameraConfig=cameraConfig,
1179  ampInfoPath=ampInfoPath,
1180  shortNameFunc=self.getShortCcdName,
1181  pupilFactoryClass=self.PupilFactoryClass
1182  )
1183 
1184  def getRegistry(self):
1185  """Get the registry used by this mapper.
1186 
1187  Returns
1188  -------
1189  Registry or None
1190  The registry used by this mapper for this mapper's repository.
1191  """
1192  return self.registry
1193 
1194  def getImageCompressionSettings(self, datasetType, dataId):
1195  """Stuff image compression settings into a daf.base.PropertySet
1196 
1197  This goes into the ButlerLocation's "additionalData", which gets
1198  passed into the boost::persistence framework.
1199 
1200  Parameters
1201  ----------
1202  datasetType : `str`
1203  Type of dataset for which to get the image compression settings.
1204  dataId : `dict`
1205  Dataset identifier.
1206 
1207  Returns
1208  -------
1209  additionalData : `lsst.daf.base.PropertySet`
1210  Image compression settings.
1211  """
1212  mapping = self.mappings[datasetType]
1213  recipeName = mapping.recipe
1214  storageType = mapping.storage
1215  if storageType not in self._writeRecipes:
1216  return dafBase.PropertySet()
1217  if recipeName not in self._writeRecipes[storageType]:
1218  raise RuntimeError("Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1219  (datasetType, storageType, recipeName))
1220  recipe = self._writeRecipes[storageType][recipeName].deepCopy()
1221  seed = hash(tuple(dataId.items())) % 2**31
1222  for plane in ("image", "mask", "variance"):
1223  if recipe.exists(plane + ".scaling.seed") and recipe.getScalar(plane + ".scaling.seed") == 0:
1224  recipe.set(plane + ".scaling.seed", seed)
1225  return recipe
1226 
1227  def _initWriteRecipes(self):
1228  """Read the recipes for writing files
1229 
1230  These recipes are currently used for configuring FITS compression,
1231  but they could have wider uses for configuring different flavors
1232  of the storage types. A recipe is referred to by a symbolic name,
1233  which has associated settings. These settings are stored as a
1234  `PropertySet` so they can easily be passed down to the
1235  boost::persistence framework as the "additionalData" parameter.
1236 
1237  The list of recipes is written in YAML. A default recipe and
1238  some other convenient recipes are in obs_base/policy/writeRecipes.yaml
1239  and these may be overridden or supplemented by the individual obs_*
1240  packages' own policy/writeRecipes.yaml files.
1241 
1242  Recipes are grouped by the storage type. Currently, only the
1243  ``FitsStorage`` storage type uses recipes, which uses it to
1244  configure FITS image compression.
1245 
1246  Each ``FitsStorage`` recipe for FITS compression should define
1247  "image", "mask" and "variance" entries, each of which may contain
1248  "compression" and "scaling" entries. Defaults will be provided for
1249  any missing elements under "compression" and "scaling".
1250 
1251  The allowed entries under "compression" are:
1252 
1253  * algorithm (string): compression algorithm to use
1254  * rows (int): number of rows per tile (0 = entire dimension)
1255  * columns (int): number of columns per tile (0 = entire dimension)
1256  * quantizeLevel (float): cfitsio quantization level
1257 
1258  The allowed entries under "scaling" are:
1259 
1260  * algorithm (string): scaling algorithm to use
1261  * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64)
1262  * fuzz (bool): fuzz the values when quantising floating-point values?
1263  * seed (long): seed for random number generator when fuzzing
1264  * maskPlanes (list of string): mask planes to ignore when doing
1265  statistics
1266  * quantizeLevel: divisor of the standard deviation for STDEV_* scaling
1267  * quantizePad: number of stdev to allow on the low side (for
1268  STDEV_POSITIVE/NEGATIVE)
1269  * bscale: manually specified BSCALE (for MANUAL scaling)
1270  * bzero: manually specified BSCALE (for MANUAL scaling)
1271 
1272  A very simple example YAML recipe:
1273 
1274  FitsStorage:
1275  default:
1276  image: &default
1277  compression:
1278  algorithm: GZIP_SHUFFLE
1279  mask: *default
1280  variance: *default
1281  """
1282  recipesFile = os.path.join(getPackageDir("obs_base"), "policy", "writeRecipes.yaml")
1283  recipes = dafPersist.Policy(recipesFile)
1284  supplementsFile = os.path.join(self.getPackageDir(), "policy", "writeRecipes.yaml")
1285  validationMenu = {'FitsStorage': validateRecipeFitsStorage, }
1286  if os.path.exists(supplementsFile) and supplementsFile != recipesFile:
1287  supplements = dafPersist.Policy(supplementsFile)
1288  # Don't allow overrides, only supplements
1289  for entry in validationMenu:
1290  intersection = set(recipes[entry].names()).intersection(set(supplements.names()))
1291  if intersection:
1292  raise RuntimeError("Recipes provided in %s section %s may not override those in %s: %s" %
1293  (supplementsFile, entry, recipesFile, intersection))
1294  recipes.update(supplements)
1295 
1296  self._writeRecipes = {}
1297  for storageType in recipes.names(True):
1298  if "default" not in recipes[storageType]:
1299  raise RuntimeError("No 'default' recipe defined for storage type %s in %s" %
1300  (storageType, recipesFile))
1301  self._writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1302 
1303 
1304 def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True):
1305  """Generate an Exposure from an image-like object
1306 
1307  If the image is a DecoratedImage then also set its WCS and metadata
1308  (Image and MaskedImage are missing the necessary metadata
1309  and Exposure already has those set)
1310 
1311  Parameters
1312  ----------
1313  image : Image-like object
1314  Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or
1315  Exposure.
1316 
1317  Returns
1318  -------
1319  `lsst.afw.image.Exposure`
1320  Exposure containing input image.
1321  """
1322  metadata = None
1323  if isinstance(image, afwImage.MaskedImage):
1324  exposure = afwImage.makeExposure(image)
1325  elif isinstance(image, afwImage.DecoratedImage):
1326  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
1327  metadata = image.getMetadata()
1328  try:
1329  wcs = afwGeom.makeSkyWcs(metadata, strip=True)
1330  exposure.setWcs(wcs)
1331  except pexExcept.TypeError as e:
1332  # raised on failure to create a wcs (and possibly others)
1333  if logger is None:
1334  logger = lsstLog.Log.getLogger("CameraMapper")
1335  logger.debug("wcs set to None; insufficient information found in metadata to create a valid wcs:"
1336  " %s", e.args[0])
1337 
1338  exposure.setMetadata(metadata)
1339  elif isinstance(image, afwImage.Exposure):
1340  # Exposure
1341  exposure = image
1342  metadata = exposure.getMetadata()
1343  else:
1344  # Image
1346  #
1347  # set VisitInfo if we can
1348  #
1349  if setVisitInfo and exposure.getInfo().getVisitInfo() is None:
1350  if metadata is not None:
1351  if mapper is None:
1352  if not logger:
1353  logger = lsstLog.Log.getLogger("CameraMapper")
1354  logger.warn("I can only set the VisitInfo if you provide a mapper")
1355  else:
1356  exposureId = mapper._computeCcdExposureId(dataId)
1357  visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1358 
1359  exposure.getInfo().setVisitInfo(visitInfo)
1360 
1361  return exposure
1362 
1363 
1365  """Validate recipes for FitsStorage
1366 
1367  The recipes are supplemented with default values where appropriate.
1368 
1369  TODO: replace this custom validation code with Cerberus (DM-11846)
1370 
1371  Parameters
1372  ----------
1373  recipes : `lsst.daf.persistence.Policy`
1374  FitsStorage recipes to validate.
1375 
1376  Returns
1377  -------
1378  validated : `lsst.daf.base.PropertySet`
1379  Validated FitsStorage recipe.
1380 
1381  Raises
1382  ------
1383  `RuntimeError`
1384  If validation fails.
1385  """
1386  # Schemas define what should be there, and the default values (and by the default
1387  # value, the expected type).
1388  compressionSchema = {
1389  "algorithm": "NONE",
1390  "rows": 1,
1391  "columns": 0,
1392  "quantizeLevel": 0.0,
1393  }
1394  scalingSchema = {
1395  "algorithm": "NONE",
1396  "bitpix": 0,
1397  "maskPlanes": ["NO_DATA"],
1398  "seed": 0,
1399  "quantizeLevel": 4.0,
1400  "quantizePad": 5.0,
1401  "fuzz": True,
1402  "bscale": 1.0,
1403  "bzero": 0.0,
1404  }
1405 
1406  def checkUnrecognized(entry, allowed, description):
1407  """Check to see if the entry contains unrecognised keywords"""
1408  unrecognized = set(entry.keys()) - set(allowed)
1409  if unrecognized:
1410  raise RuntimeError(
1411  "Unrecognized entries when parsing image compression recipe %s: %s" %
1412  (description, unrecognized))
1413 
1414  validated = {}
1415  for name in recipes.names(True):
1416  checkUnrecognized(recipes[name], ["image", "mask", "variance"], name)
1417  rr = dafBase.PropertySet()
1418  validated[name] = rr
1419  for plane in ("image", "mask", "variance"):
1420  checkUnrecognized(recipes[name][plane], ["compression", "scaling"],
1421  name + "->" + plane)
1422 
1423  for settings, schema in (("compression", compressionSchema),
1424  ("scaling", scalingSchema)):
1425  prefix = plane + "." + settings
1426  if settings not in recipes[name][plane]:
1427  for key in schema:
1428  rr.set(prefix + "." + key, schema[key])
1429  continue
1430  entry = recipes[name][plane][settings]
1431  checkUnrecognized(entry, schema.keys(), name + "->" + plane + "->" + settings)
1432  for key in schema:
1433  value = type(schema[key])(entry[key]) if key in entry else schema[key]
1434  rr.set(prefix + "." + key, value)
1435  return validated
def _makeCamera(self, policy, repositoryDir)
def map_expIdInfo(self, dataId, write=False)
def _setAmpDetector(self, item, dataId, trimmed=True)
Encapsulate information about a bad portion of a detector.
Definition: Defect.h:41
def validateRecipeFitsStorage(recipes)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
Class for logical location of a persisted Persistable instance.
def _setFilter(self, mapping, item, dataId)
A class to contain the data, WCS, and other information needed to describe an image of the sky...
Definition: Exposure.h:72
Information about a single exposure of an imaging camera.
Definition: VisitInfo.h:67
def _setCcdDetector(self, item, dataId, trimmed=True)
def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId)
daf::base::PropertySet * set
Definition: fits.cc:832
def std_bfKernel(self, item, dataId)
def getKeys(self, datasetType, level)
Describe an exposure&#39;s calibration.
Definition: Calib.h:95
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
Definition: MaskedImage.h:1280
Definition: Log.h:691
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
Definition: Exposure.h:446
def getImageCompressionSettings(self, datasetType, dataId)
def map_defects(self, dataId, write=False)
table::Key< int > type
Definition: Detector.cc:164
def map_camera(self, dataId, write=False)
def map(self, datasetType, dataId, write=False)
Definition: mapper.py:138
A class to manipulate images, masks, and variance as a single object.
Definition: MaskedImage.h:74
def backup(self, datasetType, dataId)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
Holds an integer identifier for an LSST filter.
Definition: Filter.h:141
def std_skypolicy(self, item, dataId)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
std::shared_ptr< SkyWcs > makeSkyWcs(TransformPoint2ToPoint2 const &pixelsToFieldAngle, lsst::geom::Angle const &orientation, bool flipX, lsst::geom::SpherePoint const &boresight, std::string const &projection="TAN")
Construct a FITS SkyWcs from camera geometry.
Definition: SkyWcs.cc:486
Class for storing generic metadata.
Definition: PropertySet.h:68
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
Reports errors from accepting an object of an unexpected or inappropriate type.
Definition: Runtime.h:167
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
An integer coordinate rectangle.
Definition: Box.h:54
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
Definition: Image.cc:703
A container for an Image and its associated metadata.
Definition: Image.h:407
def _getCcdKeyVal(self, dataId)
Utility functions.