28 from deprecated.sphinx
import deprecated
30 from astro_metadata_translator
import fix_header
33 from .
import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
42 from .exposureIdInfo
import ExposureIdInfo
43 from .makeRawVisitInfo
import MakeRawVisitInfo
44 from .utils
import createInitialSkyWcs, InitialSkyWcsError
46 from ._instrument
import Instrument
48 __all__ = [
"CameraMapper",
"exposureFromImage"]
53 """CameraMapper is a base class for mappers that handle images from a
54 camera and products derived from them. This provides an abstraction layer
55 between the data on disk and the code.
57 Public methods: keys, queryMetadata, getDatasetTypes, map,
58 canStandardize, standardize
60 Mappers for specific data sources (e.g., CFHT Megacam, LSST
61 simulations, etc.) should inherit this class.
63 The CameraMapper manages datasets within a "root" directory. Note that
64 writing to a dataset present in the input root will hide the existing
65 dataset but not overwrite it. See #2160 for design discussion.
67 A camera is assumed to consist of one or more rafts, each composed of
68 multiple CCDs. Each CCD is in turn composed of one or more amplifiers
69 (amps). A camera is also assumed to have a camera geometry description
70 (CameraGeom object) as a policy file, a filter description (Filter class
71 static configuration) as another policy file.
73 Information from the camera geometry and defects are inserted into all
74 Exposure objects returned.
76 The mapper uses one or two registries to retrieve metadata about the
77 images. The first is a registry of all raw exposures. This must contain
78 the time of the observation. One or more tables (or the equivalent)
79 within the registry are used to look up data identifier components that
80 are not specified by the user (e.g. filter) and to return results for
81 metadata queries. The second is an optional registry of all calibration
82 data. This should contain validity start and end entries for each
83 calibration dataset in the same timescale as the observation time.
85 Subclasses will typically set MakeRawVisitInfoClass and optionally the
86 metadata translator class:
88 MakeRawVisitInfoClass: a class variable that points to a subclass of
89 MakeRawVisitInfo, a functor that creates an
90 lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
92 translatorClass: The `~astro_metadata_translator.MetadataTranslator`
93 class to use for fixing metadata values. If it is not set an attempt
94 will be made to infer the class from ``MakeRawVisitInfoClass``, failing
95 that the metadata fixup will try to infer the translator class from the
98 Subclasses must provide the following methods:
100 _extractDetectorName(self, dataId): returns the detector name for a CCD
101 (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
102 a dataset identifier referring to that CCD or a subcomponent of it.
104 _computeCcdExposureId(self, dataId): see below
106 _computeCoaddExposureId(self, dataId, singleFilter): see below
108 Subclasses may also need to override the following methods:
110 _transformId(self, dataId): transformation of a data identifier
111 from colloquial usage (e.g., "ccdname") to proper/actual usage
112 (e.g., "ccd"), including making suitable for path expansion (e.g. removing
113 commas). The default implementation does nothing. Note that this
114 method should not modify its input parameter.
116 getShortCcdName(self, ccdName): a static method that returns a shortened
117 name suitable for use as a filename. The default version converts spaces
120 _mapActualToPath(self, template, actualId): convert a template path to an
121 actual path, using the actual dataset identifier.
123 The mapper's behaviors are largely specified by the policy file.
124 See the MapperDictionary.paf for descriptions of the available items.
126 The 'exposures', 'calibrations', and 'datasets' subpolicies configure
127 mappings (see Mappings class).
129 Common default mappings for all subclasses can be specified in the
130 "policy/{images,exposures,calibrations,datasets}.yaml" files. This
131 provides a simple way to add a product to all camera mappers.
133 Functions to map (provide a path to the data given a dataset
134 identifier dictionary) and standardize (convert data into some standard
135 format or type) may be provided in the subclass as "map_{dataset type}"
136 and "std_{dataset type}", respectively.
138 If non-Exposure datasets cannot be retrieved using standard
139 daf_persistence methods alone, a "bypass_{dataset type}" function may be
140 provided in the subclass to return the dataset instead of using the
141 "datasets" subpolicy.
143 Implementations of map_camera and bypass_camera that should typically be
144 sufficient are provided in this base class.
150 Instead of auto-loading the camera at construction time, load it from
151 the calibration registry
155 policy : daf_persistence.Policy,
156 Policy with per-camera defaults already merged.
157 repositoryDir : string
158 Policy repository for the subclassing module (obtained with
159 getRepositoryPath() on the per-camera default dictionary).
160 root : string, optional
161 Path to the root directory for data.
162 registry : string, optional
163 Path to registry with data's metadata.
164 calibRoot : string, optional
165 Root directory for calibrations.
166 calibRegistry : string, optional
167 Path to registry with calibrations' metadata.
168 provided : list of string, optional
169 Keys provided by the mapper.
170 parentRegistry : Registry subclass, optional
171 Registry from a parent repository that may be used to look up
173 repositoryCfg : daf_persistence.RepositoryCfg or None, optional
174 The configuration information for the repository this mapper is
181 MakeRawVisitInfoClass = MakeRawVisitInfo
184 PupilFactoryClass = afwCameraGeom.PupilFactory
187 translatorClass =
None
191 _gen3instrument =
None
194 root=None, registry=None, calibRoot=None, calibRegistry=None,
195 provided=None, parentRegistry=None, repositoryCfg=None):
197 dafPersist.Mapper.__init__(self)
199 self.
loglog = lsstLog.Log.getLogger(
"CameraMapper")
204 self.
rootroot = repositoryCfg.root
208 repoPolicy = repositoryCfg.policy
if repositoryCfg
else None
209 if repoPolicy
is not None:
210 policy.update(repoPolicy)
214 if 'levels' in policy:
215 levelsPolicy = policy[
'levels']
216 for key
in levelsPolicy.names(
True):
217 self.
levelslevels[key] =
set(levelsPolicy.asArray(key))
220 if 'defaultSubLevels' in policy:
228 self.
rootStoragerootStorage = dafPersist.Storage.makeFromURI(uri=root)
236 if calibRoot
is not None:
237 calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
238 calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
241 calibRoot = policy.get(
'calibRoot',
None)
243 calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
245 if calibStorage
is None:
253 posixIfNoSql=(
not parentRegistry))
255 self.
registryregistry = parentRegistry
256 needCalibRegistry = policy.get(
'needCalibRegistry',
None)
257 if needCalibRegistry:
260 "calibRegistryPath", calibStorage,
264 "'needCalibRegistry' is true in Policy, but was unable to locate a repo at "
265 f
"calibRoot ivar:{calibRoot} or policy['calibRoot']:{policy.get('calibRoot', None)}")
285 raise ValueError(
'class variable packageName must not be None')
295 def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
296 """Initialize mappings
298 For each of the dataset types that we want to be able to read, there
299 are methods that can be created to support them:
300 * map_<dataset> : determine the path for dataset
301 * std_<dataset> : standardize the retrieved dataset
302 * bypass_<dataset> : retrieve the dataset (bypassing the usual
304 * query_<dataset> : query the registry
306 Besides the dataset types explicitly listed in the policy, we create
307 additional, derived datasets for additional conveniences,
308 e.g., reading the header of an image, retrieving only the size of a
313 policy : `lsst.daf.persistence.Policy`
314 Policy with per-camera defaults already merged
315 rootStorage : `Storage subclass instance`
316 Interface to persisted repository data.
317 calibRoot : `Storage subclass instance`
318 Interface to persisted calib repository data
319 provided : `list` of `str`
320 Keys provided by the mapper
324 "obs_base",
"ImageMappingDefaults.yaml",
"policy"))
326 "obs_base",
"ExposureMappingDefaults.yaml",
"policy"))
328 "obs_base",
"CalibrationMappingDefaults.yaml",
"policy"))
333 (
"images", imgMappingPolicy, ImageMapping),
334 (
"exposures", expMappingPolicy, ExposureMapping),
335 (
"calibrations", calMappingPolicy, CalibrationMapping),
336 (
"datasets", dsMappingPolicy, DatasetMapping)
339 for name, defPolicy, cls
in mappingList:
341 datasets = policy[name]
344 defaultsPath = os.path.join(
getPackageDir(
"obs_base"),
"policy", name +
".yaml")
345 if os.path.exists(defaultsPath):
349 setattr(self, name, mappings)
350 for datasetType
in datasets.names(
True):
351 subPolicy = datasets[datasetType]
352 subPolicy.merge(defPolicy)
354 if not hasattr(self,
"map_" + datasetType)
and 'composite' in subPolicy:
355 def compositeClosure(dataId, write=False, mapper=None, mapping=None,
356 subPolicy=subPolicy):
357 components = subPolicy.get(
'composite')
358 assembler = subPolicy[
'assembler']
if 'assembler' in subPolicy
else None
359 disassembler = subPolicy[
'disassembler']
if 'disassembler' in subPolicy
else None
360 python = subPolicy[
'python']
362 disassembler=disassembler,
366 for name, component
in components.items():
367 butlerComposite.add(id=name,
368 datasetType=component.get(
'datasetType'),
369 setter=component.get(
'setter',
None),
370 getter=component.get(
'getter',
None),
371 subset=component.get(
'subset',
False),
372 inputOnly=component.get(
'inputOnly',
False))
373 return butlerComposite
374 setattr(self,
"map_" + datasetType, compositeClosure)
379 if name ==
"calibrations":
381 provided=provided, dataRoot=rootStorage)
383 mapping =
cls(datasetType, subPolicy, self.
registryregistry, rootStorage, provided=provided)
385 if datasetType
in self.
mappingsmappings:
386 raise ValueError(f
"Duplicate mapping policy for dataset type {datasetType}")
387 self.
keyDictkeyDict.update(mapping.keys())
388 mappings[datasetType] = mapping
389 self.
mappingsmappings[datasetType] = mapping
390 if not hasattr(self,
"map_" + datasetType):
391 def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
392 return mapping.map(mapper, dataId, write)
393 setattr(self,
"map_" + datasetType, mapClosure)
394 if not hasattr(self,
"query_" + datasetType):
395 def queryClosure(format, dataId, mapping=mapping):
396 return mapping.lookup(format, dataId)
397 setattr(self,
"query_" + datasetType, queryClosure)
398 if hasattr(mapping,
"standardize")
and not hasattr(self,
"std_" + datasetType):
399 def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
400 return mapping.standardize(mapper, item, dataId)
401 setattr(self,
"std_" + datasetType, stdClosure)
403 def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
404 """Set convenience methods on CameraMapper"""
405 mapName =
"map_" + datasetType +
"_" + suffix
406 bypassName =
"bypass_" + datasetType +
"_" + suffix
407 queryName =
"query_" + datasetType +
"_" + suffix
408 if not hasattr(self, mapName):
409 setattr(self, mapName, mapImpl
or getattr(self,
"map_" + datasetType))
410 if not hasattr(self, bypassName):
411 if bypassImpl
is None and hasattr(self,
"bypass_" + datasetType):
412 bypassImpl = getattr(self,
"bypass_" + datasetType)
413 if bypassImpl
is not None:
414 setattr(self, bypassName, bypassImpl)
415 if not hasattr(self, queryName):
416 setattr(self, queryName, queryImpl
or getattr(self,
"query_" + datasetType))
419 setMethods(
"filename", bypassImpl=
lambda datasetType, pythonType, location, dataId:
420 [os.path.join(location.getStorage().root, p)
for p
in location.getLocations()])
422 if subPolicy[
"storage"] ==
"FitsStorage":
423 def getMetadata(datasetType, pythonType, location, dataId):
428 setMethods(
"md", bypassImpl=getMetadata)
431 addName =
"add_" + datasetType
432 if not hasattr(self, addName):
435 if name ==
"exposures":
436 def getSkyWcs(datasetType, pythonType, location, dataId):
438 return fitsReader.readWcs()
440 setMethods(
"wcs", bypassImpl=getSkyWcs)
442 def getRawHeaderWcs(datasetType, pythonType, location, dataId):
443 """Create a SkyWcs from the un-modified raw
444 FITS WCS header keys."""
445 if datasetType[:3] !=
"raw":
450 setMethods(
"header_wcs", bypassImpl=getRawHeaderWcs)
452 def getPhotoCalib(datasetType, pythonType, location, dataId):
454 return fitsReader.readPhotoCalib()
456 setMethods(
"photoCalib", bypassImpl=getPhotoCalib)
458 def getVisitInfo(datasetType, pythonType, location, dataId):
460 return fitsReader.readVisitInfo()
462 setMethods(
"visitInfo", bypassImpl=getVisitInfo)
465 @deprecated(reason="Replaced with getFilterLabel. Will be removed after v22.",
category=FutureWarning)
466 def getFilter(datasetType, pythonType, location, dataId):
468 return fitsReader.readFilter()
470 setMethods(
"filter", bypassImpl=getFilter)
473 def getFilterLabel(datasetType, pythonType, location, dataId):
475 storedFilter = fitsReader.readFilterLabel()
480 idFilter = mapping.need([
'filter'], dataId)[
'filter']
483 bestFilter = self.
_getBestFilter_getBestFilter(storedFilter, idFilter)
484 if bestFilter
is not None:
489 setMethods(
"filterLabel", bypassImpl=getFilterLabel)
491 setMethods(
"detector",
492 mapImpl=
lambda dataId, write=
False:
494 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
496 storageName=
"Internal",
497 locationList=
"ignored",
502 bypassImpl=
lambda datasetType, pythonType, location, dataId:
506 def getBBox(datasetType, pythonType, location, dataId):
507 md =
readMetadata(location.getLocationsWithRoot()[0], hdu=1)
511 setMethods(
"bbox", bypassImpl=getBBox)
513 elif name ==
"images":
514 def getBBox(datasetType, pythonType, location, dataId):
518 setMethods(
"bbox", bypassImpl=getBBox)
520 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
522 def getMetadata(datasetType, pythonType, location, dataId):
523 md =
readMetadata(os.path.join(location.getStorage().root,
524 location.getLocations()[0]), hdu=1)
528 setMethods(
"md", bypassImpl=getMetadata)
531 if subPolicy[
"storage"] ==
"FitsStorage":
532 def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
533 subId = dataId.copy()
535 loc = mapping.map(mapper, subId, write)
536 bbox = dataId[
'bbox']
537 llcX = bbox.getMinX()
538 llcY = bbox.getMinY()
539 width = bbox.getWidth()
540 height = bbox.getHeight()
541 loc.additionalData.set(
'llcX', llcX)
542 loc.additionalData.set(
'llcY', llcY)
543 loc.additionalData.set(
'width', width)
544 loc.additionalData.set(
'height', height)
545 if 'imageOrigin' in dataId:
546 loc.additionalData.set(
'imageOrigin',
547 dataId[
'imageOrigin'])
550 def querySubClosure(key, format, dataId, mapping=mapping):
551 subId = dataId.copy()
553 return mapping.lookup(format, subId)
554 setMethods(
"sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
556 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
559 def getLen(datasetType, pythonType, location, dataId):
560 md =
readMetadata(os.path.join(location.getStorage().root,
561 location.getLocations()[0]), hdu=1)
565 setMethods(
"len", bypassImpl=getLen)
568 if not datasetType.endswith(
"_schema")
and datasetType +
"_schema" not in datasets:
569 setMethods(
"schema", bypassImpl=
lambda datasetType, pythonType, location, dataId:
570 afwTable.Schema.readFits(os.path.join(location.getStorage().root,
571 location.getLocations()[0])))
573 def _computeCcdExposureId(self, dataId):
574 """Compute the 64-bit (long) identifier for a CCD exposure.
576 Subclasses must override
581 Data identifier with visit, ccd.
583 raise NotImplementedError()
585 def _computeCoaddExposureId(self, dataId, singleFilter):
586 """Compute the 64-bit (long) identifier for a coadd.
588 Subclasses must override
593 Data identifier with tract and patch.
594 singleFilter : `bool`
595 True means the desired ID is for a single-filter coadd, in which
596 case dataIdmust contain filter.
598 raise NotImplementedError()
600 def _search(self, path):
601 """Search for path in the associated repository's storage.
606 Path that describes an object in the repository associated with
608 Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The
609 indicator will be stripped when searching and so will match
610 filenames without the HDU indicator, e.g. 'foo.fits'. The path
611 returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
616 The path for this object in the repository. Will return None if the
617 object can't be found. If the input argument path contained an HDU
618 indicator, the returned path will also contain the HDU indicator.
622 def backup(self, datasetType, dataId):
623 """Rename any existing object with the given type and dataId.
625 The CameraMapper implementation saves objects in a sequence of e.g.:
631 All of the backups will be placed in the output repo, however, and will
632 not be removed if they are found elsewhere in the _parent chain. This
633 means that the same file will be stored twice if the previous version
634 was found in an input repo.
643 def firstElement(list):
644 """Get the first element in the list, or None if that can't be
647 return list[0]
if list
is not None and len(list)
else None
650 newLocation = self.
mapmap(datasetType, dataId, write=
True)
651 newPath = newLocation.getLocations()[0]
652 path = dafPersist.PosixStorage.search(self.
rootroot, newPath, searchParents=
True)
653 path = firstElement(path)
655 while path
is not None:
657 oldPaths.append((n, path))
658 path = dafPersist.PosixStorage.search(self.
rootroot,
"%s~%d" % (newPath, n), searchParents=
True)
659 path = firstElement(path)
660 for n, oldPath
in reversed(oldPaths):
661 self.
rootStoragerootStorage.copyFile(oldPath,
"%s~%d" % (newPath, n))
664 """Return supported keys.
669 List of keys usable in a dataset identifier
673 def getKeys(self, datasetType, level):
674 """Return a dict of supported keys and their value types for a given
675 dataset type at a given level of the key hierarchy.
680 Dataset type or None for all dataset types.
681 level : `str` or None
682 Level or None for all levels or '' for the default level for the
688 Keys are strings usable in a dataset identifier, values are their
697 if datasetType
is None:
698 keyDict = copy.copy(self.
keyDictkeyDict)
701 if level
is not None and level
in self.
levelslevels:
702 keyDict = copy.copy(keyDict)
703 for lev
in self.
levelslevels[level]:
718 """Return the name of the camera that this CameraMapper is for."""
720 className = className[className.find(
'.'):-1]
721 m = re.search(
r'(\w+)Mapper', className)
723 m = re.search(
r"class '[\w.]*?(\w+)'", className)
725 return name[:1].lower() + name[1:]
if name
else ''
729 """Return the name of the package containing this CameraMapper."""
731 raise ValueError(
'class variable packageName must not be None')
736 """Return the gen3 Instrument class equivalent for this gen2 Mapper.
741 A `~lsst.obs.base.Instrument` class.
744 raise NotImplementedError(
"Please provide a specific implementation for your instrument"
745 " to enable conversion of this gen2 repository to gen3")
750 raise ValueError(f
"Mapper {cls} has declared a gen3 instrument class of {cls._gen3instrument}"
751 " but that is not an lsst.obs.base.Instrument")
756 """Return the base directory of this package"""
760 """Map a camera dataset."""
761 if self.
cameracamera
is None:
762 raise RuntimeError(
"No camera dataset available.")
765 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
767 storageName=
"ConfigStorage",
774 def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
775 """Return the (preloaded) camera object.
777 if self.
cameracamera
is None:
778 raise RuntimeError(
"No camera dataset available.")
783 pythonType=
"lsst.obs.base.ExposureIdInfo",
785 storageName=
"Internal",
786 locationList=
"ignored",
793 """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
794 expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
795 expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
799 """Disable standardization for bfKernel
801 bfKernel is a calibration product that is numpy array,
802 unlike other calibration products that are all images;
803 all calibration images are sent through _standardizeExposure
804 due to CalibrationMapping, but we don't want that to happen to bfKernel
808 def std_raw(self, item, dataId):
809 """Standardize a raw dataset by converting it to an Exposure instead
812 trimmed=
False, setVisitInfo=
True)
815 """Map a sky policy."""
817 "Internal",
None,
None, self,
821 """Standardize a sky policy by returning the one we use."""
822 return self.skypolicy
830 def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
832 """Set up a registry (usually SQLite3), trying a number of possible
840 Description of registry (for log messages)
844 Policy that contains the registry name, used if path is None.
846 Key in policy for registry path.
847 storage : Storage subclass
848 Repository Storage to look in.
849 searchParents : bool, optional
850 True if the search for a registry should follow any Butler v1
852 posixIfNoSql : bool, optional
853 If an sqlite registry is not found, will create a posix registry if
858 lsst.daf.persistence.Registry
861 if path
is None and policyKey
in policy:
863 if os.path.isabs(path):
864 raise RuntimeError(
"Policy should not indicate an absolute path for registry.")
865 if not storage.exists(path):
866 newPath = storage.instanceSearch(path)
868 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
870 self.
loglog.
warn(
"Unable to locate registry at policy path (also looked in root): %s",
874 self.
loglog.
warn(
"Unable to locate registry at policy path: %s", path)
883 if path
and (path.startswith(root)):
884 path = path[len(root +
'/'):]
885 except AttributeError:
892 def search(filename, description):
893 """Search for file in storage
898 Filename to search for
900 Description of file, for error message.
904 path : `str` or `None`
905 Path to file, or None
907 result = storage.instanceSearch(filename)
910 self.
loglog.
debug(
"Unable to locate %s: %s", description, filename)
915 path = search(
"%s.pgsql" % name,
"%s in root" % description)
917 path = search(
"%s.sqlite3" % name,
"%s in root" % description)
919 path = search(os.path.join(
".",
"%s.sqlite3" % name),
"%s in current dir" % description)
922 if not storage.exists(path):
923 newPath = storage.instanceSearch(path)
924 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
925 if newPath
is not None:
927 localFileObj = storage.getLocalFile(path)
928 self.
loglog.
info(
"Loading %s registry from %s", description, localFileObj.name)
929 registry = dafPersist.Registry.create(localFileObj.name)
931 elif not registry
and posixIfNoSql:
933 self.
loglog.
info(
"Loading Posix %s registry from %s", description, storage.root)
940 def _transformId(self, dataId):
941 """Generate a standard ID dict from a camera-specific ID dict.
943 Canonical keys include:
944 - amp: amplifier name
945 - ccd: CCD name (in LSST this is a combination of raft and sensor)
946 The default implementation returns a copy of its input.
951 Dataset identifier; this must not be modified
956 Transformed dataset identifier.
961 def _mapActualToPath(self, template, actualId):
962 """Convert a template path to an actual path, using the actual data
963 identifier. This implementation is usually sufficient but can be
964 overridden by the subclass.
981 return template % transformedId
982 except Exception
as e:
983 raise RuntimeError(
"Failed to format %r with data %r: %s" % (template, transformedId, e))
987 """Convert a CCD name to a form useful as a filename
989 The default implementation converts spaces to underscores.
991 return ccdName.replace(
" ",
"_")
993 def _extractDetectorName(self, dataId):
994 """Extract the detector (CCD) name from the dataset identifier.
996 The name in question is the detector name used by lsst.afw.cameraGeom.
1008 raise NotImplementedError(
"No _extractDetectorName() function specified")
1010 def _setAmpDetector(self, item, dataId, trimmed=True):
1011 """Set the detector object in an Exposure for an amplifier.
1013 Defects are also added to the Exposure based on the detector object.
1017 item : `lsst.afw.image.Exposure`
1018 Exposure to set the detector in.
1022 Should detector be marked as trimmed? (ignored)
1025 return self.
_setCcdDetector_setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
1027 def _setCcdDetector(self, item, dataId, trimmed=True):
1028 """Set the detector object in an Exposure for a CCD.
1032 item : `lsst.afw.image.Exposure`
1033 Exposure to set the detector in.
1037 Should detector be marked as trimmed? (ignored)
1039 if item.getDetector()
is not None:
1043 detector = self.
cameracamera[detectorName]
1044 item.setDetector(detector)
1047 def _resolveFilters(definitions, idFilter, filterLabel):
1048 """Identify the filter(s) consistent with partial filter information.
1052 definitions : `lsst.obs.base.FilterDefinitionCollection`
1053 The filter definitions in which to search for filters.
1054 idFilter : `str` or `None`
1055 The filter information provided in a data ID.
1056 filterLabel : `lsst.afw.image.FilterLabel` or `None`
1057 The filter information provided by an exposure; may be incomplete.
1061 filters : `set` [`lsst.obs.base.FilterDefinition`]
1062 The set of filters consistent with ``idFilter``
1063 and ``filterLabel``.
1068 matches =
set(definitions)
1069 if idFilter
is not None:
1070 matches.intersection_update(definitions.findAll(idFilter))
1071 if filterLabel
is not None and filterLabel.hasPhysicalLabel():
1072 matches.intersection_update(definitions.findAll(filterLabel.physicalLabel))
1073 if filterLabel
is not None and filterLabel.hasBandLabel():
1074 matches.intersection_update(definitions.findAll(filterLabel.bandLabel))
1077 def _getBestFilter(self, storedLabel, idFilter):
1078 """Estimate the most complete filter information consistent with the
1083 storedLabel : `lsst.afw.image.FilterLabel` or `None`
1084 The filter previously stored in the file.
1085 idFilter : `str` or `None`
1086 The filter implied by the data ID, if any.
1090 bestFitler : `lsst.afw.image.FilterLabel` or `None`
1091 The complete filter to describe the dataset. May be equal to
1092 ``storedLabel``. `None` if no recommendation can be generated.
1096 filterDefinitions = self.
getGen3InstrumentgetGen3Instrument()().filterDefinitions
1097 except NotImplementedError:
1098 filterDefinitions =
None
1100 if filterDefinitions
is not None:
1101 definitions = self.
_resolveFilters_resolveFilters(filterDefinitions, idFilter, storedLabel)
1102 self.
loglog.
debug(
"Matching filters for id=%r and label=%r are %s.",
1103 idFilter, storedLabel, definitions)
1104 if len(definitions) == 1:
1108 self.
loglog.
warn(
"Multiple matches for filter %r with data ID %r.", storedLabel, idFilter)
1111 bands = {d.band
for d
in definitions}
1112 if len(bands) == 1
and storedLabel
is None:
1113 band =
list(bands)[0]
1119 self.
loglog.
warn(
"Cannot reconcile filter %r with data ID %r.", storedLabel, idFilter)
1126 def _setFilter(self, mapping, item, dataId):
1127 """Set the filter information in an Exposure.
1129 The Exposure should already have had a filter loaded, but the reader
1130 (in ``afw``) had to act on incomplete information. This method
1131 cross-checks the filter against the data ID and the standard list
1136 mapping : `lsst.obs.base.Mapping`
1137 Where to get the data ID filter from.
1138 item : `lsst.afw.image.Exposure`
1139 Exposure to set the filter in.
1143 if not (isinstance(item, afwImage.ExposureU)
or isinstance(item, afwImage.ExposureI)
1144 or isinstance(item, afwImage.ExposureF)
or isinstance(item, afwImage.ExposureD)):
1147 itemFilter = item.getFilterLabel()
1149 idFilter = mapping.need([
'filter'], dataId)[
'filter']
1153 bestFilter = self.
_getBestFilter_getBestFilter(itemFilter, idFilter)
1154 if bestFilter
is not None:
1155 if bestFilter != itemFilter:
1156 item.setFilterLabel(bestFilter)
1158 elif itemFilter
is None:
1160 if self.
filtersfilters
is not None and idFilter
in self.
filtersfilters:
1161 idFilter = self.
filtersfilters[idFilter]
1165 with warnings.catch_warnings():
1166 warnings.filterwarnings(
"ignore", category=FutureWarning)
1169 self.
loglog.
warn(
"Filter %s not defined. Set to UNKNOWN.", idFilter)
1171 def _standardizeExposure(self, mapping, item, dataId, filter=True,
1172 trimmed=True, setVisitInfo=True):
1173 """Default standardization function for images.
1175 This sets the Detector from the camera geometry
1176 and optionally set the Filter. In both cases this saves
1177 having to persist some data in each exposure (or image).
1181 mapping : `lsst.obs.base.Mapping`
1182 Where to get the values from.
1183 item : image-like object
1184 Can be any of lsst.afw.image.Exposure,
1185 lsst.afw.image.DecoratedImage, lsst.afw.image.Image
1186 or lsst.afw.image.MaskedImage
1191 Set filter? Ignored if item is already an exposure
1193 Should detector be marked as trimmed?
1194 setVisitInfo : `bool`
1195 Should Exposure have its VisitInfo filled out from the metadata?
1199 `lsst.afw.image.Exposure`
1200 The standardized Exposure.
1204 setVisitInfo=setVisitInfo)
1205 except Exception
as e:
1206 self.
loglog.
error(
"Could not turn item=%r into an exposure: %s" % (repr(item), e))
1209 if mapping.level.lower() ==
"amp":
1211 elif mapping.level.lower() ==
"ccd":
1217 if mapping.level.lower() !=
"amp" and exposure.getWcs()
is None and \
1218 (exposure.getInfo().getVisitInfo()
is not None or exposure.getMetadata().toDict()):
1222 self.
_setFilter_setFilter(mapping, exposure, dataId)
1226 def _createSkyWcsFromMetadata(self, exposure):
1227 """Create a SkyWcs from the FITS header metadata in an Exposure.
1231 exposure : `lsst.afw.image.Exposure`
1232 The exposure to get metadata from, and attach the SkyWcs to.
1234 metadata = exposure.getMetadata()
1235 fix_header(metadata, translator_class=self.
translatorClasstranslatorClass)
1238 exposure.setWcs(wcs)
1242 self.
loglog.
debug(
"wcs set to None; missing information found in metadata to create a valid wcs:"
1246 exposure.setMetadata(metadata)
1248 def _createInitialSkyWcs(self, exposure):
1249 """Create a SkyWcs from the boresight and camera geometry.
1251 If the boresight or camera geometry do not support this method of
1252 WCS creation, this falls back on the header metadata-based version
1253 (typically a purely linear FITS crval/crpix/cdmatrix WCS).
1257 exposure : `lsst.afw.image.Exposure`
1258 The exposure to get data from, and attach the SkyWcs to.
1263 if exposure.getInfo().getVisitInfo()
is None:
1264 msg =
"No VisitInfo; cannot access boresight information. Defaulting to metadata-based SkyWcs."
1268 newSkyWcs =
createInitialSkyWcs(exposure.getInfo().getVisitInfo(), exposure.getDetector())
1269 exposure.setWcs(newSkyWcs)
1270 except InitialSkyWcsError
as e:
1271 msg =
"Cannot create SkyWcs using VisitInfo and Detector, using metadata-based SkyWcs: %s"
1273 self.
loglog.
debug(
"Exception was: %s", traceback.TracebackException.from_exception(e))
1274 if e.__context__
is not None:
1275 self.
loglog.
debug(
"Root-cause Exception was: %s",
1276 traceback.TracebackException.from_exception(e.__context__))
1278 def _makeCamera(self, policy, repositoryDir):
1279 """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing
1282 Also set self.cameraDataLocation, if relevant (else it can be left
1285 This implementation assumes that policy contains an entry "camera"
1286 that points to the subdirectory in this package of camera data;
1287 specifically, that subdirectory must contain:
1288 - a file named `camera.py` that contains persisted camera config
1289 - ampInfo table FITS files, as required by
1290 lsst.afw.cameraGeom.makeCameraFromPath
1294 policy : `lsst.daf.persistence.Policy`
1295 Policy with per-camera defaults already merged
1296 (PexPolicy only for backward compatibility).
1297 repositoryDir : `str`
1298 Policy repository for the subclassing module (obtained with
1299 getRepositoryPath() on the per-camera default dictionary).
1301 if 'camera' not in policy:
1302 raise RuntimeError(
"Cannot find 'camera' in policy; cannot construct a camera")
1303 cameraDataSubdir = policy[
'camera']
1305 os.path.join(repositoryDir, cameraDataSubdir,
"camera.py"))
1306 cameraConfig = afwCameraGeom.CameraConfig()
1309 return afwCameraGeom.makeCameraFromPath(
1310 cameraConfig=cameraConfig,
1311 ampInfoPath=ampInfoPath,
1317 """Get the registry used by this mapper.
1322 The registry used by this mapper for this mapper's repository.
1327 """Stuff image compression settings into a daf.base.PropertySet
1329 This goes into the ButlerLocation's "additionalData", which gets
1330 passed into the boost::persistence framework.
1335 Type of dataset for which to get the image compression settings.
1341 additionalData : `lsst.daf.base.PropertySet`
1342 Image compression settings.
1344 mapping = self.
mappingsmappings[datasetType]
1345 recipeName = mapping.recipe
1346 storageType = mapping.storage
1349 if recipeName
not in self.
_writeRecipes_writeRecipes[storageType]:
1350 raise RuntimeError(
"Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1351 (datasetType, storageType, recipeName))
1352 recipe = self.
_writeRecipes_writeRecipes[storageType][recipeName].deepCopy()
1353 seed = hash(tuple(dataId.items())) % 2**31
1354 for plane
in (
"image",
"mask",
"variance"):
1355 if recipe.exists(plane +
".scaling.seed")
and recipe.getScalar(plane +
".scaling.seed") == 0:
1356 recipe.set(plane +
".scaling.seed", seed)
1359 def _initWriteRecipes(self):
1360 """Read the recipes for writing files
1362 These recipes are currently used for configuring FITS compression,
1363 but they could have wider uses for configuring different flavors
1364 of the storage types. A recipe is referred to by a symbolic name,
1365 which has associated settings. These settings are stored as a
1366 `PropertySet` so they can easily be passed down to the
1367 boost::persistence framework as the "additionalData" parameter.
1369 The list of recipes is written in YAML. A default recipe and
1370 some other convenient recipes are in obs_base/policy/writeRecipes.yaml
1371 and these may be overridden or supplemented by the individual obs_*
1372 packages' own policy/writeRecipes.yaml files.
1374 Recipes are grouped by the storage type. Currently, only the
1375 ``FitsStorage`` storage type uses recipes, which uses it to
1376 configure FITS image compression.
1378 Each ``FitsStorage`` recipe for FITS compression should define
1379 "image", "mask" and "variance" entries, each of which may contain
1380 "compression" and "scaling" entries. Defaults will be provided for
1381 any missing elements under "compression" and "scaling".
1383 The allowed entries under "compression" are:
1385 * algorithm (string): compression algorithm to use
1386 * rows (int): number of rows per tile (0 = entire dimension)
1387 * columns (int): number of columns per tile (0 = entire dimension)
1388 * quantizeLevel (float): cfitsio quantization level
1390 The allowed entries under "scaling" are:
1392 * algorithm (string): scaling algorithm to use
1393 * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64)
1394 * fuzz (bool): fuzz the values when quantising floating-point values?
1395 * seed (long): seed for random number generator when fuzzing
1396 * maskPlanes (list of string): mask planes to ignore when doing
1398 * quantizeLevel: divisor of the standard deviation for STDEV_* scaling
1399 * quantizePad: number of stdev to allow on the low side (for
1400 STDEV_POSITIVE/NEGATIVE)
1401 * bscale: manually specified BSCALE (for MANUAL scaling)
1402 * bzero: manually specified BSCALE (for MANUAL scaling)
1404 A very simple example YAML recipe:
1410 algorithm: GZIP_SHUFFLE
1414 recipesFile = os.path.join(
getPackageDir(
"obs_base"),
"policy",
"writeRecipes.yaml")
1416 supplementsFile = os.path.join(self.
getPackageDirgetPackageDir(),
"policy",
"writeRecipes.yaml")
1417 validationMenu = {
'FitsStorage': validateRecipeFitsStorage, }
1418 if os.path.exists(supplementsFile)
and supplementsFile != recipesFile:
1421 for entry
in validationMenu:
1422 intersection =
set(recipes[entry].names()).intersection(
set(supplements.names()))
1424 raise RuntimeError(
"Recipes provided in %s section %s may not override those in %s: %s" %
1425 (supplementsFile, entry, recipesFile, intersection))
1426 recipes.update(supplements)
1429 for storageType
in recipes.names(
True):
1430 if "default" not in recipes[storageType]:
1431 raise RuntimeError(
"No 'default' recipe defined for storage type %s in %s" %
1432 (storageType, recipesFile))
1433 self.
_writeRecipes_writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1436 def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True):
1437 """Generate an Exposure from an image-like object
1439 If the image is a DecoratedImage then also set its WCS and metadata
1440 (Image and MaskedImage are missing the necessary metadata
1441 and Exposure already has those set)
1445 image : Image-like object
1446 Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or
1451 `lsst.afw.image.Exposure`
1452 Exposure containing input image.
1454 translatorClass =
None
1455 if mapper
is not None:
1456 translatorClass = mapper.translatorClass
1463 metadata = image.getMetadata()
1464 fix_header(metadata, translator_class=translatorClass)
1465 exposure.setMetadata(metadata)
1468 metadata = exposure.getMetadata()
1469 fix_header(metadata, translator_class=translatorClass)
1474 if setVisitInfo
and exposure.getInfo().getVisitInfo()
is None:
1475 if metadata
is not None:
1478 logger = lsstLog.Log.getLogger(
"CameraMapper")
1479 logger.warn(
"I can only set the VisitInfo if you provide a mapper")
1481 exposureId = mapper._computeCcdExposureId(dataId)
1482 visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1484 exposure.getInfo().setVisitInfo(visitInfo)
1490 """Validate recipes for FitsStorage
1492 The recipes are supplemented with default values where appropriate.
1494 TODO: replace this custom validation code with Cerberus (DM-11846)
1498 recipes : `lsst.daf.persistence.Policy`
1499 FitsStorage recipes to validate.
1503 validated : `lsst.daf.base.PropertySet`
1504 Validated FitsStorage recipe.
1509 If validation fails.
1513 compressionSchema = {
1514 "algorithm":
"NONE",
1517 "quantizeLevel": 0.0,
1520 "algorithm":
"NONE",
1522 "maskPlanes": [
"NO_DATA"],
1524 "quantizeLevel": 4.0,
1531 def checkUnrecognized(entry, allowed, description):
1532 """Check to see if the entry contains unrecognised keywords"""
1533 unrecognized =
set(entry.keys()) -
set(allowed)
1536 "Unrecognized entries when parsing image compression recipe %s: %s" %
1537 (description, unrecognized))
1540 for name
in recipes.names(
True):
1541 checkUnrecognized(recipes[name], [
"image",
"mask",
"variance"], name)
1543 validated[name] = rr
1544 for plane
in (
"image",
"mask",
"variance"):
1545 checkUnrecognized(recipes[name][plane], [
"compression",
"scaling"],
1546 name +
"->" + plane)
1548 for settings, schema
in ((
"compression", compressionSchema),
1549 (
"scaling", scalingSchema)):
1550 prefix = plane +
"." + settings
1551 if settings
not in recipes[name][plane]:
1553 rr.set(prefix +
"." + key, schema[key])
1555 entry = recipes[name][plane][settings]
1556 checkUnrecognized(entry, schema.keys(), name +
"->" + plane +
"->" + settings)
1558 value =
type(schema[key])(entry[key])
if key
in entry
else schema[key]
1559 rr.set(prefix +
"." + key, value)
1561 A container for an Image and its associated metadata.
A FITS reader class for Exposures and their components.
A class to contain the data, WCS, and other information needed to describe an image of the sky.
A group of labels for a filter in an exposure or coadd.
A class to manipulate images, masks, and variance as a single object.
Class for storing generic metadata.
Class for logical location of a persisted Persistable instance.
def map(self, datasetType, dataId, write=False)
def getDefaultSubLevel(self, level)
def backup(self, datasetType, dataId)
def _getBestFilter(self, storedLabel, idFilter)
def _makeCamera(self, policy, repositoryDir)
def _setAmpDetector(self, item, dataId, trimmed=True)
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
def std_bfKernel(self, item, dataId)
def _resolveFilters(definitions, idFilter, filterLabel)
def std_skypolicy(self, item, dataId)
def map_skypolicy(self, dataId)
def _transformId(self, dataId)
def getImageCompressionSettings(self, datasetType, dataId)
def _extractDetectorName(self, dataId)
def _initWriteRecipes(self)
def _createSkyWcsFromMetadata(self, exposure)
def map_camera(self, dataId, write=False)
def getShortCcdName(ccdName)
def _setFilter(self, mapping, item, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
def _createInitialSkyWcs(self, exposure)
def map_expIdInfo(self, dataId, write=False)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
def std_raw(self, item, dataId)
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def getKeys(self, datasetType, level)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def getGen3Instrument(cls)
def _setCcdDetector(self, item, dataId, trimmed=True)
def getDefaultLevel(self)
Reports attempts to access elements using an invalid key.
Reports errors from accepting an object of an unexpected or inappropriate type.
std::shared_ptr< daf::base::PropertyList > readMetadata(std::string const &fileName, int hdu=DEFAULT_HDU, bool strip=false)
Read FITS header.
std::shared_ptr< SkyWcs > makeSkyWcs(daf::base::PropertySet &metadata, bool strip=false)
Construct a SkyWcs from FITS keywords.
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects.
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
std::shared_ptr< FilterLabel > makeFilterLabel(Filter const &filter)
Convert an old-style Filter to a FilterLabel.
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
def validateRecipeFitsStorage(recipes)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
def createInitialSkyWcs(visitInfo, detector, flipX=False)
daf::base::PropertyList * list
daf::base::PropertySet * set