28 from deprecated.sphinx
import deprecated
30 from astro_metadata_translator
import fix_header
32 from .
import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
41 from .exposureIdInfo
import ExposureIdInfo
42 from .makeRawVisitInfo
import MakeRawVisitInfo
43 from .utils
import createInitialSkyWcs, InitialSkyWcsError
46 __all__ = [
"CameraMapper",
"exposureFromImage"]
51 """CameraMapper is a base class for mappers that handle images from a 52 camera and products derived from them. This provides an abstraction layer 53 between the data on disk and the code. 55 Public methods: keys, queryMetadata, getDatasetTypes, map, 56 canStandardize, standardize 58 Mappers for specific data sources (e.g., CFHT Megacam, LSST 59 simulations, etc.) should inherit this class. 61 The CameraMapper manages datasets within a "root" directory. Note that 62 writing to a dataset present in the input root will hide the existing 63 dataset but not overwrite it. See #2160 for design discussion. 65 A camera is assumed to consist of one or more rafts, each composed of 66 multiple CCDs. Each CCD is in turn composed of one or more amplifiers 67 (amps). A camera is also assumed to have a camera geometry description 68 (CameraGeom object) as a policy file, a filter description (Filter class 69 static configuration) as another policy file. 71 Information from the camera geometry and defects are inserted into all 72 Exposure objects returned. 74 The mapper uses one or two registries to retrieve metadata about the 75 images. The first is a registry of all raw exposures. This must contain 76 the time of the observation. One or more tables (or the equivalent) 77 within the registry are used to look up data identifier components that 78 are not specified by the user (e.g. filter) and to return results for 79 metadata queries. The second is an optional registry of all calibration 80 data. This should contain validity start and end entries for each 81 calibration dataset in the same timescale as the observation time. 83 Subclasses will typically set MakeRawVisitInfoClass and optionally the 84 metadata translator class: 86 MakeRawVisitInfoClass: a class variable that points to a subclass of 87 MakeRawVisitInfo, a functor that creates an 88 lsst.afw.image.VisitInfo from the FITS metadata of a raw image. 90 translatorClass: The `~astro_metadata_translator.MetadataTranslator` 91 class to use for fixing metadata values. If it is not set an attempt 92 will be made to infer the class from ``MakeRawVisitInfoClass``, failing 93 that the metadata fixup will try to infer the translator class from the 96 Subclasses must provide the following methods: 98 _extractDetectorName(self, dataId): returns the detector name for a CCD 99 (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given 100 a dataset identifier referring to that CCD or a subcomponent of it. 102 _computeCcdExposureId(self, dataId): see below 104 _computeCoaddExposureId(self, dataId, singleFilter): see below 106 Subclasses may also need to override the following methods: 108 _transformId(self, dataId): transformation of a data identifier 109 from colloquial usage (e.g., "ccdname") to proper/actual usage 110 (e.g., "ccd"), including making suitable for path expansion (e.g. removing 111 commas). The default implementation does nothing. Note that this 112 method should not modify its input parameter. 114 getShortCcdName(self, ccdName): a static method that returns a shortened 115 name suitable for use as a filename. The default version converts spaces 118 _mapActualToPath(self, template, actualId): convert a template path to an 119 actual path, using the actual dataset identifier. 121 The mapper's behaviors are largely specified by the policy file. 122 See the MapperDictionary.paf for descriptions of the available items. 124 The 'exposures', 'calibrations', and 'datasets' subpolicies configure 125 mappings (see Mappings class). 127 Common default mappings for all subclasses can be specified in the 128 "policy/{images,exposures,calibrations,datasets}.yaml" files. This 129 provides a simple way to add a product to all camera mappers. 131 Functions to map (provide a path to the data given a dataset 132 identifier dictionary) and standardize (convert data into some standard 133 format or type) may be provided in the subclass as "map_{dataset type}" 134 and "std_{dataset type}", respectively. 136 If non-Exposure datasets cannot be retrieved using standard 137 daf_persistence methods alone, a "bypass_{dataset type}" function may be 138 provided in the subclass to return the dataset instead of using the 139 "datasets" subpolicy. 141 Implementations of map_camera and bypass_camera that should typically be 142 sufficient are provided in this base class. 148 Instead of auto-loading the camera at construction time, load it from 149 the calibration registry 153 policy : daf_persistence.Policy, 154 Policy with per-camera defaults already merged. 155 repositoryDir : string 156 Policy repository for the subclassing module (obtained with 157 getRepositoryPath() on the per-camera default dictionary). 158 root : string, optional 159 Path to the root directory for data. 160 registry : string, optional 161 Path to registry with data's metadata. 162 calibRoot : string, optional 163 Root directory for calibrations. 164 calibRegistry : string, optional 165 Path to registry with calibrations' metadata. 166 provided : list of string, optional 167 Keys provided by the mapper. 168 parentRegistry : Registry subclass, optional 169 Registry from a parent repository that may be used to look up 171 repositoryCfg : daf_persistence.RepositoryCfg or None, optional 172 The configuration information for the repository this mapper is 179 MakeRawVisitInfoClass = MakeRawVisitInfo
182 PupilFactoryClass = afwCameraGeom.PupilFactory
185 translatorClass =
None 187 def __init__(self, policy, repositoryDir,
188 root=None, registry=None, calibRoot=None, calibRegistry=None,
189 provided=None, parentRegistry=None, repositoryCfg=None):
191 dafPersist.Mapper.__init__(self)
193 self.
log = lsstLog.Log.getLogger(
"CameraMapper")
198 self.
root = repositoryCfg.root
202 repoPolicy = repositoryCfg.policy
if repositoryCfg
else None 203 if repoPolicy
is not None:
204 policy.update(repoPolicy)
208 if 'levels' in policy:
209 levelsPolicy = policy[
'levels']
210 for key
in levelsPolicy.names(
True):
211 self.
levels[key] =
set(levelsPolicy.asArray(key))
214 if 'defaultSubLevels' in policy:
230 if calibRoot
is not None:
231 calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
232 calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
235 calibRoot = policy.get(
'calibRoot',
None)
237 calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
239 if calibStorage
is None:
247 posixIfNoSql=(
not parentRegistry))
250 needCalibRegistry = policy.get(
'needCalibRegistry',
None)
251 if needCalibRegistry:
254 "calibRegistryPath", calibStorage,
258 "'needCalibRegistry' is true in Policy, but was unable to locate a repo at " +
259 "calibRoot ivar:%s or policy['calibRoot']:%s" %
260 (calibRoot, policy.get(
'calibRoot',
None)))
280 raise ValueError(
'class variable packageName must not be None')
290 def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
291 """Initialize mappings 293 For each of the dataset types that we want to be able to read, there 294 are methods that can be created to support them: 295 * map_<dataset> : determine the path for dataset 296 * std_<dataset> : standardize the retrieved dataset 297 * bypass_<dataset> : retrieve the dataset (bypassing the usual 299 * query_<dataset> : query the registry 301 Besides the dataset types explicitly listed in the policy, we create 302 additional, derived datasets for additional conveniences, 303 e.g., reading the header of an image, retrieving only the size of a 308 policy : `lsst.daf.persistence.Policy` 309 Policy with per-camera defaults already merged 310 rootStorage : `Storage subclass instance` 311 Interface to persisted repository data. 312 calibRoot : `Storage subclass instance` 313 Interface to persisted calib repository data 314 provided : `list` of `str` 315 Keys provided by the mapper 319 "obs_base",
"ImageMappingDefaults.yaml",
"policy"))
321 "obs_base",
"ExposureMappingDefaults.yaml",
"policy"))
323 "obs_base",
"CalibrationMappingDefaults.yaml",
"policy"))
328 (
"images", imgMappingPolicy, ImageMapping),
329 (
"exposures", expMappingPolicy, ExposureMapping),
330 (
"calibrations", calMappingPolicy, CalibrationMapping),
331 (
"datasets", dsMappingPolicy, DatasetMapping)
334 for name, defPolicy, cls
in mappingList:
336 datasets = policy[name]
339 defaultsPath = os.path.join(
getPackageDir(
"obs_base"),
"policy", name +
".yaml")
340 if os.path.exists(defaultsPath):
344 setattr(self, name, mappings)
345 for datasetType
in datasets.names(
True):
346 subPolicy = datasets[datasetType]
347 subPolicy.merge(defPolicy)
349 if not hasattr(self,
"map_" + datasetType)
and 'composite' in subPolicy:
350 def compositeClosure(dataId, write=False, mapper=None, mapping=None,
351 subPolicy=subPolicy):
352 components = subPolicy.get(
'composite')
353 assembler = subPolicy[
'assembler']
if 'assembler' in subPolicy
else None 354 disassembler = subPolicy[
'disassembler']
if 'disassembler' in subPolicy
else None 355 python = subPolicy[
'python']
357 disassembler=disassembler,
361 for name, component
in components.items():
362 butlerComposite.add(id=name,
363 datasetType=component.get(
'datasetType'),
364 setter=component.get(
'setter',
None),
365 getter=component.get(
'getter',
None),
366 subset=component.get(
'subset',
False),
367 inputOnly=component.get(
'inputOnly',
False))
368 return butlerComposite
369 setattr(self,
"map_" + datasetType, compositeClosure)
373 if name ==
"calibrations":
375 provided=provided, dataRoot=rootStorage)
377 mapping =
cls(datasetType, subPolicy, self.
registry, rootStorage, provided=provided)
380 raise ValueError(f
"Duplicate mapping policy for dataset type {datasetType}")
381 self.
keyDict.update(mapping.keys())
382 mappings[datasetType] = mapping
383 self.
mappings[datasetType] = mapping
384 if not hasattr(self,
"map_" + datasetType):
385 def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
386 return mapping.map(mapper, dataId, write)
387 setattr(self,
"map_" + datasetType, mapClosure)
388 if not hasattr(self,
"query_" + datasetType):
389 def queryClosure(format, dataId, mapping=mapping):
390 return mapping.lookup(format, dataId)
391 setattr(self,
"query_" + datasetType, queryClosure)
392 if hasattr(mapping,
"standardize")
and not hasattr(self,
"std_" + datasetType):
393 def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
394 return mapping.standardize(mapper, item, dataId)
395 setattr(self,
"std_" + datasetType, stdClosure)
397 def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
398 """Set convenience methods on CameraMapper""" 399 mapName =
"map_" + datasetType +
"_" + suffix
400 bypassName =
"bypass_" + datasetType +
"_" + suffix
401 queryName =
"query_" + datasetType +
"_" + suffix
402 if not hasattr(self, mapName):
403 setattr(self, mapName, mapImpl
or getattr(self,
"map_" + datasetType))
404 if not hasattr(self, bypassName):
405 if bypassImpl
is None and hasattr(self,
"bypass_" + datasetType):
406 bypassImpl = getattr(self,
"bypass_" + datasetType)
407 if bypassImpl
is not None:
408 setattr(self, bypassName, bypassImpl)
409 if not hasattr(self, queryName):
410 setattr(self, queryName, queryImpl
or getattr(self,
"query_" + datasetType))
413 setMethods(
"filename", bypassImpl=
lambda datasetType, pythonType, location, dataId:
414 [os.path.join(location.getStorage().root, p)
for p
in location.getLocations()])
416 if subPolicy[
"storage"] ==
"FitsStorage":
417 def getMetadata(datasetType, pythonType, location, dataId):
422 setMethods(
"md", bypassImpl=getMetadata)
425 addName =
"add_" + datasetType
426 if not hasattr(self, addName):
429 if name ==
"exposures":
430 def getSkyWcs(datasetType, pythonType, location, dataId):
432 return fitsReader.readWcs()
434 setMethods(
"wcs", bypassImpl=getSkyWcs)
436 def getRawHeaderWcs(datasetType, pythonType, location, dataId):
437 """Create a SkyWcs from the un-modified raw FITS WCS header keys.""" 438 if datasetType[:3] !=
"raw":
443 setMethods(
"header_wcs", bypassImpl=getRawHeaderWcs)
445 def getPhotoCalib(datasetType, pythonType, location, dataId):
447 return fitsReader.readPhotoCalib()
449 setMethods(
"photoCalib", bypassImpl=getPhotoCalib)
451 def getVisitInfo(datasetType, pythonType, location, dataId):
453 return fitsReader.readVisitInfo()
455 setMethods(
"visitInfo", bypassImpl=getVisitInfo)
457 def getFilter(datasetType, pythonType, location, dataId):
459 return fitsReader.readFilter()
461 setMethods(
"filter", bypassImpl=getFilter)
463 setMethods(
"detector",
464 mapImpl=
lambda dataId, write=
False:
466 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
468 storageName=
"Internal",
469 locationList=
"ignored",
474 bypassImpl=
lambda datasetType, pythonType, location, dataId:
478 def getBBox(datasetType, pythonType, location, dataId):
479 md =
readMetadata(location.getLocationsWithRoot()[0], hdu=1)
483 setMethods(
"bbox", bypassImpl=getBBox)
485 elif name ==
"images":
486 def getBBox(datasetType, pythonType, location, dataId):
490 setMethods(
"bbox", bypassImpl=getBBox)
492 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
494 def getMetadata(datasetType, pythonType, location, dataId):
495 md =
readMetadata(os.path.join(location.getStorage().root,
496 location.getLocations()[0]), hdu=1)
500 setMethods(
"md", bypassImpl=getMetadata)
503 if subPolicy[
"storage"] ==
"FitsStorage":
504 def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
505 subId = dataId.copy()
507 loc = mapping.map(mapper, subId, write)
508 bbox = dataId[
'bbox']
509 llcX = bbox.getMinX()
510 llcY = bbox.getMinY()
511 width = bbox.getWidth()
512 height = bbox.getHeight()
513 loc.additionalData.set(
'llcX', llcX)
514 loc.additionalData.set(
'llcY', llcY)
515 loc.additionalData.set(
'width', width)
516 loc.additionalData.set(
'height', height)
517 if 'imageOrigin' in dataId:
518 loc.additionalData.set(
'imageOrigin',
519 dataId[
'imageOrigin'])
522 def querySubClosure(key, format, dataId, mapping=mapping):
523 subId = dataId.copy()
525 return mapping.lookup(format, subId)
526 setMethods(
"sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
528 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
531 def getLen(datasetType, pythonType, location, dataId):
532 md =
readMetadata(os.path.join(location.getStorage().root,
533 location.getLocations()[0]), hdu=1)
537 setMethods(
"len", bypassImpl=getLen)
540 if not datasetType.endswith(
"_schema")
and datasetType +
"_schema" not in datasets:
541 setMethods(
"schema", bypassImpl=
lambda datasetType, pythonType, location, dataId:
542 afwTable.Schema.readFits(os.path.join(location.getStorage().root,
543 location.getLocations()[0])))
545 def _computeCcdExposureId(self, dataId):
546 """Compute the 64-bit (long) identifier for a CCD exposure. 548 Subclasses must override 553 Data identifier with visit, ccd. 555 raise NotImplementedError()
557 def _computeCoaddExposureId(self, dataId, singleFilter):
558 """Compute the 64-bit (long) identifier for a coadd. 560 Subclasses must override 565 Data identifier with tract and patch. 566 singleFilter : `bool` 567 True means the desired ID is for a single-filter coadd, in which 568 case dataIdmust contain filter. 570 raise NotImplementedError()
572 def _search(self, path):
573 """Search for path in the associated repository's storage. 578 Path that describes an object in the repository associated with 580 Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The 581 indicator will be stripped when searching and so will match 582 filenames without the HDU indicator, e.g. 'foo.fits'. The path 583 returned WILL contain the indicator though, e.g. ['foo.fits[1]']. 588 The path for this object in the repository. Will return None if the 589 object can't be found. If the input argument path contained an HDU 590 indicator, the returned path will also contain the HDU indicator. 595 """Rename any existing object with the given type and dataId. 597 The CameraMapper implementation saves objects in a sequence of e.g.: 603 All of the backups will be placed in the output repo, however, and will 604 not be removed if they are found elsewhere in the _parent chain. This 605 means that the same file will be stored twice if the previous version 606 was found in an input repo. 615 def firstElement(list):
616 """Get the first element in the list, or None if that can't be 619 return list[0]
if list
is not None and len(list)
else None 622 newLocation = self.
map(datasetType, dataId, write=
True)
623 newPath = newLocation.getLocations()[0]
624 path = dafPersist.PosixStorage.search(self.
root, newPath, searchParents=
True)
625 path = firstElement(path)
627 while path
is not None:
629 oldPaths.append((n, path))
630 path = dafPersist.PosixStorage.search(self.
root,
"%s~%d" % (newPath, n), searchParents=
True)
631 path = firstElement(path)
632 for n, oldPath
in reversed(oldPaths):
633 self.
rootStorage.copyFile(oldPath,
"%s~%d" % (newPath, n))
636 """Return supported keys. 641 List of keys usable in a dataset identifier 646 """Return a dict of supported keys and their value types for a given 647 dataset type at a given level of the key hierarchy. 652 Dataset type or None for all dataset types. 653 level : `str` or None 654 Level or None for all levels or '' for the default level for the 660 Keys are strings usable in a dataset identifier, values are their 668 if datasetType
is None:
669 keyDict = copy.copy(self.
keyDict)
672 if level
is not None and level
in self.
levels:
673 keyDict = copy.copy(keyDict)
674 for l
in self.
levels[level]:
689 """Return the name of the camera that this CameraMapper is for.""" 691 className = className[className.find(
'.'):-1]
692 m = re.search(
r'(\w+)Mapper', className)
694 m = re.search(
r"class '[\w.]*?(\w+)'", className)
696 return name[:1].lower() + name[1:]
if name
else '' 700 """Return the name of the package containing this CameraMapper.""" 702 raise ValueError(
'class variable packageName must not be None')
707 """Return the base directory of this package""" 711 """Map a camera dataset.""" 713 raise RuntimeError(
"No camera dataset available.")
716 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
718 storageName=
"ConfigStorage",
726 """Return the (preloaded) camera object. 729 raise RuntimeError(
"No camera dataset available.")
734 pythonType=
"lsst.obs.base.ExposureIdInfo",
736 storageName=
"Internal",
737 locationList=
"ignored",
744 """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure""" 745 expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
746 expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
750 """Disable standardization for bfKernel 752 bfKernel is a calibration product that is numpy array, 753 unlike other calibration products that are all images; 754 all calibration images are sent through _standardizeExposure 755 due to CalibrationMapping, but we don't want that to happen to bfKernel 760 """Standardize a raw dataset by converting it to an Exposure instead 763 trimmed=
False, setVisitInfo=
True)
766 """Map a sky policy.""" 768 "Internal",
None,
None, self,
772 """Standardize a sky policy by returning the one we use.""" 773 return self.skypolicy
781 def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
783 """Set up a registry (usually SQLite3), trying a number of possible 791 Description of registry (for log messages) 795 Policy that contains the registry name, used if path is None. 797 Key in policy for registry path. 798 storage : Storage subclass 799 Repository Storage to look in. 800 searchParents : bool, optional 801 True if the search for a registry should follow any Butler v1 803 posixIfNoSql : bool, optional 804 If an sqlite registry is not found, will create a posix registry if 809 lsst.daf.persistence.Registry 812 if path
is None and policyKey
in policy:
814 if os.path.isabs(path):
815 raise RuntimeError(
"Policy should not indicate an absolute path for registry.")
816 if not storage.exists(path):
817 newPath = storage.instanceSearch(path)
819 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None 821 self.
log.
warn(
"Unable to locate registry at policy path (also looked in root): %s",
825 self.
log.
warn(
"Unable to locate registry at policy path: %s", path)
833 if path
and (path.startswith(root)):
834 path = path[len(root +
'/'):]
835 except AttributeError:
841 def search(filename, description):
842 """Search for file in storage 847 Filename to search for 849 Description of file, for error message. 853 path : `str` or `None` 854 Path to file, or None 856 result = storage.instanceSearch(filename)
859 self.
log.
debug(
"Unable to locate %s: %s", description, filename)
864 path = search(
"%s.pgsql" % name,
"%s in root" % description)
866 path = search(
"%s.sqlite3" % name,
"%s in root" % description)
868 path = search(os.path.join(
".",
"%s.sqlite3" % name),
"%s in current dir" % description)
871 if not storage.exists(path):
872 newPath = storage.instanceSearch(path)
873 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None 874 if newPath
is not None:
876 localFileObj = storage.getLocalFile(path)
877 self.
log.
info(
"Loading %s registry from %s", description, localFileObj.name)
878 registry = dafPersist.Registry.create(localFileObj.name)
880 elif not registry
and posixIfNoSql:
882 self.
log.
info(
"Loading Posix %s registry from %s", description, storage.root)
889 def _transformId(self, dataId):
890 """Generate a standard ID dict from a camera-specific ID dict. 892 Canonical keys include: 893 - amp: amplifier name 894 - ccd: CCD name (in LSST this is a combination of raft and sensor) 895 The default implementation returns a copy of its input. 900 Dataset identifier; this must not be modified 905 Transformed dataset identifier. 910 def _mapActualToPath(self, template, actualId):
911 """Convert a template path to an actual path, using the actual data 912 identifier. This implementation is usually sufficient but can be 913 overridden by the subclass. 930 return template % transformedId
931 except Exception
as e:
932 raise RuntimeError(
"Failed to format %r with data %r: %s" % (template, transformedId, e))
936 """Convert a CCD name to a form useful as a filename 938 The default implementation converts spaces to underscores. 940 return ccdName.replace(
" ",
"_")
942 def _extractDetectorName(self, dataId):
943 """Extract the detector (CCD) name from the dataset identifier. 945 The name in question is the detector name used by lsst.afw.cameraGeom. 957 raise NotImplementedError(
"No _extractDetectorName() function specified")
959 @deprecated(
"This method is no longer used for ISR (will be removed after v11)", category=FutureWarning)
960 def _extractAmpId(self, dataId):
961 """Extract the amplifier identifer from a dataset identifier. 963 .. note:: Deprecated in 11_0 965 amplifier identifier has two parts: the detector name for the CCD 966 containing the amplifier and index of the amplifier in the detector. 980 return (trDataId[
"ccd"], int(trDataId[
'amp']))
982 def _setAmpDetector(self, item, dataId, trimmed=True):
983 """Set the detector object in an Exposure for an amplifier. 985 Defects are also added to the Exposure based on the detector object. 989 item : `lsst.afw.image.Exposure` 990 Exposure to set the detector in. 994 Should detector be marked as trimmed? (ignored) 999 def _setCcdDetector(self, item, dataId, trimmed=True):
1000 """Set the detector object in an Exposure for a CCD. 1004 item : `lsst.afw.image.Exposure` 1005 Exposure to set the detector in. 1009 Should detector be marked as trimmed? (ignored) 1011 if item.getDetector()
is not None:
1015 detector = self.
camera[detectorName]
1016 item.setDetector(detector)
1018 def _setFilter(self, mapping, item, dataId):
1019 """Set the filter object in an Exposure. If the Exposure had a FILTER 1020 keyword, this was already processed during load. But if it didn't, 1021 use the filter from the registry. 1025 mapping : `lsst.obs.base.Mapping` 1026 Where to get the filter from. 1027 item : `lsst.afw.image.Exposure` 1028 Exposure to set the filter in. 1033 if not (isinstance(item, afwImage.ExposureU)
or isinstance(item, afwImage.ExposureI)
or 1034 isinstance(item, afwImage.ExposureF)
or isinstance(item, afwImage.ExposureD)):
1037 if item.getFilter().getId() != afwImage.Filter.UNKNOWN:
1040 actualId = mapping.need([
'filter'], dataId)
1041 filterName = actualId[
'filter']
1043 filterName = self.
filters[filterName]
1047 self.
log.
warn(
"Filter %s not defined. Set to UNKNOWN." % (filterName))
1049 def _standardizeExposure(self, mapping, item, dataId, filter=True,
1050 trimmed=True, setVisitInfo=True):
1051 """Default standardization function for images. 1053 This sets the Detector from the camera geometry 1054 and optionally set the Filter. In both cases this saves 1055 having to persist some data in each exposure (or image). 1059 mapping : `lsst.obs.base.Mapping` 1060 Where to get the values from. 1061 item : image-like object 1062 Can be any of lsst.afw.image.Exposure, 1063 lsst.afw.image.DecoratedImage, lsst.afw.image.Image 1064 or lsst.afw.image.MaskedImage 1069 Set filter? Ignored if item is already an exposure 1071 Should detector be marked as trimmed? 1072 setVisitInfo : `bool` 1073 Should Exposure have its VisitInfo filled out from the metadata? 1077 `lsst.afw.image.Exposure` 1078 The standardized Exposure. 1082 setVisitInfo=setVisitInfo)
1083 except Exception
as e:
1084 self.
log.
error(
"Could not turn item=%r into an exposure: %s" % (repr(item), e))
1087 if mapping.level.lower() ==
"amp":
1089 elif mapping.level.lower() ==
"ccd":
1095 if mapping.level.lower() !=
"amp" and exposure.getWcs()
is None and \
1096 (exposure.getInfo().getVisitInfo()
is not None or exposure.getMetadata().toDict()):
1104 def _createSkyWcsFromMetadata(self, exposure):
1105 """Create a SkyWcs from the FITS header metadata in an Exposure. 1109 exposure : `lsst.afw.image.Exposure` 1110 The exposure to get metadata from, and attach the SkyWcs to. 1112 metadata = exposure.getMetadata()
1115 exposure.setWcs(wcs)
1118 self.
log.
debug(
"wcs set to None; missing information found in metadata to create a valid wcs:" 1121 exposure.setMetadata(metadata)
1123 def _createInitialSkyWcs(self, exposure):
1124 """Create a SkyWcs from the boresight and camera geometry. 1126 If the boresight or camera geometry do not support this method of 1127 WCS creation, this falls back on the header metadata-based version 1128 (typically a purely linear FITS crval/crpix/cdmatrix WCS). 1132 exposure : `lsst.afw.image.Exposure` 1133 The exposure to get data from, and attach the SkyWcs to. 1138 if exposure.getInfo().getVisitInfo()
is None:
1139 msg =
"No VisitInfo; cannot access boresight information. Defaulting to metadata-based SkyWcs." 1143 newSkyWcs =
createInitialSkyWcs(exposure.getInfo().getVisitInfo(), exposure.getDetector())
1144 exposure.setWcs(newSkyWcs)
1145 except InitialSkyWcsError
as e:
1146 msg =
"Cannot create SkyWcs using VisitInfo and Detector, using metadata-based SkyWcs: %s" 1148 self.
log.
debug(
"Exception was: %s", traceback.TracebackException.from_exception(e))
1149 if e.__context__
is not None:
1150 self.
log.
debug(
"Root-cause Exception was: %s",
1151 traceback.TracebackException.from_exception(e.__context__))
1153 def _makeCamera(self, policy, repositoryDir):
1154 """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing 1157 Also set self.cameraDataLocation, if relevant (else it can be left 1160 This implementation assumes that policy contains an entry "camera" 1161 that points to the subdirectory in this package of camera data; 1162 specifically, that subdirectory must contain: 1163 - a file named `camera.py` that contains persisted camera config 1164 - ampInfo table FITS files, as required by 1165 lsst.afw.cameraGeom.makeCameraFromPath 1169 policy : `lsst.daf.persistence.Policy` 1170 Policy with per-camera defaults already merged 1171 (PexPolicy only for backward compatibility). 1172 repositoryDir : `str` 1173 Policy repository for the subclassing module (obtained with 1174 getRepositoryPath() on the per-camera default dictionary). 1176 if 'camera' not in policy:
1177 raise RuntimeError(
"Cannot find 'camera' in policy; cannot construct a camera")
1178 cameraDataSubdir = policy[
'camera']
1180 os.path.join(repositoryDir, cameraDataSubdir,
"camera.py"))
1181 cameraConfig = afwCameraGeom.CameraConfig()
1184 return afwCameraGeom.makeCameraFromPath(
1185 cameraConfig=cameraConfig,
1186 ampInfoPath=ampInfoPath,
1192 """Get the registry used by this mapper. 1197 The registry used by this mapper for this mapper's repository. 1202 """Stuff image compression settings into a daf.base.PropertySet 1204 This goes into the ButlerLocation's "additionalData", which gets 1205 passed into the boost::persistence framework. 1210 Type of dataset for which to get the image compression settings. 1216 additionalData : `lsst.daf.base.PropertySet` 1217 Image compression settings. 1219 mapping = self.
mappings[datasetType]
1220 recipeName = mapping.recipe
1221 storageType = mapping.storage
1225 raise RuntimeError(
"Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1226 (datasetType, storageType, recipeName))
1227 recipe = self.
_writeRecipes[storageType][recipeName].deepCopy()
1228 seed = hash(tuple(dataId.items())) % 2**31
1229 for plane
in (
"image",
"mask",
"variance"):
1230 if recipe.exists(plane +
".scaling.seed")
and recipe.getScalar(plane +
".scaling.seed") == 0:
1231 recipe.set(plane +
".scaling.seed", seed)
1234 def _initWriteRecipes(self):
1235 """Read the recipes for writing files 1237 These recipes are currently used for configuring FITS compression, 1238 but they could have wider uses for configuring different flavors 1239 of the storage types. A recipe is referred to by a symbolic name, 1240 which has associated settings. These settings are stored as a 1241 `PropertySet` so they can easily be passed down to the 1242 boost::persistence framework as the "additionalData" parameter. 1244 The list of recipes is written in YAML. A default recipe and 1245 some other convenient recipes are in obs_base/policy/writeRecipes.yaml 1246 and these may be overridden or supplemented by the individual obs_* 1247 packages' own policy/writeRecipes.yaml files. 1249 Recipes are grouped by the storage type. Currently, only the 1250 ``FitsStorage`` storage type uses recipes, which uses it to 1251 configure FITS image compression. 1253 Each ``FitsStorage`` recipe for FITS compression should define 1254 "image", "mask" and "variance" entries, each of which may contain 1255 "compression" and "scaling" entries. Defaults will be provided for 1256 any missing elements under "compression" and "scaling". 1258 The allowed entries under "compression" are: 1260 * algorithm (string): compression algorithm to use 1261 * rows (int): number of rows per tile (0 = entire dimension) 1262 * columns (int): number of columns per tile (0 = entire dimension) 1263 * quantizeLevel (float): cfitsio quantization level 1265 The allowed entries under "scaling" are: 1267 * algorithm (string): scaling algorithm to use 1268 * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64) 1269 * fuzz (bool): fuzz the values when quantising floating-point values? 1270 * seed (long): seed for random number generator when fuzzing 1271 * maskPlanes (list of string): mask planes to ignore when doing 1273 * quantizeLevel: divisor of the standard deviation for STDEV_* scaling 1274 * quantizePad: number of stdev to allow on the low side (for 1275 STDEV_POSITIVE/NEGATIVE) 1276 * bscale: manually specified BSCALE (for MANUAL scaling) 1277 * bzero: manually specified BSCALE (for MANUAL scaling) 1279 A very simple example YAML recipe: 1285 algorithm: GZIP_SHUFFLE 1289 recipesFile = os.path.join(
getPackageDir(
"obs_base"),
"policy",
"writeRecipes.yaml")
1291 supplementsFile = os.path.join(self.
getPackageDir(),
"policy",
"writeRecipes.yaml")
1292 validationMenu = {
'FitsStorage': validateRecipeFitsStorage, }
1293 if os.path.exists(supplementsFile)
and supplementsFile != recipesFile:
1296 for entry
in validationMenu:
1297 intersection =
set(recipes[entry].names()).intersection(
set(supplements.names()))
1299 raise RuntimeError(
"Recipes provided in %s section %s may not override those in %s: %s" %
1300 (supplementsFile, entry, recipesFile, intersection))
1301 recipes.update(supplements)
1304 for storageType
in recipes.names(
True):
1305 if "default" not in recipes[storageType]:
1306 raise RuntimeError(
"No 'default' recipe defined for storage type %s in %s" %
1307 (storageType, recipesFile))
1308 self.
_writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1312 """Generate an Exposure from an image-like object 1314 If the image is a DecoratedImage then also set its WCS and metadata 1315 (Image and MaskedImage are missing the necessary metadata 1316 and Exposure already has those set) 1320 image : Image-like object 1321 Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or 1326 `lsst.afw.image.Exposure` 1327 Exposure containing input image. 1334 metadata = image.getMetadata()
1335 exposure.setMetadata(metadata)
1338 metadata = exposure.getMetadata()
1343 if setVisitInfo
and exposure.getInfo().getVisitInfo()
is None:
1344 if metadata
is not None:
1347 logger = lsstLog.Log.getLogger(
"CameraMapper")
1348 logger.warn(
"I can only set the VisitInfo if you provide a mapper")
1350 exposureId = mapper._computeCcdExposureId(dataId)
1351 visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1353 exposure.getInfo().setVisitInfo(visitInfo)
1359 """Validate recipes for FitsStorage 1361 The recipes are supplemented with default values where appropriate. 1363 TODO: replace this custom validation code with Cerberus (DM-11846) 1367 recipes : `lsst.daf.persistence.Policy` 1368 FitsStorage recipes to validate. 1372 validated : `lsst.daf.base.PropertySet` 1373 Validated FitsStorage recipe. 1378 If validation fails. 1382 compressionSchema = {
1383 "algorithm":
"NONE",
1386 "quantizeLevel": 0.0,
1389 "algorithm":
"NONE",
1391 "maskPlanes": [
"NO_DATA"],
1393 "quantizeLevel": 4.0,
1400 def checkUnrecognized(entry, allowed, description):
1401 """Check to see if the entry contains unrecognised keywords""" 1402 unrecognized =
set(entry.keys()) -
set(allowed)
1405 "Unrecognized entries when parsing image compression recipe %s: %s" %
1406 (description, unrecognized))
1409 for name
in recipes.names(
True):
1410 checkUnrecognized(recipes[name], [
"image",
"mask",
"variance"], name)
1412 validated[name] = rr
1413 for plane
in (
"image",
"mask",
"variance"):
1414 checkUnrecognized(recipes[name][plane], [
"compression",
"scaling"],
1415 name +
"->" + plane)
1417 for settings, schema
in ((
"compression", compressionSchema),
1418 (
"scaling", scalingSchema)):
1419 prefix = plane +
"." + settings
1420 if settings
not in recipes[name][plane]:
1422 rr.set(prefix +
"." + key, schema[key])
1424 entry = recipes[name][plane][settings]
1425 checkUnrecognized(entry, schema.keys(), name +
"->" + plane +
"->" + settings)
1427 value =
type(schema[key])(entry[key])
if key
in entry
else schema[key]
1428 rr.set(prefix +
"." + key, value)
def _makeCamera(self, policy, repositoryDir)
def map_expIdInfo(self, dataId, write=False)
def _setAmpDetector(self, item, dataId, trimmed=True)
def validateRecipeFitsStorage(recipes)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
Class for logical location of a persisted Persistable instance.
def _extractDetectorName(self, dataId)
def _setFilter(self, mapping, item, dataId)
A class to contain the data, WCS, and other information needed to describe an image of the sky...
def _createInitialSkyWcs(self, exposure)
def _setCcdDetector(self, item, dataId, trimmed=True)
daf::base::PropertySet * set
def std_bfKernel(self, item, dataId)
def getKeys(self, datasetType, level)
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
def getImageCompressionSettings(self, datasetType, dataId)
def _createSkyWcsFromMetadata(self, exposure)
Reports attempts to access elements using an invalid key.
def createInitialSkyWcs(visitInfo, detector, flipX=False)
def map_camera(self, dataId, write=False)
def map(self, datasetType, dataId, write=False)
A class to manipulate images, masks, and variance as a single object.
def std_raw(self, item, dataId)
def backup(self, datasetType, dataId)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
Utility functions.
def map_skypolicy(self, dataId)
Holds an integer identifier for an LSST filter.
def std_skypolicy(self, item, dataId)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
std::shared_ptr< SkyWcs > makeSkyWcs(TransformPoint2ToPoint2 const &pixelsToFieldAngle, lsst::geom::Angle const &orientation, bool flipX, lsst::geom::SpherePoint const &boresight, std::string const &projection="TAN")
Construct a FITS SkyWcs from camera geometry.
def getDefaultSubLevel(self, level)
Class for storing generic metadata.
def _transformId(self, dataId)
A FITS reader class for Exposures and their components.
def getDefaultLevel(self)
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects...
Reports errors from accepting an object of an unexpected or inappropriate type.
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
def _initWriteRecipes(self)
def getShortCcdName(ccdName)
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
A container for an Image and its associated metadata.