25 from astropy.io
import fits
29 from .
import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
38 from .exposureIdInfo
import ExposureIdInfo
39 from .makeRawVisitInfo
import MakeRawVisitInfo
42 __all__ = [
"CameraMapper",
"exposureFromImage"]
47 """CameraMapper is a base class for mappers that handle images from a 48 camera and products derived from them. This provides an abstraction layer 49 between the data on disk and the code. 51 Public methods: keys, queryMetadata, getDatasetTypes, map, 52 canStandardize, standardize 54 Mappers for specific data sources (e.g., CFHT Megacam, LSST 55 simulations, etc.) should inherit this class. 57 The CameraMapper manages datasets within a "root" directory. Note that 58 writing to a dataset present in the input root will hide the existing 59 dataset but not overwrite it. See #2160 for design discussion. 61 A camera is assumed to consist of one or more rafts, each composed of 62 multiple CCDs. Each CCD is in turn composed of one or more amplifiers 63 (amps). A camera is also assumed to have a camera geometry description 64 (CameraGeom object) as a policy file, a filter description (Filter class 65 static configuration) as another policy file, and an optional defects 66 description directory. 68 Information from the camera geometry and defects are inserted into all 69 Exposure objects returned. 71 The mapper uses one or two registries to retrieve metadata about the 72 images. The first is a registry of all raw exposures. This must contain 73 the time of the observation. One or more tables (or the equivalent) 74 within the registry are used to look up data identifier components that 75 are not specified by the user (e.g. filter) and to return results for 76 metadata queries. The second is an optional registry of all calibration 77 data. This should contain validity start and end entries for each 78 calibration dataset in the same timescale as the observation time. 80 Subclasses will typically set MakeRawVisitInfoClass: 82 MakeRawVisitInfoClass: a class variable that points to a subclass of 83 MakeRawVisitInfo, a functor that creates an 84 lsst.afw.image.VisitInfo from the FITS metadata of a raw image. 86 Subclasses must provide the following methods: 88 _extractDetectorName(self, dataId): returns the detector name for a CCD 89 (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given 90 a dataset identifier referring to that CCD or a subcomponent of it. 92 _computeCcdExposureId(self, dataId): see below 94 _computeCoaddExposureId(self, dataId, singleFilter): see below 96 Subclasses may also need to override the following methods: 98 _transformId(self, dataId): transformation of a data identifier 99 from colloquial usage (e.g., "ccdname") to proper/actual usage 100 (e.g., "ccd"), including making suitable for path expansion (e.g. removing 101 commas). The default implementation does nothing. Note that this 102 method should not modify its input parameter. 104 getShortCcdName(self, ccdName): a static method that returns a shortened 105 name suitable for use as a filename. The default version converts spaces 108 _getCcdKeyVal(self, dataId): return a CCD key and value 109 by which to look up defects in the defects registry. 110 The default value returns ("ccd", detector name) 112 _mapActualToPath(self, template, actualId): convert a template path to an 113 actual path, using the actual dataset identifier. 115 The mapper's behaviors are largely specified by the policy file. 116 See the MapperDictionary.paf for descriptions of the available items. 118 The 'exposures', 'calibrations', and 'datasets' subpolicies configure 119 mappings (see Mappings class). 121 Common default mappings for all subclasses can be specified in the 122 "policy/{images,exposures,calibrations,datasets}.yaml" files. This 123 provides a simple way to add a product to all camera mappers. 125 Functions to map (provide a path to the data given a dataset 126 identifier dictionary) and standardize (convert data into some standard 127 format or type) may be provided in the subclass as "map_{dataset type}" 128 and "std_{dataset type}", respectively. 130 If non-Exposure datasets cannot be retrieved using standard 131 daf_persistence methods alone, a "bypass_{dataset type}" function may be 132 provided in the subclass to return the dataset instead of using the 133 "datasets" subpolicy. 135 Implementations of map_camera and bypass_camera that should typically be 136 sufficient are provided in this base class. 142 - Handle defects the same was as all other calibration products, using the 144 - Instead of auto-loading the camera at construction time, load it from 145 the calibration registry 146 - Rewrite defects as AFW tables so we don't need astropy.io.fits to 147 unpersist them; then remove all mention of astropy.io.fits from this 154 MakeRawVisitInfoClass = MakeRawVisitInfo
157 PupilFactoryClass = afwCameraGeom.PupilFactory
159 def __init__(self, policy, repositoryDir,
160 root=None, registry=None, calibRoot=None, calibRegistry=None,
161 provided=None, parentRegistry=None, repositoryCfg=None):
162 """Initialize the CameraMapper. 166 policy : daf_persistence.Policy, 167 Policy with per-camera defaults already merged. 168 repositoryDir : string 169 Policy repository for the subclassing module (obtained with 170 getRepositoryPath() on the per-camera default dictionary). 171 root : string, optional 172 Path to the root directory for data. 173 registry : string, optional 174 Path to registry with data's metadata. 175 calibRoot : string, optional 176 Root directory for calibrations. 177 calibRegistry : string, optional 178 Path to registry with calibrations' metadata. 179 provided : list of string, optional 180 Keys provided by the mapper. 181 parentRegistry : Registry subclass, optional 182 Registry from a parent repository that may be used to look up 184 repositoryCfg : daf_persistence.RepositoryCfg or None, optional 185 The configuration information for the repository this mapper is 189 dafPersist.Mapper.__init__(self)
191 self.
log = lsstLog.Log.getLogger(
"CameraMapper")
196 self.
root = repositoryCfg.root
200 repoPolicy = repositoryCfg.policy
if repositoryCfg
else None 201 if repoPolicy
is not None:
202 policy.update(repoPolicy)
206 if 'levels' in policy:
207 levelsPolicy = policy[
'levels']
208 for key
in levelsPolicy.names(
True):
209 self.
levels[key] =
set(levelsPolicy.asArray(key))
212 if 'defaultSubLevels' in policy:
228 if calibRoot
is not None:
229 calibRoot = dafPersist.Storage.absolutePath(root, calibRoot)
230 calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
233 calibRoot = policy.get(
'calibRoot',
None)
235 calibStorage = dafPersist.Storage.makeFromURI(uri=calibRoot,
237 if calibStorage
is None:
245 posixIfNoSql=(
not parentRegistry))
248 needCalibRegistry = policy.get(
'needCalibRegistry',
None)
249 if needCalibRegistry:
252 "calibRegistryPath", calibStorage,
256 "'needCalibRegistry' is true in Policy, but was unable to locate a repo at " +
257 "calibRoot ivar:%s or policy['calibRoot']:%s" %
258 (calibRoot, policy.get(
'calibRoot',
None)))
275 if 'defects' in policy:
276 self.
defectPath = os.path.join(repositoryDir, policy[
'defects'])
277 defectRegistryLocation = os.path.join(self.
defectPath,
"defectRegistry.sqlite3")
278 self.
defectRegistry = dafPersist.Registry.create(defectRegistryLocation)
286 raise ValueError(
'class variable packageName must not be None')
290 def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None):
291 """Initialize mappings 293 For each of the dataset types that we want to be able to read, there 294 are methods that can be created to support them: 295 * map_<dataset> : determine the path for dataset 296 * std_<dataset> : standardize the retrieved dataset 297 * bypass_<dataset> : retrieve the dataset (bypassing the usual 299 * query_<dataset> : query the registry 301 Besides the dataset types explicitly listed in the policy, we create 302 additional, derived datasets for additional conveniences, 303 e.g., reading the header of an image, retrieving only the size of a 308 policy : `lsst.daf.persistence.Policy` 309 Policy with per-camera defaults already merged 310 rootStorage : `Storage subclass instance` 311 Interface to persisted repository data. 312 calibRoot : `Storage subclass instance` 313 Interface to persisted calib repository data 314 provided : `list` of `str` 315 Keys provided by the mapper 319 "obs_base",
"ImageMappingDefaults.yaml",
"policy"))
321 "obs_base",
"ExposureMappingDefaults.yaml",
"policy"))
323 "obs_base",
"CalibrationMappingDefaults.yaml",
"policy"))
328 (
"images", imgMappingPolicy, ImageMapping),
329 (
"exposures", expMappingPolicy, ExposureMapping),
330 (
"calibrations", calMappingPolicy, CalibrationMapping),
331 (
"datasets", dsMappingPolicy, DatasetMapping)
334 for name, defPolicy, cls
in mappingList:
336 datasets = policy[name]
339 defaultsPath = os.path.join(
getPackageDir(
"obs_base"),
"policy", name +
".yaml")
340 if os.path.exists(defaultsPath):
344 setattr(self, name, mappings)
345 for datasetType
in datasets.names(
True):
346 subPolicy = datasets[datasetType]
347 subPolicy.merge(defPolicy)
349 if not hasattr(self,
"map_" + datasetType)
and 'composite' in subPolicy:
350 def compositeClosure(dataId, write=False, mapper=None, mapping=None,
351 subPolicy=subPolicy):
352 components = subPolicy.get(
'composite')
353 assembler = subPolicy[
'assembler']
if 'assembler' in subPolicy
else None 354 disassembler = subPolicy[
'disassembler']
if 'disassembler' in subPolicy
else None 355 python = subPolicy[
'python']
357 disassembler=disassembler,
361 for name, component
in components.items():
362 butlerComposite.add(id=name,
363 datasetType=component.get(
'datasetType'),
364 setter=component.get(
'setter',
None),
365 getter=component.get(
'getter',
None),
366 subset=component.get(
'subset',
False),
367 inputOnly=component.get(
'inputOnly',
False))
368 return butlerComposite
369 setattr(self,
"map_" + datasetType, compositeClosure)
373 if name ==
"calibrations":
375 provided=provided, dataRoot=rootStorage)
377 mapping =
cls(datasetType, subPolicy, self.
registry, rootStorage, provided=provided)
380 raise ValueError(f
"Duplicate mapping policy for dataset type {datasetType}")
381 self.
keyDict.update(mapping.keys())
382 mappings[datasetType] = mapping
383 self.
mappings[datasetType] = mapping
384 if not hasattr(self,
"map_" + datasetType):
385 def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
386 return mapping.map(mapper, dataId, write)
387 setattr(self,
"map_" + datasetType, mapClosure)
388 if not hasattr(self,
"query_" + datasetType):
389 def queryClosure(format, dataId, mapping=mapping):
390 return mapping.lookup(format, dataId)
391 setattr(self,
"query_" + datasetType, queryClosure)
392 if hasattr(mapping,
"standardize")
and not hasattr(self,
"std_" + datasetType):
393 def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
394 return mapping.standardize(mapper, item, dataId)
395 setattr(self,
"std_" + datasetType, stdClosure)
397 def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
398 """Set convenience methods on CameraMapper""" 399 mapName =
"map_" + datasetType +
"_" + suffix
400 bypassName =
"bypass_" + datasetType +
"_" + suffix
401 queryName =
"query_" + datasetType +
"_" + suffix
402 if not hasattr(self, mapName):
403 setattr(self, mapName, mapImpl
or getattr(self,
"map_" + datasetType))
404 if not hasattr(self, bypassName):
405 if bypassImpl
is None and hasattr(self,
"bypass_" + datasetType):
406 bypassImpl = getattr(self,
"bypass_" + datasetType)
407 if bypassImpl
is not None:
408 setattr(self, bypassName, bypassImpl)
409 if not hasattr(self, queryName):
410 setattr(self, queryName, queryImpl
or getattr(self,
"query_" + datasetType))
413 setMethods(
"filename", bypassImpl=
lambda datasetType, pythonType, location, dataId:
414 [os.path.join(location.getStorage().root, p)
for p
in location.getLocations()])
416 if subPolicy[
"storage"] ==
"FitsStorage":
417 setMethods(
"md", bypassImpl=
lambda datasetType, pythonType, location, dataId:
421 addName =
"add_" + datasetType
422 if not hasattr(self, addName):
425 if name ==
"exposures":
426 def getSkyWcs(datasetType, pythonType, location, dataId):
428 return fitsReader.readWcs()
430 setMethods(
"wcs", bypassImpl=getSkyWcs)
432 def getPhotoCalib(datasetType, pythonType, location, dataId):
434 return fitsReader.readPhotoCalib()
436 setMethods(
"photoCalib", bypassImpl=getPhotoCalib)
438 def getVisitInfo(datasetType, pythonType, location, dataId):
440 return fitsReader.readVisitInfo()
442 setMethods(
"visitInfo", bypassImpl=getVisitInfo)
444 def getFilter(datasetType, pythonType, location, dataId):
446 return fitsReader.readFilter()
448 setMethods(
"filter", bypassImpl=getFilter)
450 setMethods(
"detector",
451 mapImpl=
lambda dataId, write=
False:
453 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
455 storageName=
"Internal",
456 locationList=
"ignored",
461 bypassImpl=
lambda datasetType, pythonType, location, dataId:
464 setMethods(
"bbox", bypassImpl=
lambda dsType, pyType, location, dataId:
466 readMetadata(location.getLocationsWithRoot()[0], hdu=1)))
468 elif name ==
"images":
469 setMethods(
"bbox", bypassImpl=
lambda dsType, pyType, location, dataId:
473 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
474 setMethods(
"md", bypassImpl=
lambda datasetType, pythonType, location, dataId:
476 location.getLocations()[0]), hdu=1))
479 if subPolicy[
"storage"] ==
"FitsStorage":
480 def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
481 subId = dataId.copy()
483 loc = mapping.map(mapper, subId, write)
484 bbox = dataId[
'bbox']
485 llcX = bbox.getMinX()
486 llcY = bbox.getMinY()
487 width = bbox.getWidth()
488 height = bbox.getHeight()
489 loc.additionalData.set(
'llcX', llcX)
490 loc.additionalData.set(
'llcY', llcY)
491 loc.additionalData.set(
'width', width)
492 loc.additionalData.set(
'height', height)
493 if 'imageOrigin' in dataId:
494 loc.additionalData.set(
'imageOrigin',
495 dataId[
'imageOrigin'])
498 def querySubClosure(key, format, dataId, mapping=mapping):
499 subId = dataId.copy()
501 return mapping.lookup(format, subId)
502 setMethods(
"sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
504 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
506 setMethods(
"len", bypassImpl=
lambda datasetType, pythonType, location, dataId:
508 location.getLocations()[0]),
509 hdu=1).getScalar(
"NAXIS2"))
512 if not datasetType.endswith(
"_schema")
and datasetType +
"_schema" not in datasets:
513 setMethods(
"schema", bypassImpl=
lambda datasetType, pythonType, location, dataId:
514 afwTable.Schema.readFits(os.path.join(location.getStorage().root,
515 location.getLocations()[0])))
517 def _computeCcdExposureId(self, dataId):
518 """Compute the 64-bit (long) identifier for a CCD exposure. 520 Subclasses must override 525 Data identifier with visit, ccd. 527 raise NotImplementedError()
529 def _computeCoaddExposureId(self, dataId, singleFilter):
530 """Compute the 64-bit (long) identifier for a coadd. 532 Subclasses must override 537 Data identifier with tract and patch. 538 singleFilter : `bool` 539 True means the desired ID is for a single-filter coadd, in which 540 case dataIdmust contain filter. 542 raise NotImplementedError()
544 def _search(self, path):
545 """Search for path in the associated repository's storage. 550 Path that describes an object in the repository associated with 552 Path may contain an HDU indicator, e.g. 'foo.fits[1]'. The 553 indicator will be stripped when searching and so will match 554 filenames without the HDU indicator, e.g. 'foo.fits'. The path 555 returned WILL contain the indicator though, e.g. ['foo.fits[1]']. 560 The path for this object in the repository. Will return None if the 561 object can't be found. If the input argument path contained an HDU 562 indicator, the returned path will also contain the HDU indicator. 567 """Rename any existing object with the given type and dataId. 569 The CameraMapper implementation saves objects in a sequence of e.g.: 575 All of the backups will be placed in the output repo, however, and will 576 not be removed if they are found elsewhere in the _parent chain. This 577 means that the same file will be stored twice if the previous version 578 was found in an input repo. 587 def firstElement(list):
588 """Get the first element in the list, or None if that can't be 591 return list[0]
if list
is not None and len(list)
else None 594 newLocation = self.
map(datasetType, dataId, write=
True)
595 newPath = newLocation.getLocations()[0]
596 path = dafPersist.PosixStorage.search(self.
root, newPath, searchParents=
True)
597 path = firstElement(path)
599 while path
is not None:
601 oldPaths.append((n, path))
602 path = dafPersist.PosixStorage.search(self.
root,
"%s~%d" % (newPath, n), searchParents=
True)
603 path = firstElement(path)
604 for n, oldPath
in reversed(oldPaths):
605 self.
rootStorage.copyFile(oldPath,
"%s~%d" % (newPath, n))
608 """Return supported keys. 613 List of keys usable in a dataset identifier 618 """Return a dict of supported keys and their value types for a given 619 dataset type at a given level of the key hierarchy. 624 Dataset type or None for all dataset types. 625 level : `str` or None 626 Level or None for all levels or '' for the default level for the 632 Keys are strings usable in a dataset identifier, values are their 640 if datasetType
is None:
641 keyDict = copy.copy(self.
keyDict)
644 if level
is not None and level
in self.
levels:
645 keyDict = copy.copy(keyDict)
646 for l
in self.
levels[level]:
661 """Return the name of the camera that this CameraMapper is for.""" 663 className = className[className.find(
'.'):-1]
664 m = re.search(
r'(\w+)Mapper', className)
666 m = re.search(
r"class '[\w.]*?(\w+)'", className)
668 return name[:1].lower() + name[1:]
if name
else '' 672 """Return the name of the package containing this CameraMapper.""" 674 raise ValueError(
'class variable packageName must not be None')
679 """Return the base directory of this package""" 683 """Map a camera dataset.""" 685 raise RuntimeError(
"No camera dataset available.")
688 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
690 storageName=
"ConfigStorage",
698 """Return the (preloaded) camera object. 701 raise RuntimeError(
"No camera dataset available.")
705 """Map defects dataset. 709 `lsst.daf.butler.ButlerLocation` 710 Minimal ButlerLocation containing just the locationList field 711 (just enough information that bypass_defects can use it). 714 if defectFitsPath
is None:
715 raise RuntimeError(
"No defects available for dataId=%s" % (dataId,))
722 """Return a defect based on the butler location returned by map_defects 726 butlerLocation : `lsst.daf.persistence.ButlerLocation` 727 locationList = path to defects FITS file 729 Butler data ID; "ccd" must be set. 731 Note: the name "bypass_XXX" means the butler makes no attempt to 732 convert the ButlerLocation into an object, which is what we want for 733 now, since that conversion is a bit tricky. 736 defectsFitsPath = butlerLocation.locationList[0]
738 with fits.open(defectsFitsPath)
as hduList:
739 for hdu
in hduList[1:]:
740 if hdu.header[
"name"] != detectorName:
744 for data
in hdu.data:
752 raise RuntimeError(
"No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
756 pythonType=
"lsst.obs.base.ExposureIdInfo",
758 storageName=
"Internal",
759 locationList=
"ignored",
766 """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure""" 767 expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
768 expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
772 """Disable standardization for bfKernel 774 bfKernel is a calibration product that is numpy array, 775 unlike other calibration products that are all images; 776 all calibration images are sent through _standardizeExposure 777 due to CalibrationMapping, but we don't want that to happen to bfKernel 782 """Standardize a raw dataset by converting it to an Exposure instead 785 trimmed=
False, setVisitInfo=
True)
788 """Map a sky policy.""" 790 "Internal",
None,
None, self,
794 """Standardize a sky policy by returning the one we use.""" 795 return self.skypolicy
803 def _getCcdKeyVal(self, dataId):
804 """Return CCD key and value used to look a defect in the defect 807 The default implementation simply returns ("ccd", full detector name) 811 def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True,
813 """Set up a registry (usually SQLite3), trying a number of possible 821 Description of registry (for log messages) 825 Policy that contains the registry name, used if path is None. 827 Key in policy for registry path. 828 storage : Storage subclass 829 Repository Storage to look in. 830 searchParents : bool, optional 831 True if the search for a registry should follow any Butler v1 833 posixIfNoSql : bool, optional 834 If an sqlite registry is not found, will create a posix registry if 839 lsst.daf.persistence.Registry 842 if path
is None and policyKey
in policy:
844 if os.path.isabs(path):
845 raise RuntimeError(
"Policy should not indicate an absolute path for registry.")
846 if not storage.exists(path):
847 newPath = storage.instanceSearch(path)
849 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None 851 self.
log.
warn(
"Unable to locate registry at policy path (also looked in root): %s",
855 self.
log.
warn(
"Unable to locate registry at policy path: %s", path)
863 if path
and (path.startswith(root)):
864 path = path[len(root +
'/'):]
865 except AttributeError:
871 def search(filename, description):
872 """Search for file in storage 877 Filename to search for 879 Description of file, for error message. 883 path : `str` or `None` 884 Path to file, or None 886 result = storage.instanceSearch(filename)
889 self.
log.
debug(
"Unable to locate %s: %s", description, filename)
894 path = search(
"%s.pgsql" % name,
"%s in root" % description)
896 path = search(
"%s.sqlite3" % name,
"%s in root" % description)
898 path = search(os.path.join(
".",
"%s.sqlite3" % name),
"%s in current dir" % description)
901 if not storage.exists(path):
902 newPath = storage.instanceSearch(path)
903 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None 904 if newPath
is not None:
906 localFileObj = storage.getLocalFile(path)
907 self.
log.
info(
"Loading %s registry from %s", description, localFileObj.name)
908 registry = dafPersist.Registry.create(localFileObj.name)
910 elif not registry
and posixIfNoSql:
912 self.
log.
info(
"Loading Posix %s registry from %s", description, storage.root)
919 def _transformId(self, dataId):
920 """Generate a standard ID dict from a camera-specific ID dict. 922 Canonical keys include: 923 - amp: amplifier name 924 - ccd: CCD name (in LSST this is a combination of raft and sensor) 925 The default implementation returns a copy of its input. 930 Dataset identifier; this must not be modified 935 Transformed dataset identifier. 940 def _mapActualToPath(self, template, actualId):
941 """Convert a template path to an actual path, using the actual data 942 identifier. This implementation is usually sufficient but can be 943 overridden by the subclass. 960 return template % transformedId
961 except Exception
as e:
962 raise RuntimeError(
"Failed to format %r with data %r: %s" % (template, transformedId, e))
966 """Convert a CCD name to a form useful as a filename 968 The default implementation converts spaces to underscores. 970 return ccdName.replace(
" ",
"_")
972 def _extractDetectorName(self, dataId):
973 """Extract the detector (CCD) name from the dataset identifier. 975 The name in question is the detector name used by lsst.afw.cameraGeom. 987 raise NotImplementedError(
"No _extractDetectorName() function specified")
989 def _extractAmpId(self, dataId):
990 """Extract the amplifier identifer from a dataset identifier. 992 .. note:: Deprecated in 11_0 994 amplifier identifier has two parts: the detector name for the CCD 995 containing the amplifier and index of the amplifier in the detector. 1005 Amplifier identifier 1009 return (trDataId[
"ccd"],
int(trDataId[
'amp']))
1011 def _setAmpDetector(self, item, dataId, trimmed=True):
1012 """Set the detector object in an Exposure for an amplifier. 1014 Defects are also added to the Exposure based on the detector object. 1018 item : `lsst.afw.image.Exposure` 1019 Exposure to set the detector in. 1023 Should detector be marked as trimmed? (ignored) 1026 return self.
_setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
1028 def _setCcdDetector(self, item, dataId, trimmed=True):
1029 """Set the detector object in an Exposure for a CCD. 1033 item : `lsst.afw.image.Exposure` 1034 Exposure to set the detector in. 1038 Should detector be marked as trimmed? (ignored) 1040 if item.getDetector()
is not None:
1044 detector = self.
camera[detectorName]
1045 item.setDetector(detector)
1047 def _setFilter(self, mapping, item, dataId):
1048 """Set the filter object in an Exposure. If the Exposure had a FILTER 1049 keyword, this was already processed during load. But if it didn't, 1050 use the filter from the registry. 1054 mapping : `lsst.obs.base.Mapping` 1055 Where to get the filter from. 1056 item : `lsst.afw.image.Exposure` 1057 Exposure to set the filter in. 1062 if not (isinstance(item, afwImage.ExposureU)
or isinstance(item, afwImage.ExposureI)
or 1063 isinstance(item, afwImage.ExposureF)
or isinstance(item, afwImage.ExposureD)):
1066 if item.getFilter().getId() != afwImage.Filter.UNKNOWN:
1069 actualId = mapping.need([
'filter'], dataId)
1070 filterName = actualId[
'filter']
1072 filterName = self.
filters[filterName]
1076 def _standardizeExposure(self, mapping, item, dataId, filter=True,
1077 trimmed=True, setVisitInfo=True):
1078 """Default standardization function for images. 1080 This sets the Detector from the camera geometry 1081 and optionally set the Fiter. In both cases this saves 1082 having to persist some data in each exposure (or image). 1086 mapping : `lsst.obs.base.Mapping` 1087 Where to get the values from. 1088 item : image-like object 1089 Can be any of lsst.afw.image.Exposure, 1090 lsst.afw.image.DecoratedImage, lsst.afw.image.Image 1091 or lsst.afw.image.MaskedImage 1096 Set filter? Ignored if item is already an exposure 1098 Should detector be marked as trimmed? 1099 setVisitInfo : `bool` 1100 Should Exposure have its VisitInfo filled out from the metadata? 1104 `lsst.afw.image.Exposure` 1105 The standardized Exposure. 1108 item =
exposureFromImage(item, dataId, mapper=self, logger=self.
log, setVisitInfo=setVisitInfo)
1109 except Exception
as e:
1110 self.
log.
error(
"Could not turn item=%r into an exposure: %s" % (repr(item), e))
1113 if mapping.level.lower() ==
"amp":
1115 elif mapping.level.lower() ==
"ccd":
1123 def _defectLookup(self, dataId, dateKey='taiObs'):
1124 """Find the defects for a given CCD. 1134 Path to the defects file or None if not available. 1139 raise RuntimeError(
"No registry for defect lookup")
1143 dataIdForLookup = {
'visit': dataId[
'visit']}
1145 rows = self.
registry.lookup((dateKey), (
'raw_visit'), dataIdForLookup)
1148 assert len(rows) == 1
1154 (
"DATETIME(?)",
"DATETIME(validStart)",
"DATETIME(validEnd)"),
1156 if not rows
or len(rows) == 0:
1159 return os.path.join(self.
defectPath, rows[0][0])
1161 raise RuntimeError(
"Querying for defects (%s, %s) returns %d files: %s" %
1162 (ccdVal, dayObs, len(rows),
", ".join([_[0]
for _
in rows])))
1164 def _makeCamera(self, policy, repositoryDir):
1165 """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing 1168 Also set self.cameraDataLocation, if relevant (else it can be left 1171 This implementation assumes that policy contains an entry "camera" 1172 that points to the subdirectory in this package of camera data; 1173 specifically, that subdirectory must contain: 1174 - a file named `camera.py` that contains persisted camera config 1175 - ampInfo table FITS files, as required by 1176 lsst.afw.cameraGeom.makeCameraFromPath 1180 policy : `lsst.daf.persistence.Policy` 1181 Policy with per-camera defaults already merged 1182 (PexPolicy only for backward compatibility). 1183 repositoryDir : `str` 1184 Policy repository for the subclassing module (obtained with 1185 getRepositoryPath() on the per-camera default dictionary). 1187 if 'camera' not in policy:
1188 raise RuntimeError(
"Cannot find 'camera' in policy; cannot construct a camera")
1189 cameraDataSubdir = policy[
'camera']
1191 os.path.join(repositoryDir, cameraDataSubdir,
"camera.py"))
1192 cameraConfig = afwCameraGeom.CameraConfig()
1195 return afwCameraGeom.makeCameraFromPath(
1196 cameraConfig=cameraConfig,
1197 ampInfoPath=ampInfoPath,
1203 """Get the registry used by this mapper. 1208 The registry used by this mapper for this mapper's repository. 1213 """Stuff image compression settings into a daf.base.PropertySet 1215 This goes into the ButlerLocation's "additionalData", which gets 1216 passed into the boost::persistence framework. 1221 Type of dataset for which to get the image compression settings. 1227 additionalData : `lsst.daf.base.PropertySet` 1228 Image compression settings. 1230 mapping = self.
mappings[datasetType]
1231 recipeName = mapping.recipe
1232 storageType = mapping.storage
1236 raise RuntimeError(
"Unrecognized write recipe for datasetType %s (storage type %s): %s" %
1237 (datasetType, storageType, recipeName))
1238 recipe = self.
_writeRecipes[storageType][recipeName].deepCopy()
1239 seed = hash(tuple(dataId.items())) % 2**31
1240 for plane
in (
"image",
"mask",
"variance"):
1241 if recipe.exists(plane +
".scaling.seed")
and recipe.getScalar(plane +
".scaling.seed") == 0:
1242 recipe.set(plane +
".scaling.seed", seed)
1245 def _initWriteRecipes(self):
1246 """Read the recipes for writing files 1248 These recipes are currently used for configuring FITS compression, 1249 but they could have wider uses for configuring different flavors 1250 of the storage types. A recipe is referred to by a symbolic name, 1251 which has associated settings. These settings are stored as a 1252 `PropertySet` so they can easily be passed down to the 1253 boost::persistence framework as the "additionalData" parameter. 1255 The list of recipes is written in YAML. A default recipe and 1256 some other convenient recipes are in obs_base/policy/writeRecipes.yaml 1257 and these may be overridden or supplemented by the individual obs_* 1258 packages' own policy/writeRecipes.yaml files. 1260 Recipes are grouped by the storage type. Currently, only the 1261 ``FitsStorage`` storage type uses recipes, which uses it to 1262 configure FITS image compression. 1264 Each ``FitsStorage`` recipe for FITS compression should define 1265 "image", "mask" and "variance" entries, each of which may contain 1266 "compression" and "scaling" entries. Defaults will be provided for 1267 any missing elements under "compression" and "scaling". 1269 The allowed entries under "compression" are: 1271 * algorithm (string): compression algorithm to use 1272 * rows (int): number of rows per tile (0 = entire dimension) 1273 * columns (int): number of columns per tile (0 = entire dimension) 1274 * quantizeLevel (float): cfitsio quantization level 1276 The allowed entries under "scaling" are: 1278 * algorithm (string): scaling algorithm to use 1279 * bitpix (int): bits per pixel (0,8,16,32,64,-32,-64) 1280 * fuzz (bool): fuzz the values when quantising floating-point values? 1281 * seed (long): seed for random number generator when fuzzing 1282 * maskPlanes (list of string): mask planes to ignore when doing 1284 * quantizeLevel: divisor of the standard deviation for STDEV_* scaling 1285 * quantizePad: number of stdev to allow on the low side (for 1286 STDEV_POSITIVE/NEGATIVE) 1287 * bscale: manually specified BSCALE (for MANUAL scaling) 1288 * bzero: manually specified BSCALE (for MANUAL scaling) 1290 A very simple example YAML recipe: 1296 algorithm: GZIP_SHUFFLE 1300 recipesFile = os.path.join(
getPackageDir(
"obs_base"),
"policy",
"writeRecipes.yaml")
1302 supplementsFile = os.path.join(self.
getPackageDir(),
"policy",
"writeRecipes.yaml")
1303 validationMenu = {
'FitsStorage': validateRecipeFitsStorage, }
1304 if os.path.exists(supplementsFile)
and supplementsFile != recipesFile:
1307 for entry
in validationMenu:
1308 intersection =
set(recipes[entry].names()).intersection(
set(supplements.names()))
1310 raise RuntimeError(
"Recipes provided in %s section %s may not override those in %s: %s" %
1311 (supplementsFile, entry, recipesFile, intersection))
1312 recipes.update(supplements)
1315 for storageType
in recipes.names(
True):
1316 if "default" not in recipes[storageType]:
1317 raise RuntimeError(
"No 'default' recipe defined for storage type %s in %s" %
1318 (storageType, recipesFile))
1319 self.
_writeRecipes[storageType] = validationMenu[storageType](recipes[storageType])
1323 """Generate an Exposure from an image-like object 1325 If the image is a DecoratedImage then also set its WCS and metadata 1326 (Image and MaskedImage are missing the necessary metadata 1327 and Exposure already has those set) 1331 image : Image-like object 1332 Can be one of lsst.afw.image.DecoratedImage, Image, MaskedImage or 1337 `lsst.afw.image.Exposure` 1338 Exposure containing input image. 1345 metadata = image.getMetadata()
1348 exposure.setWcs(wcs)
1352 logger = lsstLog.Log.getLogger(
"CameraMapper")
1353 logger.debug(
"wcs set to None; insufficient information found in metadata to create a valid wcs:" 1356 exposure.setMetadata(metadata)
1360 metadata = exposure.getMetadata()
1367 if setVisitInfo
and exposure.getInfo().getVisitInfo()
is None:
1368 if metadata
is not None:
1371 logger = lsstLog.Log.getLogger(
"CameraMapper")
1372 logger.warn(
"I can only set the VisitInfo if you provide a mapper")
1374 exposureId = mapper._computeCcdExposureId(dataId)
1375 visitInfo = mapper.makeRawVisitInfo(md=metadata, exposureId=exposureId)
1377 exposure.getInfo().setVisitInfo(visitInfo)
1383 """Validate recipes for FitsStorage 1385 The recipes are supplemented with default values where appropriate. 1387 TODO: replace this custom validation code with Cerberus (DM-11846) 1391 recipes : `lsst.daf.persistence.Policy` 1392 FitsStorage recipes to validate. 1396 validated : `lsst.daf.base.PropertySet` 1397 Validated FitsStorage recipe. 1402 If validation fails. 1406 compressionSchema = {
1407 "algorithm":
"NONE",
1410 "quantizeLevel": 0.0,
1413 "algorithm":
"NONE",
1415 "maskPlanes": [
"NO_DATA"],
1417 "quantizeLevel": 4.0,
1424 def checkUnrecognized(entry, allowed, description):
1425 """Check to see if the entry contains unrecognised keywords""" 1426 unrecognized =
set(entry.keys()) -
set(allowed)
1429 "Unrecognized entries when parsing image compression recipe %s: %s" %
1430 (description, unrecognized))
1433 for name
in recipes.names(
True):
1434 checkUnrecognized(recipes[name], [
"image",
"mask",
"variance"], name)
1436 validated[name] = rr
1437 for plane
in (
"image",
"mask",
"variance"):
1438 checkUnrecognized(recipes[name][plane], [
"compression",
"scaling"],
1439 name +
"->" + plane)
1441 for settings, schema
in ((
"compression", compressionSchema),
1442 (
"scaling", scalingSchema)):
1443 prefix = plane +
"." + settings
1444 if settings
not in recipes[name][plane]:
1446 rr.set(prefix +
"." + key, schema[key])
1448 entry = recipes[name][plane][settings]
1449 checkUnrecognized(entry, schema.keys(), name +
"->" + plane +
"->" + settings)
1451 value =
type(schema[key])(entry[key])
if key
in entry
else schema[key]
1452 rr.set(prefix +
"." + key, value)
def _makeCamera(self, policy, repositoryDir)
def map_expIdInfo(self, dataId, write=False)
def _setAmpDetector(self, item, dataId, trimmed=True)
Encapsulate information about a bad portion of a detector.
def validateRecipeFitsStorage(recipes)
def _standardizeExposure(self, mapping, item, dataId, filter=True, trimmed=True, setVisitInfo=True)
Class for logical location of a persisted Persistable instance.
def _extractDetectorName(self, dataId)
def _setFilter(self, mapping, item, dataId)
A class to contain the data, WCS, and other information needed to describe an image of the sky...
def _setCcdDetector(self, item, dataId, trimmed=True)
def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId)
daf::base::PropertySet * set
def std_bfKernel(self, item, dataId)
def getKeys(self, datasetType, level)
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename std::shared_ptr< Image< ImagePixelT >> image, typename std::shared_ptr< Mask< MaskPixelT >> mask=Mask< MaskPixelT >(), typename std::shared_ptr< Image< VariancePixelT >> variance=Image< VariancePixelT >())
A function to return a MaskedImage of the correct type (cf.
def _defectLookup(self, dataId, dateKey='taiObs')
std::shared_ptr< Exposure< ImagePixelT, MaskPixelT, VariancePixelT > > makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, std::shared_ptr< geom::SkyWcs const > wcs=std::shared_ptr< geom::SkyWcs const >())
A function to return an Exposure of the correct type (cf.
def getImageCompressionSettings(self, datasetType, dataId)
def map_defects(self, dataId, write=False)
def map_camera(self, dataId, write=False)
def map(self, datasetType, dataId, write=False)
A class to manipulate images, masks, and variance as a single object.
def std_raw(self, item, dataId)
def backup(self, datasetType, dataId)
def _setupRegistry(self, name, description, path, policy, policyKey, storage, searchParents=True, posixIfNoSql=True)
def map_skypolicy(self, dataId)
Holds an integer identifier for an LSST filter.
def std_skypolicy(self, item, dataId)
def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId)
def _initMappings(self, policy, rootStorage=None, calibStorage=None, provided=None)
std::shared_ptr< SkyWcs > makeSkyWcs(TransformPoint2ToPoint2 const &pixelsToFieldAngle, lsst::geom::Angle const &orientation, bool flipX, lsst::geom::SpherePoint const &boresight, std::string const &projection="TAN")
Construct a FITS SkyWcs from camera geometry.
def getDefaultSubLevel(self, level)
Class for storing generic metadata.
def _transformId(self, dataId)
A FITS reader class for Exposures and their components.
def getDefaultLevel(self)
def __init__(self, policy, repositoryDir, root=None, registry=None, calibRoot=None, calibRegistry=None, provided=None, parentRegistry=None, repositoryCfg=None)
Backwards-compatibility support for depersisting the old Calib (FluxMag0/FluxMag0Err) objects...
Reports errors from accepting an object of an unexpected or inappropriate type.
def bypass_expIdInfo(self, datasetType, pythonType, location, dataId)
def exposureFromImage(image, dataId=None, mapper=None, logger=None, setVisitInfo=True)
def _initWriteRecipes(self)
def getShortCcdName(ccdName)
An integer coordinate rectangle.
lsst::geom::Box2I bboxFromMetadata(daf::base::PropertySet &metadata)
Determine the image bounding box from its metadata (FITS header)
A container for an Image and its associated metadata.
def _getCcdKeyVal(self, dataId)
Utility functions.