23 from builtins
import str
24 from past.builtins
import long
34 from .
import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
42 from .exposureIdInfo
import ExposureIdInfo
43 from .makeRawVisitInfo
import MakeRawVisitInfo
46 """This module defines the CameraMapper base class."""
51 """CameraMapper is a base class for mappers that handle images from a
52 camera and products derived from them. This provides an abstraction layer
53 between the data on disk and the code.
55 Public methods: keys, queryMetadata, getDatasetTypes, map,
56 canStandardize, standardize
58 Mappers for specific data sources (e.g., CFHT Megacam, LSST
59 simulations, etc.) should inherit this class.
61 The CameraMapper manages datasets within a "root" directory. It can also
62 be given an "outputRoot". If so, the input root is linked into the
63 outputRoot directory using a symlink named "_parent"; writes go into the
64 outputRoot while reads can come from either the root or outputRoot. As
65 outputRoots are used as inputs for further processing, the chain of
66 _parent links allows any dataset to be retrieved. Note that writing to a
67 dataset present in the input root will hide the existing dataset but not
68 overwrite it. See #2160 for design discussion.
70 A camera is assumed to consist of one or more rafts, each composed of
71 multiple CCDs. Each CCD is in turn composed of one or more amplifiers
72 (amps). A camera is also assumed to have a camera geometry description
73 (CameraGeom object) as a policy file, a filter description (Filter class
74 static configuration) as another policy file, and an optional defects
75 description directory.
77 Information from the camera geometry and defects are inserted into all
78 Exposure objects returned.
80 The mapper uses one or two registries to retrieve metadata about the
81 images. The first is a registry of all raw exposures. This must contain
82 the time of the observation. One or more tables (or the equivalent)
83 within the registry are used to look up data identifier components that
84 are not specified by the user (e.g. filter) and to return results for
85 metadata queries. The second is an optional registry of all calibration
86 data. This should contain validity start and end entries for each
87 calibration dataset in the same timescale as the observation time.
89 Subclasses will typically set MakeRawVisitInfoClass:
91 MakeRawVisitInfoClass: a class variable that points to a subclass of
92 MakeRawVisitInfo, a functor that creates an
93 lsst.afw.image.VisitInfo from the FITS metadata of a raw image.
95 Subclasses must provide the following methods:
97 _extractDetectorName(self, dataId): returns the detector name for a CCD
98 (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
99 a dataset identifier referring to that CCD or a subcomponent of it.
101 _computeCcdExposureId(self, dataId): see below
103 _computeCoaddExposureId(self, dataId, singleFilter): see below
105 Subclasses may also need to override the following methods:
107 _transformId(self, dataId): transformation of a data identifier
108 from colloquial usage (e.g., "ccdname") to proper/actual usage (e.g., "ccd"),
109 including making suitable for path expansion (e.g. removing commas).
110 The default implementation does nothing. Note that this
111 method should not modify its input parameter.
113 getShortCcdName(self, ccdName): a static method that returns a shortened name
114 suitable for use as a filename. The default version converts spaces to underscores.
116 _getCcdKeyVal(self, dataId): return a CCD key and value
117 by which to look up defects in the defects registry.
118 The default value returns ("ccd", detector name)
120 _mapActualToPath(self, template, actualId): convert a template path to an
121 actual path, using the actual dataset identifier.
123 The mapper's behaviors are largely specified by the policy file.
124 See the MapperDictionary.paf for descriptions of the available items.
126 The 'exposures', 'calibrations', and 'datasets' subpolicies configure
127 mappings (see Mappings class).
129 Common default mappings for all subclasses can be specified in the
130 "policy/{images,exposures,calibrations,datasets}.yaml" files. This provides
131 a simple way to add a product to all camera mappers.
133 Functions to map (provide a path to the data given a dataset
134 identifier dictionary) and standardize (convert data into some standard
135 format or type) may be provided in the subclass as "map_{dataset type}"
136 and "std_{dataset type}", respectively.
138 If non-Exposure datasets cannot be retrieved using standard
139 daf_persistence methods alone, a "bypass_{dataset type}" function may be
140 provided in the subclass to return the dataset instead of using the
141 "datasets" subpolicy.
143 Implementations of map_camera and bypass_camera that should typically be
144 sufficient are provided in this base class.
147 * Handle defects the same was as all other calibration products, using the calibration registry
148 * Instead of auto-loading the camera at construction time, load it from the calibration registry
149 * Rewrite defects as AFW tables so we don't need pyfits to unpersist them; then remove all mention
150 of pyfits from this package.
156 MakeRawVisitInfoClass = MakeRawVisitInfo
158 def __init__(self, policy, repositoryDir,
159 root=
None, registry=
None, calibRoot=
None, calibRegistry=
None,
160 provided=
None, outputRoot=
None):
161 """Initialize the CameraMapper.
162 @param policy (daf_persistence.Policy, or pexPolicy.Policy (only for backward compatibility))
163 Policy with per-camera defaults already merged
164 @param repositoryDir (string) Policy repository for the subclassing
165 module (obtained with getRepositoryPath() on the
166 per-camera default dictionary)
167 @param root (string) Root directory for data
168 @param registry (string) Path to registry with data's metadata
169 @param calibRoot (string) Root directory for calibrations
170 @param calibRegistry (string) Path to registry with calibrations'
172 @param provided (list of strings) Keys provided by the mapper
173 @param outputRoot (string) Root directory for output data
176 dafPersist.Mapper.__init__(self)
178 self.
log = lsstLog.Log.getLogger(
"CameraMapper")
182 policy = dafPersist.Policy(policy)
184 repoPolicy = CameraMapper.getRepoPolicy(self.
root, self.
root)
185 if repoPolicy
is not None:
186 policy.update(repoPolicy)
188 defaultPolicyFile = dafPersist.Policy.defaultPolicyFile(
"obs_base",
189 "MapperDictionary.paf",
191 dictPolicy = dafPersist.Policy(defaultPolicyFile)
192 policy.merge(dictPolicy)
196 if 'levels' in policy:
197 levelsPolicy = policy[
'levels']
198 for key
in levelsPolicy.names(
True):
199 self.
levels[key] = set(levelsPolicy.asArray(key))
202 if 'defaultSubLevels' in policy:
210 if outputRoot
is not None and os.path.abspath(outputRoot) != os.path.abspath(root):
212 if not os.path.exists(outputRoot):
214 os.makedirs(outputRoot)
216 if not e.errno == errno.EEXIST:
218 if not os.path.exists(outputRoot):
219 raise RuntimeError(
"Unable to create output repository '%s'" % (outputRoot,))
220 if os.path.exists(root):
222 src = os.path.abspath(root)
223 dst = os.path.join(outputRoot,
"_parent")
224 if not os.path.exists(dst):
229 if os.path.exists(dst):
230 if os.path.realpath(dst) != os.path.realpath(src):
231 raise RuntimeError(
"Output repository path "
232 "'%s' already exists and differs from "
233 "input repository path '%s'" % (dst, src))
235 raise RuntimeError(
"Unable to symlink from input "
236 "repository path '%s' to output repository "
237 "path '%s'" % (src, dst))
242 if calibRoot
is None:
243 if 'calibRoot' in policy:
244 calibRoot = policy[
'calibRoot']
249 if not os.path.exists(root):
250 self.log.warn(
"Root directory not found: %s", root)
251 if not os.path.exists(calibRoot):
252 self.log.warn(
"Calibration root directory not found: %s", calibRoot)
258 if 'needCalibRegistry' in policy
and policy[
'needCalibRegistry']:
259 calibRegistry = self.
_setupRegistry(
"calibRegistry", calibRegistry, policy,
260 "calibRegistryPath", calibRoot)
267 self.
_initMappings(policy, root, calibRoot, calibRegistry, provided=
None)
275 if 'defects' in policy:
276 self.
defectPath = os.path.join(repositoryDir, policy[
'defects'])
277 defectRegistryLocation = os.path.join(self.
defectPath,
"defectRegistry.sqlite3")
278 self.
defectRegistry = dafPersist.Registry.create(defectRegistryLocation)
289 raise ValueError(
'class variable packageName must not be None')
293 def _initMappings(self, policy, root=None, calibRoot=None, calibRegistry=None, provided=None):
294 """Initialize mappings
296 For each of the dataset types that we want to be able to read, there are
297 methods that can be created to support them:
298 * map_<dataset> : determine the path for dataset
299 * std_<dataset> : standardize the retrieved dataset
300 * bypass_<dataset> : retrieve the dataset (bypassing the usual retrieval machinery)
301 * query_<dataset> : query the registry
303 Besides the dataset types explicitly listed in the policy, we create
304 additional, derived datasets for additional conveniences, e.g., reading
305 the header of an image, retrieving only the size of a catalog.
307 @param policy (Policy) Policy with per-camera defaults already merged
308 @param root (string) Root directory for data
309 @param calibRoot (string) Root directory for calibrations
310 @param calibRegistry (string) Path to registry with calibrations' metadata
311 @param provided (list of strings) Keys provided by the mapper
314 imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
315 "obs_base",
"ImageMappingDictionary.paf",
"policy"))
316 expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
317 "obs_base",
"ExposureMappingDictionary.paf",
"policy"))
318 calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
319 "obs_base",
"CalibrationMappingDictionary.paf",
"policy"))
320 dsMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
321 "obs_base",
"DatasetMappingDictionary.paf",
"policy"))
325 (
"images", imgMappingPolicy, ImageMapping),
326 (
"exposures", expMappingPolicy, ExposureMapping),
327 (
"calibrations", calMappingPolicy, CalibrationMapping),
328 (
"datasets", dsMappingPolicy, DatasetMapping)
331 for name, defPolicy, cls
in mappingList:
333 datasets = policy[name]
336 defaultsPath = os.path.join(
getPackageDir(
"obs_base"),
"policy", name +
".yaml")
337 if os.path.exists(defaultsPath):
338 datasets.merge(dafPersist.Policy(defaultsPath))
341 setattr(self, name, mappings)
342 for datasetType
in datasets.names(
True):
343 subPolicy = datasets[datasetType]
344 subPolicy.merge(defPolicy)
346 if not hasattr(self,
"map_" + datasetType)
and 'composite' in subPolicy:
347 def compositeClosure(dataId, write=False, mapper=None, mapping=None, subPolicy=subPolicy):
348 components = subPolicy.get(
'composite')
349 assembler = subPolicy[
'assembler']
if 'assembler' in subPolicy
else None
350 disassembler = subPolicy[
'disassembler']
if 'disassembler' in subPolicy
else None
351 python = subPolicy[
'python']
352 butlerComposite = dafPersist.ButlerComposite(assembler=assembler,
353 disassembler=disassembler,
357 for name, component
in components.items():
358 butlerComposite.add(id=name,
359 datasetType=component.get(
'datasetType'),
360 setter=component.get(
'setter',
None),
361 getter=component.get(
'getter',
None),
362 subset=component.get(
'subset',
False),
363 inputOnly=component.get(
'inputOnly',
False))
364 return butlerComposite
365 setattr(self,
"map_" + datasetType, compositeClosure)
369 if name ==
"calibrations":
370 mapping = cls(datasetType, subPolicy, self.
registry, calibRegistry, calibRoot,
373 mapping = cls(datasetType, subPolicy, self.
registry, root, provided=provided)
374 self.keyDict.update(mapping.keys())
375 mappings[datasetType] = mapping
376 self.
mappings[datasetType] = mapping
377 if not hasattr(self,
"map_" + datasetType):
378 def mapClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
379 return mapping.map(mapper, dataId, write)
380 setattr(self,
"map_" + datasetType, mapClosure)
381 if not hasattr(self,
"query_" + datasetType):
382 def queryClosure(format, dataId, mapping=mapping):
383 return mapping.lookup(format, dataId)
384 setattr(self,
"query_" + datasetType, queryClosure)
385 if hasattr(mapping,
"standardize")
and not hasattr(self,
"std_" + datasetType):
386 def stdClosure(item, dataId, mapper=weakref.proxy(self), mapping=mapping):
387 return mapping.standardize(mapper, item, dataId)
388 setattr(self,
"std_" + datasetType, stdClosure)
390 def setMethods(suffix, mapImpl=None, bypassImpl=None, queryImpl=None):
391 """Set convenience methods on CameraMapper"""
392 mapName =
"map_" + datasetType +
"_" + suffix
393 bypassName =
"bypass_" + datasetType +
"_" + suffix
394 queryName =
"query_" + datasetType +
"_" + suffix
395 if not hasattr(self, mapName):
396 setattr(self, mapName, mapImpl
or getattr(self,
"map_" + datasetType))
397 if not hasattr(self, bypassName):
398 if bypassImpl
is None and hasattr(self,
"bypass_" + datasetType):
399 bypassImpl = getattr(self,
"bypass_" + datasetType)
400 if bypassImpl
is not None:
401 setattr(self, bypassName, bypassImpl)
402 if not hasattr(self, queryName):
403 setattr(self, queryName, queryImpl
or getattr(self,
"query_" + datasetType))
406 setMethods(
"filename", bypassImpl=
lambda datasetType, pythonType, location, dataId:
407 location.getLocations())
410 if subPolicy[
"storage"] ==
"FitsStorage":
411 setMethods(
"md", bypassImpl=
lambda datasetType, pythonType, location, dataId:
413 if name ==
"exposures":
414 setMethods(
"wcs", bypassImpl=
lambda datasetType, pythonType, location, dataId:
416 setMethods(
"calib", bypassImpl=
lambda datasetType, pythonType, location, dataId:
418 setMethods(
"visitInfo",
419 bypassImpl=
lambda datasetType, pythonType, location, dataId:
421 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
422 setMethods(
"md", bypassImpl=
lambda datasetType, pythonType, location, dataId:
426 if subPolicy[
"storage"] ==
"FitsStorage":
427 def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self), mapping=mapping):
428 subId = dataId.copy()
430 loc = mapping.map(mapper, subId, write)
431 bbox = dataId[
'bbox']
432 llcX = bbox.getMinX()
433 llcY = bbox.getMinY()
434 width = bbox.getWidth()
435 height = bbox.getHeight()
436 loc.additionalData.set(
'llcX', llcX)
437 loc.additionalData.set(
'llcY', llcY)
438 loc.additionalData.set(
'width', width)
439 loc.additionalData.set(
'height', height)
440 if 'imageOrigin' in dataId:
441 loc.additionalData.set(
'imageOrigin',
442 dataId[
'imageOrigin'])
444 def querySubClosure(key, format, dataId, mapping=mapping):
445 subId = dataId.copy()
447 return mapping.lookup(format, subId)
448 setMethods(
"sub", mapImpl=mapSubClosure, queryImpl=querySubClosure)
450 if subPolicy[
"storage"] ==
"FitsCatalogStorage":
452 setMethods(
"len", bypassImpl=
lambda datasetType, pythonType, location, dataId:
456 if not datasetType.endswith(
"_schema")
and datasetType +
"_schema" not in datasets:
457 setMethods(
"schema", bypassImpl=
lambda datasetType, pythonType, location, dataId:
458 afwTable.Schema.readFits(location.getLocations()[0]))
462 """Compute the 64-bit (long) identifier for a CCD exposure.
464 Subclasses must override
466 @param dataId (dict) Data identifier with visit, ccd
468 raise NotImplementedError()
471 """Compute the 64-bit (long) identifier for a coadd.
473 Subclasses must override
475 @param dataId (dict) Data identifier with tract and patch.
476 @param singleFilter (bool) True means the desired ID is for a single-
477 filter coadd, in which case dataId
480 raise NotImplementedError()
484 """Get the policy stored in a repo (specified by 'root'), if there is one.
486 @param root (string) path to the root location of the repository
487 @param repos (string) path from the root of the repo to the folder containing a file named
488 _policy.paf or _policy.yaml
489 @return (lsst.daf.persistence.Policy or None) A Policy instantiated with the policy found according to
490 input variables, or None if a policy file was not found.
494 paths = CameraMapper.parentSearch(root, os.path.join(repos,
'_policy.*'))
495 if paths
is not None:
496 for postfix
in (
'.yaml',
'.paf'):
497 matches = [path
for path
in paths
if (os.path.splitext(path))[1] == postfix]
499 raise RuntimeError(
"More than 1 policy possibility for root:%s" % root)
500 elif len(matches) == 1:
501 policy = dafPersist.Policy(matches[0])
506 return CameraMapper.parentSearch(self.
root, path)
510 """Look for the given path in the current root or any of its parents
511 by following "_parent" symlinks; return None if it can't be found. A
512 little tricky because the path may be in an alias of the root (e.g.
513 ".") and because the "_parent" links go between the root and the rest
515 If the path contains an HDU indicator (a number in brackets before the
516 dot, e.g. 'foo.fits[1]', this will be stripped when searching and so
517 will match filenames without the HDU indicator, e.g. 'foo.fits'. The
518 path returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
525 while len(rootDir) > 1
and rootDir[-1] ==
'/':
526 rootDir = rootDir[:-1]
528 if path.startswith(rootDir +
"/"):
530 path = path[len(rootDir)+1:]
532 elif rootDir ==
"/" and path.startswith(
"/"):
537 pathPrefix = os.path.dirname(path)
538 while pathPrefix !=
"" and pathPrefix !=
"/":
539 if os.path.realpath(pathPrefix) == os.path.realpath(root):
541 pathPrefix = os.path.dirname(pathPrefix)
542 if os.path.realpath(pathPrefix) != os.path.realpath(root):
544 paths = glob.glob(path)
555 if pathPrefix ==
"/":
557 elif pathPrefix !=
"":
558 path = path[len(pathPrefix)+1:]
566 firstBracket = path.find(
"[")
567 if firstBracket != -1:
568 strippedPath = path[:firstBracket]
569 pathStripped = path[firstBracket:]
572 paths = glob.glob(os.path.join(dir, strippedPath))
574 if pathStripped
is not None:
575 paths = [p + pathStripped
for p
in paths]
577 dir = os.path.join(dir,
"_parent")
578 if not os.path.exists(dir):
582 """Rename any existing object with the given type and dataId.
584 The CameraMapper implementation saves objects in a sequence of e.g.:
588 All of the backups will be placed in the output repo, however, and will
589 not be removed if they are found elsewhere in the _parent chain. This
590 means that the same file will be stored twice if the previous version was
591 found in an input repo.
593 def firstElement(list):
594 """Get the first element in the list, or None if that can't be done.
596 return list[0]
if list
is not None and len(list)
else None
599 newLocation = self.map(datasetType, dataId, write=
True)
600 newPath = newLocation.getLocations()[0]
602 path = firstElement(path)
604 while path
is not None:
606 oldPaths.append((n, path))
608 path = firstElement(path)
609 for n, oldPath
in reversed(oldPaths):
610 newDir, newFile = os.path.split(newPath)
611 if not os.path.exists(newDir):
613 shutil.copy(oldPath,
"%s~%d" % (newPath, n))
616 """Return supported keys.
617 @return (iterable) List of keys usable in a dataset identifier"""
618 return iter(self.keyDict.keys())
621 """Return supported keys and their value types for a given dataset
622 type at a given level of the key hierarchy.
624 @param datasetType (str) dataset type or None for all keys
625 @param level (str) level or None for all levels
626 @return (iterable) Set of keys usable in a dataset identifier"""
632 if datasetType
is None:
633 keyDict = copy.copy(self.
keyDict)
636 if level
is not None and level
in self.
levels:
637 keyDict = copy.copy(keyDict)
638 for l
in self.
levels[level]:
653 """Return the name of the camera that this CameraMapper is for."""
655 className = className[className.find(
'.'):-1]
656 m = re.search(
r'(\w+)Mapper', className)
658 m = re.search(
r"class '[\w.]*?(\w+)'", className)
660 return name[:1].lower() + name[1:]
if name
else ''
664 """Return the name of the package containing this CameraMapper."""
665 if cls.packageName
is None:
666 raise ValueError(
'class variable packageName must not be None')
667 return cls.packageName
670 """Map a camera dataset."""
672 raise RuntimeError(
"No camera dataset available.")
674 return dafPersist.ButlerLocation(
675 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
677 storageName=
"ConfigStorage",
684 """Return the (preloaded) camera object.
687 raise RuntimeError(
"No camera dataset available.")
691 """Map defects dataset.
693 @return a very minimal ButlerLocation containing just the locationList field
694 (just enough information that bypass_defects can use it).
697 if defectFitsPath
is None:
698 raise RuntimeError(
"No defects available for dataId=%s" % (dataId,))
700 return dafPersist.ButlerLocation(
None,
None,
None, defectFitsPath, dataId, self)
703 """Return a defect based on the butler location returned by map_defects
705 @param[in] butlerLocation: a ButlerLocation with locationList = path to defects FITS file
706 @param[in] dataId: the usual data ID; "ccd" must be set
708 Note: the name "bypass_XXX" means the butler makes no attempt to convert the ButlerLocation
709 into an object, which is what we want for now, since that conversion is a bit tricky.
712 defectsFitsPath = butlerLocation.locationList[0]
713 with pyfits.open(defectsFitsPath)
as hduList:
714 for hdu
in hduList[1:]:
715 if hdu.header[
"name"] != detectorName:
719 for data
in hdu.data:
727 raise RuntimeError(
"No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
730 return dafPersist.ButlerLocation(
731 pythonType=
"lsst.obs.base.ExposureIdInfo",
733 storageName=
"Internal",
734 locationList=
"ignored",
740 """Hook to retrieve an lsst.obs.base.ExposureIdInfo for an exposure"""
741 expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
742 expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
743 return ExposureIdInfo(expId=expId, expBits=expBits)
746 """Disable standardization for bfKernel
748 bfKernel is a calibration product that is numpy array,
749 unlike other calibration products that are all images;
750 all calibration images are sent through _standardizeExposure
751 due to CalibrationMapping, but we don't want that to happen to bfKernel
756 """Standardize a raw dataset by converting it to an Exposure instead of an Image"""
759 md = exposure.getMetadata()
761 exposure.getInfo().setVisitInfo(visitInfo)
766 """Map a sky policy."""
767 return dafPersist.ButlerLocation(
"lsst.pex.policy.Policy",
"Policy",
768 "Internal",
None,
None, self)
771 """Standardize a sky policy by returning the one we use."""
781 """Return CCD key and value used to look a defect in the defect registry
783 The default implementation simply returns ("ccd", full detector name)
788 """Set up a registry (usually SQLite3), trying a number of possible
790 @param name (string) Name of registry
791 @param path (string) Path for registry
792 @param policyKey (string) Key in policy for registry path
793 @param root (string) Root directory to look in
794 @return (lsst.daf.persistence.Registry) Registry object"""
796 if path
is None and policyKey
in policy:
798 if not os.path.exists(path):
799 if not os.path.isabs(path)
and root
is not None:
801 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
803 self.log.warn(
"Unable to locate registry at policy path (also looked in root): %s",
807 self.log.warn(
"Unable to locate registry at policy path: %s", path)
813 if path
is None and root
is not None:
814 path = os.path.join(root,
"%s.sqlite3" % name)
816 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
818 self.log.info(
"Unable to locate %s registry in root: %s", name, path)
821 path = os.path.join(
".",
"%s.sqlite3" % name)
823 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
825 self.log.info(
"Unable to locate %s registry in current dir: %s", name, path)
828 if not os.path.exists(path):
830 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
831 if newPath
is not None:
833 self.log.debug(
"Loading %s registry from %s", name, path)
834 registry = dafPersist.Registry.create(path)
835 elif not registry
and os.path.exists(root):
836 self.log.info(
"Loading Posix registry from %s", root)
837 registry = dafPersist.PosixRegistry(root)
842 """Generate a standard ID dict from a camera-specific ID dict.
844 Canonical keys include:
845 - amp: amplifier name
846 - ccd: CCD name (in LSST this is a combination of raft and sensor)
847 The default implementation returns a copy of its input.
849 @param dataId[in] (dict) Dataset identifier; this must not be modified
850 @return (dict) Transformed dataset identifier"""
855 """Convert a template path to an actual path, using the actual data
856 identifier. This implementation is usually sufficient but can be
857 overridden by the subclass.
858 @param template (string) Template path
859 @param actualId (dict) Dataset identifier
860 @return (string) Pathname"""
864 return template % transformedId
865 except Exception
as e:
866 raise RuntimeError(
"Failed to format %r with data %r: %s" % (template, transformedId, e))
870 """Convert a CCD name to a form useful as a filename
872 The default implementation converts spaces to underscores.
874 return ccdName.replace(
" ",
"_")
877 """Extract the detector (CCD) name from the dataset identifier.
879 The name in question is the detector name used by lsst.afw.cameraGeom.
881 @param dataId (dict) Dataset identifier
882 @return (string) Detector name
884 raise NotImplementedError(
"No _extractDetectorName() function specified")
887 """Extract the amplifier identifer from a dataset identifier.
889 @warning this is deprecated; DO NOT USE IT
891 amplifier identifier has two parts: the detector name for the CCD
892 containing the amplifier and index of the amplifier in the detector.
893 @param dataId (dict) Dataset identifer
894 @return (tuple) Amplifier identifier"""
897 return (trDataId[
"ccd"], int(trDataId[
'amp']))
900 """Set the detector object in an Exposure for an amplifier.
901 Defects are also added to the Exposure based on the detector object.
902 @param[in,out] item (lsst.afw.image.Exposure)
903 @param dataId (dict) Dataset identifier
904 @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
909 """Set the detector object in an Exposure for a CCD.
910 @param[in,out] item (lsst.afw.image.Exposure)
911 @param dataId (dict) Dataset identifier
912 @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
915 detector = self.
camera[detectorName]
916 item.setDetector(detector)
919 """Set the filter object in an Exposure. If the Exposure had a FILTER
920 keyword, this was already processed during load. But if it didn't,
921 use the filter from the registry.
922 @param mapping (lsst.obs.base.Mapping)
923 @param[in,out] item (lsst.afw.image.Exposure)
924 @param dataId (dict) Dataset identifier"""
926 if not (isinstance(item, afwImage.ExposureU)
or isinstance(item, afwImage.ExposureI)
or
927 isinstance(item, afwImage.ExposureF)
or isinstance(item, afwImage.ExposureD)):
930 actualId = mapping.need([
'filter'], dataId)
931 filterName = actualId[
'filter']
933 filterName = self.
filters[filterName]
939 """Default standardization function for images.
941 This sets the Detector from the camera geometry
942 and optionally set the Fiter. In both cases this saves
943 having to persist some data in each exposure (or image).
945 @param mapping (lsst.obs.base.Mapping)
946 @param[in,out] item image-like object; any of lsst.afw.image.Exposure,
947 lsst.afw.image.DecoratedImage, lsst.afw.image.Image
948 or lsst.afw.image.MaskedImage
949 @param dataId (dict) Dataset identifier
950 @param filter (bool) Set filter? Ignored if item is already an exposure
951 @param trimmed (bool) Should detector be marked as trimmed?
952 @return (lsst.afw.image.Exposure) the standardized Exposure"""
953 if not hasattr(item,
"getMaskedImage"):
956 except Exception
as e:
957 self.log.error(
"Could not turn item=%r into an exposure: %s" % (repr(item), e))
960 if mapping.level.lower() ==
"amp":
962 elif mapping.level.lower() ==
"ccd":
971 """Find the defects for a given CCD.
972 @param dataId (dict) Dataset identifier
973 @return (string) path to the defects file or None if not available"""
977 raise RuntimeError(
"No registry for defect lookup")
981 dataIdForLookup = {
'visit': dataId[
'visit']}
983 rows = self.registry.lookup((
'taiObs'), (
'raw_visit'), dataIdForLookup)
986 assert len(rows) == 1
990 rows = self.defectRegistry.executeQuery((
"path",), (
"defect",),
992 (
"DATETIME(?)",
"DATETIME(validStart)",
"DATETIME(validEnd)"),
994 if not rows
or len(rows) == 0:
997 return os.path.join(self.
defectPath, rows[0][0])
999 raise RuntimeError(
"Querying for defects (%s, %s) returns %d files: %s" %
1000 (ccdVal, taiObs, len(rows),
", ".join([_[0]
for _
in rows])))
1003 """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing the camera geometry
1005 Also set self.cameraDataLocation, if relevant (else it can be left None).
1007 This implementation assumes that policy contains an entry "camera" that points to the
1008 subdirectory in this package of camera data; specifically, that subdirectory must contain:
1009 - a file named `camera.py` that contains persisted camera config
1010 - ampInfo table FITS files, as required by lsst.afw.cameraGeom.makeCameraFromPath
1012 @param policy (daf_persistence.Policy, or pexPolicy.Policy (only for backward compatibility))
1013 Policy with per-camera defaults already merged
1014 @param repositoryDir (string) Policy repository for the subclassing
1015 module (obtained with getRepositoryPath() on the
1016 per-camera default dictionary)
1019 policy = dafPersist.Policy(pexPolicy=policy)
1020 if 'camera' not in policy:
1021 raise RuntimeError(
"Cannot find 'camera' in policy; cannot construct a camera")
1022 cameraDataSubdir = policy[
'camera']
1024 os.path.join(repositoryDir, cameraDataSubdir,
"camera.py"))
1025 cameraConfig = afwCameraGeom.CameraConfig()
1028 return afwCameraGeom.makeCameraFromPath(
1029 cameraConfig=cameraConfig,
1030 ampInfoPath=ampInfoPath,
1036 """Generate an Exposure from an image-like object
1038 If the image is a DecoratedImage then also set its WCS and metadata
1039 (Image and MaskedImage are missing the necessary metadata
1040 and Exposure already has those set)
1042 @param[in] image Image-like object (lsst.afw.image.DecoratedImage, Image, MaskedImage or Exposure)
1043 @return (lsst.afw.image.Exposure) Exposure containing input image
1045 if hasattr(image,
"getVariance"):
1048 elif hasattr(image,
"getImage"):
1051 metadata = image.getMetadata()
1053 exposure.setWcs(wcs)
1054 exposure.setMetadata(metadata)
1055 elif hasattr(image,
"getMaskedImage"):
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename Image< ImagePixelT >::Ptr image, typename Mask< MaskPixelT >::Ptr mask=typename Mask< MaskPixelT >::Ptr(), typename Image< VariancePixelT >::Ptr variance=typename Image< VariancePixelT >::Ptr())
A function to return a MaskedImage of the correct type (cf.
Encapsulate information about a bad portion of a detector.
Class for logical location of a persisted Persistable instance.
a container for holding hierarchical configuration data in memory.
Information about a single exposure of an imaging camera.
Describe an exposure's calibration.
An integer coordinate rectangle.
std::string getPackageDir(std::string const &packageName)
return the root directory of a setup package
boost::shared_ptr< Wcs > makeWcs(coord::Coord const &crval, geom::Point2D const &crpix, double CD11, double CD12, double CD21, double CD22)
Create a Wcs object from crval, crpix, CD, using CD elements (useful from python) ...
Holds an integer identifier for an LSST filter.
boost::shared_ptr< daf::base::PropertyList > readMetadata(std::string const &fileName, int hdu=0, bool strip=false)
Return the metadata (header entries) from a FITS file.
def _computeCcdExposureId
Exposure< ImagePixelT, MaskPixelT, VariancePixelT >::Ptr makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, boost::shared_ptr< Wcs const > wcs=boost::shared_ptr< Wcs const >())
A function to return an Exposure of the correct type (cf.
def _computeCoaddExposureId
def _getCcdKeyVal
Utility functions.