23 from builtins
import str
24 from past.builtins
import long
34 from lsst.daf.butlerUtils import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
41 from .exposureIdInfo
import ExposureIdInfo
44 """This module defines the CameraMapper base class."""
49 """CameraMapper is a base class for mappers that handle images from a
50 camera and products derived from them. This provides an abstraction layer
51 between the data on disk and the code.
53 Public methods: keys, queryMetadata, getDatasetTypes, map,
54 canStandardize, standardize
56 Mappers for specific data sources (e.g., CFHT Megacam, LSST
57 simulations, etc.) should inherit this class.
59 The CameraMapper manages datasets within a "root" directory. It can also
60 be given an "outputRoot". If so, the input root is linked into the
61 outputRoot directory using a symlink named "_parent"; writes go into the
62 outputRoot while reads can come from either the root or outputRoot. As
63 outputRoots are used as inputs for further processing, the chain of
64 _parent links allows any dataset to be retrieved. Note that writing to a
65 dataset present in the input root will hide the existing dataset but not
66 overwrite it. See #2160 for design discussion.
68 A camera is assumed to consist of one or more rafts, each composed of
69 multiple CCDs. Each CCD is in turn composed of one or more amplifiers
70 (amps). A camera is also assumed to have a camera geometry description
71 (CameraGeom object) as a policy file, a filter description (Filter class
72 static configuration) as another policy file, and an optional defects
73 description directory.
75 Information from the camera geometry and defects are inserted into all
76 Exposure objects returned.
78 The mapper uses one or two registries to retrieve metadata about the
79 images. The first is a registry of all raw exposures. This must contain
80 the time of the observation. One or more tables (or the equivalent)
81 within the registry are used to look up data identifier components that
82 are not specified by the user (e.g. filter) and to return results for
83 metadata queries. The second is an optional registry of all calibration
84 data. This should contain validity start and end entries for each
85 calibration dataset in the same timescale as the observation time.
87 The following method must be provided by the subclass:
89 _extractDetectorName(self, dataId): returns the detector name for a CCD
90 (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
91 a dataset identifier referring to that CCD or a subcomponent of it.
93 Other methods that the subclass may wish to override include:
95 _transformId(self, dataId): transformation of a data identifier
96 from colloquial usage (e.g., "ccdname") to proper/actual usage (e.g., "ccd"),
97 including making suitable for path expansion (e.g. removing commas).
98 The default implementation does nothing. Note that this
99 method should not modify its input parameter.
101 getShortCcdName(self, ccdName): a static method that returns a shortened name
102 suitable for use as a filename. The default version converts spaces to underscores.
104 _getCcdKeyVal(self, dataId): return a CCD key and value
105 by which to look up defects in the defects registry.
106 The default value returns ("ccd", detector name)
108 _mapActualToPath(self, template, actualId): convert a template path to an
109 actual path, using the actual dataset identifier.
111 The mapper's behaviors are largely specified by the policy file.
112 See the MapperDictionary.paf for descriptions of the available items.
114 The 'exposures', 'calibrations', and 'datasets' subpolicies configure
115 mappings (see Mappings class).
117 Common default mappings for all subclasses can be specified in the
118 "policy/{images,exposures,calibrations,datasets}.yaml" files. This provides
119 a simple way to add a product to all camera mappers.
121 Functions to map (provide a path to the data given a dataset
122 identifier dictionary) and standardize (convert data into some standard
123 format or type) may be provided in the subclass as "map_{dataset type}"
124 and "std_{dataset type}", respectively.
126 If non-Exposure datasets cannot be retrieved using standard
127 daf_persistence methods alone, a "bypass_{dataset type}" function may be
128 provided in the subclass to return the dataset instead of using the
129 "datasets" subpolicy.
131 Implementations of map_camera and bypass_camera that should typically be
132 sufficient are provided in this base class.
135 * Handle defects the same was as all other calibration products, using the calibration registry
136 * Instead of auto-loading the camera at construction time, load it from the calibration registry
137 * Rewrite defects as AFW tables so we don't need pyfits to unpersist them; then remove all mention
138 of pyfits from this package.
142 def __init__(self, policy, repositoryDir,
143 root=
None, registry=
None, calibRoot=
None, calibRegistry=
None,
144 provided=
None, outputRoot=
None):
145 """Initialize the CameraMapper.
146 @param policy (daf_persistence.Policy, or pexPolicy.Policy (only for backward compatibility))
147 Policy with per-camera defaults already merged
148 @param repositoryDir (string) Policy repository for the subclassing
149 module (obtained with getRepositoryPath() on the
150 per-camera default dictionary)
151 @param root (string) Root directory for data
152 @param registry (string) Path to registry with data's metadata
153 @param calibRoot (string) Root directory for calibrations
154 @param calibRegistry (string) Path to registry with calibrations'
156 @param provided (list of strings) Keys provided by the mapper
157 @param outputRoot (string) Root directory for output data
160 dafPersist.Mapper.__init__(self)
162 self.
log = lsstLog.Log.getLogger(
"CameraMapper")
166 policy = dafPersist.Policy(policy)
168 repoPolicy = CameraMapper.getRepoPolicy(self.
root, self.
root)
169 if repoPolicy
is not None:
170 policy.update(repoPolicy)
172 defaultPolicyFile = dafPersist.Policy.defaultPolicyFile(
"daf_butlerUtils",
173 "MapperDictionary.paf",
175 dictPolicy = dafPersist.Policy(defaultPolicyFile)
176 policy.merge(dictPolicy)
180 if 'levels' in policy:
181 levelsPolicy = policy[
'levels']
182 for key
in levelsPolicy.names(
True):
183 self.
levels[key] = set(levelsPolicy.asArray(key))
186 if 'defaultSubLevels' in policy:
194 if outputRoot
is not None and os.path.abspath(outputRoot) != os.path.abspath(root):
196 if not os.path.exists(outputRoot):
198 os.makedirs(outputRoot)
200 if not e.errno == errno.EEXIST:
202 if not os.path.exists(outputRoot):
203 raise RuntimeError(
"Unable to create output repository '%s'" % (outputRoot,))
204 if os.path.exists(root):
206 src = os.path.abspath(root)
207 dst = os.path.join(outputRoot,
"_parent")
208 if not os.path.exists(dst):
213 if os.path.exists(dst):
214 if os.path.realpath(dst) != os.path.realpath(src):
215 raise RuntimeError(
"Output repository path "
216 "'%s' already exists and differs from "
217 "input repository path '%s'" % (dst, src))
219 raise RuntimeError(
"Unable to symlink from input "
220 "repository path '%s' to output repository "
221 "path '%s'" % (src, dst))
226 if calibRoot
is None:
227 if 'calibRoot' in policy:
228 calibRoot = policy[
'calibRoot']
233 if not os.path.exists(root):
234 self.log.warn(
"Root directory not found: %s", root)
235 if not os.path.exists(calibRoot):
236 self.log.warn(
"Calibration root directory not found: %s", calibRoot)
242 if 'needCalibRegistry' in policy
and policy[
'needCalibRegistry']:
243 calibRegistry = self.
_setupRegistry(
"calibRegistry", calibRegistry, policy,
244 "calibRegistryPath", calibRoot)
249 imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
250 "daf_butlerUtils",
"ImageMappingDictionary.paf",
"policy"))
251 expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
252 "daf_butlerUtils",
"ExposureMappingDictionary.paf",
"policy"))
253 calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
254 "daf_butlerUtils",
"CalibrationMappingDictionary.paf",
"policy"))
255 dsMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
256 "daf_butlerUtils",
"DatasetMappingDictionary.paf",
"policy"))
263 (
"images", imgMappingPolicy, ImageMapping),
264 (
"exposures", expMappingPolicy, ExposureMapping),
265 (
"calibrations", calMappingPolicy, CalibrationMapping),
266 (
"datasets", dsMappingPolicy, DatasetMapping)
269 for name, defPolicy, cls
in mappingList:
271 datasets = policy[name]
274 defaultsPath = os.path.join(
getPackageDir(
"daf_butlerUtils"),
"policy", name +
".yaml")
275 if os.path.exists(defaultsPath):
276 datasets.merge(dafPersist.Policy(defaultsPath))
279 setattr(self, name, mappings)
280 for datasetType
in datasets.names(
True):
281 subPolicy = datasets[datasetType]
282 subPolicy.merge(defPolicy)
283 if name ==
"calibrations":
284 mapping = cls(datasetType, subPolicy, self.
registry, calibRegistry, calibRoot,
287 mapping = cls(datasetType, subPolicy, self.
registry, root, provided=provided)
288 self.keyDict.update(mapping.keys())
289 mappings[datasetType] = mapping
290 self.
mappings[datasetType] = mapping
291 if not hasattr(self,
"map_" + datasetType):
292 def mapClosure(dataId, write=False,
293 mapper=weakref.proxy(self), mapping=mapping):
294 return mapping.map(mapper, dataId, write)
295 setattr(self,
"map_" + datasetType, mapClosure)
296 if not hasattr(self,
"query_" + datasetType):
297 def queryClosure(format, dataId, mapping=mapping):
298 return mapping.lookup(format, dataId)
299 setattr(self,
"query_" + datasetType, queryClosure)
300 if hasattr(mapping,
"standardize")
and \
301 not hasattr(self,
"std_" + datasetType):
302 def stdClosure(item, dataId,
303 mapper=weakref.proxy(self), mapping=mapping):
304 return mapping.standardize(mapper, item, dataId)
305 setattr(self,
"std_" + datasetType, stdClosure)
307 mapFunc =
"map_" + datasetType +
"_filename"
308 bypassFunc =
"bypass_" + datasetType +
"_filename"
309 if not hasattr(self, mapFunc):
310 setattr(self, mapFunc, getattr(self,
"map_" + datasetType))
311 if not hasattr(self, bypassFunc):
312 setattr(self, bypassFunc,
313 lambda datasetType, pythonType, location, dataId: location.getLocations())
316 if name ==
"exposures" or name ==
"images":
317 expFunc =
"map_" + datasetType
318 mdFunc = expFunc +
"_md"
319 bypassFunc =
"bypass_" + datasetType +
"_md"
320 if not hasattr(self, mdFunc):
321 setattr(self, mdFunc, getattr(self, expFunc))
322 if not hasattr(self, bypassFunc):
323 setattr(self, bypassFunc,
324 lambda datasetType, pythonType, location, dataId:
326 if not hasattr(self,
"query_" + datasetType +
"_md"):
327 setattr(self,
"query_" + datasetType +
"_md",
328 getattr(self,
"query_" + datasetType))
330 subFunc = expFunc +
"_sub"
331 if not hasattr(self, subFunc):
332 def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self),
334 subId = dataId.copy()
336 loc = mapping.map(mapper, subId, write)
337 bbox = dataId[
'bbox']
338 llcX = bbox.getMinX()
339 llcY = bbox.getMinY()
340 width = bbox.getWidth()
341 height = bbox.getHeight()
342 loc.additionalData.set(
'llcX', llcX)
343 loc.additionalData.set(
'llcY', llcY)
344 loc.additionalData.set(
'width', width)
345 loc.additionalData.set(
'height', height)
346 if 'imageOrigin' in dataId:
347 loc.additionalData.set(
'imageOrigin',
348 dataId[
'imageOrigin'])
350 setattr(self, subFunc, mapSubClosure)
351 if not hasattr(self,
"query_" + datasetType +
"_sub"):
352 def querySubClosure(key, format, dataId, mapping=mapping):
353 subId = dataId.copy()
355 return mapping.lookup(format, subId)
356 setattr(self,
"query_" + datasetType +
"_sub", querySubClosure)
364 if 'defects' in policy:
365 self.
defectPath = os.path.join(repositoryDir, policy[
'defects'])
366 defectRegistryLocation = os.path.join(self.
defectPath,
"defectRegistry.sqlite3")
367 self.
defectRegistry = dafPersist.Registry.create(defectRegistryLocation)
378 raise ValueError(
'class variable packageName must not be None')
382 """Get the policy stored in a repo (specified by 'root'), if there is one.
384 @param root (string) path to the root location of the repository
385 @param repos (string) path from the root of the repo to the folder containing a file named
386 _policy.paf or _policy.yaml
387 @return (lsst.daf.persistence.Policy or None) A Policy instantiated with the policy found according to
388 input variables, or None if a policy file was not found.
392 paths = CameraMapper.parentSearch(root, os.path.join(repos,
'_policy.*'))
393 if paths
is not None:
394 for postfix
in (
'.yaml',
'.paf'):
395 matches = [path
for path
in paths
if (os.path.splitext(path))[1] == postfix]
397 raise RuntimeError(
"More than 1 policy possibility for root:%s" % root)
398 elif len(matches) == 1:
399 policy = dafPersist.Policy(matches[0])
404 return CameraMapper.parentSearch(self.
root, path)
408 """Look for the given path in the current root or any of its parents
409 by following "_parent" symlinks; return None if it can't be found. A
410 little tricky because the path may be in an alias of the root (e.g.
411 ".") and because the "_parent" links go between the root and the rest
413 If the path contains an HDU indicator (a number in brackets before the
414 dot, e.g. 'foo.fits[1]', this will be stripped when searching and so
415 will match filenames without the HDU indicator, e.g. 'foo.fits'. The
416 path returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
423 while len(rootDir) > 1
and rootDir[-1] ==
'/':
424 rootDir = rootDir[:-1]
426 if path.startswith(rootDir +
"/"):
428 path = path[len(rootDir)+1:]
430 elif rootDir ==
"/" and path.startswith(
"/"):
435 pathPrefix = os.path.dirname(path)
436 while pathPrefix !=
"" and pathPrefix !=
"/":
437 if os.path.realpath(pathPrefix) == os.path.realpath(root):
439 pathPrefix = os.path.dirname(pathPrefix)
440 if os.path.realpath(pathPrefix) != os.path.realpath(root):
442 paths = glob.glob(path)
453 if pathPrefix ==
"/":
455 elif pathPrefix !=
"":
456 path = path[len(pathPrefix)+1:]
464 firstBracket = path.find(
"[")
465 if firstBracket != -1:
466 strippedPath = path[:firstBracket]
467 pathStripped = path[firstBracket:]
470 paths = glob.glob(os.path.join(dir, strippedPath))
472 if pathStripped
is not None:
473 paths = [p + pathStripped
for p
in paths]
475 dir = os.path.join(dir,
"_parent")
476 if not os.path.exists(dir):
480 """Rename any existing object with the given type and dataId.
482 The CameraMapper implementation saves objects in a sequence of e.g.:
486 All of the backups will be placed in the output repo, however, and will
487 not be removed if they are found elsewhere in the _parent chain. This
488 means that the same file will be stored twice if the previous version was
489 found in an input repo.
491 def firstElement(list):
492 """Get the first element in the list, or None if that can't be done.
494 return list[0]
if list
is not None and len(list)
else None
497 newLocation = self.map(datasetType, dataId, write=
True)
498 newPath = newLocation.getLocations()[0]
500 path = firstElement(path)
502 while path
is not None:
504 oldPaths.append((n, path))
506 path = firstElement(path)
507 for n, oldPath
in reversed(oldPaths):
508 newDir, newFile = os.path.split(newPath)
509 if not os.path.exists(newDir):
511 shutil.copy(oldPath,
"%s~%d" % (newPath, n))
514 """Return supported keys.
515 @return (iterable) List of keys usable in a dataset identifier"""
516 return iter(self.keyDict.keys())
519 """Return supported keys and their value types for a given dataset
520 type at a given level of the key hierarchy.
522 @param datasetType (str) dataset type or None for all keys
523 @param level (str) level or None for all levels
524 @return (iterable) Set of keys usable in a dataset identifier"""
530 if datasetType
is None:
531 keyDict = copy.copy(self.
keyDict)
534 if level
is not None and level
in self.
levels:
535 keyDict = copy.copy(keyDict)
536 for l
in self.
levels[level]:
551 """Return the name of the camera that this CameraMapper is for."""
553 className = className[className.find(
'.'):-1]
554 m = re.search(
r'(\w+)Mapper', className)
556 m = re.search(
r"class '[\w.]*?(\w+)'", className)
558 return name[:1].lower() + name[1:]
if name
else ''
562 """Return the name of the package containing this CameraMapper."""
563 if cls.packageName
is None:
564 raise ValueError(
'class variable packageName must not be None')
565 return cls.packageName
568 """Map a camera dataset."""
570 raise RuntimeError(
"No camera dataset available.")
572 return dafPersist.ButlerLocation(
573 pythonType=
"lsst.afw.cameraGeom.CameraConfig",
575 storageName=
"ConfigStorage",
582 """Return the (preloaded) camera object.
585 raise RuntimeError(
"No camera dataset available.")
589 """Map defects dataset.
591 @return a very minimal ButlerLocation containing just the locationList field
592 (just enough information that bypass_defects can use it).
595 if defectFitsPath
is None:
596 raise RuntimeError(
"No defects available for dataId=%s" % (dataId,))
598 return dafPersist.ButlerLocation(
None,
None,
None, defectFitsPath, dataId, self)
601 """Return a defect based on the butler location returned by map_defects
603 @param[in] butlerLocation: a ButlerLocation with locationList = path to defects FITS file
604 @param[in] dataId: the usual data ID; "ccd" must be set
606 Note: the name "bypass_XXX" means the butler makes no attempt to convert the ButlerLocation
607 into an object, which is what we want for now, since that conversion is a bit tricky.
610 defectsFitsPath = butlerLocation.locationList[0]
611 with pyfits.open(defectsFitsPath)
as hduList:
612 for hdu
in hduList[1:]:
613 if hdu.header[
"name"] != detectorName:
617 for data
in hdu.data:
625 raise RuntimeError(
"No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
628 return dafPersist.ButlerLocation(
629 pythonType=
"lsst.daf.butlerUtils.ExposureIdInfo",
631 storageName=
"Internal",
632 locationList=
"ignored",
638 """Hook to retrieve an lsst.daf.butlerUtils.ExposureIdInfo for an exposure"""
639 expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
640 expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
641 return ExposureIdInfo(expId=expId, expBits=expBits)
644 """Standardize a raw dataset by converting it to an Exposure instead of an Image"""
650 """Map a sky policy."""
651 return dafPersist.ButlerLocation(
"lsst.pex.policy.Policy",
"Policy",
652 "Internal",
None,
None, self)
655 """Standardize a sky policy by returning the one we use."""
665 """Return CCD key and value used to look a defect in the defect registry
667 The default implementation simply returns ("ccd", full detector name)
672 """Set up a registry (usually SQLite3), trying a number of possible
674 @param name (string) Name of registry
675 @param path (string) Path for registry
676 @param policyKey (string) Key in policy for registry path
677 @param root (string) Root directory to look in
678 @return (lsst.daf.persistence.Registry) Registry object"""
680 if path
is None and policyKey
in policy:
682 if not os.path.exists(path):
683 if not os.path.isabs(path)
and root
is not None:
685 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
687 self.log.warn(
"Unable to locate registry at policy path (also looked in root): %s",
691 self.log.warn(
"Unable to locate registry at policy path: %s", path)
697 if path
is None and root
is not None:
698 path = os.path.join(root,
"%s.sqlite3" % name)
700 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
702 self.log.info(
"Unable to locate %s registry in root: %s", name, path)
705 path = os.path.join(
".",
"%s.sqlite3" % name)
707 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
709 self.log.info(
"Unable to locate %s registry in current dir: %s", name, path)
712 if not os.path.exists(path):
714 newPath = newPath[0]
if newPath
is not None and len(newPath)
else None
715 if newPath
is not None:
717 self.log.info(
"Loading %s registry from %s", name, path)
718 registry = dafPersist.Registry.create(path)
719 elif not registry
and os.path.exists(root):
720 self.log.info(
"Loading Posix registry from %s", root)
721 registry = dafPersist.PosixRegistry(root)
726 """Generate a standard ID dict from a camera-specific ID dict.
728 Canonical keys include:
729 - amp: amplifier name
730 - ccd: CCD name (in LSST this is a combination of raft and sensor)
731 The default implementation returns a copy of its input.
733 @param dataId[in] (dict) Dataset identifier; this must not be modified
734 @return (dict) Transformed dataset identifier"""
739 """Convert a template path to an actual path, using the actual data
740 identifier. This implementation is usually sufficient but can be
741 overridden by the subclass.
742 @param template (string) Template path
743 @param actualId (dict) Dataset identifier
744 @return (string) Pathname"""
748 return template % transformedId
749 except Exception
as e:
750 raise RuntimeError(
"Failed to format %r with data %r: %s" % (template, transformedId, e))
754 """Convert a CCD name to a form useful as a filename
756 The default implementation converts spaces to underscores.
758 return ccdName.replace(
" ",
"_")
761 """Extract the detector (CCD) name from the dataset identifier.
763 The name in question is the detector name used by lsst.afw.cameraGeom.
765 @param dataId (dict) Dataset identifier
766 @return (string) Detector name
768 raise NotImplementedError(
"No _extractDetectorName() function specified")
771 """Extract the amplifier identifer from a dataset identifier.
773 @warning this is deprecated; DO NOT USE IT
775 amplifier identifier has two parts: the detector name for the CCD
776 containing the amplifier and index of the amplifier in the detector.
777 @param dataId (dict) Dataset identifer
778 @return (tuple) Amplifier identifier"""
781 return (trDataId[
"ccd"], int(trDataId[
'amp']))
784 """Set the detector object in an Exposure for an amplifier.
785 Defects are also added to the Exposure based on the detector object.
786 @param[in,out] item (lsst.afw.image.Exposure)
787 @param dataId (dict) Dataset identifier
788 @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
793 """Set the detector object in an Exposure for a CCD.
794 @param[in,out] item (lsst.afw.image.Exposure)
795 @param dataId (dict) Dataset identifier
796 @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
799 detector = self.
camera[detectorName]
800 item.setDetector(detector)
803 """Set the filter object in an Exposure. If the Exposure had a FILTER
804 keyword, this was already processed during load. But if it didn't,
805 use the filter from the registry.
806 @param mapping (lsst.daf.butlerUtils.Mapping)
807 @param[in,out] item (lsst.afw.image.Exposure)
808 @param dataId (dict) Dataset identifier"""
810 if not (isinstance(item, afwImage.ExposureU)
or isinstance(item, afwImage.ExposureI)
or
811 isinstance(item, afwImage.ExposureF)
or isinstance(item, afwImage.ExposureD)):
814 actualId = mapping.need([
'filter'], dataId)
815 filterName = actualId[
'filter']
817 filterName = self.
filters[filterName]
821 """Set the exposure time and exposure midpoint in the calib object in
822 an Exposure. Use the EXPTIME and MJD-OBS keywords (and strip out
824 @param mapping (lsst.daf.butlerUtils.Mapping)
825 @param[in,out] item (lsst.afw.image.Exposure)
826 @param dataId (dict) Dataset identifier"""
828 md = item.getMetadata()
829 calib = item.getCalib()
830 if md.exists(
"EXPTIME"):
831 expTime = md.get(
"EXPTIME")
832 calib.setExptime(expTime)
835 expTime = calib.getExptime()
836 if md.exists(
"MJD-OBS"):
838 dafBase.DateTime.MJD, dafBase.DateTime.UTC)
839 obsMidpoint = obsStart.nsecs() + long(expTime * 1000000000 / 2)
845 """Default standardization function for images.
846 @param mapping (lsst.daf.butlerUtils.Mapping)
847 @param[in,out] item (lsst.afw.image.Exposure)
848 @param dataId (dict) Dataset identifier
849 @param filter (bool) Set filter?
850 @param trimmed (bool) Should detector be marked as trimmed?
851 @return (lsst.afw.image.Exposure) the standardized Exposure"""
853 if (re.search(
r'Exposure', mapping.python)
and re.search(
r'Image', mapping.persistable)):
856 if mapping.level.lower() ==
"amp":
858 elif mapping.level.lower() ==
"ccd":
863 if not isinstance(mapping, CalibrationMapping):
869 """Find the defects for a given CCD.
870 @param dataId (dict) Dataset identifier
871 @return (string) path to the defects file or None if not available"""
875 raise RuntimeError(
"No registry for defect lookup")
879 dataIdForLookup = {
'visit': dataId[
'visit']}
881 rows = self.registry.lookup((
'taiObs'), (
'raw_visit'), dataIdForLookup)
884 assert len(rows) == 1
888 rows = self.defectRegistry.executeQuery((
"path",), (
"defect",),
890 (
"DATETIME(?)",
"DATETIME(validStart)",
"DATETIME(validEnd)"),
892 if not rows
or len(rows) == 0:
895 return os.path.join(self.
defectPath, rows[0][0])
897 raise RuntimeError(
"Querying for defects (%s, %s) returns %d files: %s" %
898 (ccdVal, taiObs, len(rows),
", ".join([_[0]
for _
in rows])))
901 """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing the camera geometry
903 Also set self.cameraDataLocation, if relevant (else it can be left None).
905 This implementation assumes that policy contains an entry "camera" that points to the
906 subdirectory in this package of camera data; specifically, that subdirectory must contain:
907 - a file named `camera.py` that contains persisted camera config
908 - ampInfo table FITS files, as required by lsst.afw.cameraGeom.makeCameraFromPath
910 @param policy (daf_persistence.Policy, or pexPolicy.Policy (only for backward compatibility))
911 Policy with per-camera defaults already merged
912 @param repositoryDir (string) Policy repository for the subclassing
913 module (obtained with getRepositoryPath() on the
914 per-camera default dictionary)
917 policy = dafPersist.Policy(pexPolicy=policy)
918 if 'camera' not in policy:
919 raise RuntimeError(
"Cannot find 'camera' in policy; cannot construct a camera")
920 cameraDataSubdir = policy[
'camera']
922 os.path.join(repositoryDir, cameraDataSubdir,
"camera.py"))
923 cameraConfig = afwCameraGeom.CameraConfig()
926 return afwCameraGeom.makeCameraFromPath(
927 cameraConfig=cameraConfig,
928 ampInfoPath=ampInfoPath,
934 """Generate an exposure from a DecoratedImage or similar
935 @param[in] image Image of interest
936 @return (lsst.afw.image.Exposure) Exposure containing input image
938 if isinstance(image, afwImage.DecoratedImageU)
or isinstance(image, afwImage.DecoratedImageI)
or \
939 isinstance(image, afwImage.DecoratedImageF)
or isinstance(image, afwImage.DecoratedImageD):
943 md = image.getMetadata()
944 exposure.setMetadata(md)
Class for handling dates/times, including MJD, UTC, and TAI.
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename Image< ImagePixelT >::Ptr image, typename Mask< MaskPixelT >::Ptr mask=typename Mask< MaskPixelT >::Ptr(), typename Image< VariancePixelT >::Ptr variance=typename Image< VariancePixelT >::Ptr())
Encapsulate information about a bad portion of a detector.
Class for logical location of a persisted Persistable instance.
a container for holding hierarchical configuration data in memory.
def _getCcdKeyVal
Utility functions.
An integer coordinate rectangle.
std::string getPackageDir(std::string const &packageName)
return the root directory of a setup package
boost::shared_ptr< Wcs > makeWcs(coord::Coord const &crval, geom::Point2D const &crpix, double CD11, double CD12, double CD21, double CD22)
Create a Wcs object from crval, crpix, CD, using CD elements (useful from python) ...
Holds an integer identifier for an LSST filter.
boost::shared_ptr< daf::base::PropertySet > readMetadata(std::string const &fileName, int hdu=0, bool strip=false)
Return the metadata (header entries) from a FITS file.
Exposure< ImagePixelT, MaskPixelT, VariancePixelT >::Ptr makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, boost::shared_ptr< Wcs const > wcs=boost::shared_ptr< Wcs const >())