LSSTApplications  1.1.2+25,10.0+13,10.0+132,10.0+133,10.0+224,10.0+41,10.0+8,10.0-1-g0f53050+14,10.0-1-g4b7b172+19,10.0-1-g61a5bae+98,10.0-1-g7408a83+3,10.0-1-gc1e0f5a+19,10.0-1-gdb4482e+14,10.0-11-g3947115+2,10.0-12-g8719d8b+2,10.0-15-ga3f480f+1,10.0-2-g4f67435,10.0-2-gcb4bc6c+26,10.0-28-gf7f57a9+1,10.0-3-g1bbe32c+14,10.0-3-g5b46d21,10.0-4-g027f45f+5,10.0-4-g86f66b5+2,10.0-4-gc4fccf3+24,10.0-40-g4349866+2,10.0-5-g766159b,10.0-5-gca2295e+25,10.0-6-g462a451+1
LSSTDataManagementBasePackage
cameraMapper.py
Go to the documentation of this file.
1 #!/bin/env python
2 #
3 # LSST Data Management System
4 # Copyright 2008, 2009, 2010 LSST Corporation.
5 #
6 # This product includes software developed by the
7 # LSST Project (http://www.lsst.org/).
8 #
9 # This program is free software: you can redistribute it and/or modify
10 # it under the terms of the GNU General Public License as published by
11 # the Free Software Foundation, either version 3 of the License, or
12 # (at your option) any later version.
13 #
14 # This program is distributed in the hope that it will be useful,
15 # but WITHOUT ANY WARRANTY; without even the implied warranty of
16 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
17 # GNU General Public License for more details.
18 #
19 # You should have received a copy of the LSST License Statement and
20 # the GNU General Public License along with this program. If not,
21 # see <http://www.lsstcorp.org/LegalNotices/>.
22 #
23 
24 import os
25 import errno
26 import re
27 import sys
28 import shutil
29 import pyfits # required by _makeDefectsDict until defects are written as AFW tables
30 import eups
31 import lsst.daf.persistence as dafPersist
32 from lsst.daf.butlerUtils import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping, Registry
33 import lsst.daf.base as dafBase
34 import lsst.afw.geom as afwGeom
35 import lsst.afw.image as afwImage
36 import lsst.afw.cameraGeom as afwCameraGeom
37 import lsst.pex.logging as pexLog
38 import lsst.pex.policy as pexPolicy
39 
40 """This module defines the CameraMapper base class."""
41 
43 
44  """CameraMapper is a base class for mappers that handle images from a
45  camera and products derived from them. This provides an abstraction layer
46  between the data on disk and the code.
47 
48  Public methods: keys, queryMetadata, getDatasetTypes, map,
49  canStandardize, standardize
50 
51  Mappers for specific data sources (e.g., CFHT Megacam, LSST
52  simulations, etc.) should inherit this class.
53 
54  The CameraMapper manages datasets within a "root" directory. It can also
55  be given an "outputRoot". If so, the input root is linked into the
56  outputRoot directory using a symlink named "_parent"; writes go into the
57  outputRoot while reads can come from either the root or outputRoot. As
58  outputRoots are used as inputs for further processing, the chain of
59  _parent links allows any dataset to be retrieved. Note that writing to a
60  dataset present in the input root will hide the existing dataset but not
61  overwrite it. See #2160 for design discussion.
62 
63  A camera is assumed to consist of one or more rafts, each composed of
64  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
65  (amps). A camera is also assumed to have a camera geometry description
66  (CameraGeom object) as a policy file, a filter description (Filter class
67  static configuration) as another policy file, and an optional defects
68  description directory.
69 
70  Information from the camera geometry and defects are inserted into all
71  Exposure objects returned.
72 
73  The mapper uses one or two registries to retrieve metadata about the
74  images. The first is a registry of all raw exposures. This must contain
75  the time of the observation. One or more tables (or the equivalent)
76  within the registry are used to look up data identifier components that
77  are not specified by the user (e.g. filter) and to return results for
78  metadata queries. The second is an optional registry of all calibration
79  data. This should contain validity start and end entries for each
80  calibration dataset in the same timescale as the observation time.
81 
82  The following method must be provided by the subclass:
83 
84  _extractDetectorName(self, dataId): returns the detector name for a CCD
85  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
86  a dataset identifier referring to that CCD or a subcomponent of it.
87 
88  Other methods that the subclass may wish to override include:
89 
90  _transformId(self, dataId): transformation of a data identifier
91  from colloquial usage (e.g., "ccdname") to proper/actual usage (e.g., "ccd"),
92  including making suitable for path expansion (e.g. removing commas).
93  The default implementation does nothing. Note that this
94  method should not modify its input parameter.
95 
96  getShortCcdName(self, ccdName): a static method that returns a shortened name
97  suitable for use as a filename. The default version converts spaces to underscores.
98 
99  _getCcdKeyVal(self, dataId): return a CCD key and value
100  by which to look up defects in the defects registry.
101  The default value returns ("ccd", detector name)
102 
103  _mapActualToPath(self, template, actualId): convert a template path to an
104  actual path, using the actual dataset identifier.
105 
106  The mapper's behaviors are largely specified by the policy file.
107  See the MapperDictionary.paf for descriptions of the available items.
108 
109  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
110  mappings (see Mappings class).
111 
112  Functions to map (provide a path to the data given a dataset
113  identifier dictionary) and standardize (convert data into some standard
114  format or type) may be provided in the subclass as "map_{dataset type}"
115  and "std_{dataset type}", respectively.
116 
117  If non-Exposure datasets cannot be retrieved using standard
118  daf_persistence methods alone, a "bypass_{dataset type}" function may be
119  provided in the subclass to return the dataset instead of using the
120  "datasets" subpolicy.
121 
122  Implementations of map_camera and std_camera that should typically be
123  sufficient are provided in this base class.
124 
125  @todo
126  * Handle defects the same was as all other calibration products, using the calibration registry
127  * Instead of auto-loading the camera at construction time, load it from the calibration registry
128  * Rewrite defects as AFW tables so we don't need pyfits to unpersist them; then remove all mention
129  of pyfits from this package.
130  """
131 
132  def __init__(self, policy, repositoryDir,
133  root=None, registry=None, calibRoot=None, calibRegistry=None,
134  provided=None, outputRoot=None):
135  """Initialize the CameraMapper.
136  @param policy (pexPolicy.Policy) Policy with per-camera defaults
137  already merged
138  @param repositoryDir (string) Policy repository for the subclassing
139  module (obtained with getRepositoryPath() on the
140  per-camera default dictionary)
141  @param root (string) Root directory for data
142  @param registry (string) Path to registry with data's metadata
143  @param calibRoot (string) Root directory for calibrations
144  @param calibRegistry (string) Path to registry with calibrations'
145  metadata
146  @param provided (list of strings) Keys provided by the mapper
147  @param outputRoot (string) Root directory for output data
148  """
149 
150  dafPersist.Mapper.__init__(self)
151 
152  self.log = pexLog.Log(pexLog.getDefaultLog(), "CameraMapper")
153 
154  # Dictionary
155  dictFile = pexPolicy.DefaultPolicyFile("daf_butlerUtils",
156  "MapperDictionary.paf", "policy")
157  dictPolicy = pexPolicy.Policy.createPolicy(dictFile,
158  dictFile.getRepositoryPath())
159  policy.mergeDefaults(dictPolicy)
160 
161  # Levels
162  self.levels = dict()
163  if policy.exists("levels"):
164  levelsPolicy = policy.getPolicy("levels")
165  for key in levelsPolicy.names(True):
166  self.levels[key] = set(levelsPolicy.getStringArray(key))
167  self.defaultLevel = policy.getString("defaultLevel")
168  self.defaultSubLevels = dict()
169  if policy.exists("defaultSubLevels"):
170  defaultSubLevelsPolicy = policy.getPolicy("defaultSubLevels")
171  for key in defaultSubLevelsPolicy.names(True):
172  self.defaultSubLevels[key] = defaultSubLevelsPolicy.getString(key)
173 
174  # Root directories
175  if root is None:
176  root = "."
177  root = dafPersist.LogicalLocation(root).locString()
178 
179  if outputRoot is not None and os.path.abspath(outputRoot) != os.path.abspath(root):
180  # Path manipulations are subject to race condition
181  if not os.path.exists(outputRoot):
182  try:
183  os.makedirs(outputRoot)
184  except OSError, e:
185  if not e.errno == errno.EEXIST:
186  raise
187  if not os.path.exists(outputRoot):
188  raise RuntimeError, "Unable to create output " \
189  "repository '%s'" % (outputRoot,)
190  if os.path.exists(root):
191  # Symlink existing input root to "_parent" in outputRoot.
192  src = os.path.abspath(root)
193  dst = os.path.join(outputRoot, "_parent")
194  if not os.path.exists(dst):
195  try:
196  os.symlink(src, dst)
197  except OSError:
198  pass
199  if os.path.exists(dst):
200  if os.path.realpath(dst) != os.path.realpath(src):
201  raise RuntimeError, "Output repository path " \
202  "'%s' already exists and differs from " \
203  "input repository path '%s'" % (dst, src)
204  else:
205  raise RuntimeError, "Unable to symlink from input " \
206  "repository path '%s' to output repository " \
207  "path '%s'" % (src, dst)
208  # We now use the outputRoot as the main root with access to the
209  # input via "_parent".
210  root = outputRoot
211 
212  if calibRoot is None:
213  if policy.exists('calibRoot'):
214  calibRoot = policy.getString('calibRoot')
215  calibRoot = dafPersist.LogicalLocation(calibRoot).locString()
216  else:
217  calibRoot = root
218 
219  if not os.path.exists(root):
220  self.log.log(pexLog.Log.WARN,
221  "Root directory not found: %s" % (root,))
222  if not os.path.exists(calibRoot):
223  self.log.log(pexLog.Log.WARN,
224  "Calibration root directory not found: %s" % (calibRoot,))
225  self.root = root
226 
227  # Registries
229  "registry", registry, policy, "registryPath", root)
230  if policy.exists('needCalibRegistry') and \
231  policy.getBool('needCalibRegistry'):
232  calibRegistry = self._setupRegistry(
233  "calibRegistry", calibRegistry,
234  policy, "calibRegistryPath", calibRoot)
235  else:
236  calibRegistry = None
237 
238  # Sub-dictionaries (for exposure/calibration/dataset types)
239  imgMappingFile = pexPolicy.DefaultPolicyFile("daf_butlerUtils",
240  "ImageMappingDictionary.paf", "policy")
241  imgMappingPolicy = pexPolicy.Policy.createPolicy(imgMappingFile,
242  imgMappingFile.getRepositoryPath())
243  expMappingFile = pexPolicy.DefaultPolicyFile("daf_butlerUtils",
244  "ExposureMappingDictionary.paf", "policy")
245  expMappingPolicy = pexPolicy.Policy.createPolicy(expMappingFile,
246  expMappingFile.getRepositoryPath())
247  calMappingFile = pexPolicy.DefaultPolicyFile("daf_butlerUtils",
248  "CalibrationMappingDictionary.paf", "policy")
249  calMappingPolicy = pexPolicy.Policy.createPolicy(calMappingFile,
250  calMappingFile.getRepositoryPath())
251  dsMappingFile = pexPolicy.DefaultPolicyFile("daf_butlerUtils",
252  "DatasetMappingDictionary.paf", "policy")
253  dsMappingPolicy = pexPolicy.Policy.createPolicy(dsMappingFile,
254  dsMappingFile.getRepositoryPath())
255 
256  # Dict of valid keys and their value types
257  self.keyDict = dict()
258 
259  # Mappings
260  mappingList = (
261  ("images", imgMappingPolicy, ImageMapping),
262  ("exposures", expMappingPolicy, ExposureMapping),
263  ("calibrations", calMappingPolicy, CalibrationMapping),
264  ("datasets", dsMappingPolicy, DatasetMapping)
265  )
266  self.mappings = dict()
267  for name, defPolicy, cls in mappingList:
268  if policy.exists(name):
269  datasets = policy.getPolicy(name)
270  mappings = dict()
271  setattr(self, name, mappings)
272  for datasetType in datasets.names(True):
273  subPolicy = datasets.getPolicy(datasetType)
274  subPolicy.mergeDefaults(defPolicy)
275  if name == "calibrations":
276  mapping = cls(datasetType, subPolicy,
277  self.registry, calibRegistry, calibRoot, provided=provided)
278  else:
279  mapping = cls(datasetType, subPolicy,
280  self.registry, root, provided=provided)
281  self.keyDict.update(mapping.keys())
282  mappings[datasetType] = mapping
283  self.mappings[datasetType] = mapping
284  if not hasattr(self, "map_" + datasetType):
285  def mapClosure(dataId, write=False,
286  mapper=self, mapping=mapping):
287  return mapping.map(mapper, dataId, write)
288  setattr(self, "map_" + datasetType, mapClosure)
289  if not hasattr(self, "query_" + datasetType):
290  def queryClosure(key, format, dataId, mapping=mapping):
291  return mapping.lookup(format, dataId)
292  setattr(self, "query_" + datasetType, queryClosure)
293  if hasattr(mapping, "standardize") and \
294  not hasattr(self, "std_" + datasetType):
295  def stdClosure(item, dataId,
296  mapper=self, mapping=mapping):
297  return mapping.standardize(mapper, item, dataId)
298  setattr(self, "std_" + datasetType, stdClosure)
299 
300  mapFunc = "map_" + datasetType + "_filename"
301  bypassFunc = "bypass_" + datasetType + "_filename"
302  if not hasattr(self, mapFunc):
303  setattr(self, mapFunc, getattr(self, "map_" + datasetType))
304  if not hasattr(self, bypassFunc):
305  setattr(self, bypassFunc,
306  lambda datasetType, pythonType, location, dataId: location.getLocations())
307 
308  # Set up metadata versions
309  if name == "exposures" or name == "images":
310  expFunc = "map_" + datasetType # Function name to map exposure
311  mdFunc = expFunc + "_md" # Function name to map metadata
312  bypassFunc = "bypass_" + datasetType + "_md" # Function name to bypass daf_persistence
313  if not hasattr(self, mdFunc):
314  setattr(self, mdFunc, getattr(self, expFunc))
315  if not hasattr(self, bypassFunc):
316  setattr(self, bypassFunc,
317  lambda datasetType, pythonType, location, dataId:
318  afwImage.readMetadata(location.getLocations()[0]))
319  if not hasattr(self, "query_" + datasetType + "_md"):
320  setattr(self, "query_" + datasetType + "_md",
321  getattr(self, "query_" + datasetType))
322 
323  subFunc = expFunc + "_sub" # Function name to map subimage
324  if not hasattr(self, subFunc):
325  def mapSubClosure(dataId, write=False, mapper=self, mapping=mapping):
326  subId = dataId.copy()
327  del subId['bbox']
328  loc = mapping.map(mapper, subId, write)
329  bbox = dataId['bbox']
330  llcX = bbox.getMinX()
331  llcY = bbox.getMinY()
332  width = bbox.getWidth()
333  height = bbox.getHeight()
334  loc.additionalData.set('llcX', llcX)
335  loc.additionalData.set('llcY', llcY)
336  loc.additionalData.set('width', width)
337  loc.additionalData.set('height', height)
338  if 'imageOrigin' in dataId:
339  loc.additionalData.set('imageOrigin',
340  dataId['imageOrigin'])
341  return loc
342  setattr(self, subFunc, mapSubClosure)
343  if not hasattr(self, "query_" + datasetType + "_sub"):
344  def querySubClosure(key, format, dataId, mapping=mapping):
345  subId = dataId.copy()
346  del subId['bbox']
347  return mapping.lookup(format, subId)
348  setattr(self, "query_" + datasetType + "_sub", querySubClosure)
349 
350  # Camera geometry
351  self.cameraDataLocation = None # path to camera geometry config file
352  self.camera = None
353  if policy.exists('camera'):
354  cameraDataSubdir = policy.getString('camera')
355  self.cameraDataLocation = os.path.normpath(
356  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
357  cameraConfig = afwCameraGeom.CameraConfig()
358  cameraConfig.load(self.cameraDataLocation)
359  self.camera = self.std_camera(cameraConfig, dataId=dict())
360 
361  # Defect registry and root
362  self.defectRegistry = None
363  if policy.exists('defects'):
364  self.defectPath = os.path.join(
365  repositoryDir, policy.getString('defects'))
366  defectRegistryLocation = os.path.join(
367  self.defectPath, "defectRegistry.sqlite3")
368  self.defectRegistry = \
369  Registry.create(defectRegistryLocation)
370 
371  # Filter translation table
372  self.filters = None
373 
374  # Skytile policy
375  self.skypolicy = policy.getPolicy("skytiles")
376 
377  def _parentSearch(self, path):
378  """Look for the given path in the current root or any of its parents
379  by following "_parent" symlinks; return None if it can't be found. A
380  little tricky because the path may be in an alias of the root (e.g.
381  ".") and because the "_parent" links go between the root and the rest
382  of the path.
383  """
384 
385  # Separate path into a root-equivalent prefix (in dir) and the rest
386  # (left in path)
387  rootDir = self.root
388 
389  # First remove trailing slashes (#2527)
390  while len(rootDir) > 1 and rootDir[-1] == '/':
391  rootDir = rootDir[:-1]
392 
393  if path.startswith(rootDir + "/"):
394  # Common case; we have the same root prefix string
395  path = path[len(rootDir)+1:]
396  dir = rootDir
397  elif rootDir == "/" and path.startswith("/"):
398  path = path[1:]
399  dir = rootDir
400  else:
401  # Search for prefix that is the same as root
402  pathPrefix = os.path.dirname(path)
403  while pathPrefix != "" and pathPrefix != "/":
404  if os.path.realpath(pathPrefix) == os.path.realpath(self.root):
405  break
406  pathPrefix = os.path.dirname(pathPrefix)
407  if os.path.realpath(pathPrefix) != os.path.realpath(self.root):
408  # No prefix matching root, don't search for parents
409  if os.path.exists(path):
410  return path
411  return None
412  if pathPrefix == "/":
413  path = path[1:]
414  elif pathPrefix != "":
415  path = path[len(pathPrefix)+1:]
416  # If pathPrefix == "", then the current directory is the root
417  dir = pathPrefix
418 
419  # Now search for the path in the root or its parents
420  # Strip off any cfitsio bracketed extension if present
421  strippedPath = path
422  firstBracket = path.find("[")
423  if firstBracket != -1:
424  strippedPath = path[:firstBracket]
425  while not os.path.exists(os.path.join(dir, strippedPath)):
426  dir = os.path.join(dir, "_parent")
427  if not os.path.exists(dir):
428  return None
429  return os.path.join(dir, path)
430 
431  def backup(self, datasetType, dataId):
432  """Rename any existing object with the given type and dataId.
433 
434  The CameraMapper implementation saves objects in a sequence of e.g.:
435  foo.fits
436  foo.fits~1
437  foo.fits~2
438  All of the backups will be placed in the output repo, however, and will
439  not be removed if they are found elsewhere in the _parent chain. This
440  means that the same file will be stored twice if the previous version was
441  found in an input repo.
442  """
443  n = 0
444  newLocation = self.map(datasetType, dataId, write=True)
445  newPath = newLocation.getLocations()[0]
446  path = self._parentSearch(newPath)
447  oldPaths = []
448  while path is not None:
449  n += 1
450  oldPaths.append((n, path))
451  path = self._parentSearch("%s~%d" % (newPath, n))
452  for n, oldPath in reversed(oldPaths):
453  newDir, newFile = os.path.split(newPath)
454  if not os.path.exists(newDir):
455  os.makedirs(newDir)
456  shutil.copy(oldPath, "%s~%d" % (newPath, n))
457 
458  def keys(self):
459  """Return supported keys.
460  @return (iterable) List of keys usable in a dataset identifier"""
461  return self.keyDict.iterkeys()
462 
463  def getKeys(self, datasetType, level):
464  """Return supported keys and their value types for a given dataset
465  type at a given level of the key hierarchy.
466 
467  @param datasetType (str) dataset type or None for all keys
468  @param level (str) level or None for all levels
469  @return (iterable) Set of keys usable in a dataset identifier"""
470  if datasetType is None:
471  keyDict = self.keyDict
472  else:
473  keyDict = self.mappings[datasetType].keys()
474  if level is not None and level in self.levels:
475  keyDict = dict(keyDict)
476  for l in self.levels[level]:
477  if l in keyDict:
478  del keyDict[l]
479  return keyDict
480 
481  def getDefaultLevel(self):
482  return self.defaultLevel
483 
484  def getDefaultSubLevel(self, level):
485  if self.defaultSubLevels.has_key(level):
486  return self.defaultSubLevels[level]
487  return None
488 
489  @classmethod
490  def getCameraName(cls):
491  """Return the name of the camera that this CameraMapper is for."""
492  className = str(cls)
493  m = re.search(r'(\w+)Mapper', className)
494  if m is None:
495  m = re.search(r"class '[\w.]*?(\w+)'", className)
496  name = m.group(1)
497  return name[:1].lower() + name[1:] if name else ''
498 
499  @classmethod
501  """Return the name of the EUPS product containing this CameraMapper."""
502  modPath = os.path.realpath(sys.modules[cls.__module__].__file__)
503  bestPathLen = 0
504  bestName = None
505  for prod in eups.Eups().findProducts(tags=["setup"]):
506  path = os.path.realpath(prod.dir)
507  if modPath.startswith(path) and len(path) > bestPathLen:
508  bestName = prod.name
509  bestPathLen = len(path)
510  if bestName is None:
511  raise NotImplementedError(
512  "%s did not provide an eups product name, and one could not be discovered." %
513  (str(cls),))
514  return bestName
515 
516  def map_camera(self, dataId, write=False):
517  """Map a camera dataset."""
518  if self.cameraDataLocation is None:
519  raise RuntimeError("No camera dataset available.")
520  actualId = self._transformId(dataId)
522  pythonType = "lsst.afw.cameraGeom.CameraConfig",
523  cppType = "Config",
524  storageName = "ConfigStorage",
525  locationList = self.cameraDataLocation,
526  dataId = actualId,
527  )
528 
529  def std_camera(self, item, dataId):
530  """Standardize a camera dataset by converting it to a camera object.
531 
532  @param[in] item: camera info (an lsst.afw.cameraGeom.CameraConfig)
533  @param[in] dataId: data ID dict
534  """
535  if self.cameraDataLocation is None:
536  raise RuntimeError("No camera dataset available.")
537  ampInfoPath = os.path.dirname(self.cameraDataLocation)
538  return afwCameraGeom.makeCameraFromPath(
539  cameraConfig = item,
540  ampInfoPath = ampInfoPath,
541  shortNameFunc = self.getShortCcdName,
542  )
543 
544  def map_defects(self, dataId, write=False):
545  """Map defects dataset.
546 
547  @return a very minimal ButlerLocation containing just the locationList field
548  (just enough information that bypass_defects can use it).
549  """
550  defectFitsPath = self._defectLookup(dataId=dataId)
551  if defectFitsPath is None:
552  raise RuntimeError("No defects available for dataId=%s" % (dataId,))
553 
554  return dafPersist.ButlerLocation(None, None, None, defectFitsPath, dataId)
555 
556  def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId):
557  """Return a defect based on the butler location returned by map_defects
558 
559  @param[in] butlerLocation: a ButlerLocation with locationList = path to defects FITS file
560  @param[in] dataId: the usual data ID; "ccd" must be set
561 
562  Note: the name "bypass_XXX" means the butler makes no attempt to convert the ButlerLocation
563  into an object, which is what we want for now, since that conversion is a bit tricky.
564  """
565  detectorName = self._extractDetectorName(dataId)
566  defectsFitsPath = butlerLocation.locationList[0]
567  with pyfits.open(defectsFitsPath) as hduList:
568  for hdu in hduList[1:]:
569  if hdu.header["name"] != detectorName:
570  continue
571 
572  defectList = []
573  for data in hdu.data:
574  bbox = afwGeom.Box2I(
575  afwGeom.Point2I(int(data['x0']), int(data['y0'])),
576  afwGeom.Extent2I(int(data['width']), int(data['height'])),
577  )
578  defectList.append(afwImage.DefectBase(bbox))
579  return defectList
580 
581  raise RuntimeError("No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
582 
583  def std_raw(self, item, dataId):
584  """Standardize a raw dataset by converting it to an Exposure instead of an Image"""
585  item = exposureFromImage(item)
586  return self._standardizeExposure(self.exposures['raw'], item, dataId,
587  trimmed=False)
588 
589  def map_skypolicy(self, dataId):
590  """Map a sky policy."""
591  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
592  "Internal", None, None)
593 
594  def std_skypolicy(self, item, dataId):
595  """Standardize a sky policy by returning the one we use."""
596  return self.skypolicy
597 
598 ###############################################################################
599 #
600 # Utility functions
601 #
602 ###############################################################################
603 
604  def _getCcdKeyVal(self, dataId):
605  """Return CCD key and value used to look a defect in the defect registry
606 
607  The default implementation simply returns ("ccd", full detector name)
608  """
609  return ("ccd", self._extractDetectorName(dataId))
610 
611  def _setupRegistry(self, name, path, policy, policyKey, root):
612  """Set up a registry (usually SQLite3), trying a number of possible
613  paths.
614  @param name (string) Name of registry
615  @param path (string) Path for registry
616  @param policyKey (string) Key in policy for registry path
617  @param root (string) Root directory to look in
618  @return (lsst.daf.butlerUtils.Registry) Registry object"""
619 
620  if path is None and policy.exists(policyKey):
622  policy.getString(policyKey)).locString()
623  if not os.path.exists(path):
624  if not os.path.isabs(path) and root is not None:
625  newPath = self._parentSearch(os.path.join(root, path))
626  if newPath is None:
627  self.log.log(pexLog.Log.WARN,
628  "Unable to locate registry at policy path (also looked in root): %s" % path)
629  path = newPath
630  else:
631  self.log.log(pexLog.Log.WARN,
632  "Unable to locate registry at policy path: %s" % path)
633  path = None
634  if path is None and root is not None:
635  path = os.path.join(root, "%s.sqlite3" % name)
636  newPath = self._parentSearch(path)
637  if newPath is None:
638  self.log.log(pexLog.Log.WARN,
639  "Unable to locate %s registry in root: %s" % (name, path))
640  path = newPath
641  if path is None:
642  path = os.path.join(".", "%s.sqlite3" % name)
643  newPath = self._parentSearch(path)
644  if newPath is None:
645  self.log.log(pexLog.Log.WARN,
646  "Unable to locate %s registry in current dir: %s" % (name, path))
647  path = newPath
648  if path is not None:
649  if not os.path.exists(path):
650  newPath = self._parentSearch(path)
651  if newPath is not None:
652  path = newPath
653  self.log.log(pexLog.Log.INFO,
654  "Loading %s registry from %s" % (name, path))
655  registry = Registry.create(path)
656  if registry is None:
657  raise RuntimeError, "Unable to load %s registry from %s" % (name, path)
658  return registry
659  else:
660  # TODO Try a FsRegistry(root)
661  self.log.log(pexLog.Log.WARN,
662  "No registry loaded; proceeding without one")
663  return None
664 
665  def _transformId(self, dataId):
666  """Generate a standard ID dict from a camera-specific ID dict.
667 
668  Canonical keys include:
669  - amp: amplifier name
670  - ccd: CCD name (in LSST this is a combination of raft and sensor)
671  The default implementation returns a copy of its input.
672 
673  @param dataId[in] (dict) Dataset identifier; this must not be modified
674  @return (dict) Transformed dataset identifier"""
675 
676  return dataId.copy()
677 
678  def _mapActualToPath(self, template, actualId):
679  """Convert a template path to an actual path, using the actual data
680  identifier. This implementation is usually sufficient but can be
681  overridden by the subclass.
682  @param template (string) Template path
683  @param actualId (dict) Dataset identifier
684  @return (string) Pathname"""
685 
686  return template % self._transformId(actualId)
687 
688  @staticmethod
689  def getShortCcdName(ccdName):
690  """Convert a CCD name to a form useful as a filename
691 
692  The default implementation converts spaces to underscores.
693  """
694  return ccdName.replace(" ", "_")
695 
696  def _extractDetectorName(self, dataId):
697  """Extract the detector (CCD) name from the dataset identifier.
698 
699  The name in question is the detector name used by lsst.afw.cameraGeom.
700 
701  @param dataId (dict) Dataset identifier
702  @return (string) Detector name
703  """
704  raise NotImplementedError("No _extractDetectorName() function specified")
705 
706  def _extractAmpId(self, dataId):
707  """Extract the amplifier identifer from a dataset identifier.
708 
709  @warning this is deprecated; DO NOT USE IT
710 
711  amplifier identifier has two parts: the detector name for the CCD
712  containing the amplifier and index of the amplifier in the detector.
713  @param dataId (dict) Dataset identifer
714  @return (tuple) Amplifier identifier"""
715 
716  trDataId = self._transformId(dataId)
717  return (trDataId["ccd"], int(trDataId['amp']))
718 
719  def _setAmpDetector(self, item, dataId, trimmed=True):
720  """Set the detector object in an Exposure for an amplifier.
721  Defects are also added to the Exposure based on the detector object.
722  @param[in,out] item (lsst.afw.image.Exposure)
723  @param dataId (dict) Dataset identifier
724  @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
725 
726  return self._setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
727 
728  def _setCcdDetector(self, item, dataId, trimmed=True):
729  """Set the detector object in an Exposure for a CCD.
730  @param[in,out] item (lsst.afw.image.Exposure)
731  @param dataId (dict) Dataset identifier
732  @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
733 
734  detectorName = self._extractDetectorName(dataId)
735  detector = self.camera[detectorName]
736  item.setDetector(detector)
737 
738  def _setFilter(self, mapping, item, dataId):
739  """Set the filter object in an Exposure. If the Exposure had a FILTER
740  keyword, this was already processed during load. But if it didn't,
741  use the filter from the registry.
742  @param mapping (lsst.daf.butlerUtils.Mapping)
743  @param[in,out] item (lsst.afw.image.Exposure)
744  @param dataId (dict) Dataset identifier"""
745 
746  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI) or
747  isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
748  return
749 
750  actualId = mapping.need(['filter'], dataId)
751  filterName = actualId['filter']
752  if self.filters is not None and self.filters.has_key(filterName):
753  filterName = self.filters[filterName]
754  item.setFilter(afwImage.Filter(filterName))
755 
756  def _setTimes(self, mapping, item, dataId):
757  """Set the exposure time and exposure midpoint in the calib object in
758  an Exposure. Use the EXPTIME and MJD-OBS keywords (and strip out
759  EXPTIME).
760  @param mapping (lsst.daf.butlerUtils.Mapping)
761  @param[in,out] item (lsst.afw.image.Exposure)
762  @param dataId (dict) Dataset identifier"""
763 
764  md = item.getMetadata()
765  calib = item.getCalib()
766  if md.exists("EXPTIME"):
767  expTime = md.get("EXPTIME")
768  calib.setExptime(expTime)
769  md.remove("EXPTIME")
770  else:
771  expTime = calib.getExptime()
772  if md.exists("MJD-OBS"):
773  obsStart = dafBase.DateTime(md.get("MJD-OBS"),
774  dafBase.DateTime.MJD, dafBase.DateTime.UTC)
775  obsMidpoint = obsStart.nsecs() + long(expTime * 1000000000L / 2)
776  calib.setMidTime(dafBase.DateTime(obsMidpoint))
777 
778 
779  # Default standardization function for exposures
780  def _standardizeExposure(self, mapping, item, dataId, filter=True,
781  trimmed=True):
782  """Default standardization function for images.
783  @param mapping (lsst.daf.butlerUtils.Mapping)
784  @param[in,out] item (lsst.afw.image.Exposure)
785  @param dataId (dict) Dataset identifier
786  @param filter (bool) Set filter?
787  @param trimmed (bool) Should detector be marked as trimmed?
788  @return (lsst.afw.image.Exposure) the standardized Exposure"""
789 
790  if (re.search(r'Exposure', mapping.python) and re.search(r'Image',mapping.persistable)):
791  item = exposureFromImage(item)
792 
793  if mapping.level.lower() == "amp":
794  self._setAmpDetector(item, dataId, trimmed)
795  elif mapping.level.lower() == "ccd":
796  self._setCcdDetector(item, dataId, trimmed)
797 
798  if filter:
799  self._setFilter(mapping, item, dataId)
800  if not isinstance(mapping, CalibrationMapping):
801  self._setTimes(mapping, item, dataId)
802 
803  return item
804 
805  def _defectLookup(self, dataId):
806  """Find the defects for a given CCD.
807  @param dataId (dict) Dataset identifier
808  @return (string) path to the defects file or None if not available"""
809  if self.defectRegistry is None:
810  return None
811  if self.registry is None:
812  raise RuntimeError, "No registry for defect lookup"
813 
814  ccdKey, ccdVal = self._getCcdKeyVal(dataId)
815 
816  rows = self.registry.executeQuery(("taiObs",), ("raw_visit",),
817  [("visit", "?")], None, (dataId['visit'],))
818  if len(rows) == 0:
819  return None
820  assert len(rows) == 1
821  taiObs = rows[0][0]
822 
823  # Lookup the defects for this CCD serial number that are valid at the
824  # exposure midpoint.
825  rows = self.defectRegistry.executeQuery(("path",), ("defect",),
826  [(ccdKey, "?")],
827  ("DATETIME(?)", "DATETIME(validStart)", "DATETIME(validEnd)"),
828  (ccdVal, taiObs))
829  if not rows or len(rows) == 0:
830  return None
831  if len(rows) == 1:
832  return os.path.join(self.defectPath, rows[0][0])
833  else:
834  raise RuntimeError("Querying for defects (%s, %s) returns %d files: %s" %
835  (ccdVal, taiObs, len(rows), ", ".join([_[0] for _ in rows])))
836 
837 
838 def exposureFromImage(image):
839  """Generate an exposure from a DecoratedImage or similar
840  @param[in] image Image of interest
841  @return (lsst.afw.image.Exposure) Exposure containing input image
842  """
843  if isinstance(image, afwImage.DecoratedImageU) or isinstance(image, afwImage.DecoratedImageI) or \
844  isinstance(image, afwImage.DecoratedImageF) or isinstance(image, afwImage.DecoratedImageD):
845  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
846  else:
847  exposure = image
848  md = image.getMetadata()
849  exposure.setMetadata(md)
850  wcs = afwImage.makeWcs(md)
851  exposure.setWcs(wcs)
852  wcsMetadata = wcs.getFitsMetadata()
853  for kw in wcsMetadata.paramNames():
854  md.remove(kw)
855 
856  return exposure
Class for handling dates/times, including MJD, UTC, and TAI.
Definition: DateTime.h:58
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename Image< ImagePixelT >::Ptr image, typename Mask< MaskPixelT >::Ptr mask=typename Mask< MaskPixelT >::Ptr(), typename Image< VariancePixelT >::Ptr variance=typename Image< VariancePixelT >::Ptr())
Definition: MaskedImage.h:1067
Encapsulate information about a bad portion of a detector.
Definition: Defect.h:42
Class for logical location of a persisted Persistable instance.
a representation of a default Policy file that is stored as a file in the installation directory of a...
a place to record messages and descriptions of the state of processing.
Definition: Log.h:154
An integer coordinate rectangle.
Definition: Box.h:53
Wcs::Ptr makeWcs(coord::Coord const &crval, geom::Point2D const &crpix, double CD11, double CD12, double CD21, double CD22)
Create a Wcs object from crval, crpix, CD, using CD elements (useful from python) ...
Definition: makeWcs.cc:87
Holds an integer identifier for an LSST filter.
Definition: Filter.h:107
boost::shared_ptr< daf::base::PropertySet > readMetadata(std::string const &fileName, int hdu=0, bool strip=false)
Return the metadata (header entries) from a FITS file.
Exposure< ImagePixelT, MaskPixelT, VariancePixelT >::Ptr makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, boost::shared_ptr< Wcs const > wcs=boost::shared_ptr< Wcs const >())
Definition: Exposure.h:308