LSSTApplications  11.0-13-gbb96280,12.1.rc1,12.1.rc1+1,12.1.rc1+2,12.1.rc1+5,12.1.rc1+8,12.1.rc1-1-g06d7636+1,12.1.rc1-1-g253890b+5,12.1.rc1-1-g3d31b68+7,12.1.rc1-1-g3db6b75+1,12.1.rc1-1-g5c1385a+3,12.1.rc1-1-g83b2247,12.1.rc1-1-g90cb4cf+6,12.1.rc1-1-g91da24b+3,12.1.rc1-2-g3521f8a,12.1.rc1-2-g39433dd+4,12.1.rc1-2-g486411b+2,12.1.rc1-2-g4c2be76,12.1.rc1-2-gc9c0491,12.1.rc1-2-gda2cd4f+6,12.1.rc1-3-g3391c73+2,12.1.rc1-3-g8c1bd6c+1,12.1.rc1-3-gcf4b6cb+2,12.1.rc1-4-g057223e+1,12.1.rc1-4-g19ed13b+2,12.1.rc1-4-g30492a7
LSSTDataManagementBasePackage
cameraMapper.py
Go to the documentation of this file.
1 #
2 # LSST Data Management System
3 # Copyright 2008, 2009, 2010 LSST Corporation.
4 #
5 # This product includes software developed by the
6 # LSST Project (http://www.lsst.org/).
7 #
8 # This program is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU General Public License as published by
10 # the Free Software Foundation, either version 3 of the License, or
11 # (at your option) any later version.
12 #
13 # This program is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU General Public License for more details.
17 #
18 # You should have received a copy of the LSST License Statement and
19 # the GNU General Public License along with this program. If not,
20 # see <http://www.lsstcorp.org/LegalNotices/>.
21 #
22 
23 from builtins import str
24 from past.builtins import long
25 import copy
26 import errno
27 import glob
28 import os
29 import pyfits # required by _makeDefectsDict until defects are written as AFW tables
30 import re
31 import shutil
32 import weakref
33 import lsst.daf.persistence as dafPersist
34 from lsst.daf.butlerUtils import ImageMapping, ExposureMapping, CalibrationMapping, DatasetMapping
35 import lsst.daf.base as dafBase
36 import lsst.afw.geom as afwGeom
37 import lsst.afw.image as afwImage
38 import lsst.afw.cameraGeom as afwCameraGeom
39 import lsst.log as lsstLog
40 import lsst.pex.policy as pexPolicy
41 from .exposureIdInfo import ExposureIdInfo
42 from lsst.utils import getPackageDir
43 
44 """This module defines the CameraMapper base class."""
45 
46 
47 class CameraMapper(dafPersist.Mapper):
48 
49  """CameraMapper is a base class for mappers that handle images from a
50  camera and products derived from them. This provides an abstraction layer
51  between the data on disk and the code.
52 
53  Public methods: keys, queryMetadata, getDatasetTypes, map,
54  canStandardize, standardize
55 
56  Mappers for specific data sources (e.g., CFHT Megacam, LSST
57  simulations, etc.) should inherit this class.
58 
59  The CameraMapper manages datasets within a "root" directory. It can also
60  be given an "outputRoot". If so, the input root is linked into the
61  outputRoot directory using a symlink named "_parent"; writes go into the
62  outputRoot while reads can come from either the root or outputRoot. As
63  outputRoots are used as inputs for further processing, the chain of
64  _parent links allows any dataset to be retrieved. Note that writing to a
65  dataset present in the input root will hide the existing dataset but not
66  overwrite it. See #2160 for design discussion.
67 
68  A camera is assumed to consist of one or more rafts, each composed of
69  multiple CCDs. Each CCD is in turn composed of one or more amplifiers
70  (amps). A camera is also assumed to have a camera geometry description
71  (CameraGeom object) as a policy file, a filter description (Filter class
72  static configuration) as another policy file, and an optional defects
73  description directory.
74 
75  Information from the camera geometry and defects are inserted into all
76  Exposure objects returned.
77 
78  The mapper uses one or two registries to retrieve metadata about the
79  images. The first is a registry of all raw exposures. This must contain
80  the time of the observation. One or more tables (or the equivalent)
81  within the registry are used to look up data identifier components that
82  are not specified by the user (e.g. filter) and to return results for
83  metadata queries. The second is an optional registry of all calibration
84  data. This should contain validity start and end entries for each
85  calibration dataset in the same timescale as the observation time.
86 
87  The following method must be provided by the subclass:
88 
89  _extractDetectorName(self, dataId): returns the detector name for a CCD
90  (e.g., "CFHT 21", "R:1,2 S:3,4") as used in the AFW CameraGeom class given
91  a dataset identifier referring to that CCD or a subcomponent of it.
92 
93  Other methods that the subclass may wish to override include:
94 
95  _transformId(self, dataId): transformation of a data identifier
96  from colloquial usage (e.g., "ccdname") to proper/actual usage (e.g., "ccd"),
97  including making suitable for path expansion (e.g. removing commas).
98  The default implementation does nothing. Note that this
99  method should not modify its input parameter.
100 
101  getShortCcdName(self, ccdName): a static method that returns a shortened name
102  suitable for use as a filename. The default version converts spaces to underscores.
103 
104  _getCcdKeyVal(self, dataId): return a CCD key and value
105  by which to look up defects in the defects registry.
106  The default value returns ("ccd", detector name)
107 
108  _mapActualToPath(self, template, actualId): convert a template path to an
109  actual path, using the actual dataset identifier.
110 
111  The mapper's behaviors are largely specified by the policy file.
112  See the MapperDictionary.paf for descriptions of the available items.
113 
114  The 'exposures', 'calibrations', and 'datasets' subpolicies configure
115  mappings (see Mappings class).
116 
117  Common default mappings for all subclasses can be specified in the
118  "policy/{images,exposures,calibrations,datasets}.yaml" files. This provides
119  a simple way to add a product to all camera mappers.
120 
121  Functions to map (provide a path to the data given a dataset
122  identifier dictionary) and standardize (convert data into some standard
123  format or type) may be provided in the subclass as "map_{dataset type}"
124  and "std_{dataset type}", respectively.
125 
126  If non-Exposure datasets cannot be retrieved using standard
127  daf_persistence methods alone, a "bypass_{dataset type}" function may be
128  provided in the subclass to return the dataset instead of using the
129  "datasets" subpolicy.
130 
131  Implementations of map_camera and bypass_camera that should typically be
132  sufficient are provided in this base class.
133 
134  @todo
135  * Handle defects the same was as all other calibration products, using the calibration registry
136  * Instead of auto-loading the camera at construction time, load it from the calibration registry
137  * Rewrite defects as AFW tables so we don't need pyfits to unpersist them; then remove all mention
138  of pyfits from this package.
139  """
140  packageName = None
141 
142  def __init__(self, policy, repositoryDir,
143  root=None, registry=None, calibRoot=None, calibRegistry=None,
144  provided=None, outputRoot=None):
145  """Initialize the CameraMapper.
146  @param policy (daf_persistence.Policy, or pexPolicy.Policy (only for backward compatibility))
147  Policy with per-camera defaults already merged
148  @param repositoryDir (string) Policy repository for the subclassing
149  module (obtained with getRepositoryPath() on the
150  per-camera default dictionary)
151  @param root (string) Root directory for data
152  @param registry (string) Path to registry with data's metadata
153  @param calibRoot (string) Root directory for calibrations
154  @param calibRegistry (string) Path to registry with calibrations'
155  metadata
156  @param provided (list of strings) Keys provided by the mapper
157  @param outputRoot (string) Root directory for output data
158  """
159 
160  dafPersist.Mapper.__init__(self)
161 
162  self.log = lsstLog.Log.getLogger("CameraMapper")
163 
164  self.root = root
165  if isinstance(policy, pexPolicy.Policy):
166  policy = dafPersist.Policy(policy)
167 
168  repoPolicy = CameraMapper.getRepoPolicy(self.root, self.root)
169  if repoPolicy is not None:
170  policy.update(repoPolicy)
171 
172  defaultPolicyFile = dafPersist.Policy.defaultPolicyFile("daf_butlerUtils",
173  "MapperDictionary.paf",
174  "policy")
175  dictPolicy = dafPersist.Policy(defaultPolicyFile)
176  policy.merge(dictPolicy)
177 
178  # Levels
179  self.levels = dict()
180  if 'levels' in policy:
181  levelsPolicy = policy['levels']
182  for key in levelsPolicy.names(True):
183  self.levels[key] = set(levelsPolicy.asArray(key))
184  self.defaultLevel = policy['defaultLevel']
185  self.defaultSubLevels = dict()
186  if 'defaultSubLevels' in policy:
187  self.defaultSubLevels = policy['defaultSubLevels']
188 
189  # Root directories
190  if root is None:
191  root = "."
192  root = dafPersist.LogicalLocation(root).locString()
193 
194  if outputRoot is not None and os.path.abspath(outputRoot) != os.path.abspath(root):
195  # Path manipulations are subject to race condition
196  if not os.path.exists(outputRoot):
197  try:
198  os.makedirs(outputRoot)
199  except OSError as e:
200  if not e.errno == errno.EEXIST:
201  raise
202  if not os.path.exists(outputRoot):
203  raise RuntimeError("Unable to create output repository '%s'" % (outputRoot,))
204  if os.path.exists(root):
205  # Symlink existing input root to "_parent" in outputRoot.
206  src = os.path.abspath(root)
207  dst = os.path.join(outputRoot, "_parent")
208  if not os.path.exists(dst):
209  try:
210  os.symlink(src, dst)
211  except OSError:
212  pass
213  if os.path.exists(dst):
214  if os.path.realpath(dst) != os.path.realpath(src):
215  raise RuntimeError("Output repository path "
216  "'%s' already exists and differs from "
217  "input repository path '%s'" % (dst, src))
218  else:
219  raise RuntimeError("Unable to symlink from input "
220  "repository path '%s' to output repository "
221  "path '%s'" % (src, dst))
222  # We now use the outputRoot as the main root with access to the
223  # input via "_parent".
224  root = outputRoot
225 
226  if calibRoot is None:
227  if 'calibRoot' in policy:
228  calibRoot = policy['calibRoot']
229  calibRoot = dafPersist.LogicalLocation(calibRoot).locString()
230  else:
231  calibRoot = root
232 
233  if not os.path.exists(root):
234  self.log.warn("Root directory not found: %s", root)
235  if not os.path.exists(calibRoot):
236  self.log.warn("Calibration root directory not found: %s", calibRoot)
237 
238  self.root = root
239 
240  # Registries
241  self.registry = self._setupRegistry("registry", registry, policy, "registryPath", root)
242  if 'needCalibRegistry' in policy and policy['needCalibRegistry']:
243  calibRegistry = self._setupRegistry("calibRegistry", calibRegistry, policy,
244  "calibRegistryPath", calibRoot)
245  else:
246  calibRegistry = None
247 
248  # Sub-dictionaries (for exposure/calibration/dataset types)
249  imgMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
250  "daf_butlerUtils", "ImageMappingDictionary.paf", "policy"))
251  expMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
252  "daf_butlerUtils", "ExposureMappingDictionary.paf", "policy"))
253  calMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
254  "daf_butlerUtils", "CalibrationMappingDictionary.paf", "policy"))
255  dsMappingPolicy = dafPersist.Policy(dafPersist.Policy.defaultPolicyFile(
256  "daf_butlerUtils", "DatasetMappingDictionary.paf", "policy"))
257 
258  # Dict of valid keys and their value types
259  self.keyDict = dict()
260 
261  # Mappings
262  mappingList = (
263  ("images", imgMappingPolicy, ImageMapping),
264  ("exposures", expMappingPolicy, ExposureMapping),
265  ("calibrations", calMappingPolicy, CalibrationMapping),
266  ("datasets", dsMappingPolicy, DatasetMapping)
267  )
268  self.mappings = dict()
269  for name, defPolicy, cls in mappingList:
270  if name in policy:
271  datasets = policy[name]
272 
273  # Centrally-defined datasets
274  defaultsPath = os.path.join(getPackageDir("daf_butlerUtils"), "policy", name + ".yaml")
275  if os.path.exists(defaultsPath):
276  datasets.merge(dafPersist.Policy(defaultsPath))
277 
278  mappings = dict()
279  setattr(self, name, mappings)
280  for datasetType in datasets.names(True):
281  subPolicy = datasets[datasetType]
282  subPolicy.merge(defPolicy)
283  if name == "calibrations":
284  mapping = cls(datasetType, subPolicy, self.registry, calibRegistry, calibRoot,
285  provided=provided)
286  else:
287  mapping = cls(datasetType, subPolicy, self.registry, root, provided=provided)
288  self.keyDict.update(mapping.keys())
289  mappings[datasetType] = mapping
290  self.mappings[datasetType] = mapping
291  if not hasattr(self, "map_" + datasetType):
292  def mapClosure(dataId, write=False,
293  mapper=weakref.proxy(self), mapping=mapping):
294  return mapping.map(mapper, dataId, write)
295  setattr(self, "map_" + datasetType, mapClosure)
296  if not hasattr(self, "query_" + datasetType):
297  def queryClosure(format, dataId, mapping=mapping):
298  return mapping.lookup(format, dataId)
299  setattr(self, "query_" + datasetType, queryClosure)
300  if hasattr(mapping, "standardize") and \
301  not hasattr(self, "std_" + datasetType):
302  def stdClosure(item, dataId,
303  mapper=weakref.proxy(self), mapping=mapping):
304  return mapping.standardize(mapper, item, dataId)
305  setattr(self, "std_" + datasetType, stdClosure)
306 
307  mapFunc = "map_" + datasetType + "_filename"
308  bypassFunc = "bypass_" + datasetType + "_filename"
309  if not hasattr(self, mapFunc):
310  setattr(self, mapFunc, getattr(self, "map_" + datasetType))
311  if not hasattr(self, bypassFunc):
312  setattr(self, bypassFunc,
313  lambda datasetType, pythonType, location, dataId: location.getLocations())
314 
315  # Set up metadata versions
316  if name == "exposures" or name == "images":
317  expFunc = "map_" + datasetType # Function name to map exposure
318  mdFunc = expFunc + "_md" # Function name to map metadata
319  bypassFunc = "bypass_" + datasetType + "_md" # Func name to bypass daf_persistence
320  if not hasattr(self, mdFunc):
321  setattr(self, mdFunc, getattr(self, expFunc))
322  if not hasattr(self, bypassFunc):
323  setattr(self, bypassFunc,
324  lambda datasetType, pythonType, location, dataId:
325  afwImage.readMetadata(location.getLocations()[0]))
326  if not hasattr(self, "query_" + datasetType + "_md"):
327  setattr(self, "query_" + datasetType + "_md",
328  getattr(self, "query_" + datasetType))
329 
330  subFunc = expFunc + "_sub" # Function name to map subimage
331  if not hasattr(self, subFunc):
332  def mapSubClosure(dataId, write=False, mapper=weakref.proxy(self),
333  mapping=mapping):
334  subId = dataId.copy()
335  del subId['bbox']
336  loc = mapping.map(mapper, subId, write)
337  bbox = dataId['bbox']
338  llcX = bbox.getMinX()
339  llcY = bbox.getMinY()
340  width = bbox.getWidth()
341  height = bbox.getHeight()
342  loc.additionalData.set('llcX', llcX)
343  loc.additionalData.set('llcY', llcY)
344  loc.additionalData.set('width', width)
345  loc.additionalData.set('height', height)
346  if 'imageOrigin' in dataId:
347  loc.additionalData.set('imageOrigin',
348  dataId['imageOrigin'])
349  return loc
350  setattr(self, subFunc, mapSubClosure)
351  if not hasattr(self, "query_" + datasetType + "_sub"):
352  def querySubClosure(key, format, dataId, mapping=mapping):
353  subId = dataId.copy()
354  del subId['bbox']
355  return mapping.lookup(format, subId)
356  setattr(self, "query_" + datasetType + "_sub", querySubClosure)
357 
358  # Camera geometry
359  self.cameraDataLocation = None # path to camera geometry config file
360  self.camera = self._makeCamera(policy=policy, repositoryDir=repositoryDir)
361 
362  # Defect registry and root
363  self.defectRegistry = None
364  if 'defects' in policy:
365  self.defectPath = os.path.join(repositoryDir, policy['defects'])
366  defectRegistryLocation = os.path.join(self.defectPath, "defectRegistry.sqlite3")
367  self.defectRegistry = dafPersist.Registry.create(defectRegistryLocation)
368 
369  # Filter translation table
370  self.filters = None
371 
372  # Skytile policy
373  self.skypolicy = policy['skytiles']
374 
375  # verify that the class variable packageName is set before attempting
376  # to instantiate an instance
377  if self.packageName is None:
378  raise ValueError('class variable packageName must not be None')
379 
380  @staticmethod
381  def getRepoPolicy(root, repos):
382  """Get the policy stored in a repo (specified by 'root'), if there is one.
383 
384  @param root (string) path to the root location of the repository
385  @param repos (string) path from the root of the repo to the folder containing a file named
386  _policy.paf or _policy.yaml
387  @return (lsst.daf.persistence.Policy or None) A Policy instantiated with the policy found according to
388  input variables, or None if a policy file was not found.
389  """
390  policy = None
391  if root is not None:
392  paths = CameraMapper.parentSearch(root, os.path.join(repos, '_policy.*'))
393  if paths is not None:
394  for postfix in ('.yaml', '.paf'):
395  matches = [path for path in paths if (os.path.splitext(path))[1] == postfix]
396  if len(matches) > 1:
397  raise RuntimeError("More than 1 policy possibility for root:%s" % root)
398  elif len(matches) == 1:
399  policy = dafPersist.Policy(matches[0])
400  break
401  return policy
402 
403  def _parentSearch(self, path):
404  return CameraMapper.parentSearch(self.root, path)
405 
406  @staticmethod
407  def parentSearch(root, path):
408  """Look for the given path in the current root or any of its parents
409  by following "_parent" symlinks; return None if it can't be found. A
410  little tricky because the path may be in an alias of the root (e.g.
411  ".") and because the "_parent" links go between the root and the rest
412  of the path.
413  If the path contains an HDU indicator (a number in brackets before the
414  dot, e.g. 'foo.fits[1]', this will be stripped when searching and so
415  will match filenames without the HDU indicator, e.g. 'foo.fits'. The
416  path returned WILL contain the indicator though, e.g. ['foo.fits[1]'].
417  """
418  # Separate path into a root-equivalent prefix (in dir) and the rest
419  # (left in path)
420 
421  rootDir = root
422  # First remove trailing slashes (#2527)
423  while len(rootDir) > 1 and rootDir[-1] == '/':
424  rootDir = rootDir[:-1]
425 
426  if path.startswith(rootDir + "/"):
427  # Common case; we have the same root prefix string
428  path = path[len(rootDir)+1:]
429  dir = rootDir
430  elif rootDir == "/" and path.startswith("/"):
431  path = path[1:]
432  dir = rootDir
433  else:
434  # Search for prefix that is the same as root
435  pathPrefix = os.path.dirname(path)
436  while pathPrefix != "" and pathPrefix != "/":
437  if os.path.realpath(pathPrefix) == os.path.realpath(root):
438  break
439  pathPrefix = os.path.dirname(pathPrefix)
440  if os.path.realpath(pathPrefix) != os.path.realpath(root):
441  # No prefix matching root, don't search for parents
442  paths = glob.glob(path)
443 
444  # The contract states that `None` will be returned
445  # if no matches are found.
446  # Thus we explicitly set up this if/else to return `None`
447  # if `not paths` instead of `[]`.
448  # An argument could be made that the contract should be changed
449  if paths:
450  return paths
451  else:
452  return None
453  if pathPrefix == "/":
454  path = path[1:]
455  elif pathPrefix != "":
456  path = path[len(pathPrefix)+1:]
457  # If pathPrefix == "", then the current directory is the root
458  dir = pathPrefix
459 
460  # Now search for the path in the root or its parents
461  # Strip off any cfitsio bracketed extension if present
462  strippedPath = path
463  pathStripped = None
464  firstBracket = path.find("[")
465  if firstBracket != -1:
466  strippedPath = path[:firstBracket]
467  pathStripped = path[firstBracket:]
468 
469  while True:
470  paths = glob.glob(os.path.join(dir, strippedPath))
471  if len(paths) > 0:
472  if pathStripped is not None:
473  paths = [p + pathStripped for p in paths]
474  return paths
475  dir = os.path.join(dir, "_parent")
476  if not os.path.exists(dir):
477  return None
478 
479  def backup(self, datasetType, dataId):
480  """Rename any existing object with the given type and dataId.
481 
482  The CameraMapper implementation saves objects in a sequence of e.g.:
483  foo.fits
484  foo.fits~1
485  foo.fits~2
486  All of the backups will be placed in the output repo, however, and will
487  not be removed if they are found elsewhere in the _parent chain. This
488  means that the same file will be stored twice if the previous version was
489  found in an input repo.
490  """
491  def firstElement(list):
492  """Get the first element in the list, or None if that can't be done.
493  """
494  return list[0] if list is not None and len(list) else None
495 
496  n = 0
497  newLocation = self.map(datasetType, dataId, write=True)
498  newPath = newLocation.getLocations()[0]
499  path = self._parentSearch(newPath)
500  path = firstElement(path)
501  oldPaths = []
502  while path is not None:
503  n += 1
504  oldPaths.append((n, path))
505  path = self._parentSearch("%s~%d" % (newPath, n))
506  path = firstElement(path)
507  for n, oldPath in reversed(oldPaths):
508  newDir, newFile = os.path.split(newPath)
509  if not os.path.exists(newDir):
510  os.makedirs(newDir)
511  shutil.copy(oldPath, "%s~%d" % (newPath, n))
512 
513  def keys(self):
514  """Return supported keys.
515  @return (iterable) List of keys usable in a dataset identifier"""
516  return iter(self.keyDict.keys())
517 
518  def getKeys(self, datasetType, level):
519  """Return supported keys and their value types for a given dataset
520  type at a given level of the key hierarchy.
521 
522  @param datasetType (str) dataset type or None for all keys
523  @param level (str) level or None for all levels
524  @return (iterable) Set of keys usable in a dataset identifier"""
525 
526  # not sure if this is how we want to do this. what if None was intended?
527  if level == '':
528  level = self.getDefaultLevel()
529 
530  if datasetType is None:
531  keyDict = copy.copy(self.keyDict)
532  else:
533  keyDict = self.mappings[datasetType].keys()
534  if level is not None and level in self.levels:
535  keyDict = copy.copy(keyDict)
536  for l in self.levels[level]:
537  if l in keyDict:
538  del keyDict[l]
539  return keyDict
540 
541  def getDefaultLevel(self):
542  return self.defaultLevel
543 
544  def getDefaultSubLevel(self, level):
545  if level in self.defaultSubLevels:
546  return self.defaultSubLevels[level]
547  return None
548 
549  @classmethod
550  def getCameraName(cls):
551  """Return the name of the camera that this CameraMapper is for."""
552  className = str(cls)
553  className = className[className.find('.'):-1]
554  m = re.search(r'(\w+)Mapper', className)
555  if m is None:
556  m = re.search(r"class '[\w.]*?(\w+)'", className)
557  name = m.group(1)
558  return name[:1].lower() + name[1:] if name else ''
559 
560  @classmethod
561  def getPackageName(cls):
562  """Return the name of the package containing this CameraMapper."""
563  if cls.packageName is None:
564  raise ValueError('class variable packageName must not be None')
565  return cls.packageName
566 
567  def map_camera(self, dataId, write=False):
568  """Map a camera dataset."""
569  if self.camera is None:
570  raise RuntimeError("No camera dataset available.")
571  actualId = self._transformId(dataId)
572  return dafPersist.ButlerLocation(
573  pythonType="lsst.afw.cameraGeom.CameraConfig",
574  cppType="Config",
575  storageName="ConfigStorage",
576  locationList=self.cameraDataLocation or "ignored",
577  dataId=actualId,
578  mapper=self
579  )
580 
581  def bypass_camera(self, datasetType, pythonType, butlerLocation, dataId):
582  """Return the (preloaded) camera object.
583  """
584  if self.camera is None:
585  raise RuntimeError("No camera dataset available.")
586  return self.camera
587 
588  def map_defects(self, dataId, write=False):
589  """Map defects dataset.
590 
591  @return a very minimal ButlerLocation containing just the locationList field
592  (just enough information that bypass_defects can use it).
593  """
594  defectFitsPath = self._defectLookup(dataId=dataId)
595  if defectFitsPath is None:
596  raise RuntimeError("No defects available for dataId=%s" % (dataId,))
597 
598  return dafPersist.ButlerLocation(None, None, None, defectFitsPath, dataId, self)
599 
600  def bypass_defects(self, datasetType, pythonType, butlerLocation, dataId):
601  """Return a defect based on the butler location returned by map_defects
602 
603  @param[in] butlerLocation: a ButlerLocation with locationList = path to defects FITS file
604  @param[in] dataId: the usual data ID; "ccd" must be set
605 
606  Note: the name "bypass_XXX" means the butler makes no attempt to convert the ButlerLocation
607  into an object, which is what we want for now, since that conversion is a bit tricky.
608  """
609  detectorName = self._extractDetectorName(dataId)
610  defectsFitsPath = butlerLocation.locationList[0]
611  with pyfits.open(defectsFitsPath) as hduList:
612  for hdu in hduList[1:]:
613  if hdu.header["name"] != detectorName:
614  continue
615 
616  defectList = []
617  for data in hdu.data:
618  bbox = afwGeom.Box2I(
619  afwGeom.Point2I(int(data['x0']), int(data['y0'])),
620  afwGeom.Extent2I(int(data['width']), int(data['height'])),
621  )
622  defectList.append(afwImage.DefectBase(bbox))
623  return defectList
624 
625  raise RuntimeError("No defects for ccd %s in %s" % (detectorName, defectsFitsPath))
626 
627  def map_expIdInfo(self, dataId, write=False):
628  return dafPersist.ButlerLocation(
629  pythonType="lsst.daf.butlerUtils.ExposureIdInfo",
630  cppType=None,
631  storageName="Internal",
632  locationList="ignored",
633  dataId=dataId,
634  mapper=self,
635  )
636 
637  def bypass_expIdInfo(self, datasetType, pythonType, location, dataId):
638  """Hook to retrieve an lsst.daf.butlerUtils.ExposureIdInfo for an exposure"""
639  expId = self.bypass_ccdExposureId(datasetType, pythonType, location, dataId)
640  expBits = self.bypass_ccdExposureId_bits(datasetType, pythonType, location, dataId)
641  return ExposureIdInfo(expId=expId, expBits=expBits)
642 
643  def std_raw(self, item, dataId):
644  """Standardize a raw dataset by converting it to an Exposure instead of an Image"""
645  item = exposureFromImage(item)
646  return self._standardizeExposure(self.exposures['raw'], item, dataId,
647  trimmed=False)
648 
649  def map_skypolicy(self, dataId):
650  """Map a sky policy."""
651  return dafPersist.ButlerLocation("lsst.pex.policy.Policy", "Policy",
652  "Internal", None, None, self)
653 
654  def std_skypolicy(self, item, dataId):
655  """Standardize a sky policy by returning the one we use."""
656  return self.skypolicy
657 
658 ###############################################################################
659 #
660 # Utility functions
661 #
662 ###############################################################################
663 
664  def _getCcdKeyVal(self, dataId):
665  """Return CCD key and value used to look a defect in the defect registry
666 
667  The default implementation simply returns ("ccd", full detector name)
668  """
669  return ("ccd", self._extractDetectorName(dataId))
670 
671  def _setupRegistry(self, name, path, policy, policyKey, root):
672  """Set up a registry (usually SQLite3), trying a number of possible
673  paths.
674  @param name (string) Name of registry
675  @param path (string) Path for registry
676  @param policyKey (string) Key in policy for registry path
677  @param root (string) Root directory to look in
678  @return (lsst.daf.persistence.Registry) Registry object"""
679 
680  if path is None and policyKey in policy:
681  path = dafPersist.LogicalLocation(policy[policyKey]).locString()
682  if not os.path.exists(path):
683  if not os.path.isabs(path) and root is not None:
684  newPath = self._parentSearch(os.path.join(root, path))
685  newPath = newPath[0] if newPath is not None and len(newPath) else None
686  if newPath is None:
687  self.log.warn("Unable to locate registry at policy path (also looked in root): %s",
688  path)
689  path = newPath
690  else:
691  self.log.warn("Unable to locate registry at policy path: %s", path)
692  path = None
693 
694  # determine if there is an sqlite registry and if not, try the posix registry.
695  registry = None
696 
697  if path is None and root is not None:
698  path = os.path.join(root, "%s.sqlite3" % name)
699  newPath = self._parentSearch(path)
700  newPath = newPath[0] if newPath is not None and len(newPath) else None
701  if newPath is None:
702  self.log.info("Unable to locate %s registry in root: %s", name, path)
703  path = newPath
704  if path is None:
705  path = os.path.join(".", "%s.sqlite3" % name)
706  newPath = self._parentSearch(path)
707  newPath = newPath[0] if newPath is not None and len(newPath) else None
708  if newPath is None:
709  self.log.info("Unable to locate %s registry in current dir: %s", name, path)
710  path = newPath
711  if path is not None:
712  if not os.path.exists(path):
713  newPath = self._parentSearch(path)
714  newPath = newPath[0] if newPath is not None and len(newPath) else None
715  if newPath is not None:
716  path = newPath
717  self.log.info("Loading %s registry from %s", name, path)
718  registry = dafPersist.Registry.create(path)
719  elif not registry and os.path.exists(root):
720  self.log.info("Loading Posix registry from %s", root)
721  registry = dafPersist.PosixRegistry(root)
722 
723  return registry
724 
725  def _transformId(self, dataId):
726  """Generate a standard ID dict from a camera-specific ID dict.
727 
728  Canonical keys include:
729  - amp: amplifier name
730  - ccd: CCD name (in LSST this is a combination of raft and sensor)
731  The default implementation returns a copy of its input.
732 
733  @param dataId[in] (dict) Dataset identifier; this must not be modified
734  @return (dict) Transformed dataset identifier"""
735 
736  return dataId.copy()
737 
738  def _mapActualToPath(self, template, actualId):
739  """Convert a template path to an actual path, using the actual data
740  identifier. This implementation is usually sufficient but can be
741  overridden by the subclass.
742  @param template (string) Template path
743  @param actualId (dict) Dataset identifier
744  @return (string) Pathname"""
745 
746  try:
747  transformedId = self._transformId(actualId)
748  return template % transformedId
749  except Exception as e:
750  raise RuntimeError("Failed to format %r with data %r: %s" % (template, transformedId, e))
751 
752  @staticmethod
753  def getShortCcdName(ccdName):
754  """Convert a CCD name to a form useful as a filename
755 
756  The default implementation converts spaces to underscores.
757  """
758  return ccdName.replace(" ", "_")
759 
760  def _extractDetectorName(self, dataId):
761  """Extract the detector (CCD) name from the dataset identifier.
762 
763  The name in question is the detector name used by lsst.afw.cameraGeom.
764 
765  @param dataId (dict) Dataset identifier
766  @return (string) Detector name
767  """
768  raise NotImplementedError("No _extractDetectorName() function specified")
769 
770  def _extractAmpId(self, dataId):
771  """Extract the amplifier identifer from a dataset identifier.
772 
773  @warning this is deprecated; DO NOT USE IT
774 
775  amplifier identifier has two parts: the detector name for the CCD
776  containing the amplifier and index of the amplifier in the detector.
777  @param dataId (dict) Dataset identifer
778  @return (tuple) Amplifier identifier"""
779 
780  trDataId = self._transformId(dataId)
781  return (trDataId["ccd"], int(trDataId['amp']))
782 
783  def _setAmpDetector(self, item, dataId, trimmed=True):
784  """Set the detector object in an Exposure for an amplifier.
785  Defects are also added to the Exposure based on the detector object.
786  @param[in,out] item (lsst.afw.image.Exposure)
787  @param dataId (dict) Dataset identifier
788  @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
789 
790  return self._setCcdDetector(item=item, dataId=dataId, trimmed=trimmed)
791 
792  def _setCcdDetector(self, item, dataId, trimmed=True):
793  """Set the detector object in an Exposure for a CCD.
794  @param[in,out] item (lsst.afw.image.Exposure)
795  @param dataId (dict) Dataset identifier
796  @param trimmed (bool) Should detector be marked as trimmed? (ignored)"""
797 
798  detectorName = self._extractDetectorName(dataId)
799  detector = self.camera[detectorName]
800  item.setDetector(detector)
801 
802  def _setFilter(self, mapping, item, dataId):
803  """Set the filter object in an Exposure. If the Exposure had a FILTER
804  keyword, this was already processed during load. But if it didn't,
805  use the filter from the registry.
806  @param mapping (lsst.daf.butlerUtils.Mapping)
807  @param[in,out] item (lsst.afw.image.Exposure)
808  @param dataId (dict) Dataset identifier"""
809 
810  if not (isinstance(item, afwImage.ExposureU) or isinstance(item, afwImage.ExposureI) or
811  isinstance(item, afwImage.ExposureF) or isinstance(item, afwImage.ExposureD)):
812  return
813 
814  actualId = mapping.need(['filter'], dataId)
815  filterName = actualId['filter']
816  if self.filters is not None and filterName in self.filters:
817  filterName = self.filters[filterName]
818  item.setFilter(afwImage.Filter(filterName))
819 
820  def _setTimes(self, mapping, item, dataId):
821  """Set the exposure time and exposure midpoint in the calib object in
822  an Exposure. Use the EXPTIME and MJD-OBS keywords (and strip out
823  EXPTIME).
824  @param mapping (lsst.daf.butlerUtils.Mapping)
825  @param[in,out] item (lsst.afw.image.Exposure)
826  @param dataId (dict) Dataset identifier"""
827 
828  md = item.getMetadata()
829  calib = item.getCalib()
830  if md.exists("EXPTIME"):
831  expTime = md.get("EXPTIME")
832  calib.setExptime(expTime)
833  md.remove("EXPTIME")
834  else:
835  expTime = calib.getExptime()
836  if md.exists("MJD-OBS"):
837  obsStart = dafBase.DateTime(md.get("MJD-OBS"),
838  dafBase.DateTime.MJD, dafBase.DateTime.UTC)
839  obsMidpoint = obsStart.nsecs() + long(expTime * 1000000000 / 2)
840  calib.setMidTime(dafBase.DateTime(obsMidpoint))
841 
842  # Default standardization function for exposures
843  def _standardizeExposure(self, mapping, item, dataId, filter=True,
844  trimmed=True):
845  """Default standardization function for images.
846  @param mapping (lsst.daf.butlerUtils.Mapping)
847  @param[in,out] item (lsst.afw.image.Exposure)
848  @param dataId (dict) Dataset identifier
849  @param filter (bool) Set filter?
850  @param trimmed (bool) Should detector be marked as trimmed?
851  @return (lsst.afw.image.Exposure) the standardized Exposure"""
852 
853  if (re.search(r'Exposure', mapping.python) and re.search(r'Image', mapping.persistable)):
854  item = exposureFromImage(item)
855 
856  if mapping.level.lower() == "amp":
857  self._setAmpDetector(item, dataId, trimmed)
858  elif mapping.level.lower() == "ccd":
859  self._setCcdDetector(item, dataId, trimmed)
860 
861  if filter:
862  self._setFilter(mapping, item, dataId)
863  if not isinstance(mapping, CalibrationMapping):
864  self._setTimes(mapping, item, dataId)
865 
866  return item
867 
868  def _defectLookup(self, dataId):
869  """Find the defects for a given CCD.
870  @param dataId (dict) Dataset identifier
871  @return (string) path to the defects file or None if not available"""
872  if self.defectRegistry is None:
873  return None
874  if self.registry is None:
875  raise RuntimeError("No registry for defect lookup")
876 
877  ccdKey, ccdVal = self._getCcdKeyVal(dataId)
878 
879  dataIdForLookup = {'visit': dataId['visit']}
880  # .lookup will fail in a posix registry because there is no template to provide.
881  rows = self.registry.lookup(('taiObs'), ('raw_visit'), dataIdForLookup)
882  if len(rows) == 0:
883  return None
884  assert len(rows) == 1
885  taiObs = rows[0][0]
886 
887  # Lookup the defects for this CCD serial number that are valid at the exposure midpoint.
888  rows = self.defectRegistry.executeQuery(("path",), ("defect",),
889  [(ccdKey, "?")],
890  ("DATETIME(?)", "DATETIME(validStart)", "DATETIME(validEnd)"),
891  (ccdVal, taiObs))
892  if not rows or len(rows) == 0:
893  return None
894  if len(rows) == 1:
895  return os.path.join(self.defectPath, rows[0][0])
896  else:
897  raise RuntimeError("Querying for defects (%s, %s) returns %d files: %s" %
898  (ccdVal, taiObs, len(rows), ", ".join([_[0] for _ in rows])))
899 
900  def _makeCamera(self, policy, repositoryDir):
901  """Make a camera (instance of lsst.afw.cameraGeom.Camera) describing the camera geometry
902 
903  Also set self.cameraDataLocation, if relevant (else it can be left None).
904 
905  This implementation assumes that policy contains an entry "camera" that points to the
906  subdirectory in this package of camera data; specifically, that subdirectory must contain:
907  - a file named `camera.py` that contains persisted camera config
908  - ampInfo table FITS files, as required by lsst.afw.cameraGeom.makeCameraFromPath
909 
910  @param policy (daf_persistence.Policy, or pexPolicy.Policy (only for backward compatibility))
911  Policy with per-camera defaults already merged
912  @param repositoryDir (string) Policy repository for the subclassing
913  module (obtained with getRepositoryPath() on the
914  per-camera default dictionary)
915  """
916  if isinstance(policy, pexPolicy.Policy):
917  policy = dafPersist.Policy(pexPolicy=policy)
918  if 'camera' not in policy:
919  raise RuntimeError("Cannot find 'camera' in policy; cannot construct a camera")
920  cameraDataSubdir = policy['camera']
921  self.cameraDataLocation = os.path.normpath(
922  os.path.join(repositoryDir, cameraDataSubdir, "camera.py"))
923  cameraConfig = afwCameraGeom.CameraConfig()
924  cameraConfig.load(self.cameraDataLocation)
925  ampInfoPath = os.path.dirname(self.cameraDataLocation)
926  return afwCameraGeom.makeCameraFromPath(
927  cameraConfig=cameraConfig,
928  ampInfoPath=ampInfoPath,
929  shortNameFunc=self.getShortCcdName
930  )
931 
932 
933 def exposureFromImage(image):
934  """Generate an exposure from a DecoratedImage or similar
935  @param[in] image Image of interest
936  @return (lsst.afw.image.Exposure) Exposure containing input image
937  """
938  if isinstance(image, afwImage.DecoratedImageU) or isinstance(image, afwImage.DecoratedImageI) or \
939  isinstance(image, afwImage.DecoratedImageF) or isinstance(image, afwImage.DecoratedImageD):
940  exposure = afwImage.makeExposure(afwImage.makeMaskedImage(image.getImage()))
941  else:
942  exposure = image
943  md = image.getMetadata()
944  exposure.setMetadata(md)
945  wcs = afwImage.makeWcs(md, True)
946  exposure.setWcs(wcs)
947 
948  return exposure
int iter
Class for handling dates/times, including MJD, UTC, and TAI.
Definition: DateTime.h:58
MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > * makeMaskedImage(typename Image< ImagePixelT >::Ptr image, typename Mask< MaskPixelT >::Ptr mask=typename Mask< MaskPixelT >::Ptr(), typename Image< VariancePixelT >::Ptr variance=typename Image< VariancePixelT >::Ptr())
Definition: MaskedImage.h:1073
Encapsulate information about a bad portion of a detector.
Definition: Defect.h:42
Class for logical location of a persisted Persistable instance.
a container for holding hierarchical configuration data in memory.
Definition: Policy.h:169
An integer coordinate rectangle.
Definition: Box.h:53
std::string getPackageDir(std::string const &packageName)
return the root directory of a setup package
Definition: Utils.cc:34
Definition: Log.h:716
boost::shared_ptr< Wcs > makeWcs(coord::Coord const &crval, geom::Point2D const &crpix, double CD11, double CD12, double CD21, double CD22)
Create a Wcs object from crval, crpix, CD, using CD elements (useful from python) ...
Definition: makeWcs.cc:138
Holds an integer identifier for an LSST filter.
Definition: Filter.h:108
boost::shared_ptr< daf::base::PropertySet > readMetadata(std::string const &fileName, int hdu=0, bool strip=false)
Return the metadata (header entries) from a FITS file.
Exposure< ImagePixelT, MaskPixelT, VariancePixelT >::Ptr makeExposure(MaskedImage< ImagePixelT, MaskPixelT, VariancePixelT > &mimage, boost::shared_ptr< Wcs const > wcs=boost::shared_ptr< Wcs const >())
Definition: Exposure.h:306