LSSTApplications  10.0+286,10.0+36,10.0+46,10.0-2-g4f67435,10.1+152,10.1+37,11.0,11.0+1,11.0-1-g47edd16,11.0-1-g60db491,11.0-1-g7418c06,11.0-2-g04d2804,11.0-2-g68503cd,11.0-2-g818369d,11.0-2-gb8b8ce7
LSSTDataManagementBasePackage
Public Member Functions | Static Public Member Functions | Public Attributes | Private Member Functions | List of all members
lsst.daf.persistence.butler.Butler Class Reference
Inheritance diagram for lsst.daf.persistence.butler.Butler:

Public Member Functions

def __init__
 
def defineAlias
 
def getKeys
 
def queryMetadata
 
def datasetExists
 
def get
 
def put
 
def subset
 
def dataRef
 
def __reduce__
 

Static Public Member Functions

def getMapperClass
 

Public Attributes

 datasetTypeAliasDict
 
 mapper
 
 persistence
 
 log
 

Private Member Functions

def _combineDicts
 
def _read
 
def _resolveDatasetTypeAlias
 

Detailed Description

Butler provides a generic mechanism for persisting and retrieving data using mappers.

A Butler manages a collection of datasets known as a repository.  Each
dataset has a type representing its intended usage and a location.  Note
that the dataset type is not the same as the C++ or Python type of the
object containing the data.  For example, an ExposureF object might be
used to hold the data for a raw image, a post-ISR image, a calibrated
science image, or a difference image.  These would all be different
dataset types.

A Butler can produce a collection of possible values for a key (or tuples
of values for multiple keys) if given a partial data identifier.  It can
check for the existence of a file containing a dataset given its type and
data identifier.  The Butler can then retrieve the dataset.  Similarly, it
can persist an object to an appropriate location when given its associated
data identifier.

Note that the Butler has two more advanced features when retrieving a data
set.  First, the retrieval is lazy.  Input does not occur until the data
set is actually accessed.  This allows datasets to be retrieved and
placed on a clipboard prospectively with little cost, even if the
algorithm of a stage ends up not using them.  Second, the Butler will call
a standardization hook upon retrieval of the dataset.  This function,
contained in the input mapper object, must perform any necessary
manipulations to force the retrieved object to conform to standards,
including translating metadata.

Public methods:

__init__(self, root, mapper=None, **mapperArgs)

getKeys(self, datasetType=None, level=None)

queryMetadata(self, datasetType, keys, format=None, dataId={}, **rest)

datasetExists(self, datasetType, dataId={}, **rest)

get(self, datasetType, dataId={}, immediate=False, **rest)

put(self, obj, datasetType, dataId={}, **rest)

subset(self, datasetType, level=None, dataId={}, **rest))

Definition at line 38 of file butler.py.

Constructor & Destructor Documentation

def lsst.daf.persistence.butler.Butler.__init__ (   self,
  root,
  mapper = None,
  mapperArgs 
)
Construct the Butler.  If no mapper class is provided, then a file
named "_mapper" is expected to be found in the repository, which
must be a filesystem path.  The first line in that file is read and
must contain the fully-qualified name of a Mapper subclass, which is
then imported and instantiated using the root and the mapperArgs.

@param root (str)       the repository to be managed (at least
                initially).  May be None if a mapper is
                provided.
@param mapper (Mapper)  if present, the Mapper subclass instance
                to be used as the butler's mapper.
@param **mapperArgs     arguments to be passed to the mapper's
                __init__ method, in addition to the root.

Definition at line 112 of file butler.py.

113  def __init__(self, root, mapper=None, **mapperArgs):
114  """Construct the Butler. If no mapper class is provided, then a file
115  named "_mapper" is expected to be found in the repository, which
116  must be a filesystem path. The first line in that file is read and
117  must contain the fully-qualified name of a Mapper subclass, which is
118  then imported and instantiated using the root and the mapperArgs.
119 
120  @param root (str) the repository to be managed (at least
121  initially). May be None if a mapper is
122  provided.
123  @param mapper (Mapper) if present, the Mapper subclass instance
124  to be used as the butler's mapper.
125  @param **mapperArgs arguments to be passed to the mapper's
126  __init__ method, in addition to the root.
127  """
129  self.datasetTypeAliasDict = {}
130 
131  if mapper is not None:
132  self.mapper = mapper
133  else:
134  cls = Butler.getMapperClass(root)
135  self.mapper = cls(root=root, **mapperArgs)
136 
137  # Always use an empty Persistence policy until we can get rid of it
138  persistencePolicy = pexPolicy.Policy()
139  self.persistence = Persistence.getPersistence(persistencePolicy)
140  self.log = pexLog.Log(pexLog.Log.getDefaultLog(),
141  "daf.persistence.butler")
a container for holding hierarchical configuration data in memory.
Definition: Policy.h:169
a place to record messages and descriptions of the state of processing.
Definition: Log.h:154

Member Function Documentation

def lsst.daf.persistence.butler.Butler.__reduce__ (   self)

Definition at line 477 of file butler.py.

478  def __reduce__(self):
479  return (_unreduce, (self.mapper, self.datasetTypeAliasDict))
def lsst.daf.persistence.butler.Butler._combineDicts (   self,
  dataId,
  rest 
)
private

Definition at line 419 of file butler.py.

420  def _combineDicts(self, dataId, **rest):
421  finalId = {}
422  finalId.update(dataId)
423  finalId.update(rest)
424  return finalId
def lsst.daf.persistence.butler.Butler._read (   self,
  pythonType,
  location 
)
private

Definition at line 425 of file butler.py.

426  def _read(self, pythonType, location):
427  trace = pexLog.BlockTimingLog(self.log, "read",
428  pexLog.BlockTimingLog.INSTRUM+1)
429 
430  additionalData = location.getAdditionalData()
431  # Create a list of Storages for the item.
432  storageName = location.getStorageName()
433  results = []
434  locations = location.getLocations()
435  returnList = True
436  if len(locations) == 1:
437  returnList = False
438 
439  for locationString in locations:
440  logLoc = LogicalLocation(locationString, additionalData)
441  trace.start("read from %s(%s)" % (storageName, logLoc.locString()))
442 
443  if storageName == "PafStorage":
444  finalItem = pexPolicy.Policy.createPolicy(logLoc.locString())
445  elif storageName == "PickleStorage":
446  if not os.path.exists(logLoc.locString()):
447  raise RuntimeError, \
448  "No such pickle file: " + logLoc.locString()
449  with open(logLoc.locString(), "rb") as infile:
450  finalItem = cPickle.load(infile)
451  elif storageName == "FitsCatalogStorage":
452  if not os.path.exists(logLoc.locString()):
453  raise RuntimeError, \
454  "No such FITS catalog file: " + logLoc.locString()
455  hdu = additionalData.getInt("hdu", 0)
456  flags = additionalData.getInt("flags", 0)
457  finalItem = pythonType.readFits(logLoc.locString(), hdu, flags)
458  elif storageName == "ConfigStorage":
459  if not os.path.exists(logLoc.locString()):
460  raise RuntimeError, \
461  "No such config file: " + logLoc.locString()
462  finalItem = pythonType()
463  finalItem.load(logLoc.locString())
464  else:
465  storageList = StorageList()
466  storage = self.persistence.getRetrieveStorage(storageName, logLoc)
467  storageList.append(storage)
468  itemData = self.persistence.unsafeRetrieve(
469  location.getCppType(), storageList, additionalData)
470  finalItem = pythonType.swigConvert(itemData)
471  trace.done()
472  results.append(finalItem)
473 
474  if not returnList:
475  results = results[0]
476  return results
Class for logical location of a persisted Persistable instance.
def lsst.daf.persistence.butler.Butler._resolveDatasetTypeAlias (   self,
  datasetType 
)
private
Replaces all the known alias keywords in the given string with the alias value.
@param (str)datasetType
@return (str) the de-aliased string

Definition at line 480 of file butler.py.

481  def _resolveDatasetTypeAlias(self, datasetType):
482  """ Replaces all the known alias keywords in the given string with the alias value.
483  @param (str)datasetType
484  @return (str) the de-aliased string
485  """
486 
487  for key in self.datasetTypeAliasDict:
488  # if all aliases have been replaced, bail out
489  if datasetType.find('@') == -1:
490  break
491  datasetType = datasetType.replace(key, self.datasetTypeAliasDict[key])
492 
493  # If an alias specifier can not be resolved then throw.
494  if datasetType.find('@') != -1:
495  raise RuntimeError("Unresolvable alias specifier in datasetType: %s" %(datasetType))
496 
497  return datasetType
def lsst.daf.persistence.butler.Butler.dataRef (   self,
  datasetType,
  level = None,
  dataId = {},
  rest 
)
Returns a single ButlerDataRef.

Given a complete dataId specified in dataId and **rest, find the
unique dataset at the given level specified by a dataId key (e.g.
visit or sensor or amp for a camera) and return a ButlerDataRef.

@param datasetType (str)  the type of dataset collection to reference
@param level (str)        the level of dataId at which to reference
@param dataId (dict)      the data id.
@param **rest             keyword arguments for the data id.
@returns (ButlerDataRef) ButlerDataRef for dataset matching the data id

Definition at line 395 of file butler.py.

396  def dataRef(self, datasetType, level=None, dataId={}, **rest):
397  """Returns a single ButlerDataRef.
398 
399  Given a complete dataId specified in dataId and **rest, find the
400  unique dataset at the given level specified by a dataId key (e.g.
401  visit or sensor or amp for a camera) and return a ButlerDataRef.
402 
403  @param datasetType (str) the type of dataset collection to reference
404  @param level (str) the level of dataId at which to reference
405  @param dataId (dict) the data id.
406  @param **rest keyword arguments for the data id.
407  @returns (ButlerDataRef) ButlerDataRef for dataset matching the data id
408  """
409 
410  datasetType = self._resolveDatasetTypeAlias(datasetType)
411  subset = self.subset(datasetType, level, dataId, **rest)
412  if len(subset) != 1:
413  raise RuntimeError, """No unique dataset for:
414  Dataset type = %s
415  Level = %s
416  Data ID = %s
417  Keywords = %s""" % (str(datasetType), str(level), str(dataId), str(rest))
418  return ButlerDataRef(subset, subset.cache[0])
def lsst.daf.persistence.butler.Butler.datasetExists (   self,
  datasetType,
  dataId = {},
  rest 
)
Determines if a dataset file exists.

@param datasetType (str)   the type of dataset to inquire about.
@param dataId (dict)       the data id of the dataset.
@param **rest              keyword arguments for the data id.
@returns (bool) True if the dataset exists or is non-file-based.

Definition at line 211 of file butler.py.

212  def datasetExists(self, datasetType, dataId={}, **rest):
213  """Determines if a dataset file exists.
214 
215  @param datasetType (str) the type of dataset to inquire about.
216  @param dataId (dict) the data id of the dataset.
217  @param **rest keyword arguments for the data id.
218  @returns (bool) True if the dataset exists or is non-file-based.
219  """
220 
221  datasetType = self._resolveDatasetTypeAlias(datasetType)
222  dataId = self._combineDicts(dataId, **rest)
223  location = self.mapper.map(datasetType, dataId)
224  additionalData = location.getAdditionalData()
225  storageName = location.getStorageName()
226  if storageName in ('BoostStorage', 'FitsStorage', 'PafStorage',
227  'PickleStorage', 'ConfigStorage', 'FitsCatalogStorage'):
228  locations = location.getLocations()
229  for locationString in locations:
230  logLoc = LogicalLocation(locationString, additionalData).locString()
231  if storageName == 'FitsStorage':
232  # Strip off directives for cfitsio (in square brackets, e.g., extension name)
233  bracket = logLoc.find('[')
234  if bracket > 0:
235  logLoc = logLoc[:bracket]
236  if not os.path.exists(logLoc):
237  return False
238  return True
239  self.log.log(pexLog.Log.WARN,
240  "datasetExists() for non-file storage %s, dataset type=%s, keys=%s" %
241  (storageName, datasetType, str(dataId)))
242  return True
Class for logical location of a persisted Persistable instance.
def lsst.daf.persistence.butler.Butler.defineAlias (   self,
  alias,
  datasetType 
)
Register an alias that will be substituted in datasetTypes.

@param alias (str) the alias keyword. it may start with @ or not. It may not contain @ except as the
           first character.
@param datasetType (str) the string that will be substituted when @alias is passed into datasetType. It may
                 not contain '@'

Definition at line 142 of file butler.py.

143  def defineAlias(self, alias, datasetType):
144  """Register an alias that will be substituted in datasetTypes.
145 
146  @param alias (str) the alias keyword. it may start with @ or not. It may not contain @ except as the
147  first character.
148  @param datasetType (str) the string that will be substituted when @alias is passed into datasetType. It may
149  not contain '@'
150  """
151 
152  #verify formatting of alias:
153  # it can have '@' as the first character (if not it's okay, we will add it) or not at all.
154  atLoc = alias.rfind('@')
155  if atLoc is -1:
156  alias = "@" + str(alias)
157  elif atLoc > 0:
158  raise RuntimeError("Badly formatted alias string: %s" %(alias,))
159 
160  # verify that datasetType does not contain '@'
161  if datasetType.count('@') != 0:
162  raise RuntimeError("Badly formatted type string: %s" %(datasetType))
163 
164  # verify that the alias keyword does not start with another alias keyword,
165  # and vice versa
166  for key in self.datasetTypeAliasDict:
167  if key.startswith(alias) or alias.startswith(key):
168  raise RuntimeError("Alias: %s overlaps with existing alias: %s" %(alias, key))
169 
170  self.datasetTypeAliasDict[alias] = datasetType
def lsst.daf.persistence.butler.Butler.get (   self,
  datasetType,
  dataId = {},
  immediate = False,
  rest 
)
Retrieves a dataset given an input collection data id.

@param datasetType (str)   the type of dataset to retrieve.
@param dataId (dict)       the data id.
@param immediate (bool)    don't use a proxy for delayed loading.
@param **rest              keyword arguments for the data id.
@returns an object retrieved from the dataset (or a proxy for one).

Definition at line 243 of file butler.py.

244  def get(self, datasetType, dataId={}, immediate=False, **rest):
245  """Retrieves a dataset given an input collection data id.
246 
247  @param datasetType (str) the type of dataset to retrieve.
248  @param dataId (dict) the data id.
249  @param immediate (bool) don't use a proxy for delayed loading.
250  @param **rest keyword arguments for the data id.
251  @returns an object retrieved from the dataset (or a proxy for one).
252  """
253 
254  datasetType = self._resolveDatasetTypeAlias(datasetType)
255  dataId = self._combineDicts(dataId, **rest)
256  location = self.mapper.map(datasetType, dataId)
257  self.log.log(pexLog.Log.DEBUG, "Get type=%s keys=%s from %s" %
258  (datasetType, dataId, str(location)))
259 
260  if location.getPythonType() is not None:
261  # import this pythonType dynamically
262  pythonTypeTokenList = location.getPythonType().split('.')
263  importClassString = pythonTypeTokenList.pop()
264  importClassString = importClassString.strip()
265  importPackage = ".".join(pythonTypeTokenList)
266  importType = __import__(importPackage, globals(), locals(), \
267  [importClassString], -1)
268  pythonType = getattr(importType, importClassString)
269  else:
270  pythonType = None
271  if hasattr(self.mapper, "bypass_" + datasetType):
272  bypassFunc = getattr(self.mapper, "bypass_" + datasetType)
273  callback = lambda: bypassFunc(datasetType, pythonType,
274  location, dataId)
275  else:
276  callback = lambda: self._read(pythonType, location)
277  if self.mapper.canStandardize(datasetType):
278  innerCallback = callback
279  callback = lambda: self.mapper.standardize(datasetType,
280  innerCallback(), dataId)
281  if immediate:
282  return callback()
283  return ReadProxy(callback)
def lsst.daf.persistence.butler.Butler.getKeys (   self,
  datasetType = None,
  level = None 
)
Returns a dict.  The dict keys are the valid data id keys at or
above the given level of hierarchy for the dataset type or the entire
collection if None.  The dict values are the basic Python types
corresponding to the keys (int, float, str).

@param datasetType (str)  the type of dataset to get keys for, entire
                  collection if None.
@param level (str)        the hierarchy level to descend to or None.
@returns (dict) valid data id keys; values are corresponding types.

Definition at line 171 of file butler.py.

172  def getKeys(self, datasetType=None, level=None):
173  """Returns a dict. The dict keys are the valid data id keys at or
174  above the given level of hierarchy for the dataset type or the entire
175  collection if None. The dict values are the basic Python types
176  corresponding to the keys (int, float, str).
177 
178  @param datasetType (str) the type of dataset to get keys for, entire
179  collection if None.
180  @param level (str) the hierarchy level to descend to or None.
181  @returns (dict) valid data id keys; values are corresponding types.
182  """
183 
184  datasetType = self._resolveDatasetTypeAlias(datasetType)
185  return self.mapper.getKeys(datasetType, level)
def lsst.daf.persistence.butler.Butler.getMapperClass (   root)
static
Return the mapper class associated with a repository root.

Definition at line 85 of file butler.py.

85 
86  def getMapperClass(root):
87  """Return the mapper class associated with a repository root."""
88 
89  # Find a "_mapper" file containing the mapper class name
90  basePath = root
91  mapperFile = "_mapper"
92  globals = {}
93  while not os.path.exists(os.path.join(basePath, mapperFile)):
94  # Break abstraction by following _parent links from CameraMapper
95  if os.path.exists(os.path.join(basePath, "_parent")):
96  basePath = os.path.join(basePath, "_parent")
97  else:
98  raise RuntimeError(
99  "No mapper provided and no %s available" %
100  (mapperFile,))
101  mapperFile = os.path.join(basePath, mapperFile)
102 
103  # Read the name of the mapper class and instantiate it
104  with open(mapperFile, "r") as f:
105  mapperName = f.readline().strip()
106  components = mapperName.split(".")
107  if len(components) <= 1:
108  raise RuntimeError("Unqualified mapper name %s in %s" %
109  (mapperName, mapperFile))
110  pkg = importlib.import_module(".".join(components[:-1]))
111  return getattr(pkg, components[-1])
def lsst.daf.persistence.butler.Butler.put (   self,
  obj,
  datasetType,
  dataId = {},
  doBackup = False,
  rest 
)
Persists a dataset given an output collection data id.

@param obj                 the object to persist.
@param datasetType (str)   the type of dataset to persist.
@param dataId (dict)       the data id.
@param doBackup            if True, rename existing instead of overwriting
@param **rest         keyword arguments for the data id.

WARNING: Setting doBackup=True is not safe for parallel processing, as it
may be subject to race conditions.

Definition at line 284 of file butler.py.

285  def put(self, obj, datasetType, dataId={}, doBackup=False, **rest):
286  """Persists a dataset given an output collection data id.
287 
288  @param obj the object to persist.
289  @param datasetType (str) the type of dataset to persist.
290  @param dataId (dict) the data id.
291  @param doBackup if True, rename existing instead of overwriting
292  @param **rest keyword arguments for the data id.
293 
294  WARNING: Setting doBackup=True is not safe for parallel processing, as it
295  may be subject to race conditions.
296  """
297 
298  datasetType = self._resolveDatasetTypeAlias(datasetType)
299  if doBackup:
300  self.mapper.backup(datasetType, dataId)
301  dataId = self._combineDicts(dataId, **rest)
302  location = self.mapper.map(datasetType, dataId, write=True)
303  self.log.log(pexLog.Log.DEBUG, "Put type=%s keys=%s to %s" %
304  (datasetType, dataId, str(location)))
305  additionalData = location.getAdditionalData()
306  storageName = location.getStorageName()
307  locations = location.getLocations()
308  # TODO support multiple output locations
309  locationString = locations[0]
310  logLoc = LogicalLocation(locationString, additionalData)
311  trace = pexLog.BlockTimingLog(self.log, "put",
312  pexLog.BlockTimingLog.INSTRUM+1)
313  trace.setUsageFlags(trace.ALLUDATA)
314 
315  if storageName == "PickleStorage":
316  trace.start("write to %s(%s)" % (storageName, logLoc.locString()))
317  outDir = os.path.dirname(logLoc.locString())
318  if outDir != "" and not os.path.exists(outDir):
319  try:
320  os.makedirs(outDir)
321  except OSError, e:
322  # Don't fail if directory exists due to race
323  if e.errno != 17:
324  raise e
325  with open(logLoc.locString(), "wb") as outfile:
326  cPickle.dump(obj, outfile, cPickle.HIGHEST_PROTOCOL)
327  trace.done()
328  return
329 
330  if storageName == "ConfigStorage":
331  trace.start("write to %s(%s)" % (storageName, logLoc.locString()))
332  outDir = os.path.dirname(logLoc.locString())
333  if outDir != "" and not os.path.exists(outDir):
334  try:
335  os.makedirs(outDir)
336  except OSError, e:
337  # Don't fail if directory exists due to race
338  if e.errno != 17:
339  raise e
340  obj.save(logLoc.locString())
341  trace.done()
342  return
343 
344  if storageName == "FitsCatalogStorage":
345  trace.start("write to %s(%s)" % (storageName, logLoc.locString()))
346  outDir = os.path.dirname(logLoc.locString())
347  if outDir != "" and not os.path.exists(outDir):
348  try:
349  os.makedirs(outDir)
350  except OSError, e:
351  # Don't fail if directory exists due to race
352  if e.errno != 17:
353  raise e
354  flags = additionalData.getInt("flags", 0)
355  obj.writeFits(logLoc.locString(), flags=flags)
356  trace.done()
357  return
358 
359  # Create a list of Storages for the item.
360  storageList = StorageList()
361  storage = self.persistence.getPersistStorage(storageName, logLoc)
362  storageList.append(storage)
363  trace.start("write to %s(%s)" % (storageName, logLoc.locString()))
364 
365  # Persist the item.
366  if hasattr(obj, '__deref__'):
367  # We have a smart pointer, so dereference it.
368  self.persistence.persist(
369  obj.__deref__(), storageList, additionalData)
370  else:
371  self.persistence.persist(obj, storageList, additionalData)
372  trace.done()
Class for logical location of a persisted Persistable instance.
def lsst.daf.persistence.butler.Butler.queryMetadata (   self,
  datasetType,
  key,
  format = None,
  dataId = {},
  rest 
)
Returns the valid values for one or more keys when given a partial
input collection data id.

@param datasetType (str)    the type of dataset to inquire about.
@param key (str)            a key giving the level of granularity of the inquiry.
@param format (str, tuple)  an optional key or tuple of keys to be returned. 
@param dataId (dict)        the partial data id.
@param **rest               keyword arguments for the partial data id.
@returns (list) a list of valid values or tuples of valid values as
specified by the format (defaulting to the same as the key) at the
key's level of granularity.

Definition at line 186 of file butler.py.

187  def queryMetadata(self, datasetType, key, format=None, dataId={}, **rest):
188  """Returns the valid values for one or more keys when given a partial
189  input collection data id.
190 
191  @param datasetType (str) the type of dataset to inquire about.
192  @param key (str) a key giving the level of granularity of the inquiry.
193  @param format (str, tuple) an optional key or tuple of keys to be returned.
194  @param dataId (dict) the partial data id.
195  @param **rest keyword arguments for the partial data id.
196  @returns (list) a list of valid values or tuples of valid values as
197  specified by the format (defaulting to the same as the key) at the
198  key's level of granularity.
199  """
200 
201  datasetType = self._resolveDatasetTypeAlias(datasetType)
202  dataId = self._combineDicts(dataId, **rest)
203  if format is None:
204  format = (key,)
205  elif not hasattr(format, '__iter__'):
206  format = (format,)
207  tuples = self.mapper.queryMetadata(datasetType, key, format, dataId)
208  if len(format) == 1:
209  return [x[0] for x in tuples]
210  return tuples
def lsst.daf.persistence.butler.Butler.subset (   self,
  datasetType,
  level = None,
  dataId = {},
  rest 
)
Extracts a subset of a dataset collection.

Given a partial dataId specified in dataId and **rest, find all
datasets at a given level specified by a dataId key (e.g. visit or
sensor or amp for a camera) and return a collection of their dataIds
as ButlerDataRefs.

@param datasetType (str)  the type of dataset collection to subset
@param level (str)        the level of dataId at which to subset
@param dataId (dict)      the data id.
@param **rest             keyword arguments for the data id.
@returns (ButlerSubset) collection of ButlerDataRefs for datasets
matching the data id.

Definition at line 373 of file butler.py.

374  def subset(self, datasetType, level=None, dataId={}, **rest):
375  """Extracts a subset of a dataset collection.
376 
377  Given a partial dataId specified in dataId and **rest, find all
378  datasets at a given level specified by a dataId key (e.g. visit or
379  sensor or amp for a camera) and return a collection of their dataIds
380  as ButlerDataRefs.
381 
382  @param datasetType (str) the type of dataset collection to subset
383  @param level (str) the level of dataId at which to subset
384  @param dataId (dict) the data id.
385  @param **rest keyword arguments for the data id.
386  @returns (ButlerSubset) collection of ButlerDataRefs for datasets
387  matching the data id.
388  """
389 
390  datasetType = self._resolveDatasetTypeAlias(datasetType)
391  if level is None:
392  level = self.mapper.getDefaultLevel()
393  dataId = self._combineDicts(dataId, **rest)
394  return ButlerSubset(self, datasetType, level, dataId)

Member Data Documentation

lsst.daf.persistence.butler.Butler.datasetTypeAliasDict

Definition at line 128 of file butler.py.

lsst.daf.persistence.butler.Butler.log

Definition at line 139 of file butler.py.

lsst.daf.persistence.butler.Butler.mapper

Definition at line 131 of file butler.py.

lsst.daf.persistence.butler.Butler.persistence

Definition at line 138 of file butler.py.


The documentation for this class was generated from the following file: