LSST Applications  21.0.0+75b29a8a7f,21.0.0+e70536a077,21.0.0-1-ga51b5d4+62c747d40b,21.0.0-10-gbfb87ad6+3307648ee3,21.0.0-15-gedb9d5423+47cba9fc36,21.0.0-2-g103fe59+fdf0863a2a,21.0.0-2-g1367e85+d38a93257c,21.0.0-2-g45278ab+e70536a077,21.0.0-2-g5242d73+d38a93257c,21.0.0-2-g7f82c8f+e682ffb718,21.0.0-2-g8dde007+d179fbfa6a,21.0.0-2-g8f08a60+9402881886,21.0.0-2-ga326454+e682ffb718,21.0.0-2-ga63a54e+08647d4b1b,21.0.0-2-gde069b7+26c92b3210,21.0.0-2-gecfae73+0445ed2f95,21.0.0-2-gfc62afb+d38a93257c,21.0.0-27-gbbd0d29+ae871e0f33,21.0.0-28-g5fc5e037+feb0e9397b,21.0.0-3-g21c7a62+f4b9c0ff5c,21.0.0-3-g357aad2+57b0bddf0b,21.0.0-3-g4be5c26+d38a93257c,21.0.0-3-g65f322c+3f454acf5d,21.0.0-3-g7d9da8d+75b29a8a7f,21.0.0-3-gaa929c8+9e4ef6332c,21.0.0-3-ge02ed75+4b120a55c4,21.0.0-4-g3300ddd+e70536a077,21.0.0-4-g591bb35+4b120a55c4,21.0.0-4-gc004bbf+4911b9cd27,21.0.0-4-gccdca77+f94adcd104,21.0.0-4-ge8fba5a+2b3a696ff9,21.0.0-5-gb155db7+2c5429117a,21.0.0-5-gdf36809+637e4641ee,21.0.0-6-g00874e7+c9fd7f7160,21.0.0-6-g4e60332+4b120a55c4,21.0.0-7-gc8ca178+40eb9cf840,21.0.0-8-gfbe0b4b+9e4ef6332c,21.0.0-9-g2fd488a+d83b7cd606,w.2021.05
LSST Data Management Base Package
Public Member Functions | Public Attributes | List of all members
lsst.obs.base.mapping.Mapping Class Reference
Inheritance diagram for lsst.obs.base.mapping.Mapping:
lsst.obs.base.mapping.CalibrationMapping lsst.obs.base.mapping.DatasetMapping lsst.obs.base.mapping.ExposureMapping lsst.obs.base.mapping.ImageMapping

Public Member Functions

def __init__ (self, datasetType, policy, registry, rootStorage, provided=None)
 
def template (self)
 
def keys (self)
 
def map (self, mapper, dataId, write=False)
 
def lookup (self, properties, dataId)
 
def have (self, properties, dataId)
 
def need (self, properties, dataId)
 

Public Attributes

 datasetType
 
 registry
 
 rootStorage
 
 keyDict
 
 python
 
 persistable
 
 storage
 
 level
 
 tables
 
 range
 
 columns
 
 obsTimeName
 
 recipe
 

Detailed Description

Mapping is a base class for all mappings.  Mappings are used by
the Mapper to map (determine a path to some data given some
identifiers) and standardize (convert data into some standard
format or type) data, and to query the associated registry to see
what data is available.

Subclasses must specify self.storage or else override self.map().

Public methods: lookup, have, need, getKeys, map

Mappings are specified mainly by policy.  A Mapping policy should
consist of:

template (string): a Python string providing the filename for that
particular dataset type based on some data identifiers.  In the
case of redundancy in the path (e.g., file uniquely specified by
the exposure number, but filter in the path), the
redundant/dependent identifiers can be looked up in the registry.

python (string): the Python type for the retrieved data (e.g.
lsst.afw.image.ExposureF)

persistable (string): the Persistable registration for the on-disk data
(e.g. ImageU)

storage (string, optional): Storage type for this dataset type (e.g.
"FitsStorage")

level (string, optional): the level in the camera hierarchy at which the
data is stored (Amp, Ccd or skyTile), if relevant

tables (string, optional): a whitespace-delimited list of tables in the
registry that can be NATURAL JOIN-ed to look up additional
information.

Parameters
----------
datasetType : `str`
    Butler dataset type to be mapped.
policy : `daf_persistence.Policy`
    Mapping Policy.
registry : `lsst.obs.base.Registry`
    Registry for metadata lookups.
rootStorage : Storage subclass instance
    Interface to persisted repository data.
provided : `list` of `str`
    Keys provided by the mapper.

Definition at line 33 of file mapping.py.

Constructor & Destructor Documentation

◆ __init__()

def lsst.obs.base.mapping.Mapping.__init__ (   self,
  datasetType,
  policy,
  registry,
  rootStorage,
  provided = None 
)

Definition at line 84 of file mapping.py.

84  def __init__(self, datasetType, policy, registry, rootStorage, provided=None):
85 
86  if policy is None:
87  raise RuntimeError("No policy provided for mapping")
88 
89  self.datasetType = datasetType
90  self.registry = registry
91  self.rootStorage = rootStorage
92 
93  self._template = policy['template'] # Template path
94  # in most cases, the template can not be used if it is empty, and is
95  # accessed via a property that will raise if it is used while
96  # `not self._template`. In this case we *do* allow it to be empty, for
97  # the purpose of fetching the key dict so that the mapping can be
98  # constructed, so that it can raise if it's invalid. I know it's a
99  # little odd, but it allows this template check to be introduced
100  # without a major refactor.
101  if self._template:
102  self.keyDict = dict([
103  (k, _formatMap(v, k, datasetType))
104  for k, v in
105  re.findall(r'\%\‍((\w+)\‍).*?([diouxXeEfFgGcrs])', self.template)
106  ])
107  else:
108  self.keyDict = {}
109  if provided is not None:
110  for p in provided:
111  if p in self.keyDict:
112  del self.keyDict[p]
113  self.python = policy['python'] # Python type
114  self.persistable = policy['persistable'] # Persistable type
115  self.storage = policy['storage']
116  if 'level' in policy:
117  self.level = policy['level'] # Level in camera hierarchy
118  if 'tables' in policy:
119  self.tables = policy.asArray('tables')
120  else:
121  self.tables = None
122  self.range = None
123  self.columns = None
124  self.obsTimeName = policy['obsTimeName'] if 'obsTimeName' in policy else None
125  self.recipe = policy['recipe'] if 'recipe' in policy else 'default'
126 

Member Function Documentation

◆ have()

def lsst.obs.base.mapping.Mapping.have (   self,
  properties,
  dataId 
)
Returns whether the provided data identifier has all
the properties in the provided list.

Parameters
----------
properties : `list of `str`
    Properties required.
dataId : `dict`
    Dataset identifier.

Returns
-------
bool
    True if all properties are present.

Definition at line 275 of file mapping.py.

275  def have(self, properties, dataId):
276  """Returns whether the provided data identifier has all
277  the properties in the provided list.
278 
279  Parameters
280  ----------
281  properties : `list of `str`
282  Properties required.
283  dataId : `dict`
284  Dataset identifier.
285 
286  Returns
287  -------
288  bool
289  True if all properties are present.
290  """
291  for prop in properties:
292  if prop not in dataId:
293  return False
294  return True
295 

◆ keys()

def lsst.obs.base.mapping.Mapping.keys (   self)
Return the dict of keys and value types required for this mapping.

Definition at line 135 of file mapping.py.

135  def keys(self):
136  """Return the dict of keys and value types required for this mapping.
137  """
138  return self.keyDict
139 

◆ lookup()

def lsst.obs.base.mapping.Mapping.lookup (   self,
  properties,
  dataId 
)
Look up properties for in a metadata registry given a partial
dataset identifier.

Parameters
----------
properties : `list` of `str`
    What to look up.
dataId : `dict`
    Dataset identifier

Returns
-------
`list` of `tuple`
    Values of properties.

Reimplemented in lsst.obs.base.mapping.CalibrationMapping.

Definition at line 192 of file mapping.py.

192  def lookup(self, properties, dataId):
193  """Look up properties for in a metadata registry given a partial
194  dataset identifier.
195 
196  Parameters
197  ----------
198  properties : `list` of `str`
199  What to look up.
200  dataId : `dict`
201  Dataset identifier
202 
203  Returns
204  -------
205  `list` of `tuple`
206  Values of properties.
207  """
208  if self.registry is None:
209  raise RuntimeError("No registry for lookup")
210 
211  skyMapKeys = ("tract", "patch")
212 
213  where = []
214  values = []
215 
216  # Prepare to remove skymap entries from properties list. These must
217  # be in the data ID, so we store which ones we're removing and create
218  # an OrderedDict that tells us where to re-insert them. That maps the
219  # name of the property to either its index in the properties list
220  # *after* the skymap ones have been removed (for entries that aren't
221  # skymap ones) or the value from the data ID (for those that are).
222  removed = set()
223  substitutions = OrderedDict()
224  index = 0
225  properties = list(properties) # don't modify the original list
226  for p in properties:
227  if p in skyMapKeys:
228  try:
229  substitutions[p] = dataId[p]
230  removed.add(p)
231  except KeyError:
232  raise RuntimeError(
233  "Cannot look up skymap key '%s'; it must be explicitly included in the data ID" % p
234  )
235  else:
236  substitutions[p] = index
237  index += 1
238  # Can't actually remove while iterating above, so we do it here.
239  for p in removed:
240  properties.remove(p)
241 
242  fastPath = True
243  for p in properties:
244  if p not in ('filter', 'expTime', 'taiObs'):
245  fastPath = False
246  break
247  if fastPath and 'visit' in dataId and "raw" in self.tables:
248  lookupDataId = {'visit': dataId['visit']}
249  result = self.registry.lookup(properties, 'raw_visit', lookupDataId, template=self.template)
250  else:
251  if dataId is not None:
252  for k, v in dataId.items():
253  if self.columns and k not in self.columns:
254  continue
255  if k == self.obsTimeName:
256  continue
257  if k in skyMapKeys:
258  continue
259  where.append((k, '?'))
260  values.append(v)
261  lookupDataId = {k[0]: v for k, v in zip(where, values)}
262  if self.range:
263  # format of self.range is
264  # ('?', isBetween-lowKey, isBetween-highKey)
265  # here we transform that to {(lowKey, highKey): value}
266  lookupDataId[(self.range[1], self.range[2])] = dataId[self.obsTimeName]
267  result = self.registry.lookup(properties, self.tables, lookupDataId, template=self.template)
268  if not removed:
269  return result
270  # Iterate over the query results, re-inserting the skymap entries.
271  result = [tuple(v if k in removed else item[v] for k, v in substitutions.items())
272  for item in result]
273  return result
274 
daf::base::PropertyList * list
Definition: fits.cc:913
daf::base::PropertySet * set
Definition: fits.cc:912

◆ map()

def lsst.obs.base.mapping.Mapping.map (   self,
  mapper,
  dataId,
  write = False 
)
Standard implementation of map function.

Parameters
----------
mapper: `lsst.daf.persistence.Mapper`
    Object to be mapped.
dataId: `dict`
    Dataset identifier.

Returns
-------
lsst.daf.persistence.ButlerLocation
    Location of object that was mapped.

Reimplemented in lsst.obs.base.mapping.CalibrationMapping.

Definition at line 140 of file mapping.py.

140  def map(self, mapper, dataId, write=False):
141  """Standard implementation of map function.
142 
143  Parameters
144  ----------
145  mapper: `lsst.daf.persistence.Mapper`
146  Object to be mapped.
147  dataId: `dict`
148  Dataset identifier.
149 
150  Returns
151  -------
152  lsst.daf.persistence.ButlerLocation
153  Location of object that was mapped.
154  """
155  actualId = self.need(iter(self.keyDict.keys()), dataId)
156  usedDataId = {key: actualId[key] for key in self.keyDict.keys()}
157  path = mapper._mapActualToPath(self.template, actualId)
158  if os.path.isabs(path):
159  raise RuntimeError("Mapped path should not be absolute.")
160  if not write:
161  # This allows mapped files to be compressed, ending in .gz or .fz,
162  # without any indication from the policy that the file should be
163  # compressed, easily allowing repositories to contain a combination
164  # of comporessed and not-compressed files.
165  # If needed we can add a policy flag to allow compressed files or
166  # not, and perhaps a list of allowed extensions that may exist
167  # at the end of the template.
168  for ext in (None, '.gz', '.fz'):
169  if ext and path.endswith(ext):
170  continue # if the path already ends with the extension
171  extPath = path + ext if ext else path
172  newPath = self.rootStorage.instanceSearch(extPath)
173  if newPath:
174  path = newPath
175  break
176  assert path, "Fully-qualified filename is empty."
177 
178  addFunc = "add_" + self.datasetType # Name of method for additionalData
179  if hasattr(mapper, addFunc):
180  addFunc = getattr(mapper, addFunc)
181  additionalData = addFunc(self.datasetType, actualId)
182  assert isinstance(additionalData, PropertySet), \
183  "Bad type for returned data: %s" % (type(additionalData),)
184  else:
185  additionalData = None
186 
187  return ButlerLocation(pythonType=self.python, cppType=self.persistable, storageName=self.storage,
188  locationList=path, dataId=actualId.copy(), mapper=mapper,
189  storage=self.rootStorage, usedDataId=usedDataId, datasetType=self.datasetType,
190  additionalData=additionalData)
191 
table::Key< int > type
Definition: Detector.cc:163

◆ need()

def lsst.obs.base.mapping.Mapping.need (   self,
  properties,
  dataId 
)
Ensures all properties in the provided list are present in
the data identifier, looking them up as needed.  This is only
possible for the case where the data identifies a single
exposure.

Parameters
----------
properties : `list` of `str`
    Properties required.
dataId : `dict`
    Partial dataset identifier

Returns
-------
`dict`
    Copy of dataset identifier with enhanced values.

Definition at line 296 of file mapping.py.

296  def need(self, properties, dataId):
297  """Ensures all properties in the provided list are present in
298  the data identifier, looking them up as needed. This is only
299  possible for the case where the data identifies a single
300  exposure.
301 
302  Parameters
303  ----------
304  properties : `list` of `str`
305  Properties required.
306  dataId : `dict`
307  Partial dataset identifier
308 
309  Returns
310  -------
311  `dict`
312  Copy of dataset identifier with enhanced values.
313  """
314  newId = dataId.copy()
315  newProps = [] # Properties we don't already have
316  for prop in properties:
317  if prop not in newId:
318  newProps.append(prop)
319  if len(newProps) == 0:
320  return newId
321 
322  lookups = self.lookup(newProps, newId)
323  if len(lookups) != 1:
324  raise NoResults("No unique lookup for %s from %s: %d matches" %
325  (newProps, newId, len(lookups)),
326  self.datasetType, dataId)
327  for i, prop in enumerate(newProps):
328  newId[prop] = lookups[0][i]
329  return newId
330 
331 

◆ template()

def lsst.obs.base.mapping.Mapping.template (   self)

Definition at line 128 of file mapping.py.

128  def template(self):
129  if self._template: # template must not be an empty string or None
130  return self._template
131  else:
132  raise RuntimeError(f"Template is not defined for the {self.datasetType} dataset type, "
133  "it must be set before it can be used.")
134 

Member Data Documentation

◆ columns

lsst.obs.base.mapping.Mapping.columns

Definition at line 123 of file mapping.py.

◆ datasetType

lsst.obs.base.mapping.Mapping.datasetType

Definition at line 89 of file mapping.py.

◆ keyDict

lsst.obs.base.mapping.Mapping.keyDict

Definition at line 102 of file mapping.py.

◆ level

lsst.obs.base.mapping.Mapping.level

Definition at line 117 of file mapping.py.

◆ obsTimeName

lsst.obs.base.mapping.Mapping.obsTimeName

Definition at line 124 of file mapping.py.

◆ persistable

lsst.obs.base.mapping.Mapping.persistable

Definition at line 114 of file mapping.py.

◆ python

lsst.obs.base.mapping.Mapping.python

Definition at line 113 of file mapping.py.

◆ range

lsst.obs.base.mapping.Mapping.range

Definition at line 122 of file mapping.py.

◆ recipe

lsst.obs.base.mapping.Mapping.recipe

Definition at line 125 of file mapping.py.

◆ registry

lsst.obs.base.mapping.Mapping.registry

Definition at line 90 of file mapping.py.

◆ rootStorage

lsst.obs.base.mapping.Mapping.rootStorage

Definition at line 91 of file mapping.py.

◆ storage

lsst.obs.base.mapping.Mapping.storage

Definition at line 115 of file mapping.py.

◆ tables

lsst.obs.base.mapping.Mapping.tables

Definition at line 119 of file mapping.py.


The documentation for this class was generated from the following file: