evefile 0.1.0rc2__py3-none-any.whl → 0.2.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,30 +1,10 @@
1
1
  """
2
2
  *Mapping time stamps of monitors to position counts.*
3
3
 
4
- .. admonition:: Points to discuss further (without claiming to be complete)
5
-
6
- * Mapping MonitorData to MeasureData
7
-
8
- Monitor data (with time in milliseconds as primary axis) need to be
9
- mapped to measured data (with position counts as primary axis).
10
- Mapping position counts to time stamps is trivial (lookup), but *vice
11
- versa* is not unique and the algorithm generally needs to be decided
12
- upon. There is an age-long discussion on this topic (`#5295 note 3
13
- <https://redmine.ahf.ptb.de/issues/5295#note-3>`_). For a current
14
- discussion see `#7722 <https://redmine.ahf.ptb.de/issues/7722>`_.
15
-
16
- Besides the question of how to best map one to the other (that needs to
17
- be discussed, decided, clearly documented and communicated,
18
- and eventually implemented): This mapping should most probably take
19
- place in the controllers technical layer of the measurement functional
20
- layer. The individual :class:`MonitorData
21
- <evefile.entities.data.MonitorData>` class cannot do the
22
- mapping without having access to the mapping table.
23
-
24
-
25
- For a detailed discussion/summary of the current state of affairs regarding
26
- the algorithm and its specification, see `#7722
27
- <https://redmine.ahf.ptb.de/issues/7722>`_. In short:
4
+ Monitor data (with time in milliseconds as primary axis) need to be mapped
5
+ to measured data (with position counts as primary axis). Mapping position
6
+ counts to time stamps is trivial (lookup), but *vice versa* is not unique
7
+ and the algorithm generally needs to be decided upon. In short:
28
8
 
29
9
  * Monitors corresponding to motor axes should be mapped to the *next* position.
30
10
  * Monitors corresponding to detector channels should be mapped to the
@@ -40,29 +20,13 @@ event could have occurred in the readout phase of the detectors (the
40
20
  position is incremented after moving the axes and before triggering the
41
21
  detector readout and start of nested scan modules).
42
22
 
43
- The :class:`TimestampData <evefile.entities.data.TimestampData>`
44
- class got a method :meth:`get_position()
45
- <evefile.entities.data.TimestampData.get_position>` to return
46
- position counts for given timestamps. Currently, the idea is to have one
47
- method handling both, scalars and lists/arrays of values, returning the same
48
- data type, respectively.
49
-
50
- This means that for a given :obj:`EveFile
51
- <evefile.boundaries.evefile.EveFile>` object, the controller carrying out
52
- the mapping knows to ask the :obj:`TimestampData
53
- <evefile.entities.data.TimestampData>` object via its :meth:`get_position()
54
- <evefile.entities.data.TimestampData.get_position>` method for the position
55
- counts corresponding to a given timestamp.
56
-
57
- Special cases that need to be addressed either here or during import of the
58
- data of a monitor:
23
+ Special cases that are addressed during mapping:
59
24
 
60
25
  * Multiple values with timestamp ``-1``, *i.e.* *before* the scan has been
61
26
  started.
62
27
 
63
- Probably the best solution here would be to skip all values except of the
64
- last (newest) with the special timestamp ``-1``. See `#7688, note 10
65
- <https://redmine.ahf.ptb.de/issues/7688#note-10>`_ for details.
28
+ All values except of the last (newest) with the special timestamp ``-1``
29
+ will be skipped for now.
66
30
 
67
31
  For future developments of the measurement engine, it may be sensible to
68
32
  record timestamps for the monitor data in actual timestamps rather than
@@ -86,16 +50,138 @@ data of a monitor:
86
50
  array.
87
51
 
88
52
  Furthermore, a requirement is that the original monitor data are retained
89
- when converting timestamps to position counts. This most probably means to
90
- create a new :obj:`MeasureData <evefile.entities.data.MeasureData>` object.
91
- This is the case for the additional :obj:`DeviceData
92
- <evefile.entities.data.DeviceData>` class as subclass of :obj:`MeasureData
93
- <evefile.entities.data.MeasureData>`. The next question: Where to place
94
- these new objects in the :class:`EveFile <evefile.boundaries.evefile.EveFile>`
95
- (facade) class?
53
+ when converting timestamps to position counts. Hence, a new :obj:`MeasureData
54
+ <evefile.entities.data.MeasureData>` object is created upon mapping.
96
55
 
97
56
 
98
57
  Module documentation
99
58
  ====================
100
59
 
101
60
  """
61
+
62
+ import copy
63
+
64
+ import numpy as np
65
+
66
+ import evefile.entities.data
67
+
68
+
69
+ class Mapper:
70
+ """
71
+ Map monitor datasets with timestamps to datasets with position counts.
72
+
73
+ The eve measurement program has two different concepts for recorded
74
+ data: datasets with position counts as primary axes (generally,
75
+ motor axes and detector channels), and datasets for monitors that
76
+ observe (monitor) values for changes and only record values upon a
77
+ change. The latter have timestamps (in milliseconds since start of the
78
+ scan) als primary axis.
79
+
80
+ To be able to correlate monitors to position counts as primary axis,
81
+ the datasets need to be mapped accordingly. The result of mapping a
82
+ monitor dataset (a :obj:`evefile.entities.data.MonitorData` object) is a
83
+ (new) :obj:`evefile.entities.data.DeviceData` object with the same
84
+ data and metadata, but position counts instead of timestamps.
85
+
86
+
87
+ Attributes
88
+ ----------
89
+ file : :class:`evefile.boundaries.evefile.EveFile`
90
+ EveFile object the mapping should be performed for.
91
+
92
+ Although mapping is carried out for individual monitors contained in
93
+ the EveFile object, additional information from the EveFile object
94
+ is necessary to perform the task.
95
+
96
+ Parameters
97
+ ----------
98
+ file : :class:`evefile.boundaries.evefile.EveFile`
99
+ EveFile object the mapping should be performed for.
100
+
101
+
102
+ Examples
103
+ --------
104
+ Usually, mapping of monitor datasets to device datasets takes place from
105
+ within the :class:`evefile.boundaries.evefile.EveFile` class.
106
+
107
+ To map a monitor dataset (with ID ``DetP5000:gw2370700.STAT``) that is
108
+ contained in the :attr:`monitors
109
+ <evefile.boundaries.evefile.EveFile.monitors>` attribute of the
110
+ :obj:`EveFile <evefile.boundaries.evefile.EveFile>` object referenced
111
+ here with the variable ``evefile``, perform these steps:
112
+
113
+ .. code-block::
114
+
115
+ mapper = Mapper(file=evefile)
116
+ device_data = mapper.map("DetP5000:gw2370700.STAT")
117
+
118
+ This will return a device dataset in the variable ``device_data`` with
119
+ timestamps mapped to position counts. See the :meth:`map` method for
120
+ further details of the actual mapping.
121
+
122
+ """
123
+
124
+ def __init__(self, file=None):
125
+ self.file = file
126
+
127
+ def map(self, monitor=None):
128
+ """
129
+ Map monitor dataset to device data dataset.
130
+
131
+ The device data dataset returned contains mapped position counts
132
+ instead of the original timestamps (in milliseconds after start of
133
+ the scan).
134
+
135
+ .. note::
136
+
137
+ For duplicate positions, typically values recorded before the
138
+ start of the actual scan and hence with the "special" timestamp
139
+ ``-1``, only the last position (and corresponding value) is taken.
140
+
141
+
142
+ Parameters
143
+ ----------
144
+ monitor : :class:`str`
145
+ ID of the monitor dataset to map
146
+
147
+ Note that monitors do *not* have unique names. Hence, you need
148
+ to provide the (unique) ID rather than the name.
149
+
150
+ Returns
151
+ -------
152
+ device_data : :class:`evefile.entities.data.DeviceData`
153
+ Device data with mapped position counts instead of timestamps
154
+
155
+ Raises
156
+ ------
157
+ ValueError
158
+ Raised if no evefile is present
159
+ ValueError
160
+ Raised if no monitor is provided
161
+
162
+ """
163
+ if not self.file:
164
+ raise ValueError("Need an evefile to map data.")
165
+ if not monitor:
166
+ raise ValueError("Need monitor to map timestamps to positions.")
167
+ monitor_data = self.file.monitors[monitor]
168
+ # Need to force load data before mapping
169
+ monitor_data.get_data()
170
+ device_data = evefile.entities.data.DeviceData()
171
+ device_data.metadata.copy_attributes_from(monitor_data.metadata)
172
+ # Take only second of each duplicate value
173
+ indices = np.where(
174
+ np.diff(
175
+ [
176
+ *monitor_data.milliseconds,
177
+ monitor_data.milliseconds[-1] + 1,
178
+ ]
179
+ )
180
+ )
181
+ device_data.position_counts = (
182
+ self.file.position_timestamps.get_position(
183
+ monitor_data.milliseconds[indices]
184
+ )
185
+ )
186
+ device_data.data = copy.copy(monitor_data.data[indices])
187
+ return device_data
@@ -188,6 +188,9 @@ Module documentation
188
188
  import datetime
189
189
  import logging
190
190
  import sys
191
+ from collections.abc import Iterable
192
+
193
+ import numpy as np
191
194
 
192
195
  from evefile import entities
193
196
 
@@ -466,7 +469,11 @@ class VersionMapper:
466
469
  importer.source = dataset.filename
467
470
  importer.item = dataset.name
468
471
  for key, value in mapping.items():
469
- importer.mapping[dataset.dtype.names[key]] = value
472
+ # Note: datasets in array group have no column name(s)
473
+ if dataset.dtype.names:
474
+ importer.mapping[dataset.dtype.names[key]] = value
475
+ else:
476
+ importer.mapping[key] = value
470
477
  return importer
471
478
 
472
479
  @staticmethod
@@ -532,6 +539,7 @@ class VersionMapper:
532
539
  # mapped dataset is removed from this list.
533
540
  self._map_timestamp_dataset()
534
541
  self._map_monitor_datasets()
542
+ self._map_array_datasets()
535
543
  self._map_axis_datasets()
536
544
  self._map_0d_datasets()
537
545
  self._map_snapshot_datasets()
@@ -559,6 +567,26 @@ class VersionMapper:
559
567
  dataset
560
568
  )
561
569
 
570
+ def _map_array_datasets(self):
571
+ mapped_datasets = []
572
+ for name in self.datasets2map_in_main:
573
+ item = getattr(self._main_group, name)
574
+ # noinspection PyUnresolvedReferences
575
+ if isinstance(item, Iterable) and "DeviceType" in item.attributes:
576
+ # noinspection PyTypeChecker
577
+ # TODO: Distinguish between MCA and other array detectors
578
+ self._map_mca_dataset(hdf5_group=item)
579
+ # noinspection PyTypeChecker
580
+ mapped_datasets.append(self.get_dataset_name(item))
581
+ for item in mapped_datasets:
582
+ self.datasets2map_in_main.remove(item)
583
+
584
+ def _map_array_dataset(self, hdf5_group=None):
585
+ pass
586
+
587
+ def _map_mca_dataset(self, hdf5_group=None):
588
+ pass
589
+
562
590
  def _map_axis_datasets(self):
563
591
  mapped_datasets = []
564
592
  for name in self.datasets2map_in_main:
@@ -757,6 +785,158 @@ class VersionMapperV5(VersionMapper):
757
785
  dataset.metadata.unit = timestampdata.attributes["Unit"]
758
786
  self.destination.position_timestamps = dataset
759
787
 
788
+ def _map_mca_dataset(self, hdf5_group=None):
789
+ # TODO: Move up to VersionMapperV2 (at least the earliest one)
790
+ dataset = entities.data.MCAChannelData()
791
+ self.set_basic_metadata(hdf5_item=hdf5_group, dataset=dataset)
792
+ self._mca_dataset_set_data(dataset=dataset, hdf5_group=hdf5_group)
793
+ self._mca_dataset_set_options_in_main(dataset=dataset)
794
+ self._mca_dataset_set_options_in_snapshot(dataset=dataset)
795
+ self.destination.data[self.get_dataset_name(hdf5_group)] = dataset
796
+
797
+ def _mca_dataset_set_data(self, dataset=None, hdf5_group=None):
798
+ # Create positions vector and add it (needs to be done here)
799
+ positions = [int(i) for i in hdf5_group.item_names()]
800
+ sorter = np.argsort(positions)
801
+ dataset.position_counts = np.asarray(np.sort(positions), dtype="i4")
802
+ # Create and add importers for each individual array
803
+ importer_list = []
804
+ for position in hdf5_group:
805
+ importer_mapping = {
806
+ 0: "data",
807
+ }
808
+ importer = self.get_hdf5_dataset_importer(
809
+ dataset=position, mapping=importer_mapping
810
+ )
811
+ importer_list.append(importer)
812
+ for idx in sorter:
813
+ dataset.importer.append(importer_list[idx])
814
+
815
+ def _mca_dataset_set_options_in_main(self, dataset=None):
816
+ # Handle options in main section
817
+ pv_base = dataset.metadata.pv.split(".")[0]
818
+ options_in_main = [
819
+ item
820
+ for item in self.datasets2map_in_main
821
+ if item.startswith(f"{pv_base}.")
822
+ ]
823
+ options_in_main.sort()
824
+ for option in options_in_main:
825
+ mapping_table = {
826
+ "ELTM": "life_time",
827
+ "ERTM": "real_time",
828
+ }
829
+ attribute = option.split(".")[-1]
830
+ if attribute in mapping_table:
831
+ importer_mapping = {1: mapping_table[attribute]}
832
+ importer = self.get_hdf5_dataset_importer(
833
+ dataset=getattr(self.source.c1.main, option),
834
+ mapping=importer_mapping,
835
+ )
836
+ dataset.importer.append(importer)
837
+ self.datasets2map_in_main.remove(option)
838
+ if attribute.startswith("R"):
839
+ roi = entities.data.MCAChannelROIData()
840
+ importer_mapping = {
841
+ 0: "position_counts",
842
+ 1: "data",
843
+ }
844
+ importer = self.get_hdf5_dataset_importer(
845
+ dataset=getattr(self.source.c1.main, option),
846
+ mapping=importer_mapping,
847
+ )
848
+ roi.importer.append(importer)
849
+ self.set_basic_metadata(
850
+ hdf5_item=getattr(self.source.c1.main, option),
851
+ dataset=roi,
852
+ )
853
+ dataset.roi.append(roi)
854
+ self.datasets2map_in_main.remove(option)
855
+
856
+ def _mca_dataset_set_options_in_snapshot(self, dataset):
857
+ # Handle options in snapshot section
858
+ pv_base = dataset.metadata.pv.split(".")[0]
859
+ options_in_snapshot = [
860
+ item
861
+ for item in self.datasets2map_in_snapshot
862
+ if item.startswith(f"{pv_base}.")
863
+ ]
864
+ options_in_snapshot.sort()
865
+ calibration_options = [
866
+ item.split(".")[-1]
867
+ for item in options_in_snapshot
868
+ if item.split(".")[-1].startswith("CAL")
869
+ ]
870
+ if calibration_options:
871
+ mapping_table = {
872
+ "CALO": "offset",
873
+ "CALQ": "quadratic",
874
+ "CALS": "slope",
875
+ }
876
+ calibration = entities.metadata.MCAChannelCalibration()
877
+ for option in calibration_options:
878
+ # HDF5 datasets are read directly and only the first data
879
+ # point taken from each, as calibration cannot sensibly
880
+ # change between scan modules of a scan.
881
+ name = ".".join([pv_base, option])
882
+ hdf5_dataset = getattr(self.source.c1.snapshot, name)
883
+ hdf5_dataset.get_data()
884
+ setattr(
885
+ calibration,
886
+ mapping_table[option],
887
+ hdf5_dataset.data[name][0],
888
+ )
889
+ options_in_snapshot.remove(name)
890
+ self.datasets2map_in_snapshot.remove(name)
891
+ dataset.metadata.calibration = calibration
892
+ roi_options = [
893
+ item.split(".")[-1]
894
+ for item in options_in_snapshot
895
+ if item.split(".")[-1].startswith("R")
896
+ ]
897
+ if roi_options:
898
+ n_rois = len(set(int(item[1:-2]) for item in roi_options))
899
+ for idx in range(n_rois):
900
+ if len(dataset.roi) - 1 < idx:
901
+ roi = entities.data.MCAChannelROIData()
902
+ dataset.roi.append(roi)
903
+ else:
904
+ roi = dataset.roi[idx]
905
+ name = ".".join([pv_base, f"R{idx}LO"])
906
+ hdf5_dataset = getattr(self.source.c1.snapshot, name)
907
+ hdf5_dataset.get_data()
908
+ roi.marker[0] = hdf5_dataset.data[name][0]
909
+ name = ".".join([pv_base, f"R{idx}HI"])
910
+ hdf5_dataset = getattr(self.source.c1.snapshot, name)
911
+ hdf5_dataset.get_data()
912
+ roi.marker[1] = hdf5_dataset.data[name][0]
913
+ name = ".".join([pv_base, f"R{idx}NM"])
914
+ hdf5_dataset = getattr(self.source.c1.snapshot, name)
915
+ hdf5_dataset.get_data()
916
+ roi.label = hdf5_dataset.data[name][0].decode()
917
+ for option in roi_options:
918
+ name = ".".join([pv_base, option])
919
+ options_in_snapshot.remove(name)
920
+ self.datasets2map_in_snapshot.remove(name)
921
+ mapping_table = {
922
+ "PRTM": "preset_real_time",
923
+ "PLTM": "preset_life_time",
924
+ }
925
+ for option, attribute in mapping_table.items():
926
+ name = ".".join([pv_base, option])
927
+ hdf5_dataset = getattr(self.source.c1.snapshot, name)
928
+ hdf5_dataset.get_data()
929
+ setattr(
930
+ dataset.metadata,
931
+ attribute,
932
+ hdf5_dataset.data[name][0],
933
+ )
934
+ options_in_snapshot.remove(name)
935
+ self.datasets2map_in_snapshot.remove(name)
936
+ for option in options_in_snapshot:
937
+ logger.warning("Option %s unmapped", option.split(".")[-1])
938
+ self.datasets2map_in_snapshot.remove(option)
939
+
760
940
  def _map_log_messages(self):
761
941
  if not hasattr(self.source, "LiveComment"):
762
942
  return