h5netcdf 1.6.2__tar.gz → 1.6.3__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of h5netcdf might be problematic. Click here for more details.

Files changed (33) hide show
  1. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/CHANGELOG.rst +23 -17
  2. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/PKG-INFO +1 -1
  3. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/_version.py +2 -2
  4. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/core.py +4 -4
  5. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/dimensions.py +1 -1
  6. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/tests/test_h5netcdf.py +65 -93
  7. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf.egg-info/PKG-INFO +1 -1
  8. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/.pre-commit-config.yaml +0 -0
  9. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/AUTHORS.txt +0 -0
  10. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/LICENSE +0 -0
  11. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/README.rst +0 -0
  12. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/Makefile +0 -0
  13. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/api.rst +0 -0
  14. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/changelog.rst +0 -0
  15. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/conf.py +0 -0
  16. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/devguide.rst +0 -0
  17. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/feature.rst +0 -0
  18. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/index.rst +0 -0
  19. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/doc/legacyapi.rst +0 -0
  20. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/__init__.py +0 -0
  21. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/attrs.py +0 -0
  22. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/legacyapi.py +0 -0
  23. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/tests/conftest.py +0 -0
  24. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/tests/pytest.ini +0 -0
  25. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf/utils.py +0 -0
  26. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf.egg-info/SOURCES.txt +0 -0
  27. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf.egg-info/dependency_links.txt +0 -0
  28. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf.egg-info/requires.txt +0 -0
  29. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/h5netcdf.egg-info/top_level.txt +0 -0
  30. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/licenses/H5PY_LICENSE.txt +0 -0
  31. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/licenses/PSF_LICENSE.txt +0 -0
  32. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/pyproject.toml +0 -0
  33. {h5netcdf-1.6.2 → h5netcdf-1.6.3}/setup.cfg +0 -0
@@ -1,64 +1,70 @@
1
1
  Change Log
2
2
  ----------
3
3
 
4
+ Version 1.6.3 (June 30th, 2025):
5
+
6
+ - fix invalid string format specifier, match raises/warns with messages in test suite,
7
+ remove tests for h5py < 3.7, fix sphinx issue and pr roles in CHANGELOG.rst (:issue:`269`, :pull:`270`).
8
+ By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
9
+
4
10
  Version 1.6.2 (June 26th, 2025):
5
11
 
6
- - Codespell fixes ({pull}`261`).
12
+ - Codespell fixes (:pull:`261`).
7
13
  By `Kurt Schwehr <https://github.com/schwehr>`_
8
- - Fix hsds/h5pyd test fixture spinup issues ({pull}`265`).
14
+ - Fix hsds/h5pyd test fixture spinup issues (:pull:`265`).
9
15
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
10
- - Fix and add circular referrer tests for Python 3.14 and update CI matrix ({pull}`264`).
16
+ - Fix and add circular referrer tests for Python 3.14 and update CI matrix (:pull:`264`).
11
17
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
12
18
  - Avoid opening h5pyd file to check if there is a preexisting file,
13
- instead remap mode "a" -> "r+", resort to "w" if file doesn't exist ({issue}`262`, {pull}`266`).
19
+ instead remap mode "a" -> "r+", resort to "w" if file doesn't exist (:issue:`262`, :pull:`266`).
14
20
  By `Jonas Grönberg <https://github.com/JonasGronberg>`_ and `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
15
- - Reduce CI time by installing available scientific-python-nightly-wheels and using pip cache ({pull}`267`).
21
+ - Reduce CI time by installing available scientific-python-nightly-wheels and using pip cache (:pull:`267`).
16
22
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
17
23
 
18
24
  Version 1.6.1 (March 7th, 2025):
19
25
 
20
26
  - Let Variable.chunks return None for scalar variables, independent of what the underlying
21
- h5ds object returns ({pull}`259`).
27
+ h5ds object returns (:pull:`259`).
22
28
  By `Rickard Holmberg <https://github.com/rho-novatron>`_
23
29
 
24
30
  Version 1.6.0 (March 7th, 2025):
25
31
 
26
- - Allow specifying `h5netcdf.File(driver="h5pyd")` to force the use of h5pyd ({issue}`255`, {pull}`256`).
32
+ - Allow specifying `h5netcdf.File(driver="h5pyd")` to force the use of h5pyd (:issue:`255`, :pull:`256`).
27
33
  By `Rickard Holmberg <https://github.com/rho-novatron>`_
28
- - Add pytest-mypy-plugins for xarray nightly test ({pull}`257`).
34
+ - Add pytest-mypy-plugins for xarray nightly test (:pull:`257`).
29
35
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
30
36
 
31
37
  Version 1.5.0 (January 26th, 2025):
32
38
 
33
- - Update CI to new versions (Python 3.13, 3.14 alpha), remove numpy 1 from h5pyd runs ({pull}`250`).
39
+ - Update CI to new versions (Python 3.13, 3.14 alpha), remove numpy 1 from h5pyd runs (:pull:`250`).
34
40
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
35
- - Update CI and reinstate h5pyd/hsds test runs ({pull}`247`).
41
+ - Update CI and reinstate h5pyd/hsds test runs (:pull:`247`).
36
42
  By `John Readey <https://github.com/jreadey>`_
37
43
  - Allow ``zlib`` to be used as an alias for ``gzip`` for enhanced compatibility with h5netcdf's API and xarray.
38
44
  By `Mark Harfouche <https://github.com/hmaarrfk>`_
39
45
 
40
46
  Version 1.4.1 (November 13th, 2024):
41
47
 
42
- - Add CI run for hdf5 1.10.6, fix complex tests, fix enum/user type tests ({pull}`244`).
48
+ - Add CI run for hdf5 1.10.6, fix complex tests, fix enum/user type tests (:pull:`244`).
43
49
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
44
50
 
45
51
 
46
52
  Version 1.4.0 (October 7th, 2024):
47
53
 
48
- - Add UserType class, add EnumType ({pull}`229`).
54
+ - Add UserType class, add EnumType (:pull:`229`).
49
55
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
50
- - Refactor fillvalue and dtype handling for user types, enhance sanity checks and tests ({pull}`230`).
56
+ - Refactor fillvalue and dtype handling for user types, enhance sanity checks and tests (:pull:`230`).
51
57
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
52
- - Add VLType and CompoundType, commit complex compound type to file. Align with nc-complex ({pull}`227`).
58
+ - Add VLType and CompoundType, commit complex compound type to file. Align with nc-complex (:pull:`227`).
53
59
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
54
60
  - Update h5pyd testing.
55
61
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
56
- - CI and lint maintenance ({pull}`235`).
62
+ - CI and lint maintenance (:pull:`235`).
57
63
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
58
64
  - Support wrapping an h5py ``File`` object. Closing the h5netcdf file object
59
- does not close the h5py file ({pull}`238`).
65
+ does not close the h5py file (:pull:`238`).
60
66
  By `Thomas Kluyver <https://github.com/takluyver>`_
61
- - CI and lint maintenance (format README.rst, use more f-strings, change Python 3.9 to 3.10 in CI) ({pull}`239`).
67
+ - CI and lint maintenance (format README.rst, use more f-strings, change Python 3.9 to 3.10 in CI) (:pull:`239`).
62
68
  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
63
69
 
64
70
  Version 1.3.0 (November 7th, 2023):
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: h5netcdf
3
- Version: 1.6.2
3
+ Version: 1.6.3
4
4
  Summary: netCDF4 via h5py
5
5
  Author-email: Stephan Hoyer <shoyer@gmail.com>, Kai Mühlbauer <kmuehlbauer@wradlib.org>
6
6
  Maintainer-email: h5netcdf developers <devteam@h5netcdf.org>
@@ -17,5 +17,5 @@ __version__: str
17
17
  __version_tuple__: VERSION_TUPLE
18
18
  version_tuple: VERSION_TUPLE
19
19
 
20
- __version__ = version = '1.6.2'
21
- __version_tuple__ = version_tuple = (1, 6, 2)
20
+ __version__ = version = '1.6.3'
21
+ __version_tuple__ = version_tuple = (1, 6, 3)
@@ -982,16 +982,16 @@ class Group(Mapping):
982
982
  for k, v in self._all_dimensions.maps[0].items():
983
983
  if k in value:
984
984
  if v != value[k]:
985
- raise ValueError(f"cannot modify existing dimension {k:!r}")
985
+ raise ValueError(f"cannot modify existing dimension {k!r}")
986
986
  else:
987
987
  raise ValueError(
988
- f"new dimensions do not include existing dimension {k:!r}"
988
+ f"new dimensions do not include existing dimension {k!r}"
989
989
  )
990
990
  self._dimensions.update(value)
991
991
 
992
992
  def _create_child_group(self, name):
993
993
  if name in self:
994
- raise ValueError(f"unable to create group {name:!r} (name already exists)")
994
+ raise ValueError(f"unable to create group {name!r} (name already exists)")
995
995
  kwargs = {}
996
996
  kwargs.update(track_order=self._track_order)
997
997
 
@@ -1035,7 +1035,7 @@ class Group(Mapping):
1035
1035
  ):
1036
1036
  if name in self:
1037
1037
  raise ValueError(
1038
- f"unable to create variable {name:!r} (name already exists)"
1038
+ f"unable to create variable {name!r} (name already exists)"
1039
1039
  )
1040
1040
  if data is not None:
1041
1041
  data = np.asarray(data)
@@ -22,7 +22,7 @@ class Dimensions(MutableMapping):
22
22
  if not self._group._root._writable:
23
23
  raise RuntimeError("H5NetCDF: Write to read only")
24
24
  if name in self._objects:
25
- raise ValueError(f"dimension {name:!r} already exists")
25
+ raise ValueError(f"dimension {name!r} already exists")
26
26
 
27
27
  self._objects[name] = Dimension(self._group, name, size, create_h5ds=True)
28
28
 
@@ -13,7 +13,7 @@ import netCDF4
13
13
  import numpy as np
14
14
  import pytest
15
15
  from packaging import version
16
- from pytest import raises
16
+ from pytest import raises, warns
17
17
 
18
18
  import h5netcdf
19
19
  from h5netcdf import legacyapi
@@ -164,7 +164,10 @@ def write_legacy_netcdf(tmp_netcdf, write_module):
164
164
  v = ds.createVariable("foo_unlimited", float, ("x", "unlimited"))
165
165
  v[...] = 1
166
166
 
167
- with raises((h5netcdf.CompatibilityError, TypeError)):
167
+ with raises(
168
+ (h5netcdf.CompatibilityError, TypeError),
169
+ match=r"(?i)(boolean dtypes are not a supported NetCDF feature|illegal primitive data type)",
170
+ ):
168
171
  ds.createVariable("boolean", np.bool_, ("x"))
169
172
 
170
173
  g = ds.createGroup("subgroup")
@@ -257,7 +260,7 @@ def read_legacy_netcdf(tmp_netcdf, read_module, write_module):
257
260
  if write_module is not netCDF4:
258
261
  # skip for now: https://github.com/Unidata/netcdf4-python/issues/388
259
262
  assert ds.other_attr == "yes"
260
- with pytest.raises(AttributeError):
263
+ with raises(AttributeError, match="not found"):
261
264
  ds.does_not_exist
262
265
  assert set(ds.dimensions) == set(
263
266
  ["x", "y", "z", "empty", "string3", "mismatched_dim", "unlimited"]
@@ -653,25 +656,27 @@ def test_optional_netcdf4_attrs(tmp_local_or_remote_netcdf):
653
656
  def test_error_handling(tmp_local_or_remote_netcdf):
654
657
  with h5netcdf.File(tmp_local_or_remote_netcdf, "w") as ds:
655
658
  ds.dimensions["x"] = 1
656
- with raises(ValueError):
659
+ with raises(ValueError, match="already exists"):
657
660
  ds.dimensions["x"] = 2
658
- with raises(ValueError):
661
+ with raises(ValueError, match="cannot modify existing dimension"):
659
662
  ds.dimensions = {"x": 2}
660
- with raises(ValueError):
663
+ with raises(
664
+ ValueError, match="new dimensions do not include existing dimension"
665
+ ):
661
666
  ds.dimensions = {"y": 3}
662
667
  ds.create_variable("x", ("x",), dtype=float)
663
- with raises(ValueError):
668
+ with raises(ValueError, match="unable to create variable"):
664
669
  ds.create_variable("x", ("x",), dtype=float)
665
- with raises(ValueError):
670
+ with raises(ValueError, match="name parameter cannot be an empty string"):
666
671
  ds.create_variable("y/", ("x",), dtype=float)
667
672
  ds.create_group("subgroup")
668
- with raises(ValueError):
673
+ with raises(ValueError, match="unable to create group"):
669
674
  ds.create_group("subgroup")
670
675
 
671
676
 
672
677
  def test_decode_string_error(tmp_local_or_remote_netcdf):
673
678
  write_h5netcdf(tmp_local_or_remote_netcdf)
674
- with pytest.raises(TypeError):
679
+ with raises(TypeError, match="keyword argument is not allowed"):
675
680
  with h5netcdf.legacyapi.Dataset(
676
681
  tmp_local_or_remote_netcdf, "r", decode_vlen_strings=True
677
682
  ) as ds:
@@ -738,10 +743,10 @@ def test_invalid_netcdf4(tmp_local_or_remote_netcdf):
738
743
  check_invalid_netcdf4(var, i)
739
744
 
740
745
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
741
- with raises(ValueError):
746
+ with raises(ValueError, match="has no dimension scale associated"):
742
747
  ds["bar"].variables["foo1"].dimensions
743
748
 
744
- with raises(ValueError):
749
+ with raises(ValueError, match="unknown value"):
745
750
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r", phony_dims="srt") as ds:
746
751
  pass
747
752
 
@@ -806,7 +811,7 @@ def test_invalid_netcdf4_mixed(tmp_local_or_remote_netcdf):
806
811
  check_invalid_netcdf4_mixed(var, 3)
807
812
 
808
813
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
809
- with raises(ValueError):
814
+ with raises(ValueError, match="has no dimension scale associated with"):
810
815
  ds.variables["foo1"].dimensions
811
816
 
812
817
 
@@ -824,12 +829,12 @@ def test_invalid_netcdf_malformed_dimension_scales(tmp_local_or_remote_netcdf):
824
829
  f["z"].make_scale()
825
830
  f["foo1"].dims[0].attach_scale(f["x"])
826
831
 
827
- with raises(ValueError):
832
+ with raises(ValueError, match="has mixing of labeled and unlabeled dimensions"):
828
833
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
829
834
  assert ds
830
835
  print(ds)
831
836
 
832
- with raises(ValueError):
837
+ with raises(ValueError, match="has mixing of labeled and unlabeled dimensions"):
833
838
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r", phony_dims="sort") as ds:
834
839
  assert ds
835
840
  print(ds)
@@ -943,14 +948,17 @@ def test_invalid_netcdf_error(tmp_local_or_remote_netcdf):
943
948
  f.create_variable(
944
949
  "lzf_compressed", data=[1], dimensions=("x"), compression="lzf"
945
950
  )
946
- with pytest.raises(h5netcdf.CompatibilityError):
951
+ with raises(
952
+ h5netcdf.CompatibilityError,
953
+ match="scale-offset filters are not a supported NetCDF feature",
954
+ ):
947
955
  f.create_variable("scaleoffset", data=[1], dimensions=("x",), scaleoffset=0)
948
956
 
949
957
 
950
958
  def test_invalid_netcdf_okay(tmp_local_or_remote_netcdf):
951
959
  if tmp_local_or_remote_netcdf.startswith(remote_h5):
952
960
  pytest.skip("h5pyd does not support NumPy complex dtype yet")
953
- with pytest.warns(UserWarning, match="invalid netcdf features"):
961
+ with warns(UserWarning, match="invalid netcdf features"):
954
962
  with h5netcdf.File(tmp_local_or_remote_netcdf, "w", invalid_netcdf=True) as f:
955
963
  f.create_variable(
956
964
  "lzf_compressed", data=[1], dimensions=("x"), compression="lzf"
@@ -972,7 +980,7 @@ def test_invalid_netcdf_overwrite_valid(tmp_local_netcdf):
972
980
  # https://github.com/h5netcdf/h5netcdf/issues/165
973
981
  with netCDF4.Dataset(tmp_local_netcdf, mode="w"):
974
982
  pass
975
- with pytest.warns(UserWarning):
983
+ with warns(UserWarning, match="You are writing invalid netcdf features"):
976
984
  with h5netcdf.File(tmp_local_netcdf, "a", invalid_netcdf=True) as f:
977
985
  f.create_variable(
978
986
  "lzf_compressed", data=[1], dimensions=("x"), compression="lzf"
@@ -1001,7 +1009,7 @@ def test_reopen_file_different_dimension_sizes(tmp_local_netcdf):
1001
1009
 
1002
1010
 
1003
1011
  def test_invalid_then_valid_no_ncproperties(tmp_local_or_remote_netcdf):
1004
- with pytest.warns(UserWarning, match="invalid netcdf features"):
1012
+ with warns(UserWarning, match="invalid netcdf features"):
1005
1013
  with h5netcdf.File(tmp_local_or_remote_netcdf, "w", invalid_netcdf=True):
1006
1014
  pass
1007
1015
  with h5netcdf.File(tmp_local_or_remote_netcdf, "a"):
@@ -1019,11 +1027,8 @@ def test_creating_and_resizing_unlimited_dimensions(tmp_local_or_remote_netcdf):
1019
1027
  f.dimensions["z"] = None
1020
1028
  f.resize_dimension("z", 20)
1021
1029
 
1022
- with pytest.raises(ValueError) as e:
1030
+ with raises(ValueError, match="is not unlimited and thus cannot be resized"):
1023
1031
  f.resize_dimension("y", 20)
1024
- assert e.value.args[0] == (
1025
- "Dimension 'y' is not unlimited and thus cannot be resized."
1026
- )
1027
1032
 
1028
1033
  h5 = get_hdf5_module(tmp_local_or_remote_netcdf)
1029
1034
  # Assert some behavior observed by using the C netCDF bindings.
@@ -1049,11 +1054,10 @@ def test_creating_variables_with_unlimited_dimensions(tmp_local_or_remote_netcdf
1049
1054
 
1050
1055
  # Trying to create a variable while the current size of the dimension
1051
1056
  # is still zero will fail.
1052
- with pytest.raises(ValueError) as e:
1057
+ with raises(ValueError, match="Shape tuple is incompatible with data"):
1053
1058
  f.create_variable(
1054
1059
  "dummy2", data=np.array([[1, 2], [3, 4]]), dimensions=("x", "y")
1055
1060
  )
1056
- assert e.value.args[0] == "Shape tuple is incompatible with data"
1057
1061
 
1058
1062
  # Creating a coordinate variable
1059
1063
  f.create_variable("x", dimensions=("x",), dtype=np.int64)
@@ -1078,7 +1082,7 @@ def test_creating_variables_with_unlimited_dimensions(tmp_local_or_remote_netcdf
1078
1082
  # We don't expect any errors. This is effectively a void context manager
1079
1083
  expected_errors = memoryview(b"")
1080
1084
  else:
1081
- expected_errors = pytest.raises(TypeError)
1085
+ expected_errors = raises(TypeError, match="Can't broadcast")
1082
1086
  with expected_errors as e:
1083
1087
  f.variables["dummy3"][:] = np.ones((5, 2))
1084
1088
  if not tmp_local_or_remote_netcdf.startswith(remote_h5):
@@ -1115,11 +1119,10 @@ def test_writing_to_an_unlimited_dimension(tmp_local_or_remote_netcdf):
1115
1119
  f.dimensions["z"] = None
1116
1120
 
1117
1121
  # Cannot create it without first resizing it.
1118
- with pytest.raises(ValueError) as e:
1122
+ with raises(ValueError, match="Shape tuple is incompatible with data"):
1119
1123
  f.create_variable(
1120
1124
  "dummy1", data=np.array([[1, 2, 3]]), dimensions=("x", "y")
1121
1125
  )
1122
- assert e.value.args[0] == "Shape tuple is incompatible with data"
1123
1126
 
1124
1127
  # Without data.
1125
1128
  f.create_variable("dummy1", dimensions=("x", "y"), dtype=np.int64)
@@ -1148,7 +1151,9 @@ def test_writing_to_an_unlimited_dimension(tmp_local_or_remote_netcdf):
1148
1151
 
1149
1152
  # broadcast writing
1150
1153
  if tmp_local_or_remote_netcdf.startswith(remote_h5):
1151
- expected_errors = pytest.raises(OSError)
1154
+ expected_errors = raises(
1155
+ OSError, match="Got asyncio.IncompleteReadError during binary read"
1156
+ )
1152
1157
  else:
1153
1158
  # We don't expect any errors. This is effectively a void context manager
1154
1159
  expected_errors = memoryview(b"")
@@ -1773,18 +1778,10 @@ def test_more_than_7_attr_creation(tmp_local_netcdf):
1773
1778
  # https://github.com/h5netcdf/h5netcdf/issues/136#issuecomment-1017457067
1774
1779
  @pytest.mark.parametrize("track_order", [False, True])
1775
1780
  def test_more_than_7_attr_creation_track_order(tmp_local_netcdf, track_order):
1776
- h5py_version = version.parse(h5py.__version__)
1777
- if track_order and h5py_version < version.parse("3.7.0"):
1778
- expected_errors = pytest.raises(KeyError)
1779
- else:
1780
- # We don't expect any errors. This is effectively a void context manager
1781
- expected_errors = memoryview(b"")
1782
-
1783
1781
  with h5netcdf.File(tmp_local_netcdf, "w", track_order=track_order) as h5file:
1784
- with expected_errors:
1785
- for i in range(100):
1786
- h5file.attrs[f"key{i}"] = i
1787
- h5file.attrs[f"key{i}"] = 0
1782
+ for i in range(100):
1783
+ h5file.attrs[f"key{i}"] = i
1784
+ h5file.attrs[f"key{i}"] = 0
1788
1785
 
1789
1786
 
1790
1787
  def test_group_names(tmp_local_netcdf):
@@ -1871,18 +1868,11 @@ def test_bool_slicing_length_one_dim(tmp_local_netcdf):
1871
1868
  data = ds["hello"][bool_slice, :]
1872
1869
  np.testing.assert_equal(data, np.zeros((1, 2)))
1873
1870
 
1874
- # should raise for h5py >= 3.0.0 and h5py < 3.7.0
1871
+ # regression test
1875
1872
  # https://github.com/h5py/h5py/pull/2079
1876
1873
  # https://github.com/h5netcdf/h5netcdf/pull/125/
1877
1874
  with h5netcdf.File(tmp_local_netcdf, "r") as ds:
1878
- h5py_version = version.parse(h5py.__version__)
1879
- if version.parse("3.0.0") <= h5py_version < version.parse("3.7.0"):
1880
- error = "Indexing arrays must have integer dtypes"
1881
- with pytest.raises(TypeError) as e:
1882
- ds["hello"][bool_slice, :]
1883
- assert error == str(e.value)
1884
- else:
1885
- ds["hello"][bool_slice, :]
1875
+ ds["hello"][bool_slice, :]
1886
1876
 
1887
1877
 
1888
1878
  def test_fancy_indexing(tmp_local_or_remote_netcdf):
@@ -2303,38 +2293,36 @@ def test_user_type_errors_new_api(tmp_local_or_remote_netcdf):
2303
2293
  enum_type = ds.create_enumtype(np.uint8, "enum_t", enum_dict1)
2304
2294
 
2305
2295
  if tmp_local_or_remote_netcdf.startswith(remote_h5):
2306
- testcontext = pytest.raises(RuntimeError, match="Conflict")
2296
+ testcontext = raises(RuntimeError, match="Conflict")
2307
2297
  else:
2308
- testcontext = pytest.raises(
2309
- (KeyError, TypeError), match="name already exists"
2310
- )
2298
+ testcontext = raises((KeyError, TypeError), match="name already exists")
2311
2299
  with testcontext:
2312
2300
  ds.create_enumtype(np.uint8, "enum_t", enum_dict2)
2313
2301
 
2314
2302
  enum_type2 = g.create_enumtype(np.uint8, "enum_t2", enum_dict2)
2315
2303
  g.create_enumtype(np.uint8, "enum_t", enum_dict2)
2316
- with pytest.raises(TypeError, match="Please provide h5netcdf user type"):
2304
+ with raises(TypeError, match="Please provide h5netcdf user type"):
2317
2305
  ds.create_variable(
2318
2306
  "enum_var1",
2319
2307
  ("enum_dim",),
2320
2308
  dtype=enum_type._h5ds,
2321
2309
  fillvalue=enum_dict1["missing"],
2322
2310
  )
2323
- with pytest.raises(TypeError, match="is not committed into current file"):
2311
+ with raises(TypeError, match="is not committed into current file"):
2324
2312
  ds.create_variable(
2325
2313
  "enum_var2",
2326
2314
  ("enum_dim",),
2327
2315
  dtype=enum_type_ext,
2328
2316
  fillvalue=enum_dict1["missing"],
2329
2317
  )
2330
- with pytest.raises(TypeError, match="is not accessible in current group"):
2318
+ with raises(TypeError, match="is not accessible in current group"):
2331
2319
  ds.create_variable(
2332
2320
  "enum_var3",
2333
2321
  ("enum_dim",),
2334
2322
  dtype=enum_type2,
2335
2323
  fillvalue=enum_dict2["missing"],
2336
2324
  )
2337
- with pytest.raises(TypeError, match="Another dtype with same name"):
2325
+ with raises(TypeError, match="Another dtype with same name"):
2338
2326
  g.create_variable(
2339
2327
  "enum_var4",
2340
2328
  ("enum_dim",),
@@ -2353,38 +2341,36 @@ def test_user_type_errors_legacyapi(tmp_local_or_remote_netcdf):
2353
2341
  g = ds.createGroup("subgroup")
2354
2342
  enum_type = ds.createEnumType(np.uint8, "enum_t", enum_dict1)
2355
2343
  if tmp_local_or_remote_netcdf.startswith(remote_h5):
2356
- testcontext = pytest.raises(RuntimeError, match="Conflict")
2344
+ testcontext = raises(RuntimeError, match="Conflict")
2357
2345
  else:
2358
- testcontext = pytest.raises(
2359
- (KeyError, TypeError), match="name already exists"
2360
- )
2346
+ testcontext = raises((KeyError, TypeError), match="name already exists")
2361
2347
  with testcontext:
2362
2348
  ds.createEnumType(np.uint8, "enum_t", enum_dict1)
2363
2349
 
2364
2350
  enum_type2 = g.createEnumType(np.uint8, "enum_t2", enum_dict2)
2365
2351
  g.create_enumtype(np.uint8, "enum_t", enum_dict2)
2366
- with pytest.raises(TypeError, match="Please provide h5netcdf user type"):
2352
+ with raises(TypeError, match="Please provide h5netcdf user type"):
2367
2353
  ds.createVariable(
2368
2354
  "enum_var1",
2369
2355
  enum_type._h5ds,
2370
2356
  ("enum_dim",),
2371
2357
  fill_value=enum_dict1["missing"],
2372
2358
  )
2373
- with pytest.raises(TypeError, match="is not committed into current file"):
2359
+ with raises(TypeError, match="is not committed into current file"):
2374
2360
  ds.createVariable(
2375
2361
  "enum_var2",
2376
2362
  enum_type_ext,
2377
2363
  ("enum_dim",),
2378
2364
  fill_value=enum_dict1["missing"],
2379
2365
  )
2380
- with pytest.raises(TypeError, match="is not accessible in current group"):
2366
+ with raises(TypeError, match="is not accessible in current group"):
2381
2367
  ds.createVariable(
2382
2368
  "enum_var3",
2383
2369
  enum_type2,
2384
2370
  ("enum_dim",),
2385
2371
  fill_value=enum_dict2["missing"],
2386
2372
  )
2387
- with pytest.raises(TypeError, match="Another dtype with same name"):
2373
+ with raises(TypeError, match="Another dtype with same name"):
2388
2374
  g.createVariable(
2389
2375
  "enum_var4",
2390
2376
  enum_type,
@@ -2402,7 +2388,7 @@ def test_enum_type_errors_new_api(tmp_local_or_remote_netcdf):
2402
2388
  enum_type2 = ds.create_enumtype(np.uint8, "enum_t2", enum_dict2)
2403
2389
 
2404
2390
  # 1.
2405
- with pytest.warns(UserWarning, match="default fill_value 0 which IS defined"):
2391
+ with warns(UserWarning, match="default fill_value 0 which IS defined"):
2406
2392
  ds.create_variable(
2407
2393
  "enum_var1",
2408
2394
  ("enum_dim",),
@@ -2410,18 +2396,14 @@ def test_enum_type_errors_new_api(tmp_local_or_remote_netcdf):
2410
2396
  )
2411
2397
  # 2. is for legacyapi only
2412
2398
  # 3.
2413
- with pytest.warns(
2414
- UserWarning, match="default fill_value 0 which IS NOT defined"
2415
- ):
2399
+ with warns(UserWarning, match="default fill_value 0 which IS NOT defined"):
2416
2400
  ds.create_variable(
2417
2401
  "enum_var2",
2418
2402
  ("enum_dim",),
2419
2403
  dtype=enum_type,
2420
2404
  )
2421
2405
  # 4.
2422
- with pytest.warns(
2423
- UserWarning, match="with specified fill_value 0 which IS NOT"
2424
- ):
2406
+ with warns(UserWarning, match="with specified fill_value 0 which IS NOT"):
2425
2407
  ds.create_variable(
2426
2408
  "enum_var3",
2427
2409
  ("enum_dim",),
@@ -2429,9 +2411,7 @@ def test_enum_type_errors_new_api(tmp_local_or_remote_netcdf):
2429
2411
  fillvalue=0,
2430
2412
  )
2431
2413
  # 5.
2432
- with pytest.raises(
2433
- ValueError, match="with specified fill_value 100 which IS NOT"
2434
- ):
2414
+ with raises(ValueError, match="with specified fill_value 100 which IS NOT"):
2435
2415
  ds.create_variable(
2436
2416
  "enum_var4",
2437
2417
  ("enum_dim",),
@@ -2449,14 +2429,14 @@ def test_enum_type_errors_legacyapi(tmp_local_or_remote_netcdf):
2449
2429
  enum_type2 = ds.createEnumType(np.uint8, "enum_t2", enum_dict2)
2450
2430
 
2451
2431
  # 1.
2452
- with pytest.warns(UserWarning, match="default fill_value 255 which IS defined"):
2432
+ with warns(UserWarning, match="default fill_value 255 which IS defined"):
2453
2433
  ds.createVariable(
2454
2434
  "enum_var1",
2455
2435
  enum_type2,
2456
2436
  ("enum_dim",),
2457
2437
  )
2458
2438
  # 2.
2459
- with pytest.raises(ValueError, match="default fill_value 255 which IS NOT"):
2439
+ with raises(ValueError, match="default fill_value 255 which IS NOT"):
2460
2440
  ds.createVariable(
2461
2441
  "enum_var2",
2462
2442
  enum_type,
@@ -2464,9 +2444,7 @@ def test_enum_type_errors_legacyapi(tmp_local_or_remote_netcdf):
2464
2444
  )
2465
2445
  # 3. is only for new api
2466
2446
  # 4.
2467
- with pytest.warns(
2468
- UserWarning, match="interpreted as '_UNDEFINED' by netcdf-c."
2469
- ):
2447
+ with warns(UserWarning, match="interpreted as '_UNDEFINED' by netcdf-c."):
2470
2448
  ds.createVariable(
2471
2449
  "enum_var3",
2472
2450
  enum_type,
@@ -2474,9 +2452,7 @@ def test_enum_type_errors_legacyapi(tmp_local_or_remote_netcdf):
2474
2452
  fill_value=0,
2475
2453
  )
2476
2454
  # 5.
2477
- with pytest.raises(
2478
- ValueError, match="with specified fill_value 100 which IS NOT"
2479
- ):
2455
+ with raises(ValueError, match="with specified fill_value 100 which IS NOT"):
2480
2456
  ds.createVariable("enum_var4", enum_type, ("enum_dim",), fill_value=100)
2481
2457
 
2482
2458
 
@@ -2494,9 +2470,8 @@ def test_enum_type(tmp_local_or_remote_netcdf):
2494
2470
  "enum_var", ("enum_dim",), dtype=enum_type, fillvalue=enum_dict["missing"]
2495
2471
  )
2496
2472
  v[0:3] = [1, 2, 3]
2497
- with pytest.raises(ValueError) as e:
2473
+ with raises(ValueError, match="assign illegal value"):
2498
2474
  v[3] = 5
2499
- assert "assign illegal value(s)" in e.value.args[0]
2500
2475
 
2501
2476
  # check, if new API can read them
2502
2477
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
@@ -2537,9 +2512,8 @@ def test_enum_type(tmp_local_or_remote_netcdf):
2537
2512
  "enum_var", enum_type, ("enum_dim",), fill_value=enum_dict["missing"]
2538
2513
  )
2539
2514
  v[0:3] = [1, 2, 3]
2540
- with pytest.raises(ValueError) as e:
2515
+ with raises(ValueError, match="assign illegal value"):
2541
2516
  v[3] = 5
2542
- assert "assign illegal value(s)" in e.value.args[0]
2543
2517
 
2544
2518
  # check, if new API can read them
2545
2519
  with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
@@ -2581,9 +2555,7 @@ def test_enum_type(tmp_local_or_remote_netcdf):
2581
2555
  "enum_var", enum_type, ("enum_dim",), fill_value=enum_dict["missing"]
2582
2556
  )
2583
2557
  v[0:3] = [1, 2, 3]
2584
- with pytest.raises(
2585
- ValueError, match="assign illegal value to Enum variable"
2586
- ):
2558
+ with raises(ValueError, match="assign illegal value to Enum variable"):
2587
2559
  v[3] = 5
2588
2560
 
2589
2561
  # check, if new API can read them
@@ -2770,14 +2742,14 @@ def test_complex_type_creation_errors(tmp_local_netcdf):
2770
2742
 
2771
2743
  with legacyapi.Dataset(tmp_local_netcdf, "w") as ds:
2772
2744
  ds.createDimension("x", size=len(complex_array))
2773
- with pytest.raises(TypeError, match="data type 'c4' not understood"):
2745
+ with raises(TypeError, match="data type 'c4' not understood"):
2774
2746
  ds.createVariable("data", "c4", ("x",))
2775
2747
 
2776
2748
  if "complex256" not in np.sctypeDict:
2777
2749
  pytest.skip("numpy 'complex256' dtype not available")
2778
2750
  with legacyapi.Dataset(tmp_local_netcdf, "w") as ds:
2779
2751
  ds.createDimension("x", size=len(complex_array))
2780
- with pytest.raises(
2752
+ with raises(
2781
2753
  TypeError,
2782
2754
  match="Currently only 'complex64' and 'complex128' dtypes are allowed.",
2783
2755
  ):
@@ -2839,7 +2811,7 @@ def test_h5pyd_append(hsds_up):
2839
2811
  rnd = "".join(random.choice(string.ascii_uppercase) for _ in range(5))
2840
2812
  fname = f"hdf5://testfile{rnd}.nc"
2841
2813
 
2842
- with pytest.warns(UserWarning, match="Append mode for h5pyd"):
2814
+ with warns(UserWarning, match="Append mode for h5pyd"):
2843
2815
  with h5netcdf.File(fname, "a", driver="h5pyd") as ds:
2844
2816
  assert not ds._preexisting_file
2845
2817
 
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: h5netcdf
3
- Version: 1.6.2
3
+ Version: 1.6.3
4
4
  Summary: netCDF4 via h5py
5
5
  Author-email: Stephan Hoyer <shoyer@gmail.com>, Kai Mühlbauer <kmuehlbauer@wradlib.org>
6
6
  Maintainer-email: h5netcdf developers <devteam@h5netcdf.org>
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes