tsp 1.8.1__py3-none-any.whl → 1.10.2__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- tsp/__init__.py +11 -11
- tsp/__meta__.py +1 -1
- tsp/concatenation.py +159 -153
- tsp/core.py +1306 -1162
- tsp/data/2023-01-06_755-test-Dataset_2031-Constant_Over_Interval-Hourly-Ground_Temperature-Thermistor_Automated.timeserie.csv +4 -4
- tsp/data/2023-01-06_755-test.metadata.txt +208 -208
- tsp/data/NTGS_example_csv.csv +6 -6
- tsp/data/NTGS_example_slash_dates.csv +6 -6
- tsp/data/NTGS_gtr_example_excel.xlsx +0 -0
- tsp/data/example_geotop.csv +5240 -5240
- tsp/data/example_gtnp.csv +1298 -1298
- tsp/data/example_permos.csv +7 -7
- tsp/data/ntgs-db-multi.txt +3872 -0
- tsp/data/ntgs-db-single.txt +2251 -0
- tsp/data/test_geotop_has_space.txt +5 -5
- tsp/data/tsp_format_long.csv +10 -0
- tsp/data/tsp_format_wide_1.csv +7 -0
- tsp/data/tsp_format_wide_2.csv +7 -0
- tsp/dataloggers/AbstractReader.py +43 -43
- tsp/dataloggers/FG2.py +110 -110
- tsp/dataloggers/GP5W.py +114 -114
- tsp/dataloggers/Geoprecision.py +34 -34
- tsp/dataloggers/HOBO.py +930 -914
- tsp/dataloggers/RBRXL800.py +190 -190
- tsp/dataloggers/RBRXR420.py +371 -308
- tsp/dataloggers/Vemco.py +84 -0
- tsp/dataloggers/__init__.py +15 -15
- tsp/dataloggers/logr.py +196 -115
- tsp/dataloggers/test_files/004448.DAT +2543 -2543
- tsp/dataloggers/test_files/004531.DAT +17106 -17106
- tsp/dataloggers/test_files/004531.HEX +3587 -3587
- tsp/dataloggers/test_files/004534.HEX +3587 -3587
- tsp/dataloggers/test_files/010252.dat +1731 -1731
- tsp/dataloggers/test_files/010252.hex +1739 -1739
- tsp/dataloggers/test_files/010274.hex +1291 -1291
- tsp/dataloggers/test_files/010278.hex +3544 -3544
- tsp/dataloggers/test_files/012064.dat +1286 -1286
- tsp/dataloggers/test_files/012064.hex +1294 -1294
- tsp/dataloggers/test_files/012064_modified_start.hex +1294 -0
- tsp/dataloggers/test_files/012081.hex +3532 -3532
- tsp/dataloggers/test_files/013138_recovery_stamp.hex +1123 -0
- tsp/dataloggers/test_files/014037-2007.hex +95 -0
- tsp/dataloggers/test_files/019360_20160918_1146_SlumpIslandTopofHill.hex +11253 -0
- tsp/dataloggers/test_files/019360_20160918_1146_SlumpIslandTopofHill.xls +0 -0
- tsp/dataloggers/test_files/07B1592.DAT +1483 -1483
- tsp/dataloggers/test_files/07B1592.HEX +1806 -1806
- tsp/dataloggers/test_files/07B4450.DAT +2234 -2234
- tsp/dataloggers/test_files/07B4450.HEX +2559 -2559
- tsp/dataloggers/test_files/2022018_2025-09-18T22-16-16.txt +36 -0
- tsp/dataloggers/test_files/2022018_2025-09-18T22-16-16_raw.csv +2074 -0
- tsp/dataloggers/test_files/2022018_2025-09-18T22-16-16_temp.csv +2074 -0
- tsp/dataloggers/test_files/2025004_2025-12-02T17-07-28_cfg.txt +30 -0
- tsp/dataloggers/test_files/2025004_2025-12-02T17-07-28_raw.csv +35 -0
- tsp/dataloggers/test_files/2025004_2025-12-02T17-07-28_temp.csv +35 -0
- tsp/dataloggers/test_files/204087.xlsx +0 -0
- tsp/dataloggers/test_files/Asc-1455As02.000 +2982 -0
- tsp/dataloggers/test_files/Asc-1456As02.000 +2992 -0
- tsp/dataloggers/test_files/Asc-1457As02.000 +2917 -0
- tsp/dataloggers/test_files/BGC_BH15_019362_20140610_1253.hex +1729 -0
- tsp/dataloggers/test_files/Bin2944.csv +759 -0
- tsp/dataloggers/test_files/Bin5494.csv +2972 -0
- tsp/dataloggers/test_files/Bin6786.csv +272 -0
- tsp/dataloggers/test_files/FG2_399.csv +9881 -9881
- tsp/dataloggers/test_files/GP5W.csv +1121 -1121
- tsp/dataloggers/test_files/GP5W_260.csv +1884 -1884
- tsp/dataloggers/test_files/GP5W_270.csv +2210 -2210
- tsp/dataloggers/test_files/H08-030-08_HOBOware.csv +998 -998
- tsp/dataloggers/test_files/Minilog-II-T_350763_20190711_1.csv +2075 -0
- tsp/dataloggers/test_files/Minilog-II-T_350769_20190921_1.csv +6384 -0
- tsp/dataloggers/test_files/Minilog-II-T_354284_20190921_1.csv +4712 -0
- tsp/dataloggers/test_files/Minilog-T_7943_20140920_1.csv +5826 -0
- tsp/dataloggers/test_files/Minilog-T_8979_20140806_1.csv +2954 -0
- tsp/dataloggers/test_files/Minilog-T_975_20110824_1.csv +4343 -0
- tsp/dataloggers/test_files/RBR_01.dat +1046 -1046
- tsp/dataloggers/test_files/RBR_02.dat +2426 -2426
- tsp/dataloggers/test_files/RI03b_062831_20240905_1801.rsk +0 -0
- tsp/dataloggers/test_files/RI03b_062831_20240905_1801.xlsx +0 -0
- tsp/dataloggers/test_files/RSTDT2055.csv +2152 -2152
- tsp/dataloggers/test_files/U23-001_HOBOware.csv +1001 -1001
- tsp/dataloggers/test_files/hobo-negative-2.txt +6396 -6396
- tsp/dataloggers/test_files/hobo-negative-3.txt +5593 -5593
- tsp/dataloggers/test_files/hobo-positive-number-1.txt +1000 -1000
- tsp/dataloggers/test_files/hobo-positive-number-2.csv +1003 -1003
- tsp/dataloggers/test_files/hobo-positive-number-3.csv +1133 -1133
- tsp/dataloggers/test_files/hobo-positive-number-4.csv +1209 -1209
- tsp/dataloggers/test_files/hobo2.csv +8702 -8702
- tsp/dataloggers/test_files/hobo_1_AB.csv +21732 -21732
- tsp/dataloggers/test_files/hobo_1_AB_Details.txt +133 -133
- tsp/dataloggers/test_files/hobo_1_AB_classic.csv +4373 -4373
- tsp/dataloggers/test_files/hobo_1_AB_defaults.csv +21732 -21732
- tsp/dataloggers/test_files/hobo_1_AB_minimal.txt +1358 -1358
- tsp/dataloggers/test_files/hobo_1_AB_var2.csv +3189 -3189
- tsp/dataloggers/test_files/hobo_1_AB_var3.csv +2458 -2458
- tsp/dataloggers/test_files/logR_ULogC16-32_1.csv +106 -106
- tsp/dataloggers/test_files/logR_ULogC16-32_2.csv +100 -100
- tsp/dataloggers/test_files/mon_3_Ta_2010-08-18_2013-02-08.txt +21724 -21724
- tsp/dataloggers/test_files/rbr_001.dat +1133 -1133
- tsp/dataloggers/test_files/rbr_001.hex +1139 -1139
- tsp/dataloggers/test_files/rbr_001_no_comment.dat +1132 -1132
- tsp/dataloggers/test_files/rbr_001_no_comment.hex +1138 -1138
- tsp/dataloggers/test_files/rbr_002.dat +1179 -1179
- tsp/dataloggers/test_files/rbr_002.hex +1185 -1185
- tsp/dataloggers/test_files/rbr_003.hex +1292 -1292
- tsp/dataloggers/test_files/rbr_xl_001.DAT +1105 -1105
- tsp/dataloggers/test_files/rbr_xl_002.DAT +1126 -1126
- tsp/dataloggers/test_files/rbr_xl_003.DAT +4622 -4622
- tsp/dataloggers/test_files/rbr_xl_003.HEX +3587 -3587
- tsp/gtnp.py +148 -148
- tsp/labels.py +3 -3
- tsp/misc.py +90 -90
- tsp/physics.py +101 -101
- tsp/plots/static.py +388 -374
- tsp/readers.py +829 -548
- tsp/standardization/__init__.py +0 -0
- tsp/standardization/metadata.py +95 -0
- tsp/standardization/metadata_ref.py +0 -0
- tsp/standardization/validator.py +535 -0
- tsp/time.py +45 -45
- tsp/tspwarnings.py +27 -15
- tsp/utils.py +131 -101
- tsp/version.py +1 -1
- {tsp-1.8.1.dist-info → tsp-1.10.2.dist-info}/METADATA +95 -86
- tsp-1.10.2.dist-info/RECORD +132 -0
- {tsp-1.8.1.dist-info → tsp-1.10.2.dist-info}/licenses/LICENSE +674 -674
- {tsp-1.8.1.dist-info → tsp-1.10.2.dist-info}/top_level.txt +1 -0
- tsp-1.8.1.dist-info/RECORD +0 -94
- {tsp-1.8.1.dist-info → tsp-1.10.2.dist-info}/WHEEL +0 -0
tsp/time.py
CHANGED
|
@@ -1,46 +1,46 @@
|
|
|
1
|
-
import re
|
|
2
|
-
from datetime import datetime, tzinfo
|
|
3
|
-
|
|
4
|
-
from typing import Union
|
|
5
|
-
|
|
6
|
-
|
|
7
|
-
def get_utc_offset(offset: "Union[str,int]") -> int:
|
|
8
|
-
"""Get the UTC offset in seconds from a string or integer"""
|
|
9
|
-
|
|
10
|
-
if isinstance(offset, str):
|
|
11
|
-
if offset.lower() == "utc" or (offset.lower() == "z"):
|
|
12
|
-
return 0
|
|
13
|
-
|
|
14
|
-
pattern = re.compile(r"([+-]?)(\d{2}):(\d{2})")
|
|
15
|
-
match = pattern.match(offset)
|
|
16
|
-
|
|
17
|
-
if not match:
|
|
18
|
-
raise ValueError("Offset must be a string in the format '+HH:MM' or '-HH:MM'")
|
|
19
|
-
|
|
20
|
-
sign = match.group(1)
|
|
21
|
-
hours = int(match.group(2))
|
|
22
|
-
minutes = int(match.group(3))
|
|
23
|
-
utc_offset = (hours*60 + minutes)*60
|
|
24
|
-
if sign == "-":
|
|
25
|
-
utc_offset *= -1
|
|
26
|
-
|
|
27
|
-
elif isinstance(offset, int):
|
|
28
|
-
utc_offset = offset
|
|
29
|
-
|
|
30
|
-
else:
|
|
31
|
-
raise ValueError("Offset must be a string in the format '+HH:MM' or '-HH:MM' or an integer in seconds")
|
|
32
|
-
|
|
33
|
-
return utc_offset
|
|
34
|
-
|
|
35
|
-
|
|
36
|
-
def format_utc_offset(offset: tzinfo) -> str:
|
|
37
|
-
"""Format a UTC offset as a string in the format '+HH:MM' or '-HH:MM'"""
|
|
38
|
-
utc_offset = offset.utcoffset(datetime.now()).total_seconds()
|
|
39
|
-
sign = "-" if utc_offset < 0 else "+"
|
|
40
|
-
hours = int(abs(utc_offset)//3600)
|
|
41
|
-
minutes = int(abs(utc_offset)%3600/60)
|
|
42
|
-
|
|
43
|
-
if hours == 0 and minutes == 0:
|
|
44
|
-
return "UTC"
|
|
45
|
-
|
|
1
|
+
import re
|
|
2
|
+
from datetime import datetime, tzinfo
|
|
3
|
+
|
|
4
|
+
from typing import Union
|
|
5
|
+
|
|
6
|
+
|
|
7
|
+
def get_utc_offset(offset: "Union[str,int]") -> int:
|
|
8
|
+
"""Get the UTC offset in seconds from a string or integer"""
|
|
9
|
+
|
|
10
|
+
if isinstance(offset, str):
|
|
11
|
+
if offset.lower() == "utc" or (offset.lower() == "z"):
|
|
12
|
+
return 0
|
|
13
|
+
|
|
14
|
+
pattern = re.compile(r"([+-]?)(\d{2}):(\d{2})")
|
|
15
|
+
match = pattern.match(offset)
|
|
16
|
+
|
|
17
|
+
if not match:
|
|
18
|
+
raise ValueError("Offset must be a string in the format '+HH:MM' or '-HH:MM'")
|
|
19
|
+
|
|
20
|
+
sign = match.group(1)
|
|
21
|
+
hours = int(match.group(2))
|
|
22
|
+
minutes = int(match.group(3))
|
|
23
|
+
utc_offset = (hours*60 + minutes)*60
|
|
24
|
+
if sign == "-":
|
|
25
|
+
utc_offset *= -1
|
|
26
|
+
|
|
27
|
+
elif isinstance(offset, int):
|
|
28
|
+
utc_offset = offset
|
|
29
|
+
|
|
30
|
+
else:
|
|
31
|
+
raise ValueError("Offset must be a string in the format '+HH:MM' or '-HH:MM' or an integer in seconds")
|
|
32
|
+
|
|
33
|
+
return utc_offset
|
|
34
|
+
|
|
35
|
+
|
|
36
|
+
def format_utc_offset(offset: tzinfo) -> str:
|
|
37
|
+
"""Format a UTC offset as a string in the format '+HH:MM' or '-HH:MM'"""
|
|
38
|
+
utc_offset = offset.utcoffset(datetime.now()).total_seconds()
|
|
39
|
+
sign = "-" if utc_offset < 0 else "+"
|
|
40
|
+
hours = int(abs(utc_offset)//3600)
|
|
41
|
+
minutes = int(abs(utc_offset)%3600/60)
|
|
42
|
+
|
|
43
|
+
if hours == 0 and minutes == 0:
|
|
44
|
+
return "UTC"
|
|
45
|
+
|
|
46
46
|
return f"{sign}{hours:02d}:{minutes:02d}"
|
tsp/tspwarnings.py
CHANGED
|
@@ -1,15 +1,27 @@
|
|
|
1
|
-
import numpy as np
|
|
2
|
-
|
|
3
|
-
|
|
4
|
-
class DuplicateTimesWarning(UserWarning):
|
|
5
|
-
"""For when duplicate times are found in a file."""
|
|
6
|
-
def __init__(self, times):
|
|
7
|
-
self.times = times
|
|
8
|
-
|
|
9
|
-
def _msg(self, times) -> str:
|
|
10
|
-
m = f"Duplicate timestamps found: {times[np.where(times.duplicated())[0]]}. That's bad."
|
|
11
|
-
return m
|
|
12
|
-
|
|
13
|
-
def __str__(self):
|
|
14
|
-
return self._msg(self.times)
|
|
15
|
-
|
|
1
|
+
import numpy as np
|
|
2
|
+
|
|
3
|
+
|
|
4
|
+
class DuplicateTimesWarning(UserWarning):
|
|
5
|
+
"""For when duplicate times are found in a file."""
|
|
6
|
+
def __init__(self, times):
|
|
7
|
+
self.times = times
|
|
8
|
+
|
|
9
|
+
def _msg(self, times) -> str:
|
|
10
|
+
m = f"Duplicate timestamps found: {times[np.where(times.duplicated())[0]]}. That's bad."
|
|
11
|
+
return m
|
|
12
|
+
|
|
13
|
+
def __str__(self):
|
|
14
|
+
return self._msg(self.times)
|
|
15
|
+
|
|
16
|
+
class NonIncreasingTimesWarning(UserWarning):
|
|
17
|
+
"""For when non-increasing times are found in a file."""
|
|
18
|
+
def __init__(self, times):
|
|
19
|
+
self.times = times
|
|
20
|
+
|
|
21
|
+
def _msg(self, times) -> str:
|
|
22
|
+
n_bad = np.sum(np.diff(times.values) <= np.timedelta64(0, 'ns'))
|
|
23
|
+
m = f"{n_bad} non-increasing timestamps found. That's bad."
|
|
24
|
+
return m
|
|
25
|
+
|
|
26
|
+
def __str__(self):
|
|
27
|
+
return self._msg(self.times)
|
tsp/utils.py
CHANGED
|
@@ -1,101 +1,131 @@
|
|
|
1
|
-
import pandas as pd
|
|
2
|
-
import numpy as np
|
|
3
|
-
|
|
4
|
-
|
|
5
|
-
|
|
6
|
-
|
|
7
|
-
|
|
8
|
-
|
|
9
|
-
|
|
10
|
-
|
|
11
|
-
|
|
12
|
-
|
|
13
|
-
|
|
14
|
-
|
|
15
|
-
|
|
16
|
-
|
|
17
|
-
|
|
18
|
-
|
|
19
|
-
|
|
20
|
-
|
|
21
|
-
|
|
22
|
-
|
|
23
|
-
|
|
24
|
-
|
|
25
|
-
|
|
26
|
-
|
|
27
|
-
|
|
28
|
-
|
|
29
|
-
|
|
30
|
-
|
|
31
|
-
|
|
32
|
-
|
|
33
|
-
|
|
34
|
-
|
|
35
|
-
|
|
36
|
-
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
|
|
40
|
-
|
|
41
|
-
|
|
42
|
-
df =
|
|
43
|
-
|
|
44
|
-
|
|
45
|
-
|
|
46
|
-
|
|
47
|
-
|
|
48
|
-
|
|
49
|
-
|
|
50
|
-
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
|
|
54
|
-
|
|
55
|
-
|
|
56
|
-
|
|
57
|
-
df =
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
62
|
-
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
|
|
68
|
-
|
|
69
|
-
|
|
70
|
-
|
|
71
|
-
|
|
72
|
-
df =
|
|
73
|
-
|
|
74
|
-
|
|
75
|
-
|
|
76
|
-
|
|
77
|
-
|
|
78
|
-
|
|
79
|
-
|
|
80
|
-
|
|
81
|
-
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
|
|
87
|
-
|
|
88
|
-
|
|
89
|
-
averaged.
|
|
90
|
-
|
|
91
|
-
|
|
92
|
-
|
|
93
|
-
|
|
94
|
-
|
|
95
|
-
|
|
96
|
-
|
|
97
|
-
|
|
98
|
-
|
|
99
|
-
|
|
100
|
-
|
|
101
|
-
|
|
1
|
+
import pandas as pd
|
|
2
|
+
import numpy as np
|
|
3
|
+
import warnings
|
|
4
|
+
|
|
5
|
+
import tsp
|
|
6
|
+
from tsp import TSP
|
|
7
|
+
|
|
8
|
+
|
|
9
|
+
def resolve_duplicate_times(t: TSP, keep="first") -> TSP:
|
|
10
|
+
"""Eliminate duplicate times in a TSP.
|
|
11
|
+
|
|
12
|
+
Parameters
|
|
13
|
+
----------
|
|
14
|
+
tsp : TSP
|
|
15
|
+
TSP to resolve duplicate times in.
|
|
16
|
+
keep : str, optional
|
|
17
|
+
Method to resolve duplicate times. Chosen from "first", "average", "last", "strip"
|
|
18
|
+
by default "first"
|
|
19
|
+
|
|
20
|
+
Returns
|
|
21
|
+
-------
|
|
22
|
+
TSP
|
|
23
|
+
TSP with no duplicated times."""
|
|
24
|
+
resolver = _get_duplicate_resolver(keep)
|
|
25
|
+
return resolver(t)
|
|
26
|
+
|
|
27
|
+
|
|
28
|
+
def _get_duplicate_resolver(keep: str):
|
|
29
|
+
if keep == "first":
|
|
30
|
+
return _first_duplicate_time
|
|
31
|
+
elif keep == "average":
|
|
32
|
+
return _average_duplicate_time
|
|
33
|
+
elif keep == "last":
|
|
34
|
+
return _last_duplicate_time
|
|
35
|
+
elif keep == "strip":
|
|
36
|
+
return _strip_duplicate_time
|
|
37
|
+
else:
|
|
38
|
+
raise ValueError(f"Unknown duplicate resolver method: {keep}")
|
|
39
|
+
|
|
40
|
+
|
|
41
|
+
def _first_duplicate_time(t: TSP):
|
|
42
|
+
df = t.wide
|
|
43
|
+
df = df[~df.index.duplicated(keep="first")]
|
|
44
|
+
|
|
45
|
+
time = df.index
|
|
46
|
+
values = df.drop(['time'], axis=1).values
|
|
47
|
+
depths = df.drop(['time'], axis=1).columns
|
|
48
|
+
|
|
49
|
+
t_new = TSP(times=time, values=values, depths=depths,
|
|
50
|
+
latitude=t.latitude, longitude=t.longitude,
|
|
51
|
+
site_id=t.site_id, metadata=t.metadata)
|
|
52
|
+
|
|
53
|
+
return t_new
|
|
54
|
+
|
|
55
|
+
|
|
56
|
+
def _last_duplicate_time(t: TSP):
|
|
57
|
+
df = t.wide
|
|
58
|
+
df = df[~df.index.duplicated(keep="last")]
|
|
59
|
+
|
|
60
|
+
time = df.index
|
|
61
|
+
values = df.drop(['time'], axis=1).values
|
|
62
|
+
depths = df.drop(['time'], axis=1).columns
|
|
63
|
+
|
|
64
|
+
t_new = TSP(times=time, values=values, depths=depths,
|
|
65
|
+
latitude=t.latitude, longitude=t.longitude,
|
|
66
|
+
site_id=t.site_id, metadata=t.metadata)
|
|
67
|
+
|
|
68
|
+
return t_new
|
|
69
|
+
|
|
70
|
+
|
|
71
|
+
def _strip_duplicate_time(t: TSP):
|
|
72
|
+
df = t.wide
|
|
73
|
+
df = df[~df.index.duplicated(keep=False)]
|
|
74
|
+
|
|
75
|
+
time = df.index
|
|
76
|
+
values = df.drop(['time'], axis=1).values
|
|
77
|
+
depths = df.drop(['time'], axis=1).columns
|
|
78
|
+
|
|
79
|
+
t_new = TSP(times=time, values=values, depths=depths,
|
|
80
|
+
latitude=t.latitude, longitude=t.longitude,
|
|
81
|
+
site_id=t.site_id, metadata=t.metadata)
|
|
82
|
+
|
|
83
|
+
return t_new
|
|
84
|
+
|
|
85
|
+
|
|
86
|
+
def _average_duplicate_time(t: TSP):
|
|
87
|
+
singleton = t.wide[~t.wide.index.duplicated(keep=False)]
|
|
88
|
+
duplicated = t.wide[t.wide.index.duplicated(keep=False)].drop(['time'], axis=1).reset_index()
|
|
89
|
+
averaged = duplicated.groupby(duplicated['index']).apply(lambda x: x[~x.isna()].mean(numeric_only=True))
|
|
90
|
+
averaged.insert(0, 'time',averaged.index)
|
|
91
|
+
|
|
92
|
+
df = pd.concat([singleton, averaged], ignore_index=False).sort_index()
|
|
93
|
+
|
|
94
|
+
time = df.index
|
|
95
|
+
values = df.drop(['time'], axis=1).values
|
|
96
|
+
depths = df.drop(['time'], axis=1).columns
|
|
97
|
+
|
|
98
|
+
t_new = TSP(times=time, values=values, depths=depths,
|
|
99
|
+
latitude=t.latitude, longitude=t.longitude,
|
|
100
|
+
site_id=t.site_id, metadata=t.metadata)
|
|
101
|
+
|
|
102
|
+
return t_new
|
|
103
|
+
|
|
104
|
+
|
|
105
|
+
def midnight_offset_24_hr(t: TSP, reverse: bool = False) -> TSP:
|
|
106
|
+
""" (de-)increment any timestamps that are exactly at midnight by 24 hours.
|
|
107
|
+
|
|
108
|
+
Description:
|
|
109
|
+
-----------
|
|
110
|
+
Some dataloggers misattribute data at midnight to the previous day
|
|
111
|
+
which can cause problems when plotting or analyzing data. This function
|
|
112
|
+
corrects any timestamps that are exactly at midnight by adding/subtracting 24 hours.
|
|
113
|
+
"""
|
|
114
|
+
|
|
115
|
+
values = t.values.copy()
|
|
116
|
+
midnight_indices = np.where(t.times.hour==0)[0]
|
|
117
|
+
times = t.times.to_numpy().copy()
|
|
118
|
+
if len(midnight_indices) > 0:
|
|
119
|
+
if reverse:
|
|
120
|
+
times[midnight_indices] -= np.timedelta64(24, 'h')
|
|
121
|
+
else:
|
|
122
|
+
times[midnight_indices] += np.timedelta64(24, 'h')
|
|
123
|
+
|
|
124
|
+
with warnings.catch_warnings():
|
|
125
|
+
warnings.filterwarnings("ignore", category=tsp.tspwarnings.NonIncreasingTimesWarning)
|
|
126
|
+
|
|
127
|
+
t_new = TSP(times=times, values=values, depths=t.depths,
|
|
128
|
+
latitude=t.latitude, longitude=t.longitude,
|
|
129
|
+
site_id=t.site_id, metadata=t.metadata)
|
|
130
|
+
|
|
131
|
+
return t_new
|
tsp/version.py
CHANGED
|
@@ -1 +1 @@
|
|
|
1
|
-
version="1.
|
|
1
|
+
version="1.10.2"
|
|
@@ -1,86 +1,95 @@
|
|
|
1
|
-
Metadata-Version: 2.4
|
|
2
|
-
Name: tsp
|
|
3
|
-
Version: 1.
|
|
4
|
-
Summary: Making permafrost data effortless
|
|
5
|
-
Home-page: https://gitlab.com/permafrostnet/teaspoon
|
|
6
|
-
Author: Nick Brown
|
|
7
|
-
Author-email: nick.brown@carleton.ca
|
|
8
|
-
Classifier: Development Status :: 4 - Beta
|
|
9
|
-
Classifier: Intended Audience :: Developers
|
|
10
|
-
Classifier: License :: OSI Approved :: GNU General Public License (GPL)
|
|
11
|
-
Classifier: Operating System :: POSIX
|
|
12
|
-
Classifier: Programming Language :: Python
|
|
13
|
-
Classifier: Topic :: Software Development :: Libraries
|
|
14
|
-
Description-Content-Type: text/markdown
|
|
15
|
-
License-File: LICENSE
|
|
16
|
-
Requires-Dist: pandas
|
|
17
|
-
Requires-Dist: numpy
|
|
18
|
-
Requires-Dist: regex
|
|
19
|
-
Requires-Dist: matplotlib
|
|
20
|
-
Requires-Dist: setuptools
|
|
21
|
-
|
|
22
|
-
Requires-Dist:
|
|
23
|
-
Requires-Dist:
|
|
24
|
-
|
|
25
|
-
Requires-Dist:
|
|
26
|
-
Provides-Extra:
|
|
27
|
-
Requires-Dist:
|
|
28
|
-
Requires-Dist:
|
|
29
|
-
Provides-Extra:
|
|
30
|
-
Requires-Dist: scipy; extra == "
|
|
31
|
-
|
|
32
|
-
Requires-Dist:
|
|
33
|
-
Requires-Dist:
|
|
34
|
-
|
|
35
|
-
|
|
36
|
-
Requires-Dist:
|
|
37
|
-
Requires-Dist:
|
|
38
|
-
Requires-Dist:
|
|
39
|
-
Requires-Dist:
|
|
40
|
-
|
|
41
|
-
|
|
42
|
-
|
|
43
|
-
|
|
44
|
-
|
|
45
|
-
|
|
46
|
-
Dynamic:
|
|
47
|
-
Dynamic:
|
|
48
|
-
Dynamic:
|
|
49
|
-
Dynamic:
|
|
50
|
-
Dynamic:
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
|
|
54
|
-
|
|
55
|
-
|
|
56
|
-
|
|
57
|
-
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
62
|
-
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
*
|
|
66
|
-
*
|
|
67
|
-
*
|
|
68
|
-
*
|
|
69
|
-
*
|
|
70
|
-
|
|
71
|
-
*
|
|
72
|
-
*
|
|
73
|
-
*
|
|
74
|
-
*
|
|
75
|
-
|
|
76
|
-
|
|
77
|
-
|
|
78
|
-
|
|
79
|
-
|
|
80
|
-
|
|
81
|
-
|
|
82
|
-
|
|
83
|
-
##
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: tsp
|
|
3
|
+
Version: 1.10.2
|
|
4
|
+
Summary: Making permafrost data effortless
|
|
5
|
+
Home-page: https://gitlab.com/permafrostnet/teaspoon
|
|
6
|
+
Author: Nick Brown
|
|
7
|
+
Author-email: nick.brown@carleton.ca
|
|
8
|
+
Classifier: Development Status :: 4 - Beta
|
|
9
|
+
Classifier: Intended Audience :: Developers
|
|
10
|
+
Classifier: License :: OSI Approved :: GNU General Public License (GPL)
|
|
11
|
+
Classifier: Operating System :: POSIX
|
|
12
|
+
Classifier: Programming Language :: Python
|
|
13
|
+
Classifier: Topic :: Software Development :: Libraries
|
|
14
|
+
Description-Content-Type: text/markdown
|
|
15
|
+
License-File: LICENSE
|
|
16
|
+
Requires-Dist: pandas
|
|
17
|
+
Requires-Dist: numpy
|
|
18
|
+
Requires-Dist: regex
|
|
19
|
+
Requires-Dist: matplotlib
|
|
20
|
+
Requires-Dist: setuptools
|
|
21
|
+
Requires-Dist: netCDF4
|
|
22
|
+
Requires-Dist: pfit>=0.3.0
|
|
23
|
+
Requires-Dist: scipy
|
|
24
|
+
Requires-Dist: pyrsktools
|
|
25
|
+
Requires-Dist: openpyxl
|
|
26
|
+
Provides-Extra: nc
|
|
27
|
+
Requires-Dist: netCDF4; extra == "nc"
|
|
28
|
+
Requires-Dist: pfit>=0.3.0; extra == "nc"
|
|
29
|
+
Provides-Extra: plotting
|
|
30
|
+
Requires-Dist: scipy; extra == "plotting"
|
|
31
|
+
Provides-Extra: rbr
|
|
32
|
+
Requires-Dist: pyrsktools; extra == "rbr"
|
|
33
|
+
Requires-Dist: openpyxl; extra == "rbr"
|
|
34
|
+
Provides-Extra: full
|
|
35
|
+
Requires-Dist: scipy; extra == "full"
|
|
36
|
+
Requires-Dist: pyrsktools; extra == "full"
|
|
37
|
+
Requires-Dist: netCDF4; extra == "full"
|
|
38
|
+
Requires-Dist: openpyxl; extra == "full"
|
|
39
|
+
Requires-Dist: pfit>=0.3.0; extra == "full"
|
|
40
|
+
Provides-Extra: dev
|
|
41
|
+
Requires-Dist: manuel; extra == "dev"
|
|
42
|
+
Requires-Dist: pytest; extra == "dev"
|
|
43
|
+
Requires-Dist: pytest-cov; extra == "dev"
|
|
44
|
+
Requires-Dist: coverage; extra == "dev"
|
|
45
|
+
Requires-Dist: mock; extra == "dev"
|
|
46
|
+
Dynamic: author
|
|
47
|
+
Dynamic: author-email
|
|
48
|
+
Dynamic: classifier
|
|
49
|
+
Dynamic: description
|
|
50
|
+
Dynamic: description-content-type
|
|
51
|
+
Dynamic: home-page
|
|
52
|
+
Dynamic: license-file
|
|
53
|
+
Dynamic: provides-extra
|
|
54
|
+
Dynamic: requires-dist
|
|
55
|
+
Dynamic: summary
|
|
56
|
+
|
|
57
|
+
# Teaspoon
|
|
58
|
+
|
|
59
|
+
## See the full documentation on the [ReadTheDocs Pages](https://permafrostnet.gitlab.io/teaspoon/source/about.html)
|
|
60
|
+
|
|
61
|
+
## [What is it?](https://permafrostnet.gitlab.io/teaspoon/source/about.html)
|
|
62
|
+
`tsp` ('teaspoon') is a python library designed to make working with permafrost ground temperature time series data more straightforward, efficient, and reproduceable. Some of the features include:
|
|
63
|
+
|
|
64
|
+
* Read a variety of common published data formats, datalogger outputs, and model results into a common data structure
|
|
65
|
+
* GEOtop model output
|
|
66
|
+
* GTN-P database export csv
|
|
67
|
+
* NTGS ground temperature report csv
|
|
68
|
+
* Geoprecision datalogger export
|
|
69
|
+
* HoboWare datalogger export
|
|
70
|
+
* Export data in a variety of common formats
|
|
71
|
+
* TSP-recommended csv format
|
|
72
|
+
* netcdf
|
|
73
|
+
* 'GTN-P'-style csv
|
|
74
|
+
* 'NTGS'-style csv
|
|
75
|
+
* Perform common data transformations
|
|
76
|
+
* Calculate daily, monthly, or yearly means, ignoring averaging periods with missing data
|
|
77
|
+
* Switch between "long" and "wide" dataframes
|
|
78
|
+
* Visualize and explore your data with commonly used plots
|
|
79
|
+
* Trumpet curves
|
|
80
|
+
* Temperature-time graphs
|
|
81
|
+
* Colour-contour profiles
|
|
82
|
+
|
|
83
|
+
## [Installation](https://permafrostnet.gitlab.io/teaspoon/source/install.html)
|
|
84
|
+
|
|
85
|
+
## [Usage Examples](https://permafrostnet.gitlab.io/teaspoon/source/examples.html)
|
|
86
|
+
|
|
87
|
+
## [How to contribute](https://permafrostnet.gitlab.io/teaspoon/source/contributions.html)
|
|
88
|
+
|
|
89
|
+
## Data Standard
|
|
90
|
+
TSP also defines a recommended csv format for ground temperature data (which can also be extended to many other kinds of permafrost data). It is described in the [DATA_STANDARD.md](./DATA_STANDARD.md) file in this directory. Files can be read (using the `TSP.to_csv` method) and written (using `read_tsp`) using the teaspoon software package.
|
|
91
|
+
|
|
92
|
+
## Citation
|
|
93
|
+
If you find this software helpful, please consider using the following citation:
|
|
94
|
+
|
|
95
|
+
> Brown, N., (2022). tsp ("Teaspoon"): A library for ground temperature data. Journal of Open Source Software, 7(77), 4704, [https://doi.org/10.21105/joss.04704](https://doi.org/10.21105/joss.04704)
|