shipgrav 1.0.2__py2.py3-none-any.whl → 1.0.4__py2.py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
shipgrav/__init__.py CHANGED
@@ -9,9 +9,9 @@ DGS gravimeters output two types of files: serial, or 'raw' files; and 'laptop'
9
9
  Installation
10
10
  ------------
11
11
 
12
- shipgrav can be installed from `PyPI <https://pypi.org/project/shipgrav/>`_ using ``pip``. we recommend using an environment management tool like `conda <https://anaconda.org>`_. An exemplary set of commands to make a conda enviroment with shipgrav would be: ::
12
+ shipgrav can be installed from `PyPI <https://pypi.org/project/shipgrav/>`_ using ``pip``. We recommend using an environment management tool like `conda <https://anaconda.org>`_. An exemplary set of commands to make a conda enviroment with shipgrav would be: ::
13
13
 
14
- conda create --name shipgrav numpy scipy pandas statsmodels tomli pyyaml matplotlib
14
+ conda create --name shipgrav numpy scipy pandas statsmodels tomli pyyaml tqdm pooch matplotlib geographiclib
15
15
  conda activate shipgrav
16
16
  pip install shipgrav
17
17
 
@@ -24,7 +24,14 @@ shipgrav's dependencies are
24
24
  * statsmodels
25
25
  * tomli
26
26
  * pyyaml
27
+ * tqdm
28
+ * pooch (optional, needed to run the example scripts)
27
29
  * matplotlib (optional, needed to run some of the example scripts)
30
+ * geographiclib (optional, needed to run one of the example scripts)
31
+ * jupyterlab (optional, if you want to run example scripts in jupyter)
32
+ * jupytext (optional, if you want to run example scripts in jupyter)
33
+
34
+ If you install shipgrav with ``pip``, it will also install any of the required dependencies that are missing. To make ``pip`` include the optional dependencies, run ``pip install shipgrav[examples]``.
28
35
 
29
36
  The example scripts are `available on github <https://github.com/PFPE/shipgrav>`_. They are not packaged with the PyPI package and must be downloaded separately.
30
37
 
@@ -43,7 +50,7 @@ shipgrav consists of the modules ``io``, ``nav``, ``grav``, and ``utils``, along
43
50
  Data directories
44
51
  ----------------
45
52
 
46
- You can organize your data however you like; shipgrav does not care as long as you tell it where to look. However, to run the example scripts, you will need to download some data and ensure that the scripts are pointed to the paths where you put that data. A shell script ``download_data.sh`` is provided in ``example-scripts/`` to help download and organize the data.
53
+ You can organize your data however you like; shipgrav does not care as long as you tell it where to look. The example scripts are set up to download data files from public repositories using ``pooch``. The ``pooch.retrieve()`` function returns lists of file paths for files that have been downloaded and unpacked, so to adapt the example workflows for other data files, you will need to replace those lists of paths with the paths to your data.
47
54
 
48
55
  Ties and bias
49
56
  -------------
@@ -62,7 +69,9 @@ The database file included in shipgrav lists the navigation talkers that we expe
62
69
  Example scripts
63
70
  ---------------
64
71
 
65
- The scripts in the ``example-scripts/`` directory use publicly available data files to run through some common workflows for marine gravity processing. The data files can be downloaded from R2R and Zenodo. The ``download_data.sh`` shell script in ``example-scripts/`` uses ``curl``, ``tar``, and ``unzip`` to download all the data files for you and unpack them into a directory structure that matches what the examples expect. You can also download and unpack the files on your own from the sources listed below:
72
+ The scripts in the ``example-scripts/`` directory use publicly available data files to run through some common workflows for marine gravity processing. All of the examples can be run as scripts (ie, with ``python -m <script-name>.py``). All except ``interactive_line_pick.py`` can also be opened in ``jupyter`` as notebooks thanks to ``jupytext``. To run the examples in ``jupyter``, start ``jupyter lab``, right-click on the script file name, and select `open with -> notebook`.
73
+
74
+ The data files can be downloaded from R2R and Zenodo, and the scripts will do this automatically using ``pooch``. The sources are:
66
75
 
67
76
  * https://doi.org/10.7284/151470 - TN400 BGM3 data
68
77
  * https://doi.org/10.7284/151457 - TN400 nav data
@@ -71,8 +80,6 @@ The scripts in the ``example-scripts/`` directory use publicly available data fi
71
80
  * https://doi.org/10.7284/157177 - SR2312 mru data
72
81
  * https://doi.org/10.5281/zenodo.12733929 - TN400 DGS raw and laptop data, SR2312 DGS raw data, R/V Ride DGS meter and Hydrins metadata, satellite FAA tracks for comparison, example file for RMBA calculation
73
82
 
74
- If you are not using the download script provided, we recommend looking at what it does and mimicking the directory structure it sets up for the data files. Otherwise, you will have to edit the examples to point to the correct filepaths for wherever you put the data in your system.
75
-
76
83
  ``dgs_bgm_comp.py`` reads data from DGS and BGM gravimeter files from R/V Thompson cruise TN400. The files are lightly processed to obtain the FAA (including syncing with navigation data for more accurate locations), and the FAA is plotted alongside corresponding satellite-derived FAA.
77
84
 
78
85
  .. image:: _static/TN400_FAA.png
@@ -101,7 +108,7 @@ If you are not using the download script provided, we recommend looking at what
101
108
  :height: 250px
102
109
  :align: center
103
110
 
104
- ``interactive_line_pick.py`` reads laptop data and navigation data from R/V Sally Ride cruise SR2312. The script generates an interactive plot with a cursor for users to select segments of the time series data based on mapped locations, in order to extract straight line segments from a cruise track. The selected segments are written to files that can be re-read by the next script...
111
+ ``interactive_line_pick.py`` reads laptop data and navigation data from R/V Sally Ride cruise SR2312. The script generates an interactive plot with a cursor for users to select segments of the time series data based on mapped locations, in order to extract straight line segments from a cruise track. `This script cannot be run in jupyter`. The selected segments are written to files that can be re-read by the next script...
105
112
 
106
113
  .. image:: _static/cursor.png
107
114
  :alt: Interactive line segment picker window.
@@ -141,4 +148,6 @@ Contributing
141
148
 
142
149
  Do you have ideas for making this software better? Go ahead and `raise an issue <https://github.com/PFPE/shipgrav/issues>`_ on the github page or, if you're a savvy Python programmer, submit a pull request. You can also email PFPE at pfpe-interal(at)whoi.edu.
143
150
 
151
+ If you raise an issue on github, please include as much detail as possible, such as the text of error messages. If there are no visible errors but you think the code is behaving oddly, provide a description of what the code is doing and what you think it *should* be doing instead. PFPE may ask for additional details or copies of data files in order to reproduce and diagnose an issue.
152
+
144
153
  """
shipgrav/database.toml CHANGED
@@ -13,6 +13,7 @@ NBP = 'GPGGA'
13
13
  Ride = 'INGGA'
14
14
  Revelle = 'GPGGA'
15
15
  Langseth = 'INGGA'
16
+ Sikuliaq = 'GPGGA'
16
17
 
17
18
  [BGM-scale]
18
19
  Atlantis = 4.994070552
shipgrav/grav.py CHANGED
@@ -1,11 +1,12 @@
1
- import numpy as np
2
- from scipy.signal import lfilter, firwin
3
- from pandas import DataFrame
4
- from statsmodels.api import OLS, add_constant
1
+ from copy import copy
5
2
  from datetime import datetime, timezone
6
3
  from math import factorial
4
+
5
+ import numpy as np
6
+ from pandas import DataFrame
7
+ from scipy.signal import firwin, lfilter
7
8
  from scipy.special import erf, erfcinv
8
- from copy import copy
9
+ from statsmodels.api import OLS, add_constant
9
10
 
10
11
  # impulse response of 10th order Taylor series differentiator
11
12
  tay10 = [1/1260, -5/504, 5/84, -5/21, 5/6,
@@ -579,10 +580,6 @@ def _calc_up_vecs(ctilt, ltilt):
579
580
  :return: **up_vecs** (*3xN ndarray*) - (cross, long, up) for platform
580
581
  """
581
582
 
582
- # get increments, assuming initial is 0
583
- inc_ct = np.append(ctilt[0], np.diff(ctilt))
584
- inc_lt = np.append(ltilt[0], np.diff(ltilt))
585
-
586
583
  # trig functions of tilts
587
584
  sc = np.sin(ctilt)
588
585
  cc = np.cos(ctilt)
@@ -756,8 +753,6 @@ def grav2d_folding(X, Y, Z, dx, dy, drho=0.6, dz=6000, ifold=True, npower=5):
756
753
  # folding frequency:
757
754
  mfx = int(nxt/2 + 1)
758
755
  mfy = int(nyt/2 + 1)
759
- mfx2 = mfx*2
760
- mfy2 = mfy*2
761
756
 
762
757
  # the important part:
763
758
  # (1) compute wavenumbers
@@ -1172,8 +1167,6 @@ def crustal_thickness_2D(ur, nx=1000, ny=1, dx=1.3, dy=0, zdown=10, rho=0.4,
1172
1167
  """
1173
1168
  assert wlarge > wsmall, 'wlarge must be larger than wsmall'
1174
1169
 
1175
- zmin = min(ur)
1176
- zmax = max(ur)
1177
1170
  ave = np.mean(ur)
1178
1171
 
1179
1172
  # shift to a new reference and convert milligal to gal
shipgrav/io.py CHANGED
@@ -1,11 +1,12 @@
1
- import numpy as np
2
- import pandas as pd
3
- from datetime import datetime, timezone
4
- import yaml
5
1
  import mmap
6
- import sys
7
2
  import os
8
3
  import re
4
+ from datetime import datetime, timezone
5
+
6
+ import numpy as np
7
+ import pandas as pd
8
+ import yaml
9
+ from tqdm import tqdm
9
10
 
10
11
  # TODO need a fix for places where we cross the international date line (read_nav)
11
12
  # TODO check pandas versions for datetime parsing (eg RGS read)
@@ -16,7 +17,7 @@ import re
16
17
  ########################################################################
17
18
 
18
19
 
19
- def read_nav(ship, pathlist, sampling=1, talker=None, ship_function=None):
20
+ def read_nav(ship, pathlist, sampling=1, talker=None, ship_function=None, progressbar=True):
20
21
  """ Read navigation strings from .GPS (or similar) files.
21
22
 
22
23
  Ships have different formats and use different talkers for preferred
@@ -37,6 +38,8 @@ def read_nav(ship, pathlist, sampling=1, talker=None, ship_function=None):
37
38
  This function should return arrays of lon, lat, and timestamps.
38
39
  Look at _navcoords() and navdate_Atlantis() (and similar functions) for examples.
39
40
  :type ship_function: function, optional
41
+ :param progressbar: display progress bar while list of files is read
42
+ :type progressbar: bool
40
43
 
41
44
  :returns: (*pd.DataFrame*) time series of geographic coordinates and timestamps
42
45
  """
@@ -48,7 +51,7 @@ def read_nav(ship, pathlist, sampling=1, talker=None, ship_function=None):
48
51
  nav_str = info['nav-talkers']
49
52
 
50
53
  # check to make sure we have some way to read nav data for this ship
51
- if ship not in nav_str.keys() and ship_function == None:
54
+ if ship not in nav_str.keys() and ship_function is None:
52
55
  print('R/V %s not yet supported for nav read; must supply read function' % ship)
53
56
  return -999
54
57
 
@@ -70,7 +73,7 @@ def read_nav(ship, pathlist, sampling=1, talker=None, ship_function=None):
70
73
  lonlon = np.array([])
71
74
  latlat = np.array([])
72
75
 
73
- for fpath in pathlist: # loop nav files (may be a lot of them)
76
+ for fpath in tqdm(pathlist,desc='reading nav',disable=not progressbar): # loop nav files (may be a lot of them)
74
77
  with open(fpath, 'r') as f:
75
78
  allnav = np.array(f.readlines()) # read the entire file
76
79
 
@@ -95,11 +98,17 @@ def read_nav(ship, pathlist, sampling=1, talker=None, ship_function=None):
95
98
  elif ship == 'Langseth':
96
99
  lon, lat = _navcoords(allnav, talker)
97
100
  timest = _navdate_Langseth(allnav, talker)
101
+ elif ship == 'Sikuliaq' and talker == 'GPGGA':
102
+ lon, lat = _navcoords(allnav, talker)
103
+ timest = _navdate_Ride(allnav, talker)
98
104
  else: # in theory we never get to this option, but catch just in case
99
105
  print(
100
106
  'R/V %s not yet supported for nav read; must supply read function' % ship)
101
107
  return -999
102
108
 
109
+ if type(timest) is int and timest == -999:
110
+ return timest # eg looking for a talker that is not in a file
111
+
103
112
  # posix, seconds, for interpolation
104
113
  sec_time = np.array([d.timestamp() for d in timest])
105
114
  _, idx = np.unique(sec_time, return_index=True)
@@ -169,7 +178,7 @@ def _navdate_Atlantis(allnav, talker):
169
178
 
170
179
  for i in range(N):
171
180
  pre = subnav[i].split(talker)[0]
172
- date = re.findall('NAV (\d{4})/(\d{2})/(\d{2})', pre)[0]
181
+ date = re.findall(r'NAV (\d{4})/(\d{2})/(\d{2})', pre)[0]
173
182
  year = int(date[0]) # year
174
183
  mon = int(date[1]) # month
175
184
  day = int(date[2]) # day
@@ -191,7 +200,7 @@ def _navdate_NBP(allnav, talker):
191
200
 
192
201
  for i in range(N):
193
202
  pre = subnav[i].split(talker)[0]
194
- date = re.findall('(\d{2})\+(\d{2,3}):.*', pre)[0]
203
+ date = re.findall(r'(\d{2})\+(\d{2,3}):.*', pre)[0]
195
204
  # year (NBP didn't exist before 2000 so this is ok)
196
205
  year = '20' + date[0]
197
206
  doy = date[1] # doy
@@ -216,7 +225,7 @@ def _navdate_Thompson(allnav, talker):
216
225
 
217
226
  for i in range(N):
218
227
  pre = subnav[i].split(talker)[0]
219
- date = re.findall('(\d{2})/(\d{2})/(\d{4}),*', pre)[0]
228
+ date = re.findall(r'(\d{2})/(\d{2})/(\d{4}),*', pre)[0]
220
229
  year = int(date[2])
221
230
  mon = int(date[0])
222
231
  day = int(date[1])
@@ -226,7 +235,7 @@ def _navdate_Thompson(allnav, talker):
226
235
 
227
236
 
228
237
  def _navdate_Revelle(allnav, talker):
229
- """Extract datetime info from Revelle nav files (mru_seapatah330_rr_navbho-*.txt).
238
+ """Extract datetime info from Revelle nav files (mru_seapath330_rr_navbho-*.txt).
230
239
  """
231
240
  hour, mint, sec, msec = _clock_time(allnav, talker)
232
241
 
@@ -247,8 +256,8 @@ def _navdate_Revelle(allnav, talker):
247
256
  for k in range(inds[i]-1, j, -1): # step backwards toward the last talker line
248
257
  before = allnav[k]
249
258
  # date is at the start of this line
250
- if re.match('(\d{4})-(\d{2})-(\d{2})T*', before):
251
- date = re.findall('(\d{4})-(\d{2})-(\d{2})T*', before)[0]
259
+ if re.match(r'(\d{4})-(\d{2})-(\d{2})T*', before):
260
+ date = re.findall(r'(\d{4})-(\d{2})-(\d{2})T*', before)[0]
252
261
  year = int(date[0])
253
262
  mon = int(date[1])
254
263
  day = int(date[2])
@@ -264,17 +273,19 @@ def _navdate_Ride(allnav, talker):
264
273
  inav = [talker in s for s in allnav] # find lines of file with this talker
265
274
  subnav = allnav[inav] # select only those lines
266
275
  N = len(subnav)
276
+ if N == 0:
277
+ return -999
267
278
  # array for timestamps, as datetime objects
268
279
  timest = np.empty(N, dtype=datetime)
269
280
 
270
281
  for i in range(N):
271
282
  if talker == 'INGGA': # on Ride, uses posix timestamps
272
- date = re.findall('(\d+(\.\d*)?) \$%s' % talker, subnav[i])[0]
283
+ date = re.findall(r'(\d+(\.\d*)?) \$%s' % talker, subnav[i])[0]
273
284
  timest[i] = datetime.fromtimestamp(
274
- float(date[0]), tzinfo=timezone.utc)
285
+ float(date[0]), timezone.utc)
275
286
  elif talker == 'GPGGA': # includes time only with date, unlike other GPGGAs
276
287
  date = re.findall(
277
- '(\d{4})\-(\d{2})\-(\d{2})T(\d{2}):(\d{2}):(\d{2})\.(\d.*?)Z', subnav[i])[0]
288
+ r'(\d{4})\-(\d{2})\-(\d{2})T(\d{2}):(\d{2}):(\d{2})\.(\d.*?)Z', subnav[i])[0]
278
289
  year = int(date[0])
279
290
  mon = int(date[1])
280
291
  day = int(date[2])
@@ -359,7 +370,7 @@ def _clean_180cross(gps_nav):
359
370
  ########################################################################
360
371
 
361
372
 
362
- def read_bgm_rgs(fp, ship):
373
+ def read_bgm_rgs(fp, ship, progressbar=True):
363
374
  """Read BGM gravity from RGS files.
364
375
 
365
376
  RGS is supposedly a standard format; is consistent between Atlantis
@@ -369,6 +380,8 @@ def read_bgm_rgs(fp, ship):
369
380
  :type fp: string, or list of strings
370
381
  :param ship: ship name
371
382
  :type ship: string
383
+ :param progressbar: display progress bar while list of files is read
384
+ :type progressbar: bool
372
385
 
373
386
  :returns: (*pd.DataFrame*) timestamps, raw gravity, and geographic coordinates
374
387
  """
@@ -377,21 +390,20 @@ def read_bgm_rgs(fp, ship):
377
390
  print('R/V %s not supported for RGS read yet' % ship)
378
391
  return -999
379
392
 
380
- if type(fp) == str:
393
+ if type(fp) is str:
381
394
  fp = [fp,] # make a list if only one path is given
382
395
 
383
396
  dats = []
384
- for path in fp:
397
+ for path in tqdm(fp, desc='reading RGS files', disable=not progressbar):
385
398
  dat = pd.read_csv(path, delimiter=' ', names=['date', 'time', 'grav', 'lat', 'lon'],
386
- usecols=(1, 2, 3, 11, 12), parse_dates=[[0, 1]])
387
- ndt = [e.tz_localize(timezone.utc) for e in dat['date_time']]
388
- dat['date_time'] = ndt
399
+ usecols=(1, 2, 3, 11, 12))
400
+ dat['date_time'] = pd.to_datetime(dat.pop('date')+' '+dat.pop('time'),utc=True)
389
401
  dats.append(dat)
390
402
 
391
403
  return pd.concat(dats, ignore_index=True)
392
404
 
393
405
 
394
- def read_bgm_raw(fp, ship, scale=None, ship_function=None):
406
+ def read_bgm_raw(fp, ship, scale=None, ship_function=None, progressbar=True):
395
407
  """Read BGM gravity from raw (serial) files (not RGS).
396
408
 
397
409
  This function uses scale factors determined for specific BGM meters
@@ -408,6 +420,8 @@ def read_bgm_raw(fp, ship, scale=None, ship_function=None):
408
420
  The function should return a pandas.DataFrame with timestamps and counts.
409
421
  Look at _bgmserial_Atlantis() and similar functions for examples.
410
422
  :type ship_function: function, optional
423
+ :param progressbar: display progress bar while list of files is read
424
+ :type progressbar: bool
411
425
 
412
426
  :return: (*pd.DataFrame*) timestamps and calibrated raw gravity values
413
427
  """
@@ -427,10 +441,10 @@ def read_bgm_raw(fp, ship, scale=None, ship_function=None):
427
441
  else:
428
442
  scale = sc_fac[ship]
429
443
 
430
- if type(fp) == str:
444
+ if type(fp) is str:
431
445
  fp = [fp,] # make a list if only one path is given
432
446
  dats = []
433
- for path in fp:
447
+ for path in tqdm(fp,desc='reading BGM files',disable=not progressbar):
434
448
  if ship_function != None:
435
449
  dat = ship_function(path)
436
450
  else:
@@ -456,10 +470,8 @@ def _bgmserial_Atlantis(path):
456
470
  def count(x): return (
457
471
  int(x.split(':')[-1])) # function to parse counts column
458
472
  dat = pd.read_csv(path, delimiter=' ', names=['date', 'time', 'counts'], usecols=(1, 2, 4),
459
- parse_dates=[[0, 1]], converters={'counts': count})
460
- ndt = [e.tz_localize(timezone.utc)
461
- for e in dat['date_time']] # timestamps cannot be naive
462
- dat['date_time'] = ndt
473
+ converters={'counts': count})
474
+ dat['date_time'] = pd.to_datetime(dat.pop('date')+' '+dat.pop('time'),utc=True)
463
475
  return dat
464
476
 
465
477
 
@@ -468,9 +480,8 @@ def _bgmserial_Thompson(path):
468
480
  """
469
481
  def count(x): return (int(x.split(' ')[0].split(':')[-1]))
470
482
  dat = pd.read_csv(path, delimiter=',', names=['date', 'time', 'counts'],
471
- parse_dates=[[0, 1]], converters={'counts': count})
472
- ndt = [e.tz_localize(timezone.utc) for e in dat['date_time']]
473
- dat['date_time'] = ndt
483
+ converters={'counts': count})
484
+ dat['date_time'] = pd.to_datetime(dat.pop('date')+' '+dat.pop('time'),utc=True)
474
485
  return dat
475
486
 
476
487
 
@@ -480,8 +491,6 @@ def _bgmserial_Revelle(path):
480
491
  def count(x): return (int(x.split(':')[-1]))
481
492
  dat = pd.read_csv(path, delimiter=' ', names=['date_time', 'counts'], usecols=(0, 1),
482
493
  parse_dates=[0], converters={'counts': count})
483
- ndt = [e.tz_localize(timezone.utc) for e in dat['date_time']]
484
- dat['date_time'] = ndt
485
494
  return dat
486
495
 
487
496
  def _bgmserial_Langseth(path):
@@ -523,7 +532,7 @@ def _despike_bgm_serial(dat, thresh=8000):
523
532
  ########################################################################
524
533
 
525
534
 
526
- def read_dgs_laptop(fp, ship, ship_function=None):
535
+ def read_dgs_laptop(fp, ship, ship_function=None, progressbar=True):
527
536
  """Read DGS 'laptop' file(s), usually written as .dat files.
528
537
 
529
538
  :param fp: filepath(s)
@@ -534,15 +543,17 @@ def read_dgs_laptop(fp, ship, ship_function=None):
534
543
  The function should return a pandas.DataFrame. See _dgs_laptop_general()
535
544
  for an example.
536
545
  :type ship_function: function, optional
546
+ :param progressbar: display progress bar while list of files is read
547
+ :type progressbar: bool
537
548
 
538
549
  :return: *(pd.DataFrame)* DGS output time series
539
550
  """
540
- if type(fp) == str:
551
+ if type(fp) is str:
541
552
  fp = [fp,] # listify
542
553
 
543
554
  dats = []
544
- for path in fp:
545
- if ship_function != None:
555
+ for path in tqdm(fp,desc='reading DGS files',disable=not progressbar):
556
+ if ship_function is not None:
546
557
  dat = ship_function(path)
547
558
  else:
548
559
  if ship in ['Atlantis', 'Revelle', 'NBP', 'Ride', 'DGStest']:
@@ -575,14 +586,12 @@ def _dgs_laptop_Thompson(path):
575
586
  """
576
587
  dat = pd.read_csv(path, delimiter=',', names=['date', 'time', 'rgrav', 've', 'vcc',
577
588
  'al', 'ax', 'lat', 'lon'],
578
- usecols=(0, 1, 3, 12, 13, 14, 15, 16, 17),
579
- parse_dates=[[0, 1]])
580
- ndt = [e.tz_localize(timezone.utc) for e in dat['date_time']]
581
- dat['date_time'] = ndt
589
+ usecols=(0, 1, 3, 12, 13, 14, 15, 16, 17))
590
+ dat['date_time'] = pd.to_datetime(dat.pop('date')+' '+dat.pop('time'),utc=True)
582
591
  return dat
583
592
 
584
593
 
585
- def read_dgs_raw(fp, ship, scale_ccp=True):
594
+ def read_dgs_raw(fp, ship, scale_ccp=True, progressbar=True):
586
595
  """Read raw (serial) output files from DGS AT1M.
587
596
 
588
597
  These will be in AD units mostly.
@@ -594,14 +603,16 @@ def read_dgs_raw(fp, ship, scale_ccp=True):
594
603
  :type fp: string or list of strings
595
604
  :param ship: ship name
596
605
  :type ship: string
606
+ :param progressbar: display progress bar while list of files is read
607
+ :type progressbar: bool
597
608
 
598
609
  :return: (*pd.DataFrame*) DGS output time series
599
610
  """
600
611
 
601
- if type(fp) == str:
612
+ if type(fp) is str:
602
613
  fp = [fp,] # listify
603
614
  dats = []
604
- for ip, path in enumerate(fp):
615
+ for path in tqdm(fp,desc='reading DGS files',disable=not progressbar):
605
616
  if ship == 'Thompson': # always with the special file formats
606
617
  dat = _dgs_raw_Thompson(path)
607
618
  else: # there might be exceptions besides Thompson but I don't know about them yet
@@ -746,4 +757,13 @@ def read_other_stuff(yaml_file, data_file, tag):
746
757
  except KeyError:
747
758
  pass
748
759
 
749
- return df.apply(pd.to_numeric, errors='ignore'), col_info
760
+ def numeric(col):
761
+ try:
762
+ return pd.to_numeric(col)
763
+ except:
764
+ return col
765
+
766
+ for ckey in df.columns: # to_numeric by column so errors for datetimes are skipped
767
+ df[ckey] = df[ckey].apply(numeric) # slower than applying to whole df, but avoids errors
768
+
769
+ return df, col_info
@@ -2,15 +2,16 @@
2
2
  Tests for shipgrav
3
3
  """
4
4
 
5
- from pkg_resources import resource_filename
5
+ import importlib.resources as importlib_resources
6
6
  import sys
7
7
  import unittest
8
8
 
9
9
 
10
10
  def run():
11
11
  loader = unittest.TestLoader()
12
- test_dir = resource_filename('shipgrav', 'tests')
13
- suite = loader.discover(test_dir)
12
+ ref = importlib_resources.files('shipgrav') / 'tests'
13
+ with importlib_resources.as_file(ref) as path:
14
+ suite = loader.discover(ref)
14
15
  runner = unittest.runner.TextTestRunner() # verbosity=2)
15
16
  ret = not runner.run(suite).wasSuccessful()
16
17
  sys.exit(ret)
@@ -0,0 +1,4 @@
1
+ BGM 2022/07/01 12:17:33.492 BGM3 04:024963 00
2
+ BGM 2022/07/01 12:17:34.492 BGM3 04:024982 00
3
+ BGM 2022/07/01 12:17:35.492 BGM3 04:024992 00
4
+ BGM 2022/07/01 12:17:36.492 BGM3 04:024983 00
@@ -0,0 +1,8 @@
1
+ NAV 2022/07/01 12:17:33.233 GPS $GPZDA,121733.00,01,07,2022,00,00*67
2
+ NAV 2022/07/01 12:17:33.433 GPS $GPRMC,121733.00,A,4131.403226,N,07040.311159,W,0.01,218.7,010722,,,D*7C
3
+ NAV 2022/07/01 12:17:33.433 GPS $GPVTG,218.7,T,,M,0.01,N,0.02,K,D*07
4
+ NAV 2022/07/01 12:17:33.537 GPS $GPGGA,121733.00,4131.403226,N,07040.311159,W,2,12,0.9,25.169,M,-30.062,M,4.0,0402*46
5
+ NAV 2022/07/01 12:17:34.233 GPS $GPZDA,121734.00,01,07,2022,00,00*60
6
+ NAV 2022/07/01 12:17:34.433 GPS $GPRMC,121734.00,A,4131.403225,N,07040.311162,W,0.00,206.6,010722,,,D*7F
7
+ NAV 2022/07/01 12:17:34.433 GPS $GPVTG,206.6,T,,M,0.00,N,0.01,K,D*0B
8
+ NAV 2022/07/01 12:17:34.537 GPS $GPGGA,121734.00,4131.403225,N,07040.311162,W,2,12,0.9,25.176,M,-30.062,M,5.0,0402*45
@@ -0,0 +1,4 @@
1
+ vc01 2020:244:00:00:00.3244 04:025229 00
2
+ vc01 2020:244:00:00:01.3244 04:025247 00
3
+ vc01 2020:244:00:00:02.3235 04:025404 00
4
+ vc01 2020:244:00:00:03.3248 04:025535 00
@@ -0,0 +1,12 @@
1
+ seapath 2020:244:00:00:00.6212 $INGGA,000000.37,5420.890903,N,13237.257959,W,2,09,1.2,1.62,M,-7.75,M,4.2,0291*55
2
+ seapath 2020:244:00:00:00.8547 $INGLL,5420.890903,N,13237.257959,W,000000.37,A,D*63
3
+ seapath 2020:244:00:00:00.8549 $INVTG,77.26,T,59.62,M,9.8,N,18.1,K,D*03
4
+ seapath 2020:244:00:00:00.8550 $INHDT,75.05,T*22
5
+ seapath 2020:244:00:00:00.8551 $PSXN,20,0,0,0,0*3B
6
+ seapath 2020:244:00:00:00.9253 $PSXN,23,1.06,0.15,75.05,0.04*08
7
+ seapath 2020:244:00:00:01.6218 $INGGA,000001.37,5420.891483,N,13237.253415,W,2,09,1.2,1.60,M,-7.75,M,5.2,0291*52
8
+ seapath 2020:244:00:00:01.8561 $INGLL,5420.891483,N,13237.253415,W,000001.37,A,D*67
9
+ seapath 2020:244:00:00:01.8562 $INVTG,77.89,T,60.26,M,9.8,N,18.2,K,D*0F
10
+ seapath 2020:244:00:00:01.8563 $INHDT,74.82,T*2C
11
+ seapath 2020:244:00:00:01.8564 $PSXN,20,0,0,0,0*3B
12
+ seapath 2020:244:00:00:01.9253 $PSXN,23,1.09,0.10,74.82,0.05*0D
@@ -0,0 +1,22 @@
1
+ 23+013:00:00:00.113 $GPVTG,5.71,T,,M,5.2,N,9.6,K,A*36
2
+ 23+013:00:00:00.115 $GPRMC,000000.81,A,7520.156405,S,17945.635852,W,5.2,5.71,130123,,,A*6C
3
+ 23+013:00:00:00.213 $GPHDT,4.62,T*05
4
+ 23+013:00:00:00.213 $PSXN,20,0,0,0,0*3B
5
+ 23+013:00:00:00.213 $PSXN,22,1.32,0.09*30
6
+ 23+013:00:00:00.213 $PSXN,23,-0.67,0.42,4.62,0.10*13
7
+ 23+013:00:00:00.996 $GPZDA,000001.81,13,01,2023,,*6E
8
+ 23+013:00:00:00.996 $GPGGA,000001.81,7520.154976,S,17945.635106,W,1,12,0.5,-1.36,M,-56.73,M,,*6F
9
+ 23+013:00:00:01.113 $GPVTG,6.29,T,,M,5.1,N,9.5,K,A*38
10
+ 23+013:00:00:01.113 $GPRMC,000001.81,A,7520.154976,S,17945.635106,W,5.1,6.29,130123,,,A*63
11
+ 23+013:00:00:01.213 $GPHDT,4.74,T*02
12
+ 23+013:00:00:01.213 $PSXN,20,0,0,0,0*3B
13
+ 23+013:00:00:01.213 $PSXN,22,1.32,0.09*30
14
+ 23+013:00:00:01.213 $PSXN,23,-0.46,0.32,4.74,0.06*17
15
+ 23+013:00:00:02.012 $GPZDA,000002.81,13,01,2023,,*6D
16
+ 23+013:00:00:02.012 $GPGGA,000002.81,7520.153540,S,17945.635196,W,1,12,0.5,-1.29,M,-56.73,M,,*65
17
+ 23+013:00:00:02.142 $GPVTG,2.95,T,,M,5.1,N,9.5,K,A*3B
18
+ 23+013:00:00:02.142 $GPRMC,000002.81,A,7520.153540,S,17945.635196,W,5.1,2.95,130123,,,A*64
19
+ 23+013:00:00:02.213 $GPHDT,4.81,T*08
20
+ 23+013:00:00:02.213 $PSXN,20,0,0,0,0*3B
21
+ 23+013:00:00:02.213 $PSXN,22,1.32,0.09*30
22
+ 23+013:00:00:02.214 $PSXN,23,-0.61,0.22,4.81,0.01*1E
@@ -0,0 +1,6 @@
1
+ 2022-11-07T17:13:45.329611Z 04:024882 00
2
+ 2022-11-07T17:13:46.336966Z 04:024881 00
3
+ 2022-11-07T17:13:47.329375Z 04:024871 00
4
+ 2022-11-07T17:13:48.329063Z 04:024860 00
5
+ 2022-11-07T17:13:49.328981Z 04:024850 00
6
+ 2022-11-07T17:13:50.328786Z 04:024840 00
@@ -0,0 +1,19 @@
1
+ 2022-11-05T00:00:00.002010Z $GPZDA,000000.00,05,11,2022,,*61
2
+ $GPGGA,000000.00,3345.095825,N,11923.337492,W,2,09,0.9,-3.95,M,-33.51,M,2.0,0135*5F
3
+ $GPGLL,3345.095825,N,11923.337492,W,000000.00,A,D*7C
4
+ $GPVTG,200.16,T,,M,11.0,N,20.4,K,D*3B
5
+ $GPVBW,,,V,10.99,-0.08,A,,V,,V*57
6
+ $GPRMC,000000.00,A,3345.095825,N,11923.337492,W,11.0,200.16,051122,,,D*4B
7
+ $GPHDT,200.59,T*0B
8
+ $PSXN,20,0,0,0,0*3B
9
+ $PSXN,23,-1.34,0.12,200.59,-0.07*34
10
+
11
+ 2022-11-05T00:00:00.202094Z $GPZDA,000000.20,05,11,2022,,*63
12
+ $GPGGA,000000.20,3345.095252,N,11923.337748,W,2,09,0.9,-4.00,M,-33.51,M,2.0,0135*58
13
+ $GPGLL,3345.095252,N,11923.337748,W,000000.20,A,D*70
14
+ $GPVTG,200.60,T,,M,11.0,N,20.4,K,D*3A
15
+ $GPVBW,,,V,11.00,-0.02,A,,V,,V*5C
16
+ $GPRMC,000000.20,A,3345.095252,N,11923.337748,W,11.0,200.60,051122,,,D*46
17
+ $GPHDT,200.70,T*00
18
+ $PSXN,20,0,0,0,0*3B
19
+ $PSXN,23,-1.32,0.06,200.70,-0.03*38
@@ -0,0 +1,36 @@
1
+ 1674228947.582 $INZDA,153547.44,20,01,2023,,*77
2
+ 1674228947.583 $INGGA,153547.44,3242.434865,N,11714.203305,W,2,09,0.8,-4.31,M,-33.32,M,2.0,0135*4B
3
+ 1674228947.583 $INGLL,3242.434865,N,11714.203305,W,153547.44,A,D*65
4
+ 1674228947.583 $INVTG,112.40,T,,M,0.0,N,0.1,K,D*2F
5
+ 1674228947.583 $INRMC,153547.44,A,3242.434865,N,11714.203305,W,0.0,112.40,200123,,,D*66
6
+ 1674228947.603 $GPGSA,M,3,21,31,09,16,03,01,26,04,22,,,,1.7,0.8,1.5*39
7
+ 1674228947.633 $INHDT,179.39,T*10
8
+ 1674228947.643 $PSXN,20,0,0,0,0*3B
9
+ 1674228947.653 $PSXN,23,-0.33,0.42,179.39,0.01*17
10
+ 1674228947.673 $PSXN,26,0,44.5000,0.7800,-0.9000,NRP*6D
11
+ 1674228947.693 $PSXN,26,1,44.5000,0.7800,-0.9000,Monitoring Point #1*44
12
+ 1674228947.723 $PSXN,26,2,44.5000,0.7800,-0.9000,Monitoring Point #2*1F
13
+ 1674228947.754 $PSXN,26,3,44.5000,0.7800,-0.9000,Monitoring Point #3*33
14
+ 1674228947.784 $PSXN,26,4,44.5000,0.7800,-0.9000,Monitoring Point #4*68
15
+ 1674228947.814 $PSXN,26,5,44.5000,0.7800,-0.9000,Monitoring Point #5*3D
16
+ 1674228947.844 $PSXN,26,6,44.5000,0.7800,-0.9000,Monitoring Point #6*11
17
+ 1674228947.874 $PSXN,26,7,44.5000,0.7800,-0.9000,Monitoring Point #7*4A
18
+ 1674228947.904 $PSXN,26,8,44.5000,0.7800,-0.9000,Monitoring Point #8*64
19
+ 1674228948.444 $INZDA,153548.44,20,01,2023,,*78
20
+ 1674228948.474 $INGGA,153548.44,3242.434850,N,11714.203293,W,2,09,0.8,-4.32,M,-33.32,M,3.0,0135*4E
21
+ 1674228948.514 $INGLL,3242.434850,N,11714.203293,W,153548.44,A,D*62
22
+ 1674228948.544 $INVTG,117.77,T,,M,0.0,N,0.1,K,D*2E
23
+ 1674228948.564 $INRMC,153548.44,A,3242.434850,N,11714.203293,W,0.0,117.77,200123,,,D*60
24
+ 1674228948.604 $GPGSA,M,3,21,31,09,16,03,01,26,04,22,,,,1.7,0.8,1.5*39
25
+ 1674228948.634 $INHDT,179.38,T*11
26
+ 1674228948.644 $PSXN,20,0,0,0,0*3B
27
+ 1674228948.654 $PSXN,23,-0.27,0.41,179.38,0.01*10
28
+ 1674228948.674 $PSXN,26,0,44.5000,0.7800,-0.9000,NRP*6D
29
+ 1674228948.694 $PSXN,26,1,44.5000,0.7800,-0.9000,Monitoring Point #1*44
30
+ 1674228948.724 $PSXN,26,2,44.5000,0.7800,-0.9000,Monitoring Point #2*1F
31
+ 1674228948.754 $PSXN,26,3,44.5000,0.7800,-0.9000,Monitoring Point #3*33
32
+ 1674228948.784 $PSXN,26,4,44.5000,0.7800,-0.9000,Monitoring Point #4*68
33
+ 1674228948.814 $PSXN,26,5,44.5000,0.7800,-0.9000,Monitoring Point #5*3D
34
+ 1674228948.844 $PSXN,26,6,44.5000,0.7800,-0.9000,Monitoring Point #6*11
35
+ 1674228948.874 $PSXN,26,7,44.5000,0.7800,-0.9000,Monitoring Point #7*4A
36
+ 1674228948.904 $PSXN,26,8,44.5000,0.7800,-0.9000,Monitoring Point #8*64
@@ -0,0 +1,2 @@
1
+ 03/12/2022,16:04:07.033,9984.869108166342,9995.951860660536, 0.031769279452, 0.049233442454, -0.894949104184, 62.394784400000, 16730, 4086876, 30.259833080000, 50.442681000000, 0.000020000000, 0.035710000000, 0.000070000000, -0.000010000000, 0.000000000000, 0.000000000000, 0.000000000000, 0.000000000000, 0.044215194148,2022,03,12,16,04,19.80,5370472
2
+ 03/12/2022,16:04:08.017,9984.870433770533,9977.883336291710, 0.030815605022, 0.037431721381, -0.894945527905, 62.394784400000, 16730, 4086005, 30.260701895000, 50.447645000000, 0.000020000000, 0.032780000000, 0.000020000000, -0.000010000000, 0.000000000000, 0.000000000000, 0.000000000000, 0.000000000000, 0.044061782566,2022,03,12,16,04,20.80,5370473
@@ -5,19 +5,25 @@ ccp coeffs are not objectively good because the time series is short but it work
5
5
  Not bothering with the leveling correction here because exactly how to calculate it
6
6
  is left to the user
7
7
  """
8
+ import os
8
9
  import unittest
10
+ from glob import glob
11
+
9
12
  import numpy as np
10
13
  import shipgrav.grav as sgg
11
14
  import shipgrav.io as sgi
12
15
  import shipgrav.nav as sgn
13
- from glob import glob
14
16
 
15
17
 
16
18
  class gravDataTestCase(unittest.TestCase):
19
+ @property
20
+ def ex_files(self):
21
+ return __file__.rsplit(os.sep, 1)[0] + os.sep + "ex_files"
22
+
17
23
  def setUp(self): # actions to take before running each test: load in some test data
18
24
  # test data is part of one of the files supplied by DGS (original file is 30M/86400
19
25
  # lines for 24 hr; we cut to 1 hr?)
20
- gfiles = glob('ex_files/DGStest*.dat')
26
+ gfiles = glob(f'{self.ex_files}'+os.sep+'DGStest*.dat')
21
27
  data = sgi.read_dgs_laptop(gfiles, 'DGStest')
22
28
  data['tsec'] = [e.timestamp()
23
29
  for e in data['date_time']] # get posix timestamps
@@ -2,6 +2,7 @@
2
2
  Tests for shipgrav.grav
3
3
  """
4
4
  import unittest
5
+
5
6
  import numpy as np
6
7
  import shipgrav.grav as sgg
7
8
 
@@ -35,9 +36,26 @@ class gravNoDataTestCase(unittest.TestCase):
35
36
 
36
37
  def test_crustalthickness(self):
37
38
  rng = np.random.default_rng(123) # seeded
38
- C = sgg.crustal_thickness_2D(10*rng.random(1000))
39
+ signal = 10*rng.random(1000)
40
+ C = sgg.crustal_thickness_2D(signal)
39
41
  self.assertTrue(np.real(C[0])[0] - 0.04019 < 0.001)
40
42
 
43
+ C2 = sgg.crustal_thickness_2D(signal, back=True)
44
+ self.assertEqual(np.real(C[0])[0], np.real(C2[0][-1])[0])
45
+
46
+ def test_grav2d_folding(self):
47
+ rng = np.random.default_rng(123) # seeded
48
+ X = np.arange(5); Y = np.arange(5)
49
+ Z = rng.random((5,5))
50
+ sdat = sgg.grav2d_folding(X, Y, Z, 100, 100, drho=0.6, dz=6000, ifold=True, npower=5)
51
+
52
+ self.assertTrue(sdat[0,0] + 0.0043244755 < 0.001)
53
+
54
+ def test_grav2d_layer(self):
55
+ rng = np.random.default_rng(123) # seeded
56
+ rho = 1e4*rng.random((5,5))
57
+ sdat = sgg.grav2d_layer_variable_density(rho, 100, 100, 3, 5)
58
+ self.assertTrue(sdat[0,0] - 130.877533 < 0.001)
41
59
 
42
60
  def suite():
43
61
  return unittest.makeSuite(gravNoDataTestCase, 'test')
shipgrav/tests/test_io.py CHANGED
@@ -4,44 +4,117 @@ This does not test every option for read (because there are many ships)
4
4
  but for each overall read function it reads a snippet of an example file
5
5
  and checks that the values in key columns are correct
6
6
  """
7
+ import os
7
8
  import unittest
9
+
8
10
  import shipgrav.io as sgi
9
11
 
10
12
 
11
13
  class ioTestCase(unittest.TestCase):
12
- def test_read_nav(self): # NOTE only tests one of the ship functions
13
- # but it does include extracting clock time -> datetime -> posix
14
- nav = sgi.read_nav('Thompson', 'ex_files/TN400_nav.Raw')
14
+ @property
15
+ def ex_files(self):
16
+ return __file__.rsplit(os.sep, 1)[0] + os.sep + "ex_files"
17
+
18
+ def test_read_nav_Thompson(self):
19
+ nav = sgi.read_nav('Thompson', f'{self.ex_files}'+os.sep+'TN400_nav.Raw', progressbar=False)
15
20
  self.assertEqual(nav.iloc[0].time_sec, 1647129603)
16
21
  self.assertTrue(nav.iloc[0].lon + 118.6524 < 0.001)
17
22
 
23
+ def test_read_nav_Atlantis(self):
24
+ nav = sgi.read_nav('Atlantis', f'{self.ex_files}'+os.sep+'AT01_nav.gps', progressbar=False)
25
+ self.assertEqual(nav.iloc[0].time_sec, 1656677853)
26
+ self.assertTrue(nav.iloc[0].lon + 70.67185265 < 0.001)
27
+
28
+ def test_read_nav_Langseth(self):
29
+ nav = sgi.read_nav('Langseth', f'{self.ex_files}'+os.sep+'MGL2003_nav.y2020d244', progressbar=False)
30
+ self.assertEqual(nav.iloc[0].time_sec, 1598832000.6212)
31
+ self.assertTrue(nav.iloc[0].lon + 132.620965893 < 0.001)
32
+
33
+ def test_read_nav_Revelle(self):
34
+ nav = sgi.read_nav('Revelle', f'{self.ex_files}'+os.sep+'RR2212_nav.txt', progressbar=False)
35
+ self.assertEqual(nav.iloc[0].time_sec, 1667606400)
36
+ self.assertTrue(nav.iloc[0].lon + 119.3889582 < 0.001)
37
+
38
+ def test_read_nav_Ride(self):
39
+ nav = sgi.read_nav('Ride', f'{self.ex_files}'+os.sep+'SR2302_nav.raw',talker='INGGA', progressbar=False)
40
+ self.assertEqual(nav.iloc[0].time_sec, 1674228947.583)
41
+ self.assertTrue(nav.iloc[0].lon + 117.23672175 < 0.001)
42
+
43
+ def test_read_nav_NBP(self):
44
+ nav = sgi.read_nav('NBP', f'{self.ex_files}'+os.sep+'NBP_2301_nav.d013', progressbar=False)
45
+ self.assertEqual(nav.iloc[0].time_sec, 1673568001.81)
46
+ self.assertTrue(nav.iloc[0].lon + 179.7605851 < 0.001)
47
+
48
+ def test_read_nav_nope(self):
49
+ nav = sgi.read_nav('Titanic', f'{self.ex_files}'+os.sep+'TN400_nav.Raw', progressbar=False)
50
+ self.assertEqual(nav, -999)
51
+
18
52
  def test_read_bgm_rgs(self):
19
- bgm = sgi.read_bgm_rgs('ex_files/AT05_01_bgm.RGS', 'Atlantis')
53
+ bgm = sgi.read_bgm_rgs(f'{self.ex_files}'+os.sep+'AT05_01_bgm.RGS', 'Atlantis', progressbar=False)
20
54
  self.assertEqual(bgm.iloc[0]['date_time'].timestamp(), 1656633600.445)
21
55
  self.assertEqual(bgm.iloc[0]['grav'], 980329.272)
22
56
 
23
- def test_read_bgm_raw(self):
24
- bgm = sgi.read_bgm_raw('ex_files/TN400_bgm.Raw', 'Thompson')
57
+ def test_read_bgm_raw_Atlantis(self):
58
+ bgm = sgi.read_bgm_raw(f'{self.ex_files}'+os.sep+'AT01_bgm.BGM', 'Atlantis', progressbar=False)
59
+ self.assertEqual(bgm.iloc[0]['date_time'].timestamp(), 1656677853.492)
60
+ self.assertEqual(bgm.iloc[0]['counts'], 24963)
61
+ self.assertEqual(bgm.iloc[0]['rgrav'], 124666.983189576)
62
+
63
+ def test_read_bgm_raw_Thompson(self):
64
+ bgm = sgi.read_bgm_raw(f'{self.ex_files}'+os.sep+'TN400_bgm.Raw', 'Thompson', progressbar=False)
25
65
  self.assertEqual(bgm.iloc[0]['date_time'].timestamp(), 1647129602.449)
26
66
  self.assertEqual(bgm.iloc[0]['counts'], 25529)
27
67
  self.assertEqual(bgm.iloc[0]['rgrav'], 127730.60800402702)
28
68
 
29
- def test_read_dgs_dat(self):
30
- dgs = sgi.read_dgs_laptop('ex_files/DGStest_laptop.dat', 'DGStest')
69
+ def test_read_bgm_raw_Langseth(self):
70
+ bgm = sgi.read_bgm_raw(f'{self.ex_files}'+os.sep+'MGL2003_bgm.y2020d244', 'Langseth', progressbar=False)
71
+ self.assertEqual(bgm.iloc[0]['date_time'].timestamp(), 1598832000.3244)
72
+ self.assertEqual(bgm.iloc[0]['counts'], 25229)
73
+ self.assertEqual(bgm.iloc[0]['rgrav'], 126145.0)
74
+
75
+ def test_read_bgm_raw_Revelle(self):
76
+ bgm = sgi.read_bgm_raw(f'{self.ex_files}'+os.sep+'RR2212_bgm.txt', 'Revelle', progressbar=False)
77
+ self.assertEqual(bgm.iloc[0]['date_time'].timestamp(), 1667841225.329611)
78
+ self.assertEqual(bgm.iloc[0]['counts'], 24882)
79
+ self.assertEqual(bgm.iloc[0]['rgrav'], 124405.2114591)
80
+
81
+ def test_read_bgm_raw_nope(self):
82
+ bgm = sgi.read_bgm_raw(f'{self.ex_files}'+os.sep+'TN400_bgm.Raw', 'Boaty McBoatface', progressbar=False)
83
+ self.assertEqual(bgm, -999)
84
+
85
+ def test_read_dgs_dat_general(self):
86
+ dgs = sgi.read_dgs_laptop(f'{self.ex_files}'+os.sep+'DGStest_laptop.dat', 'DGStest', progressbar=False)
31
87
  self.assertEqual(dgs.iloc[0]['date_time'].timestamp(), 1562803200.0)
32
88
  self.assertEqual(dgs.iloc[0]['ve'], 0.81098)
33
89
  self.assertTrue(dgs.iloc[0]['rgrav'] - 12295.691114 < 0.0001)
34
90
 
35
- def test_read_dgs_raw(self):
36
- dgs = sgi.read_dgs_raw('ex_files/SR2312_dgs_raw.txt', 'Ride')
91
+ def test_read_dgs_dat_Thompson(self):
92
+ dgs = sgi.read_dgs_laptop(f'{self.ex_files}'+os.sep+'TN400_dgs_proc.Raw', 'Thompson', progressbar=False)
93
+ self.assertEqual(dgs.iloc[0]['date_time'].timestamp(), 1647101047.033)
94
+ self.assertEqual(dgs.iloc[0]['ve'], 2e-5)
95
+ self.assertTrue(dgs.iloc[0]['rgrav'] - 9995.95186 < 0.0001)
96
+
97
+ def test_read_dgs_dat_nope(self):
98
+ dgs = sgi.read_dgs_laptop(f'{self.ex_files}'+os.sep+'DGStest_laptop.dat', 'Katama', progressbar=False)
99
+ self.assertEqual(dgs, -999)
100
+
101
+ def test_read_dgs_raw_general(self):
102
+ dgs = sgi.read_dgs_raw(f'{self.ex_files}'+os.sep+'SR2312_dgs_raw.txt', 'Ride', progressbar=False)
37
103
  self.assertEqual(
38
104
  dgs.iloc[0]['date_time'].timestamp(), 1686873600.857719)
39
105
  self.assertEqual(dgs.iloc[0]['Gravity'], -218747)
40
106
  self.assertTrue(dgs.iloc[0]['vcc'] - 76.8771 < 0.0001)
41
107
 
108
+ def test_read_dgs_raw_Thompson(self):
109
+ dgs = sgi.read_dgs_raw(f'{self.ex_files}'+os.sep+'TN400_dgs_raw.Raw', 'Thompson', progressbar=False)
110
+ self.assertEqual(
111
+ dgs.iloc[0]['date_time'].timestamp(), 1647101046.634)
112
+ self.assertEqual(dgs.iloc[0]['Gravity'], -82)
113
+ self.assertTrue(dgs.iloc[0]['vcc'] - 0.0357100 < 0.0001)
114
+
42
115
  def test_read_mru(self):
43
116
  mru, cols = sgi.read_other_stuff(
44
- 'ex_files/IXBlue.yaml', 'ex_files/SR2312_mru.txt', 'PASHR')
117
+ f'{self.ex_files}'+os.sep+'IXBlue.yaml', f'{self.ex_files}'+os.sep+'SR2312_mru.txt', 'PASHR')
45
118
  self.assertEqual(mru.iloc[0]['Pitch:g'], -0.41)
46
119
  self.assertEqual(mru.iloc[0]['Roll:g'], 2.03)
47
120
  self.assertEqual(mru.iloc[0]['Heave:g'], -0.6)
@@ -2,8 +2,9 @@
2
2
  Tests for shipgrav.nav
3
3
  """
4
4
  import unittest
5
- import shipgrav.nav as sgn
5
+
6
6
  import numpy as np
7
+ import shipgrav.nav as sgn
7
8
 
8
9
 
9
10
  class navTestCase(unittest.TestCase):
@@ -2,8 +2,9 @@
2
2
  Tests for shipgrav.utils
3
3
  """
4
4
  import unittest
5
- import shipgrav.utils as sgu
5
+
6
6
  import numpy as np
7
+ import shipgrav.utils as sgu
7
8
 
8
9
 
9
10
  class utilsTestCase(unittest.TestCase):
shipgrav/utils.py CHANGED
@@ -1,8 +1,8 @@
1
- import numpy as np
2
- from datetime import datetime
3
- from pandas import Series
4
1
  import re
5
2
 
3
+ import numpy as np
4
+ from pandas import DataFrame
5
+
6
6
  # TODO gaussian filter doesn't *need* time array
7
7
 
8
8
 
@@ -90,9 +90,9 @@ def decode_dgs_status_bits(stat, key='status'):
90
90
  flags = ['clamp status', 'GPSsync', 'reserved', 'feedback', 'R1', 'R2', 'ADlock', 'ack host',
91
91
  'NavMod1', 'NavMod2', 'dgs1_trouble', 'dgs2_trouble', 'GPStime', 'ADsat', 'nemo1', 'nemo2']
92
92
 
93
- assert type(stat) == int or type(
94
- stat) == DataFrame, 'bad input type for status bits'
95
- if type(stat) == int:
93
+ assert type(stat) is int or type(
94
+ stat) is DataFrame, 'bad input type for status bits'
95
+ if type(stat) is int:
96
96
  bt = format(stat, '016b')
97
97
  out = {}
98
98
  for i, f in enumerate(flags):
@@ -114,6 +114,8 @@ def clean_ini_to_toml(ini_file):
114
114
  the extension .ini replaced by .toml
115
115
 
116
116
  :param ini_file: path to input ini file
117
+
118
+ :return: **opath** (*string*) - path to output toml file
117
119
  """
118
120
 
119
121
  with open(ini_file, 'r') as file:
@@ -132,11 +134,12 @@ def clean_ini_to_toml(ini_file):
132
134
  text = re.sub('$', '', text)
133
135
 
134
136
  # fix unquoted strings and write
135
- fo = open(ini_file.rstrip('ini')+'toml', 'w')
137
+ opath = ini_file.rstrip('ini')+'toml'
138
+ fo = open(opath, 'w')
136
139
  lines = text.split('\n')
137
140
  for ln in lines:
138
141
  if ln.startswith('$'):
139
- ln = re.sub('\$', '', ln) # clean out $ for talkers
142
+ ln = re.sub(r'\$', '', ln) # clean out $ for talkers
140
143
 
141
144
  if ln.startswith('[') or ln.startswith('#') or ln == '':
142
145
  fo.write(ln)
@@ -169,7 +172,7 @@ def clean_ini_to_toml(ini_file):
169
172
  fo.write('\n')
170
173
 
171
174
  fo.close()
172
- return
175
+ return opath
173
176
 
174
177
 
175
178
  class _SnappingCursor:
@@ -1,10 +1,9 @@
1
1
  Metadata-Version: 2.3
2
2
  Name: shipgrav
3
- Version: 1.0.2
3
+ Version: 1.0.4
4
4
  Summary: Functions for marine gravity data processing and reduction
5
5
  Author-email: "Hannah F. Mark" <hmark@whoi.edu>
6
6
  Maintainer-email: "Hannah F. Mark" <hmark@whoi.edu>
7
- License-File: LICENSE
8
7
  Keywords: UNOLS,gravimetry,marine gravity
9
8
  Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3)
10
9
  Classifier: Operating System :: OS Independent
@@ -15,12 +14,18 @@ Requires-Dist: pyyaml
15
14
  Requires-Dist: scipy
16
15
  Requires-Dist: statsmodels
17
16
  Requires-Dist: tomli
17
+ Requires-Dist: tqdm
18
18
  Provides-Extra: examples
19
+ Requires-Dist: geographiclib; extra == 'examples'
20
+ Requires-Dist: jupyterlab; extra == 'examples'
21
+ Requires-Dist: jupytext; extra == 'examples'
19
22
  Requires-Dist: matplotlib; extra == 'examples'
23
+ Requires-Dist: pooch; extra == 'examples'
20
24
  Description-Content-Type: text/markdown
21
25
 
22
26
  # shipgrav
23
27
  [![build status](https://github.com/PFPE/shipgrav/workflows/tests/badge.svg)](https://github.com/PFPE/shipgrav/actions)
28
+ [![codecov](https://codecov.io/gh/PFPE/shipgrav/branch/main/graph/badge.svg)](https://codecov.io/gh/PFPE/shipgrav)
24
29
 
25
30
  shipgrav is a Python package designed for reading, processing, and reducing marine gravity data from UNOLS ships. It is created and maintained by PFPE for the marine gravimetry community. The shipgrav repository also contains scripts with example workflows for gravity data processing and reduction.
26
31
 
@@ -31,9 +36,10 @@ scipy\
31
36
  pandas 2.0+\
32
37
  statsmodels\
33
38
  tomli\
34
- pyyaml
39
+ pyyaml\
40
+ tqdm
35
41
 
36
- To run the example scripts, you will also need matplotlib.
42
+ To run the example scripts, you will also need matplotlib, geographiclib, and pooch. To run the example scripts in jupyter, you will also need jupyterlab and jupytext.
37
43
 
38
44
  ## Installation
39
45
  shipgrav can be installed from PyPI using `pip`. Detailed instructions are in the [documentation](https://shipgrav.readthedocs.io/).
@@ -43,3 +49,7 @@ The shipgrav documentation is available online at [shipgrav.readthedocs.io](http
43
49
 
44
50
  ## Contributing to shipgrav
45
51
  Please do! If you have ideas for how to make shipgrav better, you can raise an issue on github or contact PFPE.
52
+
53
+ If you raise an issue on github, please include as much detail as possible about any errors you are encountering or any proposed enhancements to the code. Include the text of any error messages, and if the issue is unexpected behavior from the code without any visible error messages, describe both what the code is doing and what you think it *should* be doing instead. PFPE may ask for additional details and/or copies of data files in order to reproduce and diagnose an issue.
54
+
55
+ Additions or enhancements to the code are also welcome. Contributors are invited to fork the repository and submit pull requests for the maintainers to review.
@@ -0,0 +1,34 @@
1
+ shipgrav/__init__.py,sha256=Kmg5sCx0ZHOs1bEPZeKLs5kmlrqjvbrcBAkV4G151ws,11874
2
+ shipgrav/database.toml,sha256=8xSnXoAsfBMPDhvzKzw83oDrDT9AxD_QINZ8IWSUmiE,551
3
+ shipgrav/grav.py,sha256=09d5LIczUGys9VerTngumOMLeYOqdVJVtAdBGJZw2R8,43196
4
+ shipgrav/io.py,sha256=N2cqXRHIX_lSyCKJ35h9J4zcvs7tMKvSq0ch90xJncM,30448
5
+ shipgrav/nav.py,sha256=3E2Il-PhTEPVv8_8PrLVe7h6slHq-2HYX_Q8PdpB6X4,2895
6
+ shipgrav/utils.py,sha256=BJSBrKgCOADzdb6gBO3u9n16l22zzFXTAw42e5_QkBY,9582
7
+ shipgrav/tests/__init__.py,sha256=i8hMukDivY8kFp7OBdo1byRJWMBvb8nDDWJcAy2cOOI,427
8
+ shipgrav/tests/__main__.py,sha256=iJ8xSU8ucCmIDcOuv8uasZ9iV5dNR0DDuFRZrG0o2hE,38
9
+ shipgrav/tests/test_grav_data.py,sha256=dmD49rb5p4GQumaldU0d6Ara-2L2-TjS4S7fXMXFFwo,3537
10
+ shipgrav/tests/test_grav_nodata.py,sha256=skVHlimoyohxlDW7UG-sT_ASKOxs7sWakYm3HfkAoIE,2202
11
+ shipgrav/tests/test_io.py,sha256=aiZTB2znmMFYUKgnMBHqjIty2MXlpyo8hqt7gI4_pSA,6141
12
+ shipgrav/tests/test_nav.py,sha256=R2Fd_TIp_lOUjNtUcxvhZ5l-Vm1CmZ0-XzTcFiMFBRY,1195
13
+ shipgrav/tests/test_utils.py,sha256=dz9T_hDBpPhhLj9Xv9U-QpeMC3MsGS3aGnAWIaUVJAc,970
14
+ shipgrav/tests/ex_files/AT01_bgm.BGM,sha256=yaiVxd6vFeuoiKsUhMeTh22U2VOPPOR-UApr165Ua_I,184
15
+ shipgrav/tests/ex_files/AT01_nav.gps,sha256=75q1E_z65N82UW33x1vcaXfIHIwY_7qq1alcD2LbahQ,722
16
+ shipgrav/tests/ex_files/AT05_01_bgm.RGS,sha256=3MnlHJiIyzYe7KGr-bD8dfQ1nS_-Dt4A4g6TiVuNaBw,515
17
+ shipgrav/tests/ex_files/DGStest_laptop.dat,sha256=kUwAMltUXm32Dv9BnaDakA-KtNAuMxgSC504F8_0aws,353351
18
+ shipgrav/tests/ex_files/IXBlue.yaml,sha256=W-GGSMrSSvIXzX-b4rIU0u6Fv8bgNccZzNjTy-K1dNE,5394
19
+ shipgrav/tests/ex_files/MGL2003_bgm.y2020d244,sha256=MdF9duKY138vuVcOzZ26XcpKPSrGp20MbWYkCUtXTIk,164
20
+ shipgrav/tests/ex_files/MGL2003_nav.y2020d244,sha256=HheoWXYaevWkWE5uzNdv1vG5SHzh4AEog7UCu1O6Lw4,866
21
+ shipgrav/tests/ex_files/NBP_2301_nav.d013,sha256=Q-68kTPwGvYJ3u-nED3rem9JMSjaFxz75dYaxxrIb7c,1251
22
+ shipgrav/tests/ex_files/RR2212_bgm.txt,sha256=sB3Si6hhdZevjQrtoEcdztiX7Zf1H6g5GxRpZyXK-po,246
23
+ shipgrav/tests/ex_files/RR2212_nav.txt,sha256=BzazwIVkgILjHffoaWEKrqxN-4oOv_zGWqRvx3cF6I8,857
24
+ shipgrav/tests/ex_files/SR2302_nav.raw,sha256=7LhR1GEG17QDnTIp5NAamqjDteTLD2bTHaNfZrVFFh4,2352
25
+ shipgrav/tests/ex_files/SR2312_dgs_raw.txt,sha256=rTlIn75MBICBjnkjcn9SPCwpC5J3lMs4aq7-y6-dAmU,368
26
+ shipgrav/tests/ex_files/SR2312_mru.txt,sha256=YBM4qy4BS_D0U4ngEyoIFZr4QQbMWQgEO681fVEmgYM,1060
27
+ shipgrav/tests/ex_files/TN400_bgm.Raw,sha256=DL7T2aJednjGLMKJo6K8KcUYLMjQCiv6bsUiczlVwKw,74
28
+ shipgrav/tests/ex_files/TN400_dgs_proc.Raw,sha256=oxTXwDdyoz5yuKe253L1GgOSR5oWGGMuXfSQWrT24Ag,760
29
+ shipgrav/tests/ex_files/TN400_dgs_raw.Raw,sha256=tb-zerwCH6kkbGDbnnJ6I-sAUdCHOvkKgXXrBb6zTj8,311
30
+ shipgrav/tests/ex_files/TN400_nav.Raw,sha256=66dgRQTXjf0JItMWE1ZGUO6NptrGqV6JyN2wdYe5bW8,194
31
+ shipgrav-1.0.4.dist-info/METADATA,sha256=4IrgRSFtaOZcoYB5KwR-Qn0ynZjLKg6rfPt3hPT0K-w,2855
32
+ shipgrav-1.0.4.dist-info/WHEEL,sha256=aO3RJuuiFXItVSnAUEmQ0yRBvv9e1sbJh68PtuQkyAE,105
33
+ shipgrav-1.0.4.dist-info/licenses/LICENSE,sha256=5X8cMguM-HmKfS_4Om-eBqM6A1hfbgZf6pfx2G24QFI,35150
34
+ shipgrav-1.0.4.dist-info/RECORD,,
@@ -1,5 +1,5 @@
1
1
  Wheel-Version: 1.0
2
- Generator: hatchling 1.25.0
2
+ Generator: hatchling 1.26.3
3
3
  Root-Is-Purelib: true
4
4
  Tag: py2-none-any
5
5
  Tag: py3-none-any
@@ -1,24 +0,0 @@
1
- shipgrav/__init__.py,sha256=hkdZdfpVRK4AIdSk99WM6tJaMYmYNMQ6TU11qk2thDY,11058
2
- shipgrav/database.toml,sha256=qRSZBRsoMsbyhjkaYlXsWtVDz3JAhoW-cYfSoo8U3K4,532
3
- shipgrav/grav.py,sha256=OQK2HNiZtLlycjVwidpD902K08NaWbZXnjZSbjyPF8I,43410
4
- shipgrav/io.py,sha256=j9uy5z-CqhkUN4T-3iW5o5-cRLgjoGwfNc4o7cjw_pA,29273
5
- shipgrav/nav.py,sha256=3E2Il-PhTEPVv8_8PrLVe7h6slHq-2HYX_Q8PdpB6X4,2895
6
- shipgrav/utils.py,sha256=h5gd8cnWlO1pyAqLkJIVrbFRRLbZzHj-xJ9pWchaHtI,9521
7
- shipgrav/tests/__init__.py,sha256=mejp00TQif0WLU45j40zfBQXN2RujaOP6ZSTgxUm4A8,367
8
- shipgrav/tests/__main__.py,sha256=iJ8xSU8ucCmIDcOuv8uasZ9iV5dNR0DDuFRZrG0o2hE,38
9
- shipgrav/tests/test_grav_data.py,sha256=OnPNW8XIG3DvWOr_trQY71XLzA0rff8VfGd4Az3B37Q,3403
10
- shipgrav/tests/test_grav_nodata.py,sha256=0z5KUTzT9cJPko_bhyDG4EFm4i5xgOdVhJVKU2cPxX8,1493
11
- shipgrav/tests/test_io.py,sha256=RBRB2GYtesuSv68lJl5yMOLH2DGFz2AAKADyPKq8Dhk,2259
12
- shipgrav/tests/test_nav.py,sha256=6Ndn1mHN8kGruwk3R5yJFiK0n9Wt3XcK0GQ9PGbRIP8,1194
13
- shipgrav/tests/test_utils.py,sha256=10brOKgB5ij-oXqQnfo3CkMmZxSQMxSr_KVLLN1g1u0,969
14
- shipgrav/tests/ex_files/AT05_01_bgm.RGS,sha256=3MnlHJiIyzYe7KGr-bD8dfQ1nS_-Dt4A4g6TiVuNaBw,515
15
- shipgrav/tests/ex_files/DGStest_laptop.dat,sha256=kUwAMltUXm32Dv9BnaDakA-KtNAuMxgSC504F8_0aws,353351
16
- shipgrav/tests/ex_files/IXBlue.yaml,sha256=W-GGSMrSSvIXzX-b4rIU0u6Fv8bgNccZzNjTy-K1dNE,5394
17
- shipgrav/tests/ex_files/SR2312_dgs_raw.txt,sha256=rTlIn75MBICBjnkjcn9SPCwpC5J3lMs4aq7-y6-dAmU,368
18
- shipgrav/tests/ex_files/SR2312_mru.txt,sha256=YBM4qy4BS_D0U4ngEyoIFZr4QQbMWQgEO681fVEmgYM,1060
19
- shipgrav/tests/ex_files/TN400_bgm.Raw,sha256=DL7T2aJednjGLMKJo6K8KcUYLMjQCiv6bsUiczlVwKw,74
20
- shipgrav/tests/ex_files/TN400_nav.Raw,sha256=66dgRQTXjf0JItMWE1ZGUO6NptrGqV6JyN2wdYe5bW8,194
21
- shipgrav-1.0.2.dist-info/METADATA,sha256=aDE--gWj6U7q4-I37DIoosd4Tw6q2OE3QNqd-A9d8Og,1801
22
- shipgrav-1.0.2.dist-info/WHEEL,sha256=fl6v0VwpzfGBVsGtkAkhILUlJxROXbA3HvRL6Fe3140,105
23
- shipgrav-1.0.2.dist-info/licenses/LICENSE,sha256=5X8cMguM-HmKfS_4Om-eBqM6A1hfbgZf6pfx2G24QFI,35150
24
- shipgrav-1.0.2.dist-info/RECORD,,