weatherdb 1.1.1__py3-none-any.whl → 1.2.0__py3-none-any.whl

Sign up to get free protection for your applications and to get access to all the features.
@@ -14,12 +14,14 @@ services:
14
14
  WEATHERDB_USER_CONFIG_FILE: /home/config_user.yaml
15
15
  WEATHERDB_DATA_BASE_DIR: /home/data
16
16
  WEATHERDB_HANDLE_NON_EXISTING_CONFIG: create
17
+ WEATHERDB_OPENTOPO_API_KEY: /run/secrets/opentopo_api_key
17
18
  DOCKER_ENV: main
18
19
  volumes:
19
20
  - weatherdb_home:/home
20
21
  command: ["sleep", "infinity"] # to keep awake
21
22
  secrets:
22
23
  - db_password
24
+ - opentopo_api_key
23
25
  develop:
24
26
  watch:
25
27
  - action: rebuild
@@ -53,6 +55,8 @@ volumes:
53
55
  secrets:
54
56
  db_password:
55
57
  file: ./db_password.docker-secret
58
+ opentopo_api_key:
59
+ file: ./opentopo_api_key.docker-secret
56
60
 
57
61
  # start from parent folder with `docker compose -f docker\\docker-compose.yaml up --build`
58
- # To connect to ther weatherdb service use `docker-compose exec weatherdb bash`
62
+ # To connect to the weatherdb service use `docker-compose exec weatherdb bash`
docs/source/Methode.md CHANGED
@@ -117,7 +117,11 @@ N_{neighbor} * \dfrac{N_{station,ma,winter}}{N_{neighbor,ma,winter}} \space if\s
117
117
  N_{neighbor} * \dfrac{N_{station,ma,summer}}{N_{neighbor,ma,summer}} \space if\space month\notin[4:9]
118
118
  \end{cases}$
119
119
 
120
- For the precipitation stations only stations within a 100 km radius are taken to fill missing values. For the potential Evapotranspiration and the temperature this radius is 150km. For the temperature stations the median of the regionalised values from the 5 closest stations (but not more than 150 km away) is taken to fill missing values.
120
+ For the precipitation stations only stations within a 110 km radius are taken to fill missing values. For the potential Evapotranspiration and the temperature this radius is 150km. For the temperature stations the median of the regionalised values from the 5 closest stations (but not more than 150 km away) is taken to fill missing values. Those limits can get configured in the user configurations.
121
+
122
+ ### linear interpolation
123
+
124
+ After missing values got filled with regionalised values from the neighboring stations there can still be missing values left. To fill those residual missing values, they get interpolated linearly. Those linear interpolations have configurable limits for how big the interval of missing values can be. The default values are 1h for precipitation and 2 days for temperature and evapotranspiration.
121
125
 
122
126
  ### adjusting precipitation to daily station measurements
123
127
 
@@ -46,7 +46,8 @@ So there is:
46
46
  - "raw" : the raw measurements as on the DWD server
47
47
  - "qc" : The quality checked data
48
48
  - "filled" : The filled timeseries
49
- - "filled_by" : The station ID of the station from which the data was taken to fill the measurements
49
+ - "filled_by" : The station ID of the station from which the data was taken to fill the measurements.
50
+ If the value was filled by linear intrepolation "filled_by" is `-1`
50
51
  - "corr" : The Richter corrected timeserie.
51
52
 
52
53
  If you want more than just one kind of timeseries, e.g. the filled timeseries, together with the id from which station the respective field got filled with use:
@@ -0,0 +1,3 @@
1
+ # Readme
2
+
3
+ The DEM file in this folder originates from Copernicus and got clipped to the region of analysis in the test environment + 40km
Binary file
@@ -0,0 +1,3 @@
1
+ # Readme
2
+
3
+ The files in this folder originate from https://zenodo.org/records/10066045 and got clipped to the region of analysis in the test environment
@@ -0,0 +1,28 @@
1
+ ; Those are configs that should get loaded, when the test files are loaded
2
+ [data:rasters]
3
+ DEMS = ${data:BASE_DIR}/DEM/COP-DEM_GLO-30-DGED__2023_1_clipped.tif
4
+
5
+ [data:rasters:hyras]
6
+ FILE = ${data:BASE_DIR}/regionalisation/HYRAS_ma_1991_2020_DGM25_clipped.tif
7
+ BAND_P_WIHY = n_hyras_wihj
8
+ BAND_P_SUHY = n_hyras_sohj
9
+ BAND_P_YEAR = n_hyras_year
10
+ SRID = 3035
11
+ FACTOR_P_WIHY = 1
12
+ FACTOR_P_SUHY = 1
13
+ FACTOR_P_YEAR = 1
14
+
15
+ [data:rasters:dwd]
16
+ FILE = ${data:BASE_DIR}/regionalisation/DWD-grid_ma_1991_2020_DGM25_clipped.tif
17
+ BAND_P_WIHY = n_wihj
18
+ BAND_P_SUHY = n_sohj
19
+ BAND_P_YEAR = n_year
20
+ BAND_T_YEAR = t_year
21
+ BAND_ET_YEAR = et_year
22
+ SRID = 3035
23
+ FACTOR_P_WIHY = 1
24
+ FACTOR_P_SUHY = 1
25
+ FACTOR_P_YEAR = 1
26
+ FACTOR_T_YEAR = 0.1
27
+ FACTOR_ET_YEAR = 1
28
+
weatherdb/_version.py CHANGED
@@ -1 +1 @@
1
- __version__ = "1.1.1"
1
+ __version__ = "1.2.0"
weatherdb/cli.py CHANGED
@@ -103,10 +103,16 @@ def download_ma_rasters(which, overwrite, update_user_config):
103
103
  """
104
104
  click.echo("starting downloading multi annual raster data")
105
105
  from weatherdb.utils.get_data import download_ma_rasters
106
- download_ma_rasters(overwrite=overwrite)
106
+ download_ma_rasters(
107
+ which=which,
108
+ overwrite=overwrite,
109
+ update_user_config=update_user_config)
107
110
 
108
111
 
109
112
  @cli.command(short_help="Download the needed digital elevation model raster data from Copernicus to the data folder.")
113
+ @click.option('--out-dir', '-d',
114
+ type=click.Path(), default=None, show_default=False,
115
+ help="The directory to save the downloaded DEM data to.")
110
116
  @click.option('--overwrite/--no-overwrite', '-o/-no-o',
111
117
  type=bool, is_flag=True, default=None, show_default=False,
112
118
  help="Should the digital elevation model raster be downloaded even if it already exists?")
@@ -116,7 +122,11 @@ def download_ma_rasters(which, overwrite, update_user_config):
116
122
  @click.option("--update-user-config", "-u",
117
123
  type=bool, default=False, show_default=True, is_flag=True,
118
124
  help="Should the user configuration be updated with the path to the downloaded DEM?")
119
- def download_dem(overwrite, extent):
125
+ @click.option("--service", "-s",
126
+ type=str, default=["prism", "openTopography"], show_default=True, multiple=True,
127
+ help="The service to use to download the DEM. Options are 'prism' or 'openTopography'. " +\
128
+ "You can use this option muultiple times to test both in the given order until the file could be downloaded.")
129
+ def download_dem(out_dir, overwrite, extent, update_user_config, service="prism"):
120
130
  """Download the newest DEM data from the Copernicus Sentinel dataset.
121
131
 
122
132
  Only the GLO-30 DEM, wich has a 30m resolution, is downloaded as it is freely available.
@@ -131,7 +141,12 @@ def download_dem(overwrite, extent):
131
141
  """
132
142
  click.echo("Starting downloading digital elevation model from Copernicus")
133
143
  from weatherdb.utils.get_data import download_dem
134
- download_dem(overwrite=overwrite, extent=extent)
144
+ download_dem(
145
+ out_dir=out_dir,
146
+ overwrite=overwrite,
147
+ extent=extent,
148
+ service=service,
149
+ update_user_config=update_user_config)
135
150
 
136
151
 
137
152
  # cli statements to update the database
@@ -143,12 +143,25 @@ RASTER_BUFFER_CRS = ${weatherdb:HORIZON_CRS}
143
143
  ; The default is 1999-01-01
144
144
  MIN_DATE = 1999-01-01
145
145
 
146
+ ; When there are still NAs after filling with neighboring stations, the module can use linear interpolation to fill the gaps
147
+ ; The limit defines the maximum interval that is allowed to fill with linear interpolation
148
+ ; The value is given with a unit, e.g. 3 days, 30 minutes, 1 hour
149
+ ; For further explanation about the format see https://www.postgresql.org/docs/current/datatype-datetime.html#DATATYPE-INTERVAL-INPUT
150
+ ; for precipitation, where the timeseries resolution is 10 minutes, 30 minutes means 3 missing values are interpolated, but not more.
151
+ ; If there shouldn't be any linear interpolation, set the limit to 0
152
+ ; For precipitation, the default is 1 hour
153
+ LINEAR_INTERPOLATION_LIMIT_P = 1 hour
154
+ ; For temperature, the default is 2 days
155
+ LINEAR_INTERPOLATION_LIMIT_T = 2 days
156
+ ; For Evapotranspiration, the default is 2 days
157
+ LINEAR_INTERPOLATION_LIMIT_ET = 2 days
158
+
146
159
 
147
160
  [weatherdb:max_fillup_distance]
148
161
  ; The maximum distance in meters to use for the filling of the station data
149
162
  ; For each parameter (P, T, ET) the module uses a different distance
150
163
  ; Precipitation (P)
151
- P = 110000
164
+ P = 130000
152
165
  ; Temperature (T)
153
166
  T = 150000
154
167
  ; Evapotranspiration (ET)
@@ -526,7 +526,9 @@ class GroupStation(object):
526
526
  period_new = period_filled.union(
527
527
  period,
528
528
  how="inner")
529
- if period_new != period:
529
+ if period_new != period and not (
530
+ (period.start == period_new.start) and
531
+ ((period.end - period_new.end) <= pd.Timedelta(days=1))):
530
532
  warnings.warn(
531
533
  f"The Period for Station {self.id} got changed from {str(period)} to {str(period_new)}.")
532
534
  period = period_new
@@ -504,10 +504,12 @@ class StationBase:
504
504
  '{interval}'::INTERVAL)::{tstp_dtype} AS timestamp)
505
505
  INSERT INTO timeseries."{stid}_{para}"(timestamp)
506
506
  (SELECT wts.timestamp
507
- FROM whole_ts wts
508
- LEFT JOIN timeseries."{stid}_{para}" ts
507
+ FROM whole_ts wts
508
+ LEFT JOIN timeseries."{stid}_{para}" ts
509
509
  ON ts.timestamp=wts.timestamp
510
- WHERE ts.timestamp IS NULL);
510
+ WHERE ts.timestamp IS NULL);
511
+ DELETE FROM timeseries."{stid}_{para}"
512
+ WHERE timestamp < '{min_date} 00:00'::{tstp_dtype};
511
513
  """.format(
512
514
  stid=self.id,
513
515
  para=self._para,
@@ -1489,6 +1491,70 @@ class StationBase:
1489
1491
  extra_fillup_where=sql_format_dict["extra_fillup_where"] +\
1490
1492
  ' OR ts."filled_by" IS DISTINCT FROM new."filled_by"'))
1491
1493
 
1494
+ # linear interpolation for the last missing values
1495
+ lr_limit = config.get(
1496
+ "weatherdb",
1497
+ "LINEAR_INTERPOLATION_LIMIT_{self._para_base}",
1498
+ fallback="0")
1499
+ if lr_limit != "0":
1500
+ sql_format_dict.update(dict(
1501
+ sql_linear_interpolation=f"""
1502
+ DO
1503
+ $do$
1504
+ DECLARE reg_borders record;
1505
+ BEGIN
1506
+ FOR reg_borders IN
1507
+ WITH empty_periods AS (
1508
+ SELECT *
1509
+ FROM ( SELECT
1510
+ CASE WHEN dist_next>'{self._interval}'::interval
1511
+ THEN timestamp ELSE NULL
1512
+ END AS start,
1513
+ CASE WHEN dist_next>'{self._interval}'::interval
1514
+ THEN LEAD(timestamp) OVER (ORDER BY timestamp)
1515
+ ELSE NULL
1516
+ END AS end
1517
+ FROM (
1518
+ SELECT *,
1519
+ timestamp - lag(timestamp, 1) OVER ( ORDER BY timestamp) AS dist_prev,
1520
+ lead(timestamp, 1) OVER ( ORDER BY timestamp) - timestamp AS dist_next
1521
+ FROM new_filled_{self.id}_{self._para}
1522
+ WHERE filled IS NOT NULL) t
1523
+ WHERE t.dist_prev > '{self._interval}'::interval
1524
+ OR t.dist_next > '{self._interval}'::interval
1525
+ ) p
1526
+ WHERE p.start IS NOT NULL AND p.end IS NOT NULL)
1527
+ SELECT
1528
+ ep.start AS timestamp_start,
1529
+ ttss.filled AS value_start,
1530
+ ep.end AS timestamp_end,
1531
+ ttse.filled AS value_end,
1532
+ (ttse.filled - ttss.filled)::numeric/(EXTRACT(EPOCH FROM (ep.end - ep.start))/EXTRACT(EPOCH FROM '{self._interval}'::interval))::numeric as slope
1533
+ FROM empty_periods ep
1534
+ LEFT JOIN new_filled_{self.id}_{self._para} ttss ON ep.start = ttss.timestamp
1535
+ LEFT JOIN new_filled_{self.id}_{self._para} ttse ON ep.end = ttse.timestamp
1536
+ where (ep.end - ep.start - '{self._interval}'::interval) <= '{lr_limit}'::interval
1537
+ loop
1538
+ execute FORMAT(
1539
+ $$
1540
+ UPDATE new_filled_{self.id}_{self._para} ts
1541
+ SET filled=%2$L + (EXTRACT(EPOCH FROM ts.timestamp - %1$L)::numeric/(EXTRACT(EPOCH FROM '{self._interval}'::interval))::numeric * %5$L),
1542
+ filled_by=-1
1543
+ WHERE ts.timestamp>%1$L and ts.timestamp<%3$L;
1544
+ $$,
1545
+ reg_borders.timestamp_start,
1546
+ reg_borders.value_start,
1547
+ reg_borders.timestamp_end,
1548
+ reg_borders.value_end,
1549
+ reg_borders.slope
1550
+ );
1551
+ END loop;
1552
+ END
1553
+ $do$;"""))
1554
+ else:
1555
+ sql_format_dict.update(dict(sql_linear_interpolation=""))
1556
+
1557
+
1492
1558
  # Make SQL statement to fill the missing values with values from nearby stations
1493
1559
  sql = """
1494
1560
  CREATE TEMP TABLE new_filled_{stid}_{para}
@@ -1561,15 +1627,16 @@ class StationBase:
1561
1627
  FROM new_filled_{stid}_{para}
1562
1628
  WHERE "filled" IS NULL {extra_unfilled_period_where};
1563
1629
  END LOOP;
1564
- {sql_extra_after_loop}
1565
- UPDATE timeseries."{stid}_{para}" ts
1566
- SET filled = new.filled, {extra_cols_fillup}
1567
- filled_by = new.filled_by
1568
- FROM new_filled_{stid}_{para} new
1569
- WHERE ts.timestamp = new.timestamp
1570
- AND (ts."filled" IS DISTINCT FROM new."filled" {extra_fillup_where}) ;
1571
1630
  END
1572
1631
  $do$;
1632
+ {sql_linear_interpolation}
1633
+ {sql_extra_after_loop}
1634
+ UPDATE timeseries."{stid}_{para}" ts
1635
+ SET filled = new.filled, {extra_cols_fillup}
1636
+ filled_by = new.filled_by
1637
+ FROM new_filled_{stid}_{para} new
1638
+ WHERE ts.timestamp = new.timestamp
1639
+ AND (ts."filled" IS DISTINCT FROM new."filled" {extra_fillup_where}) ;
1573
1640
  """.format(**sql_format_dict)
1574
1641
 
1575
1642
  # execute
@@ -6,9 +6,19 @@ from pathlib import Path
6
6
  from distutils.util import strtobool
7
7
  import hashlib
8
8
  import progressbar as pb
9
+ import keyring
10
+ import os
11
+ import getpass
12
+ import logging
9
13
 
10
14
  from ..config import config
11
15
 
16
+ __all__ = ["download_ma_rasters", "download_dem"]
17
+
18
+ log = logging.getLogger(__name__)
19
+
20
+ # Multi annual rasters
21
+ # --------------------
12
22
  def download_ma_rasters(which="all", overwrite=None, update_user_config=False):
13
23
  """Get the multi annual rasters on which bases the regionalisation is done.
14
24
 
@@ -60,7 +70,7 @@ def download_ma_rasters(which="all", overwrite=None, update_user_config=False):
60
70
  if file_key in which:
61
71
  # check if file is in config
62
72
  if f"data:rasters:{file_key}" not in config:
63
- print(f"Skipping {file_key} as it is not in your configuration.\nPlease add a section 'data:rasters:{file_key}' to your configuration file.")
73
+ log.debug(f"Skipping {file_key} as it is not in your configuration.\nPlease add a section 'data:rasters:{file_key}' to your configuration file.")
64
74
  continue
65
75
 
66
76
  # check if file already exists
@@ -75,7 +85,7 @@ def download_ma_rasters(which="all", overwrite=None, update_user_config=False):
75
85
  "Do you want to overwrite it? [y/n] "))
76
86
 
77
87
  if skip:
78
- print(f"Skipping {file_key} as overwriting is not allowed.")
88
+ log.debug(f"Skipping {file_key} as overwriting is not allowed.")
79
89
  continue
80
90
 
81
91
  # check if the directory exists
@@ -126,24 +136,42 @@ def download_ma_rasters(which="all", overwrite=None, update_user_config=False):
126
136
  if config.has_user_config:
127
137
  config.update_user_config(f"data:rasters:{file_key}", "file", str(file_path))
128
138
  else:
129
- print(f"No user configuration file found, therefor the raster '{file_key}' is not set in the user configuration file.")
139
+ log.error(f"No user configuration file found, therefor the raster '{file_key}' is not set in the user configuration file.")
130
140
 
141
+ # DEM data
142
+ # --------
143
+ def _check_write_fp(fp, overwrite):
144
+ """Check if a file exists and if it should be overwritten.
131
145
 
132
- def download_dem(overwrite=None, extent=(5.3, 46.1, 15.6, 55.4), update_user_config=False):
133
- """Download the newest DEM data from the Copernicus Sentinel dataset.
134
-
135
- Only the GLO-30 DEM, which has a 30m resolution, is downloaded as it is freely available.
136
- If you register as a scientific researcher also the EEA-10, with 10 m resolution, is available.
137
- You will have to download the data yourself and define it in the configuration file.
138
-
139
- After downloading the data, the files are merged and saved as a single tif file in the data directory in a subfolder called 'DEM'.
140
- To use the DEM data in the WeatherDB, you will have to define the path to the tif file in the configuration file.
146
+ Parameters
147
+ ----------
148
+ fp : str or Path
149
+ The path to the file to check.
150
+ overwrite : bool
151
+ Should the file be overwritten?
152
+
153
+ Returns
154
+ -------
155
+ bool
156
+ Should the file be written?
157
+ """
158
+ fp = Path(fp)
159
+ if fp.exists():
160
+ log.info(f"The file already exists at {fp}.")
161
+ if overwrite is None:
162
+ overwrite = strtobool(input(f"{fp} already exists. Do you want to overwrite it? [y/n] "))
163
+ if not overwrite:
164
+ log.info("Skipping, because overwritting was turned of.")
165
+ return False
166
+ return True
141
167
 
142
- Source:
143
- Copernicus DEM - Global and European Digital Elevation Model. Digital Surface Model (DSM) provided in 3 different resolutions (90m, 30m, 10m) with varying geographical extent (EEA: European and GLO: global) and varying format (INSPIRE, DGED, DTED). DOI:10.5270/ESA-c5d3d65.
168
+ def _download_dem_prism(out_dir, overwrite=None, extent=(5.3, 46.1, 15.6, 55.4)):
169
+ """Download the DEM data from the Copernicus PRISM service.
144
170
 
145
171
  Parameters
146
172
  ----------
173
+ out_dir: str or Path
174
+ The directory to save the DEM data to.
147
175
  overwrite : bool, optional
148
176
  Should the DEM data be downloaded even if it already exists?
149
177
  If None the user will be asked.
@@ -151,9 +179,11 @@ def download_dem(overwrite=None, extent=(5.3, 46.1, 15.6, 55.4), update_user_con
151
179
  extent : tuple, optional
152
180
  The extent in WGS84 of the DEM data to download.
153
181
  The default is the boundary of germany + ~40km = (5.3, 46.1, 15.6, 55.4).
154
- update_user_config : bool, optional
155
- Should the downloaded DEM be set as the used DEM in the user configuration file?
156
- The default is False.
182
+
183
+ Returns
184
+ -------
185
+ fp : Path
186
+ The path to the downloaded DEM file.
157
187
  """
158
188
  # import necessary modules
159
189
  import rasterio as rio
@@ -164,10 +194,8 @@ def download_dem(overwrite=None, extent=(5.3, 46.1, 15.6, 55.4), update_user_con
164
194
  import re
165
195
  import json
166
196
 
167
- # get dem_dir
168
- base_dir = Path(config.get("data", "base_dir"))
169
- dem_dir = base_dir / "DEM"
170
- dem_dir.mkdir(parents=True, exist_ok=True)
197
+ # check dir
198
+ out_dir = Path(out_dir)
171
199
 
172
200
  # get available datasets
173
201
  prism_url = "https://prism-dem-open.copernicus.eu/pd-desk-open-access/publicDemURLs"
@@ -191,20 +219,12 @@ def download_dem(overwrite=None, extent=(5.3, 46.1, 15.6, 55.4), update_user_con
191
219
  )[-1]["id"]
192
220
 
193
221
  # check if dataset already exists
194
- dem_file = dem_dir / f'{ds_id.replace("/", "__")}.tif'
195
- if dem_file.exists():
196
- print(f"The DEM data already exists at {dem_file}.")
197
- if overwrite is None:
198
- overwrite = strtobool(input("Do you want to overwrite it? [y/n] "))
199
- if not overwrite:
200
- print("Skipping, because overwritting was turned of.")
201
- return
202
- else:
203
- print("Overwriting the dataset.")
204
- dem_dir.mkdir(exist_ok=True)
222
+ dem_file = out_dir / f'{ds_id.replace("/", "__")}.tif'
223
+ if not _check_write_fp(dem_file, overwrite):
224
+ return
205
225
 
206
226
  # selecting DEM tiles
207
- print(f"getting available tiles for Copernicus dataset '{ds_id}'")
227
+ log.info(f"getting available tiles for Copernicus dataset '{ds_id}'")
208
228
  ds_files_req = json.loads(
209
229
  requests.get(
210
230
  f"{prism_url}/{ds_id.replace('/', '__')}",
@@ -225,13 +245,13 @@ def download_dem(overwrite=None, extent=(5.3, 46.1, 15.6, 55.4), update_user_con
225
245
  ds_files_all))
226
246
 
227
247
  # download DEM tiles
228
- print("downloading tiles")
248
+ log.info("downloading tiles")
229
249
  with TemporaryDirectory() as tmp_dir:
230
250
  tmp_dir_fp = Path(tmp_dir)
231
251
  for f in pb.progressbar(ds_files):
232
252
  with open(tmp_dir_fp / Path(f["nativeDemUrl"]).name, "wb") as d:
233
253
  d.write(requests.get(f["nativeDemUrl"]).content)
234
- print("downloaded all files")
254
+ log.info("downloaded all files")
235
255
 
236
256
  # extracting tifs from tars
237
257
  for i, f in pb.progressbar(list(enumerate(tmp_dir_fp.glob("*.tar")))):
@@ -269,17 +289,165 @@ def download_dem(overwrite=None, extent=(5.3, 46.1, 15.6, 55.4), update_user_con
269
289
  tmp_eula_fp = next(tmp_dir_fp.glob("*.pdf"))
270
290
  shutil.copyfile(
271
291
  tmp_eula_fp,
272
- dem_dir / tmp_eula_fp.name
292
+ out_dir / tmp_eula_fp.name
273
293
  )
274
294
 
275
- print(f"created DEM at '{dem_file}'.")
295
+ log.info(f"created DEM at '{dem_file}'.")
296
+ return dem_file
297
+
298
+ def _download_dem_opentopo(
299
+ out_dir,
300
+ overwrite=None,
301
+ extent=(5.3, 46.1, 15.6, 55.4),
302
+ api_key=os.environ.get(
303
+ "WEATHERDB_OPENTOPO_API_KEY",
304
+ default=keyring.get_password("weatherdb", "opentopo_api_key"))):
305
+ """Download the DEM data from the OpenTopography service.
306
+
307
+ Get an API key from (OpenTopography)[https://portal.opentopography.org/] to use this service.
308
+
309
+ Parameters
310
+ ----------
311
+ out_dir : str or Path
312
+ The directory to save the DEM data to.
313
+ overwrite : bool, optional
314
+ Should the DEM data be downloaded even if it already exists?
315
+ If None the user will be asked.
316
+ The default is None.
317
+ extent : tuple, optional
318
+ The extent in WGS84 of the DEM data to download.
319
+ Should be in the format (south, west, north, east).
320
+ The default is the boundary of germany + ~40km = (5.3, 46.1, 15.6, 55.4).
321
+ api_key : str, optional
322
+ The API key for the OpenTopography service.
323
+ If None the user will be asked.
324
+ The default is to check if the environment variable "WEATHERDB_OPENTOPO_API_KEY" is set or if the keyring has a password for "weatherdb" and "opentopo_api_key".
325
+ If the value is a valid filepath the content of the file is used as the API key.
326
+
327
+ Returns
328
+ -------
329
+ fp : Path
330
+ The path to the downloaded DEM file.
331
+ """
332
+ # check api key
333
+ if api_key is None:
334
+ print("No API key for OpenTopography was given or found in the keyring or environment variable.")
335
+ api_key = getpass("Please enter your API key for OpenTopography: ")
336
+ if Path(api_key).exists():
337
+ with open(api_key) as f:
338
+ api_key = f.read().strip()
339
+
340
+ # make query
341
+ w, s, e, n = extent
342
+ url = f"https://portal.opentopography.org/API/globaldem?demtype=COP30&west={w}&south={s}&east={e}&north={n}&outputFormat=GTiff&API_Key={api_key}"
343
+ r = requests.get(url)
344
+ if r.status_code == 200:
345
+ # store api key
346
+ if api_key != keyring.get_password("weatherdb", "opentopo_api_key"):
347
+ keyring.set_password("weatherdb", "opentopo_api_key", api_key)
348
+
349
+ # get dem file
350
+ out_fp = out_dir / "OpenTopo_COP30.tif"
351
+
352
+ # check if file already exists
353
+ if not _check_write_fp(out_fp, overwrite):
354
+ return
355
+
356
+ # download file
357
+ with open(out_fp, "wb") as f:
358
+ for chunk in r.iter_content(chunk_size=1024):
359
+ f.write(chunk)
360
+ log.info(f"Downloaded DEM data from OpenTopography to '{out_fp}'.")
361
+ return out_fp
362
+
363
+ log.error(f"Request to openTopography API with url {r.url.replace(api_key, "[MASKED]")} returned status code {r.status_code}")
364
+
365
+ def download_dem(out_dir=None,
366
+ overwrite=None,
367
+ extent=(5.3, 46.1, 15.6, 55.4),
368
+ update_user_config=False,
369
+ service=["prism", "openTopography"], **kwargs):
370
+ """Download the newest DEM data from the Copernicus Sentinel dataset.
371
+
372
+ Only the GLO-30 DEM, which has a 30m resolution, is downloaded as it is freely available.
373
+ If you register as a scientific researcher also the EEA-10, with 10 m resolution, is available.
374
+ You will have to download the data yourself and define it in the configuration file.
375
+
376
+ After downloading the data, the files are merged and saved as a single tif file in the data directory in a subfolder called 'DEM'.
377
+ To use the DEM data in the WeatherDB, you will have to define the path to the tif file in the configuration file.
378
+
379
+ Source:
380
+ Copernicus DEM - Global and European Digital Elevation Model. Digital Surface Model (DSM) provided in 3 different resolutions (90m, 30m, 10m) with varying geographical extent (EEA: European and GLO: global) and varying format (INSPIRE, DGED, DTED). DOI:10.5270/ESA-c5d3d65.
381
+
382
+ Parameters
383
+ ----------
384
+ out_dir : str or Path, optional
385
+ The directory to save the DEM data to.
386
+ If None the data is saved in the data directory in a subfolder called 'DEM'.
387
+ The default is None.
388
+ overwrite : bool, optional
389
+ Should the DEM data be downloaded even if it already exists?
390
+ If None the user will be asked.
391
+ The default is None.
392
+ extent : tuple, optional
393
+ The extent in WGS84 of the DEM data to download.
394
+ The default is the boundary of germany + ~40km = (5.3, 46.1, 15.6, 55.4).
395
+ update_user_config : bool, optional
396
+ Should the downloaded DEM be set as the used DEM in the user configuration file?
397
+ The default is False.
398
+ service : str or list of str, optional
399
+ The service to use to download the DEM data.
400
+ Options are "prism" and "openTopography".
401
+ If Both are given they are executed in the order they are given.
402
+ If OpenTopography is selected, you will have to provide an API key.
403
+ The default is ["prism", "openTopography"].
404
+ """
405
+ # check service
406
+ if isinstance(service, str):
407
+ service = [service]
408
+
409
+ # check dir
410
+ if out_dir is None:
411
+ out_dir = Path(config.get("data", "base_dir")) / "DEM"
412
+ else:
413
+ out_dir = Path(out_dir)
414
+ out_dir.mkdir(parents=True, exist_ok=True)
415
+
416
+ # download data
417
+ fp=None
418
+ for s in service:
419
+ if s == "prism":
420
+ try:
421
+ fp = _download_dem_prism(
422
+ out_dir=out_dir,
423
+ overwrite=overwrite,
424
+ extent=extent,
425
+ **kwargs)
426
+ break
427
+ except Exception as e:
428
+ log.debug(f"Error while downloading DEM from PRISM: {e}")
429
+ elif s == "openTopography":
430
+ try:
431
+ fp = _download_dem_opentopo(
432
+ out_dir=out_dir,
433
+ overwrite=overwrite,
434
+ extent=extent,
435
+ **kwargs)
436
+ break
437
+ except Exception as e:
438
+ log.debug(f"Error while downloading DEM from OpenTopography: {e}")
439
+
440
+ # check if file was downloaded
441
+ if fp is None:
442
+ log.error("No DEM data was downloaded.")
443
+ return
276
444
 
277
445
  # update user config
278
446
  if update_user_config:
279
447
  if config.has_user_config:
280
- config.update_user_config("data:rasters", "dems", str(dem_file))
448
+ config.update_user_config("data:rasters", "dems", str(fp))
281
449
  return
282
450
  else:
283
- print("No user configuration file found, therefor the DEM is not set in the user configuration file.")
451
+ log.info("No user configuration file found, therefor the DEM is not set in the user configuration file.")
284
452
 
285
- print("To use the DEM data in the WeatherDB, you will have to define the path to the tif file in the user configuration file.")
453
+ log.info("To use the DEM data in the WeatherDB, you will have to define the path to the tif file in the user configuration file.")
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: weatherdb
3
- Version: 1.1.1
3
+ Version: 1.2.0
4
4
  Summary: This is a package to work with and to create the Weather Database which handles, checks, fills and corrects DWD Weather Station data.
5
5
  Author-email: Max Schmit <max.schmit@hydrology.uni-freiburg.de>
6
6
  License: GNU GENERAL PUBLIC LICENSE
@@ -1,11 +1,11 @@
1
1
  docker/Dockerfile,sha256=AmSjZsQHA3-XANZz8RiH0B5skIlF7MXCNiy8yIzZXr4,1051
2
- docker/docker-compose.yaml,sha256=krm0W_Nj1ZTuyKIr4I4dAfPM0XiM0lZQGhVfmjQIHM0,1568
2
+ docker/docker-compose.yaml,sha256=ezHqCUib83b5cbfbSy0fsrg0xl2iufp6diItsctJbRM,1719
3
3
  docker/docker-compose_test.yaml,sha256=LTCehPG6Oid-ri7kcPBMzXzeJBqYC-zXTs_i6XOkSyQ,931
4
4
  docker/start-docker-test.sh,sha256=8ue4LfoUrZU8inEZhNjsSDqucoN73Kn1cC3--lUPqTo,222
5
5
  docs/requirements.txt,sha256=gVqMWeiE0PAEz41NGrLwgzx1WUGuEIE7sfPLkFV4vU4,141
6
6
  docs/source/Changelog.md,sha256=5SUQr5KRoe5BUCsEM5IMXV7ryKwjp3_gDukYZP5i_9I,33
7
7
  docs/source/License.rst,sha256=BhhDehmlauu1zhoPc2tqYM2SNhgns67tmpFnA64scXQ,79
8
- docs/source/Methode.md,sha256=4UyLIEOwAx2ysWNeoSvw5g5ZHjZ_SGeaNo4j_c7f794,18232
8
+ docs/source/Methode.md,sha256=rhnfAjbn_xSz7rjOSnJC-9_LmmbiiP1DYrlLDmQ_XV0,18716
9
9
  docs/source/conf.py,sha256=StlvffI0pTq2R7WhE64cLlS4fi6DXgBuV1YDUgeu2-c,3718
10
10
  docs/source/index.rst,sha256=ByYiqrC2n_FO3zWpq-zfXUzxdcNLs4EQ38hlhXiU_z0,1832
11
11
  docs/source/_static/custom.css,sha256=FgPX621XaAYw41aiF6nWU6T6snJA42WLiCpe3ziYcQA,161
@@ -23,12 +23,19 @@ docs/source/api/weatherdb.utils.rst,sha256=j4nbIiCxYFtBkzi-jGEiefW4uKYRtSM4xnJHP
23
23
  docs/source/setup/Configuration.md,sha256=qtJ-gvXmaVfp_ldBMv60YaKENtoGb-L5HHc7rb9MaZQ,4340
24
24
  docs/source/setup/Hosting.md,sha256=4CUOgonpPhG-cPihrEsN3Z31rRKGWYeizg1QUhETmLw,105
25
25
  docs/source/setup/Install.md,sha256=IdmycMBztQFs_FAqfiU1OxtN2ihGXfIqwW4KBnsG_Iw,1265
26
- docs/source/setup/Quickstart.md,sha256=NmTvECVOS_uKoyWxgKtwQWHaH87gVFf1_QGGmZ97AZY,7062
26
+ docs/source/setup/Quickstart.md,sha256=suk3QweF4pQREBisXtLP0EJukInQePH0jWvwzY--D5U,7147
27
27
  docs/source/setup/setup.rst,sha256=bOmfcf-n9iivxbaBWk5t0nmiIc87bks_BMIJ1d6UHOM,258
28
+ tests/test-data/test-data-config.ini,sha256=XC1dho-dQglFhQCXWHtZRUuQLKjeOmjH4W0j0UumgoE,717
29
+ tests/test-data/DEM/COP-DEM_GLO-30-DGED__2023_1_clipped.tif,sha256=6JJpOHf-cDzBaTn6BIWkmroiuW9P8g9SLGfC2h7xCyk,35433174
30
+ tests/test-data/DEM/README.md,sha256=tnvGke8KzWFm3TKGbdCqajBJ4MbcRFZ0wv875x0_EMM,137
31
+ tests/test-data/DEM/eula_F.pdf,sha256=MgSZFMN_FOfVO0jRPXSknnfAMCNuKsx6DfQm-TRP66I,117522
32
+ tests/test-data/regionalisation/DWD-grid_ma_1991_2020_DGM25_clipped.tif,sha256=82r1UEYAjA6YzMC32ByZGHQFoYm1zcSajhKFtPjTqMw,2931290
33
+ tests/test-data/regionalisation/HYRAS_ma_1991_2020_DGM25_clipped.tif,sha256=HiSmi715dUpryQ1WSLSXR7GmGPHzK0rl6WOOrwjRwXw,2116377
34
+ tests/test-data/regionalisation/README.md,sha256=equFOPJVaESGKIK1ZPVEJHY4TEqkW3NpJOFQzjsBy7M,151
28
35
  weatherdb/__init__.py,sha256=2BzziScTYzA65WS6nhtCRZOWSLfwHTCwvZdeyuHw79U,846
29
- weatherdb/_version.py,sha256=y4gEwC0Q8o2BsM98PtyuCntN1nW-ViBe6bKQWBL7isA,21
36
+ weatherdb/_version.py,sha256=Btl_98iBIXFtvGx47MqpfbaEVYoOMPBQn9bao7UASkQ,21
30
37
  weatherdb/broker.py,sha256=gA5zWq4RpTJxgvKIzo9Bc5ol3k8JxAZ1BKZGkYCcr80,24474
31
- weatherdb/cli.py,sha256=BF555wpg2m0aeMTH5CdTTDKu7M6iCZABpnPISi_NgpY,9528
38
+ weatherdb/cli.py,sha256=jS_QNu4k2y7Q3d-iXTkdGP8qAEw2MWmE70wQdxBtqPo,10301
32
39
  weatherdb/alembic/README.md,sha256=6hq24TPr8z3nqJCaqrmj6__kLikVG66kLGqfC_aFPWQ,440
33
40
  weatherdb/alembic/alembic.ini,sha256=Mc3-qkQfA42ApX9douZT77g1pWE9gn4oMVJEDoebmdQ,3087
34
41
  weatherdb/alembic/config.py,sha256=Q4aolMUUT7f6VmbY0uz_dWBtKkj3KgTUs_xOY_Gbzjw,264
@@ -40,15 +47,15 @@ weatherdb/alembic/versions/V1.0.5_fix-ma-raster-values.py,sha256=EvJ_JLLJKUN5YXw
40
47
  weatherdb/alembic/versions/V1.0.6_update-views.py,sha256=ZA7cK5yeAqivFG2V8O7mvyKSkGRlphVOs0qKpBYgYKE,420
41
48
  weatherdb/config/ConfigParser.py,sha256=UEJlpeZSggG93C3MjpaNGpqKe_Ks6Jg6g5CjnhKCrpY,27978
42
49
  weatherdb/config/__init__.py,sha256=JBNlYPT43JnO8XMI8HTtg2hyA7C_JnMdtWmJ2vAtNZ8,85
43
- weatherdb/config/config_default.ini,sha256=f22zLpLVlguotbdENqUarNf-SlYwMjJFwIiXldnfBGc,6453
50
+ weatherdb/config/config_default.ini,sha256=UqEPzr4oU-S54Gpnr6nKnsoAhQtnBsmTlP1lE2cy9JY,7320
44
51
  weatherdb/db/__init__.py,sha256=g6F66XR6UPz41kPtSRc1qvDoOEZ5OawYxYtYXCgZ06s,76
45
52
  weatherdb/db/connections.py,sha256=mxArp79JOLTYYNB9fZEIspwcpdRZoxYWfQXh2Sq9ATw,14097
46
53
  weatherdb/db/models.py,sha256=6Oor07T1sAD6qEtoxwEcpmY1SkLVVONChXIIcTjbUk4,15242
47
54
  weatherdb/db/views.py,sha256=DxY_IIonmiiRM-IhQrUMLwj12gIg6Q30rQAmR3wP3BE,6359
48
55
  weatherdb/db/fixtures/RichterParameters.json,sha256=CKxrB5FBX_BRKqxegXNyNtn9DUKmgibUtdvHoE8E5JI,836
49
56
  weatherdb/db/queries/get_quotient.py,sha256=9wVFmXE8tk8igGj-Xk5uSI0eiF_PQ9d-yRq7RJpvMAA,6787
50
- weatherdb/station/GroupStation.py,sha256=Q7-1npCCyqDsHcvpg1iWkWh6avQm4awe8olpnMDzAIw,30138
51
- weatherdb/station/StationBases.py,sha256=YYnmg4Qdphw_-uAVtcfZh-EpxlABzOYLUg6eI67DHao,125553
57
+ weatherdb/station/GroupStation.py,sha256=XAovs-zK5mZmJuvNJ3iCgpWH-1yWHL0axtygYu1mVi0,30292
58
+ weatherdb/station/StationBases.py,sha256=TesbEH2Hd270_ns2hBGY22HsFwyTmS1xDDLgAudbZGE,129577
52
59
  weatherdb/station/StationET.py,sha256=hM6K8fCLC6mOLaE4cN92LHOyPGP01ifO6l2qO77I_bA,3621
53
60
  weatherdb/station/StationP.py,sha256=ENmWqEFwKR1WFsWmZ8-webEGTmrxODan8QO0gwIDDIU,32719
54
61
  weatherdb/station/StationPD.py,sha256=Bcg_uQTgIQPffsMGhNEwKAXhYs1Ii4VeB75afdFG-Kk,3367
@@ -67,11 +74,11 @@ weatherdb/utils/TimestampPeriod.py,sha256=6whEr2FLoHC-ldpMg9wW27ELEse5RW6dvaZ84M
67
74
  weatherdb/utils/__init__.py,sha256=siIvJT8lKLVvtdvWMpY3F9_h72Ky2d71DEdRGMWud5M,85
68
75
  weatherdb/utils/dwd.py,sha256=amEVVPLzXTSScBP5VAK35dBxuiy-Ua4hHyKnDGxD_4Q,12394
69
76
  weatherdb/utils/geometry.py,sha256=Od-RDAW1MASVAsGBXpHti9osbVJDqUbumSKBSJPbQqc,1823
70
- weatherdb/utils/get_data.py,sha256=5imKK_1Pm9cd6GWq44VP5NPZbeECDsyT8lPU8xYpYJE,11431
77
+ weatherdb/utils/get_data.py,sha256=-Qx7p8RMKSsYY_889egZdlvfEXr290CZtL5F4Ab4-1M,17188
71
78
  weatherdb/utils/logging.py,sha256=iaWnzCdBYxFmzWgLq9a6VAeBMdwy-rnt2d9PSAQq4lA,4242
72
- weatherdb-1.1.1.dist-info/LICENSE,sha256=ixuiBLtpoK3iv89l7ylKkg9rs2GzF9ukPH7ynZYzK5s,35148
73
- weatherdb-1.1.1.dist-info/METADATA,sha256=wm05_Yd6aIf_rrOF9y356OEbFWTlvauwJMFhHAoadBA,44266
74
- weatherdb-1.1.1.dist-info/WHEEL,sha256=tZoeGjtWxWRfdplE7E3d45VPlLNQnvbKiYnx7gwAy8A,92
75
- weatherdb-1.1.1.dist-info/entry_points.txt,sha256=kJemTd9Cm_QWPZt03KUWhpn1aB0-l_5ce6Ms3EoS_NM,55
76
- weatherdb-1.1.1.dist-info/top_level.txt,sha256=fI1xpTdDzn7NW-aRqQSj_VHgLvfN7wbK8Z1YXMx7ktM,22
77
- weatherdb-1.1.1.dist-info/RECORD,,
79
+ weatherdb-1.2.0.dist-info/LICENSE,sha256=ixuiBLtpoK3iv89l7ylKkg9rs2GzF9ukPH7ynZYzK5s,35148
80
+ weatherdb-1.2.0.dist-info/METADATA,sha256=obuXo2d9sc3nPJWiqpyr6yw_1vD8TLfjFAvsP_X9CYA,44266
81
+ weatherdb-1.2.0.dist-info/WHEEL,sha256=bFJAMchF8aTQGUgMZzHJyDDMPTO3ToJ7x23SLJa1SVo,92
82
+ weatherdb-1.2.0.dist-info/entry_points.txt,sha256=kJemTd9Cm_QWPZt03KUWhpn1aB0-l_5ce6Ms3EoS_NM,55
83
+ weatherdb-1.2.0.dist-info/top_level.txt,sha256=kLlRbXLn0GHvMWmciWRyvm0-FYEy56F6d0fGwfle9-g,28
84
+ weatherdb-1.2.0.dist-info/RECORD,,
@@ -1,5 +1,5 @@
1
1
  Wheel-Version: 1.0
2
- Generator: bdist_wheel (0.45.1)
2
+ Generator: bdist_wheel (0.45.0)
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
5
5
 
@@ -1,3 +1,4 @@
1
1
  docker
2
2
  docs
3
+ tests
3
4
  weatherdb