huff 1.5.8__tar.gz → 1.5.9__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (37) hide show
  1. {huff-1.5.8 → huff-1.5.9}/PKG-INFO +6 -9
  2. {huff-1.5.8 → huff-1.5.9}/README.md +5 -8
  3. {huff-1.5.8 → huff-1.5.9}/huff/models.py +20 -10
  4. {huff-1.5.8 → huff-1.5.9}/huff/ors.py +5 -2
  5. {huff-1.5.8 → huff-1.5.9}/huff/tests/tests_huff.py +6 -3
  6. {huff-1.5.8 → huff-1.5.9}/huff.egg-info/PKG-INFO +6 -9
  7. huff-1.5.9/huff.egg-info/requires.txt +10 -0
  8. {huff-1.5.8 → huff-1.5.9}/setup.py +8 -8
  9. huff-1.5.8/huff.egg-info/requires.txt +0 -10
  10. {huff-1.5.8 → huff-1.5.9}/MANIFEST.in +0 -0
  11. {huff-1.5.8 → huff-1.5.9}/huff/__init__.py +0 -0
  12. {huff-1.5.8 → huff-1.5.9}/huff/gistools.py +0 -0
  13. {huff-1.5.8 → huff-1.5.9}/huff/osm.py +0 -0
  14. {huff-1.5.8 → huff-1.5.9}/huff/tests/__init__.py +0 -0
  15. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach.cpg +0 -0
  16. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach.dbf +0 -0
  17. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach.prj +0 -0
  18. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach.qmd +0 -0
  19. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach.shp +0 -0
  20. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach.shx +0 -0
  21. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.cpg +0 -0
  22. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.dbf +0 -0
  23. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.prj +0 -0
  24. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.qmd +0 -0
  25. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.shp +0 -0
  26. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.shx +0 -0
  27. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.cpg +0 -0
  28. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.dbf +0 -0
  29. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.prj +0 -0
  30. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.qmd +0 -0
  31. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.shp +0 -0
  32. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.shx +0 -0
  33. {huff-1.5.8 → huff-1.5.9}/huff/tests/data/Wieland2015.xlsx +0 -0
  34. {huff-1.5.8 → huff-1.5.9}/huff.egg-info/SOURCES.txt +0 -0
  35. {huff-1.5.8 → huff-1.5.9}/huff.egg-info/dependency_links.txt +0 -0
  36. {huff-1.5.8 → huff-1.5.9}/huff.egg-info/top_level.txt +0 -0
  37. {huff-1.5.8 → huff-1.5.9}/setup.cfg +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: huff
3
- Version: 1.5.8
3
+ Version: 1.5.9
4
4
  Summary: huff: Huff Model Market Area Analysis
5
5
  Author: Thomas Wieland
6
6
  Author-email: geowieland@googlemail.com
@@ -8,7 +8,7 @@ Description-Content-Type: text/markdown
8
8
 
9
9
  # huff: Huff Model Market Area Analysis
10
10
 
11
- This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. The package also includes supplementary GIS functions, including clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application.
11
+ This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application. The package also includes GIS functions for market area analysis (buffer, distance matrix, overlay statistics) and clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps.
12
12
 
13
13
 
14
14
  ## Author
@@ -18,14 +18,11 @@ Thomas Wieland [ORCID](https://orcid.org/0000-0001-5168-9846) [EMail](mailto:geo
18
18
  See the /tests directory for usage examples of most of the included functions.
19
19
 
20
20
 
21
- ## Updates v1.5.8
21
+ ## Updates v1.5.9
22
22
  - Bugfixes:
23
- - buffers() now checks whether unique_id_col exists
24
- - buffers() now checks whether input gdf has geographic coordinate system
25
- - download_tile() which is used in map_with_basemap() now controls for timeouts
26
- - Extensions:
27
- - Function polygon_select(): Selection of polygons from point's airline distance
28
-
23
+ - models.get_isochrones() now treats distances analogously to times
24
+ - Update dependencies
25
+
29
26
 
30
27
  ## Features
31
28
 
@@ -1,6 +1,6 @@
1
1
  # huff: Huff Model Market Area Analysis
2
2
 
3
- This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. The package also includes supplementary GIS functions, including clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application.
3
+ This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application. The package also includes GIS functions for market area analysis (buffer, distance matrix, overlay statistics) and clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps.
4
4
 
5
5
 
6
6
  ## Author
@@ -10,14 +10,11 @@ Thomas Wieland [ORCID](https://orcid.org/0000-0001-5168-9846) [EMail](mailto:geo
10
10
  See the /tests directory for usage examples of most of the included functions.
11
11
 
12
12
 
13
- ## Updates v1.5.8
13
+ ## Updates v1.5.9
14
14
  - Bugfixes:
15
- - buffers() now checks whether unique_id_col exists
16
- - buffers() now checks whether input gdf has geographic coordinate system
17
- - download_tile() which is used in map_with_basemap() now controls for timeouts
18
- - Extensions:
19
- - Function polygon_select(): Selection of polygons from point's airline distance
20
-
15
+ - models.get_isochrones() now treats distances analogously to times
16
+ - Update dependencies
17
+
21
18
 
22
19
  ## Features
23
20
 
@@ -4,8 +4,8 @@
4
4
  # Author: Thomas Wieland
5
5
  # ORCID: 0000-0001-5168-9846
6
6
  # mail: geowieland@googlemail.com
7
- # Version: 1.5.6
8
- # Last update: 2025-08-01 14:00
7
+ # Version: 1.5.7
8
+ # Last update: 2025-08-13 18:47
9
9
  # Copyright (c) 2025 Thomas Wieland
10
10
  #-----------------------------------------------------------------------
11
11
 
@@ -162,7 +162,7 @@ class CustomerOrigins:
162
162
 
163
163
  def isochrones(
164
164
  self,
165
- segments_minutes: list = [5, 10, 15],
165
+ segments: list = [5, 10, 15],
166
166
  range_type: str = "time",
167
167
  intersections: str = "true",
168
168
  profile: str = "driving-car",
@@ -182,7 +182,7 @@ class CustomerOrigins:
182
182
  isochrones_gdf = get_isochrones(
183
183
  geodata_gpd = geodata_gpd,
184
184
  unique_id_col = metadata["unique_id"],
185
- segments_minutes = segments_minutes,
185
+ segments = segments,
186
186
  range_type = range_type,
187
187
  intersections = intersections,
188
188
  profile = profile,
@@ -401,7 +401,7 @@ class SupplyLocations:
401
401
 
402
402
  def isochrones(
403
403
  self,
404
- segments_minutes: list = [5, 10, 15],
404
+ segments: list = [5, 10, 15],
405
405
  range_type: str = "time",
406
406
  intersections: str = "true",
407
407
  profile: str = "driving-car",
@@ -421,7 +421,7 @@ class SupplyLocations:
421
421
  isochrones_gdf = get_isochrones(
422
422
  geodata_gpd = geodata_gpd,
423
423
  unique_id_col = metadata["unique_id"],
424
- segments_minutes = segments_minutes,
424
+ segments = segments,
425
425
  range_type = range_type,
426
426
  intersections = intersections,
427
427
  profile = profile,
@@ -3197,7 +3197,7 @@ def log_centering_transformation(
3197
3197
  def get_isochrones(
3198
3198
  geodata_gpd: gp.GeoDataFrame,
3199
3199
  unique_id_col: str,
3200
- segments_minutes: list = [5, 10, 15],
3200
+ segments: list = [5, 10, 15],
3201
3201
  range_type: str = "time",
3202
3202
  intersections: str = "true",
3203
3203
  profile: str = "driving-car",
@@ -3222,7 +3222,10 @@ def get_isochrones(
3222
3222
 
3223
3223
  isochrones_gdf = gp.GeoDataFrame(columns=[unique_id_col, "geometry"])
3224
3224
 
3225
- segments = [segment*60 for segment in segments_minutes]
3225
+ if range_type == "time":
3226
+ segments = [segment*60 for segment in segments]
3227
+ if range_type == "distance":
3228
+ segments = [segment*1000 for segment in segments]
3226
3229
 
3227
3230
  i = 0
3228
3231
 
@@ -3254,7 +3257,10 @@ def get_isochrones(
3254
3257
 
3255
3258
  isochrone_gdf[unique_id_col] = unique_id_values[i]
3256
3259
 
3257
- isochrone_gdf["segm_min"] = isochrone_gdf["segment"]/60
3260
+ if range_type == "time":
3261
+ isochrone_gdf["segm_min"] = isochrone_gdf["segment"]/60
3262
+ if range_type == "distance":
3263
+ isochrone_gdf["segm_km"] = isochrone_gdf["segment"]/1000
3258
3264
 
3259
3265
  isochrones_gdf = pd.concat(
3260
3266
  [
@@ -3267,7 +3273,11 @@ def get_isochrones(
3267
3273
  i = i+1
3268
3274
 
3269
3275
  isochrones_gdf["segment"] = isochrones_gdf["segment"].astype(int)
3270
- isochrones_gdf["segm_min"] = isochrones_gdf["segm_min"].astype(int)
3276
+
3277
+ if range_type == "time":
3278
+ isochrones_gdf["segm_min"] = isochrones_gdf["segm_min"].astype(int)
3279
+ if range_type == "distance":
3280
+ isochrones_gdf["segm_km"] = isochrones_gdf["segm_km"].astype(int)
3271
3281
 
3272
3282
  isochrones_gdf.set_crs(
3273
3283
  output_crs,
@@ -4,8 +4,8 @@
4
4
  # Author: Thomas Wieland
5
5
  # ORCID: 0000-0001-5168-9846
6
6
  # mail: geowieland@googlemail.com
7
- # Version: 1.4.1
8
- # Last update: 2025-06-16 17:44
7
+ # Version: 1.4.2
8
+ # Last update: 2025-08-13 18:47
9
9
  # Copyright (c) 2025 Thomas Wieland
10
10
  #-----------------------------------------------------------------------
11
11
 
@@ -151,6 +151,9 @@ class Client:
151
151
  "range_type": range_type
152
152
  }
153
153
 
154
+ print("body:") # TODO ?? RAUS
155
+ print(body) # TODO ?? RAUS
156
+
154
157
  save_config = {
155
158
  "range_type": range_type,
156
159
  "save_output": save_output,
@@ -4,8 +4,8 @@
4
4
  # Author: Thomas Wieland
5
5
  # ORCID: 0000-0001-5168-9846
6
6
  # mail: geowieland@googlemail.com
7
- # Version: 1.5.8
8
- # Last update: 2025-08-07 17:20
7
+ # Version: 1.5.9
8
+ # Last update: 2025-08-13 18:46
9
9
  # Copyright (c) 2025 Thomas Wieland
10
10
  #-----------------------------------------------------------------------
11
11
 
@@ -69,7 +69,10 @@ Haslach_supermarkets.define_attraction_weighting(
69
69
  # Define attraction weighting (gamma)
70
70
 
71
71
  Haslach_supermarkets.isochrones(
72
- segments_minutes=[3, 6, 9, 12, 15],
72
+ segments=[3, 6, 9, 12, 15],
73
+ # minutes or kilometers
74
+ range_type = "time",
75
+ # "time" or "distance" (default: "time")
73
76
  profile = "foot-walking",
74
77
  save_output=True,
75
78
  ors_auth="5b3ce3597851110001cf62480a15aafdb5a64f4d91805929f8af6abd",
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: huff
3
- Version: 1.5.8
3
+ Version: 1.5.9
4
4
  Summary: huff: Huff Model Market Area Analysis
5
5
  Author: Thomas Wieland
6
6
  Author-email: geowieland@googlemail.com
@@ -8,7 +8,7 @@ Description-Content-Type: text/markdown
8
8
 
9
9
  # huff: Huff Model Market Area Analysis
10
10
 
11
- This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. The package also includes supplementary GIS functions, including clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application.
11
+ This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application. The package also includes GIS functions for market area analysis (buffer, distance matrix, overlay statistics) and clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps.
12
12
 
13
13
 
14
14
  ## Author
@@ -18,14 +18,11 @@ Thomas Wieland [ORCID](https://orcid.org/0000-0001-5168-9846) [EMail](mailto:geo
18
18
  See the /tests directory for usage examples of most of the included functions.
19
19
 
20
20
 
21
- ## Updates v1.5.8
21
+ ## Updates v1.5.9
22
22
  - Bugfixes:
23
- - buffers() now checks whether unique_id_col exists
24
- - buffers() now checks whether input gdf has geographic coordinate system
25
- - download_tile() which is used in map_with_basemap() now controls for timeouts
26
- - Extensions:
27
- - Function polygon_select(): Selection of polygons from point's airline distance
28
-
23
+ - models.get_isochrones() now treats distances analogously to times
24
+ - Update dependencies
25
+
29
26
 
30
27
  ## Features
31
28
 
@@ -0,0 +1,10 @@
1
+ geopandas==1.1.1
2
+ pandas==2.3.1
3
+ numpy==2.3.0
4
+ statsmodels==0.14.4
5
+ shapely==2.1.1
6
+ requests==2.32.4
7
+ matplotlib==3.10
8
+ pillow==10.2.0
9
+ contextily==1.6.2
10
+ openpyxl==3.1.4
@@ -7,7 +7,7 @@ def read_README():
7
7
 
8
8
  setup(
9
9
  name='huff',
10
- version='1.5.8',
10
+ version='1.5.9',
11
11
  description='huff: Huff Model Market Area Analysis',
12
12
  packages=find_packages(include=["huff", "huff.tests"]),
13
13
  include_package_data=True,
@@ -20,13 +20,13 @@ setup(
20
20
  'huff': ['tests/data/*'],
21
21
  },
22
22
  install_requires=[
23
- 'geopandas==0.14.4',
24
- 'pandas==2.2.3',
25
- 'numpy==1.26.3',
26
- 'statsmodels==0.14.1',
27
- 'shapely==2.0.4',
28
- 'requests==2.31.0',
29
- 'matplotlib==3.8.2',
23
+ 'geopandas==1.1.1',
24
+ 'pandas==2.3.1',
25
+ 'numpy==2.3.0',
26
+ 'statsmodels==0.14.4',
27
+ 'shapely==2.1.1',
28
+ 'requests==2.32.4',
29
+ 'matplotlib==3.10',
30
30
  'pillow==10.2.0',
31
31
  'contextily==1.6.2',
32
32
  'openpyxl==3.1.4'
@@ -1,10 +0,0 @@
1
- geopandas==0.14.4
2
- pandas==2.2.3
3
- numpy==1.26.3
4
- statsmodels==0.14.1
5
- shapely==2.0.4
6
- requests==2.31.0
7
- matplotlib==3.8.2
8
- pillow==10.2.0
9
- contextily==1.6.2
10
- openpyxl==3.1.4
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes