huff 1.5.7__tar.gz → 1.5.9__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {huff-1.5.7 → huff-1.5.9}/PKG-INFO +7 -6
- {huff-1.5.7 → huff-1.5.9}/README.md +6 -5
- {huff-1.5.7 → huff-1.5.9}/huff/gistools.py +72 -8
- {huff-1.5.7 → huff-1.5.9}/huff/models.py +20 -10
- {huff-1.5.7 → huff-1.5.9}/huff/ors.py +5 -2
- {huff-1.5.7 → huff-1.5.9}/huff/osm.py +27 -19
- {huff-1.5.7 → huff-1.5.9}/huff/tests/tests_huff.py +6 -3
- {huff-1.5.7 → huff-1.5.9}/huff.egg-info/PKG-INFO +7 -6
- huff-1.5.9/huff.egg-info/requires.txt +10 -0
- {huff-1.5.7 → huff-1.5.9}/setup.py +11 -11
- huff-1.5.7/huff.egg-info/requires.txt +0 -10
- {huff-1.5.7 → huff-1.5.9}/MANIFEST.in +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/__init__.py +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/__init__.py +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach.cpg +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach.dbf +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach.prj +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach.qmd +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach.shp +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach.shx +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.cpg +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.dbf +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.prj +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.qmd +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.shp +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_new_supermarket.shx +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.cpg +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.dbf +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.prj +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.qmd +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.shp +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Haslach_supermarkets.shx +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff/tests/data/Wieland2015.xlsx +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff.egg-info/SOURCES.txt +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff.egg-info/dependency_links.txt +0 -0
- {huff-1.5.7 → huff-1.5.9}/huff.egg-info/top_level.txt +0 -0
- {huff-1.5.7 → huff-1.5.9}/setup.cfg +0 -0
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: huff
|
3
|
-
Version: 1.5.
|
3
|
+
Version: 1.5.9
|
4
4
|
Summary: huff: Huff Model Market Area Analysis
|
5
5
|
Author: Thomas Wieland
|
6
6
|
Author-email: geowieland@googlemail.com
|
@@ -8,7 +8,7 @@ Description-Content-Type: text/markdown
|
|
8
8
|
|
9
9
|
# huff: Huff Model Market Area Analysis
|
10
10
|
|
11
|
-
This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. The package also includes
|
11
|
+
This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application. The package also includes GIS functions for market area analysis (buffer, distance matrix, overlay statistics) and clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps.
|
12
12
|
|
13
13
|
|
14
14
|
## Author
|
@@ -18,10 +18,11 @@ Thomas Wieland [ORCID](https://orcid.org/0000-0001-5168-9846) [EMail](mailto:geo
|
|
18
18
|
See the /tests directory for usage examples of most of the included functions.
|
19
19
|
|
20
20
|
|
21
|
-
## Updates v1.5.
|
22
|
-
-
|
23
|
-
-
|
24
|
-
|
21
|
+
## Updates v1.5.9
|
22
|
+
- Bugfixes:
|
23
|
+
- models.get_isochrones() now treats distances analogously to times
|
24
|
+
- Update dependencies
|
25
|
+
|
25
26
|
|
26
27
|
## Features
|
27
28
|
|
@@ -1,6 +1,6 @@
|
|
1
1
|
# huff: Huff Model Market Area Analysis
|
2
2
|
|
3
|
-
This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. The package also includes
|
3
|
+
This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application. The package also includes GIS functions for market area analysis (buffer, distance matrix, overlay statistics) and clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps.
|
4
4
|
|
5
5
|
|
6
6
|
## Author
|
@@ -10,10 +10,11 @@ Thomas Wieland [ORCID](https://orcid.org/0000-0001-5168-9846) [EMail](mailto:geo
|
|
10
10
|
See the /tests directory for usage examples of most of the included functions.
|
11
11
|
|
12
12
|
|
13
|
-
## Updates v1.5.
|
14
|
-
-
|
15
|
-
-
|
16
|
-
|
13
|
+
## Updates v1.5.9
|
14
|
+
- Bugfixes:
|
15
|
+
- models.get_isochrones() now treats distances analogously to times
|
16
|
+
- Update dependencies
|
17
|
+
|
17
18
|
|
18
19
|
## Features
|
19
20
|
|
@@ -4,8 +4,8 @@
|
|
4
4
|
# Author: Thomas Wieland
|
5
5
|
# ORCID: 0000-0001-5168-9846
|
6
6
|
# mail: geowieland@googlemail.com
|
7
|
-
# Version: 1.4.
|
8
|
-
# Last update: 2025-07
|
7
|
+
# Version: 1.4.3
|
8
|
+
# Last update: 2025-08-07 17:20
|
9
9
|
# Copyright (c) 2025 Thomas Wieland
|
10
10
|
#-----------------------------------------------------------------------
|
11
11
|
|
@@ -83,7 +83,7 @@ def distance_matrix(
|
|
83
83
|
matrix.append(row)
|
84
84
|
|
85
85
|
if lines_gdf:
|
86
|
-
return line_data
|
86
|
+
return gp.GeoDataFrame(line_data)
|
87
87
|
else:
|
88
88
|
return matrix
|
89
89
|
|
@@ -95,9 +95,15 @@ def buffers(
|
|
95
95
|
donut: bool = True,
|
96
96
|
save_output: bool = True,
|
97
97
|
output_filepath: str = "buffers.shp",
|
98
|
-
output_crs: str = "EPSG:4326"
|
98
|
+
output_crs: str = "EPSG:4326"
|
99
99
|
):
|
100
|
+
|
101
|
+
if point_gdf.crs.is_geographic:
|
102
|
+
print(f"WARNING: Point GeoDataFrame has geographic coordinate system {point_gdf.crs}. Results may be invalid.")
|
100
103
|
|
104
|
+
if unique_id_col not in point_gdf.columns:
|
105
|
+
raise KeyError(f"No column {unique_id_col} in input GeoDataFrame")
|
106
|
+
|
101
107
|
all_buffers_gdf = gp.GeoDataFrame(
|
102
108
|
columns=[
|
103
109
|
unique_id_col,
|
@@ -146,15 +152,72 @@ def buffers(
|
|
146
152
|
|
147
153
|
all_buffers_gdf = all_buffers_gdf.to_crs(output_crs)
|
148
154
|
|
149
|
-
if save_output:
|
150
|
-
|
151
|
-
all_buffers_gdf.to_file(output_filepath)
|
152
|
-
|
155
|
+
if save_output:
|
156
|
+
all_buffers_gdf.to_file(output_filepath)
|
153
157
|
print ("Saved as", output_filepath)
|
154
158
|
|
155
159
|
return all_buffers_gdf
|
156
160
|
|
157
161
|
|
162
|
+
def polygon_select(
|
163
|
+
gdf: gp.GeoDataFrame,
|
164
|
+
gdf_unique_id_col: str,
|
165
|
+
gdf_polygon_select: gp.GeoDataFrame,
|
166
|
+
gdf_polygon_select_unique_id_col: str,
|
167
|
+
distance: int,
|
168
|
+
within: bool = False,
|
169
|
+
save_output: bool = True,
|
170
|
+
output_filepath: str = "polygon_select.shp",
|
171
|
+
output_crs: str = "EPSG:4326"
|
172
|
+
):
|
173
|
+
|
174
|
+
if gdf.crs != gdf_polygon_select.crs:
|
175
|
+
raise ValueError(f"Coordinate reference systems of inputs do not match. Polygons: {str(gdf.crs)}, points: {str(gdf_polygon_select.crs)}")
|
176
|
+
|
177
|
+
if gdf_unique_id_col not in gdf.columns:
|
178
|
+
raise KeyError(f"No column {gdf_unique_id_col} in input GeoDataFrame")
|
179
|
+
|
180
|
+
if gdf_polygon_select_unique_id_col not in gdf_polygon_select.columns:
|
181
|
+
raise KeyError(f"No column {gdf_polygon_select_unique_id_col} in input GeoDataFrame for selection")
|
182
|
+
|
183
|
+
if gdf.crs.is_geographic:
|
184
|
+
print(f"WARNING: Input GeoDataFrames have geographic coordinate system {gdf.crs}. Results may be invalid.")
|
185
|
+
|
186
|
+
if len(gdf) > 1:
|
187
|
+
print(f"WARNING: Input GeoDataFrame 'gdf' includes > 1 objects. Using the first only.")
|
188
|
+
gdf = gdf[0]
|
189
|
+
|
190
|
+
gdf_buffer = buffers(
|
191
|
+
point_gdf = gdf,
|
192
|
+
unique_id_col = gdf_unique_id_col,
|
193
|
+
distances = [distance],
|
194
|
+
save_output = True,
|
195
|
+
output_filepath = "gdf_buffer.shp",
|
196
|
+
output_crs = output_crs
|
197
|
+
)
|
198
|
+
|
199
|
+
gdf_buffer = gdf_buffer.geometry.union_all()
|
200
|
+
|
201
|
+
gdf_polygon_select = gdf_polygon_select.to_crs(output_crs)
|
202
|
+
|
203
|
+
gdf_select_intersects = gdf_polygon_select[
|
204
|
+
gdf_polygon_select.geometry.intersects(gdf_buffer)
|
205
|
+
]
|
206
|
+
|
207
|
+
if within:
|
208
|
+
gdf_select_intersects = gdf_select_intersects[gdf_select_intersects.geometry.within(gdf_buffer)]
|
209
|
+
|
210
|
+
gdf_select_intersects_unique_ids = gdf_select_intersects[gdf_polygon_select_unique_id_col].unique()
|
211
|
+
|
212
|
+
gdf_polygon_select_selection = gdf_polygon_select[gdf_polygon_select[gdf_polygon_select_unique_id_col].isin(gdf_select_intersects_unique_ids)]
|
213
|
+
|
214
|
+
if save_output:
|
215
|
+
gdf_polygon_select_selection.to_file(output_filepath)
|
216
|
+
print ("Saved as", output_filepath)
|
217
|
+
|
218
|
+
return gdf_polygon_select_selection
|
219
|
+
|
220
|
+
|
158
221
|
def overlay_difference(
|
159
222
|
polygon_gdf: gp.GeoDataFrame,
|
160
223
|
sort_col: str = None,
|
@@ -254,6 +317,7 @@ def point_spatial_join(
|
|
254
317
|
spatial_join_stat
|
255
318
|
]
|
256
319
|
|
320
|
+
|
257
321
|
def map_with_basemap(
|
258
322
|
layers: list,
|
259
323
|
osm_basemap: bool = True,
|
@@ -4,8 +4,8 @@
|
|
4
4
|
# Author: Thomas Wieland
|
5
5
|
# ORCID: 0000-0001-5168-9846
|
6
6
|
# mail: geowieland@googlemail.com
|
7
|
-
# Version: 1.5.
|
8
|
-
# Last update: 2025-08-
|
7
|
+
# Version: 1.5.7
|
8
|
+
# Last update: 2025-08-13 18:47
|
9
9
|
# Copyright (c) 2025 Thomas Wieland
|
10
10
|
#-----------------------------------------------------------------------
|
11
11
|
|
@@ -162,7 +162,7 @@ class CustomerOrigins:
|
|
162
162
|
|
163
163
|
def isochrones(
|
164
164
|
self,
|
165
|
-
|
165
|
+
segments: list = [5, 10, 15],
|
166
166
|
range_type: str = "time",
|
167
167
|
intersections: str = "true",
|
168
168
|
profile: str = "driving-car",
|
@@ -182,7 +182,7 @@ class CustomerOrigins:
|
|
182
182
|
isochrones_gdf = get_isochrones(
|
183
183
|
geodata_gpd = geodata_gpd,
|
184
184
|
unique_id_col = metadata["unique_id"],
|
185
|
-
|
185
|
+
segments = segments,
|
186
186
|
range_type = range_type,
|
187
187
|
intersections = intersections,
|
188
188
|
profile = profile,
|
@@ -401,7 +401,7 @@ class SupplyLocations:
|
|
401
401
|
|
402
402
|
def isochrones(
|
403
403
|
self,
|
404
|
-
|
404
|
+
segments: list = [5, 10, 15],
|
405
405
|
range_type: str = "time",
|
406
406
|
intersections: str = "true",
|
407
407
|
profile: str = "driving-car",
|
@@ -421,7 +421,7 @@ class SupplyLocations:
|
|
421
421
|
isochrones_gdf = get_isochrones(
|
422
422
|
geodata_gpd = geodata_gpd,
|
423
423
|
unique_id_col = metadata["unique_id"],
|
424
|
-
|
424
|
+
segments = segments,
|
425
425
|
range_type = range_type,
|
426
426
|
intersections = intersections,
|
427
427
|
profile = profile,
|
@@ -3197,7 +3197,7 @@ def log_centering_transformation(
|
|
3197
3197
|
def get_isochrones(
|
3198
3198
|
geodata_gpd: gp.GeoDataFrame,
|
3199
3199
|
unique_id_col: str,
|
3200
|
-
|
3200
|
+
segments: list = [5, 10, 15],
|
3201
3201
|
range_type: str = "time",
|
3202
3202
|
intersections: str = "true",
|
3203
3203
|
profile: str = "driving-car",
|
@@ -3222,7 +3222,10 @@ def get_isochrones(
|
|
3222
3222
|
|
3223
3223
|
isochrones_gdf = gp.GeoDataFrame(columns=[unique_id_col, "geometry"])
|
3224
3224
|
|
3225
|
-
|
3225
|
+
if range_type == "time":
|
3226
|
+
segments = [segment*60 for segment in segments]
|
3227
|
+
if range_type == "distance":
|
3228
|
+
segments = [segment*1000 for segment in segments]
|
3226
3229
|
|
3227
3230
|
i = 0
|
3228
3231
|
|
@@ -3254,7 +3257,10 @@ def get_isochrones(
|
|
3254
3257
|
|
3255
3258
|
isochrone_gdf[unique_id_col] = unique_id_values[i]
|
3256
3259
|
|
3257
|
-
|
3260
|
+
if range_type == "time":
|
3261
|
+
isochrone_gdf["segm_min"] = isochrone_gdf["segment"]/60
|
3262
|
+
if range_type == "distance":
|
3263
|
+
isochrone_gdf["segm_km"] = isochrone_gdf["segment"]/1000
|
3258
3264
|
|
3259
3265
|
isochrones_gdf = pd.concat(
|
3260
3266
|
[
|
@@ -3267,7 +3273,11 @@ def get_isochrones(
|
|
3267
3273
|
i = i+1
|
3268
3274
|
|
3269
3275
|
isochrones_gdf["segment"] = isochrones_gdf["segment"].astype(int)
|
3270
|
-
|
3276
|
+
|
3277
|
+
if range_type == "time":
|
3278
|
+
isochrones_gdf["segm_min"] = isochrones_gdf["segm_min"].astype(int)
|
3279
|
+
if range_type == "distance":
|
3280
|
+
isochrones_gdf["segm_km"] = isochrones_gdf["segm_km"].astype(int)
|
3271
3281
|
|
3272
3282
|
isochrones_gdf.set_crs(
|
3273
3283
|
output_crs,
|
@@ -4,8 +4,8 @@
|
|
4
4
|
# Author: Thomas Wieland
|
5
5
|
# ORCID: 0000-0001-5168-9846
|
6
6
|
# mail: geowieland@googlemail.com
|
7
|
-
# Version: 1.4.
|
8
|
-
# Last update: 2025-
|
7
|
+
# Version: 1.4.2
|
8
|
+
# Last update: 2025-08-13 18:47
|
9
9
|
# Copyright (c) 2025 Thomas Wieland
|
10
10
|
#-----------------------------------------------------------------------
|
11
11
|
|
@@ -151,6 +151,9 @@ class Client:
|
|
151
151
|
"range_type": range_type
|
152
152
|
}
|
153
153
|
|
154
|
+
print("body:") # TODO ?? RAUS
|
155
|
+
print(body) # TODO ?? RAUS
|
156
|
+
|
154
157
|
save_config = {
|
155
158
|
"range_type": range_type,
|
156
159
|
"save_output": save_output,
|
@@ -4,8 +4,8 @@
|
|
4
4
|
# Author: Thomas Wieland
|
5
5
|
# ORCID: 0000-0001-5168-9846
|
6
6
|
# mail: geowieland@googlemail.com
|
7
|
-
# Version: 1.4.
|
8
|
-
# Last update: 2025-07
|
7
|
+
# Version: 1.4.3
|
8
|
+
# Last update: 2025-08-07 17:21
|
9
9
|
# Copyright (c) 2025 Thomas Wieland
|
10
10
|
#-----------------------------------------------------------------------
|
11
11
|
|
@@ -23,7 +23,7 @@ class Client:
|
|
23
23
|
self,
|
24
24
|
server = "http://a.tile.openstreetmap.org/",
|
25
25
|
headers = {
|
26
|
-
'User-Agent': 'huff.osm/1.
|
26
|
+
'User-Agent': 'huff.osm/1.4.3 (your_name@your_email_provider.com)'
|
27
27
|
}
|
28
28
|
):
|
29
29
|
|
@@ -40,23 +40,31 @@ class Client:
|
|
40
40
|
|
41
41
|
osm_url = self.server + f"{zoom}/{x}/{y}.png"
|
42
42
|
|
43
|
-
|
44
|
-
osm_url,
|
45
|
-
headers = self.headers,
|
46
|
-
timeout = timeout
|
47
|
-
)
|
48
|
-
|
49
|
-
if response.status_code == 200:
|
50
|
-
|
51
|
-
with tempfile.NamedTemporaryFile(delete=False, suffix='.png') as tmp_file:
|
52
|
-
tmp_file.write(response.content)
|
53
|
-
tmp_file_path = tmp_file.name
|
54
|
-
return Image.open(tmp_file_path)
|
43
|
+
try:
|
55
44
|
|
56
|
-
|
57
|
-
|
58
|
-
|
59
|
-
|
45
|
+
response = requests.get(
|
46
|
+
osm_url,
|
47
|
+
headers = self.headers,
|
48
|
+
timeout = timeout
|
49
|
+
)
|
50
|
+
|
51
|
+
if response.status_code == 200:
|
52
|
+
|
53
|
+
with tempfile.NamedTemporaryFile(delete=False, suffix='.png') as tmp_file:
|
54
|
+
tmp_file.write(response.content)
|
55
|
+
tmp_file_path = tmp_file.name
|
56
|
+
return Image.open(tmp_file_path)
|
57
|
+
|
58
|
+
else:
|
59
|
+
|
60
|
+
print(f"Error while accessing OSM server with URL {osm_url}. Status code: {response.status_code} - {response.reason}")
|
61
|
+
|
62
|
+
return None
|
63
|
+
|
64
|
+
except Exception as e:
|
65
|
+
|
66
|
+
print(f"Error while accessing OSM server with URL {osm_url}. Error message: {e}")
|
67
|
+
|
60
68
|
return None
|
61
69
|
|
62
70
|
|
@@ -4,8 +4,8 @@
|
|
4
4
|
# Author: Thomas Wieland
|
5
5
|
# ORCID: 0000-0001-5168-9846
|
6
6
|
# mail: geowieland@googlemail.com
|
7
|
-
# Version: 1.5.
|
8
|
-
# Last update: 2025-08-
|
7
|
+
# Version: 1.5.9
|
8
|
+
# Last update: 2025-08-13 18:46
|
9
9
|
# Copyright (c) 2025 Thomas Wieland
|
10
10
|
#-----------------------------------------------------------------------
|
11
11
|
|
@@ -69,7 +69,10 @@ Haslach_supermarkets.define_attraction_weighting(
|
|
69
69
|
# Define attraction weighting (gamma)
|
70
70
|
|
71
71
|
Haslach_supermarkets.isochrones(
|
72
|
-
|
72
|
+
segments=[3, 6, 9, 12, 15],
|
73
|
+
# minutes or kilometers
|
74
|
+
range_type = "time",
|
75
|
+
# "time" or "distance" (default: "time")
|
73
76
|
profile = "foot-walking",
|
74
77
|
save_output=True,
|
75
78
|
ors_auth="5b3ce3597851110001cf62480a15aafdb5a64f4d91805929f8af6abd",
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: huff
|
3
|
-
Version: 1.5.
|
3
|
+
Version: 1.5.9
|
4
4
|
Summary: huff: Huff Model Market Area Analysis
|
5
5
|
Author: Thomas Wieland
|
6
6
|
Author-email: geowieland@googlemail.com
|
@@ -8,7 +8,7 @@ Description-Content-Type: text/markdown
|
|
8
8
|
|
9
9
|
# huff: Huff Model Market Area Analysis
|
10
10
|
|
11
|
-
This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. The package also includes
|
11
|
+
This Python library is designed for performing market area analyses with the Huff Model (Huff 1962, 1964) and/or the Multiplicative Competitive Interaction (MCI) Model (Nakanishi and Cooper 1974, 1982). Users may load point shapefiles (or CSV, XLSX) of customer origins and supply locations and conduct a market area analysis step by step. The library supports parameter estimation based on empirical customer data using the MCI model and Maximum Likelihood. See Huff and McCallum (2008) or Wieland (2017) for a description of the models and their practical application. The package also includes GIS functions for market area analysis (buffer, distance matrix, overlay statistics) and clients for OpenRouteService(1) for network analysis (e.g., transport cost matrix) and OpenStreetMap(2) for simple maps.
|
12
12
|
|
13
13
|
|
14
14
|
## Author
|
@@ -18,10 +18,11 @@ Thomas Wieland [ORCID](https://orcid.org/0000-0001-5168-9846) [EMail](mailto:geo
|
|
18
18
|
See the /tests directory for usage examples of most of the included functions.
|
19
19
|
|
20
20
|
|
21
|
-
## Updates v1.5.
|
22
|
-
-
|
23
|
-
-
|
24
|
-
|
21
|
+
## Updates v1.5.9
|
22
|
+
- Bugfixes:
|
23
|
+
- models.get_isochrones() now treats distances analogously to times
|
24
|
+
- Update dependencies
|
25
|
+
|
25
26
|
|
26
27
|
## Features
|
27
28
|
|
@@ -7,7 +7,7 @@ def read_README():
|
|
7
7
|
|
8
8
|
setup(
|
9
9
|
name='huff',
|
10
|
-
version='1.5.
|
10
|
+
version='1.5.9',
|
11
11
|
description='huff: Huff Model Market Area Analysis',
|
12
12
|
packages=find_packages(include=["huff", "huff.tests"]),
|
13
13
|
include_package_data=True,
|
@@ -20,16 +20,16 @@ setup(
|
|
20
20
|
'huff': ['tests/data/*'],
|
21
21
|
},
|
22
22
|
install_requires=[
|
23
|
-
'geopandas',
|
24
|
-
'pandas',
|
25
|
-
'numpy',
|
26
|
-
'statsmodels',
|
27
|
-
'shapely',
|
28
|
-
'requests',
|
29
|
-
'matplotlib',
|
30
|
-
'pillow',
|
31
|
-
'contextily',
|
32
|
-
'openpyxl'
|
23
|
+
'geopandas==1.1.1',
|
24
|
+
'pandas==2.3.1',
|
25
|
+
'numpy==2.3.0',
|
26
|
+
'statsmodels==0.14.4',
|
27
|
+
'shapely==2.1.1',
|
28
|
+
'requests==2.32.4',
|
29
|
+
'matplotlib==3.10',
|
30
|
+
'pillow==10.2.0',
|
31
|
+
'contextily==1.6.2',
|
32
|
+
'openpyxl==3.1.4'
|
33
33
|
],
|
34
34
|
test_suite='tests',
|
35
35
|
)
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|