ObjectNat 0.0.3__tar.gz → 0.1.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of ObjectNat might be problematic. Click here for more details.

@@ -0,0 +1,28 @@
1
+ BSD 3-Clause License
2
+
3
+ Copyright (c) 2023, iduprojects
4
+
5
+ Redistribution and use in source and binary forms, with or without
6
+ modification, are permitted provided that the following conditions are met:
7
+
8
+ 1. Redistributions of source code must retain the above copyright notice, this
9
+ list of conditions and the following disclaimer.
10
+
11
+ 2. Redistributions in binary form must reproduce the above copyright notice,
12
+ this list of conditions and the following disclaimer in the documentation
13
+ and/or other materials provided with the distribution.
14
+
15
+ 3. Neither the name of the copyright holder nor the names of its
16
+ contributors may be used to endorse or promote products derived from
17
+ this software without specific prior written permission.
18
+
19
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
20
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
21
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
22
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
23
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
24
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
25
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
26
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
27
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
28
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
@@ -0,0 +1,103 @@
1
+ Metadata-Version: 2.1
2
+ Name: ObjectNat
3
+ Version: 0.1.1
4
+ Summary: ObjectNat is an open-source library created for geospatial analysis created by IDU team
5
+ License: BSD-3-Clause
6
+ Author: Danila
7
+ Author-email: 63115678+DDonnyy@users.noreply.github.com
8
+ Requires-Python: >=3.9,<4.0
9
+ Classifier: License :: OSI Approved :: BSD License
10
+ Classifier: Programming Language :: Python :: 3
11
+ Classifier: Programming Language :: Python :: 3.9
12
+ Classifier: Programming Language :: Python :: 3.10
13
+ Classifier: Programming Language :: Python :: 3.11
14
+ Classifier: Programming Language :: Python :: 3.12
15
+ Requires-Dist: dongraphio (>=0.3.9,<0.4.0)
16
+ Requires-Dist: geopandas (>=0.14.3,<0.15.0)
17
+ Requires-Dist: joblib (>=1.4.2,<2.0.0)
18
+ Requires-Dist: networkit (>=11.0,<12.0)
19
+ Requires-Dist: networkx (>=3.2.1,<4.0.0)
20
+ Requires-Dist: numpy (>=1.23.5,<2.0.0)
21
+ Requires-Dist: pandarallel (>=1.6.5,<2.0.0)
22
+ Requires-Dist: pandas (>=2.2.0,<3.0.0)
23
+ Requires-Dist: population-restorator (>=0.2.3,<0.3.0)
24
+ Requires-Dist: provisio (>=0.1.7,<0.2.0)
25
+ Requires-Dist: scikit-learn (>=1.4.0,<2.0.0)
26
+ Requires-Dist: tqdm (>=4.66.2,<5.0.0)
27
+ Description-Content-Type: text/markdown
28
+
29
+ # ObjectNat - Meta Library
30
+
31
+ [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
32
+
33
+ <p align="center">
34
+ <img src="https://i.ibb.co/FWtHNQv/logo.png" alt="logo" width="400">
35
+ </p>
36
+
37
+ #### **ObjectNat** is an open-source library created for geospatial analysis created by **IDU team**
38
+
39
+ ## ObjectNat Components
40
+
41
+ - [dongraphio](https://github.com/DDonnyy/dongraphio) : `dongraphio` provides graph functions
42
+ - [provisio](https://github.com/DDonnyy/provisio) : `provisio` provides main provisio fuctions
43
+ - [population-restorator](https://github.com/kanootoko/population-restorator) : `restorator` provides city resettlement
44
+
45
+ ## Features and how to use
46
+
47
+ 1. **[City graph from OSM](./examples/graph_generator.ipynb)** - Function to assemble a road, pedestrian, and public
48
+ transport graph from OpenStreetMap (OSM) and creating Intermodal graph.
49
+
50
+ <img src="https://i.ibb.co/VpsPgL1/Graph-generator-v1.webp" alt="Graph-generator-v1" height="250">
51
+
52
+ 2. **[Adjacency matrix](./examples/calculate_adjacency_matrix.ipynb)** - Calculate adjacency matrix based on the provided
53
+ graph and edge weight type (time or distance). The intermodal graph can be obtained using the previous example.
54
+ 3. **[Isochrones,transport accessibility](./examples/isochrone_generator.ipynb)** - Function for generating isochrones to
55
+ analyze transportation accessibility from specified starting coordinates. Isochrones can be constructed based on
56
+ pedestrian, automobile, or public transport graphs, or a combination thereof.
57
+
58
+ <img src="https://i.ibb.co/QM0tmZ2/isochrones-from-2-points.webp" alt="isochrones-from-2-points" height="250">
59
+
60
+ 4. **[Population restoration](./examples/restore_population.ipynb)** - Function for resettling population into the provided
61
+ layer of residential buildings. This function distributes people among dwellings based on the total city population
62
+ and the living area of each house.
63
+ 5. **[Service provision](./examples/calculate_provision.ipynb)** - Function for calculating the provision of residential
64
+ buildings and population with services. In case of missing data, this function utilizes previously described
65
+ functionality to retrieve the necessary information.
66
+
67
+ <img src="https://i.ibb.co/CW7Xj5F/Burger-Provision5min.webp" alt="Burger-Provision5min" height="250">
68
+
69
+ 6. **[Visibility analysis](./examples/visibility_analysis.ipynb)** - Function to get a quick estimate of visibility from a
70
+ given point(s) to buildings within a given distance. Also, there is a visibility catchment area calculator for a
71
+ large
72
+ urban area. This function is designed to work with at least 1000 points spaced 10-20 meters apart for optimal
73
+ results. Points can be generated using a road graph and random point distribution along edges.
74
+
75
+ <img src="https://i.ibb.co/LxcGTfN/visibility-from-point.webp" alt="visibility-from-point" height="250">
76
+
77
+ <img src="https://i.ibb.co/zNRzXc5/visibility-catchment-area.webp" alt="visibility-catchment-area" height="250">
78
+
79
+ 7. **[Point clusterization](./examples/point_clusterization.ipynb)** - Function to generate cluster polygons for given
80
+ points based on a specified minimum distance and minimum points per cluster. Optionally, calculate the relative ratio
81
+ between types of services within the clusters.
82
+
83
+ <img src="https://i.ibb.co/fF3c4YC/service-clusterization.webp" alt="service-clusterization" height="250">
84
+
85
+ ## Installation
86
+
87
+ **ObjectNat** can be installed with ``pip``:
88
+
89
+ ```
90
+ pip install ObjectNat
91
+ ```
92
+
93
+ ## Contacts
94
+
95
+ - [NCCR](https://actcognitive.org/) - National
96
+ Center for Cognitive Research
97
+ - [IDU](https://idu.itmo.ru/) - Institute of
98
+ Design and Urban Studies
99
+ - [Natalya Chichkova](https://t.me/nancy_nat) - project manager
100
+ - [Danila Oleynikov (Donny)](https://t.me/ddonny_dd) - lead software engineer
101
+
102
+ ## Publications
103
+
@@ -0,0 +1,74 @@
1
+ # ObjectNat - Meta Library
2
+
3
+ [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
4
+
5
+ <p align="center">
6
+ <img src="https://i.ibb.co/FWtHNQv/logo.png" alt="logo" width="400">
7
+ </p>
8
+
9
+ #### **ObjectNat** is an open-source library created for geospatial analysis created by **IDU team**
10
+
11
+ ## ObjectNat Components
12
+
13
+ - [dongraphio](https://github.com/DDonnyy/dongraphio) : `dongraphio` provides graph functions
14
+ - [provisio](https://github.com/DDonnyy/provisio) : `provisio` provides main provisio fuctions
15
+ - [population-restorator](https://github.com/kanootoko/population-restorator) : `restorator` provides city resettlement
16
+
17
+ ## Features and how to use
18
+
19
+ 1. **[City graph from OSM](./examples/graph_generator.ipynb)** - Function to assemble a road, pedestrian, and public
20
+ transport graph from OpenStreetMap (OSM) and creating Intermodal graph.
21
+
22
+ <img src="https://i.ibb.co/VpsPgL1/Graph-generator-v1.webp" alt="Graph-generator-v1" height="250">
23
+
24
+ 2. **[Adjacency matrix](./examples/calculate_adjacency_matrix.ipynb)** - Calculate adjacency matrix based on the provided
25
+ graph and edge weight type (time or distance). The intermodal graph can be obtained using the previous example.
26
+ 3. **[Isochrones,transport accessibility](./examples/isochrone_generator.ipynb)** - Function for generating isochrones to
27
+ analyze transportation accessibility from specified starting coordinates. Isochrones can be constructed based on
28
+ pedestrian, automobile, or public transport graphs, or a combination thereof.
29
+
30
+ <img src="https://i.ibb.co/QM0tmZ2/isochrones-from-2-points.webp" alt="isochrones-from-2-points" height="250">
31
+
32
+ 4. **[Population restoration](./examples/restore_population.ipynb)** - Function for resettling population into the provided
33
+ layer of residential buildings. This function distributes people among dwellings based on the total city population
34
+ and the living area of each house.
35
+ 5. **[Service provision](./examples/calculate_provision.ipynb)** - Function for calculating the provision of residential
36
+ buildings and population with services. In case of missing data, this function utilizes previously described
37
+ functionality to retrieve the necessary information.
38
+
39
+ <img src="https://i.ibb.co/CW7Xj5F/Burger-Provision5min.webp" alt="Burger-Provision5min" height="250">
40
+
41
+ 6. **[Visibility analysis](./examples/visibility_analysis.ipynb)** - Function to get a quick estimate of visibility from a
42
+ given point(s) to buildings within a given distance. Also, there is a visibility catchment area calculator for a
43
+ large
44
+ urban area. This function is designed to work with at least 1000 points spaced 10-20 meters apart for optimal
45
+ results. Points can be generated using a road graph and random point distribution along edges.
46
+
47
+ <img src="https://i.ibb.co/LxcGTfN/visibility-from-point.webp" alt="visibility-from-point" height="250">
48
+
49
+ <img src="https://i.ibb.co/zNRzXc5/visibility-catchment-area.webp" alt="visibility-catchment-area" height="250">
50
+
51
+ 7. **[Point clusterization](./examples/point_clusterization.ipynb)** - Function to generate cluster polygons for given
52
+ points based on a specified minimum distance and minimum points per cluster. Optionally, calculate the relative ratio
53
+ between types of services within the clusters.
54
+
55
+ <img src="https://i.ibb.co/fF3c4YC/service-clusterization.webp" alt="service-clusterization" height="250">
56
+
57
+ ## Installation
58
+
59
+ **ObjectNat** can be installed with ``pip``:
60
+
61
+ ```
62
+ pip install ObjectNat
63
+ ```
64
+
65
+ ## Contacts
66
+
67
+ - [NCCR](https://actcognitive.org/) - National
68
+ Center for Cognitive Research
69
+ - [IDU](https://idu.itmo.ru/) - Institute of
70
+ Design and Urban Studies
71
+ - [Natalya Chichkova](https://t.me/nancy_nat) - project manager
72
+ - [Danila Oleynikov (Donny)](https://t.me/ddonny_dd) - lead software engineer
73
+
74
+ ## Publications
@@ -1,30 +1,27 @@
1
1
  [tool.poetry]
2
2
  name = "ObjectNat"
3
- version = "0.0.3"
4
- description = ""
3
+ version = "0.1.1"
4
+ description = "ObjectNat is an open-source library created for geospatial analysis created by IDU team"
5
+ license = "BSD-3-Clause"
5
6
  authors = ["Danila <63115678+DDonnyy@users.noreply.github.com>"]
6
7
  readme = "README.md"
7
8
 
8
9
  packages = [{ include = "objectnat", from = "src" }]
9
10
 
10
11
  [tool.poetry.dependencies]
11
- python = "^3.10"
12
+ python = "^3.9"
12
13
  geopandas = "^0.14.3"
13
- osmnx = "^1.9.1"
14
14
  tqdm = "^4.66.2"
15
- osm2geojson = "^0.2.4"
16
- pydantic = "^2.6.1"
17
- momepy = "^0.7.0"
18
15
  networkit = "^11.0"
19
16
  numpy = "^1.23.5"
20
17
  pandas = "^2.2.0"
21
18
  networkx = "^3.2.1"
22
19
  population-restorator = "^0.2.3"
23
- dongraphio = "^0.2.5"
24
- provisio = "^0.1.5"
25
-
26
-
27
-
20
+ dongraphio = "^0.3.9"
21
+ provisio = "^0.1.7"
22
+ joblib = "^1.4.2"
23
+ pandarallel = "^1.6.5"
24
+ scikit-learn = "^1.4.0"
28
25
 
29
26
  [tool.poetry.group.dev.dependencies]
30
27
  black = "^24.2.0"
@@ -1,9 +1,18 @@
1
- __version__ = "0.0.3"
1
+ __version__ = "0.1.1"
2
2
 
3
3
  from dongraphio.enums import GraphType
4
+
4
5
  from .methods.adjacency_matrix import get_adjacency_matrix
5
6
  from .methods.balanced_buildings import get_balanced_buildings
7
+ from .methods.cluster_points_in_polygons import get_clusters_polygon
8
+ from .methods.coverage_zones import get_isochrone_zone_coverage, get_radius_zone_coverage
6
9
  from .methods.demands import get_demands
7
10
  from .methods.isochrones import get_accessibility_isochrones
8
11
  from .methods.osm_graph import get_intermodal_graph_from_osm
9
12
  from .methods.provision import NoOsmIdException, NoWeightAdjacencyException, get_provision
13
+ from .methods.visibility_analysis import (
14
+ calculate_visibility_catchment_area,
15
+ get_visibilities_from_points,
16
+ get_visibility,
17
+ get_visibility_accurate,
18
+ )
@@ -13,6 +13,7 @@ def get_adjacency_matrix(
13
13
  city_crs: int | None = None,
14
14
  nx_graph: nx.MultiDiGraph | None = None,
15
15
  dongraphio: DonGraphio | None = None,
16
+ graph_type=None,
16
17
  ) -> pd.DataFrame:
17
18
  """
18
19
  Get the adjacency matrix for the specified city graph, buildings, and services.
@@ -28,11 +29,11 @@ def get_adjacency_matrix(
28
29
  """
29
30
  try:
30
31
  if dongraphio:
31
- return dongraphio.get_adjacency_matrix(buildings_from, services_to, weight)
32
+ return dongraphio.get_adjacency_matrix(buildings_from, services_to, weight, graph_type=graph_type)
32
33
 
33
34
  dongraphio = DonGraphio(city_crs)
34
35
  dongraphio.set_graph(nx_graph)
35
- return dongraphio.get_adjacency_matrix(buildings_from, services_to, weight)
36
+ return dongraphio.get_adjacency_matrix(buildings_from, services_to, weight, graph_type=graph_type)
36
37
  except ValidationError as e:
37
38
  logger.error("Function get_adjacency_matrix() missing 'weight' argument")
38
39
  raise e
@@ -0,0 +1,113 @@
1
+ from typing import Literal
2
+
3
+ import geopandas as gpd
4
+ import pandas as pd
5
+ from loguru import logger
6
+ from sklearn.cluster import DBSCAN, HDBSCAN
7
+
8
+
9
+ def _get_cluster(services_select, min_dist, min_point, method):
10
+ services_coords = pd.DataFrame(
11
+ {"x": services_select.geometry.representative_point().x, "y": services_select.geometry.representative_point().y}
12
+ )
13
+ if method == "DBSCAN":
14
+ db = DBSCAN(eps=min_dist, min_samples=min_point).fit(services_coords.to_numpy())
15
+ else:
16
+ db = HDBSCAN(min_cluster_size=min_point, cluster_selection_epsilon=min_dist).fit(services_coords.to_numpy())
17
+ services_select["cluster"] = db.labels_
18
+ return services_select
19
+
20
+
21
+ def _get_service_ratio(loc):
22
+ all_services = loc.shape[0]
23
+ loc["service_code"] = loc["service_code"].astype(str)
24
+ services_count = loc.groupby("service_code").size()
25
+ return (services_count / all_services).round(2)
26
+
27
+
28
+ def get_clusters_polygon(
29
+ points: gpd.GeoDataFrame,
30
+ min_dist: float | int = 100,
31
+ min_point: int = 5,
32
+ method: Literal["DBSCAN", "HDBSCAN"] = "HDBSCAN",
33
+ ) -> tuple[gpd.GeoDataFrame, gpd.GeoDataFrame]:
34
+ """
35
+ Generate cluster polygons for given points based on a specified minimum distance and minimum points per cluster.
36
+ Optionally, calculate the relative ratio between types of services within the clusters.
37
+
38
+ Parameters
39
+ ----------
40
+ points : gpd.GeoDataFrame
41
+ GeoDataFrame containing the points to be clustered.
42
+ Must include a 'service_code' column for service ratio calculations.
43
+ min_dist : float | int, optional
44
+ Minimum distance between points to be considered part of the same cluster. Defaults to 100.
45
+ min_point : int, optional
46
+ Minimum number of points required to form a cluster. Defaults to 5.
47
+ method : Literal["DBSCAN", "HDBSCAN"], optional
48
+ The clustering method to use. Must be either "DBSCAN" or "HDBSCAN". Defaults to "HDBSCAN".
49
+
50
+ Returns
51
+ -------
52
+ tuple[gpd.GeoDataFrame, gpd.GeoDataFrame]
53
+ A tuple containing the clustered polygons GeoDataFrame and the original points GeoDataFrame with cluster labels.
54
+
55
+ Examples
56
+ --------
57
+ >>> import geopandas as gpd
58
+ >>> from shapely.geometry import Point
59
+
60
+ >>> points = gpd.GeoDataFrame({
61
+ ... 'geometry': [Point(0, 0), Point(1, 1), Point(2, 2)],
62
+ ... 'service_code': [1, 1, 2]
63
+ ... }, crs=4326)
64
+
65
+ >>> clusters, services = get_clusters_polygon(points, min_dist=50, min_point=2)
66
+ """
67
+ if method not in ["DBSCAN", "HDBSCAN"]:
68
+ raise ValueError("Method must be either 'DBSCAN' or 'HDBSCAN'")
69
+
70
+ services_select = _get_cluster(points, min_dist, min_point, method)
71
+
72
+ if "service_code" not in points.columns:
73
+ logger.warning(
74
+ "No 'service_code' column in provided GeoDataFrame, cluster polygons will be without relative ratio."
75
+ )
76
+ points["service_code"] = 1
77
+
78
+ services_normal = services_select[services_select["cluster"] != -1]
79
+ services_outlier = services_select[services_select["cluster"] == -1]
80
+
81
+ if len(services_normal) > 0:
82
+ cluster_service = services_normal.groupby("cluster", group_keys=True).apply(_get_service_ratio)
83
+ if isinstance(cluster_service, pd.Series):
84
+ cluster_service = cluster_service.unstack(level=1, fill_value=0)
85
+
86
+ polygons_normal = services_normal.dissolve("cluster").concave_hull(ratio=0.7, allow_holes=False)
87
+ df_clusters_normal = pd.concat([cluster_service, polygons_normal.rename("geometry")], axis=1)
88
+ cluster_normal = df_clusters_normal.index.max()
89
+ else:
90
+ df_clusters_normal = None
91
+ cluster_normal = 0
92
+
93
+ if len(services_outlier) > 0:
94
+ clusters_outlier = cluster_normal + 1
95
+ new_clusters = list(range(clusters_outlier, clusters_outlier + len(services_outlier)))
96
+ services_outlier.loc[:, "cluster"] = new_clusters
97
+
98
+ cluster_service = services_outlier.groupby("cluster", group_keys=True).apply(_get_service_ratio)
99
+ if isinstance(cluster_service, pd.Series):
100
+ cluster_service = cluster_service.unstack(level=1, fill_value=0)
101
+
102
+ df_clusters_outlier = cluster_service.join(services_outlier.set_index("cluster")["geometry"])
103
+ else:
104
+ services_outlier = None
105
+ df_clusters_outlier = None
106
+
107
+ df_clusters = pd.concat([df_clusters_normal, df_clusters_outlier]).fillna(0).set_geometry("geometry")
108
+ df_clusters["geometry"] = df_clusters["geometry"].buffer(min_dist / 2)
109
+ df_clusters = df_clusters.rename(columns={"index": "cluster_id"})
110
+
111
+ services = pd.concat([services_normal, services_outlier])
112
+
113
+ return df_clusters, services
@@ -0,0 +1,100 @@
1
+ from typing import Literal
2
+
3
+ import geopandas as gpd
4
+ import networkx as nx
5
+ from dongraphio import GraphType
6
+
7
+ from .isochrones import get_accessibility_isochrones
8
+
9
+
10
+ def get_radius_zone_coverage(services: gpd.GeoDataFrame, radius: int) -> gpd.GeoDataFrame:
11
+ """
12
+ Create a buffer zone with a defined radius around each service location.
13
+
14
+ Parameters
15
+ ----------
16
+ services : gpd.GeoDataFrame
17
+ GeoDataFrame containing the service locations.
18
+ radius : int
19
+ The radius for the buffer in meters.
20
+
21
+ Returns
22
+ -------
23
+ gpd.GeoDataFrame
24
+ GeoDataFrame with the buffer zones around each service location.
25
+
26
+ Examples
27
+ --------
28
+ >>> import geopandas as gpd
29
+ >>> from shapely.geometry import Point
30
+
31
+ >>> # Create a sample GeoDataFrame for services
32
+ >>> services = gpd.read_file('services.geojson')
33
+
34
+ >>> # Define the radius
35
+ >>> radius = 50
36
+
37
+ >>> # Get radius zone coverage
38
+ >>> radius_zones = get_radius_zone_coverage(services, radius)
39
+ >>> print(radius_zones)
40
+ """
41
+ services["geometry"] = services["geometry"].buffer(radius)
42
+ return services
43
+
44
+
45
+ def get_isochrone_zone_coverage(
46
+ services: gpd.GeoDataFrame,
47
+ weight_type: Literal["time_min", "length_meter"],
48
+ weight_value: int,
49
+ city_graph: nx.Graph,
50
+ graph_type: list[GraphType],
51
+ ) -> tuple[gpd.GeoDataFrame, gpd.GeoDataFrame | None, gpd.GeoDataFrame | None]:
52
+ """
53
+ Create isochrones for each service location based on travel time/distance.
54
+
55
+ Parameters
56
+ ----------
57
+ services : gpd.GeoDataFrame
58
+ GeoDataFrame containing the service locations.
59
+ weight_type : str
60
+ Type of weight used for calculating isochrones, either "time_min" or "length_meter".
61
+ weight_value : int
62
+ The value of the weight, representing time in minutes or distance in meters.
63
+ city_graph : nx.Graph
64
+ The graph representing the city's transportation network.
65
+ graph_type : list[GraphType]
66
+ List of graph types to be used for isochrone calculations.
67
+
68
+ Returns
69
+ -------
70
+ tuple[gpd.GeoDataFrame, gpd.GeoDataFrame | None, gpd.GeoDataFrame | None]
71
+ The calculated isochrone zones, optionally including routes and stops.
72
+
73
+ Examples
74
+ --------
75
+ >>> import networkx as nx
76
+ >>> import geopandas as gpd
77
+ >>> from shapely.geometry import Point
78
+ >>> from dongraphio import GraphType
79
+
80
+ >>> # Create a sample city graph or download it from osm with get_intermodal_graph_from_osm()
81
+ >>> city_graph = nx.MultiDiGraph()
82
+
83
+ >>> # Create a sample GeoDataFrame for services
84
+ >>> services = gpd.read_file('services.geojson')
85
+
86
+ >>> # Define parameters
87
+ >>> weight_type = "time_min"
88
+ >>> weight_value = 10
89
+ >>> graph_type = [GraphType.PUBLIC_TRANSPORT, GraphType.WALK]
90
+
91
+ >>> # Get isochrone zone coverage
92
+ >>> isochrone_zones = get_isochrone_zone_coverage(services, weight_type, weight_value, city_graph, graph_type)
93
+ >>> isochrone_zones[0] # represent isochrones geodataframe
94
+ """
95
+ assert services.crs == city_graph.graph["crs"], "CRS not match"
96
+ points = services.geometry.representative_point()
97
+ isochrone_res = get_accessibility_isochrones(
98
+ points, graph_type, weight_value, weight_type, city_graph, points.crs.to_epsg()
99
+ )
100
+ return isochrone_res
@@ -0,0 +1,66 @@
1
+ import geopandas as gpd
2
+ import networkx as nx
3
+ from dongraphio import DonGraphio, GraphType
4
+ from shapely import Point
5
+
6
+
7
+ def get_accessibility_isochrones(
8
+ points: Point | gpd.GeoSeries,
9
+ graph_type: list[GraphType],
10
+ weight_value: int,
11
+ weight_type: str,
12
+ city_graph: nx.MultiDiGraph,
13
+ city_crs: int,
14
+ ) -> tuple[gpd.GeoDataFrame, gpd.GeoDataFrame | None, gpd.GeoDataFrame | None]:
15
+ """
16
+ Calculate accessibility isochrones based on the provided city graph for the selected graph_type from point(s).
17
+
18
+ Parameters
19
+ ----------
20
+ points : Point or gpd.GeoSeries
21
+ Points from which the isochrones will be calculated.
22
+ graph_type : list[GraphType]
23
+ List of graph types to calculate isochrones for.
24
+ weight_value : int
25
+ Weight value for the accessibility calculations.
26
+ weight_type : str
27
+ The type of the weight, could be either "time_min" or "length_meter".
28
+ city_graph : nx.MultiDiGraph
29
+ The graph representing the city.
30
+ city_crs : int
31
+ The CRS (Coordinate Reference System) for the city.
32
+
33
+ Returns
34
+ -------
35
+ tuple[gpd.GeoDataFrame, gpd.GeoDataFrame | None, gpd.GeoDataFrame | None]
36
+ A tuple containing the accessibility isochrones, and optionally the routes and stops.
37
+
38
+ Examples
39
+ --------
40
+ >>> import networkx as nx
41
+ >>> import geopandas as gpd
42
+ >>> from shapely.geometry import Point
43
+ >>> from dongraphio import GraphType
44
+
45
+ >>> # Create a sample city graph or download it from osm with get_intermodal_graph_from_osm()
46
+ >>> city_graph = nx.MultiDiGraph()
47
+
48
+ >>> # Define parameters
49
+ >>> graph_type = [GraphType.PUBLIC_TRANSPORT, GraphType.WALK]
50
+ >>> points = gpd.GeoSeries([Point(0, 0)])
51
+ >>> weight_value = 15
52
+ >>> weight_type = "time_min"
53
+ >>> city_crs = 4326 # Should be the same with CRS of the city graph
54
+
55
+ >>> # Calculate isochrones
56
+ >>> isochrones, routes, stops = get_accessibility_isochrones(
57
+ ... graph_type, points, weight_value, weight_type, city_graph, city_crs
58
+ ... )
59
+
60
+ >>> print(isochrones)
61
+ >>> print(routes)
62
+ >>> print(stops)
63
+ """
64
+ dongraphio = DonGraphio(city_crs)
65
+ dongraphio.set_graph(city_graph)
66
+ return dongraphio.get_accessibility_isochrones(graph_type, points, weight_value, weight_type)
@@ -126,4 +126,10 @@ def get_provision(
126
126
  if calculate_matrix:
127
127
  adjacency_matrix = dngraph.get_adjacency_matrix(buildings, services, weight_adjacency_matrix)
128
128
 
129
- return get_service_provision(services, adjacency_matrix, buildings, threshold, calculation_type)
129
+ return get_service_provision(
130
+ services=services,
131
+ adjacency_matrix=adjacency_matrix,
132
+ demanded_buildings=buildings,
133
+ threshold=threshold,
134
+ calculation_type=calculation_type,
135
+ )
@@ -0,0 +1,529 @@
1
+ import math
2
+ from multiprocessing import cpu_count
3
+
4
+ import geopandas as gpd
5
+ import numpy as np
6
+ import pandas as pd
7
+ from loguru import logger
8
+ from pandarallel import pandarallel
9
+ from shapely import LineString, MultiPolygon, Point, Polygon
10
+ from shapely.ops import polygonize, unary_union
11
+ from tqdm.contrib.concurrent import process_map
12
+
13
+
14
+ def get_visibility_accurate(point_from: Point, obstacles: gpd.GeoDataFrame, view_distance) -> Polygon:
15
+ """
16
+ Function to get accurate visibility from a given point to buildings within a given distance.
17
+
18
+ Parameters
19
+ ----------
20
+ point_from : Point
21
+ The point from which the line of sight is drawn.
22
+ obstacles : gpd.GeoDataFrame
23
+ A GeoDataFrame containing the geometry of the obstacles.
24
+ view_distance : float
25
+ The distance of view from the point.
26
+
27
+ Returns
28
+ -------
29
+ Polygon
30
+ A polygon representing the area of visibility from the given point.
31
+
32
+ Notes
33
+ -----
34
+ If a quick result is important, consider using the `get_visibility_result()` function instead.
35
+ However, please note that `get_visibility_result()` may provide less accurate results.
36
+
37
+ Examples
38
+ --------
39
+ >>> point_from = Point(1, 1)
40
+ >>> buildings = gpd.read_file('buildings.shp')
41
+ >>> view_distance = 1000
42
+ >>> visibility = get_visibility_accurate(point_from, obstacles, view_distance)
43
+ """
44
+
45
+ def get_point_from_a_thorough_b(a: Point, b: Point, dist):
46
+ """
47
+ Func to get Point from point a thorough point b on dist
48
+ """
49
+ direction = math.atan2(b.y - a.y, b.x - a.x)
50
+ c_x = a.x + dist * math.cos(direction)
51
+ c_y = a.y + dist * math.sin(direction)
52
+ return Point(c_x, c_y)
53
+
54
+ def polygon_to_linestring(geometry: Polygon):
55
+ """A function to return all segments of a polygon as a list of linestrings"""
56
+ coords_ext = geometry.exterior.coords # Create a list of all line node coordinates
57
+ polygons_inter = [Polygon(x) for x in geometry.interiors]
58
+ result = [LineString(part) for part in zip(coords_ext, coords_ext[1:])]
59
+ for poly in polygons_inter:
60
+ poly_coords = poly.exterior.coords
61
+ result.extend([LineString(part) for part in zip(poly_coords, poly_coords[1:])])
62
+ return result
63
+
64
+ point_buffer = point_from.buffer(view_distance, resolution=32)
65
+ s = obstacles.intersects(point_buffer)
66
+ buildings_in_buffer = obstacles.loc[s[s].index]
67
+ # TODO kick all geoms except Polygons/MultiPolygons
68
+ buildings_in_buffer = buildings_in_buffer.geometry.apply(
69
+ lambda x: list(x.geoms) if isinstance(x, MultiPolygon) else x
70
+ ).explode()
71
+
72
+ buildings_lines_in_buffer = gpd.GeoSeries(buildings_in_buffer.apply(polygon_to_linestring).explode())
73
+ buildings_lines_in_buffer = buildings_lines_in_buffer.loc[buildings_lines_in_buffer.intersects(point_buffer)]
74
+
75
+ buildings_in_buffer_points = gpd.GeoSeries(
76
+ [Point(line.coords[0]) for line in buildings_lines_in_buffer.geometry]
77
+ + [Point(line.coords[-1]) for line in buildings_lines_in_buffer.geometry]
78
+ )
79
+
80
+ max_dist = max(view_distance, buildings_in_buffer_points.distance(point_from).max())
81
+ polygons = []
82
+ buildings_lines_in_buffer = gpd.GeoDataFrame(geometry=buildings_lines_in_buffer, crs=obstacles.crs).reset_index(
83
+ drop=True
84
+ )
85
+ iteration = 0
86
+ while not buildings_lines_in_buffer.empty:
87
+ iteration += 1
88
+ gdf_sindex = buildings_lines_in_buffer.sindex
89
+ # TODO check if 2 walls are nearest and use the widest angle between points
90
+ nearest_wall_sind = gdf_sindex.nearest(point_from, return_all=False)
91
+ nearest_wall = buildings_lines_in_buffer.loc[nearest_wall_sind[1]].iloc[0]
92
+ wall_points = [Point(coords) for coords in nearest_wall.geometry.coords]
93
+
94
+ # Calculate angles and sort by angle
95
+ points_with_angle = sorted(
96
+ [(pt, math.atan2(pt.y - point_from.y, pt.x - point_from.x)) for pt in wall_points], key=lambda x: x[1]
97
+ )
98
+
99
+ delta_angle = 2 * math.pi + points_with_angle[0][1] - points_with_angle[-1][1]
100
+ if delta_angle > math.pi:
101
+ delta_angle = 2 * math.pi - delta_angle
102
+ a = math.sqrt((max_dist**2) * (1 + (math.tan(delta_angle / 2) ** 2)))
103
+ p1 = get_point_from_a_thorough_b(point_from, points_with_angle[0][0], a)
104
+ p2 = get_point_from_a_thorough_b(point_from, points_with_angle[1][0], a)
105
+ polygon = Polygon([points_with_angle[0][0], p1, p2, points_with_angle[1][0]])
106
+
107
+ polygons.append(polygon)
108
+
109
+ buildings_lines_in_buffer.drop(nearest_wall_sind[1], inplace=True)
110
+
111
+ lines_to_kick = buildings_lines_in_buffer.within(polygon)
112
+ buildings_lines_in_buffer = buildings_lines_in_buffer.loc[~lines_to_kick]
113
+ buildings_lines_in_buffer.reset_index(drop=True, inplace=True)
114
+ res = point_buffer.difference(unary_union(polygons))
115
+ if isinstance(res, Polygon):
116
+ return res
117
+ res = list(res.geoms)
118
+ polygon_containing_point = None
119
+ for polygon in res:
120
+ if polygon.contains(point_from):
121
+ polygon_containing_point = polygon
122
+ break
123
+ return polygon_containing_point
124
+
125
+
126
+ def get_visibility(point: Point, obstacles: gpd.GeoDataFrame, view_distance: float, resolution: int = 32) -> Polygon:
127
+ """
128
+ Function to get a quick estimate of visibility from a given point to buildings within a given distance.
129
+
130
+ Parameters
131
+ ----------
132
+ point : Point
133
+ The point from which the line of sight is drawn.
134
+ obstacles : gpd.GeoDataFrame
135
+ A GeoDataFrame containing the geometry of the buildings.
136
+ view_distance : float
137
+ The distance of view from the point.
138
+ resolution: int
139
+ Buffer resolution for more accuracy (may give result slower)
140
+
141
+ Returns
142
+ -------
143
+ Polygon
144
+ A polygon representing the estimated area of visibility from the given point.
145
+
146
+ Notes
147
+ -----
148
+ This function provides a quicker but less accurate result compared to `get_visibility_accurate()`.
149
+ If accuracy is important, consider using `get_visibility_accurate()` instead.
150
+
151
+ Examples
152
+ --------
153
+ >>> point = Point(1, 1)
154
+ >>> buildings = gpd.read_file('buildings.shp')
155
+ >>> view_distance = 1000
156
+ >>> visibility = get_visibility(point, obstacles, view_distance)
157
+ """
158
+
159
+ point_buffer = point.buffer(view_distance, resolution=resolution)
160
+ s = obstacles.within(point_buffer)
161
+ buildings_in_buffer = obstacles.loc[s[s].index].reset_index(drop=True)
162
+ buffer_exterior_ = list(point_buffer.exterior.coords)
163
+ line_geometry = [LineString([point, ext]) for ext in buffer_exterior_]
164
+ buffer_lines_gdf = gpd.GeoDataFrame(geometry=line_geometry)
165
+ united_buildings = buildings_in_buffer.unary_union
166
+ if united_buildings:
167
+ splited_lines = buffer_lines_gdf["geometry"].apply(lambda x: x.difference(united_buildings))
168
+ else:
169
+ splited_lines = buffer_lines_gdf["geometry"]
170
+
171
+ splited_lines_gdf = gpd.GeoDataFrame(geometry=splited_lines).explode(index_parts=True)
172
+ splited_lines_list = []
173
+
174
+ for _, v in splited_lines_gdf.groupby(level=0):
175
+ splited_lines_list.append(v.iloc[0]["geometry"].coords[-1])
176
+ circuit = Polygon(splited_lines_list)
177
+ if united_buildings:
178
+ circuit = circuit.difference(united_buildings)
179
+ return circuit
180
+
181
+
182
+ def _multiprocess_get_vis(args):
183
+ point, buildings, view_distance, sectors_n = args
184
+ result = get_visibility_accurate(point, buildings, view_distance)
185
+
186
+ if sectors_n is not None:
187
+ sectors = []
188
+
189
+ cx, cy = point.x, point.y
190
+
191
+ angle_increment = 2 * math.pi / sectors_n
192
+ view_distance = math.sqrt((view_distance**2) * (1 + (math.tan(angle_increment / 2) ** 2)))
193
+ for i in range(sectors_n):
194
+ angle1 = i * angle_increment
195
+ angle2 = (i + 1) * angle_increment
196
+
197
+ x1, y1 = cx + view_distance * math.cos(angle1), cy + view_distance * math.sin(angle1)
198
+ x2, y2 = cx + view_distance * math.cos(angle2), cy + view_distance * math.sin(angle2)
199
+
200
+ sector_triangle = Polygon([point, (x1, y1), (x2, y2)])
201
+ sector = result.intersection(sector_triangle)
202
+
203
+ if not sector.is_empty:
204
+ sectors.append(sector)
205
+ result = sectors
206
+ return result
207
+
208
+
209
+ def _polygons_to_linestring(geom):
210
+ # pylint: disable-next=redefined-outer-name,reimported,import-outside-toplevel
211
+ from shapely import LineString, MultiLineString, MultiPolygon
212
+
213
+ def convert_polygon(polygon: Polygon):
214
+ lines = []
215
+ exterior = LineString(polygon.exterior.coords)
216
+ lines.append(exterior)
217
+ interior = [LineString(p.coords) for p in polygon.interiors]
218
+ lines = lines + interior
219
+ return lines
220
+
221
+ def convert_multipolygon(polygon: MultiPolygon):
222
+ return MultiLineString(sum([convert_polygon(p) for p in polygon.geoms], []))
223
+
224
+ if geom.geom_type == "Polygon":
225
+ return MultiLineString(convert_polygon(geom))
226
+ if geom.geom_type == "MultiPolygon":
227
+ return convert_multipolygon(geom)
228
+ return geom
229
+
230
+
231
+ def _combine_geometry(gdf: gpd.GeoDataFrame) -> gpd.GeoDataFrame:
232
+ """
233
+ Combine geometry of intersecting layers into a single GeoDataFrame.
234
+ Parameters
235
+ ----------
236
+ gdf: gpd.GeoDataFrame
237
+ A GeoPandas GeoDataFrame
238
+
239
+ Returns
240
+ -------
241
+ gpd.GeoDataFrame
242
+ The combined GeoDataFrame with aggregated in lists columns.
243
+
244
+ Examples
245
+ --------
246
+ >>> gdf = gpd.read_file('path_to_your_file.geojson')
247
+ >>> result = _combine_geometry(gdf)
248
+ """
249
+
250
+ crs = gdf.crs
251
+ polygons = polygonize(gdf["geometry"].apply(_polygons_to_linestring).unary_union)
252
+ enclosures = gpd.GeoSeries(list(polygons), crs=crs)
253
+ enclosures_points = gpd.GeoDataFrame(enclosures.representative_point(), columns=["geometry"], crs=crs)
254
+ joined = gpd.sjoin(enclosures_points, gdf, how="inner", predicate="within").reset_index()
255
+ cols = joined.columns.tolist()
256
+ cols.remove("geometry")
257
+ joined = joined.groupby("index").agg({column: list for column in cols})
258
+ joined["geometry"] = enclosures
259
+ joined = gpd.GeoDataFrame(joined, geometry="geometry", crs=crs)
260
+ return joined
261
+
262
+
263
+ def _min_max_normalization(data, new_min=0, new_max=1):
264
+ """
265
+ Min-max normalization for a given array of data.
266
+
267
+ Parameters
268
+ ----------
269
+ data: numpy.ndarray
270
+ Input data to be normalized.
271
+ new_min: float, optional
272
+ New minimum value for normalization. Defaults to 0.
273
+ new_max: float, optional
274
+ New maximum value for normalization. Defaults to 1.
275
+
276
+ Returns
277
+ -------
278
+ numpy.ndarray
279
+ Normalized data.
280
+
281
+ Examples
282
+ --------
283
+ >>> import numpy as np
284
+ >>> data = np.array([1, 2, 3, 4, 5])
285
+ >>> normalized_data = min_max_normalization(data, new_min=0, new_max=1)
286
+ """
287
+
288
+ min_value = np.min(data)
289
+ max_value = np.max(data)
290
+ normalized_data = (data - min_value) / (max_value - min_value) * (new_max - new_min) + new_min
291
+ return normalized_data
292
+
293
+
294
+ def _process_group(group):
295
+ geom = group
296
+ combined_geometry = _combine_geometry(geom)
297
+ combined_geometry.drop(columns=["index", "index_right"], inplace=True)
298
+ combined_geometry["count_n"] = combined_geometry["ratio"].apply(len)
299
+ combined_geometry["new_ratio"] = combined_geometry.apply(
300
+ lambda x: np.power(np.prod(x.ratio), 1 / x.count_n) * x.count_n, axis=1
301
+ )
302
+
303
+ threshold = combined_geometry["new_ratio"].quantile(0.25)
304
+ combined_geometry = combined_geometry[combined_geometry["new_ratio"] > threshold]
305
+
306
+ combined_geometry["new_ratio_normalized"] = _min_max_normalization(
307
+ combined_geometry["new_ratio"].values, new_min=1, new_max=10
308
+ )
309
+
310
+ combined_geometry["new_ratio_normalized"] = np.round(combined_geometry["new_ratio_normalized"]).astype(int)
311
+
312
+ result_union = (
313
+ combined_geometry.groupby("new_ratio_normalized")
314
+ .agg({"geometry": lambda x: unary_union(MultiPolygon(list(x)).buffer(0))})
315
+ .reset_index(drop=True)
316
+ )
317
+ result_union.set_geometry("geometry", inplace=True)
318
+ result_union.set_crs(geom.crs, inplace=True)
319
+
320
+ result_union = result_union.explode("geometry", index_parts=False).reset_index(drop=True)
321
+
322
+ representative_points = combined_geometry.copy()
323
+ representative_points["geometry"] = representative_points["geometry"].representative_point()
324
+
325
+ joined = gpd.sjoin(result_union, representative_points, how="inner", predicate="contains").reset_index()
326
+ joined = joined.groupby("index").agg({"geometry": "first", "new_ratio": lambda x: np.mean(list(x))})
327
+
328
+ joined.set_geometry("geometry", inplace=True)
329
+ joined.set_crs(geom.crs, inplace=True)
330
+ return joined
331
+
332
+
333
+ def get_visibilities_from_points(
334
+ points: gpd.GeoDataFrame,
335
+ obstacles: gpd.GeoDataFrame,
336
+ view_distance: int,
337
+ sectors_n=None,
338
+ max_workers: int = cpu_count(),
339
+ ) -> list[Polygon]:
340
+ """
341
+ Calculate visibility polygons from a set of points considering obstacles within a specified view distance.
342
+
343
+ Parameters
344
+ ----------
345
+ points : gpd.GeoDataFrame
346
+ GeoDataFrame containing the points from which visibility is calculated.
347
+ obstacles : gpd.GeoDataFrame
348
+ GeoDataFrame containing the obstacles that block visibility.
349
+ view_distance : int
350
+ The maximum distance from each point within which visibility is calculated.
351
+ sectors_n : int, optional
352
+ Number of sectors to divide the view into for more detailed visibility calculations. Defaults to None.
353
+ max_workers: int, optional
354
+ Maximum workers in multiproccesing, multipocessing.cpu_count() by default.
355
+
356
+ Returns
357
+ -------
358
+ list[Polygon]
359
+ A list of visibility polygons for each input point.
360
+
361
+ Notes
362
+ -----
363
+ This function uses `get_visibility_accurate()` in multiprocessing way.
364
+
365
+ Examples
366
+ --------
367
+ >>> import geopandas as gpd
368
+ >>> from shapely.geometry import Point, Polygon
369
+ >>> points = gpd.GeoDataFrame({'geometry': [Point(0, 0), Point(1, 1)]}, crs='epsg:4326')
370
+ >>> obstacles = gpd.GeoDataFrame({'geometry': [Polygon([(0, 0), (0, 1), (1, 1), (1, 0)])]}, crs='epsg:4326')
371
+ >>> view_distance = 100
372
+
373
+ >>> visibilities = get_visibilities_from_points(points, obstacles, view_distance)
374
+ >>> visibilities
375
+ """
376
+ # remove points inside polygons
377
+ joined = gpd.sjoin(points, obstacles, how="left", predicate="intersects")
378
+ points = joined[joined.index_right.isnull()]
379
+
380
+ # remove unused obstacles
381
+ points_view = points.geometry.buffer(view_distance).unary_union
382
+ s = obstacles.intersects(points_view)
383
+ buildings_in_buffer = obstacles.loc[s[s].index].reset_index(drop=True)
384
+
385
+ buildings_in_buffer.geometry = buildings_in_buffer.geometry.apply(
386
+ lambda geom: MultiPolygon([geom]) if isinstance(geom, Polygon) else geom
387
+ )
388
+ args = [(point, buildings_in_buffer, view_distance, sectors_n) for point in points.geometry]
389
+ all_visions = process_map(
390
+ _multiprocess_get_vis,
391
+ args,
392
+ chunksize=5,
393
+ desc="Calculating Visibility Catchment Area from each Point, it might take a while for a "
394
+ "big amount of points",
395
+ max_workers=max_workers,
396
+ )
397
+ # could return sectorized visions if sectors_n is set
398
+ return all_visions
399
+
400
+
401
+ def calculate_visibility_catchment_area(
402
+ points: gpd.GeoDataFrame, obstacles: gpd.GeoDataFrame, view_distance: int | float, max_workers: int = cpu_count()
403
+ ) -> gpd.GeoDataFrame:
404
+ """
405
+ Calculate visibility catchment areas for a large urban area based on given points and obstacles.
406
+ This function is designed to work with at least 1000 points spaced 10-20 meters apart for optimal results.
407
+ Points can be generated using a road graph.
408
+
409
+ Parameters
410
+ ----------
411
+ points : gpd.GeoDataFrame
412
+ GeoDataFrame containing the points from which visibility is calculated.
413
+ obstacles : gpd.GeoDataFrame
414
+ GeoDataFrame containing the obstacles that block visibility.
415
+ view_distance : int
416
+ The maximum distance from each point within which visibility is calculated.
417
+ max_workers: int
418
+ Maximum workers in multiproccesing, multipocessing.cpu_count() by default.
419
+
420
+ Returns
421
+ -------
422
+ gpd.GeoDataFrame
423
+ GeoDataFrame containing the calculated visibility catchment areas.
424
+
425
+ Examples
426
+ --------
427
+ >>> import geopandas as gpd
428
+ >>> from shapely.geometry import Point, Polygon
429
+ >>> points = gpd.read_file('points.shp')
430
+ >>> obstacles = gpd.read_file('obstacles.shp')
431
+ >>> view_distance = 1000
432
+
433
+ >>> visibility_areas = calculate_visibility_catchment_area(points, obstacles, view_distance)
434
+ >>> visibility_areas
435
+ """
436
+
437
+ def filter_geoms(x):
438
+ if x.geom_type == "GeometryCollection":
439
+ return MultiPolygon([y for y in x.geoms if y.geom_type in ["Polygon", "MultiPolygon"]])
440
+ return x
441
+
442
+ def calc_group_factor(x):
443
+ # pylint: disable-next=redefined-outer-name,reimported,import-outside-toplevel
444
+ import numpy as np
445
+
446
+ return np.mean(x.new_ratio) * x.count_n
447
+
448
+ def unary_union_groups(x):
449
+ # pylint: disable-next=redefined-outer-name,reimported,import-outside-toplevel
450
+ from shapely import MultiPolygon
451
+
452
+ # pylint: disable-next=redefined-outer-name,reimported,import-outside-toplevel
453
+ from shapely.ops import unary_union
454
+
455
+ return unary_union(MultiPolygon(list(x["geometry"])).buffer(0))
456
+
457
+ pandarallel.initialize(progress_bar=True, verbose=0)
458
+
459
+ assert points.crs == obstacles.crs
460
+ crs = obstacles.crs
461
+ sectors_n = 12
462
+ logger.info("Calculating Visibility Catchment Area from each point")
463
+ all_visions_sectorized = get_visibilities_from_points(points, obstacles, view_distance, sectors_n, max_workers)
464
+ all_visions_sectorized = gpd.GeoDataFrame(
465
+ geometry=[item for sublist in all_visions_sectorized for item in sublist], crs=crs
466
+ )
467
+ logger.info("Calculating non-vision part...")
468
+ all_visions_unary = all_visions_sectorized.unary_union
469
+ convex = all_visions_unary.convex_hull
470
+ dif = convex.difference(all_visions_unary)
471
+
472
+ del convex, all_visions_unary
473
+
474
+ buf_area = (math.pi * view_distance**2) / sectors_n
475
+ all_visions_sectorized["ratio"] = all_visions_sectorized.area / buf_area
476
+ all_visions_sectorized["ratio"] = _min_max_normalization(
477
+ all_visions_sectorized["ratio"].values, new_min=1, new_max=10
478
+ )
479
+ groups = all_visions_sectorized.sample(frac=1).groupby(all_visions_sectorized.index // 6000)
480
+ groups = [group for _, group in groups]
481
+
482
+ del all_visions_sectorized
483
+
484
+ groups_result = process_map(
485
+ _process_group,
486
+ groups,
487
+ desc="Counting intersections in each group...",
488
+ max_workers=max_workers,
489
+ )
490
+ logger.info("Calculating all groups intersection...")
491
+ all_in = _combine_geometry(gpd.GeoDataFrame(data=pd.concat(groups_result), geometry="geometry", crs=crs))
492
+
493
+ del groups_result
494
+
495
+ all_in["count_n"] = all_in["index_right"].apply(len)
496
+
497
+ logger.info("Calculating intersection's parameters")
498
+ all_in["factor"] = all_in.parallel_apply(calc_group_factor, axis=1)
499
+ threshold = all_in["factor"].quantile(0.3)
500
+ all_in = all_in[all_in["factor"] > threshold]
501
+
502
+ all_in["factor_normalized"] = np.round(
503
+ _min_max_normalization(np.sqrt(all_in["factor"].values), new_min=1, new_max=5)
504
+ ).astype(int)
505
+ logger.info("Calculating normalized groups geometry...")
506
+ all_in = all_in.groupby("factor_normalized").parallel_apply(unary_union_groups).reset_index()
507
+ all_in = gpd.GeoDataFrame(data=all_in.rename(columns={0: "geometry"}), geometry="geometry", crs=32636)
508
+
509
+ all_in = all_in.explode(index_parts=True).reset_index(drop=True)
510
+ all_in["area"] = all_in.area
511
+ threshold = all_in["area"].quantile(0.9)
512
+ all_in = all_in[all_in["area"] > threshold]
513
+ all_in = all_in.groupby("factor_normalized").apply(unary_union_groups).reset_index()
514
+ all_in = gpd.GeoDataFrame(data=all_in.rename(columns={0: "geometry"}), geometry="geometry", crs=32636)
515
+
516
+ all_in.geometry = all_in.geometry.buffer(20).buffer(-20).difference(dif)
517
+
518
+ all_in.sort_values(by="factor_normalized", ascending=False, inplace=True)
519
+ all_in.reset_index(drop=True, inplace=True)
520
+ logger.info("Smoothing normalized groups geometry...")
521
+ for ind, row in all_in.iloc[:-1].iterrows():
522
+ for ind2 in range(ind + 1, len(all_in)):
523
+ current_geometry = all_in.at[ind2, "geometry"]
524
+ all_in.at[ind2, "geometry"] = current_geometry.difference(row.geometry)
525
+ all_in["geometry"] = all_in["geometry"].apply(filter_geoms)
526
+
527
+ all_in = all_in.explode(index_parts=True)
528
+ logger.info("Done!")
529
+ return all_in
objectnat-0.0.3/PKG-INFO DELETED
@@ -1,79 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: ObjectNat
3
- Version: 0.0.3
4
- Summary:
5
- Author: Danila
6
- Author-email: 63115678+DDonnyy@users.noreply.github.com
7
- Requires-Python: >=3.10,<4.0
8
- Classifier: Programming Language :: Python :: 3
9
- Classifier: Programming Language :: Python :: 3.10
10
- Classifier: Programming Language :: Python :: 3.11
11
- Classifier: Programming Language :: Python :: 3.12
12
- Requires-Dist: dongraphio (>=0.2.5,<0.3.0)
13
- Requires-Dist: geopandas (>=0.14.3,<0.15.0)
14
- Requires-Dist: momepy (>=0.7.0,<0.8.0)
15
- Requires-Dist: networkit (>=11.0,<12.0)
16
- Requires-Dist: networkx (>=3.2.1,<4.0.0)
17
- Requires-Dist: numpy (>=1.23.5,<2.0.0)
18
- Requires-Dist: osm2geojson (>=0.2.4,<0.3.0)
19
- Requires-Dist: osmnx (>=1.9.1,<2.0.0)
20
- Requires-Dist: pandas (>=2.2.0,<3.0.0)
21
- Requires-Dist: population-restorator (>=0.2.3,<0.3.0)
22
- Requires-Dist: provisio (>=0.1.5,<0.2.0)
23
- Requires-Dist: pydantic (>=2.6.1,<3.0.0)
24
- Requires-Dist: tqdm (>=4.66.2,<5.0.0)
25
- Description-Content-Type: text/markdown
26
-
27
- # ObjectNat - Meta Library
28
-
29
- [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
30
-
31
-
32
- <p align="center">
33
- <img src="./docs/img/logo.png" alt="test-logo" width="400"></a>
34
- </p>
35
-
36
- **ObjectNat** is an open-source library created for geospatial analysis created by IDU team
37
-
38
- ## ObjectNat Components
39
-
40
- - [dongraphio](https://github.com/DDonnyy/dongraphio) : `dongraphio` provides graph functions
41
- - [provisio](https://github.com/DDonnyy/provisio) : `provisio` provides main provisio fuctions
42
- - [population-restorator](https://github.com/kanootoko/population-restorator) : `restorator` provides city resettlement
43
-
44
- ## Features and how to use
45
-
46
- 1. [City graph from OSM](./examples/graph_generator.ipynb) - Function to assemble a road, pedestrian, and public
47
- transport graph from OpenStreetMap (OSM) and creating Intermodal graph.
48
-
49
- <img src="./docs/img/graphTara.png" alt="city_graph" width="300"></a>
50
-
51
- 2. [Adjacency matrix](./examples/calculate_adjacency_matrix.ipynb) - Calculate adjacency matrix based on the provided
52
- graph and edge weight type (time or distance). The intermodal graph can be obtained using the previous example.
53
- 3. [Isochrones,transport accessibility](./examples/isochrone_generator.ipynb) - Function for generating isochrones to
54
- analyze transportation accessibility from specified starting coordinates. Isochrones can be constructed based on
55
- pedestrian, automobile, or public transport graphs, or a combination thereof.
56
-
57
- <img src="./docs/img/isochronesTara.png" alt="isochrones" width="300"></a>
58
-
59
- 4. [Population restoration](./examples/restore_population.ipynb) - Function for resettling population into the provided
60
- layer of residential buildings. This function distributes people among dwellings based on the total city population
61
- and the living area of each house.
62
- 5. [Service provision](./examples/calculate_provision.ipynb) - Function for calculating the provision of residential
63
- buildings and population with services. In case of missing data, this function utilizes previously described
64
- functionality to retrieve the necessary information.
65
-
66
- <img src="./docs/img/provisioTara.png" alt="isochrones" width="300"></a>
67
-
68
- ## Installation
69
-
70
- **ObjectNat** soon ~~can be installed with~~ ``pip``:
71
-
72
- ```
73
- pip install ObjectNat
74
- ```
75
-
76
- ## Contacts
77
-
78
- ## Publications
79
-
objectnat-0.0.3/README.md DELETED
@@ -1,52 +0,0 @@
1
- # ObjectNat - Meta Library
2
-
3
- [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
4
-
5
-
6
- <p align="center">
7
- <img src="./docs/img/logo.png" alt="test-logo" width="400"></a>
8
- </p>
9
-
10
- **ObjectNat** is an open-source library created for geospatial analysis created by IDU team
11
-
12
- ## ObjectNat Components
13
-
14
- - [dongraphio](https://github.com/DDonnyy/dongraphio) : `dongraphio` provides graph functions
15
- - [provisio](https://github.com/DDonnyy/provisio) : `provisio` provides main provisio fuctions
16
- - [population-restorator](https://github.com/kanootoko/population-restorator) : `restorator` provides city resettlement
17
-
18
- ## Features and how to use
19
-
20
- 1. [City graph from OSM](./examples/graph_generator.ipynb) - Function to assemble a road, pedestrian, and public
21
- transport graph from OpenStreetMap (OSM) and creating Intermodal graph.
22
-
23
- <img src="./docs/img/graphTara.png" alt="city_graph" width="300"></a>
24
-
25
- 2. [Adjacency matrix](./examples/calculate_adjacency_matrix.ipynb) - Calculate adjacency matrix based on the provided
26
- graph and edge weight type (time or distance). The intermodal graph can be obtained using the previous example.
27
- 3. [Isochrones,transport accessibility](./examples/isochrone_generator.ipynb) - Function for generating isochrones to
28
- analyze transportation accessibility from specified starting coordinates. Isochrones can be constructed based on
29
- pedestrian, automobile, or public transport graphs, or a combination thereof.
30
-
31
- <img src="./docs/img/isochronesTara.png" alt="isochrones" width="300"></a>
32
-
33
- 4. [Population restoration](./examples/restore_population.ipynb) - Function for resettling population into the provided
34
- layer of residential buildings. This function distributes people among dwellings based on the total city population
35
- and the living area of each house.
36
- 5. [Service provision](./examples/calculate_provision.ipynb) - Function for calculating the provision of residential
37
- buildings and population with services. In case of missing data, this function utilizes previously described
38
- functionality to retrieve the necessary information.
39
-
40
- <img src="./docs/img/provisioTara.png" alt="isochrones" width="300"></a>
41
-
42
- ## Installation
43
-
44
- **ObjectNat** soon ~~can be installed with~~ ``pip``:
45
-
46
- ```
47
- pip install ObjectNat
48
- ```
49
-
50
- ## Contacts
51
-
52
- ## Publications
@@ -1,34 +0,0 @@
1
- import geopandas as gpd
2
- import networkx as nx
3
- from dongraphio import DonGraphio, GraphType
4
-
5
-
6
- def get_accessibility_isochrones(
7
- graph_type: list[GraphType],
8
- x_from: float,
9
- y_from: float,
10
- weight_value: int,
11
- weight_type: str,
12
- city_graph: nx.MultiDiGraph,
13
- city_crs: int,
14
- ) -> tuple[gpd.GeoDataFrame, gpd.GeoDataFrame | None, gpd.GeoDataFrame | None]:
15
- """
16
- Calculate accessibility isochrones based on provided city graph for selected graph_type enum
17
-
18
- Args:
19
- graph_type (List[GraphType]): List of graph types to calculate isochrones for.
20
- x_from (float): X coordinate of the starting point in the corresponding coordinate system.
21
- y_from (float): Y coordinate of the starting point in the corresponding coordinate system.
22
- weight_value (int): Weight value for the accessibility calculations.
23
- weight_type (str): The type of the weight, could be only "time_min" or "length_meter".
24
- city_graph (nx.MultiDiGraph): The graph representing the city.
25
- city_crs (int): The CRS (Coordinate Reference System) for the city.
26
-
27
- Returns:
28
- Tuple[gpd.GeoDataFrame, Optional[gpd.GeoDataFrame], Optional[gpd.GeoDataFrame]]:
29
- The accessibility isochrones, optional routes and stops.
30
-
31
- """
32
- dongraphio = DonGraphio(city_crs)
33
- dongraphio.set_graph(city_graph)
34
- return dongraphio.get_accessibility_isochrones(graph_type, x_from, y_from, weight_value, weight_type)