stac-fastapi-elasticsearch 4.0.0a1__tar.gz → 4.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/PKG-INFO +20 -7
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/README.md +19 -6
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/setup.py +1 -1
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi/elasticsearch/app.py +15 -8
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi/elasticsearch/config.py +39 -9
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi/elasticsearch/database_logic.py +263 -71
- stac_fastapi_elasticsearch-4.1.0/stac_fastapi/elasticsearch/version.py +2 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/PKG-INFO +20 -7
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/requires.txt +1 -1
- stac_fastapi_elasticsearch-4.0.0a1/stac_fastapi/elasticsearch/version.py +0 -2
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/setup.cfg +0 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi/elasticsearch/__init__.py +0 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/SOURCES.txt +0 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/dependency_links.txt +0 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/entry_points.txt +0 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/not-zip-safe +0 -0
- {stac_fastapi_elasticsearch-4.0.0a1 → stac_fastapi_elasticsearch-4.1.0}/stac_fastapi_elasticsearch.egg-info/top_level.txt +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.1
|
|
2
2
|
Name: stac_fastapi_elasticsearch
|
|
3
|
-
Version: 4.0
|
|
3
|
+
Version: 4.1.0
|
|
4
4
|
Summary: An implementation of STAC API based on the FastAPI framework with both Elasticsearch and Opensearch.
|
|
5
5
|
Home-page: https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch
|
|
6
6
|
License: MIT
|
|
@@ -50,8 +50,18 @@ Provides-Extra: server
|
|
|
50
50
|
- There is [Postman](https://documenter.getpostman.com/view/12888943/2s8ZDSdRHA) documentation here for examples on how to run some of the API routes locally - after starting the elasticsearch backend via the compose.yml file.
|
|
51
51
|
- The `/examples` folder shows an example of running stac-fastapi-elasticsearch from PyPI in docker without needing any code from the repository. There is also a Postman collection here that you can load into Postman for testing the API routes.
|
|
52
52
|
|
|
53
|
-
|
|
54
|
-
|
|
53
|
+
|
|
54
|
+
### Performance Note
|
|
55
|
+
|
|
56
|
+
The `enable_direct_response` option is provided by the stac-fastapi core library (introduced in stac-fastapi 5.2.0) and is available in this project starting from v4.0.0.
|
|
57
|
+
|
|
58
|
+
**You can now control this setting via the `ENABLE_DIRECT_RESPONSE` environment variable.**
|
|
59
|
+
|
|
60
|
+
When enabled (`ENABLE_DIRECT_RESPONSE=true`), endpoints return Starlette Response objects directly, bypassing FastAPI's default serialization for improved performance. **However, all FastAPI dependencies (including authentication, custom status codes, and validation) are disabled for all routes.**
|
|
61
|
+
|
|
62
|
+
This mode is best suited for public or read-only APIs where authentication and custom logic are not required. Default is `false` for safety.
|
|
63
|
+
|
|
64
|
+
See: [issue #347](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/issues/347)
|
|
55
65
|
|
|
56
66
|
|
|
57
67
|
### To install from PyPI:
|
|
@@ -95,8 +105,9 @@ If you wish to use a different version, put the following in a
|
|
|
95
105
|
file named `.env` in the same directory you run Docker Compose from:
|
|
96
106
|
|
|
97
107
|
```shell
|
|
98
|
-
ELASTICSEARCH_VERSION=
|
|
99
|
-
OPENSEARCH_VERSION=2.11.
|
|
108
|
+
ELASTICSEARCH_VERSION=8.11.0
|
|
109
|
+
OPENSEARCH_VERSION=2.11.1
|
|
110
|
+
ENABLE_DIRECT_RESPONSE=false
|
|
100
111
|
```
|
|
101
112
|
The most recent Elasticsearch 7.x versions should also work. See the [opensearch-py docs](https://github.com/opensearch-project/opensearch-py/blob/main/COMPATIBILITY.md) for compatibility information.
|
|
102
113
|
|
|
@@ -121,8 +132,10 @@ You can customize additional settings in your `.env` file:
|
|
|
121
132
|
| `RELOAD` | Enable auto-reload for development. | `true` | Optional |
|
|
122
133
|
| `STAC_FASTAPI_RATE_LIMIT` | API rate limit per client. | `200/minute` | Optional |
|
|
123
134
|
| `BACKEND` | Tests-related variable | `elasticsearch` or `opensearch` based on the backend | Optional |
|
|
124
|
-
| `ELASTICSEARCH_VERSION`
|
|
125
|
-
| `
|
|
135
|
+
| `ELASTICSEARCH_VERSION` | Version of Elasticsearch to use. | `8.11.0` | Optional |
|
|
136
|
+
| `ENABLE_DIRECT_RESPONSE` | Enable direct response for maximum performance (disables all FastAPI dependencies, including authentication, custom status codes, and validation) | `false` | Optional |
|
|
137
|
+
| `OPENSEARCH_VERSION` | OpenSearch version | `2.11.1` | Optional
|
|
138
|
+
| `RAISE_ON_BULK_ERROR` | Controls whether bulk insert operations raise exceptions on errors. If set to `true`, the operation will stop and raise an exception when an error occurs. If set to `false`, errors will be logged, and the operation will continue. **Note:** STAC Item and ItemCollection validation errors will always raise, regardless of this flag. | `false` | Optional |
|
|
126
139
|
|
|
127
140
|
> [!NOTE]
|
|
128
141
|
> The variables `ES_HOST`, `ES_PORT`, `ES_USE_SSL`, and `ES_VERIFY_CERTS` apply to both Elasticsearch and OpenSearch backends, so there is no need to rename the key names to `OS_` even if you're using OpenSearch.
|
|
@@ -29,8 +29,18 @@
|
|
|
29
29
|
- There is [Postman](https://documenter.getpostman.com/view/12888943/2s8ZDSdRHA) documentation here for examples on how to run some of the API routes locally - after starting the elasticsearch backend via the compose.yml file.
|
|
30
30
|
- The `/examples` folder shows an example of running stac-fastapi-elasticsearch from PyPI in docker without needing any code from the repository. There is also a Postman collection here that you can load into Postman for testing the API routes.
|
|
31
31
|
|
|
32
|
-
|
|
33
|
-
|
|
32
|
+
|
|
33
|
+
### Performance Note
|
|
34
|
+
|
|
35
|
+
The `enable_direct_response` option is provided by the stac-fastapi core library (introduced in stac-fastapi 5.2.0) and is available in this project starting from v4.0.0.
|
|
36
|
+
|
|
37
|
+
**You can now control this setting via the `ENABLE_DIRECT_RESPONSE` environment variable.**
|
|
38
|
+
|
|
39
|
+
When enabled (`ENABLE_DIRECT_RESPONSE=true`), endpoints return Starlette Response objects directly, bypassing FastAPI's default serialization for improved performance. **However, all FastAPI dependencies (including authentication, custom status codes, and validation) are disabled for all routes.**
|
|
40
|
+
|
|
41
|
+
This mode is best suited for public or read-only APIs where authentication and custom logic are not required. Default is `false` for safety.
|
|
42
|
+
|
|
43
|
+
See: [issue #347](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/issues/347)
|
|
34
44
|
|
|
35
45
|
|
|
36
46
|
### To install from PyPI:
|
|
@@ -74,8 +84,9 @@ If you wish to use a different version, put the following in a
|
|
|
74
84
|
file named `.env` in the same directory you run Docker Compose from:
|
|
75
85
|
|
|
76
86
|
```shell
|
|
77
|
-
ELASTICSEARCH_VERSION=
|
|
78
|
-
OPENSEARCH_VERSION=2.11.
|
|
87
|
+
ELASTICSEARCH_VERSION=8.11.0
|
|
88
|
+
OPENSEARCH_VERSION=2.11.1
|
|
89
|
+
ENABLE_DIRECT_RESPONSE=false
|
|
79
90
|
```
|
|
80
91
|
The most recent Elasticsearch 7.x versions should also work. See the [opensearch-py docs](https://github.com/opensearch-project/opensearch-py/blob/main/COMPATIBILITY.md) for compatibility information.
|
|
81
92
|
|
|
@@ -100,8 +111,10 @@ You can customize additional settings in your `.env` file:
|
|
|
100
111
|
| `RELOAD` | Enable auto-reload for development. | `true` | Optional |
|
|
101
112
|
| `STAC_FASTAPI_RATE_LIMIT` | API rate limit per client. | `200/minute` | Optional |
|
|
102
113
|
| `BACKEND` | Tests-related variable | `elasticsearch` or `opensearch` based on the backend | Optional |
|
|
103
|
-
| `ELASTICSEARCH_VERSION`
|
|
104
|
-
| `
|
|
114
|
+
| `ELASTICSEARCH_VERSION` | Version of Elasticsearch to use. | `8.11.0` | Optional |
|
|
115
|
+
| `ENABLE_DIRECT_RESPONSE` | Enable direct response for maximum performance (disables all FastAPI dependencies, including authentication, custom status codes, and validation) | `false` | Optional |
|
|
116
|
+
| `OPENSEARCH_VERSION` | OpenSearch version | `2.11.1` | Optional
|
|
117
|
+
| `RAISE_ON_BULK_ERROR` | Controls whether bulk insert operations raise exceptions on errors. If set to `true`, the operation will stop and raise an exception when an error occurs. If set to `false`, errors will be logged, and the operation will continue. **Note:** STAC Item and ItemCollection validation errors will always raise, regardless of this flag. | `false` | Optional |
|
|
105
118
|
|
|
106
119
|
> [!NOTE]
|
|
107
120
|
> The variables `ES_HOST`, `ES_PORT`, `ES_USE_SSL`, and `ES_VERIFY_CERTS` apply to both Elasticsearch and OpenSearch backends, so there is no need to rename the key names to `OS_` even if you're using OpenSearch.
|
|
@@ -1,6 +1,9 @@
|
|
|
1
1
|
"""FastAPI application."""
|
|
2
2
|
|
|
3
3
|
import os
|
|
4
|
+
from contextlib import asynccontextmanager
|
|
5
|
+
|
|
6
|
+
from fastapi import FastAPI
|
|
4
7
|
|
|
5
8
|
from stac_fastapi.api.app import StacApi
|
|
6
9
|
from stac_fastapi.api.models import create_get_request_model, create_post_request_model
|
|
@@ -87,7 +90,7 @@ post_request_model = create_post_request_model(search_extensions)
|
|
|
87
90
|
api = StacApi(
|
|
88
91
|
title=os.getenv("STAC_FASTAPI_TITLE", "stac-fastapi-elasticsearch"),
|
|
89
92
|
description=os.getenv("STAC_FASTAPI_DESCRIPTION", "stac-fastapi-elasticsearch"),
|
|
90
|
-
api_version=os.getenv("STAC_FASTAPI_VERSION", "
|
|
93
|
+
api_version=os.getenv("STAC_FASTAPI_VERSION", "4.1.0"),
|
|
91
94
|
settings=settings,
|
|
92
95
|
extensions=extensions,
|
|
93
96
|
client=CoreClient(
|
|
@@ -97,17 +100,21 @@ api = StacApi(
|
|
|
97
100
|
search_post_request_model=post_request_model,
|
|
98
101
|
route_dependencies=get_route_dependencies(),
|
|
99
102
|
)
|
|
100
|
-
app = api.app
|
|
101
|
-
app.root_path = os.getenv("STAC_FASTAPI_ROOT_PATH", "")
|
|
102
|
-
|
|
103
|
-
# Add rate limit
|
|
104
|
-
setup_rate_limit(app, rate_limit=os.getenv("STAC_FASTAPI_RATE_LIMIT"))
|
|
105
103
|
|
|
106
104
|
|
|
107
|
-
@
|
|
108
|
-
async def
|
|
105
|
+
@asynccontextmanager
|
|
106
|
+
async def lifespan(app: FastAPI):
|
|
107
|
+
"""Lifespan handler for FastAPI app. Initializes index templates and collections at startup."""
|
|
109
108
|
await create_index_templates()
|
|
110
109
|
await create_collection_index()
|
|
110
|
+
yield
|
|
111
|
+
|
|
112
|
+
|
|
113
|
+
app = api.app
|
|
114
|
+
app.router.lifespan_context = lifespan
|
|
115
|
+
app.root_path = os.getenv("STAC_FASTAPI_ROOT_PATH", "")
|
|
116
|
+
# Add rate limit
|
|
117
|
+
setup_rate_limit(app, rate_limit=os.getenv("STAC_FASTAPI_RATE_LIMIT"))
|
|
111
118
|
|
|
112
119
|
|
|
113
120
|
def run() -> None:
|
|
@@ -1,19 +1,22 @@
|
|
|
1
1
|
"""API configuration."""
|
|
2
2
|
|
|
3
|
+
import logging
|
|
3
4
|
import os
|
|
4
5
|
import ssl
|
|
5
6
|
from typing import Any, Dict, Set
|
|
6
7
|
|
|
7
8
|
import certifi
|
|
9
|
+
from elasticsearch._async.client import AsyncElasticsearch
|
|
8
10
|
|
|
9
|
-
from elasticsearch import
|
|
11
|
+
from elasticsearch import Elasticsearch # type: ignore[attr-defined]
|
|
10
12
|
from stac_fastapi.core.base_settings import ApiBaseSettings
|
|
13
|
+
from stac_fastapi.core.utilities import get_bool_env
|
|
11
14
|
from stac_fastapi.types.config import ApiSettings
|
|
12
15
|
|
|
13
16
|
|
|
14
17
|
def _es_config() -> Dict[str, Any]:
|
|
15
18
|
# Determine the scheme (http or https)
|
|
16
|
-
use_ssl =
|
|
19
|
+
use_ssl = get_bool_env("ES_USE_SSL", default=True)
|
|
17
20
|
scheme = "https" if use_ssl else "http"
|
|
18
21
|
|
|
19
22
|
# Configure the hosts parameter with the correct scheme
|
|
@@ -44,7 +47,7 @@ def _es_config() -> Dict[str, Any]:
|
|
|
44
47
|
|
|
45
48
|
config["headers"] = headers
|
|
46
49
|
|
|
47
|
-
http_compress =
|
|
50
|
+
http_compress = get_bool_env("ES_HTTP_COMPRESS", default=True)
|
|
48
51
|
if http_compress:
|
|
49
52
|
config["http_compress"] = True
|
|
50
53
|
|
|
@@ -53,8 +56,8 @@ def _es_config() -> Dict[str, Any]:
|
|
|
53
56
|
return config
|
|
54
57
|
|
|
55
58
|
# Include SSL settings if using https
|
|
56
|
-
config["ssl_version"] = ssl.TLSVersion.TLSv1_3
|
|
57
|
-
config["verify_certs"] =
|
|
59
|
+
config["ssl_version"] = ssl.TLSVersion.TLSv1_3
|
|
60
|
+
config["verify_certs"] = get_bool_env("ES_VERIFY_CERTS", default=True)
|
|
58
61
|
|
|
59
62
|
# Include CA Certificates if verifying certs
|
|
60
63
|
if config["verify_certs"]:
|
|
@@ -71,11 +74,19 @@ _forbidden_fields: Set[str] = {"type"}
|
|
|
71
74
|
|
|
72
75
|
|
|
73
76
|
class ElasticsearchSettings(ApiSettings, ApiBaseSettings):
|
|
74
|
-
"""
|
|
77
|
+
"""
|
|
78
|
+
API settings.
|
|
79
|
+
|
|
80
|
+
Set enable_direct_response via the ENABLE_DIRECT_RESPONSE environment variable.
|
|
81
|
+
If enabled, all API routes use direct response for maximum performance, but ALL FastAPI dependencies (including authentication, custom status codes, and validation) are disabled.
|
|
82
|
+
Default is False for safety.
|
|
83
|
+
"""
|
|
75
84
|
|
|
76
|
-
# Fields which are defined by STAC but not included in the database model
|
|
77
85
|
forbidden_fields: Set[str] = _forbidden_fields
|
|
78
86
|
indexed_fields: Set[str] = {"datetime"}
|
|
87
|
+
enable_response_models: bool = False
|
|
88
|
+
enable_direct_response: bool = get_bool_env("ENABLE_DIRECT_RESPONSE", default=False)
|
|
89
|
+
raise_on_bulk_error: bool = get_bool_env("RAISE_ON_BULK_ERROR", default=False)
|
|
79
90
|
|
|
80
91
|
@property
|
|
81
92
|
def create_client(self):
|
|
@@ -84,13 +95,32 @@ class ElasticsearchSettings(ApiSettings, ApiBaseSettings):
|
|
|
84
95
|
|
|
85
96
|
|
|
86
97
|
class AsyncElasticsearchSettings(ApiSettings, ApiBaseSettings):
|
|
87
|
-
"""
|
|
98
|
+
"""
|
|
99
|
+
API settings.
|
|
100
|
+
|
|
101
|
+
Set enable_direct_response via the ENABLE_DIRECT_RESPONSE environment variable.
|
|
102
|
+
If enabled, all API routes use direct response for maximum performance, but ALL FastAPI dependencies (including authentication, custom status codes, and validation) are disabled.
|
|
103
|
+
Default is False for safety.
|
|
104
|
+
"""
|
|
88
105
|
|
|
89
|
-
# Fields which are defined by STAC but not included in the database model
|
|
90
106
|
forbidden_fields: Set[str] = _forbidden_fields
|
|
91
107
|
indexed_fields: Set[str] = {"datetime"}
|
|
108
|
+
enable_response_models: bool = False
|
|
109
|
+
enable_direct_response: bool = get_bool_env("ENABLE_DIRECT_RESPONSE", default=False)
|
|
110
|
+
raise_on_bulk_error: bool = get_bool_env("RAISE_ON_BULK_ERROR", default=False)
|
|
92
111
|
|
|
93
112
|
@property
|
|
94
113
|
def create_client(self):
|
|
95
114
|
"""Create async elasticsearch client."""
|
|
96
115
|
return AsyncElasticsearch(**_es_config())
|
|
116
|
+
|
|
117
|
+
|
|
118
|
+
# Warn at import if direct response is enabled (applies to either settings class)
|
|
119
|
+
if (
|
|
120
|
+
ElasticsearchSettings().enable_direct_response
|
|
121
|
+
or AsyncElasticsearchSettings().enable_direct_response
|
|
122
|
+
):
|
|
123
|
+
logging.basicConfig(level=logging.WARNING)
|
|
124
|
+
logging.warning(
|
|
125
|
+
"ENABLE_DIRECT_RESPONSE is True: All FastAPI dependencies (including authentication) are DISABLED for all routes!"
|
|
126
|
+
)
|
|
@@ -8,10 +8,11 @@ from copy import deepcopy
|
|
|
8
8
|
from typing import Any, Dict, Iterable, List, Optional, Tuple, Type
|
|
9
9
|
|
|
10
10
|
import attr
|
|
11
|
+
import elasticsearch.helpers as helpers
|
|
11
12
|
from elasticsearch.dsl import Q, Search
|
|
13
|
+
from elasticsearch.exceptions import NotFoundError as ESNotFoundError
|
|
12
14
|
from starlette.requests import Request
|
|
13
15
|
|
|
14
|
-
from elasticsearch import exceptions, helpers # type: ignore
|
|
15
16
|
from stac_fastapi.core.base_database_logic import BaseDatabaseLogic
|
|
16
17
|
from stac_fastapi.core.database_logic import (
|
|
17
18
|
COLLECTIONS_INDEX,
|
|
@@ -50,19 +51,18 @@ async def create_index_templates() -> None:
|
|
|
50
51
|
|
|
51
52
|
"""
|
|
52
53
|
client = AsyncElasticsearchSettings().create_client
|
|
53
|
-
await client.indices.
|
|
54
|
+
await client.indices.put_index_template(
|
|
54
55
|
name=f"template_{COLLECTIONS_INDEX}",
|
|
55
56
|
body={
|
|
56
57
|
"index_patterns": [f"{COLLECTIONS_INDEX}*"],
|
|
57
|
-
"mappings": ES_COLLECTIONS_MAPPINGS,
|
|
58
|
+
"template": {"mappings": ES_COLLECTIONS_MAPPINGS},
|
|
58
59
|
},
|
|
59
60
|
)
|
|
60
|
-
await client.indices.
|
|
61
|
+
await client.indices.put_index_template(
|
|
61
62
|
name=f"template_{ITEMS_INDEX_PREFIX}",
|
|
62
63
|
body={
|
|
63
64
|
"index_patterns": [f"{ITEMS_INDEX_PREFIX}*"],
|
|
64
|
-
"settings": ES_ITEMS_SETTINGS,
|
|
65
|
-
"mappings": ES_ITEMS_MAPPINGS,
|
|
65
|
+
"template": {"settings": ES_ITEMS_SETTINGS, "mappings": ES_ITEMS_MAPPINGS},
|
|
66
66
|
},
|
|
67
67
|
)
|
|
68
68
|
await client.close()
|
|
@@ -80,7 +80,7 @@ async def create_collection_index() -> None:
|
|
|
80
80
|
|
|
81
81
|
await client.options(ignore_status=400).indices.create(
|
|
82
82
|
index=f"{COLLECTIONS_INDEX}-000001",
|
|
83
|
-
|
|
83
|
+
body={"aliases": {COLLECTIONS_INDEX: {}}},
|
|
84
84
|
)
|
|
85
85
|
await client.close()
|
|
86
86
|
|
|
@@ -100,7 +100,7 @@ async def create_item_index(collection_id: str):
|
|
|
100
100
|
|
|
101
101
|
await client.options(ignore_status=400).indices.create(
|
|
102
102
|
index=f"{index_by_collection_id(collection_id)}-000001",
|
|
103
|
-
|
|
103
|
+
body={"aliases": {index_alias_by_collection_id(collection_id): {}}},
|
|
104
104
|
)
|
|
105
105
|
await client.close()
|
|
106
106
|
|
|
@@ -128,8 +128,20 @@ async def delete_item_index(collection_id: str):
|
|
|
128
128
|
class DatabaseLogic(BaseDatabaseLogic):
|
|
129
129
|
"""Database logic."""
|
|
130
130
|
|
|
131
|
-
|
|
132
|
-
|
|
131
|
+
async_settings: AsyncElasticsearchSettings = attr.ib(
|
|
132
|
+
factory=AsyncElasticsearchSettings
|
|
133
|
+
)
|
|
134
|
+
sync_settings: SyncElasticsearchSettings = attr.ib(
|
|
135
|
+
factory=SyncElasticsearchSettings
|
|
136
|
+
)
|
|
137
|
+
|
|
138
|
+
client = attr.ib(init=False)
|
|
139
|
+
sync_client = attr.ib(init=False)
|
|
140
|
+
|
|
141
|
+
def __attrs_post_init__(self):
|
|
142
|
+
"""Initialize clients after the class is instantiated."""
|
|
143
|
+
self.client = self.async_settings.create_client
|
|
144
|
+
self.sync_client = self.sync_settings.create_client
|
|
133
145
|
|
|
134
146
|
item_serializer: Type[ItemSerializer] = attr.ib(default=ItemSerializer)
|
|
135
147
|
collection_serializer: Type[CollectionSerializer] = attr.ib(
|
|
@@ -272,7 +284,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
272
284
|
index=index_alias_by_collection_id(collection_id),
|
|
273
285
|
id=mk_item_id(item_id, collection_id),
|
|
274
286
|
)
|
|
275
|
-
except
|
|
287
|
+
except ESNotFoundError:
|
|
276
288
|
raise NotFoundError(
|
|
277
289
|
f"Item {item_id} does not exist inside Collection {collection_id}"
|
|
278
290
|
)
|
|
@@ -294,8 +306,8 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
294
306
|
return search.filter("terms", collection=collection_ids)
|
|
295
307
|
|
|
296
308
|
@staticmethod
|
|
297
|
-
def apply_datetime_filter(search: Search, datetime_search):
|
|
298
|
-
"""Apply a filter to search
|
|
309
|
+
def apply_datetime_filter(search: Search, datetime_search: dict):
|
|
310
|
+
"""Apply a filter to search on datetime, start_datetime, and end_datetime fields.
|
|
299
311
|
|
|
300
312
|
Args:
|
|
301
313
|
search (Search): The search object to filter.
|
|
@@ -304,17 +316,109 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
304
316
|
Returns:
|
|
305
317
|
Search: The filtered search object.
|
|
306
318
|
"""
|
|
319
|
+
should = []
|
|
320
|
+
|
|
321
|
+
# If the request is a single datetime return
|
|
322
|
+
# items with datetimes equal to the requested datetime OR
|
|
323
|
+
# the requested datetime is between their start and end datetimes
|
|
307
324
|
if "eq" in datetime_search:
|
|
308
|
-
|
|
309
|
-
|
|
325
|
+
should.extend(
|
|
326
|
+
[
|
|
327
|
+
Q(
|
|
328
|
+
"bool",
|
|
329
|
+
filter=[
|
|
330
|
+
Q(
|
|
331
|
+
"term",
|
|
332
|
+
properties__datetime=datetime_search["eq"],
|
|
333
|
+
),
|
|
334
|
+
],
|
|
335
|
+
),
|
|
336
|
+
Q(
|
|
337
|
+
"bool",
|
|
338
|
+
filter=[
|
|
339
|
+
Q(
|
|
340
|
+
"range",
|
|
341
|
+
properties__start_datetime={
|
|
342
|
+
"lte": datetime_search["eq"],
|
|
343
|
+
},
|
|
344
|
+
),
|
|
345
|
+
Q(
|
|
346
|
+
"range",
|
|
347
|
+
properties__end_datetime={
|
|
348
|
+
"gte": datetime_search["eq"],
|
|
349
|
+
},
|
|
350
|
+
),
|
|
351
|
+
],
|
|
352
|
+
),
|
|
353
|
+
]
|
|
310
354
|
)
|
|
355
|
+
|
|
356
|
+
# If the request is a date range return
|
|
357
|
+
# items with datetimes within the requested date range OR
|
|
358
|
+
# their startdatetime ithin the requested date range OR
|
|
359
|
+
# their enddatetime ithin the requested date range OR
|
|
360
|
+
# the requested daterange within their start and end datetimes
|
|
311
361
|
else:
|
|
312
|
-
|
|
313
|
-
|
|
314
|
-
|
|
315
|
-
|
|
316
|
-
|
|
362
|
+
should.extend(
|
|
363
|
+
[
|
|
364
|
+
Q(
|
|
365
|
+
"bool",
|
|
366
|
+
filter=[
|
|
367
|
+
Q(
|
|
368
|
+
"range",
|
|
369
|
+
properties__datetime={
|
|
370
|
+
"gte": datetime_search["gte"],
|
|
371
|
+
"lte": datetime_search["lte"],
|
|
372
|
+
},
|
|
373
|
+
),
|
|
374
|
+
],
|
|
375
|
+
),
|
|
376
|
+
Q(
|
|
377
|
+
"bool",
|
|
378
|
+
filter=[
|
|
379
|
+
Q(
|
|
380
|
+
"range",
|
|
381
|
+
properties__start_datetime={
|
|
382
|
+
"gte": datetime_search["gte"],
|
|
383
|
+
"lte": datetime_search["lte"],
|
|
384
|
+
},
|
|
385
|
+
),
|
|
386
|
+
],
|
|
387
|
+
),
|
|
388
|
+
Q(
|
|
389
|
+
"bool",
|
|
390
|
+
filter=[
|
|
391
|
+
Q(
|
|
392
|
+
"range",
|
|
393
|
+
properties__end_datetime={
|
|
394
|
+
"gte": datetime_search["gte"],
|
|
395
|
+
"lte": datetime_search["lte"],
|
|
396
|
+
},
|
|
397
|
+
),
|
|
398
|
+
],
|
|
399
|
+
),
|
|
400
|
+
Q(
|
|
401
|
+
"bool",
|
|
402
|
+
filter=[
|
|
403
|
+
Q(
|
|
404
|
+
"range",
|
|
405
|
+
properties__start_datetime={
|
|
406
|
+
"lte": datetime_search["gte"]
|
|
407
|
+
},
|
|
408
|
+
),
|
|
409
|
+
Q(
|
|
410
|
+
"range",
|
|
411
|
+
properties__end_datetime={
|
|
412
|
+
"gte": datetime_search["lte"]
|
|
413
|
+
},
|
|
414
|
+
),
|
|
415
|
+
],
|
|
416
|
+
),
|
|
417
|
+
]
|
|
317
418
|
)
|
|
419
|
+
|
|
420
|
+
search = search.query(Q("bool", filter=[Q("bool", should=should)]))
|
|
421
|
+
|
|
318
422
|
return search
|
|
319
423
|
|
|
320
424
|
@staticmethod
|
|
@@ -512,7 +616,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
512
616
|
|
|
513
617
|
try:
|
|
514
618
|
es_response = await search_task
|
|
515
|
-
except
|
|
619
|
+
except ESNotFoundError:
|
|
516
620
|
raise NotFoundError(f"Collections '{collection_ids}' do not exist")
|
|
517
621
|
|
|
518
622
|
hits = es_response["hits"]["hits"]
|
|
@@ -595,7 +699,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
595
699
|
|
|
596
700
|
try:
|
|
597
701
|
db_response = await search_task
|
|
598
|
-
except
|
|
702
|
+
except ESNotFoundError:
|
|
599
703
|
raise NotFoundError(f"Collections '{collection_ids}' do not exist")
|
|
600
704
|
|
|
601
705
|
return db_response
|
|
@@ -607,7 +711,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
607
711
|
if not await self.client.exists(index=COLLECTIONS_INDEX, id=collection_id):
|
|
608
712
|
raise NotFoundError(f"Collection {collection_id} does not exist")
|
|
609
713
|
|
|
610
|
-
async def
|
|
714
|
+
async def async_prep_create_item(
|
|
611
715
|
self, item: Item, base_url: str, exist_ok: bool = False
|
|
612
716
|
) -> Item:
|
|
613
717
|
"""
|
|
@@ -637,44 +741,114 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
637
741
|
|
|
638
742
|
return self.item_serializer.stac_to_db(item, base_url)
|
|
639
743
|
|
|
640
|
-
def
|
|
744
|
+
async def bulk_async_prep_create_item(
|
|
641
745
|
self, item: Item, base_url: str, exist_ok: bool = False
|
|
642
746
|
) -> Item:
|
|
643
747
|
"""
|
|
644
748
|
Prepare an item for insertion into the database.
|
|
645
749
|
|
|
646
|
-
This method performs pre-insertion preparation on the given `item`,
|
|
647
|
-
|
|
648
|
-
|
|
750
|
+
This method performs pre-insertion preparation on the given `item`, such as:
|
|
751
|
+
- Verifying that the collection the item belongs to exists.
|
|
752
|
+
- Optionally checking if an item with the same ID already exists in the database.
|
|
753
|
+
- Serializing the item into a database-compatible format.
|
|
649
754
|
|
|
650
755
|
Args:
|
|
651
|
-
item (Item): The item to be
|
|
652
|
-
base_url (str): The base URL used
|
|
653
|
-
exist_ok (bool): Indicates whether the item can exist
|
|
756
|
+
item (Item): The item to be prepared for insertion.
|
|
757
|
+
base_url (str): The base URL used to construct the item's self URL.
|
|
758
|
+
exist_ok (bool): Indicates whether the item can already exist in the database.
|
|
759
|
+
If False, a `ConflictError` is raised if the item exists.
|
|
654
760
|
|
|
655
761
|
Returns:
|
|
656
|
-
Item: The item
|
|
762
|
+
Item: The prepared item, serialized into a database-compatible format.
|
|
657
763
|
|
|
658
764
|
Raises:
|
|
659
765
|
NotFoundError: If the collection that the item belongs to does not exist in the database.
|
|
660
|
-
ConflictError: If an item with the same ID already exists in the collection
|
|
766
|
+
ConflictError: If an item with the same ID already exists in the collection and `exist_ok` is False,
|
|
767
|
+
and `RAISE_ON_BULK_ERROR` is set to `true`.
|
|
661
768
|
"""
|
|
662
|
-
|
|
663
|
-
collection_id = item["collection"]
|
|
664
|
-
if not self.sync_client.exists(index=COLLECTIONS_INDEX, id=collection_id):
|
|
665
|
-
raise NotFoundError(f"Collection {collection_id} does not exist")
|
|
769
|
+
logger.debug(f"Preparing item {item['id']} in collection {item['collection']}.")
|
|
666
770
|
|
|
667
|
-
if
|
|
668
|
-
|
|
669
|
-
|
|
771
|
+
# Check if the collection exists
|
|
772
|
+
await self.check_collection_exists(collection_id=item["collection"])
|
|
773
|
+
|
|
774
|
+
# Check if the item already exists in the database
|
|
775
|
+
if not exist_ok and await self.client.exists(
|
|
776
|
+
index=index_alias_by_collection_id(item["collection"]),
|
|
777
|
+
id=mk_item_id(item["id"], item["collection"]),
|
|
670
778
|
):
|
|
671
|
-
|
|
672
|
-
f"Item {
|
|
779
|
+
error_message = (
|
|
780
|
+
f"Item {item['id']} in collection {item['collection']} already exists."
|
|
673
781
|
)
|
|
782
|
+
if self.async_settings.raise_on_bulk_error:
|
|
783
|
+
raise ConflictError(error_message)
|
|
784
|
+
else:
|
|
785
|
+
logger.warning(
|
|
786
|
+
f"{error_message} Continuing as `RAISE_ON_BULK_ERROR` is set to false."
|
|
787
|
+
)
|
|
788
|
+
|
|
789
|
+
# Serialize the item into a database-compatible format
|
|
790
|
+
prepped_item = self.item_serializer.stac_to_db(item, base_url)
|
|
791
|
+
logger.debug(f"Item {item['id']} prepared successfully.")
|
|
792
|
+
return prepped_item
|
|
793
|
+
|
|
794
|
+
def bulk_sync_prep_create_item(
|
|
795
|
+
self, item: Item, base_url: str, exist_ok: bool = False
|
|
796
|
+
) -> Item:
|
|
797
|
+
"""
|
|
798
|
+
Prepare an item for insertion into the database.
|
|
674
799
|
|
|
675
|
-
|
|
800
|
+
This method performs pre-insertion preparation on the given `item`, such as:
|
|
801
|
+
- Verifying that the collection the item belongs to exists.
|
|
802
|
+
- Optionally checking if an item with the same ID already exists in the database.
|
|
803
|
+
- Serializing the item into a database-compatible format.
|
|
804
|
+
|
|
805
|
+
Args:
|
|
806
|
+
item (Item): The item to be prepared for insertion.
|
|
807
|
+
base_url (str): The base URL used to construct the item's self URL.
|
|
808
|
+
exist_ok (bool): Indicates whether the item can already exist in the database.
|
|
809
|
+
If False, a `ConflictError` is raised if the item exists.
|
|
676
810
|
|
|
677
|
-
|
|
811
|
+
Returns:
|
|
812
|
+
Item: The prepared item, serialized into a database-compatible format.
|
|
813
|
+
|
|
814
|
+
Raises:
|
|
815
|
+
NotFoundError: If the collection that the item belongs to does not exist in the database.
|
|
816
|
+
ConflictError: If an item with the same ID already exists in the collection and `exist_ok` is False,
|
|
817
|
+
and `RAISE_ON_BULK_ERROR` is set to `true`.
|
|
818
|
+
"""
|
|
819
|
+
logger.debug(f"Preparing item {item['id']} in collection {item['collection']}.")
|
|
820
|
+
|
|
821
|
+
# Check if the collection exists
|
|
822
|
+
if not self.sync_client.exists(index=COLLECTIONS_INDEX, id=item["collection"]):
|
|
823
|
+
raise NotFoundError(f"Collection {item['collection']} does not exist")
|
|
824
|
+
|
|
825
|
+
# Check if the item already exists in the database
|
|
826
|
+
if not exist_ok and self.sync_client.exists(
|
|
827
|
+
index=index_alias_by_collection_id(item["collection"]),
|
|
828
|
+
id=mk_item_id(item["id"], item["collection"]),
|
|
829
|
+
):
|
|
830
|
+
error_message = (
|
|
831
|
+
f"Item {item['id']} in collection {item['collection']} already exists."
|
|
832
|
+
)
|
|
833
|
+
if self.sync_settings.raise_on_bulk_error:
|
|
834
|
+
raise ConflictError(error_message)
|
|
835
|
+
else:
|
|
836
|
+
logger.warning(
|
|
837
|
+
f"{error_message} Continuing as `RAISE_ON_BULK_ERROR` is set to false."
|
|
838
|
+
)
|
|
839
|
+
|
|
840
|
+
# Serialize the item into a database-compatible format
|
|
841
|
+
prepped_item = self.item_serializer.stac_to_db(item, base_url)
|
|
842
|
+
logger.debug(f"Item {item['id']} prepared successfully.")
|
|
843
|
+
return prepped_item
|
|
844
|
+
|
|
845
|
+
async def create_item(
|
|
846
|
+
self,
|
|
847
|
+
item: Item,
|
|
848
|
+
refresh: bool = False,
|
|
849
|
+
base_url: str = "",
|
|
850
|
+
exist_ok: bool = False,
|
|
851
|
+
):
|
|
678
852
|
"""Database logic for creating one item.
|
|
679
853
|
|
|
680
854
|
Args:
|
|
@@ -690,18 +864,16 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
690
864
|
# todo: check if collection exists, but cache
|
|
691
865
|
item_id = item["id"]
|
|
692
866
|
collection_id = item["collection"]
|
|
693
|
-
|
|
867
|
+
item = await self.async_prep_create_item(
|
|
868
|
+
item=item, base_url=base_url, exist_ok=exist_ok
|
|
869
|
+
)
|
|
870
|
+
await self.client.index(
|
|
694
871
|
index=index_alias_by_collection_id(collection_id),
|
|
695
872
|
id=mk_item_id(item_id, collection_id),
|
|
696
873
|
document=item,
|
|
697
874
|
refresh=refresh,
|
|
698
875
|
)
|
|
699
876
|
|
|
700
|
-
if (meta := es_resp.get("meta")) and meta.get("status") == 409:
|
|
701
|
-
raise ConflictError(
|
|
702
|
-
f"Item {item_id} in collection {collection_id} already exists"
|
|
703
|
-
)
|
|
704
|
-
|
|
705
877
|
async def delete_item(
|
|
706
878
|
self, item_id: str, collection_id: str, refresh: bool = False
|
|
707
879
|
):
|
|
@@ -721,7 +893,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
721
893
|
id=mk_item_id(item_id, collection_id),
|
|
722
894
|
refresh=refresh,
|
|
723
895
|
)
|
|
724
|
-
except
|
|
896
|
+
except ESNotFoundError:
|
|
725
897
|
raise NotFoundError(
|
|
726
898
|
f"Item {item_id} in collection {collection_id} not found"
|
|
727
899
|
)
|
|
@@ -741,7 +913,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
741
913
|
index=index_name, allow_no_indices=False
|
|
742
914
|
)
|
|
743
915
|
return mapping.body
|
|
744
|
-
except
|
|
916
|
+
except ESNotFoundError:
|
|
745
917
|
raise NotFoundError(f"Mapping for index {index_name} not found")
|
|
746
918
|
|
|
747
919
|
async def create_collection(self, collection: Collection, refresh: bool = False):
|
|
@@ -792,7 +964,7 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
792
964
|
collection = await self.client.get(
|
|
793
965
|
index=COLLECTIONS_INDEX, id=collection_id
|
|
794
966
|
)
|
|
795
|
-
except
|
|
967
|
+
except ESNotFoundError:
|
|
796
968
|
raise NotFoundError(f"Collection {collection_id} not found")
|
|
797
969
|
|
|
798
970
|
return collection["_source"]
|
|
@@ -867,52 +1039,72 @@ class DatabaseLogic(BaseDatabaseLogic):
|
|
|
867
1039
|
await delete_item_index(collection_id)
|
|
868
1040
|
|
|
869
1041
|
async def bulk_async(
|
|
870
|
-
self,
|
|
871
|
-
|
|
872
|
-
|
|
1042
|
+
self,
|
|
1043
|
+
collection_id: str,
|
|
1044
|
+
processed_items: List[Item],
|
|
1045
|
+
refresh: bool = False,
|
|
1046
|
+
) -> Tuple[int, List[Dict[str, Any]]]:
|
|
1047
|
+
"""
|
|
1048
|
+
Perform a bulk insert of items into the database asynchronously.
|
|
873
1049
|
|
|
874
1050
|
Args:
|
|
875
|
-
self: The instance of the object calling this function.
|
|
876
1051
|
collection_id (str): The ID of the collection to which the items belong.
|
|
877
1052
|
processed_items (List[Item]): A list of `Item` objects to be inserted into the database.
|
|
878
1053
|
refresh (bool): Whether to refresh the index after the bulk insert (default: False).
|
|
879
1054
|
|
|
1055
|
+
Returns:
|
|
1056
|
+
Tuple[int, List[Dict[str, Any]]]: A tuple containing:
|
|
1057
|
+
- The number of successfully processed actions (`success`).
|
|
1058
|
+
- A list of errors encountered during the bulk operation (`errors`).
|
|
1059
|
+
|
|
880
1060
|
Notes:
|
|
881
|
-
This function performs a bulk insert of `processed_items` into the database using the specified `collection_id`.
|
|
882
|
-
insert is performed asynchronously, and the event loop is used to run the operation in a separate executor.
|
|
883
|
-
`mk_actions` function is called to generate a list of actions for the bulk insert. If `refresh` is set to True,
|
|
884
|
-
index is refreshed after the bulk insert.
|
|
1061
|
+
This function performs a bulk insert of `processed_items` into the database using the specified `collection_id`.
|
|
1062
|
+
The insert is performed asynchronously, and the event loop is used to run the operation in a separate executor.
|
|
1063
|
+
The `mk_actions` function is called to generate a list of actions for the bulk insert. If `refresh` is set to True,
|
|
1064
|
+
the index is refreshed after the bulk insert.
|
|
885
1065
|
"""
|
|
886
|
-
|
|
1066
|
+
raise_on_error = self.async_settings.raise_on_bulk_error
|
|
1067
|
+
success, errors = await helpers.async_bulk(
|
|
887
1068
|
self.client,
|
|
888
1069
|
mk_actions(collection_id, processed_items),
|
|
889
1070
|
refresh=refresh,
|
|
890
|
-
raise_on_error=
|
|
1071
|
+
raise_on_error=raise_on_error,
|
|
891
1072
|
)
|
|
1073
|
+
return success, errors
|
|
892
1074
|
|
|
893
1075
|
def bulk_sync(
|
|
894
|
-
self,
|
|
895
|
-
|
|
896
|
-
|
|
1076
|
+
self,
|
|
1077
|
+
collection_id: str,
|
|
1078
|
+
processed_items: List[Item],
|
|
1079
|
+
refresh: bool = False,
|
|
1080
|
+
) -> Tuple[int, List[Dict[str, Any]]]:
|
|
1081
|
+
"""
|
|
1082
|
+
Perform a bulk insert of items into the database synchronously.
|
|
897
1083
|
|
|
898
1084
|
Args:
|
|
899
|
-
self: The instance of the object calling this function.
|
|
900
1085
|
collection_id (str): The ID of the collection to which the items belong.
|
|
901
1086
|
processed_items (List[Item]): A list of `Item` objects to be inserted into the database.
|
|
902
1087
|
refresh (bool): Whether to refresh the index after the bulk insert (default: False).
|
|
903
1088
|
|
|
1089
|
+
Returns:
|
|
1090
|
+
Tuple[int, List[Dict[str, Any]]]: A tuple containing:
|
|
1091
|
+
- The number of successfully processed actions (`success`).
|
|
1092
|
+
- A list of errors encountered during the bulk operation (`errors`).
|
|
1093
|
+
|
|
904
1094
|
Notes:
|
|
905
|
-
This function performs a bulk insert of `processed_items` into the database using the specified `collection_id`.
|
|
906
|
-
insert is performed synchronously and blocking, meaning that the function does not return until the insert has
|
|
1095
|
+
This function performs a bulk insert of `processed_items` into the database using the specified `collection_id`.
|
|
1096
|
+
The insert is performed synchronously and blocking, meaning that the function does not return until the insert has
|
|
907
1097
|
completed. The `mk_actions` function is called to generate a list of actions for the bulk insert. If `refresh` is set to
|
|
908
|
-
True, the index is refreshed after the bulk insert.
|
|
1098
|
+
True, the index is refreshed after the bulk insert.
|
|
909
1099
|
"""
|
|
910
|
-
|
|
1100
|
+
raise_on_error = self.sync_settings.raise_on_bulk_error
|
|
1101
|
+
success, errors = helpers.bulk(
|
|
911
1102
|
self.sync_client,
|
|
912
1103
|
mk_actions(collection_id, processed_items),
|
|
913
1104
|
refresh=refresh,
|
|
914
|
-
raise_on_error=
|
|
1105
|
+
raise_on_error=raise_on_error,
|
|
915
1106
|
)
|
|
1107
|
+
return success, errors
|
|
916
1108
|
|
|
917
1109
|
# DANGER
|
|
918
1110
|
async def delete_items(self) -> None:
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.1
|
|
2
2
|
Name: stac-fastapi-elasticsearch
|
|
3
|
-
Version: 4.0
|
|
3
|
+
Version: 4.1.0
|
|
4
4
|
Summary: An implementation of STAC API based on the FastAPI framework with both Elasticsearch and Opensearch.
|
|
5
5
|
Home-page: https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch
|
|
6
6
|
License: MIT
|
|
@@ -50,8 +50,18 @@ Provides-Extra: server
|
|
|
50
50
|
- There is [Postman](https://documenter.getpostman.com/view/12888943/2s8ZDSdRHA) documentation here for examples on how to run some of the API routes locally - after starting the elasticsearch backend via the compose.yml file.
|
|
51
51
|
- The `/examples` folder shows an example of running stac-fastapi-elasticsearch from PyPI in docker without needing any code from the repository. There is also a Postman collection here that you can load into Postman for testing the API routes.
|
|
52
52
|
|
|
53
|
-
|
|
54
|
-
|
|
53
|
+
|
|
54
|
+
### Performance Note
|
|
55
|
+
|
|
56
|
+
The `enable_direct_response` option is provided by the stac-fastapi core library (introduced in stac-fastapi 5.2.0) and is available in this project starting from v4.0.0.
|
|
57
|
+
|
|
58
|
+
**You can now control this setting via the `ENABLE_DIRECT_RESPONSE` environment variable.**
|
|
59
|
+
|
|
60
|
+
When enabled (`ENABLE_DIRECT_RESPONSE=true`), endpoints return Starlette Response objects directly, bypassing FastAPI's default serialization for improved performance. **However, all FastAPI dependencies (including authentication, custom status codes, and validation) are disabled for all routes.**
|
|
61
|
+
|
|
62
|
+
This mode is best suited for public or read-only APIs where authentication and custom logic are not required. Default is `false` for safety.
|
|
63
|
+
|
|
64
|
+
See: [issue #347](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/issues/347)
|
|
55
65
|
|
|
56
66
|
|
|
57
67
|
### To install from PyPI:
|
|
@@ -95,8 +105,9 @@ If you wish to use a different version, put the following in a
|
|
|
95
105
|
file named `.env` in the same directory you run Docker Compose from:
|
|
96
106
|
|
|
97
107
|
```shell
|
|
98
|
-
ELASTICSEARCH_VERSION=
|
|
99
|
-
OPENSEARCH_VERSION=2.11.
|
|
108
|
+
ELASTICSEARCH_VERSION=8.11.0
|
|
109
|
+
OPENSEARCH_VERSION=2.11.1
|
|
110
|
+
ENABLE_DIRECT_RESPONSE=false
|
|
100
111
|
```
|
|
101
112
|
The most recent Elasticsearch 7.x versions should also work. See the [opensearch-py docs](https://github.com/opensearch-project/opensearch-py/blob/main/COMPATIBILITY.md) for compatibility information.
|
|
102
113
|
|
|
@@ -121,8 +132,10 @@ You can customize additional settings in your `.env` file:
|
|
|
121
132
|
| `RELOAD` | Enable auto-reload for development. | `true` | Optional |
|
|
122
133
|
| `STAC_FASTAPI_RATE_LIMIT` | API rate limit per client. | `200/minute` | Optional |
|
|
123
134
|
| `BACKEND` | Tests-related variable | `elasticsearch` or `opensearch` based on the backend | Optional |
|
|
124
|
-
| `ELASTICSEARCH_VERSION`
|
|
125
|
-
| `
|
|
135
|
+
| `ELASTICSEARCH_VERSION` | Version of Elasticsearch to use. | `8.11.0` | Optional |
|
|
136
|
+
| `ENABLE_DIRECT_RESPONSE` | Enable direct response for maximum performance (disables all FastAPI dependencies, including authentication, custom status codes, and validation) | `false` | Optional |
|
|
137
|
+
| `OPENSEARCH_VERSION` | OpenSearch version | `2.11.1` | Optional
|
|
138
|
+
| `RAISE_ON_BULK_ERROR` | Controls whether bulk insert operations raise exceptions on errors. If set to `true`, the operation will stop and raise an exception when an error occurs. If set to `false`, errors will be logged, and the operation will continue. **Note:** STAC Item and ItemCollection validation errors will always raise, regardless of this flag. | `false` | Optional |
|
|
126
139
|
|
|
127
140
|
> [!NOTE]
|
|
128
141
|
> The variables `ES_HOST`, `ES_PORT`, `ES_USE_SSL`, and `ES_VERIFY_CERTS` apply to both Elasticsearch and OpenSearch backends, so there is no need to rename the key names to `OS_` even if you're using OpenSearch.
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|