cledar-sdk 2.0.2__py3-none-any.whl → 2.1.0__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- cledar/__init__.py +1 -0
- cledar/kafka/README.md +239 -0
- cledar/kafka/__init__.py +42 -0
- cledar/kafka/clients/base.py +117 -0
- cledar/kafka/clients/consumer.py +138 -0
- cledar/kafka/clients/producer.py +97 -0
- cledar/kafka/config/schemas.py +262 -0
- cledar/kafka/exceptions.py +17 -0
- cledar/kafka/handlers/dead_letter.py +88 -0
- cledar/kafka/handlers/parser.py +83 -0
- cledar/kafka/logger.py +5 -0
- cledar/kafka/models/input.py +17 -0
- cledar/kafka/models/message.py +14 -0
- cledar/kafka/models/output.py +12 -0
- cledar/kafka/tests/.env.test.kafka +3 -0
- cledar/kafka/tests/README.md +216 -0
- cledar/kafka/tests/conftest.py +104 -0
- cledar/kafka/tests/integration/__init__.py +1 -0
- cledar/kafka/tests/integration/conftest.py +78 -0
- cledar/kafka/tests/integration/helpers.py +47 -0
- cledar/kafka/tests/integration/test_consumer_integration.py +375 -0
- cledar/kafka/tests/integration/test_integration.py +394 -0
- cledar/kafka/tests/integration/test_producer_consumer_interaction.py +388 -0
- cledar/kafka/tests/integration/test_producer_integration.py +217 -0
- cledar/kafka/tests/unit/__init__.py +1 -0
- cledar/kafka/tests/unit/test_base_kafka_client.py +391 -0
- cledar/kafka/tests/unit/test_config_validation.py +609 -0
- cledar/kafka/tests/unit/test_dead_letter_handler.py +443 -0
- cledar/kafka/tests/unit/test_error_handling.py +674 -0
- cledar/kafka/tests/unit/test_input_parser.py +310 -0
- cledar/kafka/tests/unit/test_input_parser_comprehensive.py +489 -0
- cledar/kafka/tests/unit/test_utils.py +25 -0
- cledar/kafka/tests/unit/test_utils_comprehensive.py +408 -0
- cledar/kafka/utils/callbacks.py +28 -0
- cledar/kafka/utils/messages.py +39 -0
- cledar/kafka/utils/topics.py +15 -0
- cledar/kserve/README.md +352 -0
- cledar/kserve/__init__.py +5 -0
- cledar/kserve/tests/__init__.py +0 -0
- cledar/kserve/tests/test_utils.py +64 -0
- cledar/kserve/utils.py +30 -0
- cledar/logging/README.md +53 -0
- cledar/logging/__init__.py +5 -0
- cledar/logging/tests/test_universal_plaintext_formatter.py +249 -0
- cledar/logging/universal_plaintext_formatter.py +99 -0
- cledar/monitoring/README.md +71 -0
- cledar/monitoring/__init__.py +5 -0
- cledar/monitoring/monitoring_server.py +156 -0
- cledar/monitoring/tests/integration/test_monitoring_server_int.py +162 -0
- cledar/monitoring/tests/test_monitoring_server.py +59 -0
- cledar/nonce/README.md +99 -0
- cledar/nonce/__init__.py +5 -0
- cledar/nonce/nonce_service.py +62 -0
- cledar/nonce/tests/__init__.py +0 -0
- cledar/nonce/tests/test_nonce_service.py +136 -0
- cledar/redis/README.md +536 -0
- cledar/redis/__init__.py +17 -0
- cledar/redis/async_example.py +112 -0
- cledar/redis/example.py +67 -0
- cledar/redis/exceptions.py +25 -0
- cledar/redis/logger.py +5 -0
- cledar/redis/model.py +14 -0
- cledar/redis/redis.py +764 -0
- cledar/redis/redis_config_store.py +333 -0
- cledar/redis/tests/test_async_integration_redis.py +158 -0
- cledar/redis/tests/test_async_redis_service.py +380 -0
- cledar/redis/tests/test_integration_redis.py +119 -0
- cledar/redis/tests/test_redis_service.py +319 -0
- cledar/storage/README.md +529 -0
- cledar/storage/__init__.py +6 -0
- cledar/storage/constants.py +5 -0
- cledar/storage/exceptions.py +79 -0
- cledar/storage/models.py +41 -0
- cledar/storage/object_storage.py +1274 -0
- cledar/storage/tests/conftest.py +18 -0
- cledar/storage/tests/test_abfs.py +164 -0
- cledar/storage/tests/test_integration_filesystem.py +359 -0
- cledar/storage/tests/test_integration_s3.py +453 -0
- cledar/storage/tests/test_local.py +384 -0
- cledar/storage/tests/test_s3.py +521 -0
- {cledar_sdk-2.0.2.dist-info → cledar_sdk-2.1.0.dist-info}/METADATA +1 -1
- cledar_sdk-2.1.0.dist-info/RECORD +84 -0
- cledar_sdk-2.0.2.dist-info/RECORD +0 -4
- {cledar_sdk-2.0.2.dist-info → cledar_sdk-2.1.0.dist-info}/WHEEL +0 -0
- {cledar_sdk-2.0.2.dist-info → cledar_sdk-2.1.0.dist-info}/licenses/LICENSE +0 -0
cledar/kserve/README.md
ADDED
|
@@ -0,0 +1,352 @@
|
|
|
1
|
+
# KServe Service
|
|
2
|
+
|
|
3
|
+
## Purpose
|
|
4
|
+
|
|
5
|
+
The `cledar.kserve` package provides utilities for working with KServe inference services, particularly for handling CloudEvents headers that are used in KServe's event-driven architecture. It simplifies the extraction and parsing of metadata from CloudEvents headers, making it easier to integrate with KServe deployments.
|
|
6
|
+
|
|
7
|
+
### Key Features
|
|
8
|
+
|
|
9
|
+
- **CloudEvents Parsing**: Extract Kafka topic names from CloudEvents source headers
|
|
10
|
+
- **Header Validation**: Robust validation of CloudEvents header format
|
|
11
|
+
- **Type Safety**: Fully typed with Python type hints
|
|
12
|
+
- **Well Tested**: Comprehensive unit tests covering edge cases
|
|
13
|
+
- **Lightweight**: Minimal dependencies, focused utility functions
|
|
14
|
+
|
|
15
|
+
### Use Cases
|
|
16
|
+
|
|
17
|
+
- Parsing CloudEvents headers in KServe inference services
|
|
18
|
+
- Extracting Kafka topic information from event-driven requests
|
|
19
|
+
- Building event-driven ML inference pipelines
|
|
20
|
+
- Integration with KServe and Knative Eventing
|
|
21
|
+
|
|
22
|
+
## Installation
|
|
23
|
+
|
|
24
|
+
This package is part of the `cledar-python-sdk`. Install it using:
|
|
25
|
+
|
|
26
|
+
```bash
|
|
27
|
+
# Install with uv (recommended)
|
|
28
|
+
uv sync --all-groups
|
|
29
|
+
|
|
30
|
+
# Or with pip
|
|
31
|
+
pip install -e .
|
|
32
|
+
```
|
|
33
|
+
|
|
34
|
+
## Usage Example
|
|
35
|
+
|
|
36
|
+
```python
|
|
37
|
+
from cledar.kserve import get_input_topic
|
|
38
|
+
|
|
39
|
+
# Example CloudEvents headers from KServe request
|
|
40
|
+
headers = {
|
|
41
|
+
"ce-source": "kafka://my-cluster#input-topic",
|
|
42
|
+
"ce-type": "dev.knative.kafka.event",
|
|
43
|
+
"ce-id": "partition:0/offset:123",
|
|
44
|
+
}
|
|
45
|
+
|
|
46
|
+
# Extract the Kafka topic name
|
|
47
|
+
topic = get_input_topic(headers)
|
|
48
|
+
print(topic) # Output: "input-topic"
|
|
49
|
+
|
|
50
|
+
# Handle missing or invalid headers
|
|
51
|
+
empty_headers = {}
|
|
52
|
+
topic = get_input_topic(empty_headers)
|
|
53
|
+
print(topic) # Output: None
|
|
54
|
+
|
|
55
|
+
# Handle headers without delimiter
|
|
56
|
+
invalid_headers = {"ce-source": "kafka://my-cluster/topic"}
|
|
57
|
+
topic = get_input_topic(invalid_headers)
|
|
58
|
+
print(topic) # Output: None
|
|
59
|
+
```
|
|
60
|
+
|
|
61
|
+
## Development
|
|
62
|
+
|
|
63
|
+
### Project Structure
|
|
64
|
+
|
|
65
|
+
```
|
|
66
|
+
cledar/kserve/
|
|
67
|
+
├── __init__.py # Package initialization with exports
|
|
68
|
+
├── utils.py # Utility functions for CloudEvents
|
|
69
|
+
├── tests/
|
|
70
|
+
│ ├── __init__.py # Test package initialization
|
|
71
|
+
│ └── test_utils.py # Unit tests for utilities
|
|
72
|
+
└── README.md # This file
|
|
73
|
+
```
|
|
74
|
+
|
|
75
|
+
## Running Linters
|
|
76
|
+
|
|
77
|
+
The project is configured for multiple linters (see `pyproject.toml` for configuration).
|
|
78
|
+
|
|
79
|
+
### Available Linter Configurations
|
|
80
|
+
|
|
81
|
+
The project includes configurations for:
|
|
82
|
+
- **Pylint**: Python code analysis (`.tool.pylint` in `pyproject.toml`)
|
|
83
|
+
- **Mypy**: Static type checking (`.tool.mypy` in `pyproject.toml`)
|
|
84
|
+
- **Black**: Code formatting (`.tool.black` in `pyproject.toml`)
|
|
85
|
+
|
|
86
|
+
### Installing Linters
|
|
87
|
+
|
|
88
|
+
Linters are not included in the dev dependencies by default. Install them separately:
|
|
89
|
+
|
|
90
|
+
```bash
|
|
91
|
+
# Install all linters
|
|
92
|
+
pip install pylint mypy black
|
|
93
|
+
|
|
94
|
+
# Or with uv
|
|
95
|
+
uv pip install pylint mypy black
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
### Running Linters
|
|
99
|
+
|
|
100
|
+
Once installed, run them from the SDK root directory:
|
|
101
|
+
|
|
102
|
+
```bash
|
|
103
|
+
# From the SDK root directory
|
|
104
|
+
cd /path/to/cledar-python-sdk
|
|
105
|
+
|
|
106
|
+
# Run pylint on cledar.kserve
|
|
107
|
+
pylint cledar/kserve/
|
|
108
|
+
|
|
109
|
+
# Run mypy type checking (strict mode configured)
|
|
110
|
+
mypy cledar/kserve/
|
|
111
|
+
|
|
112
|
+
# Check code formatting with black
|
|
113
|
+
black --check cledar/kserve/
|
|
114
|
+
|
|
115
|
+
# Auto-format code
|
|
116
|
+
black cledar/kserve/
|
|
117
|
+
```
|
|
118
|
+
|
|
119
|
+
### Run All Linters
|
|
120
|
+
|
|
121
|
+
```bash
|
|
122
|
+
# Run all linters in sequence
|
|
123
|
+
pylint cledar/kserve/ && \
|
|
124
|
+
mypy cledar/kserve/ && \
|
|
125
|
+
black --check cledar/kserve/
|
|
126
|
+
```
|
|
127
|
+
|
|
128
|
+
### IDE Integration
|
|
129
|
+
|
|
130
|
+
Most IDEs support these linters natively:
|
|
131
|
+
- **VSCode**: Install Python extension, linters auto-detected via `pyproject.toml`
|
|
132
|
+
- **PyCharm**: Enable in Settings → Tools → Python Integrated Tools
|
|
133
|
+
- **Cursor**: Same as VSCode
|
|
134
|
+
|
|
135
|
+
## Running Unit Tests
|
|
136
|
+
|
|
137
|
+
Unit tests verify the functionality of the CloudEvents parsing utilities.
|
|
138
|
+
|
|
139
|
+
### Run All Unit Tests
|
|
140
|
+
|
|
141
|
+
```bash
|
|
142
|
+
# From the SDK root directory
|
|
143
|
+
cd /path/to/cledar-python-sdk
|
|
144
|
+
|
|
145
|
+
# Run all tests using uv
|
|
146
|
+
PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/ -v
|
|
147
|
+
```
|
|
148
|
+
|
|
149
|
+
### Run Specific Test File
|
|
150
|
+
|
|
151
|
+
```bash
|
|
152
|
+
# Run specific test file
|
|
153
|
+
PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/test_utils.py -v
|
|
154
|
+
```
|
|
155
|
+
|
|
156
|
+
### Run Specific Test
|
|
157
|
+
|
|
158
|
+
```bash
|
|
159
|
+
# Run a specific test by name
|
|
160
|
+
PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/test_utils.py::test_get_input_topic_valid_source -v
|
|
161
|
+
```
|
|
162
|
+
|
|
163
|
+
### Run with Coverage
|
|
164
|
+
|
|
165
|
+
```bash
|
|
166
|
+
# Generate coverage report
|
|
167
|
+
PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/ \
|
|
168
|
+
--cov=cledar.kserve \
|
|
169
|
+
--cov-report=html \
|
|
170
|
+
--cov-report=term
|
|
171
|
+
|
|
172
|
+
# View HTML report
|
|
173
|
+
open htmlcov/index.html
|
|
174
|
+
```
|
|
175
|
+
|
|
176
|
+
### Unit Test Details
|
|
177
|
+
|
|
178
|
+
- **Test Framework**: pytest
|
|
179
|
+
- **Test Count**: 9 unit tests
|
|
180
|
+
- **Execution Time**: ~0.04 seconds (fast, no external dependencies)
|
|
181
|
+
|
|
182
|
+
#### What Unit Tests Cover:
|
|
183
|
+
|
|
184
|
+
- ✅ Valid CloudEvents source parsing
|
|
185
|
+
- ✅ Whitespace trimming and normalization
|
|
186
|
+
- ✅ Missing header handling
|
|
187
|
+
- ✅ Invalid format detection (no delimiter)
|
|
188
|
+
- ✅ Empty topic after delimiter
|
|
189
|
+
- ✅ Whitespace-only topics
|
|
190
|
+
- ✅ Multiple delimiters in source
|
|
191
|
+
- ✅ Empty source values
|
|
192
|
+
- ✅ Complex topic names with namespaces
|
|
193
|
+
|
|
194
|
+
## CI/CD Integration
|
|
195
|
+
|
|
196
|
+
### GitLab CI Example
|
|
197
|
+
|
|
198
|
+
```yaml
|
|
199
|
+
test-kserve-service:
|
|
200
|
+
stage: test
|
|
201
|
+
image: python:3.12
|
|
202
|
+
script:
|
|
203
|
+
- pip install uv
|
|
204
|
+
- uv sync --all-groups
|
|
205
|
+
- PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/ -v
|
|
206
|
+
```
|
|
207
|
+
|
|
208
|
+
### GitHub Actions Example
|
|
209
|
+
|
|
210
|
+
```yaml
|
|
211
|
+
name: KServe Service Tests
|
|
212
|
+
on: [push, pull_request]
|
|
213
|
+
|
|
214
|
+
jobs:
|
|
215
|
+
unit-tests:
|
|
216
|
+
runs-on: ubuntu-latest
|
|
217
|
+
steps:
|
|
218
|
+
- uses: actions/checkout@v3
|
|
219
|
+
- uses: actions/setup-python@v4
|
|
220
|
+
with:
|
|
221
|
+
python-version: '3.12'
|
|
222
|
+
- name: Install dependencies
|
|
223
|
+
run: |
|
|
224
|
+
pip install uv
|
|
225
|
+
uv sync --all-groups
|
|
226
|
+
- name: Run unit tests
|
|
227
|
+
run: PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/ -v
|
|
228
|
+
```
|
|
229
|
+
|
|
230
|
+
## API Reference
|
|
231
|
+
|
|
232
|
+
### Constants
|
|
233
|
+
|
|
234
|
+
#### `CE_SOURCE_HEADER`
|
|
235
|
+
|
|
236
|
+
The CloudEvents source header key used in KServe requests.
|
|
237
|
+
|
|
238
|
+
```python
|
|
239
|
+
CE_SOURCE_HEADER = "ce-source"
|
|
240
|
+
```
|
|
241
|
+
|
|
242
|
+
### Functions
|
|
243
|
+
|
|
244
|
+
#### `get_input_topic(headers: dict[str, str]) -> str | None`
|
|
245
|
+
|
|
246
|
+
Extract the Kafka topic name from CloudEvents source header.
|
|
247
|
+
|
|
248
|
+
Parses the 'ce-source' header value which is expected to be in the format `prefix#topic_name` and returns the topic name after the '#' delimiter.
|
|
249
|
+
|
|
250
|
+
**Parameters:**
|
|
251
|
+
- `headers` (dict[str, str]): Dictionary of HTTP headers containing CloudEvents metadata.
|
|
252
|
+
|
|
253
|
+
**Returns:**
|
|
254
|
+
- `str | None`: The extracted topic name if the header exists, contains '#', and has a non-empty topic name after the delimiter. Returns `None` otherwise.
|
|
255
|
+
|
|
256
|
+
**Example:**
|
|
257
|
+
|
|
258
|
+
```python
|
|
259
|
+
>>> headers = {"ce-source": "kafka://cluster#my-topic"}
|
|
260
|
+
>>> get_input_topic(headers)
|
|
261
|
+
'my-topic'
|
|
262
|
+
|
|
263
|
+
>>> headers = {"ce-source": "kafka://cluster#"}
|
|
264
|
+
>>> get_input_topic(headers)
|
|
265
|
+
None
|
|
266
|
+
|
|
267
|
+
>>> headers = {}
|
|
268
|
+
>>> get_input_topic(headers)
|
|
269
|
+
None
|
|
270
|
+
```
|
|
271
|
+
|
|
272
|
+
**Edge Cases Handled:**
|
|
273
|
+
- Missing `ce-source` header → Returns `None`
|
|
274
|
+
- No `#` delimiter in source → Returns `None`
|
|
275
|
+
- Empty topic after `#` → Returns `None`
|
|
276
|
+
- Whitespace-only topic → Returns `None` (after stripping)
|
|
277
|
+
- Leading/trailing whitespace → Stripped automatically
|
|
278
|
+
- Multiple `#` delimiters → Only first `#` is used as delimiter
|
|
279
|
+
|
|
280
|
+
## CloudEvents Format
|
|
281
|
+
|
|
282
|
+
The `ce-source` header in KServe follows the CloudEvents specification and typically has this format:
|
|
283
|
+
|
|
284
|
+
```
|
|
285
|
+
<protocol>://<cluster-or-namespace>#<topic-name>
|
|
286
|
+
```
|
|
287
|
+
|
|
288
|
+
**Examples:**
|
|
289
|
+
- `kafka://prod-cluster#user-events`
|
|
290
|
+
- `kafka://namespace.kafka#model-predictions`
|
|
291
|
+
- `kafka://local#ml-inference-requests`
|
|
292
|
+
|
|
293
|
+
The `get_input_topic` function extracts the `<topic-name>` portion after the `#` delimiter.
|
|
294
|
+
|
|
295
|
+
## Integration with KServe
|
|
296
|
+
|
|
297
|
+
### Example KServe Predictor
|
|
298
|
+
|
|
299
|
+
```python
|
|
300
|
+
from kserve import Model, ModelServer
|
|
301
|
+
from cledar.kserve import get_input_topic
|
|
302
|
+
import logging
|
|
303
|
+
|
|
304
|
+
logger = logging.getLogger(__name__)
|
|
305
|
+
|
|
306
|
+
class MyPredictor(Model):
|
|
307
|
+
def __init__(self, name: str):
|
|
308
|
+
super().__init__(name)
|
|
309
|
+
|
|
310
|
+
def predict(self, request: dict, headers: dict[str, str]) -> dict:
|
|
311
|
+
# Extract source topic from CloudEvents headers
|
|
312
|
+
source_topic = get_input_topic(headers)
|
|
313
|
+
|
|
314
|
+
if source_topic:
|
|
315
|
+
logger.info(f"Processing request from topic: {source_topic}")
|
|
316
|
+
else:
|
|
317
|
+
logger.warning("Could not determine source topic from headers")
|
|
318
|
+
|
|
319
|
+
# Your inference logic here
|
|
320
|
+
predictions = self.model.predict(request["instances"])
|
|
321
|
+
|
|
322
|
+
return {"predictions": predictions}
|
|
323
|
+
|
|
324
|
+
if __name__ == "__main__":
|
|
325
|
+
model = MyPredictor("my-model")
|
|
326
|
+
ModelServer().start([model])
|
|
327
|
+
```
|
|
328
|
+
|
|
329
|
+
## Running Pre-commit Checks
|
|
330
|
+
|
|
331
|
+
```bash
|
|
332
|
+
# Format code
|
|
333
|
+
uv run black cledar/kserve/
|
|
334
|
+
|
|
335
|
+
# Check types
|
|
336
|
+
uv run mypy cledar/kserve/
|
|
337
|
+
|
|
338
|
+
# Run linter
|
|
339
|
+
uv run pylint cledar/kserve/
|
|
340
|
+
|
|
341
|
+
# Run all tests
|
|
342
|
+
PYTHONPATH=$PWD uv run pytest cledar/kserve/tests/ -v
|
|
343
|
+
```
|
|
344
|
+
|
|
345
|
+
## License
|
|
346
|
+
|
|
347
|
+
See the main repository LICENSE file.
|
|
348
|
+
|
|
349
|
+
## Support
|
|
350
|
+
|
|
351
|
+
For issues, questions, or contributions, please refer to the main repository's contribution guidelines.
|
|
352
|
+
|
|
File without changes
|
|
@@ -0,0 +1,64 @@
|
|
|
1
|
+
from cledar.kserve.utils import get_input_topic
|
|
2
|
+
|
|
3
|
+
|
|
4
|
+
def test_get_input_topic_valid_source() -> None:
|
|
5
|
+
headers = {"ce-source": "kafka://cluster#my-topic"}
|
|
6
|
+
result = get_input_topic(headers)
|
|
7
|
+
|
|
8
|
+
assert result == "my-topic"
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
def test_get_input_topic_with_whitespace() -> None:
|
|
12
|
+
headers = {"ce-source": "kafka://cluster# my-topic "}
|
|
13
|
+
result = get_input_topic(headers)
|
|
14
|
+
|
|
15
|
+
assert result == "my-topic"
|
|
16
|
+
|
|
17
|
+
|
|
18
|
+
def test_get_input_topic_missing_header() -> None:
|
|
19
|
+
headers: dict[str, str] = {}
|
|
20
|
+
result = get_input_topic(headers)
|
|
21
|
+
|
|
22
|
+
assert result is None
|
|
23
|
+
|
|
24
|
+
|
|
25
|
+
def test_get_input_topic_no_delimiter() -> None:
|
|
26
|
+
headers = {"ce-source": "kafka://cluster/my-topic"}
|
|
27
|
+
result = get_input_topic(headers)
|
|
28
|
+
|
|
29
|
+
assert result is None
|
|
30
|
+
|
|
31
|
+
|
|
32
|
+
def test_get_input_topic_empty_after_delimiter() -> None:
|
|
33
|
+
headers = {"ce-source": "kafka://cluster#"}
|
|
34
|
+
result = get_input_topic(headers)
|
|
35
|
+
|
|
36
|
+
assert result is None
|
|
37
|
+
|
|
38
|
+
|
|
39
|
+
def test_get_input_topic_only_whitespace_after_delimiter() -> None:
|
|
40
|
+
headers = {"ce-source": "kafka://cluster# "}
|
|
41
|
+
result = get_input_topic(headers)
|
|
42
|
+
|
|
43
|
+
assert result is None
|
|
44
|
+
|
|
45
|
+
|
|
46
|
+
def test_get_input_topic_multiple_delimiters() -> None:
|
|
47
|
+
headers = {"ce-source": "kafka://cluster#my-topic#with-hash"}
|
|
48
|
+
result = get_input_topic(headers)
|
|
49
|
+
|
|
50
|
+
assert result == "my-topic#with-hash"
|
|
51
|
+
|
|
52
|
+
|
|
53
|
+
def test_get_input_topic_empty_source_value() -> None:
|
|
54
|
+
headers = {"ce-source": ""}
|
|
55
|
+
result = get_input_topic(headers)
|
|
56
|
+
|
|
57
|
+
assert result is None
|
|
58
|
+
|
|
59
|
+
|
|
60
|
+
def test_get_input_topic_complex_topic_name() -> None:
|
|
61
|
+
headers = {"ce-source": "kafka://prod.cluster.example.com#namespace.my-topic-v2"}
|
|
62
|
+
result = get_input_topic(headers)
|
|
63
|
+
|
|
64
|
+
assert result == "namespace.my-topic-v2"
|
cledar/kserve/utils.py
ADDED
|
@@ -0,0 +1,30 @@
|
|
|
1
|
+
"""Utilities for KServe integration and CloudEvents processing."""
|
|
2
|
+
|
|
3
|
+
CE_SOURCE_HEADER = "ce-source"
|
|
4
|
+
|
|
5
|
+
|
|
6
|
+
def get_input_topic(headers: dict[str, str]) -> str | None:
|
|
7
|
+
"""Extract the Kafka topic name from CloudEvents source header.
|
|
8
|
+
|
|
9
|
+
Parses the 'ce-source' header value which is expected to be in the format
|
|
10
|
+
'prefix#topic_name' and returns the topic name after the '#' delimiter.
|
|
11
|
+
|
|
12
|
+
Args:
|
|
13
|
+
headers: Dictionary of HTTP headers containing CloudEvents metadata.
|
|
14
|
+
|
|
15
|
+
Returns:
|
|
16
|
+
The extracted topic name if the header exists, contains '#', and has
|
|
17
|
+
a non-empty topic name after the delimiter. Returns None otherwise.
|
|
18
|
+
|
|
19
|
+
Example:
|
|
20
|
+
>>> headers = {"ce-source": "kafka://cluster#my-topic"}
|
|
21
|
+
>>> get_input_topic(headers)
|
|
22
|
+
'my-topic'
|
|
23
|
+
|
|
24
|
+
"""
|
|
25
|
+
source = headers.get(CE_SOURCE_HEADER)
|
|
26
|
+
if not source or "#" not in source:
|
|
27
|
+
return None
|
|
28
|
+
|
|
29
|
+
topic = source.split("#", 1)[1].strip()
|
|
30
|
+
return topic if topic else None
|
cledar/logging/README.md
ADDED
|
@@ -0,0 +1,53 @@
|
|
|
1
|
+
# Universal Formatter
|
|
2
|
+
|
|
3
|
+
The `UniversalPlaintextFormatter` is a custom logging formatter that extends the standard `logging.Formatter` class. It adds the ability to include extra attributes from log records while excluding standard attributes and configurable keys.
|
|
4
|
+
|
|
5
|
+
## Usage
|
|
6
|
+
|
|
7
|
+
To use the `UniversalPlaintextFormatter` in your logging configuration, add the following to your `logging.conf` file:
|
|
8
|
+
|
|
9
|
+
```ini
|
|
10
|
+
[formatter_plaintextFormatter]
|
|
11
|
+
class=questions_generator.common_services.logging.universal_formatter.UniversalPlaintextFormatter
|
|
12
|
+
format=%(asctime)s %(name)s [%(levelname)s]: %(message)s
|
|
13
|
+
datefmt=%Y-%m-%d %H:%M:%S
|
|
14
|
+
```
|
|
15
|
+
|
|
16
|
+
## Features
|
|
17
|
+
|
|
18
|
+
- Extends the standard logging.Formatter
|
|
19
|
+
- Automatically includes extra attributes from log records
|
|
20
|
+
- Excludes standard LogRecord attributes to keep logs clean
|
|
21
|
+
- Configurable exclusion of additional keys
|
|
22
|
+
|
|
23
|
+
## Configuration Options
|
|
24
|
+
|
|
25
|
+
In addition to the standard formatter options, you can configure which keys to exclude from the log output:
|
|
26
|
+
|
|
27
|
+
```ini
|
|
28
|
+
[formatter_plaintextFormatter]
|
|
29
|
+
class=questions_generator.common_services.logging.universal_formatter.UniversalPlaintextFormatter
|
|
30
|
+
format=%(asctime)s %(name)s [%(levelname)s]: %(message)s
|
|
31
|
+
datefmt=%Y-%m-%d %H:%M:%S
|
|
32
|
+
exclude_keys=key1,key2,key3
|
|
33
|
+
```
|
|
34
|
+
|
|
35
|
+
The `exclude_keys` option allows you to specify a comma-separated list of keys that should be excluded from the log output, in addition to the standard LogRecord attributes.
|
|
36
|
+
|
|
37
|
+
## Example
|
|
38
|
+
|
|
39
|
+
When using this formatter, any extra attributes added to the log record will be automatically included in the log output:
|
|
40
|
+
|
|
41
|
+
```python
|
|
42
|
+
import logging
|
|
43
|
+
|
|
44
|
+
logger = logging.getLogger(__name__)
|
|
45
|
+
logger.info("User logged in", extra={"user_id": 123, "ip_address": "192.168.1.1"})
|
|
46
|
+
```
|
|
47
|
+
|
|
48
|
+
Output:
|
|
49
|
+
```
|
|
50
|
+
2023-08-04 12:34:56 my_module [INFO]: User logged in
|
|
51
|
+
user_id: 123
|
|
52
|
+
ip_address: 192.168.1.1
|
|
53
|
+
```
|