norialog 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
norialog-0.1.0/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Noria Labs
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,604 @@
1
+ Metadata-Version: 2.4
2
+ Name: norialog
3
+ Version: 0.1.0
4
+ Summary: Structured JSON logging for Python services with stdout, stderr, file, and CloudWatch destinations.
5
+ Author: Noria Labs
6
+ License-Expression: MIT
7
+ Requires-Python: >=3.11
8
+ Description-Content-Type: text/markdown
9
+ License-File: LICENSE
10
+ Requires-Dist: boto3>=1.42.85
11
+ Requires-Dist: botocore>=1.42.85
12
+ Provides-Extra: dev
13
+ Requires-Dist: pytest>=9.0.3; extra == "dev"
14
+ Requires-Dist: pytest-cov>=7.1.0; extra == "dev"
15
+ Requires-Dist: ruff>=0.15.9; extra == "dev"
16
+ Dynamic: license-file
17
+
18
+ # `norialog`
19
+
20
+ Structured JSON logging for Python services, with support for `stdout`, `stderr`, rotating file targets, and direct CloudWatch delivery.
21
+
22
+ This package is intentionally small and explicit. It does not wrap the standard `logging` module. Instead, it gives you a service logger that writes JSON records directly to one or more destinations, with schema remapping, secret redaction, target resolution, and CloudWatch batching built in.
23
+
24
+ ## Install
25
+
26
+ ```bash
27
+ pip install norialog
28
+ ```
29
+
30
+ Python requirement: `>=3.11`
31
+
32
+ ## Main Exports
33
+
34
+ ```python
35
+ from norialog import (
36
+ ManagedLogger,
37
+ LoggerRuntimeContext,
38
+ LoggerTargetContext,
39
+ create_cloudwatch_destination,
40
+ create_file_destination,
41
+ create_logger_runtime_context,
42
+ create_logger_target_context,
43
+ create_redact_matcher,
44
+ create_service_logger,
45
+ format_date_stamp,
46
+ parse_logger_destinations,
47
+ parse_logger_redact_keys,
48
+ resolve_target,
49
+ sanitize_log_value,
50
+ )
51
+ ```
52
+
53
+ ## Quick Start
54
+
55
+ ```python
56
+ from norialog import create_service_logger
57
+
58
+ managed = create_service_logger(
59
+ service_name="payments",
60
+ environment="production",
61
+ )
62
+
63
+ logger = managed.logger
64
+
65
+ logger.info("service started", provider="stripe")
66
+ logger.warn("slow upstream", duration_ms=812)
67
+ logger.exception("payment failed", RuntimeError("gateway timeout"), invoice_id="inv_123")
68
+
69
+ managed.flush()
70
+ managed.close()
71
+ ```
72
+
73
+ ## What `create_service_logger()` Returns
74
+
75
+ `create_service_logger()` returns a `ManagedLogger` dataclass with:
76
+
77
+ - `logger`: the `ServiceLogger` instance
78
+ - `flush()`: flushes every configured destination
79
+ - `close()`: flushes and closes managed destinations
80
+
81
+ Call `close()` before process exit when you use file or CloudWatch destinations.
82
+
83
+ ## Logger Methods
84
+
85
+ The returned `logger` exposes:
86
+
87
+ - `trace(message, **fields)`
88
+ - `debug(message, **fields)`
89
+ - `info(message, **fields)`
90
+ - `warn(message, **fields)`
91
+ - `warning(message, **fields)`
92
+ - `error(message, **fields)`
93
+ - `fatal(message, **fields)`
94
+ - `exception(message, error, **fields)`
95
+ - `log(level, message, **fields)`
96
+
97
+ Supported levels are:
98
+
99
+ - `trace`
100
+ - `debug`
101
+ - `info`
102
+ - `warn`
103
+ - `error`
104
+ - `fatal`
105
+ - `silent`
106
+
107
+ Records below the configured threshold are skipped.
108
+
109
+ ## Default Output Shape
110
+
111
+ By default, each log record contains:
112
+
113
+ ```json
114
+ {
115
+ "level": "info",
116
+ "levelValue": 30,
117
+ "time": 1711580000000,
118
+ "timestamp": "2024-03-27T12:13:20.000Z",
119
+ "service": "payments",
120
+ "environment": "production",
121
+ "msg": "service started"
122
+ }
123
+ ```
124
+
125
+ Additional keyword arguments passed to the logger are merged into the record.
126
+
127
+ Exception values are normalized into objects with:
128
+
129
+ - `name`
130
+ - `message`
131
+ - `stack`
132
+
133
+ ## Basic Configuration
134
+
135
+ ```python
136
+ from norialog import create_service_logger
137
+
138
+ managed = create_service_logger(
139
+ service_name="api",
140
+ environment="staging",
141
+ level="debug",
142
+ destinations=["stdout", "file"],
143
+ base={"team": "platform", "region": "eu-west-1"},
144
+ redact_keys=["session_id"],
145
+ file={
146
+ "target": {
147
+ "prefix": "/var/log/noria/api",
148
+ "rotation": "daily",
149
+ "suffix": ".jsonl",
150
+ }
151
+ },
152
+ )
153
+ ```
154
+
155
+ ### `create_service_logger()` Options
156
+
157
+ - `service_name`: required service identifier added to every record
158
+ - `environment`: optional environment name added to every record
159
+ - `level`: minimum level, default `info`
160
+ - `destinations`: list of destination names, default `["stdout"]`
161
+ - `schema`: field remapping configuration
162
+ - `identity`: runtime identity overrides for hostname, instance id, and pid
163
+ - `redact`: advanced redaction configuration
164
+ - `redact_keys`: extra exact-match redact keys
165
+ - `base`: base fields merged into every record
166
+ - `file`: file destination configuration, required when `file` is enabled
167
+ - `cloudwatch`: CloudWatch destination configuration, required when `cloudwatch` is enabled
168
+
169
+ ## Destinations
170
+
171
+ Supported destination names are:
172
+
173
+ - `stdout`
174
+ - `stderr`
175
+ - `file`
176
+ - `cloudwatch`
177
+
178
+ Use `parse_logger_destinations()` if you want to accept a comma-separated environment variable:
179
+
180
+ ```python
181
+ from norialog import parse_logger_destinations
182
+
183
+ destinations = parse_logger_destinations("stdout,file,cloudwatch")
184
+ ```
185
+
186
+ The parser:
187
+
188
+ - defaults to `["stdout"]` when the input is empty
189
+ - lowercases entries
190
+ - removes duplicates
191
+ - raises `ValueError` for unsupported names
192
+
193
+ ## Schema Remapping
194
+
195
+ Use the `schema` option to rename output fields and choose which time fields are emitted.
196
+
197
+ ```python
198
+ managed = create_service_logger(
199
+ service_name="billing",
200
+ schema={
201
+ "messageKey": "message",
202
+ "levelKey": "severity",
203
+ "levelValueKey": "severityValue",
204
+ "timeKey": "ts",
205
+ "timestampKey": "tsIso",
206
+ "serviceKey": "app",
207
+ "environmentKey": "stage",
208
+ "errorKey": "error",
209
+ "timeMode": "iso",
210
+ },
211
+ )
212
+ ```
213
+
214
+ Supported schema keys:
215
+
216
+ - `messageKey`: default `msg`
217
+ - `levelKey`: default `level`
218
+ - `levelValueKey`: default `levelValue`
219
+ - `timeKey`: default `time`
220
+ - `timestampKey`: default `timestamp`
221
+ - `serviceKey`: default `service`
222
+ - `environmentKey`: default `environment`
223
+ - `errorKey`: default `err`
224
+ - `timeMode`: one of `epoch`, `iso`, `both`
225
+
226
+ Rules:
227
+
228
+ - `timeMode="epoch"` emits only the integer millisecond timestamp
229
+ - `timeMode="iso"` emits only the ISO timestamp
230
+ - `timeMode="both"` emits both
231
+ - when `timeMode="both"`, `timeKey` and `timestampKey` must be different
232
+
233
+ ## Redaction
234
+
235
+ Redaction happens before records are encoded to JSON.
236
+
237
+ By default, the built-in matcher treats keys containing common secret-like names as sensitive, including:
238
+
239
+ - `token`
240
+ - `secret`
241
+ - `key`
242
+ - `password`
243
+ - `authorization`
244
+ - `credential`
245
+ - `api_key`
246
+
247
+ ### Simple Redaction
248
+
249
+ ```python
250
+ managed = create_service_logger(
251
+ service_name="auth",
252
+ redact_keys=["session_id", "otp"],
253
+ )
254
+ ```
255
+
256
+ ### Advanced Redaction
257
+
258
+ ```python
259
+ managed = create_service_logger(
260
+ service_name="auth",
261
+ redact={
262
+ "keys": ["session_id"],
263
+ "mode": "replace",
264
+ },
265
+ )
266
+ ```
267
+
268
+ `redact.mode` controls how custom keys behave:
269
+
270
+ - `merge`: exact keys are added to the built-in secret matcher
271
+ - `replace`: only the explicitly listed keys are redacted
272
+
273
+ `redact_keys` and `redact["keys"]` can be combined. If `redact` is provided, its `mode` wins.
274
+
275
+ You can also use the helpers directly:
276
+
277
+ ```python
278
+ from norialog import create_redact_matcher, sanitize_log_value
279
+
280
+ matcher = create_redact_matcher({"keys": ["session_id"], "mode": "merge"})
281
+ safe = sanitize_log_value({"token": "secret", "session_id": "abc"}, matcher)
282
+ ```
283
+
284
+ ## Runtime Identity
285
+
286
+ By default, runtime context uses the current hostname and process id. Override it with `identity` when you need deterministic names in tests or custom deployment metadata:
287
+
288
+ ```python
289
+ managed = create_service_logger(
290
+ service_name="worker",
291
+ identity={
292
+ "hostname": "queue-1",
293
+ "instanceId": "i-abc123",
294
+ "pid": 4242,
295
+ },
296
+ )
297
+ ```
298
+
299
+ Available identity keys:
300
+
301
+ - `hostname`
302
+ - `instanceId`
303
+ - `pid`
304
+
305
+ ## Base Fields
306
+
307
+ Use `base` to inject fields into every record:
308
+
309
+ ```python
310
+ managed = create_service_logger(
311
+ service_name="payments",
312
+ base={
313
+ "team": "platform",
314
+ "component": "webhook-consumer",
315
+ },
316
+ )
317
+ ```
318
+
319
+ Base fields are merged after the standard service and environment fields.
320
+
321
+ ## File Destination
322
+
323
+ Enable the file destination by including `"file"` in `destinations` and supplying `file=...`.
324
+
325
+ ```python
326
+ managed = create_service_logger(
327
+ service_name="api",
328
+ destinations=["file"],
329
+ file={
330
+ "target": {
331
+ "prefix": "/var/log/noria/api",
332
+ "rotation": "daily",
333
+ "suffix": ".jsonl",
334
+ },
335
+ "mkdir": True,
336
+ },
337
+ )
338
+ ```
339
+
340
+ ### File Config
341
+
342
+ - `target`: required target config
343
+ - `mkdir`: optional, default `True`; creates parent directories automatically
344
+
345
+ ### File Target Resolution
346
+
347
+ `file["target"]` supports three styles:
348
+
349
+ 1. Fixed path
350
+
351
+ ```python
352
+ file={"target": {"value": "/var/log/noria/api.jsonl"}}
353
+ ```
354
+
355
+ 2. Declarative path building
356
+
357
+ ```python
358
+ file={
359
+ "target": {
360
+ "prefix": "/var/log/noria/api",
361
+ "rotation": "monthly",
362
+ "includeServiceName": True,
363
+ "includeEnvironment": True,
364
+ "includeHostname": True,
365
+ "includeInstanceId": True,
366
+ "includePid": True,
367
+ "suffix": ".jsonl",
368
+ "separator": "/",
369
+ "timezone": "America/New_York",
370
+ }
371
+ }
372
+ ```
373
+
374
+ 3. Custom resolver
375
+
376
+ ```python
377
+ file={
378
+ "target": {
379
+ "resolve": lambda context: (
380
+ f"/var/log/{context.environment}/{context.service_name}-{context.pid}.jsonl"
381
+ )
382
+ }
383
+ }
384
+ ```
385
+
386
+ Supported target keys:
387
+
388
+ - `value`: fixed path
389
+ - `prefix`: base path or prefix
390
+ - `rotation`: `none`, `daily`, `monthly`, `annual`
391
+ - `timezone`: IANA timezone used for rotation boundaries
392
+ - `includeServiceName`
393
+ - `includeEnvironment`
394
+ - `includeHostname`
395
+ - `includeInstanceId`
396
+ - `includePid`
397
+ - `identifier`
398
+ - `separator`: join string, default `-`
399
+ - `suffix`
400
+ - `resolve`: callable that receives `LoggerTargetContext`
401
+
402
+ Important behavior:
403
+
404
+ - file targets are resolved per event timestamp, not only once at startup
405
+ - that allows date-aware rollovers from the actual event time
406
+ - if the emitted JSON contains `time` or `timestamp`, the file destination uses it to choose the target
407
+
408
+ ## CloudWatch Destination
409
+
410
+ Enable the CloudWatch destination by including `"cloudwatch"` in `destinations` and supplying `cloudwatch=...`.
411
+
412
+ ```python
413
+ managed = create_service_logger(
414
+ service_name="api",
415
+ destinations=["stdout", "cloudwatch"],
416
+ cloudwatch={
417
+ "region": "eu-west-1",
418
+ "logGroupName": "/noria/api",
419
+ "stream": {
420
+ "prefix": "api",
421
+ "rotation": "daily",
422
+ "includeHostname": False,
423
+ "includePid": False,
424
+ },
425
+ "retentionInDays": 30,
426
+ },
427
+ )
428
+ ```
429
+
430
+ ### CloudWatch Config
431
+
432
+ - `region`: required unless you inject `client`
433
+ - `logGroupName`: required
434
+ - `credentials`: optional mapping with `access_key_id`, `secret_access_key`, `session_token`
435
+ - `client`: optional boto logs client override
436
+ - `stream`: optional target config for stream names
437
+ - `createLogGroup`: default `True`
438
+ - `createLogStream`: default `True`
439
+ - `retentionInDays`: optional CloudWatch retention policy
440
+ - `flushIntervalMs`: default `2000`
441
+ - `maxBatchCount`: default `1000`
442
+ - `maxBatchBytes`: default `900000`
443
+ - `maxBufferedEvents`: default `20000`
444
+ - `retryBaseDelayMs`: default `1000`
445
+
446
+ ### Stream Naming
447
+
448
+ CloudWatch stream naming uses the same target resolution engine as file targets.
449
+
450
+ If you do not provide `stream`, the fallback stream name is:
451
+
452
+ ```text
453
+ {hostname}-{pid}
454
+ ```
455
+
456
+ When you provide a rotating stream prefix, rotation happens from each event timestamp, not wall-clock flush time.
457
+
458
+ ### Retention Values
459
+
460
+ Supported `retentionInDays` values are:
461
+
462
+ `1, 3, 5, 7, 14, 30, 60, 90, 120, 150, 180, 365, 400, 545, 731, 1096, 1827, 2192, 2557, 2922, 3288, 3653`
463
+
464
+ ### CloudWatch Operational Behavior
465
+
466
+ - log events are buffered in memory and flushed in batches
467
+ - batches are grouped by stream name
468
+ - oversized buffers are trimmed from the oldest events
469
+ - transient flush failures are retried with backoff
470
+ - CloudWatch setup can create the log group and stream automatically
471
+
472
+ ## Target Helper Functions
473
+
474
+ These helpers are available when you want to build your own file or CloudWatch wrappers:
475
+
476
+ ```python
477
+ from norialog import (
478
+ create_logger_runtime_context,
479
+ create_logger_target_context,
480
+ format_date_stamp,
481
+ resolve_target,
482
+ )
483
+ ```
484
+
485
+ Example:
486
+
487
+ ```python
488
+ runtime = create_logger_runtime_context(
489
+ service_name="payments",
490
+ environment="prod",
491
+ )
492
+
493
+ target_context = create_logger_target_context(runtime, 1711578600000)
494
+
495
+ path = resolve_target(
496
+ {
497
+ "prefix": "logs",
498
+ "rotation": "daily",
499
+ "includeServiceName": True,
500
+ "includeEnvironment": True,
501
+ "suffix": ".jsonl",
502
+ "separator": "/",
503
+ },
504
+ target_context,
505
+ )
506
+ ```
507
+
508
+ ## Direct Destination Construction
509
+
510
+ If you do not want the full managed logger, you can create destinations directly:
511
+
512
+ ```python
513
+ from norialog import create_cloudwatch_destination, create_file_destination
514
+
515
+ runtime = create_logger_runtime_context(service_name="api", environment="prod")
516
+
517
+ file_destination = create_file_destination(
518
+ {"target": {"value": "/tmp/api.jsonl"}},
519
+ runtime,
520
+ )
521
+
522
+ cloudwatch_destination = create_cloudwatch_destination(
523
+ {
524
+ "region": "eu-west-1",
525
+ "logGroupName": "/noria/api",
526
+ "createLogGroup": False,
527
+ "createLogStream": False,
528
+ },
529
+ runtime,
530
+ )
531
+ ```
532
+
533
+ Each destination exposes:
534
+
535
+ - `emit_line(line, timestamp_ms=None)`
536
+ - `flush()`
537
+ - `close()`
538
+
539
+ ## Usage Patterns
540
+
541
+ ### Stdout Only
542
+
543
+ ```python
544
+ managed = create_service_logger(service_name="api")
545
+ managed.logger.info("ready")
546
+ ```
547
+
548
+ ### Stdout and File
549
+
550
+ ```python
551
+ managed = create_service_logger(
552
+ service_name="api",
553
+ destinations=["stdout", "file"],
554
+ file={"target": {"prefix": "/tmp/api", "rotation": "daily", "suffix": ".jsonl"}},
555
+ )
556
+ ```
557
+
558
+ ### File Only with Custom Schema
559
+
560
+ ```python
561
+ managed = create_service_logger(
562
+ service_name="jobs",
563
+ destinations=["file"],
564
+ schema={"messageKey": "message", "errorKey": "error", "timeMode": "iso"},
565
+ file={"target": {"value": "/tmp/jobs.log"}},
566
+ )
567
+ ```
568
+
569
+ ### CloudWatch Only
570
+
571
+ ```python
572
+ managed = create_service_logger(
573
+ service_name="worker",
574
+ destinations=["cloudwatch"],
575
+ cloudwatch={
576
+ "region": "eu-west-1",
577
+ "logGroupName": "/noria/worker",
578
+ },
579
+ )
580
+ ```
581
+
582
+ ## Notes and Caveats
583
+
584
+ - `close()` is the safe way to finish file and CloudWatch logging
585
+ - `StdDestination.close()` only flushes; it does not close `stdout` or `stderr`
586
+ - file and CloudWatch targets can rotate based on the timestamp inside each emitted JSON record
587
+ - `warn()` and `warning()` are equivalent
588
+ - passing `error=` or `err=` fields is normalized to the configured error key
589
+ - JSON is emitted with compact separators and `ensure_ascii=False`
590
+
591
+ ## Development
592
+
593
+ Run tests:
594
+
595
+ ```bash
596
+ uv sync --extra dev
597
+ uv run pytest
598
+ ```
599
+
600
+ Run lint:
601
+
602
+ ```bash
603
+ uv run ruff check .
604
+ ```