pytest-allure-host 0.1.2__tar.gz → 2.0.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: pytest-allure-host
3
- Version: 0.1.2
3
+ Version: 2.0.0
4
4
  Summary: Publish Allure static reports to private S3 behind CloudFront with history preservation
5
5
  License-Expression: MIT
6
6
  License-File: LICENSE
@@ -17,6 +17,7 @@ Classifier: Intended Audience :: Developers
17
17
  Classifier: Topic :: Software Development :: Testing
18
18
  Classifier: Framework :: Pytest
19
19
  Classifier: Development Status :: 3 - Alpha
20
+ Classifier: License :: OSI Approved :: MIT License
20
21
  Classifier: Operating System :: OS Independent
21
22
  Requires-Dist: PyYAML (>=6,<7)
22
23
  Requires-Dist: boto3 (>=1.28,<2.0)
@@ -34,9 +35,12 @@ Description-Content-Type: text/markdown
34
35
  ![PyPI - Version](https://img.shields.io/pypi/v/pytest-allure-host.svg)
35
36
  ![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)
36
37
  [![Docs](https://img.shields.io/badge/docs-site-blue)](https://darrenrabbs.github.io/allurehosting/)
38
+ [![CDK Stack](https://img.shields.io/badge/CDK%20Stack-repo-blueviolet)](https://github.com/darrenrabbs/allurehosting-cdk)
37
39
 
38
40
  Publish Allure static reports to private S3 behind CloudFront with history preservation and SPA-friendly routing.
39
41
 
42
+ Optional infrastructure (AWS CDK stack to provision the private S3 bucket + CloudFront OAC distribution) lives externally: https://github.com/darrenrabbs/allurehosting-cdk
43
+
40
44
  See `docs/architecture.md` and `.github/copilot-instructions.md` for architecture and design constraints.
41
45
 
42
46
  ## Documentation
@@ -63,6 +67,39 @@ The README intentionally stays lean—refer to the site for detailed guidance.
63
67
  - Columns: Run ID, raw epoch, UTC Time (human readable), Size (pretty units), P/F/B (passed/failed/broken counts), links to the immutable run and the moving latest
64
68
  - Newest run highlighted with a star (★) and soft background
65
69
 
70
+ ## Quick start
71
+
72
+ ```bash
73
+ # Install the publisher
74
+ pip install pytest-allure-host
75
+
76
+ # Run your test suite and produce allure-results/
77
+ pytest --alluredir=allure-results
78
+
79
+ # Plan (no uploads) – shows what would be published
80
+ publish-allure \
81
+ --bucket my-allure-bucket \
82
+ --project myproj \
83
+ --branch main \
84
+ --dry-run --summary-json plan.json
85
+
86
+ # Real publish (requires AWS creds: env vars, profile, or OIDC)
87
+ publish-allure \
88
+ --bucket my-allure-bucket \
89
+ --project myproj \
90
+ --branch main
91
+ ```
92
+
93
+ Notes:
94
+
95
+ - `--prefix` defaults to `reports`; omit unless you need a different root.
96
+ - `--branch` defaults to `$GIT_BRANCH` or `main` if unset.
97
+ - Add `--cloudfront https://reports.example.com` to print CDN URLs.
98
+ - Use `--check` to preflight (AWS / allure binary / inputs) before a real run.
99
+ - Add `--context-url https://jira.example.com/browse/PROJ-123` to link a change ticket in the runs index.
100
+ - Use `--dry-run` + `--summary-json` in CI for a planning stage artifact.
101
+ - Provide `--ttl-days` and/or `--max-keep-runs` for lifecycle & cost controls.
102
+
66
103
  ## Requirements
67
104
 
68
105
  - Python 3.9+
@@ -230,6 +267,56 @@ Pytest-driven (plugin):
230
267
  --allure-max-keep-runs 10
231
268
  ```
232
269
 
270
+ ### Minimal publish-only workflow
271
+
272
+ Create `.github/workflows/allure-publish.yml` for a lightweight pipeline that runs tests, generates the report, and publishes it (using secrets for the bucket and AWS credentials):
273
+
274
+ ```yaml
275
+ name: allure-publish
276
+ on: [push, pull_request]
277
+ jobs:
278
+ publish:
279
+ runs-on: ubuntu-latest
280
+ permissions:
281
+ contents: read
282
+ steps:
283
+ - uses: actions/checkout@v4
284
+ - uses: actions/setup-python@v5
285
+ with:
286
+ python-version: "3.11"
287
+ - name: Install deps (minimal)
288
+ run: pip install pytest pytest-allure-host allure-pytest
289
+ - name: Run tests
290
+ run: pytest --alluredir=allure-results -q
291
+ - name: Publish Allure report (dry-run on PRs)
292
+ env:
293
+ AWS_REGION: us-east-1
294
+ AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
295
+ AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
296
+ ALLURE_BUCKET: ${{ secrets.ALLURE_BUCKET }}
297
+ run: |
298
+ EXTRA=""
299
+ if [ "${{ github.event_name }}" = "pull_request" ]; then EXTRA="--dry-run"; fi
300
+ publish-allure \
301
+ --bucket "$ALLURE_BUCKET" \
302
+ --project myproj \
303
+ --branch "${{ github.ref_name }}" \
304
+ --summary-json summary.json $EXTRA
305
+ - name: Upload publish summary (always)
306
+ if: always()
307
+ uses: actions/upload-artifact@v4
308
+ with:
309
+ name: allure-summary
310
+ path: summary.json
311
+ ```
312
+
313
+ Notes:
314
+
315
+ - Add `--cloudfront https://reports.example.com` if you have a CDN domain.
316
+ - Add `--context-url ${{ github.server_url }}/${{ github.repository }}/pull/${{ github.event.pull_request.number }}` inside PRs to link the run to its PR.
317
+ - Use `--max-keep-runs` / `--ttl-days` to manage storage costs.
318
+ - For LocalStack-based tests, set `--s3-endpoint` and export `ALLURE_S3_ENDPOINT` in `env:`.
319
+
233
320
  ## Troubleshooting
234
321
 
235
322
  - Missing Allure binary: ensure the Allure CLI is installed and on PATH.
@@ -5,9 +5,12 @@
5
5
  ![PyPI - Version](https://img.shields.io/pypi/v/pytest-allure-host.svg)
6
6
  ![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)
7
7
  [![Docs](https://img.shields.io/badge/docs-site-blue)](https://darrenrabbs.github.io/allurehosting/)
8
+ [![CDK Stack](https://img.shields.io/badge/CDK%20Stack-repo-blueviolet)](https://github.com/darrenrabbs/allurehosting-cdk)
8
9
 
9
10
  Publish Allure static reports to private S3 behind CloudFront with history preservation and SPA-friendly routing.
10
11
 
12
+ Optional infrastructure (AWS CDK stack to provision the private S3 bucket + CloudFront OAC distribution) lives externally: https://github.com/darrenrabbs/allurehosting-cdk
13
+
11
14
  See `docs/architecture.md` and `.github/copilot-instructions.md` for architecture and design constraints.
12
15
 
13
16
  ## Documentation
@@ -34,6 +37,39 @@ The README intentionally stays lean—refer to the site for detailed guidance.
34
37
  - Columns: Run ID, raw epoch, UTC Time (human readable), Size (pretty units), P/F/B (passed/failed/broken counts), links to the immutable run and the moving latest
35
38
  - Newest run highlighted with a star (★) and soft background
36
39
 
40
+ ## Quick start
41
+
42
+ ```bash
43
+ # Install the publisher
44
+ pip install pytest-allure-host
45
+
46
+ # Run your test suite and produce allure-results/
47
+ pytest --alluredir=allure-results
48
+
49
+ # Plan (no uploads) – shows what would be published
50
+ publish-allure \
51
+ --bucket my-allure-bucket \
52
+ --project myproj \
53
+ --branch main \
54
+ --dry-run --summary-json plan.json
55
+
56
+ # Real publish (requires AWS creds: env vars, profile, or OIDC)
57
+ publish-allure \
58
+ --bucket my-allure-bucket \
59
+ --project myproj \
60
+ --branch main
61
+ ```
62
+
63
+ Notes:
64
+
65
+ - `--prefix` defaults to `reports`; omit unless you need a different root.
66
+ - `--branch` defaults to `$GIT_BRANCH` or `main` if unset.
67
+ - Add `--cloudfront https://reports.example.com` to print CDN URLs.
68
+ - Use `--check` to preflight (AWS / allure binary / inputs) before a real run.
69
+ - Add `--context-url https://jira.example.com/browse/PROJ-123` to link a change ticket in the runs index.
70
+ - Use `--dry-run` + `--summary-json` in CI for a planning stage artifact.
71
+ - Provide `--ttl-days` and/or `--max-keep-runs` for lifecycle & cost controls.
72
+
37
73
  ## Requirements
38
74
 
39
75
  - Python 3.9+
@@ -201,6 +237,56 @@ Pytest-driven (plugin):
201
237
  --allure-max-keep-runs 10
202
238
  ```
203
239
 
240
+ ### Minimal publish-only workflow
241
+
242
+ Create `.github/workflows/allure-publish.yml` for a lightweight pipeline that runs tests, generates the report, and publishes it (using secrets for the bucket and AWS credentials):
243
+
244
+ ```yaml
245
+ name: allure-publish
246
+ on: [push, pull_request]
247
+ jobs:
248
+ publish:
249
+ runs-on: ubuntu-latest
250
+ permissions:
251
+ contents: read
252
+ steps:
253
+ - uses: actions/checkout@v4
254
+ - uses: actions/setup-python@v5
255
+ with:
256
+ python-version: "3.11"
257
+ - name: Install deps (minimal)
258
+ run: pip install pytest pytest-allure-host allure-pytest
259
+ - name: Run tests
260
+ run: pytest --alluredir=allure-results -q
261
+ - name: Publish Allure report (dry-run on PRs)
262
+ env:
263
+ AWS_REGION: us-east-1
264
+ AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
265
+ AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
266
+ ALLURE_BUCKET: ${{ secrets.ALLURE_BUCKET }}
267
+ run: |
268
+ EXTRA=""
269
+ if [ "${{ github.event_name }}" = "pull_request" ]; then EXTRA="--dry-run"; fi
270
+ publish-allure \
271
+ --bucket "$ALLURE_BUCKET" \
272
+ --project myproj \
273
+ --branch "${{ github.ref_name }}" \
274
+ --summary-json summary.json $EXTRA
275
+ - name: Upload publish summary (always)
276
+ if: always()
277
+ uses: actions/upload-artifact@v4
278
+ with:
279
+ name: allure-summary
280
+ path: summary.json
281
+ ```
282
+
283
+ Notes:
284
+
285
+ - Add `--cloudfront https://reports.example.com` if you have a CDN domain.
286
+ - Add `--context-url ${{ github.server_url }}/${{ github.repository }}/pull/${{ github.event.pull_request.number }}` inside PRs to link the run to its PR.
287
+ - Use `--max-keep-runs` / `--ttl-days` to manage storage costs.
288
+ - For LocalStack-based tests, set `--s3-endpoint` and export `ALLURE_S3_ENDPOINT` in `env:`.
289
+
204
290
  ## Troubleshooting
205
291
 
206
292
  - Missing Allure binary: ensure the Allure CLI is installed and on PATH.
@@ -4,7 +4,7 @@ build-backend = "poetry.core.masonry.api"
4
4
 
5
5
  [project]
6
6
  name = "pytest-allure-host"
7
- version = "0.1.2"
7
+ version = "2.0.0"
8
8
  description = "Publish Allure static reports to private S3 behind CloudFront with history preservation"
9
9
  readme = "README.md"
10
10
  license = "MIT"
@@ -22,6 +22,7 @@ classifiers = [
22
22
  "Topic :: Software Development :: Testing",
23
23
  "Framework :: Pytest",
24
24
  "Development Status :: 3 - Alpha",
25
+ "License :: OSI Approved :: MIT License",
25
26
  "Operating System :: OS Independent"
26
27
  ]
27
28
 
@@ -47,6 +48,9 @@ pip-audit = ">=2.7,<3.0"
47
48
  black = ">=24,<26"
48
49
  mkdocs = ">=1.5,<2.0"
49
50
  mkdocs-material = ">=9.5,<10.0"
51
+ playwright = ">=1.44,<2.0" # optional UI interaction tests (RUN_UI=1)
52
+ beautifulsoup4 = "^4.12.3"
53
+ Pillow = ">=10,<11" # for webp conversion in screenshot helper
50
54
 
51
55
  [project.scripts]
52
56
  publish-allure = "pytest_allure_host.cli:main"
@@ -63,17 +67,45 @@ Changelog = "https://darrenrabbs.github.io/allurehosting/changelog/"
63
67
 
64
68
  # Package include (PEP 621 doesn't specify this; still handled by Poetry configuration)
65
69
  [tool.poetry]
70
+ # Duplicate minimal metadata for backward compatibility with Poetry 1.x commands; primary metadata is under [project].
71
+ name = "pytest-allure-host"
72
+ version = "2.0.0"
73
+ description = "Publish Allure static reports to private S3 behind CloudFront with history preservation"
74
+ authors = ["Allure Hosting Maintainers"]
75
+ license = "MIT"
76
+ readme = "README.md"
66
77
  packages = [{ include = "pytest_allure_host" }]
67
- # PEP 621 is authoritative; legacy duplicate metadata removed (require Poetry 2.2.1+)
78
+ # Limit published artifacts strictly to the runtime package; exclude development & generated dirs.
79
+ exclude = [
80
+ "tests",
81
+ "docs",
82
+ "infra",
83
+ "reports",
84
+ "allure-results",
85
+ "allure-report",
86
+ "demo_run",
87
+ "scripts/*.sh",
88
+ "*.iml",
89
+ ]
90
+ # Both PEP 621 ([project]) and minimal [tool.poetry] are present; [project] is the source of truth.
68
91
 
69
92
  [tool.poetry.dependencies]
70
- # Retained only for Poetry's internal processing of the runtime dependency group.
93
+ # Retained for Poetry 1.x compatibility
71
94
  python = ">=3.9,<4.0"
72
95
 
73
96
 
74
97
  [tool.ruff]
75
98
  line-length = 100
76
99
  target-version = "py39"
100
+ extend-exclude = [
101
+ "dev/**",
102
+ "docs/**",
103
+ "site/**",
104
+ "site-internal/**",
105
+ ".lh/**",
106
+ ".history/**",
107
+ "scripts/**",
108
+ ]
77
109
 
78
110
  [tool.ruff.lint]
79
111
  select = ["E", "F", "I", "B", "UP", "S", "W", "C90"]
@@ -0,0 +1,14 @@
1
+ from importlib import metadata as _md
2
+
3
+ from .utils import PublishConfig, default_run_id # re-export key types
4
+
5
+ try: # runtime version (works inside installed env)
6
+ __version__ = _md.version("pytest-allure-host")
7
+ except Exception: # pragma: no cover
8
+ __version__ = "0.0.0+unknown"
9
+
10
+ __all__ = [
11
+ "PublishConfig",
12
+ "default_run_id",
13
+ "__version__",
14
+ ]
@@ -0,0 +1,291 @@
1
+ from __future__ import annotations
2
+
3
+ import argparse
4
+ import os
5
+ from pathlib import Path
6
+
7
+ from . import __version__
8
+ from .config import load_effective_config
9
+ from .publisher import plan_dry_run, preflight, publish
10
+ from .utils import PublishConfig, default_run_id
11
+
12
+
13
+ def parse_args() -> argparse.Namespace:
14
+ p = argparse.ArgumentParser("publish-allure")
15
+ p.add_argument(
16
+ "--version",
17
+ action="store_true",
18
+ help="Print version and exit",
19
+ )
20
+ p.add_argument("--config", help="Path to YAML config (optional)")
21
+ p.add_argument("--bucket")
22
+ p.add_argument("--prefix", default=None)
23
+ p.add_argument("--project")
24
+ p.add_argument("--branch", default=os.getenv("GIT_BRANCH", "main"))
25
+ p.add_argument(
26
+ "--run-id",
27
+ default=os.getenv("ALLURE_RUN_ID", default_run_id()),
28
+ )
29
+ p.add_argument("--cloudfront", default=os.getenv("ALLURE_CLOUDFRONT"))
30
+ p.add_argument(
31
+ "--results",
32
+ "--results-dir",
33
+ dest="results",
34
+ default=os.getenv("ALLURE_RESULTS_DIR", "allure-results"),
35
+ help="Path to allure-results directory (alias: --results-dir)",
36
+ )
37
+ p.add_argument(
38
+ "--report",
39
+ default=os.getenv("ALLURE_REPORT_DIR", "allure-report"),
40
+ help="Output directory for generated Allure static report",
41
+ )
42
+ p.add_argument("--ttl-days", type=int, default=None)
43
+ p.add_argument("--max-keep-runs", type=int, default=None)
44
+ p.add_argument(
45
+ "--sse",
46
+ default=os.getenv("ALLURE_S3_SSE"),
47
+ help="Server-side encryption algorithm (AES256 or aws:kms)",
48
+ )
49
+ p.add_argument(
50
+ "--sse-kms-key-id",
51
+ default=os.getenv("ALLURE_S3_SSE_KMS_KEY_ID"),
52
+ help="KMS Key ID / ARN when --sse=aws:kms",
53
+ )
54
+ p.add_argument(
55
+ "--s3-endpoint",
56
+ default=os.getenv("ALLURE_S3_ENDPOINT"),
57
+ help=("Custom S3 endpoint URL (e.g. http://localhost:4566)"),
58
+ )
59
+ p.add_argument("--summary-json", default=None)
60
+ p.add_argument(
61
+ "--context-url",
62
+ default=os.getenv("ALLURE_CONTEXT_URL"),
63
+ help="Optional hyperlink giving change context (e.g. Jira ticket)",
64
+ )
65
+ p.add_argument(
66
+ "--meta",
67
+ action="append",
68
+ default=[],
69
+ metavar="KEY=VAL",
70
+ help=(
71
+ "Attach arbitrary metadata (repeatable). Example: --meta "
72
+ "jira=PROJ-123 --meta env=staging. Adds dynamic columns to "
73
+ "runs index & manifest."
74
+ ),
75
+ )
76
+ p.add_argument("--dry-run", action="store_true", help="Plan only")
77
+ p.add_argument(
78
+ "--check",
79
+ action="store_true",
80
+ help="Run preflight checks (AWS, allure, inputs)",
81
+ )
82
+ p.add_argument(
83
+ "--verbose-summary",
84
+ action="store_true",
85
+ help="Print extended summary (CDN prefixes, manifest path, metadata)",
86
+ )
87
+ p.add_argument(
88
+ "--allow-duplicate-prefix-project",
89
+ action="store_true",
90
+ help=(
91
+ "Bypass guard preventing prefix==project duplication. "
92
+ "Only use if you intentionally want that folder layout."
93
+ ),
94
+ )
95
+ p.add_argument(
96
+ "--upload-workers",
97
+ type=int,
98
+ default=None,
99
+ help="Parallel upload worker threads (auto if unset)",
100
+ )
101
+ p.add_argument(
102
+ "--copy-workers",
103
+ type=int,
104
+ default=None,
105
+ help="Parallel copy worker threads for latest promotion",
106
+ )
107
+ p.add_argument(
108
+ "--archive-run",
109
+ action="store_true",
110
+ help="Also produce a compressed archive of the run (tar.gz)",
111
+ )
112
+ p.add_argument(
113
+ "--archive-format",
114
+ choices=["tar.gz", "zip"],
115
+ default="tar.gz",
116
+ help="Archive format when --archive-run is set",
117
+ )
118
+ return p.parse_args()
119
+
120
+
121
+ def _parse_metadata(pairs: list[str]) -> dict | None:
122
+ if not pairs:
123
+ return None
124
+ meta: dict[str, str] = {}
125
+ for raw in pairs:
126
+ if "=" not in raw:
127
+ continue
128
+ k, v = raw.split("=", 1)
129
+ k = k.strip()
130
+ v = v.strip()
131
+ if not k:
132
+ continue
133
+ safe_k = k.lower().replace("-", "_")
134
+ if safe_k and v:
135
+ meta[safe_k] = v
136
+ return meta or None
137
+
138
+
139
+ def _build_cli_overrides(args: argparse.Namespace) -> dict:
140
+ return {
141
+ "bucket": args.bucket,
142
+ "prefix": args.prefix,
143
+ "project": args.project,
144
+ "branch": args.branch,
145
+ "cloudfront": args.cloudfront,
146
+ "run_id": args.run_id,
147
+ "ttl_days": args.ttl_days,
148
+ "max_keep_runs": args.max_keep_runs,
149
+ "s3_endpoint": args.s3_endpoint,
150
+ "context_url": args.context_url,
151
+ "sse": args.sse,
152
+ "sse_kms_key_id": args.sse_kms_key_id,
153
+ }
154
+
155
+
156
+ def _effective_config(args: argparse.Namespace) -> tuple[dict, PublishConfig]:
157
+ overrides = _build_cli_overrides(args)
158
+ effective = load_effective_config(overrides, args.config)
159
+ cfg_source = effective.get("_config_file")
160
+ if cfg_source:
161
+ print(f"[config] loaded settings from {cfg_source}")
162
+ missing = [k for k in ("bucket", "project") if not effective.get(k)]
163
+ if missing:
164
+ missing_list = ", ".join(missing)
165
+ raise SystemExit(
166
+ f"Missing required config values: {missing_list}. Provide via CLI, env, or YAML."
167
+ )
168
+ cfg = PublishConfig(
169
+ bucket=effective["bucket"],
170
+ prefix=effective.get("prefix") or "reports",
171
+ project=effective["project"],
172
+ branch=effective.get("branch") or args.branch,
173
+ run_id=effective.get("run_id") or args.run_id,
174
+ cloudfront_domain=effective.get("cloudfront"),
175
+ ttl_days=effective.get("ttl_days"),
176
+ max_keep_runs=effective.get("max_keep_runs"),
177
+ s3_endpoint=effective.get("s3_endpoint"),
178
+ context_url=effective.get("context_url"),
179
+ sse=effective.get("sse"),
180
+ sse_kms_key_id=effective.get("sse_kms_key_id"),
181
+ metadata=_parse_metadata(args.meta),
182
+ upload_workers=args.upload_workers,
183
+ copy_workers=args.copy_workers,
184
+ archive_run=args.archive_run,
185
+ archive_format=args.archive_format if args.archive_run else None,
186
+ )
187
+ # Guard against accidental duplication like prefix==project producing
188
+ # 'reports/reports/<branch>/...' paths. This is usually unintentional
189
+ # and makes report URLs longer / redundant. Fail fast so users can
190
+ # correct config explicitly (they can still deliberately choose this
191
+ # by changing either value slightly, e.g. prefix='reports',
192
+ # project='team-reports').
193
+ if cfg.prefix == cfg.project and not getattr(args, "allow_duplicate_prefix_project", False):
194
+ parts = [
195
+ "Invalid config: prefix and project are identical (",
196
+ f"'{cfg.project}'). ",
197
+ "This yields duplicated S3 paths (",
198
+ f"{cfg.prefix}/{cfg.project}/<branch>/...). ",
199
+ "Set distinct values (e.g. prefix='reports', project='payments').",
200
+ ]
201
+ raise SystemExit("".join(parts))
202
+ return effective, cfg
203
+
204
+
205
+ def _write_json(path: str, payload: dict) -> None:
206
+ import json
207
+
208
+ with open(path, "w", encoding="utf-8") as f:
209
+ json.dump(payload, f, indent=2)
210
+
211
+
212
+ def _print_publish_summary(
213
+ cfg: PublishConfig,
214
+ out: dict,
215
+ verbose: bool = False,
216
+ ) -> None:
217
+ print("Publish complete")
218
+ if out.get("run_url"):
219
+ print(f"Run URL: {out['run_url']}")
220
+ if out.get("latest_url"):
221
+ print(f"Latest URL: {out['latest_url']}")
222
+ # Main aggregated runs index (HTML) at branch root if CDN configured
223
+ if cfg.cloudfront_domain:
224
+ branch_root = f"{cfg.prefix}/{cfg.project}/{cfg.branch}"
225
+ cdn_root = cfg.cloudfront_domain.rstrip("/")
226
+ runs_index_url = f"{cdn_root}/{branch_root}/runs/index.html"
227
+ print(f"Runs Index URL: {runs_index_url}")
228
+ run_prefix = out.get("run_prefix") or cfg.s3_run_prefix
229
+ latest_prefix = out.get("latest_prefix") or cfg.s3_latest_prefix
230
+ print(f"S3 run prefix: s3://{cfg.bucket}/{run_prefix}")
231
+ print(f"S3 latest prefix: s3://{cfg.bucket}/{latest_prefix}")
232
+ print(
233
+ "Report files: "
234
+ f"{out.get('report_files', '?')} Size: "
235
+ f"{out.get('report_size_bytes', '?')} bytes"
236
+ )
237
+ if verbose and cfg.cloudfront_domain:
238
+ # Duplicate earlier lines but clarify this is the CDN-root mapping
239
+ print("CDN run prefix (index root):", cfg.url_run())
240
+ print("CDN latest prefix (index root):", cfg.url_latest())
241
+ if verbose:
242
+ # Manifest stored at branch root under runs/index.json
243
+ branch_root = f"{cfg.prefix}/{cfg.project}/{cfg.branch}"
244
+ manifest_key = f"{branch_root}/runs/index.json"
245
+ print("Manifest object:", f"s3://{cfg.bucket}/{manifest_key}")
246
+ if cfg.metadata:
247
+ print("Metadata keys:", ", ".join(sorted(cfg.metadata.keys())))
248
+ if cfg.sse:
249
+ print("Encryption:", cfg.sse, cfg.sse_kms_key_id or "")
250
+
251
+
252
+ def main() -> int: # noqa: C901 (reduced but keep guard just in case)
253
+ args = parse_args()
254
+ if args.version:
255
+ print(__version__)
256
+ return 0
257
+ effective, cfg = _effective_config(args)
258
+ # Construct explicit Paths honoring custom results/report dirs
259
+ paths = None
260
+ try:
261
+ mod = __import__("pytest_allure_host.publisher", fromlist=["Paths"])
262
+ paths = mod.publisher.Paths(
263
+ results=Path(args.results),
264
+ report=Path(args.report),
265
+ )
266
+ except Exception: # pragma: no cover - defensive fallback
267
+ from .publisher import Paths # type: ignore
268
+
269
+ paths = Paths(results=Path(args.results), report=Path(args.report))
270
+
271
+ if args.check:
272
+ checks = preflight(cfg, paths=paths)
273
+ print(checks)
274
+ if not all(checks.values()):
275
+ return 2
276
+ if args.dry_run:
277
+ plan = plan_dry_run(cfg, paths=paths)
278
+ print(plan)
279
+ if args.summary_json:
280
+ _write_json(args.summary_json, plan)
281
+ return 0
282
+ out = publish(cfg, paths=paths)
283
+ print(out) # raw dict for backward compatibility
284
+ _print_publish_summary(cfg, out, verbose=args.verbose_summary)
285
+ if args.summary_json:
286
+ _write_json(args.summary_json, out)
287
+ return 0
288
+
289
+
290
+ if __name__ == "__main__": # pragma: no cover
291
+ raise SystemExit(main())