uploadwhen 1.0.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- uploadwhen-1.0.0/LICENSE +21 -0
- uploadwhen-1.0.0/PKG-INFO +129 -0
- uploadwhen-1.0.0/README.md +98 -0
- uploadwhen-1.0.0/pyproject.toml +44 -0
- uploadwhen-1.0.0/setup.cfg +4 -0
- uploadwhen-1.0.0/uploadwhen/__init__.py +6 -0
- uploadwhen-1.0.0/uploadwhen/__main__.py +6 -0
- uploadwhen-1.0.0/uploadwhen/analyzer.py +120 -0
- uploadwhen-1.0.0/uploadwhen/cli.py +203 -0
- uploadwhen-1.0.0/uploadwhen/fetcher.py +115 -0
- uploadwhen-1.0.0/uploadwhen.egg-info/PKG-INFO +129 -0
- uploadwhen-1.0.0/uploadwhen.egg-info/SOURCES.txt +14 -0
- uploadwhen-1.0.0/uploadwhen.egg-info/dependency_links.txt +1 -0
- uploadwhen-1.0.0/uploadwhen.egg-info/entry_points.txt +2 -0
- uploadwhen-1.0.0/uploadwhen.egg-info/requires.txt +5 -0
- uploadwhen-1.0.0/uploadwhen.egg-info/top_level.txt +1 -0
uploadwhen-1.0.0/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 Soubhagya
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
|
@@ -0,0 +1,129 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: uploadwhen
|
|
3
|
+
Version: 1.0.0
|
|
4
|
+
Summary: Find out exactly when a YouTube video was uploaded — date & time in IST.
|
|
5
|
+
Author: SoubhagyaSwain
|
|
6
|
+
License: MIT
|
|
7
|
+
Project-URL: Homepage, https://github.com/soubhagya2003jan/Uploadwhen
|
|
8
|
+
Project-URL: Repository, https://github.com/soubhagya2003jan/Uploadwhen
|
|
9
|
+
Keywords: youtube,upload-date,cli,yt-dlp,upload-time
|
|
10
|
+
Classifier: Development Status :: 4 - Beta
|
|
11
|
+
Classifier: Environment :: Console
|
|
12
|
+
Classifier: Intended Audience :: Developers
|
|
13
|
+
Classifier: Intended Audience :: End Users/Desktop
|
|
14
|
+
Classifier: License :: OSI Approved :: MIT License
|
|
15
|
+
Classifier: Operating System :: OS Independent
|
|
16
|
+
Classifier: Programming Language :: Python :: 3
|
|
17
|
+
Classifier: Programming Language :: Python :: 3.9
|
|
18
|
+
Classifier: Programming Language :: Python :: 3.10
|
|
19
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
20
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
21
|
+
Classifier: Programming Language :: Python :: 3.13
|
|
22
|
+
Classifier: Topic :: Multimedia :: Video
|
|
23
|
+
Classifier: Topic :: Utilities
|
|
24
|
+
Requires-Python: >=3.9
|
|
25
|
+
Description-Content-Type: text/markdown
|
|
26
|
+
License-File: LICENSE
|
|
27
|
+
Requires-Dist: yt-dlp>=2024.0.0
|
|
28
|
+
Requires-Dist: rich>=13.0.0
|
|
29
|
+
Requires-Dist: tzdata; sys_platform == "win32"
|
|
30
|
+
Dynamic: license-file
|
|
31
|
+
|
|
32
|
+
# uploadwhen
|
|
33
|
+
|
|
34
|
+
> Find out **exactly** when a YouTube video was uploaded — date & time across multiple time zones.
|
|
35
|
+
|
|
36
|
+
A Python CLI tool that extracts upload timestamps from YouTube videos, converts them to **IST, US/ET, AU/AEST, JP/JST, and UTC** (24-hour format). Also computes **upload consistency analytics** for any channel.
|
|
37
|
+
|
|
38
|
+
Built with [yt-dlp](https://github.com/yt-dlp/yt-dlp) and [rich](https://github.com/Textualize/rich).
|
|
39
|
+
|
|
40
|
+
---
|
|
41
|
+
|
|
42
|
+
## Installation
|
|
43
|
+
|
|
44
|
+
```bash
|
|
45
|
+
pip install uploadwhen
|
|
46
|
+
```
|
|
47
|
+
|
|
48
|
+
## Quick Start
|
|
49
|
+
|
|
50
|
+
```bash
|
|
51
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
|
52
|
+
```
|
|
53
|
+
|
|
54
|
+
## Usage
|
|
55
|
+
|
|
56
|
+
```
|
|
57
|
+
uploadwhen [URL] [OPTIONS]
|
|
58
|
+
```
|
|
59
|
+
|
|
60
|
+
### Options
|
|
61
|
+
|
|
62
|
+
| Flag | Short | Description |
|
|
63
|
+
|---|---|---|
|
|
64
|
+
| `--version` | `-V` | Show version |
|
|
65
|
+
| `--help` | `-h` | Show help |
|
|
66
|
+
| `--json` | | Export results to `uploadwhen_results.json` |
|
|
67
|
+
| `--csv` | | Export results to `uploadwhen_results.csv` |
|
|
68
|
+
| `--analyze` | | Fetch recent uploads and show consistency stats |
|
|
69
|
+
| `--count N` | `-n N` | Number of uploads to analyze (default: 10) |
|
|
70
|
+
| `--verbose` | `-v` | Show detailed yt-dlp output |
|
|
71
|
+
|
|
72
|
+
### Examples
|
|
73
|
+
|
|
74
|
+
```bash
|
|
75
|
+
# Single video — shows upload date & time
|
|
76
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
|
77
|
+
|
|
78
|
+
# Analyze a channel's upload consistency (last 10 videos)
|
|
79
|
+
uploadwhen "https://www.youtube.com/@theRadBrad" --analyze
|
|
80
|
+
|
|
81
|
+
# Analyze last 25 uploads from a channel
|
|
82
|
+
uploadwhen "https://www.youtube.com/@tseries" --analyze --count 25
|
|
83
|
+
|
|
84
|
+
# Auto-detect channel from a video URL
|
|
85
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ" --analyze
|
|
86
|
+
|
|
87
|
+
# Export to JSON and CSV
|
|
88
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ" --json --csv
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
## Time Zones
|
|
92
|
+
|
|
93
|
+
| Zone | Region |
|
|
94
|
+
|---|---|
|
|
95
|
+
| **IST** | India / Asia/Kolkata (UTC+05:30) |
|
|
96
|
+
| **ET** | US / New York (DST-aware) |
|
|
97
|
+
| **AEST** | Australia / Sydney (DST-aware) |
|
|
98
|
+
| **JST** | Japan / Tokyo (UTC+09:00) |
|
|
99
|
+
| **UTC** | Reference time |
|
|
100
|
+
|
|
101
|
+
## Upload Consistency Analytics
|
|
102
|
+
|
|
103
|
+
The `--analyze` flag fetches recent uploads and computes:
|
|
104
|
+
|
|
105
|
+
| Metric | Description |
|
|
106
|
+
|---|---|
|
|
107
|
+
| Avg gap | Average time between uploads |
|
|
108
|
+
| Median gap | Middle value of all gaps |
|
|
109
|
+
| Shortest / Longest gap | Extremes |
|
|
110
|
+
| Std deviation | How much upload timing varies |
|
|
111
|
+
| Consistency score (0–10) | Higher = more regular uploads |
|
|
112
|
+
|
|
113
|
+
Gaps under 1 day are shown in hours for clarity.
|
|
114
|
+
|
|
115
|
+
Use `--count N` to analyze more or fewer videos:
|
|
116
|
+
|
|
117
|
+
```bash
|
|
118
|
+
uploadwhen "https://www.youtube.com/@channel" --analyze --count 50
|
|
119
|
+
```
|
|
120
|
+
|
|
121
|
+
## Requirements
|
|
122
|
+
|
|
123
|
+
- Python ≥ 3.9
|
|
124
|
+
- [yt-dlp](https://github.com/yt-dlp/yt-dlp)
|
|
125
|
+
- [rich](https://github.com/Textualize/rich)
|
|
126
|
+
|
|
127
|
+
## License
|
|
128
|
+
|
|
129
|
+
MIT
|
|
@@ -0,0 +1,98 @@
|
|
|
1
|
+
# uploadwhen
|
|
2
|
+
|
|
3
|
+
> Find out **exactly** when a YouTube video was uploaded — date & time across multiple time zones.
|
|
4
|
+
|
|
5
|
+
A Python CLI tool that extracts upload timestamps from YouTube videos, converts them to **IST, US/ET, AU/AEST, JP/JST, and UTC** (24-hour format). Also computes **upload consistency analytics** for any channel.
|
|
6
|
+
|
|
7
|
+
Built with [yt-dlp](https://github.com/yt-dlp/yt-dlp) and [rich](https://github.com/Textualize/rich).
|
|
8
|
+
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
## Installation
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
pip install uploadwhen
|
|
15
|
+
```
|
|
16
|
+
|
|
17
|
+
## Quick Start
|
|
18
|
+
|
|
19
|
+
```bash
|
|
20
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
## Usage
|
|
24
|
+
|
|
25
|
+
```
|
|
26
|
+
uploadwhen [URL] [OPTIONS]
|
|
27
|
+
```
|
|
28
|
+
|
|
29
|
+
### Options
|
|
30
|
+
|
|
31
|
+
| Flag | Short | Description |
|
|
32
|
+
|---|---|---|
|
|
33
|
+
| `--version` | `-V` | Show version |
|
|
34
|
+
| `--help` | `-h` | Show help |
|
|
35
|
+
| `--json` | | Export results to `uploadwhen_results.json` |
|
|
36
|
+
| `--csv` | | Export results to `uploadwhen_results.csv` |
|
|
37
|
+
| `--analyze` | | Fetch recent uploads and show consistency stats |
|
|
38
|
+
| `--count N` | `-n N` | Number of uploads to analyze (default: 10) |
|
|
39
|
+
| `--verbose` | `-v` | Show detailed yt-dlp output |
|
|
40
|
+
|
|
41
|
+
### Examples
|
|
42
|
+
|
|
43
|
+
```bash
|
|
44
|
+
# Single video — shows upload date & time
|
|
45
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
|
46
|
+
|
|
47
|
+
# Analyze a channel's upload consistency (last 10 videos)
|
|
48
|
+
uploadwhen "https://www.youtube.com/@theRadBrad" --analyze
|
|
49
|
+
|
|
50
|
+
# Analyze last 25 uploads from a channel
|
|
51
|
+
uploadwhen "https://www.youtube.com/@tseries" --analyze --count 25
|
|
52
|
+
|
|
53
|
+
# Auto-detect channel from a video URL
|
|
54
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ" --analyze
|
|
55
|
+
|
|
56
|
+
# Export to JSON and CSV
|
|
57
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ" --json --csv
|
|
58
|
+
```
|
|
59
|
+
|
|
60
|
+
## Time Zones
|
|
61
|
+
|
|
62
|
+
| Zone | Region |
|
|
63
|
+
|---|---|
|
|
64
|
+
| **IST** | India / Asia/Kolkata (UTC+05:30) |
|
|
65
|
+
| **ET** | US / New York (DST-aware) |
|
|
66
|
+
| **AEST** | Australia / Sydney (DST-aware) |
|
|
67
|
+
| **JST** | Japan / Tokyo (UTC+09:00) |
|
|
68
|
+
| **UTC** | Reference time |
|
|
69
|
+
|
|
70
|
+
## Upload Consistency Analytics
|
|
71
|
+
|
|
72
|
+
The `--analyze` flag fetches recent uploads and computes:
|
|
73
|
+
|
|
74
|
+
| Metric | Description |
|
|
75
|
+
|---|---|
|
|
76
|
+
| Avg gap | Average time between uploads |
|
|
77
|
+
| Median gap | Middle value of all gaps |
|
|
78
|
+
| Shortest / Longest gap | Extremes |
|
|
79
|
+
| Std deviation | How much upload timing varies |
|
|
80
|
+
| Consistency score (0–10) | Higher = more regular uploads |
|
|
81
|
+
|
|
82
|
+
Gaps under 1 day are shown in hours for clarity.
|
|
83
|
+
|
|
84
|
+
Use `--count N` to analyze more or fewer videos:
|
|
85
|
+
|
|
86
|
+
```bash
|
|
87
|
+
uploadwhen "https://www.youtube.com/@channel" --analyze --count 50
|
|
88
|
+
```
|
|
89
|
+
|
|
90
|
+
## Requirements
|
|
91
|
+
|
|
92
|
+
- Python ≥ 3.9
|
|
93
|
+
- [yt-dlp](https://github.com/yt-dlp/yt-dlp)
|
|
94
|
+
- [rich](https://github.com/Textualize/rich)
|
|
95
|
+
|
|
96
|
+
## License
|
|
97
|
+
|
|
98
|
+
MIT
|
|
@@ -0,0 +1,44 @@
|
|
|
1
|
+
[build-system]
|
|
2
|
+
requires = ["setuptools>=68.0", "wheel"]
|
|
3
|
+
build-backend = "setuptools.build_meta"
|
|
4
|
+
|
|
5
|
+
[project]
|
|
6
|
+
name = "uploadwhen"
|
|
7
|
+
version = "1.0.0"
|
|
8
|
+
description = "Find out exactly when a YouTube video was uploaded — date & time in IST."
|
|
9
|
+
readme = "README.md"
|
|
10
|
+
license = {text = "MIT"}
|
|
11
|
+
requires-python = ">=3.9"
|
|
12
|
+
authors = [{name = "SoubhagyaSwain"}]
|
|
13
|
+
keywords = ["youtube", "upload-date", "cli", "yt-dlp", "upload-time"]
|
|
14
|
+
classifiers = [
|
|
15
|
+
"Development Status :: 4 - Beta",
|
|
16
|
+
"Environment :: Console",
|
|
17
|
+
"Intended Audience :: Developers",
|
|
18
|
+
"Intended Audience :: End Users/Desktop",
|
|
19
|
+
"License :: OSI Approved :: MIT License",
|
|
20
|
+
"Operating System :: OS Independent",
|
|
21
|
+
"Programming Language :: Python :: 3",
|
|
22
|
+
"Programming Language :: Python :: 3.9",
|
|
23
|
+
"Programming Language :: Python :: 3.10",
|
|
24
|
+
"Programming Language :: Python :: 3.11",
|
|
25
|
+
"Programming Language :: Python :: 3.12",
|
|
26
|
+
"Programming Language :: Python :: 3.13",
|
|
27
|
+
"Topic :: Multimedia :: Video",
|
|
28
|
+
"Topic :: Utilities",
|
|
29
|
+
]
|
|
30
|
+
dependencies = [
|
|
31
|
+
"yt-dlp>=2024.0.0",
|
|
32
|
+
"rich>=13.0.0",
|
|
33
|
+
"tzdata; sys_platform == 'win32'",
|
|
34
|
+
]
|
|
35
|
+
|
|
36
|
+
[project.urls]
|
|
37
|
+
Homepage = "https://github.com/soubhagya2003jan/Uploadwhen"
|
|
38
|
+
Repository = "https://github.com/soubhagya2003jan/Uploadwhen"
|
|
39
|
+
|
|
40
|
+
[project.scripts]
|
|
41
|
+
uploadwhen = "uploadwhen.cli:main"
|
|
42
|
+
|
|
43
|
+
[tool.setuptools.packages.find]
|
|
44
|
+
include = ["uploadwhen*"]
|
|
@@ -0,0 +1,120 @@
|
|
|
1
|
+
"""Channel upload-consistency analytics."""
|
|
2
|
+
|
|
3
|
+
from __future__ import annotations
|
|
4
|
+
import math
|
|
5
|
+
from datetime import datetime
|
|
6
|
+
|
|
7
|
+
from rich.console import Console
|
|
8
|
+
from rich.table import Table
|
|
9
|
+
from rich.panel import Panel
|
|
10
|
+
|
|
11
|
+
console = Console()
|
|
12
|
+
|
|
13
|
+
|
|
14
|
+
def _sorted_dates(records: list[dict]) -> list[datetime]:
|
|
15
|
+
"""Extract and sort UTC datetimes from records."""
|
|
16
|
+
dates = [r["datetime_utc"] for r in records if r.get("datetime_utc")]
|
|
17
|
+
dates.sort()
|
|
18
|
+
return dates
|
|
19
|
+
|
|
20
|
+
|
|
21
|
+
def _gaps_in_days(dates: list[datetime]) -> list[float]:
|
|
22
|
+
"""Compute inter-upload gaps in days."""
|
|
23
|
+
return [(dates[i + 1] - dates[i]).total_seconds() / 86400
|
|
24
|
+
for i in range(len(dates) - 1)]
|
|
25
|
+
|
|
26
|
+
|
|
27
|
+
def _median(values: list[float]) -> float:
|
|
28
|
+
n = len(values)
|
|
29
|
+
s = sorted(values)
|
|
30
|
+
if n % 2 == 1:
|
|
31
|
+
return s[n // 2]
|
|
32
|
+
return (s[n // 2 - 1] + s[n // 2]) / 2
|
|
33
|
+
|
|
34
|
+
|
|
35
|
+
def _std_dev(values: list[float], mean: float) -> float:
|
|
36
|
+
if len(values) < 2:
|
|
37
|
+
return 0.0
|
|
38
|
+
variance = sum((v - mean) ** 2 for v in values) / (len(values) - 1)
|
|
39
|
+
return math.sqrt(variance)
|
|
40
|
+
|
|
41
|
+
|
|
42
|
+
def _consistency_score(std: float, mean: float) -> float:
|
|
43
|
+
"""0–10 score using 10 / (1 + CV). Smoother curve that never hits zero."""
|
|
44
|
+
if mean == 0:
|
|
45
|
+
return 0.0
|
|
46
|
+
cv = std / mean
|
|
47
|
+
return round(10 / (1 + cv), 1)
|
|
48
|
+
|
|
49
|
+
|
|
50
|
+
def _format_gap(days: float) -> str:
|
|
51
|
+
"""Show hours when gap < 1 day, otherwise days."""
|
|
52
|
+
if days < 1:
|
|
53
|
+
hours = days * 24
|
|
54
|
+
return f"{round(hours, 1)} hours"
|
|
55
|
+
return f"{round(days, 2)} days"
|
|
56
|
+
|
|
57
|
+
|
|
58
|
+
def analyze(records: list[dict]) -> dict | None:
|
|
59
|
+
"""Compute upload-gap statistics. Returns None if < 2 dated videos."""
|
|
60
|
+
dates = _sorted_dates(records)
|
|
61
|
+
if len(dates) < 2:
|
|
62
|
+
return None
|
|
63
|
+
|
|
64
|
+
gaps = _gaps_in_days(dates)
|
|
65
|
+
avg = sum(gaps) / len(gaps)
|
|
66
|
+
med = _median(gaps)
|
|
67
|
+
shortest = min(gaps)
|
|
68
|
+
longest = max(gaps)
|
|
69
|
+
std = _std_dev(gaps, avg)
|
|
70
|
+
score = _consistency_score(std, avg)
|
|
71
|
+
|
|
72
|
+
return {
|
|
73
|
+
"channel": records[0].get("channel", "Unknown"),
|
|
74
|
+
"total_videos": len(dates),
|
|
75
|
+
"avg_gap": _format_gap(avg),
|
|
76
|
+
"median_gap": _format_gap(med),
|
|
77
|
+
"shortest_gap": _format_gap(shortest),
|
|
78
|
+
"longest_gap": _format_gap(longest),
|
|
79
|
+
"std_dev": _format_gap(std),
|
|
80
|
+
"consistency_score": score,
|
|
81
|
+
}
|
|
82
|
+
|
|
83
|
+
|
|
84
|
+
def _score_color(score: float) -> str:
|
|
85
|
+
"""Pick a Rich color based on score (0–10)."""
|
|
86
|
+
if score >= 7.5:
|
|
87
|
+
return "green"
|
|
88
|
+
if score >= 4.0:
|
|
89
|
+
return "yellow"
|
|
90
|
+
return "red"
|
|
91
|
+
|
|
92
|
+
|
|
93
|
+
def print_analysis(stats: dict | None) -> None:
|
|
94
|
+
"""Render analysis stats as a Rich panel."""
|
|
95
|
+
if stats is None:
|
|
96
|
+
console.print("\n [yellow]![/yellow] Need at least 2 dated videos for analytics.\n")
|
|
97
|
+
return
|
|
98
|
+
|
|
99
|
+
color = _score_color(stats["consistency_score"])
|
|
100
|
+
|
|
101
|
+
table = Table(show_header=False, box=None, padding=(0, 1), expand=False)
|
|
102
|
+
table.add_column("Metric", style="bold", width=26)
|
|
103
|
+
table.add_column("Value", justify="right")
|
|
104
|
+
|
|
105
|
+
table.add_row("Videos analysed", str(stats["total_videos"]))
|
|
106
|
+
table.add_row("Avg gap", stats["avg_gap"])
|
|
107
|
+
table.add_row("Median gap", stats["median_gap"])
|
|
108
|
+
table.add_row("Shortest gap", stats["shortest_gap"])
|
|
109
|
+
table.add_row("Longest gap", stats["longest_gap"])
|
|
110
|
+
table.add_row("Std deviation", stats["std_dev"])
|
|
111
|
+
table.add_row("Consistency score",
|
|
112
|
+
f"[{color} bold]{stats['consistency_score']} / 10[/{color} bold]")
|
|
113
|
+
|
|
114
|
+
console.print()
|
|
115
|
+
console.print(Panel(
|
|
116
|
+
table,
|
|
117
|
+
title=f"[bold]Upload Consistency — {stats['channel']}[/bold]",
|
|
118
|
+
border_style=color,
|
|
119
|
+
expand=False,
|
|
120
|
+
))
|
|
@@ -0,0 +1,203 @@
|
|
|
1
|
+
"""CLI interface for uploadwhen."""
|
|
2
|
+
|
|
3
|
+
import sys
|
|
4
|
+
import json
|
|
5
|
+
import csv
|
|
6
|
+
import argparse
|
|
7
|
+
from pathlib import Path
|
|
8
|
+
|
|
9
|
+
from rich.console import Console
|
|
10
|
+
from rich.table import Table
|
|
11
|
+
from rich.panel import Panel
|
|
12
|
+
from rich.text import Text
|
|
13
|
+
from rich import box
|
|
14
|
+
|
|
15
|
+
from uploadwhen import __version__
|
|
16
|
+
from uploadwhen.fetcher import fetch_video, fetch_channel_videos, _is_channel_url
|
|
17
|
+
from uploadwhen.analyzer import analyze, print_analysis
|
|
18
|
+
|
|
19
|
+
console = Console()
|
|
20
|
+
|
|
21
|
+
# Banner
|
|
22
|
+
BANNER = """[bold cyan]
|
|
23
|
+
██╗ ██╗██████╗ ██╗ ██████╗ █████╗ ██████╗ ██╗ ██╗██╗ ██╗███████╗███╗ ██╗
|
|
24
|
+
██║ ██║██╔══██╗██║ ██╔═══██╗██╔══██╗██╔══██╗██║ ██║██║ ██║██╔════╝████╗ ██║
|
|
25
|
+
██║ ██║██████╔╝██║ ██║ ██║███████║██║ ██║██║ █╗ ██║███████║█████╗ ██╔██╗ ██║
|
|
26
|
+
██║ ██║██╔═══╝ ██║ ██║ ██║██╔══██║██║ ██║██║███╗██║██╔══██║██╔══╝ ██║╚██╗██║
|
|
27
|
+
╚██████╔╝██║ ███████╗╚██████╔╝██║ ██║██████╔╝╚███╔███╔╝██║ ██║███████╗██║ ╚████║
|
|
28
|
+
╚═════╝ ╚═╝ ╚══════╝ ╚═════╝ ╚═╝ ╚═╝╚═════╝ ╚══╝╚══╝ ╚═╝ ╚═╝╚══════╝╚═╝ ╚═══╝
|
|
29
|
+
[/bold cyan][dim] Find out exactly when any YouTube video was uploaded.
|
|
30
|
+
v""" + __version__ + "[/dim]"
|
|
31
|
+
|
|
32
|
+
|
|
33
|
+
def _build_parser() -> argparse.ArgumentParser:
|
|
34
|
+
parser = argparse.ArgumentParser(
|
|
35
|
+
prog="uploadwhen",
|
|
36
|
+
description=(
|
|
37
|
+
"uploadwhen — Find out exactly when a YouTube video was uploaded.\n"
|
|
38
|
+
"Supports video URLs and channel-level upload consistency analysis.\n"
|
|
39
|
+
"Times shown in IST, US/ET, AU/AEST, JP/JST, and UTC."
|
|
40
|
+
),
|
|
41
|
+
formatter_class=argparse.RawDescriptionHelpFormatter,
|
|
42
|
+
epilog=(
|
|
43
|
+
"Example:\n"
|
|
44
|
+
' uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"\n'
|
|
45
|
+
"\n"
|
|
46
|
+
"Tip: Wrap URLs in quotes so your shell doesn't break on & characters."
|
|
47
|
+
),
|
|
48
|
+
)
|
|
49
|
+
|
|
50
|
+
parser.add_argument("url", nargs="?", metavar="URL",
|
|
51
|
+
help="A YouTube video or channel URL.")
|
|
52
|
+
parser.add_argument("-V", "--version", action="version",
|
|
53
|
+
version=f"%(prog)s {__version__}")
|
|
54
|
+
parser.add_argument("--json", action="store_true", dest="export_json",
|
|
55
|
+
help="Export results to uploadwhen_results.json")
|
|
56
|
+
parser.add_argument("--csv", action="store_true", dest="export_csv",
|
|
57
|
+
help="Export results to uploadwhen_results.csv")
|
|
58
|
+
parser.add_argument("--analyze", action="store_true",
|
|
59
|
+
help="Fetch recent uploads from the channel and show consistency stats.")
|
|
60
|
+
parser.add_argument("-n", "--count", type=int, default=10, metavar="N",
|
|
61
|
+
help="Number of uploads to analyze (default: 10).")
|
|
62
|
+
parser.add_argument("-v", "--verbose", action="store_true",
|
|
63
|
+
help="Show detailed yt-dlp output.")
|
|
64
|
+
return parser
|
|
65
|
+
|
|
66
|
+
|
|
67
|
+
# Display —————————————————————————————————————————————————————————————————
|
|
68
|
+
|
|
69
|
+
def _print_video(rec: dict) -> None:
|
|
70
|
+
table = Table(show_header=False, box=None, padding=(0, 1),
|
|
71
|
+
expand=False, show_edge=False)
|
|
72
|
+
table.add_column("Key", style="bold", width=14)
|
|
73
|
+
table.add_column("Value")
|
|
74
|
+
|
|
75
|
+
table.add_row("Channel", rec["channel"])
|
|
76
|
+
table.add_row("Date", rec["date"])
|
|
77
|
+
table.add_row("Time (IST)", f"[green]{rec['time_ist']}[/green] [dim]India[/dim]")
|
|
78
|
+
table.add_row("Time (ET)", f"{rec['time_us']} [dim]US / New York[/dim]")
|
|
79
|
+
table.add_row("Time (AEST)", f"{rec['time_au']} [dim]Australia / Sydney[/dim]")
|
|
80
|
+
table.add_row("Time (JST)", f"{rec['time_jp']} [dim]Japan / Tokyo[/dim]")
|
|
81
|
+
table.add_row("Time (UTC)", f"[dim]{rec['time_utc']}[/dim]")
|
|
82
|
+
table.add_row("Duration", rec["duration"])
|
|
83
|
+
table.add_row("URL", f"[dim]{rec['video_url']}[/dim]")
|
|
84
|
+
|
|
85
|
+
console.print(Panel(table, title=f"[bold]{rec['title']}[/bold]",
|
|
86
|
+
border_style="cyan", expand=False))
|
|
87
|
+
|
|
88
|
+
|
|
89
|
+
def _print_channel_videos(records: list[dict]) -> None:
|
|
90
|
+
"""Render a compact table of channel uploads."""
|
|
91
|
+
table = Table(title="Recent Uploads", box=box.ROUNDED,
|
|
92
|
+
border_style="cyan", header_style="bold")
|
|
93
|
+
table.add_column("#", justify="right", style="dim", width=3)
|
|
94
|
+
table.add_column("Date", width=20)
|
|
95
|
+
table.add_column("Time (IST)", style="green", width=14)
|
|
96
|
+
table.add_column("Title", max_width=55, no_wrap=True)
|
|
97
|
+
|
|
98
|
+
for i, rec in enumerate(records, start=1):
|
|
99
|
+
table.add_row(str(i), rec["date"], rec["time_ist"],
|
|
100
|
+
Text(rec["title"], overflow="ellipsis"))
|
|
101
|
+
|
|
102
|
+
console.print(table)
|
|
103
|
+
|
|
104
|
+
|
|
105
|
+
# Export ——————————————————————————————————————————————————————————————————
|
|
106
|
+
|
|
107
|
+
EXPORT_FIELDS = ["title", "channel", "date", "time_ist", "time_us",
|
|
108
|
+
"time_au", "time_jp", "time_utc", "duration",
|
|
109
|
+
"video_url", "video_id"]
|
|
110
|
+
|
|
111
|
+
|
|
112
|
+
def _export_json(records: list[dict], path: Path) -> None:
|
|
113
|
+
clean = [{k: r[k] for k in EXPORT_FIELDS} for r in records]
|
|
114
|
+
path.write_text(json.dumps(clean, indent=2, ensure_ascii=False), encoding="utf-8")
|
|
115
|
+
console.print(f"\n [green]✓[/green] JSON saved → [bold]{path}[/bold]")
|
|
116
|
+
|
|
117
|
+
|
|
118
|
+
def _export_csv(records: list[dict], path: Path) -> None:
|
|
119
|
+
with open(path, "w", newline="", encoding="utf-8") as f:
|
|
120
|
+
writer = csv.DictWriter(f, fieldnames=EXPORT_FIELDS)
|
|
121
|
+
writer.writeheader()
|
|
122
|
+
for r in records:
|
|
123
|
+
writer.writerow({k: r[k] for k in EXPORT_FIELDS})
|
|
124
|
+
console.print(f"\n [green]✓[/green] CSV saved → [bold]{path}[/bold]")
|
|
125
|
+
|
|
126
|
+
|
|
127
|
+
# Entry point —————————————————————————————————————————————————————————————
|
|
128
|
+
|
|
129
|
+
def main(argv: list[str] | None = None) -> None:
|
|
130
|
+
parser = _build_parser()
|
|
131
|
+
args = parser.parse_args(argv)
|
|
132
|
+
|
|
133
|
+
if not args.url:
|
|
134
|
+
console.print(BANNER)
|
|
135
|
+
parser.print_help()
|
|
136
|
+
sys.exit(0)
|
|
137
|
+
|
|
138
|
+
console.print(BANNER)
|
|
139
|
+
url = args.url.strip()
|
|
140
|
+
is_channel = _is_channel_url(url)
|
|
141
|
+
|
|
142
|
+
# Channel URL path
|
|
143
|
+
if is_channel:
|
|
144
|
+
if not args.analyze:
|
|
145
|
+
console.print(
|
|
146
|
+
f"\n [yellow]![/yellow] That looks like a channel URL. "
|
|
147
|
+
f"Add [bold]--analyze[/bold] to see upload consistency stats.\n"
|
|
148
|
+
)
|
|
149
|
+
sys.exit(0)
|
|
150
|
+
|
|
151
|
+
console.print(f"\n [cyan]⟳[/cyan] Fetching last [bold]{args.count}[/bold] uploads from channel…\n")
|
|
152
|
+
records = fetch_channel_videos(url, count=args.count, verbose=args.verbose)
|
|
153
|
+
|
|
154
|
+
if not records:
|
|
155
|
+
console.print(" [red]✗[/red] Could not retrieve videos from this channel.")
|
|
156
|
+
sys.exit(1)
|
|
157
|
+
|
|
158
|
+
_print_channel_videos(records)
|
|
159
|
+
print_analysis(analyze(records))
|
|
160
|
+
|
|
161
|
+
if args.export_json:
|
|
162
|
+
_export_json(records, Path.cwd() / "uploadwhen_results.json")
|
|
163
|
+
if args.export_csv:
|
|
164
|
+
_export_csv(records, Path.cwd() / "uploadwhen_results.csv")
|
|
165
|
+
|
|
166
|
+
console.print()
|
|
167
|
+
return
|
|
168
|
+
|
|
169
|
+
# Single video URL path
|
|
170
|
+
console.print(f"\n [cyan]⟳[/cyan] Fetching video info…\n")
|
|
171
|
+
video = fetch_video(url, verbose=args.verbose)
|
|
172
|
+
|
|
173
|
+
if not video:
|
|
174
|
+
console.print(" [red]✗[/red] Could not retrieve video info. Check the URL.")
|
|
175
|
+
sys.exit(1)
|
|
176
|
+
|
|
177
|
+
_print_video(video)
|
|
178
|
+
|
|
179
|
+
if args.export_json:
|
|
180
|
+
_export_json([video], Path.cwd() / "uploadwhen_results.json")
|
|
181
|
+
if args.export_csv:
|
|
182
|
+
_export_csv([video], Path.cwd() / "uploadwhen_results.csv")
|
|
183
|
+
|
|
184
|
+
# Auto-detect channel and run analysis
|
|
185
|
+
if args.analyze:
|
|
186
|
+
channel_url = video.get("channel_url")
|
|
187
|
+
if not channel_url:
|
|
188
|
+
console.print("\n [yellow]![/yellow] Could not determine channel URL from this video.\n")
|
|
189
|
+
else:
|
|
190
|
+
channel = video.get("channel", "this channel")
|
|
191
|
+
console.print(
|
|
192
|
+
f"\n [cyan]⟳[/cyan] Fetching last [bold]{args.count}[/bold] uploads "
|
|
193
|
+
f"from [bold]{channel}[/bold]…\n"
|
|
194
|
+
)
|
|
195
|
+
records = fetch_channel_videos(channel_url, count=args.count, verbose=args.verbose)
|
|
196
|
+
|
|
197
|
+
if not records:
|
|
198
|
+
console.print(" [red]✗[/red] Could not retrieve channel videos.")
|
|
199
|
+
else:
|
|
200
|
+
_print_channel_videos(records)
|
|
201
|
+
print_analysis(analyze(records))
|
|
202
|
+
|
|
203
|
+
console.print()
|
|
@@ -0,0 +1,115 @@
|
|
|
1
|
+
"""Fetch metadata from YouTube videos and channels via yt-dlp."""
|
|
2
|
+
|
|
3
|
+
import yt_dlp
|
|
4
|
+
from urllib.parse import urlparse
|
|
5
|
+
from datetime import datetime, timezone, timedelta
|
|
6
|
+
|
|
7
|
+
try:
|
|
8
|
+
from zoneinfo import ZoneInfo
|
|
9
|
+
IST = ZoneInfo("Asia/Kolkata")
|
|
10
|
+
US_ET = ZoneInfo("America/New_York")
|
|
11
|
+
AU_SYD = ZoneInfo("Australia/Sydney")
|
|
12
|
+
JP = ZoneInfo("Asia/Tokyo")
|
|
13
|
+
except ImportError:
|
|
14
|
+
IST = timezone(timedelta(hours=5, minutes=30), name="IST")
|
|
15
|
+
US_ET = timezone(timedelta(hours=-5), name="ET")
|
|
16
|
+
AU_SYD = timezone(timedelta(hours=11), name="AEDT")
|
|
17
|
+
JP = timezone(timedelta(hours=9), name="JST")
|
|
18
|
+
|
|
19
|
+
|
|
20
|
+
def _extract_video_info(info: dict) -> dict:
|
|
21
|
+
"""Parse a yt-dlp info dict into a clean record."""
|
|
22
|
+
title = info.get("title", "Unknown")
|
|
23
|
+
channel = info.get("channel") or info.get("uploader") or "Unknown"
|
|
24
|
+
channel_url = info.get("channel_url") or info.get("uploader_url") or ""
|
|
25
|
+
video_id = info.get("id", "")
|
|
26
|
+
video_url = info.get("webpage_url") or info.get("url") or ""
|
|
27
|
+
duration = info.get("duration_string") or info.get("duration") or "Unknown"
|
|
28
|
+
timestamp = info.get("timestamp")
|
|
29
|
+
upload_date = info.get("upload_date")
|
|
30
|
+
|
|
31
|
+
date_str = time_ist = time_utc = time_us = time_au = time_jp = dt_utc = None
|
|
32
|
+
na = "N/A"
|
|
33
|
+
|
|
34
|
+
if timestamp:
|
|
35
|
+
dt_utc = datetime.fromtimestamp(timestamp, tz=timezone.utc)
|
|
36
|
+
dt_ist = dt_utc.astimezone(IST)
|
|
37
|
+
dt_us = dt_utc.astimezone(US_ET)
|
|
38
|
+
dt_au = dt_utc.astimezone(AU_SYD)
|
|
39
|
+
dt_jp = dt_utc.astimezone(JP)
|
|
40
|
+
date_str = dt_ist.strftime("%d %B %Y")
|
|
41
|
+
time_ist = dt_ist.strftime("%H:%M:%S IST")
|
|
42
|
+
time_utc = dt_utc.strftime("%H:%M:%S UTC")
|
|
43
|
+
time_us = dt_us.strftime("%H:%M:%S ET")
|
|
44
|
+
time_au = dt_au.strftime("%H:%M:%S AEST")
|
|
45
|
+
time_jp = dt_jp.strftime("%H:%M:%S JST")
|
|
46
|
+
elif upload_date:
|
|
47
|
+
dt_utc = datetime.strptime(upload_date, "%Y%m%d").replace(tzinfo=timezone.utc)
|
|
48
|
+
date_str = dt_utc.strftime("%d %B %Y")
|
|
49
|
+
time_ist = time_utc = time_us = time_au = time_jp = na
|
|
50
|
+
else:
|
|
51
|
+
date_str = time_ist = time_utc = time_us = time_au = time_jp = na
|
|
52
|
+
|
|
53
|
+
return {
|
|
54
|
+
"title": title, "channel": channel, "channel_url": channel_url,
|
|
55
|
+
"video_id": video_id, "video_url": video_url,
|
|
56
|
+
"date": date_str, "time_ist": time_ist, "time_us": time_us,
|
|
57
|
+
"time_au": time_au, "time_jp": time_jp, "time_utc": time_utc,
|
|
58
|
+
"duration": str(duration), "datetime_utc": dt_utc,
|
|
59
|
+
}
|
|
60
|
+
|
|
61
|
+
|
|
62
|
+
def fetch_video(url: str, verbose: bool = False) -> dict | None:
|
|
63
|
+
"""Fetch info for a single video URL."""
|
|
64
|
+
ydl_opts = {
|
|
65
|
+
"quiet": not verbose, "no_warnings": not verbose,
|
|
66
|
+
"skip_download": True,
|
|
67
|
+
}
|
|
68
|
+
try:
|
|
69
|
+
with yt_dlp.YoutubeDL(ydl_opts) as ydl:
|
|
70
|
+
info = ydl.extract_info(url, download=False)
|
|
71
|
+
return _extract_video_info(info) if info else None
|
|
72
|
+
except Exception:
|
|
73
|
+
return None
|
|
74
|
+
|
|
75
|
+
|
|
76
|
+
def _normalise_channel_url(url: str) -> str:
|
|
77
|
+
"""Ensure channel URL points to /videos tab for yt-dlp."""
|
|
78
|
+
url = url.rstrip("/")
|
|
79
|
+
if url.endswith("/videos"):
|
|
80
|
+
return url
|
|
81
|
+
for suffix in ("/shorts", "/streams", "/playlists", "/community", "/about"):
|
|
82
|
+
if url.endswith(suffix):
|
|
83
|
+
url = url[: -len(suffix)]
|
|
84
|
+
break
|
|
85
|
+
return url + "/videos"
|
|
86
|
+
|
|
87
|
+
|
|
88
|
+
def _is_channel_url(url: str) -> bool:
|
|
89
|
+
"""Check if URL is a YouTube channel (not a video)."""
|
|
90
|
+
path = urlparse(url).path.lower()
|
|
91
|
+
if any(path.startswith(p) for p in ("/@", "/channel/", "/c/", "/user/")):
|
|
92
|
+
return True
|
|
93
|
+
return False
|
|
94
|
+
|
|
95
|
+
|
|
96
|
+
def fetch_channel_videos(channel_url: str, count: int = 10,
|
|
97
|
+
verbose: bool = False) -> list[dict]:
|
|
98
|
+
"""Fetch the last N uploads from a YouTube channel."""
|
|
99
|
+
videos_url = _normalise_channel_url(channel_url)
|
|
100
|
+
ydl_opts = {
|
|
101
|
+
"quiet": not verbose, "no_warnings": not verbose,
|
|
102
|
+
"skip_download": True, "extract_flat": False,
|
|
103
|
+
"ignoreerrors": True, "playlistend": count,
|
|
104
|
+
}
|
|
105
|
+
results: list[dict] = []
|
|
106
|
+
try:
|
|
107
|
+
with yt_dlp.YoutubeDL(ydl_opts) as ydl:
|
|
108
|
+
info = ydl.extract_info(videos_url, download=False)
|
|
109
|
+
if info:
|
|
110
|
+
for entry in (info.get("entries") or []):
|
|
111
|
+
if entry is not None:
|
|
112
|
+
results.append(_extract_video_info(entry))
|
|
113
|
+
except Exception:
|
|
114
|
+
pass
|
|
115
|
+
return results
|
|
@@ -0,0 +1,129 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: uploadwhen
|
|
3
|
+
Version: 1.0.0
|
|
4
|
+
Summary: Find out exactly when a YouTube video was uploaded — date & time in IST.
|
|
5
|
+
Author: SoubhagyaSwain
|
|
6
|
+
License: MIT
|
|
7
|
+
Project-URL: Homepage, https://github.com/soubhagya2003jan/Uploadwhen
|
|
8
|
+
Project-URL: Repository, https://github.com/soubhagya2003jan/Uploadwhen
|
|
9
|
+
Keywords: youtube,upload-date,cli,yt-dlp,upload-time
|
|
10
|
+
Classifier: Development Status :: 4 - Beta
|
|
11
|
+
Classifier: Environment :: Console
|
|
12
|
+
Classifier: Intended Audience :: Developers
|
|
13
|
+
Classifier: Intended Audience :: End Users/Desktop
|
|
14
|
+
Classifier: License :: OSI Approved :: MIT License
|
|
15
|
+
Classifier: Operating System :: OS Independent
|
|
16
|
+
Classifier: Programming Language :: Python :: 3
|
|
17
|
+
Classifier: Programming Language :: Python :: 3.9
|
|
18
|
+
Classifier: Programming Language :: Python :: 3.10
|
|
19
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
20
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
21
|
+
Classifier: Programming Language :: Python :: 3.13
|
|
22
|
+
Classifier: Topic :: Multimedia :: Video
|
|
23
|
+
Classifier: Topic :: Utilities
|
|
24
|
+
Requires-Python: >=3.9
|
|
25
|
+
Description-Content-Type: text/markdown
|
|
26
|
+
License-File: LICENSE
|
|
27
|
+
Requires-Dist: yt-dlp>=2024.0.0
|
|
28
|
+
Requires-Dist: rich>=13.0.0
|
|
29
|
+
Requires-Dist: tzdata; sys_platform == "win32"
|
|
30
|
+
Dynamic: license-file
|
|
31
|
+
|
|
32
|
+
# uploadwhen
|
|
33
|
+
|
|
34
|
+
> Find out **exactly** when a YouTube video was uploaded — date & time across multiple time zones.
|
|
35
|
+
|
|
36
|
+
A Python CLI tool that extracts upload timestamps from YouTube videos, converts them to **IST, US/ET, AU/AEST, JP/JST, and UTC** (24-hour format). Also computes **upload consistency analytics** for any channel.
|
|
37
|
+
|
|
38
|
+
Built with [yt-dlp](https://github.com/yt-dlp/yt-dlp) and [rich](https://github.com/Textualize/rich).
|
|
39
|
+
|
|
40
|
+
---
|
|
41
|
+
|
|
42
|
+
## Installation
|
|
43
|
+
|
|
44
|
+
```bash
|
|
45
|
+
pip install uploadwhen
|
|
46
|
+
```
|
|
47
|
+
|
|
48
|
+
## Quick Start
|
|
49
|
+
|
|
50
|
+
```bash
|
|
51
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
|
52
|
+
```
|
|
53
|
+
|
|
54
|
+
## Usage
|
|
55
|
+
|
|
56
|
+
```
|
|
57
|
+
uploadwhen [URL] [OPTIONS]
|
|
58
|
+
```
|
|
59
|
+
|
|
60
|
+
### Options
|
|
61
|
+
|
|
62
|
+
| Flag | Short | Description |
|
|
63
|
+
|---|---|---|
|
|
64
|
+
| `--version` | `-V` | Show version |
|
|
65
|
+
| `--help` | `-h` | Show help |
|
|
66
|
+
| `--json` | | Export results to `uploadwhen_results.json` |
|
|
67
|
+
| `--csv` | | Export results to `uploadwhen_results.csv` |
|
|
68
|
+
| `--analyze` | | Fetch recent uploads and show consistency stats |
|
|
69
|
+
| `--count N` | `-n N` | Number of uploads to analyze (default: 10) |
|
|
70
|
+
| `--verbose` | `-v` | Show detailed yt-dlp output |
|
|
71
|
+
|
|
72
|
+
### Examples
|
|
73
|
+
|
|
74
|
+
```bash
|
|
75
|
+
# Single video — shows upload date & time
|
|
76
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
|
77
|
+
|
|
78
|
+
# Analyze a channel's upload consistency (last 10 videos)
|
|
79
|
+
uploadwhen "https://www.youtube.com/@theRadBrad" --analyze
|
|
80
|
+
|
|
81
|
+
# Analyze last 25 uploads from a channel
|
|
82
|
+
uploadwhen "https://www.youtube.com/@tseries" --analyze --count 25
|
|
83
|
+
|
|
84
|
+
# Auto-detect channel from a video URL
|
|
85
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ" --analyze
|
|
86
|
+
|
|
87
|
+
# Export to JSON and CSV
|
|
88
|
+
uploadwhen "https://www.youtube.com/watch?v=dQw4w9WgXcQ" --json --csv
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
## Time Zones
|
|
92
|
+
|
|
93
|
+
| Zone | Region |
|
|
94
|
+
|---|---|
|
|
95
|
+
| **IST** | India / Asia/Kolkata (UTC+05:30) |
|
|
96
|
+
| **ET** | US / New York (DST-aware) |
|
|
97
|
+
| **AEST** | Australia / Sydney (DST-aware) |
|
|
98
|
+
| **JST** | Japan / Tokyo (UTC+09:00) |
|
|
99
|
+
| **UTC** | Reference time |
|
|
100
|
+
|
|
101
|
+
## Upload Consistency Analytics
|
|
102
|
+
|
|
103
|
+
The `--analyze` flag fetches recent uploads and computes:
|
|
104
|
+
|
|
105
|
+
| Metric | Description |
|
|
106
|
+
|---|---|
|
|
107
|
+
| Avg gap | Average time between uploads |
|
|
108
|
+
| Median gap | Middle value of all gaps |
|
|
109
|
+
| Shortest / Longest gap | Extremes |
|
|
110
|
+
| Std deviation | How much upload timing varies |
|
|
111
|
+
| Consistency score (0–10) | Higher = more regular uploads |
|
|
112
|
+
|
|
113
|
+
Gaps under 1 day are shown in hours for clarity.
|
|
114
|
+
|
|
115
|
+
Use `--count N` to analyze more or fewer videos:
|
|
116
|
+
|
|
117
|
+
```bash
|
|
118
|
+
uploadwhen "https://www.youtube.com/@channel" --analyze --count 50
|
|
119
|
+
```
|
|
120
|
+
|
|
121
|
+
## Requirements
|
|
122
|
+
|
|
123
|
+
- Python ≥ 3.9
|
|
124
|
+
- [yt-dlp](https://github.com/yt-dlp/yt-dlp)
|
|
125
|
+
- [rich](https://github.com/Textualize/rich)
|
|
126
|
+
|
|
127
|
+
## License
|
|
128
|
+
|
|
129
|
+
MIT
|
|
@@ -0,0 +1,14 @@
|
|
|
1
|
+
LICENSE
|
|
2
|
+
README.md
|
|
3
|
+
pyproject.toml
|
|
4
|
+
uploadwhen/__init__.py
|
|
5
|
+
uploadwhen/__main__.py
|
|
6
|
+
uploadwhen/analyzer.py
|
|
7
|
+
uploadwhen/cli.py
|
|
8
|
+
uploadwhen/fetcher.py
|
|
9
|
+
uploadwhen.egg-info/PKG-INFO
|
|
10
|
+
uploadwhen.egg-info/SOURCES.txt
|
|
11
|
+
uploadwhen.egg-info/dependency_links.txt
|
|
12
|
+
uploadwhen.egg-info/entry_points.txt
|
|
13
|
+
uploadwhen.egg-info/requires.txt
|
|
14
|
+
uploadwhen.egg-info/top_level.txt
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
uploadwhen
|