nextcloud_release_agent 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +7 -0
- data/Gemfile +3 -0
- data/README.md +121 -0
- data/exe/nextcloud-release-agent +7 -0
- data/lib/nextcloud_release_agent/changelog_renderer.rb +65 -0
- data/lib/nextcloud_release_agent/cli.rb +855 -0
- data/lib/nextcloud_release_agent/version.rb +3 -0
- data/lib/nextcloud_release_agent.rb +3 -0
- data/nextcloud_release_agent.gemspec +21 -0
- metadata +64 -0
checksums.yaml
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
1
|
+
---
|
|
2
|
+
SHA256:
|
|
3
|
+
metadata.gz: b3dbfa1416323265354d6d21af4fc5f4cf7f6a5b9e488f12d7d2fa97153a1001
|
|
4
|
+
data.tar.gz: 9a61861dbbea835ffd03ddeefa8007dde0ec8bd5cf45cbe5fc418a3ea265c8a6
|
|
5
|
+
SHA512:
|
|
6
|
+
metadata.gz: 2c2a864e6498e28f52c2c9e9827ab4696760c2e02b78312d35255306e6a28e668ae8b95f1711ad82a08a39f66518a5a76b73e1bce44c0eed2d658a32dd81209d
|
|
7
|
+
data.tar.gz: cf316d8db8f6283cd7ee95b66076d1403626a91c2f540185fcb3569497aa01c20f7a06dfac28fd6addbf20106fd1f24800feaeff47940342d5228ee1fdcb6278
|
data/Gemfile
ADDED
data/README.md
ADDED
|
@@ -0,0 +1,121 @@
|
|
|
1
|
+
# Nextcloud Release Agent
|
|
2
|
+
|
|
3
|
+
> [!WARNING]
|
|
4
|
+
> This project is vibe coded and has been audited but I'm no Ruby developer.
|
|
5
|
+
> It doesn't have much weight but it does have my biases like squash preference, commit names, etc., ymmv.
|
|
6
|
+
|
|
7
|
+
Ruby CLI for releasing Nextcloud apps using `git` and `gh`.
|
|
8
|
+
It automates changelog updates, semver bumps, PR creation, tagging, releases, and workflow monitoring for Nextcloud app repositories.
|
|
9
|
+
|
|
10
|
+
## Requirements
|
|
11
|
+
|
|
12
|
+
- Ruby 3.1+
|
|
13
|
+
- `git` in `PATH`
|
|
14
|
+
- `gh` in `PATH` and logged in (`gh auth status`)
|
|
15
|
+
|
|
16
|
+
## Setup
|
|
17
|
+
|
|
18
|
+
- Target repo needs `appinfo/info.xml` and a `changelog.yaml` (see example file at `example/changelog.yaml`)
|
|
19
|
+
- The `release` remote is optional. If absent, tag push, GitHub release creation, and Actions monitoring are skipped for it with a warning.
|
|
20
|
+
|
|
21
|
+
## Install
|
|
22
|
+
|
|
23
|
+
```bash
|
|
24
|
+
gem install nextcloud_release_agent
|
|
25
|
+
```
|
|
26
|
+
|
|
27
|
+
User-local (no root):
|
|
28
|
+
|
|
29
|
+
```bash
|
|
30
|
+
gem install --user-install nextcloud_release_agent
|
|
31
|
+
export PATH="$(ruby -r rubygems -e 'print Gem.user_dir')/bin:$PATH"
|
|
32
|
+
```
|
|
33
|
+
|
|
34
|
+
Or without gem install:
|
|
35
|
+
|
|
36
|
+
```bash
|
|
37
|
+
git clone https://github.com/nextcloud/release_agent
|
|
38
|
+
ruby release_agent/exe/nextcloud-release-agent --help
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
## Usage
|
|
42
|
+
|
|
43
|
+
Prepare + publish in one shot
|
|
44
|
+
```bash
|
|
45
|
+
nextcloud-release-agent run --monitor
|
|
46
|
+
```
|
|
47
|
+
|
|
48
|
+
Create release PR only
|
|
49
|
+
```bash
|
|
50
|
+
nextcloud-release-agent prepare
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
Merge, tag, release
|
|
54
|
+
```bash
|
|
55
|
+
nextcloud-release-agent publish --monitor
|
|
56
|
+
```
|
|
57
|
+
|
|
58
|
+
Watch Github Actions for the latest version
|
|
59
|
+
```bash
|
|
60
|
+
nextcloud-release-agent monitor
|
|
61
|
+
```
|
|
62
|
+
|
|
63
|
+
Watch Github Actions for a specific version
|
|
64
|
+
```bash
|
|
65
|
+
nextcloud-release-agent monitor --repo translate2 2.4.0
|
|
66
|
+
```
|
|
67
|
+
|
|
68
|
+
## How it works
|
|
69
|
+
|
|
70
|
+
**`run`**
|
|
71
|
+
|
|
72
|
+
Runs `prepare` then `publish` in one shot. Use this for the happy path.
|
|
73
|
+
|
|
74
|
+
**`prepare`**
|
|
75
|
+
|
|
76
|
+
1. Fetch and sync the default branch.
|
|
77
|
+
2. Collect commits since the last `v*` tag.
|
|
78
|
+
3. Filter out commits matching `release_agent.ignore_commits` in `changelog.yaml`.
|
|
79
|
+
4. Fetch the GitHub PR for each commit.
|
|
80
|
+
5. Compute the next semver version.
|
|
81
|
+
6. Prepend a new changelog entry.
|
|
82
|
+
7. Render `CHANGELOG.md`.
|
|
83
|
+
8. Update `appinfo/info.xml` (and `<image-tag>` if present).
|
|
84
|
+
9. Create a `release/<version>` branch, commit, push, open a PR.
|
|
85
|
+
|
|
86
|
+
**`publish`**
|
|
87
|
+
|
|
88
|
+
1. Squash-merge the release PR (`--admin`), or wait for a manual merge.
|
|
89
|
+
2. Pull the merged default branch.
|
|
90
|
+
3. Create and push `v<version>` to `origin` and `release`.
|
|
91
|
+
4. Create GitHub releases in both remotes with the rendered changelog entry.
|
|
92
|
+
5. Optionally monitor the resulting GitHub Actions runs (`--monitor`).
|
|
93
|
+
|
|
94
|
+
## Semver
|
|
95
|
+
|
|
96
|
+
Version is derived from commit and PR metadata:
|
|
97
|
+
|
|
98
|
+
- **Major**: `BREAKING CHANGE`, conventional `!:` marker, or a breaking label.
|
|
99
|
+
- **Minor**: anything that looks like a feature or enhancement.
|
|
100
|
+
- **Patch**: everything else after ignore rules are applied.
|
|
101
|
+
|
|
102
|
+
Changelog sections: `Added` for features, `Fixed` for fixes/security, `Changed` for breaking and everything else.
|
|
103
|
+
|
|
104
|
+
## Ignore rules
|
|
105
|
+
|
|
106
|
+
Put ignore rules in the target repo's `changelog.yaml`:
|
|
107
|
+
|
|
108
|
+
```yaml
|
|
109
|
+
release_agent:
|
|
110
|
+
ignore_commits:
|
|
111
|
+
- author: "^pre-commit-ci\\[bot\\]$"
|
|
112
|
+
- author: "^nextcloud-bot$"
|
|
113
|
+
- author: "^dependabot\\[bot\\]$"
|
|
114
|
+
```
|
|
115
|
+
|
|
116
|
+
Rules are regexes. Omit a key to match anything.
|
|
117
|
+
|
|
118
|
+
## Releasing this CLI
|
|
119
|
+
|
|
120
|
+
It guides others to treasures it cannot possess.
|
|
121
|
+
Releases are manual for now.
|
|
@@ -0,0 +1,65 @@
|
|
|
1
|
+
module NextcloudReleaseAgent
|
|
2
|
+
class ChangelogRenderer
|
|
3
|
+
def initialize(repository_link)
|
|
4
|
+
@repository_link = repository_link&.sub(%r{/$}, "")
|
|
5
|
+
end
|
|
6
|
+
|
|
7
|
+
def render_document(data, entries)
|
|
8
|
+
preamble = data["markdown_preamble"]
|
|
9
|
+
document_lines = []
|
|
10
|
+
if preamble && !preamble.empty?
|
|
11
|
+
document_lines << preamble.rstrip
|
|
12
|
+
end
|
|
13
|
+
|
|
14
|
+
rendered_entries = entries.map { |entry| render_entry(entry) }
|
|
15
|
+
entry_separator = "\n\n\n"
|
|
16
|
+
|
|
17
|
+
if document_lines.empty?
|
|
18
|
+
rendered_entries.join(entry_separator) + "\n"
|
|
19
|
+
else
|
|
20
|
+
document_lines.join("\n") + "\n\n" + rendered_entries.join(entry_separator) + "\n"
|
|
21
|
+
end
|
|
22
|
+
end
|
|
23
|
+
|
|
24
|
+
def render_entry(entry)
|
|
25
|
+
lines = []
|
|
26
|
+
lines << "## #{entry.fetch("version")} - #{entry.fetch("release_date")}"
|
|
27
|
+
|
|
28
|
+
notes = entry["notes"]
|
|
29
|
+
if notes && !notes.empty?
|
|
30
|
+
lines << ""
|
|
31
|
+
lines.concat(notes.split("\n"))
|
|
32
|
+
end
|
|
33
|
+
|
|
34
|
+
entry.fetch("sections", []).each do |section|
|
|
35
|
+
lines << "" unless lines.last == ""
|
|
36
|
+
lines << "### #{section.fetch("name")}"
|
|
37
|
+
section.fetch("items", []).each do |item|
|
|
38
|
+
lines << "- #{format_item(item)}"
|
|
39
|
+
end
|
|
40
|
+
end
|
|
41
|
+
|
|
42
|
+
lines.join("\n")
|
|
43
|
+
end
|
|
44
|
+
|
|
45
|
+
private
|
|
46
|
+
|
|
47
|
+
def format_item(item)
|
|
48
|
+
text = item.fetch("text")
|
|
49
|
+
issue_number = item["issue_number"]
|
|
50
|
+
issue_marker = item.fetch("issue_marker", "#")
|
|
51
|
+
authors = item.fetch("authors", [])
|
|
52
|
+
|
|
53
|
+
rendered = text.dup
|
|
54
|
+
unless issue_number.nil?
|
|
55
|
+
if @repository_link && !@repository_link.empty?
|
|
56
|
+
rendered << " ([#{issue_marker}#{issue_number}](#{@repository_link}/pull/#{issue_number}))"
|
|
57
|
+
else
|
|
58
|
+
rendered << " (#{issue_marker}#{issue_number})"
|
|
59
|
+
end
|
|
60
|
+
end
|
|
61
|
+
rendered << authors.map { |author| " @#{author}" }.join unless authors.empty?
|
|
62
|
+
rendered
|
|
63
|
+
end
|
|
64
|
+
end
|
|
65
|
+
end
|
|
@@ -0,0 +1,855 @@
|
|
|
1
|
+
require "date"
|
|
2
|
+
require "json"
|
|
3
|
+
require "open3"
|
|
4
|
+
require "optparse"
|
|
5
|
+
require "pathname"
|
|
6
|
+
require "psych"
|
|
7
|
+
require "rexml/document"
|
|
8
|
+
require "shellwords"
|
|
9
|
+
require "tempfile"
|
|
10
|
+
require "time"
|
|
11
|
+
|
|
12
|
+
module NextcloudReleaseAgent
|
|
13
|
+
class Error < StandardError; end
|
|
14
|
+
|
|
15
|
+
CommandResult = Struct.new(:stdout, :stderr, :status, keyword_init: true)
|
|
16
|
+
CommitInfo = Struct.new(
|
|
17
|
+
:sha,
|
|
18
|
+
:author_name,
|
|
19
|
+
:author_email,
|
|
20
|
+
:message,
|
|
21
|
+
:pr_number,
|
|
22
|
+
:pr_title,
|
|
23
|
+
:pr_body,
|
|
24
|
+
:pr_url,
|
|
25
|
+
:pr_author,
|
|
26
|
+
:labels,
|
|
27
|
+
:classification,
|
|
28
|
+
:section,
|
|
29
|
+
keyword_init: true
|
|
30
|
+
)
|
|
31
|
+
|
|
32
|
+
class Logger
|
|
33
|
+
def info(message)
|
|
34
|
+
puts("[INFO] #{message}")
|
|
35
|
+
end
|
|
36
|
+
|
|
37
|
+
def status(message)
|
|
38
|
+
puts("[STEP] #{message}")
|
|
39
|
+
end
|
|
40
|
+
|
|
41
|
+
def warn(message)
|
|
42
|
+
$stderr.puts("[WARN] #{message}")
|
|
43
|
+
end
|
|
44
|
+
|
|
45
|
+
def error(message)
|
|
46
|
+
$stderr.puts("[ERROR] #{message}")
|
|
47
|
+
end
|
|
48
|
+
end
|
|
49
|
+
|
|
50
|
+
class Shell
|
|
51
|
+
def initialize(logger:, dry_run: false)
|
|
52
|
+
@logger = logger
|
|
53
|
+
@dry_run = dry_run
|
|
54
|
+
end
|
|
55
|
+
|
|
56
|
+
def capture(*command, chdir:, allow_failure: false, env: {})
|
|
57
|
+
printable = command.shelljoin
|
|
58
|
+
@logger.info("#{@dry_run ? 'would run' : 'running'}: #{printable}")
|
|
59
|
+
return CommandResult.new(stdout: "", stderr: "", status: 0) if @dry_run
|
|
60
|
+
|
|
61
|
+
stdout, stderr, status = Open3.capture3(env, *command, chdir: chdir)
|
|
62
|
+
result = CommandResult.new(stdout: stdout, stderr: stderr, status: status.exitstatus)
|
|
63
|
+
return result if status.success? || allow_failure
|
|
64
|
+
|
|
65
|
+
raise Error, "command failed (#{status.exitstatus}): #{printable}\n#{stderr.strip}"
|
|
66
|
+
end
|
|
67
|
+
|
|
68
|
+
def system!(*command, chdir:, allow_failure: false, env: {})
|
|
69
|
+
result = capture(*command, chdir: chdir, allow_failure: allow_failure, env: env)
|
|
70
|
+
return result if result.status.zero? || allow_failure
|
|
71
|
+
|
|
72
|
+
raise Error, "command failed (#{result.status}): #{command.shelljoin}\n#{result.stderr.strip}"
|
|
73
|
+
end
|
|
74
|
+
end
|
|
75
|
+
|
|
76
|
+
class ChangelogFile
|
|
77
|
+
SECTION_ORDER = ["Added", "Changed", "Fixed"].freeze
|
|
78
|
+
|
|
79
|
+
def initialize(path)
|
|
80
|
+
@path = Pathname(path)
|
|
81
|
+
end
|
|
82
|
+
|
|
83
|
+
def path
|
|
84
|
+
@path
|
|
85
|
+
end
|
|
86
|
+
|
|
87
|
+
def data
|
|
88
|
+
@data ||= Psych.safe_load(@path.read, aliases: false, permitted_classes: [], symbolize_names: false) || {}
|
|
89
|
+
end
|
|
90
|
+
|
|
91
|
+
def repository_link
|
|
92
|
+
data["repository_link"]
|
|
93
|
+
end
|
|
94
|
+
|
|
95
|
+
def entries
|
|
96
|
+
data.fetch("entries", [])
|
|
97
|
+
end
|
|
98
|
+
|
|
99
|
+
def latest_entry
|
|
100
|
+
entries.first
|
|
101
|
+
end
|
|
102
|
+
|
|
103
|
+
def ignore_rules
|
|
104
|
+
config = data.fetch("release_agent", {})
|
|
105
|
+
rules = config.fetch("ignore_commits", [])
|
|
106
|
+
rules.map do |rule|
|
|
107
|
+
{
|
|
108
|
+
"author" => compile_regex(rule["author"]),
|
|
109
|
+
"email" => compile_regex(rule["email"]),
|
|
110
|
+
"message" => compile_regex(rule["message"])
|
|
111
|
+
}
|
|
112
|
+
end
|
|
113
|
+
end
|
|
114
|
+
|
|
115
|
+
def ignored?(commit)
|
|
116
|
+
ignore_rules.any? do |rule|
|
|
117
|
+
matches?(rule["author"], commit.author_name) &&
|
|
118
|
+
matches?(rule["email"], commit.author_email) &&
|
|
119
|
+
matches?(rule["message"], commit.message)
|
|
120
|
+
end
|
|
121
|
+
end
|
|
122
|
+
|
|
123
|
+
def add_entry(entry)
|
|
124
|
+
data["entries"] = [entry] + entries
|
|
125
|
+
write!
|
|
126
|
+
end
|
|
127
|
+
|
|
128
|
+
private
|
|
129
|
+
|
|
130
|
+
def compile_regex(pattern)
|
|
131
|
+
return nil if pattern.nil? || pattern.empty?
|
|
132
|
+
|
|
133
|
+
Regexp.new(pattern, Regexp::IGNORECASE)
|
|
134
|
+
end
|
|
135
|
+
|
|
136
|
+
def matches?(pattern, value)
|
|
137
|
+
return true if pattern.nil?
|
|
138
|
+
|
|
139
|
+
!!(value.to_s =~ pattern)
|
|
140
|
+
end
|
|
141
|
+
|
|
142
|
+
def write!
|
|
143
|
+
@path.write(serialize(data))
|
|
144
|
+
end
|
|
145
|
+
|
|
146
|
+
def serialize(payload)
|
|
147
|
+
document = {
|
|
148
|
+
"repository_link" => payload["repository_link"],
|
|
149
|
+
"title" => payload["title"],
|
|
150
|
+
"description" => payload["description"]
|
|
151
|
+
}
|
|
152
|
+
document["markdown_preamble"] = payload["markdown_preamble"] if payload.key?("markdown_preamble")
|
|
153
|
+
document["release_agent"] = normalize_release_agent(payload["release_agent"]) if payload.key?("release_agent")
|
|
154
|
+
document["entries"] = normalize_entries(payload.fetch("entries", []))
|
|
155
|
+
|
|
156
|
+
yaml = Psych.dump(document, nil, line_width: -1).sub(/\A---\s*\n/, "")
|
|
157
|
+
yaml.end_with?("\n") ? yaml : "#{yaml}\n"
|
|
158
|
+
end
|
|
159
|
+
|
|
160
|
+
def normalize_release_agent(config)
|
|
161
|
+
{
|
|
162
|
+
"ignore_commits" => Array(config&.fetch("ignore_commits", [])).map do |rule|
|
|
163
|
+
normalized_rule = {}
|
|
164
|
+
%w[author email message].each do |key|
|
|
165
|
+
normalized_rule[key] = rule[key] if rule.key?(key)
|
|
166
|
+
end
|
|
167
|
+
normalized_rule
|
|
168
|
+
end
|
|
169
|
+
}
|
|
170
|
+
end
|
|
171
|
+
|
|
172
|
+
def normalize_entries(entries)
|
|
173
|
+
entries.map do |entry|
|
|
174
|
+
normalized_entry = {
|
|
175
|
+
"version" => entry.fetch("version"),
|
|
176
|
+
"release_date" => entry.fetch("release_date")
|
|
177
|
+
}
|
|
178
|
+
normalized_entry["notes"] = entry["notes"] if entry.key?("notes") && !entry["notes"].nil?
|
|
179
|
+
normalized_entry["sections"] = ordered_sections(entry.fetch("sections", [])).map do |section|
|
|
180
|
+
{
|
|
181
|
+
"name" => section.fetch("name"),
|
|
182
|
+
"items" => section.fetch("items", []).map do |item|
|
|
183
|
+
normalized_item = {
|
|
184
|
+
"text" => item.fetch("text"),
|
|
185
|
+
"authors" => Array(item["authors"])
|
|
186
|
+
}
|
|
187
|
+
normalized_item["issue_number"] = item["issue_number"] if item.key?("issue_number")
|
|
188
|
+
normalized_item["issue_marker"] = item["issue_marker"] if item.key?("issue_marker") && item["issue_marker"] != "#"
|
|
189
|
+
normalized_item
|
|
190
|
+
end
|
|
191
|
+
}
|
|
192
|
+
end
|
|
193
|
+
normalized_entry
|
|
194
|
+
end
|
|
195
|
+
end
|
|
196
|
+
|
|
197
|
+
def ordered_sections(sections)
|
|
198
|
+
sections.sort_by do |section|
|
|
199
|
+
index = SECTION_ORDER.index(section["name"])
|
|
200
|
+
index.nil? ? SECTION_ORDER.length : index
|
|
201
|
+
end
|
|
202
|
+
end
|
|
203
|
+
end
|
|
204
|
+
|
|
205
|
+
class ReleaseManager
|
|
206
|
+
RELEASE_EVENT = "release".freeze
|
|
207
|
+
PUSH_EVENT = "push".freeze
|
|
208
|
+
|
|
209
|
+
def initialize(options)
|
|
210
|
+
@options = options
|
|
211
|
+
@logger = Logger.new
|
|
212
|
+
@shell = Shell.new(logger: @logger, dry_run: options[:dry_run])
|
|
213
|
+
@repo_path = Pathname(options[:repo]).expand_path
|
|
214
|
+
@changelog_path = resolve_changelog_path
|
|
215
|
+
@info_xml_path = @repo_path.join("appinfo", "info.xml")
|
|
216
|
+
@markdown_path = resolve_markdown_path
|
|
217
|
+
@changelog = ChangelogFile.new(@changelog_path)
|
|
218
|
+
@origin_remote = options[:remote]
|
|
219
|
+
@release_remote = options[:release_remote]
|
|
220
|
+
@release_remote_exists = remote_exists?(@release_remote)
|
|
221
|
+
@logger.warn("release remote '#{@release_remote}' not found: skipping release remote steps") unless @release_remote_exists
|
|
222
|
+
end
|
|
223
|
+
|
|
224
|
+
def prepare
|
|
225
|
+
with_release_context do |context|
|
|
226
|
+
branch_name = "#{@options[:branch_prefix]}#{context[:version]}"
|
|
227
|
+
create_or_reset_branch(branch_name, context[:default_branch])
|
|
228
|
+
update_metadata_files(context)
|
|
229
|
+
commit_paths = [relative_path(@changelog_path), relative_path(@markdown_path), relative_path(@info_xml_path)]
|
|
230
|
+
commit_release(branch_name, context[:version], commit_paths)
|
|
231
|
+
push_branch(branch_name)
|
|
232
|
+
pr = create_pull_request(branch_name, context[:default_branch], context[:version], context[:release_notes])
|
|
233
|
+
@logger.status("prepared release #{context[:version]} on #{branch_name}")
|
|
234
|
+
@logger.info("pull request: #{pr.fetch('url')}")
|
|
235
|
+
{ version: context[:version], branch: branch_name, pr: pr }
|
|
236
|
+
end
|
|
237
|
+
end
|
|
238
|
+
|
|
239
|
+
def publish(version = nil, pr: nil)
|
|
240
|
+
version ||= latest_version
|
|
241
|
+
default_branch = ensure_default_branch_name
|
|
242
|
+
pr ||= find_release_pull_request(version)
|
|
243
|
+
raise Error, "could not find release PR for #{version}" if pr.nil?
|
|
244
|
+
|
|
245
|
+
ensure_pull_request_merged(pr.fetch("number"), version)
|
|
246
|
+
sync_default_branch(default_branch)
|
|
247
|
+
tag_name = "v#{version}"
|
|
248
|
+
create_and_push_tag(tag_name)
|
|
249
|
+
release_notes = render_release_notes(version)
|
|
250
|
+
create_github_release(@origin_remote, tag_name, version, release_notes)
|
|
251
|
+
create_github_release(@release_remote, tag_name, version, release_notes) if @release_remote_exists
|
|
252
|
+
monitor(version) if @options[:monitor]
|
|
253
|
+
{ version: version, pr: pr, tag: tag_name }
|
|
254
|
+
end
|
|
255
|
+
|
|
256
|
+
def run
|
|
257
|
+
prepared = prepare
|
|
258
|
+
publish(prepared.fetch(:version), pr: prepared.fetch(:pr))
|
|
259
|
+
end
|
|
260
|
+
|
|
261
|
+
def monitor(version = nil)
|
|
262
|
+
version ||= latest_version
|
|
263
|
+
commit_sha = git("rev-parse", "HEAD").stdout.strip
|
|
264
|
+
tag_name = "v#{version}"
|
|
265
|
+
@logger.status("monitoring workflow runs for #{tag_name}")
|
|
266
|
+
monitor_repo_runs(@origin_remote, PUSH_EVENT, commit_sha, version)
|
|
267
|
+
monitor_repo_runs(@release_remote, RELEASE_EVENT, commit_sha, version) if @release_remote_exists
|
|
268
|
+
end
|
|
269
|
+
|
|
270
|
+
private
|
|
271
|
+
|
|
272
|
+
def with_release_context
|
|
273
|
+
validate_repo!
|
|
274
|
+
default_branch = ensure_default_branch_name
|
|
275
|
+
ensure_clean_worktree! unless @options[:allow_dirty]
|
|
276
|
+
sync_default_branch(default_branch)
|
|
277
|
+
commits = collect_commits
|
|
278
|
+
raise Error, "no releasable commits found since the last tag" if commits.empty?
|
|
279
|
+
|
|
280
|
+
enriched_commits = enrich_commits(commits)
|
|
281
|
+
filtered_commits = enriched_commits.reject do |commit|
|
|
282
|
+
ignored = @changelog.ignored?(commit)
|
|
283
|
+
@logger.info("ignoring #{commit.sha[0, 7]} #{commit.message.inspect}") if ignored
|
|
284
|
+
ignored
|
|
285
|
+
end
|
|
286
|
+
raise Error, "all commits were ignored by changelog release_agent.ignore_commits" if filtered_commits.empty?
|
|
287
|
+
|
|
288
|
+
classify_commits!(filtered_commits)
|
|
289
|
+
version = bump_version(filtered_commits)
|
|
290
|
+
release_notes_entry = build_changelog_entry(version, filtered_commits)
|
|
291
|
+
release_notes = render_release_notes_for_entry(release_notes_entry)
|
|
292
|
+
yield(
|
|
293
|
+
version: version,
|
|
294
|
+
default_branch: default_branch,
|
|
295
|
+
commits: filtered_commits,
|
|
296
|
+
changelog_entry: release_notes_entry,
|
|
297
|
+
release_notes: release_notes
|
|
298
|
+
)
|
|
299
|
+
end
|
|
300
|
+
|
|
301
|
+
def validate_repo!
|
|
302
|
+
raise Error, "repo path does not exist: #{@repo_path}" unless @repo_path.directory?
|
|
303
|
+
raise Error, "missing changelog file: #{@changelog_path}" unless @changelog_path.file?
|
|
304
|
+
raise Error, "missing info.xml: #{@info_xml_path}" unless @info_xml_path.file?
|
|
305
|
+
end
|
|
306
|
+
|
|
307
|
+
def resolve_changelog_path
|
|
308
|
+
configured = @options[:changelog] && Pathname(@options[:changelog])
|
|
309
|
+
return configured.expand_path if configured
|
|
310
|
+
|
|
311
|
+
[@repo_path.join("changelog.yaml"), @repo_path.join("CHANGELOG.yaml")].find(&:file?) || @repo_path.join("changelog.yaml")
|
|
312
|
+
end
|
|
313
|
+
|
|
314
|
+
def resolve_markdown_path
|
|
315
|
+
configured = @options[:markdown] && Pathname(@options[:markdown])
|
|
316
|
+
return configured.expand_path if configured
|
|
317
|
+
|
|
318
|
+
[@repo_path.join("CHANGELOG.md"), @repo_path.join("changelog.md"), @repo_path.join("changelog.generated.md")].find(&:file?) || @repo_path.join("CHANGELOG.md")
|
|
319
|
+
end
|
|
320
|
+
|
|
321
|
+
def ensure_clean_worktree!
|
|
322
|
+
status = git("status", "--porcelain").stdout
|
|
323
|
+
return if status.strip.empty?
|
|
324
|
+
|
|
325
|
+
raise Error, "working tree is dirty; rerun with --allow-dirty to bypass"
|
|
326
|
+
end
|
|
327
|
+
|
|
328
|
+
def ensure_default_branch_name
|
|
329
|
+
return @options[:default_branch] if @options[:default_branch]
|
|
330
|
+
|
|
331
|
+
remote_head = git("symbolic-ref", "refs/remotes/#{@origin_remote}/HEAD", allow_failure: true)
|
|
332
|
+
if remote_head.status.zero?
|
|
333
|
+
return remote_head.stdout.strip.split("/").last
|
|
334
|
+
end
|
|
335
|
+
|
|
336
|
+
repo = remote_repo_slug(@origin_remote)
|
|
337
|
+
response = gh_json("repo", "view", "--repo", repo, "--json", "defaultBranchRef")
|
|
338
|
+
response.fetch("defaultBranchRef").fetch("name")
|
|
339
|
+
end
|
|
340
|
+
|
|
341
|
+
def sync_default_branch(default_branch)
|
|
342
|
+
@logger.status("syncing #{default_branch}")
|
|
343
|
+
git("fetch", "--all", "--tags", "--prune")
|
|
344
|
+
current_branch = git("branch", "--show-current").stdout.strip
|
|
345
|
+
git("checkout", default_branch) unless current_branch == default_branch
|
|
346
|
+
git("pull", "--ff-only", @origin_remote, default_branch)
|
|
347
|
+
end
|
|
348
|
+
|
|
349
|
+
def create_or_reset_branch(branch_name, default_branch)
|
|
350
|
+
@logger.status("creating branch #{branch_name}")
|
|
351
|
+
git("checkout", "-B", branch_name, default_branch)
|
|
352
|
+
end
|
|
353
|
+
|
|
354
|
+
def collect_commits
|
|
355
|
+
@logger.status("collecting commits since last tag")
|
|
356
|
+
last_tag = git("describe", "--tags", "--abbrev=0", "--match", "v*", "HEAD", allow_failure: true)
|
|
357
|
+
range = last_tag.status.zero? ? "#{last_tag.stdout.strip}..HEAD" : "HEAD"
|
|
358
|
+
format = "%H%x1f%an%x1f%ae%x1f%s"
|
|
359
|
+
output = git("log", "--reverse", "--format=#{format}", range).stdout
|
|
360
|
+
output.lines.filter_map do |line|
|
|
361
|
+
sha, author_name, author_email, message = line.chomp.split("\u001F", 4)
|
|
362
|
+
next if sha.nil? || sha.empty?
|
|
363
|
+
|
|
364
|
+
CommitInfo.new(
|
|
365
|
+
sha: sha,
|
|
366
|
+
author_name: author_name,
|
|
367
|
+
author_email: author_email,
|
|
368
|
+
message: message,
|
|
369
|
+
labels: []
|
|
370
|
+
)
|
|
371
|
+
end
|
|
372
|
+
end
|
|
373
|
+
|
|
374
|
+
def enrich_commits(commits)
|
|
375
|
+
repo = remote_repo_slug(@origin_remote)
|
|
376
|
+
commits.each do |commit|
|
|
377
|
+
pulls = gh_api_json("repos/#{repo}/commits/#{commit.sha}/pulls", repo: repo, default: [])
|
|
378
|
+
pull = Array(pulls).first
|
|
379
|
+
next unless pull
|
|
380
|
+
|
|
381
|
+
detail = gh_api_json("repos/#{repo}/pulls/#{pull.fetch('number')}", repo: repo)
|
|
382
|
+
commit.pr_number = detail.fetch("number")
|
|
383
|
+
commit.pr_title = detail.fetch("title")
|
|
384
|
+
commit.pr_body = detail["body"].to_s
|
|
385
|
+
commit.pr_url = detail.fetch("html_url")
|
|
386
|
+
commit.pr_author = detail.fetch("user", {}).fetch("login", nil)
|
|
387
|
+
commit.labels = Array(detail["labels"]).map { |label| label.fetch("name") }
|
|
388
|
+
end
|
|
389
|
+
commits
|
|
390
|
+
end
|
|
391
|
+
|
|
392
|
+
def classify_commits!(commits)
|
|
393
|
+
commits.each do |commit|
|
|
394
|
+
text = [commit.pr_title, commit.pr_body, commit.message].compact.join("\n")
|
|
395
|
+
labels = commit.labels.map(&:downcase)
|
|
396
|
+
commit.classification = if breaking_change?(text, labels)
|
|
397
|
+
:breaking
|
|
398
|
+
elsif feature_change?(text, labels)
|
|
399
|
+
:feature
|
|
400
|
+
elsif fix_change?(text, labels)
|
|
401
|
+
:fix
|
|
402
|
+
else
|
|
403
|
+
:change
|
|
404
|
+
end
|
|
405
|
+
|
|
406
|
+
commit.section = case commit.classification
|
|
407
|
+
when :feature
|
|
408
|
+
"Added"
|
|
409
|
+
when :fix
|
|
410
|
+
"Fixed"
|
|
411
|
+
else
|
|
412
|
+
"Changed"
|
|
413
|
+
end
|
|
414
|
+
end
|
|
415
|
+
end
|
|
416
|
+
|
|
417
|
+
def breaking_change?(text, labels)
|
|
418
|
+
return true if labels.any? { |label| label.include?("breaking") || label == "major" }
|
|
419
|
+
|
|
420
|
+
!!(text =~ /BREAKING CHANGE|!:/i)
|
|
421
|
+
end
|
|
422
|
+
|
|
423
|
+
def feature_change?(text, labels)
|
|
424
|
+
return true if labels.any? { |label| %w[feature enhancement added minor].include?(label) || label.include?("feature") }
|
|
425
|
+
|
|
426
|
+
!!(text =~ /(^|\n)feat(?:\([^)]+\))?!?:|(^|\n)add(ed)?\b|(^|\n)feature\b|(^|\n)enh\b/i)
|
|
427
|
+
end
|
|
428
|
+
|
|
429
|
+
def fix_change?(text, labels)
|
|
430
|
+
return true if labels.any? { |label| %w[fix bug bugfix security regression patch].include?(label) || label.include?("bug") }
|
|
431
|
+
|
|
432
|
+
!!(text =~ /(^|\n)fix(?:\([^)]+\))?!?:|(^|\n)bug|(^|\n)security\b/i)
|
|
433
|
+
end
|
|
434
|
+
|
|
435
|
+
def bump_version(commits)
|
|
436
|
+
current = parse_version(@changelog.latest_entry&.fetch("version", nil) || current_info_xml_version)
|
|
437
|
+
next_parts = current.dup
|
|
438
|
+
|
|
439
|
+
if commits.any? { |commit| commit.classification == :breaking }
|
|
440
|
+
next_parts[0] += 1
|
|
441
|
+
next_parts[1] = 0
|
|
442
|
+
next_parts[2] = 0
|
|
443
|
+
elsif commits.any? { |commit| commit.classification == :feature }
|
|
444
|
+
next_parts[1] += 1
|
|
445
|
+
next_parts[2] = 0
|
|
446
|
+
else
|
|
447
|
+
next_parts[2] += 1
|
|
448
|
+
end
|
|
449
|
+
|
|
450
|
+
next_parts.join(".")
|
|
451
|
+
end
|
|
452
|
+
|
|
453
|
+
def parse_version(version)
|
|
454
|
+
match = version.to_s.match(/\A(\d+)\.(\d+)\.(\d+)\z/)
|
|
455
|
+
raise Error, "unsupported semver version: #{version.inspect}" unless match
|
|
456
|
+
|
|
457
|
+
match.captures.map(&:to_i)
|
|
458
|
+
end
|
|
459
|
+
|
|
460
|
+
def current_info_xml_version
|
|
461
|
+
xml = REXML::Document.new(@info_xml_path.read)
|
|
462
|
+
REXML::XPath.first(xml, "//info/version/text()").to_s
|
|
463
|
+
end
|
|
464
|
+
|
|
465
|
+
def build_changelog_entry(version, commits)
|
|
466
|
+
grouped = commits.group_by(&:section)
|
|
467
|
+
sections = ChangelogFile::SECTION_ORDER.filter_map do |section_name|
|
|
468
|
+
items = Array(grouped[section_name]).map { |commit| build_changelog_item(commit) }
|
|
469
|
+
next if items.empty?
|
|
470
|
+
|
|
471
|
+
{ "name" => section_name, "items" => items }
|
|
472
|
+
end
|
|
473
|
+
|
|
474
|
+
{
|
|
475
|
+
"version" => version,
|
|
476
|
+
"release_date" => Date.today.iso8601,
|
|
477
|
+
"sections" => sections
|
|
478
|
+
}
|
|
479
|
+
end
|
|
480
|
+
|
|
481
|
+
def build_changelog_item(commit)
|
|
482
|
+
text = clean_summary(commit.pr_title || commit.message)
|
|
483
|
+
text = "BREAKING: #{text}" if commit.classification == :breaking && !text.start_with?("BREAKING:")
|
|
484
|
+
|
|
485
|
+
item = {
|
|
486
|
+
"text" => text,
|
|
487
|
+
"authors" => normalize_authors(commit)
|
|
488
|
+
}
|
|
489
|
+
item["issue_number"] = commit.pr_number unless commit.pr_number.nil?
|
|
490
|
+
item
|
|
491
|
+
end
|
|
492
|
+
|
|
493
|
+
def clean_summary(text)
|
|
494
|
+
text.to_s
|
|
495
|
+
.sub(/\A(?:feat|fix|chore|docs|refactor|test|ci|build)(?:\([^)]+\))?!?:\s*/i, "")
|
|
496
|
+
.sub(/\Amerge pull request\s+#\d+.*?\n?/i, "")
|
|
497
|
+
.strip
|
|
498
|
+
end
|
|
499
|
+
|
|
500
|
+
def normalize_authors(commit)
|
|
501
|
+
candidate = commit.pr_author || commit.author_name
|
|
502
|
+
return [] if candidate.nil?
|
|
503
|
+
|
|
504
|
+
sanitized = candidate.strip.gsub(/\s+/, "-")
|
|
505
|
+
sanitized.match?(/\A[0-9A-Za-z][0-9A-Za-z-]*\z/) ? [sanitized] : []
|
|
506
|
+
end
|
|
507
|
+
|
|
508
|
+
def update_metadata_files(context)
|
|
509
|
+
@logger.status("updating changelog and app metadata")
|
|
510
|
+
@changelog.add_entry(context.fetch(:changelog_entry))
|
|
511
|
+
render_markdown_changelog
|
|
512
|
+
update_info_xml(context.fetch(:version))
|
|
513
|
+
end
|
|
514
|
+
|
|
515
|
+
def render_markdown_changelog
|
|
516
|
+
@markdown_path.dirname.mkpath
|
|
517
|
+
data = Psych.safe_load(@changelog_path.read, aliases: false, permitted_classes: [], symbolize_names: false) || {}
|
|
518
|
+
renderer = ChangelogRenderer.new(data["repository_link"])
|
|
519
|
+
@markdown_path.write(renderer.render_document(data, data.fetch("entries", [])))
|
|
520
|
+
end
|
|
521
|
+
|
|
522
|
+
def render_release_notes(version)
|
|
523
|
+
data = Psych.safe_load(@changelog_path.read, aliases: false, permitted_classes: [], symbolize_names: false) || {}
|
|
524
|
+
renderer = ChangelogRenderer.new(data["repository_link"])
|
|
525
|
+
entry = data.fetch("entries", []).find { |e| e.fetch("version") == version }
|
|
526
|
+
raise Error, "no changelog entry found for version #{version}" unless entry
|
|
527
|
+
|
|
528
|
+
renderer.render_entry(entry)
|
|
529
|
+
end
|
|
530
|
+
|
|
531
|
+
def render_release_notes_for_entry(entry)
|
|
532
|
+
renderer = ChangelogRenderer.new(@changelog.repository_link)
|
|
533
|
+
renderer.render_entry(entry)
|
|
534
|
+
end
|
|
535
|
+
|
|
536
|
+
def update_info_xml(version)
|
|
537
|
+
xml_text = @info_xml_path.read
|
|
538
|
+
updated = xml_text.sub(%r{<version>.*?</version>}, "<version>#{version}</version>")
|
|
539
|
+
updated = updated.sub(%r{<image-tag>.*?</image-tag>}, "<image-tag>#{version}</image-tag>") if updated.include?("<image-tag>")
|
|
540
|
+
@info_xml_path.write(updated)
|
|
541
|
+
end
|
|
542
|
+
|
|
543
|
+
def commit_release(branch_name, version, paths)
|
|
544
|
+
@logger.status("committing release #{version}")
|
|
545
|
+
paths.each { |path| git("add", path) }
|
|
546
|
+
git("commit", "-s", "-m", version)
|
|
547
|
+
rescue Error => error
|
|
548
|
+
raise unless error.message.include?("nothing to commit")
|
|
549
|
+
|
|
550
|
+
@logger.warn("no content changes were staged for #{branch_name}")
|
|
551
|
+
end
|
|
552
|
+
|
|
553
|
+
def push_branch(branch_name)
|
|
554
|
+
return if @options[:no_push]
|
|
555
|
+
|
|
556
|
+
@logger.status("pushing #{branch_name} to #{@origin_remote}")
|
|
557
|
+
git("push", "--set-upstream", @origin_remote, branch_name)
|
|
558
|
+
pause_for_visibility("waiting for #{branch_name} to be visible on GitHub")
|
|
559
|
+
end
|
|
560
|
+
|
|
561
|
+
def create_pull_request(branch_name, default_branch, version, release_notes)
|
|
562
|
+
repo = remote_repo_slug(@origin_remote)
|
|
563
|
+
body_file = Tempfile.new(["release-notes", ".md"])
|
|
564
|
+
body_file.write(release_notes)
|
|
565
|
+
body_file.flush
|
|
566
|
+
create_result = gh(
|
|
567
|
+
"pr",
|
|
568
|
+
"create",
|
|
569
|
+
"--repo",
|
|
570
|
+
repo,
|
|
571
|
+
"--base",
|
|
572
|
+
default_branch,
|
|
573
|
+
"--head",
|
|
574
|
+
branch_name,
|
|
575
|
+
"--title",
|
|
576
|
+
version,
|
|
577
|
+
"--body-file",
|
|
578
|
+
body_file.path,
|
|
579
|
+
)
|
|
580
|
+
pr_url = extract_pull_request_url(create_result.stdout)
|
|
581
|
+
raise Error, "could not determine created pull request URL" if pr_url.nil?
|
|
582
|
+
|
|
583
|
+
wait_for_release_pull_request(repo, version, branch_name, pr_url: pr_url)
|
|
584
|
+
ensure
|
|
585
|
+
body_file&.close!
|
|
586
|
+
end
|
|
587
|
+
|
|
588
|
+
def extract_pull_request_url(text)
|
|
589
|
+
text.to_s.lines.reverse_each do |line|
|
|
590
|
+
match = line.match(%r{https://github\.com/[^\s]+/pull/\d+})
|
|
591
|
+
return match[0] if match
|
|
592
|
+
end
|
|
593
|
+
|
|
594
|
+
nil
|
|
595
|
+
end
|
|
596
|
+
|
|
597
|
+
def latest_version
|
|
598
|
+
@changelog.latest_entry&.fetch("version") || current_info_xml_version
|
|
599
|
+
end
|
|
600
|
+
|
|
601
|
+
def find_release_pull_request(version)
|
|
602
|
+
repo = remote_repo_slug(@origin_remote)
|
|
603
|
+
wait_for_release_pull_request(repo, version, "#{@options[:branch_prefix]}#{version}")
|
|
604
|
+
end
|
|
605
|
+
|
|
606
|
+
def ensure_pull_request_merged(pr_number, version)
|
|
607
|
+
repo = remote_repo_slug(@origin_remote)
|
|
608
|
+
begin
|
|
609
|
+
@logger.status("merging PR ##{pr_number} for #{version}")
|
|
610
|
+
gh("pr", "merge", pr_number.to_s, "--repo", repo, "--squash", "--admin")
|
|
611
|
+
rescue Error => error
|
|
612
|
+
@logger.warn("automatic merge failed: #{error.message.lines.first.strip}")
|
|
613
|
+
@logger.warn("waiting for the PR to be merged manually")
|
|
614
|
+
end
|
|
615
|
+
|
|
616
|
+
wait_for_merge(repo, pr_number)
|
|
617
|
+
end
|
|
618
|
+
|
|
619
|
+
def wait_for_merge(repo, pr_number)
|
|
620
|
+
deadline = Time.now + @options[:poll_timeout]
|
|
621
|
+
loop do
|
|
622
|
+
pr = gh_json("pr", "view", pr_number.to_s, "--repo", repo, "--json", "number,state,mergedAt,url")
|
|
623
|
+
return pr if pr["mergedAt"]
|
|
624
|
+
|
|
625
|
+
if pr["state"] == "CLOSED"
|
|
626
|
+
raise Error, "PR #{pr.fetch('url')} was closed without merging"
|
|
627
|
+
end
|
|
628
|
+
|
|
629
|
+
raise Error, "timed out waiting for #{pr.fetch('url')} to merge" if Time.now >= deadline
|
|
630
|
+
|
|
631
|
+
@logger.info("PR not merged yet, sleeping #{@options[:poll_interval]}s")
|
|
632
|
+
sleep(@options[:poll_interval]) unless @options[:dry_run]
|
|
633
|
+
end
|
|
634
|
+
end
|
|
635
|
+
|
|
636
|
+
def create_and_push_tag(tag_name)
|
|
637
|
+
@logger.status("tagging #{tag_name}")
|
|
638
|
+
git("tag", tag_name)
|
|
639
|
+
git("push", @origin_remote, tag_name)
|
|
640
|
+
git("push", @release_remote, tag_name) if @release_remote_exists
|
|
641
|
+
pause_for_visibility("waiting for #{tag_name} to be visible on GitHub")
|
|
642
|
+
end
|
|
643
|
+
|
|
644
|
+
def create_github_release(remote_name, tag_name, version, release_notes)
|
|
645
|
+
repo = remote_repo_slug(remote_name)
|
|
646
|
+
@logger.status("creating GitHub release #{version} in #{remote_name}")
|
|
647
|
+
notes_file = Tempfile.new(["release-notes", ".md"])
|
|
648
|
+
begin
|
|
649
|
+
notes_file.write(release_notes)
|
|
650
|
+
notes_file.flush
|
|
651
|
+
gh("release", "create", tag_name, "--repo", repo, "--title", version, "--notes-file", notes_file.path, "--verify-tag")
|
|
652
|
+
ensure
|
|
653
|
+
notes_file.close!
|
|
654
|
+
end
|
|
655
|
+
end
|
|
656
|
+
|
|
657
|
+
def wait_for_release_pull_request(repo, version, branch_name, pr_url: nil)
|
|
658
|
+
deadline = Time.now + @options[:poll_timeout]
|
|
659
|
+
|
|
660
|
+
loop do
|
|
661
|
+
if pr_url
|
|
662
|
+
view_result = gh("pr", "view", pr_url, "--repo", repo, "--json", "number,url,title,state,mergedAt,headRefName", allow_failure: true)
|
|
663
|
+
return JSON.parse(view_result.stdout) if view_result.status.zero?
|
|
664
|
+
end
|
|
665
|
+
|
|
666
|
+
pr = query_release_pull_request(repo, version, branch_name)
|
|
667
|
+
return pr if pr
|
|
668
|
+
|
|
669
|
+
raise Error, "could not find release PR for #{version}" if Time.now >= deadline
|
|
670
|
+
|
|
671
|
+
pause_for_visibility("waiting for release PR #{version} to become visible")
|
|
672
|
+
end
|
|
673
|
+
end
|
|
674
|
+
|
|
675
|
+
def query_release_pull_request(repo, version, branch_name)
|
|
676
|
+
prs = gh_json("pr", "list", "--repo", repo, "--state", "all", "--search", version, "--json", "number,title,url,state,mergedAt,headRefName")
|
|
677
|
+
Array(prs).find { |pr| pr["title"] == version || pr["headRefName"] == branch_name }
|
|
678
|
+
end
|
|
679
|
+
|
|
680
|
+
def pause_for_visibility(message)
|
|
681
|
+
@logger.info("#{message}; sleeping #{@options[:poll_interval]}s")
|
|
682
|
+
sleep(@options[:poll_interval]) unless @options[:dry_run]
|
|
683
|
+
end
|
|
684
|
+
|
|
685
|
+
def monitor_repo_runs(remote_name, event, commit_sha, version)
|
|
686
|
+
repo = remote_repo_slug(remote_name)
|
|
687
|
+
deadline = Time.now + @options[:poll_timeout]
|
|
688
|
+
matching_run = nil
|
|
689
|
+
|
|
690
|
+
loop do
|
|
691
|
+
runs = gh_api_json("repos/#{repo}/actions/runs?event=#{event}&per_page=20", repo: repo, default: {}).fetch("workflow_runs", [])
|
|
692
|
+
matching_run = runs.find do |run|
|
|
693
|
+
run.fetch("head_sha", "") == commit_sha || run.fetch("display_title", "").include?(version)
|
|
694
|
+
end
|
|
695
|
+
|
|
696
|
+
break if matching_run || Time.now >= deadline
|
|
697
|
+
|
|
698
|
+
@logger.info("waiting for #{remote_name} #{event} workflow run")
|
|
699
|
+
sleep(@options[:poll_interval]) unless @options[:dry_run]
|
|
700
|
+
end
|
|
701
|
+
|
|
702
|
+
unless matching_run
|
|
703
|
+
@logger.warn("no matching #{event} workflow run found in #{remote_name}")
|
|
704
|
+
return
|
|
705
|
+
end
|
|
706
|
+
|
|
707
|
+
loop do
|
|
708
|
+
run = gh_api_json("repos/#{repo}/actions/runs/#{matching_run.fetch('id')}", repo: repo)
|
|
709
|
+
status = run.fetch("status")
|
|
710
|
+
conclusion = run["conclusion"]
|
|
711
|
+
if status == "completed"
|
|
712
|
+
if conclusion == "success"
|
|
713
|
+
@logger.info("workflow succeeded: #{run.fetch('html_url')}")
|
|
714
|
+
else
|
|
715
|
+
@logger.warn("workflow #{conclusion || 'failed'}: #{run.fetch('html_url')}")
|
|
716
|
+
end
|
|
717
|
+
return
|
|
718
|
+
end
|
|
719
|
+
|
|
720
|
+
raise Error, "timed out waiting for workflow #{run.fetch('html_url')}" if Time.now >= deadline
|
|
721
|
+
|
|
722
|
+
@logger.info("workflow #{run.fetch('name')} is #{status}; sleeping #{@options[:poll_interval]}s")
|
|
723
|
+
sleep(@options[:poll_interval]) unless @options[:dry_run]
|
|
724
|
+
end
|
|
725
|
+
end
|
|
726
|
+
|
|
727
|
+
def relative_path(path)
|
|
728
|
+
Pathname(path).relative_path_from(@repo_path).to_s
|
|
729
|
+
end
|
|
730
|
+
|
|
731
|
+
def remote_exists?(remote_name)
|
|
732
|
+
result = git("remote", "get-url", remote_name, allow_failure: true)
|
|
733
|
+
result.status.zero?
|
|
734
|
+
end
|
|
735
|
+
|
|
736
|
+
def remote_repo_slug(remote_name)
|
|
737
|
+
url = git("remote", "get-url", remote_name).stdout.strip
|
|
738
|
+
match = url.match(%r{github\.com[:/](.+?)(?:\.git)?\z})
|
|
739
|
+
raise Error, "unsupported GitHub remote URL for #{remote_name}: #{url}" unless match
|
|
740
|
+
|
|
741
|
+
match[1]
|
|
742
|
+
end
|
|
743
|
+
|
|
744
|
+
def gh_api_json(path, repo:, default: nil)
|
|
745
|
+
result = gh("api", path, "-H", "Accept: application/vnd.github+json", allow_failure: !default.nil?)
|
|
746
|
+
return default if !default.nil? && result.status != 0
|
|
747
|
+
|
|
748
|
+
JSON.parse(result.stdout)
|
|
749
|
+
end
|
|
750
|
+
|
|
751
|
+
def gh_json(*args)
|
|
752
|
+
result = gh(*args)
|
|
753
|
+
JSON.parse(result.stdout)
|
|
754
|
+
end
|
|
755
|
+
|
|
756
|
+
def git(*args, allow_failure: false)
|
|
757
|
+
@shell.capture("git", *args, chdir: @repo_path.to_s, allow_failure: allow_failure)
|
|
758
|
+
end
|
|
759
|
+
|
|
760
|
+
def gh(*args, allow_failure: false)
|
|
761
|
+
@shell.capture("gh", *args, chdir: @repo_path.to_s, allow_failure: allow_failure)
|
|
762
|
+
end
|
|
763
|
+
end
|
|
764
|
+
|
|
765
|
+
class CLI
|
|
766
|
+
SUBCOMMANDS = %w[prepare publish run monitor].freeze
|
|
767
|
+
|
|
768
|
+
def initialize(argv)
|
|
769
|
+
@argv = argv.dup
|
|
770
|
+
end
|
|
771
|
+
|
|
772
|
+
def run
|
|
773
|
+
subcommand = @argv.shift
|
|
774
|
+
return usage if subcommand.nil? || %w[-h --help help].include?(subcommand)
|
|
775
|
+
raise Error, "unknown subcommand: #{subcommand}" unless SUBCOMMANDS.include?(subcommand)
|
|
776
|
+
|
|
777
|
+
options = default_options
|
|
778
|
+
parser = option_parser(options)
|
|
779
|
+
parser.parse!(@argv)
|
|
780
|
+
|
|
781
|
+
manager = ReleaseManager.new(options)
|
|
782
|
+
case subcommand
|
|
783
|
+
when "prepare"
|
|
784
|
+
manager.prepare
|
|
785
|
+
when "publish"
|
|
786
|
+
manager.publish(@argv.shift)
|
|
787
|
+
when "run"
|
|
788
|
+
manager.run
|
|
789
|
+
when "monitor"
|
|
790
|
+
manager.monitor(@argv.shift)
|
|
791
|
+
end
|
|
792
|
+
0
|
|
793
|
+
rescue Error, OptionParser::ParseError => error
|
|
794
|
+
$stderr.puts("[ERROR] #{error.message}")
|
|
795
|
+
1
|
|
796
|
+
end
|
|
797
|
+
|
|
798
|
+
private
|
|
799
|
+
|
|
800
|
+
def usage
|
|
801
|
+
puts(option_parser(default_options))
|
|
802
|
+
0
|
|
803
|
+
end
|
|
804
|
+
|
|
805
|
+
def default_options
|
|
806
|
+
{
|
|
807
|
+
repo: Dir.pwd,
|
|
808
|
+
changelog: nil,
|
|
809
|
+
markdown: nil,
|
|
810
|
+
remote: "origin",
|
|
811
|
+
release_remote: "release",
|
|
812
|
+
default_branch: nil,
|
|
813
|
+
branch_prefix: "release/",
|
|
814
|
+
dry_run: false,
|
|
815
|
+
no_push: false,
|
|
816
|
+
allow_dirty: false,
|
|
817
|
+
monitor: false,
|
|
818
|
+
poll_interval: 15,
|
|
819
|
+
poll_timeout: 1800
|
|
820
|
+
}
|
|
821
|
+
end
|
|
822
|
+
|
|
823
|
+
def option_parser(options)
|
|
824
|
+
OptionParser.new do |parser|
|
|
825
|
+
parser.banner = <<~TEXT
|
|
826
|
+
Usage: nextcloud-release-agent <prepare|publish|run|monitor> [options] [version]
|
|
827
|
+
|
|
828
|
+
Commands:
|
|
829
|
+
prepare Update changelog and appinfo, create a release branch, push it, and open a PR.
|
|
830
|
+
publish Merge the PR if possible, tag the merge commit, push tags, create releases, and optionally monitor workflows.
|
|
831
|
+
run Execute prepare and publish in one pass.
|
|
832
|
+
monitor Watch GitHub Actions runs related to the given or latest version.
|
|
833
|
+
TEXT
|
|
834
|
+
|
|
835
|
+
parser.on("--repo PATH", "Target app repository") { |value| options[:repo] = value }
|
|
836
|
+
parser.on("--changelog PATH", "Path to changelog.yaml inside the target repo") { |value| options[:changelog] = value }
|
|
837
|
+
parser.on("--markdown PATH", "Path to rendered changelog markdown inside the target repo") { |value| options[:markdown] = value }
|
|
838
|
+
parser.on("--remote NAME", "Git remote used for the development repository (default: origin)") { |value| options[:remote] = value }
|
|
839
|
+
parser.on("--release-remote NAME", "Git remote used for the release repository (default: release)") { |value| options[:release_remote] = value }
|
|
840
|
+
parser.on("--default-branch NAME", "Override default branch detection") { |value| options[:default_branch] = value }
|
|
841
|
+
parser.on("--branch-prefix PREFIX", "Release branch prefix (default: release/)") { |value| options[:branch_prefix] = value }
|
|
842
|
+
parser.on("--allow-dirty", "Allow running with a dirty worktree") { options[:allow_dirty] = true }
|
|
843
|
+
parser.on("--no-push", "Skip pushing the release branch during prepare") { options[:no_push] = true }
|
|
844
|
+
parser.on("--monitor", "Monitor workflow runs after publish") { options[:monitor] = true }
|
|
845
|
+
parser.on("--poll-interval SECONDS", Integer, "Polling interval for PR and workflow checks") { |value| options[:poll_interval] = value }
|
|
846
|
+
parser.on("--poll-timeout SECONDS", Integer, "Maximum wait time for merge and workflow checks") { |value| options[:poll_timeout] = value }
|
|
847
|
+
parser.on("--dry-run", "Print commands without mutating the target repository") { options[:dry_run] = true }
|
|
848
|
+
parser.on("-h", "--help", "Show help") do
|
|
849
|
+
puts(parser)
|
|
850
|
+
exit(0)
|
|
851
|
+
end
|
|
852
|
+
end
|
|
853
|
+
end
|
|
854
|
+
end
|
|
855
|
+
end
|
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
require_relative "lib/nextcloud_release_agent/version"
|
|
2
|
+
|
|
3
|
+
Gem::Specification.new do |spec|
|
|
4
|
+
spec.name = "nextcloud_release_agent"
|
|
5
|
+
spec.version = NextcloudReleaseAgent::VERSION
|
|
6
|
+
spec.authors = ["Anupam Kumar"]
|
|
7
|
+
spec.email = ["kyteinsky@gmail.com"]
|
|
8
|
+
|
|
9
|
+
spec.summary = "Ruby CLI for releasing Nextcloud apps using `git` and `gh`."
|
|
10
|
+
spec.description = "Automates changelog updates, semver bumps, PR creation, tagging, releases, and workflow monitoring for Nextcloud app repositories."
|
|
11
|
+
spec.homepage = "https://github.com/kyteinsky/nextcloud_release_agent"
|
|
12
|
+
spec.license = "AGPL-3.0-only"
|
|
13
|
+
|
|
14
|
+
spec.files = Dir.glob("{exe,lib}/**/*") + %w[Gemfile README.md nextcloud_release_agent.gemspec]
|
|
15
|
+
spec.bindir = "exe"
|
|
16
|
+
spec.executables = ["nextcloud-release-agent"]
|
|
17
|
+
spec.require_paths = ["lib"]
|
|
18
|
+
|
|
19
|
+
spec.required_ruby_version = ">= 3.1"
|
|
20
|
+
spec.add_runtime_dependency "rexml", "~> 3.2"
|
|
21
|
+
end
|
metadata
ADDED
|
@@ -0,0 +1,64 @@
|
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
|
2
|
+
name: nextcloud_release_agent
|
|
3
|
+
version: !ruby/object:Gem::Version
|
|
4
|
+
version: 0.1.0
|
|
5
|
+
platform: ruby
|
|
6
|
+
authors:
|
|
7
|
+
- Anupam Kumar
|
|
8
|
+
bindir: exe
|
|
9
|
+
cert_chain: []
|
|
10
|
+
date: 1980-01-02 00:00:00.000000000 Z
|
|
11
|
+
dependencies:
|
|
12
|
+
- !ruby/object:Gem::Dependency
|
|
13
|
+
name: rexml
|
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
|
15
|
+
requirements:
|
|
16
|
+
- - "~>"
|
|
17
|
+
- !ruby/object:Gem::Version
|
|
18
|
+
version: '3.2'
|
|
19
|
+
type: :runtime
|
|
20
|
+
prerelease: false
|
|
21
|
+
version_requirements: !ruby/object:Gem::Requirement
|
|
22
|
+
requirements:
|
|
23
|
+
- - "~>"
|
|
24
|
+
- !ruby/object:Gem::Version
|
|
25
|
+
version: '3.2'
|
|
26
|
+
description: Automates changelog updates, semver bumps, PR creation, tagging, releases,
|
|
27
|
+
and workflow monitoring for Nextcloud app repositories.
|
|
28
|
+
email:
|
|
29
|
+
- kyteinsky@gmail.com
|
|
30
|
+
executables:
|
|
31
|
+
- nextcloud-release-agent
|
|
32
|
+
extensions: []
|
|
33
|
+
extra_rdoc_files: []
|
|
34
|
+
files:
|
|
35
|
+
- Gemfile
|
|
36
|
+
- README.md
|
|
37
|
+
- exe/nextcloud-release-agent
|
|
38
|
+
- lib/nextcloud_release_agent.rb
|
|
39
|
+
- lib/nextcloud_release_agent/changelog_renderer.rb
|
|
40
|
+
- lib/nextcloud_release_agent/cli.rb
|
|
41
|
+
- lib/nextcloud_release_agent/version.rb
|
|
42
|
+
- nextcloud_release_agent.gemspec
|
|
43
|
+
homepage: https://github.com/kyteinsky/nextcloud_release_agent
|
|
44
|
+
licenses:
|
|
45
|
+
- AGPL-3.0-only
|
|
46
|
+
metadata: {}
|
|
47
|
+
rdoc_options: []
|
|
48
|
+
require_paths:
|
|
49
|
+
- lib
|
|
50
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
|
51
|
+
requirements:
|
|
52
|
+
- - ">="
|
|
53
|
+
- !ruby/object:Gem::Version
|
|
54
|
+
version: '3.1'
|
|
55
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
|
56
|
+
requirements:
|
|
57
|
+
- - ">="
|
|
58
|
+
- !ruby/object:Gem::Version
|
|
59
|
+
version: '0'
|
|
60
|
+
requirements: []
|
|
61
|
+
rubygems_version: 3.6.9
|
|
62
|
+
specification_version: 4
|
|
63
|
+
summary: Ruby CLI for releasing Nextcloud apps using `git` and `gh`.
|
|
64
|
+
test_files: []
|