oss-stats 0.0.1 → 0.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 9343ec72ca5f1412829f0cfd8f64ced3c725ab9a01198a228e2caa9d3540c7f6
4
- data.tar.gz: c5ff104f234c412c7059fd3d914fda9c5f72466427ad00bc29bf1ad2e0a0b10e
3
+ metadata.gz: cf35db5f39b5f26e0ea59ca26d502137077aa46b18812dcfb2d3355c11fb8606
4
+ data.tar.gz: 37a81f53394b9ea8a6da952dbc17c47a8f74be94216829dc5826500c8fb2d762
5
5
  SHA512:
6
- metadata.gz: 5c455dd20bf92a07a843ecb6549e43ca760d94d423868cdcb2833c012d43149660c7963e664bd24a8bbae5caf7fb558400bc1f06f01cb23b11190da6f51cfcde
7
- data.tar.gz: f5912b0c5b8a1bf7c9f84da22562903f8ce6b79bc52efe6cefb1d5e1d3e01460184a382a4b8c6f32e56c7553d4b3f4e806e0ec9acba54ef6f6212c1a71b48281
6
+ metadata.gz: c06f182855176fa481653ea05574ba5c95eef1251a07cda209e4279dcbbdbc96f24de9792d5c680f0aa9995b512433ec533caef8a6fb43d754591bcb8a094eb6
7
+ data.tar.gz: 9edd71a6a6d5230e17cc42b8ccea40b6b385fb034ea65bbdd5255a2a8479a44c26f2a0a7c93a83eed42143d5983891c293b07b2b426a3b0f4fdb5c607ea0122a
data/CHANGELOG.md CHANGED
@@ -1,5 +1,13 @@
1
1
  # oss-stats Changelog
2
2
 
3
+ ## 0.0.2 (2025-07-10)
4
+
5
+ * Setup now uses the gem instead of a separate checkout
6
+ * Massive improvements to setup script
7
+ * Add "convert" option to setup script to move to gem-based setup
8
+ * Fix some ruby warnings
9
+ * Fix bug in default config file
10
+
3
11
  ## 0.0.1 (2025-06-15)
4
12
 
5
13
  * Initial release!
data/Gemfile CHANGED
@@ -3,8 +3,6 @@ source 'https://rubygems.org'
3
3
  gem 'oss-stats', path: '.'
4
4
 
5
5
  group(:development) do
6
- # for cookstyle
7
- gem 'base64'
8
6
  gem 'cookstyle'
9
7
  gem 'mdl'
10
8
  gem 'rspec'
data/README.md CHANGED
@@ -7,7 +7,8 @@
7
7
  This is a collection of tools that aim to make it easier to track and report
8
8
  various metrics around health of an open source project.
9
9
 
10
- * [How to use this repo](#how-to-use-this-repo)
10
+ * [Installation](#installation)
11
+ * [Converting from pre-gem verions](#converting-from-pre-gem-versions)
11
12
  * [Tools in this repo](#tools-in-this-repo)
12
13
  * [Repo Stats](#repo-stats)
13
14
  * [Pipeline Visibility Stats](#pipeline-visibility-stats)
@@ -15,34 +16,86 @@ various metrics around health of an open source project.
15
16
  * [Promises](#promises)
16
17
  * [Authentication](#authentication)
17
18
 
18
- ## How to use this repo
19
+ ## Installation
19
20
 
20
- You'll want to create your own repository to keep the data and results that
21
- these scripts use and generate about your project.
21
+ You'll want to create your own directory or git repository to keep the data and
22
+ results that these scripts use and generate about your project. Whether or not
23
+ your directory is actually a git repo doesn't matter, but we recommend making
24
+ it one.
22
25
 
23
- Currently docs and tools are all setup for you to keep your repo and this repo
24
- checked out at the same level and use `oss-stats` directly from git. Once we
25
- get a release out the door, we'll update this with alternative options.
26
+ In your fresh directory, run this command to set everything up:
26
27
 
27
- This repo has a script that'll do all the required initial work. On your new
28
- repo, do:
28
+ ```bash
29
+ \curl -sSL https://raw.githubusercontent.com/jaymzh/oss-stats/refs/heads/main/bin/initialize_repo.sh | bash -s
30
+ ```
31
+
32
+ You can pass in some options like:
29
33
 
30
- ```shell
31
- <path_to_this_repo>/scripts/intialize_repo.sh
34
+ ```bash
35
+ \curl -sSL https://raw.githubusercontent.com/jaymzh/oss-stats/refs/heads/main/bin/initialize_repo.sh | bash -s -- <options>
32
36
  ```
33
37
 
34
- This will:
38
+ You can find valid options with `-h`.
35
39
 
36
- * Generate basic config skeleton files for the various scripts
37
- * Create necessary directories
38
- * Setup a GitHub Actions workflow for you
40
+ This will create a Gemfile that depends on the `git` version of the `oss-stats`
41
+ gem, install the bundle, setup the binstubs in `./bin`, create same config
42
+ files for you, and even setup GitHub Workflows!
39
43
 
40
44
  You can run it with `-n` (dryrun) to see what it will do without actually
41
45
  doing anything.
42
46
 
47
+ It'll look like:
48
+
49
+ ```text
50
+ Welcome to oss-stats!
51
+
52
+ We'll go ahead and setup this directory to be ready to track your open source
53
+ stats!
54
+
55
+ ➤ Initializing Gemfile to depend on oss-stats
56
+ ➤ Installing gem bundle
57
+ ➤ Making necessary directories
58
+ ➤ Copying basic skeleton files
59
+ ➤ Creating initial config files
60
+ ➤ Setting up GH Workflows
61
+
62
+ OK, this directory is setup.
63
+
64
+ NEXT STEPS:
65
+
66
+ 1. Edit `repo_stats_config.rb` in this directory to add repository to specify
67
+ what repositories you care about, and change anything else you may be
68
+ interested in.
69
+ 2. Run a sample report with: `./bin/repo_stats.rb`
70
+
71
+ We recommend running it regularly (e.g. weekly) and storing the output in the
72
+ repo_reports directory we've created, ala:
73
+
74
+ date=$(date '+%Y-%m-%d')
75
+ out="repo_reports/${date}.md"
76
+ for repo in $repos; do
77
+ ./bin/repo_stats.rb >> $out
78
+ done
79
+
80
+ Then you can also check `promise_stats`, `pipeline_visibility_stats`, and
81
+ `meeting_stats`.
82
+ ```
83
+
43
84
  You can see an example of a downstream repo at
44
85
  [chef-oss-stats](https://github.com/jaymzh/chef-oss-stats/).
45
86
 
87
+ ### Converting from pre-gem versions
88
+
89
+ If you ran the setup script before it used the gem (and required `../oss-stats`
90
+ to exist), you can convert to the new setup. Make sure your `oss-stats` checkout
91
+ is updated, and then run:
92
+
93
+ ```bash
94
+ ../oss-stats/bin/initialize_repo.sh -c
95
+ ```
96
+
97
+ Once you've done that, your `oss-stats` checkout is no longer necessary.
98
+
46
99
  ## Tools in this repo
47
100
 
48
101
  ### Repo Stats
@@ -0,0 +1,326 @@
1
+ #!/bin/bash
2
+
3
+ CONVERT=0
4
+ DRYRUN=0
5
+ DEBUG=0
6
+ FORCE=0
7
+ BRANCH=""
8
+ NOGIT=0
9
+ VERSION=""
10
+ DIRS=(
11
+ scripts
12
+ data
13
+ repo_reports
14
+ pipeline_visibility_reports
15
+ promises
16
+ images
17
+ .github/workflows
18
+ )
19
+ OSS_STATS_PATH=""
20
+
21
+ step() {
22
+ echo "➤ $*"
23
+ }
24
+
25
+ warn() {
26
+ echo "WARNING: $*" >&2
27
+ }
28
+
29
+ err() {
30
+ echo "ERROR: $*" >&2
31
+ }
32
+
33
+ die() {
34
+ err "$@"
35
+ exit 1
36
+ }
37
+
38
+ debug() {
39
+ [[ "$DEBUG" -eq 0 ]] && return
40
+
41
+ echo "DEBUG: $*" >&2
42
+ }
43
+
44
+ run() {
45
+ if [[ "$DRYRUN" -ne 0 ]]; then
46
+ echo "DRYRUN: $*"
47
+ return 0
48
+ fi
49
+ "$@"
50
+ }
51
+
52
+ usage() {
53
+ cat <<EOF
54
+ $0 <options>
55
+
56
+ This script will initialize a new repo that utilizes oss-stats. It should
57
+ be run in an empty directory, or new git clone.
58
+
59
+ Options:
60
+ -b <branch>
61
+ Install oss-stats gem from a specific branch. You probably do
62
+ not want this.
63
+
64
+ -c
65
+ Convert Mode. If you setup oss-stats back when it required
66
+ a checkout of your downstream repo and oss-stats next to each
67
+ other, you can use this option to convert your repo to use the
68
+ gem instead.
69
+
70
+ -d
71
+ Enable debug output.
72
+
73
+ -f
74
+ Force. Copy over files, even if the directory is not empty. Not
75
+ recommended.
76
+
77
+ -G
78
+ When installing oss-stats, don't use git, instead use wahtever
79
+ is latest. Not recommended. See also '-V'.
80
+
81
+ -h
82
+ Print this help message.
83
+
84
+ -n
85
+ Dryrun. Don't do any work, just say what you would do.
86
+
87
+ -V <version_constraint>
88
+ When installing oss-stats, don't use git, and specifically use
89
+ this version of the gem. Must be in gem-constraint format.
90
+
91
+ EOF
92
+ }
93
+
94
+ do_gem() {
95
+ step "Initializing Gemfile to depend on oss-stats"
96
+ if [ "$CONVERT" -eq 1 ]; then
97
+ run rm Gemfile Gemfile.lock
98
+ fi
99
+ gemfile_line="gem 'oss-stats'"
100
+ if [[ -n "$VERSION" ]]; then
101
+ gemfile_line="$gemfile_line, $VERSION"
102
+ elif [[ "$NOGIT" -eq 0 ]]; then
103
+ gemfile_line="$gemfile_line,\n git: 'https://github.com/jaymzh/oss-stats.git'"
104
+ if [[ -n "$BRANCH" ]]; then
105
+ gemfile_line="$gemfile_line,\n branch: '$BRANCH'"
106
+ fi
107
+ fi
108
+
109
+ if [ -e Gemfile ] && [ "$CONVERT" -eq 0 ]; then
110
+ if grep -q 'oss-stats' Gemfile; then
111
+ warn "Gemfile already populated with oss-stats, skipping"
112
+ else
113
+ warn "Gemfile already exists, adding oss-stats"
114
+ if [ "$DRYRUN" -eq 0 ]; then
115
+ echo -e "$gemfile_line" >> Gemfile
116
+ fi
117
+ fi
118
+ else
119
+ cat >Gemfile <<EOF
120
+ source 'https://rubygems.org'
121
+
122
+ $(echo -e "$gemfile_line")
123
+
124
+ group(:development) do
125
+ gem 'cookstyle'
126
+ gem 'mdl'
127
+ gem 'rspec'
128
+ end
129
+ EOF
130
+ fi
131
+
132
+ step 'Installing gem bundle'
133
+ if ! out=$(run bundle install); then
134
+ die "Failed to install bundle. Output: \n$out"
135
+ fi
136
+ if ! out=$(run bundle update oss-stats); then
137
+ die "Failed to install latest oss-stats. Output: \n$out"
138
+ fi
139
+ run bundle binstubs oss-stats
140
+ OSS_STATS_PATH=$(bundle show oss-stats)
141
+ }
142
+
143
+ do_directories() {
144
+ step 'Making necessary directories'
145
+ run mkdir -p "${DIRS[@]}"
146
+ }
147
+
148
+ gen_file() {
149
+ local file="$1"
150
+ local content="$2"
151
+
152
+ if [ -e "$file" ]; then
153
+ warn "$file exists, skipping"
154
+ return
155
+ fi
156
+
157
+ if [ "$DRYRUN" -eq 0 ]; then
158
+ echo -e "$content" > "$file"
159
+ else
160
+ echo "DRYRUN: echo \"$content\" > $file"
161
+ fi
162
+ }
163
+
164
+ do_files() {
165
+ local dst
166
+ local file
167
+ step 'Copying basic skeleton files'
168
+
169
+ # top level files
170
+ mapfile -t files < <(
171
+ find "$OSS_STATS_PATH/initialization_data/" -maxdepth 1 -type f
172
+ )
173
+ debug "Copying files: ${files[*]}"
174
+ for file in "${files[@]}"; do
175
+ dst=$(basename "$file")
176
+ run cp "$file" "$dst"
177
+ done
178
+
179
+ # scripts
180
+ mapfile -t files< <(
181
+ find "$OSS_STATS_PATH/initialization_data/scripts" -maxdepth 1 -type f
182
+ )
183
+ debug "Copying scripts: ${files[*]}"
184
+ for file in "${files[@]}"; do
185
+ dst=$(basename "$file")
186
+ run cp "$file" "scripts/$dst"
187
+ chmod +x "scripts/$dst"
188
+ done
189
+ }
190
+
191
+ do_config_files() {
192
+ step 'Creating initial config files'
193
+ for file in "$OSS_STATS_PATH/examples/"*_config.rb; do
194
+ f=$(basename "$file")
195
+ # even if we're in force/convert, skip config files that exist
196
+ if [ -e "$f" ]; then
197
+ warn "Config file $f already exists, skipping"
198
+ else
199
+ run cp "$file" .
200
+ fi
201
+ done
202
+ }
203
+
204
+ do_gh_workflows() {
205
+ step 'Setting up GH Workflows'
206
+ for file in "$OSS_STATS_PATH/initialization_data/github_workflows/"*; do
207
+ f=".github/workflows/$(basename "$file")"
208
+ run cp "$file" "$f"
209
+ done
210
+ }
211
+
212
+ do_instructions() {
213
+ if [ "$CONVERT" -eq 1 ]; then
214
+ do_convert_instructions
215
+ else
216
+ do_install_instructions
217
+ fi
218
+ }
219
+
220
+ do_install_instructions() {
221
+ cat <<'EOF'
222
+ OK, this directory is setup.
223
+
224
+ NEXT STEPS:
225
+
226
+ 1. Edit `repo_stats_config.rb` in this directory to add repository to specify
227
+ what repositories you care about, and change anything else you may be
228
+ interested in.
229
+ 2. Run a sample report with: `./bin/repo_stats.rb`
230
+
231
+ We recommend running it regularly (e.g. weekly) and storing the output in the
232
+ repo_reports directory we've created, ala:
233
+
234
+ date=$(date '+%Y-%m-%d')
235
+ out="repo_reports/${date}.md"
236
+ for repo in $repos; do
237
+ ./bin/repo_stats.rb >> $out
238
+ done
239
+
240
+ Then you can also check `promise_stats`, `pipeline_visibility_stats`, and
241
+ `meeting_stats` - these are all in `./bin`
242
+ EOF
243
+ }
244
+
245
+ do_convert_instructions() {
246
+ cat <<'EOF'
247
+ We've done our best to convert your repo. Some things to check for:
248
+
249
+ * You probably will want to revert to your README if you've made changes,
250
+ so run `git diff README.md`, and optionally `git checkout README.md`
251
+ * If you have your own CI stuff, be sure to checkout the changes to the
252
+ github workflow files
253
+ * New stuff is in bin/ and scripts/ - be sure to git add them
254
+ * Finally, do a pass on `git diff` and `git status`, to make sure you like
255
+ what you see.
256
+
257
+ From now on, run script with `./bin/<script>` instead of
258
+ `../oss-stats/bin/<script>` and other than that, it should be the same!
259
+ EOF
260
+ }
261
+
262
+ while getopts b:cdfGhnV: opt; do
263
+ case "$opt" in
264
+ b)
265
+ BRANCH="$OPTARG"
266
+ ;;
267
+ c)
268
+ debug "Activating CONVERT mode"
269
+ CONVERT=1
270
+ ;;
271
+ d)
272
+ debug "Activating DEBUG mode"
273
+ DEBUG=1
274
+ ;;
275
+ f)
276
+ debug "Activating FORCE mode"
277
+ FORCE=1
278
+ ;;
279
+ G)
280
+ NOGIT=1
281
+ ;;
282
+ h)
283
+ usage
284
+ exit
285
+ ;;
286
+ n)
287
+ DRYRUN=1
288
+ ;;
289
+ V)
290
+ VERSION="$OPTARG"
291
+ ;;
292
+ ?)
293
+ exit 1
294
+ ;;
295
+ esac
296
+ done
297
+
298
+ # shellcheck disable=SC2012
299
+ num=$(ls | wc -l)
300
+ if [[ "$num" -ne 0 ]]; then
301
+ if [[ "$FORCE" -eq 1 ]]; then
302
+ warn "Directory not empty, but force is on"
303
+ elif [[ "$CONVERT" -eq 1 ]]; then
304
+ warn "Directory not empty, but convert is on"
305
+ else
306
+ die "Script should be run in an empty directory"
307
+ fi
308
+ fi
309
+
310
+ cat <<'EOF'
311
+ Welcome to oss-stats!
312
+
313
+ We'll go ahead and setup this directory to be ready to track your open source
314
+ stats!
315
+
316
+ EOF
317
+
318
+ do_gem
319
+ debug "OSS_STATS_PATH is $OSS_STATS_PATH"
320
+ do_directories
321
+ do_files
322
+ do_config_files
323
+ do_gh_workflows
324
+ do_instructions
325
+
326
+
data/bin/meeting_stats CHANGED
@@ -257,13 +257,13 @@ def summary(db_file)
257
257
  db.close
258
258
 
259
259
  # TODO: de-dupe this with generate_plots
260
- dates = data.map { |row| row[0] }.uniq.reverse
260
+ dates = data.map(&:first).uniq.reverse
261
261
  dates[0..2].each do |date|
262
- total_teams = data.count { |row| row[0] == date }
263
- present_teams = data.count { |row| row[0] == date && row[2] == 'Y' }
262
+ total_teams = data.count { |row| row.first == date }
263
+ present_teams = data.count { |row| row.first == date && row[2] == 'Y' }
264
264
  present_pct = ((present_teams / total_teams) * 100).round(2)
265
265
  reporting_builds = data.count do |row|
266
- row[0] == date && row[3] != 'N' && !row[3].strip.empty?
266
+ row.first == date && row[3] != 'N' && !row[3].strip.empty?
267
267
  end
268
268
  reporting_builds_pct =
269
269
  ((reporting_builds.to_f / total_teams) * 100).round(2)
@@ -284,15 +284,15 @@ def generate_plots(db_file, img_dir)
284
284
  )
285
285
  db.close
286
286
 
287
- dates = data.map { |row| row[0] }.uniq
287
+ dates = data.map(&:first).uniq
288
288
  attendance_percentages = []
289
289
  build_status_percentages = []
290
290
 
291
291
  dates.each do |date|
292
- total_teams = data.count { |row| row[0] == date }
293
- present_teams = data.count { |row| row[0] == date && row[2] == 'Y' }
292
+ total_teams = data.count { |row| row.first == date }
293
+ present_teams = data.count { |row| row.first == date && row[2] == 'Y' }
294
294
  reporting_builds = data.count do |row|
295
- row[0] == date && row[3] != 'N' && !row[3].strip.empty?
295
+ row.first == date && row[3] != 'N' && !row[3].strip.empty?
296
296
  end
297
297
 
298
298
  attendance_percentages <<
@@ -307,7 +307,7 @@ def generate_plots(db_file, img_dir)
307
307
  }
308
308
 
309
309
  sizes.each do |name, size|
310
- g = Gruff::Line.new(size[0], size[1])
310
+ g = Gruff::Line.new(size.first, size[1])
311
311
  g.maximum_value = 100
312
312
  g.minimum_value = 0
313
313
  g.title = 'Percentage of Teams Present Over Time'
@@ -315,7 +315,7 @@ def generate_plots(db_file, img_dir)
315
315
  g.labels = dates.each_with_index.to_h
316
316
  g.write(::File.join(img_dir, "attendance-#{name}.png"))
317
317
 
318
- g2 = Gruff::Line.new(size[0], size[1])
318
+ g2 = Gruff::Line.new(size.first, size[1])
319
319
  g2.maximum_value = 100
320
320
  g2.minimum_value = 0
321
321
  g2.title = 'Percentage of Teams Reporting Build Status Over Time'
@@ -6,7 +6,6 @@ require 'json'
6
6
  require 'mixlib/shellout'
7
7
  require 'net/http'
8
8
  require 'optparse'
9
- require 'set'
10
9
  require 'uri'
11
10
  require 'yaml'
12
11
 
@@ -254,11 +253,10 @@ def process_buildkite_pipelines(
254
253
  total_pipeline_count += 1
255
254
  next if visibility.casecmp('public').zero?
256
255
 
257
- if reported_slugs.add?(report_key)
258
- log.debug("Pipeline #{report_key} is #{visibility} (direct)")
259
- repo_missing_public << slug
260
- private_pipeline_count += 1
261
- end
256
+ next unless reported_slugs.add?(report_key)
257
+ log.debug("Pipeline #{report_key} is #{visibility} (direct)")
258
+ repo_missing_public << slug
259
+ private_pipeline_count += 1
262
260
  end
263
261
 
264
262
  # However, more likely, we don't have access to see the pipeline or
data/bin/promise_stats CHANGED
@@ -68,7 +68,7 @@ def update_promise_status(config, date, new_status)
68
68
 
69
69
  print 'Enter ID to update: '
70
70
  chosen_id = gets.strip.to_i
71
- if rows.any? { |r| r[0] == chosen_id }
71
+ if rows.any? { |r| r.first == chosen_id }
72
72
  if new_status == 'resolved'
73
73
  db.execute(
74
74
  "UPDATE promises SET resolved_on = ?, status = 'resolved' " +
@@ -107,7 +107,7 @@ def edit_promise(config)
107
107
 
108
108
  print 'Enter ID to edit: '
109
109
  chosen_id = gets.strip.to_i
110
- entry = rows.find { |r| r[0] == chosen_id }
110
+ entry = rows.find { |r| r.first == chosen_id }
111
111
 
112
112
  unless entry
113
113
  puts 'Invalid ID.'
@@ -197,7 +197,7 @@ def prompt(txt, default = nil)
197
197
  end
198
198
 
199
199
  def main
200
- if ARGV.empty? || %w{--help -h}.include?(ARGV[0])
200
+ if ARGV.empty? || %w{--help -h}.include?(ARGV.first)
201
201
  puts <<~HELP
202
202
  Usage: #{$PROGRAM_NAME} [subcommand] [options]
203
203
 
data/bin/repo_stats CHANGED
@@ -110,4 +110,4 @@ def main
110
110
  end
111
111
  end
112
112
 
113
- main if __FILE__ == $PROGRAM_NAME
113
+ main
@@ -8,7 +8,7 @@ db_file DEFAULT_DB_FILE = File.expand_path(
8
8
  __dir__,
9
9
  )
10
10
  header <<~EOF
11
- # Promises Report #{Date.today.to_s}
11
+ # Promises Report #{Date.today}
12
12
  EOF
13
13
 
14
14
  # Uncomment this and set it to a string to have the output
@@ -18,32 +18,60 @@ include_list false
18
18
  # is the organizations block. It allows you to specify
19
19
  # all of the repos that will be processed and how
20
20
  # they should be processed
21
- organizations({
22
- 'someorg' => {
23
- # if this org uses different branches (can further override under the repo)
24
- 'branches' => ['trunk'],
25
- # if you want a different number of days by for repos in this org (can
26
- # further override under the repo)
27
- 'days' => 7,
28
- 'repositories' => {
29
- 'repo1' => {},
30
- 'repo2' => {
31
- # crazy repo, only do 2 days
32
- 'days' => 2,
33
- 'branches' => ['main'],
21
+ organizations(
22
+ {
23
+ 'someorg' => {
24
+ # if this org uses different branches
25
+ # (can further override under the repo)
26
+ 'branches' => ['trunk'],
27
+ # if you want a different number of days by for repos in this org (can
28
+ # further override under the repo)
29
+ 'days' => 7,
30
+ 'repositories' => {
31
+ 'repo1' => {},
32
+ 'repo2' => {
33
+ # crazy repo, only do 2 days
34
+ 'days' => 2,
35
+ 'branches' => ['main'],
36
+ },
37
+ 'repo3' => {
38
+ 'days' => 30,
39
+ 'branches' => ['main'],
40
+ },
41
+ },
42
+ },
43
+ 'anotherorg' => {
44
+ 'days' => 45,
45
+ 'branches' => %w{main oldstuff},
46
+ 'repositories' => {
47
+ 'repo1' => {},
48
+ 'repo2' => {},
49
+ 'repo3' => {},
34
50
  },
35
- 'repo3' => {
36
- 'days' => 30,
37
- 'branches' => ['main'],
38
- }
39
- },
40
- 'anotherorg' => {
41
- 'days' => 45,
42
- 'branches' => %w{main oldstuff},
43
- 'repositories' => {
44
- 'repo1' => {},
45
- 'repo2' => {},
46
- 'repo3' => {},
47
51
  },
48
52
  },
49
- })
53
+ )
54
+
55
+ # limit output to only repos in the top-N trouble-makers along various
56
+ # axes
57
+ #
58
+ # All of these except "N" or "N%" (3 repos, or 3% or repos, for example)
59
+
60
+ # top_n_stale 3
61
+ # OR
62
+ # top_n_stale_pr 3
63
+ # top_n_stale_issue 3
64
+ #
65
+ # top_n_oldest 3
66
+ # OR
67
+ # top_n_oldest_pr 3
68
+ # top_n_oldest_issue 3
69
+ #
70
+ # top_n_time_to_close 3
71
+ # OR
72
+ # top_n_time_to_close_pr 3
73
+ # top_n_time_to_close_issue 3
74
+ #
75
+ # top_n_most_broken_ci_days 3
76
+ #
77
+ # top_n_most_broken_ci_jobs 3
@@ -1,7 +1,7 @@
1
- # OSS Stats for <Project>
1
+ # OSS Stats for YOUR_PROJECT_HERE
2
2
 
3
- This repo aims to track stats that affect how <project>'s open source community
4
- interacts with the project and repositories.
3
+ This repo aims to track stats that affect how YOUR_PROJECT_HERE's open source
4
+ community interacts with the project and repositories.
5
5
 
6
6
  It leverages [oss-stats](https://github.com/jaymzh/oss-stats) to track those
7
7
  stats. It assumes oss-stats and this repo are checked out next to each other
@@ -9,10 +9,10 @@ on the filesystem.
9
9
 
10
10
  ## tl;dr
11
11
 
12
- * See **Issue, PR, and CI stats** in [ci_reports](ci_reports)
12
+ * See **Issue, PR, and CI stats** in [ci_reports](repo_reports)
13
13
  * See **weekly meeting stats** in [Slack Status Tracking](team_slack_reports.md)
14
14
  * See **pipeline visiblity stats** in [pipeline_visibility_reports](pipeline_visibility_reports)
15
- * See **promises** in [promises][promises]
15
+ * See **promises** in [promises_reports][promises_reports]
16
16
 
17
17
  ## Usage
18
18
 
@@ -1,13 +1,24 @@
1
+ require_relative 'log'
2
+
1
3
  # looks for :github_token in `options`, falling back to
2
4
  # $GITHUB_TOKEN, and then gh's auth.
3
5
  def get_github_token(options)
4
- return options[:github_token] if options[:github_token]
5
- return ENV['GITHUB_TOKEN'] if ENV['GITHUB_TOKEN']
6
+ if options[:github_token]
7
+ log.debug('Using GH token from CLI')
8
+ return options[:github_token]
9
+ elsif ENV['GITHUB_TOKEN']
10
+ log.debug('Using GH token from env')
11
+ return ENV['GITHUB_TOKEN']
12
+ end
6
13
 
7
14
  config_path = File.expand_path('~/.config/gh/hosts.yml')
8
15
  if File.exist?(config_path)
9
16
  config = YAML.load_file(config_path)
10
- return config.dig('github.com', 'oauth_token')
17
+ token = config.dig('github.com', 'oauth_token')
18
+ if token
19
+ log.debug('Using GH token from gh cli config')
20
+ return token
21
+ end
11
22
  end
12
23
  nil
13
24
  end
@@ -3,7 +3,6 @@ require 'date'
3
3
  require 'deep_merge'
4
4
  require 'octokit'
5
5
  require 'optparse'
6
- require 'set'
7
6
  require 'yaml'
8
7
 
9
8
  require_relative 'buildkite_client'
@@ -149,7 +148,7 @@ module OssStats
149
148
  buildkite_badge_regex =
150
149
  %r{\)\]\((https://buildkite\.com\/([^\/]+)\/([^\/\)]+))\)}
151
150
  matches = readme.scan(buildkite_badge_regex)
152
- if matches.length.zero?
151
+ if matches.empty?
153
152
  log.debug('no BK pipelines found in readme')
154
153
  return pipelines
155
154
  end
@@ -209,7 +208,7 @@ module OssStats
209
208
  api_builds = bk_client.get_pipeline_builds(
210
209
  pl[:org], pl[:pipeline], from_date, today, branch
211
210
  )
212
- if api_builds.length.zero?
211
+ if api_builds.empty?
213
212
  log.debug("No builds for #{pl} on #{branch}")
214
213
  next
215
214
  end
@@ -314,7 +313,7 @@ module OssStats
314
313
  log.debug(" Acquiring page #{page}")
315
314
  runs = gh_client.workflow_runs(
316
315
  repo, workflow.id, branch:, status: 'completed', per_page: 100,
317
- page:
316
+ page:
318
317
  )
319
318
  rate_limited_sleep
320
319
 
@@ -499,7 +498,7 @@ module OssStats
499
498
  end
500
499
  log.info(
501
500
  " * Stale #{type} (>30 days without comment): " +
502
- stats[:stale_count],
501
+ stats[:stale_count].to_s,
503
502
  )
504
503
  avg_time = stats[:avg_time_to_close_hours]
505
504
  avg_time_str =
@@ -1,3 +1,3 @@
1
1
  module OssStats
2
- VERSION = '0.0.1'.freeze
2
+ VERSION = '0.0.2'.freeze
3
3
  end
data/oss-stats.gemspec CHANGED
@@ -25,9 +25,19 @@ Gem::Specification.new do |spec|
25
25
  Dir.glob('bin/*') +
26
26
  Dir.glob('extras/*') +
27
27
  Dir.glob('spec/*') +
28
+ Dir.glob('scripts/*') +
28
29
  Dir.glob('initialization_data/*') +
29
30
  Dir.glob('initialization_data/github_workflow/s*')
30
- %w{sqlite3 octokit mixlib-log mixlib-config gruff deep_merge}.each do |dep|
31
+ %w{
32
+ base64
33
+ deep_merge
34
+ faraday-retry
35
+ gruff
36
+ mixlib-config
37
+ mixlib-log
38
+ octokit
39
+ sqlite3
40
+ }.each do |dep|
31
41
  spec.add_dependency dep
32
42
  end
33
43
  spec.metadata = {
@@ -0,0 +1,3 @@
1
+ #!/bin/bash
2
+
3
+ exec bundle exec cookstyle "$@"
@@ -0,0 +1,3 @@
1
+ #!/bin/bash
2
+
3
+ exec bundle exec mdl "$@"
@@ -0,0 +1,8 @@
1
+ #!/bin/bash
2
+
3
+ if [ -z "$1" ]; then
4
+ mapfile -t files < <(find . -type f -name \*.sh)
5
+ exec shellcheck "${files[@]}"
6
+ fi
7
+
8
+ exec shellcheck "${@}"
@@ -0,0 +1,3 @@
1
+ #!/bin/bash
2
+
3
+ exec bundle exec rspec -f doc "$@"
@@ -108,8 +108,8 @@ RSpec.describe 'repo_stats' do
108
108
  .and_return(double(content: ''))
109
109
  allow(client).to receive(:workflows).and_return(
110
110
  double(workflows: [
111
- double(id: 1, name: 'Test Workflow', html_url: 'testurl'),
112
- ]),
111
+ double(id: 1, name: 'Test Workflow', html_url: 'testurl'),
112
+ ]),
113
113
  )
114
114
  allow(client).to receive(:workflow_runs).with(
115
115
  'test_org/test_repo',
@@ -127,8 +127,8 @@ RSpec.describe 'repo_stats' do
127
127
  ).and_return(double(workflow_runs: []))
128
128
  allow(client).to receive(:workflow_run_jobs).and_return(
129
129
  double(jobs: [
130
- double(name: 'Test Job', conclusion: 'failure'),
131
- ]),
130
+ double(name: 'Test Job', conclusion: 'failure'),
131
+ ]),
132
132
  )
133
133
 
134
134
  failed_tests = get_failed_tests_from_ci(client, nil, options, {})
@@ -205,15 +205,15 @@ RSpec.describe 'repo_stats' do
205
205
  expect(mock_buildkite_client).to receive(:get_pipeline)
206
206
  .with('test-buildkite-org', 'actual-pipeline-name')
207
207
  .and_return({
208
- url: 'testurl',
209
- slug: 'actual-pipeline-name',
210
- })
208
+ url: 'testurl',
209
+ slug: 'actual-pipeline-name',
210
+ })
211
211
  expect(mock_buildkite_client).to receive(:get_pipeline)
212
212
  .with('other-org', 'other-pipeline')
213
213
  .and_return({
214
- url: 'testurl',
215
- slug: 'other-pipeline',
216
- })
214
+ url: 'testurl',
215
+ slug: 'other-pipeline',
216
+ })
217
217
  expect(mock_buildkite_client).to receive(:get_pipeline_builds)
218
218
  .with(
219
219
  'test-buildkite-org',
@@ -222,13 +222,15 @@ RSpec.describe 'repo_stats' do
222
222
  Date.today,
223
223
  'main',
224
224
  )
225
- .and_return([
226
- {
227
- 'node' => {
228
- 'createdAt' => (Date.today - 1).to_s, 'state' => 'FAILED'
225
+ .and_return(
226
+ [
227
+ {
228
+ 'node' => {
229
+ 'createdAt' => (Date.today - 1).to_s, 'state' => 'FAILED'
230
+ },
229
231
  },
230
- },
231
- ])
232
+ ],
233
+ )
232
234
  expect(mock_buildkite_client).to receive(:get_pipeline_builds)
233
235
  .with(
234
236
  'other-org',
@@ -237,13 +239,15 @@ RSpec.describe 'repo_stats' do
237
239
  Date.today,
238
240
  'main',
239
241
  )
240
- .and_return([
241
- {
242
- 'node' => {
243
- 'createdAt' => (Date.today - 1).to_s, 'state' => 'PASSED'
242
+ .and_return(
243
+ [
244
+ {
245
+ 'node' => {
246
+ 'createdAt' => (Date.today - 1).to_s, 'state' => 'PASSED'
247
+ },
244
248
  },
245
- },
246
- ])
249
+ ],
250
+ )
247
251
  failed_tests = get_failed_tests_from_ci(
248
252
  client, mock_buildkite_client, settings_with_buildkite_token, {}
249
253
  )
@@ -256,9 +260,9 @@ RSpec.describe 'repo_stats' do
256
260
  expect(mock_buildkite_client).to receive(:get_pipeline)
257
261
  .with('test-buildkite-org', 'another-actual-pipeline')
258
262
  .and_return({
259
- url: 'testurl',
260
- slug: 'another-actual-pipelinename',
261
- })
263
+ url: 'testurl',
264
+ slug: 'another-actual-pipelinename',
265
+ })
262
266
  allow(client).to receive(:readme)
263
267
  .with(repo_full_name)
264
268
  .and_return(
@@ -282,24 +286,24 @@ RSpec.describe 'repo_stats' do
282
286
  expect(mock_buildkite_client).to receive(:get_pipeline)
283
287
  .with('test-buildkite-org', 'actual-pipeline-name')
284
288
  .and_return({
285
- url: 'testurl',
286
- slug: 'actual-pipeline-name',
287
- })
289
+ url: 'testurl',
290
+ slug: 'actual-pipeline-name',
291
+ })
288
292
  expect(mock_buildkite_client).to receive(:get_pipeline)
289
293
  .with('other-org', 'other-pipeline')
290
294
  .and_return({
291
- url: 'testurl',
292
- slug: 'other-pipeline',
293
- })
295
+ url: 'testurl',
296
+ slug: 'other-pipeline',
297
+ })
294
298
  allow(mock_buildkite_client).to receive(:get_pipeline_builds)
295
299
  .and_return([
296
- {
297
- 'node' => {
298
- 'createdAt' => (Date.today - 1).to_s,
299
- 'state' => 'PASSED',
300
- },
301
- },
302
- ])
300
+ {
301
+ 'node' => {
302
+ 'createdAt' => (Date.today - 1).to_s,
303
+ 'state' => 'PASSED',
304
+ },
305
+ },
306
+ ])
303
307
  failed_tests = get_failed_tests_from_ci(
304
308
  client, mock_buildkite_client, settings_with_buildkite_token, {}
305
309
  )
@@ -340,15 +344,15 @@ RSpec.describe 'repo_stats' do
340
344
  expect(mock_buildkite_client).to receive(:get_pipeline)
341
345
  .with('test-buildkite-org', 'actual-pipeline-name')
342
346
  .and_return({
343
- url: 'testurl',
344
- slug: 'actual-pipeline-name',
345
- })
347
+ url: 'testurl',
348
+ slug: 'actual-pipeline-name',
349
+ })
346
350
  expect(mock_buildkite_client).to receive(:get_pipeline)
347
351
  .with('other-org', 'other-pipeline')
348
352
  .and_return({
349
- url: 'testurl',
350
- slug: 'other-pipeline',
351
- })
353
+ url: 'testurl',
354
+ slug: 'other-pipeline',
355
+ })
352
356
  allow(mock_buildkite_client).to receive(:get_pipeline_builds)
353
357
  .with(
354
358
  org_name, pipeline_name, today - days_to_check, today, 'main'
@@ -359,15 +363,17 @@ RSpec.describe 'repo_stats' do
359
363
  client, mock_buildkite_client, options_for_ongoing, {}
360
364
  )
361
365
 
362
- expected_job_dates = Set.new([
363
- today - days_to_check + 1,
364
- today - days_to_check + 2,
365
- # no 3, it passed that day
366
- today - days_to_check + 4,
367
- # add today (days_to_check = 5), becuase we fill in
368
- # all days through today if the last check is failing
369
- today,
370
- ])
366
+ expected_job_dates = Set.new(
367
+ [
368
+ today - days_to_check + 1,
369
+ today - days_to_check + 2,
370
+ # no 3, it passed that day
371
+ today - days_to_check + 4,
372
+ # add today (days_to_check = 5), becuase we fill in
373
+ # all days through today if the last check is failing
374
+ today,
375
+ ],
376
+ )
371
377
  expect(failed_tests['main'][job_key][:dates])
372
378
  .to eq(expected_job_dates)
373
379
  expect(failed_tests['main'][job_key][:dates].size)
@@ -417,9 +423,9 @@ RSpec.describe 'repo_stats' do
417
423
  allow(mock_buildkite_client)
418
424
  .to receive(:get_pipeline)
419
425
  .and_return({
420
- url: 'testurl',
421
- slug: 'actual-pipeline-name',
422
- })
426
+ url: 'testurl',
427
+ slug: 'actual-pipeline-name',
428
+ })
423
429
  end
424
430
 
425
431
  it 'handles the error gracefully and logs it' do
@@ -593,26 +599,28 @@ RSpec.describe 'repo_stats' do
593
599
 
594
600
  describe '#determine_orgs_to_process' do
595
601
  before(:each) do
596
- OssStats::Config::RepoStats.organizations({
597
- 'org1' => {
598
- 'days' => 2,
599
- 'repositories' => {
600
- 'repo1' => {},
601
- 'repo2' => {
602
- 'days' => 3,
602
+ OssStats::Config::RepoStats.organizations(
603
+ {
604
+ 'org1' => {
605
+ 'days' => 2,
606
+ 'repositories' => {
607
+ 'repo1' => {},
608
+ 'repo2' => {
609
+ 'days' => 3,
610
+ },
603
611
  },
604
612
  },
605
- },
606
- 'org2' => {
607
- 'days' => 7,
608
- 'repositories' => {
609
- 'repoA' => {
610
- 'days' => 30,
613
+ 'org2' => {
614
+ 'days' => 7,
615
+ 'repositories' => {
616
+ 'repoA' => {
617
+ 'days' => 30,
618
+ },
619
+ 'repoB' => {},
611
620
  },
612
- 'repoB' => {},
613
621
  },
614
622
  },
615
- })
623
+ )
616
624
  end
617
625
  let(:config) { OssStats::Config::RepoStats }
618
626
 
@@ -1007,7 +1015,7 @@ ci_broken_days_map: { 'main_job' => 100 }),
1007
1015
  ),
1008
1016
  mock_repo_data(
1009
1017
  'repo_issue_slow', avg_close_pr_hours: 10,
1010
- avg_close_issue_hours: 100
1018
+ avg_close_issue_hours: 100
1011
1019
  ),
1012
1020
  mock_repo_data(
1013
1021
  'repo_both_fast', avg_close_pr_hours: 5, avg_close_issue_hours: 5
metadata CHANGED
@@ -1,17 +1,17 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: oss-stats
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.1
4
+ version: 0.0.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Phil Dibowitz
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2025-06-16 00:00:00.000000000 Z
11
+ date: 2025-07-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: sqlite3
14
+ name: base64
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
17
  - - ">="
@@ -25,7 +25,7 @@ dependencies:
25
25
  - !ruby/object:Gem::Version
26
26
  version: '0'
27
27
  - !ruby/object:Gem::Dependency
28
- name: octokit
28
+ name: deep_merge
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
31
  - - ">="
@@ -39,7 +39,21 @@ dependencies:
39
39
  - !ruby/object:Gem::Version
40
40
  version: '0'
41
41
  - !ruby/object:Gem::Dependency
42
- name: mixlib-log
42
+ name: faraday-retry
43
+ requirement: !ruby/object:Gem::Requirement
44
+ requirements:
45
+ - - ">="
46
+ - !ruby/object:Gem::Version
47
+ version: '0'
48
+ type: :runtime
49
+ prerelease: false
50
+ version_requirements: !ruby/object:Gem::Requirement
51
+ requirements:
52
+ - - ">="
53
+ - !ruby/object:Gem::Version
54
+ version: '0'
55
+ - !ruby/object:Gem::Dependency
56
+ name: gruff
43
57
  requirement: !ruby/object:Gem::Requirement
44
58
  requirements:
45
59
  - - ">="
@@ -67,7 +81,7 @@ dependencies:
67
81
  - !ruby/object:Gem::Version
68
82
  version: '0'
69
83
  - !ruby/object:Gem::Dependency
70
- name: gruff
84
+ name: mixlib-log
71
85
  requirement: !ruby/object:Gem::Requirement
72
86
  requirements:
73
87
  - - ">="
@@ -81,7 +95,21 @@ dependencies:
81
95
  - !ruby/object:Gem::Version
82
96
  version: '0'
83
97
  - !ruby/object:Gem::Dependency
84
- name: deep_merge
98
+ name: octokit
99
+ requirement: !ruby/object:Gem::Requirement
100
+ requirements:
101
+ - - ">="
102
+ - !ruby/object:Gem::Version
103
+ version: '0'
104
+ type: :runtime
105
+ prerelease: false
106
+ version_requirements: !ruby/object:Gem::Requirement
107
+ requirements:
108
+ - - ">="
109
+ - !ruby/object:Gem::Version
110
+ version: '0'
111
+ - !ruby/object:Gem::Dependency
112
+ name: sqlite3
85
113
  requirement: !ruby/object:Gem::Requirement
86
114
  requirements:
87
115
  - - ">="
@@ -98,6 +126,7 @@ description:
98
126
  email:
99
127
  - phil@ipom.com
100
128
  executables:
129
+ - initialize_repo.sh
101
130
  - meeting_stats
102
131
  - pipeline_visibility_stats
103
132
  - promise_stats
@@ -123,6 +152,7 @@ files:
123
152
  - Gemfile
124
153
  - LICENSE
125
154
  - README.md
155
+ - bin/initialize_repo.sh
126
156
  - bin/meeting_stats
127
157
  - bin/pipeline_visibility_stats
128
158
  - bin/promise_stats
@@ -134,9 +164,7 @@ files:
134
164
  - examples/meeting_stats_config.rb
135
165
  - examples/promise_stats_config.rb
136
166
  - examples/repo_stats_config.rb
137
- - initialization_data/Gemfile
138
167
  - initialization_data/README.md
139
- - initialization_data/rubocop.yml
140
168
  - lib/oss_stats/buildkite_client.rb
141
169
  - lib/oss_stats/buildkite_token.rb
142
170
  - lib/oss_stats/config/meeting_stats.rb
@@ -149,6 +177,10 @@ files:
149
177
  - lib/oss_stats/repo_stats.rb
150
178
  - lib/oss_stats/version.rb
151
179
  - oss-stats.gemspec
180
+ - scripts/run_cookstyle.sh
181
+ - scripts/run_markdownlint.sh
182
+ - scripts/run_shellcheck.sh
183
+ - scripts/run_specs.sh
152
184
  - spec/buildkite_client_spec.rb
153
185
  - spec/repo_stats_spec.rb
154
186
  homepage: https://github.com/jaymzh/oss-stats
@@ -1,3 +0,0 @@
1
- source 'https://rubygems.org'
2
-
3
- eval_gemfile('../oss-stats/Gemfile')
@@ -1,2 +0,0 @@
1
- inherit_from:
2
- - ../oss-stats/.rubocop.yml