churn_vs_complexity 1.3.0 → 1.4.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +6 -0
- data/README.md +16 -2
- data/TODO +11 -0
- data/bin/churn_vs_complexity +5 -0
- data/lib/churn_vs_complexity/churn.rb +1 -1
- data/lib/churn_vs_complexity/cli.rb +18 -6
- data/lib/churn_vs_complexity/complexity/eslint_calculator.rb +3 -5
- data/lib/churn_vs_complexity/complexity/flog_calculator.rb +2 -2
- data/lib/churn_vs_complexity/complexity/pmd_calculator.rb +5 -2
- data/lib/churn_vs_complexity/concurrent_calculator.rb +1 -1
- data/lib/churn_vs_complexity/config.rb +74 -18
- data/lib/churn_vs_complexity/git_date.rb +7 -0
- data/lib/churn_vs_complexity/serializer/csv.rb +14 -0
- data/lib/churn_vs_complexity/serializer/graph.rb +24 -0
- data/lib/churn_vs_complexity/serializer/pass_through.rb +21 -0
- data/lib/churn_vs_complexity/serializer/summary.rb +27 -0
- data/lib/churn_vs_complexity/serializer/summary_hash.rb +54 -0
- data/lib/churn_vs_complexity/serializer/timetravel/quality_calculator.rb +38 -0
- data/lib/churn_vs_complexity/serializer/timetravel/stats_calculator.rb +60 -0
- data/lib/churn_vs_complexity/serializer/timetravel.rb +103 -0
- data/lib/churn_vs_complexity/serializer.rb +7 -60
- data/lib/churn_vs_complexity/timetravel/traveller.rb +66 -0
- data/lib/churn_vs_complexity/timetravel/worktree.rb +56 -0
- data/lib/churn_vs_complexity/timetravel.rb +70 -0
- data/lib/churn_vs_complexity/version.rb +1 -1
- data/lib/churn_vs_complexity.rb +2 -0
- data/tmp/template/graph.html +1 -4
- data/tmp/template/timetravel_graph.html +100 -0
- data/tmp/timetravel/.keep +0 -0
- metadata +16 -2
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 7484ff3a1c015738808226a78087017f6b7aff5ce42d15879023f32df5648717
|
4
|
+
data.tar.gz: ad3bdeff5ba32e9d7f414b45173d8928b1c463313557202b96aaf5fdcf059109
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: a56e26296acfff22e755c414cab9ee36923e4ccd7181d67691f55feb8f99b5aa738f5d2b7645006f25c632efa346fb535d7fe04923443dedf607772ce1a21323
|
7
|
+
data.tar.gz: f406ee696facf7708e792b67ed8b463f1554365b3939deb4bde40c9fab88dfe891997f611727559a283103c5b7234f71d78d5fe108731628cb3fd8a782a73ce5
|
data/CHANGELOG.md
CHANGED
@@ -17,3 +17,9 @@
|
|
17
17
|
|
18
18
|
- Add support for javascript and typescript complexity calculation using eslint
|
19
19
|
- Fixed behavior when --since or short-hand flags were not provided
|
20
|
+
|
21
|
+
## [1.4.0] - 2024-10-10
|
22
|
+
|
23
|
+
- Add timetravel mode to visualise code quality over time
|
24
|
+
- Add alpha, beta, and gamma scores to summaries
|
25
|
+
- Fixed broken Ruby complexity calculation
|
data/README.md
CHANGED
@@ -4,7 +4,7 @@ A tool to visualise code complexity in a project and help direct refactoring eff
|
|
4
4
|
|
5
5
|
Inspired by [Michael Feathers' article "Getting Empirical about Refactoring"](https://www.agileconnection.com/article/getting-empirical-about-refactoring) and the gem [turbulence](https://rubygems.org/gems/turbulence) by Chad Fowler and others.
|
6
6
|
|
7
|
-
This gem
|
7
|
+
This gem currently supports analysis of Java, Ruby, JavaScript, and TypeScript repositories, but it can easily be extended.
|
8
8
|
|
9
9
|
## Installation
|
10
10
|
|
@@ -40,14 +40,27 @@ Usage: churn_vs_complexity [options] folder
|
|
40
40
|
--graph Format output as HTML page with Churn vs Complexity graph
|
41
41
|
--summary Output summary statistics (mean and median) for churn and complexity
|
42
42
|
--excluded PATTERN Exclude file paths including this string. Can be used multiple times.
|
43
|
-
--since YYYY-MM-DD Calculate churn after this date
|
43
|
+
--since YYYY-MM-DD Normal mode: Calculate churn after this date. Timetravel mode: calculate summaries from this date
|
44
44
|
-m, --month Calculate churn for the month leading up to the most recent commit
|
45
45
|
-q, --quarter Calculate churn for the quarter leading up to the most recent commit
|
46
46
|
-y, --year Calculate churn for the year leading up to the most recent commit
|
47
|
+
--timetravel N Calculate summary for all commits at intervals of N days throughout project history or from the date specified with --since
|
47
48
|
--dry-run Echo the chosen options from the CLI
|
48
49
|
-h, --help Display help
|
50
|
+
|
51
|
+
|
49
52
|
```
|
50
53
|
|
54
|
+
Note that when using the `--timetravel` mode, the semantics of some flags are subtly different from normal mode:
|
55
|
+
|
56
|
+
* `--since YYYY-MM-DD`: Calculate summaries from this date
|
57
|
+
* `--month`, `--quarter`, `--year`: Calculate churn for the period leading up to each commit being summarised
|
58
|
+
|
59
|
+
Timetravel analysis can take many minutes for old and large repositories.
|
60
|
+
|
61
|
+
Summaries in normal mode include a gamma score, which is an unnormalised harmonic mean of churn and complexity. This allows for comparison of summaries across different projects with the same language, or over time for a single project.
|
62
|
+
|
63
|
+
Summary points in timetravel mode instead include an alpha score, which is the same harmonic mean of churn and complexity, where churn and complexity values are normalised to a 0-1 range to avoid either churn or complexity dominating the score. The summary points also include a beta score, which is the geometric mean of the normalised churn and complexity values.
|
51
64
|
## Examples
|
52
65
|
|
53
66
|
`churn_vs_complexity --ruby --csv my_ruby_project > ~/Desktop/ruby-demo.csv`
|
@@ -56,6 +69,7 @@ Usage: churn_vs_complexity [options] folder
|
|
56
69
|
|
57
70
|
`churn_vs_complexity --ruby --summary -m my_ruby_project >> ~/Desktop/monthly-report.txt`
|
58
71
|
|
72
|
+
`churn_vs_complexity --java -m --since 2019-03-01 --timetravel 30 --graph my_java_project > ~/Desktop/timetravel-after-1st-march-2019.html`
|
59
73
|
|
60
74
|
## Development
|
61
75
|
|
data/TODO
ADDED
@@ -0,0 +1,11 @@
|
|
1
|
+
TODO:
|
2
|
+
|
3
|
+
- Move Timetravel calculations away from serializer
|
4
|
+
|
5
|
+
- Database, where we can prepopulate the state of every file and every commit
|
6
|
+
- Populate incrementally from each commit in log.
|
7
|
+
- Only need to care about deltas between commits, and copy everything else from previous commit.
|
8
|
+
- processed_commit table with sha and version of processing logic
|
9
|
+
- Unit tests for simpler new classes
|
10
|
+
- Integration test for Timetravel
|
11
|
+
- Candlebars on mean dots in graph
|
data/bin/churn_vs_complexity
CHANGED
@@ -10,7 +10,7 @@ module ChurnVsComplexity
|
|
10
10
|
git_dir = File.join(folder, '.git')
|
11
11
|
earliest_date = [date_of_first_commit(folder:), since].max
|
12
12
|
formatted_date = earliest_date.strftime('%Y-%m-%d')
|
13
|
-
cmd = %
|
13
|
+
cmd = %(git --git-dir #{git_dir} --work-tree #{folder} log --format="%H" --follow --since="#{formatted_date}" -- #{file} | wc -l)
|
14
14
|
`#{cmd}`.to_i
|
15
15
|
end
|
16
16
|
|
@@ -22,7 +22,8 @@ module ChurnVsComplexity
|
|
22
22
|
options[:language] = :ruby
|
23
23
|
end
|
24
24
|
|
25
|
-
opts.on('--js', '--ts', '--javascript', '--typescript',
|
25
|
+
opts.on('--js', '--ts', '--javascript', '--typescript',
|
26
|
+
'Check complexity of javascript and typescript files',) do
|
26
27
|
options[:language] = :javascript
|
27
28
|
end
|
28
29
|
|
@@ -43,20 +44,27 @@ module ChurnVsComplexity
|
|
43
44
|
options[:excluded] << value
|
44
45
|
end
|
45
46
|
|
46
|
-
opts.on('--since YYYY-MM-DD',
|
47
|
+
opts.on('--since YYYY-MM-DD',
|
48
|
+
'Normal mode: Calculate churn after this date. Timetravel mode: calculate summaries from this date',) do |value|
|
47
49
|
options[:since] = value
|
48
50
|
end
|
49
51
|
|
50
52
|
opts.on('-m', '--month', 'Calculate churn for the month leading up to the most recent commit') do
|
51
|
-
options[:
|
53
|
+
options[:relative_period] = :month
|
52
54
|
end
|
53
55
|
|
54
56
|
opts.on('-q', '--quarter', 'Calculate churn for the quarter leading up to the most recent commit') do
|
55
|
-
options[:
|
57
|
+
options[:relative_period] = :quarter
|
56
58
|
end
|
57
59
|
|
58
60
|
opts.on('-y', '--year', 'Calculate churn for the year leading up to the most recent commit') do
|
59
|
-
options[:
|
61
|
+
options[:relative_period] = :year
|
62
|
+
end
|
63
|
+
|
64
|
+
opts.on('--timetravel N',
|
65
|
+
'Calculate summary for all commits at intervals of N days throughout project history or from the date specified with --since',) do |value|
|
66
|
+
options[:mode] = :timetravel
|
67
|
+
options[:jump_days] = value.to_i
|
60
68
|
end
|
61
69
|
|
62
70
|
opts.on('--dry-run', 'Echo the chosen options from the CLI') do
|
@@ -84,7 +92,11 @@ module ChurnVsComplexity
|
|
84
92
|
|
85
93
|
config.validate!
|
86
94
|
|
87
|
-
|
95
|
+
if options[:mode] == :timetravel
|
96
|
+
puts config.timetravel.go(folder:)
|
97
|
+
else
|
98
|
+
puts config.to_engine.check(folder:)
|
99
|
+
end
|
88
100
|
end
|
89
101
|
end
|
90
102
|
end
|
@@ -5,20 +5,18 @@ module ChurnVsComplexity
|
|
5
5
|
module ESLintCalculator
|
6
6
|
class << self
|
7
7
|
def folder_based? = false
|
8
|
-
|
8
|
+
|
9
9
|
def calculate(files:)
|
10
10
|
dir_path = File.join(gem_root, 'tmp', 'eslint-support')
|
11
11
|
script_path = File.join(dir_path, 'complexity-calculator.js')
|
12
12
|
install_command = "npm install --prefix '#{dir_path}'"
|
13
13
|
`#{install_command}`
|
14
14
|
|
15
|
-
|
16
15
|
command = "node #{script_path} '#{files.to_json}'"
|
17
16
|
complexity = `#{command}`
|
18
17
|
|
19
|
-
if complexity.empty?
|
20
|
-
|
21
|
-
end
|
18
|
+
raise Error, 'Failed to calculate complexity' if complexity.empty?
|
19
|
+
|
22
20
|
all = JSON.parse(complexity)
|
23
21
|
all.to_h do |abc|
|
24
22
|
[abc['file'], abc['complexity']]
|
@@ -9,9 +9,9 @@ module ChurnVsComplexity
|
|
9
9
|
def folder_based? = false
|
10
10
|
|
11
11
|
def calculate(files:)
|
12
|
-
|
13
|
-
# TODO: Run this concurrently
|
12
|
+
# TODO: Run this concurrently
|
14
13
|
files.to_h do |file|
|
14
|
+
flog = Flog.new
|
15
15
|
flog.flog(file)
|
16
16
|
[file, flog.total_score]
|
17
17
|
end
|
@@ -9,7 +9,10 @@ module ChurnVsComplexity
|
|
9
9
|
def folder_based? = true
|
10
10
|
|
11
11
|
def calculate(folder:)
|
12
|
-
|
12
|
+
cache_path = resolve_cache_path
|
13
|
+
output = `pmd check -d #{folder} -R #{resolve_ruleset_path} -f json -t #{CONCURRENCY} --cache #{cache_path}`
|
14
|
+
File.delete(cache_path)
|
15
|
+
|
13
16
|
Parser.new.parse(output)
|
14
17
|
end
|
15
18
|
|
@@ -29,7 +32,7 @@ module ChurnVsComplexity
|
|
29
32
|
end
|
30
33
|
|
31
34
|
def resolve_cache_path
|
32
|
-
File.join(gem_root, 'tmp', 'pmd-support',
|
35
|
+
File.join(gem_root, 'tmp', 'pmd-support', "pmd-cache-#{Process.pid}")
|
33
36
|
end
|
34
37
|
|
35
38
|
def gem_root
|
@@ -57,7 +57,7 @@ module ChurnVsComplexity
|
|
57
57
|
def combine_results
|
58
58
|
result = {}
|
59
59
|
result[:values_by_file] = @complexity_results.keys.each_with_object({}) do |file, acc|
|
60
|
-
# File with complexity score might not have churned in queried period,
|
60
|
+
# File with complexity score might not have churned in queried period,
|
61
61
|
# set zero churn on miss
|
62
62
|
acc[file] = [@churn_results[file] || 0, @complexity_results[file]]
|
63
63
|
end
|
@@ -7,25 +7,42 @@ module ChurnVsComplexity
|
|
7
7
|
serializer:,
|
8
8
|
excluded: [],
|
9
9
|
since: nil,
|
10
|
+
relative_period: nil,
|
10
11
|
complexity_validator: ComplexityValidator,
|
11
|
-
since_validator: SinceValidator
|
12
|
+
since_validator: SinceValidator,
|
13
|
+
**options
|
12
14
|
)
|
13
15
|
@language = language
|
14
16
|
@serializer = serializer
|
15
17
|
@excluded = excluded
|
16
18
|
@since = since
|
19
|
+
@relative_period = relative_period
|
17
20
|
@complexity_validator = complexity_validator
|
18
21
|
@since_validator = since_validator
|
22
|
+
@options = options
|
19
23
|
end
|
20
24
|
|
21
25
|
def validate!
|
22
|
-
raise
|
23
|
-
raise Error, "Unsupported serializer: #{@serializer}" unless %i[none csv graph summary].include?(@serializer)
|
26
|
+
raise ValidationError, "Unsupported language: #{@language}" unless %i[java ruby javascript].include?(@language)
|
24
27
|
|
25
|
-
|
28
|
+
SerializerValidator.validate!(serializer: @serializer, mode: @options[:mode])
|
29
|
+
|
30
|
+
@since_validator.validate!(since: @since, relative_period: @relative_period, mode: @options[:mode])
|
31
|
+
RelativePeriodValidator.validate!(relative_period: @relative_period, mode: @options[:mode])
|
26
32
|
@complexity_validator.validate!(@language)
|
27
33
|
end
|
28
34
|
|
35
|
+
def timetravel
|
36
|
+
engine = timetravel_engine_config.to_engine
|
37
|
+
Timetravel::Traveller.new(
|
38
|
+
since: @since,
|
39
|
+
relative_period: @relative_period,
|
40
|
+
engine:,
|
41
|
+
jump_days: @options[:jump_days],
|
42
|
+
serializer: @serializer,
|
43
|
+
)
|
44
|
+
end
|
45
|
+
|
29
46
|
def to_engine
|
30
47
|
case @language
|
31
48
|
when :java
|
@@ -34,7 +51,7 @@ module ChurnVsComplexity
|
|
34
51
|
churn:,
|
35
52
|
file_selector: FileSelector::Java.excluding(@excluded),
|
36
53
|
serializer:,
|
37
|
-
since: @since,
|
54
|
+
since: @since || @relative_period,
|
38
55
|
)
|
39
56
|
when :ruby
|
40
57
|
Engine.concurrent(
|
@@ -42,7 +59,7 @@ module ChurnVsComplexity
|
|
42
59
|
churn:,
|
43
60
|
file_selector: FileSelector::Ruby.excluding(@excluded),
|
44
61
|
serializer:,
|
45
|
-
since: @since,
|
62
|
+
since: @since || @relative_period,
|
46
63
|
)
|
47
64
|
when :javascript
|
48
65
|
Engine.concurrent(
|
@@ -50,13 +67,26 @@ module ChurnVsComplexity
|
|
50
67
|
churn:,
|
51
68
|
file_selector: FileSelector::JavaScript.excluding(@excluded),
|
52
69
|
serializer:,
|
53
|
-
since: @since,
|
70
|
+
since: @since || @relative_period,
|
54
71
|
)
|
55
72
|
end
|
56
73
|
end
|
57
74
|
|
58
75
|
private
|
59
76
|
|
77
|
+
def timetravel_engine_config
|
78
|
+
Config.new(
|
79
|
+
language: @language,
|
80
|
+
serializer: :pass_through,
|
81
|
+
excluded: @excluded,
|
82
|
+
since: nil, # since has a different meaning in timetravel mode
|
83
|
+
relative_period: @relative_period,
|
84
|
+
complexity_validator: @complexity_validator,
|
85
|
+
since_validator: @since_validator,
|
86
|
+
**@options,
|
87
|
+
)
|
88
|
+
end
|
89
|
+
|
60
90
|
def churn = Churn::GitCalculator
|
61
91
|
|
62
92
|
def serializer
|
@@ -69,6 +99,8 @@ module ChurnVsComplexity
|
|
69
99
|
Serializer::Graph.new
|
70
100
|
when :summary
|
71
101
|
Serializer::Summary
|
102
|
+
when :pass_through
|
103
|
+
Serializer::PassThrough
|
72
104
|
end
|
73
105
|
end
|
74
106
|
|
@@ -81,21 +113,45 @@ module ChurnVsComplexity
|
|
81
113
|
end
|
82
114
|
end
|
83
115
|
|
116
|
+
# TODO: unit test
|
117
|
+
module SerializerValidator
|
118
|
+
def self.validate!(serializer:, mode:)
|
119
|
+
raise ValidationError, "Unsupported serializer: #{serializer}" \
|
120
|
+
unless %i[none csv graph summary].include?(serializer)
|
121
|
+
raise ValidationError, 'Does not support --summary in --timetravel mode' \
|
122
|
+
if serializer == :summary && mode == :timetravel
|
123
|
+
end
|
124
|
+
end
|
125
|
+
|
126
|
+
# TODO: unit test
|
127
|
+
module RelativePeriodValidator
|
128
|
+
def self.validate!(relative_period:, mode:)
|
129
|
+
if mode == :timetravel && relative_period.nil?
|
130
|
+
raise ValidationError,
|
131
|
+
'Relative period is required in timetravel mode'
|
132
|
+
end
|
133
|
+
return if relative_period.nil? || %i[month quarter year].include?(relative_period)
|
134
|
+
|
135
|
+
raise ValidationError, "Invalid relative period #{relative_period}"
|
136
|
+
end
|
137
|
+
end
|
138
|
+
|
84
139
|
module SinceValidator
|
85
|
-
def self.validate!(since)
|
140
|
+
def self.validate!(since:, relative_period:, mode:)
|
86
141
|
# since can be nil, a date string or a keyword (:month, :quarter, :year)
|
87
142
|
return if since.nil?
|
88
143
|
|
89
|
-
|
90
|
-
raise
|
91
|
-
|
92
|
-
|
93
|
-
|
94
|
-
|
95
|
-
|
96
|
-
|
97
|
-
|
98
|
-
|
144
|
+
unless mode == :timetravel || since.nil? || relative_period.nil?
|
145
|
+
raise ValidationError,
|
146
|
+
'--since and relative period (--month, --quarter, --year) can only be used together in --timetravel mode'
|
147
|
+
end
|
148
|
+
|
149
|
+
raise ValidationError, "Invalid since value #{since}" unless since.is_a?(String)
|
150
|
+
|
151
|
+
begin
|
152
|
+
Date.strptime(since, '%Y-%m-%d')
|
153
|
+
rescue Date::Error
|
154
|
+
raise ValidationError, "Invalid date #{since}, please use correct format, YYYY-MM-DD"
|
99
155
|
end
|
100
156
|
end
|
101
157
|
end
|
@@ -2,6 +2,13 @@
|
|
2
2
|
|
3
3
|
module ChurnVsComplexity
|
4
4
|
module GitDate
|
5
|
+
def self.select_dates_with_at_least_interval(dates, interval)
|
6
|
+
ds = dates.sort
|
7
|
+
ds.each_with_object([]) do |date, acc|
|
8
|
+
acc << date if acc.empty? || date - acc.last >= interval
|
9
|
+
end
|
10
|
+
end
|
11
|
+
|
5
12
|
def self.git_period(cli_arg_since, latest_commit_date)
|
6
13
|
latest_commit_date = latest_commit_date.to_date
|
7
14
|
if cli_arg_since.nil?
|
@@ -0,0 +1,14 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
module CSV
|
6
|
+
def self.serialize(result)
|
7
|
+
values_by_file = result[:values_by_file]
|
8
|
+
values_by_file.map do |file, values|
|
9
|
+
"#{file},#{values[0]},#{values[1]}\n"
|
10
|
+
end.join
|
11
|
+
end
|
12
|
+
end
|
13
|
+
end
|
14
|
+
end
|
@@ -0,0 +1,24 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
class Graph
|
6
|
+
def initialize(template: Graph.load_template_file)
|
7
|
+
@template = template
|
8
|
+
end
|
9
|
+
|
10
|
+
def serialize(result)
|
11
|
+
data = result[:values_by_file].map do |file, values|
|
12
|
+
"{ file_path: '#{file}', churn: #{values[0]}, complexity: #{values[1]} }"
|
13
|
+
end.join(",\n") + "\n"
|
14
|
+
title = Serializer.title(result)
|
15
|
+
@template.gsub("// INSERT DATA\n", data).gsub('INSERT TITLE', title)
|
16
|
+
end
|
17
|
+
|
18
|
+
def self.load_template_file
|
19
|
+
file_path = File.expand_path('../../../tmp/template/graph.html', __dir__)
|
20
|
+
File.read(file_path)
|
21
|
+
end
|
22
|
+
end
|
23
|
+
end
|
24
|
+
end
|
@@ -0,0 +1,21 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
module PassThrough
|
6
|
+
class << self
|
7
|
+
def serialize(result)
|
8
|
+
values_by_file = result[:values_by_file]
|
9
|
+
end_date = result[:git_period].end_date
|
10
|
+
values = values_by_file.map do |_, values|
|
11
|
+
[values[0].to_f, values[1].to_f]
|
12
|
+
end
|
13
|
+
{
|
14
|
+
end_date:,
|
15
|
+
values:,
|
16
|
+
}
|
17
|
+
end
|
18
|
+
end
|
19
|
+
end
|
20
|
+
end
|
21
|
+
end
|
@@ -0,0 +1,27 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
module Summary
|
6
|
+
def self.serialize(result)
|
7
|
+
values_by_file = result[:values_by_file]
|
8
|
+
summary = SummaryHash.serialize(result)
|
9
|
+
|
10
|
+
<<~SUMMARY
|
11
|
+
#{Serializer.title(result)}
|
12
|
+
|
13
|
+
Number of observations: #{values_by_file.size}
|
14
|
+
|
15
|
+
Churn:
|
16
|
+
Mean #{summary[:mean_churn]}, Median #{summary[:median_churn]}
|
17
|
+
|
18
|
+
Complexity:
|
19
|
+
Mean #{summary[:mean_complexity]}, Median #{summary[:median_complexity]}
|
20
|
+
|
21
|
+
Gamma score:
|
22
|
+
Mean #{summary[:mean_gamma_score]}, Median #{summary[:median_gamma_score]}
|
23
|
+
SUMMARY
|
24
|
+
end
|
25
|
+
end
|
26
|
+
end
|
27
|
+
end
|
@@ -0,0 +1,54 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
module SummaryHash
|
6
|
+
class << self
|
7
|
+
def serialize(result)
|
8
|
+
values_by_file = result[:values_by_file]
|
9
|
+
churn_values = values_by_file.map { |_, values| values[0].to_f }
|
10
|
+
complexity_values = values_by_file.map { |_, values| values[1].to_f }
|
11
|
+
|
12
|
+
mean_churn = churn_values.sum / churn_values.size
|
13
|
+
median_churn = churn_values.sort[churn_values.size / 2]
|
14
|
+
mean_complexity = complexity_values.sum / complexity_values.size
|
15
|
+
median_complexity = complexity_values.sort[complexity_values.size / 2]
|
16
|
+
|
17
|
+
max_churn = churn_values.max
|
18
|
+
min_churn = churn_values.min
|
19
|
+
max_complexity = complexity_values.max
|
20
|
+
min_complexity = complexity_values.min
|
21
|
+
|
22
|
+
epsilon = 0.0001
|
23
|
+
gamma_score = values_by_file.map do |_, values|
|
24
|
+
# unnormalised harmonic mean of churn and complexity,
|
25
|
+
# since the summary needs to be comparable over time
|
26
|
+
churn = values[0].to_f + epsilon
|
27
|
+
complexity = values[1].to_f + epsilon
|
28
|
+
|
29
|
+
(2 * churn * complexity) / (churn + complexity)
|
30
|
+
end
|
31
|
+
|
32
|
+
mean_gamma_score = gamma_score.sum / gamma_score.size
|
33
|
+
median_gamma_score = gamma_score.sort[gamma_score.size / 2]
|
34
|
+
|
35
|
+
end_date = result[:git_period].end_date
|
36
|
+
|
37
|
+
{
|
38
|
+
mean_churn:,
|
39
|
+
median_churn:,
|
40
|
+
max_churn:,
|
41
|
+
min_churn:,
|
42
|
+
mean_complexity:,
|
43
|
+
median_complexity:,
|
44
|
+
max_complexity:,
|
45
|
+
min_complexity:,
|
46
|
+
mean_gamma_score:,
|
47
|
+
median_gamma_score:,
|
48
|
+
end_date:,
|
49
|
+
}
|
50
|
+
end
|
51
|
+
end
|
52
|
+
end
|
53
|
+
end
|
54
|
+
end
|
@@ -0,0 +1,38 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
module Timetravel
|
6
|
+
EPSILON = 0.0001
|
7
|
+
|
8
|
+
class QualityCalculator
|
9
|
+
def initialize(min_churn:, max_churn:, min_complexity:, max_complexity:)
|
10
|
+
@min_churn = min_churn
|
11
|
+
@max_churn = max_churn
|
12
|
+
@min_complexity = min_complexity
|
13
|
+
@max_complexity = max_complexity
|
14
|
+
end
|
15
|
+
|
16
|
+
def alpha_score(raw_churn, raw_complexity)
|
17
|
+
# harmonic mean of normalised churn and complexity
|
18
|
+
churn = normalise(raw_churn, @min_churn, @max_churn, EPSILON)
|
19
|
+
complexity = normalise(raw_complexity, @min_complexity, @max_complexity, EPSILON)
|
20
|
+
|
21
|
+
(2 * churn * complexity) / (churn + complexity)
|
22
|
+
end
|
23
|
+
|
24
|
+
def beta_score(raw_churn, raw_complexity)
|
25
|
+
# geometric mean of normalised churn and complexity
|
26
|
+
churn = normalise(raw_churn, @min_churn, @max_churn, EPSILON)
|
27
|
+
complexity = normalise(raw_complexity, @min_complexity, @max_complexity, EPSILON)
|
28
|
+
|
29
|
+
Math.sqrt(churn * complexity)
|
30
|
+
end
|
31
|
+
|
32
|
+
private
|
33
|
+
|
34
|
+
def normalise(score, min, max, epsilon) = (score + epsilon - min) / (epsilon + max - min)
|
35
|
+
end
|
36
|
+
end
|
37
|
+
end
|
38
|
+
end
|
@@ -0,0 +1,60 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
module Serializer
|
5
|
+
module Timetravel
|
6
|
+
class StatsCalculator
|
7
|
+
# ['some_sha', { 'end_date' => '2024-01-01', 'values' => [[1, 2], [3, 4]] }]
|
8
|
+
def summaries(result)
|
9
|
+
observations = result.sort_by do |_sha, summary|
|
10
|
+
summary['end_date']
|
11
|
+
end.map { |entry| entry[1] }
|
12
|
+
|
13
|
+
quality_calculator = QualityCalculator.new(**extrema(observations))
|
14
|
+
observations.map do |o|
|
15
|
+
end_date = o['end_date']
|
16
|
+
scores = o['values'].map do |(churn, complexity)|
|
17
|
+
alpha = quality_calculator.alpha_score(churn, complexity)
|
18
|
+
beta = quality_calculator.beta_score(churn, complexity)
|
19
|
+
[churn, complexity, alpha, beta]
|
20
|
+
end
|
21
|
+
{
|
22
|
+
'end_date' => end_date,
|
23
|
+
'mean_churn' => mean(scores.map { |s| s[0] }),
|
24
|
+
'median_churn' => median(scores.map { |s| s[0] }),
|
25
|
+
'mean_complexity' => mean(scores.map { |s| s[1] }),
|
26
|
+
'median_complexity' => median(scores.map { |s| s[1] }),
|
27
|
+
'mean_alpha_score' => mean(scores.map { |s| s[2] }),
|
28
|
+
'median_alpha_score' => median(scores.map { |s| s[2] }),
|
29
|
+
'mean_beta_score' => mean(scores.map { |s| s[3] }),
|
30
|
+
'median_beta_score' => median(scores.map { |s| s[3] }),
|
31
|
+
}
|
32
|
+
end
|
33
|
+
end
|
34
|
+
|
35
|
+
private
|
36
|
+
|
37
|
+
def extrema(observations)
|
38
|
+
churn_series = observations.flat_map { |o| o['values'] }.map { |(churn, _)| churn }
|
39
|
+
max_churn = churn_series.max
|
40
|
+
min_churn = churn_series.min
|
41
|
+
|
42
|
+
complexity_series = observations.flat_map { |o| o['values'] }.map { |(_, complexity)| complexity }
|
43
|
+
max_complexity = complexity_series.max
|
44
|
+
min_complexity = complexity_series.min
|
45
|
+
|
46
|
+
{ max_churn:, min_churn:, max_complexity:, min_complexity: }
|
47
|
+
end
|
48
|
+
|
49
|
+
def mean(series)
|
50
|
+
series.sum / series.size
|
51
|
+
end
|
52
|
+
|
53
|
+
def median(series)
|
54
|
+
sorted = series.sort
|
55
|
+
sorted[sorted.size / 2]
|
56
|
+
end
|
57
|
+
end
|
58
|
+
end
|
59
|
+
end
|
60
|
+
end
|
@@ -0,0 +1,103 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require_relative 'timetravel/quality_calculator'
|
4
|
+
require_relative 'timetravel/stats_calculator'
|
5
|
+
|
6
|
+
module ChurnVsComplexity
|
7
|
+
module Serializer
|
8
|
+
module Timetravel
|
9
|
+
def self.summaries(result)
|
10
|
+
StatsCalculator.new.summaries(result)
|
11
|
+
end
|
12
|
+
|
13
|
+
def self.resolve(serializer:, git_period:, relative_period:, jump_days:)
|
14
|
+
case serializer
|
15
|
+
when :csv
|
16
|
+
CSV
|
17
|
+
when :graph
|
18
|
+
Graph.new(git_period:, relative_period:, jump_days:)
|
19
|
+
end
|
20
|
+
end
|
21
|
+
|
22
|
+
module CSV
|
23
|
+
def self.serialize(result)
|
24
|
+
summaries = Timetravel.summaries(result)
|
25
|
+
|
26
|
+
# 2. Add title row to front of summaries
|
27
|
+
summaries.unshift(
|
28
|
+
{
|
29
|
+
'end_date' => 'Date',
|
30
|
+
'mean_churn' => 'Mean Churn',
|
31
|
+
'median_churn' => 'Median Churn',
|
32
|
+
'mean_complexity' => 'Mean Complexity',
|
33
|
+
'median_complexity' => 'Median Complexity',
|
34
|
+
'mean_alpha_score' => 'Mean Alpha Score',
|
35
|
+
'median_alpha_score' => 'Median Alpha Score',
|
36
|
+
'mean_beta_score' => 'Mean Beta Score',
|
37
|
+
'median_beta_score' => 'Median Beta Score',
|
38
|
+
},
|
39
|
+
)
|
40
|
+
|
41
|
+
# 3. convert to csv
|
42
|
+
summaries.map do |summary|
|
43
|
+
"#{summary['end_date']},#{summary['mean_churn']},#{summary['median_churn']},#{summary['mean_complexity']},#{summary['median_complexity']},#{summary['mean_alpha_score']},#{summary['median_alpha_score']},#{summary['mean_beta_score']},#{summary['median_beta_score']}"
|
44
|
+
end.join("\n")
|
45
|
+
end
|
46
|
+
end
|
47
|
+
|
48
|
+
# TODO: unit test
|
49
|
+
class Graph
|
50
|
+
def initialize(git_period:, relative_period:, jump_days:, template: Graph.load_template_file)
|
51
|
+
@template = template
|
52
|
+
@git_period = git_period
|
53
|
+
@relative_period = relative_period
|
54
|
+
@jump_days = jump_days
|
55
|
+
end
|
56
|
+
|
57
|
+
def self.load_template_file
|
58
|
+
file_path = File.expand_path('../../../tmp/template/timetravel_graph.html', __dir__)
|
59
|
+
File.read(file_path)
|
60
|
+
end
|
61
|
+
|
62
|
+
def serialize(result)
|
63
|
+
summaries = Timetravel.summaries(result)
|
64
|
+
|
65
|
+
data = summaries.map do |summary|
|
66
|
+
JSON.dump(summary)
|
67
|
+
end.join(",\n") + "\n"
|
68
|
+
|
69
|
+
@template.gsub("// INSERT DATA\n", data)
|
70
|
+
.gsub('INSERT TITLE', title)
|
71
|
+
.gsub('INSERT CHURN MODIFIER', churn_modifier)
|
72
|
+
end
|
73
|
+
|
74
|
+
private
|
75
|
+
|
76
|
+
def title
|
77
|
+
"#{churn_modifier}churn and complexity since #{since} evaluated every #{@jump_days} days"
|
78
|
+
end
|
79
|
+
|
80
|
+
def since
|
81
|
+
if @git_period.requested_start_date.nil?
|
82
|
+
'start of project'
|
83
|
+
else
|
84
|
+
@git_period.effective_start_date.strftime('%Y-%m-%d').to_s
|
85
|
+
end
|
86
|
+
end
|
87
|
+
|
88
|
+
def churn_modifier
|
89
|
+
case @relative_period
|
90
|
+
when :month
|
91
|
+
'Monthly '
|
92
|
+
when :quarter
|
93
|
+
'Quarterly '
|
94
|
+
when :year
|
95
|
+
'Yearly '
|
96
|
+
else
|
97
|
+
''
|
98
|
+
end
|
99
|
+
end
|
100
|
+
end
|
101
|
+
end
|
102
|
+
end
|
103
|
+
end
|
@@ -1,5 +1,12 @@
|
|
1
1
|
# frozen_string_literal: true
|
2
2
|
|
3
|
+
require_relative 'serializer/timetravel'
|
4
|
+
require_relative 'serializer/summary_hash'
|
5
|
+
require_relative 'serializer/summary'
|
6
|
+
require_relative 'serializer/csv'
|
7
|
+
require_relative 'serializer/graph'
|
8
|
+
require_relative 'serializer/pass_through'
|
9
|
+
|
3
10
|
module ChurnVsComplexity
|
4
11
|
module Serializer
|
5
12
|
def self.title(result)
|
@@ -15,65 +22,5 @@ module ChurnVsComplexity
|
|
15
22
|
module None
|
16
23
|
def self.serialize(result) = result
|
17
24
|
end
|
18
|
-
|
19
|
-
module Summary
|
20
|
-
def self.serialize(result)
|
21
|
-
values_by_file = result[:values_by_file]
|
22
|
-
churn_values = values_by_file.map { |_, values| values[0].to_f }
|
23
|
-
complexity_values = values_by_file.map { |_, values| values[1].to_f }
|
24
|
-
|
25
|
-
mean_churn = churn_values.sum / churn_values.size
|
26
|
-
median_churn = churn_values.sort[churn_values.size / 2]
|
27
|
-
mean_complexity = complexity_values.sum / complexity_values.size
|
28
|
-
median_complexity = complexity_values.sort[complexity_values.size / 2]
|
29
|
-
|
30
|
-
product = values_by_file.map { |_, values| values[0].to_f * values[1].to_f }
|
31
|
-
mean_product = product.sum / product.size
|
32
|
-
median_product = product.sort[product.size / 2]
|
33
|
-
|
34
|
-
<<~SUMMARY
|
35
|
-
#{Serializer.title(result)}
|
36
|
-
|
37
|
-
Number of observations: #{values_by_file.size}
|
38
|
-
|
39
|
-
Churn:
|
40
|
-
Mean #{mean_churn}, Median #{median_churn}
|
41
|
-
|
42
|
-
Complexity:
|
43
|
-
Mean #{mean_complexity}, Median #{median_complexity}
|
44
|
-
|
45
|
-
Product of churn and complexity:
|
46
|
-
Mean #{mean_product}, Median #{median_product}
|
47
|
-
SUMMARY
|
48
|
-
end
|
49
|
-
end
|
50
|
-
|
51
|
-
module CSV
|
52
|
-
def self.serialize(result)
|
53
|
-
values_by_file = result[:values_by_file]
|
54
|
-
values_by_file.map do |file, values|
|
55
|
-
"#{file},#{values[0]},#{values[1]}\n"
|
56
|
-
end.join
|
57
|
-
end
|
58
|
-
end
|
59
|
-
|
60
|
-
class Graph
|
61
|
-
def initialize(template: Graph.load_template_file)
|
62
|
-
@template = template
|
63
|
-
end
|
64
|
-
|
65
|
-
def serialize(result)
|
66
|
-
data = result[:values_by_file].map do |file, values|
|
67
|
-
"{ file_path: '#{file}', churn: #{values[0]}, complexity: #{values[1]} }"
|
68
|
-
end.join(",\n") + "\n"
|
69
|
-
title = Serializer.title(result)
|
70
|
-
@template.gsub("// INSERT DATA\n", data).gsub('INSERT TITLE', title)
|
71
|
-
end
|
72
|
-
|
73
|
-
def self.load_template_file
|
74
|
-
file_path = File.expand_path('../../tmp/template/graph.html', __dir__)
|
75
|
-
File.read(file_path)
|
76
|
-
end
|
77
|
-
end
|
78
25
|
end
|
79
26
|
end
|
@@ -0,0 +1,66 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module ChurnVsComplexity
|
4
|
+
# TODO: unit test and integration test
|
5
|
+
module Timetravel
|
6
|
+
class Traveller
|
7
|
+
def initialize(since:, relative_period:, engine:, serializer:, jump_days:, factory: Factory)
|
8
|
+
@relative_period = relative_period
|
9
|
+
@engine = engine
|
10
|
+
@jump_days = jump_days
|
11
|
+
@serializer = serializer
|
12
|
+
@git_period = GitDate.git_period(since, Time.now.to_date)
|
13
|
+
@factory = factory
|
14
|
+
end
|
15
|
+
|
16
|
+
def go(folder:)
|
17
|
+
git_strategy = @factory.git_strategy(folder:)
|
18
|
+
commits = git_strategy.resolve_commits_with_interval(git_period: @git_period, jump_days: @jump_days)
|
19
|
+
|
20
|
+
chunked = make_chunks(commits)
|
21
|
+
work_on(chunked:, folder:, git_strategy:)
|
22
|
+
combined = chunked.map { |c_and_p| read_result(c_and_p[:pipe]) }.reduce({}, :merge)
|
23
|
+
|
24
|
+
serializer.serialize(combined)
|
25
|
+
end
|
26
|
+
|
27
|
+
private
|
28
|
+
|
29
|
+
def work_on(chunked:, folder:, git_strategy:)
|
30
|
+
chunked.map.with_index do |c_and_p, i|
|
31
|
+
worktree = @factory.worktree(root_folder: folder, git_strategy:, number: i)
|
32
|
+
worktree.prepare
|
33
|
+
schedule_work(worktree:, **c_and_p)
|
34
|
+
end
|
35
|
+
end
|
36
|
+
|
37
|
+
def make_chunks(commits)
|
38
|
+
chunk_size = (commits.size / 3.0).ceil
|
39
|
+
commits.each_slice(chunk_size).map do |chunk|
|
40
|
+
{ chunk:, pipe: @factory.pipe }
|
41
|
+
end.to_a
|
42
|
+
end
|
43
|
+
|
44
|
+
def read_result(pipe)
|
45
|
+
part = begin
|
46
|
+
JSON.parse(pipe[0].gets)
|
47
|
+
rescue StandardError => e
|
48
|
+
warn "Error parsing JSON: #{e}"
|
49
|
+
{}
|
50
|
+
end
|
51
|
+
pipe.each(&:close)
|
52
|
+
part
|
53
|
+
end
|
54
|
+
|
55
|
+
def schedule_work(chunk:, worktree:, pipe:)
|
56
|
+
@factory.worker(engine: @engine, worktree:)
|
57
|
+
.schedule(chunk:, pipe:)
|
58
|
+
end
|
59
|
+
|
60
|
+
def serializer
|
61
|
+
@factory.serializer(serializer: @serializer, git_period: @git_period,
|
62
|
+
relative_period: @relative_period, jump_days: @jump_days,)
|
63
|
+
end
|
64
|
+
end
|
65
|
+
end
|
66
|
+
end
|
@@ -0,0 +1,56 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'digest'
|
4
|
+
require 'tmpdir'
|
5
|
+
|
6
|
+
module ChurnVsComplexity
|
7
|
+
module Timetravel
|
8
|
+
class Worktree
|
9
|
+
attr_reader :folder
|
10
|
+
|
11
|
+
def initialize(root_folder:, git_strategy:, number:)
|
12
|
+
@root_folder = root_folder
|
13
|
+
@git_strategy = git_strategy
|
14
|
+
@number = number
|
15
|
+
end
|
16
|
+
|
17
|
+
def prepare
|
18
|
+
@folder = prepare_worktree
|
19
|
+
end
|
20
|
+
|
21
|
+
def checkout(sha)
|
22
|
+
raise Error, 'Worktree not prepared' if @folder.nil?
|
23
|
+
|
24
|
+
@git_strategy.checkout_in_worktree(@folder, sha)
|
25
|
+
end
|
26
|
+
|
27
|
+
def remove
|
28
|
+
raise Error, 'Worktree not prepared' if @folder.nil?
|
29
|
+
|
30
|
+
@git_strategy.remove_worktree(@folder)
|
31
|
+
end
|
32
|
+
|
33
|
+
private
|
34
|
+
|
35
|
+
def tt_folder
|
36
|
+
folder_hash = Digest::SHA256.hexdigest(@root_folder)[0..7]
|
37
|
+
File.join(Dir.tmpdir, 'churn_vs_complexity', 'timetravel', folder_hash)
|
38
|
+
end
|
39
|
+
|
40
|
+
def prepare_worktree
|
41
|
+
worktree_folder = File.join(tt_folder, "worktree_#{@number}")
|
42
|
+
|
43
|
+
unless File.directory?(worktree_folder)
|
44
|
+
begin
|
45
|
+
FileUtils.mkdir_p(worktree_folder)
|
46
|
+
rescue StandardError
|
47
|
+
nil
|
48
|
+
end
|
49
|
+
@git_strategy.add_worktree(worktree_folder)
|
50
|
+
end
|
51
|
+
|
52
|
+
worktree_folder
|
53
|
+
end
|
54
|
+
end
|
55
|
+
end
|
56
|
+
end
|
@@ -0,0 +1,70 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require_relative 'timetravel/traveller'
|
4
|
+
require_relative 'timetravel/worktree'
|
5
|
+
|
6
|
+
module ChurnVsComplexity
|
7
|
+
module Timetravel
|
8
|
+
class Factory
|
9
|
+
def self.git_strategy(folder:) = GitStrategy.new(folder:)
|
10
|
+
def self.pipe = IO.pipe
|
11
|
+
def self.worker(engine:, worktree:) = Worker.new(engine:, worktree:)
|
12
|
+
def self.worktree(root_folder:, git_strategy:, number:) = Worktree.new(root_folder:, git_strategy:, number:)
|
13
|
+
def self.serializer(**args) = Serializer::Timetravel.resolve(**args)
|
14
|
+
end
|
15
|
+
|
16
|
+
class Worker
|
17
|
+
def initialize(engine:, worktree:)
|
18
|
+
@engine = engine
|
19
|
+
@worktree = worktree
|
20
|
+
end
|
21
|
+
|
22
|
+
def schedule(chunk:, pipe:)
|
23
|
+
fork do
|
24
|
+
results = chunk.to_h do |commit|
|
25
|
+
sha = commit.sha
|
26
|
+
@worktree.checkout(sha)
|
27
|
+
result = @engine.check(folder: @worktree.folder)
|
28
|
+
[sha, result]
|
29
|
+
end
|
30
|
+
@worktree.remove
|
31
|
+
pipe[1].puts(JSON.dump(results))
|
32
|
+
pipe[1].close
|
33
|
+
end
|
34
|
+
end
|
35
|
+
end
|
36
|
+
|
37
|
+
class GitStrategy
|
38
|
+
def initialize(folder:)
|
39
|
+
@repo = Git.open(folder)
|
40
|
+
@folder = folder
|
41
|
+
end
|
42
|
+
|
43
|
+
def checkout_in_worktree(worktree_folder, sha)
|
44
|
+
command = "(cd #{worktree_folder} && git checkout #{sha}) > /dev/null 2>&1"
|
45
|
+
`#{command}`
|
46
|
+
end
|
47
|
+
|
48
|
+
def resolve_commits_with_interval(git_period:, jump_days:)
|
49
|
+
candidates = @repo.log(1_000_000).since(git_period.effective_start_date).until(git_period.end_date).to_a
|
50
|
+
|
51
|
+
commits_by_date = candidates.filter { |c| c.date.to_date >= git_period.effective_start_date }
|
52
|
+
.group_by { |c| c.date.to_date }
|
53
|
+
|
54
|
+
found_dates = GitDate.select_dates_with_at_least_interval(commits_by_date.keys, jump_days)
|
55
|
+
|
56
|
+
found_dates.map { |date| commits_by_date[date].max_by(&:date) }
|
57
|
+
end
|
58
|
+
|
59
|
+
def add_worktree(wt_folder)
|
60
|
+
command = "(cd #{@folder} && git worktree add -f #{wt_folder}) > /dev/null 2>&1"
|
61
|
+
`#{command}`
|
62
|
+
end
|
63
|
+
|
64
|
+
def remove_worktree(worktree_folder)
|
65
|
+
command = "(cd #{worktree_folder} && git worktree remove -f #{worktree_folder}) > /dev/null 2>&1"
|
66
|
+
`#{command}`
|
67
|
+
end
|
68
|
+
end
|
69
|
+
end
|
70
|
+
end
|
data/lib/churn_vs_complexity.rb
CHANGED
@@ -13,7 +13,9 @@ require_relative 'churn_vs_complexity/cli'
|
|
13
13
|
require_relative 'churn_vs_complexity/config'
|
14
14
|
require_relative 'churn_vs_complexity/serializer'
|
15
15
|
require_relative 'churn_vs_complexity/git_date'
|
16
|
+
require_relative 'churn_vs_complexity/timetravel'
|
16
17
|
|
17
18
|
module ChurnVsComplexity
|
18
19
|
class Error < StandardError; end
|
20
|
+
class ValidationError < Error; end
|
19
21
|
end
|
data/tmp/template/graph.html
CHANGED
@@ -16,10 +16,7 @@
|
|
16
16
|
];
|
17
17
|
|
18
18
|
// Extract data for Chart.js
|
19
|
-
const labels = dataPoints.map(point => point.file_path);
|
20
|
-
const churnData = dataPoints.map(point => point.churn);
|
21
|
-
const complexityData = dataPoints.map(point => point.complexity);
|
22
|
-
|
19
|
+
const labels = dataPoints.map(point => point.file_path);
|
23
20
|
// Prepare data in Chart.js format
|
24
21
|
const data = {
|
25
22
|
labels: labels,
|
@@ -0,0 +1,100 @@
|
|
1
|
+
<!DOCTYPE html>
|
2
|
+
<html lang="en">
|
3
|
+
<head>
|
4
|
+
<meta charset="UTF-8">
|
5
|
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
6
|
+
<title>INSERT TITLE</title>
|
7
|
+
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
|
8
|
+
<script src="https://cdn.jsdelivr.net/npm/chartjs-adapter-date-fns"></script>
|
9
|
+
<style>
|
10
|
+
body {
|
11
|
+
font-family: Arial, sans-serif;
|
12
|
+
}
|
13
|
+
h1 {
|
14
|
+
text-align: center;
|
15
|
+
font-size: 24px;
|
16
|
+
font-weight: bold;
|
17
|
+
margin-bottom: 20px;
|
18
|
+
color: #333;
|
19
|
+
}
|
20
|
+
canvas {
|
21
|
+
margin: 20px auto;
|
22
|
+
}
|
23
|
+
</style>
|
24
|
+
</head>
|
25
|
+
<body>
|
26
|
+
<h1>INSERT TITLE</h1>
|
27
|
+
<canvas id="complexityChart" width="800" height="400"></canvas>
|
28
|
+
<canvas id="churnChart" width="800" height="400"></canvas>
|
29
|
+
<canvas id="alphaScoreChart" width="800" height="400"></canvas>
|
30
|
+
<canvas id="betaScoreChart" width="800" height="400"></canvas>
|
31
|
+
|
32
|
+
<script>
|
33
|
+
|
34
|
+
const dataPoints = [
|
35
|
+
// INSERT DATA
|
36
|
+
];
|
37
|
+
|
38
|
+
// Extract dates for x-axis
|
39
|
+
const labels = dataPoints.map(point => point.end_date);
|
40
|
+
|
41
|
+
// Function to create a dataset
|
42
|
+
function createDataset(label, data, color) {
|
43
|
+
return {
|
44
|
+
label: label,
|
45
|
+
data: data,
|
46
|
+
borderColor: color,
|
47
|
+
backgroundColor: color,
|
48
|
+
fill: false,
|
49
|
+
tension: 0.1
|
50
|
+
};
|
51
|
+
}
|
52
|
+
|
53
|
+
// Function to create a chart
|
54
|
+
function createChart(ctx, title, datasets) {
|
55
|
+
return new Chart(ctx, {
|
56
|
+
type: 'line',
|
57
|
+
data: { labels: labels, datasets: datasets },
|
58
|
+
options: {
|
59
|
+
responsive: true,
|
60
|
+
plugins: {
|
61
|
+
title: { display: true, text: title }
|
62
|
+
},
|
63
|
+
scales: {
|
64
|
+
x: { type: 'time', time: { parser: 'yyyy-MM-dd', tooltipFormat: 'll' } },
|
65
|
+
y: { beginAtZero: true }
|
66
|
+
}
|
67
|
+
}
|
68
|
+
});
|
69
|
+
}
|
70
|
+
|
71
|
+
// Create Complexity Chart
|
72
|
+
const complexityCtx = document.getElementById('complexityChart').getContext('2d');
|
73
|
+
createChart(complexityCtx, 'Complexity Over Time', [
|
74
|
+
createDataset('Mean Complexity', dataPoints.map(p => ({ x: p.end_date, y: p.mean_complexity })), 'rgb(75, 192, 192)'),
|
75
|
+
createDataset('Median Complexity', dataPoints.map(p => ({ x: p.end_date, y: p.median_complexity })), 'rgb(255, 99, 132)')
|
76
|
+
]);
|
77
|
+
|
78
|
+
// Create Churn Chart
|
79
|
+
const churnCtx = document.getElementById('churnChart').getContext('2d');
|
80
|
+
createChart(churnCtx, 'INSERT CHURN MODIFIERChurn Over Time', [
|
81
|
+
createDataset('Mean Churn', dataPoints.map(p => ({ x: p.end_date, y: p.mean_churn })), 'rgb(54, 162, 235)'),
|
82
|
+
createDataset('Median Churn', dataPoints.map(p => ({ x: p.end_date, y: p.median_churn })), 'rgb(255, 206, 86)')
|
83
|
+
]);
|
84
|
+
|
85
|
+
// Create Alpha Score Chart
|
86
|
+
const alphaScoreCtx = document.getElementById('alphaScoreChart').getContext('2d');
|
87
|
+
createChart(alphaScoreCtx, 'Alpha Score Over Time', [
|
88
|
+
createDataset('Mean Alpha Score', dataPoints.map(p => ({ x: p.end_date, y: p.mean_alpha_score })), 'rgb(153, 102, 255)'),
|
89
|
+
createDataset('Median Alpha Score', dataPoints.map(p => ({ x: p.end_date, y: p.median_alpha_score })), 'rgb(255, 159, 64)')
|
90
|
+
]);
|
91
|
+
|
92
|
+
// Create Beta Score Chart
|
93
|
+
const betaScoreCtx = document.getElementById('betaScoreChart').getContext('2d');
|
94
|
+
createChart(betaScoreCtx, 'Beta Score Over Time', [
|
95
|
+
createDataset('Mean Beta Score', dataPoints.map(p => ({ x: p.end_date, y: p.mean_beta_score })), 'rgb(153, 102, 255)'),
|
96
|
+
createDataset('Median Beta Score', dataPoints.map(p => ({ x: p.end_date, y: p.median_beta_score })), 'rgb(255, 159, 64)')
|
97
|
+
]);
|
98
|
+
</script>
|
99
|
+
</body>
|
100
|
+
</html>
|
File without changes
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: churn_vs_complexity
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 1.
|
4
|
+
version: 1.4.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Erik T. Madsen
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2024-
|
11
|
+
date: 2024-10-10 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
name: flog
|
@@ -57,6 +57,7 @@ files:
|
|
57
57
|
- LICENSE.txt
|
58
58
|
- README.md
|
59
59
|
- Rakefile
|
60
|
+
- TODO
|
60
61
|
- bin/churn_vs_complexity
|
61
62
|
- lib/churn_vs_complexity.rb
|
62
63
|
- lib/churn_vs_complexity/churn.rb
|
@@ -71,12 +72,24 @@ files:
|
|
71
72
|
- lib/churn_vs_complexity/file_selector.rb
|
72
73
|
- lib/churn_vs_complexity/git_date.rb
|
73
74
|
- lib/churn_vs_complexity/serializer.rb
|
75
|
+
- lib/churn_vs_complexity/serializer/csv.rb
|
76
|
+
- lib/churn_vs_complexity/serializer/graph.rb
|
77
|
+
- lib/churn_vs_complexity/serializer/pass_through.rb
|
78
|
+
- lib/churn_vs_complexity/serializer/summary.rb
|
79
|
+
- lib/churn_vs_complexity/serializer/summary_hash.rb
|
80
|
+
- lib/churn_vs_complexity/serializer/timetravel.rb
|
81
|
+
- lib/churn_vs_complexity/serializer/timetravel/quality_calculator.rb
|
82
|
+
- lib/churn_vs_complexity/serializer/timetravel/stats_calculator.rb
|
83
|
+
- lib/churn_vs_complexity/timetravel.rb
|
84
|
+
- lib/churn_vs_complexity/timetravel/traveller.rb
|
85
|
+
- lib/churn_vs_complexity/timetravel/worktree.rb
|
74
86
|
- lib/churn_vs_complexity/version.rb
|
75
87
|
- package-lock.json
|
76
88
|
- tmp/eslint-support/complexity-calculator.js
|
77
89
|
- tmp/eslint-support/package.json
|
78
90
|
- tmp/pmd-support/ruleset.xml
|
79
91
|
- tmp/template/graph.html
|
92
|
+
- tmp/template/timetravel_graph.html
|
80
93
|
- tmp/test-support/java/small-example/src/main/java/org/example/Main.java
|
81
94
|
- tmp/test-support/java/small-example/src/main/java/org/example/spice/Checker.java
|
82
95
|
- tmp/test-support/javascript/complex.js
|
@@ -93,6 +106,7 @@ files:
|
|
93
106
|
- tmp/test-support/txt/st.txt
|
94
107
|
- tmp/test-support/txt/uvx.txt
|
95
108
|
- tmp/test-support/txt/yz.txt
|
109
|
+
- tmp/timetravel/.keep
|
96
110
|
homepage: https://github.com/beatmadsen/churn_vs_complexity
|
97
111
|
licenses:
|
98
112
|
- MIT
|