heimdall_tools 1.3.48 → 1.3.49

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: aa900ce8ff5cabccb6e138bb636a0c56972f1894431c9ce67f95bcb0811b1c91
4
- data.tar.gz: 830006eea9df8dfe413e8d0e472d9b38794f41f92a3ec5c468b882b01d54b39b
3
+ metadata.gz: f6386ac453df3c036d6fb57fd6993c338d3432c6c65dda92d06df94c808d4394
4
+ data.tar.gz: 1f215414fb5063abee81d84a760c1208a211dac9478dc8a96b95eb9f00d75b8f
5
5
  SHA512:
6
- metadata.gz: b304f53e1ffd55c28734ca094c2d4ccedb23123ca144bad0d06c80f5a29e5b6f143b48cce2dedec0467ae344c8bbc55bd93cf6ca53dae6d0d7486019e73dcbe1
7
- data.tar.gz: 7096c8751c387912720ba51fe18f69c1a8568d98a4e9a1d7962a5693d7469049eb71d24878dc7a75d8af96de70211324a173b9f3eb87c6104b0e5f6df647cdcd
6
+ metadata.gz: 6a1644af8db70b6de1853899547037c651886a1e8fed8d611826b8a367e5e26b60620bd2edc0d93735c4eb5da379a456f38d65be11bcc9b13e9977545f8566f5
7
+ data.tar.gz: 89b993467f7bf734dc5624218937090d3d093bbcdb979e80ecfe239b131a520abab7785a03204e4b707d613659ba82cee7e7894fdacef826599adebd440ce922
data/README.md CHANGED
@@ -5,20 +5,22 @@
5
5
 
6
6
  HeimdallTools supplies several methods to convert output from various tools to "Heimdall Data Format"(HDF) format to be viewable in Heimdall. The current converters are:
7
7
 
8
- 1. [**aws_config_mapper**](#aws_config_mapper) - assess, audit, and evaluate AWS resources
9
- 1. [**burpsuite_mapper**](#burpsuite_mapper) - commercial dynamic analysis tool
10
- 1. [**dbprotect_mapper**](#dbprotect_mapper) - database vulnerability scanner
11
- 1. [**fortify_mapper**](#fortify_mapper) - commercial static code analysis tool
12
- 1. [**jfrog_xray_mapper**](#jfrog_xray_mapper) - package vulnerability scanner
13
- 1. [**nessus_mapper**](#nessus_mapper) - commercial security scanner (supports compliance and vulnerability scans from Tenable.sc and Tenable.io)
14
- 1. [**netsparker_mapper**](#netsparker_mapper) - web application security scanner
15
- 1. [**nikto_mapper**](#nikto_mapper) - open-source web server scanner
16
- 1. [**sarif_mapper**](#sarif_mapper) - static analysis results interchange format
8
+ 1. [**asff_mapper**](#asff_mapper) - custom findings format for AWS Security Hub
9
+ 1. [**aws_config_mapper**](#aws_config_mapper) - assess, audit, and evaluate AWS resources
10
+ 1. [**burpsuite_mapper**](#burpsuite_mapper) - commercial dynamic analysis tool
11
+ 1. [**dbprotect_mapper**](#dbprotect_mapper) - database vulnerability scanner
12
+ 1. [**fortify_mapper**](#fortify_mapper) - commercial static code analysis tool
13
+ 1. [**jfrog_xray_mapper**](#jfrog_xray_mapper) - package vulnerability scanner
14
+ 1. [**nessus_mapper**](#nessus_mapper) - commercial security scanner (supports compliance and vulnerability scans from Tenable.sc and Tenable.io)
15
+ 1. [**netsparker_mapper**](#netsparker_mapper) - web application security scanner
16
+ 1. [**nikto_mapper**](#nikto_mapper) - open-source web server scanner
17
+ 1. [**prowler_mapper**](#prowler_mapper) - assess, audit, harden, and facilitate incidence response for AWS resources
18
+ 1. [**sarif_mapper**](#sarif_mapper) - static analysis results interchange format
17
19
  1. [**scoutsuite_mapper**](#scoutsuite_mapper) - multi-cloud security auditing tool
18
20
  1. [**snyk_mapper**](#snyk_mapper) - commercial package vulnerability scanner
19
21
  1. [**sonarqube_mapper**](#sonarqube_mapper) - open-source static code analysis tool
20
22
  1. [**xccdf_results_mapper**](#xccdf_results_mapper) - extensible configuration checklist description results format
21
- 1. [*scc_mapper](#xccdf_results_mapper) - scap compliance checker format
23
+ 1. [**scc_mapper**](#xccdf_results_mapper) - scap compliance checker format
22
24
  1. [**zap_mapper**](#zap_mapper) - OWASP ZAP - open-source dynamic code analysis tool
23
25
 
24
26
  ## Want to recommend a mapper for another tool? Please use these steps:
@@ -84,6 +86,27 @@ For Docker usage, replace the `heimdall_tools` command with the correct Docker c
84
86
 
85
87
  Note that all of the above Docker commands will mount your current directory on the Docker container. Ensure that you have navigated to the directory you intend to convert files in before executing the command.
86
88
 
89
+ ## asff_mapper
90
+
91
+ asff_mapper translates AWS Security Finding Format results from JSON to HDF-formatted JSON so as to be viewable on Heimdall
92
+
93
+ Note: The following commands are examples to extract data via the AWS CLI that need to be fed to the mapper:
94
+
95
+ Output|Use|Command
96
+ ---|---|---
97
+ ASFF json|All the findings that will be fed into the mapper|aws securityhub get-findings > asff.json
98
+ AWS SecurityHub enabled standards json|Get all the enabled standards so you can get their identifiers|aws securityhub get-enabled-standards > asff_standards.json
99
+ AWS SecurityHub standard controls json|Get all the controls for a standard that will be fed into the mapper|aws securityhub describe-standards-controls --standards-subscription-arn "arn:aws:securityhub:us-east-1:123456789123:subscription/cis-aws-foundations-benchmark/v/1.2.0" > asff_cis_standard.json
100
+
101
+ USAGE: heimdall_tools asff_mapper -i <asff-finding-json> [--sh <standard-1-json> ... <standard-n-json>] -o <hdf-scan-results-json>
102
+
103
+ FLAGS:
104
+ -i --input -j --json <asff-finding-json> : path to ASFF findings file.
105
+ --sh --securityhub-standards --input-securityhub-standards : array of paths to AWS SecurityHub standard files.
106
+ -o --output <hdf-scan-results-json> : path to output scan-results json.
107
+
108
+ example: heimdall_tools asff_mapper -i asff_findings.json --sh aws_standard.json cis_standard.json -o asff_hdf.json
109
+
87
110
  ## aws_config_mapper
88
111
 
89
112
  aws_config_mapper pulls Ruby AWS SDK data to translate AWS Config Rule results into HDF format json to be viewable in Heimdall
@@ -99,8 +122,8 @@ aws_config_mapper pulls Ruby AWS SDK data to translate AWS Config Rule results i
99
122
  USAGE: heimdall_tools aws_config_mapper [OPTIONS] -o
100
123
 
101
124
  FLAGS:
102
- -o --output : path to output scan-results json.
103
- -V --verbose : verbose run [optional].
125
+ -o --output : path to output scan-results json.
126
+ -V --verbose : verbose run [optional].
104
127
 
105
128
  example: heimdall_tools aws_config_mapper -o aws_config_results_hdf.json
106
129
 
@@ -111,9 +134,9 @@ burpsuite_mapper translates an BurpSuite Pro exported XML results file into HDF
111
134
  USAGE: heimdall_tools burpsuite_mapper [OPTIONS] -x -o
112
135
 
113
136
  FLAGS:
114
- -x : path to BurpSuitePro exported XML results file.
115
- -o --output : path to output scan-results json.
116
- -V --verbose : verbose run [optional].
137
+ -x : path to BurpSuitePro exported XML results file.
138
+ -o --output : path to output scan-results json.
139
+ -V --verbose : verbose run [optional].
117
140
 
118
141
  example: heimdall_tools burpsuite_mapper -x burpsuite_results.xml -o scan_results.json
119
142
 
@@ -124,9 +147,9 @@ dbprotect_mapper translates DBProtect report in `Check Results Details` format X
124
147
  USAGE: heimdall_tools dbprotect_mapper [OPTIONS] -x -o
125
148
 
126
149
  FLAGS:
127
- -x : path to DBProtect report XML file.
128
- -o --output : path to output scan-results json.
129
- -V --verbose : verbose run [optional].
150
+ -x : path to DBProtect report XML file.
151
+ -o --output : path to output scan-results json.
152
+ -V --verbose : verbose run [optional].
130
153
 
131
154
  example: heimdall_tools dbprotect_mapper -x check_results_details_report.xml -o db_protect_hdf.json
132
155
 
@@ -137,9 +160,9 @@ fortify_mapper translates an Fortify results FVDL file into HDF format json to b
137
160
  USAGE: heimdall_tools fortify_mapper [OPTIONS] -f -o
138
161
 
139
162
  FLAGS:
140
- -f --fvdl : path to Fortify Scan FVDL file.
141
- -o --output : path to output scan-results json.
142
- -V --verbose : verbose run [optional].
163
+ -f --fvdl : path to Fortify Scan FVDL file.
164
+ -o --output : path to output scan-results json.
165
+ -V --verbose : verbose run [optional].
143
166
 
144
167
  example: heimdall_tools fortify_mapper -f audit.fvdl -o scan_results.json
145
168
 
@@ -150,9 +173,9 @@ jfrog_xray_mapper translates an JFrog Xray results JSON file into HDF format JSO
150
173
  USAGE: heimdall_tools jfrog_xray_mapper [OPTIONS] -j -o
151
174
 
152
175
  FLAGS:
153
- -j : path to xray results JSON file.
154
- -o --output : path to output scan-results json.
155
- -V --verbose : verbose run [optional].
176
+ -j : path to xray results JSON file.
177
+ -o --output : path to output scan-results json.
178
+ -V --verbose : verbose run [optional].
156
179
 
157
180
  example: heimdall_tools jfrog_xray_mapper -j xray_results.json -o xray_results_hdf.json
158
181
 
@@ -166,9 +189,9 @@ Note: A separate HDF JSON file is generated for each host reported in the Nessus
166
189
  USAGE: heimdall_tools nessus_mapper [OPTIONS] -x -o
167
190
 
168
191
  FLAGS:
169
- -x : path to Nessus-exported XML results file.
170
- -o --output_prefix : path to output scan-results json.
171
- -V --verbose : verbose run [optional].
192
+ -x : path to Nessus-exported XML results file.
193
+ -o --output_prefix : path to output scan-results json.
194
+ -V --verbose : verbose run [optional].
172
195
 
173
196
  example: heimdall_tools nessus_mapper -x nessus-results.xml -o test-env
174
197
 
@@ -181,9 +204,9 @@ The current iteration only works with Netsparker Enterprise Vulnerabilities Scan
181
204
  USAGE: heimdall_tools netsparker_mapper [OPTIONS] -x -o
182
205
 
183
206
  FLAGS:
184
- -x : path to netsparker results XML file.
185
- -o --output : path to output scan-results json.
186
- -V --verbose : verbose run [optional].
207
+ -x : path to netsparker results XML file.
208
+ -o --output : path to output scan-results json.
209
+ -V --verbose : verbose run [optional].
187
210
 
188
211
  example: heimdall_tools netsparker_mapper -x netsparker_results.xml -o netsparker_hdf.json
189
212
 
@@ -196,12 +219,26 @@ Note: Current this mapper only support single target Nikto Scans.
196
219
  USAGE: heimdall_tools nikto_mapper [OPTIONS] -x -o
197
220
 
198
221
  FLAGS:
199
- -j : path to Nikto results JSON file.
200
- -o --output_prefix : path to output scan-results json.
201
- -V --verbose : verbose run [optional].
222
+ -j : path to Nikto results JSON file.
223
+ -o --output_prefix : path to output scan-results json.
224
+ -V --verbose : verbose run [optional].
202
225
 
203
226
  example: heimdall_tools nikto_mapper -j nikto_results.json -o nikto_results.json
204
227
 
228
+ ## prowler_mapper
229
+
230
+ prowler_mapper translates Prowler-derived AWS Security Finding Format results from concatenated JSON blobs to HDF-formatted JSON so as to be viewable on Heimdall
231
+
232
+ Note: Currently this mapper only supports Prowler's ASFF output format.
233
+
234
+ USAGE: heimdall_tools prowler_mapper -i <prowler-asff-json> -o <hdf-scan-results-json>
235
+
236
+ FLAGS:
237
+ -i --input -j --json <prowler-asff-json> : path to Prowler ASFF findings file.
238
+ -o --output <hdf-scan-results-json> : path to output scan-results json.
239
+
240
+ example: heimdall_tools prowler_mapper -i prowler_results.js -o prowler_hdf.json
241
+
205
242
  ## sarif_mapper
206
243
 
207
244
  sarif_mapper translates a SARIF JSON file into HDF format JSON to be viewable in Heimdall
@@ -209,9 +246,9 @@ sarif_mapper translates a SARIF JSON file into HDF format JSON to be viewable in
209
246
  USAGE: heimdall_tools sarif_mapper [OPTIONS] -j -o
210
247
 
211
248
  FLAGS:
212
- -j : path to SARIF results JSON file.
213
- -o --output_prefix : path to output scan-results json.
214
- -V --verbose : verbose run [optional].
249
+ -j : path to SARIF results JSON file.
250
+ -o --output_prefix : path to output scan-results json.
251
+ -V --verbose : verbose run [optional].
215
252
 
216
253
  example: heimdall_tools sarif_mapper -j sarif_results.json -o sarif_results_hdf.json
217
254
 
@@ -224,8 +261,8 @@ Note: Currently this mapper only supports AWS.
224
261
  USAGE: heimdall_tools scoutsuite_mapper -i -o
225
262
 
226
263
  FLAGS:
227
- -i --input -j --javascript : path to Scout Suite results Javascript file.
228
- -o --output : path to output scan-results json.
264
+ -i --input -j --javascript : path to Scout Suite results Javascript file.
265
+ -o --output : path to output scan-results json.
229
266
 
230
267
  example: heimdall_tools scoutsuite_mapper -i scoutsuite_results.js -o scoutsuite_hdf.json
231
268
 
@@ -238,9 +275,9 @@ Note: A separate HDF JSON is generated for each project reported in the Snyk Rep
238
275
  USAGE: heimdall_tools snyk_mapper [OPTIONS] -x -o
239
276
 
240
277
  FLAGS:
241
- -j : path to Snyk results JSON file.
242
- -o --output_prefix : path to output scan-results json.
243
- -V --verbose : verbose run [optional].
278
+ -j : path to Snyk results JSON file.
279
+ -o --output_prefix : path to output scan-results json.
280
+ -V --verbose : verbose run [optional].
244
281
 
245
282
  example: heimdall_tools snyk_mapper -j snyk_results.json -o output-file-prefix
246
283
 
@@ -251,11 +288,11 @@ sonarqube_mapper pulls SonarQube results, for the specified project, from the AP
251
288
  USAGE: heimdall_tools sonarqube_mapper [OPTIONS] -n -u -o
252
289
 
253
290
  FLAGS:
254
- -n --name : Project Key of the project in SonarQube
255
- -u --api_url : url of the SonarQube Server API. Typically ends with /api.
256
- --auth : username:password or token [optional].
257
- -o --output : path to output scan-results json.
258
- -V --verbose : verbose run [optional].
291
+ -n --name : Project Key of the project in SonarQube
292
+ -u --api_url : url of the SonarQube Server API. Typically ends with /api.
293
+ --auth : username:password or token [optional].
294
+ -o --output : path to output scan-results json.
295
+ -V --verbose : verbose run [optional].
259
296
 
260
297
  example:
261
298
 
@@ -272,8 +309,8 @@ xccdf_results_mapper translates an XCCDF_Results XML scan into HDF format json t
272
309
  USAGE: heimdall_tools xccdf_results_mapper [OPTIONS] -x -o
273
310
 
274
311
  FLAGS:
275
- -x : path to XCCDF-Results XML file.
276
- -o --output : path to output scan-results json.
312
+ -x : path to XCCDF-Results XML file.
313
+ -o --output : path to output scan-results json.
277
314
 
278
315
  example: heimdall_tools xccdf_results_mapper -x xccdf_results.xml -o scan_results.json
279
316
 
@@ -284,10 +321,10 @@ zap_mapper translates OWASP ZAP results Json to HDF format Json be viewed on Hei
284
321
  USAGE: heimdall_tools zap_mapper [OPTIONS] -j -n -o
285
322
 
286
323
  FLAGS:
287
- -j --json : path to OWASP ZAP results JSON file.
288
- -n --name : URL of the site being evaluated.
289
- -o --output : path to output scan-results json.
290
- -V --verbose : verbose run [optional].
324
+ -j --json : path to OWASP ZAP results JSON file.
325
+ -n --name : URL of the site being evaluated.
326
+ -o --output : path to output scan-results json.
327
+ -V --verbose : verbose run [optional].
291
328
 
292
329
  example: heimdall_tools zap_mapper -j zap_results.json -n site_name -o scan_results.json
293
330
 
@@ -355,6 +392,7 @@ To release a new version, update the version number in `version.rb` according to
355
392
 
356
393
  ### Authors
357
394
 
395
+ - Author:: Amndeep Singh Mann [Amndeep7](https://github.com/Amndeep7)
358
396
  - Author:: Rony Xavier [rx294](https://github.com/rx294)
359
397
  - Author:: Dan Mirsky [mirskiy](https://github.com/mirskiy)
360
398
 
@@ -0,0 +1,11 @@
1
+ module HeimdallTools
2
+ class FirewallManager
3
+ def self.finding_id(finding, *, encode:, **)
4
+ encode.call(finding['Title'])
5
+ end
6
+
7
+ def self.product_name(findings, *, encode:, **)
8
+ encode.call("#{findings[0]['ProductFields']['aws/securityhub/CompanyName']} #{findings[0]['ProductFields']['aws/securityhub/ProductName']}")
9
+ end
10
+ end
11
+ end
@@ -0,0 +1,19 @@
1
+ module HeimdallTools
2
+ class Prowler
3
+ def self.subfindings_code_desc(finding, *, encode:, **)
4
+ encode.call(finding['Description'])
5
+ end
6
+
7
+ def self.finding_id(finding, *, encode:, **)
8
+ encode.call(finding['GeneratorId'].partition('-')[-1])
9
+ end
10
+
11
+ def self.product_name(findings, *, encode:, **)
12
+ encode.call(findings[0]['ProductFields']['ProviderName'])
13
+ end
14
+
15
+ def self.desc(*, **)
16
+ ' '
17
+ end
18
+ end
19
+ end
@@ -0,0 +1,89 @@
1
+ require 'csv'
2
+ require 'json'
3
+
4
+ module HeimdallTools
5
+ class SecurityHub
6
+ private_class_method def self.corresponding_control(controls, finding)
7
+ controls.find { |c| c['StandardsControlArn'] == finding['ProductFields']['StandardsControlArn'] }
8
+ end
9
+
10
+ def self.supporting_docs(standards:)
11
+ begin
12
+ controls = standards.nil? ? nil : standards.map { |s| JSON.parse(s)['Controls'] }.flatten
13
+ rescue StandardError => e
14
+ raise "Invalid supporting docs for Security Hub:\nException: #{e}"
15
+ end
16
+
17
+ begin
18
+ resource_dir = Pathname.new(__FILE__).join('../../../data')
19
+ aws_config_mapping_file = File.join(resource_dir, 'aws-config-mapping.csv')
20
+ aws_config_mapping = CSV.read(aws_config_mapping_file, { encoding: 'UTF-8', headers: true, header_converters: :symbol }).map(&:to_hash)
21
+ rescue StandardError => e
22
+ raise "Invalid AWS Config mapping file:\nException: #{e}"
23
+ end
24
+
25
+ { controls: controls, aws_config_mapping: aws_config_mapping }
26
+ end
27
+
28
+ def self.finding_id(finding, *, encode:, controls: nil, **)
29
+ ret = if !controls.nil? && !(control = corresponding_control(controls, finding)).nil?
30
+ control['ControlId']
31
+ elsif finding['ProductFields'].member?('ControlId') # check if aws
32
+ finding['ProductFields']['ControlId']
33
+ elsif finding['ProductFields'].member?('RuleId') # check if cis
34
+ finding['ProductFields']['RuleId']
35
+ else
36
+ finding['GeneratorId'].split('/')[-1]
37
+ end
38
+ encode.call(ret)
39
+ end
40
+
41
+ def self.finding_impact(finding, *, controls: nil, **)
42
+ if !controls.nil? && !(control = corresponding_control(controls, finding)).nil?
43
+ imp = control['SeverityRating'].to_sym
44
+ else
45
+ # severity is required, but can be either 'label' or 'normalized' internally with 'label' being preferred. other values can be in here too such as the original severity rating.
46
+ imp = finding['Severity'].key?('Label') ? finding['Severity']['Label'].to_sym : finding['Severity']['Normalized']/100.0
47
+ # securityhub asff file does not contain accurate severity information by setting things that shouldn't be informational to informational: when additional context, i.e. standards, is not provided, set informational to medium.
48
+ imp = :MEDIUM if imp.is_a?(Symbol) && imp == :INFORMATIONAL
49
+ end
50
+ imp
51
+ end
52
+
53
+ def self.finding_nist_tag(finding, *, aws_config_mapping:, **)
54
+ return {} unless finding['ProductFields']['RelatedAWSResources:0/type'] == 'AWS::Config::ConfigRule'
55
+
56
+ entries = aws_config_mapping.select { |rule| finding['ProductFields']['RelatedAWSResources:0/name'].include? rule[:awsconfigrulename] }
57
+ entries.map do |rule|
58
+ tags_joined = rule[:nistid].split('|') # subheadings are joined together in the csv file
59
+ tags_joined.map do |tag|
60
+ if (i = tag.index('(')).nil?
61
+ tag
62
+ else
63
+ tag[i..-1].scan(/\(.+?\)/).map { |subheading| "#{tag[0..i-1]}#{subheading}" }
64
+ end
65
+ end
66
+ end.flatten.uniq
67
+ end
68
+
69
+ def self.finding_title(finding, *, encode:, controls: nil, **)
70
+ ret = if !controls.nil? && !(control = corresponding_control(controls, finding)).nil?
71
+ control['Title']
72
+ else
73
+ finding['Title']
74
+ end
75
+ encode.call(ret)
76
+ end
77
+
78
+ def self.product_name(findings, *, encode:, **)
79
+ # "#{findings[0]['ProductFields']['aws/securityhub/CompanyName']} #{findings[0]['ProductFields']['aws/securityhub/ProductName']}"
80
+ # not using above due to wanting to provide the standard's name instead
81
+ if findings[0]['Types'][0].split('/')[-1].gsub(/-/, ' ').downcase == findings[0]['ProductFields']['StandardsControlArn'].split('/')[-4].gsub(/-/, ' ').downcase
82
+ standardname = findings[0]['Types'][0].split('/')[-1].gsub(/-/, ' ')
83
+ else
84
+ standardname = findings[0]['ProductFields']['StandardsControlArn'].split('/')[-4].gsub(/-/, ' ').split.map(&:capitalize).join(' ')
85
+ end
86
+ encode.call("#{standardname} v#{findings[0]['ProductFields']['StandardsControlArn'].split('/')[-2]}")
87
+ end
88
+ end
89
+ end
@@ -0,0 +1,232 @@
1
+ require 'json'
2
+ require 'set'
3
+
4
+ require 'htmlentities'
5
+
6
+ require 'heimdall_tools/hdf'
7
+ require 'heimdall_tools/asff_compatible_products/firewall_manager'
8
+ require 'heimdall_tools/asff_compatible_products/prowler'
9
+ require 'heimdall_tools/asff_compatible_products/securityhub'
10
+
11
+ module HeimdallTools
12
+ DEFAULT_NIST_TAG = %w{SA-11 RA-5}.freeze
13
+
14
+ INSPEC_INPUTS_MAPPING = {
15
+ string: 'String',
16
+ numeric: 'Numeric',
17
+ regexp: 'Regexp',
18
+ array: 'Array',
19
+ hash: 'Hash',
20
+ boolean: 'Boolean',
21
+ any: 'Any'
22
+ }.freeze
23
+
24
+ # Loading spinner sign
25
+ $spinner = Enumerator.new do |e|
26
+ loop do
27
+ e.yield '|'
28
+ e.yield '/'
29
+ e.yield '-'
30
+ e.yield '\\'
31
+ end
32
+ end
33
+
34
+ # TODO: use hash.dig and safe navigation operator throughout
35
+ class ASFFMapper
36
+ IMPACT_MAPPING = {
37
+ CRITICAL: 0.9,
38
+ HIGH: 0.7,
39
+ MEDIUM: 0.5,
40
+ LOW: 0.3,
41
+ INFORMATIONAL: 0.0
42
+ }.freeze
43
+
44
+ PRODUCT_ARN_MAPPING = {
45
+ %r{arn:.+:securityhub:.+:.*:product/aws/firewall-manager} => FirewallManager,
46
+ %r{arn:.+:securityhub:.+:.*:product/aws/securityhub} => SecurityHub,
47
+ %r{arn:.+:securityhub:.+:.*:product/prowler/prowler} => Prowler
48
+ }.freeze
49
+
50
+ def initialize(asff_json, securityhub_standards_json_array: nil, meta: nil)
51
+ @meta = meta
52
+
53
+ @supporting_docs = {}
54
+ @supporting_docs[SecurityHub] = SecurityHub.supporting_docs({ standards: securityhub_standards_json_array })
55
+
56
+ begin
57
+ asff_required_keys = %w{AwsAccountId CreatedAt Description GeneratorId Id ProductArn Resources SchemaVersion Severity Title Types UpdatedAt}
58
+ @report = JSON.parse(asff_json)
59
+ if @report.length == 1 && @report.member?('Findings') && @report['Findings'].each { |finding| asff_required_keys.to_set.difference(finding.keys.to_set).none? }.all?
60
+ # ideal case that is spec compliant
61
+ # might need to ensure that the file is utf-8 encoded and remove a BOM if one exists
62
+ elsif asff_required_keys.to_set.difference(@report.keys.to_set).none?
63
+ # individual finding so have to add wrapping array
64
+ @report = { 'Findings' => [@report] }
65
+ else
66
+ raise 'Not a findings file nor an individual finding'
67
+ end
68
+ rescue StandardError => e
69
+ raise "Invalid ASFF file provided:\nException: #{e}"
70
+ end
71
+
72
+ @coder = HTMLEntities.new
73
+ end
74
+
75
+ def encode(string)
76
+ @coder.encode(string, :basic, :named, :decimal)
77
+ end
78
+
79
+ def external_product_handler(product, data, func, default)
80
+ if (product.is_a?(Regexp) || (arn = PRODUCT_ARN_MAPPING.keys.find { |a| product.match(a) })) && PRODUCT_ARN_MAPPING.key?(arn || product) && PRODUCT_ARN_MAPPING[arn || product].respond_to?(func)
81
+ keywords = { encode: method(:encode) }
82
+ keywords = keywords.merge(@supporting_docs[PRODUCT_ARN_MAPPING[arn || product]]) if @supporting_docs.member?(PRODUCT_ARN_MAPPING[arn || product])
83
+ PRODUCT_ARN_MAPPING[arn || product].send(func, data, **keywords)
84
+ elsif default.is_a? Proc
85
+ default.call
86
+ else
87
+ default
88
+ end
89
+ end
90
+
91
+ def nist_tag(finding)
92
+ tags = external_product_handler(finding['ProductArn'], finding, :finding_nist_tag, {})
93
+ tags.empty? ? DEFAULT_NIST_TAG : tags
94
+ end
95
+
96
+ def impact(finding)
97
+ # there can be findings listed that are intentionally ignored due to the underlying control being superceded by a control from a different standard
98
+ if finding.member?('Workflow') && finding['Workflow'].member?('Status') && finding['Workflow']['Status'] == 'SUPPRESSED'
99
+ imp = :INFORMATIONAL
100
+ else
101
+ # severity is required, but can be either 'label' or 'normalized' internally with 'label' being preferred. other values can be in here too such as the original severity rating.
102
+ default = proc { finding['Severity'].key?('Label') ? finding['Severity']['Label'].to_sym : finding['Severity']['Normalized']/100.0 }
103
+ imp = external_product_handler(finding['ProductArn'], finding, :finding_impact, default)
104
+ end
105
+ imp.is_a?(Symbol) ? IMPACT_MAPPING[imp] : imp
106
+ end
107
+
108
+ def desc_tags(data, label)
109
+ { data: data || NA_STRING, label: label || NA_STRING }
110
+ end
111
+
112
+ def subfindings(finding)
113
+ subfinding = {}
114
+
115
+ statusreason = finding['Compliance']['StatusReasons'].map { |reason| reason.flatten.map { |string| encode(string) } }.flatten.join("\n") if finding.key?('Compliance') && finding['Compliance'].key?('StatusReasons')
116
+ if finding.key?('Compliance') && finding['Compliance'].key?('Status')
117
+ case finding['Compliance']['Status']
118
+ when 'PASSED'
119
+ subfinding['status'] = 'passed'
120
+ subfinding['message'] = statusreason if statusreason
121
+ when 'WARNING'
122
+ subfinding['status'] = 'skipped'
123
+ subfinding['skip_message'] = statusreason if statusreason
124
+ when 'FAILED'
125
+ subfinding['status'] = 'failed'
126
+ subfinding['message'] = statusreason if statusreason
127
+ when 'NOT_AVAILABLE'
128
+ # primary meaning is that the check could not be performed due to a service outage or API error, but it's also overloaded to mean NOT_APPLICABLE so technically 'skipped' or 'error' could be applicable, but AWS seems to do the equivalent of skipped
129
+ subfinding['status'] = 'skipped'
130
+ subfinding['skip_message'] = statusreason if statusreason
131
+ else
132
+ subfinding['status'] = 'error' # not a valid value for the status enum
133
+ subfinding['message'] = statusreason if statusreason
134
+ end
135
+ else
136
+ subfinding['status'] = 'skipped' # if no compliance status is provided which is a weird but possible case, then skip
137
+ subfinding['skip_message'] = statusreason if statusreason
138
+ end
139
+
140
+ subfinding['code_desc'] = external_product_handler(finding['ProductArn'], finding, :subfindings_code_desc, '')
141
+ subfinding['code_desc'] += '; ' unless subfinding['code_desc'].empty?
142
+ subfinding['code_desc'] += "Resources: [#{finding['Resources'].map { |r| "Type: #{encode(r['Type'])}, Id: #{encode(r['Id'])}#{", Partition: #{encode(r['Partition'])}" if r.key?('Partition')}#{", Region: #{encode(r['Region'])}" if r.key?('Region')}" }.join(', ')}]"
143
+
144
+ subfinding['start_time'] = finding.key?('LastObservedAt') ? finding['LastObservedAt'] : finding['UpdatedAt']
145
+
146
+ [subfinding]
147
+ end
148
+
149
+ def to_hdf
150
+ product_groups = {}
151
+ @report['Findings'].each do |finding|
152
+ printf("\rProcessing: %s", $spinner.next)
153
+
154
+ external = method(:external_product_handler).curry(4)[finding['ProductArn']][finding]
155
+
156
+ # group subfindings by asff productarn and then hdf id
157
+ item = {}
158
+ item['id'] = external[:finding_id][encode(finding['GeneratorId'])]
159
+
160
+ item['title'] = external[:finding_title][encode(finding['Title'])]
161
+
162
+ item['tags'] = { nist: nist_tag(finding) }
163
+
164
+ item['impact'] = impact(finding)
165
+
166
+ item['desc'] = encode(finding['Description'])
167
+
168
+ item['descriptions'] = []
169
+ item['descriptions'] << desc_tags(finding['Remediation']['Recommendation'].map { |_k, v| encode(v) }.join("\n"), 'fix') if finding.key?('Remediation') && finding['Remediation'].key?('Recommendation')
170
+
171
+ item['refs'] = []
172
+ item['refs'] << { url: finding['SourceUrl'] } if finding.key?('SourceUrl')
173
+
174
+ item['source_location'] = NA_HASH
175
+
176
+ item['results'] = subfindings(finding)
177
+
178
+ arn = PRODUCT_ARN_MAPPING.keys.find { |a| finding['ProductArn'].match(a) }
179
+ if arn.nil?
180
+ product_info = finding['ProductArn'].split(':')[-1]
181
+ arn = Regexp.new "arn:.+:securityhub:.+:.*:product/#{product_info.split('/')[1]}/#{product_info.split('/')[2]}"
182
+ end
183
+ product_groups[arn] = {} if product_groups[arn].nil?
184
+ product_groups[arn][item['id']] = [] if product_groups[arn][item['id']].nil?
185
+ product_groups[arn][item['id']] << [item, finding]
186
+ end
187
+
188
+ controls = []
189
+ product_groups.each do |product, id_groups|
190
+ id_groups.each do |id, data|
191
+ printf("\rProcessing: %s", $spinner.next)
192
+
193
+ external = method(:external_product_handler).curry(4)[product]
194
+
195
+ group = data.map { |d| d[0] }
196
+ findings = data.map { |d| d[1] }
197
+
198
+ product_info = findings[0]['ProductArn'].split(':')[-1].split('/')
199
+ product_name = external[findings][:product_name][encode("#{product_info[1]}/#{product_info[2]}")]
200
+
201
+ item = {}
202
+ # add product name to id if any ids are the same across products
203
+ item['id'] = product_groups.reject { |pg| pg == product }.values.any? { |ig| ig.keys.include?(id) } ? "[#{product_name}] #{id}" : id
204
+
205
+ item['title'] = "#{product_name}: #{group.map { |d| d['title'] }.uniq.join(';')}"
206
+
207
+ item['tags'] = { nist: group.map { |d| d['tags'][:nist] }.flatten.uniq }
208
+
209
+ item['impact'] = group.map { |d| d['impact'] }.max
210
+
211
+ item['desc'] = external[group][:desc][group.map { |d| d['desc'] }.uniq.join("\n")]
212
+
213
+ item['descriptions'] = group.map { |d| d['descriptions'] }.flatten.compact.reject(&:empty?).uniq
214
+
215
+ item['refs'] = group.map { |d| d['refs'] }.flatten.compact.reject(&:empty?).uniq
216
+
217
+ item['source_location'] = NA_HASH
218
+ item['code'] = JSON.pretty_generate({ Findings: findings })
219
+
220
+ item['results'] = group.map { |d| d['results'] }.flatten.uniq
221
+
222
+ controls << item
223
+ end
224
+ end
225
+
226
+ results = HeimdallDataFormat.new(profile_name: @meta&.key?('name') ? @meta['name'] : 'AWS Security Finding Format',
227
+ title: @meta&.key?('title') ? @meta['title'] : 'ASFF findings',
228
+ controls: controls)
229
+ results.to_hdf
230
+ end
231
+ end
232
+ end
@@ -8,7 +8,7 @@ RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
8
8
  AWS_CONFIG_MAPPING_FILE = File.join(RESOURCE_DIR, 'aws-config-mapping.csv')
9
9
 
10
10
  NOT_APPLICABLE_MSG = 'No AWS resources found to evaluate complaince for this rule'.freeze
11
- INSUFFICIENT_DATA_MSG = 'Not enough data has been collectd to determine compliance yet.'.freeze
11
+ INSUFFICIENT_DATA_MSG = 'Not enough data has been collected to determine compliance yet.'.freeze
12
12
 
13
13
  ##
14
14
  # HDF mapper for use with AWS Config rules.
@@ -70,7 +70,7 @@ module HeimdallTools
70
70
  option :output_prefix, required: true, aliases: '-o'
71
71
  def snyk_mapper
72
72
  hdfs = HeimdallTools::SnykMapper.new(File.read(options[:json]), options[:name]).to_hdf
73
- puts "\r\HDF Generated:\n"
73
+ puts "\rHDF Generated:\n"
74
74
  hdfs.each_key do |host|
75
75
  File.write("#{options[:output_prefix]}-#{host}.json", hdfs[host])
76
76
  puts "#{options[:output_prefix]}-#{host}.json"
@@ -84,7 +84,7 @@ module HeimdallTools
84
84
  def nikto_mapper
85
85
  hdf = HeimdallTools::NiktoMapper.new(File.read(options[:json])).to_hdf
86
86
  File.write(options[:output], hdf)
87
- puts "\r\HDF Generated:\n"
87
+ puts "\rHDF Generated:\n"
88
88
  puts options[:output].to_s
89
89
  end
90
90
 
@@ -95,7 +95,7 @@ module HeimdallTools
95
95
  def jfrog_xray_mapper
96
96
  hdf = HeimdallTools::JfrogXrayMapper.new(File.read(options[:json])).to_hdf
97
97
  File.write(options[:output], hdf)
98
- puts "\r\HDF Generated:\n"
98
+ puts "\rHDF Generated:\n"
99
99
  puts options[:output].to_s
100
100
  end
101
101
 
@@ -106,7 +106,7 @@ module HeimdallTools
106
106
  def dbprotect_mapper
107
107
  hdf = HeimdallTools::DBProtectMapper.new(File.read(options[:xml])).to_hdf
108
108
  File.write(options[:output], hdf)
109
- puts "\r\HDF Generated:\n"
109
+ puts "\rHDF Generated:\n"
110
110
  puts options[:output].to_s
111
111
  end
112
112
 
@@ -117,7 +117,7 @@ module HeimdallTools
117
117
  def aws_config_mapper
118
118
  hdf = HeimdallTools::AwsConfigMapper.new(options[:custom_mapping]).to_hdf
119
119
  File.write(options[:output], hdf)
120
- puts "\r\HDF Generated:\n"
120
+ puts "\rHDF Generated:\n"
121
121
  puts options[:output].to_s
122
122
  end
123
123
 
@@ -128,7 +128,7 @@ module HeimdallTools
128
128
  def netsparker_mapper
129
129
  hdf = HeimdallTools::NetsparkerMapper.new(File.read(options[:xml])).to_hdf
130
130
  File.write(options[:output], hdf)
131
- puts "\r\HDF Generated:\n"
131
+ puts "\rHDF Generated:\n"
132
132
  puts options[:output].to_s
133
133
  end
134
134
 
@@ -140,7 +140,7 @@ module HeimdallTools
140
140
  def sarif_mapper
141
141
  hdf = HeimdallTools::SarifMapper.new(File.read(options[:json])).to_hdf
142
142
  File.write(options[:output], hdf)
143
- puts "\r\HDF Generated:\n"
143
+ puts "\rHDF Generated:\n"
144
144
  puts options[:output].to_s
145
145
  end
146
146
 
@@ -155,6 +155,29 @@ module HeimdallTools
155
155
  puts options[:output].to_s
156
156
  end
157
157
 
158
+ desc 'asff_mapper', 'asff_mapper translates AWS Security Finding Format results from JSON to HDF-formatted JSON so as to be viewable on Heimdall'
159
+ long_desc Help.text(:asff_mapper)
160
+ option :json, required: true, banner: 'ASFF-FINDING-JSON', aliases: ['-i', '--input', '-j']
161
+ option :securityhub_standards, required: false, type: :array, banner: 'ASFF-SECURITYHUB-STANDARDS-JSON', aliases: ['--sh', '--input-securityhub-standards']
162
+ option :output, required: true, banner: 'HDF-SCAN-RESULTS-JSON', aliases: '-o'
163
+ def asff_mapper
164
+ hdf = HeimdallTools::ASFFMapper.new(File.read(options[:json]), securityhub_standards_json_array: options[:securityhub_standards].nil? ? nil : options[:securityhub_standards].map { |filename| File.read(filename) }).to_hdf
165
+ File.write(options[:output], hdf)
166
+ puts "\rHDF Generated:\n"
167
+ puts options[:output].to_s
168
+ end
169
+
170
+ desc 'prowler_mapper', 'prowler_mapper translates Prowler-derived AWS Security Finding Format results from concatenated JSON blobs to HDF-formatted JSON so as to be viewable on Heimdall'
171
+ long_desc Help.text(:prowler_mapper)
172
+ option :json, required: true, banner: 'PROWLER-ASFF-JSON', aliases: ['-i', '--input', '-j']
173
+ option :output, required: true, banner: 'HDF-SCAN-RESULTS-JSON', aliases: '-o'
174
+ def prowler_mapper
175
+ hdf = HeimdallTools::ProwlerMapper.new(File.read(options[:json])).to_hdf
176
+ File.write(options[:output], hdf)
177
+ puts "\rHDF Generated:\n"
178
+ puts options[:output].to_s
179
+ end
180
+
158
181
  desc 'version', 'prints version'
159
182
  def version
160
183
  puts VERSION
@@ -55,15 +55,13 @@ module HeimdallTools
55
55
  findings.uniq
56
56
  end
57
57
 
58
- # rubocop:disable Layout/LineEndStringConcatenationIndentation
59
58
  def snippet(snippetid)
60
59
  snippet = @snippets.select { |x| x['id'].eql?(snippetid) }.first
61
60
  "\nPath: #{snippet['File']}\n" \
62
- "StartLine: #{snippet['StartLine']}, " \
63
- "EndLine: #{snippet['EndLine']}\n" \
64
- "Code:\n#{snippet['Text']['#cdata-section'].strip}" \
61
+ "StartLine: #{snippet['StartLine']}, " \
62
+ "EndLine: #{snippet['EndLine']}\n" \
63
+ "Code:\n#{snippet['Text']['#cdata-section'].strip}" \
65
64
  end
66
- # rubocop:enable Layout/LineEndStringConcatenationIndentation
67
65
 
68
66
  def nist_tag(rule)
69
67
  references = rule['References']['Reference']
@@ -0,0 +1,6 @@
1
+ asff_mapper translates AWS Security Finding Format results from JSON to HDF-formatted JSON so as to be viewable on Heimdall
2
+
3
+ Examples:
4
+
5
+ heimdall_tools asff_mapper -i <asff-finding-json> -o <hdf-scan-results-json>
6
+ heimdall_tools asff_mapper -i <asff-finding-json> --sh <standard-1-json> ... <standard-n-json> -o <hdf-scan-results-json>
@@ -0,0 +1,5 @@
1
+ prowler_mapper translates Prowler-derived AWS Security Finding Format results from concatenated JSON blobs to HDF-formatted JSON so as to be viewable on Heimdall
2
+
3
+ Examples:
4
+
5
+ heimdall_tools prowler_mapper -i <prowler-asff-json> -o <hdf-scan-results-json>
@@ -92,11 +92,17 @@ module HeimdallTools
92
92
 
93
93
  def finding(issue, timestamp)
94
94
  finding = {}
95
- # if compliance-result field, this is a policy compliance result entry
96
- # nessus policy compliance result provides a pass/fail data
97
- # For non policy compliance results are defaulted to failed
98
95
  if issue['compliance-result']
99
- finding['status'] = issue['compliance-result'].eql?('PASSED') ? 'passed' : 'failed'
96
+ case issue['compliance-result']
97
+ when 'PASSED'
98
+ finding['status'] = 'passed'
99
+ when 'ERROR'
100
+ finding['status'] = 'error'
101
+ when 'WARNING'
102
+ finding['status'] = 'skipped'
103
+ else
104
+ finding['status'] = 'failed'
105
+ end
100
106
  else
101
107
  finding['status'] = 'failed'
102
108
  end
@@ -0,0 +1,8 @@
1
+ module HeimdallTools
2
+ class ProwlerMapper < ASFFMapper
3
+ def initialize(prowler_asff_json)
4
+ # comes as an asff-json file which is basically all the findings concatenated into one file instead of putting it in the proper wrapper data structure
5
+ super("{ \"Findings\": [#{prowler_asff_json.split("\n").join(',')}]}", meta: { 'name' => 'Prowler', 'title' => 'Prowler findings' })
6
+ end
7
+ end
8
+ end
@@ -19,4 +19,6 @@ module HeimdallTools
19
19
  autoload :SarifMapper, 'heimdall_tools/sarif_mapper'
20
20
  autoload :ScoutSuiteMapper, 'heimdall_tools/scoutsuite_mapper'
21
21
  autoload :XCCDFResultsMapper, 'heimdall_tools/xccdf_results_mapper'
22
+ autoload :ASFFMapper, 'heimdall_tools/asff_mapper'
23
+ autoload :ProwlerMapper, 'heimdall_tools/prowler_mapper'
22
24
  end
metadata CHANGED
@@ -1,16 +1,17 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: heimdall_tools
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.3.48
4
+ version: 1.3.49
5
5
  platform: ruby
6
6
  authors:
7
7
  - Robert Thew
8
8
  - Rony Xavier
9
+ - Amndeep Singh Mann
9
10
  - Aaron Lippold
10
11
  autorequire:
11
12
  bindir: exe
12
13
  cert_chain: []
13
- date: 2021-06-29 00:00:00.000000000 Z
14
+ date: 2021-11-18 00:00:00.000000000 Z
14
15
  dependencies:
15
16
  - !ruby/object:Gem::Dependency
16
17
  name: aws-sdk-configservice
@@ -26,6 +27,20 @@ dependencies:
26
27
  - - "~>"
27
28
  - !ruby/object:Gem::Version
28
29
  version: '1'
30
+ - !ruby/object:Gem::Dependency
31
+ name: aws-sdk-securityhub
32
+ requirement: !ruby/object:Gem::Requirement
33
+ requirements:
34
+ - - "~>"
35
+ - !ruby/object:Gem::Version
36
+ version: '1'
37
+ type: :runtime
38
+ prerelease: false
39
+ version_requirements: !ruby/object:Gem::Requirement
40
+ requirements:
41
+ - - "~>"
42
+ - !ruby/object:Gem::Version
43
+ version: '1'
29
44
  - !ruby/object:Gem::Dependency
30
45
  name: csv
31
46
  requirement: !ruby/object:Gem::Requirement
@@ -54,6 +69,20 @@ dependencies:
54
69
  - - ">="
55
70
  - !ruby/object:Gem::Version
56
71
  version: 0.17.2
72
+ - !ruby/object:Gem::Dependency
73
+ name: htmlentities
74
+ requirement: !ruby/object:Gem::Requirement
75
+ requirements:
76
+ - - "~>"
77
+ - !ruby/object:Gem::Version
78
+ version: 4.3.4
79
+ type: :runtime
80
+ prerelease: false
81
+ version_requirements: !ruby/object:Gem::Requirement
82
+ requirements:
83
+ - - "~>"
84
+ - !ruby/object:Gem::Version
85
+ version: 4.3.4
57
86
  - !ruby/object:Gem::Dependency
58
87
  name: httparty
59
88
  requirement: !ruby/object:Gem::Requirement
@@ -216,6 +245,10 @@ files:
216
245
  - lib/data/owasp-nist-mapping.csv
217
246
  - lib/data/scoutsuite-nist-mapping.csv
218
247
  - lib/heimdall_tools.rb
248
+ - lib/heimdall_tools/asff_compatible_products/firewall_manager.rb
249
+ - lib/heimdall_tools/asff_compatible_products/prowler.rb
250
+ - lib/heimdall_tools/asff_compatible_products/securityhub.rb
251
+ - lib/heimdall_tools/asff_mapper.rb
219
252
  - lib/heimdall_tools/aws_config_mapper.rb
220
253
  - lib/heimdall_tools/burpsuite_mapper.rb
221
254
  - lib/heimdall_tools/cli.rb
@@ -224,6 +257,7 @@ files:
224
257
  - lib/heimdall_tools/fortify_mapper.rb
225
258
  - lib/heimdall_tools/hdf.rb
226
259
  - lib/heimdall_tools/help.rb
260
+ - lib/heimdall_tools/help/asff_mapper.md
227
261
  - lib/heimdall_tools/help/aws_config_mapper.md
228
262
  - lib/heimdall_tools/help/burpsuite_mapper.md
229
263
  - lib/heimdall_tools/help/dbprotect_mapper.md
@@ -232,6 +266,7 @@ files:
232
266
  - lib/heimdall_tools/help/nessus_mapper.md
233
267
  - lib/heimdall_tools/help/netsparker_mapper.md
234
268
  - lib/heimdall_tools/help/nikto_mapper.md
269
+ - lib/heimdall_tools/help/prowler_mapper.md
235
270
  - lib/heimdall_tools/help/sarif_mapper.md
236
271
  - lib/heimdall_tools/help/scoutsuite_mapper.md
237
272
  - lib/heimdall_tools/help/snyk_mapper.md
@@ -241,6 +276,7 @@ files:
241
276
  - lib/heimdall_tools/nessus_mapper.rb
242
277
  - lib/heimdall_tools/netsparker_mapper.rb
243
278
  - lib/heimdall_tools/nikto_mapper.rb
279
+ - lib/heimdall_tools/prowler_mapper.rb
244
280
  - lib/heimdall_tools/sarif_mapper.rb
245
281
  - lib/heimdall_tools/scoutsuite_mapper.rb
246
282
  - lib/heimdall_tools/snyk_mapper.rb
@@ -268,8 +304,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
268
304
  - !ruby/object:Gem::Version
269
305
  version: '0'
270
306
  requirements: []
271
- rubygems_version: 3.2.15
307
+ rubygems_version: 3.2.22
272
308
  signing_key:
273
309
  specification_version: 4
274
- summary: Convert Forify, Openzap and Sonarqube results to HDF
310
+ summary: Convert a variety of security product results to HDF
275
311
  test_files: []