heimdall_tools 1.3.42 → 1.3.47

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 938eedc5f080d2ffca3623fd3f25c8996244009cc65aa4b78981ee8310063052
4
- data.tar.gz: '0588206839c8f454134354721e3d850859e59c0fab8d964c4eb40a9987930443'
3
+ metadata.gz: 0c1758366de13bb1966a1ffeccbab60c7b1c2d91c221d5ee604dd2d4f1c96dcb
4
+ data.tar.gz: 2ba1491cb04b569b216c966724b9b8e4f06430a3dc7913ce8bf5e0611ed6896a
5
5
  SHA512:
6
- metadata.gz: f6b1a7071a33b8d54d2336ce715c67bff26fb1a02306612cb63d752553cd9758ce05274b37c62c4887a0746772c100772fba9d38abded9099ed14862726d5f90
7
- data.tar.gz: 1c752004ea7dde9a7695bb68d4e4a71839821afdf4032e6561bf0d119b2ee108dc131ace3b78382cab517afdfa19699f69a34b155faa546f4effa0c9fb658933
6
+ metadata.gz: 38d66cf5d8653a1a859b7254b9b2ba05518ce5c750b73fb6e638b24fca08ea574aaf1e89bbf560f98fa5cdd7d56d49ca5dedf23b20845d56dd9ecb8c41210ce1
7
+ data.tar.gz: b9db7f5c78640e37795493f36e2b119e4c15d5184d3b830e57bf92ad41709a0fa8345946e249afef7f1adb0ba02a89183c756ce8e81eeafb5cd766232518b197
data/README.md CHANGED
@@ -9,13 +9,30 @@ HeimdallTools supplies several methods to convert output from various tools to "
9
9
  - **fortify_mapper** - commercial static code analysis tool
10
10
  - **zap_mapper** - OWASP ZAP - open-source dynamic code analysis tool
11
11
  - **burpsuite_mapper** - commercial dynamic analysis tool
12
- - **nessus_mapper** - commercial vulnerability scanner
12
+ - **nessus_mapper** - commercial security scanner (supports compliance and vulnerability scans from Tenable.sc and Tenable.io)
13
13
  - **snyk_mapper** - commercial package vulnerability scanner
14
14
  - **nikto_mapper** - open-source web server scanner
15
15
  - **jfrog_xray_mapper** - package vulnerability scanner
16
16
  - **dbprotect_mapper** - database vulnerability scanner
17
17
  - **aws_config_mapper** - assess, audit, and evaluate AWS resources
18
18
  - **netsparker_mapper** - web application security scanner
19
+ - **sarif_mapper** - static analysis results interchange format
20
+ - **scoutsuite_mapper** - multi-cloud security auditing tool
21
+
22
+ ## Want to recommend a mapper for another tool? Please use these steps:
23
+ 1. Create an [issue](https://github.com/mitre/heimdall_tools/issues/new), and email saf@groups.mitre.org citing the issue link so we can help
24
+ 2. Provide a sample output, preferably the most detailed the tool can provide, and also preferably in a machine-readable format, such as xml, json, or csv - whichever is natively available. If it is sensitive we'll work that in #3. (If it's an API only, we'll also just talk about it in #3)
25
+ 3. Let's arrange a time to take a close look at the data it provides to get an idea of all it has to offer. We'll suggest an initial mapping of the HDF core elements. (see https://saf.mitre.org/#/normalize)
26
+ 4. Note: if the tool doesn't provide a NIST SP 800-53 reference, we've worked on mappings to other references such as CWE or OWASP Top 10:
27
+ https://github.com/mitre/heimdall_tools/tree/master/lib/data
28
+ https://github.com/mitre/heimdall_tools/blob/master/lib/data/cwe-nist-mapping.csv
29
+ https://github.com/mitre/heimdall_tools/blob/master/lib/data/owasp-nist-mapping.csv
30
+ 5. If the tool doesn't provide something for #4, or another core element such as impact, we'll help you identify a custom mapping approach.
31
+ 6. We'll help you decide how to preserve any other information (non-core elements) the tool provides to ensure that all of the original tool's intent comes through for the user when the data is viewed in Heimdall.
32
+ 7. Finally, We'll provide final peer review and support merging your pull request.
33
+ We appreciate your contributions, but we're here to help!
34
+
35
+ ## How to Install Heimdall Tools:
19
36
 
20
37
  Ruby 2.4 or higher (check using "ruby -v")
21
38
 
@@ -136,7 +153,8 @@ example: heimdall_tools burpsuite_mapper -x burpsuite_results.xml -o scan_result
136
153
 
137
154
  ## nessus_mapper
138
155
 
139
- nessus_mapper translates a Nessus-exported XML results file into HDF format json to be viewable in Heimdall
156
+ nessus_mapper translates a Nessus-exported XML results file into HDF format json to be viewable in Heimdall.
157
+ Supports compliance and vulnerability scans from Tenable.sc and Tenable.io.
140
158
 
141
159
  Note: A separate HDF JSON file is generated for each host reported in the Nessus Report.
142
160
 
@@ -185,6 +203,22 @@ FLAGS:
185
203
  example: heimdall_tools nikto_mapper -j nikto_results.json -o nikto_results.json
186
204
  ```
187
205
 
206
+ ## scoutsuite_mapper
207
+
208
+ scoutsuite_mapper translates Scout Suite results from Javascript to HDF-formatted JSON so as to be viewable on Heimdall
209
+
210
+ Note: Currently this mapper only supports AWS.
211
+
212
+ ```
213
+ USAGE: heimdall_tools scoutsuite_mapper -i <scoutsuite-results-js> -o <hdf-scan-results-json>
214
+
215
+ FLAGS:
216
+ -i --input -j --javascript <scoutsuite-results-js> : path to Scout Suite results Javascript file.
217
+ -o --output <hdf-scan-results-json> : path to output scan-results json.
218
+
219
+ example: heimdall_tools scoutsuite_mapper -i scoutsuite_results.js -o scoutsuite_hdf.json
220
+ ```
221
+
188
222
  ## jfrog_xray_mapper
189
223
 
190
224
  jfrog_xray_mapper translates an JFrog Xray results JSON file into HDF format JSON to be viewable in Heimdall
@@ -252,6 +286,21 @@ FLAGS:
252
286
  example: heimdall_tools netsparker_mapper -x netsparker_results.xml -o netsparker_hdf.json
253
287
  ```
254
288
 
289
+ ## sarif_mapper
290
+
291
+ sarif_mapper translates a SARIF JSON file into HDF format JSON to be viewable in Heimdall
292
+
293
+ ```
294
+ USAGE: heimdall_tools sarif_mapper [OPTIONS] -j <sarif-results-json> -o <hdf-scan-results.json>
295
+
296
+ FLAGS:
297
+ -j <sarif_results_json> : path to SARIF results JSON file.
298
+ -o --output_prefix <prefix> : path to output scan-results json.
299
+ -V --verbose : verbose run [optional].
300
+
301
+ example: heimdall_tools sarif_mapper -j sarif_results.json -o sarif_results_hdf.json
302
+ ```
303
+
255
304
  ## version
256
305
 
257
306
  Prints out the gem version
@@ -0,0 +1,140 @@
1
+ rule,nistid
2
+ acm-certificate-with-close-expiration-date,SC-12
3
+ acm-certificate-with-transparency-logging-disabled,SC-12
4
+ cloudformation-stack-with-role,AC-6
5
+ cloudtrail-duplicated-global-services-logging,AU-6
6
+ cloudtrail-no-cloudwatch-integration,AU-12|SI-4(2)
7
+ cloudtrail-no-data-logging,AU-12
8
+ cloudtrail-no-encryption-with-kms,AU-6
9
+ cloudtrail-no-global-services-logging,AU-12
10
+ cloudtrail-no-log-file-validation,AU-6
11
+ cloudtrail-no-logging,AU-12
12
+ cloudtrail-not-configured,AU-12
13
+ cloudwatch-alarm-without-actions,AU-12
14
+ config-recorder-not-configured,CM-8|CM-8(2)|CM-8(6)
15
+ ec2-ami-public,AC-3
16
+ ec2-default-security-group-in-use,AC-3(3)
17
+ ec2-default-security-group-with-rules,AC-3(3)
18
+ ec2-ebs-snapshot-not-encrypted,SC-28
19
+ ec2-ebs-snapshot-public,AC-3
20
+ ec2-ebs-volume-not-encrypted,SC-28
21
+ ec2-instance-in-security-group,CM-7(1)
22
+ ec2-instance-type,CM-2
23
+ ec2-instance-types,CM-2
24
+ ec2-instance-with-public-ip,AC-3
25
+ ec2-instance-with-user-data-secrets,AC-3
26
+ ec2-security-group-opens-all-ports,CM-7(1)
27
+ ec2-security-group-opens-all-ports-to-all,CM-7(1)
28
+ ec2-security-group-opens-all-ports-to-self,CM-7(1)
29
+ ec2-security-group-opens-icmp-to-all,CM-7(1)
30
+ ec2-security-group-opens-known-port-to-all,CM-7(1)
31
+ ec2-security-group-opens-plaintext-port,CM-7(1)
32
+ ec2-security-group-opens-port-range,CM-7(1)
33
+ ec2-security-group-opens-port-to-all,CM-7(1)
34
+ ec2-security-group-whitelists-aws,CM-7(1)
35
+ ec2-security-group-whitelists-aws-ip-from-banned-region,CM-7(1)
36
+ ec2-security-group-whitelists-non-elastic-ips,CM-7(1)
37
+ ec2-security-group-whitelists-unknown-aws,CM-7(1)
38
+ ec2-security-group-whitelists-unknown-cidrs,CM-7(1)
39
+ ec2-unused-security-group,CM-7(1)
40
+ elb-listener-allowing-cleartext,SC-8
41
+ elb-no-access-logs,AU-12
42
+ elb-older-ssl-policy,SC-8
43
+ elbv2-http-request-smuggling,SC-8
44
+ elbv2-listener-allowing-cleartext,SC-8
45
+ elbv2-no-access-logs,AU-12
46
+ elbv2-no-deletion-protection,SI-7
47
+ elbv2-older-ssl-policy,SC-8
48
+ iam-assume-role-lacks-external-id-and-mfa,AC-17
49
+ iam-assume-role-no-mfa,AC-6
50
+ iam-assume-role-policy-allows-all,AC-6
51
+ iam-ec2-role-without-instances,AC-6
52
+ iam-group-with-inline-policies,AC-6
53
+ iam-group-with-no-users,AC-6
54
+ iam-human-user-with-policies,AC-6
55
+ iam-inline-policy-allows-non-sts-action,AC-6
56
+ iam-inline-policy-allows-NotActions,AC-6
57
+ iam-inline-policy-for-role,AC-6
58
+ iam-managed-policy-allows-full-privileges,AC-6
59
+ iam-managed-policy-allows-non-sts-action,AC-6
60
+ iam-managed-policy-allows-NotActions,AC-6
61
+ iam-managed-policy-for-role,AC-6
62
+ iam-managed-policy-no-attachments,AC-6
63
+ iam-no-support-role,IR-7
64
+ iam-password-policy-expiration-threshold,AC-2
65
+ iam-password-policy-minimum-length,AC-2
66
+ iam-password-policy-no-expiration,AC-2
67
+ iam-password-policy-no-lowercase-required,AC-2
68
+ iam-password-policy-no-number-required,AC-2
69
+ iam-password-policy-no-symbol-required,AC-2
70
+ iam-password-policy-no-uppercase-required,AC-2
71
+ iam-password-policy-reuse-enabled,IA-5(1)
72
+ iam-role-with-inline-policies,AC-6
73
+ iam-root-account-no-hardware-mfa,IA-2(1)
74
+ iam-root-account-no-mfa,IA-2(1)
75
+ iam-root-account-used-recently,AC-6(9)
76
+ iam-root-account-with-active-certs,AC-6(9)
77
+ iam-root-account-with-active-keys,AC-6(9)
78
+ iam-service-user-with-password,AC-2
79
+ iam-unused-credentials-not-disabled,AC-2
80
+ iam-user-no-key-rotation,AC-2
81
+ iam-user-not-in-category-group,AC-2
82
+ iam-user-not-in-common-group,AC-2
83
+ iam-user-unused-access-key-initial-setup,AC-2
84
+ iam-user-with-multiple-access-keys,IA-2
85
+ iam-user-without-mfa,IA-2(1)
86
+ iam-user-with-password-and-key,IA-2
87
+ iam-user-with-policies,AC-2
88
+ kms-cmk-rotation-disabled,SC-12
89
+ logs-no-alarm-aws-configuration-changes,CM-8|CM-8(2)|CM-8(6)
90
+ logs-no-alarm-cloudtrail-configuration-changes,AU-6
91
+ logs-no-alarm-cmk-deletion,AC-2
92
+ logs-no-alarm-console-authentication-failures,AC-2
93
+ logs-no-alarm-iam-policy-changes,AC-2
94
+ logs-no-alarm-nacl-changes,CM-6(2)
95
+ logs-no-alarm-network-gateways-changes,AU-12|CM-6(2)
96
+ logs-no-alarm-root-usage,AU-2
97
+ logs-no-alarm-route-table-changes,AU-12|CM-6(2)
98
+ logs-no-alarm-s3-policy-changes,AC-6|AU-12
99
+ logs-no-alarm-security-group-changes,AC-2(4)
100
+ logs-no-alarm-signin-without-mfa,AC-2
101
+ logs-no-alarm-unauthorized-api-calls,AU-6|SI-4(2)
102
+ logs-no-alarm-vpc-changes,CM-6(1)
103
+ rds-instance-backup-disabled,CP-9
104
+ rds-instance-ca-certificate-deprecated,SC-12
105
+ rds-instance-no-minor-upgrade,SI-2
106
+ rds-instance-short-backup-retention-period,CP-9
107
+ rds-instance-single-az,CP-7
108
+ rds-instance-storage-not-encrypted,SC-28
109
+ rds-postgres-instance-with-invalid-certificate,SC-12
110
+ rds-security-group-allows-all,CM-7(1)
111
+ rds-snapshot-public,SC-28
112
+ redshift-cluster-database-not-encrypted,SC-28
113
+ redshift-cluster-no-version-upgrade,SI-2
114
+ redshift-cluster-publicly-accessible,AC-3
115
+ redshift-parameter-group-logging-disabled,AU-12
116
+ redshift-parameter-group-ssl-not-required,SC-8
117
+ redshift-security-group-whitelists-all,CM-7(1)
118
+ route53-domain-no-autorenew,SC-2
119
+ route53-domain-no-transferlock,SC-2
120
+ route53-domain-transferlock-not-authorized,SC-2
121
+ s3-bucket-allowing-cleartext,SC-28
122
+ s3-bucket-no-default-encryption,SC-28
123
+ s3-bucket-no-logging,AU-2|AU-12
124
+ s3-bucket-no-mfa-delete,SI-7
125
+ s3-bucket-no-versioning,SI-7
126
+ s3-bucket-world-acl,AC-3(3)
127
+ s3-bucket-world-policy-arg,AC-3(3)
128
+ s3-bucket-world-policy-star,AC-3(3)
129
+ ses-identity-dkim-not-enabled,SC-23
130
+ ses-identity-dkim-not-verified,SC-23
131
+ ses-identity-world-policy,AC-6
132
+ sns-topic-world-policy,AC-6
133
+ sqs-queue-world-policy,AC-6
134
+ vpc-custom-network-acls-allow-all,SC-7
135
+ vpc-default-network-acls-allow-all,SC-7
136
+ vpc-network-acl-not-used,SC-7
137
+ vpc-routing-tables-with-peering,AC-3(3)
138
+ vpc-subnet-with-bad-acls,SC-7
139
+ vpc-subnet-with-default-acls,SC-7
140
+ vpc-subnet-without-flow-log,AU-12
@@ -16,4 +16,6 @@ module HeimdallTools
16
16
  autoload :DBProtectMapper, 'heimdall_tools/dbprotect_mapper'
17
17
  autoload :AwsConfigMapper, 'heimdall_tools/aws_config_mapper'
18
18
  autoload :NetsparkerMapper, 'heimdall_tools/netsparker_mapper'
19
+ autoload :SarifMapper, 'heimdall_tools/sarif_mapper'
20
+ autoload :ScoutSuiteMapper, 'heimdall_tools/scoutsuite_mapper'
19
21
  end
@@ -18,8 +18,7 @@ INSUFFICIENT_DATA_MSG = 'Not enough data has been collectd to determine complian
18
18
  #
19
19
  module HeimdallTools
20
20
  class AwsConfigMapper
21
- def initialize(custom_mapping, endpoint = nil, verbose = false)
22
- @verbose = verbose
21
+ def initialize(custom_mapping, endpoint = nil)
23
22
  @default_mapping = get_rule_mapping(AWS_CONFIG_MAPPING_FILE)
24
23
  @custom_mapping = custom_mapping.nil? ? {} : get_rule_mapping(custom_mapping)
25
24
  if endpoint.nil?
@@ -38,8 +37,8 @@ module HeimdallTools
38
37
  def to_hdf
39
38
  controls = @issues.map do |issue|
40
39
  @item = {}
41
- @item['id'] = issue[:config_rule_name]
42
- @item['title'] = issue[:config_rule_name]
40
+ @item['id'] = issue[:config_rule_id]
41
+ @item['title'] = "#{get_account_id(issue[:config_rule_arn])} - #{issue[:config_rule_name]}"
43
42
  @item['desc'] = issue[:description]
44
43
  @item['impact'] = 0.5
45
44
  @item['tags'] = hdf_tags(issue)
@@ -55,18 +54,33 @@ module HeimdallTools
55
54
  @item
56
55
  end
57
56
  end
57
+
58
58
  results = HeimdallDataFormat.new(
59
59
  profile_name: 'AWS Config',
60
- title: 'AWS Config',
61
- summary: 'AWS Config',
62
- controls: controls,
63
- statistics: { aws_config_sdk_version: Aws::ConfigService::GEM_VERSION },
60
+ title: 'AWS Config',
61
+ summary: 'AWS Config',
62
+ controls: controls,
63
+ statistics: { aws_config_sdk_version: Aws::ConfigService::GEM_VERSION },
64
64
  )
65
65
  results.to_hdf
66
66
  end
67
67
 
68
68
  private
69
69
 
70
+ ##
71
+ # Gets the account ID from a config rule ARN
72
+ #
73
+ # https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html
74
+ # https://docs.aws.amazon.com/general/latest/gr/acct-identifiers.html
75
+ #
76
+ # Params:
77
+ # - arn: The ARN of the config rule
78
+ #
79
+ # Returns: The account ID portion of the ARN
80
+ def get_account_id(arn)
81
+ /:(\d{12}):config-rule/.match(arn)&.captures&.first || 'no-account-id'
82
+ end
83
+
70
84
  ##
71
85
  # Read in a config rule -> 800-53 control mapping CSV.
72
86
  #
@@ -263,7 +277,8 @@ module HeimdallTools
263
277
  # If no input parameters, then provide an empty JSON array to the JSON
264
278
  # parser because passing nil to JSON.parse throws an exception.
265
279
  params = (JSON.parse(config_rule[:input_parameters] || '[]').map { |key, value| "#{key}: #{value}" }).join('<br/>')
266
- check_text = config_rule[:config_rule_arn] || ''
280
+ check_text = "ARN: #{config_rule[:config_rule_arn] || 'N/A'}"
281
+ check_text += "<br/>Source Identifier: #{config_rule.dig(:source, :source_identifier) || 'N/A'}"
267
282
  check_text += "<br/>#{params}" unless params.empty?
268
283
  check_text
269
284
  end
@@ -20,9 +20,8 @@ DEFAULT_NIST_TAG = %w{SA-11 RA-5 Rev_4}.freeze
20
20
 
21
21
  module HeimdallTools
22
22
  class BurpSuiteMapper
23
- def initialize(burps_xml, _name = nil, verbose = false)
23
+ def initialize(burps_xml, _name = nil)
24
24
  @burps_xml = burps_xml
25
- @verbose = verbose
26
25
 
27
26
  begin
28
27
  @cwe_nist_mapping = parse_mapper
@@ -6,7 +6,6 @@ module HeimdallTools
6
6
  long_desc Help.text(:fortify_mapper)
7
7
  option :fvdl, required: true, aliases: '-f'
8
8
  option :output, required: true, aliases: '-o'
9
- option :verbose, type: :boolean, aliases: '-V'
10
9
  def fortify_mapper
11
10
  hdf = HeimdallTools::FortifyMapper.new(File.read(options[:fvdl])).to_hdf
12
11
  File.write(options[:output], hdf)
@@ -17,7 +16,6 @@ module HeimdallTools
17
16
  option :json, required: true, aliases: '-j'
18
17
  option :name, required: true, aliases: '-n'
19
18
  option :output, required: true, aliases: '-o'
20
- option :verbose, type: :boolean, aliases: '-V'
21
19
  def zap_mapper
22
20
  hdf = HeimdallTools::ZapMapper.new(File.read(options[:json]), options[:name]).to_hdf
23
21
  File.write(options[:output], hdf)
@@ -29,7 +27,6 @@ module HeimdallTools
29
27
  option :api_url, required: true, aliases: '-u'
30
28
  option :auth, type: :string, required: false
31
29
  option :output, required: true, aliases: '-o'
32
- option :verbose, type: :boolean, aliases: '-V'
33
30
  def sonarqube_mapper
34
31
  hdf = HeimdallTools::SonarQubeMapper.new(options[:name], options[:api_url], options[:auth]).to_hdf
35
32
  File.write(options[:output], hdf)
@@ -39,7 +36,6 @@ module HeimdallTools
39
36
  long_desc Help.text(:burpsuite_mapper)
40
37
  option :xml, required: true, aliases: '-x'
41
38
  option :output, required: true, aliases: '-o'
42
- option :verbose, type: :boolean, aliases: '-V'
43
39
  def burpsuite_mapper
44
40
  hdf = HeimdallTools::BurpSuiteMapper.new(File.read(options[:xml])).to_hdf
45
41
  File.write(options[:output], hdf)
@@ -49,7 +45,6 @@ module HeimdallTools
49
45
  long_desc Help.text(:nessus_mapper)
50
46
  option :xml, required: true, aliases: '-x'
51
47
  option :output_prefix, required: true, aliases: '-o'
52
- option :verbose, type: :boolean, aliases: '-V'
53
48
  def nessus_mapper
54
49
  hdfs = HeimdallTools::NessusMapper.new(File.read(options[:xml])).to_hdf
55
50
 
@@ -64,7 +59,6 @@ module HeimdallTools
64
59
  long_desc Help.text(:snyk_mapper)
65
60
  option :json, required: true, aliases: '-j'
66
61
  option :output_prefix, required: true, aliases: '-o'
67
- option :verbose, type: :boolean, aliases: '-V'
68
62
  def snyk_mapper
69
63
  hdfs = HeimdallTools::SnykMapper.new(File.read(options[:json]), options[:name]).to_hdf
70
64
  puts "\r\HDF Generated:\n"
@@ -78,7 +72,6 @@ module HeimdallTools
78
72
  long_desc Help.text(:nikto_mapper)
79
73
  option :json, required: true, aliases: '-j'
80
74
  option :output, required: true, aliases: '-o'
81
- option :verbose, type: :boolean, aliases: '-V'
82
75
  def nikto_mapper
83
76
  hdf = HeimdallTools::NiktoMapper.new(File.read(options[:json])).to_hdf
84
77
  File.write(options[:output], hdf)
@@ -90,7 +83,6 @@ module HeimdallTools
90
83
  long_desc Help.text(:jfrog_xray_mapper)
91
84
  option :json, required: true, aliases: '-j'
92
85
  option :output, required: true, aliases: '-o'
93
- option :verbose, type: :boolean, aliases: '-V'
94
86
  def jfrog_xray_mapper
95
87
  hdf = HeimdallTools::JfrogXrayMapper.new(File.read(options[:json])).to_hdf
96
88
  File.write(options[:output], hdf)
@@ -102,7 +94,6 @@ module HeimdallTools
102
94
  long_desc Help.text(:dbprotect_mapper)
103
95
  option :xml, required: true, aliases: '-x'
104
96
  option :output, required: true, aliases: '-o'
105
- option :verbose, type: :boolean, aliases: '-V'
106
97
  def dbprotect_mapper
107
98
  hdf = HeimdallTools::DBProtectMapper.new(File.read(options[:xml])).to_hdf
108
99
  File.write(options[:output], hdf)
@@ -114,7 +105,6 @@ module HeimdallTools
114
105
  long_desc Help.text(:aws_config_mapper)
115
106
  # option :custom_mapping, required: false, aliases: '-m'
116
107
  option :output, required: true, aliases: '-o'
117
- option :verbose, type: :boolean, aliases: '-V'
118
108
  def aws_config_mapper
119
109
  hdf = HeimdallTools::AwsConfigMapper.new(options[:custom_mapping]).to_hdf
120
110
  File.write(options[:output], hdf)
@@ -126,7 +116,6 @@ module HeimdallTools
126
116
  long_desc Help.text(:netsparker_mapper)
127
117
  option :xml, required: true, aliases: '-x'
128
118
  option :output, required: true, aliases: '-o'
129
- option :verbose, type: :boolean, aliases: '-V'
130
119
  def netsparker_mapper
131
120
  hdf = HeimdallTools::NetsparkerMapper.new(File.read(options[:xml])).to_hdf
132
121
  File.write(options[:output], hdf)
@@ -134,6 +123,29 @@ module HeimdallTools
134
123
  puts options[:output].to_s
135
124
  end
136
125
 
126
+ desc 'sarif_mapper', 'sarif_mapper translates a SARIF JSON file into HDF format JSON to be viewable in Heimdall'
127
+ long_desc Help.text(:sarif_mapper)
128
+ option :json, required: true, aliases: '-j'
129
+ option :output, required: true, aliases: '-o'
130
+ option :verbose, type: :boolean, aliases: '-V'
131
+ def sarif_mapper
132
+ hdf = HeimdallTools::SarifMapper.new(File.read(options[:json])).to_hdf
133
+ File.write(options[:output], hdf)
134
+ puts "\r\HDF Generated:\n"
135
+ puts options[:output].to_s
136
+ end
137
+
138
+ desc 'scoutsuite_mapper', 'scoutsuite_mapper translates Scout Suite results from Javascript to HDF-formatted JSON so as to be viewable on Heimdall'
139
+ long_desc Help.text(:scoutsuite_mapper)
140
+ option :javascript, required: true, banner: 'SCOUTSUITE-RESULTS-JS', aliases: ['-i', '--input', '-j']
141
+ option :output, required: true, banner: 'HDF-SCAN-RESULTS-JSON', aliases: '-o'
142
+ def scoutsuite_mapper
143
+ hdf = HeimdallTools::ScoutSuiteMapper.new(File.read(options[:javascript])).to_hdf
144
+ File.write(options[:output], hdf)
145
+ puts "\rHDF Generated:\n"
146
+ puts options[:output].to_s
147
+ end
148
+
137
149
  desc 'version', 'prints version'
138
150
  def version
139
151
  puts VERSION
@@ -12,15 +12,11 @@ IMPACT_MAPPING = {
12
12
 
13
13
  module HeimdallTools
14
14
  class DBProtectMapper
15
- def initialize(xml, _name = nil, verbose = false)
16
- @verbose = verbose
17
-
18
- begin
19
- dataset = xml_to_hash(xml)
20
- @entries = compile_findings(dataset['dataset'])
21
- rescue StandardError => e
22
- raise "Invalid DBProtect XML file provided Exception: #{e};\nNote that XML must be of kind `Check Results Details`."
23
- end
15
+ def initialize(xml, _name = nil)
16
+ dataset = xml_to_hash(xml)
17
+ @entries = compile_findings(dataset['dataset'])
18
+ rescue StandardError => e
19
+ raise "Invalid DBProtect XML file provided Exception: #{e};\nNote that XML must be of kind `Check Results Details`."
24
20
  end
25
21
 
26
22
  def to_hdf
@@ -7,9 +7,8 @@ DEFAULT_NIST_TAG = %w{SA-11 RA-5}.freeze
7
7
 
8
8
  module HeimdallTools
9
9
  class FortifyMapper
10
- def initialize(fvdl, verbose = false)
10
+ def initialize(fvdl)
11
11
  @fvdl = fvdl
12
- @verbose = verbose
13
12
 
14
13
  begin
15
14
  data = xml_to_hash(fvdl)
@@ -0,0 +1,12 @@
1
+ sarif_mapper translates a SARIF JSON file into HDF format JSON to be viewable in Heimdall
2
+
3
+ SARIF level to HDF impact Mapping:
4
+ SARIF level error -> HDF impact 0.7
5
+ SARIF level warning -> HDF impact 0.5
6
+ SARIF level note -> HDF impact 0.3
7
+ SARIF level none -> HDF impact 0.1
8
+ SARIF level not provided -> HDF impact 0.1 as default
9
+
10
+ Examples:
11
+
12
+ heimdall_tools sarif_mapper [OPTIONS] -j <sarif-results-json> -o <hdf-scan-results.json>
@@ -0,0 +1,7 @@
1
+ scoutsuite_mapper translates Scout Suite results from Javascript to HDF-formatted JSON so as to be viewable on Heimdall
2
+
3
+ Note: Currently this mapper only supports AWS.
4
+
5
+ Examples:
6
+
7
+ heimdall_tools scoutsuite_mapper -i <scoutsuite-results-js> -o <hdf-scan-results-json>
@@ -27,9 +27,8 @@ end
27
27
 
28
28
  module HeimdallTools
29
29
  class JfrogXrayMapper
30
- def initialize(xray_json, _name = nil, verbose = false)
30
+ def initialize(xray_json, _name = nil)
31
31
  @xray_json = xray_json
32
- @verbose = verbose
33
32
 
34
33
  begin
35
34
  @cwe_nist_mapping = parse_mapper
@@ -39,9 +39,8 @@ end
39
39
 
40
40
  module HeimdallTools
41
41
  class NessusMapper
42
- def initialize(nessus_xml, verbose = false)
42
+ def initialize(nessus_xml)
43
43
  @nessus_xml = nessus_xml
44
- @verbose = verbose
45
44
  read_cci_xml
46
45
  begin
47
46
  @cwe_nist_mapping = parse_mapper
@@ -72,7 +71,8 @@ module HeimdallTools
72
71
  info = {}
73
72
 
74
73
  info['policyName'] = policy['policyName']
75
- info['version'] = policy['Preferences']['ServerPreferences']['preference'].select { |x| x['name'].eql? 'sc_version' }.first['value']
74
+ scanner_version = policy['Preferences']['ServerPreferences']['preference'].select { |x| x['name'].eql? 'sc_version' }
75
+ info['version'] = scanner_version.empty? ? NA_STRING : scanner_version.first['value']
76
76
  info
77
77
  rescue StandardError => e
78
78
  raise "Invalid Nessus XML file provided Exception: #{e}"
@@ -221,8 +221,12 @@ module HeimdallTools
221
221
  end
222
222
  if item['compliance-reference']
223
223
  @item['tags']['nist'] = cci_nist_tag(parse_refs(item['compliance-reference'], 'CCI'))
224
+ @item['tags']['cci'] = parse_refs(item['compliance-reference'], 'CCI')
225
+ @item['tags']['rid'] = parse_refs(item['compliance-reference'], 'Rule-ID').join(',')
226
+ @item['tags']['stig_id'] = parse_refs(item['compliance-reference'], 'STIG-ID').join(',')
224
227
  else
225
228
  @item['tags']['nist'] = plugin_nist_tag(item['pluginFamily'], item['pluginID'])
229
+ @item['tags']['rid'] = item['pluginID'].to_s
226
230
  end
227
231
  if item['compliance-solution']
228
232
  @item['descriptions'] << desc_tags(item['compliance-solution'], 'check')
@@ -21,19 +21,15 @@ DEFAULT_NIST_TAG = %w{SA-11 RA-5}.freeze
21
21
 
22
22
  module HeimdallTools
23
23
  class NetsparkerMapper
24
- def initialize(xml, _name = nil, verbose = false)
25
- @verbose = verbose
26
-
27
- begin
28
- @cwe_nist_mapping = parse_mapper(CWE_NIST_MAPPING_FILE)
29
- @owasp_nist_mapping = parse_mapper(OWASP_NIST_MAPPING_FILE)
30
- data = xml_to_hash(xml)
31
-
32
- @vulnerabilities = data['netsparker-enterprise']['vulnerabilities']['vulnerability']
33
- @scan_info = data['netsparker-enterprise']['target']
34
- rescue StandardError => e
35
- raise "Invalid Netsparker XML file provided Exception: #{e}"
36
- end
24
+ def initialize(xml, _name = nil)
25
+ @cwe_nist_mapping = parse_mapper(CWE_NIST_MAPPING_FILE)
26
+ @owasp_nist_mapping = parse_mapper(OWASP_NIST_MAPPING_FILE)
27
+ data = xml_to_hash(xml)
28
+
29
+ @vulnerabilities = data['netsparker-enterprise']['vulnerabilities']['vulnerability']
30
+ @scan_info = data['netsparker-enterprise']['target']
31
+ rescue StandardError => e
32
+ raise "Invalid Netsparker XML file provided Exception: #{e}"
37
33
  end
38
34
 
39
35
  def to_hdf
@@ -26,9 +26,8 @@ end
26
26
 
27
27
  module HeimdallTools
28
28
  class NiktoMapper
29
- def initialize(nikto_json, _name = nil, verbose = false)
29
+ def initialize(nikto_json, _name = nil)
30
30
  @nikto_json = nikto_json
31
- @verbose = verbose
32
31
 
33
32
  begin
34
33
  @nikto_nist_mapping = parse_mapper
@@ -0,0 +1,198 @@
1
+ require 'json'
2
+ require 'csv'
3
+ require 'heimdall_tools/hdf'
4
+
5
+ RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
6
+
7
+ CWE_NIST_MAPPING_FILE = File.join(RESOURCE_DIR, 'cwe-nist-mapping.csv')
8
+
9
+ IMPACT_MAPPING = {
10
+ error: 0.7,
11
+ warning: 0.5,
12
+ note: 0.3,
13
+ none: 0.0
14
+ }.freeze
15
+
16
+ DEFAULT_NIST_TAG = %w{SA-11 RA-5}.freeze
17
+
18
+ # Loading spinner sign
19
+ $spinner = Enumerator.new do |e|
20
+ loop do
21
+ e.yield '|'
22
+ e.yield '/'
23
+ e.yield '-'
24
+ e.yield '\\'
25
+ end
26
+ end
27
+
28
+ module HeimdallTools
29
+ class SarifMapper
30
+ def initialize(sarif_json, _name = nil, verbose = false)
31
+ @sarif_json = sarif_json
32
+ @verbose = verbose
33
+ begin
34
+ @cwe_nist_mapping = parse_mapper
35
+ @sarif_log = JSON.parse(@sarif_json)
36
+ rescue StandardError => e
37
+ raise "Invalid SARIF JSON file provided\n\nException: #{e}"
38
+ end
39
+ end
40
+
41
+ def extract_scaninfo(sarif_log)
42
+ info = {}
43
+ begin
44
+ info['policy'] = 'SARIF'
45
+ info['version'] = sarif_log['version']
46
+ info['projectName'] = 'Static Analysis Results Interchange Format'
47
+ info['summary'] = NA_STRING
48
+ info
49
+ rescue StandardError => e
50
+ raise "Error extracting project info from SARIF JSON file provided Exception: #{e}"
51
+ end
52
+ end
53
+
54
+ def finding(result)
55
+ finding = {}
56
+ finding['status'] = 'failed'
57
+ finding['code_desc'] = ''
58
+ if get_location(result)['uri']
59
+ finding['code_desc'] += " URL : #{get_location(result)['uri']}"
60
+ end
61
+ if get_location(result)['start_line']
62
+ finding['code_desc'] += " LINE : #{get_location(result)['start_line']}"
63
+ end
64
+ if get_location(result)['start_column']
65
+ finding['code_desc'] += " COLUMN : #{get_location(result)['start_column']}"
66
+ end
67
+ finding['code_desc'].strip!
68
+ finding['run_time'] = NA_FLOAT
69
+ finding['start_time'] = NA_STRING
70
+ finding
71
+ end
72
+
73
+ def add_nist_tag_from_cwe(cweid, taxonomy_name, tags_node)
74
+ entries = @cwe_nist_mapping.select { |x| cweid.include?(x[:cweid].to_s) && !x[:nistid].nil? }
75
+ tags = entries.map { |x| x[:nistid] }
76
+ result_tags = tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
77
+ if result_tags.count.positive?
78
+ if !tags_node
79
+ tags_node = {}
80
+ end
81
+ if !tags_node.key?(taxonomy_name)
82
+ tags_node[taxonomy_name] = []
83
+ end
84
+ result_tags.each do |t|
85
+ tags_node[taxonomy_name] |= [t]
86
+ end
87
+ end
88
+ tags_node
89
+ end
90
+
91
+ def get_location(result)
92
+ location_info = {}
93
+ location_info['uri'] = result.dig('locations', 0, 'physicalLocation', 'artifactLocation', 'uri')
94
+ location_info['start_line'] = result.dig('locations', 0, 'physicalLocation', 'region', 'startLine')
95
+ location_info['start_column'] = result.dig('locations', 0, 'physicalLocation', 'region', 'startColumn')
96
+ location_info
97
+ end
98
+
99
+ def get_rule_info(run, result, rule_id)
100
+ finding = {}
101
+ driver = run.dig('tool', 'driver')
102
+ finding['driver_name'] = driver['name']
103
+ finding['driver_version'] = driver['version']
104
+ rules = driver['rules']
105
+ if rules
106
+ rule = rules.find { |x| x['id'].eql?(rule_id) }
107
+ if rule
108
+ finding['rule_name'] = rule&.[]('name')
109
+ finding['rule_short_description'] = rule&.[]('shortDescription')&.[]('text')
110
+ finding['rule_tags'] = get_tags(rule)
111
+ finding['rule_name'] = rule&.[]('messageStrings')&.[]('default')&.[]('text') unless finding['rule_name']
112
+ end
113
+ end
114
+ finding['rule_name'] = result&.[]('message')&.[]('text') unless finding['rule_name']
115
+ finding
116
+ end
117
+
118
+ def get_tags(rule)
119
+ result = {}
120
+ Array(rule&.[]('relationships')).each do |relationship|
121
+ taxonomy_name = relationship['target']['toolComponent']['name'].downcase
122
+ taxonomy_id = relationship['target']['id']
123
+ if !result.key?(taxonomy_name)
124
+ result[taxonomy_name] = []
125
+ end
126
+ result[taxonomy_name] |= [taxonomy_id]
127
+ end
128
+ result
129
+ end
130
+
131
+ def parse_identifiers(rule_tags, ref)
132
+ # Extracting id number from reference style CWE-297
133
+ rule_tags[ref.downcase].map { |e| e.downcase.split("#{ref.downcase}-")[1] }
134
+ rescue StandardError
135
+ []
136
+ end
137
+
138
+ def impact(severity)
139
+ severity_mapping = IMPACT_MAPPING[severity.to_sym]
140
+ severity_mapping.nil? ? 0.1 : severity_mapping
141
+ end
142
+
143
+ def parse_mapper
144
+ csv_data = CSV.read(CWE_NIST_MAPPING_FILE, **{ encoding: 'UTF-8',
145
+ headers: true,
146
+ header_converters: :symbol,
147
+ converters: :all })
148
+ csv_data.map(&:to_hash)
149
+ end
150
+
151
+ def desc_tags(data, label)
152
+ { data: data || NA_STRING, label: label || NA_STRING }
153
+ end
154
+
155
+ def process_item(run, result, controls)
156
+ printf("\rProcessing: %s", $spinner.next)
157
+ control = controls.find { |x| x['id'].eql?(result['ruleId']) }
158
+
159
+ if control
160
+ control['results'] << finding(result)
161
+ else
162
+ rule_info = get_rule_info(run, result, result['ruleId'])
163
+ item = {}
164
+ item['tags'] = rule_info['rule_tags']
165
+ item['descriptions'] = []
166
+ item['refs'] = NA_ARRAY
167
+ item['source_location'] = { ref: get_location(result)['uri'], line: get_location(result)['start_line'] }
168
+ item['descriptions'] = NA_ARRAY
169
+ item['title'] = rule_info['rule_name'].to_s
170
+ item['id'] = result['ruleId'].to_s
171
+ item['desc'] = rule_info['rule_short_description'].to_s
172
+ item['impact'] = impact(result['level'].to_s)
173
+ item['code'] = NA_STRING
174
+ item['results'] = [finding(result)]
175
+ item['tags'] = add_nist_tag_from_cwe(parse_identifiers(rule_info['rule_tags'], 'CWE'), 'nist', item['tags'])
176
+ controls << item
177
+ end
178
+ end
179
+
180
+ def to_hdf
181
+ controls = []
182
+ @sarif_log['runs'].each do |run|
183
+ run['results'].each do |result|
184
+ process_item(run, result, controls)
185
+ end
186
+ end
187
+
188
+ scaninfo = extract_scaninfo(@sarif_log)
189
+ results = HeimdallDataFormat.new(profile_name: scaninfo['policy'],
190
+ version: scaninfo['version'],
191
+ title: scaninfo['projectName'],
192
+ summary: scaninfo['summary'],
193
+ controls: controls,
194
+ target_id: scaninfo['projectName'])
195
+ results.to_hdf
196
+ end
197
+ end
198
+ end
@@ -0,0 +1,180 @@
1
+ require 'json'
2
+ require 'csv'
3
+ require 'heimdall_tools/hdf'
4
+
5
+ RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
6
+
7
+ SCOUTSUITE_NIST_MAPPING_FILE = File.join(RESOURCE_DIR, 'scoutsuite-nist-mapping.csv')
8
+
9
+ IMPACT_MAPPING = {
10
+ danger: 0.7,
11
+ warning: 0.5
12
+ }.freeze
13
+
14
+ DEFAULT_NIST_TAG = %w{SA-11 RA-5}.freeze
15
+
16
+ INSPEC_INPUTS_MAPPING = {
17
+ string: 'String',
18
+ numeric: 'Numeric',
19
+ regexp: 'Regexp',
20
+ array: 'Array',
21
+ hash: 'Hash',
22
+ boolean: 'Boolean',
23
+ any: 'Any'
24
+ }.freeze
25
+
26
+ # Loading spinner sign
27
+ $spinner = Enumerator.new do |e|
28
+ loop do
29
+ e.yield '|'
30
+ e.yield '/'
31
+ e.yield '-'
32
+ e.yield '\\'
33
+ end
34
+ end
35
+
36
+ module HeimdallTools
37
+ # currently only tested against an AWS based result, but ScoutSuite supports many other cloud providers such as Azure
38
+ class ScoutSuiteMapper
39
+ def initialize(scoutsuite_js)
40
+ begin
41
+ @scoutsuite_nist_mapping = parse_mapper
42
+ rescue StandardError => e
43
+ raise "Invalid Scout Suite to NIST mapping file:\nException: #{e}"
44
+ end
45
+
46
+ begin
47
+ @scoutsuite_json = scoutsuite_js.lines[1] # first line is `scoutsuite_results =\n` and second line is json
48
+ @report = JSON.parse(@scoutsuite_json)
49
+ rescue StandardError => e
50
+ raise "Invalid Scout Suite JavaScript file provided:\nException: #{e}"
51
+ end
52
+ end
53
+
54
+ def parse_mapper
55
+ csv_data = CSV.read(SCOUTSUITE_NIST_MAPPING_FILE, { encoding: 'UTF-8', headers: true, header_converters: :symbol })
56
+ csv_data.map(&:to_hash)
57
+ end
58
+
59
+ def create_attribute(name, value, required = nil, sensitive = nil, type = nil)
60
+ { name: name, options: { value: value, required: required, sensitive: sensitive, type: type }.compact }
61
+ end
62
+
63
+ def extract_scaninfo(report)
64
+ info = {}
65
+ begin
66
+ info['name'] = 'Scout Suite Multi-Cloud Security Auditing Tool'
67
+ info['version'] = report['last_run']['version']
68
+ info['title'] = "Scout Suite Report using #{report['last_run']['ruleset_name']} ruleset on #{report['provider_name']} with account #{report['account_id']}"
69
+ info['target_id'] = "#{report['last_run']['ruleset_name']} ruleset:#{report['provider_name']}:#{report['account_id']}"
70
+ info['summary'] = report['last_run']['ruleset_about']
71
+ info['attributes'] = [
72
+ create_attribute('account_id', report['account_id'], true, false, INSPEC_INPUTS_MAPPING[:string]),
73
+ create_attribute('environment', report['environment']),
74
+ create_attribute('ruleset', report['ruleset_name']),
75
+ # think at least these run_parameters are aws only
76
+ create_attribute('run_parameters_excluded_regions', report['last_run']['run_parameters']['excluded_regions'].join(', ')),
77
+ create_attribute('run_parameters_regions', report['last_run']['run_parameters']['regions'].join(', ')),
78
+ create_attribute('run_parameters_services', report['last_run']['run_parameters']['services'].join(', ')),
79
+ create_attribute('run_parameters_skipped_services', report['last_run']['run_parameters']['skipped_services'].join(', ')),
80
+ create_attribute('time', report['last_run']['time']),
81
+ create_attribute('partition', report['partition']), # think this is aws only
82
+ create_attribute('provider_code', report['provider_code']),
83
+ create_attribute('provider_name', report['provider_name']),
84
+ ]
85
+
86
+ info
87
+ rescue StandardError => e
88
+ raise "Error extracting report info from Scout Suite JS->JSON file:\nException: #{e}"
89
+ end
90
+ end
91
+
92
+ def nist_tag(rule)
93
+ entries = @scoutsuite_nist_mapping.select { |x| rule.eql?(x[:rule].to_s) && !x[:nistid].nil? }
94
+ tags = entries.map { |x| x[:nistid].split('|') }
95
+ tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
96
+ end
97
+
98
+ def impact(severity)
99
+ IMPACT_MAPPING[severity.to_sym]
100
+ end
101
+
102
+ def desc_tags(data, label)
103
+ { data: data || NA_STRING, label: label || NA_STRING }
104
+ end
105
+
106
+ def findings(details)
107
+ finding = {}
108
+ if (details['checked_items']).zero?
109
+ finding['status'] = 'skipped'
110
+ finding['skip_message'] = 'Skipped because no items were checked'
111
+ elsif (details['flagged_items']).zero?
112
+ finding['status'] = 'passed'
113
+ finding['message'] = "0 flagged items out of #{details['checked_items']} checked items"
114
+ else # there are checked items and things were flagged
115
+ finding['status'] = 'failed'
116
+ finding['message'] = "#{details['flagged_items']} flagged items out of #{details['checked_items']} checked items:\n#{details['items'].join("\n")}"
117
+ end
118
+ finding['code_desc'] = details['description']
119
+ finding['start_time'] = @report['last_run']['time']
120
+ [finding]
121
+ end
122
+
123
+ def compliance(arr)
124
+ str = 'Compliant with '
125
+ arr.map do |val|
126
+ info = "#{val['name']}, reference #{val['reference']}, version #{val['version']}"
127
+ str + info
128
+ end.join("\n")
129
+ end
130
+
131
+ def to_hdf
132
+ controls = []
133
+ @report['services'].each_key do |service|
134
+ @report['services'][service]['findings'].each_key do |finding|
135
+ printf("\rProcessing: %s", $spinner.next)
136
+
137
+ finding_id = finding
138
+ finding_details = @report['services'][service]['findings'][finding]
139
+
140
+ item = {}
141
+ item['id'] = finding_id
142
+ item['title'] = finding_details['description']
143
+
144
+ item['tags'] = { nist: nist_tag(finding_id) }
145
+
146
+ item['impact'] = impact(finding_details['level'])
147
+
148
+ item['desc'] = finding_details['rationale']
149
+
150
+ item['descriptions'] = []
151
+ item['descriptions'] << desc_tags(finding_details['remediation'], 'fix') unless finding_details['remediation'].nil?
152
+ item['descriptions'] << desc_tags(finding_details['service'], 'service')
153
+ item['descriptions'] << desc_tags(finding_details['path'], 'path')
154
+ item['descriptions'] << desc_tags(finding_details['id_suffix'], 'id_suffix')
155
+
156
+ item['refs'] = []
157
+ item['refs'] += finding_details['references'].map { |link| { url: link } } unless finding_details['references'].nil? || finding_details['references'].empty?
158
+ item['refs'] << { ref: compliance(finding_details['compliance']) } unless finding_details['compliance'].nil?
159
+
160
+ item['source_location'] = NA_HASH
161
+ item['code'] = NA_STRING
162
+
163
+ item['results'] = findings(finding_details)
164
+
165
+ controls << item
166
+ end
167
+ end
168
+
169
+ scaninfo = extract_scaninfo(@report)
170
+ results = HeimdallDataFormat.new(profile_name: scaninfo['name'],
171
+ version: scaninfo['version'],
172
+ title: scaninfo['title'],
173
+ summary: scaninfo['summary'],
174
+ controls: controls,
175
+ target_id: scaninfo['target_id'],
176
+ attributes: scaninfo['attributes'])
177
+ results.to_hdf
178
+ end
179
+ end
180
+ end
@@ -29,9 +29,8 @@ end
29
29
 
30
30
  module HeimdallTools
31
31
  class SnykMapper
32
- def initialize(synk_json, _name = nil, verbose = false)
32
+ def initialize(synk_json, _name = nil)
33
33
  @synk_json = synk_json
34
- @verbose = verbose
35
34
 
36
35
  begin
37
36
  @cwe_nist_mapping = parse_mapper
@@ -8,13 +8,10 @@ RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
8
8
  CWE_NIST_MAPPING_FILE = File.join(RESOURCE_DIR, 'cwe-nist-mapping.csv')
9
9
  DEFAULT_NIST_TAG = %w{SA-11 RA-5}.freeze
10
10
 
11
- # rubocop:disable Metrics/AbcSize
12
-
13
11
  module HeimdallTools
14
12
  class ZapMapper
15
- def initialize(zap_json, name, verbose = false)
13
+ def initialize(zap_json, name)
16
14
  @zap_json = zap_json
17
- @verbose = verbose
18
15
 
19
16
  begin
20
17
  data = JSON.parse(zap_json, symbolize_names: true)
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: heimdall_tools
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.3.42
4
+ version: 1.3.47
5
5
  platform: ruby
6
6
  authors:
7
7
  - Robert Thew
@@ -10,7 +10,7 @@ authors:
10
10
  autorequire:
11
11
  bindir: exe
12
12
  cert_chain: []
13
- date: 2021-04-08 00:00:00.000000000 Z
13
+ date: 2021-06-08 00:00:00.000000000 Z
14
14
  dependencies:
15
15
  - !ruby/object:Gem::Dependency
16
16
  name: aws-sdk-configservice
@@ -88,14 +88,14 @@ dependencies:
88
88
  requirements:
89
89
  - - "~>"
90
90
  - !ruby/object:Gem::Version
91
- version: 1.10.9
91
+ version: '1.11'
92
92
  type: :runtime
93
93
  prerelease: false
94
94
  version_requirements: !ruby/object:Gem::Requirement
95
95
  requirements:
96
96
  - - "~>"
97
97
  - !ruby/object:Gem::Version
98
- version: 1.10.9
98
+ version: '1.11'
99
99
  - !ruby/object:Gem::Dependency
100
100
  name: openssl
101
101
  requirement: !ruby/object:Gem::Requirement
@@ -214,6 +214,7 @@ files:
214
214
  - lib/data/nessus-plugins-nist-mapping.csv
215
215
  - lib/data/nikto-nist-mapping.csv
216
216
  - lib/data/owasp-nist-mapping.csv
217
+ - lib/data/scoutsuite-nist-mapping.csv
217
218
  - lib/heimdall_tools.rb
218
219
  - lib/heimdall_tools/aws_config_mapper.rb
219
220
  - lib/heimdall_tools/burpsuite_mapper.rb
@@ -231,6 +232,8 @@ files:
231
232
  - lib/heimdall_tools/help/nessus_mapper.md
232
233
  - lib/heimdall_tools/help/netsparker_mapper.md
233
234
  - lib/heimdall_tools/help/nikto_mapper.md
235
+ - lib/heimdall_tools/help/sarif_mapper.md
236
+ - lib/heimdall_tools/help/scoutsuite_mapper.md
234
237
  - lib/heimdall_tools/help/snyk_mapper.md
235
238
  - lib/heimdall_tools/help/sonarqube_mapper.md
236
239
  - lib/heimdall_tools/help/zap_mapper.md
@@ -238,6 +241,8 @@ files:
238
241
  - lib/heimdall_tools/nessus_mapper.rb
239
242
  - lib/heimdall_tools/netsparker_mapper.rb
240
243
  - lib/heimdall_tools/nikto_mapper.rb
244
+ - lib/heimdall_tools/sarif_mapper.rb
245
+ - lib/heimdall_tools/scoutsuite_mapper.rb
241
246
  - lib/heimdall_tools/snyk_mapper.rb
242
247
  - lib/heimdall_tools/sonarqube_mapper.rb
243
248
  - lib/heimdall_tools/version.rb
@@ -262,7 +267,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
262
267
  - !ruby/object:Gem::Version
263
268
  version: '0'
264
269
  requirements: []
265
- rubygems_version: 3.2.3
270
+ rubygems_version: 3.2.15
266
271
  signing_key:
267
272
  specification_version: 4
268
273
  summary: Convert Forify, Openzap and Sonarqube results to HDF