heimdall_tools 1.3.34 → 1.3.39.pre1
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/README.md +53 -0
- data/lib/data/aws-config-mapping.csv +107 -0
- data/lib/data/cwe-nist-mapping.csv +5 -1
- data/lib/heimdall_tools.rb +3 -0
- data/lib/heimdall_tools/aws_config_mapper.rb +284 -0
- data/lib/heimdall_tools/burpsuite_mapper.rb +1 -1
- data/lib/heimdall_tools/cli.rb +36 -0
- data/lib/heimdall_tools/dbprotect_mapper.rb +127 -0
- data/lib/heimdall_tools/fortify_mapper.rb +2 -1
- data/lib/heimdall_tools/hdf.rb +3 -1
- data/lib/heimdall_tools/help/aws_config_mapper.md +30 -0
- data/lib/heimdall_tools/help/dbprotect_mapper.md +5 -0
- data/lib/heimdall_tools/help/jfrog_xray_mapper.md +5 -0
- data/lib/heimdall_tools/jfrog_xray_mapper.rb +142 -0
- data/lib/heimdall_tools/nessus_mapper.rb +1 -1
- data/lib/heimdall_tools/nikto_mapper.rb +1 -1
- data/lib/heimdall_tools/snyk_mapper.rb +1 -1
- data/lib/heimdall_tools/sonarqube_mapper.rb +3 -1
- data/lib/heimdall_tools/zap_mapper.rb +3 -2
- metadata +26 -33
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 5bf59c85ab39e97a18e7ef7a18d0820541df75fc321a017e3b5b633270596e51
|
4
|
+
data.tar.gz: 25c4ab33be6805034e122fb4491a2241ad8bb2f993f90290cef0c2fefe8f81a8
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: b05652317c333cc5e0ac9eee14cef1052438b0d4b0e71a5bd9a6cc5cc79c312a314d939b9bbde5eeba64cf15edeb7e53633d61eecf5f3d44f8a8522551204001
|
7
|
+
data.tar.gz: 62de5d7471629610a26b2074a0fd1088717f0b2cd6199ec15d7c8b89787c682511f00e3088e9f7df7135f860e279afb7baaf86901a82239b359d6ddcb47d8f82
|
data/README.md
CHANGED
@@ -12,6 +12,9 @@ HeimdallTools supplies several methods to convert output from various tools to "
|
|
12
12
|
- **nessus_mapper** - commercial vulnerability scanner
|
13
13
|
- **snyk_mapper** - commercial package vulnerability scanner
|
14
14
|
- **nikto_mapper** - open-source web server scanner
|
15
|
+
- **jfrog_xray_mapper** - package vulnerability scanner
|
16
|
+
- **dbprotect_mapper** - database vulnerability scanner
|
17
|
+
- **aws_config_mapper** - assess, audit, and evaluate AWS resources
|
15
18
|
|
16
19
|
Ruby 2.4 or higher (check using "ruby -v")
|
17
20
|
|
@@ -181,6 +184,56 @@ FLAGS:
|
|
181
184
|
example: heimdall_tools nikto_mapper -j nikto_results.json -o nikto_results.json
|
182
185
|
```
|
183
186
|
|
187
|
+
## jfrog_xray_mapper
|
188
|
+
|
189
|
+
jfrog_xray_mapper translates an JFrog Xray results JSON file into HDF format JSON to be viewable in Heimdall
|
190
|
+
|
191
|
+
```
|
192
|
+
USAGE: heimdall_tools jfrog_xray_mapper [OPTIONS] -j <xray-results-json> -o <hdf-scan-results.json>
|
193
|
+
|
194
|
+
FLAGS:
|
195
|
+
-j <xray_results_json> : path to xray results JSON file.
|
196
|
+
-o --output <scan-results> : path to output scan-results json.
|
197
|
+
-V --verbose : verbose run [optional].
|
198
|
+
|
199
|
+
example: heimdall_tools jfrog_xray_mapper -j xray_results.json -o xray_results_hdf.json
|
200
|
+
```
|
201
|
+
|
202
|
+
## dbprotect_mapper
|
203
|
+
|
204
|
+
dbprotect_mapper translates DBProtect report in `Check Results Details` format XML to HDF format JSON be viewed on Heimdall.
|
205
|
+
|
206
|
+
```
|
207
|
+
USAGE: heimdall_tools dbprotect_mapper [OPTIONS] -x <check_results_details_report_xml> -o <db_protect_hdf.json>
|
208
|
+
|
209
|
+
FLAGS:
|
210
|
+
-x <check_results_details_report_xml> : path to DBProtect report XML file.
|
211
|
+
-o --output <scan-results> : path to output scan-results json.
|
212
|
+
-V --verbose : verbose run [optional].
|
213
|
+
|
214
|
+
example: heimdall_tools dbprotect_mapper -x check_results_details_report.xml -o db_protect_hdf.json
|
215
|
+
```
|
216
|
+
|
217
|
+
## aws_config_mapper
|
218
|
+
|
219
|
+
aws_config_mapper pulls Ruby AWS SDK data to translate AWS Config Rule results into HDF format json to be viewable in Heimdall
|
220
|
+
|
221
|
+
### AWS Config Rule Mapping:
|
222
|
+
The mapping of AWS Config Rules to 800-53 Controls was sourced from [this link](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-nist-800-53_rev_4.html).
|
223
|
+
|
224
|
+
### Authentication with AWS:
|
225
|
+
[Developer Guide for configuring Ruby AWS SDK for authentication](https://docs.aws.amazon.com/sdk-for-ruby/v3/developer-guide/setup-config.html)
|
226
|
+
|
227
|
+
```
|
228
|
+
USAGE: heimdall_tools aws_config_mapper [OPTIONS] -o <hdf-scan-results.json>
|
229
|
+
|
230
|
+
FLAGS:
|
231
|
+
-o --output <scan-results> : path to output scan-results json.
|
232
|
+
-V --verbose : verbose run [optional].
|
233
|
+
|
234
|
+
example: heimdall_tools aws_config_mapper -o aws_config_results_hdf.json
|
235
|
+
```
|
236
|
+
|
184
237
|
## version
|
185
238
|
|
186
239
|
Prints out the gem version
|
@@ -0,0 +1,107 @@
|
|
1
|
+
AwsConfigRuleName,NIST-ID,Rev
|
2
|
+
secretsmanager-scheduled-rotation-success-check,AC-2(1)|AC-2(j),4
|
3
|
+
iam-user-group-membership-check,AC-2(1)|AC-2(j)|AC-3|AC-6,4
|
4
|
+
iam-password-policy,AC-2(1)|AC-2(f)|AC-2(j)|IA-2|IA-5(1)(a)(d)(e)|IA-5(4),4
|
5
|
+
access-keys-rotated,AC-2(1)|AC-2(j),4
|
6
|
+
iam-user-unused-credentials-check,AC-2(1)|AC-2(3)|AC-2(f)|AC-3|AC-6,4
|
7
|
+
securityhub-enabled,AC-2(1)|AC-2(4)|AC-2(12)(a)|AC-2(g)|AC-17(1)|AU-6(1)(3)|CA-7(a)(b)|SA-10|SI-4(2)|SI-4(4)|SI-4(5)|SI-4(16)|SI-4(a)(b)(c),4
|
8
|
+
guardduty-enabled-centralized,AC-2(1)|AC-2(4)|AC-2(12)(a)|AC-2(g)|AC-17(1)|AU-6(1)(3)|CA-7(a)(b)|RA-5|SA-10|SI-4(1)|SI-4(2)|SI-4(4)|SI-4(5)|SI-4(16)|SI-4(a)(b)(c),4
|
9
|
+
cloud-trail-cloud-watch-logs-enabled,AC-2(4)|AC-2(g)|AU-2(a)(d)|AU-3|AU-6(1)(3)|AU-7(1)|AU-12(a)(c)|CA-7(a)(b)|SI-4(2)|SI-4(4)|SI-4(5)|SI-4(a)(b)(c),4
|
10
|
+
cloudtrail-enabled,AC-2(4)|AC-2(g)|AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
11
|
+
multi-region-cloudtrail-enabled,AC-2(4)|AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
12
|
+
rds-logging-enabled,AC-2(4)|AC-2(g)|AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
13
|
+
cloudwatch-alarm-action-check,AC-2(4)|AU-6(1)(3)|AU-7(1)|CA-7(a)(b)|IR-4(1)|SI-4(2)|SI-4(4)|SI-4(5)|SI-4(a)(b)(c),4
|
14
|
+
redshift-cluster-configuration-check,AC-2(4)|AC-2(g)|AU-2(a)(d)|AU-3|AU-12(a)(c)|SC-13|SC-28,4
|
15
|
+
iam-root-access-key-check,AC-2(f)|AC-2(j)|AC-3|AC-6|AC-6(10),4
|
16
|
+
s3-bucket-logging-enabled,AC-2(g)|AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
17
|
+
cloudtrail-s3-dataevents-enabled,AC-2(g)|AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
18
|
+
root-account-mfa-enabled,AC-2(j)|IA-2(1)(11),4
|
19
|
+
emr-kerberos-enabled,AC-2(j)|AC-3|AC-5(c)|AC-6,4
|
20
|
+
iam-group-has-users-check,AC-2(j)|AC-3|AC-5(c)|AC-6|SC-2,4
|
21
|
+
iam-policy-no-statements-with-admin-access,AC-2(j)|AC-3|AC-5(c)|AC-6|SC-2,4
|
22
|
+
iam-user-no-policies-check,AC-2(j)|AC-3|AC-5(c)|AC-6,4
|
23
|
+
s3-bucket-public-write-prohibited,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
24
|
+
lambda-function-public-access-prohibited,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
25
|
+
rds-snapshots-public-prohibited,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
26
|
+
redshift-cluster-public-access-check,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
27
|
+
s3-bucket-policy-grantee-check,AC-3|AC-6|SC-7|SC-7(3),4
|
28
|
+
s3-bucket-public-read-prohibited,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
29
|
+
s3-account-level-public-access-blocks,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
30
|
+
dms-replication-not-public,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
31
|
+
ebs-snapshot-public-restorable-check,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
32
|
+
sagemaker-notebook-no-direct-internet-access,AC-3|AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
33
|
+
rds-instance-public-access-check,AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
34
|
+
lambda-inside-vpc,AC-4|SC-7|SC-7(3),4
|
35
|
+
ec2-instances-in-vpc,AC-4|SC-7|SC-7(3),4
|
36
|
+
restricted-common-ports,AC-4|CM-2|SC-7|SC-7(3),4
|
37
|
+
restricted-ssh,AC-4|SC-7|SC-7(3),4
|
38
|
+
vpc-default-security-group-closed,AC-4|SC-7|SC-7(3),4
|
39
|
+
vpc-sg-open-only-to-authorized-ports,AC-4|SC-7|SC-7(3),4
|
40
|
+
acm-certificate-expiration-check,AC-4|AC-17(2)|SC-12,4
|
41
|
+
ec2-instance-no-public-ip,AC-4|AC-6|AC-21(b)|SC-7|SC-7(3),4
|
42
|
+
elasticsearch-in-vpc-only,AC-4|SC-7|SC-7(3),4
|
43
|
+
emr-master-no-public-ip,AC-4|AC-21(b)|SC-7|SC-7(3),4
|
44
|
+
internet-gateway-authorized-vpc-only,AC-4|AC-17(3)|SC-7|SC-7(3),4
|
45
|
+
codebuild-project-envvar-awscred-check,AC-6|IA-5(7)|SA-3(a),4
|
46
|
+
ec2-imdsv2-check,AC-6,4
|
47
|
+
iam-no-inline-policy-check,AC-6,4
|
48
|
+
alb-http-to-https-redirection-check,AC-17(2)|SC-7|SC-8|SC-8(1)|SC-13|SC-23,4
|
49
|
+
redshift-require-tls-ssl,AC-17(2)|SC-7|SC-8|SC-8(1)|SC-13,4
|
50
|
+
s3-bucket-ssl-requests-only,AC-17(2)|SC-7|SC-8|SC-8(1)|SC-13,4
|
51
|
+
elb-acm-certificate-required,AC-17(2)|SC-7|SC-8|SC-8(1)|SC-13,4
|
52
|
+
alb-http-drop-invalid-header-enabled,AC-17(2)|SC-7|SC-8|SC-8(1)|SC-23,4
|
53
|
+
elb-tls-https-listeners-only,AC-17(2)|SC-7|SC-8|SC-8(1)|SC-23,4
|
54
|
+
api-gw-execution-logging-enabled,AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
55
|
+
elb-logging-enabled,AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
56
|
+
vpc-flow-logs-enabled,AU-2(a)(d)|AU-3|AU-12(a)(c),4
|
57
|
+
wafv2-logging-enabled,AU-2(a)(d)|AU-3|AU-12(a)(c)|SC-7|SI-4(a)(b)(c),4
|
58
|
+
cloud-trail-encryption-enabled,AU-9|SC-13|SC-28,4
|
59
|
+
cloudwatch-log-group-encrypted,AU-9|SC-13|SC-28,4
|
60
|
+
s3-bucket-replication-enabled,AU-9(2)|CP-9(b)|CP-10|SC-5|SC-36,4
|
61
|
+
cw-loggroup-retention-period-check,AU-11|SI-12,4
|
62
|
+
ec2-instance-detailed-monitoring-enabled,CA-7(a)(b)|SI-4(2)|SI-4(a)(b)(c),4
|
63
|
+
rds-enhanced-monitoring-enabled,CA-7(a)(b),4
|
64
|
+
ec2-instance-managed-by-systems-manager,CM-2|CM-7(a)|CM-8(1)|CM-8(3)(a)|SA-3(a)|SA-10|SI-2(2)|SI-7(1),4
|
65
|
+
ec2-managedinstance-association-compliance-status-check,CM-2|CM-7(a)|CM-8(3)(a)|SI-2(2),4
|
66
|
+
ec2-stopped-instance,CM-2,4
|
67
|
+
ec2-volume-inuse-check,CM-2|SC-4,4
|
68
|
+
elb-deletion-protection-enabled,CM-2|CP-10,4
|
69
|
+
cloudtrail-security-trail-enabled,CM-2,4
|
70
|
+
ec2-managedinstance-patch-compliance-status-check,CM-8(3)(a)|SI-2(2)|SI-7(1),4
|
71
|
+
db-instance-backup-enabled,CP-9(b)|CP-10|SI-12,4
|
72
|
+
dynamodb-pitr-enabled,CP-9(b)|CP-10|SI-12,4
|
73
|
+
elasticache-redis-cluster-automatic-backup-check,CP-9(b)|CP-10|SI-12,4
|
74
|
+
dynamodb-in-backup-plan,CP-9(b)|CP-10|SI-12,4
|
75
|
+
ebs-in-backup-plan,CP-9(b)|CP-10|SI-12,4
|
76
|
+
efs-in-backup-plan,CP-9(b)|CP-10|SI-12,4
|
77
|
+
rds-in-backup-plan,CP-9(b)|CP-10|SI-12,4
|
78
|
+
dynamodb-autoscaling-enabled,CP-10|SC-5,4
|
79
|
+
rds-multi-az-support,CP-10|SC-5|SC-36,4
|
80
|
+
s3-bucket-versioning-enabled,CP-10|SI-12,4
|
81
|
+
vpc-vpn-2-tunnels-up,CP-10,4
|
82
|
+
elb-cross-zone-load-balancing-enabled,CP-10|SC-5,4
|
83
|
+
root-account-hardware-mfa-enabled,IA-2(1)(11),4
|
84
|
+
mfa-enabled-for-iam-console-access,IA-2(1)(2)(11),4
|
85
|
+
iam-user-mfa-enabled,IA-2(1)(2)(11),4
|
86
|
+
guardduty-non-archived-findings,IR-4(1)|IR-6(1)|IR-7(1)|RA-5|SA-10|SI-4(a)(b)(c),4
|
87
|
+
codebuild-project-source-repo-url-check,SA-3(a),4
|
88
|
+
autoscaling-group-elb-healthcheck-required,SC-5,4
|
89
|
+
rds-instance-deletion-protection-enabled,SC-5,4
|
90
|
+
alb-waf-enabled,SC-7|SI-4(a)(b)(c),4
|
91
|
+
elasticsearch-node-to-node-encryption-check,SC-7|SC-8|SC-8(1),4
|
92
|
+
cmk-backing-key-rotation-enabled,SC-12,4
|
93
|
+
kms-cmk-not-scheduled-for-deletion,SC-12|SC-28,4
|
94
|
+
api-gw-cache-enabled-and-encrypted,SC-13|SC-28,4
|
95
|
+
efs-encrypted-check,SC-13|SC-28,4
|
96
|
+
elasticsearch-encrypted-at-rest,SC-13|SC-28,4
|
97
|
+
encrypted-volumes,SC-13|SC-28,4
|
98
|
+
rds-storage-encrypted,SC-13|SC-28,4
|
99
|
+
s3-bucket-server-side-encryption-enabled,SC-13|SC-28,4
|
100
|
+
sagemaker-endpoint-configuration-kms-key-configured,SC-13|SC-28,4
|
101
|
+
sagemaker-notebook-instance-kms-key-configured,SC-13|SC-28,4
|
102
|
+
sns-encrypted-kms,SC-13|SC-28,4
|
103
|
+
dynamodb-table-encrypted-kms,SC-13,4
|
104
|
+
s3-bucket-default-lock-enabled,SC-28,4
|
105
|
+
ec2-ebs-encryption-by-default,SC-28,4
|
106
|
+
rds-snapshot-encrypted,SC-28,4
|
107
|
+
cloud-trail-log-file-validation-enabled,SI-7|SI-7(1),4
|
@@ -64,10 +64,13 @@
|
|
64
64
|
251, Often Misused: String Management,,4,
|
65
65
|
252, Unchecked Return Value,,4,
|
66
66
|
256, Plaintext Storage of a Password,SC-28,4,Protection of Information at Rest
|
67
|
+
257, Storing Passwords in a Recoverable Format,IA-5,4,Authenticator Management
|
67
68
|
258, Empty Password in Configuration File,SC-28,4,Protection of Information at Rest
|
68
69
|
259, Use of Hard-coded Password,,4,
|
69
70
|
260, Password in Configuration File,SC-28,4,Protection of Information at Rest
|
70
71
|
261, Weak Cryptography for Passwords,SC-13,4,Cryptographic Protection
|
72
|
+
262, Not Using Password Aging,IA-5,4,Authenticator Management
|
73
|
+
263, Password Aging with Long Expiration,IA-5,4,Authenticator Management
|
71
74
|
265, Privilege / Sandbox Issues,AC-6,4,Least Privilege
|
72
75
|
269, Improper Privilege Management,AC-4,4,Information Flow Enforcement
|
73
76
|
272, Least Privilege Violation,AC-6,4,Least Privilege: Privilege Levels for Code Execution -8
|
@@ -175,8 +178,9 @@
|
|
175
178
|
662, Improper Synchonization,,4,
|
176
179
|
667, Improper Locking,,4,
|
177
180
|
676, Use of Potentially Dangerous Function,,4,
|
178
|
-
690
|
181
|
+
690, Unchecked Return Value to NULL Pointer Dereference,,4,
|
179
182
|
691, Insufficient Control Flow Management,SI-11,4,Error Handling
|
183
|
+
693, Protection Mechanism Failure,IA-5,4,Authenticator Management
|
180
184
|
694, Use of Multiple Resources with Duplicate Identifier,,4,
|
181
185
|
732, Incorrect Permission Assignment for Critical Resource,AC-3,4,Access Enforcement
|
182
186
|
733, Compiler Optimization Removal or Modification of Security-critical Code,,4,
|
data/lib/heimdall_tools.rb
CHANGED
@@ -12,4 +12,7 @@ module HeimdallTools
|
|
12
12
|
autoload :NessusMapper, 'heimdall_tools/nessus_mapper'
|
13
13
|
autoload :SnykMapper, 'heimdall_tools/snyk_mapper'
|
14
14
|
autoload :NiktoMapper, 'heimdall_tools/nikto_mapper'
|
15
|
+
autoload :JfrogXrayMapper, 'heimdall_tools/jfrog_xray_mapper'
|
16
|
+
autoload :DBProtectMapper, 'heimdall_tools/dbprotect_mapper'
|
17
|
+
autoload :AwsConfigMapper, 'heimdall_tools/aws_config_mapper'
|
15
18
|
end
|
@@ -0,0 +1,284 @@
|
|
1
|
+
require 'aws-sdk-configservice'
|
2
|
+
require 'heimdall_tools/hdf'
|
3
|
+
require 'csv'
|
4
|
+
require 'json'
|
5
|
+
|
6
|
+
RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
|
7
|
+
|
8
|
+
AWS_CONFIG_MAPPING_FILE = File.join(RESOURCE_DIR, 'aws-config-mapping.csv')
|
9
|
+
|
10
|
+
NOT_APPLICABLE_MSG = 'No AWS resources found to evaluate complaince for this rule'.freeze
|
11
|
+
INSUFFICIENT_DATA_MSG = 'Not enough data has been collectd to determine compliance yet.'.freeze
|
12
|
+
|
13
|
+
##
|
14
|
+
# HDF mapper for use with AWS Config rules.
|
15
|
+
#
|
16
|
+
# Ruby AWS Ruby SDK for ConfigService:
|
17
|
+
# - https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/ConfigService/Client.html
|
18
|
+
#
|
19
|
+
# rubocop:disable Metrics/AbcSize, Metrics/ClassLength
|
20
|
+
module HeimdallTools
|
21
|
+
class AwsConfigMapper
|
22
|
+
def initialize(custom_mapping, verbose = false)
|
23
|
+
@verbose = verbose
|
24
|
+
@default_mapping = get_rule_mapping(AWS_CONFIG_MAPPING_FILE)
|
25
|
+
@custom_mapping = custom_mapping.nil? ? {} : get_rule_mapping(custom_mapping)
|
26
|
+
@client = Aws::ConfigService::Client.new
|
27
|
+
@issues = get_all_config_rules
|
28
|
+
end
|
29
|
+
|
30
|
+
##
|
31
|
+
# Convert to HDF
|
32
|
+
#
|
33
|
+
# If there is overlap in rule names from @default_mapping and @custom_mapping,
|
34
|
+
# then the tags from both will be added to the rule.
|
35
|
+
def to_hdf
|
36
|
+
controls = @issues.map do |issue|
|
37
|
+
@item = {}
|
38
|
+
@item['id'] = issue[:config_rule_name]
|
39
|
+
@item['title'] = issue[:config_rule_name]
|
40
|
+
@item['desc'] = issue[:description]
|
41
|
+
@item['impact'] = 0.5
|
42
|
+
@item['tags'] = hdf_tags(issue)
|
43
|
+
@item['descriptions'] = hdf_descriptions(issue)
|
44
|
+
@item['refs'] = NA_ARRAY
|
45
|
+
@item['source_location'] = { ref: issue[:config_rule_arn], line: 1 }
|
46
|
+
@item['code'] = ''
|
47
|
+
@item['results'] = issue[:results]
|
48
|
+
# Avoid duplicating rules that exist in the custom mapping as 'unmapped' in this loop
|
49
|
+
if @custom_mapping.include?(issue[:config_rule_name]) && !@default_mapping.include?(issue[:config_rule_name])
|
50
|
+
nil
|
51
|
+
else
|
52
|
+
@item
|
53
|
+
end
|
54
|
+
end
|
55
|
+
results = HeimdallDataFormat.new(
|
56
|
+
profile_name: 'AWS Config',
|
57
|
+
title: 'AWS Config',
|
58
|
+
summary: 'AWS Config',
|
59
|
+
controls: controls,
|
60
|
+
statistics: { aws_config_sdk_version: Aws::ConfigService::GEM_VERSION }
|
61
|
+
)
|
62
|
+
results.to_hdf
|
63
|
+
end
|
64
|
+
|
65
|
+
private
|
66
|
+
|
67
|
+
##
|
68
|
+
# Read in a config rule -> 800-53 control mapping CSV.
|
69
|
+
#
|
70
|
+
# Params:
|
71
|
+
# - path: The file path to the CSV file
|
72
|
+
#
|
73
|
+
# Returns: A mapped version of the csv in the format { rule_name: row, ... }
|
74
|
+
def get_rule_mapping(path)
|
75
|
+
Hash[CSV.read(path, headers: true).map { |row| [row[0], row] }]
|
76
|
+
end
|
77
|
+
|
78
|
+
##
|
79
|
+
# Fetches information on all of the config rules available to the
|
80
|
+
# AWS account.
|
81
|
+
#
|
82
|
+
# https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/ConfigService/Client.html#describe_config_rules-instance_method
|
83
|
+
#
|
84
|
+
# Returns: list of hash for all config rules available
|
85
|
+
def get_all_config_rules
|
86
|
+
config_rules = []
|
87
|
+
|
88
|
+
# Fetch all rules with pagination
|
89
|
+
response = @client.describe_config_rules
|
90
|
+
config_rules += response.config_rules
|
91
|
+
while response.next_token
|
92
|
+
response = @client.describe_config_rules(next_token: response.next_token)
|
93
|
+
config_rules += response.config_rules
|
94
|
+
end
|
95
|
+
config_rules = config_rules.map(&:to_h)
|
96
|
+
|
97
|
+
# Add necessary data to rules using helpers
|
98
|
+
add_compliance_to_config_rules(config_rules)
|
99
|
+
add_results_to_config_rules(config_rules)
|
100
|
+
end
|
101
|
+
|
102
|
+
##
|
103
|
+
# Adds compliance information for config rules to the config rule hash
|
104
|
+
# from AwsConfigMapper::get_all_config_rules.
|
105
|
+
#
|
106
|
+
# `complaince_type` may be any of the following:
|
107
|
+
# ["COMPLIANT", "NON_COMPLIANT", "NOT_APPLICABLE", "INSUFFICIENT_DATA"]
|
108
|
+
#
|
109
|
+
# Params:
|
110
|
+
# - config_rules: The list of hash from AwsConfigMapper::get_all_config_rules
|
111
|
+
#
|
112
|
+
# Returns: The same config_rules array with `compliance` key added to each rule
|
113
|
+
def add_compliance_to_config_rules(config_rules)
|
114
|
+
mapped_compliance_results = fetch_all_compliance_info(config_rules)
|
115
|
+
|
116
|
+
# Add compliance to config_rules
|
117
|
+
config_rules.each do |rule|
|
118
|
+
rule[:compliance] = mapped_compliance_results[rule[:config_rule_name]]&.dig(:compliance, :compliance_type)
|
119
|
+
end
|
120
|
+
|
121
|
+
config_rules
|
122
|
+
end
|
123
|
+
|
124
|
+
##
|
125
|
+
# Fetch and combine all compliance information for the config rules.
|
126
|
+
#
|
127
|
+
# AWS allows passing up to 25 rules at a time to this endpoint.
|
128
|
+
#
|
129
|
+
# https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/ConfigService/Client.html#describe_compliance_by_config_rule-instance_method
|
130
|
+
#
|
131
|
+
# Params:
|
132
|
+
# - config_rules: The list of hash from AwsConfigMapper::get_all_config_rules
|
133
|
+
#
|
134
|
+
# Returns: Results mapped by config rule in the format { name: {<response>}, ... }
|
135
|
+
def fetch_all_compliance_info(config_rules)
|
136
|
+
compliance_results = []
|
137
|
+
|
138
|
+
config_rules.each_slice(25).each do |slice|
|
139
|
+
config_rule_names = slice.map { |r| r[:config_rule_name] }
|
140
|
+
response = @client.describe_compliance_by_config_rule(config_rule_names: config_rule_names)
|
141
|
+
compliance_results += response.compliance_by_config_rules
|
142
|
+
end
|
143
|
+
|
144
|
+
# Map based on name for easy lookup
|
145
|
+
Hash[compliance_results.collect { |r| [r.config_rule_name, r.to_h] }]
|
146
|
+
end
|
147
|
+
|
148
|
+
##
|
149
|
+
# Takes in config rules and formats the results for hdf format.
|
150
|
+
#
|
151
|
+
# https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/ConfigService/Client.html#get_compliance_details_by_config_rule-instance_method
|
152
|
+
#
|
153
|
+
# Example hdf results:
|
154
|
+
# [
|
155
|
+
# {
|
156
|
+
# "code_desc": "This rule...",
|
157
|
+
# "run_time": 0.314016,
|
158
|
+
# "start_time": "2018-11-18T20:21:40-05:00",
|
159
|
+
# "status": "passed"
|
160
|
+
# },
|
161
|
+
# ...
|
162
|
+
# ]
|
163
|
+
#
|
164
|
+
# Status may be any of the following: ['passed', 'failed', 'skipped', 'loaded']
|
165
|
+
#
|
166
|
+
# Params:
|
167
|
+
# - rule: Rules from AwsConfigMapper::get_all_config_rules
|
168
|
+
#
|
169
|
+
# Returns: The same config_rules array with `results` key added to each rule.
|
170
|
+
def add_results_to_config_rules(config_rules)
|
171
|
+
config_rules.each do |rule|
|
172
|
+
response = @client.get_compliance_details_by_config_rule(config_rule_name: rule[:config_rule_name], limit: 100)
|
173
|
+
rule_results = response.to_h[:evaluation_results]
|
174
|
+
while response.next_token
|
175
|
+
response = @client.get_compliance_details_by_config_rule(next_token: response.next_token, limit: 100)
|
176
|
+
rule_results += response.to_h[:evaluation_results]
|
177
|
+
end
|
178
|
+
|
179
|
+
rule[:results] = []
|
180
|
+
rule_results.each do |result|
|
181
|
+
hdf_result = {}
|
182
|
+
# code_desc
|
183
|
+
hdf_result['code_desc'] = result.dig(:evaluation_result_identifier, :evaluation_result_qualifier)&.map do |k, v|
|
184
|
+
"#{k}: #{v}"
|
185
|
+
end&.join(', ')
|
186
|
+
# start_time
|
187
|
+
hdf_result['start_time'] = if result.key?(:config_rule_invoked_time)
|
188
|
+
DateTime.parse(result[:config_rule_invoked_time].to_s).strftime('%Y-%m-%dT%H:%M:%S%:z')
|
189
|
+
end
|
190
|
+
# run_time
|
191
|
+
hdf_result['run_time'] = if result.key?(:result_recorded_time) && result.key?(:config_rule_invoked_time)
|
192
|
+
(result[:result_recorded_time] - result[:config_rule_invoked_time]).round(6)
|
193
|
+
end
|
194
|
+
# status
|
195
|
+
hdf_result['status'] = case result.dig(:compliance_type)
|
196
|
+
when 'COMPLIANT'
|
197
|
+
'passed'
|
198
|
+
when 'NON_COMPLIANT'
|
199
|
+
'failed'
|
200
|
+
else
|
201
|
+
'skipped'
|
202
|
+
end
|
203
|
+
hdf_result['message'] = "(#{hdf_result['code_desc']}): #{result[:annotation] || 'Rule does not pass rule compliance'}" if hdf_result['status'] == 'failed'
|
204
|
+
rule[:results] << hdf_result
|
205
|
+
end
|
206
|
+
next unless rule[:results].empty?
|
207
|
+
|
208
|
+
case rule[:compliance]
|
209
|
+
when 'NOT_APPLICABLE'
|
210
|
+
rule[:impact] = 0
|
211
|
+
rule[:results] << {
|
212
|
+
'run_time': 0,
|
213
|
+
'code_desc': NOT_APPLICABLE_MSG,
|
214
|
+
'skip_message': NOT_APPLICABLE_MSG,
|
215
|
+
'start_time': DateTime.now.strftime('%Y-%m-%dT%H:%M:%S%:z'),
|
216
|
+
'status': 'skipped'
|
217
|
+
}
|
218
|
+
when 'INSUFFICIENT_DATA'
|
219
|
+
rule[:results] << {
|
220
|
+
'run_time': 0,
|
221
|
+
'code_desc': INSUFFICIENT_DATA_MSG,
|
222
|
+
'skip_message': INSUFFICIENT_DATA_MSG,
|
223
|
+
'start_time': DateTime.now.strftime('%Y-%m-%dT%H:%M:%S%:z'),
|
224
|
+
'status': 'skipped'
|
225
|
+
}
|
226
|
+
end
|
227
|
+
end
|
228
|
+
|
229
|
+
config_rules
|
230
|
+
end
|
231
|
+
|
232
|
+
##
|
233
|
+
# Takes in a config rule and pulls out tags that are useful for HDF.
|
234
|
+
#
|
235
|
+
# Params:
|
236
|
+
# - config_rule: A single config rule from AwsConfigMapper::get_all_config_rules
|
237
|
+
#
|
238
|
+
# Returns: Hash containing all relevant HDF tags
|
239
|
+
def hdf_tags(config_rule)
|
240
|
+
result = {}
|
241
|
+
|
242
|
+
@default_mapping
|
243
|
+
@custom_mapping
|
244
|
+
|
245
|
+
# NIST tag
|
246
|
+
result['nist'] = []
|
247
|
+
default_mapping_match = @default_mapping[config_rule[:config_rule_name]]
|
248
|
+
|
249
|
+
result['nist'] += default_mapping_match[1].split('|') unless default_mapping_match.nil?
|
250
|
+
|
251
|
+
custom_mapping_match = @custom_mapping[config_rule[:config_rule_name]]
|
252
|
+
|
253
|
+
result['nist'] += custom_mapping_match[1].split('|').map { |name| "#{name} (user provided)" } unless custom_mapping_match.nil?
|
254
|
+
|
255
|
+
result['nist'] = ['unmapped'] if result['nist'].empty?
|
256
|
+
|
257
|
+
result
|
258
|
+
end
|
259
|
+
|
260
|
+
def check_text(config_rule)
|
261
|
+
params = (JSON.parse(config_rule[:input_parameters]).map { |key, value| "#{key}: #{value}" }).join('<br/>')
|
262
|
+
check_text = config_rule[:config_rule_arn]
|
263
|
+
check_text += "<br/>#{params}" unless params.empty?
|
264
|
+
check_text
|
265
|
+
end
|
266
|
+
|
267
|
+
##
|
268
|
+
# Takes in a config rule and pulls out information for the descriptions array
|
269
|
+
#
|
270
|
+
# Params:
|
271
|
+
# - config_rule: A single config rule from AwsConfigMapper::get_all_config_rules
|
272
|
+
#
|
273
|
+
# Returns: Array containing all relevant descriptions information
|
274
|
+
def hdf_descriptions(config_rule)
|
275
|
+
[
|
276
|
+
{
|
277
|
+
'label': 'check',
|
278
|
+
'data': check_text(config_rule)
|
279
|
+
}
|
280
|
+
]
|
281
|
+
end
|
282
|
+
end
|
283
|
+
end
|
284
|
+
# rubocop:enable Metrics/AbcSize, Metrics/ClassLength
|
@@ -63,7 +63,7 @@ module HeimdallTools
|
|
63
63
|
end
|
64
64
|
|
65
65
|
def nist_tag(cweid)
|
66
|
-
entries = @cwe_nist_mapping.select { |x| cweid.include?
|
66
|
+
entries = @cwe_nist_mapping.select { |x| cweid.include?(x[:cweid].to_s) && !x[:nistid].nil? }
|
67
67
|
tags = entries.map { |x| [x[:nistid], "Rev_#{x[:rev]}"] }
|
68
68
|
tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
|
69
69
|
end
|
data/lib/heimdall_tools/cli.rb
CHANGED
@@ -87,6 +87,42 @@ module HeimdallTools
|
|
87
87
|
puts "#{options[:output]}"
|
88
88
|
end
|
89
89
|
|
90
|
+
desc 'jfrog_xray_mapper', 'jfrog_xray_mapper translates Jfrog Xray results Json to HDF format Json be viewed on Heimdall'
|
91
|
+
long_desc Help.text(:jfrog_xray_mapper)
|
92
|
+
option :json, required: true, aliases: '-j'
|
93
|
+
option :output, required: true, aliases: '-o'
|
94
|
+
option :verbose, type: :boolean, aliases: '-V'
|
95
|
+
def jfrog_xray_mapper
|
96
|
+
hdf = HeimdallTools::JfrogXrayMapper.new(File.read(options[:json])).to_hdf
|
97
|
+
File.write(options[:output], hdf)
|
98
|
+
puts "\r\HDF Generated:\n"
|
99
|
+
puts "#{options[:output]}"
|
100
|
+
end
|
101
|
+
|
102
|
+
desc 'dbprotect_mapper', 'dbprotect_mapper translates dbprotect results xml to HDF format Json be viewed on Heimdall'
|
103
|
+
long_desc Help.text(:dbprotect_mapper)
|
104
|
+
option :xml, required: true, aliases: '-x'
|
105
|
+
option :output, required: true, aliases: '-o'
|
106
|
+
option :verbose, type: :boolean, aliases: '-V'
|
107
|
+
def dbprotect_mapper
|
108
|
+
hdf = HeimdallTools::DBProtectMapper.new(File.read(options[:xml])).to_hdf
|
109
|
+
File.write(options[:output], hdf)
|
110
|
+
puts "\r\HDF Generated:\n"
|
111
|
+
puts "#{options[:output]}"
|
112
|
+
end
|
113
|
+
|
114
|
+
desc 'aws_config_mapper', 'aws_config_mapper pulls Ruby AWS SDK data to translate AWS Config Rule results into HDF format Json to be viewable in Heimdall'
|
115
|
+
long_desc Help.text(:aws_config_mapper)
|
116
|
+
# option :custom_mapping, required: false, aliases: '-m'
|
117
|
+
option :output, required: true, aliases: '-o'
|
118
|
+
option :verbose, type: :boolean, aliases: '-V'
|
119
|
+
def aws_config_mapper
|
120
|
+
hdf = HeimdallTools::AwsConfigMapper.new(options[:custom_mapping]).to_hdf
|
121
|
+
File.write(options[:output], hdf)
|
122
|
+
puts "\r\HDF Generated:\n"
|
123
|
+
puts "#{options[:output]}"
|
124
|
+
end
|
125
|
+
|
90
126
|
desc 'version', 'prints version'
|
91
127
|
def version
|
92
128
|
puts VERSION
|
@@ -0,0 +1,127 @@
|
|
1
|
+
require 'json'
|
2
|
+
require 'csv'
|
3
|
+
require 'heimdall_tools/hdf'
|
4
|
+
require 'utilities/xml_to_hash'
|
5
|
+
|
6
|
+
IMPACT_MAPPING = {
|
7
|
+
High: 0.7,
|
8
|
+
Medium: 0.5,
|
9
|
+
Low: 0.3,
|
10
|
+
Informational: 0.0
|
11
|
+
}.freeze
|
12
|
+
|
13
|
+
# rubocop:disable Metrics/AbcSize
|
14
|
+
|
15
|
+
module HeimdallTools
|
16
|
+
class DBProtectMapper
|
17
|
+
def initialize(xml, name=nil, verbose = false)
|
18
|
+
@verbose = verbose
|
19
|
+
|
20
|
+
begin
|
21
|
+
dataset = xml_to_hash(xml)
|
22
|
+
@entries = compile_findings(dataset['dataset'])
|
23
|
+
|
24
|
+
rescue StandardError => e
|
25
|
+
raise "Invalid DBProtect XML file provided Exception: #{e};\nNote that XML must be of kind `Check Results Details`."
|
26
|
+
end
|
27
|
+
|
28
|
+
end
|
29
|
+
|
30
|
+
def to_hdf
|
31
|
+
controls = []
|
32
|
+
@entries.each do |entry|
|
33
|
+
@item = {}
|
34
|
+
@item['id'] = entry['Check ID']
|
35
|
+
@item['title'] = entry['Check']
|
36
|
+
@item['desc'] = format_desc(entry)
|
37
|
+
@item['impact'] = impact(entry['Risk DV'])
|
38
|
+
@item['tags'] = {}
|
39
|
+
@item['descriptions'] = []
|
40
|
+
@item['refs'] = NA_ARRAY
|
41
|
+
@item['source_location'] = NA_HASH
|
42
|
+
@item['code'] = ''
|
43
|
+
@item['results'] = finding(entry)
|
44
|
+
|
45
|
+
controls << @item
|
46
|
+
end
|
47
|
+
controls = collapse_duplicates(controls)
|
48
|
+
results = HeimdallDataFormat.new(profile_name: @entries.first['Policy'],
|
49
|
+
version: "",
|
50
|
+
title: @entries.first['Job Name'],
|
51
|
+
summary: format_summary(@entries.first),
|
52
|
+
controls: controls)
|
53
|
+
results.to_hdf
|
54
|
+
end
|
55
|
+
|
56
|
+
private
|
57
|
+
|
58
|
+
def compile_findings(dataset)
|
59
|
+
keys = dataset['metadata']['item'].map{ |e| e['name']}
|
60
|
+
findings = dataset['data']['row'].map { |e| Hash[keys.zip(e['value'])] }
|
61
|
+
findings
|
62
|
+
end
|
63
|
+
|
64
|
+
def format_desc(entry)
|
65
|
+
text = []
|
66
|
+
text << "Task : #{entry['Task']}"
|
67
|
+
text << "Check Category : #{entry['Check Category']}"
|
68
|
+
text.join("; ")
|
69
|
+
end
|
70
|
+
|
71
|
+
def format_summary(entry)
|
72
|
+
text = []
|
73
|
+
text << "Organization : #{entry['Organization']}"
|
74
|
+
text << "Asset : #{entry['Check Asset']}"
|
75
|
+
text << "Asset Type : #{entry['Asset Type']}"
|
76
|
+
text << "IP Address, Port, Instance : #{entry['Asset Type']}"
|
77
|
+
text << "IP Address, Port, Instance : #{entry['IP Address, Port, Instance']}"
|
78
|
+
text.join("\n")
|
79
|
+
end
|
80
|
+
|
81
|
+
def finding(entry)
|
82
|
+
finding = {}
|
83
|
+
|
84
|
+
finding['code_desc'] = entry['Details']
|
85
|
+
finding['run_time'] = 0.0
|
86
|
+
finding['start_time'] = entry['Date']
|
87
|
+
|
88
|
+
case entry['Result Status']
|
89
|
+
when 'Fact'
|
90
|
+
finding['status'] = 'skipped'
|
91
|
+
when 'Failed'
|
92
|
+
finding['status'] = 'failed'
|
93
|
+
finding['backtrace'] = ["DB Protect Failed Check"]
|
94
|
+
when 'Finding'
|
95
|
+
finding['status'] = 'failed'
|
96
|
+
when 'Not A Finding'
|
97
|
+
finding['status'] = 'passed'
|
98
|
+
when 'Skipped'
|
99
|
+
finding['status'] = 'skipped'
|
100
|
+
else
|
101
|
+
finding['status'] = 'skipped'
|
102
|
+
end
|
103
|
+
[finding]
|
104
|
+
end
|
105
|
+
|
106
|
+
def impact(severity)
|
107
|
+
IMPACT_MAPPING[severity.to_sym]
|
108
|
+
end
|
109
|
+
|
110
|
+
# DBProtect report could have multiple issue entries for multiple findings of same issue type.
|
111
|
+
# The meta data is identical across entries
|
112
|
+
# method collapse_duplicates return unique controls with applicable findings collapsed into it.
|
113
|
+
def collapse_duplicates(controls)
|
114
|
+
unique_controls = []
|
115
|
+
|
116
|
+
controls.map { |x| x['id'] }.uniq.each do |id|
|
117
|
+
collapsed_results = controls.select { |x| x['id'].eql?(id) }.map {|x| x['results']}
|
118
|
+
unique_control = controls.find { |x| x['id'].eql?(id) }
|
119
|
+
unique_control['results'] = collapsed_results.flatten
|
120
|
+
unique_controls << unique_control
|
121
|
+
end
|
122
|
+
unique_controls
|
123
|
+
end
|
124
|
+
|
125
|
+
|
126
|
+
end
|
127
|
+
end
|
@@ -3,6 +3,7 @@ require 'heimdall_tools/hdf'
|
|
3
3
|
require 'utilities/xml_to_hash'
|
4
4
|
|
5
5
|
NIST_REFERENCE_NAME = 'Standards Mapping - NIST Special Publication 800-53 Revision 4'.freeze
|
6
|
+
DEFAULT_NIST_TAG = ["SA-11", "RA-5"].freeze
|
6
7
|
|
7
8
|
module HeimdallTools
|
8
9
|
class FortifyMapper
|
@@ -68,7 +69,7 @@ module HeimdallTools
|
|
68
69
|
references = rule['References']['Reference']
|
69
70
|
references = [references] unless references.is_a?(Array)
|
70
71
|
tag = references.detect { |x| x['Author'].eql?(NIST_REFERENCE_NAME) }
|
71
|
-
tag.nil? ?
|
72
|
+
tag.nil? ? DEFAULT_NIST_TAG : tag['Title'].match(/[a-zA-Z][a-zA-Z]-\d{1,2}/)
|
72
73
|
end
|
73
74
|
|
74
75
|
def impact(classid)
|
data/lib/heimdall_tools/hdf.rb
CHANGED
@@ -29,7 +29,8 @@ module HeimdallTools
|
|
29
29
|
groups: NA_ARRAY,
|
30
30
|
status: 'loaded',
|
31
31
|
controls: NA_TAG,
|
32
|
-
target_id: NA_TAG
|
32
|
+
target_id: NA_TAG,
|
33
|
+
statistics: NA_HASH)
|
33
34
|
|
34
35
|
@results_json = {}
|
35
36
|
@results_json['platform'] = {}
|
@@ -40,6 +41,7 @@ module HeimdallTools
|
|
40
41
|
|
41
42
|
@results_json['statistics'] = {}
|
42
43
|
@results_json['statistics']['duration'] = duration || NA_TAG
|
44
|
+
@results_json['statistics'].merge! statistics
|
43
45
|
|
44
46
|
@results_json['profiles'] = []
|
45
47
|
|
@@ -0,0 +1,30 @@
|
|
1
|
+
aws_config_mapper pulls Ruby AWS SDK data to translate AWS Config Rule results into HDF format json to be viewable in Heimdall
|
2
|
+
|
3
|
+
AWS Config Rule Mapping:
|
4
|
+
The mapping of AWS Config Rules to 800-53 Controls was sourced from [this link](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-nist-800-53_rev_4.html).
|
5
|
+
|
6
|
+
Authentication with AWS:
|
7
|
+
[Developer Guide for configuring Ruby AWS SDK for authentication](https://docs.aws.amazon.com/sdk-for-ruby/v3/developer-guide/setup-config.html)
|
8
|
+
|
9
|
+
Authentication Example:
|
10
|
+
|
11
|
+
- Create `~/.aws/credentials`
|
12
|
+
- Add contents to file, replacing with your access ID and key
|
13
|
+
|
14
|
+
```
|
15
|
+
[default]
|
16
|
+
aws_access_key_id = your_access_key_id
|
17
|
+
aws_secret_access_key = your_secret_access_key
|
18
|
+
```
|
19
|
+
|
20
|
+
- (optional) set AWS region through `~/.aws/config` file with contents
|
21
|
+
|
22
|
+
```
|
23
|
+
[default]
|
24
|
+
output = json
|
25
|
+
region = us-gov-west-1
|
26
|
+
```
|
27
|
+
|
28
|
+
Examples:
|
29
|
+
|
30
|
+
heimdall_tools aws_config_mapper -o aws_config_results.json
|
@@ -0,0 +1,142 @@
|
|
1
|
+
require 'json'
|
2
|
+
require 'csv'
|
3
|
+
require 'heimdall_tools/hdf'
|
4
|
+
require 'utilities/xml_to_hash'
|
5
|
+
|
6
|
+
RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
|
7
|
+
|
8
|
+
CWE_NIST_MAPPING_FILE = File.join(RESOURCE_DIR, 'cwe-nist-mapping.csv')
|
9
|
+
|
10
|
+
IMPACT_MAPPING = {
|
11
|
+
high: 0.7,
|
12
|
+
medium: 0.5,
|
13
|
+
low: 0.3,
|
14
|
+
}.freeze
|
15
|
+
|
16
|
+
DEFAULT_NIST_TAG = ["SA-11", "RA-5"].freeze
|
17
|
+
|
18
|
+
# Loading spinner sign
|
19
|
+
$spinner = Enumerator.new do |e|
|
20
|
+
loop do
|
21
|
+
e.yield '|'
|
22
|
+
e.yield '/'
|
23
|
+
e.yield '-'
|
24
|
+
e.yield '\\'
|
25
|
+
end
|
26
|
+
end
|
27
|
+
|
28
|
+
module HeimdallTools
|
29
|
+
class JfrogXrayMapper
|
30
|
+
def initialize(xray_json, name=nil, verbose = false)
|
31
|
+
@xray_json = xray_json
|
32
|
+
@verbose = verbose
|
33
|
+
|
34
|
+
begin
|
35
|
+
@cwe_nist_mapping = parse_mapper
|
36
|
+
@project = JSON.parse(xray_json)
|
37
|
+
|
38
|
+
rescue StandardError => e
|
39
|
+
raise "Invalid JFrog Xray JSON file provided Exception: #{e}"
|
40
|
+
end
|
41
|
+
end
|
42
|
+
|
43
|
+
def finding(vulnerability)
|
44
|
+
finding = {}
|
45
|
+
finding['status'] = 'failed'
|
46
|
+
finding['code_desc'] = []
|
47
|
+
finding['code_desc'] << "source_comp_id : #{vulnerability['source_comp_id'].to_s }"
|
48
|
+
finding['code_desc'] << "vulnerable_versions : #{vulnerability['component_versions']['vulnerable_versions'].to_s }"
|
49
|
+
finding['code_desc'] << "fixed_versions : #{vulnerability['component_versions']['fixed_versions'].to_s }"
|
50
|
+
finding['code_desc'] << "issue_type : #{vulnerability['issue_type'].to_s }"
|
51
|
+
finding['code_desc'] << "provider : #{vulnerability['provider'].to_s }"
|
52
|
+
finding['code_desc'] = finding['code_desc'].join("\n")
|
53
|
+
finding['run_time'] = NA_FLOAT
|
54
|
+
|
55
|
+
# Xray results does not profile scan timestamp; using current time to satisfy HDF format
|
56
|
+
finding['start_time'] = NA_STRING
|
57
|
+
[finding]
|
58
|
+
end
|
59
|
+
|
60
|
+
def nist_tag(cweid)
|
61
|
+
entries = @cwe_nist_mapping.select { |x| cweid.include?(x[:cweid].to_s) && !x[:nistid].nil? }
|
62
|
+
tags = entries.map { |x| x[:nistid] }
|
63
|
+
tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
|
64
|
+
end
|
65
|
+
|
66
|
+
def parse_identifiers(vulnerability, ref)
|
67
|
+
# Extracting id number from reference style CWE-297
|
68
|
+
vulnerability['component_versions']['more_details']['cves'][0][ref.downcase].map { |e| e.split("#{ref}-")[1] }
|
69
|
+
rescue
|
70
|
+
return []
|
71
|
+
end
|
72
|
+
|
73
|
+
def impact(severity)
|
74
|
+
IMPACT_MAPPING[severity.downcase.to_sym]
|
75
|
+
end
|
76
|
+
|
77
|
+
def parse_mapper
|
78
|
+
csv_data = CSV.read(CWE_NIST_MAPPING_FILE, **{ encoding: 'UTF-8',
|
79
|
+
headers: true,
|
80
|
+
header_converters: :symbol,
|
81
|
+
converters: :all })
|
82
|
+
csv_data.map(&:to_hash)
|
83
|
+
end
|
84
|
+
|
85
|
+
def desc_tags(data, label)
|
86
|
+
{ "data": data || NA_STRING, "label": label || NA_STRING }
|
87
|
+
end
|
88
|
+
|
89
|
+
# Xray report could have multiple vulnerability entries for multiple findings of same issue type.
|
90
|
+
# The meta data is identical across entries
|
91
|
+
# method collapse_duplicates return unique controls with applicable findings collapsed into it.
|
92
|
+
def collapse_duplicates(controls)
|
93
|
+
unique_controls = []
|
94
|
+
|
95
|
+
controls.map { |x| x['id'] }.uniq.each do |id|
|
96
|
+
collapsed_results = controls.select { |x| x['id'].eql?(id) }.map {|x| x['results']}
|
97
|
+
unique_control = controls.find { |x| x['id'].eql?(id) }
|
98
|
+
unique_control['results'] = collapsed_results.flatten
|
99
|
+
unique_controls << unique_control
|
100
|
+
end
|
101
|
+
unique_controls
|
102
|
+
end
|
103
|
+
|
104
|
+
def to_hdf
|
105
|
+
controls = []
|
106
|
+
vulnerability_count = 0
|
107
|
+
@project['data'].uniq.each do | vulnerability |
|
108
|
+
printf("\rProcessing: %s", $spinner.next)
|
109
|
+
|
110
|
+
vulnerability_count +=1
|
111
|
+
item = {}
|
112
|
+
item['tags'] = {}
|
113
|
+
item['descriptions'] = []
|
114
|
+
item['refs'] = NA_ARRAY
|
115
|
+
item['source_location'] = NA_HASH
|
116
|
+
item['descriptions'] = NA_ARRAY
|
117
|
+
|
118
|
+
# Xray JSONs might note have `id` fields populated.
|
119
|
+
# If thats a case MD5 hash is used to collapse vulnerability findings of the same type.
|
120
|
+
item['id'] = vulnerability['id'].empty? ? OpenSSL::Digest::MD5.digest(vulnerability['summary'].to_s).unpack("H*")[0].to_s : vulnerability['id']
|
121
|
+
item['title'] = vulnerability['summary'].to_s
|
122
|
+
item['desc'] = vulnerability['component_versions']['more_details']['description'].to_s
|
123
|
+
item['impact'] = impact(vulnerability['severity'].to_s)
|
124
|
+
item['code'] = NA_STRING
|
125
|
+
item['results'] = finding(vulnerability)
|
126
|
+
|
127
|
+
item['tags']['nist'] = nist_tag( parse_identifiers( vulnerability, 'CWE') )
|
128
|
+
item['tags']['cweid'] = parse_identifiers( vulnerability, 'CWE')
|
129
|
+
|
130
|
+
controls << item
|
131
|
+
end
|
132
|
+
|
133
|
+
controls = collapse_duplicates(controls)
|
134
|
+
results = HeimdallDataFormat.new(profile_name: "JFrog Xray Scan",
|
135
|
+
version: NA_STRING,
|
136
|
+
title: "JFrog Xray Scan",
|
137
|
+
summary: "Continuous Security and Universal Artifact Analysis",
|
138
|
+
controls: controls)
|
139
|
+
results.to_hdf
|
140
|
+
end
|
141
|
+
end
|
142
|
+
end
|
@@ -140,7 +140,7 @@ module HeimdallTools
|
|
140
140
|
end
|
141
141
|
|
142
142
|
def plugin_nist_tag(pluginfamily, pluginid)
|
143
|
-
entries = @cwe_nist_mapping.select { |x| (x[:pluginfamily].eql?(pluginfamily) && (x[:pluginid].eql?('*') || x[:pluginid].eql?(pluginid.to_i)) ) }
|
143
|
+
entries = @cwe_nist_mapping.select { |x| (x[:pluginfamily].eql?(pluginfamily) && (x[:pluginid].eql?('*') || x[:pluginid].eql?(pluginid.to_i)) ) && !x[:nistid].nil? }
|
144
144
|
tags = entries.map { |x| [x[:nistid].split('|'), "Rev_#{x[:rev]}"] }
|
145
145
|
tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
|
146
146
|
end
|
@@ -71,7 +71,7 @@ module HeimdallTools
|
|
71
71
|
end
|
72
72
|
|
73
73
|
def nist_tag(niktoid)
|
74
|
-
entries = @nikto_nist_mapping.select { |x| niktoid.eql?(x[:niktoid].to_s) }
|
74
|
+
entries = @nikto_nist_mapping.select { |x| niktoid.eql?(x[:niktoid].to_s) && !x[:nistid].nil? }
|
75
75
|
tags = entries.map { |x| x[:nistid] }
|
76
76
|
tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
|
77
77
|
end
|
@@ -74,7 +74,7 @@ module HeimdallTools
|
|
74
74
|
end
|
75
75
|
|
76
76
|
def nist_tag(cweid)
|
77
|
-
entries = @cwe_nist_mapping.select { |x| cweid.include?
|
77
|
+
entries = @cwe_nist_mapping.select { |x| cweid.include?(x[:cweid].to_s) && !x[:nistid].nil? }
|
78
78
|
tags = entries.map { |x| x[:nistid] }
|
79
79
|
tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
|
80
80
|
end
|
@@ -5,6 +5,8 @@ require 'heimdall_tools/hdf'
|
|
5
5
|
|
6
6
|
RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
|
7
7
|
|
8
|
+
DEFAULT_NIST_TAG = ["SA-11", "RA-5"].freeze
|
9
|
+
|
8
10
|
MAPPING_FILES = {
|
9
11
|
cwe: File.join(RESOURCE_DIR, 'cwe-nist-mapping.csv'),
|
10
12
|
owasp: File.join(RESOURCE_DIR, 'owasp-nist-mapping.csv')
|
@@ -237,7 +239,7 @@ class Control
|
|
237
239
|
return [@mappings[tag_type][parsed_tag]].flatten.uniq
|
238
240
|
end
|
239
241
|
|
240
|
-
|
242
|
+
DEFAULT_NIST_TAG # Entries with unmapped NIST tags are defaulted to NIST tags ‘SA-11, RA-5 Rev_4’
|
241
243
|
end
|
242
244
|
|
243
245
|
def hdf
|
@@ -7,6 +7,7 @@ require 'heimdall_tools/hdf'
|
|
7
7
|
RESOURCE_DIR = Pathname.new(__FILE__).join('../../data')
|
8
8
|
|
9
9
|
CWE_NIST_MAPPING_FILE = File.join(RESOURCE_DIR, 'cwe-nist-mapping.csv')
|
10
|
+
DEFAULT_NIST_TAG = ["SA-11", "RA-5"].freeze
|
10
11
|
|
11
12
|
# rubocop:disable Metrics/AbcSize
|
12
13
|
|
@@ -64,9 +65,9 @@ module HeimdallTools
|
|
64
65
|
end
|
65
66
|
|
66
67
|
def nist_tag(cweid)
|
67
|
-
entries = @cwe_nist_mapping.select { |x| x[:cweid].to_s.eql?(cweid.to_s) }
|
68
|
+
entries = @cwe_nist_mapping.select { |x| x[:cweid].to_s.eql?(cweid.to_s) && !x[:nistid].nil? }
|
68
69
|
tags = entries.map { |x| [x[:nistid], "Rev_#{x[:rev]}"] }
|
69
|
-
tags.empty? ?
|
70
|
+
tags.empty? ? DEFAULT_NIST_TAG : tags.flatten.uniq
|
70
71
|
end
|
71
72
|
|
72
73
|
def impact(riskcode)
|
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: heimdall_tools
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 1.3.
|
4
|
+
version: 1.3.39.pre1
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Robert Thew
|
@@ -10,8 +10,22 @@ authors:
|
|
10
10
|
autorequire:
|
11
11
|
bindir: exe
|
12
12
|
cert_chain: []
|
13
|
-
date:
|
13
|
+
date: 2021-03-11 00:00:00.000000000 Z
|
14
14
|
dependencies:
|
15
|
+
- !ruby/object:Gem::Dependency
|
16
|
+
name: aws-sdk-configservice
|
17
|
+
requirement: !ruby/object:Gem::Requirement
|
18
|
+
requirements:
|
19
|
+
- - "~>"
|
20
|
+
- !ruby/object:Gem::Version
|
21
|
+
version: '1'
|
22
|
+
type: :runtime
|
23
|
+
prerelease: false
|
24
|
+
version_requirements: !ruby/object:Gem::Requirement
|
25
|
+
requirements:
|
26
|
+
- - "~>"
|
27
|
+
- !ruby/object:Gem::Version
|
28
|
+
version: '1'
|
15
29
|
- !ruby/object:Gem::Dependency
|
16
30
|
name: nokogiri
|
17
31
|
requirement: !ruby/object:Gem::Requirement
|
@@ -96,20 +110,6 @@ dependencies:
|
|
96
110
|
- - "~>"
|
97
111
|
- !ruby/object:Gem::Version
|
98
112
|
version: '2.1'
|
99
|
-
- !ruby/object:Gem::Dependency
|
100
|
-
name: nori
|
101
|
-
requirement: !ruby/object:Gem::Requirement
|
102
|
-
requirements:
|
103
|
-
- - "~>"
|
104
|
-
- !ruby/object:Gem::Version
|
105
|
-
version: '2.6'
|
106
|
-
type: :runtime
|
107
|
-
prerelease: false
|
108
|
-
version_requirements: !ruby/object:Gem::Requirement
|
109
|
-
requirements:
|
110
|
-
- - "~>"
|
111
|
-
- !ruby/object:Gem::Version
|
112
|
-
version: '2.6'
|
113
113
|
- !ruby/object:Gem::Dependency
|
114
114
|
name: git-lite-version-bump
|
115
115
|
requirement: !ruby/object:Gem::Requirement
|
@@ -166,20 +166,6 @@ dependencies:
|
|
166
166
|
- - ">="
|
167
167
|
- !ruby/object:Gem::Version
|
168
168
|
version: '0'
|
169
|
-
- !ruby/object:Gem::Dependency
|
170
|
-
name: codeclimate-test-reporter
|
171
|
-
requirement: !ruby/object:Gem::Requirement
|
172
|
-
requirements:
|
173
|
-
- - ">="
|
174
|
-
- !ruby/object:Gem::Version
|
175
|
-
version: '0'
|
176
|
-
type: :development
|
177
|
-
prerelease: false
|
178
|
-
version_requirements: !ruby/object:Gem::Requirement
|
179
|
-
requirements:
|
180
|
-
- - ">="
|
181
|
-
- !ruby/object:Gem::Version
|
182
|
-
version: '0'
|
183
169
|
- !ruby/object:Gem::Dependency
|
184
170
|
name: rake
|
185
171
|
requirement: !ruby/object:Gem::Requirement
|
@@ -209,24 +195,31 @@ files:
|
|
209
195
|
- Rakefile
|
210
196
|
- exe/heimdall_tools
|
211
197
|
- lib/data/U_CCI_List.xml
|
198
|
+
- lib/data/aws-config-mapping.csv
|
212
199
|
- lib/data/cwe-nist-mapping.csv
|
213
200
|
- lib/data/nessus-plugins-nist-mapping.csv
|
214
201
|
- lib/data/nikto-nist-mapping.csv
|
215
202
|
- lib/data/owasp-nist-mapping.csv
|
216
203
|
- lib/heimdall_tools.rb
|
204
|
+
- lib/heimdall_tools/aws_config_mapper.rb
|
217
205
|
- lib/heimdall_tools/burpsuite_mapper.rb
|
218
206
|
- lib/heimdall_tools/cli.rb
|
219
207
|
- lib/heimdall_tools/command.rb
|
208
|
+
- lib/heimdall_tools/dbprotect_mapper.rb
|
220
209
|
- lib/heimdall_tools/fortify_mapper.rb
|
221
210
|
- lib/heimdall_tools/hdf.rb
|
222
211
|
- lib/heimdall_tools/help.rb
|
212
|
+
- lib/heimdall_tools/help/aws_config_mapper.md
|
223
213
|
- lib/heimdall_tools/help/burpsuite_mapper.md
|
214
|
+
- lib/heimdall_tools/help/dbprotect_mapper.md
|
224
215
|
- lib/heimdall_tools/help/fortify_mapper.md
|
216
|
+
- lib/heimdall_tools/help/jfrog_xray_mapper.md
|
225
217
|
- lib/heimdall_tools/help/nessus_mapper.md
|
226
218
|
- lib/heimdall_tools/help/nikto_mapper.md
|
227
219
|
- lib/heimdall_tools/help/snyk_mapper.md
|
228
220
|
- lib/heimdall_tools/help/sonarqube_mapper.md
|
229
221
|
- lib/heimdall_tools/help/zap_mapper.md
|
222
|
+
- lib/heimdall_tools/jfrog_xray_mapper.rb
|
230
223
|
- lib/heimdall_tools/nessus_mapper.rb
|
231
224
|
- lib/heimdall_tools/nikto_mapper.rb
|
232
225
|
- lib/heimdall_tools/snyk_mapper.rb
|
@@ -249,11 +242,11 @@ required_ruby_version: !ruby/object:Gem::Requirement
|
|
249
242
|
version: '0'
|
250
243
|
required_rubygems_version: !ruby/object:Gem::Requirement
|
251
244
|
requirements:
|
252
|
-
- - "
|
245
|
+
- - ">"
|
253
246
|
- !ruby/object:Gem::Version
|
254
|
-
version:
|
247
|
+
version: 1.3.1
|
255
248
|
requirements: []
|
256
|
-
rubygems_version: 3.
|
249
|
+
rubygems_version: 3.2.3
|
257
250
|
signing_key:
|
258
251
|
specification_version: 4
|
259
252
|
summary: Convert Forify, Openzap and Sonarqube results to HDF
|