splunk-pickaxe 2.0.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/Gemfile +11 -0
- data/README.md +220 -0
- data/Rakefile +7 -0
- data/bin/pickaxe +6 -0
- data/lib/splunk/pickaxe.rb +30 -0
- data/lib/splunk/pickaxe/cli.rb +59 -0
- data/lib/splunk/pickaxe/client.rb +36 -0
- data/lib/splunk/pickaxe/config.rb +65 -0
- data/lib/splunk/pickaxe/objects.rb +125 -0
- data/lib/splunk/pickaxe/objects/alerts.rb +77 -0
- data/lib/splunk/pickaxe/objects/dashboards.rb +33 -0
- data/lib/splunk/pickaxe/objects/eventtypes.rb +20 -0
- data/lib/splunk/pickaxe/objects/field_extractions.rb +38 -0
- data/lib/splunk/pickaxe/objects/reports.rb +66 -0
- data/lib/splunk/pickaxe/objects/tags.rb +65 -0
- data/lib/splunk/pickaxe/version.rb +7 -0
- data/project.yml +34 -0
- metadata +132 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 8bf76e38f6793776895f41afc8c1e345c1e0e3fe
|
4
|
+
data.tar.gz: b85f42c837a2d3deb8dee1c2db57406675389409
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: a309da7903abab3668482669ac16e332670fabc9aaa62a0ba5da0e529f51918c4eda4e2aaab5399063728178513dd38a835170969b7bf2380237d4711dd69684
|
7
|
+
data.tar.gz: e29feda288ce0bfc6a77b60b51cd779f09fcb9d217a3119be8a2790488c1219037a28398186eb560b1a739a406f7eb81a423740b99a0800ddd210011b1eb5f3d
|
data/Gemfile
ADDED
data/README.md
ADDED
@@ -0,0 +1,220 @@
|
|
1
|
+
Splunk-Pickaxe
|
2
|
+
==============
|
3
|
+
|
4
|
+
A tool for syncing your repo of Splunk objects with a Splunk instance(s).
|
5
|
+
|
6
|
+
This provides a development workflow for Splunk components (i.e. dashboards,
|
7
|
+
alerts, reports, etc) and an easy way to apply them consistently.
|
8
|
+
|
9
|
+
Getting Started
|
10
|
+
---------------
|
11
|
+
|
12
|
+
Install the gem,
|
13
|
+
|
14
|
+
gem install splunk-pickaxe
|
15
|
+
|
16
|
+
Create your repo,
|
17
|
+
|
18
|
+
mkdir my-splunk-repo
|
19
|
+
cd my-splunk-repo
|
20
|
+
pickaxe init
|
21
|
+
|
22
|
+
Update the `.pickaxe.yml` file with your app name from Splunk. This should
|
23
|
+
be the name from the URL when you visit your Splunk app. (i.e. if your url is
|
24
|
+
`https://my-splunk.com/en-US/app/my_splunk_app/search` my app is `my_splunk_app`).
|
25
|
+
|
26
|
+
You also need to add your Splunk environment(s). So your `.pickaxe.yml` should
|
27
|
+
look like this,
|
28
|
+
|
29
|
+
```yaml
|
30
|
+
namespace:
|
31
|
+
# The application in which to create the Splunk knowledge objects
|
32
|
+
app: MY_SPLUNK_APP
|
33
|
+
|
34
|
+
environments:
|
35
|
+
ENVIRONMENT_NAME: SPLUNK_API_URL (i.e. https://search-head.my-splunk.com:8089)
|
36
|
+
```
|
37
|
+
|
38
|
+
Add some Splunk objects. See [example repo](example-repo) or below for format.
|
39
|
+
|
40
|
+
Sync your repo with Splunk,
|
41
|
+
|
42
|
+
pickaxe sync ENVIRONMENT_NAME
|
43
|
+
|
44
|
+
Where `ENVIRONMENT_NAME` is the name of one of the environments configured in
|
45
|
+
your `.pickaxe.yml`. These map to different Splunk instances.
|
46
|
+
|
47
|
+
By default this command assumes the user running the command has a Splunk account
|
48
|
+
and access to make these changes in the configured Splunk application. Your
|
49
|
+
password will be requested when run. Alternatively you can make use of the
|
50
|
+
options `--user` and `--password`.
|
51
|
+
|
52
|
+
Splunk Objects
|
53
|
+
--------------
|
54
|
+
|
55
|
+
Currently the following objects are supported,
|
56
|
+
|
57
|
+
* [alerts](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTsearch#saved.2Fsearches)
|
58
|
+
* [dashboards](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTknowledge#data.2Fui.2Fviews.2F.7Bname.7D)
|
59
|
+
* [eventtypes](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTknowledge#saved.2Feventtypes)
|
60
|
+
* [reports](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTsearch#saved.2Fsearches)
|
61
|
+
* [tags](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTknowledge#search.2Ftags.2F.7Btag_name.7D)
|
62
|
+
* [field_extractions](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTknowledge#data.2Fprops.2Fextractions)
|
63
|
+
|
64
|
+
### Alerts
|
65
|
+
|
66
|
+
To add a new alert to the repo simply create a new `ALERT.yml` file under `alerts`.
|
67
|
+
This YAML file should contain the following at minimum,
|
68
|
+
|
69
|
+
```yaml
|
70
|
+
name: ALERT NAME
|
71
|
+
config:
|
72
|
+
# Search query of events used to trigger alert
|
73
|
+
search: >
|
74
|
+
MY SEARCH
|
75
|
+
```
|
76
|
+
|
77
|
+
It will be populated with the alert defaults which includes,
|
78
|
+
|
79
|
+
* Running hourly
|
80
|
+
* Emailing everyone listed in your `.pickaxe.yml`
|
81
|
+
* Alerting if the search results in any events
|
82
|
+
|
83
|
+
You can override these defaults or any other property by specifying the property
|
84
|
+
under the `config` section. [This doc](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTsearch#saved.2Fsearches.2F.7Bname.7D)
|
85
|
+
contains all the properties for an alert.
|
86
|
+
|
87
|
+
#### Common Overrides
|
88
|
+
|
89
|
+
* To tweak the schedule set `cron_schedule` in the `config` section. By default its setup to run every hour. This should be a cron value (i.e. `0 10 * * 1` or run every Monday at 10am)
|
90
|
+
* To tweak the length of the search or how far back the set `dispatch.earliest_time` in the `config` section. By default this is setup to search over the last hour. You could change this to `-7d@h` to search the last 7 days
|
91
|
+
|
92
|
+
### Dashboards
|
93
|
+
|
94
|
+
To add a new dashboard create a new `DASHBOARD.xml` file under `dashboards`.
|
95
|
+
The name of the file should be the name of the dashboard. If the file is
|
96
|
+
`my_dashboard.xml`, this means the name is `my_dashboard`.
|
97
|
+
|
98
|
+
The file should contain the XML source of your dashboard. You can get this
|
99
|
+
by selecting your dashboard in Splunk, then Edit -> Edit Source. Copy the XML
|
100
|
+
and paste it into the file.
|
101
|
+
|
102
|
+
### Eventtypes
|
103
|
+
|
104
|
+
To add a new eventtype create a new `eventtype.yml` file under `eventtypes`.
|
105
|
+
The name of the file should be the name of the eventtype. If the file is
|
106
|
+
`my_eventtype.yml`, this means the name is `my_eventtype`.
|
107
|
+
|
108
|
+
```yaml
|
109
|
+
name: EVENTTYPE NAME
|
110
|
+
config:
|
111
|
+
disabled: <1|0> Set to 1 to disable
|
112
|
+
search: index=my_index more search things
|
113
|
+
priority: <1-10> 1 is highest priority and 10 is lowest
|
114
|
+
tags: <string> Enter a comma-separated list of tags
|
115
|
+
...
|
116
|
+
```
|
117
|
+
|
118
|
+
### Reports
|
119
|
+
|
120
|
+
To add a new report to the repo simply create a new `REPORT.yml` file under `reports`.
|
121
|
+
This YAML file should contain the following at minimum,
|
122
|
+
|
123
|
+
```yaml
|
124
|
+
name: REPORT NAME
|
125
|
+
config:
|
126
|
+
# Search query of events in the report
|
127
|
+
search: >
|
128
|
+
MY SEARCH
|
129
|
+
```
|
130
|
+
|
131
|
+
It will be populated with the alert defaults which includes,
|
132
|
+
|
133
|
+
* Running hourly
|
134
|
+
* Emailing everyone listed in your `.pickaxe.yml`
|
135
|
+
|
136
|
+
You can override these defaults or any other property by specifying the property
|
137
|
+
under the `config` section. [This doc](http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTsearch#saved.2Fsearches.2F.7Bname.7D)
|
138
|
+
contains all the properties for an report.
|
139
|
+
|
140
|
+
#### Common Overrides
|
141
|
+
|
142
|
+
* To tweak the schedule set `cron_schedule` in the `config` section. By default its setup to run every hour. This should be a cron value (i.e. `0 10 * * 1` or run every Monday at 10am)
|
143
|
+
* To tweak the length of the search or how far back the set `dispatch.earliest_time` in the `config` section. By default this is setup to search over the last hour. You could change this to `-7d@h` to search the last 7 days
|
144
|
+
|
145
|
+
### Tags
|
146
|
+
|
147
|
+
To add a new tag to the repo simply create a new `TAG.yml` file under `tags`.
|
148
|
+
This YAML file should contain the following at minimum,
|
149
|
+
|
150
|
+
```yaml
|
151
|
+
name: TAG
|
152
|
+
fields:
|
153
|
+
- source::/some/file
|
154
|
+
- sourcetype::my_type
|
155
|
+
...
|
156
|
+
```
|
157
|
+
|
158
|
+
The important config here is the `fields` key which contains the array of fields
|
159
|
+
you want in your tag. The value should be of the form `FIELD::VALUE` where `FIELD`
|
160
|
+
can be things like `source`, `sourcetype`, `eventtype` or any other Splunk field.
|
161
|
+
|
162
|
+
### Field Extractions
|
163
|
+
|
164
|
+
To add a new field extraction to the repo simply create a new `field_extraction.yml`
|
165
|
+
file under `field_extractions`. This YAML file should contain the following,
|
166
|
+
|
167
|
+
```yaml
|
168
|
+
name: my field extraction
|
169
|
+
config:
|
170
|
+
# The props.conf stanza to which this field extraction applies, e.g. the
|
171
|
+
# sourcetype or source that triggers this field extraction
|
172
|
+
stanza: See Comment Above
|
173
|
+
# If using EXTRACT type this is the regular expression
|
174
|
+
# If using REPORT type specify a comma- or space-delimited list of transforms.conf
|
175
|
+
# stanza names that define the field transformations to apply.
|
176
|
+
value: See Comment Above
|
177
|
+
# Use EXTRACT for inline regular expressions. Use REPORT for a transforms.conf stanza
|
178
|
+
type: EXTRACT (or) REPORT
|
179
|
+
```
|
180
|
+
|
181
|
+
### Generic Configuration
|
182
|
+
|
183
|
+
Most Splunk objects offer the following additional configuration,
|
184
|
+
|
185
|
+
* `envs`: The list of environments the object should be deployed to
|
186
|
+
|
187
|
+
```yaml
|
188
|
+
name: MY_OBJECT_NAME
|
189
|
+
envs:
|
190
|
+
# Only deploy to dev environment
|
191
|
+
- dev
|
192
|
+
```
|
193
|
+
|
194
|
+
By default if `envs` is not provided the object will be imported to all
|
195
|
+
environments.
|
196
|
+
|
197
|
+
Config
|
198
|
+
------
|
199
|
+
|
200
|
+
The `.pickaxe.yml` file contains the config for your Splunk resources. You can add,
|
201
|
+
|
202
|
+
* `environments` : A hash of environment names (key) to Splunk url
|
203
|
+
* `namespace`:
|
204
|
+
* `app`: The name of your Splunk application to deploy objects to
|
205
|
+
* `sharing`: The sharing settings for the Splunk resources (default=`app`)
|
206
|
+
* `emails`: An array of emails used for all reports and alerts (default=`[]`)
|
207
|
+
|
208
|
+
Contributing
|
209
|
+
------------
|
210
|
+
|
211
|
+
See [CONTRIBUTING.md](CONTRIBUTING.md)
|
212
|
+
|
213
|
+
LICENSE
|
214
|
+
-------
|
215
|
+
|
216
|
+
Copyright 2015 Cerner Innovation, Inc.
|
217
|
+
|
218
|
+
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
|
219
|
+
|
220
|
+
[http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0) Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
|
data/Rakefile
ADDED
data/bin/pickaxe
ADDED
@@ -0,0 +1,30 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'splunk-sdk-ruby'
|
4
|
+
require 'uri'
|
5
|
+
require 'splunk/pickaxe/config'
|
6
|
+
require 'splunk/pickaxe/client'
|
7
|
+
|
8
|
+
module Splunk
|
9
|
+
module Pickaxe
|
10
|
+
def self.configure(environment, username, password, execution_path = Dir.getwd)
|
11
|
+
config = Config.load execution_path
|
12
|
+
|
13
|
+
raise "Unknown environment [#{environment}]. Expected #{config.environments.keys}" unless config.environments.key?(environment)
|
14
|
+
|
15
|
+
uri = URI(config.environments[environment])
|
16
|
+
|
17
|
+
puts "Connecting to splunk [#{uri}]"
|
18
|
+
service = Splunk.connect(
|
19
|
+
scheme: uri.scheme.to_sym,
|
20
|
+
host: uri.host,
|
21
|
+
port: uri.port,
|
22
|
+
username: username,
|
23
|
+
password: password,
|
24
|
+
namespace: config.namespace
|
25
|
+
)
|
26
|
+
|
27
|
+
Client.new service, environment.downcase, config
|
28
|
+
end
|
29
|
+
end
|
30
|
+
end
|
@@ -0,0 +1,59 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'thor'
|
4
|
+
require 'etc'
|
5
|
+
require 'splunk/pickaxe'
|
6
|
+
require 'highline'
|
7
|
+
|
8
|
+
module Splunk
|
9
|
+
module Pickaxe
|
10
|
+
class CLI < Thor
|
11
|
+
desc 'init', 'initializes your splunk repo'
|
12
|
+
def init
|
13
|
+
puts 'Creating Splunk Object directories...'
|
14
|
+
[
|
15
|
+
Alerts::DIR,
|
16
|
+
Dashboards::DIR,
|
17
|
+
EventTypes::DIR,
|
18
|
+
Reports::DIR,
|
19
|
+
Tags::DIR,
|
20
|
+
FieldExtractions::DIR
|
21
|
+
].each do |dir|
|
22
|
+
Dir.mkdir dir unless Dir.exist? dir
|
23
|
+
end
|
24
|
+
|
25
|
+
puts 'Writing Gemfile ...'
|
26
|
+
File.open('Gemfile', 'w') do |f|
|
27
|
+
f.puts 'source "https://rubygems.org"'
|
28
|
+
f.puts
|
29
|
+
f.puts 'gem "splunk-pickaxe"'
|
30
|
+
end
|
31
|
+
|
32
|
+
puts 'Writing .pickaxe.yml ...'
|
33
|
+
File.open('.pickaxe.yml', 'w') do |f|
|
34
|
+
f.puts 'namespace:'
|
35
|
+
f.puts ' app: TODO'
|
36
|
+
f.puts 'environments:'
|
37
|
+
f.puts ' MY_ENV: SPLUNK_API_URL'
|
38
|
+
f.puts 'emails:'
|
39
|
+
f.puts ' - my.email@domain.com'
|
40
|
+
end
|
41
|
+
end
|
42
|
+
|
43
|
+
desc 'sync ENVIRONMENT', 'sync your splunk repo to the given environment'
|
44
|
+
option :user, type: :string, desc: 'The user to login to splunk with. If this is not provide it will use the current user'
|
45
|
+
option :password, type: :string, desc: 'The password to login to splunk with. If this is not provided it will ask for a password'
|
46
|
+
option :repo_path, type: :string, desc: 'The path to the repo. If this is not specified it is assumed you are executing from within the repo'
|
47
|
+
def sync(environment)
|
48
|
+
cli = HighLine.new
|
49
|
+
|
50
|
+
user = options[:user] || Etc.getlogin
|
51
|
+
password = options[:password] || cli.ask('Password: ') { |o| o.echo = '*' }
|
52
|
+
execution_path = options[:repo_path] || Dir.getwd
|
53
|
+
|
54
|
+
pickaxe = Pickaxe.configure environment, user, password, execution_path
|
55
|
+
pickaxe.sync_all
|
56
|
+
end
|
57
|
+
end
|
58
|
+
end
|
59
|
+
end
|
@@ -0,0 +1,36 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'splunk/pickaxe/objects/alerts'
|
4
|
+
require 'splunk/pickaxe/objects/dashboards'
|
5
|
+
require 'splunk/pickaxe/objects/eventtypes'
|
6
|
+
require 'splunk/pickaxe/objects/reports'
|
7
|
+
require 'splunk/pickaxe/objects/tags'
|
8
|
+
require 'splunk/pickaxe/objects/field_extractions'
|
9
|
+
|
10
|
+
module Splunk
|
11
|
+
module Pickaxe
|
12
|
+
class Client
|
13
|
+
attr_reader :service, :alerts, :dashboards, :eventypes, :reports, :tags, :field_extractions
|
14
|
+
|
15
|
+
def initialize(service, environment, config)
|
16
|
+
@service = service
|
17
|
+
|
18
|
+
@alerts = Alerts.new service, environment, config
|
19
|
+
@dashboards = Dashboards.new service, environment, config
|
20
|
+
@eventtypes = EventTypes.new service, environment, config
|
21
|
+
@reports = Reports.new service, environment, config
|
22
|
+
@tags = Tags.new service, environment, config
|
23
|
+
@field_extractions = FieldExtractions.new service, environment, config
|
24
|
+
end
|
25
|
+
|
26
|
+
def sync_all
|
27
|
+
@alerts.sync
|
28
|
+
@dashboards.sync
|
29
|
+
@eventtypes.sync
|
30
|
+
@reports.sync
|
31
|
+
@tags.sync
|
32
|
+
@field_extractions.sync
|
33
|
+
end
|
34
|
+
end
|
35
|
+
end
|
36
|
+
end
|
@@ -0,0 +1,65 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'yaml'
|
4
|
+
|
5
|
+
module Splunk
|
6
|
+
module Pickaxe
|
7
|
+
class Config
|
8
|
+
CONFIG_FILE ||= '.pickaxe.yml'
|
9
|
+
|
10
|
+
DEFAULTS ||= {
|
11
|
+
'namespace' => {
|
12
|
+
'sharing' => 'app'
|
13
|
+
},
|
14
|
+
'environments' => {
|
15
|
+
},
|
16
|
+
'emails' => []
|
17
|
+
}.freeze
|
18
|
+
|
19
|
+
def self.load(execution_path)
|
20
|
+
config_path = File.join(execution_path, CONFIG_FILE)
|
21
|
+
raise "Unable to load config file [#{config_path}]" unless File.exist? config_path
|
22
|
+
|
23
|
+
# Merges DEFAULTS with yaml config
|
24
|
+
Config.new deep_merge(DEFAULTS, YAML.load_file(config_path)), execution_path
|
25
|
+
end
|
26
|
+
|
27
|
+
attr_reader :config, :namespace, :environments, :execution_path
|
28
|
+
|
29
|
+
def initialize(config, execution_path)
|
30
|
+
unless config['namespace'].key? 'app'
|
31
|
+
raise "Config must have a 'namespace / app' config"
|
32
|
+
end
|
33
|
+
|
34
|
+
raise "Must have at least one environment" unless config['environments'].size > 0
|
35
|
+
|
36
|
+
@config = config
|
37
|
+
@execution_path = execution_path
|
38
|
+
|
39
|
+
@environments = config['environments']
|
40
|
+
|
41
|
+
# Convert namespace config hash to hash with symbols for keys
|
42
|
+
namespace_config = config['namespace'].each_with_object({}) { |(k, v), memo| memo[k.to_sym] = v; }
|
43
|
+
@namespace = Splunk.namespace(namespace_config)
|
44
|
+
end
|
45
|
+
|
46
|
+
private
|
47
|
+
|
48
|
+
# Simple deep merge of two hashes
|
49
|
+
def self.deep_merge hash1, hash2
|
50
|
+
copy = Hash[hash1]
|
51
|
+
|
52
|
+
hash2.each do |key, value|
|
53
|
+
if value.kind_of?(Hash) && hash1[key].kind_of?(Hash)
|
54
|
+
copy[key] = deep_merge(hash1[key], value)
|
55
|
+
else
|
56
|
+
copy[key] = value
|
57
|
+
end
|
58
|
+
end
|
59
|
+
|
60
|
+
copy
|
61
|
+
end
|
62
|
+
|
63
|
+
end
|
64
|
+
end
|
65
|
+
end
|
@@ -0,0 +1,125 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'splunk-sdk-ruby'
|
4
|
+
require 'yaml'
|
5
|
+
|
6
|
+
# Base class for syncing splunk objects (dashboards, alerts, etc...)
|
7
|
+
module Splunk
|
8
|
+
module Pickaxe
|
9
|
+
class Objects
|
10
|
+
attr_reader :service, :environment, :pickaxe_config
|
11
|
+
|
12
|
+
def initialize(service, environment, pickaxe_config)
|
13
|
+
@service = service
|
14
|
+
@environment = environment
|
15
|
+
@pickaxe_config = pickaxe_config
|
16
|
+
end
|
17
|
+
|
18
|
+
def sync
|
19
|
+
puts "Syncing all #{entity_dir.capitalize}"
|
20
|
+
|
21
|
+
dir = File.join(pickaxe_config.execution_path, entity_dir)
|
22
|
+
|
23
|
+
unless Dir.exist? dir
|
24
|
+
puts "The directory #{dir} does not exist. Not syncing #{entity_dir.capitalize}"
|
25
|
+
return
|
26
|
+
end
|
27
|
+
|
28
|
+
Dir.entries(dir).each do |entity_file|
|
29
|
+
entity_path = File.join(dir, entity_file)
|
30
|
+
|
31
|
+
next unless File.file?(entity_path) && entity_file_extensions.any? { |ext| entity_path.end_with?(ext) }
|
32
|
+
|
33
|
+
entity = config(entity_path)
|
34
|
+
entity_name = name(entity)
|
35
|
+
|
36
|
+
puts "- #{entity_name}"
|
37
|
+
|
38
|
+
# Check if we should skip this entity
|
39
|
+
if skip? entity
|
40
|
+
puts ' Skipping'
|
41
|
+
next
|
42
|
+
end
|
43
|
+
|
44
|
+
splunk_entity = find entity
|
45
|
+
|
46
|
+
if splunk_entity.nil?
|
47
|
+
# Entity does not exist create it
|
48
|
+
puts ' Creating ...'
|
49
|
+
create entity
|
50
|
+
puts ' Created!'
|
51
|
+
else
|
52
|
+
# Entity exists check if it needs an update
|
53
|
+
if needs_update? splunk_entity, entity
|
54
|
+
puts ' Updating ...'
|
55
|
+
update splunk_entity, entity
|
56
|
+
puts ' Updated!'
|
57
|
+
else
|
58
|
+
puts ' Up to date!'
|
59
|
+
end
|
60
|
+
end
|
61
|
+
end
|
62
|
+
end
|
63
|
+
|
64
|
+
def config(file_path)
|
65
|
+
YAML.load_file file_path
|
66
|
+
end
|
67
|
+
|
68
|
+
def create(entity)
|
69
|
+
entity_collection = Splunk::Collection.new service, splunk_resource
|
70
|
+
entity_collection.create(name(entity), splunk_config(entity))
|
71
|
+
end
|
72
|
+
|
73
|
+
def update(splunk_entity, entity)
|
74
|
+
splunk_entity.update(splunk_config(entity))
|
75
|
+
end
|
76
|
+
|
77
|
+
def find(entity)
|
78
|
+
# Either return the entity or nil if it doesn't exist
|
79
|
+
|
80
|
+
Splunk::Entity.new service, service.namespace, splunk_resource, name(entity)
|
81
|
+
rescue Splunk::SplunkHTTPError => e
|
82
|
+
if e.code == 404
|
83
|
+
nil
|
84
|
+
else
|
85
|
+
raise e
|
86
|
+
end
|
87
|
+
end
|
88
|
+
|
89
|
+
def needs_update?(splunk_entity, entity)
|
90
|
+
splunk_config(entity).each do |k, v|
|
91
|
+
return true if splunk_entity[k] != v
|
92
|
+
end
|
93
|
+
|
94
|
+
false
|
95
|
+
end
|
96
|
+
|
97
|
+
def skip?(entity)
|
98
|
+
return false unless entity.key?('envs')
|
99
|
+
!entity['envs'].include?(environment)
|
100
|
+
end
|
101
|
+
|
102
|
+
def name(entity)
|
103
|
+
entity['name']
|
104
|
+
end
|
105
|
+
|
106
|
+
def splunk_config(entity)
|
107
|
+
entity['config']
|
108
|
+
end
|
109
|
+
|
110
|
+
def entity_file_extensions
|
111
|
+
['.yml', '.yaml']
|
112
|
+
end
|
113
|
+
|
114
|
+
def splunk_resource
|
115
|
+
# Must be implemented by child class
|
116
|
+
nil
|
117
|
+
end
|
118
|
+
|
119
|
+
def entity_dir
|
120
|
+
# Must be implemented by child class
|
121
|
+
nil
|
122
|
+
end
|
123
|
+
end
|
124
|
+
end
|
125
|
+
end
|
@@ -0,0 +1,77 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'yaml'
|
4
|
+
require 'splunk/pickaxe/objects'
|
5
|
+
|
6
|
+
module Splunk
|
7
|
+
module Pickaxe
|
8
|
+
class Alerts < Objects
|
9
|
+
DIR ||= 'alerts'
|
10
|
+
|
11
|
+
def splunk_resource
|
12
|
+
%w[saved searches]
|
13
|
+
end
|
14
|
+
|
15
|
+
def entity_dir
|
16
|
+
DIR
|
17
|
+
end
|
18
|
+
|
19
|
+
def name(entity)
|
20
|
+
# The alert name contains the environment name
|
21
|
+
"#{entity['name']} [#{environment.capitalize}]"
|
22
|
+
end
|
23
|
+
|
24
|
+
def splunk_config(entity_yaml)
|
25
|
+
# Include default values
|
26
|
+
config = alert_defaults
|
27
|
+
|
28
|
+
# Override defaults with any config provided in yaml
|
29
|
+
config.merge! entity_yaml['config']
|
30
|
+
|
31
|
+
config
|
32
|
+
end
|
33
|
+
|
34
|
+
def alert_defaults
|
35
|
+
{
|
36
|
+
# Who to email
|
37
|
+
'action.email.to' => pickaxe_config.config['emails'].join(','),
|
38
|
+
|
39
|
+
# How often to run alert (every hour)
|
40
|
+
'cron_schedule' => '0 * * * *',
|
41
|
+
'is_scheduled' => '1',
|
42
|
+
|
43
|
+
# Email subject
|
44
|
+
'action.email.subject' => 'Splunk Alert: $name$',
|
45
|
+
'action.email.subject.alert' => 'Splunk Alert: $name$',
|
46
|
+
|
47
|
+
# Email result formatting (inline results, table format, include alert link)
|
48
|
+
'action.email.format' => 'table',
|
49
|
+
'action.email.inline' => '1',
|
50
|
+
'action.email.include.view_link' => '1',
|
51
|
+
|
52
|
+
# Is an email alert
|
53
|
+
'actions' => 'email',
|
54
|
+
'action.email.sendresults' => '1',
|
55
|
+
|
56
|
+
# Alert severity (High)
|
57
|
+
'alert.severity' => '4',
|
58
|
+
|
59
|
+
# When to trigger alert
|
60
|
+
'alert_type' => 'number of events',
|
61
|
+
'alert_comparator' => 'greater than',
|
62
|
+
'alert_threshold' => '0',
|
63
|
+
|
64
|
+
# The time bounds for alert search
|
65
|
+
'dispatch.earliest_time' => '-1h',
|
66
|
+
'dispatch.latest_time' => 'now',
|
67
|
+
|
68
|
+
# Track alerts
|
69
|
+
'alert.track' => '1',
|
70
|
+
|
71
|
+
# Don't supress any alerts
|
72
|
+
'alert.suppress' => '0'
|
73
|
+
}
|
74
|
+
end
|
75
|
+
end
|
76
|
+
end
|
77
|
+
end
|
@@ -0,0 +1,33 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'splunk/pickaxe/objects'
|
4
|
+
|
5
|
+
module Splunk
|
6
|
+
module Pickaxe
|
7
|
+
class Dashboards < Objects
|
8
|
+
DIR ||= 'dashboards'
|
9
|
+
|
10
|
+
def splunk_resource
|
11
|
+
%w[data ui views]
|
12
|
+
end
|
13
|
+
|
14
|
+
def entity_dir
|
15
|
+
DIR
|
16
|
+
end
|
17
|
+
|
18
|
+
def config(file_path)
|
19
|
+
# Dashboards don't have many properties just name and source XML
|
20
|
+
{
|
21
|
+
'name' => File.basename(file_path, '.xml'),
|
22
|
+
'config' => {
|
23
|
+
'eai:data' => IO.read(file_path)
|
24
|
+
}
|
25
|
+
}
|
26
|
+
end
|
27
|
+
|
28
|
+
def entity_file_extensions
|
29
|
+
['.xml']
|
30
|
+
end
|
31
|
+
end
|
32
|
+
end
|
33
|
+
end
|
@@ -0,0 +1,20 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'yaml'
|
4
|
+
require 'splunk/pickaxe/objects'
|
5
|
+
|
6
|
+
module Splunk
|
7
|
+
module Pickaxe
|
8
|
+
class EventTypes < Objects
|
9
|
+
DIR ||= 'eventtypes'
|
10
|
+
|
11
|
+
def splunk_resource
|
12
|
+
%w[saved eventtypes]
|
13
|
+
end
|
14
|
+
|
15
|
+
def entity_dir
|
16
|
+
DIR
|
17
|
+
end
|
18
|
+
end
|
19
|
+
end
|
20
|
+
end
|
@@ -0,0 +1,38 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'splunk/pickaxe/objects'
|
4
|
+
|
5
|
+
module Splunk
|
6
|
+
module Pickaxe
|
7
|
+
class FieldExtractions < Objects
|
8
|
+
DIR ||= 'field_extractions'
|
9
|
+
|
10
|
+
def splunk_resource
|
11
|
+
%w[data props extractions]
|
12
|
+
end
|
13
|
+
|
14
|
+
def entity_dir
|
15
|
+
DIR
|
16
|
+
end
|
17
|
+
|
18
|
+
def find(entity)
|
19
|
+
# Splunk does some fun things by re-naming our field extraction to include
|
20
|
+
# the stanza and type in the name when its created so do that here by
|
21
|
+
# cloning the entity and editing its name before passing it to find
|
22
|
+
find_entity = Hash.new(entity)
|
23
|
+
find_entity['name'] = "#{entity['config']['stanza']} : #{entity['config']['type']}-#{entity['name']}"
|
24
|
+
super(find_entity)
|
25
|
+
end
|
26
|
+
|
27
|
+
def update(splunk_entity, entity)
|
28
|
+
# When updating splunk only wants the value field
|
29
|
+
splunk_entity.update('value' => splunk_config(entity)['value'])
|
30
|
+
end
|
31
|
+
|
32
|
+
def needs_update?(splunk_entity, entity)
|
33
|
+
# When updating splunk only cares about this field
|
34
|
+
splunk_entity['value'] != splunk_config(entity)['value']
|
35
|
+
end
|
36
|
+
end
|
37
|
+
end
|
38
|
+
end
|
@@ -0,0 +1,66 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'yaml'
|
4
|
+
require 'splunk/pickaxe/objects'
|
5
|
+
|
6
|
+
module Splunk
|
7
|
+
module Pickaxe
|
8
|
+
class Reports < Objects
|
9
|
+
DIR ||= 'reports'
|
10
|
+
|
11
|
+
def splunk_resource
|
12
|
+
%w[saved searches]
|
13
|
+
end
|
14
|
+
|
15
|
+
def entity_dir
|
16
|
+
DIR
|
17
|
+
end
|
18
|
+
|
19
|
+
def name(entity)
|
20
|
+
# The report name contains the environment name
|
21
|
+
"#{entity['name']} [#{environment.capitalize}]"
|
22
|
+
end
|
23
|
+
|
24
|
+
def splunk_config(entity_yaml)
|
25
|
+
# Include default values
|
26
|
+
config = report_defaults
|
27
|
+
|
28
|
+
# Override defaults with any config provided in yaml
|
29
|
+
config.merge! entity_yaml['config']
|
30
|
+
|
31
|
+
config
|
32
|
+
end
|
33
|
+
|
34
|
+
def report_defaults
|
35
|
+
{
|
36
|
+
# Who to email
|
37
|
+
'action.email.to' => pickaxe_config.config['emails'].join(','),
|
38
|
+
|
39
|
+
# How often to run alert (every hour)
|
40
|
+
'cron_schedule' => '0 * * * *',
|
41
|
+
'is_scheduled' => '1',
|
42
|
+
|
43
|
+
# Email subject
|
44
|
+
'action.email.subject' => 'Splunk Report: $name$',
|
45
|
+
'action.email.subject.alert' => 'Splunk Report: $name$',
|
46
|
+
|
47
|
+
# Email result formatting (inline results, table format, include alert link)
|
48
|
+
'action.email.format' => 'table',
|
49
|
+
'action.email.inline' => '1',
|
50
|
+
'action.email.include.view_link' => '1',
|
51
|
+
|
52
|
+
# Send an email
|
53
|
+
'actions' => 'email',
|
54
|
+
'action.email.sendresults' => '1',
|
55
|
+
|
56
|
+
# This is a report so always send it
|
57
|
+
'alert_type' => 'always',
|
58
|
+
|
59
|
+
# The time bounds for alert search (1 hour)
|
60
|
+
'dispatch.earliest_time' => '-1h',
|
61
|
+
'dispatch.latest_time' => 'now'
|
62
|
+
}
|
63
|
+
end
|
64
|
+
end
|
65
|
+
end
|
66
|
+
end
|
@@ -0,0 +1,65 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'yaml'
|
4
|
+
require 'splunk/pickaxe/objects'
|
5
|
+
|
6
|
+
module Splunk
|
7
|
+
module Pickaxe
|
8
|
+
class Tags < Objects
|
9
|
+
DIR ||= 'tags'
|
10
|
+
|
11
|
+
def splunk_resource
|
12
|
+
%w[search tags]
|
13
|
+
end
|
14
|
+
|
15
|
+
def entity_dir
|
16
|
+
DIR
|
17
|
+
end
|
18
|
+
|
19
|
+
# Tags do not follow the typical conventions that other splunk resources do
|
20
|
+
# so we have to change the find/create/update methods
|
21
|
+
def find(entity)
|
22
|
+
# Either return the entity or nil if it doesn't exist
|
23
|
+
|
24
|
+
response = service.request(method: :GET, resource: splunk_resource + [name(entity)])
|
25
|
+
# Parse out fields
|
26
|
+
atom_feed = Splunk::AtomFeed.new(response.body)
|
27
|
+
atom_feed.entries.map { |e| e['title'] }
|
28
|
+
rescue Splunk::SplunkHTTPError => e
|
29
|
+
if e.code == 404
|
30
|
+
nil
|
31
|
+
else
|
32
|
+
raise e
|
33
|
+
end
|
34
|
+
end
|
35
|
+
|
36
|
+
def create(entity)
|
37
|
+
# Create and update are the same thing. Pass in no known fields
|
38
|
+
update [], entity
|
39
|
+
end
|
40
|
+
|
41
|
+
def update(splunk_entity, entity)
|
42
|
+
# what we want - whats there = what we need to create/update
|
43
|
+
(splunk_config(entity) - splunk_entity).each do |field|
|
44
|
+
response = service.request(method: :POST, resource: splunk_resource + [name(entity)], body: { add: field })
|
45
|
+
raise "Failed to add field to tag [#{response.code}] - [#{response.body}]" unless response.is_a? Net::HTTPSuccess
|
46
|
+
end
|
47
|
+
|
48
|
+
# whats there - what we want = what we need to remove
|
49
|
+
(splunk_entity - splunk_config(entity)).each do |field|
|
50
|
+
response = service.request(method: :POST, resource: splunk_resource + [name(entity)], body: { delete: field })
|
51
|
+
raise "Failed to delete field from tag [#{response.code}] - [#{response.body}]" unless response.is_a? Net::HTTPSuccess
|
52
|
+
end
|
53
|
+
end
|
54
|
+
|
55
|
+
def splunk_config(entity)
|
56
|
+
entity['fields']
|
57
|
+
end
|
58
|
+
|
59
|
+
def needs_update?(splunk_entity, entity)
|
60
|
+
# Compares the fields in our config vs whats in splunk
|
61
|
+
splunk_config(entity).uniq.sort != splunk_entity.uniq.sort
|
62
|
+
end
|
63
|
+
end
|
64
|
+
end
|
65
|
+
end
|
data/project.yml
ADDED
@@ -0,0 +1,34 @@
|
|
1
|
+
name: splunk-pickaxe
|
2
|
+
group_id: com.cerner.bigdata
|
3
|
+
artifact_id: splunk-pickaxe
|
4
|
+
github:
|
5
|
+
project_url: http://github.cerner.com/bigdata/splunk-pickaxe
|
6
|
+
|
7
|
+
doc: rdoc
|
8
|
+
test: rake
|
9
|
+
|
10
|
+
philter:
|
11
|
+
linters:
|
12
|
+
- ruby
|
13
|
+
ruby:
|
14
|
+
exclusions:
|
15
|
+
- target/**/*
|
16
|
+
- vendor/**/*
|
17
|
+
- spec/**/*
|
18
|
+
|
19
|
+
jira:
|
20
|
+
url: https://jira.cerner.com
|
21
|
+
component: 25899
|
22
|
+
|
23
|
+
snapshot_repository:
|
24
|
+
id: cerner-rubygems-snapshot
|
25
|
+
url: http://repo.snapshot.cerner.corp/rubygems/
|
26
|
+
snapshot_site_repository:
|
27
|
+
id: bigdata-snapshot-site
|
28
|
+
url: http://repo.bigdata.cerner.corp/nexus/content/repositories/bigdata-snapshot-site/
|
29
|
+
repository:
|
30
|
+
id: cerner-rubygems-internal
|
31
|
+
url: http://repo.release.cerner.corp/internal/rubygems/
|
32
|
+
site_repository:
|
33
|
+
id: cerner-main-internal-site
|
34
|
+
url: http://repo.release.cerner.corp/internal/site/
|
metadata
ADDED
@@ -0,0 +1,132 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: splunk-pickaxe
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 2.0.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Bryan Baugher
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2017-07-10 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: splunk-sdk-ruby
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - "~>"
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: '1.0'
|
20
|
+
type: :runtime
|
21
|
+
prerelease: false
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '1.0'
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
name: highline
|
29
|
+
requirement: !ruby/object:Gem::Requirement
|
30
|
+
requirements:
|
31
|
+
- - "~>"
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '1.7'
|
34
|
+
type: :runtime
|
35
|
+
prerelease: false
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - "~>"
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '1.7'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
name: thor
|
43
|
+
requirement: !ruby/object:Gem::Requirement
|
44
|
+
requirements:
|
45
|
+
- - "~>"
|
46
|
+
- !ruby/object:Gem::Version
|
47
|
+
version: '0.19'
|
48
|
+
type: :runtime
|
49
|
+
prerelease: false
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - "~>"
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '0.19'
|
55
|
+
- !ruby/object:Gem::Dependency
|
56
|
+
name: bundler
|
57
|
+
requirement: !ruby/object:Gem::Requirement
|
58
|
+
requirements:
|
59
|
+
- - "~>"
|
60
|
+
- !ruby/object:Gem::Version
|
61
|
+
version: '1.9'
|
62
|
+
type: :development
|
63
|
+
prerelease: false
|
64
|
+
version_requirements: !ruby/object:Gem::Requirement
|
65
|
+
requirements:
|
66
|
+
- - "~>"
|
67
|
+
- !ruby/object:Gem::Version
|
68
|
+
version: '1.9'
|
69
|
+
- !ruby/object:Gem::Dependency
|
70
|
+
name: rake
|
71
|
+
requirement: !ruby/object:Gem::Requirement
|
72
|
+
requirements:
|
73
|
+
- - "~>"
|
74
|
+
- !ruby/object:Gem::Version
|
75
|
+
version: '10.0'
|
76
|
+
type: :development
|
77
|
+
prerelease: false
|
78
|
+
version_requirements: !ruby/object:Gem::Requirement
|
79
|
+
requirements:
|
80
|
+
- - "~>"
|
81
|
+
- !ruby/object:Gem::Version
|
82
|
+
version: '10.0'
|
83
|
+
description: 'A tool for syncing your repo of splunk objects with a splunk instance '
|
84
|
+
email:
|
85
|
+
- bryan.baugher@Cerner.com
|
86
|
+
executables:
|
87
|
+
- pickaxe
|
88
|
+
extensions: []
|
89
|
+
extra_rdoc_files: []
|
90
|
+
files:
|
91
|
+
- Gemfile
|
92
|
+
- README.md
|
93
|
+
- Rakefile
|
94
|
+
- bin/pickaxe
|
95
|
+
- lib/splunk/pickaxe.rb
|
96
|
+
- lib/splunk/pickaxe/cli.rb
|
97
|
+
- lib/splunk/pickaxe/client.rb
|
98
|
+
- lib/splunk/pickaxe/config.rb
|
99
|
+
- lib/splunk/pickaxe/objects.rb
|
100
|
+
- lib/splunk/pickaxe/objects/alerts.rb
|
101
|
+
- lib/splunk/pickaxe/objects/dashboards.rb
|
102
|
+
- lib/splunk/pickaxe/objects/eventtypes.rb
|
103
|
+
- lib/splunk/pickaxe/objects/field_extractions.rb
|
104
|
+
- lib/splunk/pickaxe/objects/reports.rb
|
105
|
+
- lib/splunk/pickaxe/objects/tags.rb
|
106
|
+
- lib/splunk/pickaxe/version.rb
|
107
|
+
- project.yml
|
108
|
+
homepage: http://github.com/Cerner/splunk-pickaxe
|
109
|
+
licenses:
|
110
|
+
- Apache-2.0
|
111
|
+
metadata: {}
|
112
|
+
post_install_message:
|
113
|
+
rdoc_options: []
|
114
|
+
require_paths:
|
115
|
+
- lib
|
116
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
117
|
+
requirements:
|
118
|
+
- - ">="
|
119
|
+
- !ruby/object:Gem::Version
|
120
|
+
version: '0'
|
121
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
122
|
+
requirements:
|
123
|
+
- - ">="
|
124
|
+
- !ruby/object:Gem::Version
|
125
|
+
version: '0'
|
126
|
+
requirements: []
|
127
|
+
rubyforge_project:
|
128
|
+
rubygems_version: 2.5.1
|
129
|
+
signing_key:
|
130
|
+
specification_version: 4
|
131
|
+
summary: A tool for syncing your repo of splunk objects with a splunk instance
|
132
|
+
test_files: []
|