logstash-output-fir 0.9.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +3 -0
- data/CONTRIBUTORS +11 -0
- data/DEVELOPER.md +2 -0
- data/Gemfile +3 -0
- data/LICENSE +13 -0
- data/NOTICE.TXT +5 -0
- data/README.md +191 -0
- data/lib/logstash/outputs/fir.rb +490 -0
- data/logstash-output-fir.gemspec +24 -0
- data/spec/outputs/fir_spec.rb +22 -0
- metadata +90 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 636de71feea1e34b4579a613463af11d26006f4e
|
4
|
+
data.tar.gz: e52e5056843d45177f5620748b84a8ffdde42a53
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: b179f2ffe18d37e7b0d1389002810610f80166007e379cb6c9dbc788a2db4b370f7f378bf2de4aebc98d6c08fa94ff18b33591d19d5f89a0f616408350444f66
|
7
|
+
data.tar.gz: f65f9e809db4ebb81d0c6fda09b8a0f7832cf464453388cb5fe3429e2a072ee4b4be2d67082d018525cc20d67f8d152832170bf00d7e4f7e3f871fa2ece2cbcf
|
data/CHANGELOG.md
ADDED
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,11 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* Aaron Mildenstein (untergeek)
|
6
|
+
* Pier-Hugues Pellerin (ph)
|
7
|
+
|
8
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
9
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
10
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
11
|
+
open source awesome.
|
data/DEVELOPER.md
ADDED
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
|
2
|
+
|
3
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
you may not use this file except in compliance with the License.
|
5
|
+
You may obtain a copy of the License at
|
6
|
+
|
7
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
|
9
|
+
Unless required by applicable law or agreed to in writing, software
|
10
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
See the License for the specific language governing permissions and
|
13
|
+
limitations under the License.
|
data/NOTICE.TXT
ADDED
data/README.md
ADDED
@@ -0,0 +1,191 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
Logstash plugin output for send alert to FIR (Incident platform of Cert SG - https://github.com/certsocietegenerale/FIR)
|
4
|
+
** require manticore **
|
5
|
+
|
6
|
+
Work on version 5.x and older.
|
7
|
+
|
8
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
9
|
+
|
10
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
11
|
+
|
12
|
+
## Install
|
13
|
+
```
|
14
|
+
env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 /usr/share/logstash/vendor/jby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem build logstash-output-fir.gemspec
|
15
|
+
usr/share/logstash/bin/logstash-plugin install logstash-output-fir-2.0.0.gem
|
16
|
+
```
|
17
|
+
|
18
|
+
## Main Configuration (logstash-output.conf)
|
19
|
+
** Refresh DB : The plugin use some files configurations, you can change it during run. The plugin get change and apply all refresh time. You can use config/db file with git system... **
|
20
|
+
When i create alert/event in FIR, it's possible to update whith news informations (news alert in relationship, ...), the plugin can update a event FIR for avoid remake near event...
|
21
|
+
* url_api_fir [string]: The uri for send alert to FIR API (REST)
|
22
|
+
* Example: "https://127.0.0.1:8000/api/"
|
23
|
+
* refresh_interval_remote [numeric]: Delay to refresh database by donwload incidentsDB in FIR (for re-syncronisation, when new event, is add in db_incident of logstash -- utility when you close event in FIR)
|
24
|
+
* Example: 3600
|
25
|
+
* If you have much incidents in DB, you take a long time for download DB...
|
26
|
+
* refresh_interval [numeric]: Delay to refresh configuration FIR
|
27
|
+
* Example: 3600
|
28
|
+
* headers [hash]: You give token API FIR
|
29
|
+
* Example: {"Authorization" => "Token 0000000000000000000000000000", "Content-Type" => "application/json"}
|
30
|
+
* ssl_options [string]: If you change SSL configuration of manticore, by example disable verify cert.
|
31
|
+
* Example for disable verify: "{ :verify => :disable }"
|
32
|
+
* template_new [path]: the path of erb file for make body/description of event in FIR when new alert
|
33
|
+
* Example "/etc/logstash/db/template_update.erb" (template example in folder: template_erb)
|
34
|
+
* template_update [path]: the path of erb file for make body/description of event in FIR when update alert
|
35
|
+
* Example "/etc/logstash/db/template_new.erb" (template example in folder: template_erb)
|
36
|
+
* subj_template_new [path]: the path of erb file for make subject of event in FIR when new alert
|
37
|
+
* Example "/etc/logstash/db/template_update.erb" (template example in folder: template_erb)
|
38
|
+
* subj_template_update [path]: the path of erb file for make subject of event in FIR when update alert
|
39
|
+
* Example "/etc/logstash/db/template_new.erb" (template example in folder: template_erb)
|
40
|
+
* confile [path]: configuration file for rules of create event in FIR (filter and filter near event)
|
41
|
+
* Example "/etc/logstash/db/conf_fir.json" (example in folder sample_conf)
|
42
|
+
* subject_field [string]: The name of field "subject" in event FIR when you create event
|
43
|
+
* Example field by default: "subject"
|
44
|
+
* body_field [string]: The name of field "description" in event FIR when you create event
|
45
|
+
* Example field by default: "description"
|
46
|
+
* severity_field [string]: The name of field "severity" in event FIR when you create event
|
47
|
+
* Example field by default: "severity"
|
48
|
+
* status_field [string]: The name of field "status" in event FIR when you create event
|
49
|
+
* Example field by default: "status"
|
50
|
+
|
51
|
+
### conf_fir.json
|
52
|
+
The file contains rules which give filter and filter near event, choice template for create event in FIR
|
53
|
+
** This file is a Json format **
|
54
|
+
```
|
55
|
+
{ "rules": [
|
56
|
+
{
|
57
|
+
"filters": {"sig_detected_note": "3|4"},
|
58
|
+
"subject_filter": "src_ip",
|
59
|
+
"subject_filter_prefix": "-",
|
60
|
+
"subject_filter_sufix": "-",
|
61
|
+
"body_filter": "fingerprint",
|
62
|
+
"body_filter_prefix": "",
|
63
|
+
"body_filter_sufix": " -> SCORE",
|
64
|
+
"count_filter": " Count: ",
|
65
|
+
"severity_add": "sig_detected_note",
|
66
|
+
"fields_create": {"actor": 6, "category": 26,"confidentiality": 0,"detection": 36, "plan": 37,"is_starred": false,"is_major": false,"is_incident": false,"concerned_business_lines": []},
|
67
|
+
"template_new_sujet": "",
|
68
|
+
"template_new_body": "",
|
69
|
+
"template_up_sujet": "",
|
70
|
+
"template_up_body": "",
|
71
|
+
}
|
72
|
+
] }
|
73
|
+
```
|
74
|
+
Json contains key "rules" which contains all rule in hash format.
|
75
|
+
Each element of rule:
|
76
|
+
* filters [hash]: filter rule by field value, you can use multi field filter => {"field1": "regexp", "field2": "regexp"}
|
77
|
+
* field_name [string]: name of field in event where you search regexp value
|
78
|
+
* value_search [regexp]: regexp value to search in event field selected
|
79
|
+
* subject_filter [string]: When you match filter then you search if event is near. For apply, you search value of field event in subject of all incident FIR DB. If you find then event DB know server or client then you verify if same event for this client. Else you don't find value field event in incident DB, you create new event in FIR. The search use "subject == '*value_field_event*' "
|
80
|
+
* subject_filter_prefix [string]: For avoid error when match subject_filter you can add prefix and change search by "subject == '*prefix+value_field_event*' "
|
81
|
+
* subject_filter_sufix [string]: For avoid error when match subject_filter you can add sufix and change search by "subject == '*value_field_event+sufix*' "
|
82
|
+
* body_filter [string]: When subject matched, then verify if description event contains same event. I use fingerprint field (Plugin logstash-filter-sig) for it. If it find then stop, is ok, else, update event FIR with new event for client/server subject. The search use "body == '*value_field_event*' " (body is field description in FIR)
|
83
|
+
* body_filter_prefix [string]: For avoid error when match body_filter you can add prefix and change search by "body == '*prefix+value_field_event*' "
|
84
|
+
* body_filter_sufix [string]: For avoid error when match body_filter you can add sufix and change search by "body == '*value_field_event+sufix*' "
|
85
|
+
* count_filter [string]: For incriment count of number same alert receveive.
|
86
|
+
* severity_add [string]: The name of field used for copy value in severity field of FIR event. If empty then set severity to 1
|
87
|
+
* fields_create [hash]: contains information must need for create event in FIR
|
88
|
+
* actor [numeric]: 1
|
89
|
+
* category [numeric]: 2
|
90
|
+
* confidentiality [numeric]: 1
|
91
|
+
* detection [numeric]: 3
|
92
|
+
* plan [numeric]: 4
|
93
|
+
* is_starred [boolean]: false
|
94
|
+
* is_major [boolean]: false
|
95
|
+
* is_incident [boolean]: false
|
96
|
+
* concerned_business_lines [array]: []
|
97
|
+
* template_new_sujet [path]: The path of file erb template for new subject
|
98
|
+
* template_new_body [path]: The path of file erb template for new description/body
|
99
|
+
* template_up_sujet [path]: The path of file erb template for update subject
|
100
|
+
* template_up_body [path]: The path of file erb template for update description/body
|
101
|
+
|
102
|
+
## Documentation
|
103
|
+
|
104
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
105
|
+
|
106
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
107
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
108
|
+
|
109
|
+
## Need Help?
|
110
|
+
|
111
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
112
|
+
|
113
|
+
## Developing
|
114
|
+
|
115
|
+
### 1. Plugin Developement and Testing
|
116
|
+
|
117
|
+
#### Code
|
118
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
119
|
+
|
120
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
121
|
+
|
122
|
+
- Install dependencies
|
123
|
+
```sh
|
124
|
+
bundle install
|
125
|
+
```
|
126
|
+
|
127
|
+
#### Test
|
128
|
+
|
129
|
+
- Update your dependencies
|
130
|
+
|
131
|
+
```sh
|
132
|
+
bundle install
|
133
|
+
```
|
134
|
+
|
135
|
+
- Run tests
|
136
|
+
|
137
|
+
```sh
|
138
|
+
bundle exec rspec
|
139
|
+
```
|
140
|
+
|
141
|
+
### 2. Running your unpublished Plugin in Logstash
|
142
|
+
|
143
|
+
#### 2.1 Run in a local Logstash clone
|
144
|
+
|
145
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
146
|
+
```ruby
|
147
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
148
|
+
```
|
149
|
+
- Install plugin
|
150
|
+
```sh
|
151
|
+
# Logstash 2.3 and higher
|
152
|
+
bin/logstash-plugin install --no-verify
|
153
|
+
|
154
|
+
# Prior to Logstash 2.3
|
155
|
+
bin/plugin install --no-verify
|
156
|
+
|
157
|
+
```
|
158
|
+
- Run Logstash with your plugin
|
159
|
+
```sh
|
160
|
+
bin/logstash -e 'filter {awesome {}}'
|
161
|
+
```
|
162
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
163
|
+
|
164
|
+
#### 2.2 Run in an installed Logstash
|
165
|
+
|
166
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
167
|
+
|
168
|
+
- Build your plugin gem
|
169
|
+
```sh
|
170
|
+
gem build logstash-filter-awesome.gemspec
|
171
|
+
```
|
172
|
+
- Install the plugin from the Logstash home
|
173
|
+
```sh
|
174
|
+
# Logstash 2.3 and higher
|
175
|
+
bin/logstash-plugin install --no-verify
|
176
|
+
|
177
|
+
# Prior to Logstash 2.3
|
178
|
+
bin/plugin install --no-verify
|
179
|
+
|
180
|
+
```
|
181
|
+
- Start Logstash and proceed to test the plugin
|
182
|
+
|
183
|
+
## Contributing
|
184
|
+
|
185
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
186
|
+
|
187
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
188
|
+
|
189
|
+
It is more important to the community that you are able to contribute.
|
190
|
+
|
191
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,490 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
# Output "FIR" send alert to FIR platform (CERT SG)
|
3
|
+
# can be use with logstash-filter-sig
|
4
|
+
# Contact: Lionel PRAT (lionel.prat9@gmail.com)
|
5
|
+
require "logstash/outputs/base"
|
6
|
+
require "logstash/namespace"
|
7
|
+
require "json"
|
8
|
+
require "uri"
|
9
|
+
require "manticore"
|
10
|
+
require "time"
|
11
|
+
require "erb"
|
12
|
+
|
13
|
+
# An example output that does nothing.
|
14
|
+
class LogStash::Outputs::Fir < LogStash::Outputs::Base
|
15
|
+
config_name "fir"
|
16
|
+
|
17
|
+
# URL to use
|
18
|
+
config :url_api_fir, :validate => :string, :required => :true, :default => "https://127.0.0.1:8000/api/"
|
19
|
+
#refresh remote information FIR - get closed information & new ticket manual & artefacts db
|
20
|
+
config :refresh_interval_remote, :validate => :number, :default => 3600
|
21
|
+
|
22
|
+
# Custom headers to use
|
23
|
+
# format is `headers => ["X-My-Header", "%{host}"]`
|
24
|
+
#REPLACE 0000000000000000000000000000 by your token
|
25
|
+
config :headers, :validate => :hash, :required => :true, :default => {"Authorization" => "Token 0000000000000000000000000000", "Content-Type" => "application/json"}
|
26
|
+
#use insecure mode change by '{ verify: :disable }'
|
27
|
+
config :ssl_options, :validate => :string, :default => "{ :verify => :disable }"
|
28
|
+
|
29
|
+
#template body & subject ERB
|
30
|
+
#template for new alert
|
31
|
+
config :template_new, :validate => :path, :default => "/etc/logstash/db/template_new.erb"
|
32
|
+
#template for update alert
|
33
|
+
config :template_update, :validate => :path, :default => "/etc/logstash/db/template_update.erb"
|
34
|
+
#template subject for new alert
|
35
|
+
config :subj_template_new, :validate => :path, :default => "/etc/logstash/db/subject_template_new.erb"
|
36
|
+
#template subject for update alert
|
37
|
+
config :subj_template_update, :validate => :path, :default => "/etc/logstash/db/subject_template_update.erb"
|
38
|
+
#MATCH CONFIGURATION:
|
39
|
+
config :conffile, :validate => :string, :default => "/etc/logstash/db/conf_fir.json"
|
40
|
+
#need: filter + first identification in sujet + second identification for alert exist in body + optionnal change template
|
41
|
+
#Configuration -- format HASH
|
42
|
+
#rules [
|
43
|
+
# {
|
44
|
+
# filters: {},
|
45
|
+
# subject_filter: 'name_field_in_event', # by example field src_ip
|
46
|
+
# subject_filter_prefix: 'string must match.. before content of field subject filter' #optionnal
|
47
|
+
# subject_filter_sufix: 'string must match.. after content of field subject filter' #optionnal
|
48
|
+
# body_filter: 'name_field_in_event', # by example field fingerprint
|
49
|
+
# body_filter_prefix: 'string must match.. before content of field body filter' #optionnal
|
50
|
+
# body_filter_sufix: 'string must match.. after content of field body filter' #optionnal
|
51
|
+
# severity_add: 'name_field_in_event', # by example field sig_detected_note !!optionnal!!
|
52
|
+
# count_filter: ' Count: '
|
53
|
+
# fields_create: {'name' => value, '' => value} # !!!REQUIRE FOR CREATE: "actor" = 6 & "category" & "confidentiality" & "detection" & "plan" & "is_starred" & "is_major" & "is_incident" & "concerned_business_lines"
|
54
|
+
# template_new_sujet: 'path', #optionnal
|
55
|
+
# template_new_body: 'path', #optionnal
|
56
|
+
# template_up_sujet: 'path', #optionnal
|
57
|
+
# template_up_body: 'path', #optionnal
|
58
|
+
# }
|
59
|
+
#]
|
60
|
+
|
61
|
+
# this setting will indicate how frequently
|
62
|
+
# (in seconds) logstash will check the db file for updates.
|
63
|
+
config :refresh_interval, :validate => :number, :default => 3600
|
64
|
+
|
65
|
+
#choice in FIR for subject and body and severity -- API REST
|
66
|
+
config :subject_field, :validate => :string, :default => "subject"
|
67
|
+
config :body_field, :validate => :string, :default => "description"
|
68
|
+
config :severity_field, :validate => :string, :default => "severity"
|
69
|
+
config :status_field, :validate => :string, :default => "status"
|
70
|
+
|
71
|
+
concurrency :single
|
72
|
+
|
73
|
+
public
|
74
|
+
def register
|
75
|
+
@fir_conf = []
|
76
|
+
@incidents_db = {}
|
77
|
+
@client = Manticore::Client.new(ssl: eval(@ssl_options))
|
78
|
+
@logger.info("FIR Configuration -- Loading...")
|
79
|
+
@hash_file = ""
|
80
|
+
@load_statut = false
|
81
|
+
load_db
|
82
|
+
@load_statut = true
|
83
|
+
@logger.info("finish")
|
84
|
+
#file load template
|
85
|
+
@logger.info("FIR templates -- Loading...")
|
86
|
+
@template_data_n = ""
|
87
|
+
@template_data_u = ""
|
88
|
+
@template_subj_n = ""
|
89
|
+
@template_subj_u = ""
|
90
|
+
if File.file?(@template_new) && File.file?(@template_update) && File.file?(@subj_template_new) && File.file?(@subj_template_update)
|
91
|
+
@template_data_n = File.read(@template_new)
|
92
|
+
@template_data_u = File.read(@template_update)
|
93
|
+
@template_subj_n = File.read(@subj_template_new)
|
94
|
+
@template_subj_u = File.read(@subj_template_update)
|
95
|
+
else
|
96
|
+
@logger.error("FIR templates not found!")
|
97
|
+
exit -1
|
98
|
+
end
|
99
|
+
if @template_subj_u.empty? or @template_subj_n.empty? or @template_data_u.empty? or @template_data_n.empty?
|
100
|
+
@logger.error("FIR templates is empty!!")
|
101
|
+
exit -1
|
102
|
+
end
|
103
|
+
@logger.info("finish")
|
104
|
+
@next_refresh = Time.now + @refresh_interval
|
105
|
+
@load_statut_r = false
|
106
|
+
@logger.info("FIR get incident DB -- Loading...could take a sometime...")
|
107
|
+
load_incidents
|
108
|
+
@logger.info("finish")
|
109
|
+
@load_statut_r = true
|
110
|
+
@next_refresh = Time.now + @refresh_interval
|
111
|
+
@next_refresh_remote = Time.now + @refresh_interval_remote
|
112
|
+
@token_create=true
|
113
|
+
end # def register
|
114
|
+
|
115
|
+
public
|
116
|
+
def multi_receive(events)
|
117
|
+
events.each {|event| receive(event)}
|
118
|
+
end
|
119
|
+
|
120
|
+
def receive(event)
|
121
|
+
tnow = Time.now
|
122
|
+
#refresh DB & conf
|
123
|
+
if @next_refresh < tnow
|
124
|
+
if @load_statut == true
|
125
|
+
@load_statut = false
|
126
|
+
load_db
|
127
|
+
if File.file?(@template_new) && File.file?(@template_update)
|
128
|
+
@template_data_n = File.read(@template_new)
|
129
|
+
@template_data_u = File.read(@template_update)
|
130
|
+
end
|
131
|
+
@next_refresh = tnow + @refresh_interval
|
132
|
+
@load_statut = true
|
133
|
+
end
|
134
|
+
end
|
135
|
+
if @next_refresh_remote < tnow
|
136
|
+
if @load_statut_r == true
|
137
|
+
@load_statut_r = false
|
138
|
+
@logger.info("FIR refresh incident DB -- could take a sometime...")
|
139
|
+
load_incidents
|
140
|
+
@next_refresh_remote = tnow + @refresh_interval_remote
|
141
|
+
@load_statut_r = true
|
142
|
+
end
|
143
|
+
end
|
144
|
+
sleep(1) until @load_statut_r
|
145
|
+
sleep(1) until @load_statut
|
146
|
+
#verify db & conf is OK!
|
147
|
+
if @fir_conf.is_a?(Array) and not @incidents_db.nil?
|
148
|
+
#check filter: {'field_name': [] or "" -- if field event is numerci, the code change to string
|
149
|
+
#match one regexp on field event use string value "regexp"
|
150
|
+
#match multi regexp (for array type) on field event use array type: [regex1,regexp2,...] for match ok all regexp must matched one time
|
151
|
+
# }
|
152
|
+
for rule in @fir_conf
|
153
|
+
#get key in rule: fields
|
154
|
+
eventK = event.to_hash.keys
|
155
|
+
inter = rule['filters'].keys & eventK
|
156
|
+
#check if fields rule present in event
|
157
|
+
if inter.length == rule['filters'].keys.length
|
158
|
+
#ok field present - check filter
|
159
|
+
#check field by field
|
160
|
+
sig_add = {}
|
161
|
+
check_sig=false
|
162
|
+
for kfield in inter
|
163
|
+
check_sig=false
|
164
|
+
# field X -- check type
|
165
|
+
if event.get(kfield).is_a?(Array)
|
166
|
+
#array type
|
167
|
+
# if rule value regexp is Array?
|
168
|
+
if rule['filters'][kfield].is_a?(Array)
|
169
|
+
for regexp in rule['filters'][kfield]
|
170
|
+
check_sig=false
|
171
|
+
for elem in event.get(kfield)
|
172
|
+
match = Regexp.new(regexp, nil, 'n').match(elem.to_s)
|
173
|
+
if not match.nil?
|
174
|
+
check_sig=true
|
175
|
+
break
|
176
|
+
end
|
177
|
+
end
|
178
|
+
break unless check_sig
|
179
|
+
end
|
180
|
+
else
|
181
|
+
#rule not array
|
182
|
+
for elem in event.get(kfield)
|
183
|
+
match = Regexp.new(rule['filters'][kfield], nil, 'n').match(elem.to_s)
|
184
|
+
if not match.nil?
|
185
|
+
check_sig=true
|
186
|
+
break
|
187
|
+
end
|
188
|
+
end
|
189
|
+
end
|
190
|
+
else
|
191
|
+
#other type
|
192
|
+
# if rule value regexp is Array?
|
193
|
+
if rule['filters'][kfield].is_a?(Array)
|
194
|
+
#array
|
195
|
+
for regexp in rule['filters'][kfield]
|
196
|
+
match = Regexp.new(regexp, nil, 'n').match(event.get(kfield).to_s)
|
197
|
+
if not match.nil?
|
198
|
+
sig_add[kfield.to_s]="Regexp found #{match}"
|
199
|
+
check_sig=true
|
200
|
+
next
|
201
|
+
end
|
202
|
+
break unless check_sig
|
203
|
+
end
|
204
|
+
else
|
205
|
+
#other
|
206
|
+
match = Regexp.new(rule['filters'][kfield], nil, 'n').match(event.get(kfield).to_s)
|
207
|
+
if not match.nil?
|
208
|
+
check_sig=true
|
209
|
+
next
|
210
|
+
end
|
211
|
+
end
|
212
|
+
end
|
213
|
+
break unless check_sig
|
214
|
+
end
|
215
|
+
if check_sig
|
216
|
+
#filter matched
|
217
|
+
#stat of exist alert
|
218
|
+
check_if_create=true
|
219
|
+
#verify filter present subject_filter && body_filter
|
220
|
+
if not event.get(rule['subject_filter'].to_s).nil? and not event.get(rule['body_filter'].to_s).nil?
|
221
|
+
#ok
|
222
|
+
#not write in same time for avoid corruption db
|
223
|
+
#check in incident db if alert exist
|
224
|
+
for incident in @incidents_db["results"]
|
225
|
+
#verify if incident is Open or Close -- just check incident open
|
226
|
+
next if not incident[@status_field].to_s == "O"
|
227
|
+
#verify if filter subject present in incident DB
|
228
|
+
next if not incident[@subject_field].include?(rule['subject_filter_prefix'].to_s+event.get(rule['subject_filter'].to_s).to_s+rule['subject_filter_sufix'].to_s)
|
229
|
+
#verify if filter body present in incident DB
|
230
|
+
check_if_create=false
|
231
|
+
#if body match, break, is ok -> created and updated
|
232
|
+
if incident[@body_field].include?(rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s)
|
233
|
+
break if rule['count_filter'].nil? or rule['count_filter'].empty?
|
234
|
+
#count ++
|
235
|
+
#replace rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s+".*COUNT:"
|
236
|
+
num_al=incident[@body_field].scan(/#{Regexp.escape(rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s)}.*#{Regexp.escape(rule['count_filter'].to_s)}(\d+)/).last
|
237
|
+
if num_al.nil?
|
238
|
+
break
|
239
|
+
else
|
240
|
+
if num_al.is_a?(Array) and not num_al.empty?
|
241
|
+
num_al = num_al.first.to_i + 1
|
242
|
+
incident[@body_field] = incident[@body_field].gsub(/(#{Regexp.escape(rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s)}.*#{Regexp.escape(rule['count_filter'].to_s)})(\d+)/, '\1'+num_al.to_s)
|
243
|
+
else
|
244
|
+
break
|
245
|
+
end
|
246
|
+
end
|
247
|
+
url = @url_api_fir + "incidents/" + incident["id"].to_s
|
248
|
+
begin
|
249
|
+
response = @client.patch(url, :body => incident.to_json, :headers => @headers)
|
250
|
+
if response.code < 200 and response.code > 299
|
251
|
+
log_failure(
|
252
|
+
"Encountered non-200 HTTP code #{200}",
|
253
|
+
:response_code => response.code,
|
254
|
+
:url => url,
|
255
|
+
:event => event)
|
256
|
+
end
|
257
|
+
rescue
|
258
|
+
@logger.warn("ERROR SEND:", :string => body.to_json)
|
259
|
+
end
|
260
|
+
break
|
261
|
+
end
|
262
|
+
#if body no match, then update
|
263
|
+
#UPDATE
|
264
|
+
sleep(1) until @token_create
|
265
|
+
@token_create=false
|
266
|
+
#update incident
|
267
|
+
#change severity if option not empty
|
268
|
+
if rule['severity_add'] and (event.get(rule['severity_add'].to_s).is_a?(String) or event.get(rule['severity_add'].to_s).is_a?(Numeric))
|
269
|
+
if incident["severity"] < event.get(rule['severity_add'].to_s).to_i
|
270
|
+
incident[@severity_field] = event.get(rule['severity_add'].to_s).to_i
|
271
|
+
end
|
272
|
+
end
|
273
|
+
if not rule['template_up_sujet'].nil? and not rule['template_up_sujet'].empty?
|
274
|
+
incident[@subject_field] = ERB.new(rule['template_up_sujet']).result(binding)
|
275
|
+
else
|
276
|
+
incident[@subject_field] = ERB.new(@template_subj_u).result(binding)
|
277
|
+
end
|
278
|
+
#keep old content
|
279
|
+
if not rule['template_up_body'].nil? and not rule['template_up_body'].empty?
|
280
|
+
incident[@body_field] = ERB.new(rule['template_up_body']).result(binding) + incident[@body_field]
|
281
|
+
else
|
282
|
+
incident[@body_field] = ERB.new(@template_data_u).result(binding) + incident[@body_field]
|
283
|
+
end
|
284
|
+
url = @url_api_fir + "incidents/" + incident["id"].to_s
|
285
|
+
begin
|
286
|
+
response = @client.patch(url, :body => incident.to_json, :headers => @headers)
|
287
|
+
if response.code < 200 and response.code > 299
|
288
|
+
log_failure(
|
289
|
+
"Encountered non-200 HTTP code #{200}",
|
290
|
+
:response_code => response.code,
|
291
|
+
:url => url,
|
292
|
+
:event => event)
|
293
|
+
else
|
294
|
+
begin
|
295
|
+
url = @url_api_fir+"files/"+incident["id"].to_s+"/upload"
|
296
|
+
bodyfile={"files" => [{"content" => JSON.pretty_generate(event.to_hash),"description" => "Incident details","filename" => "incident_detail-"+tnow.strftime("%Y-%m-%dT%H:%M:%S:%L").to_s+".json"}]}
|
297
|
+
response = @client.post(url, :body => bodyfile.to_json, :headers => @headers)
|
298
|
+
@logger.info("Upload file content incident: ", :string => response.body)
|
299
|
+
rescue
|
300
|
+
@logger.warn("Upload file content incident: ERROR.", :string => response.body)
|
301
|
+
end
|
302
|
+
end
|
303
|
+
rescue
|
304
|
+
@logger.warn("ERROR SEND:", :string => body.to_json)
|
305
|
+
end
|
306
|
+
#end - give token
|
307
|
+
@token_create=true
|
308
|
+
#break loop check_if_create is false, stop process => created and updated OK
|
309
|
+
break
|
310
|
+
end
|
311
|
+
if check_if_create and rule['fields_create'].is_a?(Hash)
|
312
|
+
#create
|
313
|
+
sleep(1) until @token_create
|
314
|
+
@token_create=false
|
315
|
+
body = {}
|
316
|
+
#create base JSON of fir incendent
|
317
|
+
rule['fields_create'].each do |jkey,jval|
|
318
|
+
body[jkey.to_s] = jval
|
319
|
+
end
|
320
|
+
body["date"] = tnow.strftime("%Y-%m-%dT%H:%M").to_s # format "2016-01-01T00:00" TODO!!! CHOICE GMT
|
321
|
+
if rule['severity_add'] and (event.get(rule['severity_add'].to_s).is_a?(String) or event.get(rule['severity_add'].to_s).is_a?(Numeric))
|
322
|
+
if event.get(rule['severity_add'].to_s).to_i < 5
|
323
|
+
body[@severity_field] = event.get(rule['severity_add'].to_s).to_i
|
324
|
+
else
|
325
|
+
body[@severity_field] = 1
|
326
|
+
end
|
327
|
+
else
|
328
|
+
body[@severity_field] = 1
|
329
|
+
end
|
330
|
+
body[@status_field] = "O"
|
331
|
+
if not rule['template_new_sujet'].nil? and not rule['template_new_sujet'].empty?
|
332
|
+
body[@subject_field] = ERB.new(rule['template_new_sujet']).result(binding)
|
333
|
+
else
|
334
|
+
body[@subject_field] = ERB.new(@template_subj_n).result(binding)
|
335
|
+
end
|
336
|
+
#keep old content
|
337
|
+
if not rule['template_new_body'].nil? and not rule['template_new_body'].empty?
|
338
|
+
body[@body_field] = ERB.new(rule['template_new_body']).result(binding)
|
339
|
+
else
|
340
|
+
body[@body_field] = ERB.new(@template_data_n).result(binding)
|
341
|
+
end
|
342
|
+
url = @url_api_fir+"incidents"
|
343
|
+
begin
|
344
|
+
response = @client.post(url, :body => body.to_json, :headers => @headers)
|
345
|
+
if response.code > 200 and response.code < 299
|
346
|
+
#body
|
347
|
+
begin
|
348
|
+
add_inc = JSON.parse(response.body)
|
349
|
+
(@incidents_db["results"] ||= []) << add_inc
|
350
|
+
begin
|
351
|
+
url = @url_api_fir+"files/"+add_inc["id"].to_s+"/upload"
|
352
|
+
bodyfile={"files" => [{"content" => JSON.pretty_generate(event.to_hash),"description" => "Incident details","filename" => "incident_detail-"+tnow.strftime("%Y-%m-%dT%H:%M:%S:%L").to_s+".json"}]}
|
353
|
+
response = @client.post(url, :body => bodyfile.to_json, :headers => @headers)
|
354
|
+
@logger.info("Upload file content incident: ", :string => response.body)
|
355
|
+
rescue
|
356
|
+
@logger.warn("Upload file content incident: ERROR.", :string => response.body)
|
357
|
+
end
|
358
|
+
rescue
|
359
|
+
@logger.warn("JSON CMD ERROR PARSE:", :string => response.body)
|
360
|
+
end
|
361
|
+
else
|
362
|
+
log_failure(
|
363
|
+
"Encountered non-200 HTTP code #{200}",
|
364
|
+
:response_code => response.code,
|
365
|
+
:url => url,
|
366
|
+
:response => response,
|
367
|
+
:body => body)
|
368
|
+
end
|
369
|
+
rescue
|
370
|
+
@logger.warn("ERROR SEND:", :string => body.to_json)
|
371
|
+
end
|
372
|
+
#end - give token
|
373
|
+
@token_create=true
|
374
|
+
else
|
375
|
+
break # exit of rules check & ok => stop
|
376
|
+
end
|
377
|
+
end
|
378
|
+
end
|
379
|
+
end
|
380
|
+
end
|
381
|
+
end
|
382
|
+
end
|
383
|
+
|
384
|
+
def close
|
385
|
+
@client.close
|
386
|
+
end
|
387
|
+
|
388
|
+
private
|
389
|
+
def load_db
|
390
|
+
if !File.exists?(@conffile)
|
391
|
+
@logger.warn("Configuration file read failure, stop loading", :path => @conffile)
|
392
|
+
return
|
393
|
+
end
|
394
|
+
tmp_hash = Digest::SHA256.hexdigest File.read @conffile
|
395
|
+
if not tmp_hash == @hash_file
|
396
|
+
@hash_file = tmp_hash
|
397
|
+
begin
|
398
|
+
tmp_conf = JSON.parse( IO.read(@conffile, encoding:'utf-8') )
|
399
|
+
unless tmp_conf.nil?
|
400
|
+
if tmp_conf['rules'].is_a?(Array)
|
401
|
+
for rule in tmp_conf['rules']
|
402
|
+
if not rule['template_new_sujet'].nil? and not rule['template_new_sujet'].empty? and !File.exists?(rule['template_new_sujet'].to_s)
|
403
|
+
@logger.error("Template in configuration file not exist", :path => rule['template_new_sujet'].to_s)
|
404
|
+
return
|
405
|
+
elsif not rule['template_new_sujet'].nil? and not rule['template_new_sujet'].empty?
|
406
|
+
rule['template_new_sujet'] = File.read(rule['template_new_sujet'].to_s)
|
407
|
+
end
|
408
|
+
if not rule['template_new_body'].nil? and not rule['template_new_body'].empty? and !File.exists?(rule['template_new_body'].to_s)
|
409
|
+
@logger.error("Template in configuration file not exist", :path => rule['template_new_body'].to_s)
|
410
|
+
return
|
411
|
+
elsif not rule['template_new_body'].nil? and not rule['template_new_body'].empty?
|
412
|
+
rule['template_new_body'] = File.read(rule['template_new_body'].to_s)
|
413
|
+
end
|
414
|
+
if not rule['template_up_sujet'].nil? and not rule['template_up_sujet'].empty? and !File.exists?(rule['template_up_sujet'].to_s)
|
415
|
+
@logger.error("Template in configuration file not exist", :path => rule['template_up_sujet'].to_s)
|
416
|
+
return
|
417
|
+
elsif not rule['template_up_sujet'].nil? and not rule['template_up_sujet'].empty?
|
418
|
+
rule['template_up_sujet'] = File.read(rule['template_up_sujet'].to_s)
|
419
|
+
end
|
420
|
+
if not rule['template_up_body'].nil? and not rule['template_up_body'].empty? and !File.exists?(rule['template_up_body'].to_s)
|
421
|
+
@logger.error("Template in configuration file not exist", :path => rule['template_up_body'].to_s)
|
422
|
+
return
|
423
|
+
elsif not rule['template_up_body'].nil? and not rule['template_up_body'].empty?
|
424
|
+
rule['template_up_body'] = File.read(rule['template_up_body'].to_s)
|
425
|
+
end
|
426
|
+
if rule['subject_filter_prefix'].nil?
|
427
|
+
rule['subject_filter_prefix'] = ""
|
428
|
+
end
|
429
|
+
if rule['subject_filter_sufix'].nil?
|
430
|
+
rule['subject_filter_sufix'] = ""
|
431
|
+
end
|
432
|
+
if rule['body_filter_prefix'].nil?
|
433
|
+
rule['body_filter_prefix'] = ""
|
434
|
+
end
|
435
|
+
if rule['body_filter_sufix'].nil?
|
436
|
+
rule['body_filter_sufix'] = ""
|
437
|
+
end
|
438
|
+
end
|
439
|
+
@fir_conf = tmp_conf['rules']
|
440
|
+
end
|
441
|
+
end
|
442
|
+
@logger.info("refreshing DB FIR condition file")
|
443
|
+
rescue
|
444
|
+
@logger.error("JSON CONF FIR -- PARSE ERROR")
|
445
|
+
end
|
446
|
+
end
|
447
|
+
end
|
448
|
+
|
449
|
+
# This is split into a separate method mostly to help testing
|
450
|
+
def log_failure(message, opts)
|
451
|
+
@logger.error("[HTTP Output Failure] #{message}", opts)
|
452
|
+
end
|
453
|
+
|
454
|
+
def load_incidents
|
455
|
+
# Send the request
|
456
|
+
stop_load = true
|
457
|
+
incidents_db_tmp = {}
|
458
|
+
first = true
|
459
|
+
error_load = false
|
460
|
+
url = @url_api_fir+"incidents?format=json"
|
461
|
+
while stop_load
|
462
|
+
response = @client.get(url, :headers => @headers)
|
463
|
+
#body
|
464
|
+
@logger.info("BODY: #{response.body}")
|
465
|
+
begin
|
466
|
+
field_next = ""
|
467
|
+
if first
|
468
|
+
incidents_db_tmp = JSON.parse(response.body)
|
469
|
+
field_next = incidents_db_tmp["next"]
|
470
|
+
first = false
|
471
|
+
else
|
472
|
+
tmp_db = JSON.parse(response.body)
|
473
|
+
incidents_db_tmp["results"] = incidents_db_tmp["results"] + tmp_db["results"]
|
474
|
+
field_next = tmp_db["next"]
|
475
|
+
end
|
476
|
+
if field_next != nil
|
477
|
+
url = field_next
|
478
|
+
else
|
479
|
+
stop_load = false
|
480
|
+
end
|
481
|
+
rescue
|
482
|
+
@logger.warn("JSON CMD ERROR PARSE:", :string => response.body)
|
483
|
+
stop_load = false
|
484
|
+
error_load = true
|
485
|
+
end
|
486
|
+
end
|
487
|
+
@incidents_db = incidents_db_tmp unless error_load
|
488
|
+
@logger.warn("INCIDENT DB LOADED") unless error_load
|
489
|
+
end
|
490
|
+
end # class LogStash::Outputs::Example
|
@@ -0,0 +1,24 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-output-fir'
|
3
|
+
s.version = "0.9.0"
|
4
|
+
s.licenses = ["Apache License (2.0)"]
|
5
|
+
s.summary = "This fir output send alert of sig filter to FIR (https://github.com/certsocietegenerale/FIR)."
|
6
|
+
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install fir. This gem is not a stand-alone program"
|
7
|
+
s.authors = ["Lionel PRAT"]
|
8
|
+
s.email = 'lionel.prat9@gmail.com'
|
9
|
+
s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
|
10
|
+
s.require_paths = ["lib"]
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
14
|
+
# Tests
|
15
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
|
+
|
17
|
+
# Special flag to let us know this is actually a logstash plugin
|
18
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
|
19
|
+
|
20
|
+
# Gem dependencies
|
21
|
+
# Gem dependencies
|
22
|
+
s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
|
23
|
+
s.add_development_dependency 'logstash-devutils'
|
24
|
+
end
|
@@ -0,0 +1,22 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/devutils/rspec/spec_helper"
|
3
|
+
require "logstash/outputs/fir"
|
4
|
+
require "logstash/codecs/plain"
|
5
|
+
require "logstash/event"
|
6
|
+
|
7
|
+
describe LogStash::Outputs::Fir do
|
8
|
+
let(:sample_event) { LogStash::Event.new }
|
9
|
+
let(:output) { LogStash::Outputs::Fir.new }
|
10
|
+
|
11
|
+
before do
|
12
|
+
output.register
|
13
|
+
end
|
14
|
+
|
15
|
+
describe "receive message" do
|
16
|
+
subject { output.receive(sample_event) }
|
17
|
+
|
18
|
+
it "returns a string" do
|
19
|
+
expect(subject).to eq("Event received")
|
20
|
+
end
|
21
|
+
end
|
22
|
+
end
|
metadata
ADDED
@@ -0,0 +1,90 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-output-fir
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.9.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Lionel PRAT
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2017-06-02 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
15
|
+
requirements:
|
16
|
+
- - ">="
|
17
|
+
- !ruby/object:Gem::Version
|
18
|
+
version: '1.60'
|
19
|
+
- - "<="
|
20
|
+
- !ruby/object:Gem::Version
|
21
|
+
version: '2.99'
|
22
|
+
name: logstash-core-plugin-api
|
23
|
+
prerelease: false
|
24
|
+
type: :runtime
|
25
|
+
version_requirements: !ruby/object:Gem::Requirement
|
26
|
+
requirements:
|
27
|
+
- - ">="
|
28
|
+
- !ruby/object:Gem::Version
|
29
|
+
version: '1.60'
|
30
|
+
- - "<="
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: '2.99'
|
33
|
+
- !ruby/object:Gem::Dependency
|
34
|
+
requirement: !ruby/object:Gem::Requirement
|
35
|
+
requirements:
|
36
|
+
- - ">="
|
37
|
+
- !ruby/object:Gem::Version
|
38
|
+
version: '0'
|
39
|
+
name: logstash-devutils
|
40
|
+
prerelease: false
|
41
|
+
type: :development
|
42
|
+
version_requirements: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - ">="
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '0'
|
47
|
+
description: This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install fir. This gem is not a stand-alone program
|
48
|
+
email: lionel.prat9@gmail.com
|
49
|
+
executables: []
|
50
|
+
extensions: []
|
51
|
+
extra_rdoc_files: []
|
52
|
+
files:
|
53
|
+
- CHANGELOG.md
|
54
|
+
- CONTRIBUTORS
|
55
|
+
- DEVELOPER.md
|
56
|
+
- Gemfile
|
57
|
+
- LICENSE
|
58
|
+
- NOTICE.TXT
|
59
|
+
- README.md
|
60
|
+
- lib/logstash/outputs/fir.rb
|
61
|
+
- logstash-output-fir.gemspec
|
62
|
+
- spec/outputs/fir_spec.rb
|
63
|
+
homepage: http://www.elastic.co/guide/en/logstash/current/index.html
|
64
|
+
licenses:
|
65
|
+
- Apache License (2.0)
|
66
|
+
metadata:
|
67
|
+
logstash_plugin: 'true'
|
68
|
+
logstash_group: output
|
69
|
+
post_install_message:
|
70
|
+
rdoc_options: []
|
71
|
+
require_paths:
|
72
|
+
- lib
|
73
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
74
|
+
requirements:
|
75
|
+
- - ">="
|
76
|
+
- !ruby/object:Gem::Version
|
77
|
+
version: '0'
|
78
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
79
|
+
requirements:
|
80
|
+
- - ">="
|
81
|
+
- !ruby/object:Gem::Version
|
82
|
+
version: '0'
|
83
|
+
requirements: []
|
84
|
+
rubyforge_project:
|
85
|
+
rubygems_version: 2.4.8
|
86
|
+
signing_key:
|
87
|
+
specification_version: 4
|
88
|
+
summary: This fir output send alert of sig filter to FIR (https://github.com/certsocietegenerale/FIR).
|
89
|
+
test_files:
|
90
|
+
- spec/outputs/fir_spec.rb
|