logstash-filter-sig 0.9.0

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: c8e9c8697571bb1f9686c8dca21a4ecbecee7417
4
+ data.tar.gz: fb04e71a22728d094d8e521d38512c171916827c
5
+ SHA512:
6
+ metadata.gz: 1434ac29a21f160cac2f8e96ca4248d77665095a12a0c79600f928981c9abc8dce40d8f9c325a7fd9f930f93954df7b51e1273b914469eb40d94ca465ce3637c
7
+ data.tar.gz: 1ec83b2379ed8610e0eb21ae8ee05b535a0e3f8e069f11bec09d6749dde428cd2ed191b82e9f10502fa782d2a024efc49664e96ff1b8b760a5f9fd906d02bdcd
@@ -0,0 +1,2 @@
1
+ ## 0.9.0
2
+ - Plugins work on logstash 5.4
@@ -0,0 +1,11 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Aaron Mildenstein (untergeek)
6
+ * Pier-Hugues Pellerin (ph)
7
+
8
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
9
+ Logstash, and you aren't on the list above and want to be, please let us know
10
+ and we'll make sure you're here. Contributions from folks like you are what make
11
+ open source awesome.
@@ -0,0 +1,2 @@
1
+ # logstash-filter-sig
2
+ * Lionel PRAT - lionel.prat9@gmail.com
data/Gemfile ADDED
@@ -0,0 +1,2 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
@@ -0,0 +1,5 @@
1
+ Elasticsearch
2
+ Copyright 2012-2015 Elasticsearch
3
+
4
+ This product includes software developed by The Apache Software
5
+ Foundation (http://www.apache.org/).
@@ -0,0 +1,675 @@
1
+ # Logstash Plugin
2
+
3
+ Logstash plugin Filter "Sig" can help you to detect security threat in log by differents techniques:
4
+ * Drop False positive and noise
5
+ * Find new value in field
6
+ * BL REPUTATION check
7
+ * IOC extracted of MISP by exemple
8
+ * SIG with some fonctionnality search
9
+ * Each rule have:
10
+ * a name: name of rule for report
11
+ * a id: id use for correlation and change note
12
+ * a score (note): give score to signature, use for trigger alert or no
13
+ * a type: two type, one (1) is primary signature and two (2) is second signature. Second signature match only if a primary signature has before matched.
14
+ * ModeFP: true if false positive signature
15
+ * extract (optionnal): use for extract data information which you are sure that threat, and put in IOC local database for detect in all next events.
16
+ * Each rule can to do multi search technique by event on field and value:
17
+ * Field present and Field not present
18
+ * Regexp
19
+ * Regexp not present
20
+ * motif: motif must Array type and contains all possibility match
21
+ * Compare value field with another value field (string or numeric -- operator: ==, <, >, !=)
22
+ * Check size of string (length) operator: ==, <, >, !=
23
+ * Check ipaddr (ex: 192.168.0.0/24) with operator: ==, !=
24
+ * Check numeric value with operator: ==, !=, <, >
25
+ * Check date value in relationship with time now + X, use operator: ==, <, >, !=
26
+ * Check date value if in hour, use operator: ==, <, >, !=
27
+ * Check date value if in day (number day, ex: 0==sunday,1==monday), use operator: ==, <, >, !=
28
+ * Check frequence on multi event
29
+ * use for brute force (ex: sig type 3times error auth in 60secondes, if detected not research before 3600 secondes)
30
+ * use for correlation multi source with field value common (ex: ip) on events but different event type (squid and antivirus) (ex: sig type one error/detect on each type)
31
+ * Check frequence on event second possibility use for correlation
32
+ * By databases of reference (before make with ES data contains clean logs -- script include) (new version of my project AEE [https://github.com/lprat/AEE])
33
+ * Check size, check regexp data, uniq value (ex: uniq value can be @timestamp because change everytime)
34
+ * Check link/relationship between not uniq (value) fields (idea inspired by tool PicViz). Exemple on apache log page test.php return always 200 in all logs. The link/relationship val$
35
+ * Note/Score functionnality for change score (up or down) of alert with correlation IOC/multi SIG/REF match
36
+ * By frequence but create alert not defined reason, just know log loading up and not normaly. You can select frequence on specifique event by filters
37
+
38
+ This plugin use simhash for find around result and futur possibility check and correlation.
39
+
40
+ **!!!! You must install simhash under logstash, follow instruction:**
41
+
42
+ 1. curl -sSL https://get.rvm.io | bash && /usr/local/rvm/bin/rvm install 1.9.3-dev
43
+ 2. IN vendor/jruby/lib/ruby/shared/mkmf.rb add line (45):
44
+ * RbConfig::MAKEFILE_CONFIG["CPPFLAGS"] += ' -I/usr/local/rvm/rubies/ruby-1.9.3-p551-dev/include/ruby-1.9.1/x86_64-linux/'
45
+ * RbConfig::MAKEFILE_CONFIG['includedir'] = "/usr/local/rvm/rubies/ruby-1.9.3-p551-dev/include/ruby-1.9.1/"
46
+ 3. env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 JRUBY_OPTS='-Xcext.enabled=true' /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/bundle/jruby/1.9/bin/bundle install
47
+ 4. env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 JRUBY_OPTS='-Xcext.enabled=true' /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem build logstash-filter-sig.gemspec
48
+ 5. /usr/share/logstash/bin/logstash-plugin install logstash-filter-sig-3.0.0.gem
49
+
50
+ ** You welcome to contribute (report bug, new functionality, ...)! **
51
+
52
+ ** Possibility you meet bug, I recently ported on logstash 5.x !! **
53
+
54
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
55
+
56
+ It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
57
+
58
+ ## Contact
59
+ Lionel PRAT lionel.prat9 (at) gmail.com or cronos56 (at) yahoo.com
60
+
61
+ ## Let's start with docker
62
+ Dockerfile
63
+ ```
64
+ FROM logstash
65
+ MAINTAINER Lionel PRAT <lionel.prat9@gmail.com>
66
+
67
+ RUN apt-get update && apt-get install -y vim nano
68
+ RUN mkdir -p /opt/logstash-plugins/ && mkdir /etc/logstash/db
69
+ ADD logstash-filter-sig /opt/logstash-plugins/
70
+ RUN cp /opt/logstash-plugins/logstash-filter-sig/conf-samples/* /etc/logstash/db/ && chown logstash.logstash -R /etc/logstash/db/
71
+ RUN cp /opt/logstash-plugins/logstash-filter-sig/need/mkmf.rb /usr/share/logstash/vendor/jruby/lib/ruby/shared/mkmf.rb
72
+ RUN cd /opt/logstash-plugins/logstash-filter-sig && curl -sSL https://get.rvm.io | bash && /usr/local/rvm/bin/rvm install 1.9.3-dev && env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 JRUBY_OPTS='-Xcext.enabled=true' /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/bundle/jruby/1.9/bin/bundle install && env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 JRUBY_OPTS='-Xcext.enabled=true' /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem build logstash-filter-sig.gemspec
73
+ RUN /usr/share/logstash/bin/logstash-plugin install logstash-filter-sig-3.0.0.gem
74
+ ```
75
+
76
+ ## Main Configuration (logstash-filter.conf)
77
+ ** Refresh DB : The plugin use some files configurations, you can change it during run. The plugin get change and apply all refresh time. You can use config/db file with git system... **
78
+ ** Functions are in order to works/process **
79
+ * Disable check techniques : use for disable function check
80
+ * no_check => "sig_no_apply_all" : add in event a field name "sig_no_apply_all" for no use all check on it
81
+ * disable_drop => false : if turn to true, function "drop" will disable
82
+ * disable_fp => false : if turn to true, function "fingerprint & drop fingerprint" will disable
83
+ * disable_nv => false : if turn to true, function "new value" will disable
84
+ * disable_ioc => false : if turn to true, function "ioc" will disable
85
+ * disable_sig => false : if turn to true, function "signature" will disable
86
+ * disable_ref => false : if turn to true, function "reference" will disable
87
+ * disable_freq => false : if turn to true, function "frequence" will disable
88
+ * disable_note => false : if turn to true, function "note/score" will disable
89
+
90
+ * Drop function : use drop function to drop noise and event you don't want analysis
91
+ * noapply_sig_dropdb => "sig_no_apply_dropdb" : add in event a field name "sig_no_apply_dropdb" for no use this check on it
92
+ * db_drop => "/etc/logstash/db/drop-db.json" : path of file drop-db.json (see below for more information)
93
+ * refresh_interval_dropdb => 3600 : delay interval (in second) to refresh db_drop
94
+
95
+ * New value : use for check new value on event specified field
96
+ * conf_nv => "/etc/logstash/db/new.json" : path of file new.json (see below for more information)
97
+ * db_nv => "/etc/logstash/db/new-save.json" : path of file new-save.json (see below for more information)
98
+ * noapply_sig_nv => "sig_no_apply_nv" : add in event a field name "sig_no_apply_nv" for no use this check on it
99
+ * refresh_interval_confnv => 3600 : delay interval (in second) to refresh conf_nv
100
+ * save_interval_dbnv => 3600 : delay interval (in second) to save db_nv
101
+ * target_nv => "new_value_" : prefix value if new value detected, create field with name "new_value_FIELDX" contains "new value" value
102
+
103
+ * BL REPUTATION : use for check ip reputation in event field
104
+ * conf_bl => "/etc/logstash/db/bl_conf.json" : path of file bl_conf.json (see below for more information)
105
+ * file_bl => [Array type] ["/etc/logstash/db/firehol_level1.netset","/etc/logstash/db/firehol_level2.netset","/etc/logstash/db/firehol_level3.netset","/etc/logstash/db/firehol_level4.netset","/etc/logstash/db/firehol_webserver.netset","/etc/logstash/db/firehol_webclient.netset","/etc/logstash/db/firehol_abusers_30d.netset","/etc/logstash/db/firehol_anonymous.netset","/etc/logstash/db/firehol_proxies.netset"] : path of files contains ip reputation
106
+ * You can use firehol BL: https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level1.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level2.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level3.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level4.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_webserver.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_webclient.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_abusers_30d.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_anonymous.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_proxies.netset
107
+ * noapply_sig_bl => "sig_no_apply_bl" : add in event a field name "sig_no_apply_bl" for no use this check on it
108
+ * refresh_interval_confbl => 3600 : delay interval (in second) to refresh conf_bl & db_bl (file bl)
109
+ * targetname_bl => "bl_detected_category" : field name to save value of category if ip reputation found
110
+
111
+ * IOC : use for check IOC in event
112
+ * db_ioc => ["/etc/logstash/db/ioc.json", "/etc/logstash/db/ioc_local.json"] : Array contains path of files db (ioc_local.json => created by signature function [file_save_localioc], ioc.json) (see below for more information)
113
+ * conf_ioc => "/etc/logstash/db/ioc_conf.json" : path of file ioc_conf.json (see below for more information)
114
+ * target_ioc => "ioc_detected" : name of field where you save IOC detected
115
+ * targetnum_ioc => "ioc_detected_count" : name of field where you save count of IOC detected
116
+ * targetname_ioc => "ioc_detected_name" : name of field where you save IOC name detected
117
+ * refresh_interval_dbioc => 3600 : delay interval (in second) to refresh conf_ioc & db_ioc
118
+ * noapply_ioc => "sig_no_apply_ioc" : add in event a field name "sig_no_apply_ioc" for no use this check on it
119
+
120
+ * Signature : use for check complexe signature on event
121
+ * conf_rules_sig => "/etc/logstash/db/sig.json" : path of file sig.json (see below for more information)
122
+ * file_save_localioc => "/etc/logstash/db/ioc_local.json" : path of file ioc_local.json (see below for more information)
123
+ * target_sig => "sig_detected" : name of field where you save Rules detected
124
+ * targetnum_sig => "sig_detected_count" : name of field where you save count of rules detected
125
+ * targetname_sig => "sig_detected_name" : name of field where you save name of rules detected
126
+ * refresh_interval_confrules => 3600 : delay interval (in second) to refresh file_save_localioc & conf_rules_sig
127
+ * noapply_sig_rules => "sig_no_apply_rules" : add in event a field name "sig_no_apply_rules" for no use this check on it
128
+ * check_stop => false : fix to true if you can stop check sig after one found
129
+
130
+ * REFERENCE (old ANOMALIE) : use for verify event is included in reference database
131
+ * conf_ref => "/etc/logstash/db/conf_ref.json" : path of file conf_ref.json (see below for more information)
132
+ * db_ref => "/etc/logstash/db/reference.json" : path of file reference.json (see below for more information)
133
+ * db_pattern => "/etc/logstash/db/pattern.db" : path of file pattern.db (see below for more information)
134
+ * refresh_interval_dbref => 3600 : delay interval (in second) to refresh db_ref & db_pattern & conf_ref
135
+ * noapply_ref => "sig_no_apply_ref" : add in event a field name "sig_no_apply_ref" for no use this check on it
136
+ * target_ref => "ref_detected" : name of field where you save detected differences between event and reference
137
+ * targetnum_ref => "ref_detected_count" : name of field where you save count all detected differences between event and reference
138
+ * targetname_ref => "ref_detected_name" : name of field where you save detected name of difference between event and reference
139
+ * ref_aroundfloat => 0.5 : around score if not integer (float result)
140
+ * ref_stop_after_firstffind => true : fix to false if you can continue to check reference after one rule found
141
+
142
+ * SIG & IOC & REF configuration common (SCORE/NOTE function) : use score function for change value of score if you match multi rule (IOC/REF/SIG ==> correlation between matched)
143
+ * targetnote => "sig_detected_note" : name of field where you save score issued of IOC/SIG/REF/SCORE function
144
+ * targetid => "sig_detected_id" : name of field where you save ID rule issued of IOC/SIG/REF function
145
+ * conf_rules_note => "/etc/logstash/db/note.json" : path of file note.json (see below for more information)
146
+
147
+ * Fingerprint function: use for limit information on alert, first detect add fingerprint value and 'first' tag, after all others add tag 'info' (information complementary). You can process for make alert with first (send to incident platform) and put all 'info' in ES for read for more context information.
148
+ * noapply_sig_dropfp => "sig_no_apply_dropfp" : add in event a field name "sig_no_apply_dropfp" for no use this check on it
149
+ * conf_fp => "/etc/logstash/db/fingerprint_conf.json" : path of file fingerprint_conf.json (see below for more information)
150
+ * db_dropfp => "/etc/logstash/db/drop-fp.json" : path of file drop-fp.json (see below for more information)
151
+ * select_fp => "tags" : name field for select/filter type, relationship with fingerprint_conf.json. Exemple: event['tags']="squid" --> (fingerprint_conf.json->>) {"squid":{"fields":[....],...}}
152
+ * target_fp => "fingerprint" : name field where you save fingerprint value
153
+ * tag_name_first => "first_alert" : value name of tag for unique event alert when first time to lookup
154
+ * tag_name_after => "info_comp" : value name of tag for unique event alert when not first time to lookup
155
+ * target_tag_fp => "tags" : field name where you save the value tag (first or complementary)
156
+ * refresh_interval_conffp => 3600 : delay interval (in second) to refresh db_dropfp and conf_fp
157
+
158
+ * FREQUENCE : use for detect anormaly frequence increase on event flux
159
+ * conf_freq => "/etc/logstash/db/conf_freq.json" : path of file conf_freq.json (see below for more information)
160
+ * refresh_interval_freqrules => 3600 : delay interval (in second) to refresh conf_freq
161
+ * noapply_freq => "sig_no_apply_freq" : add in event a field name "sig_no_apply_freq" for no use this check on it
162
+
163
+ ## Files Configuration
164
+ ** Check in folder conf-samples and scripts-create-db **
165
+ ### DROP FIRST
166
+ The file drop-db.json contains rule for drop event, that you don't want analysis or noise.
167
+ ** This file is a Json format **
168
+ ```
169
+ {"dst_domain": "^google.com$|^mydomain.ext$", "dst_ip": "10.0.0.\\d+"}
170
+ ```
171
+ The json key is field name to check in event (event['field'], and value must regexp to check on it. If regexp match then event is dropped.
172
+
173
+ ### New Value
174
+ #### new-save.json
175
+ The file is auto generated by plugins, but you must created for first time with contains '{}' (Json empty). In time, contains all informations of field selected.
176
+ You can restart to 0 by recreate file.
177
+ ** This file is a Json format **
178
+ #### new.json
179
+ The file contains rules which indicates what field selected for check new value.
180
+ ** This file is a Json format **
181
+ ```
182
+ {"rules": ["dst_host","user_agent"]}
183
+ ```
184
+ Above, the rules selectes field with name "dst_host" for create verification on it, and another verification on field "user_agent".
185
+
186
+ ### IP REPUTATION
187
+ #### bl_conf.json
188
+ The file contains rules which indicates what field selected for check with list ip reputation. Field must be a IP format.
189
+ ** This file is a Json format **
190
+ ```
191
+ {"fieldx": {'dbs':[file_name,file_name2,...], id: '180XX', 'note': 1, 'category': "malware"}}
192
+ ```
193
+ Use id plage different of another technique (ioc, sig, ref, ...).
194
+ Note: between 1 and 4.
195
+ Category: indicate category contains in file_name (malware, webserveur attack, proxies, ...)
196
+ dbs: must contains list file name configured in main conf 'file_bl'
197
+
198
+ ### IOC
199
+ #### ioc_conf.json
200
+ This file contains rules which explain on which field use each IOC.
201
+ A rule is hash composed of 4 elements:
202
+ * First key: IOC name in DB with value include in event key (ex: "ioc_hostname":["_host"] want to do => check IOC hostname on event with field name *_host* as event['dst_hostname'] ... )
203
+ * Second Key Name repeat First key name with add +'_downcase', it's value can be true or false. True verify IOC without case (AbC == abc) and False opposite (AbC != abc)
204
+ * Third key name repeat again and add +'iocnote', it's value is score if IOC detected
205
+ * Fourth key name repeat again! and add +'iocid', it's value is ID of rule for use after in NOTE function by example.
206
+ ** This file is a Json format **
207
+ ```
208
+ {
209
+ "ioc_hostname":["_host"], "ioc_hostname_downcase":true, "ioc_hostname_iocnote":2, "ioc_hostname_iocid":1001,
210
+ "ioc_domain":["_domain"], "ioc_domain_downcase":true, "ioc_domain_iocnote":2, "ioc_domain_iocid":1002,
211
+ "ioc_ip":["_ip"], "ioc_ip_downcase":false, "ioc_ip_iocnote":1, "ioc_ip_iocid":1003,
212
+ "ioc_emailaddr":["_emailaddr"], "ioc_emailaddr_downcase":true, "ioc_emailaddr_iocnote":3, "ioc_emailaddr_iocid":1004,
213
+ "ioc_user-agent":["user_agent"], "ioc_user-agent_downcase":false, "ioc_user-agent_iocnote":2, "ioc_user-agent_iocid":1005,
214
+ "ioc_uri":["_url","_request","_uripath_global"], "ioc_uri_downcase":false, "ioc_uri_iocnote":2, "ioc_uri_iocid":1006,
215
+ "ioc_attachment":["attachment","_uriparam","_uripage"], "ioc_attachment_downcase":false, "ioc_attachment_iocnote":1, "ioc_attachment_iocid":1007
216
+ }
217
+ ```
218
+
219
+ #### ioc_local.json
220
+ This file is generated (JSON format) by plugin with use function signature and parameter 'extract'.
221
+ ** For first time, you create file empty (echo '{}' > ioc_local.json) **
222
+
223
+ #### Script to generate ioc.json
224
+ Use script ioc_create.sh for create ioc.json file (in path: "/etc/logstash/db/") from MISP db.
225
+ ** Require Pymisp (https://github.com/MISP/PyMISP), wget (for download alexa db), misp2json4ioc.py (include in folder scripts), blacklist.json (include in conf-samples) **
226
+
227
+ ##### blacklist.json
228
+ This file used for avoid to add IOC create more false positive.
229
+ ** This file is a Json format **
230
+ ```
231
+ {
232
+ "ioc_ip":["(127\\.[0-9]+\\.[0-9]+\\.[0-9]+|10\\.\\d+\\.\\d+\\.\\d+|192\\.168\\.\\d+\\.\\d+|172\\.([1-2][0-9]|0|30|31)\\.\\d+\\.\\d+|255\\.255\\.255\\.\\d+)"],
233
+ "email-attachment":[],
234
+ "ioc_attachment":["2"],
235
+ "ioc_emailaddr":[],
236
+ "ioc_uri":["\/"],
237
+ "ioc_domain":[],
238
+ "ioc_hostname":[],
239
+ "ioc_user-agent":[],
240
+ "ioc_email-subject":[],
241
+ "ioc_as":[]
242
+ }
243
+ ```
244
+
245
+ ### SIGNATURES
246
+ The sig.json file contains rules to check in event.
247
+ THe first key name is 'rules', the value is array contains all rules.
248
+ Each rule is Hash composed of multi element optionnal and mandatory:
249
+ * Level 1 : all first key in hash is name of Field {fieldX:{},fieldY:{},...} with techniques of search, for rule match, you must valid match on all technique in all field!
250
+ * In one field (only) you must add information (in its hash):
251
+ * "id": X => X is value interger of key id, give unique id number which identify rule
252
+ * "name": "String" => String is a value string which give name to rule
253
+ * "type": value 1 or 2 => use 1 for search rule in event without another rule before find, and use 2 for search only if a rule is found before.
254
+ * "note": 1 to X => use for give the score if rule match
255
+ * "modeFP": true or false => use for drop event if rule match (false positive mode)
256
+ * "extract": {"field": "ioc_x"} => ** it's optionnal add ** , use for extract value of field indicated in hash key and put in ioc database local in ioc_X selected in value of hash.
257
+ * Technique frequence & correlation in different event:
258
+ * "freq_field": [fieldx,fieldz,...] => the value is array contains name of field of event relationship between anoter event
259
+ * "freq_delay": x in second => it's time delay between first event and last event (if freq_count == 3 then first 1 and last 3)
260
+ * "freq_count": y => the count of event you must see for match
261
+ * "freq_resettime": z in second => the time to wait for reseach new frequence when you before detected
262
+ * "correlate_change_fieldvalue": [] => indicated field name in array, the value of field must be different for each event match
263
+ * Techniques of search
264
+ * "motif": ["X","Z"] => use for search motif in field. Field must include one element in array. If field contains X then techniques match!
265
+ * "false": {} => flase contais empty hash. use for indicate than field name not be present in event
266
+ * "regexp": ["^\\d+$","..."] => value is array contains all regexp, for match technique each regexp must matched.
267
+ * "notregexp": [] => value is array contains all regexp, for match technique each regexp must ** not ** matched.
268
+ * "date": {'egal'|'inf'|'sup'|'diff': x in second} => use for field contains date value, check if date is (time.now)-x of value with operator (<,>,!=,==)
269
+ * "hour": {'egal'|'inf'|'sup'|'diff': 0 to 23} => use for field contains date value, check if hour is hour of value with operator (<,>,!=,==)
270
+ * "day": {'egal'|'inf'|'sup'|'diff': 0 to 6} => use for field contains date value , check if day is day of value with operator (<,>,!=,==)
271
+ * "ipaddr": {'egal'|'diff']: ipaddr or subnet} => use on field contains ip addr compare if value (by operator != or ==) to value of hash
272
+ * "sizeope" : {['egal'|'inf'|'sup'|'diff']: x} => use on field contains string and compare size of string with hash operator for value hash.
273
+ * "numop" : {['egal'|'inf'|'sup'|'diff']: x} => use on field contains interger and compare number with hash operator for value hash.
274
+ * "compope": {"fieldz": {['egal'|'inf'|'sup'|'diff']: "string"/numeric}} => use for compare two field in same event with same value type
275
+ * ** !! two another techniques present in information party above **
276
+
277
+ ** This file is a Json format -- example not use all possibility of sig **
278
+ ```
279
+ {"rules":[
280
+ {"type":{"motif":["squid"],"type":1,"note":1,"name":"Field User-AGENT not present","id":1},"user_agent":{"false":{}}},
281
+ {"new_value_dst_host":{"sizeope":{"sup":1},"type":1,"note":1,"name":"New value dst_host","id":2},"type":{"motif":["squid"]}},
282
+ {"elapsed_time":{"numope":{"sup":900000},"type":1,"note":2,"name":"Connection time too long > 15minutes","id":3}},
283
+ {"type":{"motif":["squid"],"type":2,"note":2,"name":"Referer FIELD not present","id":4},"uri_proto":{"notregexp":["tunnel"]},"referer_host":{"false":{}}}
284
+ ]}
285
+ ```
286
+ ### REFERENCE (OLD ANOMALIE)
287
+ #### conf_ref.json
288
+ This file contains rules to check on event and also use for create databases reference (script).
289
+ The file json contains a key named 'rules' and this value is Array which contains all rules.
290
+ A rule is composed of multi elements:
291
+ * Key "pivot_field" : use for filter event by rule (select rule), value is hash with key is event field name, and value is Array which contains value which must present in event field.
292
+ * Key "list_sig" : value is Array contains name of field in event checked in reference databases. If field not present in some case, it's doesn't matter.
293
+ * Key "relation_min" : value is integer, used in relationship between field on field not unique. This relationship create simhash, the reference databases contains count of simhash value seem in all event type. Exemple if simhash "1111111" count 9 time in all event then if you set 10 this parameter, the plugins match rule because relationship not exist for him/it.
294
+ * Key "simhash_size" : value is integer, use for create simhash size... If you use little value then you more chance you find value simhash with event near.
295
+ * Key "simhash_use_size" (Not works, i will work on!)
296
+ * Key "id" : use for identified rule matched and used in score/note function.
297
+
298
+ ** This file is a Json format **
299
+ ```
300
+ {"rules":[
301
+ {"pivot_field":{"tags":["squid"]}, "list_sig": ["src_host","src_ip","dst_host","dst_ip","uri_proto","uri_global"], "relation_min": 10, "simhash_size": 32, "simhash_use_size": 32, "id": 2001}
302
+ ]}
303
+ ```
304
+ #### Create reference database (reference.json)
305
+ For create databases (reference.json file) use script include in scripts folder.
306
+ Run script with syntaxe: ./create.rb conf_ref.json pattern.db https://user:secret@localhost:9200
307
+ For make good databases, use elasticsearch contains clean data log else you verify databases containt and change strange value.
308
+
309
+ ##### note_ref_defaut.json
310
+ This file contains note/score by default for each check of reference verification matched.
311
+ The 'NOTE_UNIQ_REDUC' used for reduce score of matched on field check. By example if match LEN problem then if uniq field value, score is not 0.25 but 0.25-0.1 => 0.15.
312
+ ** This file is a Json format **
313
+ ```
314
+ {
315
+ 'NOTE_UNIQ_REDUC': 0.1,
316
+ 'NOTE_DEFAULT': 2,
317
+ 'NOTE_LISTV': 0.25,
318
+ 'NOTE_ENCODING': 0.25,
319
+ 'NOTE_LEN': 0.25,
320
+ 'NOTE_LEN_AVG': 0.25,
321
+ 'NOTE_LEN_EVEN': 0.25,
322
+ 'NOTE_REGEXP': 0.25,
323
+ 'NOTE_REGEXP_MIN': 0.25
324
+ }
325
+ ```
326
+ ##### pattern.db
327
+ This file is used in check of regexp on field value.
328
+ This file is ** not ** a Json format.
329
+ ```
330
+ ALPHA_MAJU=>>[A-Z]
331
+ ALPHA_MINU=>>[a-z]
332
+ NUM_1to9=>>[1-9]
333
+ NUM_0to9=>>[0-9]
334
+ ALPHA_MAJandMIN=>>[A-Za-z]
335
+ HEXA=>>(0x|x|\\x)[0-9A-Fa-f][0-9A-Fa-f]
336
+ CHAR_SPE_NUL=>>\x00
337
+ CHAR_SPE_SOH=>>\x01
338
+ CHAR_SPE_STX=>>\x02
339
+ CHAR_SPE_ETX=>>\x03
340
+ CHAR_SPE_EOT=>>\x04
341
+ CHAR_SPE_ENQ=>>\x05
342
+ CHAR_SPE_ACK=>>\x06
343
+ CHAR_SPE_BEL=>>\x07
344
+ CHAR_SPE_BS=>>\x08
345
+ CHAR_SPE_HT=>>\x09
346
+ CHAR_SPE_LF=>>\x0A
347
+ CHAR_SPE_VT=>>\x0B
348
+ CHAR_SPE_FF=>>\x0C
349
+ CHAR_SPE_CR=>>\x0D
350
+ CHAR_SPE_SO=>>\x0E
351
+ CHAR_SPE_SI=>>\x0F
352
+ CHAR_SPE_DLE=>>\x10
353
+ CHAR_SPE_DC1=>>\x11
354
+ CHAR_SPE_DC2=>>\x12
355
+ CHAR_SPE_DC3=>>\x13
356
+ CHAR_SPE_DC4=>>\x14
357
+ CHAR_SPE_NAK=>>\x15
358
+ CHAR_SPE_SYN=>>\x16
359
+ CHAR_SPE_ETB=>>\x17
360
+ CHAR_SPE_CAN=>>\x18
361
+ CHAR_SPE_EM=>>\x19
362
+ CHAR_SPE_SUB=>>\x1A
363
+ CHAR_SPE_ESC=>>\x1B
364
+ CHAR_SPE_FS=>>\x1C
365
+ CHAR_SPE_GS=>>\x1D
366
+ CHAR_SPE_RS=>>\x1E
367
+ CHAR_SPE_US=>>\x1F
368
+ CHAR_SPE_SP=>>\x20
369
+ CHAR_SPE_EXCL=>>\x21
370
+ CHAR_SPE_QUOTE=>>\x22
371
+ CHAR_SPE_DIEZ=>>\x23
372
+ CHAR_SPE_DOLLAR=>>\x24
373
+ CHAR_SPE_POURC=>>\x25
374
+ CHAR_SPE_AND=>>\x26
375
+ CHAR_SPE_QUOTE2=>>\x27
376
+ CHAR_SPE_DPARA=>>\x28
377
+ CHAR_SPE_FPARA=>>\x29
378
+ CHAR_SPE_ETOI=>>\x2A
379
+ CHAR_SPE_PLUS=>>\x2B
380
+ CHAR_SPE_VIRG=>>\x2C
381
+ CHAR_SPE_MOINS=>>\x2D
382
+ CHAR_SPE_POINT=>>\x2E
383
+ CHAR_SPE_SLASH=>>\x2F
384
+ CHAR_SPE_2POINT=>>\x3A
385
+ CHAR_SPE_POINTVIRG=>>\x3B
386
+ CHAR_SPE_DBALIZ=>>\x3C
387
+ CHAR_SPE_EGAL=>>\x3D
388
+ CHAR_SPE_FBALIZ=>>\x3E
389
+ CHAR_SPE_INTER=>>\x3F
390
+ CHAR_SPE_AROB=>>\x40
391
+ CHAR_SPE_DCROCH=>>\x5B
392
+ CHAR_SPE_ASLASH=>>\x5C
393
+ CHAR_SPE_DCROCH=>>\x5D
394
+ CHAR_SPE_CHAP=>>\x5E
395
+ CHAR_SPE_UNDERS=>>\x5F
396
+ CHAR_SPE_QUOTE3=>>\x60
397
+ CHAR_SPE_DACCOL=>>\x7B
398
+ CHAR_SPE_OR=>>\x7C
399
+ CHAR_SPE_FACCOL=>>\x7D
400
+ CHAR_SPE_TILD=>>\x7E
401
+ CHAR_SPE_DEL=>>\x7F
402
+ CHAR_ETEND_80=>>\x80
403
+ CHAR_ETEND_81=>>\x81
404
+ CHAR_ETEND_82=>>\x82
405
+ CHAR_ETEND_83=>>\x83
406
+ CHAR_ETEND_84=>>\x84
407
+ CHAR_ETEND_85=>>\x85
408
+ CHAR_ETEND_86=>>\x86
409
+ CHAR_ETEND_87=>>\x87
410
+ CHAR_ETEND_88=>>\x88
411
+ CHAR_ETEND_89=>>\x89
412
+ CHAR_ETEND_8A=>>\x8A
413
+ CHAR_ETEND_8B=>>\x8B
414
+ CHAR_ETEND_8C=>>\x8C
415
+ CHAR_ETEND_8D=>>\x8D
416
+ CHAR_ETEND_8E=>>\x8E
417
+ CHAR_ETEND_8F=>>\x8F
418
+ CHAR_ETEND_90=>>\x90
419
+ CHAR_ETEND_91=>>\x91
420
+ CHAR_ETEND_92=>>\x92
421
+ CHAR_ETEND_93=>>\x93
422
+ CHAR_ETEND_94=>>\x94
423
+ CHAR_ETEND_95=>>\x95
424
+ CHAR_ETEND_96=>>\x96
425
+ CHAR_ETEND_97=>>\x97
426
+ CHAR_ETEND_98=>>\x98
427
+ CHAR_ETEND_99=>>\x99
428
+ CHAR_ETEND_9A=>>\x9A
429
+ CHAR_ETEND_9B=>>\x9B
430
+ CHAR_ETEND_9C=>>\x9C
431
+ CHAR_ETEND_9D=>>\x9D
432
+ CHAR_ETEND_9E=>>\x9E
433
+ CHAR_ETEND_9F=>>\x9F
434
+ CHAR_ETEND_A0=>>\xA0
435
+ CHAR_ETEND_A1=>>\xA1
436
+ CHAR_ETEND_A2=>>\xA2
437
+ CHAR_ETEND_A3=>>\xA3
438
+ CHAR_ETEND_A4=>>\xA4
439
+ CHAR_ETEND_A5=>>\xA5
440
+ CHAR_ETEND_A6=>>\xA6
441
+ CHAR_ETEND_A7=>>\xA7
442
+ CHAR_ETEND_A8=>>\xA8
443
+ CHAR_ETEND_A9=>>\xA9
444
+ CHAR_ETEND_AA=>>\xAA
445
+ CHAR_ETEND_AB=>>\xAB
446
+ CHAR_ETEND_AC=>>\xAC
447
+ CHAR_ETEND_AD=>>\xAD
448
+ CHAR_ETEND_AE=>>\xAE
449
+ CHAR_ETEND_PD=>>\xAF
450
+ CHAR_ETEND_B0=>>\xB0
451
+ CHAR_ETEND_B1=>>\xB1
452
+ CHAR_ETEND_B2=>>\xB2
453
+ CHAR_ETEND_B3=>>\xB3
454
+ CHAR_ETEND_B4=>>\xB4
455
+ CHAR_ETEND_B5=>>\xB5
456
+ CHAR_ETEND_B6=>>\xB6
457
+ CHAR_ETEND_B7=>>\xB7
458
+ CHAR_ETEND_B8=>>\xB8
459
+ CHAR_ETEND_B9=>>\xB9
460
+ CHAR_ETEND_BA=>>\xBA
461
+ CHAR_ETEND_BB=>>\xBB
462
+ CHAR_ETEND_BC=>>\xBC
463
+ CHAR_ETEND_BD=>>\xBD
464
+ CHAR_ETEND_BE=>>\xBE
465
+ CHAR_ETEND_BF=>>\xBF
466
+ CHAR_ETEND_C0=>>\xC0
467
+ CHAR_ETEND_C1=>>\xC1
468
+ CHAR_ETEND_C2=>>\xC2
469
+ CHAR_ETEND_C3=>>\xC3
470
+ CHAR_ETEND_C4=>>\xC4
471
+ CHAR_ETEND_C5=>>\xC5
472
+ CHAR_ETEND_C6=>>\xC6
473
+ CHAR_ETEND_C7=>>\xC7
474
+ CHAR_ETEND_C8=>>\xC8
475
+ CHAR_ETEND_C9=>>\xC9
476
+ CHAR_ETEND_CA=>>\xCA
477
+ CHAR_ETEND_CB=>>\xCB
478
+ CHAR_ETEND_CC=>>\xCC
479
+ CHAR_ETEND_CD=>>\xCD
480
+ CHAR_ETEND_CE=>>\xCE
481
+ CHAR_ETEND_CF=>>\xCF
482
+ CHAR_ETEND_D0=>>\xD0
483
+ CHAR_ETEND_D1=>>\xD1
484
+ CHAR_ETEND_D2=>>\xD2
485
+ CHAR_ETEND_D3=>>\xD3
486
+ CHAR_ETEND_D4=>>\xD4
487
+ CHAR_ETEND_D5=>>\xD5
488
+ CHAR_ETEND_D6=>>\xD6
489
+ CHAR_ETEND_D7=>>\xD7
490
+ CHAR_ETEND_D8=>>\xD8
491
+ CHAR_ETEND_D9=>>\xD9
492
+ CHAR_ETEND_DA=>>\xDA
493
+ CHAR_ETEND_DB=>>\xDB
494
+ CHAR_ETEND_DC=>>\xDC
495
+ CHAR_ETEND_JJ=>>\xDD
496
+ CHAR_ETEND_DE=>>\xDE
497
+ CHAR_ETEND_D=>>\xDF
498
+ CHAR_ETEND_E0=>>\xE0
499
+ CHAR_ETEND_E1=>>\xE1
500
+ CHAR_ETEND_E2=>>\xE2
501
+ CHAR_ETEND_E3=>>\xE3
502
+ CHAR_ETEND_E4=>>\xE4
503
+ CHAR_ETEND_E5=>>\xE5
504
+ CHAR_ETEND_E6=>>\xE6
505
+ CHAR_ETEND_E=>>\xE7
506
+ CHAR_ETEND_E8=>>\xE8
507
+ CHAR_ETEND_E9=>>\xE9
508
+ CHAR_ETEND_EA=>>\xEA
509
+ CHAR_ETEND_EB=>>\xEB
510
+ CHAR_ETEND_EC=>>\xEC
511
+ CHAR_ETEND_ED=>>\xED
512
+ CHAR_ETEND_EE=>>\xEE
513
+ CHAR_ETEND_EF=>>\xEF
514
+ CHAR_ETEND_F0=>>\xF0
515
+ CHAR_ETEND_F1=>>\xF1
516
+ CHAR_ETEND_F2=>>\xF2
517
+ CHAR_ETEND_F3=>>\xF3
518
+ CHAR_ETEND_F4=>>\xF4
519
+ CHAR_ETEND_F5=>>\xF5
520
+ CHAR_ETEND_F6=>>\xF6
521
+ CHAR_ETEND_F7=>>\xF7
522
+ CHAR_ETEND_F8=>>\xF8
523
+ CHAR_ETEND_F9=>>\xF9
524
+ CHAR_ETEND_FA=>>\xFA
525
+ CHAR_ETEND_FB=>>\xFB
526
+ CHAR_ETEND_FC=>>\xFC
527
+ CHAR_ETEND_FD=>>\xFD
528
+ CHAR_ETEND_FE=>>\xFE
529
+ CHAR_ETEND_FF=>>\xFF
530
+ ```
531
+
532
+ ##### reference.json
533
+ ** TODO: describe file composition for change if you need/want **
534
+
535
+ ### NOTE
536
+ This file (note.json) contains rules for correlation score, you can reduce or inscrease score when you matched multi rules (IOC/REF/SIG).
537
+ The json file contains main key 'rules' in value is Array.
538
+ Each element array is a Rule. A rule is composed of multi elements:
539
+ * 'id' Key : value is a array contains all id which must present in event
540
+ * 'optid' Key : value is a array contains all id which maybe present in event
541
+ * 'opt_num' Key : value is a integer which indicate number of optionnal id must be present in event. In example below, at least one id between 3 and 4 must be present.
542
+ * 'noid' Key : value is a array contains all id which must ** not ** present in event
543
+ * 'overwrite' Key : value is boolean, indicate if overwrite score reduce even if actually event score is bigger
544
+ ** This file is a Json format **
545
+ ```
546
+ {"rules":[
547
+ {"id":[2],"optid":[3,4],"opt_num":1,"noid":[],"note":3,"overwrite":true}
548
+ ]
549
+ }
550
+ ```
551
+ ### FINGERPRINT
552
+ The file fingerprint_conf.json contains rules which create fingerprint on event and tag with first or complementary information.
553
+ The key of Json is value must present in select_fp (main configuration). The value of key is Hash composed of multi key+value:
554
+ * Key 'fields': value contains Array with name of field used for create simhash.
555
+ * Key 'delay': use for restart with tag first after delay (utility for dhcp by example). The value is number in second.
556
+ * Key 'hashbit': value is number, use for define size of simhash.
557
+ ** This file is a Json format **
558
+ ```
559
+ {
560
+ "squid":{"fields":["src_ip","dst_host","dst_ip","uri_proto","sig_detected_name","ioc_detected","tags"],"delay":36000, "hashbit": 32}
561
+ }
562
+ ```
563
+ #### drop-fp.json
564
+ Use this file for drop event for false positive by example. The key of json is simhash and value is reason of drop.
565
+ ** This file is a Json format **
566
+ ```
567
+ {"821861840": "false positive: update of software XXX"}
568
+ ```
569
+ ### FREQUENCE
570
+ THe file conf_freq.json contains rules for create interne db frequence (restart from zero if you restart logstash).
571
+ The first key is rules and value is array which contains rules.
572
+ A rule is hash composed of multi element:
573
+ * Key 'select_field': value is hash with key field name and value array contais value must present. THis parameter is filter.
574
+ * Key 'note': use parameter for set score if rule matched
575
+ * Key 'refresh_time': use parameter for give delay between each verify if event increase
576
+ * Key 'reset_time': use paramter for give delai for reset database value (for only this rule)
577
+ * Key 'wait_after_reset': time to wait after reset database or first start
578
+ * Key 'id': value is number. Use parameter to fix id of rule.
579
+ ** This file is a Json format **
580
+ ```
581
+ {"rules":[
582
+ {"select_field": {"tags":["squid"],"return_code":["404"]}, "note": 2, "refresh_time": 60, "reset_time": 86400, "wait_after_reset": 10, "id": 3001}
583
+ ]}
584
+ ```
585
+
586
+ ## Documentation
587
+
588
+ Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
589
+
590
+ - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
591
+ - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
592
+
593
+ ## Need Help?
594
+
595
+ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
596
+
597
+ ## Developing
598
+
599
+ ### 1. Plugin Developement and Testing
600
+
601
+ #### Code
602
+ - To get started, you'll need JRuby with the Bundler gem installed.
603
+
604
+ - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
605
+
606
+ - Install dependencies
607
+ ```sh
608
+ bundle install
609
+ ```
610
+
611
+ #### Test
612
+
613
+ - Update your dependencies
614
+
615
+ ```sh
616
+ bundle install
617
+ ```
618
+
619
+ - Run tests
620
+
621
+ ```sh
622
+ bundle exec rspec
623
+ ```
624
+
625
+ ### 2. Running your unpublished Plugin in Logstash
626
+
627
+ #### 2.1 Run in a local Logstash clone
628
+
629
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
630
+ ```ruby
631
+ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
632
+ ```
633
+ - Install plugin
634
+ ```sh
635
+ # Logstash 2.3 and higher
636
+ bin/logstash-plugin install --no-verify
637
+
638
+ # Prior to Logstash 2.3
639
+ bin/plugin install --no-verify
640
+
641
+ ```
642
+ - Run Logstash with your plugin
643
+ ```sh
644
+ bin/logstash -e 'filter {awesome {}}'
645
+ ```
646
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
647
+
648
+ #### 2.2 Run in an installed Logstash
649
+
650
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
651
+
652
+ - Build your plugin gem
653
+ ```sh
654
+ gem build logstash-filter-awesome.gemspec
655
+ ```
656
+ - Install the plugin from the Logstash home
657
+ ```sh
658
+ # Logstash 2.3 and higher
659
+ bin/logstash-plugin install --no-verify
660
+
661
+ # Prior to Logstash 2.3
662
+ bin/plugin install --no-verify
663
+
664
+ ```
665
+ - Start Logstash and proceed to test the plugin
666
+
667
+ ## Contributing
668
+
669
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
670
+
671
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
672
+
673
+ It is more important to the community that you are able to contribute.
674
+
675
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.