logstash-output-mageshlogs 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +2 -0
- data/CONTRIBUTORS +10 -0
- data/DEVELOPER.md +2 -0
- data/Gemfile +3 -0
- data/LICENSE +11 -0
- data/README.md +86 -0
- data/lib/logstash/outputs/mageshlogs.rb +309 -0
- data/logstash-output-mageshlogs.gemspec +23 -0
- data/spec/outputs/mageshlogs_spec.rb +22 -0
- metadata +96 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: a6c4064d475c7b3888cb102e97032ace990371d9d1b819332bb20c52320fc1f4
|
4
|
+
data.tar.gz: 5f3b3181be0c13013a7fee4ed2023da2e95aab27a0daaace4ffead6a1ebefad3
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: c413d7d9f65f805e05e8cf3da60757f59782eb37f8385135b68ac832258f0df4a8e873a3b46d8da39c6f007a5e450ee7b3fd7a0710d758dd0966eb70b16ab169
|
7
|
+
data.tar.gz: 35d9433ac3ec4f2784d55adf94330881f43e6ee8e201038ad45644f9926f0971a1dbcb1c1339620df3204d62e002668e400f8826199398916bcde9c758afaae8
|
data/CHANGELOG.md
ADDED
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,10 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* -
|
6
|
+
|
7
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
8
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
9
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
10
|
+
open source awesome.
|
data/DEVELOPER.md
ADDED
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,11 @@
|
|
1
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
2
|
+
you may not use this file except in compliance with the License.
|
3
|
+
You may obtain a copy of the License at
|
4
|
+
|
5
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
6
|
+
|
7
|
+
Unless required by applicable law or agreed to in writing, software
|
8
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
9
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
10
|
+
See the License for the specific language governing permissions and
|
11
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,86 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
- Update your dependencies
|
35
|
+
|
36
|
+
```sh
|
37
|
+
bundle install
|
38
|
+
```
|
39
|
+
|
40
|
+
- Run tests
|
41
|
+
|
42
|
+
```sh
|
43
|
+
bundle exec rspec
|
44
|
+
```
|
45
|
+
|
46
|
+
### 2. Running your unpublished Plugin in Logstash
|
47
|
+
|
48
|
+
#### 2.1 Run in a local Logstash clone
|
49
|
+
|
50
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
51
|
+
```ruby
|
52
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
53
|
+
```
|
54
|
+
- Install plugin
|
55
|
+
```sh
|
56
|
+
bin/logstash-plugin install --no-verify
|
57
|
+
```
|
58
|
+
- Run Logstash with your plugin
|
59
|
+
```sh
|
60
|
+
bin/logstash -e 'filter {awesome {}}'
|
61
|
+
```
|
62
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
63
|
+
|
64
|
+
#### 2.2 Run in an installed Logstash
|
65
|
+
|
66
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
67
|
+
|
68
|
+
- Build your plugin gem
|
69
|
+
```sh
|
70
|
+
gem build logstash-filter-awesome.gemspec
|
71
|
+
```
|
72
|
+
- Install the plugin from the Logstash home
|
73
|
+
```sh
|
74
|
+
bin/logstash-plugin install /your/local/plugin/logstash-filter-awesome.gem
|
75
|
+
```
|
76
|
+
- Start Logstash and proceed to test the plugin
|
77
|
+
|
78
|
+
## Contributing
|
79
|
+
|
80
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
81
|
+
|
82
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
83
|
+
|
84
|
+
It is more important to the community that you are able to contribute.
|
85
|
+
|
86
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,309 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/outputs/base"
|
3
|
+
require "logstash/json"
|
4
|
+
require "zlib"
|
5
|
+
require "date"
|
6
|
+
require "json"
|
7
|
+
require "base64"
|
8
|
+
|
9
|
+
# An mageshlogs output that does nothing.
|
10
|
+
class LogStash::Outputs::Mageshlogs < LogStash::Outputs::Base
|
11
|
+
config_name "mageshlogs"
|
12
|
+
|
13
|
+
S247_MAX_RECORD_COUNT = 500
|
14
|
+
S247_MAX_RECORD_SIZE = 1000000
|
15
|
+
S247_MAX_BATCH_SIZE = 5000000
|
16
|
+
S247_LOG_UPLOAD_CHECK_INTERVAL = 600 #10 minutes
|
17
|
+
S247_TRUNCATION_SUFFIX = "##TRUNCATED###"
|
18
|
+
|
19
|
+
default :codec, "json"
|
20
|
+
|
21
|
+
config :log_type_config,:validate => :string, :required => true
|
22
|
+
config :max_retry, :validate => :number, :required => false, :default => 3
|
23
|
+
config :retry_interval, :validate => :number, :required => false, :default => 2
|
24
|
+
config :http_idle_timeout, :validate => :number, :required => false, default: 5
|
25
|
+
config :http_read_timeout, :validate => :number, :required => false, default: 30
|
26
|
+
config :http_proxy, :validate => :string, :required => false, :default => nil
|
27
|
+
|
28
|
+
public
|
29
|
+
def register
|
30
|
+
init_variables()
|
31
|
+
init_http_client(@logtype_config)
|
32
|
+
end # def register
|
33
|
+
|
34
|
+
|
35
|
+
public
|
36
|
+
def multi_receive(events)
|
37
|
+
return if events.empty?
|
38
|
+
@logger.info("Events received : #{events}")
|
39
|
+
process_http_events(events)
|
40
|
+
end
|
41
|
+
|
42
|
+
def close
|
43
|
+
@s247_http_client.close if @s247_http_client
|
44
|
+
end
|
45
|
+
|
46
|
+
def base64_url_decode(str)
|
47
|
+
str += '=' * (4 - str.length.modulo(4))
|
48
|
+
Base64.decode64(str.tr('-_','+/'))
|
49
|
+
end
|
50
|
+
def init_variables()
|
51
|
+
@logtype_config = JSON.parse(base64_url_decode(@log_type_config))
|
52
|
+
@s247_custom_regex = if @logtype_config.has_key? 'regex' then Regexp.compile(@logtype_config['regex'].gsub('?P<','?<')) else nil end
|
53
|
+
@s247_ignored_fields = if @logtype_config.has_key? 'ignored_fields' then @logtype_config['ignored_fields'] else [] end
|
54
|
+
@s247_tz = {'hrs': 0, 'mins': 0} #UTC
|
55
|
+
@log_source = Socket.gethostname
|
56
|
+
@valid_logtype = true
|
57
|
+
@log_upload_allowed = true
|
58
|
+
@log_upload_stopped_time = 0
|
59
|
+
@s247_datetime_format_string = @logtype_config['dateFormat']
|
60
|
+
@s247_datetime_format_string = @s247_datetime_format_string.sub('%f', '%N')
|
61
|
+
if !@s247_datetime_format_string.include? 'unix'
|
62
|
+
@is_year_present = if @s247_datetime_format_string.include?('%y') || @s247_datetime_format_string.include?('%Y') then true else false end
|
63
|
+
if !@is_year_present
|
64
|
+
@s247_datetime_format_string = @s247_datetime_format_string+ ' %Y'
|
65
|
+
end
|
66
|
+
@is_timezone_present = if @s247_datetime_format_string.include? '%z' then true else false end
|
67
|
+
if !@is_timezone_present && @logtype_config.has_key?('timezone')
|
68
|
+
tz_value = @logtype_config['timezone']
|
69
|
+
if tz_value.start_with?('+')
|
70
|
+
@s247_tz['hrs'] = Integer('-' + tz_value[1..4])
|
71
|
+
@s247_tz['mins'] = Integer('-' + tz_value[3..6])
|
72
|
+
elsif tz_value.start_with?('-')
|
73
|
+
@s247_tz['hrs'] = Integer('+' + tz_value[1..4])
|
74
|
+
@s247_tz['mins'] = Integer('+' + tz_value[3..6])
|
75
|
+
end
|
76
|
+
end
|
77
|
+
end
|
78
|
+
end
|
79
|
+
|
80
|
+
def init_http_client(logtype_config)
|
81
|
+
require 'manticore'
|
82
|
+
@upload_url = 'https://'+logtype_config['uploadDomain']+'/upload'
|
83
|
+
@logger.info("Starting HTTP connection to #{@upload_url}")
|
84
|
+
@headers = {"Content-Type" => "application/json", "Content-Encoding" => "gzip", "X-DeviceKey" => logtype_config['apiKey'], "X-LogType" => logtype_config['logType'], "X-StreamMode" => "1", "User-Agent" => "LogStash"}
|
85
|
+
@s247_http_client = Manticore::Client.new({})
|
86
|
+
#@s247_http_client.verify_mode = OpenSSL::SSL::VERIFY_NONE
|
87
|
+
#@s247_http_client.idle_timeout = @http_idle_timeout
|
88
|
+
#@s247_http_client.read_timeout = @http_read_timeout
|
89
|
+
|
90
|
+
end
|
91
|
+
|
92
|
+
def get_timestamp(datetime_string)
|
93
|
+
begin
|
94
|
+
# If the date value is in unix format the no need to process the date string
|
95
|
+
if @s247_datetime_format_string.include? 'unix'
|
96
|
+
return (if @s247_datetime_format_string == 'unix' then datetime_string+'000' else datetime_string end)
|
97
|
+
end
|
98
|
+
datetime_string += if !@is_year_present then ' '+String(Time.new.year) else '' end
|
99
|
+
if !@is_timezone_present && @logtype_config.has_key?('timezone')
|
100
|
+
@s247_datetime_format_string += '%z'
|
101
|
+
time_zone = String(@s247_tz['hrs'])+':'+String(@s247_tz['mins'])
|
102
|
+
datetime_string += if time_zone.start_with?('-') then time_zone else '+'+time_zone end
|
103
|
+
end
|
104
|
+
datetime_data = DateTime.strptime(datetime_string, @s247_datetime_format_string)
|
105
|
+
return Integer(datetime_data.strftime('%Q'))
|
106
|
+
rescue
|
107
|
+
return 0
|
108
|
+
end
|
109
|
+
end
|
110
|
+
|
111
|
+
def parse_lines(lines)
|
112
|
+
parsed_lines = []
|
113
|
+
log_size = 0
|
114
|
+
lines.each do |line|
|
115
|
+
if !line.empty?
|
116
|
+
begin
|
117
|
+
if match = line.match(@s247_custom_regex)
|
118
|
+
log_size += line.bytesize
|
119
|
+
log_fields = match&.named_captures
|
120
|
+
removed_log_size=0
|
121
|
+
@s247_ignored_fields.each do |field_name|
|
122
|
+
removed_log_size += if log_fields.has_key?field_name then log_fields.delete(field_name).bytesize else 0 end
|
123
|
+
end
|
124
|
+
formatted_line = {'_zl_timestamp' => get_timestamp(log_fields[@logtype_config['dateField']]), 's247agentuid' => @log_source}
|
125
|
+
formatted_line.merge!(log_fields)
|
126
|
+
parsed_lines.push(formatted_line)
|
127
|
+
log_size -= removed_log_size
|
128
|
+
end
|
129
|
+
rescue Exception => e
|
130
|
+
@logger.error("Exception in parse_line #{e.backtrace}")
|
131
|
+
end
|
132
|
+
end
|
133
|
+
end
|
134
|
+
return parsed_lines, log_size
|
135
|
+
end
|
136
|
+
|
137
|
+
def is_filters_matched(formatted_line)
|
138
|
+
if @logtype_config.has_key?'filterConfig'
|
139
|
+
@logtype_config['filterConfig'].each do |config|
|
140
|
+
if formatted_line.has_key?config && (filter_config[config]['match'] ^ (filter_config[config]['values'].include?formatted_line[config]))
|
141
|
+
return false
|
142
|
+
end
|
143
|
+
end
|
144
|
+
end
|
145
|
+
return true
|
146
|
+
end
|
147
|
+
|
148
|
+
def get_json_value(obj, key, datatype=nil)
|
149
|
+
if obj != nil && (obj.has_key?key)
|
150
|
+
if datatype and datatype == 'json-object'
|
151
|
+
arr_json = []
|
152
|
+
child_obj = obj[key]
|
153
|
+
if child_obj.class == String
|
154
|
+
child_obj = JSON.parse(child_obj.gsub('\\','\\\\'))
|
155
|
+
end
|
156
|
+
child_obj.each do |key, value|
|
157
|
+
arr_json.push({'key' => key, 'value' => String(value)})
|
158
|
+
end
|
159
|
+
return arr_json
|
160
|
+
else
|
161
|
+
return (if obj.has_key?key then obj[key] else obj[key.downcase] end)
|
162
|
+
end
|
163
|
+
elsif key.include?'.'
|
164
|
+
parent_key = key[0..key.index('.')-1]
|
165
|
+
child_key = key[key.index('.')+1..-1]
|
166
|
+
child_obj = obj[if obj.has_key?parent_key then parent_key else parent_key.capitalize() end]
|
167
|
+
if child_obj.class == String
|
168
|
+
child_obj = JSON.parse(child_obj.replace('\\','\\\\'))
|
169
|
+
end
|
170
|
+
return get_json_value(child_obj, child_key)
|
171
|
+
end
|
172
|
+
end
|
173
|
+
|
174
|
+
def json_log_parser(lines_read)
|
175
|
+
log_size = 0
|
176
|
+
parsed_lines = []
|
177
|
+
lines_read.each do |line|
|
178
|
+
if !line.empty?
|
179
|
+
current_log_size = 0
|
180
|
+
formatted_line = {}
|
181
|
+
event_obj = Yajl::Parser.parse(line)
|
182
|
+
@logtype_config['jsonPath'].each do |path_obj|
|
183
|
+
value = get_json_value(event_obj, path_obj[if path_obj.has_key?'key' then 'key' else 'name' end], path_obj['type'])
|
184
|
+
if value
|
185
|
+
formatted_line[path_obj['name']] = value
|
186
|
+
current_log_size+= String(value).size - (if value.class == Array then value.size*20 else 0 end)
|
187
|
+
end
|
188
|
+
end
|
189
|
+
if is_filters_matched(formatted_line)
|
190
|
+
formatted_line['_zl_timestamp'] = get_timestamp(formatted_line[@logtype_config['dateField']])
|
191
|
+
formatted_line['s247agentuid'] = @log_source
|
192
|
+
parsed_lines.push(formatted_line)
|
193
|
+
log_size+=current_log_size
|
194
|
+
end
|
195
|
+
end
|
196
|
+
end
|
197
|
+
return parsed_lines, log_size
|
198
|
+
end
|
199
|
+
|
200
|
+
def process_http_events(events)
|
201
|
+
batches = batch_http_events(events)
|
202
|
+
batches.each do |batched_event|
|
203
|
+
formatted_events, log_size = format_http_event_batch(batched_event)
|
204
|
+
formatted_events = gzip_compress(formatted_events)
|
205
|
+
send_logs_to_s247(formatted_events, log_size)
|
206
|
+
end
|
207
|
+
end
|
208
|
+
|
209
|
+
def batch_http_events(encoded_events)
|
210
|
+
batches = []
|
211
|
+
current_batch = []
|
212
|
+
current_batch_size = 0
|
213
|
+
encoded_events.each_with_index do |encoded_event, i|
|
214
|
+
#dump = LogStash::Json.dump(encoded_event.to_hash)
|
215
|
+
event_message = encoded_event.to_hash['message']
|
216
|
+
@logger.info("Message : #{event_message}")
|
217
|
+
current_event_size = event_message.bytesize
|
218
|
+
if current_event_size > S247_MAX_RECORD_SIZE
|
219
|
+
event_message = event_message[0..(S247_MAX_RECORD_SIZE-DD_TRUNCATION_SUFFIX.length)]+DD_TRUNCATION_SUFFIX
|
220
|
+
current_event_size = event_message.bytesize
|
221
|
+
end
|
222
|
+
|
223
|
+
if (i > 0 and i % S247_MAX_RECORD_COUNT == 0) or (current_batch_size + current_event_size > S247_MAX_BATCH_SIZE)
|
224
|
+
batches << current_batch
|
225
|
+
current_batch = []
|
226
|
+
current_batch_size = 0
|
227
|
+
end
|
228
|
+
|
229
|
+
current_batch_size += current_event_size
|
230
|
+
current_batch << event_message
|
231
|
+
end
|
232
|
+
batches << current_batch
|
233
|
+
batches
|
234
|
+
end
|
235
|
+
|
236
|
+
def format_http_event_batch(events)
|
237
|
+
parsed_lines = []
|
238
|
+
log_size = 0
|
239
|
+
if @logtype_config.has_key?'jsonPath'
|
240
|
+
parsed_lines, log_size = json_log_parser(events)
|
241
|
+
else
|
242
|
+
parsed_lines, log_size = parse_lines(events)
|
243
|
+
end
|
244
|
+
return LogStash::Json.dump(parsed_lines), log_size
|
245
|
+
end
|
246
|
+
|
247
|
+
def gzip_compress(payload)
|
248
|
+
gz = StringIO.new
|
249
|
+
gz.set_encoding("BINARY")
|
250
|
+
z = Zlib::GzipWriter.new(gz, 9)
|
251
|
+
begin
|
252
|
+
z.write(payload)
|
253
|
+
ensure
|
254
|
+
z.close
|
255
|
+
end
|
256
|
+
gz.string
|
257
|
+
end
|
258
|
+
|
259
|
+
def send_logs_to_s247(gzipped_parsed_lines, log_size)
|
260
|
+
@headers['Log-Size'] = String(log_size)
|
261
|
+
@logger.info("log_size = #{log_size}")
|
262
|
+
sleep_interval = @retry_interval
|
263
|
+
begin
|
264
|
+
@max_retry.times do |counter|
|
265
|
+
need_retry = false
|
266
|
+
begin
|
267
|
+
response = @s247_http_client.post(@upload_url, body: gzipped_parsed_lines, headers: @headers).call
|
268
|
+
resp_headers = response.headers.to_h
|
269
|
+
if response.code == 200
|
270
|
+
if resp_headers.has_key?'LOG_LICENSE_EXCEEDS' && resp_headers['LOG_LICENSE_EXCEEDS'] == 'True'
|
271
|
+
@logger.error("Log license limit exceeds so not able to send logs")
|
272
|
+
@log_upload_allowed = false
|
273
|
+
@log_upload_stopped_time =Time.now.to_i
|
274
|
+
elsif resp_headers.has_key?'BLOCKED_LOGTYPE' && resp_headers['BLOCKED_LOGTYPE'] == 'True'
|
275
|
+
@logger.error("Max upload limit reached for log type")
|
276
|
+
@log_upload_allowed = false
|
277
|
+
@log_upload_stopped_time =Time.now.to_i
|
278
|
+
elsif resp_headers.has_key?'INVALID_LOGTYPE' && resp_headers['INVALID_LOGTYPE'] == 'True'
|
279
|
+
@logger.error("Log type not present in this account so stopping log collection")
|
280
|
+
@valid_logtype = false
|
281
|
+
else
|
282
|
+
@log_upload_allowed = true
|
283
|
+
@logger.info("Successfully sent logs with size #{gzipped_parsed_lines.size} / #{log_size} to site24x7. Upload Id : #{resp_headers['x-uploadid']}")
|
284
|
+
end
|
285
|
+
else
|
286
|
+
@logger.error("Response Code #{response.code} from Site24x7, so retrying (#{counter + 1}/#{@max_retry})")
|
287
|
+
need_retry = true
|
288
|
+
end
|
289
|
+
rescue StandardError => e
|
290
|
+
@logger.error("Error connecting to Site24x7. exception: #{e.backtrace}")
|
291
|
+
end
|
292
|
+
|
293
|
+
if need_retry
|
294
|
+
if counter == @max_retry - 1
|
295
|
+
@logger.error("Could not send your logs after #{max_retry} tries")
|
296
|
+
break
|
297
|
+
end
|
298
|
+
sleep(sleep_interval)
|
299
|
+
sleep_interval *= 2
|
300
|
+
else
|
301
|
+
return
|
302
|
+
end
|
303
|
+
end
|
304
|
+
rescue Exception => e
|
305
|
+
@logger.error("Exception occurred in sendig logs : #{e.backtrace}")
|
306
|
+
end
|
307
|
+
end
|
308
|
+
|
309
|
+
end
|
@@ -0,0 +1,23 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-output-mageshlogs'
|
3
|
+
s.version = '0.1.0'
|
4
|
+
s.licenses = ['Apache-2.0']
|
5
|
+
s.summary = 'Write a short summary, because Rubygems requires one.'
|
6
|
+
s.homepage = 'https://site24x7.com'
|
7
|
+
s.authors = ['']
|
8
|
+
s.email = ''
|
9
|
+
s.require_paths = ['lib']
|
10
|
+
|
11
|
+
# Files
|
12
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
13
|
+
# Tests
|
14
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
15
|
+
|
16
|
+
# Special flag to let us know this is actually a logstash plugin
|
17
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
|
18
|
+
|
19
|
+
# Gem dependencies
|
20
|
+
s.add_runtime_dependency "logstash-core-plugin-api", "~> 2.0"
|
21
|
+
s.add_runtime_dependency "logstash-codec-plain"
|
22
|
+
s.add_development_dependency "logstash-devutils"
|
23
|
+
end
|
@@ -0,0 +1,22 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/devutils/rspec/spec_helper"
|
3
|
+
require "logstash/outputs/mageshlogs"
|
4
|
+
require "logstash/codecs/plain"
|
5
|
+
require "logstash/event"
|
6
|
+
|
7
|
+
describe LogStash::Outputs::Mageshlogs do
|
8
|
+
let(:sample_event) { LogStash::Event.new }
|
9
|
+
let(:output) { LogStash::Outputs::Mageshlogs.new }
|
10
|
+
|
11
|
+
before do
|
12
|
+
output.register
|
13
|
+
end
|
14
|
+
|
15
|
+
describe "receive message" do
|
16
|
+
subject { output.receive(sample_event) }
|
17
|
+
|
18
|
+
it "returns a string" do
|
19
|
+
expect(subject).to eq("Event received")
|
20
|
+
end
|
21
|
+
end
|
22
|
+
end
|
metadata
ADDED
@@ -0,0 +1,96 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-output-mageshlogs
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- ''
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2021-08-05 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: logstash-core-plugin-api
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - "~>"
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: '2.0'
|
20
|
+
type: :runtime
|
21
|
+
prerelease: false
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '2.0'
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
name: logstash-codec-plain
|
29
|
+
requirement: !ruby/object:Gem::Requirement
|
30
|
+
requirements:
|
31
|
+
- - ">="
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '0'
|
34
|
+
type: :runtime
|
35
|
+
prerelease: false
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - ">="
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '0'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
name: logstash-devutils
|
43
|
+
requirement: !ruby/object:Gem::Requirement
|
44
|
+
requirements:
|
45
|
+
- - ">="
|
46
|
+
- !ruby/object:Gem::Version
|
47
|
+
version: '0'
|
48
|
+
type: :development
|
49
|
+
prerelease: false
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - ">="
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '0'
|
55
|
+
description:
|
56
|
+
email: ''
|
57
|
+
executables: []
|
58
|
+
extensions: []
|
59
|
+
extra_rdoc_files: []
|
60
|
+
files:
|
61
|
+
- CHANGELOG.md
|
62
|
+
- CONTRIBUTORS
|
63
|
+
- DEVELOPER.md
|
64
|
+
- Gemfile
|
65
|
+
- LICENSE
|
66
|
+
- README.md
|
67
|
+
- lib/logstash/outputs/mageshlogs.rb
|
68
|
+
- logstash-output-mageshlogs.gemspec
|
69
|
+
- spec/outputs/mageshlogs_spec.rb
|
70
|
+
homepage: https://site24x7.com
|
71
|
+
licenses:
|
72
|
+
- Apache-2.0
|
73
|
+
metadata:
|
74
|
+
logstash_plugin: 'true'
|
75
|
+
logstash_group: output
|
76
|
+
post_install_message:
|
77
|
+
rdoc_options: []
|
78
|
+
require_paths:
|
79
|
+
- lib
|
80
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
81
|
+
requirements:
|
82
|
+
- - ">="
|
83
|
+
- !ruby/object:Gem::Version
|
84
|
+
version: '0'
|
85
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
86
|
+
requirements:
|
87
|
+
- - ">="
|
88
|
+
- !ruby/object:Gem::Version
|
89
|
+
version: '0'
|
90
|
+
requirements: []
|
91
|
+
rubygems_version: 3.2.3
|
92
|
+
signing_key:
|
93
|
+
specification_version: 4
|
94
|
+
summary: Write a short summary, because Rubygems requires one.
|
95
|
+
test_files:
|
96
|
+
- spec/outputs/mageshlogs_spec.rb
|