logstash-input-mongodb 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/DEVELOPER.md +2 -0
- data/Gemfile +3 -0
- data/LICENSE +13 -0
- data/README.md +95 -0
- data/Rakefile +1 -0
- data/lib/logstash/inputs/mongodb.rb +326 -0
- data/logstash-input-mongodb.gemspec +27 -0
- data/spec/inputs/example_spec.rb +1 -0
- metadata +146 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 069aa95b8c9675b2079f871d66a496da135741f8
|
4
|
+
data.tar.gz: ee102f0da33e5a562bcc4facfa28a63f743a466a
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 32c33f58cd391152b7c0ce788007cff3428c3b675a9a2734039d00d88b989ca921623b7b398b60c250ff38eb6c42187b4576552b2ccd40a1649e2e914c9db1a5
|
7
|
+
data.tar.gz: f5e8da5a9250404735a6d5c0d92c2870c14e73c8bab1768a0da522805ef5c2527ce7c011d197d154d4803e1224bd9e99e77d43b722dfbe379e6ea80e295ab9fb
|
data/DEVELOPER.md
ADDED
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
Copyright (c) 2012-2015 Elasticsearch <http://www.elasticsearch.org>
|
2
|
+
|
3
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
you may not use this file except in compliance with the License.
|
5
|
+
You may obtain a copy of the License at
|
6
|
+
|
7
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
|
9
|
+
Unless required by applicable law or agreed to in writing, software
|
10
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
See the License for the specific language governing permissions and
|
13
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,95 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elasticsearch/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elasticsearch.org/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elasticsearch/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the logstash-users@googlegroups.com mailing list.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization.
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
```sh
|
35
|
+
bundle exec rspec
|
36
|
+
```
|
37
|
+
|
38
|
+
The Logstash code required to run the tests/specs is specified in the `Gemfile` by the line similar to:
|
39
|
+
```ruby
|
40
|
+
gem "logstash", :github => "elasticsearch/logstash", :branch => "1.5"
|
41
|
+
```
|
42
|
+
To test against another version or a local Logstash, edit the `Gemfile` to specify an alternative location, for example:
|
43
|
+
```ruby
|
44
|
+
gem "logstash", :github => "elasticsearch/logstash", :ref => "master"
|
45
|
+
```
|
46
|
+
```ruby
|
47
|
+
gem "logstash", :path => "/your/local/logstash"
|
48
|
+
```
|
49
|
+
|
50
|
+
Then update your dependencies and run your tests:
|
51
|
+
|
52
|
+
```sh
|
53
|
+
bundle install
|
54
|
+
bundle exec rspec
|
55
|
+
```
|
56
|
+
|
57
|
+
### 2. Running your unpublished Plugin in Logstash
|
58
|
+
|
59
|
+
#### 2.1 Run in a local Logstash clone
|
60
|
+
|
61
|
+
- Edit Logstash `tools/Gemfile` and add the local plugin path, for example:
|
62
|
+
```ruby
|
63
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
64
|
+
```
|
65
|
+
- Update Logstash dependencies
|
66
|
+
```sh
|
67
|
+
rake vendor:gems
|
68
|
+
```
|
69
|
+
- Run Logstash with your plugin
|
70
|
+
```sh
|
71
|
+
bin/logstash -e 'filter {awesome {}}'
|
72
|
+
```
|
73
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
74
|
+
|
75
|
+
#### 2.2 Run in an installed Logstash
|
76
|
+
|
77
|
+
- Build your plugin gem
|
78
|
+
```sh
|
79
|
+
gem build logstash-filter-awesome.gemspec
|
80
|
+
```
|
81
|
+
- Install the plugin from the Logstash home
|
82
|
+
```sh
|
83
|
+
bin/plugin install /your/local/plugin/logstash-filter-awesome.gem
|
84
|
+
```
|
85
|
+
- Start Logstash and proceed to test the plugin
|
86
|
+
|
87
|
+
## Contributing
|
88
|
+
|
89
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
90
|
+
|
91
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
92
|
+
|
93
|
+
It is more important to me that you are able to contribute.
|
94
|
+
|
95
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elasticsearch/logstash/blob/master/CONTRIBUTING.md) file.
|
data/Rakefile
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
require "logstash/devutils/rake"
|
@@ -0,0 +1,326 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/inputs/base"
|
3
|
+
require "logstash/namespace"
|
4
|
+
require "logstash/timestamp"
|
5
|
+
require "stud/interval"
|
6
|
+
require "socket" # for Socket.gethostname
|
7
|
+
require "json"
|
8
|
+
require "bson"
|
9
|
+
|
10
|
+
# Generate a repeating message.
|
11
|
+
#
|
12
|
+
# This plugin is intented only as an example.
|
13
|
+
|
14
|
+
class LogStash::Inputs::MongoDB < LogStash::Inputs::Base
|
15
|
+
config_name "mongodb"
|
16
|
+
|
17
|
+
# If undefined, Logstash will complain, even if codec is unused.
|
18
|
+
default :codec, "plain"
|
19
|
+
|
20
|
+
# Example URI: mongodb://mydb.host:27017/mydbname?ssl=true
|
21
|
+
config :uri, :validate => :string, :required => true
|
22
|
+
|
23
|
+
# The path to the sqlite database file.
|
24
|
+
config :path, :validate => :string, :required => true
|
25
|
+
|
26
|
+
# Any table to exclude by name
|
27
|
+
config :exclude_tables, :validate => :array, :default => []
|
28
|
+
|
29
|
+
config :batch_size, :avlidate => :number, :default => 30
|
30
|
+
|
31
|
+
config :since_table, :validate => :string, :default => "logstash_since"
|
32
|
+
|
33
|
+
# The collection to use. Is turned into a regex so 'events' will match 'events_20150227'
|
34
|
+
# Example collection: events_20150227 or events_
|
35
|
+
config :collection, :validate => :string, :required => true
|
36
|
+
|
37
|
+
# This allows you to select the method you would like to use to parse your data
|
38
|
+
config :parse_method, :validate => :string, :default => 'flatten'
|
39
|
+
|
40
|
+
# If not flattening you can dig to flatten select fields
|
41
|
+
config :dig_fields, :validate => :array, :default => []
|
42
|
+
|
43
|
+
# This is the second level of hash flattening
|
44
|
+
config :dig_dig_fields, :validate => :array, :default => []
|
45
|
+
|
46
|
+
# If true, store the @timestamp field in mongodb as an ISODate type instead
|
47
|
+
# of an ISO8601 string. For more information about this, see
|
48
|
+
# http://www.mongodb.org/display/DOCS/Dates
|
49
|
+
config :isodate, :validate => :boolean, :default => false
|
50
|
+
|
51
|
+
# Number of seconds to wait after failure before retrying
|
52
|
+
config :retry_delay, :validate => :number, :default => 3, :required => false
|
53
|
+
|
54
|
+
# If true, an "_id" field will be added to the document before insertion.
|
55
|
+
# The "_id" field will use the timestamp of the event and overwrite an existing
|
56
|
+
# "_id" field in the event.
|
57
|
+
config :generateId, :validate => :boolean, :default => false
|
58
|
+
|
59
|
+
config :unpack_mongo_id, :validate => :boolean, :default => false
|
60
|
+
|
61
|
+
# The message string to use in the event.
|
62
|
+
config :message, :validate => :string, :default => "Default message..."
|
63
|
+
|
64
|
+
# Set how frequently messages should be sent.
|
65
|
+
# The default, `1`, means send a message every second.
|
66
|
+
config :interval, :validate => :number, :default => 1
|
67
|
+
|
68
|
+
SINCE_TABLE = :since_table
|
69
|
+
|
70
|
+
public
|
71
|
+
def init_placeholder_table(sqlitedb)
|
72
|
+
begin
|
73
|
+
sqlitedb.create_table "#{SINCE_TABLE}" do
|
74
|
+
String :table
|
75
|
+
Int :place
|
76
|
+
end
|
77
|
+
rescue
|
78
|
+
@logger.debug("since table already exists")
|
79
|
+
end
|
80
|
+
end
|
81
|
+
|
82
|
+
public
|
83
|
+
def get_placeholder(sqlitedb, since_table, mongodb, mongo_collection_name)
|
84
|
+
since = sqlitedb[SINCE_TABLE]
|
85
|
+
x = since.where(:table => "#{since_table}_#{mongo_collection_name}")
|
86
|
+
if x[:place].nil? || x[:place] == 0
|
87
|
+
first_entry_id = init_placeholder(sqlitedb, since_table, mongodb, mongo_collection_name)
|
88
|
+
return first_entry_id
|
89
|
+
else
|
90
|
+
@logger.debug("placeholder already exists, it is #{x[:place]}")
|
91
|
+
return x[:place][:place]
|
92
|
+
end
|
93
|
+
end
|
94
|
+
|
95
|
+
public
|
96
|
+
def init_placeholder(sqlitedb, since_table, mongodb, mongo_collection_name)
|
97
|
+
@logger.debug("init placeholder for #{since_table}_#{mongo_collection_name}")
|
98
|
+
since = sqlitedb[SINCE_TABLE]
|
99
|
+
mongo_collection = mongodb.collection(mongo_collection_name)
|
100
|
+
first_entry = mongo_collection.find_one({})
|
101
|
+
first_entry_id = first_entry['_id'].to_s
|
102
|
+
since.insert(:table => "#{since_table}_#{mongo_collection_name}", :place => first_entry_id)
|
103
|
+
return first_entry_id
|
104
|
+
end
|
105
|
+
|
106
|
+
public
|
107
|
+
def update_placeholder(sqlitedb, since_table, mongo_collection_name, place)
|
108
|
+
@logger.debug("updating placeholder for #{since_table}_#{mongo_collection_name} to #{place}")
|
109
|
+
since = sqlitedb[SINCE_TABLE]
|
110
|
+
since.where(:table => "#{since_table}_#{mongo_collection_name}").update(:place => place)
|
111
|
+
end
|
112
|
+
|
113
|
+
public
|
114
|
+
def get_all_tables(mongodb)
|
115
|
+
return @mongodb.collection_names
|
116
|
+
end
|
117
|
+
|
118
|
+
public
|
119
|
+
def get_collection_names(mongodb, collection)
|
120
|
+
collection_names = []
|
121
|
+
@mongodb.collection_names.each do |coll|
|
122
|
+
if /#{collection}/ =~ coll
|
123
|
+
collection_names.push(coll)
|
124
|
+
@logger.debug("Added #{coll} to the collection list as it matches our collection search")
|
125
|
+
end
|
126
|
+
end
|
127
|
+
return collection_names
|
128
|
+
end
|
129
|
+
|
130
|
+
public
|
131
|
+
def get_cursor_for_collection(mongodb, mongo_collection_name, last_id_object, batch_size)
|
132
|
+
collection = mongodb.collection(mongo_collection_name)
|
133
|
+
return collection.find({:_id => {:$gt => last_id_object}}).limit(batch_size)
|
134
|
+
end
|
135
|
+
|
136
|
+
public
|
137
|
+
def update_watched_collections(mongodb, collection, sqlitedb)
|
138
|
+
collections = get_collection_names(mongodb, collection)
|
139
|
+
collection_data = {}
|
140
|
+
collections.each do |my_collection|
|
141
|
+
init_placeholder_table(sqlitedb)
|
142
|
+
last_id = get_placeholder(sqlitedb, since_table, mongodb, my_collection)
|
143
|
+
if !collection_data[my_collection]
|
144
|
+
collection_data[my_collection] = { :name => my_collection, :last_id => last_id }
|
145
|
+
end
|
146
|
+
end
|
147
|
+
return collection_data
|
148
|
+
end
|
149
|
+
|
150
|
+
public
|
151
|
+
def register
|
152
|
+
require "mongo"
|
153
|
+
require "jdbc/sqlite3"
|
154
|
+
require "sequel"
|
155
|
+
uriParsed = Mongo::URIParser.new(@uri)
|
156
|
+
conn = uriParsed.connection({})
|
157
|
+
if uriParsed.auths.length > 0
|
158
|
+
uriParsed.auths.each do |auth|
|
159
|
+
if !auth['db_name'].nil?
|
160
|
+
conn.add_auth(auth['db_name'], auth['username'], auth['password'], nil)
|
161
|
+
end
|
162
|
+
end
|
163
|
+
conn.apply_saved_authentication()
|
164
|
+
end
|
165
|
+
@host = Socket.gethostname
|
166
|
+
@logger.info("Registering MongoDB input", :database => @path)
|
167
|
+
#@mongodb = conn.db(@database)
|
168
|
+
@mongodb = conn.db(uriParsed.db_name)
|
169
|
+
@sqlitedb = Sequel.connect("jdbc:sqlite:#{@path}")
|
170
|
+
# Should check to see if there are new matching tables at a predefined interval or on some trigger
|
171
|
+
@collection_data = update_watched_collections(@mongodb, @collection, @sqlitedb)
|
172
|
+
end # def register
|
173
|
+
|
174
|
+
class BSON::OrderedHash
|
175
|
+
def to_h
|
176
|
+
inject({}) { |acc, element| k,v = element; acc[k] = (if v.class == BSON::OrderedHash then v.to_h else v end); acc }
|
177
|
+
end
|
178
|
+
|
179
|
+
def to_json
|
180
|
+
JSON.parse(self.to_h.to_json, :allow_nan => true)
|
181
|
+
end
|
182
|
+
end
|
183
|
+
|
184
|
+
def flatten(my_hash)
|
185
|
+
new_hash = {}
|
186
|
+
if my_hash.respond_to? :each
|
187
|
+
my_hash.each do |k1,v1|
|
188
|
+
if v1.is_a?(Hash)
|
189
|
+
v1.each do |k2,v2|
|
190
|
+
if v2.is_a?(Hash)
|
191
|
+
# puts "Found a nested hash"
|
192
|
+
result = flatten(v2)
|
193
|
+
result.each do |k3,v3|
|
194
|
+
new_hash[k1.to_s+"_"+k2.to_s+"_"+k3.to_s] = v3
|
195
|
+
end
|
196
|
+
# puts "result: "+result.to_s+" k2: "+k2.to_s+" v2: "+v2.to_s
|
197
|
+
else
|
198
|
+
new_hash[k1.to_s+"_"+k2.to_s] = v2
|
199
|
+
end
|
200
|
+
end
|
201
|
+
else
|
202
|
+
# puts "Key: "+k1.to_s+" is not a hash"
|
203
|
+
new_hash[k1.to_s] = v1
|
204
|
+
end
|
205
|
+
end
|
206
|
+
else
|
207
|
+
@logger.debug("Flatten [ERROR]: hash did not respond to :each")
|
208
|
+
end
|
209
|
+
return new_hash
|
210
|
+
end
|
211
|
+
|
212
|
+
def run(queue)
|
213
|
+
sleep_min = 0.01
|
214
|
+
sleep_max = 5
|
215
|
+
sleeptime = sleep_min
|
216
|
+
|
217
|
+
begin
|
218
|
+
@logger.debug("Tailing MongoDB", :path => @path)
|
219
|
+
@logger.debug("Collection data is: #{@collection_data}")
|
220
|
+
loop do
|
221
|
+
@collection_data.each do |index, collection|
|
222
|
+
collection_name = collection[:name]
|
223
|
+
@logger.debug("collection_data is: #{@collection_data}")
|
224
|
+
last_id = @collection_data[index][:last_id]
|
225
|
+
@logger.debug("last_id is #{last_id}", :index => index, :collection => collection_name)
|
226
|
+
# get batch of events starting at the last_place if it is set
|
227
|
+
last_id_object = BSON::ObjectId(last_id)
|
228
|
+
cursor = get_cursor_for_collection(@mongodb, collection_name, last_id_object, batch_size)
|
229
|
+
cursor.each do |doc|
|
230
|
+
@logger.debug("Date from mongo: #{doc['_id'].generation_time.to_s}")
|
231
|
+
logdate = DateTime.parse(doc['_id'].generation_time.to_s)
|
232
|
+
@logger.debug("logdate.iso8601: #{logdate.iso8601}")
|
233
|
+
event = LogStash::Event.new("host" => @host)
|
234
|
+
decorate(event)
|
235
|
+
event["logdate"] = logdate.iso8601
|
236
|
+
@logger.debug("type of doc is: "+doc.class.to_s)
|
237
|
+
log_entry = doc.to_h.to_s
|
238
|
+
log_entry['_id'] = log_entry['_id'].to_s
|
239
|
+
event["log_entry"] = log_entry
|
240
|
+
@logger.debug("EVENT looks like: "+event.to_s)
|
241
|
+
@logger.debug("Sent message: "+doc.to_h.to_s)
|
242
|
+
@logger.debug("EVENT looks like: "+event.to_s)
|
243
|
+
# Extract the HOST_ID and PID from the MongoDB BSON::ObjectID
|
244
|
+
if @unpack_mongo_id
|
245
|
+
doc_obj_bin = doc['_id'].to_a.pack("C*").unpack("a4 a3 a2 a3")
|
246
|
+
host_id = doc_obj_bin[1].unpack("S")
|
247
|
+
process_id = doc_obj_bin[2].unpack("S")
|
248
|
+
event['host_id'] = host_id.first.to_i
|
249
|
+
event['process_id'] = process_id.first.to_i
|
250
|
+
end
|
251
|
+
|
252
|
+
if @parse_method == 'flatten'
|
253
|
+
# Flatten the JSON so that the data is usable in Kibana
|
254
|
+
flat_doc = flatten(doc)
|
255
|
+
# Check for different types of expected values and add them to the event
|
256
|
+
flat_doc.each do |k,v|
|
257
|
+
# Check for an integer
|
258
|
+
if /\A[-+]?\d+[.][\d]+\z/ === v
|
259
|
+
event[k.to_s] = v.to_f
|
260
|
+
elsif (/\A[-+]?\d+\z/ === v) || (v.is_a? Integer)
|
261
|
+
event[k.to_s] = v.to_i
|
262
|
+
else
|
263
|
+
event[k.to_s] = v.to_s unless k.to_s == "_id" || k.to_s == "tags"
|
264
|
+
if (k.to_s == "tags") && (v.is_a? Array)
|
265
|
+
event['tags'] = v
|
266
|
+
end
|
267
|
+
end
|
268
|
+
end
|
269
|
+
elsif @parse_method == 'dig'
|
270
|
+
# Dig into the JSON and flatten select elements
|
271
|
+
doc.each do |k, v|
|
272
|
+
if k != "_id"
|
273
|
+
if (@dig_fields.include? k) && (v.respond_to? :each)
|
274
|
+
v.each do |kk, vv|
|
275
|
+
if (@dig_dig_fields.include? kk) && (vv.respond_to? :each)
|
276
|
+
vv.each do |kkk, vvv|
|
277
|
+
if /\A[-+]?\d+\z/ === vvv
|
278
|
+
event["#{k}_#{kk}_#{kkk}"] = vvv.to_i
|
279
|
+
else
|
280
|
+
event["#{k}_#{kk}_#{kkk}"] = vvv.to_s
|
281
|
+
end
|
282
|
+
end
|
283
|
+
else
|
284
|
+
if /\A[-+]?\d+\z/ === vv
|
285
|
+
event["#{k}_#{kk}"] = vv.to_i
|
286
|
+
else
|
287
|
+
event["#{k}_#{kk}"] = vv.to_s
|
288
|
+
end
|
289
|
+
end
|
290
|
+
end
|
291
|
+
else
|
292
|
+
if /\A[-+]?\d+\z/ === v
|
293
|
+
event[k] = v.to_i
|
294
|
+
else
|
295
|
+
event[k] = v.to_s
|
296
|
+
end
|
297
|
+
end
|
298
|
+
end
|
299
|
+
end
|
300
|
+
else
|
301
|
+
# Should probably do some sanitization here and insert the doc as raw as possible for parsing in logstash
|
302
|
+
end
|
303
|
+
|
304
|
+
queue << event
|
305
|
+
@collection_data[index][:last_id] = doc['_id'].to_s
|
306
|
+
@collection_data = update_watched_collections(@mongodb, @collection, @sqlitedb)
|
307
|
+
end
|
308
|
+
# Store the last-seen doc in the database
|
309
|
+
update_placeholder(@sqlitedb, since_table, collection_name, @collection_data[index][:last_id])
|
310
|
+
end
|
311
|
+
|
312
|
+
# nothing found in that iteration
|
313
|
+
# sleep a bit
|
314
|
+
@logger.debug("No new rows. Sleeping.", :time => sleeptime)
|
315
|
+
sleeptime = [sleeptime * 2, sleep_max].min
|
316
|
+
sleep(sleeptime)
|
317
|
+
#sleeptime = sleep_min
|
318
|
+
end
|
319
|
+
rescue LogStash::ShutdownSignal
|
320
|
+
if @interrupted
|
321
|
+
@logger.debug("Mongo Input shutting down")
|
322
|
+
end
|
323
|
+
end
|
324
|
+
end # def run
|
325
|
+
|
326
|
+
end # class LogStash::Inputs::Example
|
@@ -0,0 +1,27 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-input-mongodb'
|
3
|
+
s.version = '0.1.0'
|
4
|
+
s.licenses = ['Apache License (2.0)']
|
5
|
+
s.summary = "This takes entries from mongodb as an input to logstash."
|
6
|
+
s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
|
7
|
+
s.authors = ["Philip Hutchins"]
|
8
|
+
s.email = 'flipture@gmail.com'
|
9
|
+
s.homepage = "http://www.phutchins.com"
|
10
|
+
s.require_paths = ["lib"]
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = `git ls-files`.split($\)
|
14
|
+
# Tests
|
15
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
|
+
|
17
|
+
# Special flag to let us know this is actually a logstash plugin
|
18
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
|
19
|
+
|
20
|
+
# Gem dependencies
|
21
|
+
s.add_runtime_dependency 'logstash', '>= 1.4.0', '< 2.0.0'
|
22
|
+
s.add_runtime_dependency 'logstash-codec-plain'
|
23
|
+
s.add_runtime_dependency 'stud'
|
24
|
+
s.add_runtime_dependency 'jdbc/sqlite3'
|
25
|
+
s.add_runtime_dependency 'sequel'
|
26
|
+
s.add_development_dependency 'logstash-devutils'
|
27
|
+
end
|
@@ -0,0 +1 @@
|
|
1
|
+
require "logstash/devutils/rspec/spec_helper"
|
metadata
ADDED
@@ -0,0 +1,146 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-input-mongodb
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Philip Hutchins
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2015-03-18 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: logstash
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - ">="
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: 1.4.0
|
20
|
+
- - "<"
|
21
|
+
- !ruby/object:Gem::Version
|
22
|
+
version: 2.0.0
|
23
|
+
type: :runtime
|
24
|
+
prerelease: false
|
25
|
+
version_requirements: !ruby/object:Gem::Requirement
|
26
|
+
requirements:
|
27
|
+
- - ">="
|
28
|
+
- !ruby/object:Gem::Version
|
29
|
+
version: 1.4.0
|
30
|
+
- - "<"
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: 2.0.0
|
33
|
+
- !ruby/object:Gem::Dependency
|
34
|
+
name: logstash-codec-plain
|
35
|
+
requirement: !ruby/object:Gem::Requirement
|
36
|
+
requirements:
|
37
|
+
- - ">="
|
38
|
+
- !ruby/object:Gem::Version
|
39
|
+
version: '0'
|
40
|
+
type: :runtime
|
41
|
+
prerelease: false
|
42
|
+
version_requirements: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - ">="
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '0'
|
47
|
+
- !ruby/object:Gem::Dependency
|
48
|
+
name: stud
|
49
|
+
requirement: !ruby/object:Gem::Requirement
|
50
|
+
requirements:
|
51
|
+
- - ">="
|
52
|
+
- !ruby/object:Gem::Version
|
53
|
+
version: '0'
|
54
|
+
type: :runtime
|
55
|
+
prerelease: false
|
56
|
+
version_requirements: !ruby/object:Gem::Requirement
|
57
|
+
requirements:
|
58
|
+
- - ">="
|
59
|
+
- !ruby/object:Gem::Version
|
60
|
+
version: '0'
|
61
|
+
- !ruby/object:Gem::Dependency
|
62
|
+
name: jdbc/sqlite3
|
63
|
+
requirement: !ruby/object:Gem::Requirement
|
64
|
+
requirements:
|
65
|
+
- - ">="
|
66
|
+
- !ruby/object:Gem::Version
|
67
|
+
version: '0'
|
68
|
+
type: :runtime
|
69
|
+
prerelease: false
|
70
|
+
version_requirements: !ruby/object:Gem::Requirement
|
71
|
+
requirements:
|
72
|
+
- - ">="
|
73
|
+
- !ruby/object:Gem::Version
|
74
|
+
version: '0'
|
75
|
+
- !ruby/object:Gem::Dependency
|
76
|
+
name: sequel
|
77
|
+
requirement: !ruby/object:Gem::Requirement
|
78
|
+
requirements:
|
79
|
+
- - ">="
|
80
|
+
- !ruby/object:Gem::Version
|
81
|
+
version: '0'
|
82
|
+
type: :runtime
|
83
|
+
prerelease: false
|
84
|
+
version_requirements: !ruby/object:Gem::Requirement
|
85
|
+
requirements:
|
86
|
+
- - ">="
|
87
|
+
- !ruby/object:Gem::Version
|
88
|
+
version: '0'
|
89
|
+
- !ruby/object:Gem::Dependency
|
90
|
+
name: logstash-devutils
|
91
|
+
requirement: !ruby/object:Gem::Requirement
|
92
|
+
requirements:
|
93
|
+
- - ">="
|
94
|
+
- !ruby/object:Gem::Version
|
95
|
+
version: '0'
|
96
|
+
type: :development
|
97
|
+
prerelease: false
|
98
|
+
version_requirements: !ruby/object:Gem::Requirement
|
99
|
+
requirements:
|
100
|
+
- - ">="
|
101
|
+
- !ruby/object:Gem::Version
|
102
|
+
version: '0'
|
103
|
+
description: This gem is a logstash plugin required to be installed on top of the
|
104
|
+
Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not
|
105
|
+
a stand-alone program
|
106
|
+
email: flipture@gmail.com
|
107
|
+
executables: []
|
108
|
+
extensions: []
|
109
|
+
extra_rdoc_files: []
|
110
|
+
files:
|
111
|
+
- DEVELOPER.md
|
112
|
+
- Gemfile
|
113
|
+
- LICENSE
|
114
|
+
- README.md
|
115
|
+
- Rakefile
|
116
|
+
- lib/logstash/inputs/mongodb.rb
|
117
|
+
- logstash-input-mongodb.gemspec
|
118
|
+
- spec/inputs/example_spec.rb
|
119
|
+
homepage: http://www.phutchins.com
|
120
|
+
licenses:
|
121
|
+
- Apache License (2.0)
|
122
|
+
metadata:
|
123
|
+
logstash_plugin: 'true'
|
124
|
+
logstash_group: input
|
125
|
+
post_install_message:
|
126
|
+
rdoc_options: []
|
127
|
+
require_paths:
|
128
|
+
- lib
|
129
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
130
|
+
requirements:
|
131
|
+
- - ">="
|
132
|
+
- !ruby/object:Gem::Version
|
133
|
+
version: '0'
|
134
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
135
|
+
requirements:
|
136
|
+
- - ">="
|
137
|
+
- !ruby/object:Gem::Version
|
138
|
+
version: '0'
|
139
|
+
requirements: []
|
140
|
+
rubyforge_project:
|
141
|
+
rubygems_version: 2.4.5
|
142
|
+
signing_key:
|
143
|
+
specification_version: 4
|
144
|
+
summary: This takes entries from mongodb as an input to logstash.
|
145
|
+
test_files:
|
146
|
+
- spec/inputs/example_spec.rb
|