logstash-output-mongodb_upsert_custom 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +2 -0
- data/CONTRIBUTORS +10 -0
- data/DEVELOPER.md +2 -0
- data/Gemfile +3 -0
- data/LICENSE +11 -0
- data/README.md +86 -0
- data/lib/logstash/outputs/bson/big_decimal.rb +66 -0
- data/lib/logstash/outputs/bson/logstash_event.rb +76 -0
- data/lib/logstash/outputs/bson/logstash_timestamp.rb +50 -0
- data/lib/logstash/outputs/mongodb_upsert_custom.rb +202 -0
- data/logstash-output-mongodb_upsert_custom.gemspec +24 -0
- data/spec/outputs/mongodb_upsert_custom_spec.rb +22 -0
- metadata +99 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 7f4e52a56cac12d0e591dacb03e3c9968888da377515300e6f80d14858ba98a6
|
4
|
+
data.tar.gz: deb4b185920556250d6af0455cfa1a464ab7551a41ebd5e085d44b74edebe012
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 4c6d965c49a3e955934be9d9aca63c61bb3b63aad6dd2169ec34c252a080b55b735c3feb6df3b12e3f623361ecf8de24745b9f11d5dc8142e10bd88e18c25593
|
7
|
+
data.tar.gz: 3975b1adc830f4b4a88200b1f2b69d847e3a5976379df73bbf2f6266ed8d2190ad38f3922be631b17453b67fcc8241e99b8b566f2ce4989665be9c692b6aa45d
|
data/CHANGELOG.md
ADDED
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,10 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* jijikarikkad - jijikarikkad@gmail.com
|
6
|
+
|
7
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
8
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
9
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
10
|
+
open source awesome.
|
data/DEVELOPER.md
ADDED
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,11 @@
|
|
1
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
2
|
+
you may not use this file except in compliance with the License.
|
3
|
+
You may obtain a copy of the License at
|
4
|
+
|
5
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
6
|
+
|
7
|
+
Unless required by applicable law or agreed to in writing, software
|
8
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
9
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
10
|
+
See the License for the specific language governing permissions and
|
11
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,86 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
- Update your dependencies
|
35
|
+
|
36
|
+
```sh
|
37
|
+
bundle install
|
38
|
+
```
|
39
|
+
|
40
|
+
- Run tests
|
41
|
+
|
42
|
+
```sh
|
43
|
+
bundle exec rspec
|
44
|
+
```
|
45
|
+
|
46
|
+
### 2. Running your unpublished Plugin in Logstash
|
47
|
+
|
48
|
+
#### 2.1 Run in a local Logstash clone
|
49
|
+
|
50
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
51
|
+
```ruby
|
52
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
53
|
+
```
|
54
|
+
- Install plugin
|
55
|
+
```sh
|
56
|
+
bin/logstash-plugin install --no-verify
|
57
|
+
```
|
58
|
+
- Run Logstash with your plugin
|
59
|
+
```sh
|
60
|
+
bin/logstash -e 'filter {awesome {}}'
|
61
|
+
```
|
62
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
63
|
+
|
64
|
+
#### 2.2 Run in an installed Logstash
|
65
|
+
|
66
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
67
|
+
|
68
|
+
- Build your plugin gem
|
69
|
+
```sh
|
70
|
+
gem build logstash-filter-awesome.gemspec
|
71
|
+
```
|
72
|
+
- Install the plugin from the Logstash home
|
73
|
+
```sh
|
74
|
+
bin/logstash-plugin install /your/local/plugin/logstash-filter-awesome.gem
|
75
|
+
```
|
76
|
+
- Start Logstash and proceed to test the plugin
|
77
|
+
|
78
|
+
## Contributing
|
79
|
+
|
80
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
81
|
+
|
82
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
83
|
+
|
84
|
+
It is more important to the community that you are able to contribute.
|
85
|
+
|
86
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,66 @@
|
|
1
|
+
# Copyright (C) 2009-2014 MongoDB Inc.
|
2
|
+
#
|
3
|
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
# you may not use this file except in compliance with the License.
|
5
|
+
# You may obtain a copy of the License at
|
6
|
+
#
|
7
|
+
# http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
#
|
9
|
+
# Unless required by applicable law or agreed to in writing, software
|
10
|
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
# See the License for the specific language governing permissions and
|
13
|
+
# limitations under the License.
|
14
|
+
|
15
|
+
# Modified 2015 Elastic
|
16
|
+
|
17
|
+
module BSON
|
18
|
+
|
19
|
+
# Injects behaviour for encoding and decoding BigDecimal values
|
20
|
+
# to and from # raw bytes as specified by the BSON spec.
|
21
|
+
#
|
22
|
+
# @see http://bsonspec.org/#/specification
|
23
|
+
module BigDecimal
|
24
|
+
|
25
|
+
# A floating point is type 0x01 in the BSON spec.
|
26
|
+
BSON_TYPE = 1.chr.force_encoding(BINARY).freeze
|
27
|
+
|
28
|
+
# The pack directive is for 8 byte floating points.
|
29
|
+
PACK = "E".freeze
|
30
|
+
|
31
|
+
# Get the floating point as encoded BSON.
|
32
|
+
# @example Get the floating point as encoded BSON.
|
33
|
+
# 1.221311.to_bson
|
34
|
+
# @return [ String ] The encoded string.
|
35
|
+
# @see http://bsonspec.org/#/specification
|
36
|
+
def to_bson(encoded = ''.force_encoding(BINARY))
|
37
|
+
encoded << [ self ].pack(PACK)
|
38
|
+
end
|
39
|
+
|
40
|
+
module ClassMethods
|
41
|
+
|
42
|
+
# Deserialize an instance of a BigDecimal from a BSON double.
|
43
|
+
# @param [ BSON ] bson object from Mongo.
|
44
|
+
# @return [ BigDecimal ] The decoded BigDecimal.
|
45
|
+
# @see http://bsonspec.org/#/specification
|
46
|
+
def from_bson(bson)
|
47
|
+
from_bson_double(bson.read(8))
|
48
|
+
end
|
49
|
+
|
50
|
+
private
|
51
|
+
|
52
|
+
def from_bson_double(double)
|
53
|
+
new(double.unpack(PACK).first.to_s)
|
54
|
+
end
|
55
|
+
end
|
56
|
+
|
57
|
+
# Register this type when the module is loaded.
|
58
|
+
Registry.register(BSON_TYPE, ::BigDecimal)
|
59
|
+
end
|
60
|
+
|
61
|
+
# Enrich the core BigDecimal class with this module.
|
62
|
+
#
|
63
|
+
# @since 2.0.0
|
64
|
+
::BigDecimal.send(:include, BigDecimal)
|
65
|
+
::BigDecimal.send(:extend, BigDecimal::ClassMethods)
|
66
|
+
end
|
@@ -0,0 +1,76 @@
|
|
1
|
+
# Copyright (C) 2009-2014 MongoDB Inc.
|
2
|
+
#
|
3
|
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
# you may not use this file except in compliance with the License.
|
5
|
+
# You may obtain a copy of the License at
|
6
|
+
#
|
7
|
+
# http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
#
|
9
|
+
# Unless required by applicable law or agreed to in writing, software
|
10
|
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
# See the License for the specific language governing permissions and
|
13
|
+
# limitations under the License.
|
14
|
+
|
15
|
+
# Modified 2015 Elastic
|
16
|
+
|
17
|
+
module BSON
|
18
|
+
|
19
|
+
# Injects behaviour for encoding and decoding time values to
|
20
|
+
# and from raw bytes as specified by the BSON spec.
|
21
|
+
#
|
22
|
+
# @see http://bsonspec.org/#/specification
|
23
|
+
module LogStashEvent
|
24
|
+
|
25
|
+
# An Event is an embedded document is type 0x03 in the BSON spec..
|
26
|
+
BSON_TYPE = 3.chr.force_encoding(BINARY).freeze
|
27
|
+
|
28
|
+
# Get the event as encoded BSON.
|
29
|
+
# @example Get the hash as encoded BSON.
|
30
|
+
# Event.new("field" => "value").to_bson
|
31
|
+
# @return [ String ] The encoded string.
|
32
|
+
# @see http://bsonspec.org/#/specification
|
33
|
+
def to_bson(buffer = ByteBuffer.new)
|
34
|
+
position = buffer.length
|
35
|
+
buffer.put_int32(0)
|
36
|
+
to_hash.each do |field, value|
|
37
|
+
buffer.put_byte(value.bson_type)
|
38
|
+
buffer.put_cstring(field.to_bson_key)
|
39
|
+
value.to_bson(buffer)
|
40
|
+
end
|
41
|
+
buffer.put_byte(NULL_BYTE)
|
42
|
+
buffer.replace_int32(position, buffer.length - position)
|
43
|
+
end
|
44
|
+
|
45
|
+
# Converts the event to a normalized value in a BSON document.
|
46
|
+
# @example Convert the event to a normalized value.
|
47
|
+
# event.to_bson_normalized_value
|
48
|
+
# @return [ BSON::Document ] The normalized event.
|
49
|
+
def to_bson_normalized_value
|
50
|
+
Document.new(self)
|
51
|
+
end
|
52
|
+
|
53
|
+
module ClassMethods
|
54
|
+
# Deserialize the Event from BSON.
|
55
|
+
# @param [ ByteBuffer ] buffer The byte buffer.
|
56
|
+
# @return [ Event ] The decoded bson document.
|
57
|
+
# @see http://bsonspec.org/#/specification
|
58
|
+
def from_bson(buffer)
|
59
|
+
hash = Hash.new
|
60
|
+
buffer.get_int32 # Throw away the size.
|
61
|
+
while (type = buffer.get_byte) != NULL_BYTE
|
62
|
+
field = buffer.get_cstring
|
63
|
+
hash.store(field, BSON::Registry.get(type).from_bson(buffer))
|
64
|
+
end
|
65
|
+
new(hash)
|
66
|
+
end
|
67
|
+
end
|
68
|
+
|
69
|
+
# Register this type when the module is loaded.
|
70
|
+
Registry.register(BSON_TYPE, ::LogStash::Event)
|
71
|
+
end
|
72
|
+
|
73
|
+
# Enrich the core LogStash::Event class with this module.
|
74
|
+
::LogStash::Event.send(:include, ::LogStashEvent)
|
75
|
+
::LogStash::Event.send(:extend, ::LogStashEvent::ClassMethods)
|
76
|
+
end
|
@@ -0,0 +1,50 @@
|
|
1
|
+
# Copyright (C) 2009-2014 MongoDB Inc.
|
2
|
+
#
|
3
|
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
# you may not use this file except in compliance with the License.
|
5
|
+
# You may obtain a copy of the License at
|
6
|
+
#
|
7
|
+
# http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
#
|
9
|
+
# Unless required by applicable law or agreed to in writing, software
|
10
|
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
# See the License for the specific language governing permissions and
|
13
|
+
# limitations under the License.
|
14
|
+
|
15
|
+
# Modified 2015 Elastic
|
16
|
+
|
17
|
+
module BSON
|
18
|
+
|
19
|
+
# Injects behaviour for encoding and decoding time values to
|
20
|
+
# and from raw bytes as specified by the BSON spec.
|
21
|
+
#
|
22
|
+
# @see http://bsonspec.org/#/specification
|
23
|
+
module LogStashTimestamp
|
24
|
+
|
25
|
+
# A time is type 0x09 in the BSON spec.
|
26
|
+
BSON_TYPE = 9.chr.force_encoding(BINARY).freeze
|
27
|
+
|
28
|
+
def to_bson(encoded = ''.force_encoding(BINARY))
|
29
|
+
time.to_bson(encoded)
|
30
|
+
end
|
31
|
+
|
32
|
+
module ClassMethods
|
33
|
+
# Deserialize UTC time from BSON.
|
34
|
+
# @param [ BSON ] bson encoded time.
|
35
|
+
# @return [ ::LogStash::Timestamp ] The decoded UTC time as a ::LogStash::Timestamp.
|
36
|
+
# @see http://bsonspec.org/#/specification
|
37
|
+
def from_bson(bson)
|
38
|
+
seconds, fragment = BSON::Int64.from_bson(bson).divmod(1000)
|
39
|
+
new(::Time.at(seconds, fragment * 1000).utc)
|
40
|
+
end
|
41
|
+
end
|
42
|
+
|
43
|
+
# Register this type when the module is loaded.
|
44
|
+
Registry.register(BSON_TYPE, ::LogStash::Timestamp)
|
45
|
+
end
|
46
|
+
|
47
|
+
# Enrich the core LogStash::Timestamp class with this module.
|
48
|
+
::LogStash::Timestamp.send(:include, LogStashTimestamp)
|
49
|
+
::LogStash::Timestamp.send(:extend, LogStashTimestamp::ClassMethods)
|
50
|
+
end
|
@@ -0,0 +1,202 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/outputs/base"
|
3
|
+
require "logstash/namespace"
|
4
|
+
require "mongo"
|
5
|
+
require_relative "bson/big_decimal"
|
6
|
+
require_relative "bson/logstash_timestamp"
|
7
|
+
|
8
|
+
# This output writes events to MongoDB.
|
9
|
+
class LogStash::Outputs::MongodbUpsertCustom < LogStash::Outputs::Base
|
10
|
+
|
11
|
+
config_name "mongodb_upsert_custom"
|
12
|
+
|
13
|
+
# A MongoDB URI to connect to.
|
14
|
+
# See http://docs.mongodb.org/manual/reference/connection-string/.
|
15
|
+
config :uri, :validate => :string, :required => true
|
16
|
+
|
17
|
+
# The database to use.
|
18
|
+
config :database, :validate => :string, :required => true
|
19
|
+
|
20
|
+
# The collection to use. This value can use `%{foo}` values to dynamically
|
21
|
+
# select a collection based on data in the event.
|
22
|
+
config :collection, :validate => :string, :required => true
|
23
|
+
|
24
|
+
# If true, store the @timestamp field in MongoDB as an ISODate type instead
|
25
|
+
# of an ISO8601 string. For more information about this, see
|
26
|
+
# http://www.mongodb.org/display/DOCS/Dates.
|
27
|
+
config :isodate, :validate => :boolean, :default => false
|
28
|
+
|
29
|
+
# The number of seconds to wait after failure before retrying.
|
30
|
+
config :retry_delay, :validate => :number, :default => 3, :required => false
|
31
|
+
|
32
|
+
# If true, an "_id" field will be added to the document before insertion.
|
33
|
+
# The "_id" field will use the timestamp of the event and overwrite an existing
|
34
|
+
# "_id" field in the event.
|
35
|
+
config :generateId, :validate => :boolean, :default => false
|
36
|
+
|
37
|
+
|
38
|
+
# Bulk insert flag, set to true to allow bulk insertion, else it will insert events one by one.
|
39
|
+
config :bulk, :validate => :boolean, :default => false
|
40
|
+
# Bulk interval, Used to insert events periodically if the "bulk" flag is activated.
|
41
|
+
config :bulk_interval, :validate => :number, :default => 2
|
42
|
+
# Bulk events number, if the number of events to insert into a collection raise that limit, it will be bulk inserted
|
43
|
+
# whatever the bulk interval value (mongodb hard limit is 1000).
|
44
|
+
config :bulk_size, :validate => :number, :default => 900, :maximum => 999, :min => 2
|
45
|
+
|
46
|
+
config :date_keys, :validate => :string, :default => nil
|
47
|
+
|
48
|
+
config :is_insert, :validate => :boolean, :default => nil
|
49
|
+
|
50
|
+
config :update_keys, :validate => :string, :default => nil
|
51
|
+
|
52
|
+
config :filter_key1, :validate => :string, :default => nil
|
53
|
+
|
54
|
+
config :filter_key2, :validate => :string, :default => nil
|
55
|
+
|
56
|
+
# Mutex used to synchronize access to 'documents'
|
57
|
+
@@mutex = Mutex.new
|
58
|
+
|
59
|
+
def register
|
60
|
+
if @bulk_size > 1000
|
61
|
+
raise LogStash::ConfigurationError, "Bulk size must be lower than '1000', currently '#{@bulk_size}'"
|
62
|
+
end
|
63
|
+
|
64
|
+
Mongo::Logger.logger = @logger
|
65
|
+
conn = Mongo::Client.new(@uri)
|
66
|
+
@db = conn.use(@database)
|
67
|
+
|
68
|
+
@closed = Concurrent::AtomicBoolean.new(false)
|
69
|
+
@documents = {}
|
70
|
+
@bulk_thread = Thread.new(@bulk_interval) do |bulk_interval|
|
71
|
+
while @closed.false? do
|
72
|
+
sleep(bulk_interval)
|
73
|
+
|
74
|
+
@@mutex.synchronize do
|
75
|
+
@documents.each do |collection, values|
|
76
|
+
if values.length > 0
|
77
|
+
if !@is_insert
|
78
|
+
values.each do |value|
|
79
|
+
criteria = Hash.new
|
80
|
+
criteria_key1 = @filter_key1
|
81
|
+
criteria[criteria_key1] = value[criteria_key1]
|
82
|
+
criteria_key2 = @filter_key2
|
83
|
+
criteria[criteria_key2] = value[criteria_key2]
|
84
|
+
documentToUpdate = @db[collection].find(criteria)
|
85
|
+
if documentToUpdate.count() > 0
|
86
|
+
documentToUpdate.update_many('$set' => value)
|
87
|
+
else
|
88
|
+
@db[collection].insert_many([value])
|
89
|
+
end
|
90
|
+
end
|
91
|
+
else
|
92
|
+
@db[collection].insert_many(values)
|
93
|
+
end
|
94
|
+
@documents.delete(collection)
|
95
|
+
end
|
96
|
+
end
|
97
|
+
end
|
98
|
+
end
|
99
|
+
end
|
100
|
+
end
|
101
|
+
|
102
|
+
def receive(event)
|
103
|
+
begin
|
104
|
+
# Our timestamp object now has a to_bson method, using it here
|
105
|
+
# {}.merge(other) so we don't taint the event hash innards
|
106
|
+
document = {}.merge(event.to_hash)
|
107
|
+
if !@isodate
|
108
|
+
timestamp = event.timestamp
|
109
|
+
if timestamp
|
110
|
+
# not using timestamp.to_bson
|
111
|
+
document["@timestamp"] = timestamp.to_json
|
112
|
+
else
|
113
|
+
@logger.warn("Cannot set MongoDB document `@timestamp` field because it does not exist in the event", :event => event)
|
114
|
+
end
|
115
|
+
end
|
116
|
+
|
117
|
+
if @date_keys
|
118
|
+
keys = date_keys.to_s.split(",")
|
119
|
+
document.each do |key, value|
|
120
|
+
if keys.index key
|
121
|
+
document[key] = LogStash::Timestamp.new(value)
|
122
|
+
end
|
123
|
+
end
|
124
|
+
end
|
125
|
+
|
126
|
+
if @update_keys
|
127
|
+
filterkeys = update_keys.to_s.split(",")
|
128
|
+
@filter_key1 = filterkeys[0]
|
129
|
+
@filter_key2 = filterkeys[1]
|
130
|
+
end
|
131
|
+
|
132
|
+
if @generateId
|
133
|
+
document["_id"] = BSON::ObjectId.new
|
134
|
+
end
|
135
|
+
|
136
|
+
if @bulk
|
137
|
+
collection = event.sprintf(@collection)
|
138
|
+
@@mutex.synchronize do
|
139
|
+
if(!@documents[collection])
|
140
|
+
@documents[collection] = []
|
141
|
+
end
|
142
|
+
@documents[collection].push(document)
|
143
|
+
if(@documents[collection].length >= @bulk_size)
|
144
|
+
if !@is_insert
|
145
|
+
@documents[collection].each do |docRecord|
|
146
|
+
criteria = Hash.new
|
147
|
+
criteria_key1 = @filter_key1
|
148
|
+
criteria[criteria_key1] = docRecord[criteria_key1]
|
149
|
+
criteria_key2 = @filter_key2
|
150
|
+
criteria[criteria_key2] = docRecord[criteria_key2]
|
151
|
+
documentToUpdate = @db[collection].find(criteria)
|
152
|
+
if documentToUpdate.count() > 0
|
153
|
+
documentToUpdate.update_many('$set' => docRecord)
|
154
|
+
else
|
155
|
+
@db[collection].insert_many([docRecord])
|
156
|
+
end
|
157
|
+
end
|
158
|
+
else
|
159
|
+
@db[collection].insert_many(@documents[collection])
|
160
|
+
end
|
161
|
+
@documents.delete(collection)
|
162
|
+
end
|
163
|
+
end
|
164
|
+
else
|
165
|
+
if !@is_insert
|
166
|
+
criteria = Hash.new
|
167
|
+
criteria_key1 = @filter_key1
|
168
|
+
criteria[criteria_key1] = document[criteria_key1]
|
169
|
+
criteria_key2 = @filter_key2
|
170
|
+
criteria[criteria_key2] = document[criteria_key2]
|
171
|
+
singleDoc = @db[event.sprintf(@collection)].find(criteria)
|
172
|
+
if singleDoc.count() > 0
|
173
|
+
singleDoc.update_one('$set' => document)
|
174
|
+
else
|
175
|
+
@db[event.sprintf(@collection)].insert_one(document)
|
176
|
+
end
|
177
|
+
else
|
178
|
+
@db[event.sprintf(@collection)].insert_one(document)
|
179
|
+
end
|
180
|
+
end
|
181
|
+
rescue => e
|
182
|
+
if e.message =~ /^E11000/
|
183
|
+
# On a duplicate key error, skip the insert.
|
184
|
+
# We could check if the duplicate key err is the _id key
|
185
|
+
# and generate a new primary key.
|
186
|
+
# If the duplicate key error is on another field, we have no way
|
187
|
+
# to fix the issue.
|
188
|
+
@logger.warn("Skipping insert because of a duplicate key error", :event => event, :exception => e)
|
189
|
+
else
|
190
|
+
@logger.warn("Failed to send event to MongoDB, retrying in #{@retry_delay.to_s} seconds", :event => event, :exception => e)
|
191
|
+
sleep(@retry_delay)
|
192
|
+
retry
|
193
|
+
end
|
194
|
+
end
|
195
|
+
end
|
196
|
+
|
197
|
+
def close
|
198
|
+
@closed.make_true
|
199
|
+
@bulk_thread.wakeup
|
200
|
+
@bulk_thread.join
|
201
|
+
end
|
202
|
+
end
|
@@ -0,0 +1,24 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-output-mongodb_upsert_custom'
|
3
|
+
s.version = '0.1.0'
|
4
|
+
s.licenses = ['Apache-2.0']
|
5
|
+
s.summary = 'Plugin to write to mongo db.'
|
6
|
+
s.description = 'This plugin is to write the logstash input to given mongo db.'
|
7
|
+
s.homepage = 'http://www.elastic.co/guide/en/logstash/current/index.html'
|
8
|
+
s.authors = ['jijk']
|
9
|
+
s.email = 'jiji.k@softwareag.com'
|
10
|
+
s.require_paths = ['lib']
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
14
|
+
# Tests
|
15
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
|
+
|
17
|
+
# Special flag to let us know this is actually a logstash plugin
|
18
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
|
19
|
+
|
20
|
+
# Gem dependencies
|
21
|
+
s.add_runtime_dependency "logstash-core-plugin-api", "~> 2.0"
|
22
|
+
s.add_runtime_dependency "logstash-codec-plain"
|
23
|
+
s.add_development_dependency "logstash-devutils"
|
24
|
+
end
|
@@ -0,0 +1,22 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/devutils/rspec/spec_helper"
|
3
|
+
require "logstash/outputs/mongodb_upsert_custom"
|
4
|
+
require "logstash/codecs/plain"
|
5
|
+
require "logstash/event"
|
6
|
+
|
7
|
+
describe LogStash::Outputs::MongodbUpsertCustom do
|
8
|
+
let(:sample_event) { LogStash::Event.new }
|
9
|
+
let(:output) { LogStash::Outputs::MongodbUpsertCustom.new }
|
10
|
+
|
11
|
+
before do
|
12
|
+
output.register
|
13
|
+
end
|
14
|
+
|
15
|
+
describe "receive message" do
|
16
|
+
subject { output.receive(sample_event) }
|
17
|
+
|
18
|
+
it "returns a string" do
|
19
|
+
expect(subject).to eq("Event received")
|
20
|
+
end
|
21
|
+
end
|
22
|
+
end
|
metadata
ADDED
@@ -0,0 +1,99 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-output-mongodb_upsert_custom
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- jijk
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2020-03-26 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: logstash-core-plugin-api
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - "~>"
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: '2.0'
|
20
|
+
type: :runtime
|
21
|
+
prerelease: false
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '2.0'
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
name: logstash-codec-plain
|
29
|
+
requirement: !ruby/object:Gem::Requirement
|
30
|
+
requirements:
|
31
|
+
- - ">="
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '0'
|
34
|
+
type: :runtime
|
35
|
+
prerelease: false
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - ">="
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '0'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
name: logstash-devutils
|
43
|
+
requirement: !ruby/object:Gem::Requirement
|
44
|
+
requirements:
|
45
|
+
- - ">="
|
46
|
+
- !ruby/object:Gem::Version
|
47
|
+
version: '0'
|
48
|
+
type: :development
|
49
|
+
prerelease: false
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - ">="
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '0'
|
55
|
+
description: This plugin is to write the logstash input to given mongo db.
|
56
|
+
email: jiji.k@softwareag.com
|
57
|
+
executables: []
|
58
|
+
extensions: []
|
59
|
+
extra_rdoc_files: []
|
60
|
+
files:
|
61
|
+
- CHANGELOG.md
|
62
|
+
- CONTRIBUTORS
|
63
|
+
- DEVELOPER.md
|
64
|
+
- Gemfile
|
65
|
+
- LICENSE
|
66
|
+
- README.md
|
67
|
+
- lib/logstash/outputs/bson/big_decimal.rb
|
68
|
+
- lib/logstash/outputs/bson/logstash_event.rb
|
69
|
+
- lib/logstash/outputs/bson/logstash_timestamp.rb
|
70
|
+
- lib/logstash/outputs/mongodb_upsert_custom.rb
|
71
|
+
- logstash-output-mongodb_upsert_custom.gemspec
|
72
|
+
- spec/outputs/mongodb_upsert_custom_spec.rb
|
73
|
+
homepage: http://www.elastic.co/guide/en/logstash/current/index.html
|
74
|
+
licenses:
|
75
|
+
- Apache-2.0
|
76
|
+
metadata:
|
77
|
+
logstash_plugin: 'true'
|
78
|
+
logstash_group: output
|
79
|
+
post_install_message:
|
80
|
+
rdoc_options: []
|
81
|
+
require_paths:
|
82
|
+
- lib
|
83
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
84
|
+
requirements:
|
85
|
+
- - ">="
|
86
|
+
- !ruby/object:Gem::Version
|
87
|
+
version: '0'
|
88
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
89
|
+
requirements:
|
90
|
+
- - ">="
|
91
|
+
- !ruby/object:Gem::Version
|
92
|
+
version: '0'
|
93
|
+
requirements: []
|
94
|
+
rubygems_version: 3.0.3
|
95
|
+
signing_key:
|
96
|
+
specification_version: 4
|
97
|
+
summary: Plugin to write to mongo db.
|
98
|
+
test_files:
|
99
|
+
- spec/outputs/mongodb_upsert_custom_spec.rb
|