logstash-input-couchdb_changes 0.1.1
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/.gitignore +5 -0
- data/DEVELOPER.md +82 -0
- data/Gemfile +4 -0
- data/LICENSE +13 -0
- data/README.md +95 -0
- data/Rakefile +1 -0
- data/lib/logstash/inputs/couchdb_changes.rb +204 -0
- data/logstash-input-couchdb_changes.gemspec +33 -0
- data/spec/inputs/ca_cert.pem +35 -0
- data/spec/inputs/couchdb_changes_spec.rb +499 -0
- data/spec/inputs/localhost.cert +35 -0
- data/spec/inputs/localhost.key +51 -0
- metadata +165 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 08d8fd80e27c6ab8877b599ac947060e4028fdee
|
4
|
+
data.tar.gz: 1b176733452b924a819430ed6397a51c2724ec8c
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 3e4362208c4244b95814389879ad99702e390009538ed963a913107ced7dd52d3b1889ae0fab3605738fbf146ad27531d056ec7d190c113190b66964328f3788
|
7
|
+
data.tar.gz: e9d45e052779bc1c45272618781d927f3510a5de095dd95ce2ffce640e2f92c1a7b4eb1ab386680c900d9dd075f0451b93e7267354e64523b32708bdf424e766
|
data/.gitignore
ADDED
data/DEVELOPER.md
ADDED
@@ -0,0 +1,82 @@
|
|
1
|
+
# CouchDB Changes
|
2
|
+
|
3
|
+
This plugin is for capturing the stream of the `_changes` API and feeding it into Logstash.
|
4
|
+
|
5
|
+
## Testing
|
6
|
+
|
7
|
+
This plugin has some requirements for testing. Following normal dev installation procedure (i.e. `bundle install`), you must also have a CouchDB instance running on localhost. In order to test SSL connectivity, a certificate, private key, and CA certificate have been provided.
|
8
|
+
|
9
|
+
### CouchDB configuration
|
10
|
+
|
11
|
+
Locate the `local.ini` file of your CouchDB installation and edit with your favorite text editor.
|
12
|
+
|
13
|
+
You will need to reconfigure the `[ssl]` section similar to this:
|
14
|
+
|
15
|
+
```
|
16
|
+
[ssl]
|
17
|
+
port = 6984
|
18
|
+
cert_file = /path/to/localhost.cert
|
19
|
+
key_file = /path/to/localhost.key
|
20
|
+
```
|
21
|
+
|
22
|
+
The files `localhost.cert` and `localhost.key` are in `spec/inputs` of this repository. You can copy them out to the any path you like. Configure `cert_file` and `key_file` with the full path to wherver you put those files.
|
23
|
+
|
24
|
+
Next, in the `[daemons]` configuration block, you need to make sure you see:
|
25
|
+
|
26
|
+
```
|
27
|
+
httpd={couch_httpd, start_link, []}
|
28
|
+
httpsd={couch_httpd, start_link, [https]}
|
29
|
+
```
|
30
|
+
|
31
|
+
Chances are that you will only need to add the `httpsd` line. Be sure to put `https` in the square braces at the end of that line.
|
32
|
+
|
33
|
+
### Running CouchDB locally
|
34
|
+
|
35
|
+
Launch CouchDB by calling the binary, or with your preferred method. In the STDOUT or log file you should see something like this:
|
36
|
+
|
37
|
+
```
|
38
|
+
$ couchdb
|
39
|
+
Apache CouchDB 1.6.1 (LogLevel=info) is starting.
|
40
|
+
Apache CouchDB has started. Time to relax.
|
41
|
+
[info] [<0.31.0>] Apache CouchDB has started on http://127.0.0.1:5984/
|
42
|
+
[info] [<0.31.0>] Apache CouchDB has started on https://127.0.0.1:6984/
|
43
|
+
```
|
44
|
+
|
45
|
+
If you see lines like this with 127.0.0.1, you only require a local instance of Elasticsearch to be prepared to run the rspec tests.
|
46
|
+
|
47
|
+
### Ensure a local instance of Elasticsearch is running
|
48
|
+
|
49
|
+
It must be running on 127.0.0.1:9200
|
50
|
+
|
51
|
+
### Execute the tests
|
52
|
+
|
53
|
+
The tests can be run with `bundle exec rspec -t elasticsearch`. Adding `-f d` shows more detail.
|
54
|
+
|
55
|
+
```
|
56
|
+
bundle exec rspec -t elasticsearch -f d
|
57
|
+
Using Accessor#strict_set for specs
|
58
|
+
Run options:
|
59
|
+
include {:elasticsearch=>true}
|
60
|
+
exclude {:redis=>true, :socket=>true, :performance=>true, :elasticsearch_secure=>true, :broken=>true, :export_cypher=>true, :integration=>true}
|
61
|
+
|
62
|
+
inputs/couchdb_changes
|
63
|
+
Load couchdb documents
|
64
|
+
agent(/Users/buh/WORK/logstash-plugins/logstash-input-couchdb_changes/spec/inputs/couchdb_changes_spec.rb:127:in) runs
|
65
|
+
Test sincedb
|
66
|
+
agent(/Users/buh/WORK/logstash-plugins/logstash-input-couchdb_changes/spec/inputs/couchdb_changes_spec.rb:266:in) runs
|
67
|
+
Test document updates
|
68
|
+
agent(/Users/buh/WORK/logstash-plugins/logstash-input-couchdb_changes/spec/inputs/couchdb_changes_spec.rb:195:in) runs
|
69
|
+
Test Secure Connection
|
70
|
+
agent(/Users/buh/WORK/logstash-plugins/logstash-input-couchdb_changes/spec/inputs/couchdb_changes_spec.rb:468:in) runs
|
71
|
+
Test document deletion
|
72
|
+
agent(/Users/buh/WORK/logstash-plugins/logstash-input-couchdb_changes/spec/inputs/couchdb_changes_spec.rb:336:in) runs
|
73
|
+
Test authenticated connectivity
|
74
|
+
agent(/Users/buh/WORK/logstash-plugins/logstash-input-couchdb_changes/spec/inputs/couchdb_changes_spec.rb:403:in) runs
|
75
|
+
|
76
|
+
Finished in 14.44 seconds
|
77
|
+
6 examples, 0 failures
|
78
|
+
|
79
|
+
Randomized with seed 31091
|
80
|
+
```
|
81
|
+
|
82
|
+
Your results should look something like this.
|
data/Gemfile
ADDED
data/LICENSE
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
Copyright (c) 2012-2015 Elasticsearch <http://www.elasticsearch.org>
|
2
|
+
|
3
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
4
|
+
you may not use this file except in compliance with the License.
|
5
|
+
You may obtain a copy of the License at
|
6
|
+
|
7
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
8
|
+
|
9
|
+
Unless required by applicable law or agreed to in writing, software
|
10
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
11
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
12
|
+
See the License for the specific language governing permissions and
|
13
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,95 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elasticsearch/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elasticsearch.org/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elasticsearch/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the logstash-users@googlegroups.com mailing list.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization.
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
```sh
|
35
|
+
bundle exec rspec
|
36
|
+
```
|
37
|
+
|
38
|
+
The Logstash code required to run the tests/specs is specified in the `Gemfile` by the line similar to:
|
39
|
+
```ruby
|
40
|
+
gem "logstash", :github => "elasticsearch/logstash", :branch => "1.5"
|
41
|
+
```
|
42
|
+
To test against another version or a local Logstash, edit the `Gemfile` to specify an alternative location, for example:
|
43
|
+
```ruby
|
44
|
+
gem "logstash", :github => "elasticsearch/logstash", :ref => "master"
|
45
|
+
```
|
46
|
+
```ruby
|
47
|
+
gem "logstash", :path => "/your/local/logstash"
|
48
|
+
```
|
49
|
+
|
50
|
+
Then update your dependencies and run your tests:
|
51
|
+
|
52
|
+
```sh
|
53
|
+
bundle install
|
54
|
+
bundle exec rspec
|
55
|
+
```
|
56
|
+
|
57
|
+
### 2. Running your unpublished Plugin in Logstash
|
58
|
+
|
59
|
+
#### 2.1 Run in a local Logstash clone
|
60
|
+
|
61
|
+
- Edit Logstash `tools/Gemfile` and add the local plugin path, for example:
|
62
|
+
```ruby
|
63
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
64
|
+
```
|
65
|
+
- Update Logstash dependencies
|
66
|
+
```sh
|
67
|
+
rake vendor:gems
|
68
|
+
```
|
69
|
+
- Run Logstash with your plugin
|
70
|
+
```sh
|
71
|
+
bin/logstash -e 'filter {awesome {}}'
|
72
|
+
```
|
73
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
74
|
+
|
75
|
+
#### 2.2 Run in an installed Logstash
|
76
|
+
|
77
|
+
- Build your plugin gem
|
78
|
+
```sh
|
79
|
+
gem build logstash-filter-awesome.gemspec
|
80
|
+
```
|
81
|
+
- Install the plugin from the Logstash home
|
82
|
+
```sh
|
83
|
+
bin/plugin install /your/local/plugin/logstash-filter-awesome.gem
|
84
|
+
```
|
85
|
+
- Start Logstash and proceed to test the plugin
|
86
|
+
|
87
|
+
## Contributing
|
88
|
+
|
89
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
90
|
+
|
91
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
92
|
+
|
93
|
+
It is more important to me that you are able to contribute.
|
94
|
+
|
95
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elasticsearch/logstash/blob/master/CONTRIBUTING.md) file.
|
data/Rakefile
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
require "logstash/devutils/rake"
|
@@ -0,0 +1,204 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
|
3
|
+
require "logstash/inputs/base"
|
4
|
+
require "logstash/namespace"
|
5
|
+
require "net/http"
|
6
|
+
require "uri"
|
7
|
+
|
8
|
+
# Stream events from the CouchDB _changes URI.
|
9
|
+
# Use event metadata to allow for upsert and
|
10
|
+
# document deletion.
|
11
|
+
class LogStash::Inputs::CouchDBChanges < LogStash::Inputs::Base
|
12
|
+
config_name "couchdb_changes"
|
13
|
+
|
14
|
+
# IP or hostname of your CouchDB instance
|
15
|
+
config :host, :validate => :string, :default => "localhost"
|
16
|
+
|
17
|
+
# Port of your CouchDB instance.
|
18
|
+
config :port, :validate => :number, :default => 5984
|
19
|
+
|
20
|
+
# The CouchDB db to connect to.
|
21
|
+
# Required parameter.
|
22
|
+
config :db, :validate => :string, :required => true
|
23
|
+
|
24
|
+
# Connect to CouchDB's _changes feed securely (via https)
|
25
|
+
# Default: false (via http)
|
26
|
+
config :secure, :validate => :boolean, :default => false
|
27
|
+
|
28
|
+
# Path to a CA certificate file, used to validate certificates
|
29
|
+
config :ca_file, :validate => :path
|
30
|
+
|
31
|
+
# Username, if authentication is needed to connect to
|
32
|
+
# CouchDB
|
33
|
+
config :username, :validate => :string, :default => nil
|
34
|
+
|
35
|
+
# Password, if authentication is needed to connect to
|
36
|
+
# CouchDB
|
37
|
+
config :password, :validate => :password, :default => nil
|
38
|
+
|
39
|
+
# Logstash connects to CouchDB's _changes with feed=continuous
|
40
|
+
# The heartbeat is how often (in milliseconds) Logstash will ping
|
41
|
+
# CouchDB to ensure the connection is maintained. Changing this
|
42
|
+
# setting is not recommended unless you know what you are doing.
|
43
|
+
config :heartbeat, :validate => :number, :default => 1000
|
44
|
+
|
45
|
+
# File path where the last sequence number in the _changes
|
46
|
+
# stream is stored. If unset it will write to "$HOME/.couchdb_seq"
|
47
|
+
config :sequence_path, :validate => :string
|
48
|
+
|
49
|
+
# If unspecified, Logstash will attempt to read the last sequence number
|
50
|
+
# from the `sequence_path` file. If that is empty or non-existent, it will
|
51
|
+
# begin with 0 (the beginning).
|
52
|
+
#
|
53
|
+
# If you specify this value, it is anticipated that you will
|
54
|
+
# only be doing so for an initial read under special circumstances
|
55
|
+
# and that you will unset this value afterwards.
|
56
|
+
config :initial_sequence, :validate => :number
|
57
|
+
|
58
|
+
# Preserve the CouchDB document revision "_rev" value in the
|
59
|
+
# output.
|
60
|
+
config :keep_revision, :validate => :boolean, :default => false
|
61
|
+
|
62
|
+
# Future feature! Until implemented, changing this from the default
|
63
|
+
# will not do anything.
|
64
|
+
#
|
65
|
+
# Ignore attachments associated with CouchDB documents.
|
66
|
+
config :ignore_attachments, :validate => :boolean, :default => true
|
67
|
+
|
68
|
+
# Reconnect flag. When true, always try to reconnect after a failure
|
69
|
+
config :always_reconnect, :validate => :boolean, :default => true
|
70
|
+
|
71
|
+
# Reconnect delay: time between reconnect attempts, in seconds.
|
72
|
+
config :reconnect_delay, :validate => :number, :default => 10
|
73
|
+
|
74
|
+
# Timeout: Number of milliseconds to wait for new data before
|
75
|
+
# terminating the connection. If a timeout is set it will disable
|
76
|
+
# the heartbeat configuration option.
|
77
|
+
config :timeout, :validate => :number
|
78
|
+
|
79
|
+
# Declare these constants here.
|
80
|
+
FEED = 'continuous'
|
81
|
+
INCLUDEDOCS = 'true'
|
82
|
+
|
83
|
+
public
|
84
|
+
def register
|
85
|
+
require "logstash/util/buftok"
|
86
|
+
if @sequence_path.nil?
|
87
|
+
if ENV["HOME"].nil?
|
88
|
+
@logger.error("No HOME environment variable set, I don't know where " \
|
89
|
+
"to keep track of the files I'm watching. Either set " \
|
90
|
+
"HOME in your environment, or set sequence_path in " \
|
91
|
+
"in your Logstash config.")
|
92
|
+
raise ArgumentError
|
93
|
+
end
|
94
|
+
default_dir = ENV["HOME"]
|
95
|
+
@sequence_path = File.join(default_dir, ".couchdb_seq")
|
96
|
+
|
97
|
+
@logger.info("No sequence_path set, generating one...",
|
98
|
+
:sequence_path => @sequence_path)
|
99
|
+
end
|
100
|
+
|
101
|
+
@sequencedb = SequenceDB::File.new(@sequence_path)
|
102
|
+
@path = '/' + @db + '/_changes'
|
103
|
+
|
104
|
+
@scheme = @secure ? 'https' : 'http'
|
105
|
+
|
106
|
+
@sequence = @initial_sequence ? @initial_sequence : @sequencedb.read
|
107
|
+
|
108
|
+
if @username && @password
|
109
|
+
@userinfo = @username + ':' + @password.value
|
110
|
+
else
|
111
|
+
@userinfo = nil
|
112
|
+
end
|
113
|
+
|
114
|
+
end
|
115
|
+
|
116
|
+
module SequenceDB
|
117
|
+
class File
|
118
|
+
def initialize(file)
|
119
|
+
@sequence_path = file
|
120
|
+
end
|
121
|
+
|
122
|
+
def read
|
123
|
+
::File.exists?(@sequence_path) ? ::File.read(@sequence_path).chomp.strip : 0
|
124
|
+
end
|
125
|
+
|
126
|
+
def write(sequence = nil)
|
127
|
+
sequence = 0 if sequence.nil?
|
128
|
+
::File.write(@sequence_path, sequence.to_s)
|
129
|
+
end
|
130
|
+
end
|
131
|
+
end
|
132
|
+
|
133
|
+
public
|
134
|
+
def run(queue)
|
135
|
+
buffer = FileWatch::BufferedTokenizer.new
|
136
|
+
@logger.info("Connecting to CouchDB _changes stream at:", :host => @host.to_s, :port => @port.to_s, :db => @db)
|
137
|
+
uri = build_uri
|
138
|
+
Net::HTTP.start(@host, @port, :use_ssl => (@secure == true), :ca_file => @ca_file) do |http|
|
139
|
+
request = Net::HTTP::Get.new(uri.request_uri)
|
140
|
+
http.request request do |response|
|
141
|
+
raise ArgumentError, "Database not found!" if response.code == "404"
|
142
|
+
response.read_body do |chunk|
|
143
|
+
buffer.extract(chunk).each do |changes|
|
144
|
+
# If no changes come since the last heartbeat period, a blank line is
|
145
|
+
# sent as a sort of keep-alive. We should ignore those.
|
146
|
+
next if changes.chomp.empty?
|
147
|
+
if event = build_event(changes)
|
148
|
+
@logger.debug("event", :event => event.to_hash_with_metadata) if @logger.debug?
|
149
|
+
decorate(event)
|
150
|
+
queue << event
|
151
|
+
@sequence = event['@metadata']['seq']
|
152
|
+
@sequencedb.write(@sequence.to_s)
|
153
|
+
end
|
154
|
+
end
|
155
|
+
end
|
156
|
+
end
|
157
|
+
end
|
158
|
+
rescue Timeout::Error, Errno::EINVAL, Errno::ECONNRESET, EOFError, Errno::EHOSTUNREACH, Errno::ECONNREFUSED,
|
159
|
+
Net::HTTPBadResponse, Net::HTTPHeaderSyntaxError, Net::ProtocolError => e
|
160
|
+
@logger.error("Connection problem encountered: Retrying connection in 10 seconds...", :error => e.to_s)
|
161
|
+
retry if reconnect?
|
162
|
+
rescue Errno::EBADF => e
|
163
|
+
@logger.error("Unable to connect: Bad file descriptor: ", :error => e.to_s)
|
164
|
+
retry if reconnect?
|
165
|
+
rescue ArgumentError => e
|
166
|
+
@logger.error("Unable to connect to database", :db => @db, :error => e.to_s)
|
167
|
+
retry if reconnect?
|
168
|
+
end
|
169
|
+
|
170
|
+
private
|
171
|
+
def build_uri
|
172
|
+
options = {:feed => FEED, :include_docs => INCLUDEDOCS, :since => @sequence}
|
173
|
+
options = options.merge(@timeout ? {:timeout => @timeout} : {:heartbeat => @heartbeat})
|
174
|
+
URI::HTTP.build(:scheme => @scheme, :userinfo => @userinfo, :host => @host, :port => @port, :path => @path, :query => URI.encode_www_form(options))
|
175
|
+
end
|
176
|
+
|
177
|
+
private
|
178
|
+
def reconnect?
|
179
|
+
sleep(@always_reconnect ? @reconnect_delay : 0)
|
180
|
+
@always_reconnect
|
181
|
+
end
|
182
|
+
|
183
|
+
private
|
184
|
+
def build_event(line)
|
185
|
+
# In lieu of a codec, build the event here
|
186
|
+
line = LogStash::Json.load(line)
|
187
|
+
return nil if line.has_key?("last_seq")
|
188
|
+
hash = Hash.new
|
189
|
+
hash['@metadata'] = { '_id' => line['doc']['_id'] }
|
190
|
+
if line['doc']['_deleted']
|
191
|
+
hash['@metadata']['action'] = 'delete'
|
192
|
+
else
|
193
|
+
hash['doc'] = line['doc']
|
194
|
+
hash['@metadata']['action'] = 'update'
|
195
|
+
hash['doc'].delete('_id')
|
196
|
+
hash['doc_as_upsert'] = true
|
197
|
+
hash['doc'].delete('_rev') unless @keep_revision
|
198
|
+
end
|
199
|
+
hash['@metadata']['seq'] = line['seq']
|
200
|
+
event = LogStash::Event.new(hash)
|
201
|
+
@logger.debug("event", :event => event.to_hash_with_metadata) if @logger.debug?
|
202
|
+
event
|
203
|
+
end
|
204
|
+
end
|
@@ -0,0 +1,33 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
|
3
|
+
s.name = 'logstash-input-couchdb_changes'
|
4
|
+
s.version = '0.1.1'
|
5
|
+
s.licenses = ['Apache License (2.0)']
|
6
|
+
s.summary = "This input captures the _changes stream from a CouchDB instance"
|
7
|
+
s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
|
8
|
+
s.authors = ["Elasticsearch"]
|
9
|
+
s.email = 'info@elasticsearch.com'
|
10
|
+
s.homepage = "http://www.elasticsearch.org/guide/en/logstash/current/index.html"
|
11
|
+
s.require_paths = ["lib"]
|
12
|
+
|
13
|
+
# Files
|
14
|
+
s.files = `git ls-files`.split($\)+::Dir.glob('vendor/*')
|
15
|
+
|
16
|
+
# Tests
|
17
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
18
|
+
|
19
|
+
# Special flag to let us know this is actually a logstash plugin
|
20
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
|
21
|
+
|
22
|
+
# Gem dependencies
|
23
|
+
s.add_runtime_dependency 'logstash', '>= 1.4.0', '< 2.0.0'
|
24
|
+
s.add_runtime_dependency 'logstash-codec-plain'
|
25
|
+
s.add_runtime_dependency 'ftw', '>= 0.0.41'
|
26
|
+
s.add_runtime_dependency 'json'
|
27
|
+
|
28
|
+
s.add_development_dependency 'ftw', '>= 0.0.41'
|
29
|
+
s.add_development_dependency 'logstash-devutils', '>= 0.0.6'
|
30
|
+
s.add_development_dependency 'logstash-output-elasticsearch'
|
31
|
+
|
32
|
+
end
|
33
|
+
|
@@ -0,0 +1,35 @@
|
|
1
|
+
-----BEGIN CERTIFICATE-----
|
2
|
+
MIIGLDCCBBSgAwIBAgIJAK8PXnAk27i2MA0GCSqGSIb3DQEBCwUAMIGlMQswCQYD
|
3
|
+
VQQGEwJVUzETMBEGA1UECAwKQ2FsaWZvcm5pYTESMBAGA1UEBwwJTG9zIEFsdG9z
|
4
|
+
MRYwFAYDVQQKDA1FbGFzdGljc2VhcmNoMRAwDgYDVQQLDAdUZXN0aW5nMRkwFwYD
|
5
|
+
VQQDDBBFbGFzdGljc2VhcmNoIENBMSgwJgYJKoZIhvcNAQkBFhlub3JlcGx5QGVs
|
6
|
+
YXN0aWNzZWFyY2guY29tMB4XDTE0MTExNDE4MDk0MVoXDTI0MTExMTE4MDk0MVow
|
7
|
+
gaUxCzAJBgNVBAYTAlVTMRMwEQYDVQQIDApDYWxpZm9ybmlhMRIwEAYDVQQHDAlM
|
8
|
+
b3MgQWx0b3MxFjAUBgNVBAoMDUVsYXN0aWNzZWFyY2gxEDAOBgNVBAsMB1Rlc3Rp
|
9
|
+
bmcxGTAXBgNVBAMMEEVsYXN0aWNzZWFyY2ggQ0ExKDAmBgkqhkiG9w0BCQEWGW5v
|
10
|
+
cmVwbHlAZWxhc3RpY3NlYXJjaC5jb20wggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAw
|
11
|
+
ggIKAoICAQDZZ7PvqXV3Qj8jPJE3qBV1ALDeM01PoclxxfKbcvKRhoMmzBWkEqFg
|
12
|
+
BhSQ49SoUzfhgtr7lpRPs7FPQYjWnbmJ6KSVESKK5VDuBx1/MLPgq3lQK/tCyKF0
|
13
|
+
yLiwXCw5doc1cnU/Ih2Ckk2ZNHpnSAM7C/lJ4jiKsei96RiXCRvf8uhJZxVdXnmS
|
14
|
+
R+uHqNsEjxQeh5EuuLzi7DA80GRn/+JWjWtaYClTHRkHKgCsWYF4MCEtwHIw0s9g
|
15
|
+
MDvNtkXA3guD1CP3sgm4+REMWABZll80xkzoC5WAyFcEP4oiBt7uPQMrbWTEOy/5
|
16
|
+
1TmKqV+mTdUpZWmsLIVLky8b8w0kCAd9QtwWacXLj38hvMifCjYukkYMp5g8c79U
|
17
|
+
4kimsYzm0r8AwEQLRsVisaq2a0SXkiN4SWgFwQ+gstEeRuLMBKasBgwMU/ebk+rl
|
18
|
+
Zli5eB7EYT89q2CSzp/I/CPt4NaMuCRgURKS6wFQzbg9SHfr9pRwhnNH2AGGBzSC
|
19
|
+
1e9Q61/rdGUCs/23/961lnehux0i99qbVy9Q/+65rrkkkcPlbT1tcnkZoh7tcYn9
|
20
|
+
YoBvIkbNoeOPtB1N1UPx08YkzWsNQncXKcm8sb/EtUHsvO7Telm3Q0DcjlzQGLII
|
21
|
+
U4iSj3xlpSiyrmsrFAzKRWBcrOODLC3ReWKtblYoUlGf3W7EGhCdNQIDAQABo10w
|
22
|
+
WzAdBgNVHQ4EFgQU0fMBg4eGjjrvuPzK4LYgyparbGcwHwYDVR0jBBgwFoAU0fMB
|
23
|
+
g4eGjjrvuPzK4LYgyparbGcwDAYDVR0TBAUwAwEB/zALBgNVHQ8EBAMCAQYwDQYJ
|
24
|
+
KoZIhvcNAQELBQADggIBAI8TfLPt5m6feuF1V48fwMUJsiWpe5D0t97gDMcnXAuT
|
25
|
+
gG53e7J966B5DaNzvQQ7HWSlYVswSLYgl6Qz1nwEeSnoSlKU8WZMOoJB0F7twdzm
|
26
|
+
ifCUMXHUv5+Ib9kMSP892KK99p6W0jg7B65+WIwBctnXPyB4FPlhu1mcLaeRzYH5
|
27
|
+
3FWmaOm/tv185o4kYLOwlkR0XkWRNEZplSY0h+HahVk96dCF+yxkJlgwu7DUujmA
|
28
|
+
FKsBV20CittQHOOMCTBMfj5kAL19S+5GJCY5GfzRbJAd8FJ2DV+5COUaFBaC3Mog
|
29
|
+
Y51D9E/13tyB2Lq8mzKhvB0Jq3WDCjn9zHAgWxQFAYtEIkJOIOKowoLi7PwMROSR
|
30
|
+
/KNVEwnZ3yuPWL5rkMz3ogZFNA1hIQiOj0S9KspL1gLaHDfX2kt0OY+Ri/G8mbg4
|
31
|
+
2Rn3R//ZtLQFLyI32mN/nyf5FoWJu0Y8+zLHSpkuWSO10SboImD3HNvxr6sZIggc
|
32
|
+
YBYgU6OYDwp3/owHQSss6wOiRdn6sp/pcCC31DiQCbcD3mLQbP+FjZEmOfo50y5I
|
33
|
+
gczeMu//CfWVmGDr+SqEbUkORYJrEeEZ4iSTbqPfrixLdMd/0SgPRqt/Ydk+ia+G
|
34
|
+
3INj46/FHVGAdTaUXfyPpw+cRAu4/rUjGRBGkaXIpAJ14x8w0bDQ0VpiIMxZEKz2
|
35
|
+
-----END CERTIFICATE-----
|
@@ -0,0 +1,499 @@
|
|
1
|
+
require "logstash/devutils/rspec/spec_helper"
|
2
|
+
require "ftw"
|
3
|
+
require "logstash/plugin"
|
4
|
+
require "logstash/json"
|
5
|
+
require "logstash/inputs/couchdb_changes"
|
6
|
+
|
7
|
+
module Helpers
|
8
|
+
def createdb
|
9
|
+
ftw = FTW::Agent.new
|
10
|
+
ftw.put!("http://127.0.0.1:5984/db")
|
11
|
+
end
|
12
|
+
|
13
|
+
def deletedb
|
14
|
+
ftw = FTW::Agent.new
|
15
|
+
ftw.delete!("http://127.0.0.1:5984/db")
|
16
|
+
end
|
17
|
+
|
18
|
+
def populatedb
|
19
|
+
ftw = FTW::Agent.new
|
20
|
+
ftw.put!("http://127.0.0.1:5984/db/1", :body => '{"name":"Peter Parker"}')
|
21
|
+
ftw.put!("http://127.0.0.1:5984/db/2", :body => '{"name":"Mary Jane Watson"}')
|
22
|
+
ftw.put!("http://127.0.0.1:5984/db/3", :body => '{"name":"Captain America"}')
|
23
|
+
ftw.put!("http://127.0.0.1:5984/db/4", :body => '{"name":"J. Jonah Jameson"}')
|
24
|
+
ftw.put!("http://127.0.0.1:5984/db/5", :body => '{"name":"Otto Octavius"}')
|
25
|
+
ftw.put!("http://127.0.0.1:5984/db/6", :body => '{"name":"May Parker"}')
|
26
|
+
ftw.put!("http://127.0.0.1:5984/db/7", :body => '{"name":"Harry Osborne"}')
|
27
|
+
ftw.put!("http://127.0.0.1:5984/db/8", :body => '{"name":"Norman Osborne"}')
|
28
|
+
ftw.put!("http://127.0.0.1:5984/db/9", :body => '{"name":"Ben Parker"}')
|
29
|
+
ftw.put!("http://127.0.0.1:5984/db/10", :body => '{"name":"Stan Lee"}')
|
30
|
+
end
|
31
|
+
|
32
|
+
def updatedocs
|
33
|
+
ftw = FTW::Agent.new
|
34
|
+
data = ""
|
35
|
+
response = ftw.get!("http://127.0.0.1:5984/db/_changes?include_docs=true")
|
36
|
+
response.read_body { |chunk| data << chunk }
|
37
|
+
result = LogStash::Json.load(data)
|
38
|
+
result["results"].each do |doc|
|
39
|
+
upd = false
|
40
|
+
body = doc["doc"]
|
41
|
+
case doc["id"]
|
42
|
+
when "1"
|
43
|
+
body["Alter-ego"] = "Spider-man"
|
44
|
+
upd = true
|
45
|
+
when "5"
|
46
|
+
body["Alter-ego"] = "Doctor Octopus"
|
47
|
+
upd = true
|
48
|
+
when "8"
|
49
|
+
body["Alter-ego"] = "Green Goblin"
|
50
|
+
upd = true
|
51
|
+
end
|
52
|
+
if upd
|
53
|
+
ftw.put!("http://127.0.0.1:5984/db/#{doc["id"]}", :body => LogStash::Json.dump(body))
|
54
|
+
end
|
55
|
+
end
|
56
|
+
end
|
57
|
+
|
58
|
+
def deletedoc
|
59
|
+
ftw = FTW::Agent.new
|
60
|
+
data = ""
|
61
|
+
response = ftw.get!("http://127.0.0.1:5984/db/9")
|
62
|
+
response.read_body { |chunk| data << chunk }
|
63
|
+
doc = LogStash::Json.load(data)
|
64
|
+
ftw.delete!("http://127.0.0.1:5984/db/9?rev=#{doc["_rev"]}")
|
65
|
+
end
|
66
|
+
|
67
|
+
def createuser
|
68
|
+
ftw = FTW::Agent.new
|
69
|
+
ftw.put!("http://127.0.0.1:5984/_config/admins/logstash", :body => '"logstash"')
|
70
|
+
end
|
71
|
+
|
72
|
+
def deleteuser
|
73
|
+
user = "logstash"
|
74
|
+
pass = "logstash"
|
75
|
+
auth = "#{user}:#{pass}@"
|
76
|
+
ftw = FTW::Agent.new
|
77
|
+
ftw.delete!("http://#{auth}127.0.0.1:5984/_config/admins/logstash")
|
78
|
+
end
|
79
|
+
|
80
|
+
def deleteindex
|
81
|
+
ftw = FTW::Agent.new
|
82
|
+
ftw.delete!("http://127.0.0.1:9200/couchdb_test")
|
83
|
+
end
|
84
|
+
|
85
|
+
def buildup
|
86
|
+
# BEGIN: The following calls are a safety net in case of an aborted test
|
87
|
+
deleteuser
|
88
|
+
teardown
|
89
|
+
# END
|
90
|
+
createdb
|
91
|
+
populatedb
|
92
|
+
end
|
93
|
+
|
94
|
+
def teardown
|
95
|
+
deletedb
|
96
|
+
deleteindex
|
97
|
+
sequence = "/tmp/.couchdb_seq"
|
98
|
+
File.delete(sequence) if File.exist?(sequence)
|
99
|
+
end
|
100
|
+
end
|
101
|
+
|
102
|
+
describe "inputs/couchdb_changes", :elasticsearch => true, :couchdb => true do
|
103
|
+
describe "Load couchdb documents", :elasticsearch => true, :couchdb => true do
|
104
|
+
include Helpers
|
105
|
+
sequence = "/tmp/.couchdb_seq"
|
106
|
+
index = "couchdb_test"
|
107
|
+
|
108
|
+
before do
|
109
|
+
buildup
|
110
|
+
end
|
111
|
+
|
112
|
+
ftw = FTW::Agent.new
|
113
|
+
|
114
|
+
config <<-CONFIG
|
115
|
+
input {
|
116
|
+
couchdb_changes {
|
117
|
+
db => "db"
|
118
|
+
host => "127.0.0.1"
|
119
|
+
timeout => 2000
|
120
|
+
always_reconnect => false
|
121
|
+
sequence_path => "#{sequence}"
|
122
|
+
type => "couchdb"
|
123
|
+
}
|
124
|
+
}
|
125
|
+
output {
|
126
|
+
elasticsearch {
|
127
|
+
action => "%{[@metadata][action]}"
|
128
|
+
document_id => "%{[@metadata][_id]}"
|
129
|
+
host => "127.0.0.1"
|
130
|
+
index => "#{index}"
|
131
|
+
protocol => "http"
|
132
|
+
}
|
133
|
+
}
|
134
|
+
CONFIG
|
135
|
+
|
136
|
+
agent do
|
137
|
+
# Verify the count
|
138
|
+
ftw.post!("http://127.0.0.1:9200/#{index}/_refresh")
|
139
|
+
data = ""
|
140
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_count?q=*")
|
141
|
+
response.read_body { |chunk| data << chunk }
|
142
|
+
result = LogStash::Json.load(data)
|
143
|
+
count = result["count"]
|
144
|
+
insist { count } == 10
|
145
|
+
# Get the docs and do a couple spot checks
|
146
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_search?q=*&size=10")
|
147
|
+
data = ""
|
148
|
+
response.read_body { |chunk| data << chunk }
|
149
|
+
result = LogStash::Json.load(data)
|
150
|
+
result["hits"]["hits"].each do |doc|
|
151
|
+
# With no 'index_type' set, the document type should be the type
|
152
|
+
# set on the input
|
153
|
+
insist { doc["_type"] } == "couchdb"
|
154
|
+
insist { doc["_index"] } == index
|
155
|
+
case doc["_id"]
|
156
|
+
when 1
|
157
|
+
insist { doc["_source"]["name"] } == "Peter Parker"
|
158
|
+
when 5
|
159
|
+
insist { doc["_source"]["name"] } == "Otto Octavius"
|
160
|
+
when 8
|
161
|
+
insist { doc["_source"]["name"] } == "Norman Osborne"
|
162
|
+
end
|
163
|
+
end
|
164
|
+
end
|
165
|
+
after do
|
166
|
+
teardown
|
167
|
+
end
|
168
|
+
end
|
169
|
+
|
170
|
+
describe "Test document updates", :elasticsearch => true, :couchdb => true do
|
171
|
+
include Helpers
|
172
|
+
sequence = "/tmp/.couchdb_seq"
|
173
|
+
index = "couchdb_test"
|
174
|
+
|
175
|
+
before do
|
176
|
+
buildup
|
177
|
+
updatedocs
|
178
|
+
end
|
179
|
+
|
180
|
+
ftw = FTW::Agent.new
|
181
|
+
|
182
|
+
config <<-CONFIG
|
183
|
+
input {
|
184
|
+
couchdb_changes {
|
185
|
+
db => "db"
|
186
|
+
host => "127.0.0.1"
|
187
|
+
timeout => 2000
|
188
|
+
always_reconnect => false
|
189
|
+
sequence_path => "#{sequence}"
|
190
|
+
type => "couchdb"
|
191
|
+
}
|
192
|
+
}
|
193
|
+
output {
|
194
|
+
elasticsearch {
|
195
|
+
action => "%{[@metadata][action]}"
|
196
|
+
document_id => "%{[@metadata][_id]}"
|
197
|
+
host => "127.0.0.1"
|
198
|
+
index => "#{index}"
|
199
|
+
protocol => "http"
|
200
|
+
}
|
201
|
+
}
|
202
|
+
CONFIG
|
203
|
+
|
204
|
+
agent do
|
205
|
+
# Verify the count (which should still be 10)
|
206
|
+
ftw.post!("http://127.0.0.1:9200/#{index}/_refresh")
|
207
|
+
data = ""
|
208
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_count?q=*")
|
209
|
+
response.read_body { |chunk| data << chunk }
|
210
|
+
result = LogStash::Json.load(data)
|
211
|
+
count = result["count"]
|
212
|
+
insist { count } == 10
|
213
|
+
# Get the docs and do a couple more spot checks
|
214
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_search?q=*&size=10")
|
215
|
+
data = ""
|
216
|
+
response.read_body { |chunk| data << chunk }
|
217
|
+
result = LogStash::Json.load(data)
|
218
|
+
result["hits"]["hits"].each do |doc|
|
219
|
+
case doc["_id"]
|
220
|
+
when 1
|
221
|
+
insist { doc["_source"]["Alter-ego"] } == "Spider-man"
|
222
|
+
when 5
|
223
|
+
insist { doc["_source"]["Alter-ego"] } == "Doctor Octopus"
|
224
|
+
when 8
|
225
|
+
insist { doc["_source"]["Alter-ego"] } == "Green Goblin"
|
226
|
+
end
|
227
|
+
end
|
228
|
+
end
|
229
|
+
|
230
|
+
after do
|
231
|
+
teardown
|
232
|
+
end
|
233
|
+
|
234
|
+
end
|
235
|
+
|
236
|
+
describe "Test sequence", :elasticsearch => true, :couchdb => true do
|
237
|
+
include Helpers
|
238
|
+
sequence = "/tmp/.couchdb_seq"
|
239
|
+
index = "couchdb_test"
|
240
|
+
|
241
|
+
ftw = FTW::Agent.new
|
242
|
+
|
243
|
+
config <<-CONFIG
|
244
|
+
input {
|
245
|
+
couchdb_changes {
|
246
|
+
db => "db"
|
247
|
+
host => "127.0.0.1"
|
248
|
+
timeout => 2000
|
249
|
+
always_reconnect => false
|
250
|
+
sequence_path => "#{sequence}"
|
251
|
+
type => "couchdb"
|
252
|
+
}
|
253
|
+
}
|
254
|
+
output {
|
255
|
+
elasticsearch {
|
256
|
+
action => "%{[@metadata][action]}"
|
257
|
+
document_id => "%{[@metadata][_id]}"
|
258
|
+
host => "127.0.0.1"
|
259
|
+
index => "#{index}"
|
260
|
+
protocol => "http"
|
261
|
+
}
|
262
|
+
}
|
263
|
+
CONFIG
|
264
|
+
|
265
|
+
before do
|
266
|
+
# This puts 10 docs into CouchDB
|
267
|
+
buildup
|
268
|
+
# And updates 3
|
269
|
+
updatedocs
|
270
|
+
# But let's set sequence to say we only read the 10th change
|
271
|
+
# so it will start with change #11
|
272
|
+
File.open(sequence, 'w') { |file| file.write("10") }
|
273
|
+
end
|
274
|
+
|
275
|
+
agent do
|
276
|
+
# Verify the count (which should still be 10)
|
277
|
+
ftw.post!("http://127.0.0.1:9200/#{index}/_refresh")
|
278
|
+
data = ""
|
279
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_count?q=*")
|
280
|
+
response.read_body { |chunk| data << chunk }
|
281
|
+
result = LogStash::Json.load(data)
|
282
|
+
count = result["count"]
|
283
|
+
# We should only have 3 documents here because of the sequence change
|
284
|
+
insist { count } == 3
|
285
|
+
# Get the docs and do a couple more spot checks
|
286
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_search?q=*&size=10")
|
287
|
+
data = ""
|
288
|
+
response.read_body { |chunk| data << chunk }
|
289
|
+
result = LogStash::Json.load(data)
|
290
|
+
counter = 0
|
291
|
+
result["hits"]["hits"].each do |doc|
|
292
|
+
case doc["_id"]
|
293
|
+
when 1
|
294
|
+
insist { doc["_source"]["Alter-ego"] } == "Spider-man"
|
295
|
+
when 5
|
296
|
+
insist { doc["_source"]["Alter-ego"] } == "Doctor Octopus"
|
297
|
+
when 8
|
298
|
+
insist { doc["_source"]["Alter-ego"] } == "Green Goblin"
|
299
|
+
end
|
300
|
+
end
|
301
|
+
# Logstash should have updated the sequence to 13 after all this
|
302
|
+
insist { File.read(sequence) } == "13"
|
303
|
+
end
|
304
|
+
|
305
|
+
after do
|
306
|
+
teardown
|
307
|
+
end
|
308
|
+
|
309
|
+
end
|
310
|
+
|
311
|
+
describe "Test document deletion", :elasticsearch => true, :couchdb => true do
|
312
|
+
include Helpers
|
313
|
+
sequence = "/tmp/.couchdb_seq"
|
314
|
+
index = "couchdb_test"
|
315
|
+
|
316
|
+
before do
|
317
|
+
buildup
|
318
|
+
deletedoc # from CouchDB
|
319
|
+
end
|
320
|
+
|
321
|
+
ftw = FTW::Agent.new
|
322
|
+
|
323
|
+
config <<-CONFIG
|
324
|
+
input {
|
325
|
+
couchdb_changes {
|
326
|
+
db => "db"
|
327
|
+
host => "127.0.0.1"
|
328
|
+
timeout => 2000
|
329
|
+
always_reconnect => false
|
330
|
+
sequence_path => "#{sequence}"
|
331
|
+
type => "couchdb"
|
332
|
+
}
|
333
|
+
}
|
334
|
+
output {
|
335
|
+
elasticsearch {
|
336
|
+
action => "%{[@metadata][action]}"
|
337
|
+
document_id => "%{[@metadata][_id]}"
|
338
|
+
host => "127.0.0.1"
|
339
|
+
index => "#{index}"
|
340
|
+
protocol => "http"
|
341
|
+
}
|
342
|
+
}
|
343
|
+
CONFIG
|
344
|
+
|
345
|
+
agent do
|
346
|
+
# Verify the count (should now be 9)
|
347
|
+
ftw.post!("http://127.0.0.1:9200/#{index}/_refresh")
|
348
|
+
data = ""
|
349
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_count?q=*")
|
350
|
+
response.read_body { |chunk| data << chunk }
|
351
|
+
result = LogStash::Json.load(data)
|
352
|
+
count = result["count"]
|
353
|
+
insist { count } == 9
|
354
|
+
# Get the docs and do a couple more spot checks
|
355
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_search?q=*&size=10")
|
356
|
+
data = ""
|
357
|
+
response.read_body { |chunk| data << chunk }
|
358
|
+
result = LogStash::Json.load(data)
|
359
|
+
insist { result["hits"]["hits"] }.any? { |doc| doc["_id"] == "9" }
|
360
|
+
end
|
361
|
+
|
362
|
+
after do
|
363
|
+
teardown
|
364
|
+
end
|
365
|
+
|
366
|
+
end
|
367
|
+
|
368
|
+
describe "Test authenticated connectivity", :elasticsearch => true, :couchdb => true do
|
369
|
+
include Helpers
|
370
|
+
user = "logstash"
|
371
|
+
pass = "logstash"
|
372
|
+
sequence = "/tmp/.couchdb_seq"
|
373
|
+
index = "couchdb_test"
|
374
|
+
|
375
|
+
before do
|
376
|
+
buildup
|
377
|
+
createuser
|
378
|
+
end
|
379
|
+
|
380
|
+
ftw = FTW::Agent.new
|
381
|
+
|
382
|
+
config <<-CONFIG
|
383
|
+
input {
|
384
|
+
couchdb_changes {
|
385
|
+
db => "db"
|
386
|
+
host => "127.0.0.1"
|
387
|
+
timeout => 2000
|
388
|
+
always_reconnect => false
|
389
|
+
sequence_path => "#{sequence}"
|
390
|
+
type => "couchdb"
|
391
|
+
username => "#{user}"
|
392
|
+
password => "#{pass}"
|
393
|
+
}
|
394
|
+
}
|
395
|
+
output {
|
396
|
+
elasticsearch {
|
397
|
+
action => "%{[@metadata][action]}"
|
398
|
+
document_id => "%{[@metadata][_id]}"
|
399
|
+
host => "127.0.0.1"
|
400
|
+
index => "#{index}"
|
401
|
+
protocol => "http"
|
402
|
+
}
|
403
|
+
}
|
404
|
+
CONFIG
|
405
|
+
|
406
|
+
agent do
|
407
|
+
# Verify the count
|
408
|
+
ftw.post!("http://127.0.0.1:9200/#{index}/_refresh")
|
409
|
+
data = ""
|
410
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_count?q=*")
|
411
|
+
response.read_body { |chunk| data << chunk }
|
412
|
+
result = LogStash::Json.load(data)
|
413
|
+
count = result["count"]
|
414
|
+
insist { count } == 10
|
415
|
+
# Get the docs and do a couple spot checks
|
416
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_search?q=*&size=10")
|
417
|
+
data = ""
|
418
|
+
response.read_body { |chunk| data << chunk }
|
419
|
+
result = LogStash::Json.load(data)
|
420
|
+
doc3 = result["hits"]["hits"].find { |doc| doc["_id"] == "3" }
|
421
|
+
# Make sure it's found
|
422
|
+
reject { doc3 }.nil?
|
423
|
+
# verify the 'name' field
|
424
|
+
insist { doc3["_source"]["name"] } == "Captain America"
|
425
|
+
end
|
426
|
+
|
427
|
+
after do
|
428
|
+
deleteuser
|
429
|
+
teardown
|
430
|
+
end
|
431
|
+
end
|
432
|
+
|
433
|
+
describe "Test Secure Connection", :elasticsearch => true, :couchdb => true do
|
434
|
+
include Helpers
|
435
|
+
sequence = "/tmp/.couchdb_seq"
|
436
|
+
index = "couchdb_test"
|
437
|
+
ca_file = File.dirname(__FILE__) + "/ca_cert.pem"
|
438
|
+
|
439
|
+
before do
|
440
|
+
buildup
|
441
|
+
end
|
442
|
+
|
443
|
+
ftw = FTW::Agent.new
|
444
|
+
|
445
|
+
config <<-CONFIG
|
446
|
+
input {
|
447
|
+
couchdb_changes {
|
448
|
+
db => "db"
|
449
|
+
host => "localhost"
|
450
|
+
port => 6984
|
451
|
+
timeout => 2000
|
452
|
+
always_reconnect => false
|
453
|
+
sequence_path => "#{sequence}"
|
454
|
+
type => "couchdb"
|
455
|
+
secure => true
|
456
|
+
ca_file => "#{ca_file}"
|
457
|
+
}
|
458
|
+
}
|
459
|
+
output {
|
460
|
+
elasticsearch {
|
461
|
+
action => "%{[@metadata][action]}"
|
462
|
+
document_id => "%{[@metadata][_id]}"
|
463
|
+
host => "127.0.0.1"
|
464
|
+
index => "#{index}"
|
465
|
+
protocol => "http"
|
466
|
+
}
|
467
|
+
}
|
468
|
+
CONFIG
|
469
|
+
|
470
|
+
agent do
|
471
|
+
# Verify the count
|
472
|
+
ftw.post!("http://127.0.0.1:9200/#{index}/_refresh")
|
473
|
+
data = ""
|
474
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_count?q=*")
|
475
|
+
response.read_body { |chunk| data << chunk }
|
476
|
+
result = LogStash::Json.load(data)
|
477
|
+
count = result["count"]
|
478
|
+
insist { count } == 10
|
479
|
+
# Get the docs and do a couple spot checks
|
480
|
+
response = ftw.get!("http://127.0.0.1:9200/#{index}/_search?q=*&size=10")
|
481
|
+
data = ""
|
482
|
+
response.read_body { |chunk| data << chunk }
|
483
|
+
result = LogStash::Json.load(data)
|
484
|
+
doc8 = result["hits"]["hits"].find { |doc| doc["_id"] == "8" }
|
485
|
+
# Make sure it's found
|
486
|
+
reject { doc8 }.nil?
|
487
|
+
# verify the 'name' field
|
488
|
+
insist { doc8["_source"]["name"] } == "Norman Osborne"
|
489
|
+
end
|
490
|
+
|
491
|
+
after do
|
492
|
+
teardown
|
493
|
+
end
|
494
|
+
end
|
495
|
+
|
496
|
+
end
|
497
|
+
|
498
|
+
|
499
|
+
|
@@ -0,0 +1,35 @@
|
|
1
|
+
-----BEGIN CERTIFICATE-----
|
2
|
+
MIIGDDCCA/SgAwIBAgICEAAwDQYJKoZIhvcNAQELBQAwgaUxCzAJBgNVBAYTAlVT
|
3
|
+
MRMwEQYDVQQIDApDYWxpZm9ybmlhMRIwEAYDVQQHDAlMb3MgQWx0b3MxFjAUBgNV
|
4
|
+
BAoMDUVsYXN0aWNzZWFyY2gxEDAOBgNVBAsMB1Rlc3RpbmcxGTAXBgNVBAMMEEVs
|
5
|
+
YXN0aWNzZWFyY2ggQ0ExKDAmBgkqhkiG9w0BCQEWGW5vcmVwbHlAZWxhc3RpY3Nl
|
6
|
+
YXJjaC5jb20wHhcNMTQxMTE0MTgxNzE4WhcNMjQxMTExMTgxNzE4WjBgMQswCQYD
|
7
|
+
VQQGEwJVUzETMBEGA1UECAwKQ2FsaWZvcm5pYTEWMBQGA1UECgwNRWxhc3RpY3Nl
|
8
|
+
YXJjaDEQMA4GA1UECwwHVGVzdGluZzESMBAGA1UEAwwJbG9jYWxob3N0MIICIjAN
|
9
|
+
BgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAwA0VFWDCX3iBli03N7ZztrjUOJxk
|
10
|
+
7RBpXNv252FRdmf0vKaEKKkAQCbKzhHsBCloAH4CQPDUli3B2NdFYsWjSbmszgKk
|
11
|
+
XpTmQldgipxB/e4CXOFvVbjffjzUWHsoP6E7QiFUSJUElG9P8xUBXFFIlEWaRko1
|
12
|
+
gKv3DXOsSjZD5FdZymwAzcSS0c/PnwJNQ4X4RPqxq+JiAVnDt1T5XMfNelj+cIcG
|
13
|
+
jQA7SNgbwPYiNDsjwPy3xqvsgLodXXCI7mRSAxCeTRIP2UJWeEnYbuBoaA9xI8eM
|
14
|
+
gRsXWacCq8DoPT7zsxjO+BS33ui1DbFD4km7hS7qvluAjZe+N7q7/2F80e3V5SHQ
|
15
|
+
U7RZ8UD8/87p4eSkZQ6fErvN0wjCdRLpZ53keC0rau5n+xjM4+ITRkRjQOMaQaFC
|
16
|
+
8hSSXN0TmDeygrbOjW3Q68AVpjKQ8GJebZI2noE9YRDqs/+KtpHDL4saqFLSG7y/
|
17
|
+
nbhtk4M0svpzHydYXKkdwaiIhx+Fk2OflCV3+QYUl/tzBq5xX7e6oXY0WtXaJEzv
|
18
|
+
v+GY+vGu7jUXESMZ3Ar5GX1hwTpgHHbMHEn6MKdLVsrqnf9zKEjf1OdiV1FidUIH
|
19
|
+
rr+7x/ZEJvdWmXEupyBeKgRD1JxjWUNrHTRcZNdosa6rDdtLXa9ZDCW69oC+ySRs
|
20
|
+
LVovkPJqPhc05HECAwEAAaOBiTCBhjAJBgNVHRMEAjAAMAsGA1UdDwQEAwIF4DAs
|
21
|
+
BglghkgBhvhCAQ0EHxYdT3BlblNTTCBHZW5lcmF0ZWQgQ2VydGlmaWNhdGUwHQYD
|
22
|
+
VR0OBBYEFFI+R1vjzUbuZLgEkPGPN34KtJsQMB8GA1UdIwQYMBaAFNHzAYOHho46
|
23
|
+
77j8yuC2IMqWq2xnMA0GCSqGSIb3DQEBCwUAA4ICAQBzxw6tK5s4A91Mfak7dTlF
|
24
|
+
5/XoioxetPJaLzunGGDE1M2YXKMzyugJREJTP6bK2FxATujJHDtCYJxjoHtygJph
|
25
|
+
uaOWVXBa3eAn0YwSPPC68S2YoWIoqbb1dvRJCuyXzVfjgjJyGtKwHtjV8yCsoQnS
|
26
|
+
4TU2k/ARFCdbl5wpAUpH734smOUalqooaliG+8SpnlAA+mOQk0ZhtU6ZEzwhZWnc
|
27
|
+
HjIJu/k+LA4pLrpinmTocZXH5jij5EowdR55iyy8t2yLHf/YopQfsvvbIxJAAVp9
|
28
|
+
+6K5vdwSgIH5y4sAca1DjeeIxTJogFXuup1E0VE84zs6kMvzcA5hNyamn+wM+NPJ
|
29
|
+
svihI4mdLYN/RBTyHF1aI0kgvLqKmmR99srM/YUpy4SvRToRaM7zwyHw5Kga1s1q
|
30
|
+
st9wmsSKmT6wqq1uE91wUAuJp0NU6BYmmwB37cmM1SwrJuN/IFv+PF/TsIIhOQTm
|
31
|
+
QesmvesJzr8oaJuEzVZJ70UJCfFzkKUOk80a5v1uQtBbIhdQxb07S95FnJOMnb68
|
32
|
+
PzdUwOL3Ffnqc0A28tZSuIPGJG8rhsZU5l7g+JBdaQujJsduRnM+eIkW3KaPt6yk
|
33
|
+
eURFGI0lMdEq4aQOORebr+i2wXxCemP4q20xACoj0hNMQnOc+SeHtQLdbdw9aLnE
|
34
|
+
djDQYN/eKL3Jn4Itnfy+Mg==
|
35
|
+
-----END CERTIFICATE-----
|
@@ -0,0 +1,51 @@
|
|
1
|
+
-----BEGIN RSA PRIVATE KEY-----
|
2
|
+
MIIJKgIBAAKCAgEAwA0VFWDCX3iBli03N7ZztrjUOJxk7RBpXNv252FRdmf0vKaE
|
3
|
+
KKkAQCbKzhHsBCloAH4CQPDUli3B2NdFYsWjSbmszgKkXpTmQldgipxB/e4CXOFv
|
4
|
+
VbjffjzUWHsoP6E7QiFUSJUElG9P8xUBXFFIlEWaRko1gKv3DXOsSjZD5FdZymwA
|
5
|
+
zcSS0c/PnwJNQ4X4RPqxq+JiAVnDt1T5XMfNelj+cIcGjQA7SNgbwPYiNDsjwPy3
|
6
|
+
xqvsgLodXXCI7mRSAxCeTRIP2UJWeEnYbuBoaA9xI8eMgRsXWacCq8DoPT7zsxjO
|
7
|
+
+BS33ui1DbFD4km7hS7qvluAjZe+N7q7/2F80e3V5SHQU7RZ8UD8/87p4eSkZQ6f
|
8
|
+
ErvN0wjCdRLpZ53keC0rau5n+xjM4+ITRkRjQOMaQaFC8hSSXN0TmDeygrbOjW3Q
|
9
|
+
68AVpjKQ8GJebZI2noE9YRDqs/+KtpHDL4saqFLSG7y/nbhtk4M0svpzHydYXKkd
|
10
|
+
waiIhx+Fk2OflCV3+QYUl/tzBq5xX7e6oXY0WtXaJEzvv+GY+vGu7jUXESMZ3Ar5
|
11
|
+
GX1hwTpgHHbMHEn6MKdLVsrqnf9zKEjf1OdiV1FidUIHrr+7x/ZEJvdWmXEupyBe
|
12
|
+
KgRD1JxjWUNrHTRcZNdosa6rDdtLXa9ZDCW69oC+ySRsLVovkPJqPhc05HECAwEA
|
13
|
+
AQKCAgB+al4dc1Key1DpjJvTNWsXtLQlC3U3wtzH/haZGasouKcVYrqNlSkQETjf
|
14
|
+
ylZEKwlFgax0GNKmhDocRR9sM9IXHnxMItsVUwf6VU+8Db02q+usPcwubgHXM61H
|
15
|
+
DNJih/vcvNmg6U5Zcqf6xzHdFbgjuWkiqYhsSUXW+fRH6U5pSMJXBx7EU4edSiBN
|
16
|
+
d5NqRg41QZugG+UNJIw66lk2JGLyDRB2+7ppJ+TePzqNmrbLEL6pMvC5esOzvE7G
|
17
|
+
CeRon2qIj/DTpHcAOLV4eotX+KlhDgEYXyrydOW64r9UVSfJ4N1DX6olDGY3+Y/s
|
18
|
+
Emov0v73XPmyEcMd8OuJ/YXwRbiFxOYJQLzthbUvEnOYn4AeTF2oMC/EllHAUlVG
|
19
|
+
bPLupK3kMY7GcOLVfdySV/cnyb/QS7Tn8q9sH2rCsKVHQdZxEjwVuFnPvxwowjb0
|
20
|
+
3F/ssfT1H1Ib5sWVMUGImngoUo95hI9I3YSpKev0dxANeqkiMrcq8T/j5M20YhbN
|
21
|
+
K9VVN61+4yD5xrWlwVxvNh4VKAbIXXBw4BcbR+Ich8a2NT1Sqoo2NyKCuyqf4OsN
|
22
|
+
RyGD+9XPx65GHSs/15yT11xztdsf8Hyzfi27eSdpMxVLMXUxQxEiRhJ3U+LlKFjW
|
23
|
+
Cevm2TGme/CCI8NOStd5zxglV8NJRHghz43O9F2w6nC/Nch40QKCAQEA+n2cG2E9
|
24
|
+
Ww5WikjdZ/VHpAKAfpT0mji0gzcWMTe1hc1bTgU5PGf/7nL8ndpiDxIV3XQOhQIy
|
25
|
+
sFmwnG7Rr8r6EuNBRYYRu2XltveWNi1e1HsCMzDyPuCAvy5mKSAwabZjsbakH8dx
|
26
|
+
f7mPpZErumTDSCVRLnXZ10ip5MSlZMqv+nYtVtmwB8vMXMWTknRJGyGVKFzXwiRx
|
27
|
+
b1RHn3YigKU60/F3hrgp8wiIj/UU8inhdQCuMeA1x1sJDRiWgY9yq50e3Hoe2Wjw
|
28
|
+
cX2GqdrzNicJfedvKSEN7665CbJKSdYd6386Gl+mzytY5dCBm0mCqHZy9aTS8h3U
|
29
|
+
H0Jz3wMr7CuwnQKCAQEAxEZtk2uLAGm9dIzLoVOr209xuRBKbKAu0pwVQ+YZF5s/
|
30
|
+
LNvupstY9wHXz3S0a3zeNhV1QzKT4E1uMGthJgUUyWrFx0TvPuuhOwSGfPDlPMC2
|
31
|
+
rmQBDeh+WdsjEI+obpGKWe0tWLQ4jK3IIwsYtq9mIhg60InDrGBGuAypZyjeCaz8
|
32
|
+
MZiCKBSPBnbp1/FveykVNTtd80SJQWEU8PB5dKdXZiLZ+OZllOREjgji/EVBoWyv
|
33
|
+
VLnfWqgRE0xuZgsF7Gx8b7qjN5PhNj7bt9GmddXrnCwPq63CG7ndUhpKC+PDvT6c
|
34
|
+
MucM4JIAg3u44/aiVjjb0tO1sUXUbmcJw8i3UpcI5QKCAQEApTkNKxIswk7mzjfZ
|
35
|
+
sqSbKJdt4hCmdsNIbfR78uLoHOWjgAb6BdojekjmT8ioPYCUY4oVua4FeUTvtX9z
|
36
|
+
WBzubl7vwsf4Ej/YEaOltP6gOk8Y2GNEpiy3P2N/h0jwJgpkH+h0wXFwb+sZ0P+8
|
37
|
+
dCnalU/oCFk740DOr1L2NVFsWixxI4RbFAldNyQrfsKtJfQ0ynbS6f/XwrM4uvnV
|
38
|
+
MD3MW9g+GuKG6QOL8EicFE+DowVb4RIe5uwpQDYjsDnKTWBJ5uu2RXluf61okckV
|
39
|
+
+3YsUJvDOsHOy3XYH4k2bxWIjrlQveyxvpy8+nlZw+/s+umGUnjxmzoJnMTcYRFa
|
40
|
+
e2EmEQKCAQEAvsO44oym6Drdobbqf51ELn7TiExWGafCen6riHfOsYv5Zg9IsCJ6
|
41
|
+
EHFhIMhMRyBxFV3bv/kbkumPDE6BeKN9pZo6KkhMw/nele77C9pS465mn66g+7SZ
|
42
|
+
gZokRYdq7DRWpLqJ1WosgEaze6PgXEaz3LVyDJepcBOPCHl9+L7Wt87Cuy2Aa5Y3
|
43
|
+
wM+4zmJhuPGgJEHUOYnGYZ4K7Xa8hW3T23hKJMlBt+n425e1jf1+IXfyHUYe2Qz6
|
44
|
+
s6gYsONL2ZNDipB67Jy1SrTjdpC02liVjA+jF3hlbe9DNfZJO68e4wNEa5D3ihk9
|
45
|
+
mI2akS54b5cJyKpuMKMvKjLJYdnmm8YiJQKCAQEA+GDeA5sPp2IfRRgUBrxQUQOY
|
46
|
+
BmHAE8dOrooY94JQNB4OFWmTqrFw2chVgGJe9SQLJSPu8UxBlqbb3RiWy4yZLuoS
|
47
|
+
53SI6atzz4U7+DLVj+Xvw5RRxI09ezGah/q10snSG4mQ+mo2ZfLcamHX4Y8aLPXP
|
48
|
+
HgnpA86LDV3+A+0fuDQCr6MLkmA1xonldC1tT9vQbQlu57t2lvKug2n38w1qZ9qa
|
49
|
+
upYqJUTRI54REfNETItakBs3bXwDHuMCpx4wcK1vnsw+sdnkoW6TvHBaulR4u4rW
|
50
|
+
9SYvRHTJWbj3PhHUvet/2E4V60B5kKSm0obd5UeHYYR/LEheacnbB3mnAv4ahw==
|
51
|
+
-----END RSA PRIVATE KEY-----
|
metadata
ADDED
@@ -0,0 +1,165 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-input-couchdb_changes
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.1
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Elasticsearch
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2015-01-21 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: logstash
|
15
|
+
version_requirements: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - '>='
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: 1.4.0
|
20
|
+
- - <
|
21
|
+
- !ruby/object:Gem::Version
|
22
|
+
version: 2.0.0
|
23
|
+
requirement: !ruby/object:Gem::Requirement
|
24
|
+
requirements:
|
25
|
+
- - '>='
|
26
|
+
- !ruby/object:Gem::Version
|
27
|
+
version: 1.4.0
|
28
|
+
- - <
|
29
|
+
- !ruby/object:Gem::Version
|
30
|
+
version: 2.0.0
|
31
|
+
prerelease: false
|
32
|
+
type: :runtime
|
33
|
+
- !ruby/object:Gem::Dependency
|
34
|
+
name: logstash-codec-plain
|
35
|
+
version_requirements: !ruby/object:Gem::Requirement
|
36
|
+
requirements:
|
37
|
+
- - '>='
|
38
|
+
- !ruby/object:Gem::Version
|
39
|
+
version: '0'
|
40
|
+
requirement: !ruby/object:Gem::Requirement
|
41
|
+
requirements:
|
42
|
+
- - '>='
|
43
|
+
- !ruby/object:Gem::Version
|
44
|
+
version: '0'
|
45
|
+
prerelease: false
|
46
|
+
type: :runtime
|
47
|
+
- !ruby/object:Gem::Dependency
|
48
|
+
name: ftw
|
49
|
+
version_requirements: !ruby/object:Gem::Requirement
|
50
|
+
requirements:
|
51
|
+
- - '>='
|
52
|
+
- !ruby/object:Gem::Version
|
53
|
+
version: 0.0.41
|
54
|
+
requirement: !ruby/object:Gem::Requirement
|
55
|
+
requirements:
|
56
|
+
- - '>='
|
57
|
+
- !ruby/object:Gem::Version
|
58
|
+
version: 0.0.41
|
59
|
+
prerelease: false
|
60
|
+
type: :runtime
|
61
|
+
- !ruby/object:Gem::Dependency
|
62
|
+
name: json
|
63
|
+
version_requirements: !ruby/object:Gem::Requirement
|
64
|
+
requirements:
|
65
|
+
- - '>='
|
66
|
+
- !ruby/object:Gem::Version
|
67
|
+
version: '0'
|
68
|
+
requirement: !ruby/object:Gem::Requirement
|
69
|
+
requirements:
|
70
|
+
- - '>='
|
71
|
+
- !ruby/object:Gem::Version
|
72
|
+
version: '0'
|
73
|
+
prerelease: false
|
74
|
+
type: :runtime
|
75
|
+
- !ruby/object:Gem::Dependency
|
76
|
+
name: ftw
|
77
|
+
version_requirements: !ruby/object:Gem::Requirement
|
78
|
+
requirements:
|
79
|
+
- - '>='
|
80
|
+
- !ruby/object:Gem::Version
|
81
|
+
version: 0.0.41
|
82
|
+
requirement: !ruby/object:Gem::Requirement
|
83
|
+
requirements:
|
84
|
+
- - '>='
|
85
|
+
- !ruby/object:Gem::Version
|
86
|
+
version: 0.0.41
|
87
|
+
prerelease: false
|
88
|
+
type: :development
|
89
|
+
- !ruby/object:Gem::Dependency
|
90
|
+
name: logstash-devutils
|
91
|
+
version_requirements: !ruby/object:Gem::Requirement
|
92
|
+
requirements:
|
93
|
+
- - '>='
|
94
|
+
- !ruby/object:Gem::Version
|
95
|
+
version: 0.0.6
|
96
|
+
requirement: !ruby/object:Gem::Requirement
|
97
|
+
requirements:
|
98
|
+
- - '>='
|
99
|
+
- !ruby/object:Gem::Version
|
100
|
+
version: 0.0.6
|
101
|
+
prerelease: false
|
102
|
+
type: :development
|
103
|
+
- !ruby/object:Gem::Dependency
|
104
|
+
name: logstash-output-elasticsearch
|
105
|
+
version_requirements: !ruby/object:Gem::Requirement
|
106
|
+
requirements:
|
107
|
+
- - '>='
|
108
|
+
- !ruby/object:Gem::Version
|
109
|
+
version: '0'
|
110
|
+
requirement: !ruby/object:Gem::Requirement
|
111
|
+
requirements:
|
112
|
+
- - '>='
|
113
|
+
- !ruby/object:Gem::Version
|
114
|
+
version: '0'
|
115
|
+
prerelease: false
|
116
|
+
type: :development
|
117
|
+
description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
|
118
|
+
email: info@elasticsearch.com
|
119
|
+
executables: []
|
120
|
+
extensions: []
|
121
|
+
extra_rdoc_files: []
|
122
|
+
files:
|
123
|
+
- .gitignore
|
124
|
+
- DEVELOPER.md
|
125
|
+
- Gemfile
|
126
|
+
- LICENSE
|
127
|
+
- README.md
|
128
|
+
- Rakefile
|
129
|
+
- lib/logstash/inputs/couchdb_changes.rb
|
130
|
+
- logstash-input-couchdb_changes.gemspec
|
131
|
+
- spec/inputs/ca_cert.pem
|
132
|
+
- spec/inputs/couchdb_changes_spec.rb
|
133
|
+
- spec/inputs/localhost.cert
|
134
|
+
- spec/inputs/localhost.key
|
135
|
+
homepage: http://www.elasticsearch.org/guide/en/logstash/current/index.html
|
136
|
+
licenses:
|
137
|
+
- Apache License (2.0)
|
138
|
+
metadata:
|
139
|
+
logstash_plugin: 'true'
|
140
|
+
logstash_group: input
|
141
|
+
post_install_message:
|
142
|
+
rdoc_options: []
|
143
|
+
require_paths:
|
144
|
+
- lib
|
145
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
146
|
+
requirements:
|
147
|
+
- - '>='
|
148
|
+
- !ruby/object:Gem::Version
|
149
|
+
version: '0'
|
150
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
151
|
+
requirements:
|
152
|
+
- - '>='
|
153
|
+
- !ruby/object:Gem::Version
|
154
|
+
version: '0'
|
155
|
+
requirements: []
|
156
|
+
rubyforge_project:
|
157
|
+
rubygems_version: 2.1.9
|
158
|
+
signing_key:
|
159
|
+
specification_version: 4
|
160
|
+
summary: This input captures the _changes stream from a CouchDB instance
|
161
|
+
test_files:
|
162
|
+
- spec/inputs/ca_cert.pem
|
163
|
+
- spec/inputs/couchdb_changes_spec.rb
|
164
|
+
- spec/inputs/localhost.cert
|
165
|
+
- spec/inputs/localhost.key
|