logstash-input-elasticsearch 0.1.1 → 0.1.2
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CONTRIBUTORS +17 -0
- data/LICENSE +1 -1
- data/README.md +95 -0
- data/lib/logstash/inputs/elasticsearch.rb +76 -18
- data/logstash-input-elasticsearch.gemspec +1 -1
- data/spec/inputs/elasticsearch_spec.rb +198 -26
- metadata +23 -21
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA1:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 46a597e767e0d6857aa3f7554342dcaf1bad8d3c
|
4
|
+
data.tar.gz: 1759b3870f1fcb3ce6d34189c08d6b63190229e2
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: bcaa1593c02088b7e63933edc8439cd76403916ed1ad9bc2fdb079aaf27c68880fae5e697163588cc19d72b3f0ea3ef0de5ee4f082f738bcfcbb9e00e00db986
|
7
|
+
data.tar.gz: 2fa3384a7a827853bd1afa72267d02536986921a7bd1cddc57dfe1537b630443a12e4fdcb6ed1645a912158bcf716914dc37d1ce81807cd90c694f0f83a9cebf
|
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,17 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* Colin Surprenant (colinsurprenant)
|
6
|
+
* Jonathan Van Eenwyk (jdve)
|
7
|
+
* Jordan Sissel (jordansissel)
|
8
|
+
* João Duarte (jsvd)
|
9
|
+
* Kurt Hurtado (kurtado)
|
10
|
+
* Pier-Hugues Pellerin (ph)
|
11
|
+
* Richard Pijnenburg (electrical)
|
12
|
+
* Suyog Rao (suyograo)
|
13
|
+
|
14
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
15
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
16
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
17
|
+
open source awesome.
|
data/LICENSE
CHANGED
@@ -1,4 +1,4 @@
|
|
1
|
-
Copyright (c) 2012-
|
1
|
+
Copyright (c) 2012-2015 Elasticsearch <http://www.elasticsearch.org>
|
2
2
|
|
3
3
|
Licensed under the Apache License, Version 2.0 (the "License");
|
4
4
|
you may not use this file except in compliance with the License.
|
data/README.md
ADDED
@@ -0,0 +1,95 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elasticsearch/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elasticsearch.org/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elasticsearch/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the logstash-users@googlegroups.com mailing list.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization.
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
```sh
|
35
|
+
bundle exec rspec
|
36
|
+
```
|
37
|
+
|
38
|
+
The Logstash code required to run the tests/specs is specified in the `Gemfile` by the line similar to:
|
39
|
+
```ruby
|
40
|
+
gem "logstash", :github => "elasticsearch/logstash", :branch => "1.5"
|
41
|
+
```
|
42
|
+
To test against another version or a local Logstash, edit the `Gemfile` to specify an alternative location, for example:
|
43
|
+
```ruby
|
44
|
+
gem "logstash", :github => "elasticsearch/logstash", :ref => "master"
|
45
|
+
```
|
46
|
+
```ruby
|
47
|
+
gem "logstash", :path => "/your/local/logstash"
|
48
|
+
```
|
49
|
+
|
50
|
+
Then update your dependencies and run your tests:
|
51
|
+
|
52
|
+
```sh
|
53
|
+
bundle install
|
54
|
+
bundle exec rspec
|
55
|
+
```
|
56
|
+
|
57
|
+
### 2. Running your unpublished Plugin in Logstash
|
58
|
+
|
59
|
+
#### 2.1 Run in a local Logstash clone
|
60
|
+
|
61
|
+
- Edit Logstash `tools/Gemfile` and add the local plugin path, for example:
|
62
|
+
```ruby
|
63
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
64
|
+
```
|
65
|
+
- Update Logstash dependencies
|
66
|
+
```sh
|
67
|
+
rake vendor:gems
|
68
|
+
```
|
69
|
+
- Run Logstash with your plugin
|
70
|
+
```sh
|
71
|
+
bin/logstash -e 'filter {awesome {}}'
|
72
|
+
```
|
73
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
74
|
+
|
75
|
+
#### 2.2 Run in an installed Logstash
|
76
|
+
|
77
|
+
- Build your plugin gem
|
78
|
+
```sh
|
79
|
+
gem build logstash-filter-awesome.gemspec
|
80
|
+
```
|
81
|
+
- Install the plugin from the Logstash home
|
82
|
+
```sh
|
83
|
+
bin/plugin install /your/local/plugin/logstash-filter-awesome.gem
|
84
|
+
```
|
85
|
+
- Start Logstash and proceed to test the plugin
|
86
|
+
|
87
|
+
## Contributing
|
88
|
+
|
89
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
90
|
+
|
91
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
92
|
+
|
93
|
+
It is more important to me that you are able to contribute.
|
94
|
+
|
95
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elasticsearch/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -18,12 +18,16 @@ require "base64"
|
|
18
18
|
#
|
19
19
|
# This would create an Elasticsearch query with the following format:
|
20
20
|
# [source,json]
|
21
|
-
# http://localhost:9200/logstash-*/_search
|
21
|
+
# curl 'http://localhost:9200/logstash-*/_search?&scroll=1m&size=1000' -d '{
|
22
|
+
# "query": {
|
23
|
+
# "match": {
|
24
|
+
# "statuscode": 200
|
25
|
+
# }
|
26
|
+
# }
|
27
|
+
# }'
|
22
28
|
#
|
23
|
-
# TODO(sissel): Option to keep the index, type, and doc id so we can do reindexing?
|
24
29
|
class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
25
30
|
config_name "elasticsearch"
|
26
|
-
milestone 1
|
27
31
|
|
28
32
|
default :codec, "json"
|
29
33
|
|
@@ -37,7 +41,7 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
37
41
|
config :index, :validate => :string, :default => "logstash-*"
|
38
42
|
|
39
43
|
# The query to be executed.
|
40
|
-
config :query, :validate => :string, :default => "
|
44
|
+
config :query, :validate => :string, :default => '{"query": { "match_all": {} } }'
|
41
45
|
|
42
46
|
# Enable the Elasticsearch "scan" search type. This will disable
|
43
47
|
# sorting but increase speed and performance.
|
@@ -51,6 +55,45 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
51
55
|
# round trip (i.e. between the previous scan scroll request, to the next).
|
52
56
|
config :scroll, :validate => :string, :default => "1m"
|
53
57
|
|
58
|
+
# If set, include Elasticsearch document information such as index, type, and
|
59
|
+
# the id in the event.
|
60
|
+
#
|
61
|
+
# It might be important to note, with regards to metadata, that if you're
|
62
|
+
# ingesting documents with the intent to re-index them (or just update them)
|
63
|
+
# that the `action` option in the elasticsearch output want's to know how to
|
64
|
+
# handle those things. It can be dynamically assigned with a field
|
65
|
+
# added to the metadata.
|
66
|
+
#
|
67
|
+
# Example
|
68
|
+
# [source, ruby]
|
69
|
+
# input {
|
70
|
+
# elasticsearch {
|
71
|
+
# host => "es.production.mysite.org"
|
72
|
+
# index => "mydata-2018.09.*"
|
73
|
+
# query => "*"
|
74
|
+
# size => 500
|
75
|
+
# scroll => "5m"
|
76
|
+
# docinfo => true
|
77
|
+
# }
|
78
|
+
# }
|
79
|
+
# output {
|
80
|
+
# elasticsearch {
|
81
|
+
# index => "copy-of-production.%{[@metadata][_index]}"
|
82
|
+
# index_type => "%{[@metadata][_type]}"
|
83
|
+
# document_id => "%{[@metadata][_id]}"
|
84
|
+
# }
|
85
|
+
# }
|
86
|
+
#
|
87
|
+
config :docinfo, :validate => :boolean, :default => false
|
88
|
+
|
89
|
+
# Where to move the Elasticsearch document information by default we use the @metadata field.
|
90
|
+
config :docinfo_target, :validate=> :string, :default => "@metadata"
|
91
|
+
|
92
|
+
# List of document metadata to move to the `docinfo_target` field
|
93
|
+
# To learn more about Elasticsearch metadata fields read
|
94
|
+
# http://www.elasticsearch.org/guide/en/elasticsearch/guide/current/_document_metadata.html
|
95
|
+
config :docinfo_fields, :validate => :array, :default => ['_index', '_type', '_id']
|
96
|
+
|
54
97
|
# Basic Auth - username
|
55
98
|
config :user, :validate => :string
|
56
99
|
|
@@ -68,10 +111,10 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
68
111
|
require "elasticsearch"
|
69
112
|
|
70
113
|
@options = {
|
71
|
-
index
|
72
|
-
body
|
73
|
-
scroll
|
74
|
-
size
|
114
|
+
:index => @index,
|
115
|
+
:body => @query,
|
116
|
+
:scroll => @scroll,
|
117
|
+
:size => @size
|
75
118
|
}
|
76
119
|
|
77
120
|
@options[:search_type] = 'scan' if @scan
|
@@ -80,28 +123,27 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
80
123
|
|
81
124
|
if @user && @password
|
82
125
|
token = Base64.strict_encode64("#{@user}:#{@password.value}")
|
83
|
-
transport_options[:headers] = { Authorization
|
126
|
+
transport_options[:headers] = { :Authorization => "Basic #{token}" }
|
84
127
|
end
|
85
128
|
|
86
129
|
hosts = if @ssl then
|
87
|
-
@hosts.map {|h| { host
|
130
|
+
@hosts.map { |h| { :host => h, :scheme => 'https' } }
|
88
131
|
else
|
89
132
|
@hosts
|
90
133
|
end
|
91
134
|
|
92
135
|
if @ssl && @ca_file
|
93
|
-
transport_options[:ssl] = { ca_file
|
136
|
+
transport_options[:ssl] = { :ca_file => @ca_file }
|
94
137
|
end
|
95
138
|
|
96
|
-
@client = Elasticsearch::Client.new
|
97
|
-
|
98
|
-
|
99
|
-
|
139
|
+
@client = Elasticsearch::Client.new(:hosts => hosts, :transport_options => transport_options)
|
140
|
+
end
|
141
|
+
|
100
142
|
public
|
101
143
|
def run(output_queue)
|
102
144
|
|
103
145
|
# get first wave of data
|
104
|
-
r = @client.search
|
146
|
+
r = @client.search(@options)
|
105
147
|
|
106
148
|
# since 'scan' doesn't return data on the search call, do an extra scroll
|
107
149
|
if @scan
|
@@ -109,8 +151,24 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
109
151
|
end
|
110
152
|
|
111
153
|
while r['hits']['hits'].any? do
|
112
|
-
r['hits']['hits'].each do |
|
154
|
+
r['hits']['hits'].each do |hit|
|
155
|
+
event = LogStash::Event.new(hit['_source'])
|
113
156
|
decorate(event)
|
157
|
+
|
158
|
+
if @docinfo
|
159
|
+
event[@docinfo_target] ||= {}
|
160
|
+
|
161
|
+
unless event[@docinfo_target].is_a?(Hash)
|
162
|
+
@logger.error("Elasticsearch Input: Incompatible Event, incompatible type for the `@metadata` field in the `_source` document, expected a hash got:", :metadata_type => event[@docinfo_target].class)
|
163
|
+
|
164
|
+
raise Exception.new("Elasticsearch input: incompatible event")
|
165
|
+
end
|
166
|
+
|
167
|
+
@docinfo_fields.each do |field|
|
168
|
+
event[@docinfo_target][field] = hit[field]
|
169
|
+
end
|
170
|
+
end
|
171
|
+
|
114
172
|
output_queue << event
|
115
173
|
end
|
116
174
|
r = scroll_request(r['_scroll_id'])
|
@@ -119,6 +177,6 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
119
177
|
|
120
178
|
private
|
121
179
|
def scroll_request scroll_id
|
122
|
-
@client.scroll(body
|
180
|
+
@client.scroll(:body => scroll_id, :scroll => @scroll)
|
123
181
|
end
|
124
182
|
end # class LogStash::Inputs::Elasticsearch
|
@@ -1,7 +1,7 @@
|
|
1
1
|
Gem::Specification.new do |s|
|
2
2
|
|
3
3
|
s.name = 'logstash-input-elasticsearch'
|
4
|
-
s.version = '0.1.
|
4
|
+
s.version = '0.1.2'
|
5
5
|
s.licenses = ['Apache License (2.0)']
|
6
6
|
s.summary = "Read from an Elasticsearch cluster, based on search query results"
|
7
7
|
s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
|
@@ -1,11 +1,10 @@
|
|
1
|
+
# encoding: utf-8
|
1
2
|
require "logstash/devutils/rspec/spec_helper"
|
2
3
|
require "logstash/inputs/elasticsearch"
|
3
4
|
require "elasticsearch"
|
4
5
|
|
5
6
|
describe "inputs/elasticsearch" do
|
6
|
-
|
7
7
|
it "should retrieve json event from elasticseach" do
|
8
|
-
|
9
8
|
config = %q[
|
10
9
|
input {
|
11
10
|
elasticsearch {
|
@@ -33,7 +32,7 @@ describe "inputs/elasticsearch" do
|
|
33
32
|
"_type" => "logs",
|
34
33
|
"_id" => "C5b2xLQwTZa76jBmHIbwHQ",
|
35
34
|
"_score" => 1.0,
|
36
|
-
"
|
35
|
+
"_source" => { "message" => ["ohayo"] }
|
37
36
|
} ]
|
38
37
|
}
|
39
38
|
}
|
@@ -46,23 +45,15 @@ describe "inputs/elasticsearch" do
|
|
46
45
|
client = Elasticsearch::Client.new
|
47
46
|
expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
|
48
47
|
expect(client).to receive(:search).with(any_args).and_return(response)
|
49
|
-
expect(client).to receive(:scroll).with({:body=>"cXVlcnlUaGVuRmV0Y2g", :scroll=>"1m"}).and_return(scroll_reponse)
|
50
|
-
|
51
|
-
pipeline = LogStash::Pipeline.new(config)
|
52
|
-
queue = Queue.new
|
53
|
-
pipeline.instance_eval do
|
54
|
-
@output_func = lambda { |event| queue << event }
|
55
|
-
end
|
56
|
-
pipeline_thread = Thread.new { pipeline.run }
|
57
|
-
event = queue.pop
|
48
|
+
expect(client).to receive(:scroll).with({ :body => "cXVlcnlUaGVuRmV0Y2g", :scroll=> "1m" }).and_return(scroll_reponse)
|
58
49
|
|
59
|
-
|
50
|
+
event = fetch_event(config)
|
60
51
|
|
61
|
-
|
52
|
+
insist { event }.is_a?(LogStash::Event)
|
53
|
+
insist { event["message"] } == [ "ohayo" ]
|
62
54
|
end
|
63
55
|
|
64
56
|
it "should retrieve json event from elasticseach with scan" do
|
65
|
-
|
66
57
|
config = %q[
|
67
58
|
input {
|
68
59
|
elasticsearch {
|
@@ -95,7 +86,7 @@ describe "inputs/elasticsearch" do
|
|
95
86
|
"_type" => "logs",
|
96
87
|
"_id" => "C5b2xLQwTZa76jBmHIbwHQ",
|
97
88
|
"_score" => 1.0,
|
98
|
-
"
|
89
|
+
"_source" => { "message" => ["ohayo"] }
|
99
90
|
} ]
|
100
91
|
}
|
101
92
|
},
|
@@ -108,19 +99,200 @@ describe "inputs/elasticsearch" do
|
|
108
99
|
client = Elasticsearch::Client.new
|
109
100
|
expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
|
110
101
|
expect(client).to receive(:search).with(any_args).and_return(scan_response)
|
111
|
-
expect(client).to receive(:scroll).with({:body=>"DcrY3G1xff6SB", :scroll=>"1m"}).and_return(scroll_responses.first)
|
112
|
-
expect(client).to receive(:scroll).with({:body=>"cXVlcnlUaGVuRmV0Y2g", :scroll=>"1m"}).and_return(scroll_responses.last)
|
102
|
+
expect(client).to receive(:scroll).with({ :body => "DcrY3G1xff6SB", :scroll => "1m" }).and_return(scroll_responses.first)
|
103
|
+
expect(client).to receive(:scroll).with({ :body=> "cXVlcnlUaGVuRmV0Y2g", :scroll => "1m" }).and_return(scroll_responses.last)
|
104
|
+
|
105
|
+
event = fetch_event(config)
|
106
|
+
|
107
|
+
insist { event }.is_a?(LogStash::Event)
|
108
|
+
insist { event["message"] } == [ "ohayo" ]
|
109
|
+
end
|
113
110
|
|
114
|
-
|
115
|
-
|
116
|
-
|
117
|
-
|
111
|
+
context "with Elasticsearch document information" do
|
112
|
+
let!(:response) do
|
113
|
+
{
|
114
|
+
"_scroll_id" => "cXVlcnlUaGVuRmV0Y2g",
|
115
|
+
"took" => 27,
|
116
|
+
"timed_out" => false,
|
117
|
+
"_shards" => {
|
118
|
+
"total" => 169,
|
119
|
+
"successful" => 169,
|
120
|
+
"failed" => 0
|
121
|
+
},
|
122
|
+
"hits" => {
|
123
|
+
"total" => 1,
|
124
|
+
"max_score" => 1.0,
|
125
|
+
"hits" => [ {
|
126
|
+
"_index" => "logstash-2014.10.12",
|
127
|
+
"_type" => "logs",
|
128
|
+
"_id" => "C5b2xLQwTZa76jBmHIbwHQ",
|
129
|
+
"_score" => 1.0,
|
130
|
+
"_source" => {
|
131
|
+
"message" => ["ohayo"],
|
132
|
+
"metadata_with_hash" => { "awesome" => "logstash" },
|
133
|
+
"metadata_with_string" => "a string"
|
134
|
+
}
|
135
|
+
} ]
|
136
|
+
}
|
137
|
+
}
|
118
138
|
end
|
119
|
-
pipeline_thread = Thread.new { pipeline.run }
|
120
|
-
event = queue.pop
|
121
139
|
|
122
|
-
|
140
|
+
let(:scroll_reponse) do
|
141
|
+
{
|
142
|
+
"_scroll_id" => "r453Wc1jh0caLJhSDg",
|
143
|
+
"hits" => { "hits" => [] }
|
144
|
+
}
|
145
|
+
end
|
146
|
+
|
147
|
+
let(:client) { Elasticsearch::Client.new }
|
148
|
+
|
149
|
+
before do
|
150
|
+
expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
|
151
|
+
expect(client).to receive(:search).with(any_args).and_return(response)
|
152
|
+
allow(client).to receive(:scroll).with({ :body => "cXVlcnlUaGVuRmV0Y2g", :scroll => "1m" }).and_return(scroll_reponse)
|
153
|
+
end
|
154
|
+
|
155
|
+
context 'when defining docinfo' do
|
156
|
+
let(:config_metadata) do
|
157
|
+
%q[
|
158
|
+
input {
|
159
|
+
elasticsearch {
|
160
|
+
hosts => ["node01"]
|
161
|
+
scan => false
|
162
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
163
|
+
docinfo => true
|
164
|
+
}
|
165
|
+
}
|
166
|
+
]
|
167
|
+
end
|
168
|
+
|
169
|
+
it 'merges the values if the `docinfo_target` already exist in the `_source` document' do
|
170
|
+
metadata_field = 'metadata_with_hash'
|
171
|
+
|
172
|
+
config_metadata_with_hash = %Q[
|
173
|
+
input {
|
174
|
+
elasticsearch {
|
175
|
+
hosts => ["node01"]
|
176
|
+
scan => false
|
177
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
178
|
+
docinfo => true
|
179
|
+
docinfo_target => '#{metadata_field}'
|
180
|
+
}
|
181
|
+
}
|
182
|
+
]
|
183
|
+
|
184
|
+
event = fetch_event(config_metadata_with_hash)
|
185
|
+
|
186
|
+
expect(event[metadata_field]["_index"]).to eq('logstash-2014.10.12')
|
187
|
+
expect(event[metadata_field]["_type"]).to eq('logs')
|
188
|
+
expect(event[metadata_field]["_id"]).to eq('C5b2xLQwTZa76jBmHIbwHQ')
|
189
|
+
expect(event[metadata_field]["awesome"]).to eq("logstash")
|
190
|
+
end
|
191
|
+
|
192
|
+
it 'thows an exception if the `docinfo_target` exist but is not of type hash' do
|
193
|
+
metadata_field = 'metadata_with_string'
|
194
|
+
|
195
|
+
config_metadata_with_string = %Q[
|
196
|
+
input {
|
197
|
+
elasticsearch {
|
198
|
+
hosts => ["node01"]
|
199
|
+
scan => false
|
200
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
201
|
+
docinfo => true
|
202
|
+
docinfo_target => '#{metadata_field}'
|
203
|
+
}
|
204
|
+
}
|
205
|
+
]
|
206
|
+
|
207
|
+
pipeline = LogStash::Pipeline.new(config_metadata_with_string)
|
208
|
+
queue = Queue.new
|
209
|
+
pipeline.instance_eval do
|
210
|
+
@output_func = lambda { |event| queue << event }
|
211
|
+
end
|
212
|
+
|
213
|
+
expect { pipeline.run }.to raise_error(Exception, /incompatible event/)
|
214
|
+
end
|
215
|
+
|
216
|
+
it "should move the document info to the @metadata field" do
|
217
|
+
event = fetch_event(config_metadata)
|
123
218
|
|
124
|
-
|
219
|
+
expect(event["[@metadata][_index]"]).to eq('logstash-2014.10.12')
|
220
|
+
expect(event["[@metadata][_type]"]).to eq('logs')
|
221
|
+
expect(event["[@metadata][_id]"]).to eq('C5b2xLQwTZa76jBmHIbwHQ')
|
222
|
+
end
|
223
|
+
|
224
|
+
it 'should move the document information to the specified field' do
|
225
|
+
config = %q[
|
226
|
+
input {
|
227
|
+
elasticsearch {
|
228
|
+
hosts => ["node01"]
|
229
|
+
scan => false
|
230
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
231
|
+
docinfo => true
|
232
|
+
docinfo_target => 'meta'
|
233
|
+
}
|
234
|
+
}
|
235
|
+
]
|
236
|
+
event = fetch_event(config)
|
237
|
+
|
238
|
+
expect(event["[meta][_index]"]).to eq('logstash-2014.10.12')
|
239
|
+
expect(event["[meta][_type]"]).to eq('logs')
|
240
|
+
expect(event["[meta][_id]"]).to eq('C5b2xLQwTZa76jBmHIbwHQ')
|
241
|
+
end
|
242
|
+
|
243
|
+
it "should allow to specify which fields from the document info to save to the @metadata field" do
|
244
|
+
fields = ["_index"]
|
245
|
+
config = %Q[
|
246
|
+
input {
|
247
|
+
elasticsearch {
|
248
|
+
hosts => ["node01"]
|
249
|
+
scan => false
|
250
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
251
|
+
docinfo => true
|
252
|
+
docinfo_fields => #{fields}
|
253
|
+
}
|
254
|
+
}]
|
255
|
+
|
256
|
+
event = fetch_event(config)
|
257
|
+
|
258
|
+
expect(event["@metadata"].keys).to eq(fields)
|
259
|
+
expect(event["[@metadata][_type]"]).to eq(nil)
|
260
|
+
expect(event["[@metadata][_index]"]).to eq('logstash-2014.10.12')
|
261
|
+
expect(event["[@metadata][_id]"]).to eq(nil)
|
262
|
+
end
|
263
|
+
end
|
264
|
+
|
265
|
+
context "when not defining the docinfo" do
|
266
|
+
it 'should keep the document information in the root of the event' do
|
267
|
+
config = %q[
|
268
|
+
input {
|
269
|
+
elasticsearch {
|
270
|
+
hosts => ["node01"]
|
271
|
+
scan => false
|
272
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
273
|
+
}
|
274
|
+
}
|
275
|
+
]
|
276
|
+
event = fetch_event(config)
|
277
|
+
|
278
|
+
expect(event["[@metadata][_index]"]).to eq(nil)
|
279
|
+
expect(event["[@metadata][_type]"]).to eq(nil)
|
280
|
+
expect(event["[@metadata][_id]"]).to eq(nil)
|
281
|
+
end
|
282
|
+
end
|
125
283
|
end
|
126
284
|
end
|
285
|
+
|
286
|
+
def fetch_event(config)
|
287
|
+
pipeline = LogStash::Pipeline.new(config)
|
288
|
+
queue = Queue.new
|
289
|
+
pipeline.instance_eval do
|
290
|
+
@output_func = lambda { |event| queue << event }
|
291
|
+
end
|
292
|
+
pipeline_thread = Thread.new { pipeline.run }
|
293
|
+
event = queue.pop
|
294
|
+
|
295
|
+
pipeline_thread.join
|
296
|
+
|
297
|
+
return event
|
298
|
+
end
|
metadata
CHANGED
@@ -1,18 +1,17 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-input-elasticsearch
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.1.
|
4
|
+
version: 0.1.2
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Elasticsearch
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date:
|
11
|
+
date: 2015-01-27 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
|
-
|
15
|
-
version_requirements: !ruby/object:Gem::Requirement
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
16
15
|
requirements:
|
17
16
|
- - '>='
|
18
17
|
- !ruby/object:Gem::Version
|
@@ -20,7 +19,10 @@ dependencies:
|
|
20
19
|
- - <
|
21
20
|
- !ruby/object:Gem::Version
|
22
21
|
version: 2.0.0
|
23
|
-
|
22
|
+
name: logstash
|
23
|
+
prerelease: false
|
24
|
+
type: :runtime
|
25
|
+
version_requirements: !ruby/object:Gem::Requirement
|
24
26
|
requirements:
|
25
27
|
- - '>='
|
26
28
|
- !ruby/object:Gem::Version
|
@@ -28,11 +30,8 @@ dependencies:
|
|
28
30
|
- - <
|
29
31
|
- !ruby/object:Gem::Version
|
30
32
|
version: 2.0.0
|
31
|
-
prerelease: false
|
32
|
-
type: :runtime
|
33
33
|
- !ruby/object:Gem::Dependency
|
34
|
-
|
35
|
-
version_requirements: !ruby/object:Gem::Requirement
|
34
|
+
requirement: !ruby/object:Gem::Requirement
|
36
35
|
requirements:
|
37
36
|
- - '>='
|
38
37
|
- !ruby/object:Gem::Version
|
@@ -40,7 +39,10 @@ dependencies:
|
|
40
39
|
- - ~>
|
41
40
|
- !ruby/object:Gem::Version
|
42
41
|
version: '1.0'
|
43
|
-
|
42
|
+
name: elasticsearch
|
43
|
+
prerelease: false
|
44
|
+
type: :runtime
|
45
|
+
version_requirements: !ruby/object:Gem::Requirement
|
44
46
|
requirements:
|
45
47
|
- - '>='
|
46
48
|
- !ruby/object:Gem::Version
|
@@ -48,36 +50,34 @@ dependencies:
|
|
48
50
|
- - ~>
|
49
51
|
- !ruby/object:Gem::Version
|
50
52
|
version: '1.0'
|
51
|
-
prerelease: false
|
52
|
-
type: :runtime
|
53
53
|
- !ruby/object:Gem::Dependency
|
54
|
-
name: logstash-codec-json
|
55
|
-
version_requirements: !ruby/object:Gem::Requirement
|
56
|
-
requirements:
|
57
|
-
- - '>='
|
58
|
-
- !ruby/object:Gem::Version
|
59
|
-
version: '0'
|
60
54
|
requirement: !ruby/object:Gem::Requirement
|
61
55
|
requirements:
|
62
56
|
- - '>='
|
63
57
|
- !ruby/object:Gem::Version
|
64
58
|
version: '0'
|
59
|
+
name: logstash-codec-json
|
65
60
|
prerelease: false
|
66
61
|
type: :runtime
|
67
|
-
- !ruby/object:Gem::Dependency
|
68
|
-
name: logstash-devutils
|
69
62
|
version_requirements: !ruby/object:Gem::Requirement
|
70
63
|
requirements:
|
71
64
|
- - '>='
|
72
65
|
- !ruby/object:Gem::Version
|
73
66
|
version: '0'
|
67
|
+
- !ruby/object:Gem::Dependency
|
74
68
|
requirement: !ruby/object:Gem::Requirement
|
75
69
|
requirements:
|
76
70
|
- - '>='
|
77
71
|
- !ruby/object:Gem::Version
|
78
72
|
version: '0'
|
73
|
+
name: logstash-devutils
|
79
74
|
prerelease: false
|
80
75
|
type: :development
|
76
|
+
version_requirements: !ruby/object:Gem::Requirement
|
77
|
+
requirements:
|
78
|
+
- - '>='
|
79
|
+
- !ruby/object:Gem::Version
|
80
|
+
version: '0'
|
81
81
|
description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
|
82
82
|
email: info@elasticsearch.com
|
83
83
|
executables: []
|
@@ -85,8 +85,10 @@ extensions: []
|
|
85
85
|
extra_rdoc_files: []
|
86
86
|
files:
|
87
87
|
- .gitignore
|
88
|
+
- CONTRIBUTORS
|
88
89
|
- Gemfile
|
89
90
|
- LICENSE
|
91
|
+
- README.md
|
90
92
|
- Rakefile
|
91
93
|
- lib/logstash/inputs/elasticsearch.rb
|
92
94
|
- logstash-input-elasticsearch.gemspec
|
@@ -113,7 +115,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
113
115
|
version: '0'
|
114
116
|
requirements: []
|
115
117
|
rubyforge_project:
|
116
|
-
rubygems_version: 2.
|
118
|
+
rubygems_version: 2.1.9
|
117
119
|
signing_key:
|
118
120
|
specification_version: 4
|
119
121
|
summary: Read from an Elasticsearch cluster, based on search query results
|