logstash-input-elasticsearch 4.8.1 → 4.9.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +9 -0
- data/CONTRIBUTORS +1 -0
- data/docs/index.asciidoc +29 -3
- data/lib/logstash/inputs/elasticsearch.rb +12 -1
- data/logstash-input-elasticsearch.gemspec +2 -1
- data/spec/inputs/elasticsearch_spec.rb +76 -42
- metadata +17 -3
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 4091feeb0b3bf292cfb9afcde7496a72f95168eb570b21d29064ade174bb1352
|
4
|
+
data.tar.gz: cfd02af050bb495dceea16b4ffe5daa6828088d2bd7994639ee08374f87da69d
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 873680bea22204e65d519d310f3952f26fd672ce336504e45e99b13988e2da2794b7c05f83b3e1b7a87f317eb51c079759f9df515dc99236577417ecfd379aa1
|
7
|
+
data.tar.gz: 85c66a665eb7a3b503ab7cec64ea2f24b0e16829c4b78a75560d69c6272b1861583dad41e04ea1157ce58435714a86eaa6f4d71b5a4a724edfb9604f77150b99
|
data/CHANGELOG.md
CHANGED
@@ -1,3 +1,12 @@
|
|
1
|
+
## 4.9.0
|
2
|
+
- Added `target` option, allowing the hit's source to target a specific field instead of being expanded at the root of the event. This allows the input to play nicer with the Elastic Common Schema when the input does not follow the schema. [#117](https://github.com/logstash-plugins/logstash-input-elasticsearch/issues/117)
|
3
|
+
|
4
|
+
## 4.8.3
|
5
|
+
- [DOC] Fixed links to restructured Logstash-to-cloud docs [#139](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/139)
|
6
|
+
|
7
|
+
## 4.8.2
|
8
|
+
- [DOC] Document the permissions required in secured clusters [#137](https://github.com/logstash-plugins/logstash-input-elasticsearch/pull/137)
|
9
|
+
|
1
10
|
## 4.8.1
|
2
11
|
- Fixed connection error when using multiple `slices`. [#133](https://github.com/logstash-plugins/logstash-input-elasticsearch/issues/133)
|
3
12
|
|
data/CONTRIBUTORS
CHANGED
data/docs/index.asciidoc
CHANGED
@@ -77,6 +77,11 @@ Authentication to a secure Elasticsearch cluster is possible using _one_ of the
|
|
77
77
|
* <<plugins-{type}s-{plugin}-cloud_auth>>
|
78
78
|
* <<plugins-{type}s-{plugin}-api_key>>
|
79
79
|
|
80
|
+
[id="plugins-{type}s-{plugin}-autz"]
|
81
|
+
==== Authorization
|
82
|
+
|
83
|
+
Authorization to a secure Elasticsearch cluster requires `read` permission at index level and `monitoring` permissions at cluster level.
|
84
|
+
The `monitoring` permission at cluster level is necessary to perform periodic connectivity checks.
|
80
85
|
|
81
86
|
[id="plugins-{type}s-{plugin}-options"]
|
82
87
|
==== Elasticsearch Input Configuration Options
|
@@ -106,6 +111,7 @@ This plugin supports the following configuration options plus the <<plugins-{typ
|
|
106
111
|
| <<plugins-{type}s-{plugin}-slices>> |<<number,number>>|No
|
107
112
|
| <<plugins-{type}s-{plugin}-ssl>> |<<boolean,boolean>>|No
|
108
113
|
| <<plugins-{type}s-{plugin}-socket_timeout_seconds>> | <<number,number>>|No
|
114
|
+
| <<plugins-{type}s-{plugin}-target>> | https://www.elastic.co/guide/en/logstash/master/field-references-deepdive.html[field reference] | No
|
109
115
|
| <<plugins-{type}s-{plugin}-user>> |<<string,string>>|No
|
110
116
|
|=======================================================================
|
111
117
|
|
@@ -122,7 +128,10 @@ input plugins.
|
|
122
128
|
|
123
129
|
Authenticate using Elasticsearch API key. Note that this option also requires enabling the `ssl` option.
|
124
130
|
|
125
|
-
Format is `id:api_key` where `id` and `api_key` are as returned by the
|
131
|
+
Format is `id:api_key` where `id` and `api_key` are as returned by the
|
132
|
+
Elasticsearch
|
133
|
+
https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html[Create
|
134
|
+
API key API].
|
126
135
|
|
127
136
|
[id="plugins-{type}s-{plugin}-ca_file"]
|
128
137
|
===== `ca_file`
|
@@ -140,7 +149,9 @@ SSL Certificate Authority file in PEM encoded format, must also include any chai
|
|
140
149
|
|
141
150
|
Cloud authentication string ("<username>:<password>" format) is an alternative for the `user`/`password` pair.
|
142
151
|
|
143
|
-
For more info, check out the
|
152
|
+
For more info, check out the
|
153
|
+
https://www.elastic.co/guide/en/logstash/current/connecting-to-cloud.html[Logstash-to-Cloud
|
154
|
+
documentation]
|
144
155
|
|
145
156
|
[id="plugins-{type}s-{plugin}-cloud_id"]
|
146
157
|
===== `cloud_id`
|
@@ -150,7 +161,9 @@ For more info, check out the https://www.elastic.co/guide/en/logstash/current/co
|
|
150
161
|
|
151
162
|
Cloud ID, from the Elastic Cloud web console. If set `hosts` should not be used.
|
152
163
|
|
153
|
-
For more info, check out the
|
164
|
+
For more info, check out the
|
165
|
+
https://www.elastic.co/guide/en/logstash/current/connecting-to-cloud.html[Logstash-to-Cloud
|
166
|
+
documentation]
|
154
167
|
|
155
168
|
[id="plugins-{type}s-{plugin}-connect_timeout_seconds"]
|
156
169
|
===== `connect_timeout_seconds`
|
@@ -365,6 +378,19 @@ server (i.e. HTTPS will be used instead of plain HTTP).
|
|
365
378
|
The maximum amount of time, in seconds, to wait on an incomplete response from Elasticsearch while no additional data has been appended.
|
366
379
|
Socket timeouts usually occur while waiting for the first byte of a response, such as when executing a particularly complex query.
|
367
380
|
|
381
|
+
|
382
|
+
[id="plugins-{type}s-{plugin}-target"]
|
383
|
+
===== `target`
|
384
|
+
|
385
|
+
* Value type is https://www.elastic.co/guide/en/logstash/master/field-references-deepdive.html[field reference]
|
386
|
+
* There is no default value for this setting.
|
387
|
+
|
388
|
+
Without a `target`, events are created from each hit's `_source` at the root level.
|
389
|
+
When the `target` is set to a field reference, the `_source` of the hit is placed in the target field instead.
|
390
|
+
|
391
|
+
This option can be useful to avoid populating unknown fields when a downstream schema such as ECS is enforced.
|
392
|
+
It is also possible to target an entry in the event's metadata, which will be available during event processing but not exported to your outputs (e.g., `target \=> "[@metadata][_source]"`).
|
393
|
+
|
368
394
|
[id="plugins-{type}s-{plugin}-user"]
|
369
395
|
===== `user`
|
370
396
|
|
@@ -3,6 +3,7 @@ require "logstash/inputs/base"
|
|
3
3
|
require "logstash/namespace"
|
4
4
|
require "logstash/json"
|
5
5
|
require "logstash/util/safe_uri"
|
6
|
+
require 'logstash/plugin_mixins/validator_support/field_reference_validation_adapter'
|
6
7
|
require "base64"
|
7
8
|
require_relative "patch"
|
8
9
|
|
@@ -62,6 +63,8 @@ require_relative "patch"
|
|
62
63
|
#
|
63
64
|
#
|
64
65
|
class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
66
|
+
extend LogStash::PluginMixins::ValidatorSupport::FieldReferenceValidationAdapter
|
67
|
+
|
65
68
|
config_name "elasticsearch"
|
66
69
|
|
67
70
|
default :codec, "json"
|
@@ -175,6 +178,9 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
175
178
|
# exactly once.
|
176
179
|
config :schedule, :validate => :string
|
177
180
|
|
181
|
+
# If set, the _source of each hit will be added nested under the target instead of at the top-level
|
182
|
+
config :target, :validate => :field_reference
|
183
|
+
|
178
184
|
def register
|
179
185
|
require "elasticsearch"
|
180
186
|
require "rufus/scheduler"
|
@@ -298,7 +304,12 @@ class LogStash::Inputs::Elasticsearch < LogStash::Inputs::Base
|
|
298
304
|
end
|
299
305
|
|
300
306
|
def push_hit(hit, output_queue)
|
301
|
-
|
307
|
+
if @target.nil?
|
308
|
+
event = LogStash::Event.new(hit['_source'])
|
309
|
+
else
|
310
|
+
event = LogStash::Event.new
|
311
|
+
event.set(@target, hit['_source'])
|
312
|
+
end
|
302
313
|
|
303
314
|
if @docinfo
|
304
315
|
# do not assume event[@docinfo_target] to be in-place updatable. first get it, update it, then at the end set it in the event.
|
@@ -1,7 +1,7 @@
|
|
1
1
|
Gem::Specification.new do |s|
|
2
2
|
|
3
3
|
s.name = 'logstash-input-elasticsearch'
|
4
|
-
s.version = '4.
|
4
|
+
s.version = '4.9.0'
|
5
5
|
s.licenses = ['Apache License (2.0)']
|
6
6
|
s.summary = "Reads query results from an Elasticsearch cluster"
|
7
7
|
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
|
@@ -20,6 +20,7 @@ Gem::Specification.new do |s|
|
|
20
20
|
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
|
21
21
|
|
22
22
|
# Gem dependencies
|
23
|
+
s.add_runtime_dependency "logstash-mixin-validator_support", '~> 1.0'
|
23
24
|
s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
|
24
25
|
|
25
26
|
s.add_runtime_dependency 'elasticsearch', '>= 5.0.3'
|
@@ -40,57 +40,91 @@ describe LogStash::Inputs::TestableElasticsearch do
|
|
40
40
|
end
|
41
41
|
end
|
42
42
|
|
43
|
-
|
44
|
-
config
|
45
|
-
|
46
|
-
|
47
|
-
|
48
|
-
|
43
|
+
context 'creating events from Elasticsearch' do
|
44
|
+
let(:config) do
|
45
|
+
%q[
|
46
|
+
input {
|
47
|
+
elasticsearch {
|
48
|
+
hosts => ["localhost"]
|
49
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
50
|
+
}
|
49
51
|
}
|
52
|
+
]
|
53
|
+
end
|
54
|
+
|
55
|
+
let(:mock_response) do
|
56
|
+
{
|
57
|
+
"_scroll_id" => "cXVlcnlUaGVuRmV0Y2g",
|
58
|
+
"took" => 27,
|
59
|
+
"timed_out" => false,
|
60
|
+
"_shards" => {
|
61
|
+
"total" => 169,
|
62
|
+
"successful" => 169,
|
63
|
+
"failed" => 0
|
64
|
+
},
|
65
|
+
"hits" => {
|
66
|
+
"total" => 1,
|
67
|
+
"max_score" => 1.0,
|
68
|
+
"hits" => [ {
|
69
|
+
"_index" => "logstash-2014.10.12",
|
70
|
+
"_type" => "logs",
|
71
|
+
"_id" => "C5b2xLQwTZa76jBmHIbwHQ",
|
72
|
+
"_score" => 1.0,
|
73
|
+
"_source" => { "message" => ["ohayo"] }
|
74
|
+
} ]
|
75
|
+
}
|
50
76
|
}
|
51
|
-
|
52
|
-
|
53
|
-
|
54
|
-
|
55
|
-
|
56
|
-
|
57
|
-
"_shards" => {
|
58
|
-
"total" => 169,
|
59
|
-
"successful" => 169,
|
60
|
-
"failed" => 0
|
61
|
-
},
|
62
|
-
"hits" => {
|
63
|
-
"total" => 1,
|
64
|
-
"max_score" => 1.0,
|
65
|
-
"hits" => [ {
|
66
|
-
"_index" => "logstash-2014.10.12",
|
67
|
-
"_type" => "logs",
|
68
|
-
"_id" => "C5b2xLQwTZa76jBmHIbwHQ",
|
69
|
-
"_score" => 1.0,
|
70
|
-
"_source" => { "message" => ["ohayo"] }
|
71
|
-
} ]
|
77
|
+
end
|
78
|
+
|
79
|
+
let(:mock_scroll_response) do
|
80
|
+
{
|
81
|
+
"_scroll_id" => "r453Wc1jh0caLJhSDg",
|
82
|
+
"hits" => { "hits" => [] }
|
72
83
|
}
|
73
|
-
|
84
|
+
end
|
74
85
|
|
75
|
-
|
76
|
-
|
77
|
-
|
78
|
-
|
86
|
+
before(:each) do
|
87
|
+
client = Elasticsearch::Client.new
|
88
|
+
expect(Elasticsearch::Client).to receive(:new).with(any_args).and_return(client)
|
89
|
+
expect(client).to receive(:search).with(any_args).and_return(mock_response)
|
90
|
+
expect(client).to receive(:scroll).with({ :body => { :scroll_id => "cXVlcnlUaGVuRmV0Y2g" }, :scroll=> "1m" }).and_return(mock_scroll_response)
|
91
|
+
expect(client).to receive(:clear_scroll).and_return(nil)
|
92
|
+
end
|
79
93
|
|
80
|
-
|
81
|
-
|
82
|
-
|
83
|
-
|
84
|
-
expect(client).to receive(:clear_scroll).and_return(nil)
|
94
|
+
it 'creates the events from the hits' do
|
95
|
+
event = input(config) do |pipeline, queue|
|
96
|
+
queue.pop
|
97
|
+
end
|
85
98
|
|
86
|
-
|
87
|
-
|
99
|
+
expect(event).to be_a(LogStash::Event)
|
100
|
+
puts event.to_hash_with_metadata
|
101
|
+
expect(event.get("message")).to eql [ "ohayo" ]
|
88
102
|
end
|
89
103
|
|
90
|
-
|
91
|
-
|
92
|
-
|
104
|
+
context 'when a target is set' do
|
105
|
+
let(:config) do
|
106
|
+
%q[
|
107
|
+
input {
|
108
|
+
elasticsearch {
|
109
|
+
hosts => ["localhost"]
|
110
|
+
query => '{ "query": { "match": { "city_name": "Okinawa" } }, "fields": ["message"] }'
|
111
|
+
target => "[@metadata][_source]"
|
112
|
+
}
|
113
|
+
}
|
114
|
+
]
|
115
|
+
end
|
93
116
|
|
117
|
+
it 'creates the event using the target' do
|
118
|
+
event = input(config) do |pipeline, queue|
|
119
|
+
queue.pop
|
120
|
+
end
|
121
|
+
|
122
|
+
expect(event).to be_a(LogStash::Event)
|
123
|
+
puts event.to_hash_with_metadata
|
124
|
+
expect(event.get("[@metadata][_source][message]")).to eql [ "ohayo" ]
|
125
|
+
end
|
126
|
+
end
|
127
|
+
end
|
94
128
|
|
95
129
|
# This spec is an adapter-spec, ensuring that we send the right sequence of messages to our Elasticsearch Client
|
96
130
|
# to support sliced scrolling. The underlying implementation will spawn its own threads to consume, so we must be
|
metadata
CHANGED
@@ -1,15 +1,29 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-input-elasticsearch
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 4.
|
4
|
+
version: 4.9.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Elastic
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2020-
|
11
|
+
date: 2020-12-16 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
15
|
+
requirements:
|
16
|
+
- - "~>"
|
17
|
+
- !ruby/object:Gem::Version
|
18
|
+
version: '1.0'
|
19
|
+
name: logstash-mixin-validator_support
|
20
|
+
prerelease: false
|
21
|
+
type: :runtime
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '1.0'
|
13
27
|
- !ruby/object:Gem::Dependency
|
14
28
|
requirement: !ruby/object:Gem::Requirement
|
15
29
|
requirements:
|
@@ -231,7 +245,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
231
245
|
version: '0'
|
232
246
|
requirements: []
|
233
247
|
rubyforge_project:
|
234
|
-
rubygems_version: 2.
|
248
|
+
rubygems_version: 2.7.10
|
235
249
|
signing_key:
|
236
250
|
specification_version: 4
|
237
251
|
summary: Reads query results from an Elasticsearch cluster
|