logstash-output-elasticsearch 11.22.2-java → 11.22.4-java

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 011acede8b368a5fcf578665eacc4393b1d0bc531fc2f5c47814345826534d6f
4
- data.tar.gz: 00b17bcff6d2100a03e801da6e4dff1c97d58adee514764c0c6c7a11588d2bd8
3
+ metadata.gz: ca7853e786ec4b8b63975e609f741bfab19465b65b9d48cef21030a36b0ac6eb
4
+ data.tar.gz: 98ae91a57a372e758e494f355246c23a45ccae403eacce9649b36cfd95cbbd65
5
5
  SHA512:
6
- metadata.gz: c01b55a7daa6609256624e44f2f05f5528adc114a928c150dae3a06f6596d8cc0473252b207b60bd0dbaaeb801e4f44bc1b8338900030d8acfa981ee4cc1807a
7
- data.tar.gz: b422aca67422e08a8627f7b276f32d88ca0320823248bf89b6124eb2f205866718cbdc7f208b36ed595f6810737813c9ce291343885621622bd7cc7dbf427842
6
+ metadata.gz: 87d0fb490f7be1405e187731b8a80362c3a283f9896facd2ef77f3c569b62800afff1ed7806cfb958c9ad0caafcc175f729e8def1e4f9c01b0d756484d3cd6bb
7
+ data.tar.gz: 3dbfcb27c386f841c54adf1275ec09769a59f1ce8b6d86fd9d658de9ddfd4f316114bf17d716179f664c5dae2ecd9dd77d5230218a728eb93f941d765522e276
data/CHANGELOG.md CHANGED
@@ -1,3 +1,9 @@
1
+ ## 11.22.4
2
+ - [DOC] Adds note that ecs-compatibility is required for data streams to work properly [#1174](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1174)
3
+
4
+ ## 11.22.3
5
+ - Fixes an issue where events containing non-unicode strings could fail to serialize correctly when compression is enabled [#1169](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1169)
6
+
1
7
  ## 11.22.2
2
8
  - [DOC] Add content for sending data to Elasticsearch on serverless [#1164](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1164)
3
9
 
@@ -136,9 +142,11 @@
136
142
 
137
143
  ## 11.5.0
138
144
  - Feat: add ssl_supported_protocols option [#1055](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1055)
145
+ - [DOC] Add `v8` to supported values for ecs_compatiblity defaults [#1059](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1059)
139
146
 
140
147
  ## 11.4.2
141
- - [DOC] Add `v8` to supported values for ecs_compatiblity defaults [#1059](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1059)
148
+ - Fixes an issue where events containing non-unicode strings could fail to serialize correctly when compression is enabled [#1169](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1169)
149
+ - NOTE: This is a backport of the relevant fix from v11.22.3 to the 11.4 series for inclusion with Logstash 7.17 maintenance releases
142
150
 
143
151
  ## 11.4.1
144
152
  - Feat: upgrade manticore (http-client) library [#1063](https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/1063)
data/docs/index.asciidoc CHANGED
@@ -51,7 +51,7 @@ You can use this plugin to send your {ls} data to {es-serverless}.
51
51
  Some differences to note between {es-serverless} and self-managed {es}:
52
52
 
53
53
  * Use *API keys* to access {serverless-full} from {ls}.
54
- Any user-based security settings in your in your <<plugins-outputs-elasticsearch,{es} output plugin>> configuration are ignored and may cause errors.
54
+ Any user-based security settings in your <<plugins-outputs-elasticsearch,{es} output plugin>> configuration are ignored and may cause errors.
55
55
  * {es-serverless} uses *data streams* and {ref}/data-stream-lifecycle.html[{dlm} ({dlm-init})] instead of {ilm} ({ilm-init}).
56
56
  Any {ilm-init} settings in your <<plugins-outputs-elasticsearch,{es} output plugin>> configuration are ignored and may cause errors.
57
57
  * *{ls} monitoring* is available through the https://github.com/elastic/integrations/blob/main/packages/logstash/_dev/build/docs/README.md[{ls} Integration] in {serverless-docs}/observability/what-is-observability-serverless[Elastic Observability] on {serverless-full}.
@@ -99,6 +99,8 @@ as logs, metrics, and events) into {es} and {es-serverless}:
99
99
  * <<plugins-{type}s-{plugin}-data_stream_sync_fields>>
100
100
  * <<plugins-{type}s-{plugin}-data_stream_type>>
101
101
 
102
+ IMPORTANT: <<plugins-{type}s-{plugin}-ecs_compatibility,ECS compatibility>> must be enabled (set to `v1` or `v8`) for data streams to work properly.
103
+
102
104
  [id="plugins-{type}s-{plugin}-ds-examples"]
103
105
  ===== Data stream configuration examples
104
106
 
@@ -134,8 +136,6 @@ output {
134
136
  -----
135
137
 
136
138
 
137
-
138
-
139
139
  ==== Writing to different indices: best practices
140
140
 
141
141
  NOTE: You cannot use dynamic variable substitution when `ilm_enabled` is `true`
@@ -316,6 +316,11 @@ index level and `monitoring` permissions at cluster level. The `monitoring`
316
316
  permission at cluster level is necessary to perform periodic connectivity
317
317
  checks.
318
318
 
319
+ [id="plugins-{type}s-{plugin}-handling-non-utf-8"]
320
+ ==== Handling non UTF-8 data
321
+
322
+ This plugin transmits events to Elasticsearch using a JSON API, and therefore requires that all string values in events to be valid UTF-8.
323
+ When a string value on an event contains one or more byte sequences that are not valid in UTF-8, each offending byte sequence is replaced with the UTF-8 replacement character (`\uFFFD`).
319
324
 
320
325
  [id="plugins-{type}s-{plugin}-options"]
321
326
  ==== Elasticsearch Output Configuration Options
@@ -506,6 +511,7 @@ The other `data_stream_*` settings will be used only if this setting is enabled.
506
511
 
507
512
  Logstash handles the output as a data stream when the supplied configuration
508
513
  is compatible with data streams and this value is set to `auto`.
514
+ Note that <<plugins-{type}s-{plugin}-ecs_compatibility,ECS compatibility>> must be enabled (set to `v1` or `v8`) for data streams to work properly.
509
515
 
510
516
  [id="plugins-{type}s-{plugin}-data_stream_auto_routing"]
511
517
  ===== `data_stream_auto_routing`
@@ -22,6 +22,7 @@ module LogStash; module Outputs; class ElasticSearch;
22
22
  # made sense. We picked one on the lowish side to not use too much heap.
23
23
  TARGET_BULK_BYTES = 20 * 1024 * 1024 # 20MiB
24
24
 
25
+
25
26
  class HttpClient
26
27
  attr_reader :client, :options, :logger, :pool, :action_count, :recv_count
27
28
  # This is here in case we use DEFAULT_OPTIONS in the future
@@ -37,7 +38,7 @@ module LogStash; module Outputs; class ElasticSearch;
37
38
  # * `:user` - String. The user to use for authentication.
38
39
  # * `:password` - String. The password to use for authentication.
39
40
  # * `:timeout` - Float. A duration value, in seconds, after which a socket
40
- # operation or request will be aborted if not yet successfull
41
+ # operation or request will be aborted if not yet successful
41
42
  # * `:client_settings` - a hash; see below for keys.
42
43
  #
43
44
  # The `client_settings` key is a has that can contain other settings:
@@ -132,6 +133,9 @@ module LogStash; module Outputs; class ElasticSearch;
132
133
  action.map {|line| LogStash::Json.dump(line)}.join("\n") :
133
134
  LogStash::Json.dump(action)
134
135
  as_json << "\n"
136
+
137
+ as_json.scrub! # ensure generated JSON is valid UTF-8
138
+
135
139
  if (stream_writer.pos + as_json.bytesize) > TARGET_BULK_BYTES && stream_writer.pos > 0
136
140
  stream_writer.flush # ensure writer has sync'd buffers before reporting sizes
137
141
  logger.debug("Sending partial bulk request for batch with one or more actions remaining.",
@@ -496,5 +500,6 @@ module LogStash; module Outputs; class ElasticSearch;
496
500
  end
497
501
  [args, source]
498
502
  end
503
+
499
504
  end
500
505
  end end end
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-elasticsearch'
3
- s.version = '11.22.2'
3
+ s.version = '11.22.4'
4
4
  s.licenses = ['apache-2.0']
5
5
  s.summary = "Stores logs in Elasticsearch"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -11,10 +11,13 @@ end
11
11
  [ {"http_compression" => true}, {"compression_level" => 1} ].each do |compression_config|
12
12
  describe "indexing with http_compression turned on", :integration => true do
13
13
  let(:event) { LogStash::Event.new("message" => "Hello World!", "type" => type) }
14
+ let(:event_with_invalid_utf_8_bytes) { LogStash::Event.new("message" => "Message from spacecraft which contains \xAC invalid \xD7 byte sequences.", "type" => type) }
15
+
14
16
  let(:index) { 10.times.collect { rand(10).to_s }.join("") }
15
17
  let(:type) { ESHelper.es_version_satisfies?("< 7") ? "doc" : "_doc" }
16
18
  let(:event_count) { 10000 + rand(500) }
17
- let(:events) { event_count.times.map { event }.to_a }
19
+ # mix the events with valid and invalid UTF-8 payloads
20
+ let(:events) { event_count.times.map { |i| i%3 == 0 ? event : event_with_invalid_utf_8_bytes }.to_a }
18
21
  let(:config) {
19
22
  {
20
23
  "hosts" => get_host_port,
@@ -242,12 +242,14 @@ describe LogStash::Outputs::ElasticSearch::HttpClient do
242
242
  end
243
243
  end
244
244
 
245
- context "with two messages" do
246
- let(:message1) { "hey" }
247
- let(:message2) { "you" }
245
+ context "with multiple messages" do
246
+ let(:message_head) { "Spacecraft message" }
247
+ let(:message_tail) { "byte sequence" }
248
+ let(:invalid_utf_8_message) { "contains invalid \xAC" }
248
249
  let(:actions) { [
249
- ["index", {:_id=>nil, :_index=>"logstash"}, {"message"=> message1}],
250
- ["index", {:_id=>nil, :_index=>"logstash"}, {"message"=> message2}],
250
+ ["index", {:_id=>nil, :_index=>"logstash"}, {"message"=> message_head}],
251
+ ["index", {:_id=>nil, :_index=>"logstash"}, {"message"=> invalid_utf_8_message}],
252
+ ["index", {:_id=>nil, :_index=>"logstash"}, {"message"=> message_tail}],
251
253
  ]}
252
254
  it "executes one bulk_send operation" do
253
255
  allow(subject).to receive(:join_bulk_responses)
@@ -257,7 +259,7 @@ describe LogStash::Outputs::ElasticSearch::HttpClient do
257
259
 
258
260
  context "if one exceeds TARGET_BULK_BYTES" do
259
261
  let(:target_bulk_bytes) { LogStash::Outputs::ElasticSearch::TARGET_BULK_BYTES }
260
- let(:message1) { "a" * (target_bulk_bytes + 1) }
262
+ let(:message_head) { "a" * (target_bulk_bytes + 1) }
261
263
  it "executes two bulk_send operations" do
262
264
  allow(subject).to receive(:join_bulk_responses)
263
265
  expect(subject).to receive(:bulk_send).twice
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-elasticsearch
3
3
  version: !ruby/object:Gem::Version
4
- version: 11.22.2
4
+ version: 11.22.4
5
5
  platform: java
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-12-13 00:00:00.000000000 Z
11
+ date: 2024-03-22 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement