logstash-input-file 4.2.3 → 4.4.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +16 -0
- data/README.md +1 -1
- data/docs/index.asciidoc +30 -0
- data/lib/filewatch/helper.rb +5 -2
- data/lib/filewatch/sincedb_collection.rb +14 -3
- data/lib/jars/filewatch-1.0.1.jar +0 -0
- data/lib/logstash/inputs/file.rb +22 -2
- data/lib/logstash/inputs/file_listener.rb +1 -3
- data/logstash-input-file.gemspec +2 -1
- data/spec/filewatch/reading_spec.rb +4 -4
- data/spec/filewatch/rotate_spec.rb +4 -4
- data/spec/filewatch/tailing_spec.rb +10 -10
- data/spec/inputs/file_read_spec.rb +8 -5
- data/spec/inputs/file_tail_spec.rb +48 -28
- metadata +17 -4
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 667711732af65c8b27c1bf3e455d78748d0b0c9365cc612df7aa29cc65fbbb58
|
4
|
+
data.tar.gz: 3f25446f513d8cb7a5619bd7e00333bcade1dbb6ccd8aeb061b991b14e91cb1c
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 371c56fb328a940b11047b88ded84ef2c17427ae706999dcdb15a674109be7515c952b67aadb692400e76fae9edf0fd5bfb6984a9b23f9fe00762ec618d88b25
|
7
|
+
data.tar.gz: b980a4595e55b90783f72f9f36240c0ac9f4a0aea998ab27ae6e2b5024474961f100349f6a044aee29826d7c94789829b1f3f1598bd204ba1b4edb0d90295fa8
|
data/CHANGELOG.md
CHANGED
@@ -1,3 +1,19 @@
|
|
1
|
+
## 4.4.0
|
2
|
+
- Add support for ECS v8 [#301](https://github.com/logstash-plugins/logstash-input-file/pull/301)
|
3
|
+
|
4
|
+
## 4.3.1
|
5
|
+
- Add extra safety to `chown` call in `atomic_write`, avoiding plugin crashes and falling back to a
|
6
|
+
`non_atomic_write` in the event of failure [#295](https://github.com/logstash-plugins/logstash-input-file/pull/295)
|
7
|
+
- Refactor: unify event updates to happen in one place [#297](https://github.com/logstash-plugins/logstash-input-file/pull/297)
|
8
|
+
- Test: Actually retry tests on `RSpec::Expectations::ExpectationNotMetError` and retry instead of relying on timeout
|
9
|
+
[#297](https://github.com/logstash-plugins/logstash-input-file/pull/297)
|
10
|
+
|
11
|
+
## 4.3.0
|
12
|
+
- Add ECS Compatibility Mode [#291](https://github.com/logstash-plugins/logstash-input-file/pull/291)
|
13
|
+
|
14
|
+
## 4.2.4
|
15
|
+
- Fix: sincedb_write issue on Windows machines [#283](https://github.com/logstash-plugins/logstash-input-file/pull/283)
|
16
|
+
|
1
17
|
## 4.2.3
|
2
18
|
- Refactor: improve debug logging (log catched exceptions) [#280](https://github.com/logstash-plugins/logstash-input-file/pull/280)
|
3
19
|
|
data/README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1
1
|
# Logstash Plugin
|
2
2
|
Travis Build
|
3
|
-
[![Travis Build Status](https://travis-ci.
|
3
|
+
[![Travis Build Status](https://travis-ci.com/logstash-plugins/logstash-input-file.svg)](https://travis-ci.com/logstash-plugins/logstash-input-file)
|
4
4
|
|
5
5
|
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
6
6
|
|
data/docs/index.asciidoc
CHANGED
@@ -78,6 +78,21 @@ Read mode also allows for an action to take place after processing the file comp
|
|
78
78
|
In the past attempts to simulate a Read mode while still assuming infinite streams
|
79
79
|
was not ideal and a dedicated Read mode is an improvement.
|
80
80
|
|
81
|
+
[id="plugins-{type}s-{plugin}-ecs"]
|
82
|
+
==== Compatibility with the Elastic Common Schema (ECS)
|
83
|
+
|
84
|
+
This plugin adds metadata about event's source, and can be configured to do so
|
85
|
+
in an {ecs-ref}[ECS-compatible] way with <<plugins-{type}s-{plugin}-ecs_compatibility>>.
|
86
|
+
This metadata is added after the event has been decoded by the appropriate codec,
|
87
|
+
and will never overwrite existing values.
|
88
|
+
|
89
|
+
|========
|
90
|
+
| ECS Disabled | ECS `v1`, `v8` | Description
|
91
|
+
|
92
|
+
| `host` | `[host][name]` | The name of the {ls} host that processed the event
|
93
|
+
| `path` | `[log][file][path]` | The full path to the log file from which the event originates
|
94
|
+
|========
|
95
|
+
|
81
96
|
==== Tracking of current position in watched files
|
82
97
|
|
83
98
|
The plugin keeps track of the current position in each file by
|
@@ -168,6 +183,7 @@ see <<plugins-{type}s-{plugin}-string_duration,string_duration>> for the details
|
|
168
183
|
| <<plugins-{type}s-{plugin}-close_older>> |<<number,number>> or <<plugins-{type}s-{plugin}-string_duration,string_duration>>|No
|
169
184
|
| <<plugins-{type}s-{plugin}-delimiter>> |<<string,string>>|No
|
170
185
|
| <<plugins-{type}s-{plugin}-discover_interval>> |<<number,number>>|No
|
186
|
+
| <<plugins-{type}s-{plugin}-ecs_compatibility>> |<<string,string>>|No
|
171
187
|
| <<plugins-{type}s-{plugin}-exclude>> |<<array,array>>|No
|
172
188
|
| <<plugins-{type}s-{plugin}-exit_after_read>> |<<boolean,boolean>>|No
|
173
189
|
| <<plugins-{type}s-{plugin}-file_chunk_count>> |<<number,number>>|No
|
@@ -242,6 +258,20 @@ This value is a multiple to `stat_interval`, e.g. if `stat_interval` is "500 ms"
|
|
242
258
|
files could be discovered every 15 X 500 milliseconds - 7.5 seconds.
|
243
259
|
In practice, this will be the best case because the time taken to read new content needs to be factored in.
|
244
260
|
|
261
|
+
[id="plugins-{type}s-{plugin}-ecs_compatibility"]
|
262
|
+
===== `ecs_compatibility`
|
263
|
+
|
264
|
+
* Value type is <<string,string>>
|
265
|
+
* Supported values are:
|
266
|
+
** `disabled`: sets non-ECS metadata on event (such as top-level `host`, `path`)
|
267
|
+
** `v1`,`v8`: sets ECS-compatible metadata on event (such as `[host][name]`, `[log][file][path]`)
|
268
|
+
* Default value depends on which version of Logstash is running:
|
269
|
+
** When Logstash provides a `pipeline.ecs_compatibility` setting, its value is used as the default
|
270
|
+
** Otherwise, the default value is `disabled`.
|
271
|
+
|
272
|
+
Controls this plugin's compatibility with the
|
273
|
+
{ecs-ref}[Elastic Common Schema (ECS)].
|
274
|
+
|
245
275
|
[id="plugins-{type}s-{plugin}-exclude"]
|
246
276
|
===== `exclude`
|
247
277
|
|
data/lib/filewatch/helper.rb
CHANGED
@@ -30,6 +30,7 @@ module FileHelper
|
|
30
30
|
temp_file.binmode
|
31
31
|
return_val = yield temp_file
|
32
32
|
temp_file.close
|
33
|
+
new_stat = File.stat(temp_file)
|
33
34
|
|
34
35
|
# Overwrite original file with temp file
|
35
36
|
File.rename(temp_file.path, file_name)
|
@@ -37,8 +38,10 @@ module FileHelper
|
|
37
38
|
# Unable to get permissions of the original file => return
|
38
39
|
return return_val if old_stat.nil?
|
39
40
|
|
40
|
-
# Set correct uid/gid on new file
|
41
|
-
|
41
|
+
# Set correct uid/gid on new file if ownership is different.
|
42
|
+
if old_stat && (old_stat.gid != new_stat.gid || old_stat.uid != new_stat.uid)
|
43
|
+
File.chown(old_stat.uid, old_stat.gid, file_name) if old_stat
|
44
|
+
end
|
42
45
|
|
43
46
|
return_val
|
44
47
|
end
|
@@ -225,14 +225,25 @@ module FileWatch
|
|
225
225
|
|
226
226
|
# @return expired keys
|
227
227
|
def atomic_write(time)
|
228
|
-
|
229
|
-
|
228
|
+
logger.trace? && logger.trace("non_atomic_write: ", :time => time)
|
229
|
+
begin
|
230
|
+
FileHelper.write_atomically(@full_path) do |io|
|
231
|
+
@serializer.serialize(@sincedb, io, time.to_f)
|
232
|
+
end
|
233
|
+
rescue Errno::EPERM, Errno::EACCES => e
|
234
|
+
logger.warn("sincedb_write: unable to write atomically due to permissions error, falling back to non-atomic write: #{path} error:", :exception => e.class, :message => e.message)
|
235
|
+
@write_method = method(:non_atomic_write)
|
236
|
+
non_atomic_write(time)
|
237
|
+
rescue => e
|
238
|
+
logger.warn("sincedb_write: unable to write atomically, attempting non-atomic write: #{path} error:", :exception => e.class, :message => e.message)
|
239
|
+
non_atomic_write(time)
|
230
240
|
end
|
231
241
|
end
|
232
242
|
|
233
243
|
# @return expired keys
|
234
244
|
def non_atomic_write(time)
|
235
|
-
|
245
|
+
logger.trace? && logger.trace("non_atomic_write: ", :time => time)
|
246
|
+
File.open(@full_path, "w+") do |io|
|
236
247
|
@serializer.serialize(@sincedb, io, time.to_f)
|
237
248
|
end
|
238
249
|
end
|
Binary file
|
data/lib/logstash/inputs/file.rb
CHANGED
@@ -2,6 +2,7 @@
|
|
2
2
|
require "logstash/namespace"
|
3
3
|
require "logstash/inputs/base"
|
4
4
|
require "logstash/codecs/identity_map_codec"
|
5
|
+
require 'logstash/plugin_mixins/ecs_compatibility_support'
|
5
6
|
|
6
7
|
require "pathname"
|
7
8
|
require "socket" # for Socket.gethostname
|
@@ -88,6 +89,8 @@ module LogStash module Inputs
|
|
88
89
|
class File < LogStash::Inputs::Base
|
89
90
|
config_name "file"
|
90
91
|
|
92
|
+
include PluginMixins::ECSCompatibilitySupport(:disabled, :v1, :v8 => :v1)
|
93
|
+
|
91
94
|
# The path(s) to the file(s) to use as an input.
|
92
95
|
# You can use filename patterns here, such as `/var/log/*.log`.
|
93
96
|
# If you use a pattern like `/var/log/**/*.log`, a recursive search
|
@@ -325,6 +328,9 @@ class File < LogStash::Inputs::Base
|
|
325
328
|
@codec = LogStash::Codecs::IdentityMapCodec.new(@codec)
|
326
329
|
@completely_stopped = Concurrent::AtomicBoolean.new
|
327
330
|
@queue = Concurrent::AtomicReference.new
|
331
|
+
|
332
|
+
@source_host_field = ecs_select[disabled: 'host', v1:'[host][name]']
|
333
|
+
@source_path_field = ecs_select[disabled: 'path', v1:'[log][file][path]']
|
328
334
|
end # def register
|
329
335
|
|
330
336
|
def completely_stopped?
|
@@ -367,9 +373,12 @@ class File < LogStash::Inputs::Base
|
|
367
373
|
@completely_stopped.make_true
|
368
374
|
end # def run
|
369
375
|
|
370
|
-
def post_process_this(event)
|
376
|
+
def post_process_this(event, path)
|
377
|
+
event.set("[@metadata][path]", path)
|
371
378
|
event.set("[@metadata][host]", @host)
|
372
|
-
event
|
379
|
+
attempt_set(event, @source_host_field, @host)
|
380
|
+
attempt_set(event, @source_path_field, path) if path
|
381
|
+
|
373
382
|
decorate(event)
|
374
383
|
@queue.get << event
|
375
384
|
end
|
@@ -407,6 +416,17 @@ class File < LogStash::Inputs::Base
|
|
407
416
|
end
|
408
417
|
end
|
409
418
|
|
419
|
+
# Attempt to set an event's field to the provided value
|
420
|
+
# without overwriting an existing value or producing an error
|
421
|
+
def attempt_set(event, field_reference, value)
|
422
|
+
return false if event.include?(field_reference)
|
423
|
+
|
424
|
+
event.set(field_reference, value)
|
425
|
+
rescue => e
|
426
|
+
logger.trace("failed to set #{field_reference} to `#{value}`", :exception => e.message)
|
427
|
+
false
|
428
|
+
end
|
429
|
+
|
410
430
|
def build_sincedb_base_from_env
|
411
431
|
# This section is going to be deprecated eventually, as path.data will be
|
412
432
|
# the default, not an environment variable (SINCEDB_DIR or LOGSTASH_HOME)
|
data/logstash-input-file.gemspec
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
Gem::Specification.new do |s|
|
2
2
|
|
3
3
|
s.name = 'logstash-input-file'
|
4
|
-
s.version = '4.
|
4
|
+
s.version = '4.4.0'
|
5
5
|
s.licenses = ['Apache-2.0']
|
6
6
|
s.summary = "Streams events from files"
|
7
7
|
s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
|
@@ -33,6 +33,7 @@ Gem::Specification.new do |s|
|
|
33
33
|
|
34
34
|
s.add_runtime_dependency 'concurrent-ruby', '~> 1.0'
|
35
35
|
s.add_runtime_dependency 'logstash-codec-multiline', ['~> 3.0']
|
36
|
+
s.add_runtime_dependency 'logstash-mixin-ecs_compatibility_support', '~>1.3'
|
36
37
|
|
37
38
|
s.add_development_dependency 'stud', ['~> 0.0.19']
|
38
39
|
s.add_development_dependency 'logstash-devutils'
|
@@ -91,7 +91,7 @@ module FileWatch
|
|
91
91
|
context "when watching a directory with files using striped reading" do
|
92
92
|
let(:file_path2) { ::File.join(directory, "2.log") }
|
93
93
|
# use a chunk size that does not align with the line boundaries
|
94
|
-
let(:opts) { super.merge(:file_chunk_size => 10, :file_chunk_count => 1, :file_sort_by => "path")}
|
94
|
+
let(:opts) { super().merge(:file_chunk_size => 10, :file_chunk_count => 1, :file_sort_by => "path")}
|
95
95
|
let(:lines) { [] }
|
96
96
|
let(:observer) { TestObserver.new(lines) }
|
97
97
|
let(:listener2) { observer.listener_for(file_path2) }
|
@@ -121,7 +121,7 @@ module FileWatch
|
|
121
121
|
end
|
122
122
|
|
123
123
|
context "when a non default delimiter is specified and it is not in the content" do
|
124
|
-
let(:opts) { super.merge(:delimiter => "\nø") }
|
124
|
+
let(:opts) { super().merge(:delimiter => "\nø") }
|
125
125
|
let(:actions) do
|
126
126
|
RSpec::Sequencing.run("create file") do
|
127
127
|
File.open(file_path, "wb") { |file| file.write("line1\nline2") }
|
@@ -154,7 +154,7 @@ module FileWatch
|
|
154
154
|
let(:file_path2) { ::File.join(directory, "2.log") }
|
155
155
|
let(:file_path3) { ::File.join(directory, "3.log") }
|
156
156
|
|
157
|
-
let(:opts) { super.merge(:file_sort_by => "last_modified") }
|
157
|
+
let(:opts) { super().merge(:file_sort_by => "last_modified") }
|
158
158
|
let(:lines) { [] }
|
159
159
|
let(:observer) { TestObserver.new(lines) }
|
160
160
|
|
@@ -195,7 +195,7 @@ module FileWatch
|
|
195
195
|
end
|
196
196
|
|
197
197
|
context "when watching a directory with files using exit_after_read" do
|
198
|
-
let(:opts) { super.merge(:exit_after_read => true, :max_open_files => 2) }
|
198
|
+
let(:opts) { super().merge(:exit_after_read => true, :max_open_files => 2) }
|
199
199
|
let(:file_path3) { ::File.join(directory, "3.log") }
|
200
200
|
let(:file_path4) { ::File.join(directory, "4.log") }
|
201
201
|
let(:file_path5) { ::File.join(directory, "5.log") }
|
@@ -219,7 +219,7 @@ module FileWatch
|
|
219
219
|
end
|
220
220
|
|
221
221
|
context "create + rename rotation: when a new logfile is renamed to a path we have seen before but not all content from the previous the file is read" do
|
222
|
-
let(:opts) { super.merge(
|
222
|
+
let(:opts) { super().merge(
|
223
223
|
:file_chunk_size => line1.bytesize.succ,
|
224
224
|
:file_chunk_count => 1
|
225
225
|
) }
|
@@ -296,7 +296,7 @@ module FileWatch
|
|
296
296
|
end
|
297
297
|
|
298
298
|
context "copy + truncate rotation: when a logfile is copied to a new path and truncated before the open file is fully read" do
|
299
|
-
let(:opts) { super.merge(
|
299
|
+
let(:opts) { super().merge(
|
300
300
|
:file_chunk_size => line1.bytesize.succ,
|
301
301
|
:file_chunk_count => 1
|
302
302
|
) }
|
@@ -370,7 +370,7 @@ module FileWatch
|
|
370
370
|
end
|
371
371
|
|
372
372
|
context "? rotation: when an active file is renamed inside the glob and the reading lags behind" do
|
373
|
-
let(:opts) { super.merge(
|
373
|
+
let(:opts) { super().merge(
|
374
374
|
:file_chunk_size => line1.bytesize.succ,
|
375
375
|
:file_chunk_count => 2
|
376
376
|
) }
|
@@ -409,7 +409,7 @@ module FileWatch
|
|
409
409
|
end
|
410
410
|
|
411
411
|
context "? rotation: when a not active file is rotated outside the glob before the file is read" do
|
412
|
-
let(:opts) { super.merge(
|
412
|
+
let(:opts) { super().merge(
|
413
413
|
:close_older => 3600,
|
414
414
|
:max_open_files => 1,
|
415
415
|
:file_sort_by => "path"
|
@@ -77,7 +77,7 @@ module FileWatch
|
|
77
77
|
|
78
78
|
context "when close_older is set" do
|
79
79
|
let(:wait_before_quit) { 0.8 }
|
80
|
-
let(:opts) { super.merge(:close_older => 0.1, :max_open_files => 1, :stat_interval => 0.1) }
|
80
|
+
let(:opts) { super().merge(:close_older => 0.1, :max_open_files => 1, :stat_interval => 0.1) }
|
81
81
|
let(:suffix) { "B" }
|
82
82
|
it "opens both files" do
|
83
83
|
actions.activate_quietly
|
@@ -278,7 +278,7 @@ module FileWatch
|
|
278
278
|
|
279
279
|
context "when watching a directory with files and a file is renamed to match glob", :unix => true do
|
280
280
|
let(:suffix) { "H" }
|
281
|
-
let(:opts) { super.merge(:close_older => 0) }
|
281
|
+
let(:opts) { super().merge(:close_older => 0) }
|
282
282
|
let(:listener2) { observer.listener_for(file_path2) }
|
283
283
|
let(:actions) do
|
284
284
|
RSpec::Sequencing
|
@@ -346,7 +346,7 @@ module FileWatch
|
|
346
346
|
end
|
347
347
|
|
348
348
|
context "when close older expiry is enabled" do
|
349
|
-
let(:opts) { super.merge(:close_older => 1) }
|
349
|
+
let(:opts) { super().merge(:close_older => 1) }
|
350
350
|
let(:suffix) { "J" }
|
351
351
|
let(:actions) do
|
352
352
|
RSpec::Sequencing.run("create file") do
|
@@ -370,7 +370,7 @@ module FileWatch
|
|
370
370
|
end
|
371
371
|
|
372
372
|
context "when close older expiry is enabled and after timeout the file is appended-to" do
|
373
|
-
let(:opts) { super.merge(:close_older => 0.5) }
|
373
|
+
let(:opts) { super().merge(:close_older => 0.5) }
|
374
374
|
let(:suffix) { "K" }
|
375
375
|
let(:actions) do
|
376
376
|
RSpec::Sequencing
|
@@ -406,7 +406,7 @@ module FileWatch
|
|
406
406
|
end
|
407
407
|
|
408
408
|
context "when ignore older expiry is enabled and all files are already expired" do
|
409
|
-
let(:opts) { super.merge(:ignore_older => 1) }
|
409
|
+
let(:opts) { super().merge(:ignore_older => 1) }
|
410
410
|
let(:suffix) { "L" }
|
411
411
|
let(:actions) do
|
412
412
|
RSpec::Sequencing
|
@@ -430,7 +430,7 @@ module FileWatch
|
|
430
430
|
|
431
431
|
context "when a file is renamed before it gets activated", :unix => true do
|
432
432
|
let(:max) { 1 }
|
433
|
-
let(:opts) { super.merge(:file_chunk_count => 8, :file_chunk_size => 6, :close_older => 0.1, :discover_interval => 6) }
|
433
|
+
let(:opts) { super().merge(:file_chunk_count => 8, :file_chunk_size => 6, :close_older => 0.1, :discover_interval => 6) }
|
434
434
|
let(:suffix) { "M" }
|
435
435
|
let(:start_new_files_at) { :beginning } # we are creating files and sincedb record before hand
|
436
436
|
let(:actions) do
|
@@ -469,7 +469,7 @@ module FileWatch
|
|
469
469
|
end
|
470
470
|
|
471
471
|
context "when ignore_older is less than close_older and all files are not expired" do
|
472
|
-
let(:opts) { super.merge(:ignore_older => 1, :close_older => 1.1) }
|
472
|
+
let(:opts) { super().merge(:ignore_older => 1, :close_older => 1.1) }
|
473
473
|
let(:suffix) { "N" }
|
474
474
|
let(:start_new_files_at) { :beginning }
|
475
475
|
let(:actions) do
|
@@ -497,7 +497,7 @@ module FileWatch
|
|
497
497
|
end
|
498
498
|
|
499
499
|
context "when ignore_older is less than close_older and all files are expired" do
|
500
|
-
let(:opts) { super.merge(:ignore_older => 10, :close_older => 1) }
|
500
|
+
let(:opts) { super().merge(:ignore_older => 10, :close_older => 1) }
|
501
501
|
let(:suffix) { "P" }
|
502
502
|
let(:actions) do
|
503
503
|
RSpec::Sequencing
|
@@ -522,7 +522,7 @@ module FileWatch
|
|
522
522
|
end
|
523
523
|
|
524
524
|
context "when ignore older and close older expiry is enabled and after timeout the file is appended-to" do
|
525
|
-
let(:opts) { super.merge(:ignore_older => 20, :close_older => 0.5) }
|
525
|
+
let(:opts) { super().merge(:ignore_older => 20, :close_older => 0.5) }
|
526
526
|
let(:suffix) { "Q" }
|
527
527
|
let(:actions) do
|
528
528
|
RSpec::Sequencing
|
@@ -551,7 +551,7 @@ module FileWatch
|
|
551
551
|
end
|
552
552
|
|
553
553
|
context "when a non default delimiter is specified and it is not in the content" do
|
554
|
-
let(:opts) { super.merge(:ignore_older => 20, :close_older => 1, :delimiter => "\nø") }
|
554
|
+
let(:opts) { super().merge(:ignore_older => 20, :close_older => 1, :delimiter => "\nø") }
|
555
555
|
let(:suffix) { "R" }
|
556
556
|
let(:actions) do
|
557
557
|
RSpec::Sequencing
|
@@ -267,7 +267,7 @@ describe LogStash::Inputs::File do
|
|
267
267
|
describe 'delete on complete' do
|
268
268
|
|
269
269
|
let(:options) do
|
270
|
-
super.merge({ 'file_completed_action' => "delete", 'exit_after_read' => false })
|
270
|
+
super().merge({ 'file_completed_action' => "delete", 'exit_after_read' => false })
|
271
271
|
end
|
272
272
|
|
273
273
|
let(:sample_file) { File.join(temp_directory, "sample.log") }
|
@@ -306,7 +306,7 @@ describe LogStash::Inputs::File do
|
|
306
306
|
describe 'sincedb cleanup' do
|
307
307
|
|
308
308
|
let(:options) do
|
309
|
-
super.merge(
|
309
|
+
super().merge(
|
310
310
|
'sincedb_path' => sincedb_path,
|
311
311
|
'sincedb_clean_after' => '1.0 seconds',
|
312
312
|
'sincedb_write_interval' => 0.25,
|
@@ -338,7 +338,7 @@ describe LogStash::Inputs::File do
|
|
338
338
|
sincedb_content = File.read(sincedb_path).strip
|
339
339
|
expect( sincedb_content ).to_not be_empty
|
340
340
|
|
341
|
-
|
341
|
+
try(3) do
|
342
342
|
sleep(1.5) # > sincedb_clean_after
|
343
343
|
|
344
344
|
sincedb_content = File.read(sincedb_path).strip
|
@@ -363,7 +363,10 @@ describe LogStash::Inputs::File do
|
|
363
363
|
end
|
364
364
|
end
|
365
365
|
|
366
|
-
def wait_for_file_removal(path
|
367
|
-
|
366
|
+
def wait_for_file_removal(path)
|
367
|
+
timeout = interval
|
368
|
+
try(5) do
|
369
|
+
wait(timeout).for { File.exist?(path) }.to be_falsey
|
370
|
+
end
|
368
371
|
end
|
369
372
|
end
|
@@ -3,7 +3,9 @@
|
|
3
3
|
require "helpers/spec_helper"
|
4
4
|
require "logstash/devutils/rspec/shared_examples"
|
5
5
|
require "logstash/inputs/file"
|
6
|
+
require "logstash/plugin_mixins/ecs_compatibility_support/spec_helper"
|
6
7
|
|
8
|
+
require "json"
|
7
9
|
require "tempfile"
|
8
10
|
require "stud/temporary"
|
9
11
|
require "logstash/codecs/multiline"
|
@@ -99,41 +101,59 @@ describe LogStash::Inputs::File do
|
|
99
101
|
end
|
100
102
|
end
|
101
103
|
|
102
|
-
context "when path and host fields exist" do
|
103
|
-
let(:name) { "C" }
|
104
|
-
it "should not overwrite them" do
|
105
|
-
conf = <<-CONFIG
|
106
|
-
input {
|
107
|
-
file {
|
108
|
-
type => "blah"
|
109
|
-
path => "#{path_path}"
|
110
|
-
start_position => "beginning"
|
111
|
-
sincedb_path => "#{sincedb_path}"
|
112
|
-
delimiter => "#{TEST_FILE_DELIMITER}"
|
113
|
-
codec => "json"
|
114
|
-
}
|
115
|
-
}
|
116
|
-
CONFIG
|
117
104
|
|
118
|
-
|
119
|
-
|
120
|
-
|
121
|
-
|
105
|
+
context "when path and host fields exist", :ecs_compatibility_support do
|
106
|
+
ecs_compatibility_matrix(:disabled, :v1, :v8 => :v1) do |ecs_select|
|
107
|
+
|
108
|
+
before(:each) do
|
109
|
+
allow_any_instance_of(described_class).to receive(:ecs_compatibility).and_return(ecs_compatibility)
|
122
110
|
end
|
123
111
|
|
124
|
-
|
125
|
-
|
112
|
+
let(:file_path_target_field ) { ecs_select[disabled: "path", v1: '[log][file][path]'] }
|
113
|
+
let(:source_host_target_field) { ecs_select[disabled: "host", v1: '[host][name]'] }
|
114
|
+
|
115
|
+
let(:event_with_existing) do
|
116
|
+
LogStash::Event.new.tap do |e|
|
117
|
+
e.set(file_path_target_field, 'my_path')
|
118
|
+
e.set(source_host_target_field, 'my_host')
|
119
|
+
end.to_hash
|
126
120
|
end
|
127
121
|
|
128
|
-
|
122
|
+
let(:name) { "C" }
|
123
|
+
it "should not overwrite them" do
|
124
|
+
conf = <<-CONFIG
|
125
|
+
input {
|
126
|
+
file {
|
127
|
+
type => "blah"
|
128
|
+
path => "#{path_path}"
|
129
|
+
start_position => "beginning"
|
130
|
+
sincedb_path => "#{sincedb_path}"
|
131
|
+
delimiter => "#{TEST_FILE_DELIMITER}"
|
132
|
+
codec => "json"
|
133
|
+
}
|
134
|
+
}
|
135
|
+
CONFIG
|
129
136
|
|
130
|
-
|
131
|
-
|
132
|
-
|
137
|
+
File.open(tmpfile_path, "w") do |fd|
|
138
|
+
fd.puts(event_with_existing.to_json)
|
139
|
+
fd.puts('{"my_field": "my_val"}')
|
140
|
+
fd.fsync
|
141
|
+
end
|
133
142
|
|
134
|
-
|
135
|
-
|
136
|
-
|
143
|
+
events = input(conf) do |pipeline, queue|
|
144
|
+
2.times.collect { queue.pop }
|
145
|
+
end
|
146
|
+
|
147
|
+
existing_path_index, added_path_index = "my_val" == events[0].get("my_field") ? [1,0] : [0,1]
|
148
|
+
|
149
|
+
expect(events[existing_path_index].get(file_path_target_field)).to eq "my_path"
|
150
|
+
expect(events[existing_path_index].get(source_host_target_field)).to eq "my_host"
|
151
|
+
expect(events[existing_path_index].get("[@metadata][host]")).to eq "#{Socket.gethostname.force_encoding(Encoding::UTF_8)}"
|
152
|
+
|
153
|
+
expect(events[added_path_index].get(file_path_target_field)).to eq "#{tmpfile_path}"
|
154
|
+
expect(events[added_path_index].get(source_host_target_field)).to eq "#{Socket.gethostname.force_encoding(Encoding::UTF_8)}"
|
155
|
+
expect(events[added_path_index].get("[@metadata][host]")).to eq "#{Socket.gethostname.force_encoding(Encoding::UTF_8)}"
|
156
|
+
end
|
137
157
|
end
|
138
158
|
end
|
139
159
|
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-input-file
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 4.
|
4
|
+
version: 4.4.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Elastic
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date:
|
11
|
+
date: 2021-08-04 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|
@@ -86,6 +86,20 @@ dependencies:
|
|
86
86
|
- - "~>"
|
87
87
|
- !ruby/object:Gem::Version
|
88
88
|
version: '3.0'
|
89
|
+
- !ruby/object:Gem::Dependency
|
90
|
+
requirement: !ruby/object:Gem::Requirement
|
91
|
+
requirements:
|
92
|
+
- - "~>"
|
93
|
+
- !ruby/object:Gem::Version
|
94
|
+
version: '1.3'
|
95
|
+
name: logstash-mixin-ecs_compatibility_support
|
96
|
+
prerelease: false
|
97
|
+
type: :runtime
|
98
|
+
version_requirements: !ruby/object:Gem::Requirement
|
99
|
+
requirements:
|
100
|
+
- - "~>"
|
101
|
+
- !ruby/object:Gem::Version
|
102
|
+
version: '1.3'
|
89
103
|
- !ruby/object:Gem::Dependency
|
90
104
|
requirement: !ruby/object:Gem::Requirement
|
91
105
|
requirements:
|
@@ -268,8 +282,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
268
282
|
- !ruby/object:Gem::Version
|
269
283
|
version: '0'
|
270
284
|
requirements: []
|
271
|
-
|
272
|
-
rubygems_version: 2.6.13
|
285
|
+
rubygems_version: 3.1.6
|
273
286
|
signing_key:
|
274
287
|
specification_version: 4
|
275
288
|
summary: Streams events from files
|