logstash-logger 0.14.1 → 0.15.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 3656436203c1f2ce10f02699e6013463c33f8e70
4
- data.tar.gz: ab68643e0fd78501afeb7b4a3f438fb270400b2d
3
+ metadata.gz: 279993e60535511de3a792fdbd892bb52f6dee55
4
+ data.tar.gz: c0286204549928ed8caca944c8533a8468250333
5
5
  SHA512:
6
- metadata.gz: f6d6cca25f3ec1c507a00513f3afdb8f035c1e2d1f55e1f1e1be6ff2b0ce58ffb9ee8bcb5b811de012dcd99d81e0841ea00f3a5ca5d819f7ee1437ecb70a3341
7
- data.tar.gz: 11b3fbcec84828b2886956ed28f15ee7dd43c1255324c4179b184a842d9ff40a8a983c67a04c2e5e211ddacdfc09bb4cd546528771643403ca79cbeadd3f3839
6
+ metadata.gz: 0c2666eacf05151711814889f806b4ad2ea72b497455cab575b6ffa8e61f5decca9866f32d00f24751661707ad9f0d92efa9190feed248601fe7634e880772c5
7
+ data.tar.gz: 0d650d201b8896905b7cf9b96edf9f58e8d762337e3a9cc9ef35834fad190b0ddc6dce65071de17c988656079bf30f3b8dff5d8f98b4bcd5408511dfd3c95624
data/CHANGELOG.md CHANGED
@@ -1,3 +1,6 @@
1
+ ## 0.15.0
2
+ - Adds buffering and automatic retry support for all connectable outputs. [#35](https://github.com/dwbutler/logstash-logger/pull/35)
3
+
1
4
  ## 0.14.1
2
5
  - Fixes MultiLogger tagged logging in Rails. [#60](https://github.com/dwbutler/logstash-logger/pull/60)
3
6
 
data/README.md CHANGED
@@ -7,12 +7,13 @@ writing to a file or syslog since logstash can receive the structured data direc
7
7
 
8
8
  ## Features
9
9
 
10
- * Can write directly to logstash over a UDP or TCP/SSL connection.
10
+ * Can write directly to a logstash listener over a UDP or TCP/SSL connection.
11
11
  * Can write to a file, Redis, Kafka, a unix socket, syslog, stdout, or stderr.
12
- * Writes in logstash JSON format, but supports other formats as well.
13
- * Can write to multiple outputs.
14
12
  * Logger can take a string message, a hash, a `LogStash::Event`, an object, or a JSON string as input.
15
13
  * Events are automatically populated with message, timestamp, host, and severity.
14
+ * Writes in logstash JSON format, but supports other formats as well.
15
+ * Can write to multiple outputs.
16
+ * Log messages are buffered and automatically re-sent if there is a connection problem.
16
17
  * Easily integrates with Rails via configuration.
17
18
 
18
19
  ## Installation
@@ -234,6 +235,39 @@ This configuration would result in the following output.
234
235
  }
235
236
  ```
236
237
 
238
+ ## Buffering / Automatic Retries
239
+
240
+ Log messages are buffered internally, and automatically re-sent if there is a connection problem.
241
+ Outputs that support batch writing (Redis and Kafka) will write log messages in bulk from the
242
+ buffer. This functionality is implemented using
243
+ [Stud::Buffer](https://github.com/jordansissel/ruby-stud/blob/master/lib/stud/buffer.rb).
244
+ You can configure its behavior by passing the following options to LogStashLogger:
245
+
246
+ :buffer_max_items - Max number of items to buffer before flushing. Defaults to 50.
247
+ :buffer_max_interval - Max number of seconds to wait between flushes. Defaults to 5.
248
+
249
+ You can turn this behavior off by setting `buffer_max_items` to `1` or `sync` to `true`.
250
+
251
+ Please be aware of the following caveats to this behavior:
252
+
253
+ * It's possible for duplicate log messages to be sent when retrying. For outputs like Redis and
254
+ Kafka that write in batches, the whole batch could get re-sent. If this is a problem, you
255
+ can add a UUID field to each event to uniquely identify it. You can either do this
256
+ in a `customize_event` block, or by using logstash's
257
+ [UUID filter](https://www.elastic.co/guide/en/logstash/current/plugins-filters-uuid.html).
258
+ * It's still possible to lose log messages. Ruby won't detect a TCP/UDP connection problem
259
+ immediately. In my testing, it took Ruby about 4 seconds to notice the receiving end was down
260
+ and start raising exceptions. Since logstash listeners over TCP/UDP do not acknowledge received
261
+ messages, it's not possible to know which log messages to re-send.
262
+ * If your output source is unavailable long enough, writing to the log will block until it is
263
+ available again. This could make your application unresponsive.
264
+ * If your application suddenly terminates (for example, by SIGKILL or a power outage), the whole
265
+ buffer will be lost.
266
+
267
+ You can make message loss and application blockage less likely by increasing `buffer_max_items`
268
+ (so that more events can be held in the buffer), and increasing `buffer_max_interval` (to wait
269
+ longer between flushes). This will increase memory pressure on your application as log messages
270
+ accumulate in the buffer, so make sure you have allocated enough memory to your process.
237
271
 
238
272
  ## Rails Integration
239
273
 
@@ -269,6 +303,12 @@ config.logstash.uri = ENV['LOGSTASH_URI']
269
303
  # Optional. Defaults to :json_lines. If there are multiple outputs,
270
304
  # they will all share the same formatter.
271
305
  config.logstash.formatter = :json_lines
306
+
307
+ # Optional, max number of items to buffer before flushing. Defaults to 50
308
+ config.logstash.buffer_max_items = 50
309
+
310
+ # Optional, max number of seconds to wait between flushes. Defaults to 5
311
+ config.logstash.buffer_max_interval = 5
272
312
  ```
273
313
 
274
314
  #### UDP
@@ -1,19 +1,45 @@
1
+ require 'stud/buffer'
2
+
1
3
  module LogStashLogger
2
4
  module Device
3
5
  class Connectable < Base
4
- def write(message)
5
- with_connection do
6
- super
6
+ include Stud::Buffer
7
+
8
+ def initialize(opts = {})
9
+ super
10
+
11
+ if opts[:batch_events]
12
+ warn "The :batch_events option is deprecated. Please use :buffer_max_items instead"
7
13
  end
14
+
15
+ if opts[:batch_timeout]
16
+ warn "The :batch_timeout option is deprecated. Please use :buffer_max_interval instead"
17
+ end
18
+
19
+ @buffer_max_items = opts[:batch_events] || opts[:buffer_max_items]
20
+ @buffer_max_interval = opts[:batch_timeout] || opts[:buffer_max_interval]
21
+
22
+ buffer_initialize max_items: @buffer_max_items, max_interval: @buffer_max_interval
8
23
  end
9
24
 
10
- def flush
11
- return unless connected?
12
- with_connection do
13
- super
25
+ def write(message)
26
+ buffer_receive message
27
+ buffer_flush(force: true) if @sync
28
+ end
29
+
30
+ def flush(*args)
31
+ if args.empty?
32
+ buffer_flush
33
+ else
34
+ write_batch(args[0])
14
35
  end
15
36
  end
16
37
 
38
+ def close
39
+ buffer_flush(final: true)
40
+ super
41
+ end
42
+
17
43
  def to_io
18
44
  with_connection do
19
45
  @io
@@ -24,7 +50,13 @@ module LogStashLogger
24
50
  !!@io
25
51
  end
26
52
 
27
- protected
53
+ def write_batch(messages)
54
+ with_connection do
55
+ messages.each do |message|
56
+ @io.write(message)
57
+ end
58
+ end
59
+ end
28
60
 
29
61
  # Implemented by subclasses
30
62
  def connect
@@ -38,12 +70,12 @@ module LogStashLogger
38
70
 
39
71
  # Ensure the block is executed with a valid connection
40
72
  def with_connection(&block)
41
- connect unless @io
73
+ connect unless connected?
42
74
  yield
43
75
  rescue => e
44
76
  warn "#{self.class} - #{e.class} - #{e.message}"
45
- close
46
77
  @io = nil
78
+ raise
47
79
  end
48
80
  end
49
81
  end
@@ -1,10 +1,8 @@
1
1
  require 'poseidon'
2
- require 'stud/buffer'
3
2
 
4
3
  module LogStashLogger
5
4
  module Device
6
5
  class Kafka < Connectable
7
- include Stud::Buffer
8
6
 
9
7
  DEFAULT_HOST = 'localhost'
10
8
  DEFAULT_PORT = 9092
@@ -22,11 +20,6 @@ module LogStashLogger
22
20
  @topic = opts[:path] || DEFAULT_TOPIC
23
21
  @producer = opts[:producer] || DEFAULT_PRODUCER
24
22
  @backoff = opts[:backoff] || DEFAULT_BACKOFF
25
-
26
- @batch_events = opts.fetch(:batch_events, 50)
27
- @batch_timeout = opts.fetch(:batch_timeout, 5)
28
-
29
- buffer_initialize max_items: @batch_events, max_interval: @batch_timeout
30
23
  end
31
24
 
32
25
  def connect
@@ -56,6 +49,12 @@ module LogStashLogger
56
49
  buffer_flush(force: true) if @sync
57
50
  end
58
51
 
52
+ def write_batch(messages)
53
+ with_connection do
54
+ @io.send_messages messages
55
+ end
56
+ end
57
+
59
58
  def close
60
59
  buffer_flush(final: true)
61
60
  @io && @io.close
@@ -70,9 +69,7 @@ module LogStashLogger
70
69
  buffer_flush
71
70
  else
72
71
  messages = *args.first
73
- with_connection do
74
- @io.send_messages messages
75
- end
72
+ write_batch(messages)
76
73
  end
77
74
  end
78
75
 
@@ -1,11 +1,8 @@
1
1
  require 'redis'
2
- require 'stud/buffer'
3
2
 
4
3
  module LogStashLogger
5
4
  module Device
6
5
  class Redis < Connectable
7
- include Stud::Buffer
8
-
9
6
  DEFAULT_LIST = 'logstash'
10
7
 
11
8
  attr_accessor :list
@@ -17,14 +14,8 @@ module LogStashLogger
17
14
  normalize_path(opts)
18
15
 
19
16
  @redis_options = opts
20
-
21
- @batch_events = opts.fetch(:batch_events, 50)
22
- @batch_timeout = opts.fetch(:batch_timeout, 5)
23
-
24
- buffer_initialize max_items: @batch_events, max_interval: @batch_timeout
25
17
  end
26
18
 
27
-
28
19
  def connect
29
20
  @io = ::Redis.new(@redis_options)
30
21
  end
@@ -42,6 +33,7 @@ module LogStashLogger
42
33
  rescue => e
43
34
  warn "#{self.class} - #{e.class} - #{e.message}"
44
35
  @io = nil
36
+ raise
45
37
  end
46
38
 
47
39
  def write(message)
@@ -49,6 +41,12 @@ module LogStashLogger
49
41
  buffer_flush(force: true) if @sync
50
42
  end
51
43
 
44
+ def write_batch(messages, list = nil)
45
+ with_connection do
46
+ @io.rpush(list, messages)
47
+ end
48
+ end
49
+
52
50
  def close
53
51
  buffer_flush(final: true)
54
52
  @io && @io.quit
@@ -63,9 +61,7 @@ module LogStashLogger
63
61
  buffer_flush
64
62
  else
65
63
  messages, list = *args
66
- with_connection do
67
- @io.rpush(list, messages)
68
- end
64
+ write_batch(messages, list)
69
65
  end
70
66
  end
71
67
 
@@ -16,8 +16,6 @@ module LogStashLogger
16
16
  @use_ssl || !@ssl_certificate.nil?
17
17
  end
18
18
 
19
- protected
20
-
21
19
  def connect
22
20
  if use_ssl?
23
21
  ssl_connect
@@ -28,6 +26,8 @@ module LogStashLogger
28
26
  @io
29
27
  end
30
28
 
29
+ protected
30
+
31
31
  def non_ssl_connect
32
32
  @io = TCPSocket.new(@host, @port).tap do |socket|
33
33
  socket.sync = sync unless sync.nil?
@@ -1,3 +1,3 @@
1
1
  module LogStashLogger
2
- VERSION = "0.14.1"
2
+ VERSION = "0.15.0"
3
3
  end
@@ -7,6 +7,7 @@ describe LogStashLogger::Device::Unix do
7
7
 
8
8
  before(:each) do
9
9
  allow(::UNIXSocket).to receive(:new) { unix_socket }
10
+ allow(unix_socket).to receive(:sync=)
10
11
  end
11
12
 
12
13
  it "writes to a local unix socket" do
data/spec/spec_helper.rb CHANGED
@@ -40,7 +40,7 @@ RSpec.shared_context 'logger' do
40
40
  let(:port) { PORT }
41
41
 
42
42
  # The logstash logger
43
- let(:logger) { LogStashLogger.new(host: host, port: port, type: connection_type) }
43
+ let(:logger) { LogStashLogger.new(host: host, port: port, type: connection_type, sync: true) }
44
44
  # The log device that the logger writes to
45
45
  let(:logdev) { logger.instance_variable_get(:@logdev) }
46
46
  end
@@ -48,10 +48,10 @@ end
48
48
  RSpec.shared_context 'device' do
49
49
  let(:port) { PORT }
50
50
  let(:device_with_port) { LogStashLogger::Device.new(port: port) }
51
- let(:udp_device) { LogStashLogger::Device.new(type: :udp, port: port) }
52
- let(:tcp_device) { LogStashLogger::Device.new(type: :tcp, port: port) }
53
- let(:ssl_tcp_device) { LogStashLogger::Device.new(type: :tcp, port: port, ssl_enable: true) }
54
- let(:unix_device) { LogStashLogger::Device.new(type: :unix, path: '/tmp/logstash') }
51
+ let(:udp_device) { LogStashLogger::Device.new(type: :udp, port: port, sync: true) }
52
+ let(:tcp_device) { LogStashLogger::Device.new(type: :tcp, port: port, sync: true) }
53
+ let(:ssl_tcp_device) { LogStashLogger::Device.new(type: :tcp, port: port, ssl_enable: true, sync: true) }
54
+ let(:unix_device) { LogStashLogger::Device.new(type: :unix, path: '/tmp/logstash', sync: true) }
55
55
 
56
56
  let(:file) { Tempfile.new('test') }
57
57
  let(:file_device) { LogStashLogger::Device.new(type: :file, path: file.path)}
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-logger
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.14.1
4
+ version: 0.15.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - David Butler
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2015-09-11 00:00:00.000000000 Z
11
+ date: 2015-09-21 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: logstash-event