logstash-logger 0.26.1 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (58) hide show
  1. checksums.yaml +4 -4
  2. data/.github/workflows/tests.yml +51 -0
  3. data/.gitignore +1 -0
  4. data/.rubocop.yml +85 -105
  5. data/Appraisals +6 -18
  6. data/CHANGELOG.md +28 -0
  7. data/Gemfile +0 -1
  8. data/README.md +12 -833
  9. data/docs/buffering.md +70 -0
  10. data/docs/customization.md +86 -0
  11. data/docs/outputs.md +42 -0
  12. data/docs/rails.md +344 -0
  13. data/docs/ssl.md +90 -0
  14. data/docs/troubleshooting.md +84 -0
  15. data/docs/usage.md +148 -0
  16. data/gemfiles/{rails_4.2.gemfile → rails_7.2.gemfile} +1 -2
  17. data/gemfiles/{rails_5.0.gemfile → rails_8.0.gemfile} +1 -2
  18. data/gemfiles/{rails_4.0.gemfile → rails_8.1.gemfile} +1 -2
  19. data/lib/logstash-logger/buffer.rb +0 -1
  20. data/lib/logstash-logger/configuration.rb +1 -2
  21. data/lib/logstash-logger/device/aws_stream.rb +1 -1
  22. data/lib/logstash-logger/device/base.rb +2 -2
  23. data/lib/logstash-logger/device/connectable.rb +3 -11
  24. data/lib/logstash-logger/device/file.rb +21 -4
  25. data/lib/logstash-logger/device/http.rb +33 -0
  26. data/lib/logstash-logger/device/kafka.rb +153 -36
  27. data/lib/logstash-logger/device/redis.rb +8 -1
  28. data/lib/logstash-logger/device/tcp.rb +1 -5
  29. data/lib/logstash-logger/device.rb +24 -2
  30. data/lib/logstash-logger/formatter/base.rb +54 -9
  31. data/lib/logstash-logger/formatter/cee_syslog.rb +1 -1
  32. data/lib/logstash-logger/formatter/json.rb +13 -0
  33. data/lib/logstash-logger/formatter/json_lines.rb +13 -0
  34. data/lib/logstash-logger/formatter.rb +14 -6
  35. data/lib/logstash-logger/logger.rb +6 -19
  36. data/lib/logstash-logger/multi_logger.rb +2 -1
  37. data/lib/logstash-logger/railtie.rb +1 -1
  38. data/lib/logstash-logger/tagged_logging.rb +3 -1
  39. data/lib/logstash-logger/version.rb +1 -1
  40. data/logstash-logger.gemspec +9 -1
  41. data/spec/device/file_spec.rb +65 -0
  42. data/spec/device/http_spec.rb +11 -0
  43. data/spec/device/kafka_spec.rb +337 -14
  44. data/spec/device_spec.rb +13 -0
  45. data/spec/formatter/base_spec.rb +46 -1
  46. data/spec/formatter/cee_syslog_spec.rb +3 -3
  47. data/spec/formatter/json_lines_spec.rb +23 -0
  48. data/spec/formatter/json_spec.rb +49 -0
  49. data/spec/formatter_spec.rb +19 -2
  50. data/spec/logger_spec.rb +5 -5
  51. data/spec/multi_logger_spec.rb +16 -0
  52. data/spec/spec_helper.rb +2 -5
  53. data/spec/tagged_logging_spec.rb +15 -0
  54. metadata +89 -16
  55. data/.travis.yml +0 -26
  56. data/gemfiles/rails_3.2.gemfile +0 -9
  57. data/gemfiles/rails_4.1.gemfile +0 -9
  58. data/gemfiles/rails_5.1.gemfile +0 -9
@@ -0,0 +1,84 @@
1
+ # Troubleshooting
2
+
3
+ ## Logstash never receives any logs
4
+
5
+ If you are using a device backed by a Ruby IO object (such as a file, UDP socket, or TCP socket), please be aware that Ruby
6
+ keeps its own internal buffer. Despite the fact that LogStashLogger buffers
7
+ messages and flushes them periodically, the data written to the IO object can
8
+ be buffered by Ruby internally indefinitely, and may not even write until the
9
+ program terminates. If this bothers you or you need to see log messages
10
+ immediately, your only recourse is to set the `sync: true` option.
11
+
12
+ ## JSON::GeneratorError
13
+
14
+ Your application is probably attempting to log data that is not encoded in a valid way. When this happens, Ruby's
15
+ standard JSON library will raise an exception. You may be able to overcome this by swapping out a different JSON encoder
16
+ such as Oj. Use the [oj_mimic_json](https://github.com/ohler55/oj_mimic_json) gem to use Oj for JSON generation.
17
+
18
+ ## No logs getting sent on Heroku
19
+
20
+ Heroku recommends installing the [rails_12factor](https://github.com/heroku/rails_12factor) so that logs get sent to STDOUT.
21
+ Unfortunately, this overrides LogStashLogger, preventing logs from being sent to their configured destination. The solution
22
+ is to remove `rails_12factor` from your Gemfile.
23
+
24
+ ## Logging eventually stops in production
25
+
26
+ This is most likely not a problem with LogStashLogger, but rather a different gem changing the log level of `Rails.logger`.
27
+ This is especially likely if you're using a threaded server such as Puma, since gems often change the log level of
28
+ `Rails.logger` in a non thread-safe way. See [#17](https://github.com/dwbutler/logstash-logger/issues/17) for more information.
29
+
30
+ ## Sometimes two lines of JSON log messages get sent as one message
31
+
32
+ If you're using UDP output and writing to a logstash listener, you are most likely encountering a bug in the UDP implementation
33
+ of the logstash listener. There is no known fix at this time. See [#43](https://github.com/dwbutler/logstash-logger/issues/43)
34
+ for more information.
35
+
36
+ ## Errno::EMSGSIZE - Message too long
37
+
38
+ A known drawback of using TCP or UDP is the 65535 byte limit on total message size. To workaround
39
+ this issue, you will have to truncate the message by setting the max message size:
40
+
41
+ ```ruby
42
+ LogStashLogger.configure do |config|
43
+ config.max_message_size = 2000
44
+ end
45
+ ```
46
+
47
+ This will truncate only the `message` field of the LogStash Event. So make sure
48
+ you set the max message size significantly less than 65535 bytes to make room
49
+ for other fields.
50
+
51
+ # Breaking changes
52
+
53
+ ## Version 0.27+
54
+
55
+ MRI Ruby < 3.2 is no longer supported, since it has been EOL'ed. If you are on an older version of Ruby, you will need to use 0.26 or below.
56
+
57
+ ## Version 0.25+
58
+
59
+ Rails 3.2, MRI Ruby < 2.2, and JRuby 1.7 are no longer supported, since they have been
60
+ EOL'ed. If you are on an older version of Ruby, you will need to use 0.24 or below.
61
+
62
+ ## Version 0.5+
63
+
64
+ * The `source` event key has been replaced with `host` to better match the latest logstash.
65
+ * The `(host, port, type)` constructor has been deprecated in favor of an options hash constructor.
66
+
67
+ ## Version 0.4+
68
+
69
+ `LogStash::Event` uses the v1 format starting version 1.2+. If you're using the v1, you'll need to install
70
+ LogStashLogger version 0.4+. This is not backwards compatible with the old `LogStash::Event` v1.1.5, which uses
71
+ the v0 format.
72
+
73
+ ## Version 0.3+
74
+
75
+ Earlier versions of this gem (<= 0.2.1) only implemented a TCP connection.
76
+ Newer versions (>= 0.3) also implement UDP, and use that as the new default.
77
+ Please be aware if you are using the default constructor and still require TCP, you should add an additional argument:
78
+
79
+ ```ruby
80
+ # Now defaults to UDP instead of TCP
81
+ logger = LogStashLogger.new('localhost', 5228)
82
+ # Explicitly specify TCP instead of UDP
83
+ logger = LogStashLogger.new('localhost', 5228, :tcp)
84
+ ```
data/docs/usage.md ADDED
@@ -0,0 +1,148 @@
1
+ # Usage Examples
2
+
3
+ ## Basic Usage
4
+
5
+ ```ruby
6
+ require 'logstash-logger'
7
+
8
+ # Defaults to UDP on 0.0.0.0
9
+ logger = LogStashLogger.new(port: 5228)
10
+
11
+ # Specify host and type (UDP or TCP) explicitly
12
+ udp_logger = LogStashLogger.new(type: :udp, host: 'localhost', port: 5228)
13
+ tcp_logger = LogStashLogger.new(type: :tcp, host: 'localhost', port: 5229)
14
+
15
+ # Other types of loggers
16
+ file_logger = LogStashLogger.new(type: :file, path: 'log/development.log', sync: true)
17
+ unix_logger = LogStashLogger.new(type: :unix, path: '/tmp/sock')
18
+ syslog_logger = LogStashLogger.new(type: :syslog)
19
+ redis_logger = LogStashLogger.new(type: :redis)
20
+ kafka_logger = LogStashLogger.new(type: :kafka)
21
+ stdout_logger = LogStashLogger.new(type: :stdout)
22
+ stderr_logger = LogStashLogger.new(type: :stderr)
23
+ io_logger = LogStashLogger.new(type: :io, io: io)
24
+ ```
25
+
26
+ ## Using Formatters
27
+
28
+ ```ruby
29
+ # Use a different formatter
30
+ cee_logger = LogStashLogger.new(
31
+ type: :tcp,
32
+ host: 'logsene-receiver-syslog.sematext.com',
33
+ port: 514,
34
+ formatter: :cee_syslog
35
+ )
36
+
37
+ custom_formatted_logger = LogStashLogger.new(
38
+ type: :redis,
39
+ formatter: MyCustomFormatter
40
+ )
41
+
42
+ lambda_formatted_logger = LogStashLogger.new(
43
+ type: :stdout,
44
+ formatter: ->(severity, time, progname, msg) { "[#{progname}] #{msg}" }
45
+ )
46
+
47
+ ruby_default_formatter_logger = LogStashLogger.new(
48
+ type: :file,
49
+ path: 'log/development.log',
50
+ formatter: ::Logger::Formatter
51
+ )
52
+ ```
53
+
54
+ ## Multiple Outputs
55
+
56
+ ```ruby
57
+ # Send messages to multiple outputs. Each output will have the same format.
58
+ # Syslog cannot be an output because it requires a separate logger.
59
+ multi_delegating_logger = LogStashLogger.new(
60
+ type: :multi_delegator,
61
+ outputs: [
62
+ { type: :file, path: 'log/development.log' },
63
+ { type: :udp, host: 'localhost', port: 5228 }
64
+ ])
65
+
66
+ # Balance messages between several outputs.
67
+ # Works the same as multi delegator, but randomly chooses an output to send each message.
68
+ balancer_logger = LogStashLogger.new(
69
+ type: :balancer,
70
+ outputs: [
71
+ { type: :udp, host: 'host1', port: 5228 },
72
+ { type: :udp, host: 'host2', port: 5228 }
73
+ ])
74
+
75
+ # Send messages to multiple loggers.
76
+ # Use this if you need to send different formats to different outputs.
77
+ # If you need to log to syslog, you must use this.
78
+ multi_logger = LogStashLogger.new(
79
+ type: :multi_logger,
80
+ outputs: [
81
+ { type: :file, path: 'log/development.log', formatter: ::Logger::Formatter },
82
+ { type: :tcp, host: 'localhost', port: 5228, formatter: :json }
83
+ ])
84
+ ```
85
+
86
+ ## Logging Messages
87
+
88
+ ```ruby
89
+ # The following messages are written to UDP port 5228:
90
+
91
+ logger.info 'test'
92
+ # {"message":"test","@timestamp":"2014-05-22T09:37:19.204-07:00","@version":"1","severity":"INFO","host":"[hostname]"}
93
+
94
+ logger.error '{"message": "error"}'
95
+ # {"message":"error","@timestamp":"2014-05-22T10:10:55.877-07:00","@version":"1","severity":"ERROR","host":"[hostname]"}
96
+
97
+ logger.debug message: 'test', foo: 'bar'
98
+ # {"message":"test","foo":"bar","@timestamp":"2014-05-22T09:43:24.004-07:00","@version":"1","severity":"DEBUG","host":"[hostname]"}
99
+
100
+ logger.warn LogStash::Event.new(message: 'test', foo: 'bar')
101
+ # {"message":"test","foo":"bar","@timestamp":"2014-05-22T16:44:37.364Z","@version":"1","severity":"WARN","host":"[hostname]"}
102
+
103
+ # Tagged logging
104
+ logger.tagged('foo') { logger.fatal('bar') }
105
+ # {"message":"bar","@timestamp":"2014-05-26T20:35:14.685-07:00","@version":"1","severity":"FATAL","host":"[hostname]","tags":["foo"]}
106
+ ```
107
+
108
+ ## URI Configuration
109
+
110
+ You can use a URI to configure your logstash logger instead of a hash. This is useful in environments
111
+ such as Heroku where you may want to read configuration values from the environment. The URI scheme
112
+ is `type://host:port/path?key=value`. Some sample URI configurations are given below.
113
+
114
+ ```
115
+ udp://localhost:5228
116
+ tcp://localhost:5229
117
+ unix:///tmp/socket
118
+ file:///path/to/file
119
+ redis://localhost:6379
120
+ kafka://localhost:9092
121
+ stdout:/
122
+ stderr:/
123
+ ```
124
+
125
+ Pass the URI into your logstash logger like so:
126
+
127
+ ```ruby
128
+ # Read the URI from an environment variable
129
+ logger = LogStashLogger.new(uri: ENV['LOGSTASH_URI'])
130
+ ```
131
+
132
+ ## What type of logger should I use?
133
+
134
+ It depends on your specific needs, but most applications should use the default (UDP). Here are the advantages and
135
+ disadvantages of each type:
136
+
137
+ * UDP is faster than TCP because it's asynchronous (fire-and-forget). However, this means that log messages could get dropped.
138
+ This is okay for many applications.
139
+ * TCP verifies that every message has been received via two-way communication. It also supports SSL for secure transmission
140
+ of log messages over a network. This could slow your app down to a crawl if the TCP listener is under heavy load.
141
+ * A file is simple to use, but you will have to worry about log rotation and running out of disk space.
142
+ * Writing to a Unix socket is faster than writing to a TCP or UDP port, but only works locally.
143
+ * Writing to Redis is good for distributed setups that generate tons of logs. However, you will have another moving part and
144
+ have to worry about Redis running out of memory.
145
+ * Writing to stdout is only recommended for debugging purposes.
146
+
147
+ For a more detailed discussion of UDP vs TCP, I recommend reading this article:
148
+ [UDP vs. TCP](http://gafferongames.com/networking-for-game-programmers/udp-vs-tcp/)
@@ -2,8 +2,7 @@
2
2
 
3
3
  source "https://rubygems.org"
4
4
 
5
- gem "codecov", require: false, group: :test
6
5
  gem "codeclimate-test-reporter", group: :test, require: nil
7
- gem "rails", "~> 4.2.0"
6
+ gem "rails", "~> 7.2.0"
8
7
 
9
8
  gemspec path: "../"
@@ -2,8 +2,7 @@
2
2
 
3
3
  source "https://rubygems.org"
4
4
 
5
- gem "codecov", require: false, group: :test
6
5
  gem "codeclimate-test-reporter", group: :test, require: nil
7
- gem "rails", "~> 5.0.0"
6
+ gem "rails", "~> 8.0.0"
8
7
 
9
8
  gemspec path: "../"
@@ -2,8 +2,7 @@
2
2
 
3
3
  source "https://rubygems.org"
4
4
 
5
- gem "codecov", require: false, group: :test
6
5
  gem "codeclimate-test-reporter", group: :test, require: nil
7
- gem "rails", "~> 4.0.0"
6
+ gem "rails", "~> 8.1.0"
8
7
 
9
8
  gemspec path: "../"
@@ -66,7 +66,6 @@ module LogStashLogger
66
66
  # to ensure that all accumulated messages are flushed.
67
67
  module Buffer
68
68
 
69
- public
70
69
  # Initialize the buffer.
71
70
  #
72
71
  # Call directly from your constructor if you wish to set some non-default
@@ -13,12 +13,11 @@ module LogStashLogger
13
13
  attr_accessor :max_message_size
14
14
  attr_accessor :default_error_logger
15
15
 
16
- def initialize(*args)
16
+ def initialize(*_args)
17
17
  @customize_event_block = nil
18
18
  @default_error_logger = Logger.new($stderr)
19
19
 
20
20
  yield self if block_given?
21
- self
22
21
  end
23
22
 
24
23
  def customize_event(&block)
@@ -59,7 +59,7 @@ module LogStashLogger
59
59
  close(flush: false)
60
60
  end
61
61
 
62
- def write_batch(messages, group = nil)
62
+ def write_batch(messages, _group = nil)
63
63
  records = messages.map{ |m| transform_message(m) }
64
64
 
65
65
  with_connection do
@@ -29,7 +29,7 @@ module LogStashLogger
29
29
  end
30
30
  end
31
31
 
32
- def write_batch(messages, group = nil)
32
+ def write_batch(messages, _group = nil)
33
33
  messages.each do |message|
34
34
  write_one(message)
35
35
  end
@@ -43,7 +43,7 @@ module LogStashLogger
43
43
  close
44
44
  end
45
45
 
46
- def close(opts = {})
46
+ def close(_opts = {})
47
47
  close!
48
48
  rescue => e
49
49
  log_error(e)
@@ -10,17 +10,9 @@ module LogStashLogger
10
10
  def initialize(opts = {})
11
11
  super
12
12
 
13
- if opts[:batch_events]
14
- warn "The :batch_events option is deprecated. Please use :buffer_max_items instead"
15
- end
16
-
17
- if opts[:batch_timeout]
18
- warn "The :batch_timeout option is deprecated. Please use :buffer_max_interval instead"
19
- end
20
-
21
13
  @buffer_group = nil
22
- @buffer_max_items = opts[:batch_events] || opts[:buffer_max_items]
23
- @buffer_max_interval = opts[:batch_timeout] || opts[:buffer_max_interval]
14
+ @buffer_max_items = opts[:buffer_max_items]
15
+ @buffer_max_interval = opts[:buffer_max_interval]
24
16
  @drop_messages_on_flush_error =
25
17
  if opts.key?(:drop_messages_on_flush_error)
26
18
  opts.delete(:drop_messages_on_flush_error)
@@ -118,7 +110,7 @@ module LogStashLogger
118
110
  end
119
111
 
120
112
  # Ensure the block is executed with a valid connection
121
- def with_connection(&block)
113
+ def with_connection()
122
114
  connect unless connected?
123
115
  yield
124
116
  rescue => e
@@ -1,11 +1,14 @@
1
1
  require 'fileutils'
2
-
3
2
  module LogStashLogger
4
3
  module Device
5
4
  class File < Base
6
5
  def initialize(opts)
7
6
  super
8
7
  @path = opts[:path] || fail(ArgumentError, "Path is required")
8
+ @shift_age = opts[:shift_age]
9
+ @shift_size = opts[:shift_size]
10
+ @shift_period_suffix = opts[:shift_period_suffix]
11
+ @use_log_device = opts.key?(:shift_age) || opts.key?(:shift_size) || opts.key?(:shift_period_suffix)
9
12
  open
10
13
  end
11
14
 
@@ -14,9 +17,23 @@ module LogStashLogger
14
17
  ::FileUtils.mkdir_p ::File.dirname @path
15
18
  end
16
19
 
17
- @io = ::File.open @path, ::File::WRONLY | ::File::APPEND | ::File::CREAT
18
- @io.binmode
19
- @io.sync = self.sync
20
+ if @use_log_device
21
+ require 'logger'
22
+ log_device_options = { binmode: true }
23
+ log_device_options[:shift_age] = @shift_age unless @shift_age.nil?
24
+ log_device_options[:shift_size] = @shift_size unless @shift_size.nil?
25
+ log_device_options[:shift_period_suffix] = @shift_period_suffix unless @shift_period_suffix.nil?
26
+ @io = ::Logger::LogDevice.new(@path, **log_device_options)
27
+ @io.dev.sync = self.sync unless self.sync.nil?
28
+ else
29
+ @io = ::File.open @path, ::File::WRONLY | ::File::APPEND | ::File::CREAT
30
+ @io.binmode
31
+ @io.sync = self.sync unless self.sync.nil?
32
+ end
33
+ end
34
+
35
+ def to_io
36
+ @io.respond_to?(:dev) ? @io.dev : @io
20
37
  end
21
38
  end
22
39
  end
@@ -0,0 +1,33 @@
1
+ require 'uri'
2
+ require 'net/http'
3
+
4
+ module LogStashLogger
5
+ module Device
6
+
7
+ # Rudimentary write to Logstash HTTP Input relying on buffering
8
+ # rather than persistent HTTP connections for efficiency.
9
+ class HTTP < Connectable
10
+
11
+ def initialize(opts)
12
+ super
13
+ @url = URI(opts[:url])
14
+ end
15
+
16
+ def connect
17
+ # no-op
18
+ end
19
+
20
+ def write_one(message)
21
+ write_batch([message])
22
+ end
23
+
24
+ def write_batch(messages, _group = nil)
25
+ # Logstash HTTP input expects JSON array instead of lines of JSON
26
+ body = "[#{messages.join(',')}]"
27
+ resp = Net::HTTP.post @url, body, {"Content-Type" => "application/json"}
28
+ raise resp.message if Net::HTTPError === resp
29
+ end
30
+
31
+ end
32
+ end
33
+ end
@@ -1,56 +1,173 @@
1
- require 'poseidon'
2
-
3
1
  module LogStashLogger
4
2
  module Device
5
3
  class Kafka < Connectable
4
+ class TLSConfiguration
5
+ attr_reader :ssl_ca_cert, :ssl_client_cert, :ssl_client_cert_key
6
+
7
+ def initialize(opts = {})
8
+ @ssl_ca_cert = opts[:ssl_ca_cert]
9
+ @ssl_client_cert = opts[:ssl_client_cert]
10
+ @ssl_client_cert_key = opts[:ssl_client_cert_key]
11
+ end
12
+
13
+ def cert_bundle
14
+ @cert_bundle ||= all_cert_params? ? cert_params_as_hash : {}
15
+ end
16
+
17
+ def valid?
18
+ all_cert_params? || no_cert_params?
19
+ end
6
20
 
7
- DEFAULT_HOST = 'localhost'
8
- DEFAULT_PORT = 9092
9
- DEFAULT_TOPIC = 'logstash'
10
- DEFAULT_PRODUCER = 'logstash-logger'
11
- DEFAULT_BACKOFF = 1
12
-
13
- attr_accessor :hosts, :topic, :producer, :backoff
14
-
15
- def initialize(opts)
16
- super
17
- host = opts[:host] || DEFAULT_HOST
18
- port = opts[:port] || DEFAULT_PORT
19
- @hosts = opts[:hosts] || host.split(',').map { |h| "#{h}:#{port}" }
20
- @topic = opts[:path] || DEFAULT_TOPIC
21
- @producer = opts[:producer] || DEFAULT_PRODUCER
22
- @backoff = opts[:backoff] || DEFAULT_BACKOFF
21
+ def invalid?
22
+ !valid?
23
+ end
24
+
25
+ private
26
+
27
+ def cert_params_as_hash
28
+ { ssl_ca_cert: @ssl_ca_cert,
29
+ ssl_client_cert: @ssl_client_cert,
30
+ ssl_client_cert_key: @ssl_client_cert_key,
31
+ }
32
+ end
33
+
34
+ def all_cert_params?
35
+ cert_params_as_hash.values.compact.length == valid_cert_params_length
36
+ end
37
+
38
+ def no_cert_params?
39
+ cert_params_as_hash.values.compact.empty?
40
+ end
41
+
42
+ def valid_cert_params_length
43
+ cert_params_as_hash.keys.length
44
+ end
45
+ end
46
+
47
+ attr_reader :topic, :brokers, :cert_bundle, :kafka_tls_configurator,
48
+ :client_id
49
+
50
+ def initialize(opts = {}, kafka_tls_configurator = TLSConfiguration)
51
+ require 'ruby-kafka'
52
+ super(opts)
53
+
54
+ opts = normalize_uri_options(opts)
55
+ @client_id = opts[:client_id]
56
+ @topic = normalize_topic(opts[:topic])
23
57
  @buffer_group = @topic
58
+ @kafka_tls_configurator = kafka_tls_configurator
59
+ @brokers = make_brokers_array(opts[:brokers])
60
+ raise_no_brokers_set! if @brokers.empty?
61
+ make_cert_bundle(opts)
62
+ end
63
+
64
+ def connection
65
+ @io ||= ::Kafka.new(**kafka_client_connection_hash)
24
66
  end
25
67
 
26
68
  def connect
27
- @io = ::Poseidon::Producer.new(@hosts, @producer)
69
+ @io = connection
28
70
  end
29
71
 
30
- def with_connection
31
- connect unless connected?
32
- yield
33
- rescue ::Poseidon::Errors::ChecksumError, Poseidon::Errors::UnableToFetchMetadata => e
34
- log_error(e)
35
- log_warning("reconnect/retry")
36
- sleep backoff if backoff
37
- reconnect
38
- retry
39
- rescue => e
40
- log_error(e)
41
- log_warning("giving up")
42
- close(flush: false)
72
+ def write_one(message, topic=nil)
73
+ topic ||= @topic
74
+ write_messages_to_broker_and_deliver do |producer|
75
+ producer.produce(message, topic: topic)
76
+ end
43
77
  end
44
78
 
45
- def write_batch(messages, topic = nil)
79
+ def write_batch(messages, topic=nil)
46
80
  topic ||= @topic
81
+ write_messages_to_broker_and_deliver do |producer|
82
+ messages.each {|msg| producer.produce(msg, topic: topic) }
83
+ end
84
+ end
85
+
86
+ private
87
+
88
+ def write_messages_to_broker_and_deliver(&block)
47
89
  with_connection do
48
- @io.send_messages messages.map { |message| Poseidon::MessageToSend.new(topic, message) }
90
+ kproducer = producer
91
+ block.call(kproducer) if block_given?
92
+ kproducer.deliver_messages
93
+ end
94
+ end
95
+
96
+ def close!
97
+ begin
98
+ if @producer
99
+ if @producer.respond_to?(:shutdown)
100
+ @producer.shutdown
101
+ elsif @producer.respond_to?(:close)
102
+ @producer.close
103
+ end
104
+ end
105
+ ensure
106
+ @producer = nil
107
+ super
108
+ end
109
+ end
110
+
111
+ def kafka_client_connection_hash
112
+ { seed_brokers: @brokers,
113
+ client_id: @client_id,
114
+ }.merge(@cert_bundle)
115
+ end
116
+
117
+ def producer
118
+ @producer ||= (connected? ? @io : connection).producer
119
+ end
120
+
121
+ def raise_no_topic_set!
122
+ fail ArgumentError, "a topic must be configured"
123
+ end
124
+
125
+ def raise_no_brokers_set!
126
+ fail ArgumentError, "brokers must be configured"
127
+ end
128
+
129
+ def normalize_topic(topic)
130
+ normalized = topic.to_s.strip
131
+ return raise_no_topic_set! if normalized.empty?
132
+ normalized
133
+ end
134
+
135
+ def make_brokers_array(opt)
136
+ brokers =
137
+ case opt
138
+ when Array
139
+ opt.flatten
140
+ when String
141
+ opt.split(/\s+/)
142
+ else
143
+ []
144
+ end
145
+ brokers.compact.map(&:to_s).reject(&:empty?)
146
+ end
147
+
148
+ def normalize_uri_options(opts)
149
+ normalized = opts.dup
150
+ if normalized[:brokers].nil?
151
+ if normalized[:hosts]
152
+ normalized[:brokers] = normalized[:hosts]
153
+ elsif normalized[:host]
154
+ broker = normalized[:host]
155
+ broker = "#{broker}:#{normalized[:port]}" if normalized[:port]
156
+ normalized[:brokers] = broker
157
+ end
158
+ end
159
+ if normalized[:topic].nil? && normalized[:path]
160
+ normalized[:topic] = normalized[:path].to_s.sub(%r{\A/}, '')
49
161
  end
162
+ normalized
50
163
  end
51
164
 
52
- def write_one(message, topic = nil)
53
- write_batch([message], topic)
165
+ def make_cert_bundle(opts)
166
+ tls_conf = kafka_tls_configurator.new(opts)
167
+ if tls_conf.invalid?
168
+ fail ArgumentError, "all ssl parameters (ssl_ca_cert, ssl_client_cert and ssl_client_cert_key) are required or do use any of them to not use TLS"
169
+ end
170
+ @cert_bundle ||= tls_conf.cert_bundle
54
171
  end
55
172
  end
56
173
  end
@@ -13,6 +13,7 @@ module LogStashLogger
13
13
  @buffer_group = @list
14
14
 
15
15
  normalize_path(opts)
16
+ delete_unknown_keywords(opts)
16
17
 
17
18
  @redis_options = opts
18
19
  end
@@ -59,11 +60,17 @@ module LogStashLogger
59
60
  def normalize_path(opts)
60
61
  path = opts.fetch(:path, nil)
61
62
  if path
62
- opts[:db] = path.gsub("/", "").to_i unless path.empty?
63
+ opts[:db] = path.delete("/").to_i unless path.empty?
63
64
  opts.delete(:path)
64
65
  end
65
66
  end
66
67
 
68
+ def delete_unknown_keywords(opts)
69
+ # due to restrictions in redis client version >= 5
70
+ # Continous error is being logged for unknown keywords, at present there is only one i.e. :sync
71
+ # to prevent that adding following line :)
72
+ opts.delete(:sync)
73
+ end
67
74
  end
68
75
  end
69
76
  end
@@ -12,10 +12,6 @@ module LogStashLogger
12
12
  @ssl_context = opts[:ssl_context]
13
13
  @use_ssl = !!(@ssl_certificate || opts[:ssl_context])
14
14
  @use_ssl = opts[:ssl_enable] if opts.has_key? :ssl_enable
15
- if opts.has_key?(:use_ssl)
16
- @use_ssl = opts[:use_ssl]
17
- warn "[LogStashLogger] The use_ssl option is deprecated. Use ssl_enable instead."
18
- end
19
15
  @verify_hostname = opts.fetch(:verify_hostname, true)
20
16
  end
21
17
 
@@ -41,7 +37,7 @@ module LogStashLogger
41
37
  ssl_io
42
38
  else
43
39
  tcp_io
44
- end
40
+ end
45
41
  end
46
42
 
47
43
  protected