logdna 1.2.0 → 1.5.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: 6222bf49a91e32f57729e06f85badea969f6bacb
4
- data.tar.gz: 0b33b7292fa1072b81f978f07f331b369eabc9ee
2
+ SHA256:
3
+ metadata.gz: 32488a458ed8004dcb65531121c286ae1df1b5b3586cf146bb5e973e24e0f559
4
+ data.tar.gz: 03f487fbff81de61177296ec51849659c66391d7f16a00a2d170e7d03971eaf9
5
5
  SHA512:
6
- metadata.gz: 73b0894e04ca5246b7d2906a8b333f947b4c43c750938f7cb8c06ae3610bedd1b43bc3bc94a661ff4a54bf768279d1404abd4d76f258893aba3effb7dc09afe8
7
- data.tar.gz: 5d47c9ec27b2e620c7967d03312e1209b47cfb57107f480613f611e85d98400ec120572e4788d55d9301740c9a302daa7d43bcf7eb79cb5ba362bddc87d7322a
6
+ metadata.gz: 5a525a02bc844af91ecc52e324ebc9c1323c26ff0fada9cfb32092b9d57f84b7c8dacd7239e8ffb4d36caca3b89533b0fac577a17e3264643c2678368b7abc10
7
+ data.tar.gz: fa9721cf9605ab44e08d6222a3d8a222c0c0b3eca7802db673ab7a2c3c8bc840ddb060197a39ddb7b5aa149b93fc26d197c09c7c10a54e84f4579d00f7b8ed61
@@ -1,6 +1,6 @@
1
1
  The MIT License (MIT)
2
2
 
3
- Copyright (c) 2016 edwin-lai
3
+ Copyright (c) 2019 LogDNA
4
4
 
5
5
  Permission is hereby granted, free of charge, to any person obtaining a copy
6
6
  of this software and associated documentation files (the "Software"), to deal
@@ -9,13 +9,13 @@ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
9
  copies of the Software, and to permit persons to whom the Software is
10
10
  furnished to do so, subject to the following conditions:
11
11
 
12
- The above copyright notice and this permission notice shall be included in
13
- all copies or substantial portions of the Software.
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
14
 
15
15
  THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
16
  IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
17
  FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
18
  AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
19
  LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21
- THE SOFTWARE.
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md CHANGED
@@ -48,9 +48,10 @@ Options are optional variables that may contain hostname, app name, mac address,
48
48
  :ip => myIpAddress,
49
49
  :mac => myMacAddress,
50
50
  :app => myAppName,
51
- :level => "INFO", # LOG_LEVELS = ['TRACE', 'DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL'] or your customized log level
51
+ :level => "INFO", # LOG_LEVELS = ['TRACE', 'DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL'] or your customized log level (custom levels for Rails have to be sent with a log message)
52
52
  :env => "PRODUCTION",
53
- :meta => {:once => {:first => "nested1", :another => "nested2"}}
53
+ :meta => {:once => {:first => "nested1", :another => "nested2"}},
54
+ :endpoint => "https://fqdn/logs/ingest"
54
55
  }
55
56
 
56
57
  To send logs, use "log" method. Default log level is "INFO"
@@ -80,7 +81,6 @@ Clear current metadata, level, appname, environment
80
81
  logger.clear
81
82
 
82
83
  Check current log level:
83
-
84
84
  logger.info? => true
85
85
  logger.warn? => false
86
86
 
@@ -93,12 +93,19 @@ Log a message with a particular level easily
93
93
 
94
94
  Hostname and app name cannot be more than 80 characters.
95
95
 
96
+ ### Rails Setup
97
+ In your `config/environments/environment.rb`:
98
+
99
+ ```
100
+ Rails.application.configure do
101
+ config.logger = Logdna::Ruby.new(your_api_key, options)
102
+ end
103
+ ```
96
104
 
97
105
  # Important Notes
98
106
 
99
107
  1. This logger assumes that you pass in json formatted data
100
- 2. This logger is a singleton (do not create mutiple instances of the logger) even though the singleton structure is not strongly enforced.
101
-
108
+ 2. This logger is a singleton (do not create mutiple instances of the logger) even though the singleton structure is not strongly enforced.
102
109
 
103
110
  # API
104
111
 
@@ -115,10 +122,15 @@ Instantiates a new instance of the class it is called on. ingestion_key is requi
115
122
  |{ :level => Log level } | 'INFO' |
116
123
  |{ :env => STAGING, PRODUCTION .. etc} | Nil |
117
124
  |{ :meta => metadata} | Nil |
118
- |{ :flushtime => Log flush interval in seconds } | 0.25 seconds |
119
- |{ :flushbyte => Log flush upper limit in bytes } | 500000 bytes ~= 0.5 megabytes |
120
-
121
- Different log level displays log messages in different colors as well.
125
+ |{ :endpoint => LogDNA Ingestion URI | 'https://logs.logdna.com/logs/ingest' |
126
+ |{ :flush_interval => Limit to trigger a flush in seconds } | 0.25 seconds |
127
+ |{ :flush_size => Limit to trigger a flush in bytes } | 2097152 bytes = 2 MiB |
128
+ |{ :request_size => Upper limit of request in bytes } | 2097152 bytes = 2 MiB |
129
+ |{ :retry_timeout => Base timeout for retries in seconds } | 0.25 seconds |
130
+ |{ :retry_max_attempts => Maximum number of retries per request } | 3 attempts |
131
+ |{ :retry_max_jitter => Maximum amount of jitter to add to each retry request in seconds } | 0.25 seconds |
132
+
133
+ Different log level displays log messages in different colors as well.
122
134
  - ![TRACE DEBUG INFO Colors](https://placehold.it/15/515151/000000?text=+) "Trace" "Debug" "Info"
123
135
  - ![WARN Color](https://placehold.it/15/ec9563/000000?text=+) "Warn"
124
136
  - ![ERROR Fatal Colors](https://placehold.it/15/e37e7d/000000?text=+) "Error" "Fatal"
@@ -1,157 +1,133 @@
1
- #!/usr/bin/env ruby
2
- # encoding: utf-8
3
- # require 'singleton'
4
- require 'socket'
5
- require 'uri'
6
- require_relative 'logdna/client.rb'
7
- require_relative 'logdna/resources.rb'
1
+ # frozen_string_literal: true
2
+
3
+ require "logger"
4
+ require "socket"
5
+ require "uri"
6
+ require_relative "logdna/client"
7
+ require_relative "logdna/resources"
8
+ require_relative "logdna/version"
9
+
8
10
  module Logdna
9
11
  class ValidURLRequired < ArgumentError; end
12
+
10
13
  class MaxLengthExceeded < ArgumentError; end
11
14
 
12
15
  class Ruby < ::Logger
13
16
  # uncomment line below and line 3 to enforce singleton
14
17
  # include Singleton
15
18
  Logger::TRACE = 5
16
- attr_accessor :level, :app, :env, :meta
19
+ attr_accessor :app, :env, :meta
17
20
 
18
- def initialize(key, opts={})
19
- @app = opts[:app] || 'default'
20
- @level = opts[:level] || 'INFO'
21
+ def initialize(key, opts = {})
22
+ super(nil, nil, nil)
23
+ @app = opts[:app] || "default"
24
+ @log_level = opts[:level] || "INFO"
21
25
  @env = opts[:env]
22
26
  @meta = opts[:meta]
23
- @@client = nil unless defined? @@client
24
-
27
+ @internal_logger = Logger.new($stdout)
28
+ @internal_logger.level = Logger::DEBUG
29
+ endpoint = opts[:endpoint] || Resources::ENDPOINT
25
30
  hostname = opts[:hostname] || Socket.gethostname
26
- ip = opts.key?(:ip) ? "&ip=#{opts[:ip]}" : ''
27
- mac = opts.key?(:mac) ? "&mac=#{opts[:mac]}" : ''
28
- url = "#{Resources::ENDPOINT}?hostname=#{hostname}#{mac}#{ip}"
29
-
30
- begin
31
- if (hostname.size > Resources::MAX_INPUT_LENGTH || @app.size > Resources::MAX_INPUT_LENGTH )
32
- raise MaxLengthExceeded.new
33
- end
34
- rescue MaxLengthExceeded => e
35
- puts "Hostname or Appname is over #{Resources::MAX_INPUT_LENGTH} characters"
36
- handle_exception(e)
37
- return
38
- end
39
31
 
40
- begin
41
- uri = URI(url)
42
- rescue URI::ValidURIRequired => e
43
- puts "Invalid URL Endpoint: #{url}"
44
- handle_exception(e)
32
+ if hostname.size > Resources::MAX_INPUT_LENGTH || @app.size > Resources::MAX_INPUT_LENGTH
33
+ @internal_logger.debug("Hostname or Appname is over #{Resources::MAX_INPUT_LENGTH} characters")
45
34
  return
46
35
  end
47
36
 
48
- begin
49
- request = Net::HTTP::Post.new(uri.request_uri, 'Content-Type' => 'application/json')
50
- request.basic_auth 'username', key
51
- rescue => e
52
- handle_exception(e)
53
- return
54
- end
55
-
56
- @@client = Logdna::Client.new(request, uri, opts)
57
- end
37
+ ip = opts.key?(:ip) ? "&ip=#{opts[:ip]}" : ""
38
+ mac = opts.key?(:mac) ? "&mac=#{opts[:mac]}" : ""
39
+ url = "#{endpoint}?hostname=#{hostname}#{mac}#{ip}"
40
+ uri = URI(url)
58
41
 
59
- def handle_exception(e)
60
- exception_message = e.message
61
- exception_backtrace = e.backtrace
62
- # NOTE: should log with Ruby logger?
63
- puts exception_message
42
+ request = Net::HTTP::Post.new(uri.request_uri, "Content-Type" => "application/json")
43
+ request.basic_auth("username", key)
44
+ request[:'user-agent'] = opts[:'user-agent'] || "ruby/#{LogDNA::VERSION}"
45
+ @client = Logdna::Client.new(request, uri, opts)
64
46
  end
65
47
 
66
48
  def default_opts
67
49
  {
68
50
  app: @app,
69
- level: @level,
51
+ level: @log_level,
70
52
  env: @env,
71
53
  meta: @meta,
72
54
  }
73
55
  end
74
56
 
57
+ def level
58
+ @log_level
59
+ end
60
+
75
61
  def level=(value)
76
62
  if value.is_a? Numeric
77
- @level = Resources::LOG_LEVELS[value]
63
+ @log_level = Resources::LOG_LEVELS[value]
78
64
  return
79
65
  end
80
66
 
81
- @level = value
67
+ @log_level = value
82
68
  end
83
69
 
84
- def log(msg=nil, opts={})
85
- loggerExist?
86
- message = yield if msg.nil? && block_given?
87
- @response = @@client.buffer(message, default_opts.merge(opts).merge({
88
- timestamp: (Time.now.to_f * 1000).to_i
89
- }))
90
- 'Saved'
70
+ def log(message = nil, opts = {})
71
+ if message.nil? && block_given?
72
+ message = yield
73
+ end
74
+ if message.nil?
75
+ @internal_logger.debug("provide either a message or block")
76
+ return
77
+ end
78
+ message = message.to_s.encode("UTF-8")
79
+ @client.write_to_buffer(message, default_opts.merge(opts).merge(
80
+ timestamp: (Time.now.to_f * 1000).to_i
81
+ ))
91
82
  end
92
83
 
93
84
  Resources::LOG_LEVELS.each do |lvl|
94
85
  name = lvl.downcase
95
86
 
96
- define_method name do |msg=nil, opts={}, &block|
97
- self.log(msg, opts.merge({
98
- level: lvl,
99
- }), &block)
87
+ define_method name do |msg = nil, opts = {}, &block|
88
+ self.log(msg, opts.merge(
89
+ level: lvl
90
+ ), &block)
100
91
  end
101
92
 
102
93
  define_method "#{name}?" do
103
- return Resources::LOG_LEVELS[self.level] == lvl if self.level.is_a? Numeric
94
+ return Resources::LOG_LEVELS[self.level] == lvl if level.is_a? Numeric
95
+
104
96
  self.level == lvl
105
97
  end
106
98
  end
107
99
 
108
100
  def clear
109
- @app = 'default'
110
- @level = 'INFO'
101
+ @app = "default"
102
+ @log_level = "INFO"
111
103
  @env = nil
112
104
  @meta = nil
113
105
  end
114
106
 
115
- def loggerExist?
116
- if @@client.nil?
117
- puts "Logger Not Initialized Yet"
118
- close
119
- end
120
- end
121
-
122
- def <<(msg=nil, opts={})
123
- self.log(msg, opts.merge({
124
- level: '',
125
- }))
107
+ def <<(msg = nil, opts = {})
108
+ log(msg, opts.merge(
109
+ level: ""
110
+ ))
126
111
  end
127
112
 
128
- def add(*arg)
129
- puts "add not supported in LogDNA logger"
130
- return false
113
+ def add(*_arg)
114
+ @internal_logger.debug("add not supported in LogDNA logger")
115
+ false
131
116
  end
132
117
 
133
- def unknown(msg=nil, opts={})
134
- self.log(msg, opts.merge({
135
- level: 'UNKNOWN',
136
- }))
118
+ def unknown(msg = nil, opts = {})
119
+ log(msg, opts.merge(
120
+ level: "UNKNOWN"
121
+ ))
137
122
  end
138
123
 
139
- def datetime_format(*arg)
140
- puts "datetime_format not supported in LogDNA logger"
141
- return false
124
+ def datetime_format(*_arg)
125
+ @internal_logger.debug("datetime_format not supported in LogDNA logger")
126
+ false
142
127
  end
143
128
 
144
-
145
129
  def close
146
- if defined? @@client and !@@client.nil?
147
- @@client.exitout()
148
- end
149
- end
150
-
151
- at_exit do
152
- if defined? @@client and !@@client.nil?
153
- @@client.exitout()
154
- end
130
+ @client&.exitout
155
131
  end
156
132
  end
157
133
  end
@@ -1,47 +1,61 @@
1
- require 'net/http'
2
- require 'socket'
3
- require 'json'
4
- require 'concurrent'
5
- require 'thread'
1
+ # frozen_string_literal: true
2
+
3
+ require "etc"
4
+ require "net/http"
5
+ require "socket"
6
+ require "json"
7
+ require "concurrent"
8
+ require "date"
9
+ require "securerandom"
6
10
 
7
11
  module Logdna
8
- class Client
12
+ Message = Struct.new(:source, :running_size)
9
13
 
14
+ class Client
10
15
  def initialize(request, uri, opts)
11
16
  @uri = uri
12
17
 
13
18
  # NOTE: buffer is in memory
14
- @buffer = StringIO.new
15
- @messages = []
16
- @buffer_over_limit = false
17
-
18
- @side_buffer = StringIO.new
19
- @side_messages = []
19
+ @buffer = []
20
20
 
21
21
  @lock = Mutex.new
22
- @task = nil
23
22
 
24
- # NOTE: the byte limit only affects the message, not the entire message_hash
25
- @actual_byte_limit = opts[:flushbyte] ||= Resources::FLUSH_BYTE_LIMIT
26
- @actual_flush_interval = opts[:flushtime] ||= Resources::FLUSH_INTERVAL
23
+ @flush_interval = opts[:flush_interval] || Resources::FLUSH_INTERVAL
24
+ @flush_size = opts[:flush_size] || Resources::FLUSH_SIZE
25
+
26
+ @request = request
27
+ @request_size = opts[:request_size] || Resources::REQUEST_SIZE
28
+
29
+ @retry_timeout = opts[:retry_timeout] || Resources::RETRY_TIMEOUT
30
+ @retry_max_jitter = opts[:retry_max_jitter] || Resources::RETRY_MAX_JITTER
31
+ @retry_max_attempts = opts[:retry_max_attempts] || Resources::RETRY_MAX_ATTEMPTS
27
32
 
28
- @@request = request
33
+ @internal_logger = Logger.new($stdout)
34
+ @internal_logger.level = Logger::DEBUG
35
+
36
+ @work_thread_pool = Concurrent::FixedThreadPool.new(Etc.nprocessors)
37
+ # TODO: Expose an option to configure the maximum concurrent requests
38
+ # Requires the instance-global request to be resolved first
39
+ @request_thread_pool = Concurrent::FixedThreadPool.new(Resources::MAX_CONCURRENT_REQUESTS)
40
+
41
+ @scheduled_flush = nil
29
42
  end
30
43
 
31
- def encode_message(msg)
32
- msg = msg.to_s unless msg.instance_of? String
44
+ def schedule_flush
45
+ if @scheduled_flush.nil? || @scheduled_flush.complete?
46
+ @scheduled_flush = Concurrent::ScheduledTask.execute(@flush_interval) { flush }
47
+ end
48
+ end
33
49
 
34
- begin
35
- msg = msg.encode("UTF-8")
36
- rescue Encoding::UndefinedConversionError => e
37
- # NOTE: should this be raised or handled silently?
38
- # raise e
50
+ def unschedule_flush
51
+ if !@scheduled_flush.nil?
52
+ @scheduled_flush.cancel
53
+ @scheduled_flush = nil
39
54
  end
40
- msg
41
55
  end
42
56
 
43
- def message_hash(msg, opts={})
44
- obj = {
57
+ def process_message(msg, opts = {})
58
+ processed_message = {
45
59
  line: msg,
46
60
  app: opts[:app],
47
61
  level: opts[:level],
@@ -49,116 +63,149 @@ module Logdna
49
63
  meta: opts[:meta],
50
64
  timestamp: Time.now.to_i,
51
65
  }
52
- obj.delete(:meta) if obj[:meta].nil?
53
- obj
66
+ processed_message.delete(:meta) if processed_message[:meta].nil?
67
+ processed_message
54
68
  end
55
69
 
56
- def create_flush_task
57
- return @task unless @task.nil? or !@task.running?
58
-
59
- t = Concurrent::TimerTask.new(execution_interval: @actual_flush_interval, timeout_interval: Resources::TIMER_OUT) do |task|
60
- if @messages.any?
61
- # keep running if there are queued messages, but don't flush
62
- # because the buffer is being flushed due to being over the limit
63
- unless @buffer_over_limit
64
- flush()
65
- end
66
- else
67
- # no messages means we can kill the task
68
- task.kill
69
- end
70
- end
71
- t.execute
70
+ def write_to_buffer(msg, opts)
71
+ Concurrent::Future.execute({ executor: @work_thread_pool }) { write_to_buffer_sync(msg, opts) }
72
72
  end
73
73
 
74
- def check_side_buffer
75
- return if @side_buffer.size == 0
76
-
77
- @buffer.write(@side_buffer.string)
78
- @side_buffer.truncate(0)
79
- queued_side_messages = @side_messages
80
- @side_messages = []
81
- queued_side_messages.each { |message_hash_obj| @messages.push(message_hash_obj) }
82
- end
74
+ def write_to_buffer_sync(msg, opts)
75
+ processed_message = process_message(msg, opts)
76
+ message_size = processed_message.to_s.bytesize
83
77
 
78
+ running_size = @lock.synchronize do
79
+ running_size = message_size
80
+ if @buffer.any?
81
+ running_size += @buffer[-1].running_size
82
+ end
83
+ @buffer.push(Message.new(processed_message, running_size))
84
84
 
85
- # this should always be running synchronously within this thread
86
- def buffer(msg, opts)
87
- buffer_size = write_to_buffer(msg, opts)
88
- unless buffer_size.nil?
89
- process_buffer(buffer_size)
85
+ running_size
90
86
  end
91
- end
92
87
 
93
- def write_to_buffer(msg, opts)
94
- return if msg.nil?
95
- msg = encode_message(msg)
96
-
97
- if @lock.locked?
98
- @side_buffer.write(msg)
99
- @side_messages.push(message_hash(msg, opts))
100
- return
88
+ if running_size >= @flush_size
89
+ unschedule_flush
90
+ flush_sync
91
+ else
92
+ schedule_flush
101
93
  end
102
-
103
- check_side_buffer
104
- buffer_size = @buffer.write(msg)
105
- @messages.push(message_hash(msg, opts))
106
- buffer_size
107
94
  end
108
95
 
109
- def queue_to_buffer(queue=@queue)
110
- next_object = queue.shift
111
- write_to_buffer(next_object[:msg], next_object[:opts])
96
+ ##
97
+ # Flushes all logs to LogDNA asynchronously
98
+ def flush(options = {})
99
+ Concurrent::Future.execute({ executor: @work_thread_pool }) { flush_sync(options) }
112
100
  end
113
101
 
114
- def process_buffer(buffer_size)
115
- if buffer_size > @actual_byte_limit
116
- @buffer_over_limit = true
117
- flush()
118
- @buffer_over_limit = false
119
- else
120
- @task = create_flush_task
102
+ ##
103
+ # Flushes all logs to LogDNA synchronously
104
+ def flush_sync(options = {})
105
+ slices = @lock.synchronize do
106
+ # Slice the buffer into chunks that try to be no larger than @request_size. Slice points are found with
107
+ # a binary search thanks to the structure of @buffer. We are working backwards because it's cheaper to
108
+ # remove from the tail of an array instead of the head
109
+ slices = []
110
+ until @buffer.empty?
111
+ search_size = @buffer[-1].running_size - @request_size
112
+ if search_size.negative?
113
+ search_size = 0
114
+ end
115
+
116
+ slice_index = @buffer.bsearch_index { |message| message.running_size >= search_size }
117
+ slices.push(@buffer.pop(@buffer.length - slice_index).map(&:source))
118
+ end
119
+ slices
120
+ end
121
+
122
+ # Remember the chunks are in reverse order, this un-reverses them
123
+ slices.reverse_each do |slice|
124
+ if options[:block_on_requests]
125
+ try_request(slice)
126
+ else
127
+ Concurrent::Future.execute({ executor: @request_thread_pool }) { try_request(slice) }
128
+ end
121
129
  end
122
130
  end
123
131
 
124
- # this should be running synchronously if @buffer_over_limit i.e. called from self.buffer
125
- # else asynchronously through @task
126
- def flush()
127
- if defined? @@request and !@@request.nil?
128
- request_messages = []
129
- @lock.synchronize do
130
- request_messages = @messages
131
- @buffer.truncate(0)
132
- @messages = []
132
+ def try_request(slice)
133
+ body = {
134
+ e: "ls",
135
+ ls: slice
136
+ }.to_json
137
+
138
+ flush_id = "#{SecureRandom.uuid} [#{slice.length} lines]"
139
+ error_header = "Flush {#{flush_id}} failed."
140
+ tries = 0
141
+ loop do
142
+ tries += 1
143
+
144
+ if tries > @retry_max_attempts
145
+ @internal_logger.debug("Flush {#{flush_id}} exceeded 3 tries. Discarding flush buffer")
146
+ break
133
147
  end
134
- return if request_messages.empty?
135
148
 
136
- real = {
137
- e: 'ls',
138
- ls: request_messages,
139
- }.to_json
149
+ if send_request(body, error_header)
150
+ break
151
+ end
140
152
 
141
- @@request.body = real
142
- @response = Net::HTTP.start(@uri.hostname, @uri.port, use_ssl: @uri.scheme == 'https') do |http|
143
- http.request(@@request)
153
+ sleep(@retry_timeout * (1 << (tries - 1)) + rand(@retry_max_jitter))
154
+ end
155
+ end
156
+
157
+ def send_request(body, error_header)
158
+ # TODO: Remove instance-global request object
159
+ @request.body = body
160
+ begin
161
+ response = Net::HTTP.start(
162
+ @uri.hostname,
163
+ @uri.port,
164
+ use_ssl: @uri.scheme == "https"
165
+ ) do |http|
166
+ http.request(@request)
144
167
  end
145
168
 
146
- # don't kill @task if this was executed from self.buffer
147
- # don't kill @task if there are queued messages
148
- unless @buffer_over_limit || @messages.any? || @task.nil?
149
- @task.shutdown
150
- @task.kill
169
+ code = response.code.to_i
170
+ if [401, 403].include?(code)
171
+ @internal_logger.debug("#{error_header} Please provide a valid ingestion key. Discarding flush buffer")
172
+ return true
173
+ elsif [408, 500, 504].include?(code)
174
+ # These codes might indicate a temporary ingester issue
175
+ @internal_logger.debug("#{error_header} The request failed #{response}. Retrying")
176
+ elsif code == 200
177
+ return true
178
+ else
179
+ @internal_logger.debug("#{error_header} The request failed #{response}. Discarding flush buffer")
180
+ return true
151
181
  end
182
+ rescue SocketError
183
+ @internal_logger.debug("#{error_header} Network connectivity issue. Retrying")
184
+ rescue Errno::ECONNREFUSED => e
185
+ @internal_logger.debug("#{error_header} The server is down. #{e.message}. Retrying")
186
+ rescue Timeout::Error => e
187
+ @internal_logger.debug("#{error_header} Timeout error occurred. #{e.message}. Retrying")
152
188
  end
189
+
190
+ false
153
191
  end
154
192
 
155
- def exitout()
156
- check_side_buffer
157
- if @messages.any?
158
- flush()
193
+ def exitout
194
+ unschedule_flush
195
+ @work_thread_pool.shutdown
196
+ if !@work_thread_pool.wait_for_termination(1)
197
+ @internal_logger.warn("Work thread pool unable to shutdown gracefully. Logs potentially dropped")
198
+ end
199
+ @request_thread_pool.shutdown
200
+ if !@request_thread_pool.wait_for_termination(5)
201
+ @internal_logger.warn("Request thread pool unable to shutdown gracefully. Logs potentially dropped")
202
+ end
203
+
204
+ if @buffer.any?
205
+ @internal_logger.debug("Exiting LogDNA logger: Logging remaining messages")
206
+ flush_sync({ block_on_requests: true })
207
+ @internal_logger.debug("Finished flushing logs to LogDNA")
159
208
  end
160
- puts "Exiting LogDNA logger: Logging remaining messages"
161
- return
162
209
  end
163
210
  end
164
211
  end
@@ -1,15 +1,21 @@
1
+ # frozen_string_literal: true
2
+
1
3
  module Resources
2
- LOG_LEVELS = ['DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL', 'TRACE'].freeze
3
- DEFAULT_REQUEST_HEADER = { 'Content-Type' => 'application/json; charset=UTF-8' }.freeze
4
- DEFAULT_REQUEST_TIMEOUT = 180000
5
- MS_IN_A_DAY = 86400000
6
- MAX_REQUEST_TIMEOUT = 300000
7
- MAX_LINE_LENGTH = 32000
4
+ LOG_LEVELS = %w[DEBUG INFO WARN ERROR FATAL TRACE].freeze
5
+ DEFAULT_REQUEST_HEADER = { "Content-Type" => "application/json; charset=UTF-8" }.freeze
6
+ DEFAULT_REQUEST_TIMEOUT = 180_000
7
+ MS_IN_A_DAY = 86_400_000
8
+ MAX_REQUEST_TIMEOUT = 300_000
9
+ MAX_LINE_LENGTH = 32_000
8
10
  MAX_INPUT_LENGTH = 80
11
+ RETRY_TIMEOUT = 0.25
12
+ RETRY_MAX_ATTEMPTS = 3
13
+ RETRY_MAX_JITTER = 0.5
9
14
  FLUSH_INTERVAL = 0.25
10
- TIMER_OUT = 15
11
- FLUSH_BYTE_LIMIT = 500000
12
- ENDPOINT = 'https://logs.logdna.com/logs/ingest'.freeze
13
- MAC_ADDR_CHECK = /^([0-9a-fA-F][0-9a-fA-F]:){5}([0-9a-fA-F][0-9a-fA-F])$/
14
- IP_ADDR_CHECK = /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/
15
+ FLUSH_SIZE = 2 * 1_024 * 1_024
16
+ REQUEST_SIZE = 2 * 1_024 * 1_024
17
+ ENDPOINT = "https://logs.logdna.com/logs/ingest"
18
+ MAC_ADDR_CHECK = /^([0-9a-fA-F][0-9a-fA-F]:){5}([0-9a-fA-F][0-9a-fA-F])$/.freeze
19
+ IP_ADDR_CHECK = /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/.freeze
20
+ MAX_CONCURRENT_REQUESTS = 1
15
21
  end
@@ -1,3 +1,5 @@
1
+ # frozen_string_literal: true
2
+
1
3
  module LogDNA
2
- VERSION = '1.2.0'.freeze
4
+ VERSION = "1.5.0"
3
5
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logdna
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.2.0
4
+ version: 1.5.0
5
5
  platform: ruby
6
6
  authors:
7
- - Gun Woo Choi, Derek Zhou
7
+ - Gun Woo Choi, Derek Zhou, Vilya Levitskiy, Muaz Siddiqui
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2018-06-01 00:00:00.000000000 Z
11
+ date: 2021-01-29 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: concurrent-ruby
@@ -24,20 +24,6 @@ dependencies:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
26
  version: '1.0'
27
- - !ruby/object:Gem::Dependency
28
- name: require_all
29
- requirement: !ruby/object:Gem::Requirement
30
- requirements:
31
- - - "~>"
32
- - !ruby/object:Gem::Version
33
- version: '1.4'
34
- type: :runtime
35
- prerelease: false
36
- version_requirements: !ruby/object:Gem::Requirement
37
- requirements:
38
- - - "~>"
39
- - !ruby/object:Gem::Version
40
- version: '1.4'
41
27
  - !ruby/object:Gem::Dependency
42
28
  name: json
43
29
  requirement: !ruby/object:Gem::Requirement
@@ -53,80 +39,45 @@ dependencies:
53
39
  - !ruby/object:Gem::Version
54
40
  version: '2.0'
55
41
  - !ruby/object:Gem::Dependency
56
- name: bundler
57
- requirement: !ruby/object:Gem::Requirement
58
- requirements:
59
- - - "~>"
60
- - !ruby/object:Gem::Version
61
- version: '1.13'
62
- type: :development
63
- prerelease: false
64
- version_requirements: !ruby/object:Gem::Requirement
65
- requirements:
66
- - - "~>"
67
- - !ruby/object:Gem::Version
68
- version: '1.13'
69
- - !ruby/object:Gem::Dependency
70
- name: rake
71
- requirement: !ruby/object:Gem::Requirement
72
- requirements:
73
- - - "~>"
74
- - !ruby/object:Gem::Version
75
- version: '10.5'
76
- type: :development
77
- prerelease: false
78
- version_requirements: !ruby/object:Gem::Requirement
79
- requirements:
80
- - - "~>"
81
- - !ruby/object:Gem::Version
82
- version: '10.5'
83
- - !ruby/object:Gem::Dependency
84
- name: rspec
42
+ name: require_all
85
43
  requirement: !ruby/object:Gem::Requirement
86
44
  requirements:
87
45
  - - "~>"
88
46
  - !ruby/object:Gem::Version
89
- version: '3.5'
90
- type: :development
47
+ version: '1.4'
48
+ type: :runtime
91
49
  prerelease: false
92
50
  version_requirements: !ruby/object:Gem::Requirement
93
51
  requirements:
94
52
  - - "~>"
95
53
  - !ruby/object:Gem::Version
96
- version: '3.5'
54
+ version: '1.4'
97
55
  - !ruby/object:Gem::Dependency
98
- name: webmock
56
+ name: rubocop
99
57
  requirement: !ruby/object:Gem::Requirement
100
58
  requirements:
101
59
  - - "~>"
102
60
  - !ruby/object:Gem::Version
103
- version: '2.3'
61
+ version: '0.78'
104
62
  type: :development
105
63
  prerelease: false
106
64
  version_requirements: !ruby/object:Gem::Requirement
107
65
  requirements:
108
66
  - - "~>"
109
67
  - !ruby/object:Gem::Version
110
- version: '2.3'
68
+ version: '0.78'
111
69
  description:
112
- email: support@logdna.com
70
+ email: apps+rubygems@logdna.com
113
71
  executables: []
114
72
  extensions: []
115
73
  extra_rdoc_files: []
116
74
  files:
117
- - ".gitignore"
118
- - ".rspec"
119
- - ".ruby-version"
120
- - Gemfile
121
- - LICENSE.txt
75
+ - LICENSE
122
76
  - README.md
123
- - Rakefile
124
77
  - lib/logdna.rb
125
78
  - lib/logdna/client.rb
126
79
  - lib/logdna/resources.rb
127
80
  - lib/logdna/version.rb
128
- - logdna.gemspec
129
- - test.rb
130
81
  homepage: https://github.com/logdna/ruby
131
82
  licenses:
132
83
  - MIT
@@ -139,15 +90,14 @@ required_ruby_version: !ruby/object:Gem::Requirement
139
90
  requirements:
140
91
  - - ">="
141
92
  - !ruby/object:Gem::Version
142
- version: '0'
93
+ version: 2.5.0
143
94
  required_rubygems_version: !ruby/object:Gem::Requirement
144
95
  requirements:
145
96
  - - ">="
146
97
  - !ruby/object:Gem::Version
147
98
  version: '0'
148
99
  requirements: []
149
- rubyforge_project:
150
- rubygems_version: 2.5.2
100
+ rubygems_version: 3.0.3
151
101
  signing_key:
152
102
  specification_version: 4
153
103
  summary: LogDNA Ruby logger
data/.gitignore DELETED
@@ -1,11 +0,0 @@
1
- /.bundle/
2
- /.yardoc
3
- /Gemfile.lock
4
- /_yardoc/
5
- /coverage/
6
- /doc/
7
- /pkg/
8
- /spec/reports/
9
- /tmp/
10
- *.gem
11
- .DS_Store
data/.rspec DELETED
@@ -1,2 +0,0 @@
1
- --format documentation
2
- --color
@@ -1 +0,0 @@
1
- 2.2.0
data/Gemfile DELETED
@@ -1,4 +0,0 @@
1
- source 'https://rubygems.org'
2
-
3
- # Specify your gem's dependencies in logdna_ruby.gemspec
4
- gemspec
data/Rakefile DELETED
@@ -1,6 +0,0 @@
1
- require "bundler/gem_tasks"
2
- require "rspec/core/rake_task"
3
-
4
- RSpec::Core::RakeTask.new(:spec)
5
-
6
- task :default => :spec
@@ -1,32 +0,0 @@
1
- # coding: utf-8
2
- lib = File.expand_path('../lib', __FILE__)
3
- $LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
4
- require 'logdna/version'
5
-
6
- Gem::Specification.new do |spec|
7
- spec.name = 'logdna'
8
- spec.version = LogDNA::VERSION
9
- spec.authors = 'Gun Woo Choi, Derek Zhou'
10
- spec.email = 'support@logdna.com'
11
-
12
- spec.summary = 'LogDNA Ruby logger'
13
- spec.homepage = 'https://github.com/logdna/ruby'
14
- spec.license = 'MIT'
15
-
16
- spec.files = `git ls-files -z`.split("\x0").reject do |f|
17
- f.match(%r{^(test|spec|features)/})
18
- end
19
- spec.bindir = 'exe'
20
- spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) }
21
- spec.require_paths = ['lib']
22
-
23
-
24
- spec.add_runtime_dependency 'concurrent-ruby', '~> 1.0'
25
- spec.add_runtime_dependency 'require_all', '~> 1.4'
26
- spec.add_runtime_dependency 'json', '~> 2.0'
27
-
28
- spec.add_development_dependency 'bundler', '~> 1.13'
29
- spec.add_development_dependency 'rake', '~> 10.5'
30
- spec.add_development_dependency 'rspec', '~> 3.5'
31
- spec.add_development_dependency 'webmock', '~> 2.3'
32
- end
data/test.rb DELETED
@@ -1,69 +0,0 @@
1
- require 'require_all'
2
- require_all 'lib'
3
-
4
-
5
- options = {hostname: "new_ruby", meta:{:once => {:first => "nested1", :another => "nested2"}}}
6
-
7
-
8
- logger1 = Logdna::Ruby.new('Your API Key', options)
9
-
10
- logger1.log('**************** This is the start of test ****************')
11
- logger1.env = 'STAGING'
12
- logger1.app = 'HELLO'
13
- logger1.warn('Warn message with Staging and Hello')
14
- logger1.clear
15
- logger1.log('Is everything back to normal?')
16
-
17
-
18
- logger1.log('Testing env app name change using log')
19
- logger1.env = 'PRODUCTION'
20
- logger1.app = 'CHANGED'
21
- logger1.log('This should be stage = PRODUCTION and appname = CHANGED')
22
- logger1.log('Testing env app name change using other messages')
23
-
24
-
25
- logger1.error('This is error message with env = DEVELOPMENT and appname = NIHAO', {:env => 'DEVELOPMENT', :app => 'NIHAO'})
26
- logger1.log('Should not stay as DEVELOPMENT and NIHAO')
27
- logger1.env = 'DEVELOPMENT'
28
- logger1.app = 'NIHAO'
29
- logger1.log('Now should be DEVELOPMENT and NIHAO')
30
- logger1.log('Logging metadata in trace level', {:meta => {:once => {:first => "nested1", :another => "nested2"}}, :level => "TRACE"})
31
-
32
-
33
- logger1.level = Logger::DEBUG
34
- logger1.log('This is debug message')
35
- logger1.add('this should not be supported')
36
- logger1.fatal('Does this continue as fatal?')
37
- logger1.log('This should be debug')
38
- logger1.level = Logger::WARN
39
- logger1.log('**************** This is the end of test ****************')
40
-
41
-
42
- =begin
43
- logger1.level = Logger::WARN
44
- logger1.log('This should be warn')
45
- logger1.trace('This should be trace')
46
- logger1.log('Again warn level')
47
-
48
- logger1.log('Warn level log1')
49
- logger1.info('Info level log1')
50
- logger1.level = Logger::DEBUG
51
- logger1.log('DEBUG log1')
52
-
53
- logger1.app = 'NEW APP NAME'
54
- logger1.env = 'Staging'
55
- logger1.level = 'INFO'
56
-
57
-
58
-
59
- logger1.level = 'INFO'
60
- logger1.level == Logger::INFO
61
-
62
-
63
- logger1.log('are changes all updated')
64
- =end
65
- sleep 3
66
-
67
-
68
-
69
-