semantic_logger 4.8.2 → 4.10.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: d0bd2e2a218bdbf0bfa67ca85da999b352de9b214383890278c0710cd3f84e91
4
- data.tar.gz: dd3b393e8f0a031ddb25c2ca3cdeaa7d761617586f366006e79d7fcf3de8a9a5
3
+ metadata.gz: 2ff7d4bcb345581f6ba19f834bb1bf2f921e97904f4b6236142b87af75a4526d
4
+ data.tar.gz: b1069d5360296c61c9796f630b1ec065eb1a0a7dc549778c66c5e26c90d0eadb
5
5
  SHA512:
6
- metadata.gz: dba969cf8d70cbb63a6542a692b68773f395eda22515267c572630aa5180a3bbfe490f2096e83cc366f06e86bbf06bafc87d2e5800810110107ac5080b7a1046
7
- data.tar.gz: 35d69548ee2a91fb4d7c0575ea2d1da48648a7352d52304e8f270a6fe095c42e7ad1367594cef0b14941e14fc47906816df2360899c37fc577828b368c71c21a
6
+ metadata.gz: c5cfbac7dd8795b11cf79ed3fa9a4a086128d5ee030f0e9db40b75f30f58997c1674a4284b4b287939216bc8de83372e2b7fe00d73e7856c3cc6830310740f0d
7
+ data.tar.gz: 2150a809562247d771103a5d387ae5852ff5b0649a4f4288a7dd07a94420bafcfb6b36f3e21077fcc053c2d8bcaffe80a110ab65b52178fb612b7368fd79d87d
data/README.md CHANGED
@@ -9,16 +9,6 @@ Semantic Logger is a feature rich logging framework, and replacement for existin
9
9
 
10
10
  [Semantic Logger Guide](https://logger.rocketjob.io/)
11
11
 
12
- ## Upgrading to Semantic Logger v4.4
13
-
14
- With some forking frameworks it is necessary to call `reopen` after the fork. With v4.4 the
15
- workaround for Ruby 2.5 crashes is no longer needed.
16
- I.e. Please remove the following line if being called anywhere:
17
-
18
- ~~~ruby
19
- SemanticLogger::Processor.instance.instance_variable_set(:@queue, Queue.new)
20
- ~~~
21
-
22
12
  ## Logging Destinations
23
13
 
24
14
  Logging to the following destinations are all supported "out-of-the-box":
@@ -32,7 +22,7 @@ Logging to the following destinations are all supported "out-of-the-box":
32
22
  * Splunk
33
23
  * MongoDB
34
24
  * Honeybadger
35
- * Sentry
25
+ * Sentry (both with legacy `sentry-raven` and modern `sentry-ruby` gem)
36
26
  * HTTP
37
27
  * TCP
38
28
  * UDP
@@ -70,8 +60,50 @@ and are therefore not automatically included by this gem:
70
60
  - Splunk Appender: gem 'splunk-sdk-ruby'
71
61
  - Elasticsearch Appender: gem 'elasticsearch'
72
62
  - Kafka Appender: gem 'ruby-kafka'
63
+ - Legacy Sentry Appender: gem 'sentry-raven' (deprecated)
64
+ - Sentry Appender: gem 'sentry-ruby'
65
+
66
+ ## Upgrading to Semantic Logger v4.9
67
+
68
+ These changes should not be noticeable by the majority of users of Semantic Logger, since
69
+ they are to the internal API. It is possible that advanced users may be using these internal
70
+ API's directly.
71
+
72
+ This does not affect any calls to the public api `SemanticLogger.add_appender`.
73
+
74
+ File and IO are now separate appenders. When creating the File appender explicitly, its arguments
75
+ have changed. For example, when requesting an IO stream, it needs to be changed from:
76
+
77
+ ~~~ruby
78
+ SemanticLogger::Appender::File.new(io: $stderr)
79
+ ~~~
80
+ to:
81
+ ~~~ruby
82
+ SemanticLogger::Appender::IO.new($stderr)
83
+ ~~~
84
+
85
+ Additionally, this needs to be changed from:
86
+ ~~~ruby
87
+ SemanticLogger::Appender::File.new(file_name: "file.log")
88
+ ~~~
89
+ to:
90
+ ~~~ruby
91
+ SemanticLogger::Appender::File.new("file.log")
92
+ ~~~
93
+
94
+ Rails Semantic Logger, if used, needs to be upgraded to v4.9 when upgrading to Semantic Logger v4.9.
95
+
96
+ ## Upgrading to Semantic Logger v4.4
97
+
98
+ With some forking frameworks it is necessary to call `reopen` after the fork. With v4.4 the
99
+ workaround for Ruby 2.5 crashes is no longer needed.
100
+ I.e. Please remove the following line if being called anywhere:
101
+
102
+ ~~~ruby
103
+ SemanticLogger::Processor.instance.instance_variable_set(:@queue, Queue.new)
104
+ ~~~
73
105
 
74
- ## V4 Upgrade notes
106
+ ## Upgrading to Semantic Logger v4.0
75
107
 
76
108
  The following changes need to be made when upgrading to V4:
77
109
  - Ruby V2.3 / JRuby V9.1 is now the minimum runtime version.
@@ -64,6 +64,7 @@ module SemanticLogger
64
64
  signal.wait(batch_seconds)
65
65
 
66
66
  logs = []
67
+ messages = []
67
68
  first = true
68
69
  message_count = queue.length
69
70
  message_count.times do
@@ -76,10 +77,11 @@ module SemanticLogger
76
77
  first = false
77
78
  end
78
79
  else
79
- process_message(message)
80
+ messages << message
80
81
  end
81
82
  end
82
83
  appender.batch(logs) if logs.size.positive?
84
+ messages.each { |message| process_message(message) }
83
85
  signal.reset unless queue.size >= batch_size
84
86
  end
85
87
  end
@@ -135,7 +135,7 @@ module SemanticLogger
135
135
  application: nil,
136
136
  environment: nil,
137
137
  host: nil,
138
- metrics: false,
138
+ data_stream: false,
139
139
  **elasticsearch_args,
140
140
  &block)
141
141
 
@@ -146,6 +146,7 @@ module SemanticLogger
146
146
  @elasticsearch_args = elasticsearch_args.dup
147
147
  @elasticsearch_args[:url] = url if url && !elasticsearch_args[:hosts]
148
148
  @elasticsearch_args[:logger] = logger
149
+ @data_stream = data_stream
149
150
 
150
151
  super(level: level, formatter: formatter, filter: filter, application: application, environment: environment, host: host, metrics: false, &block)
151
152
  reopen
@@ -175,7 +176,12 @@ module SemanticLogger
175
176
  private
176
177
 
177
178
  def write_to_elasticsearch(messages)
178
- bulk_result = @client.bulk(body: messages)
179
+ bulk_result = if @data_stream
180
+ @client.bulk(index: index, body: messages)
181
+ else
182
+ @client.bulk(body: messages)
183
+ end
184
+
179
185
  return unless bulk_result["errors"]
180
186
 
181
187
  failed = bulk_result["items"].reject { |x| x["status"] == 201 }
@@ -184,11 +190,21 @@ module SemanticLogger
184
190
 
185
191
  def bulk_index(log)
186
192
  expanded_index_name = log.time.strftime("#{index}-#{date_pattern}")
187
- {"index" => {"_index" => expanded_index_name, "_type" => type}}
193
+ if @data_stream
194
+ {"create" => {}}
195
+ else
196
+ {"index" => {"_index" => expanded_index_name, "_type" => type}}
197
+ end
188
198
  end
189
199
 
190
200
  def default_formatter
191
- SemanticLogger::Formatters::Raw.new(time_format: :iso_8601, time_key: :timestamp)
201
+ time_key = if @data_stream
202
+ "@timestamp"
203
+ else
204
+ :timestamp
205
+ end
206
+
207
+ SemanticLogger::Formatters::Raw.new(time_format: :iso_8601, time_key: time_key)
192
208
  end
193
209
  end
194
210
  end
@@ -1,3 +1,4 @@
1
+ require "date"
1
2
  # File appender
2
3
  #
3
4
  # Writes log messages to a file or open iostream
@@ -5,114 +6,285 @@
5
6
  module SemanticLogger
6
7
  module Appender
7
8
  class File < SemanticLogger::Subscriber
8
- # Create a File Logger appender instance.
9
+ attr_accessor :file_name, :retry_count, :append, :exclusive_lock, :encoding,
10
+ :reopen_period, :reopen_count, :reopen_size
11
+ attr_reader :log_count, :log_size, :current_file_name, :reopen_at
12
+
13
+ # Create an appender to log to a named file.
9
14
  #
10
15
  # Parameters
11
- # :file_name [String]
12
- # Name of file to write to.
13
- # Or,
14
- # :io [IO]
15
- # An IO stream to which to write the log messages to.
16
- #
17
- # :level [:trace | :debug | :info | :warn | :error | :fatal]
18
- # Override the log level for this appender.
19
- # Default: SemanticLogger.default_level
20
- #
21
- # :formatter: [Object|Proc]
22
- # An instance of a class that implements #call, or a Proc to be used to format
23
- # the output from this appender
24
- # Default: Use the built-in formatter (See: #call)
25
- #
26
- # :filter [Regexp|Proc]
27
- # RegExp: Only include log messages where the class name matches the supplied
28
- # regular expression. All other messages will be ignored.
29
- # Proc: Only include log messages where the supplied Proc returns true
30
- # The Proc must return true or false.
16
+ # file_name [String]
17
+ # Name of the file to write to.
31
18
  #
32
- # Example
33
- # require 'semantic_logger'
19
+ # File name format directives:
20
+ # %p - Process Id
21
+ # %n - Short hostname (SemanticLogger.host). Everything before the first period in the hostname.
22
+ # %N - Full hostname (SemanticLogger.host)
23
+ # %a - Application name (SemanticLogger.application)
24
+ # %e - Environment name (SemanticLogger.environment)
25
+ # %D - Current Date. Equivalent to "%Y%m%d"
26
+ # %T - Current Time. Equivalent to "%H%M%S"
27
+ # %% - Literal `%` character
34
28
  #
35
- # # Enable trace level logging
36
- # SemanticLogger.default_level = :info
29
+ # Date:
30
+ # %Y - Year with century
31
+ # %C - year / 100 (round down. 20 in 2009)
32
+ # %y - year % 100 (00..99)
33
+ # %m - Month of the year, zero-padded (01..12)
34
+ # %d - Day of the month, zero-padded (01..31)
35
+ # %j - Day of the year (001..366)
36
+ # %U - Week number of the year. The week starts with Sunday. (00..53)
37
+ # %W - Week number of the year. The week starts with Monday. (00..53)
37
38
  #
38
- # # Log to screen
39
- # SemanticLogger.add_appender(io: $stdout, formatter: :color)
39
+ # Time:
40
+ # %H - 24 Hour of the day, zero-padded (00..23)
41
+ # %M - Minute of the hour (00..59)
42
+ # %S - Second of the minute (00..60)
40
43
  #
41
- # # And log to a file at the same time
42
- # SemanticLogger.add_appender(file_name: 'application.log', formatter: :color)
44
+ # Examples:
45
+ # Create a log file name consisting of the short host name, process id, date, and time.
46
+ # "log/production-%n-%p-%D-%T.log"
43
47
  #
44
- # logger = SemanticLogger['test']
45
- # logger.info 'Hello World'
48
+ # :level [:trace | :debug | :info | :warn | :error | :fatal]
49
+ # Override the log level for this appender.
50
+ # Default: SemanticLogger.default_level
46
51
  #
47
- # Example 2. To log all levels to file and only :info and above to screen:
52
+ # :formatter: [Object|Proc]
53
+ # An instance of a class that implements #call, or a Proc to be used to format
54
+ # the output from this appender
55
+ # Default: Use the built-in formatter (See: #call)
48
56
  #
49
- # require 'semantic_logger'
57
+ # :filter [Regexp|Proc]
58
+ # RegExp: Only include log messages where the class name matches the supplied
59
+ # regular expression. All other messages will be ignored.
60
+ # Proc: Only include log messages where the supplied Proc returns true
61
+ # The Proc must return true or false.
50
62
  #
51
- # # Enable trace level logging
52
- # SemanticLogger.default_level = :trace
63
+ # :append [true|false]
64
+ # Append to the log file if already present?
65
+ # Default: true
53
66
  #
54
- # # Log to screen but only display :info and above
55
- # SemanticLogger.add_appender(io: $stdout, level: :info)
67
+ # :exclusive_lock [true|false]
68
+ # Obtain an exclusive lock on the file, for operating systems that support it.
69
+ # Prevents multiple processes from trying to write to the same log file.
70
+ # Default: false
56
71
  #
57
- # # And log to a file at the same time, including all :trace level data
58
- # SemanticLogger.add_appender(file_name: 'application.log')
72
+ # :encoding ["UTF-8", "UTF-16", etc.]
73
+ # Encoding to use when writing to the file.
74
+ # Default: Encoding::BINARY
59
75
  #
60
- # logger = SemanticLogger['test']
61
- # logger.info 'Hello World'
62
- def initialize(io: nil, file_name: nil, **args, &block)
63
- if io
64
- @log = io
65
- unless @log.respond_to?(:write)
66
- raise(ArgumentError, "SemanticLogging::Appender::File :io is not a valid IO instance: #{io.inspect}")
67
- end
68
- else
69
- @file_name = file_name
70
- unless file_name
71
- raise(ArgumentError, "SemanticLogging::Appender::File missing mandatory parameter :file_name or :io")
72
- end
73
-
74
- reopen
76
+ # :retry_count [Integer]
77
+ # Number of times to attempt to re-open the file name when an error occurs trying to
78
+ # write to the file.
79
+ # Note: Set to 0 to disable retries.
80
+ # Default: 1
81
+ #
82
+ # :reopen_period [String]
83
+ # Specify a period after which to re-open the log file, specified in minutes, hours, or days.
84
+ # The format of the duration must start with an Integer or Float number,
85
+ # followed by the duration specified as:
86
+ # "m" : minutes
87
+ # "h" : hours
88
+ # "d" : days
89
+ # The time is rounded down to the specified time interval, so that:
90
+ # - "1h" will re-open every hour at the beginning of the hour.
91
+ # - "30m" will re-open every 30 minutes at the beginning of the 30th minute.
92
+ # - "1d" will re-open every day at midnight.
93
+ # Examples:
94
+ # "60m" : Every 60 minutes at the beginning of the minute: 10:24:00, 11:24:00, 12:24:00, ...
95
+ # "1h" : Every hour at the beginning of the hour: 10:00:00, 11:00:00, 12:00:00, ...
96
+ # "1d" : Every day at the beginning of the day: "20211008 00:00:00", "20211009 00:00:00", ...
97
+ # Default: nil (Disabled)
98
+ #
99
+ # :reopen_count [Integer]
100
+ # Close and re-open the log file after every `reopen_count` number of logged entries.
101
+ # Default: 0 (Disabled)
102
+ #
103
+ # :reopen_size [Integer]
104
+ # Approximate number of bytes to write to a log file by this process before closing and re-opening it.
105
+ # Notes:
106
+ # - When `append: true` and the file already exists, it reads the size of the current log file
107
+ # and starts with that size.
108
+ # - If the current log file size already exceeds the `reopen_size`, its current size is ignored.
109
+ # - The `reopen_size` is only the amount of bytes written by this process, it excludes data
110
+ # written by other processes. Use a unique filename to prevent multiple processes from writing to
111
+ # the same log file at the same time.
112
+ # Default: 0 (Disabled)
113
+ #
114
+ # Example
115
+ # require "semantic_logger"
116
+ #
117
+ # # Enable trace level logging
118
+ # SemanticLogger.default_level = :info
119
+ #
120
+ # # Log to a file
121
+ # SemanticLogger.add_appender(file_name: "application.log", formatter: :color)
122
+ #
123
+ # logger = SemanticLogger["test"]
124
+ # logger.info "Hello World"
125
+ def initialize(file_name, retry_count: 1, append: true, reopen_period: nil, reopen_count: 0, reopen_size: 0, encoding: Encoding::BINARY, exclusive_lock: false, **args, &block)
126
+ if !file_name.is_a?(String) || file_name.empty?
127
+ raise(ArgumentError, "SemanticLogging::Appender::File file_name must be a non-empty string")
75
128
  end
76
129
 
130
+ @file_name = file_name
131
+ @retry_count = retry_count
132
+ @file = nil
133
+ @append = append
134
+ @reopen_period = reopen_period
135
+ @reopen_count = reopen_count
136
+ @reopen_size = reopen_size
137
+ @encoding = encoding
138
+ @exclusive_lock = exclusive_lock
139
+ @log_count = 0
140
+ @log_size = 0
141
+ @reopen_at = nil
142
+
77
143
  super(**args, &block)
78
144
  end
79
145
 
80
146
  # After forking an active process call #reopen to re-open
81
- # open the file handles etc to resources
82
- #
83
- # Note: This method will only work if :file_name was supplied
84
- # on the initializer.
85
- # If :io was supplied, it will need to be re-opened manually.
147
+ # open the file handles etc to resources.
86
148
  def reopen
87
- return unless @file_name
149
+ begin
150
+ @file&.close
151
+ rescue StandardError
152
+ nil
153
+ end
154
+
155
+ self.current_file_name = apply_format_directives(file_name)
156
+ if ::File.directory?(file_name)
157
+ raise(ArgumentError, "The supplied log file_name: #{current_file_name} is already a directory.")
158
+ end
159
+
160
+ self.log_count = 0
161
+ if append && reopen_size && ::File.exist?(current_file_name)
162
+ self.log_size = ::File.size(current_file_name)
163
+ self.log_size = 0 if log_size >= reopen_size
164
+ else
165
+ self.log_size = 0
166
+ end
88
167
 
89
- @log = ::File.open(@file_name, ::File::WRONLY | ::File::APPEND | ::File::CREAT)
168
+ self.reopen_at = reopen_period ? next_reopen_period(reopen_period) : nil
169
+
170
+ options = ::File::WRONLY | ::File::CREAT
171
+ options |= ::File::APPEND if append
172
+ @file = ::File.open(current_file_name, options)
90
173
  # Force all log entries to write immediately without buffering
91
174
  # Allows multiple processes to write to the same log file simultaneously
92
- @log.sync = true
93
- @log.set_encoding(Encoding::BINARY) if @log.respond_to?(:set_encoding)
94
- @log
175
+ @file.sync = true
176
+ @file.set_encoding(encoding) if @file.respond_to?(:set_encoding)
177
+ @file.flock(::File::LOCK_EX) if exclusive_lock
178
+ @file
95
179
  end
96
180
 
97
- # Pass log calls to the underlying Rails, log4j or Ruby logger
98
- # trace entries are mapped to debug since :trace is not supported by the
99
- # Ruby or Rails Loggers
181
+ # Since only one appender thread will be writing to the file at a time
182
+ # it is not necessary to protect access to the file with a semaphore.
100
183
  def log(log)
101
- # Since only one appender thread will be writing to the file at a time
102
- # it is not necessary to protect access to the file with a semaphore
103
- # Allow this logger to filter out log levels lower than it's own
104
- @log.write(formatter.call(log, self) << "\n")
184
+ reopen if time_to_reopen?
185
+
186
+ count = 0
187
+ begin
188
+ message = formatter.call(log, self) << "\n"
189
+ @file.write(message)
190
+ @log_count += 1
191
+ @log_size += message.size
192
+ rescue StandardError => e
193
+ if count < retry_count
194
+ count += 1
195
+ reopen
196
+ retry
197
+ end
198
+ raise(e)
199
+ end
105
200
  true
106
201
  end
107
202
 
108
203
  # Flush all pending logs to disk.
109
- # Waits for all sent documents to be writted to disk
204
+ # Waits for all sent documents to be written to disk
110
205
  def flush
111
- @log.flush if @log.respond_to?(:flush)
206
+ @file&.flush
207
+ end
208
+
209
+ private
210
+
211
+ attr_writer :log_count, :log_size, :current_file_name, :reopen_at
212
+
213
+ def time_to_reopen?
214
+ return true unless @file
215
+
216
+ (reopen_count.positive? && (log_count >= reopen_count)) ||
217
+ (reopen_size.positive? && (log_size >= reopen_size)) ||
218
+ (reopen_at && (Time.now > reopen_at))
219
+ end
220
+
221
+ def apply_format_directives(file_name)
222
+ return file_name unless file_name.include?("%")
223
+
224
+ file_name.gsub(/%(.)/) { format_directive(Regexp.last_match(1)) }
225
+ end
226
+
227
+ def format_directive(directive)
228
+ case directive
229
+ when "p"
230
+ $$
231
+ when "n"
232
+ SemanticLogger.host.split(".")[0]
233
+ when "N"
234
+ SemanticLogger.host
235
+ when "a"
236
+ SemanticLogger.application
237
+ when "e"
238
+ SemanticLogger.environment
239
+ when "D"
240
+ Date.today.strftime("%Y%m%d")
241
+ when "Y", "C", "y", "m", "d", "j", "U", "W"
242
+ Date.today.strftime("%#{directive}")
243
+ when "T"
244
+ Time.now.strftime("%H%M%S")
245
+ when "H", "M", "S"
246
+ Time.now.strftime("%#{directive}")
247
+ when "%"
248
+ "%"
249
+ else
250
+ raise(ArgumentError, "Format Directive '#{directive}' in file_name: #{file_name} is not supported.")
251
+ end
112
252
  end
113
253
 
114
- def console_output?
115
- [$stderr, $stdout].include?(@log)
254
+ def next_reopen_period(period_string)
255
+ return unless period_string
256
+
257
+ duration, period = parse_period(period_string)
258
+ calculate_reopen_at(duration, period)
259
+ end
260
+
261
+ def parse_period(period_string)
262
+ match = period_string.to_s.downcase.gsub(/\s+/, "").match(/([\d.]+)([mhd])/)
263
+ unless match
264
+ raise(ArgumentError,
265
+ "Invalid period definition: #{period_string}, must begin with an integer, followed by m,h, or d.")
266
+ end
267
+
268
+ duration = match[1]
269
+ period = match[2]
270
+ raise(ArgumentError, "Invalid or missing duration in: #{period_string}, must begin with an integer.") unless duration
271
+ raise(ArgumentError, "Invalid or missing period in: #{period_string}, must end with m,h, or d.") unless period
272
+
273
+ [duration.to_i, period]
274
+ end
275
+
276
+ # Round down the current time based on the period, then add on the duration for that period
277
+ def calculate_reopen_at(duration, period, time = Time.now)
278
+ case period
279
+ when "m"
280
+ Time.new(time.year, time.month, time.day, time.hour, time.min, 0) + (duration * 60)
281
+ when "h"
282
+ Time.new(time.year, time.month, time.day, time.hour, 0, 0) + (duration * 60 * 60)
283
+ when "d"
284
+ Time.new(time.year, time.month, time.day, 0, 0, 0) + (duration * 24 * 60 * 60)
285
+ else
286
+ raise(ArgumentError, "Invalid or missing period in: #{reopen_period}, must end with m,h, or d.")
287
+ end
116
288
  end
117
289
  end
118
290
  end
@@ -0,0 +1,68 @@
1
+ # File appender
2
+ #
3
+ # Writes log messages to a file or open iostream
4
+ #
5
+ module SemanticLogger
6
+ module Appender
7
+ class IO < SemanticLogger::Subscriber
8
+ # Create a Stream Logger appender instance.
9
+ #
10
+ # Parameters
11
+ # io [IO]
12
+ # An IO stream to which to write the log messages to.
13
+ #
14
+ # :level [:trace | :debug | :info | :warn | :error | :fatal]
15
+ # Override the log level for this appender.
16
+ # Default: SemanticLogger.default_level
17
+ #
18
+ # :formatter: [Object|Proc]
19
+ # An instance of a class that implements #call, or a Proc to be used to format
20
+ # the output from this appender
21
+ # Default: Use the built-in formatter (See: #call)
22
+ #
23
+ # :filter [Regexp|Proc]
24
+ # RegExp: Only include log messages where the class name matches the supplied
25
+ # regular expression. All other messages will be ignored.
26
+ # Proc: Only include log messages where the supplied Proc returns true
27
+ # The Proc must return true or false.
28
+ #
29
+ # Example
30
+ # require "semantic_logger"
31
+ #
32
+ # # Enable trace level logging
33
+ # SemanticLogger.default_level = :info
34
+ #
35
+ # # Log to screen
36
+ # SemanticLogger.add_appender(io: $stdout, formatter: :color)
37
+ #
38
+ # logger = SemanticLogger['test']
39
+ # logger.info 'Hello World'
40
+ def initialize(io, **args, &block)
41
+ @io = io
42
+ unless @io.respond_to?(:write)
43
+ raise(ArgumentError, "SemanticLogging::Appender::IO io is not a valid IO instance: #{io.inspect}")
44
+ end
45
+
46
+ super(**args, &block)
47
+ end
48
+
49
+ def log(log)
50
+ # Since only one appender thread will be writing to the file at a time
51
+ # it is not necessary to protect access to the file with a semaphore
52
+ # Allow this logger to filter out log levels lower than it's own
53
+ @io.write(formatter.call(log, self) << "\n")
54
+ true
55
+ end
56
+
57
+ # Flush all pending logs to disk.
58
+ # Waits for all sent documents to be written to disk
59
+ def flush
60
+ @io.flush if @io.respond_to?(:flush)
61
+ end
62
+
63
+ def console_output?
64
+ [$stderr, $stdout].include?(@io)
65
+ end
66
+ end
67
+ end
68
+ end
@@ -0,0 +1,138 @@
1
+ begin
2
+ require "sentry-ruby"
3
+ rescue LoadError
4
+ raise LoadError, 'Gem sentry-ruby is required for logging purposes. Please add the gem "sentry-ruby" to your Gemfile.'
5
+ end
6
+
7
+ # Send log messages to sentry
8
+ #
9
+ # Example:
10
+ # SemanticLogger.add_appender(appender: :sentry_ruby)
11
+ #
12
+ module SemanticLogger
13
+ module Appender
14
+ class SentryRuby < SemanticLogger::Subscriber
15
+ # Create Appender
16
+ #
17
+ # Parameters
18
+ # level: [:trace | :debug | :info | :warn | :error | :fatal]
19
+ # Override the log level for this appender.
20
+ # Default: :error
21
+ #
22
+ # formatter: [Object|Proc|Symbol|Hash]
23
+ # An instance of a class that implements #call, or a Proc to be used to format
24
+ # the output from this appender
25
+ # Default: Use the built-in formatter (See: #call)
26
+ #
27
+ # filter: [Regexp|Proc]
28
+ # RegExp: Only include log messages where the class name matches the supplied.
29
+ # regular expression. All other messages will be ignored.
30
+ # Proc: Only include log messages where the supplied Proc returns true
31
+ # The Proc must return true or false.
32
+ #
33
+ # host: [String]
34
+ # Name of this host to appear in log messages.
35
+ # Default: SemanticLogger.host
36
+ #
37
+ # application: [String]
38
+ # Name of this application to appear in log messages.
39
+ # Default: SemanticLogger.application
40
+ def initialize(level: :error, **args, &block)
41
+ # Replace the Sentry Ruby logger so that we can identify its log
42
+ # messages and not forward them to Sentry
43
+ ::Sentry.init { |config| config.logger = SemanticLogger[::Sentry] }
44
+ super(level: level, **args, &block)
45
+ end
46
+
47
+ # Send an error notification to sentry
48
+ def log(log)
49
+ # Ignore logs coming from Sentry itself
50
+ return false if log.name == "Sentry"
51
+
52
+ context = formatter.call(log, self)
53
+ payload = context.delete(:payload) || {}
54
+ named_tags = context[:named_tags] || {}
55
+ transaction_name = named_tags.delete(:transaction_name)
56
+
57
+ user = extract_user!(named_tags, payload)
58
+ tags = extract_tags!(context)
59
+
60
+ fingerprint = payload.delete(:fingerprint)
61
+
62
+ ::Sentry.with_scope do |scope|
63
+ scope.set_user(user) if user
64
+ scope.set_level(context.delete(:level)) if context[:level]
65
+ scope.set_fingerprint(fingerprint) if fingerprint
66
+ scope.set_transaction_name(transaction_name) if transaction_name
67
+ scope.set_tags(tags)
68
+ scope.set_extras(context)
69
+ scope.set_extras(payload)
70
+
71
+ if log.exception
72
+ ::Sentry.capture_exception(log.exception)
73
+ elsif log.backtrace
74
+ ::Sentry.capture_message(context[:message], backtrace: log.backtrace)
75
+ else
76
+ ::Sentry.capture_message(context[:message])
77
+ end
78
+ end
79
+
80
+ true
81
+ end
82
+
83
+ private
84
+
85
+ # Use Raw Formatter by default
86
+ def default_formatter
87
+ SemanticLogger::Formatters::Raw.new
88
+ end
89
+
90
+ # Extract user data from named tags or payload.
91
+ #
92
+ # Keys :user_id and :user_email will be used as :id and :email respectively.
93
+ # Keys :username and :ip_address will be used verbatim.
94
+ #
95
+ # Any additional value nested in a :user key will be added, provided any of
96
+ # the above keys is already present.
97
+ #
98
+ def extract_user!(*sources)
99
+ keys = {user_id: :id, username: :username, user_email: :email, ip_address: :ip_address}
100
+
101
+ user = {}
102
+
103
+ sources.each do |source|
104
+ keys.each do |source_key, target_key|
105
+ value = source.delete(source_key)
106
+ user[target_key] = value if value
107
+ end
108
+ end
109
+
110
+ return if user.empty?
111
+
112
+ sources.each do |source|
113
+ extras = source.delete(:user)
114
+ user.merge!(extras) if extras.is_a?(Hash)
115
+ end
116
+
117
+ user
118
+ end
119
+
120
+ # Extract tags.
121
+ #
122
+ # Named tags will be stringified (both key and value).
123
+ # Unnamed tags will be stringified and joined with a comma. Then they will
124
+ # be used as a "tag" named tag. If such a tag already exists, it is also
125
+ # joined with a comma.
126
+ #
127
+ # Finally, the tag names are limited to 32 characters and the tag values to 256.
128
+ #
129
+ def extract_tags!(context)
130
+ named_tags = context.delete(:named_tags) || {}
131
+ named_tags = named_tags.map { |k, v| [k.to_s, v.to_s] }.to_h
132
+ tags = context.delete(:tags)
133
+ named_tags.merge!("tag" => tags.join(", ")) { |_, v1, v2| "#{v1}, #{v2}" } if tags
134
+ named_tags.map { |k, v| [k[0...32], v[0...256]] }.to_h
135
+ end
136
+ end
137
+ end
138
+ end
@@ -9,6 +9,7 @@ module SemanticLogger
9
9
  autoload :File, "semantic_logger/appender/file"
10
10
  autoload :Graylog, "semantic_logger/appender/graylog"
11
11
  autoload :Honeybadger, "semantic_logger/appender/honeybadger"
12
+ autoload :IO, "semantic_logger/appender/io"
12
13
  autoload :Kafka, "semantic_logger/appender/kafka"
13
14
  autoload :Sentry, "semantic_logger/appender/sentry"
14
15
  autoload :Http, "semantic_logger/appender/http"
@@ -21,6 +22,7 @@ module SemanticLogger
21
22
  autoload :Tcp, "semantic_logger/appender/tcp"
22
23
  autoload :Udp, "semantic_logger/appender/udp"
23
24
  autoload :Wrapper, "semantic_logger/appender/wrapper"
25
+ autoload :SentryRuby, "semantic_logger/appender/sentry_ruby"
24
26
  # @formatter:on
25
27
 
26
28
  # Returns [SemanticLogger::Subscriber] appender for the supplied options
@@ -32,7 +34,7 @@ module SemanticLogger
32
34
  appender = build(**args, &block)
33
35
 
34
36
  # If appender implements #batch, then it should use the batch proxy by default.
35
- batch = true if batch.nil? && appender.respond_to?(:batch)
37
+ batch = true if batch.nil? && appender.respond_to?(:batch)
36
38
 
37
39
  if batch == true
38
40
  Appender::AsyncBatch.new(
@@ -56,8 +58,10 @@ module SemanticLogger
56
58
 
57
59
  # Returns [Subscriber] instance from the supplied options.
58
60
  def self.build(io: nil, file_name: nil, appender: nil, metric: nil, logger: nil, **args, &block)
59
- if io || file_name
60
- SemanticLogger::Appender::File.new(io: io, file_name: file_name, **args, &block)
61
+ if file_name
62
+ SemanticLogger::Appender::File.new(file_name, **args, &block)
63
+ elsif io
64
+ SemanticLogger::Appender::IO.new(io, **args, &block)
61
65
  elsif logger
62
66
  SemanticLogger::Appender::Wrapper.new(logger: logger, **args, &block)
63
67
  elsif appender
@@ -188,7 +188,8 @@ module SemanticLogger
188
188
  # - For better performance with clean tags, see `SemanticLogger.tagged`.
189
189
  def tagged(*tags, &block)
190
190
  # Allow named tags to be passed into the logger
191
- if tags.size == 1
191
+ # Rails::Rack::Logger passes logs as an array with a single argument
192
+ if tags.size == 1 && !tags.first.is_a?(Array)
192
193
  tag = tags[0]
193
194
  return yield if tag.nil? || tag == ""
194
195
 
@@ -70,6 +70,7 @@ module SemanticLogger
70
70
 
71
71
  # Return the Time as a formatted string
72
72
  def format_time(time)
73
+ time = time.dup
73
74
  case time_format
74
75
  when :rfc_3339
75
76
  time.utc.to_datetime.rfc3339
@@ -2,6 +2,21 @@ require "json"
2
2
 
3
3
  module SemanticLogger
4
4
  module Formatters
5
+ # Produces logfmt formatted messages
6
+ #
7
+ # The following fields are extracted from the raw log and included in the formatted message:
8
+ # :timestamp, :level, :name, :message, :duration, :tags, :named_tags
9
+ #
10
+ # E.g.
11
+ # timestamp="2020-07-20T08:32:05.375276Z" level=info name="DefaultTest" base="breakfast" spaces="second breakfast" double_quotes="\"elevensies\"" single_quotes="'lunch'" tag="success"
12
+ #
13
+ # All timestamps are ISO8601 formatteed
14
+ # All user supplied values are escaped and surrounded by double quotes to avoid ambiguious message delimeters
15
+ # `tags` are treated as keys with boolean values. Tag names are not formatted or validated, ensure you use valid logfmt format for tag names.
16
+ # `named_tags` are flattened are merged into the top level message field. Any conflicting fields are overridden.
17
+ # `payload` values take precedence over `tags` and `named_tags`. Any conflicting fields are overridden.
18
+ #
19
+ # Futher Reading https://brandur.org/logfmt
5
20
  class Logfmt < Raw
6
21
  def initialize(time_format: :iso_8601, time_key: :timestamp, **args)
7
22
  super(time_format: time_format, time_key: time_key, **args)
@@ -16,13 +31,22 @@ module SemanticLogger
16
31
  private
17
32
 
18
33
  def raw_to_logfmt
19
- @parsed = @raw.slice(:timestamp, :level, :name, :message, :duration).merge tag: "success"
34
+ @parsed = @raw.slice(time_key, :level, :name, :message, :duration).merge(tag: "success")
35
+ handle_tags
20
36
  handle_payload
21
37
  handle_exception
22
38
 
23
39
  flatten_log
24
40
  end
25
41
 
42
+ def handle_tags
43
+ tags = @raw.fetch(:tags){ [] }
44
+ .each_with_object({}){ |tag, accum| accum[tag] = true }
45
+
46
+ @parsed = @parsed.merge(tags)
47
+ .merge(@raw.fetch(:named_tags){ {} })
48
+ end
49
+
26
50
  def handle_payload
27
51
  return unless @raw.key? :payload
28
52
 
@@ -38,17 +62,11 @@ module SemanticLogger
38
62
 
39
63
  def flatten_log
40
64
  flattened = @parsed.map do |key, value|
41
- "#{key}=#{parse_value(value)}"
65
+ "#{key}=#{value.to_json}"
42
66
  end
43
67
 
44
68
  flattened.join(" ")
45
69
  end
46
-
47
- def parse_value(value)
48
- return value.to_json if value.instance_of? String
49
-
50
- value
51
- end
52
70
  end
53
71
  end
54
72
  end
@@ -15,7 +15,7 @@ module SemanticLogger
15
15
  def self.logger
16
16
  @logger ||=
17
17
  begin
18
- l = SemanticLogger::Appender::File.new(io: $stderr, level: :warn)
18
+ l = SemanticLogger::Appender::IO.new($stderr, level: :warn)
19
19
  l.name = name
20
20
  l
21
21
  end
@@ -122,7 +122,7 @@ module SemanticLogger
122
122
  # Default: SemanticLogger.default_level
123
123
  #
124
124
  # formatter: [Symbol|Object|Proc]
125
- # Any of the following symbol values: :default, :color, :json
125
+ # Any of the following symbol values: :default, :color, :json, :logfmt, etc...
126
126
  # Or,
127
127
  # An instance of a class that implements #call
128
128
  # Or,
@@ -164,7 +164,7 @@ module SemanticLogger
164
164
  # logger.info "Hello World"
165
165
  # logger.debug("Login time", user: 'Joe', duration: 100, ip_address: '127.0.0.1')
166
166
  def self.add_appender(**args, &block)
167
- appender = Logger.processor.appenders.add(**args, &block)
167
+ appender = appenders.add(**args, &block)
168
168
  # Start appender thread if it is not already running
169
169
  Logger.processor.start
170
170
  appender
@@ -175,7 +175,7 @@ module SemanticLogger
175
175
  def self.remove_appender(appender)
176
176
  return unless appender
177
177
 
178
- Logger.processor.appenders.delete(appender)
178
+ appenders.delete(appender)
179
179
  appender.close
180
180
  end
181
181
 
@@ -189,7 +189,7 @@ module SemanticLogger
189
189
  # Use SemanticLogger.add_appender and SemanticLogger.remove_appender
190
190
  # to manipulate the active appenders list
191
191
  def self.appenders
192
- Logger.processor.appenders.to_a
192
+ Logger.processor.appenders
193
193
  end
194
194
 
195
195
  # Flush all queued log entries disk, database, etc.
@@ -22,6 +22,11 @@ module SemanticLogger
22
22
  # NOOP
23
23
  end
24
24
 
25
+ # Method called to log an event
26
+ def log(log)
27
+ raise NotImplementedError
28
+ end
29
+
25
30
  # Returns [SemanticLogger::Formatters::Default] default formatter for this subscriber.
26
31
  def default_formatter
27
32
  SemanticLogger::Formatters::Default.new
@@ -68,6 +73,11 @@ module SemanticLogger
68
73
  super(log) && (log.metric_only? ? metrics? : true)
69
74
  end
70
75
 
76
+ # Whether this appender is logging to stdout or stderror
77
+ def console_output?
78
+ false
79
+ end
80
+
71
81
  private
72
82
 
73
83
  # Initializer for Abstract Class SemanticLogger::Subscriber
@@ -24,7 +24,7 @@ module SemanticLogger
24
24
  def self.logger
25
25
  @logger ||=
26
26
  begin
27
- l = SemanticLogger::Appender::File.new(io: $stderr, level: :warn)
27
+ l = SemanticLogger::Appender::IO.new($stderr, level: :warn)
28
28
  l.name = name
29
29
  l
30
30
  end
@@ -0,0 +1,34 @@
1
+ module SemanticLogger
2
+ module Test
3
+ # Logging class to captures all logging events in memory.
4
+ #
5
+ # Example:
6
+ #
7
+ # class UserTest < ActiveSupport::TestCase
8
+ # describe User do
9
+ # let(:capture_logger) { SemanticLogger::Test::CaptureLogEvents.new }
10
+ # let(:user) { User.new }
11
+ #
12
+ # it "logs message" do
13
+ # user.stub(:logger, capture_logger) do
14
+ # user.enable!
15
+ # end
16
+ # assert_equal "Hello World", capture_logger.events.last.message
17
+ # assert_equal :info, capture_logger.events.last.level
18
+ # end
19
+ # end
20
+ # end
21
+ class CaptureLogEvents < SemanticLogger::Subscriber
22
+ attr_accessor :events
23
+
24
+ # By default collect all log levels, and collect metric only log events.
25
+ def initialize(level: :trace, metrics: true)
26
+ super(level: level, metrics: true)
27
+ end
28
+
29
+ def log(log)
30
+ (@events ||= []) << log
31
+ end
32
+ end
33
+ end
34
+ end
@@ -1,3 +1,3 @@
1
1
  module SemanticLogger
2
- VERSION = "4.8.2".freeze
2
+ VERSION = "4.10.0".freeze
3
3
  end
@@ -32,6 +32,10 @@ module SemanticLogger
32
32
  autoload :Minitest, "semantic_logger/reporters/minitest"
33
33
  end
34
34
 
35
+ module Test
36
+ autoload :CaptureLogEvents, "semantic_logger/test/capture_log_events"
37
+ end
38
+
35
39
  if defined?(JRuby)
36
40
  module JRuby
37
41
  autoload :GarbageCollectionLogger, "semantic_logger/jruby/garbage_collection_logger"
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: semantic_logger
3
3
  version: !ruby/object:Gem::Version
4
- version: 4.8.2
4
+ version: 4.10.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Reid Morrison
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2021-08-15 00:00:00.000000000 Z
11
+ date: 2022-02-05 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: concurrent-ruby
@@ -45,11 +45,13 @@ files:
45
45
  - lib/semantic_logger/appender/graylog.rb
46
46
  - lib/semantic_logger/appender/honeybadger.rb
47
47
  - lib/semantic_logger/appender/http.rb
48
+ - lib/semantic_logger/appender/io.rb
48
49
  - lib/semantic_logger/appender/kafka.rb
49
50
  - lib/semantic_logger/appender/mongodb.rb
50
51
  - lib/semantic_logger/appender/new_relic.rb
51
52
  - lib/semantic_logger/appender/rabbitmq.rb
52
53
  - lib/semantic_logger/appender/sentry.rb
54
+ - lib/semantic_logger/appender/sentry_ruby.rb
53
55
  - lib/semantic_logger/appender/splunk.rb
54
56
  - lib/semantic_logger/appender/splunk_http.rb
55
57
  - lib/semantic_logger/appender/syslog.rb
@@ -87,6 +89,7 @@ files:
87
89
  - lib/semantic_logger/subscriber.rb
88
90
  - lib/semantic_logger/sync.rb
89
91
  - lib/semantic_logger/sync_processor.rb
92
+ - lib/semantic_logger/test/capture_log_events.rb
90
93
  - lib/semantic_logger/utils.rb
91
94
  - lib/semantic_logger/version.rb
92
95
  homepage: https://logger.rocketjob.io
@@ -108,7 +111,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
108
111
  - !ruby/object:Gem::Version
109
112
  version: '0'
110
113
  requirements: []
111
- rubygems_version: 3.2.15
114
+ rubygems_version: 3.3.3
112
115
  signing_key:
113
116
  specification_version: 4
114
117
  summary: Feature rich logging framework, and replacement for existing Ruby & Rails