logstash-output-scalyr 0.2.6 → 0.2.8.beta
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +26 -0
- data/README.md +14 -29
- data/lib/logstash/outputs/scalyr.rb +42 -12
- data/lib/scalyr/common/client.rb +23 -39
- data/lib/scalyr/constants.rb +6 -7
- data/logstash-output-scalyr.gemspec +2 -2
- metadata +5 -23
- data/spec/benchmarks/bignum_fixing.rb +0 -87
- data/spec/benchmarks/flattening_and_serialization.rb +0 -100
- data/spec/benchmarks/json_serialization.rb +0 -85
- data/spec/benchmarks/metrics_overhead.rb +0 -48
- data/spec/benchmarks/set_session_level_serverhost_on_events.rb +0 -107
- data/spec/benchmarks/util.rb +0 -24
- data/spec/logstash/outputs/scalyr_integration_spec.rb +0 -337
- data/spec/logstash/outputs/scalyr_spec.rb +0 -1262
- data/spec/scalyr/common/util_spec.rb +0 -543
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: da91d80d418081307dc1a6b756f17024712c0eab9a6d527a140288976a58b76f
|
4
|
+
data.tar.gz: 2ac792b42939edb371614c73c4dbb50fcc835f199d03685ff619d9d355c6465a
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 0a87ca7115a2eb3f03282046510bd51869e575eb29b8bec6ee273cbc9f68fea0167890835ffdda8bd0d571ac4299aff689642542d8dd5d88eaa2a89fe760a3f0
|
7
|
+
data.tar.gz: 7457fff5003d10b7288a9a958f3ec3f4a490c19da51a3ac3ade0402a37f74fafffc2501185857b4b3a7263753cb6af3276471724870efa0dca9e0773671b2f3f
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,31 @@
|
|
1
1
|
# Beta
|
2
2
|
|
3
|
+
## 0.2.8.beta
|
4
|
+
|
5
|
+
* Update ``.gemspec`` gem metadata to not include ``spec/`` directory with the tests and tests
|
6
|
+
fixtures with the actual production gem file.
|
7
|
+
|
8
|
+
* Do not retry requests that will never be accepted by the server.
|
9
|
+
Specifically, any request that returns HTTP Status code 413 is too large, and
|
10
|
+
will never be accepted. Instead of simply retrying for 10 minutes before
|
11
|
+
sending the request to the DLQ, skip the retries go directly to sending the
|
12
|
+
request to the DLQ.
|
13
|
+
|
14
|
+
To be notified when an event fails to be ingested for whatever reason, create
|
15
|
+
an alert using the query: ``parser='logstash_plugin_metrics'
|
16
|
+
failed_events_processed > 0``. Instructions on how to create an alert can be
|
17
|
+
found in our docs here: https://scalyr.com/help/alerts
|
18
|
+
|
19
|
+
## 0.2.7.beta
|
20
|
+
|
21
|
+
* SSL cert validation code has been simplified. Now ``ssl_ca_bundle_path`` config option
|
22
|
+
defaults to the CA bundle path which is vendored / bundled with the RubyGem and includes CA
|
23
|
+
certs of the authorities which are used for DataSet API endpoint certificates.
|
24
|
+
|
25
|
+
In addition to that, ``append_builtin_cert`` config option has been removed and the code now
|
26
|
+
throws error in case invalid / inexistent path is specified for the ``ssl_ca_bundle_path``
|
27
|
+
config option - this represents a fatal config error.
|
28
|
+
|
3
29
|
## 0.2.6.beta, 0.2.6
|
4
30
|
|
5
31
|
* Update default value of ``ssl_ca_bundle_path`` config option to
|
data/README.md
CHANGED
@@ -132,42 +132,23 @@ set on the event object to prevent API from rejecting an invalid request.
|
|
132
132
|
|
133
133
|
## Note On Server SSL Certificate Validation
|
134
134
|
|
135
|
-
By default when validating DataSet endpoint server SSL certificate, logstash
|
136
|
-
|
137
|
-
of
|
138
|
-
to issue / sign server certificates used by the DataSet API endpoint.
|
135
|
+
By default when validating DataSet endpoint server SSL certificate, logstash uses CA certificate
|
136
|
+
bundles which is vendored / bundled with the RubyGem / plugin. This bundle includes CA certificate
|
137
|
+
files of authoried which are used to issue DataSet API endpoint certificates.
|
139
138
|
|
140
|
-
|
141
|
-
|
139
|
+
If you want to use system CA bundle, you should update ``ssl_ca_bundle_path`` to system CA bundle
|
140
|
+
path (e.g. ``/etc/ssl/certs/ca-certificates.crt``), as shown in the example below:
|
142
141
|
|
143
142
|
```
|
144
143
|
output {
|
145
144
|
scalyr {
|
146
145
|
api_write_token => 'SCALYR_API_KEY'
|
147
146
|
...
|
148
|
-
# You only need to set this config option in case default CA bundle path on your system is
|
149
|
-
# different
|
150
147
|
ssl_ca_bundle_path => "/etc/ssl/certs/ca-certificates.crt"
|
151
|
-
append_builtin_cert => false
|
152
148
|
}
|
153
149
|
}
|
154
150
|
```
|
155
151
|
|
156
|
-
In case you want to use only root CA certs which are bundled with the plugin (not use system CA
|
157
|
-
certs bundle), you can do that by using the following config options:
|
158
|
-
|
159
|
-
```
|
160
|
-
output {
|
161
|
-
scalyr {
|
162
|
-
api_write_token => 'SCALYR_API_KEY'
|
163
|
-
...
|
164
|
-
# You only need to set this config option in case default CA bundle path on your system is
|
165
|
-
# different
|
166
|
-
ssl_ca_bundle_path => nil
|
167
|
-
append_builtin_cert => true
|
168
|
-
}
|
169
|
-
}
|
170
|
-
```
|
171
152
|
|
172
153
|
## Options
|
173
154
|
|
@@ -185,7 +166,7 @@ output {
|
|
185
166
|
|
186
167
|
- Path to SSL CA bundle file which is used to verify the server certificate.
|
187
168
|
|
188
|
-
`config :ssl_ca_bundle_path, :validate => :string, :default =>
|
169
|
+
`config :ssl_ca_bundle_path, :validate => :string, :default => CA_CERTS_PATH`
|
189
170
|
|
190
171
|
If for some reason you need to disable server cert validation (you are strongly recommended to
|
191
172
|
not disable it unless specifically instructed to do so or have a valid reason for it), you can do
|
@@ -460,6 +441,12 @@ Or to run a single test function defined on line XXX
|
|
460
441
|
bundle exec rspec spec/scalyr/common/util_spec.rb:XXX
|
461
442
|
```
|
462
443
|
|
444
|
+
Or using more verbose output mode:
|
445
|
+
|
446
|
+
```bash
|
447
|
+
bundle exec rspec -fd spec/scalyr/common/util_spec.rb
|
448
|
+
```
|
449
|
+
|
463
450
|
## Instrumentation and metrics
|
464
451
|
|
465
452
|
By default, plugin logs a special line with metrics to Scalyr every 5 minutes. This line contains
|
@@ -518,10 +505,9 @@ To deploy the current code on your machine run these commands:
|
|
518
505
|
|
519
506
|
```
|
520
507
|
rm -rf vendor/
|
521
|
-
bundle
|
508
|
+
bundle install
|
522
509
|
curl -u RUBY_USER:RUBY_PASSWORD https://rubygems.org/api/v1/api_key.yaml > ~/.gem/credentials
|
523
510
|
chmod 0600 ~/.gem/credentials
|
524
|
-
bundle exec rake vendor
|
525
511
|
bundle exec rspec
|
526
512
|
bundle exec rake publish_gem
|
527
513
|
```
|
@@ -531,10 +517,9 @@ Or as an alternative if ``rake publish_gem`` task doesn't appear to work for wha
|
|
531
517
|
|
532
518
|
```
|
533
519
|
rm -rf vendor/
|
534
|
-
bundle
|
520
|
+
bundle install
|
535
521
|
curl -u RUBY_USER:RUBY_PASSWORD https://rubygems.org/api/v1/api_key.yaml > ~/.gem/credentials
|
536
522
|
chmod 0600 ~/.gem/credentials
|
537
|
-
bundle exec rake vendor
|
538
523
|
bundle exec rspec
|
539
524
|
rvm use jruby
|
540
525
|
bundle exec gem build logstash-output-scalyr.gemspec
|
@@ -133,13 +133,14 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
133
133
|
# Whether or not to verify the connection to Scalyr, only set to false for debugging.
|
134
134
|
config :ssl_verify_peer, :validate => :boolean, :default => true
|
135
135
|
|
136
|
-
# Path to SSL bundle file.
|
137
|
-
#
|
138
|
-
#
|
139
|
-
|
136
|
+
# Path to SSL bundle file used to validate remote / server SSL certificate. By default, path to
|
137
|
+
# the CA bundled which is vendored / bundled with the RubyGem is used.
|
138
|
+
# If user has a specific reason to change this value (e.g. to a system ca bundle such as
|
139
|
+
# /etc/ssl/certs/ca-certificates.crt, they can update this option).
|
140
|
+
config :ssl_ca_bundle_path, :validate => :string, :default => CA_CERTS_PATH
|
140
141
|
|
141
|
-
#
|
142
|
-
config :append_builtin_cert, :validate => :boolean, :default =>
|
142
|
+
# Unused since v0.2.7, left here for backward compatibility reasons
|
143
|
+
config :append_builtin_cert, :validate => :boolean, :default => false
|
143
144
|
|
144
145
|
config :max_request_buffer, :validate => :number, :default => 5500000 # echee TODO: eliminate?
|
145
146
|
config :force_message_encoding, :validate => :string, :default => nil
|
@@ -261,6 +262,10 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
261
262
|
end
|
262
263
|
end
|
263
264
|
|
265
|
+
if not @append_builtin_cert.nil?
|
266
|
+
@logger.warn "append_builtin_cert config option has been deprecated and is unused in versions 0.2.7 and above"
|
267
|
+
end
|
268
|
+
|
264
269
|
@dlq_writer = dlq_enabled? ? execution_context.dlq_writer : nil
|
265
270
|
|
266
271
|
@message_encoding = nil
|
@@ -352,7 +357,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
352
357
|
@running = true
|
353
358
|
@client_session = Scalyr::Common::Client::ClientSession.new(
|
354
359
|
@logger, @add_events_uri,
|
355
|
-
@compression_type, @compression_level, @ssl_verify_peer, @ssl_ca_bundle_path,
|
360
|
+
@compression_type, @compression_level, @ssl_verify_peer, @ssl_ca_bundle_path,
|
356
361
|
@record_stats_for_status, @flush_quantile_estimates_on_status_send,
|
357
362
|
@http_connect_timeout, @http_socket_timeout, @http_request_timeout, @http_pool_max, @http_pool_max_per_route
|
358
363
|
)
|
@@ -378,7 +383,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
378
383
|
# This way we don't need to keep idle long running connection open.
|
379
384
|
initial_send_status_client_session = Scalyr::Common::Client::ClientSession.new(
|
380
385
|
@logger, @add_events_uri,
|
381
|
-
@compression_type, @compression_level, @ssl_verify_peer, @ssl_ca_bundle_path,
|
386
|
+
@compression_type, @compression_level, @ssl_verify_peer, @ssl_ca_bundle_path,
|
382
387
|
@record_stats_for_status, @flush_quantile_estimates_on_status_send,
|
383
388
|
@http_connect_timeout, @http_socket_timeout, @http_request_timeout, @http_pool_max, @http_pool_max_per_route
|
384
389
|
)
|
@@ -495,6 +500,24 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
495
500
|
result.push(multi_event_request)
|
496
501
|
end
|
497
502
|
|
503
|
+
rescue Scalyr::Common::Client::PayloadTooLargeError => e
|
504
|
+
# if the payload is too large, we do not retry. we send to DLQ or drop it.
|
505
|
+
exc_data = {
|
506
|
+
:error_class => e.e_class,
|
507
|
+
:url => e.url.to_s,
|
508
|
+
:message => e.message,
|
509
|
+
:batch_num => batch_num,
|
510
|
+
:total_batches => total_batches,
|
511
|
+
:record_count => multi_event_request[:record_count],
|
512
|
+
:payload_size => multi_event_request[:body].bytesize,
|
513
|
+
}
|
514
|
+
exc_data[:code] = e.code if e.code
|
515
|
+
if defined?(e.body) and e.body
|
516
|
+
exc_data[:body] = Scalyr::Common::Util.truncate(e.body, 512)
|
517
|
+
end
|
518
|
+
exc_data[:payload] = "\tSample payload: #{multi_event_request[:body][0,1024]}..."
|
519
|
+
log_retry_failure(multi_event_request, exc_data, 0, 0)
|
520
|
+
next
|
498
521
|
rescue Scalyr::Common::Client::ServerError, Scalyr::Common::Client::ClientError => e
|
499
522
|
sleep_interval = sleep_for(sleep_interval)
|
500
523
|
exc_sleep += sleep_interval
|
@@ -629,18 +652,25 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
629
652
|
@multi_receive_statistics[:total_events_processed] += multi_event_request[:logstash_events].length
|
630
653
|
@multi_receive_statistics[:failed_events_processed] += multi_event_request[:logstash_events].length
|
631
654
|
end
|
632
|
-
message = "Failed to send #{multi_event_request[:logstash_events].length} events after #{exc_retries} tries."
|
633
655
|
sample_events = Array.new
|
634
656
|
multi_event_request[:logstash_events][0,5].each {|l_event|
|
635
657
|
sample_events << Scalyr::Common::Util.truncate(l_event.to_hash.to_json, 256)
|
636
658
|
}
|
637
|
-
|
659
|
+
if exc_data[:code] == 413
|
660
|
+
message = "Failed to send #{multi_event_request[:logstash_events].length} events due to exceeding maximum request size. Not retrying non-retriable request."
|
661
|
+
# For PayloadTooLargeError we already include sample Scalyr payload in exc_data so there is no need
|
662
|
+
# to include redundant sample Logstash event objects
|
663
|
+
@logger.error(message, :error_data => exc_data)
|
664
|
+
else
|
665
|
+
message = "Failed to send #{multi_event_request[:logstash_events].length} events after #{exc_retries} tries."
|
666
|
+
@logger.error(message, :error_data => exc_data, :sample_events => sample_events, :retries => exc_retries, :sleep_time => exc_sleep)
|
667
|
+
end
|
638
668
|
if @dlq_writer
|
639
669
|
multi_event_request[:logstash_events].each {|l_event|
|
640
670
|
@dlq_writer.write(l_event, "#{exc_data[:message]}")
|
641
671
|
}
|
642
672
|
else
|
643
|
-
@logger.warn("Dead letter queue not configured, dropping #{multi_event_request[:logstash_events].length} events
|
673
|
+
@logger.warn("Dead letter queue not configured, dropping #{multi_event_request[:logstash_events].length} events.", :sample_events => sample_events)
|
644
674
|
end
|
645
675
|
end
|
646
676
|
|
@@ -1045,7 +1075,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
|
|
1045
1075
|
serialized_request_size = serialized_body.bytesize
|
1046
1076
|
|
1047
1077
|
# We give it "buffer" since the splitting code allows for some slack and doesn't take into account top-level non-event attributes
|
1048
|
-
if
|
1078
|
+
if serialized_request_size >= @max_request_buffer + 5000
|
1049
1079
|
# TODO: If we end up here is estimate config opsion is false, split the request here into multiple ones
|
1050
1080
|
@logger.warn("Serialized request size (#{serialized_request_size}) is larger than max_request_buffer (#{max_request_buffer})!")
|
1051
1081
|
end
|
data/lib/scalyr/common/client.rb
CHANGED
@@ -27,6 +27,18 @@ end
|
|
27
27
|
# An exception that signifies the Scalyr server received the upload request but dropped it
|
28
28
|
#---------------------------------------------------------------------------------------------------------------------
|
29
29
|
class RequestDroppedError < ServerError;
|
30
|
+
def initialize(msg=nil, code=nil, url=nil, body=nil, e_class="Scalyr::Common::Client::RequestDroppedError")
|
31
|
+
super(msg, code, url, body, e_class)
|
32
|
+
end
|
33
|
+
end
|
34
|
+
|
35
|
+
#---------------------------------------------------------------------------------------------------------------------
|
36
|
+
# An exception that signifies the Scalyr server received the upload request but dropped it due to it being too large.
|
37
|
+
#---------------------------------------------------------------------------------------------------------------------
|
38
|
+
class PayloadTooLargeError < ServerError;
|
39
|
+
def initialize(msg=nil, code=nil, url=nil, body=nil, e_class="Scalyr::Common::Client::PayloadTooLargeError")
|
40
|
+
super(msg, code, url, body, e_class)
|
41
|
+
end
|
30
42
|
end
|
31
43
|
|
32
44
|
#---------------------------------------------------------------------------------------------------------------------
|
@@ -57,7 +69,7 @@ end
|
|
57
69
|
class ClientSession
|
58
70
|
|
59
71
|
def initialize(logger, add_events_uri, compression_type, compression_level,
|
60
|
-
ssl_verify_peer, ssl_ca_bundle_path,
|
72
|
+
ssl_verify_peer, ssl_ca_bundle_path,
|
61
73
|
record_stats_for_status, flush_quantile_estimates_on_status_send,
|
62
74
|
connect_timeout, socket_timeout, request_timeout, pool_max, pool_max_per_route)
|
63
75
|
@logger = logger
|
@@ -66,7 +78,6 @@ class ClientSession
|
|
66
78
|
@compression_level = compression_level
|
67
79
|
@ssl_verify_peer = ssl_verify_peer
|
68
80
|
@ssl_ca_bundle_path = ssl_ca_bundle_path
|
69
|
-
@append_builtin_cert = append_builtin_cert
|
70
81
|
@record_stats_for_status = record_stats_for_status
|
71
82
|
@flush_quantile_estimates_on_status_send = flush_quantile_estimates_on_status_send
|
72
83
|
@connect_timeout = connect_timeout
|
@@ -75,9 +86,6 @@ class ClientSession
|
|
75
86
|
@pool_max = pool_max
|
76
87
|
@pool_max_per_route = pool_max_per_route
|
77
88
|
|
78
|
-
# A cert to use by default to avoid issues caused by the OpenSSL library not validating certs according to standard
|
79
|
-
@cert_string = CA_CERT_STRING
|
80
|
-
|
81
89
|
# Request statistics are accumulated across multiple threads and must be accessed through a mutex
|
82
90
|
@stats_lock = Mutex.new
|
83
91
|
@latency_stats = get_new_latency_stats
|
@@ -120,41 +128,15 @@ class ClientSession
|
|
120
128
|
# verify peers to prevent potential MITM attacks
|
121
129
|
if @ssl_verify_peer
|
122
130
|
c[:ssl][:verify] = :strict
|
131
|
+
@logger.info("Using CA bundle from #{@ssl_ca_bundle_path} to validate the server side certificate")
|
123
132
|
|
124
|
-
if not @
|
125
|
-
|
126
|
-
@logger.info("Using CA bundle from #{@ssl_ca_bundle_path} to validate the server side certificate")
|
127
|
-
@ca_cert_path = @ssl_ca_bundle_path
|
128
|
-
|
129
|
-
if not File.file?(@ssl_ca_bundle_path)
|
130
|
-
# TODO: For now we don't throw to keep code backward compatible. In the future in case
|
131
|
-
# file doesn't exist, we should throw instead of write empty CA cert file and pass that
|
132
|
-
# to Manticore which will eventually fail and throw on cert validation
|
133
|
-
#raise Errno::ENOENT.new("ssl_ca_bundle_path config option to an invalid file path which doesn't exist - #{@ssl_ca_bundle_path}")
|
134
|
-
@ca_cert = Tempfile.new("ca_cert")
|
135
|
-
@ca_cert_path = @ca_cert.path
|
136
|
-
end
|
137
|
-
else
|
138
|
-
@ca_cert = Tempfile.new("ca_cert")
|
139
|
-
|
140
|
-
if File.file?(@ssl_ca_bundle_path)
|
141
|
-
@ca_cert.write(File.read(@ssl_ca_bundle_path))
|
142
|
-
@ca_cert.flush
|
143
|
-
else
|
144
|
-
@logger.warn("CA bundle (#{@ssl_ca_bundle_path}) doesn't exist, using only bundled CA certificates")
|
145
|
-
end
|
146
|
-
|
147
|
-
open(@ca_cert.path, "a") do |f|
|
148
|
-
f.puts @cert_string
|
149
|
-
end
|
150
|
-
|
151
|
-
@ca_cert.flush
|
152
|
-
@ca_cert_path = @ca_cert.path
|
153
|
-
|
154
|
-
@logger.info("Using CA bundle from #{@ssl_ca_bundle_path} combined with bundled certificates to validate the server side certificate (#{@ca_cert_path})")
|
133
|
+
if not File.file?(@ssl_ca_bundle_path)
|
134
|
+
raise Errno::ENOENT.new("Invalid path for ssl_ca_bundle_path config option - file doesn't exist or is not readable")
|
155
135
|
end
|
156
|
-
|
136
|
+
|
137
|
+
c[:ssl][:ca_file] = @ssl_ca_bundle_path
|
157
138
|
else
|
139
|
+
@logger.warn("SSL certificate validation has been disabled. You are strongly encouraged to enable it to prevent possible MITM and similar attacks.")
|
158
140
|
c[:ssl][:verify] = :disable
|
159
141
|
end
|
160
142
|
|
@@ -338,15 +320,17 @@ class ClientSession
|
|
338
320
|
end
|
339
321
|
|
340
322
|
status = response_hash["status"]
|
323
|
+
code = response.code.to_s.strip.to_i
|
341
324
|
|
342
325
|
if status != "success"
|
343
|
-
if
|
326
|
+
if code == 413
|
327
|
+
raise PayloadTooLargeError.new(status, response.code, @add_events_uri, response.body)
|
328
|
+
elsif status =~ /discardBuffer/
|
344
329
|
raise RequestDroppedError.new(status, response.code, @add_events_uri, response.body)
|
345
330
|
else
|
346
331
|
raise ServerError.new(status, response.code, @add_events_uri, response.body)
|
347
332
|
end
|
348
333
|
else
|
349
|
-
code = response.code.to_s.strip.to_i
|
350
334
|
if code < 200 or code > 299
|
351
335
|
raise ServerError.new(status, response.code, @add_events_uri, response.body)
|
352
336
|
end
|
data/lib/scalyr/constants.rb
CHANGED
@@ -1,15 +1,14 @@
|
|
1
1
|
# encoding: utf-8
|
2
2
|
|
3
|
-
PLUGIN_VERSION = "v0.2.
|
3
|
+
PLUGIN_VERSION = "v0.2.8.beta"
|
4
4
|
|
5
5
|
# Special event level attribute name which can be used for setting event level serverHost attribute
|
6
6
|
EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME = '__origServerHost'
|
7
7
|
|
8
8
|
# Path to the bundled root CA certs used to sign server cert
|
9
|
-
|
9
|
+
CA_CERTS_PATH = File.expand_path(File.join(File.dirname(__FILE__), + "/certs/ca_certs.crt"))
|
10
10
|
|
11
|
-
#
|
12
|
-
|
13
|
-
|
14
|
-
|
15
|
-
CA_CERT_STRING = File.read(CA_CERT_PATH)
|
11
|
+
# Additional check on import to catch this issue early (in case of a invalid path or similar)
|
12
|
+
if not File.file?(CA_CERTS_PATH)
|
13
|
+
raise Errno::ENOENT.new("Invalid path specified for CA_CERTS_PATH module constant (likely a developer error).")
|
14
|
+
end
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Gem::Specification.new do |s|
|
2
2
|
s.name = 'logstash-output-scalyr'
|
3
|
-
s.version = '0.2.
|
3
|
+
s.version = '0.2.8.beta'
|
4
4
|
s.licenses = ['Apache-2.0']
|
5
5
|
s.summary = "Scalyr output plugin for Logstash"
|
6
6
|
s.description = "Sends log data collected by Logstash to Scalyr (https://www.scalyr.com)"
|
@@ -10,7 +10,7 @@ Gem::Specification.new do |s|
|
|
10
10
|
s.require_paths = ["lib"]
|
11
11
|
|
12
12
|
# Files
|
13
|
-
s.files = Dir['lib/**/*','
|
13
|
+
s.files = Dir['lib/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
14
14
|
# Tests
|
15
15
|
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
16
|
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-output-scalyr
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.2.
|
4
|
+
version: 0.2.8.beta
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Edward Chee
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2022-
|
11
|
+
date: 2022-10-20 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|
@@ -165,15 +165,6 @@ files:
|
|
165
165
|
- lib/scalyr/common/util.rb
|
166
166
|
- lib/scalyr/constants.rb
|
167
167
|
- logstash-output-scalyr.gemspec
|
168
|
-
- spec/benchmarks/bignum_fixing.rb
|
169
|
-
- spec/benchmarks/flattening_and_serialization.rb
|
170
|
-
- spec/benchmarks/json_serialization.rb
|
171
|
-
- spec/benchmarks/metrics_overhead.rb
|
172
|
-
- spec/benchmarks/set_session_level_serverhost_on_events.rb
|
173
|
-
- spec/benchmarks/util.rb
|
174
|
-
- spec/logstash/outputs/scalyr_integration_spec.rb
|
175
|
-
- spec/logstash/outputs/scalyr_spec.rb
|
176
|
-
- spec/scalyr/common/util_spec.rb
|
177
168
|
homepage: https://www.scalyr.com/help/data-sources#logstash
|
178
169
|
licenses:
|
179
170
|
- Apache-2.0
|
@@ -191,22 +182,13 @@ required_ruby_version: !ruby/object:Gem::Requirement
|
|
191
182
|
version: '0'
|
192
183
|
required_rubygems_version: !ruby/object:Gem::Requirement
|
193
184
|
requirements:
|
194
|
-
- - "
|
185
|
+
- - ">"
|
195
186
|
- !ruby/object:Gem::Version
|
196
|
-
version:
|
187
|
+
version: 1.3.1
|
197
188
|
requirements: []
|
198
189
|
rubyforge_project:
|
199
190
|
rubygems_version: 2.7.10
|
200
191
|
signing_key:
|
201
192
|
specification_version: 4
|
202
193
|
summary: Scalyr output plugin for Logstash
|
203
|
-
test_files:
|
204
|
-
- spec/benchmarks/bignum_fixing.rb
|
205
|
-
- spec/benchmarks/flattening_and_serialization.rb
|
206
|
-
- spec/benchmarks/json_serialization.rb
|
207
|
-
- spec/benchmarks/metrics_overhead.rb
|
208
|
-
- spec/benchmarks/set_session_level_serverhost_on_events.rb
|
209
|
-
- spec/benchmarks/util.rb
|
210
|
-
- spec/logstash/outputs/scalyr_integration_spec.rb
|
211
|
-
- spec/logstash/outputs/scalyr_spec.rb
|
212
|
-
- spec/scalyr/common/util_spec.rb
|
194
|
+
test_files: []
|
@@ -1,87 +0,0 @@
|
|
1
|
-
require 'benchmark'
|
2
|
-
require 'quantile'
|
3
|
-
|
4
|
-
require_relative '../../lib/scalyr/common/util'
|
5
|
-
require_relative './util'
|
6
|
-
|
7
|
-
# Micro benchmark which measures how long it takes to find all the Bignums in a record and convert them to strings
|
8
|
-
|
9
|
-
ITERATIONS = 500
|
10
|
-
|
11
|
-
def rand_bignum()
|
12
|
-
return 200004000020304050300 + rand(999999)
|
13
|
-
end
|
14
|
-
|
15
|
-
def generate_hash(widths)
|
16
|
-
result = {}
|
17
|
-
if widths.empty?
|
18
|
-
return rand_bignum()
|
19
|
-
else
|
20
|
-
widths[0].times do
|
21
|
-
result[rand_str(9)] = generate_hash(widths[1..widths.length])
|
22
|
-
end
|
23
|
-
return result
|
24
|
-
end
|
25
|
-
end
|
26
|
-
|
27
|
-
def generate_data_array_for_spec(spec)
|
28
|
-
data = []
|
29
|
-
ITERATIONS.times do
|
30
|
-
data << generate_hash(spec)
|
31
|
-
end
|
32
|
-
|
33
|
-
data
|
34
|
-
end
|
35
|
-
|
36
|
-
def run_benchmark_and_print_results(data, run_benchmark_func)
|
37
|
-
puts ""
|
38
|
-
puts "Using %s total keys in a hash" % [Scalyr::Common::Util.flatten(data[0]).count]
|
39
|
-
puts ""
|
40
|
-
|
41
|
-
result = []
|
42
|
-
ITERATIONS.times do |i|
|
43
|
-
result << Benchmark.measure { run_benchmark_func.(data[i]) }
|
44
|
-
end
|
45
|
-
|
46
|
-
sum = result.inject(nil) { |sum, t| sum.nil? ? sum = t : sum += t }
|
47
|
-
avg = sum / result.size
|
48
|
-
|
49
|
-
Benchmark.bm(7, "sum:", "avg:") do |b|
|
50
|
-
[sum, avg]
|
51
|
-
end
|
52
|
-
puts ""
|
53
|
-
end
|
54
|
-
|
55
|
-
|
56
|
-
puts "Using %s iterations" % [ITERATIONS]
|
57
|
-
puts ""
|
58
|
-
|
59
|
-
@value = Quantile::Estimator.new
|
60
|
-
@prng = Random.new
|
61
|
-
|
62
|
-
def convert_bignums(record)
|
63
|
-
Scalyr::Common::Util.convert_bignums(record)
|
64
|
-
end
|
65
|
-
|
66
|
-
puts "Util.convert_bignums()"
|
67
|
-
puts "==============================="
|
68
|
-
|
69
|
-
# Around ~200 keys in a hash
|
70
|
-
data = generate_data_array_for_spec([4, 4, 3, 4])
|
71
|
-
run_benchmark_and_print_results(data, method(:convert_bignums))
|
72
|
-
|
73
|
-
# Around ~200 keys in a hash (single level)
|
74
|
-
data = generate_data_array_for_spec([200])
|
75
|
-
run_benchmark_and_print_results(data, method(:convert_bignums))
|
76
|
-
|
77
|
-
# Around ~512 keys in a hash
|
78
|
-
data = generate_data_array_for_spec([8, 4, 4, 4])
|
79
|
-
run_benchmark_and_print_results(data, method(:convert_bignums))
|
80
|
-
|
81
|
-
# Around ~960 keys in a hash
|
82
|
-
data = generate_data_array_for_spec([12, 5, 4, 4])
|
83
|
-
run_benchmark_and_print_results(data, method(:convert_bignums))
|
84
|
-
|
85
|
-
# Around ~2700 keys in a hash
|
86
|
-
data = generate_data_array_for_spec([14, 8, 6, 4])
|
87
|
-
run_benchmark_and_print_results(data, method(:convert_bignums))
|
@@ -1,100 +0,0 @@
|
|
1
|
-
require 'benchmark'
|
2
|
-
require 'json'
|
3
|
-
|
4
|
-
require_relative '../../lib/scalyr/common/util'
|
5
|
-
require_relative './util'
|
6
|
-
|
7
|
-
# NOTE: When using jRuby using multiple iterations with the same dataset doesn't make
|
8
|
-
# sense since it will just use JITed version of the code which will be very fast. If we
|
9
|
-
# wanted to accurately measure using multiple iterations we would need te different
|
10
|
-
# input data for each iteration.
|
11
|
-
ITERATIONS = 500
|
12
|
-
|
13
|
-
def run_benchmark_and_print_results(data, run_benchmark_func)
|
14
|
-
puts ""
|
15
|
-
puts "Using %s total keys in a hash" % [Scalyr::Common::Util.flatten(data[0]).count]
|
16
|
-
puts ""
|
17
|
-
|
18
|
-
result = []
|
19
|
-
ITERATIONS.times do |i|
|
20
|
-
result << Benchmark.measure { run_benchmark_func.(data[i]) }
|
21
|
-
end
|
22
|
-
|
23
|
-
sum = result.inject(nil) { |sum, t| sum.nil? ? sum = t : sum += t }
|
24
|
-
avg = sum / result.size
|
25
|
-
|
26
|
-
Benchmark.bm(7, "sum:", "avg:") do |b|
|
27
|
-
[sum, avg]
|
28
|
-
end
|
29
|
-
puts ""
|
30
|
-
end
|
31
|
-
|
32
|
-
def flatten_data_func(data)
|
33
|
-
Scalyr::Common::Util.flatten(data)
|
34
|
-
end
|
35
|
-
|
36
|
-
def json_serialize_data(data)
|
37
|
-
data.to_json
|
38
|
-
end
|
39
|
-
|
40
|
-
DATASETS = {
|
41
|
-
:keys_50 => generate_data_array_for_spec([3, 3, 3, 2]),
|
42
|
-
:keys_200 => generate_data_array_for_spec([4, 4, 3, 4]),
|
43
|
-
:keys_200_flat => generate_data_array_for_spec([200]),
|
44
|
-
:keys_512 => generate_data_array_for_spec([8, 4, 4, 4]),
|
45
|
-
:keys_960 => generate_data_array_for_spec([12, 5, 4, 4]),
|
46
|
-
:keys_2700 => generate_data_array_for_spec([14, 8, 6, 4])
|
47
|
-
}
|
48
|
-
|
49
|
-
puts "Using %s iterations" % [ITERATIONS]
|
50
|
-
puts ""
|
51
|
-
|
52
|
-
puts "Scalyr::Common::Util.flatten()"
|
53
|
-
puts "==============================="
|
54
|
-
|
55
|
-
# Around ~50 keys in a hash
|
56
|
-
data = DATASETS[:keys_50]
|
57
|
-
run_benchmark_and_print_results(data, method(:flatten_data_func))
|
58
|
-
|
59
|
-
# Around ~200 keys in a hash
|
60
|
-
data = DATASETS[:keys_200]
|
61
|
-
run_benchmark_and_print_results(data, method(:flatten_data_func))
|
62
|
-
|
63
|
-
# Around ~200 keys in a hash (single level)
|
64
|
-
data = DATASETS[:keys_200_flat]
|
65
|
-
run_benchmark_and_print_results(data, method(:flatten_data_func))
|
66
|
-
|
67
|
-
# Around ~512 keys in a hash
|
68
|
-
data = DATASETS[:keys_512]
|
69
|
-
run_benchmark_and_print_results(data, method(:flatten_data_func))
|
70
|
-
|
71
|
-
# Around ~960 keys in a hash
|
72
|
-
data = DATASETS[:keys_960]
|
73
|
-
run_benchmark_and_print_results(data, method(:flatten_data_func))
|
74
|
-
|
75
|
-
# Around ~2700 keys in a hash
|
76
|
-
data = DATASETS[:keys_2700]
|
77
|
-
run_benchmark_and_print_results(data, method(:flatten_data_func))
|
78
|
-
|
79
|
-
puts "JSON.dumps (hash.to_dict)"
|
80
|
-
puts "==============================="
|
81
|
-
|
82
|
-
# Around ~200 keys in a hash
|
83
|
-
data = generate_data_array_for_spec([4, 4, 3, 4])
|
84
|
-
run_benchmark_and_print_results(data, method(:json_serialize_data))
|
85
|
-
|
86
|
-
# Around ~200 keys in a hash (single level)
|
87
|
-
data = DATASETS[:keys_200_flat]
|
88
|
-
run_benchmark_and_print_results(data, method(:json_serialize_data))
|
89
|
-
|
90
|
-
# Around ~512 keys in a hash
|
91
|
-
data = generate_data_array_for_spec([8, 4, 4, 4])
|
92
|
-
run_benchmark_and_print_results(data, method(:json_serialize_data))
|
93
|
-
|
94
|
-
# Around ~960 keys in a hash
|
95
|
-
data = generate_data_array_for_spec([12, 5, 4, 4])
|
96
|
-
run_benchmark_and_print_results(data, method(:json_serialize_data))
|
97
|
-
|
98
|
-
# Around ~2700 keys in a hash
|
99
|
-
data = generate_data_array_for_spec([14, 8, 6, 4])
|
100
|
-
run_benchmark_and_print_results(data, method(:json_serialize_data))
|