logstash-output-scalyr 0.1.26.beta → 0.2.0.beta

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 5593e93dedad0ddb545f8911e7f1cd8cabde4043a6a76b8c5b78d99c0c4d12a5
4
- data.tar.gz: 6c821768c3d780529d00bc1f58c4ec8841ecba2758a46c71ecf17201fb8ba3d1
3
+ metadata.gz: d2f63d5c51030c7bbb033edc0d7f8b8d3affff48f94f0bb2134fdfb75af36a76
4
+ data.tar.gz: 5787605f828a4a4f7d992003ed8a27e1f67439c49fab37d91c5077aa08d50428
5
5
  SHA512:
6
- metadata.gz: 57901dc94f49aa844130f4b2a7ac084c95d473c1a948902b699cbaba8d63049d1e960b69b49b20cf8ffa8b2c1f5f68a53ad9687b72867799f113a384f490e01d
7
- data.tar.gz: 329f985e8121d49bab836381861ca878c229f2b316e160792263461900a8fc4cf7f46bdc2aba889174a0f35b0c67f567b0e5a09d429f888948b788707ab3c0b7
6
+ metadata.gz: ba560b44dd9ad73f56b2f105007a37bedd5b7821f7776c8d58c44ec7bb0fa544e80bd287756f8bb6ca76d53eedd3a1f36131cc60f2e7cf26ef9937af21c0a5a9
7
+ data.tar.gz: 4c8e3d4528ab613bd506e8b87e014ddfbd9725a088171396d483826348e8e7467ca75b1164dadb082b3d0732b37bfe4d4d1985b86cfb4ca389f592294055d2a3
data/CHANGELOG.md CHANGED
@@ -1,9 +1,24 @@
1
1
  # Beta
2
2
 
3
+ ## 0.2.0.beta
4
+
5
+ - Fix a bug and correctly handle ``serverHost`` event level attribute. Now if an event contains
6
+ ``serverHost`` attribute, this attribute will be correctly set on the event level and available for
7
+ "Sources" filtering in the UI.
8
+ - Plugin doesn't set ``serverHost`` attribute with a fixed value of ``Logstash`` on each event
9
+ level anymore. If you still want this behavior, you can achieve that with logstash mutate filter.
10
+ - Session level ``serverHost`` value now defaults to logstash aggregator node hostname
11
+ (``use_hostname_for_serverhost`` config option now defaults to true).
12
+ - ``host`` attribute is not removed by default from all the events. By default, logstash adds
13
+ ``host`` attribute which contains logstash aggregator host to each event. This is now redundant
14
+ and unncessary with the fixed and improved serverHost behavior (host and serverHost would contain
15
+ the same value by default). If you want to change this behavior and and still include ``host``
16
+ attribute on each event you can do that by setting ``remove_host_attribute_from_events`` config
17
+ option to false.
18
+
3
19
  ## 0.1.26.beta
4
20
  - Add support for new ``json_library`` config option. Valid values are ``stdlib`` (default) are ``jrjackson``. The later may offer 2-4x faster JSON serialization.
5
21
 
6
-
7
22
  ## 0.1.23.beta
8
23
  - Add testing support for disabling estimation of serialized event size for each event in the batch.
9
24
 
data/README.md CHANGED
@@ -9,7 +9,7 @@ You can view documentation for this plugin [on the Scalyr website](https://app.s
9
9
  # Quick start
10
10
 
11
11
  1. Build the gem, run `gem build logstash-output-scalyr.gemspec`
12
- 2. Install the gem into a Logstash installation, run `/usr/share/logstash/bin/logstash-plugin install logstash-output-scalyr-0.1.25.beta.gem` or follow the latest official instructions on working with plugins from Logstash.
12
+ 2. Install the gem into a Logstash installation, run `/usr/share/logstash/bin/logstash-plugin install logstash-output-scalyr-0.2.0.beta.gem` or follow the latest official instructions on working with plugins from Logstash.
13
13
  3. Configure the output plugin (e.g. add it to a pipeline .conf)
14
14
  4. Restart Logstash
15
15
 
@@ -39,6 +39,51 @@ output {
39
39
 
40
40
  In the above example, the Logstash pipeline defines a file input that reads from `/var/log/messages`. Log events from this source have the `host` and `path` fields. The pipeline then outputs to the scalyr plugin, which in this example is configured to remap `host`->`serverHost` and `path`->`logfile`, thus facilitating filtering in the Scalyr UI.
41
41
 
42
+ ## Notes on serverHost attribute handling
43
+
44
+ > Some of this functionality has been fixed and changed in the v0.2.0beta release. In previous
45
+ versions, plugin added ``serverHost`` attribute with a value of ``Logstash`` to each event and
46
+ this attribute was not handled correctly - it was treated as a regular event level attribute
47
+ and not a special attribute which can be used for Source functionality and filtering.
48
+
49
+ By default this plugin will set ``serverHost`` for all the events in a batch to match hostname of
50
+ the logstash node where the output plugin is running.
51
+
52
+ You can change that either by setting ``serverHost`` attribute in the ``server_attributes`` config
53
+ option hash or by setting ``serverHost`` attribute on the event level via logstash record attribute.
54
+
55
+ In both scenarios, you will be able to utilize this value for "Sources" functionality and filterin
56
+ in the Scalyr UI.
57
+
58
+ For example:
59
+
60
+ 1. Define static value for all the events handled by specific plugin instance
61
+
62
+ ```
63
+ output {
64
+ scalyr {
65
+ api_write_token => 'SCALYR_API_KEY'
66
+ server_attributes => {'serverHost' => 'my-host-1'}
67
+ }
68
+ }
69
+ ```
70
+
71
+ 2. Define static value on the event level which is set via logstash filter
72
+
73
+ ```
74
+ mutate {
75
+ add_field => { "serverHost" => "my hostname" }
76
+ }
77
+ ```
78
+
79
+ 3. Define dynamic value on the event level which is set via logstash filter
80
+
81
+ ```
82
+ mutate {
83
+ add_field => { "serverHost" => "%{[host][name]}" }
84
+ }
85
+ ```
86
+
42
87
  ## Options
43
88
 
44
89
  - The Scalyr API write token, these are available at https://www.scalyr.com/keys. This is the only compulsory configuration field required for proper upload
@@ -71,7 +116,7 @@ In the above example, the Logstash pipeline defines a file input that reads from
71
116
  - Related to the server_attributes dictionary above, if you do not define the 'serverHost' key in server_attributes,
72
117
  the plugin will automatically set it, using the aggregator hostname as value, if this value is true.
73
118
 
74
- `config :use_hostname_for_serverhost, :validate => :boolean, :default => false`
119
+ `config :use_hostname_for_serverhost, :validate => :boolean, :default => true`
75
120
 
76
121
  ---
77
122
 
@@ -394,3 +439,65 @@ bundle exec rake publish_gem
394
439
 
395
440
  `RUBY_USER` and `RUBY_PASSWORD` should be replaced with the username and password to the RubyGems.org account you wish to release to,
396
441
  these credentials should be found in Keeper.
442
+
443
+ # Testing Plugin Changes
444
+
445
+ This section describes how to test the plugin and changes using docker compose setup from
446
+ lostash-config-tester repo available at https://github.com/Kami/logstash-config-tester.
447
+
448
+ This repo has been forked and already contains some changes which make testing the plugin
449
+ easier.
450
+
451
+ The logstash configuration in that repo is set up to receive records encoded as JSON via
452
+ standard input, decode the JSON into the event object and print it to stdout + send it
453
+ to the Scalyr output.
454
+
455
+ ```bash
456
+ # 0. Clone the tester repo
457
+ git clone https://github.com/Kami/logstash-config-tester ~/
458
+
459
+ # 1. Build the plugin
460
+ gem build logstash-output-scalyr.gemspec
461
+
462
+ # 2. Copy it to the config test repo
463
+ cp logstash-output-scalyr-0.2.0.beta.gem ~/logstash-config-test/logstash-output-scalyr.gem
464
+
465
+ # 3. Build docker image with the latest dev version of the plugin (may take a while)
466
+ docker-compose build
467
+
468
+ # 4. Configure API key in docker-compose.yml and make any changes to plugin config in
469
+ # pipeline/scalyr_output.conf.j2, if necessary
470
+ vim docker-compose.yml
471
+
472
+ vim pipeline/scalyr_output.conf.j2
473
+
474
+ # 4. Run logstash with the stdin input and stdout + scalyr output
475
+ docker-compose run logstash
476
+ ```
477
+
478
+ A couple of things to keep in mind:
479
+
480
+ 1. Logstash log level is set to debug which is quite verbose, but it makes troubleshooting and
481
+ testing easier.
482
+ 2. Plugin accepts records (events) as JSON via standard input. This means to inject the mock
483
+ event you can simply copy the JSON event representation string to stdin and press enter. If you
484
+ want to submit multiples events to be handled as a single batch, paste each event one at a time
485
+ and press enter at the end. Logstash pipeline has been configured to wait up to 5 seconds before
486
+ handling the batch which should give you enough time to test batches with multiple events (this
487
+ setting can be adjusted in ``config/logstash.yml`` - ``pipeline.batch.delay``, if needed)
488
+
489
+ Example values you can enter into stdin:
490
+
491
+ 1. Single event batch
492
+
493
+ ```javascript
494
+ {"foo": "bar", "message": "test logstash"}
495
+ ```
496
+
497
+ 2. Batch with 3 events
498
+
499
+ ```javascript
500
+ {"serverHost": "host-1", "bar": "baz", "message": "test logstash 1"}
501
+ {"serverHost": "host-2", "bar": "baz", "message": "test logstash 2"}
502
+ {"bar": "baz", "message": "test logstash 3"}
503
+ ```
@@ -43,7 +43,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
43
43
 
44
44
  # Related to the server_attributes dictionary above, if you do not define the 'serverHost' key in server_attributes,
45
45
  # the plugin will automatically set it, using the aggregator hostname as value, if this value is true.
46
- config :use_hostname_for_serverhost, :validate => :boolean, :default => false
46
+ config :use_hostname_for_serverhost, :validate => :boolean, :default => true
47
47
 
48
48
  # Field that represents the origin of the log event.
49
49
  # (Warning: events with an existing 'serverHost' field, it will be overwritten)
@@ -66,6 +66,23 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
66
66
  # value with have fields moved.
67
67
  config :log_constants, :validate => :array, :default => nil
68
68
 
69
+ # When this option is true and session level server host is defined (either via
70
+ # server_attributes config option or via node hostname) and some events in a batch contain
71
+ # "serverHost" attributes, other nodes in a batch which don't contain it will have serverHost
72
+ # set to the session level value.
73
+ # This is needed because session level attribute has priority over event level which means
74
+ # that in case we specify serverHost on some events, other events won't have any value set
75
+ # for serverHost.
76
+ # Since this option adds some overhead and requires additional processing time, it's
77
+ # disabled by default.
78
+ config :set_session_level_serverhost_on_events, validate: :boolean, default: false
79
+
80
+ # By default, logstash will add "host" attribute which includes logstash aggregator server
81
+ # host to each event. This is not really needed and desired anymore with the fixed and improved
82
+ # serverHost attribute handling (serverHost now contains logstash aggregator hostname by
83
+ # default).
84
+ config :remove_host_attribute_from_events, validate: :boolean, default: true
85
+
69
86
  # If true, nested values will be flattened (which changes keys to delimiter-separated concatenation of all
70
87
  # nested keys).
71
88
  config :flatten_nested_values, :validate => :boolean, :default => false
@@ -241,12 +258,13 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
241
258
  @server_attributes = new_attributes
242
259
  end
243
260
 
244
- # See if we should use the hostname as the server_attributes.serverHost
245
- if @use_hostname_for_serverhost
246
- if @server_attributes.nil?
261
+ # See if we should use the hostname as the server_attributes.serverHost (aka if fixed serverHost is not
262
+ # defined as part of server_attributes config option)
263
+ if @server_attributes.nil?
247
264
  @server_attributes = {}
248
- end
265
+ end
249
266
 
267
+ if @use_hostname_for_serverhost
250
268
  # only set serverHost if it doesn't currently exist in server_attributes
251
269
  # Note: Use strings rather than symbols for the key, because keys coming
252
270
  # from the config file will be strings
@@ -258,6 +276,15 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
258
276
  # Add monitor server attribute to identify this as coming from a plugin
259
277
  @server_attributes['monitor'] = 'pluginLogstash'
260
278
 
279
+ # We create a fixed copy without host here so we can reference this later if needed to avoid
280
+ # some of the copy + manipulate overhead per batch
281
+ @server_attributes_without_serverhost = @server_attributes.clone
282
+ if @server_attributes_without_serverhost.key? "serverHost"
283
+ @server_attributes_without_serverhost.delete "serverHost"
284
+ end
285
+
286
+ @session_server_host = @server_attributes["serverHost"]
287
+
261
288
  @scalyr_server << '/' unless @scalyr_server.end_with?('/')
262
289
 
263
290
  @add_events_uri = URI(@scalyr_server) + "addEvents"
@@ -290,7 +317,8 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
290
317
  @http_connect_timeout, @http_socket_timeout, @http_request_timeout, @http_pool_max, @http_pool_max_per_route
291
318
  )
292
319
 
293
- @logger.info(sprintf("Started Scalyr output plugin (%s)." % [PLUGIN_VERSION]), :class => self.class.name)
320
+ @logger.info(sprintf("Started Scalyr LogStash output plugin %s (compression_type=%s,compression_level=%s,json_library=%s)." %
321
+ [PLUGIN_VERSION, @compression_type, @compression_type, @json_library]), :class => self.class.name)
294
322
 
295
323
  # Finally, send a status line to Scalyr
296
324
  # We use a special separate short lived client session for sending the initial client status.
@@ -505,7 +533,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
505
533
  @dlq_writer.write(l_event, "#{exc_data[:message]}")
506
534
  }
507
535
  else
508
- @logger.warn("Deal letter queue not configured, dropping #{multi_event_request[:logstash_events].length} events after #{exc_retries} tries.", :sample_events => sample_events)
536
+ @logger.warn("Dead letter queue not configured, dropping #{multi_event_request[:logstash_events].length} events after #{exc_retries} tries.", :sample_events => sample_events)
509
537
  end
510
538
  end
511
539
 
@@ -549,6 +577,8 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
549
577
  logs_ids = Hash.new
550
578
  next_log_id = 1
551
579
 
580
+ batch_has_event_level_server_host = false
581
+
552
582
  logstash_events.each {|l_event|
553
583
 
554
584
  record = l_event.to_hash
@@ -604,6 +634,11 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
604
634
  # Rename user-specified logfile field -> 'logfile'
605
635
  rename.call(@logfile_field, 'logfile')
606
636
 
637
+ # Remove "host" attribute
638
+ if @remove_host_attribute_from_events and record.key? "host"
639
+ record.delete("host")
640
+ end
641
+
607
642
  # Set a default parser is none is present in the event
608
643
  if record['parser'].to_s.empty?
609
644
  record['parser'] = "logstashParser"
@@ -614,26 +649,31 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
614
649
  record['logfile'] = "/logstash/#{serverHost}"
615
650
  end
616
651
 
617
- # Set a default if no serverHost value is present.
618
- if serverHost.nil?
619
- record['serverHost'] = "Logstash"
652
+ # Rename serverHost (if exists) to __origServerHost so sources filtering works correctly
653
+ # It's important that this happens at the very end of the event processing in this function.
654
+ record_has_server_host_attribute = record.key? 'serverHost'
655
+ batch_has_event_level_server_host |= record_has_server_host_attribute
656
+
657
+ if record_has_server_host_attribute
658
+ record[EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME] = record['serverHost']
659
+ record.delete('serverHost')
620
660
  end
621
661
 
662
+ # To reduce duplication of common event-level attributes, we "fold" them into top-level "logs" attribute
663
+ # and reference log entry inside the event
622
664
  log_identifier = nil
623
665
  add_log = false
624
666
  if serverHost
625
667
  log_identifier = serverHost + record['logfile']
626
668
  end
669
+
627
670
  if log_identifier and not logs.key? log_identifier
628
671
  add_log = true
629
672
  logs[log_identifier] = {
630
673
  'id' => next_log_id,
631
674
  'attrs' => Hash.new
632
675
  }
633
- if not record['serverHost'].to_s.empty?
634
- logs[log_identifier]['attrs']['serverHost'] = record['serverHost']
635
- record.delete('serverHost')
636
- end
676
+
637
677
  if not record['logfile'].to_s.empty?
638
678
  logs[log_identifier]['attrs']['logfile'] = record['logfile']
639
679
  record.delete('logfile')
@@ -646,10 +686,21 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
646
686
  end
647
687
  }
648
688
  end
689
+
649
690
  logs_ids[log_identifier] = next_log_id
650
691
  next_log_id += 1
651
692
  end
652
693
 
694
+ # If we already contain "logs" entry for this record, we remove duplicated serverHost from
695
+ # the event attributes since it's already part of the log level attributes which are
696
+ # referenced by the event.
697
+ if log_identifier and logs.key? log_identifier
698
+ if not record[EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME].to_s.empty?
699
+ logs[log_identifier]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME] = record[EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]
700
+ record.delete(EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME)
701
+ end
702
+ end
703
+
653
704
  # Delete unwanted fields from record
654
705
  record.delete('@version')
655
706
  record.delete('@timestamp')
@@ -695,7 +746,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
695
746
  :attrs => record
696
747
  }
697
748
 
698
- # optionally set thread
749
+ # optionally set thread and referenced log file
699
750
  if serverHost
700
751
  scalyr_event[:thread] = thread_id.to_s
701
752
  scalyr_event[:log] = logs_ids[log_identifier]
@@ -756,7 +807,8 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
756
807
  append_event = false
757
808
  end
758
809
 
759
- multi_event_request = self.create_multi_event_request(scalyr_events, l_events, current_threads, logs)
810
+ Scalyr::Common::Util.set_session_level_serverhost_on_events(@session_server_host, scalyr_events, logs, batch_has_event_level_server_host)
811
+ multi_event_request = self.create_multi_event_request(scalyr_events, l_events, current_threads, logs, batch_has_event_level_server_host)
760
812
  multi_event_request_array << multi_event_request
761
813
 
762
814
  total_bytes = 0
@@ -765,6 +817,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
765
817
  logs_ids = Hash.new
766
818
  scalyr_events = Array.new
767
819
  l_events = Array.new
820
+ batch_has_event_level_server_host = false
768
821
  end
769
822
  else
770
823
  # If size estimation is disabled we simply append the event and handle splitting later on (if needed)
@@ -784,14 +837,14 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
784
837
 
785
838
  # create a final request with any left over events (and make sure there is at least one event)
786
839
  if scalyr_events.size >= 1
787
- multi_event_request = self.create_multi_event_request(scalyr_events, l_events, current_threads, logs)
840
+ Scalyr::Common::Util.set_session_level_serverhost_on_events(@session_server_host, scalyr_events, logs, batch_has_event_level_server_host)
841
+ multi_event_request = self.create_multi_event_request(scalyr_events, l_events, current_threads, logs, batch_has_event_level_server_host)
788
842
  multi_event_request_array << multi_event_request
789
843
  end
790
844
 
791
845
  multi_event_request_array
792
846
  end
793
847
 
794
-
795
848
  # Helper method that adds a client_timestamp to a batch addEvents request body
796
849
  def add_client_timestamp_to_body(body)
797
850
  current_time_millis = DateTime.now.strftime('%Q').to_i
@@ -804,7 +857,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
804
857
  # A request comprises multiple Scalyr Events. This function creates a request hash for
805
858
  # final upload to Scalyr (from an array of events, and an optional hash of current threads)
806
859
  # Note: The request body field will be json-encoded.
807
- def create_multi_event_request(scalyr_events, logstash_events, current_threads, current_logs)
860
+ def create_multi_event_request(scalyr_events, logstash_events, current_threads, current_logs, batch_has_event_level_server_host = false)
808
861
 
809
862
  body = {
810
863
  :session => @session_id + Thread.current.object_id.to_s,
@@ -833,7 +886,13 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
833
886
  end
834
887
 
835
888
  # add serverAttributes
836
- body[:sessionInfo] = @server_attributes if @server_attributes
889
+ # If serverHost is defined on any of the events, we don't send it via sessionInfo since
890
+ # sesionInfo has the higest priority and would always overwritte the event level one
891
+ if batch_has_event_level_server_host
892
+ body[:sessionInfo] = @server_attributes_without_serverhost if @server_attributes_without_serverhost
893
+ else
894
+ body[:sessionInfo] = @server_attributes if @server_attributes
895
+ end
837
896
 
838
897
  # We time serialization to get some insight on how long it takes to serialize the request body
839
898
  start_time = Time.now.to_f
@@ -926,8 +985,9 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
926
985
  }
927
986
  @send_stats.synchronize do
928
987
  if !@last_status_transmit_time
929
- status_event[:attrs]['message'] = sprintf("Started Scalyr LogStash output plugin (%s)." % [PLUGIN_VERSION])
930
- status_event[:attrs]['serverHost'] = @node_hostname
988
+ status_event[:attrs]['message'] = sprintf("Started Scalyr LogStash output plugin %s (compression_type=%s,compression_level=%s,json_library=%s)." %
989
+ [PLUGIN_VERSION, @compression_type, @compression_type, @json_library])
990
+ status_event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME] = @node_hostname
931
991
  else
932
992
  cur_time = Time.now()
933
993
  return if (cur_time.to_i - @last_status_transmit_time.to_i) < @status_report_interval
@@ -949,7 +1009,7 @@ class LogStash::Outputs::Scalyr < LogStash::Outputs::Base
949
1009
  cnt += 1
950
1010
  end
951
1011
  status_event[:attrs]['message'] = msg
952
- status_event[:attrs]['serverHost'] = @node_hostname
1012
+ status_event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME] = @node_hostname
953
1013
  status_event[:attrs]['parser'] = @status_parser
954
1014
  end
955
1015
  multi_event_request = create_multi_event_request([status_event], nil, nil, nil)
@@ -1,4 +1,5 @@
1
1
  require "scalyr/constants"
2
+ require "logstash-core"
2
3
 
3
4
  module Scalyr; module Common; module Client
4
5
 
@@ -298,7 +299,7 @@ class ClientSession
298
299
  version = sprintf('output-logstash-scalyr %s' % [PLUGIN_VERSION])
299
300
  post_headers = {
300
301
  'Content-Type': 'application/json',
301
- 'User-Agent': version + ';' + RUBY_VERSION + ';' + RUBY_PLATFORM
302
+ 'User-Agent': version + ';' + RUBY_VERSION + ';' + RUBY_PLATFORM + ';' + LOGSTASH_VERSION
302
303
  }
303
304
 
304
305
  post_body = nil
@@ -124,5 +124,32 @@ def self.convert_bignums(obj)
124
124
  end
125
125
  end
126
126
 
127
- end; end; end;
128
127
 
128
+ # Function which sets special serverHost attribute on the events without this special attribute
129
+ # to session level serverHost value
130
+ # NOTE: This method mutates scalyr_events in place.
131
+ def self.set_session_level_serverhost_on_events(session_server_host, scalyr_events, logs, batch_has_event_level_server_host = false)
132
+ # Maps log id (number) to logfile attributes for more efficient lookups later on
133
+ logs_ids_to_attrs = Hash.new
134
+
135
+ logs.each {|_, log|
136
+ logs_ids_to_attrs[log["id"]] = log["attrs"]
137
+ }
138
+
139
+ if batch_has_event_level_server_host
140
+ scalyr_events.each {|s_event|
141
+ log_id = s_event[:log]
142
+ logfile_attrs = logs_ids_to_attrs[log_id]
143
+
144
+ if logfile_attrs.nil?
145
+ logfile_attrs = Hash.new
146
+ end
147
+
148
+ if s_event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME].nil? and logfile_attrs[EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME].nil?
149
+ s_event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME] = session_server_host
150
+ end
151
+ }
152
+ end
153
+ end
154
+
155
+ end; end; end;
@@ -1,2 +1,5 @@
1
1
  # encoding: utf-8
2
- PLUGIN_VERSION = "v0.1.26.beta"
2
+ PLUGIN_VERSION = "v0.2.0.beta"
3
+
4
+ # Special event level attribute name which can be used for setting event level serverHost attribute
5
+ EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME = '__origServerHost'
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-scalyr'
3
- s.version = '0.1.26.beta'
3
+ s.version = '0.2.0.beta'
4
4
  s.licenses = ['Apache-2.0']
5
5
  s.summary = "Scalyr output plugin for Logstash"
6
6
  s.description = "Sends log data collected by Logstash to Scalyr (https://www.scalyr.com)"
@@ -2,15 +2,12 @@ require 'benchmark'
2
2
  require 'quantile'
3
3
 
4
4
  require_relative '../../lib/scalyr/common/util'
5
+ require_relative './util'
5
6
 
6
7
  # Micro benchmark which measures how long it takes to find all the Bignums in a record and convert them to strings
7
8
 
8
9
  ITERATIONS = 500
9
10
 
10
- def rand_str(len)
11
- return (0...len).map { (65 + rand(26)).chr }.join
12
- end
13
-
14
11
  def rand_bignum()
15
12
  return 200004000020304050300 + rand(999999)
16
13
  end
@@ -43,7 +40,7 @@ def run_benchmark_and_print_results(data, run_benchmark_func)
43
40
 
44
41
  result = []
45
42
  ITERATIONS.times do |i|
46
- result << Benchmark.measure { run_benchmark_func.(data[0]) }
43
+ result << Benchmark.measure { run_benchmark_func.(data[i]) }
47
44
  end
48
45
 
49
46
  sum = result.inject(nil) { |sum, t| sum.nil? ? sum = t : sum += t }
@@ -17,7 +17,7 @@ def run_benchmark_and_print_results(data, run_benchmark_func)
17
17
 
18
18
  result = []
19
19
  ITERATIONS.times do |i|
20
- result << Benchmark.measure { run_benchmark_func.(data[0]) }
20
+ result << Benchmark.measure { run_benchmark_func.(data[i]) }
21
21
  end
22
22
 
23
23
  sum = result.inject(nil) { |sum, t| sum.nil? ? sum = t : sum += t }
@@ -31,7 +31,7 @@ def run_benchmark_and_print_results(data, run_benchmark_func)
31
31
 
32
32
  result = []
33
33
  ITERATIONS.times do |i|
34
- result << Benchmark.measure { run_benchmark_func.(data[0]) }
34
+ result << Benchmark.measure { run_benchmark_func.(data[i]) }
35
35
  end
36
36
 
37
37
  sum = result.inject(nil) { |sum, t| sum.nil? ? sum = t : sum += t }
@@ -0,0 +1,109 @@
1
+ require 'benchmark'
2
+ require 'quantile'
3
+
4
+ require_relative '../../lib/scalyr/constants'
5
+ require_relative '../../lib/scalyr/common/util'
6
+ require_relative './util'
7
+
8
+ # Micro benchmark which measures how long "set_session_level_serverhost_on_events" takes
9
+
10
+ ITERATIONS = 100
11
+
12
+ def run_benchmark_and_print_results(data, run_benchmark_func)
13
+ puts ""
14
+ puts "Using %s total events in a batch" % [data[0].size]
15
+ puts ""
16
+
17
+ result = []
18
+ ITERATIONS.times do |i|
19
+ result << Benchmark.measure { run_benchmark_func.(data[i]) }
20
+ end
21
+
22
+ sum = result.inject(nil) { |sum, t| sum.nil? ? sum = t : sum += t }
23
+ avg = sum / result.size
24
+
25
+ Benchmark.bm(7, "sum:", "avg:") do |b|
26
+ [sum, avg]
27
+ end
28
+ puts ""
29
+ end
30
+
31
+ # Generate random event with only single event having special server host attribute set which
32
+ # represents a worst case scenario since we need to backfill rest of the events.
33
+ def generate_events(count)
34
+ result = []
35
+
36
+ ITERATIONS.times do |iteration|
37
+ events = []
38
+
39
+ count.times do |index|
40
+ event = generate_hash([2])
41
+ event[:attrs] = Hash.new
42
+ event[:log] = 1
43
+
44
+ if index == count - 1
45
+ event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME] = format("test-host-%s", index)
46
+ end
47
+
48
+ events << event
49
+ end
50
+
51
+ raise "Assertion failed" unless events.size == count
52
+
53
+ result << events
54
+ end
55
+
56
+ raise "Assertion failed" unless result.size == ITERATIONS
57
+ result
58
+ end
59
+
60
+ def run_func(events)
61
+ # NOTE: This function manipulates events in place
62
+ events.each_with_index do |event, index|
63
+ if index < events.size - 1
64
+ # Last event will have _origServerHost set, but others won't
65
+ raise "Assertion failed" unless event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME].nil?
66
+ end
67
+ end
68
+
69
+ Scalyr::Common::Util.set_session_level_serverhost_on_events("session-server-host-dummy", events, {}, true)
70
+
71
+ events.each do |event|
72
+ raise "Assertion failed" unless event[:attrs][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME].nil? == false
73
+ end
74
+ end
75
+
76
+
77
+ puts "Using %s iterations" % [ITERATIONS]
78
+ puts ""
79
+
80
+ @value = Quantile::Estimator.new
81
+
82
+ puts "Util.set_session_level_serverhost_on_events()"
83
+ puts "==============================="
84
+
85
+ # 100 events in a batch
86
+ data = generate_events(100)
87
+ run_benchmark_and_print_results(data, method(:run_func))
88
+
89
+
90
+ # 500 events in a batch
91
+ data = generate_events(500)
92
+ run_benchmark_and_print_results(data, method(:run_func))
93
+
94
+ # 1000 events in a batch
95
+ data = generate_events(1000)
96
+ run_benchmark_and_print_results(data, method(:run_func))
97
+
98
+ # 2000 events in a batch
99
+ data = generate_events(2000)
100
+ run_benchmark_and_print_results(data, method(:run_func))
101
+
102
+ # 3000 events in a batch
103
+ data = generate_events(3000)
104
+ run_benchmark_and_print_results(data, method(:run_func))
105
+
106
+ # 5000 events in a batch
107
+ data = generate_events(5000)
108
+ puts data.size
109
+ run_benchmark_and_print_results(data, method(:run_func))
@@ -42,7 +42,7 @@ describe LogStash::Outputs::Scalyr do
42
42
  :batch_num=>1,
43
43
  :code=>401,
44
44
  :message=>"error/client/badParam",
45
- :payload_size=>781,
45
+ :payload_size=>737,
46
46
  :record_count=>3,
47
47
  :total_batches=>1,
48
48
  :url=>"https://agent.scalyr.com/addEvents",
@@ -65,7 +65,7 @@ describe LogStash::Outputs::Scalyr do
65
65
  :error_class=>"Manticore::UnknownException",
66
66
  :batch_num=>1,
67
67
  :message=>"java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty",
68
- :payload_size=>781,
68
+ :payload_size=>737,
69
69
  :record_count=>3,
70
70
  :total_batches=>1,
71
71
  :url=>"https://agent.scalyr.com/addEvents",
@@ -91,7 +91,7 @@ describe LogStash::Outputs::Scalyr do
91
91
  :error_class=>"Manticore::UnknownException",
92
92
  :batch_num=>1,
93
93
  :message=>"java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty",
94
- :payload_size=>781,
94
+ :payload_size=>737,
95
95
  :record_count=>3,
96
96
  :total_batches=>1,
97
97
  :url=>"https://agent.scalyr.com/addEvents",
@@ -130,7 +130,7 @@ describe LogStash::Outputs::Scalyr do
130
130
  :error_class=>"Manticore::UnknownException",
131
131
  :batch_num=>1,
132
132
  :message=>"Host name 'invalid.mitm.should.fail.test.agent.scalyr.com' does not match the certificate subject provided by the peer (CN=*.scalyr.com)",
133
- :payload_size=>781,
133
+ :payload_size=>737,
134
134
  :record_count=>3,
135
135
  :total_batches=>1,
136
136
  :url=>"https://invalid.mitm.should.fail.test.agent.scalyr.com/addEvents",
@@ -175,7 +175,7 @@ describe LogStash::Outputs::Scalyr do
175
175
  :batch_num=>1,
176
176
  :code=>503,
177
177
  :message=>"Invalid JSON response from server",
178
- :payload_size=>781,
178
+ :payload_size=>737,
179
179
  :record_count=>3,
180
180
  :total_batches=>1,
181
181
  :url=>"https://agent.scalyr.com/addEvents",
@@ -203,7 +203,7 @@ describe LogStash::Outputs::Scalyr do
203
203
  :batch_num=>1,
204
204
  :code=>500,
205
205
  :message=>"Invalid JSON response from server",
206
- :payload_size=>781,
206
+ :payload_size=>737,
207
207
  :record_count=>3,
208
208
  :total_batches=>1,
209
209
  :url=>"https://agent.scalyr.com/addEvents",
@@ -231,7 +231,7 @@ describe LogStash::Outputs::Scalyr do
231
231
  :batch_num=>1,
232
232
  :code=>500,
233
233
  :message=>"Invalid JSON response from server",
234
- :payload_size=>781,
234
+ :payload_size=>737,
235
235
  :record_count=>3,
236
236
  :total_batches=>1,
237
237
  :url=>"https://agent.scalyr.com/addEvents",
@@ -6,6 +6,7 @@ require "logstash/event"
6
6
  require "json"
7
7
  require "quantile"
8
8
 
9
+ NODE_HOSTNAME = Socket.gethostname
9
10
 
10
11
  class MockClientSession
11
12
  DEFAULT_STATS = {
@@ -103,7 +104,7 @@ describe LogStash::Outputs::Scalyr do
103
104
  plugin.instance_variable_set(:@client_session, mock_client_session)
104
105
  plugin.instance_variable_set(:@session_id, "some_session_id")
105
106
  status_event = plugin.send_status
106
- expect(status_event[:attrs]["message"]).to eq("Started Scalyr LogStash output plugin (%s)." % [PLUGIN_VERSION])
107
+ expect(status_event[:attrs]["message"]).to eq("Started Scalyr LogStash output plugin %s (compression_type=deflate,compression_level=deflate,json_library=stdlib)." % [PLUGIN_VERSION])
107
108
 
108
109
  # 2. Second send
109
110
  plugin.instance_variable_set(:@last_status_transmit_time, 100)
@@ -210,7 +211,7 @@ describe LogStash::Outputs::Scalyr do
210
211
  body = JSON.parse(result[0][:body])
211
212
  expect(body['events'].size).to eq(3)
212
213
  logattrs2 = body['logs'][2]['attrs']
213
- expect(logattrs2.fetch('serverHost', nil)).to eq('my host 3')
214
+ expect(logattrs2.fetch(EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME, nil)).to eq('my host 3')
214
215
  expect(logattrs2.fetch('logfile', nil)).to eq('/logstash/my host 3')
215
216
  expect(logattrs2.fetch('tags', nil)).to eq(['t1', 't2', 't3'])
216
217
  end
@@ -240,7 +241,7 @@ describe LogStash::Outputs::Scalyr do
240
241
  body = JSON.parse(result[0][:body])
241
242
  expect(body['events'].size).to eq(3)
242
243
  logattrs2 = body['logs'][2]['attrs']
243
- expect(logattrs2.fetch('serverHost', nil)).to eq('my host 3')
244
+ expect(logattrs2.fetch(EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME, nil)).to eq('my host 3')
244
245
  expect(logattrs2.fetch('logfile', nil)).to eq('/logstash/my host 3')
245
246
  end
246
247
  end
@@ -258,7 +259,7 @@ describe LogStash::Outputs::Scalyr do
258
259
  body = JSON.parse(result[0][:body])
259
260
  expect(body['events'].size).to eq(3)
260
261
  logattrs2 = body['logs'][2]['attrs']
261
- expect(logattrs2.fetch('serverHost', nil)).to eq('my host 3')
262
+ expect(logattrs2.fetch(EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME, nil)).to eq('my host 3')
262
263
  expect(logattrs2.fetch('logfile', nil)).to eq('my file 3')
263
264
  end
264
265
  end
@@ -287,7 +288,6 @@ describe LogStash::Outputs::Scalyr do
287
288
  'seq' => 3,
288
289
  'source_file' => 'my file 3',
289
290
  'source_host' => 'my host 3',
290
- 'serverHost' => 'Logstash',
291
291
  "tag_prefix_t1" => "true",
292
292
  "tag_prefix_t2" => "true",
293
293
  "tag_prefix_t3" => "true",
@@ -321,7 +321,6 @@ describe LogStash::Outputs::Scalyr do
321
321
  'seq' => 3,
322
322
  'source_file' => 'my file 3',
323
323
  'source_host' => 'my host 3',
324
- 'serverHost' => 'Logstash',
325
324
  "tag_prefix_t1" => "true",
326
325
  "tag_prefix_t2" => "true",
327
326
  "tag_prefix_t3" => "true",
@@ -353,7 +352,6 @@ describe LogStash::Outputs::Scalyr do
353
352
  'seq' => 3,
354
353
  'source_file' => 'my file 3',
355
354
  'source_host' => 'my host 3',
356
- 'serverHost' => 'Logstash',
357
355
  "tag_prefix_t1" => "true",
358
356
  "tag_prefix_t2" => "true",
359
357
  "tag_prefix_t3" => "true",
@@ -385,7 +383,6 @@ describe LogStash::Outputs::Scalyr do
385
383
  'seq' => 3,
386
384
  'source_file' => 'my file 3',
387
385
  'source_host' => 'my host 3',
388
- 'serverHost' => 'Logstash',
389
386
  "tag_prefix_t1" => "true",
390
387
  "tag_prefix_t2" => "true",
391
388
  "tag_prefix_t3" => "true",
@@ -428,7 +425,6 @@ describe LogStash::Outputs::Scalyr do
428
425
  'seq' => 3,
429
426
  'source_file' => 'my file 3',
430
427
  'source_host' => 'my host 3',
431
- 'serverHost' => 'Logstash',
432
428
  "tag_prefix_t1" => "true",
433
429
  "tag_prefix_t2" => "true",
434
430
  "tag_prefix_t3" => "true",
@@ -468,7 +464,6 @@ describe LogStash::Outputs::Scalyr do
468
464
  'seq' => 3,
469
465
  'source_file' => 'my file 3',
470
466
  'source_host' => 'my host 3',
471
- 'serverHost' => 'Logstash',
472
467
  "tag_prefix_t1" => "true",
473
468
  "tag_prefix_t2" => "true",
474
469
  "tag_prefix_t3" => "true",
@@ -503,7 +498,6 @@ describe LogStash::Outputs::Scalyr do
503
498
  'seq' => 3,
504
499
  'source_file' => 'my file 3',
505
500
  'source_host' => 'my host 3',
506
- 'serverHost' => 'Logstash',
507
501
  "tag_prefix_t1" => "true",
508
502
  "tag_prefix_t2" => "true",
509
503
  "tag_prefix_t3" => "true",
@@ -528,7 +522,6 @@ describe LogStash::Outputs::Scalyr do
528
522
  'seq' => 3,
529
523
  'source_file' => 'my file 3',
530
524
  'source_host' => 'my host 3',
531
- 'serverHost' => 'Logstash',
532
525
  "tags" => ["t1", "t2", "t3"],
533
526
  "parser" => "logstashParser",
534
527
  })
@@ -554,19 +547,278 @@ describe LogStash::Outputs::Scalyr do
554
547
  'seq' => 3,
555
548
  'source_file' => 'my file 3',
556
549
  'source_host' => 'my host 3',
557
- 'serverHost' => 'Logstash',
558
550
  "tags" => ["t1", "t2", "t3"],
559
551
  "parser" => "logstashParser",
560
552
  })
561
553
  expect(plugin.instance_variable_get(:@logger)).to have_received(:warn).with("Error while flattening record",
562
554
  {
563
555
  :error_message=>"Resulting flattened object will contain more keys than the configured flattening_max_key_count of 3",
564
- :sample_keys=>["serverHost", "parser", "tags_2", "tags_1"]
556
+ :sample_keys=>["parser", "tags_2", "tags_1", "tags_0"]
565
557
  }
566
558
  ).exactly(3).times
567
559
  end
568
560
  end
569
561
 
562
+ context "serverHost attribute handling" do
563
+ it "no serverHost defined in server_attributes, no serverHost defined on event level - should use node hostname as the default session level value" do
564
+ config = {
565
+ 'api_write_token' => '1234',
566
+ }
567
+ plugin = LogStash::Outputs::Scalyr.new(config)
568
+
569
+ allow(plugin).to receive(:send_status).and_return(nil)
570
+ plugin.register
571
+ e = LogStash::Event.new
572
+ result = plugin.build_multi_event_request_array([e])
573
+ body = JSON.parse(result[0][:body])
574
+ expect(body['sessionInfo']['serverHost']).to eq(NODE_HOSTNAME)
575
+
576
+ expect(body['logs']).to eq([])
577
+ expect(body['events'].size).to eq(1)
578
+ expect(body['events'][0]['attrs']["serverHost"]).to eq(nil)
579
+ end
580
+
581
+ it "serverHost defined in server_attributes, nothing defined on event level - server_attributes value should be used" do
582
+ config = {
583
+ 'api_write_token' => '1234',
584
+ 'server_attributes' => {'serverHost' => 'fooHost'}
585
+ }
586
+ plugin = LogStash::Outputs::Scalyr.new(config)
587
+
588
+ allow(plugin).to receive(:send_status).and_return(nil)
589
+ plugin.register
590
+ e = LogStash::Event.new
591
+ result = plugin.build_multi_event_request_array([e])
592
+ body = JSON.parse(result[0][:body])
593
+ expect(body['sessionInfo']['serverHost']).to eq('fooHost')
594
+ expect(body['events'].size).to eq(1)
595
+ expect(body['events'][0]['attrs']["serverHost"]).to eq(nil)
596
+ end
597
+
598
+ # sessionInfo serverHost always has precedence this means it's important that we don't include it if event level attribute is set, otherwise
599
+ # session level one would simply always overwrite event level one which would be ignored
600
+ it "serverHost defined in server_attributes (explicitly defined), event level serverHost defined - event level value should be used" do
601
+ config = {
602
+ 'api_write_token' => '1234',
603
+ 'server_attributes' => {'serverHost' => 'fooHost', 'attr1' => 'val1'}
604
+ }
605
+ plugin = LogStash::Outputs::Scalyr.new(config)
606
+
607
+ allow(plugin).to receive(:send_status).and_return(nil)
608
+ plugin.register
609
+ expect(plugin.server_attributes['serverHost']).to eq('fooHost')
610
+
611
+ e1 = LogStash::Event.new
612
+ e1.set('a1', 'v1')
613
+ e1.set('serverHost', 'event-host-1')
614
+
615
+ e2 = LogStash::Event.new
616
+ e2.set('a2', 'v2')
617
+ e2.set('serverHost', 'event-host-2')
618
+
619
+ e3 = LogStash::Event.new
620
+ e3.set('a3', 'v3')
621
+ e3.set('serverHost', 'event-host-2')
622
+
623
+ e4 = LogStash::Event.new
624
+ e4.set('a4', 'v4')
625
+ e4.set('serverHost', 'event-host-1')
626
+
627
+ result = plugin.build_multi_event_request_array([e1, e2, e3, e4])
628
+ body = JSON.parse(result[0][:body])
629
+ expect(body['sessionInfo']['serverHost']).to eq(nil)
630
+ expect(body['sessionInfo']['attr1']).to eq('val1')
631
+
632
+ expect(body['logs'].size).to eq(2)
633
+ expect(body['logs'][0]['id']).to eq(1)
634
+ expect(body['logs'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-1')
635
+ expect(body['logs'][1]['id']).to eq(2)
636
+ expect(body['logs'][1]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-2')
637
+
638
+ expect(body['events'].size).to eq(4)
639
+ expect(body['events'][0]['log']).to eq(1)
640
+ expect(body['events'][0]['attrs']["serverHost"]).to eq(nil)
641
+ expect(body['events'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
642
+
643
+ expect(body['events'][1]['log']).to eq(2)
644
+ expect(body['events'][1]['attrs']["serverHost"]).to eq(nil)
645
+ expect(body['events'][1]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
646
+
647
+ expect(body['events'][2]['log']).to eq(2)
648
+ expect(body['events'][2]['attrs']["serverHost"]).to eq(nil)
649
+ expect(body['events'][2]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
650
+
651
+ expect(body['events'][3]['log']).to eq(1)
652
+ expect(body['events'][3]['attrs']["serverHost"]).to eq(nil)
653
+ expect(body['events'][3]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
654
+ end
655
+
656
+ it "serverHost defined in server_attributes (defined via node hostname), event level serverHost defined - event level value should be used" do
657
+ config = {
658
+ 'api_write_token' => '1234',
659
+ 'server_attributes' => {'attr1' => 'val1'}
660
+ }
661
+ plugin = LogStash::Outputs::Scalyr.new(config)
662
+
663
+ allow(plugin).to receive(:send_status).and_return(nil)
664
+ plugin.register
665
+
666
+ expect(plugin.server_attributes['serverHost']).to eq(NODE_HOSTNAME)
667
+
668
+ e1 = LogStash::Event.new
669
+ e1.set('a1', 'v1')
670
+ e1.set('serverHost', 'event-host-1')
671
+
672
+ e2 = LogStash::Event.new
673
+ e2.set('a2', 'v2')
674
+ e2.set('serverHost', 'event-host-2')
675
+
676
+ e3 = LogStash::Event.new
677
+ e3.set('a3', 'v3')
678
+ e3.set('serverHost', 'event-host-2')
679
+
680
+ e4 = LogStash::Event.new
681
+ e4.set('a4', 'v4')
682
+ e4.set('serverHost', 'event-host-1')
683
+
684
+ result = plugin.build_multi_event_request_array([e1, e2, e3, e4])
685
+ body = JSON.parse(result[0][:body])
686
+ expect(body['sessionInfo']['serverHost']).to eq(nil)
687
+ expect(body['sessionInfo']['attr1']).to eq('val1')
688
+
689
+ expect(body['logs'].size).to eq(2)
690
+ expect(body['logs'][0]['id']).to eq(1)
691
+ expect(body['logs'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-1')
692
+ expect(body['logs'][1]['id']).to eq(2)
693
+ expect(body['logs'][1]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-2')
694
+
695
+ expect(body['events'].size).to eq(4)
696
+ expect(body['events'][0]['log']).to eq(1)
697
+ expect(body['events'][0]['attrs']["serverHost"]).to eq(nil)
698
+ expect(body['events'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
699
+
700
+ expect(body['events'][1]['log']).to eq(2)
701
+ expect(body['events'][1]['attrs']["serverHost"]).to eq(nil)
702
+ expect(body['events'][1]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
703
+
704
+ expect(body['events'][2]['log']).to eq(2)
705
+ expect(body['events'][2]['attrs']["serverHost"]).to eq(nil)
706
+ expect(body['events'][2]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
707
+
708
+ expect(body['events'][3]['log']).to eq(1)
709
+ expect(body['events'][3]['attrs']["serverHost"]).to eq(nil)
710
+ expect(body['events'][3]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
711
+ end
712
+
713
+ # If set_session_level_serverhost_on_events config option is true, we set session level serverHost on events which don't
714
+ # explicitly define this special attribute.
715
+ it "serverHost defined in server_attributes (explicitly defined), event level serverHost defined - event level value should be used and server level one for events without server host" do
716
+ config = {
717
+ 'api_write_token' => '1234',
718
+ 'server_attributes' => {'serverHost' => 'top-level-session-host', 'attr1' => 'val1'}
719
+ }
720
+ plugin = LogStash::Outputs::Scalyr.new(config)
721
+
722
+ allow(plugin).to receive(:send_status).and_return(nil)
723
+ plugin.register
724
+ expect(plugin.server_attributes['serverHost']).to eq('top-level-session-host')
725
+
726
+ e1 = LogStash::Event.new
727
+ e1.set('a1', 'v1')
728
+ e1.set('serverHost', 'event-host-1')
729
+
730
+ e2 = LogStash::Event.new
731
+ e2.set('a2', 'v2')
732
+
733
+ e3 = LogStash::Event.new
734
+ e3.set('a3', 'v3')
735
+
736
+ e4 = LogStash::Event.new
737
+ e4.set('a4', 'v4')
738
+ e4.set('serverHost', 'event-host-1')
739
+
740
+ result = plugin.build_multi_event_request_array([e1, e2, e3, e4])
741
+ body = JSON.parse(result[0][:body])
742
+ expect(body['sessionInfo']['serverHost']).to eq(nil)
743
+ expect(body['sessionInfo']['attr1']).to eq('val1')
744
+
745
+ expect(body['logs'].size).to eq(1)
746
+ expect(body['logs'][0]['id']).to eq(1)
747
+ expect(body['logs'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-1')
748
+
749
+ expect(body['events'].size).to eq(4)
750
+ expect(body['events'][0]['log']).to eq(1)
751
+ expect(body['events'][0]['attrs']["serverHost"]).to eq(nil)
752
+ expect(body['events'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
753
+
754
+ expect(body['events'][1]['log']).to eq(nil)
755
+ expect(body['events'][1]['attrs']["serverHost"]).to eq(nil)
756
+ expect(body['events'][2]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq("top-level-session-host")
757
+
758
+ expect(body['events'][1]['log']).to eq(nil)
759
+ expect(body['events'][2]['attrs']["serverHost"]).to eq(nil)
760
+ expect(body['events'][2]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq("top-level-session-host")
761
+
762
+ expect(body['events'][3]['log']).to eq(1)
763
+ expect(body['events'][3]['attrs']["serverHost"]).to eq(nil)
764
+ expect(body['events'][3]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
765
+ end
766
+
767
+ it "no serverHost defined, event level serverHost defined - event level value should be used" do
768
+ config = {
769
+ 'api_write_token' => '1234',
770
+ 'server_attributes' => {'attr1' => 'val1'},
771
+ 'use_hostname_for_serverhost' => false
772
+ }
773
+ plugin = LogStash::Outputs::Scalyr.new(config)
774
+
775
+ allow(plugin).to receive(:send_status).and_return(nil)
776
+ plugin.register
777
+
778
+ expect(plugin.server_attributes['serverHost']).to eq(nil)
779
+
780
+ e1 = LogStash::Event.new
781
+ e1.set('a1', 'v1')
782
+ e1.set('serverHost', 'event-host-1')
783
+
784
+ e2 = LogStash::Event.new
785
+ e2.set('a2', 'v2')
786
+ e2.set('serverHost', 'event-host-2')
787
+
788
+ e3 = LogStash::Event.new
789
+ e3.set('a3', 'v3')
790
+ e3.set('serverHost', 'event-host-2')
791
+
792
+ e4 = LogStash::Event.new
793
+ e4.set('a4', 'v4')
794
+ e4.set('serverHost', 'event-host-2')
795
+
796
+ result = plugin.build_multi_event_request_array([e1, e2, e3, e4])
797
+ body = JSON.parse(result[0][:body])
798
+ expect(body['sessionInfo']['serverHost']).to eq(nil)
799
+ expect(body['sessionInfo']['attr1']).to eq('val1')
800
+
801
+ expect(body['logs'][0]['id']).to eq(1)
802
+ expect(body['logs'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-1')
803
+ expect(body['logs'][1]['id']).to eq(2)
804
+ expect(body['logs'][1]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq('event-host-2')
805
+
806
+ expect(body['events'].size).to eq(4)
807
+ expect(body['events'][0]['log']).to eq(1)
808
+ expect(body['events'][0]['attrs']["serverHost"]).to eq(nil)
809
+ expect(body['events'][0]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
810
+ expect(body['events'][1]['log']).to eq(2)
811
+ expect(body['events'][1]['attrs']["serverHost"]).to eq(nil)
812
+ expect(body['events'][1]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
813
+ expect(body['events'][2]['log']).to eq(2)
814
+ expect(body['events'][2]['attrs']["serverHost"]).to eq(nil)
815
+ expect(body['events'][2]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
816
+ expect(body['events'][3]['log']).to eq(2)
817
+ expect(body['events'][3]['attrs']["serverHost"]).to eq(nil)
818
+ expect(body['events'][3]['attrs'][EVENT_LEVEL_SERVER_HOST_ATTRIBUTE_NAME]).to eq(nil)
819
+ end
820
+ end
821
+
570
822
  context "when receiving an event with Bignums" do
571
823
  config = {
572
824
  'api_write_token' => '1234',
@@ -585,6 +837,59 @@ describe LogStash::Outputs::Scalyr do
585
837
  end
586
838
  end
587
839
 
840
+ context "host attribute handling" do
841
+ it "host attribute removed by default" do
842
+ config = {
843
+ 'api_write_token' => '1234',
844
+ }
845
+ plugin = LogStash::Outputs::Scalyr.new(config)
846
+
847
+ allow(plugin).to receive(:send_status).and_return(nil)
848
+ plugin.register
849
+
850
+ expect(plugin.server_attributes['serverHost']).to eq(NODE_HOSTNAME)
851
+
852
+ e1 = LogStash::Event.new
853
+ e1.set('a1', 'v1')
854
+ e1.set('host', 'event-host-1')
855
+
856
+ result = plugin.build_multi_event_request_array([e1])
857
+ body = JSON.parse(result[0][:body])
858
+ expect(body['sessionInfo']['serverHost']).to eq(NODE_HOSTNAME)
859
+
860
+ expect(body['logs'].size).to eq(0)
861
+
862
+ expect(body['events'].size).to eq(1)
863
+ expect(body['events'][0]['attrs']["host"]).to eq(nil)
864
+ end
865
+
866
+ it "host attribute not removed if config option set" do
867
+ config = {
868
+ 'api_write_token' => '1234',
869
+ 'remove_host_attribute_from_events' => false,
870
+ }
871
+ plugin = LogStash::Outputs::Scalyr.new(config)
872
+
873
+ allow(plugin).to receive(:send_status).and_return(nil)
874
+ plugin.register
875
+
876
+ expect(plugin.server_attributes['serverHost']).to eq(NODE_HOSTNAME)
877
+
878
+ e1 = LogStash::Event.new
879
+ e1.set('a1', 'v1')
880
+ e1.set('host', 'event-host-1')
881
+
882
+ result = plugin.build_multi_event_request_array([e1])
883
+ body = JSON.parse(result[0][:body])
884
+ expect(body['sessionInfo']['serverHost']).to eq(NODE_HOSTNAME)
885
+
886
+ expect(body['logs'].size).to eq(0)
887
+
888
+ expect(body['events'].size).to eq(1)
889
+ expect(body['events'][0]['attrs']["host"]).to eq("event-host-1")
890
+ end
891
+ end
892
+
588
893
  context "when using custom json library" do
589
894
  it "stdlib (implicit)" do
590
895
  config = {
@@ -598,7 +903,7 @@ describe LogStash::Outputs::Scalyr do
598
903
  e.set('bignumber', 20)
599
904
  result = plugin.build_multi_event_request_array([e])
600
905
  body = JSON.parse(result[0][:body])
601
- expect(result[0][:body]).to include('{"monitor":"pluginLogstash"}')
906
+ expect(result[0][:body]).to include(sprintf('{"serverHost":"%s","monitor":"pluginLogstash"}', NODE_HOSTNAME))
602
907
  expect(body['events'].size).to eq(1)
603
908
  end
604
909
 
@@ -615,7 +920,7 @@ describe LogStash::Outputs::Scalyr do
615
920
  e.set('bignumber', 20)
616
921
  result = plugin.build_multi_event_request_array([e])
617
922
  body = JSON.parse(result[0][:body])
618
- expect(result[0][:body]).to include('{"monitor":"pluginLogstash"}')
923
+ expect(result[0][:body]).to include(sprintf('{"serverHost":"%s","monitor":"pluginLogstash"}', NODE_HOSTNAME))
619
924
  expect(body['events'].size).to eq(1)
620
925
  end
621
926
 
@@ -632,7 +937,7 @@ describe LogStash::Outputs::Scalyr do
632
937
  e.set('bignumber', 20)
633
938
  result = plugin.build_multi_event_request_array([e])
634
939
  body = JSON.parse(result[0][:body])
635
- expect(result[0][:body]).to include('{"monitor":"pluginLogstash"}')
940
+ expect(result[0][:body]).to include(sprintf('{"serverHost":"%s","monitor":"pluginLogstash"}', NODE_HOSTNAME))
636
941
  expect(body['events'].size).to eq(1)
637
942
  end
638
943
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-scalyr
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.26.beta
4
+ version: 0.2.0.beta
5
5
  platform: ruby
6
6
  authors:
7
7
  - Edward Chee
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2021-08-30 00:00:00.000000000 Z
11
+ date: 2021-09-09 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -164,6 +164,7 @@ files:
164
164
  - spec/benchmarks/flattening_and_serialization.rb
165
165
  - spec/benchmarks/json_serialization.rb
166
166
  - spec/benchmarks/metrics_overhead.rb
167
+ - spec/benchmarks/set_session_level_serverhost_on_events.rb
167
168
  - spec/benchmarks/util.rb
168
169
  - spec/logstash/outputs/scalyr_integration_spec.rb
169
170
  - spec/logstash/outputs/scalyr_spec.rb
@@ -4097,6 +4098,7 @@ test_files:
4097
4098
  - spec/benchmarks/flattening_and_serialization.rb
4098
4099
  - spec/benchmarks/json_serialization.rb
4099
4100
  - spec/benchmarks/metrics_overhead.rb
4101
+ - spec/benchmarks/set_session_level_serverhost_on_events.rb
4100
4102
  - spec/benchmarks/util.rb
4101
4103
  - spec/logstash/outputs/scalyr_integration_spec.rb
4102
4104
  - spec/logstash/outputs/scalyr_spec.rb