logstash-codec-protobuf 1.2.0 → 1.2.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: 4812c43d851064eacffc5c6fba4719eed4691702
4
- data.tar.gz: 88436b13daa031ffb714a0f3b1c9611e4130133a
2
+ SHA256:
3
+ metadata.gz: 48d78608a993ce87e16cefbf92995c27ea0f95b3a9af82f030d3bd7fc22a1e6d
4
+ data.tar.gz: 5f1e2aab280322aab0878eae12ee51f98f30257025ce8d50b0bac39a4edd793e
5
5
  SHA512:
6
- metadata.gz: b9e6b75105a06e434fdf6f968e80842ac874bbeb7cf1bd72b7339ce3434e43dd4cdc20e992add4b24e811302f2d4366ee8548426b8813042649ea73617cd14ca
7
- data.tar.gz: 53d72f5bdb6b5eac6fb88b4d092dba366c820043a709a0d433cf0ef30e7a52f468a48e46cd8a9e728a2fb9c2775c31fee305fb11e5e4f1d7ac0f7d491389a4dc
6
+ metadata.gz: 932aaff952bf4982d5192701eb3a3997416d76dc750516d53e48f08689f16f93bef0c734a2813f9d71f36ac8a3566fc8916af0c3e3b33b58714b33f40848b1f0
7
+ data.tar.gz: 76562195f5f05b10dc34695447ace2bd74081e1fdee281c4f5b28e04722f518f6fff0fac789b462c7a7be2877c03236922063d7bb86d16676c853bf1506cac26
data/CHANGELOG.md CHANGED
@@ -1,3 +1,10 @@
1
+ ## 1.2.1
2
+ - Keep original data in case of parsing errors
3
+
4
+ ## 1.2.0
5
+ - Autoload all referenced protobuf classes
6
+ - Fix concurrency issue when using multiple pipelines
7
+
1
8
  ## 1.1.0
2
9
  - Add support for protobuf3
3
10
 
data/README.md CHANGED
@@ -4,58 +4,62 @@ This is a codec plugin for [Logstash](https://github.com/elastic/logstash) to pa
4
4
 
5
5
  # Prerequisites and Installation
6
6
 
7
- * prepare your ruby versions of the protobuf definitions
8
- ** For protobuf 2 use the [ruby-protoc compiler](https://github.com/codekitchen/ruby-protocol-buffers).
9
- ** For protobuf 3 use the [official google protobuf compiler](https://developers.google.com/protocol-buffers/docs/reference/ruby-generated).
7
+ * prepare your Ruby versions of the Protobuf definitions:
8
+ * For protobuf 2 use the [ruby-protoc compiler](https://github.com/codekitchen/ruby-protocol-buffers).
9
+ * For protobuf 3 use the [official google protobuf compiler](https://developers.google.com/protocol-buffers/docs/reference/ruby-generated).
10
10
  * install the codec: `bin/logstash-plugin install logstash-codec-protobuf`
11
- * use the codec in your logstash config file. See details below.
11
+ * use the codec in your Logstash config file. See details below.
12
12
 
13
13
  ## Configuration
14
14
 
15
- include_path (required): an array of strings with filenames where logstash can find your protobuf definitions. Please provide absolute paths. For directories it will only try to import files ending on .rb
15
+ `include_path` (required): an array of strings with filenames where logstash can find your protobuf definitions. Requires absolute paths. Please note that protobuf v2 files have the ending `.pb.rb` whereas files compiled for protobuf v3 end in `_pb.rb`.
16
16
 
17
- class_name (required): the name of the protobuf class that is to be decoded or encoded. For protobuf 2 separate the modules with ::. For protobuf 3 use single dots. See examples below.
17
+ `class_name` (required): the name of the protobuf class that is to be decoded or encoded. For protobuf 2 separate the modules with ::. For protobuf 3 use single dots.
18
18
 
19
- protobuf_version (optional): set this to 3 if you want to use protobuf 3 definitions. Defaults to 2.
19
+ `protobuf_version` (optional): set this to 3 if you want to use protobuf 3 definitions. Defaults to 2.
20
20
 
21
21
  ## Usage example: decoder
22
22
 
23
23
  Use this as a codec in any logstash input. Just provide the name of the class that your incoming objects will be encoded in, and specify the path to the compiled definition.
24
24
  Here's an example for a kafka input with protobuf 2:
25
25
 
26
- kafka
27
- {
28
- zk_connect => "127.0.0.1"
29
- topic_id => "unicorns_protobuffed"
30
- key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
31
- value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
32
-
33
- codec => protobuf
34
- {
35
- class_name => "Animals::Unicorn"
36
- include_path => ['/path/to/pb_definitions/Animal.pb.rb', '/path/to/pb_definitions/Unicorn.pb.rb']
37
- }
38
- }
26
+ ```ruby
27
+ kafka
28
+ {
29
+ topic_id => "..."
30
+ key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
31
+ value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
32
+
33
+ codec => protobuf
34
+ {
35
+ class_name => "Animals::Mammals::Unicorn"
36
+ include_path => ['/path/to/pb_definitions/Animal.pb.rb', '/path/to/pb_definitions/Unicorn.pb.rb']
37
+ }
38
+ }
39
+ ```
39
40
 
40
41
  Example for protobuf 3:
41
42
 
42
- kafka
43
- {
44
- zk_connect => "127.0.0.1"
45
- topic_id => "unicorns_protobuffed"
46
- key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
47
- value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
48
- codec => protobuf
49
- {
50
- class_name => "Animals.Unicorn"
51
- include_path => ['/path/to/pb_definitions/Animal_pb.rb', '/path/to/pb_definitions/Unicorn_pb.rb']
52
- protobuf_version => 3
53
- }
54
- }
43
+ ```ruby
44
+ kafka
45
+ {
46
+ topic_id => "..."
47
+ key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
48
+ value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
49
+ codec => protobuf
50
+ {
51
+ class_name => "Animals.Mammals.Unicorn"
52
+ include_path => ['/path/to/pb_definitions/Animal_pb.rb', '/path/to/pb_definitions/Unicorn_pb.rb']
53
+ protobuf_version => 3
54
+ }
55
+ }
56
+ ```
55
57
 
56
58
  For version 3 class names check the bottom of the generated protobuf ruby file. It contains lines like this:
57
59
 
58
- Animals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Unicorn").msgclass
60
+ ```ruby
61
+ Animals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Unicorn").msgclass
62
+ ```
59
63
 
60
64
  Use the parameter for the lookup call as the class_name for the codec config.
61
65
 
@@ -65,32 +69,39 @@ If you're using a kafka input please also set the deserializer classes as shown
65
69
 
66
70
  Imagine you have the following protobuf version 2 relationship: class Unicorn lives in namespace Animal::Horse and uses another class Wings.
67
71
 
68
- module Animal
69
- module Horse
70
- class Unicorn
71
- set_fully_qualified_name "Animal.Horse.Unicorn"
72
- optional ::Animal::Bodypart::Wings, :wings, 1
73
- optional :string, :name, 2
74
- # here be more field definitions
72
+ ```ruby
73
+ module Animal
74
+ module Mammal
75
+ class Unicorn
76
+ set_fully_qualified_name "Animal.Mammal.Unicorn"
77
+ optional ::Bodypart::Wings, :wings, 1
78
+ optional :string, :name, 2
79
+ ...
80
+ ```
75
81
 
76
82
  Make sure to put the referenced wings class first in the include_path:
77
83
 
78
- include_path => ['/path/to/pb_definitions/wings.pb.rb','/path/to/pb_definitions/unicorn.pb.rb']
84
+ ```ruby
85
+ include_path => ['/path/to/pb_definitions/wings.pb.rb','/path/to/pb_definitions/unicorn.pb.rb']
86
+ ```
79
87
 
80
88
  Set the class name to the parent class:
81
-
82
- class_name => "Animal::Horse::Unicorn"
83
89
 
84
- for protobuf 2. For protobuf 3 use
90
+ ```ruby
91
+ class_name => "Animal::Mammal::Unicorn"
92
+ ```
85
93
 
86
- class_name => "Animal.Horse.Unicorn"
94
+ for protobuf 2. For protobuf 3 use
87
95
 
96
+ ```ruby
97
+ class_name => "Animal.Mammal.Unicorn"
98
+ ```
88
99
 
89
100
  ## Usage example: encoder
90
101
 
91
102
  The configuration of the codec for encoding logstash events for a protobuf output is pretty much the same as for the decoder input usage as demonstrated above. There are some constraints though that you need to be aware of:
92
- * the protobuf definition needs to contain all the fields that logstash typically adds to an event, in the corrent data type. Examples for this are @timestamp (string), @version (string), host, path, all of which depend on your input sources and filters aswell. If you do not want to add those fields to your protobuf definition then please use a [modify filter](https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html) to [remove](https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-remove_field) the undesired fields.
93
- * object members starting with @ are somewhat problematic in protobuf definitions. Therefore those fields will automatically be renamed to remove the at character. This also effects the important @timestamp field. Please name it just "timestamp" in your definition.
103
+ * the protobuf definition needs to contain all the fields that logstash typically adds to an event, in the corrent data type. Examples for this are `@timestamp` (string), `@version` (string), `host`, `path`, all of which depend on your input sources and filters aswell. If you do not want to add those fields to your protobuf definition then please use a [modify filter](https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html) to [remove](https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-remove_field) the undesired fields.
104
+ * object members starting with `@` are somewhat problematic in protobuf definitions. Therefore those fields will automatically be renamed to remove the at character. This also effects the important `@timestamp` field. Please name it just "timestamp" in your definition.
94
105
 
95
106
 
96
107
  ## Troubleshooting
@@ -98,11 +109,11 @@ The configuration of the codec for encoding logstash events for a protobuf outpu
98
109
  ### Protobuf 2
99
110
  #### "uninitialized constant SOME_CLASS_NAME"
100
111
 
101
- If you include more than one definition class, consider the order of inclusion. This is especially relevant if you include whole directories. A definition might refer to another definition that is not loaded yet. In this case, please specify the files in the include_path variable in reverse order of reference. See 'Example with referenced definitions' above.
112
+ If you include more than one definition class, consider the order of inclusion. This is especially relevant if you include whole directories. A definition might refer to another definition that is not loaded yet. In this case, please specify the files in the `include_path` variable in reverse order of reference. See 'Example with referenced definitions' above.
102
113
 
103
114
  #### no protobuf output
104
115
 
105
- Maybe your protobuf definition does not fullfill the requirements and needs additional fields. Run logstash with the --debug flag and search for error messages.
116
+ Maybe your protobuf definition does not fullfill the requirements and needs additional fields. Run logstash with the `--debug` flag and search for error messages.
106
117
 
107
118
  ### Protobuf 3
108
119
 
data/docs/index.asciidoc CHANGED
@@ -20,26 +20,51 @@ include::{include_path}/plugin_header.asciidoc[]
20
20
 
21
21
  ==== Description
22
22
 
23
- This codec converts protobuf encoded messages into logstash events and vice versa.
23
+ This codec converts protobuf encoded messages into logstash events and vice versa. It supports the protobuf versions 2 and 3.
24
24
 
25
- Requires the protobuf definitions as ruby files. You can create those using the [ruby-protoc compiler](https://github.com/codekitchen/ruby-protocol-buffers).
25
+ The plugin requires the protobuf definitions to be compiled to ruby files. +
26
+ For protobuf 2 use the https://github.com/codekitchen/ruby-protocol-buffers[ruby-protoc compiler]. +
27
+ For protobuf 3 use the https://developers.google.com/protocol-buffers/docs/reference/ruby-generated[official google protobuf compiler].
26
28
 
27
- The following shows a usage example for decoding events from a kafka stream:
29
+ The following shows a usage example (protobuf v2) for decoding events from a kafka stream:
28
30
  [source,ruby]
29
31
  kafka
30
32
  {
31
- zk_connect => "127.0.0.1"
32
- topic_id => "your_topic_goes_here"
33
+ topic_id => "..."
33
34
  key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
34
35
  value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
35
36
  codec => protobuf
36
37
  {
37
- class_name => "Animal::Unicorn"
38
+ class_name => "Animals::Mammals::Unicorn"
38
39
  include_path => ['/path/to/protobuf/definitions/UnicornProtobuf.pb.rb']
39
40
  }
40
41
  }
41
42
 
42
- Specifically for the kafka input: please set the deserializer classes as shown above.
43
+ Usage example for protobuf v3:
44
+ [source,ruby]
45
+ kafka
46
+ {
47
+ topic_id => "..."
48
+ key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
49
+ value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
50
+ codec => protobuf
51
+ {
52
+ class_name => "Animals.Mammals.Unicorn"
53
+ include_path => ['/path/to/pb_definitions/Animal_pb.rb', '/path/to/pb_definitions/Unicorn_pb.rb']
54
+ protobuf_version => 3
55
+ }
56
+ }
57
+
58
+
59
+ The codec can be used in input and output plugins. +
60
+ When using the codec in the kafka input plugin please set the deserializer classes as shown above. +
61
+ When using the codec in an output plugin:
62
+
63
+ * make sure to include all the desired fields in the protobuf definition, including timestamp.
64
+ Remove fields that are not part of the protobuf definition from the event by using the mutate filter.
65
+ * the `@` symbol is currently not supported in field names when loading the protobuf definitions for encoding. Make sure to call the timestamp field `timestamp`
66
+ instead of `@timestamp` in the protobuf file. Logstash event fields will be stripped of the leading `@` before conversion.
67
+
43
68
 
44
69
  [id="plugins-{type}s-{plugin}-options"]
45
70
  ==== Protobuf Codec Configuration Options
@@ -49,6 +74,7 @@ Specifically for the kafka input: please set the deserializer classes as shown a
49
74
  |Setting |Input type|Required
50
75
  | <<plugins-{type}s-{plugin}-class_name>> |<<string,string>>|Yes
51
76
  | <<plugins-{type}s-{plugin}-include_path>> |<<array,array>>|Yes
77
+ | <<plugins-{type}s-{plugin}-protobuf_version>> |<<number,number>>|Yes
52
78
  |=======================================================================
53
79
 
54
80
  &nbsp;
@@ -60,24 +86,21 @@ Specifically for the kafka input: please set the deserializer classes as shown a
60
86
  * Value type is <<string,string>>
61
87
  * There is no default value for this setting.
62
88
 
63
- Name of the class to decode.
64
- If your protobuf definition contains modules, prepend them to the class name with double colons like so:
89
+ Fully qualified name of the class to decode.
90
+ Please note that the module delimiter is different depending on the protobuf version. For protobuf v2, use double colons:
65
91
  [source,ruby]
66
- class_name => "Foods::Dairy::Cheese"
92
+ class_name => "Animals::Mammals::Unicorn"
67
93
 
68
- This corresponds to a protobuf definition starting as follows:
94
+ For protobuf v3, use single dots:
69
95
  [source,ruby]
70
- module Foods
71
- module Dairy
72
- class Cheese
73
- # here are your field definitions.
96
+ class_name => "Animals.Mammals.Unicorn"
74
97
 
75
- If your class references other definitions: you only have to add the main class here.
76
- For version 3 class names check the bottom of the generated protobuf ruby file. It contains lines like this:
98
+ For protobuf v3, you can copy the class name from the Descriptorpool registrations at the bottom of the generated protobuf ruby file. It contains lines like this:
99
+ [source,ruby]
100
+ Animals.Mammals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Mammals.Unicorn").msgclass
77
101
 
78
- Animals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Unicorn").msgclass
79
102
 
80
- Use the parameter for the lookup call as the class_name for the codec config.
103
+ If your class references other definitions: you only have to add the name of the main class here.
81
104
 
82
105
  [id="plugins-{type}s-{plugin}-include_path"]
83
106
  ===== `include_path`
@@ -90,25 +113,29 @@ List of absolute pathes to files with protobuf definitions.
90
113
  When using more than one file, make sure to arrange the files in reverse order of dependency so that each class is loaded before it is
91
114
  refered to by another.
92
115
 
93
- Example: a class _Cheese_ referencing another protobuf class _Milk_
116
+ Example: a class _Unicorn_ referencing another protobuf class _Wings_
94
117
  [source,ruby]
95
- module Foods
96
- module Dairy
97
- class Cheese
98
- set_fully_qualified_name "Foods.Dairy.Cheese"
99
- optional ::Foods::Cheese::Milk, :milk, 1
100
- optional :int64, :unique_id, 2
101
- # here be more field definitions
118
+ module Animal
119
+ module Mammal
120
+ class Unicorn
121
+ set_fully_qualified_name "Animal.Mammal.Unicorn"
122
+ optional ::Bodypart::Wings, :wings, 1
123
+ optional :string, :name, 2
124
+ ...
102
125
 
103
126
  would be configured as
104
127
  [source,ruby]
105
- include_path => ['/path/to/protobuf/definitions/Milk.pb.rb','/path/to/protobuf/definitions/Cheese.pb.rb']
128
+ include_path => ['/path/to/pb_definitions/wings.pb.rb','/path/to/pb_definitions/unicorn.pb.rb']
129
+
130
+ Please note that protobuf v2 files have the ending `.pb.rb` whereas files compiled for protobuf v3 end in `_pb.rb`.
131
+
132
+ [id="plugins-{type}s-{plugin}-protobuf_version"]
133
+ ===== `protobuf_version`
134
+
135
+ * Value type is <<number,number>>
136
+ * Default value is 2
137
+
138
+ Protocol buffers version. Valid settings are 2, 3.
106
139
 
107
- When using the codec in an output plugin:
108
- * make sure to include all the desired fields in the protobuf definition, including timestamp.
109
- Remove fields that are not part of the protobuf definition from the event by using the mutate filter.
110
- * the @ symbol is currently not supported in field names when loading the protobuf definitions for encoding. Make sure to call the timestamp field "timestamp"
111
- instead of "@timestamp" in the protobuf file. Logstash event fields will be stripped of the leading @ before conversion.
112
-
113
140
 
114
141
 
@@ -214,6 +214,8 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
214
214
  @logger.warn("Couldn't decode protobuf: #{e.inspect}.")
215
215
  if stop_on_error
216
216
  raise e
217
+ else # keep original message so that the user can debug it.
218
+ yield LogStash::Event.new("message" => data, "tags" => ["_protobufdecodefailure"])
217
219
  end
218
220
  end # def decode
219
221
 
@@ -1,7 +1,7 @@
1
1
  Gem::Specification.new do |s|
2
2
 
3
3
  s.name = 'logstash-codec-protobuf'
4
- s.version = '1.2.0'
4
+ s.version = '1.2.1'
5
5
  s.licenses = ['Apache License (2.0)']
6
6
  s.summary = "Reads protobuf messages and converts to Logstash Events"
7
7
  s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
@@ -261,6 +261,7 @@ describe LogStash::Codecs::Protobuf do
261
261
 
262
262
  context "#test6_pb3" do
263
263
 
264
+
264
265
  let(:execution_context) { double("execution_context")}
265
266
  let(:pipeline_id) {rand(36**8).to_s(36)}
266
267
 
@@ -372,4 +373,36 @@ describe LogStash::Codecs::Protobuf do
372
373
 
373
374
  end # context #encodePB3
374
375
 
376
+
377
+
378
+ context "#test7_pb3" do
379
+
380
+ #### Test case 6: decode test case for github issue 17 ####################################################################################################################
381
+ let(:plugin_7) { LogStash::Codecs::Protobuf.new("class_name" => "RepeatedEvents", "include_path" => [pb_include_path + '/pb3/events_pb.rb'], "protobuf_version" => 3) }
382
+ before do
383
+ plugin_7.register
384
+ end
385
+
386
+ it "should return an event from protobuf encoded data with repeated top level objects" do
387
+ event_class = Google::Protobuf::DescriptorPool.generated_pool.lookup("RepeatedEvent").msgclass # TODO this shouldnt be necessary because the classes are already
388
+ # specified at the end of the _pb.rb files
389
+ events_class = Google::Protobuf::DescriptorPool.generated_pool.lookup("RepeatedEvents").msgclass
390
+ test_a = event_class.new({:id => "1", :msg => "a"})
391
+ test_b = event_class.new({:id => "2", :msg => "b"})
392
+ test_c = event_class.new({:id => "3", :msg => "c"})
393
+ event_obj = events_class.new({:repeated_events=>[test_a, test_b, test_c]})
394
+ bin = events_class.encode(event_obj)
395
+ plugin_7.decode(bin) do |event|
396
+ expect(event.get("repeated_events").size ).to eq(3)
397
+ expect(event.get("repeated_events")[0]["id"] ).to eq("1")
398
+ expect(event.get("repeated_events")[2]["id"] ).to eq("3")
399
+ expect(event.get("repeated_events")[0]["msg"] ).to eq("a")
400
+ expect(event.get("repeated_events")[2]["msg"] ).to eq("c")
401
+ end
402
+ end # it
403
+
404
+
405
+ end # context test7_pb3
406
+
407
+
375
408
  end # describe
@@ -0,0 +1,10 @@
1
+ syntax = "proto3";
2
+
3
+ message Event {
4
+ string id = 1;
5
+ string msg = 2;
6
+ }
7
+
8
+ message Events {
9
+ repeated Event events = 1;
10
+ }
@@ -0,0 +1,17 @@
1
+ # Generated by the protocol buffer compiler. DO NOT EDIT!
2
+ # source: events.proto3
3
+
4
+ require 'google/protobuf'
5
+
6
+ Google::Protobuf::DescriptorPool.generated_pool.build do
7
+ add_message "RepeatedEvent" do
8
+ optional :id, :string, 1
9
+ optional :msg, :string, 2
10
+ end
11
+ add_message "RepeatedEvents" do
12
+ repeated :repeated_events, :message, 1, "RepeatedEvent"
13
+ end
14
+ end
15
+
16
+ Event = Google::Protobuf::DescriptorPool.generated_pool.lookup("RepeatedEvent").msgclass
17
+ Events = Google::Protobuf::DescriptorPool.generated_pool.lookup("RepeatedEvents").msgclass
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-codec-protobuf
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.2.0
4
+ version: 1.2.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Inga Feick
@@ -103,6 +103,8 @@ files:
103
103
  - spec/helpers/pb2/unicorn_event.pb.rb
104
104
  - spec/helpers/pb3/ProbeResult_pb.rb
105
105
  - spec/helpers/pb3/dnsmessage_pb.rb
106
+ - spec/helpers/pb3/events.proto3
107
+ - spec/helpers/pb3/events_pb.rb
106
108
  - spec/helpers/pb3/header/header.proto3
107
109
  - spec/helpers/pb3/header/header_pb.rb
108
110
  - spec/helpers/pb3/integertest_pb.rb
@@ -134,7 +136,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
134
136
  version: '0'
135
137
  requirements: []
136
138
  rubyforge_project:
137
- rubygems_version: 2.6.14.1
139
+ rubygems_version: 2.7.6
138
140
  signing_key:
139
141
  specification_version: 4
140
142
  summary: Reads protobuf messages and converts to Logstash Events
@@ -152,6 +154,8 @@ test_files:
152
154
  - spec/helpers/pb2/unicorn_event.pb.rb
153
155
  - spec/helpers/pb3/ProbeResult_pb.rb
154
156
  - spec/helpers/pb3/dnsmessage_pb.rb
157
+ - spec/helpers/pb3/events.proto3
158
+ - spec/helpers/pb3/events_pb.rb
155
159
  - spec/helpers/pb3/header/header.proto3
156
160
  - spec/helpers/pb3/header/header_pb.rb
157
161
  - spec/helpers/pb3/integertest_pb.rb