logstash-codec-protobuf 1.1.0 → 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA256:
3
- metadata.gz: c7759d64f37dcbb075892f198b46795d1b35964b66341e70107d13a9065de70b
4
- data.tar.gz: db8bfbab2b9cdd7f3dbdbdaa8c0f8b12ab46b7ae4ef024cbc0b35706f3dc08e5
2
+ SHA1:
3
+ metadata.gz: 4812c43d851064eacffc5c6fba4719eed4691702
4
+ data.tar.gz: 88436b13daa031ffb714a0f3b1c9611e4130133a
5
5
  SHA512:
6
- metadata.gz: 72147c0788aed161306b74f8537259b04c615313c85d803e51eca96abdfd0bfdee2fb0f415fbd2eb56e66f08e69a46e8e43403175966ec9178dcfa0133e90e4c
7
- data.tar.gz: 1872037c0dff0b05cfaba1d9ac6fe663b12d8130cdab8931d5a3e8b4fc03af9b48720120835bf5acd941a1c0814276a797f9f7a1081ee0d5e4bdfdc63d444fab
6
+ metadata.gz: b9e6b75105a06e434fdf6f968e80842ac874bbeb7cf1bd72b7339ce3434e43dd4cdc20e992add4b24e811302f2d4366ee8548426b8813042649ea73617cd14ca
7
+ data.tar.gz: 53d72f5bdb6b5eac6fb88b4d092dba366c820043a709a0d433cf0ef30e7a52f468a48e46cd8a9e728a2fb9c2775c31fee305fb11e5e4f1d7ac0f7d491389a4dc
data/README.md CHANGED
@@ -27,6 +27,9 @@ Here's an example for a kafka input with protobuf 2:
27
27
  {
28
28
  zk_connect => "127.0.0.1"
29
29
  topic_id => "unicorns_protobuffed"
30
+ key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
31
+ value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
32
+
30
33
  codec => protobuf
31
34
  {
32
35
  class_name => "Animals::Unicorn"
@@ -40,13 +43,23 @@ Example for protobuf 3:
40
43
  {
41
44
  zk_connect => "127.0.0.1"
42
45
  topic_id => "unicorns_protobuffed"
46
+ key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
47
+ value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
43
48
  codec => protobuf
44
49
  {
45
- class_name => "Animals.Unicorn"
50
+ class_name => "Animals.Unicorn"
46
51
  include_path => ['/path/to/pb_definitions/Animal_pb.rb', '/path/to/pb_definitions/Unicorn_pb.rb']
47
52
  protobuf_version => 3
48
53
  }
49
- }
54
+ }
55
+
56
+ For version 3 class names check the bottom of the generated protobuf ruby file. It contains lines like this:
57
+
58
+ Animals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Unicorn").msgclass
59
+
60
+ Use the parameter for the lookup call as the class_name for the codec config.
61
+
62
+ If you're using a kafka input please also set the deserializer classes as shown above.
50
63
 
51
64
  ### Class loading order
52
65
 
data/docs/index.asciidoc CHANGED
@@ -30,6 +30,8 @@ kafka
30
30
  {
31
31
  zk_connect => "127.0.0.1"
32
32
  topic_id => "your_topic_goes_here"
33
+ key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
34
+ value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
33
35
  codec => protobuf
34
36
  {
35
37
  class_name => "Animal::Unicorn"
@@ -37,6 +39,7 @@ kafka
37
39
  }
38
40
  }
39
41
 
42
+ Specifically for the kafka input: please set the deserializer classes as shown above.
40
43
 
41
44
  [id="plugins-{type}s-{plugin}-options"]
42
45
  ==== Protobuf Codec Configuration Options
@@ -70,6 +73,11 @@ module Foods
70
73
  # here are your field definitions.
71
74
 
72
75
  If your class references other definitions: you only have to add the main class here.
76
+ For version 3 class names check the bottom of the generated protobuf ruby file. It contains lines like this:
77
+
78
+ Animals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Unicorn").msgclass
79
+
80
+ Use the parameter for the lookup call as the class_name for the codec config.
73
81
 
74
82
  [id="plugins-{type}s-{plugin}-include_path"]
75
83
  ===== `include_path`
@@ -4,17 +4,36 @@ require 'logstash/util/charset'
4
4
  require 'google/protobuf' # for protobuf3
5
5
  require 'protocol_buffers' # https://github.com/codekitchen/ruby-protocol-buffers, for protobuf2
6
6
 
7
- # This codec converts protobuf encoded messages into logstash events and vice versa.
7
+ # Monkey-patch the `Google::Protobuf::DescriptorPool` with a mutex for exclusive
8
+ # access.
9
+ #
10
+ # The DescriptorPool instance is not thread-safe when loading protobuf
11
+ # definitions. This can cause unrecoverable errors when registering multiple
12
+ # concurrent pipelines that try to register the same dependency. The
13
+ # DescriptorPool instance is global to the JVM and shared among all pipelines.
14
+ class << Google::Protobuf::DescriptorPool
15
+ def with_lock
16
+ if !@mutex
17
+ @mutex = Mutex.new
18
+ end
19
+
20
+ return @mutex
21
+ end
22
+ end
23
+
24
+ # This codec converts protobuf encoded messages into logstash events and vice versa.
8
25
  #
9
26
  # Requires the protobuf definitions as ruby files. You can create those using the [ruby-protoc compiler](https://github.com/codekitchen/ruby-protocol-buffers).
10
- #
27
+ #
11
28
  # The following shows a usage example for decoding protobuf 2 encoded events from a kafka stream:
12
29
  # [source,ruby]
13
- # kafka
30
+ # kafka
14
31
  # {
15
32
  # zk_connect => "127.0.0.1"
16
33
  # topic_id => "your_topic_goes_here"
17
- # codec => protobuf
34
+ # key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
35
+ # value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
36
+ # codec => protobuf
18
37
  # {
19
38
  # class_name => "Animal::Unicorn"
20
39
  # include_path => ['/path/to/protobuf/definitions/UnicornProtobuf.pb.rb']
@@ -23,11 +42,13 @@ require 'protocol_buffers' # https://github.com/codekitchen/ruby-protocol-buffer
23
42
  #
24
43
  # Same example for protobuf 3:
25
44
  # [source,ruby]
26
- # kafka
45
+ # kafka
27
46
  # {
28
47
  # zk_connect => "127.0.0.1"
29
48
  # topic_id => "your_topic_goes_here"
30
- # codec => protobuf
49
+ # key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
50
+ # value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
51
+ # codec => protobuf
31
52
  # {
32
53
  # class_name => "Animal.Unicorn"
33
54
  # include_path => ['/path/to/protobuf/definitions/UnicornProtobuf_pb.rb']
@@ -35,6 +56,7 @@ require 'protocol_buffers' # https://github.com/codekitchen/ruby-protocol-buffer
35
56
  # }
36
57
  # }
37
58
  #
59
+ # Specifically for the kafka input: please set the deserializer classes as shown above.
38
60
 
39
61
  class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
40
62
  config_name 'protobuf'
@@ -43,7 +65,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
43
65
  # If your protobuf 2 definition contains modules, prepend them to the class name with double colons like so:
44
66
  # [source,ruby]
45
67
  # class_name => "Animal::Horse::Unicorn"
46
- #
68
+ #
47
69
  # This corresponds to a protobuf definition starting as follows:
48
70
  # [source,ruby]
49
71
  # module Animal
@@ -54,14 +76,44 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
54
76
  # For protobuf 3 separate the modules with single dots.
55
77
  # [source,ruby]
56
78
  # class_name => "Animal.Horse.Unicorn"
57
- #
79
+ # Check the bottom of the generated protobuf ruby file. It contains lines like this:
80
+ # [source,ruby]
81
+ # Animals.Unicorn = Google::Protobuf::DescriptorPool.generated_pool.lookup("Animals.Unicorn").msgclass
82
+ # Use the parameter for the lookup call as the class_name for the codec config.
83
+ #
58
84
  # If your class references other definitions: you only have to add the main class here.
59
85
  config :class_name, :validate => :string, :required => true
60
86
 
61
- # List of absolute pathes to files with protobuf definitions.
62
- # When using more than one file, make sure to arrange the files in reverse order of dependency so that each class is loaded before it is
87
+ # Relative path to the ruby file that contains class_name
88
+ #
89
+ # Relative path (from `protobuf_root_directory`) that holds the definition of the class specified in
90
+ # `class_name`.
91
+ #
92
+ # `class_file` and `include_path` cannot be used at the same time.
93
+ config :class_file, :validate => :string, :default => '', :required => false
94
+
95
+ # Absolute path to the directory that contains all compiled protobuf files.
96
+ #
97
+ # Absolute path to the root directory that contains all referenced/used dependencies
98
+ # of the main class (`class_name`) or any of its dependencies.
99
+ #
100
+ # For instance:
101
+ #
102
+ # pb3
103
+ # ├── header
104
+ # │   └── header_pb.rb
105
+ # ├── messageA_pb.rb
106
+ #
107
+ # In this case `messageA_pb.rb` has an embedded message from `header/header_pb.rb`.
108
+ # If `class_file` is set to `messageA_pb.rb`, and `class_name` to
109
+ # `MessageA`, `protobuf_root_directory` must be set to `/path/to/pb3`. Which includes
110
+ # both definitions.
111
+ config :protobuf_root_directory, :validate => :string, :required => false
112
+
113
+ # List of absolute pathes to files with protobuf definitions.
114
+ # When using more than one file, make sure to arrange the files in reverse order of dependency so that each class is loaded before it is
63
115
  # refered to by another.
64
- #
116
+ #
65
117
  # Example: a class _Unicorn_ referencing another protobuf class _Wings_
66
118
  # [source,ruby]
67
119
  # module Animal
@@ -76,15 +128,16 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
76
128
  # [source,ruby]
77
129
  # include_path => ['/path/to/protobuf/definitions/Wings.pb.rb','/path/to/protobuf/definitions/Unicorn.pb.rb']
78
130
  #
79
- # When using the codec in an output plugin:
80
- # * make sure to include all the desired fields in the protobuf definition, including timestamp.
131
+ # When using the codec in an output plugin:
132
+ # * make sure to include all the desired fields in the protobuf definition, including timestamp.
81
133
  # Remove fields that are not part of the protobuf definition from the event by using the mutate filter.
82
- # * the @ symbol is currently not supported in field names when loading the protobuf definitions for encoding. Make sure to call the timestamp field "timestamp"
134
+ # * the @ symbol is currently not supported in field names when loading the protobuf definitions for encoding. Make sure to call the timestamp field "timestamp"
83
135
  # instead of "@timestamp" in the protobuf file. Logstash event fields will be stripped of the leading @ before conversion.
84
- #
85
- config :include_path, :validate => :array, :required => true
136
+ #
137
+ # `class_file` and `include_path` cannot be used at the same time.
138
+ config :include_path, :validate => :array, :default => [], :required => false
86
139
 
87
- # Protocol buffer version switch. Defaults to version 2. Please note that the behaviour for enums varies between the versions.
140
+ # Protocol buffer version switch. Defaults to version 2. Please note that the behaviour for enums varies between the versions.
88
141
  # For protobuf 2 you will get integer representations for enums, for protobuf 3 you'll get string representations due to a different converter library.
89
142
  # Recommendation: use the translate plugin to restore previous behaviour when upgrading.
90
143
  config :protobuf_version, :validate => [2,3], :default => 2, :required => true
@@ -92,18 +145,61 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
92
145
  # To tolerate faulty messages that cannot be decoded, set this to false. Otherwise the pipeline will stop upon encountering a non decipherable message.
93
146
  config :stop_on_error, :validate => :boolean, :default => false, :required => false
94
147
 
148
+ attr_reader :execution_context
149
+
150
+ # id of the pipeline whose events you want to read from.
151
+ def pipeline_id
152
+ respond_to?(:execution_context) && !execution_context.nil? ? execution_context.pipeline_id : "main"
153
+ end
154
+
95
155
  def register
96
156
  @metainfo_messageclasses = {}
97
157
  @metainfo_enumclasses = {}
98
158
  @metainfo_pb2_enumlist = []
99
- include_path.each { |path| load_protobuf_definition(path) }
100
- if @protobuf_version == 3
101
- @pb_builder = Google::Protobuf::DescriptorPool.generated_pool.lookup(class_name).msgclass
102
- else
103
- @pb_builder = pb2_create_instance(class_name)
159
+
160
+ if @include_path.length > 0 and not class_file.strip.empty?
161
+ raise LogStash::ConfigurationError, "Cannot use `include_path` and `class_file` at the same time"
162
+ end
163
+
164
+ if @include_path.length == 0 and class_file.strip.empty?
165
+ raise LogStash::ConfigurationError, "Need to specify `include_path` or `class_file`"
166
+ end
167
+
168
+ should_register = Google::Protobuf::DescriptorPool.generated_pool.lookup(class_name).nil?
169
+
170
+ unless @protobuf_root_directory.nil? or @protobuf_root_directory.strip.empty?
171
+ if !$LOAD_PATH.include? @protobuf_root_directory and should_register
172
+ $LOAD_PATH.unshift(@protobuf_root_directory)
173
+ end
174
+ end
175
+
176
+ @class_file = "#{@protobuf_root_directory}/#{@class_file}" unless (Pathname.new @class_file).absolute? or @class_file.empty?
177
+
178
+ # exclusive access while loading protobuf definitions
179
+ Google::Protobuf::DescriptorPool.with_lock.synchronize do
180
+ # load from `class_file`
181
+ load_protobuf_definition(@class_file) if should_register and !@class_file.empty?
182
+ # load from `include_path`
183
+ include_path.each { |path| load_protobuf_definition(path) } if include_path.length > 0 and should_register
184
+
185
+ if @protobuf_version == 3
186
+ @pb_builder = Google::Protobuf::DescriptorPool.generated_pool.lookup(class_name).msgclass
187
+ else
188
+ @pb_builder = pb2_create_instance(class_name)
189
+ end
104
190
  end
105
191
  end
106
192
 
193
+ # Pipelines using this plugin cannot be reloaded.
194
+ # https://github.com/elastic/logstash/pull/6499
195
+ #
196
+ # The DescriptorPool instance registers the protobuf classes (and
197
+ # dependencies) as global objects. This makes it very difficult to reload a
198
+ # pipeline, because `class_name` and all of its dependencies are already
199
+ # registered.
200
+ def reloadable?
201
+ return false
202
+ end
107
203
 
108
204
  def decode(data)
109
205
  if @protobuf_version == 3
@@ -111,7 +207,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
111
207
  h = pb3_deep_to_hash(decoded)
112
208
  else
113
209
  decoded = @pb_builder.parse(data.to_s)
114
- h = decoded.to_hash
210
+ h = decoded.to_hash
115
211
  end
116
212
  yield LogStash::Event.new(h) if block_given?
117
213
  rescue => e
@@ -126,7 +222,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
126
222
  if @protobuf_version == 3
127
223
  protobytes = pb3_encode_wrapper(event)
128
224
  else
129
- protobytes = pb2_encode_wrapper(event)
225
+ protobytes = pb2_encode_wrapper(event)
130
226
  end
131
227
  @on_event.call(event, protobytes)
132
228
  end # def encode
@@ -139,7 +235,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
139
235
  result = Hash.new
140
236
  input.to_hash.each {|key, value|
141
237
  result[key] = pb3_deep_to_hash(value) # the key is required for the class lookup of enums.
142
- }
238
+ }
143
239
  when ::Array
144
240
  result = []
145
241
  input.each {|value|
@@ -174,32 +270,32 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
174
270
 
175
271
  def pb3_encode(datahash, class_name)
176
272
  if datahash.is_a?(::Hash)
177
-
178
-
273
+
274
+
179
275
 
180
276
  # Preparation: the data cannot be encoded until certain criteria are met:
181
277
  # 1) remove @ signs from keys.
182
278
  # 2) convert timestamps and other objects to strings
183
279
  datahash = datahash.inject({}){|x,(k,v)| x[k.gsub(/@/,'').to_sym] = (should_convert_to_string?(v) ? v.to_s : v); x}
184
-
280
+
185
281
  # Check if any of the fields in this hash are protobuf classes and if so, create a builder for them.
186
282
  meta = @metainfo_messageclasses[class_name]
187
283
  if meta
188
284
  meta.map do | (field_name,class_name) |
189
285
  key = field_name.to_sym
190
286
  if datahash.include?(key)
191
- original_value = datahash[key]
192
- datahash[key] =
287
+ original_value = datahash[key]
288
+ datahash[key] =
193
289
  if original_value.is_a?(::Array)
194
290
  # make this field an array/list of protobuf objects
195
291
  # value is a list of hashed complex objects, each of which needs to be protobuffed and
196
292
  # put back into the list.
197
- original_value.map { |x| pb3_encode(x, class_name) }
293
+ original_value.map { |x| pb3_encode(x, class_name) }
198
294
  original_value
199
- else
295
+ else
200
296
  r = pb3_encode(original_value, class_name)
201
297
  builder = Google::Protobuf::DescriptorPool.generated_pool.lookup(class_name).msgclass
202
- builder.new(r)
298
+ builder.new(r)
203
299
  end # if is array
204
300
  end # if datahash_include
205
301
  end # do
@@ -214,7 +310,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
214
310
  original_value = datahash[key]
215
311
  datahash[key] = case original_value
216
312
  when ::Array
217
- original_value.map { |x| pb3_encode(x, class_name) }
313
+ original_value.map { |x| pb3_encode(x, class_name) }
218
314
  original_value
219
315
  when Fixnum
220
316
  original_value # integers will be automatically converted into enum
@@ -229,8 +325,8 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
229
325
  # rescue => e
230
326
  # @logger.debug("Encoding error 3: could not translate #{original_value} into enum. #{e}")
231
327
  # raise e
232
- # end
233
- end
328
+ # end
329
+ end
234
330
  end # if datahash_include
235
331
  end # do
236
332
  end # if meta
@@ -252,27 +348,27 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
252
348
 
253
349
 
254
350
 
255
- def pb2_encode(datahash, class_name)
351
+ def pb2_encode(datahash, class_name)
256
352
  if datahash.is_a?(::Hash)
257
353
  # Preparation: the data cannot be encoded until certain criteria are met:
258
354
  # 1) remove @ signs from keys.
259
355
  # 2) convert timestamps and other objects to strings
260
356
  datahash = ::Hash[datahash.map{|(k,v)| [k.to_s.dup.gsub(/@/,''), (should_convert_to_string?(v) ? v.to_s : v)] }]
261
-
357
+
262
358
  # Check if any of the fields in this hash are protobuf classes and if so, create a builder for them.
263
359
  meta = @metainfo_messageclasses[class_name]
264
360
  if meta
265
361
  meta.map do | (k,c) |
266
362
  if datahash.include?(k)
267
- original_value = datahash[k]
268
- datahash[k] =
363
+ original_value = datahash[k]
364
+ datahash[k] =
269
365
  if original_value.is_a?(::Array)
270
366
  # make this field an array/list of protobuf objects
271
367
  # value is a list of hashed complex objects, each of which needs to be protobuffed and
272
368
  # put back into the list.
273
- original_value.map { |x| pb2_encode(x, c) }
369
+ original_value.map { |x| pb2_encode(x, c) }
274
370
  original_value
275
- else
371
+ else
276
372
  proto_obj = pb2_create_instance(c)
277
373
  proto_obj.new(pb2_encode(original_value, c)) # this line is reached in the colourtest for an enum. Enums should not be instantiated. Should enums even be in the messageclasses? I dont think so! TODO bug
278
374
  end # if is array
@@ -288,7 +384,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
288
384
  !(v.is_a?(Fixnum) || v.is_a?(::Hash) || v.is_a?(::Array) || [true, false].include?(v))
289
385
  end
290
386
 
291
-
387
+
292
388
  def pb2_create_instance(name)
293
389
  @logger.debug("Creating instance of " + name)
294
390
  name.split('::').inject(Object) { |n,c| n.const_get c }
@@ -302,7 +398,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
302
398
  type = ""
303
399
  field_name = ""
304
400
  File.readlines(filename).each do |line|
305
- if ! (line =~ regex_class_name).nil?
401
+ if ! (line =~ regex_class_name).nil?
306
402
  class_name = $1
307
403
  @metainfo_messageclasses[class_name] = {}
308
404
  @metainfo_enumclasses[class_name] = {}
@@ -326,7 +422,7 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
326
422
  @logger.warn("Error 3: unable to read pb definition from file " + filename+ ". Reason: #{e.inspect}. Last settings were: class #{class_name} field #{field_name} type #{type}. Backtrace: " + e.backtrace.inspect.to_s)
327
423
  raise e
328
424
  end
329
-
425
+
330
426
 
331
427
 
332
428
  def pb2_metadata_analyis(filename)
@@ -359,13 +455,13 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
359
455
  if type =~ /::/
360
456
  clean_type = type.gsub(/^:/,"")
361
457
  e = @metainfo_pb2_enumlist.include? clean_type.downcase
362
-
458
+
363
459
  if e
364
460
  if not @metainfo_enumclasses.key? class_name
365
461
  @metainfo_enumclasses[class_name] = {}
366
462
  end
367
463
  @metainfo_enumclasses[class_name][field_name] = clean_type
368
- else
464
+ else
369
465
  if not @metainfo_messageclasses.key? class_name
370
466
  @metainfo_messageclasses[class_name] = {}
371
467
  end
@@ -381,32 +477,32 @@ class LogStash::Codecs::Protobuf < LogStash::Codecs::Base
381
477
  rescue LoadError => e
382
478
  raise ArgumentError.new("Could not load file: " + filename + ". Please try to use absolute pathes. Current working dir: " + Dir.pwd + ", loadpath: " + $LOAD_PATH.join(" "))
383
479
  rescue => e
384
-
480
+
385
481
  @logger.warn("Error 3: unable to read pb definition from file " + filename+ ". Reason: #{e.inspect}. Last settings were: class #{class_name} field #{field_name} type #{type}. Backtrace: " + e.backtrace.inspect.to_s)
386
482
  raise e
387
483
  end
388
-
484
+
389
485
 
390
486
  def load_protobuf_definition(filename)
391
487
  if filename.end_with? ('.rb')
488
+ # Add to the loading path of the protobuf definitions
392
489
  if (Pathname.new filename).absolute?
393
- require filename
394
- else
395
- require_relative filename # needed for the test cases
396
- r = File.expand_path(File.dirname(__FILE__))
397
- filename = File.join(r, filename) # make the path absolute
490
+ begin
491
+ require filename
492
+ rescue Exception => e
493
+ @logger.error("Unable to load file: #{filename}. Reason: #{e.inspect}")
494
+ end
398
495
  end
399
-
496
+
400
497
  if @protobuf_version == 3
401
498
  pb3_metadata_analyis(filename)
402
499
  else
403
500
  pb2_metadata_analyis(filename)
404
501
  end
405
-
406
- else
502
+
503
+ else
407
504
  @logger.warn("Not a ruby file: " + filename)
408
505
  end
409
506
  end
410
507
 
411
-
412
508
  end # class LogStash::Codecs::Protobuf