logstash-input-azure_blob_storage 0.10.1 → 0.10.2

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: db216440cf4319f70a5fbdb001a53da72826afdcbce43c04ada28b63c5d9e1f8
4
- data.tar.gz: e2e519090c0d67b6b65f4570c34be4f0b02592864459f261c7be613486f2941b
3
+ metadata.gz: 237b9c5343a6339760b9ab3281a1dfb44f9cc6fcfc826d24928ef95a61984e1e
4
+ data.tar.gz: 26b2c86c19252cb0c3e681aff153621a8cade7ad47a22c0a6fb093c7764ea79b
5
5
  SHA512:
6
- metadata.gz: 341abd35cf3b732c1a0bada111cbac79cafa92fb48fd14ad24e0121245c67f1819c1d6e5e4647dda645773231da6aa2a61b7e1aaee4027fa97fe9e857a8a334f
7
- data.tar.gz: 93542f740dda404889f623c9a9d460a1ac6f7181ddd6974de8d775e4e29585b8fccd43b9bd05a880250a4da2709121bf5f018f15a8ce49a9d758b1c645482cbf
6
+ metadata.gz: 88ad3e85f16731b84cd341924f281315d33504c9baef866b91166e9a298b3bc214dde336f621ff8a2de0acf3595a9ffcdfe6ed9b7703cb0795b5b1854b7f7554
7
+ data.tar.gz: ab0bf519697e96197f1b1ffda10376919a36bfdc9b47d73bdedac7821810b973ade5c5a2acf92b4c6f508c236faffac5fe36ec46c0e6b3a328c34157a25d500e
data/CHANGELOG.md CHANGED
@@ -1,3 +1,11 @@
1
+ ## 0.10.2
2
+ - moved iplookup to own plugin logstash-filter-lookup
3
+
4
+ ## 0.10.1
5
+ - implemented iplookup
6
+ - fixed sas tokens (maybe)
7
+ - introduced dns_suffix
8
+
1
9
  ## 0.10.0
2
10
  - Plugin created with the logstash plugin generator
3
11
  - Reimplemented logstash-input-azureblob with incompatible config and data/registry
data/README.md CHANGED
@@ -97,12 +97,6 @@ input {
97
97
  prefix => "resourceId=/"
98
98
  registry_create_policy => "resume"
99
99
  interval => 300
100
- iplookup => "http://10.0.0.5:6081/ripe.php?ip="
101
- use_redis => true
102
- iplist => [
103
- "{'ip':'10.0.0.4','netname':'Application Gateway','subnet':'10.0.0.0\/24','hostname':'appgw'}",
104
- "{'ip':'36.156.24.96',netname':'China Mobile','subnet':'36.156.0.0\/16','hostname':'bigbadwolf'}"
105
- ]
106
100
  }
107
101
  }
108
102
  ```
@@ -2,13 +2,7 @@
2
2
  require "logstash/inputs/base"
3
3
  require "stud/interval"
4
4
  require 'azure/storage/blob'
5
- #require 'securerandom'
6
- #require 'rbconfig'
7
- #require 'date'
8
- #require 'json'
9
- #require 'thread'
10
- #require 'redis'
11
- #require 'net/http'
5
+ require 'json'
12
6
 
13
7
  # This is a logstash input plugin for files in Azure Blob Storage. There is a storage explorer in the portal and an application with the same name https://storageexplorer.com. A storage account has by default a globally unique name, {storageaccount}.blob.core.windows.net which is a CNAME to Azures blob servers blob.*.store.core.windows.net. A storageaccount has an container and those have a directory and blobs (like files). Blobs have one or more blocks. After writing the blocks, they can be committed. Some Azure diagnostics can send events to an EventHub that can be parse through the plugin logstash-input-azure_event_hubs, but for the events that are only stored in an storage account, use this plugin. The original logstash-input-azureblob from azure-diagnostics-tools is great for low volumes, but it suffers from outdated client, slow reads, lease locking issues and json parse errors.
14
8
  # https://azure.microsoft.com/en-us/services/storage/blobs/
@@ -82,33 +76,13 @@ class LogStash::Inputs::AzureBlobStorage < LogStash::Inputs::Base
82
76
  # For NSGFLOWLOGS a path starts with "resourceId=/", but this would only be needed to exclude other files that may be written in the same container.
83
77
  config :prefix, :validate => :string, :required => false
84
78
 
85
- # Set the value for the registry file.
86
- #
87
- # The default, `data/registry`, it contains a Ruby Marshal Serialized Hash of the filename the offset read sofar and the filelength the list time a filelisting was done.
88
- config :registry_path, :validate => :string, :required => false, :default => 'data/registry'
89
-
90
- # The default, `resume`, will load the registry offsets and will start processing files from the offsets.
91
- # When set to `start_over`, all log files are processed from begining.
92
- # when set to `start_fresh`, it will read log files that are created or appended since this start of the pipeline.
93
- config :registry_create_policy, :validate => ['resume','start_over','start_fresh'], :required => false, :default => 'resume'
94
-
95
- # Optional to enrich NSGFLOWLOGS with netname and subnet the iplookup value points to a webservice that provides the information in JSON format like this.
96
- # {"ip":"8.8.8.8","netname":"Google","subnet":"8.8.8.0\/24","hostname":"google-public-dns-a.google.com"}
97
- # In the query parameter has the <ip> tag will be replaced by the IP address to lookup, other parameters are optional and according to your lookup service.
98
- config :iplookup, :validate => :string, :required => false, :default => 'http://127.0.0.1/ripe.php?ip=<ip>&TOKEN=token'
99
-
100
- # Optional array of JSON objects that don't require a lookup
101
- config :iplist, :validate => :array, :required => false, :default => ['{"ip":"10.0.0.4","netname":"Application Gateway","subnet":"10.0.0.0\/24","hostname":"appgw"}']
102
-
103
- # Optional Redis IP cache
104
- config :use_redis, :validate => :boolean, :required => false, :default => false
105
-
106
79
 
107
80
 
108
81
  public
109
82
  def register
110
83
  @pipe_id = Thread.current[:name].split("[").last.split("]").first
111
- @logger.info("=== "+config_name+"/"+@pipe_id+"/"+@id[0,6]+" ===")
84
+ @logger.info("=== "+config_name+" / "+@pipe_id+" / "+@id[0,6]+" ===")
85
+ #@logger.info("ruby #{ RUBY_VERSION }p#{ RUBY_PATCHLEVEL } / #{Gem.loaded_specs[config_name].version.to_s}")
112
86
  @logger.info("Contact me at jan@janmg.com, if something in this plugin doesn't work")
113
87
  # TODO: consider multiple readers, so add pipeline @id or use logstash-to-logstash communication?
114
88
  # TODO: Implement retry ... Error: Connection refused - Failed to open TCP connection to
@@ -127,9 +101,9 @@ def register
127
101
  end
128
102
  unless sas_token.nil?
129
103
  unless sas_token.value.start_with?('?')
130
- conn = "BlobEndpoint=https://#{storageaccount}.#{dns_suffix};SharedAccessSignature=#{sas_token.value}"
104
+ conn = "BlobEndpoint=https://#{storageaccount}.#{dns_suffix};SharedAccessSignature=#{sas_token.value}"
131
105
  else
132
- conn = sas_token.value
106
+ conn = sas_token.value
133
107
  end
134
108
  end
135
109
  unless conn.nil?
@@ -220,7 +194,8 @@ def run(queue)
220
194
  end
221
195
  if logtype == "nsgflowlog" && @is_json
222
196
  begin
223
- @processed += nsgflowlog(queue, JSON.parse(chunk))
197
+ fingjson = JSON.parse(chunk)
198
+ @processed += nsgflowlog(queue, fingjson)
224
199
  rescue JSON::ParserError
225
200
  @logger.error(@pipe_id+" parse error on #{res[:nsg]} [#{res[:date]}] offset: #{file[:offset]} length: #{file[:length]}")
226
201
  end
@@ -303,9 +278,11 @@ def nsgflowlog(queue, json)
303
278
  if (record["properties"]["Version"]==2)
304
279
  ev.merge!( {:flowstate => tups[8], :src_pack => tups[9], :src_bytes => tups[10], :dst_pack => tups[11], :dst_bytes => tups[12]} )
305
280
  end
306
- unless iplookup.nil?
307
- ev.merge!(addip(tups[1], tups[2]))
308
- end
281
+ # Replaced by new plugin: logstash-filter-lookup
282
+ # This caused JSON parse errors since iplookup is now obsolete
283
+ #unless iplookup.nil?
284
+ # ev.merge!(addip(tups[1], tups[2]))
285
+ #end
309
286
  @logger.trace(ev.to_s)
310
287
  event = LogStash::Event.new('message' => ev.to_json)
311
288
  decorate(event)
@@ -1,19 +1,19 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-input-azure_blob_storage'
3
- s.version = '0.10.1'
3
+ s.version = '0.10.2'
4
4
  s.licenses = ['Apache-2.0']
5
5
  s.summary = 'This logstash plugin reads and parses data from Azure Storage Blobs.'
6
6
  s.description = <<-EOF
7
7
  This gem is a Logstash plugin. It reads and parses data from Azure Storage Blobs. The azure_blob_storage is a reimplementation to replace azureblob from azure-diagnostics-tools/Logstash. It can deal with larger volumes and partial file reads and eliminating a delay when rebuilding the registry.
8
8
 
9
- The logstash pipeline configuration would look like this
10
- input {
11
- azure_blob_storage {
12
- storageaccount => "yourstorageaccountname"
13
- access_key => "Ba5e64c0d3=="
14
- container => "insights-logs-networksecuritygroupflowevent"
15
- }
16
- }
9
+ The minimal logstash pipeline configuration would look like this
10
+ > input {
11
+ > azure_blob_storage {
12
+ > storageaccount => "yourstorageaccountname"
13
+ > access_key => "Ba5e64c0d3=="
14
+ > container => "insights-logs-networksecuritygroupflowevent"
15
+ > }
16
+ > }
17
17
  EOF
18
18
  s.homepage = 'https://github.com/janmg/logstash-input-azure_blob_storage'
19
19
  s.authors = ['Jan Geertsma']
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-input-azure_blob_storage
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.10.1
4
+ version: 0.10.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Jan Geertsma
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2019-03-22 00:00:00.000000000 Z
11
+ date: 2019-05-03 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -69,34 +69,34 @@ dependencies:
69
69
  - !ruby/object:Gem::Dependency
70
70
  requirement: !ruby/object:Gem::Requirement
71
71
  requirements:
72
- - - "~>"
73
- - !ruby/object:Gem::Version
74
- version: '1.0'
75
72
  - - ">="
76
73
  - !ruby/object:Gem::Version
77
74
  version: 1.0.0
75
+ - - "~>"
76
+ - !ruby/object:Gem::Version
77
+ version: '1.0'
78
78
  name: logstash-devutils
79
79
  prerelease: false
80
80
  type: :development
81
81
  version_requirements: !ruby/object:Gem::Requirement
82
82
  requirements:
83
- - - "~>"
84
- - !ruby/object:Gem::Version
85
- version: '1.0'
86
83
  - - ">="
87
84
  - !ruby/object:Gem::Version
88
85
  version: 1.0.0
86
+ - - "~>"
87
+ - !ruby/object:Gem::Version
88
+ version: '1.0'
89
89
  description: |2
90
90
  This gem is a Logstash plugin. It reads and parses data from Azure Storage Blobs. The azure_blob_storage is a reimplementation to replace azureblob from azure-diagnostics-tools/Logstash. It can deal with larger volumes and partial file reads and eliminating a delay when rebuilding the registry.
91
91
 
92
- The logstash pipeline configuration would look like this
93
- input {
94
- azure_blob_storage {
95
- storageaccount => "yourstorageaccountname"
96
- access_key => "Ba5e64c0d3=="
97
- container => "insights-logs-networksecuritygroupflowevent"
98
- }
99
- }
92
+ The minimal logstash pipeline configuration would look like this
93
+ > input {
94
+ > azure_blob_storage {
95
+ > storageaccount => "yourstorageaccountname"
96
+ > access_key => "Ba5e64c0d3=="
97
+ > container => "insights-logs-networksecuritygroupflowevent"
98
+ > }
99
+ > }
100
100
  email: jan@janmg.com
101
101
  executables: []
102
102
  extensions: []
@@ -133,7 +133,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
133
133
  version: '0'
134
134
  requirements: []
135
135
  rubyforge_project:
136
- rubygems_version: 2.6.13
136
+ rubygems_version: 2.7.9
137
137
  signing_key:
138
138
  specification_version: 4
139
139
  summary: This logstash plugin reads and parses data from Azure Storage Blobs.