logstash-input-azure_blob_storage 0.12.9 → 0.12.10

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4714d163b8085f62c285af7e18cae4b0075e89ee11aa6e6f2a9e18a2fd0dde1a
4
- data.tar.gz: 0ebb527c554c1b48d7c1d3cb4b17b4ecb8aaa7745dab55b6d0eaa22660722fa2
3
+ metadata.gz: ebb6ddba4c6e8d122a85901fe33f03125951facfb164fa16e8aa987cf314c34b
4
+ data.tar.gz: 5fc3537dcbbeacfefb51b45f29493c8494e74a6fcc2fc0d8414626dc86a209aa
5
5
  SHA512:
6
- metadata.gz: c0696b1431363cd1e828340a54fe044eacceb6c494f8f054f0082777f9bf78512fd3a3c893cea063eedfe006f0fad973fc72ebe7af8f7f03bde290a89fb77b89
7
- data.tar.gz: 41a63e6decb12a501528b349b3c88b04c083f09f645360381af0cfaa520853989a9425fe8bf5c30a69666c3c87887efe649cc622291afdea6118abf462b7af3e
6
+ metadata.gz: 85abf8fdd855702ddb710ef82fce98d108dac266f4bd1afc8693626ef2319c838e751e03e029dedd3c13eddf090137078a061164bf8748b967aedf875f4d734e
7
+ data.tar.gz: 256db7cef14c127bccfcd409d5b161e7c9a82571eea17c0255709fd3995a70d49fb08e12a36f221b72c9c4d4f5a4205f1b2380d7df20e311afb65bf260563b7e
data/CHANGELOG.md CHANGED
@@ -1,3 +1,10 @@
1
+ ## 0.12.10 (Not yet started)
2
+ - gzip support
3
+ - csv support
4
+
5
+ ## 0.12.9
6
+ - fixed the processing of non json and json_lines codecs
7
+
1
8
  ## 0.12.8
2
9
  - support append blob (use codec json_lines and logtype raw)
3
10
  - change the default head and tail to an empty string, unless the logtype is nsgflowlog
data/README.md CHANGED
@@ -1,18 +1,27 @@
1
- # Logstash
1
+ # WARNING !!!
2
+ Because of on update of logstash or azure I can't seem to get this plugin to work.
3
+ https://github.com/janmg/logstash-input-azure_blob_storage/issues/44
4
+ It doesn't look like I'll be able to fix this, even the quick start example is broken.
5
+
6
+ Going forward I will rebuild an NSGFLOWLOG only tool in GOLANG, that can fetch the log entries and feed them to stdout, a log file, or a queue like kafka, this way I cut the JRUBY dependancies with logstash. With the logstash-input-kafka plugin you can still suck in the flow logs in logstash or use an Azure eventhub. The GOLANG program is a proof of concept. It will take some time before it's going to be useable.
7
+
8
+ blob-to-kafka.go can already list blobs, list the blocks of a blob, read the blob, loop through the json and find the flowtuples and send them to kafka. It's work in progress. It's not yet using the file listing, the blob is fully read and partial reads are not yet implemented and there is no tracking of which files got read. but at least it looks hopeful and I get some progress done.
9
+
10
+ For problems or feature requests with this specific program, raise a github issue [GITHUB/janmg/logstash-input-azure_blob_storage/](https://github.com/janmg/logstash-input-azure_blob_storage). Pull requests will also be welcomed after discussion through an issue.
2
11
 
3
- This is a plugin for [Logstash](https://github.com/elastic/logstash). It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way. All logstash plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/). Need generic logstash help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
12
+ # Logstash
4
13
 
5
- For problems or feature requests with this specific plugin, raise a github issue [GITHUB/janmg/logstash-input-azure_blob_storage/](https://github.com/janmg/logstash-input-azure_blob_storage). Pull requests will also be welcomed after discussion through an issue.
14
+ This was a plugin for [Logstash](https://github.com/elastic/logstash). It was fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way. All logstash plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/). Need generic logstash help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
6
15
 
7
16
  ## Purpose
8
- This plugin can read from Azure Storage Blobs, for instance JSON diagnostics logs for NSG flow logs or LINE based accesslogs from App Services.
17
+ This plugin was abled to read from Azure Storage Blobs, for instance JSON diagnostics logs for NSG flow logs or LINE based accesslogs from App Services.
9
18
  [Azure Blob Storage](https://azure.microsoft.com/en-us/services/storage/blobs/)
10
19
 
11
20
  ## Alternatives
12
21
  This plugin was inspired by the Azure diagnostics tools, but should work better for bigger amounts of files. the configuration is not compatible, the configuration azureblob refers to the diagnostics tools plugin and this plugin uses azure_blob_storage
13
22
  https://github.com/Azure/azure-diagnostics-tools/tree/master/Logstash/logstash-input-azureblob
14
23
 
15
- There is a Filebeat plugin, that may work in the future
24
+ There is a Filebeat plugin, that may work in the future (or not?)
16
25
  https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-azure-blob-storage.html
17
26
 
18
27
  ## Innerworking
@@ -28,11 +37,28 @@ The plugin executes the following steps
28
37
  7. If logstash is stopped, a stop signal will try to finish the current file, save the registry and than quit
29
38
 
30
39
  ## Installation
31
- This plugin can be installed through logstash-plugin
40
+ This plugin can be installed through logstash-plugin as documented https://www.elastic.co/guide/en/logstash/current/working-with-plugins.html#listing-plugins. This should pull the latest version from rubygems https://rubygems.org/gems/logstash-input-azure_blob_storage
41
+
32
42
  ```
33
43
  /usr/share/logstash/bin/logstash-plugin install logstash-input-azure_blob_storage
34
44
  ```
35
45
 
46
+ For Ubuntu I use these commands, to list, update, remove and install
47
+ ```
48
+ sudo -u logstash /usr/share/logstash/bin/logstash-plugin list --verbose
49
+ sudo -u logstash /usr/share/logstash/bin/logstash-plugin update
50
+ sudo -u logstash /usr/share/logstash/bin/logstash-plugin update logstash-input-azure_blob_storage
51
+ sudo -u logstash /usr/share/logstash/bin/logstash-plugin remove logstash-input-azurestorage
52
+ sudo -u logstash /usr/share/logstash/bin/logstash-plugin install logstash-input-azure_blob_storage
53
+ ```
54
+
55
+ Alternatively you can use the commands from the build.sh script to build and install the gem locally. This you don't have to do, unless you want to modify the code in lib/logstash/inputs/azure_blob_storage.rb
56
+ ```
57
+ sudo -u logstash gem build logstash-input-azure_blob_storage.gemspec
58
+ sudo -u logstash gem install logstash-input-azure_blob_storage-${VERSION}.gem
59
+ sudo -u logstash /usr/share/logstash/bin/logstash-plugin install ${GEMPWD}/logstash-input-azure_blob_storage-${VERSION}.gem
60
+ ```
61
+
36
62
  ## Minimal Configuration
37
63
  The minimum configuration required as input is storageaccount, access_key and container.
38
64
 
@@ -74,9 +100,13 @@ The pipeline can be started in several ways.
74
100
  pipe.id = test
75
101
  pipe.path = /etc/logstash/conf.d/test.conf
76
102
  ```
103
+ and then started as a service
104
+ ```
105
+ service logstash start
106
+ ```
77
107
  - As managed pipeline from Kibana
78
108
 
79
- Logstash itself (so not specific to this plugin) has a feature where multiple instances can run on the same system. The default TCP port is 9600, but if it's already in use it will use 9601 (and up), this is probably not true anymore from v8. To update a config file on a running instance on the commandline you can add the argument --config.reload.automatic and if you modify the files that are in the pipeline.yml you can send a SIGHUP channel to reload the pipelines where the config was changed.
109
+ To update a config file on a running instance on the commandline you can add the argument --config.reload.automatic and if you modify the files that are in the pipeline.yml you can send a SIGHUP channel to reload the pipelines where the config was changed.
80
110
  [https://www.elastic.co/guide/en/logstash/current/reloading-config.html](https://www.elastic.co/guide/en/logstash/current/reloading-config.html)
81
111
 
82
112
  ## Internal Working
@@ -88,6 +118,10 @@ Additional fields can be enabled with addfilename and addall, ecs_compatibility
88
118
 
89
119
  The configurations and the rest of the code are in [https://github.com/janmg/logstash-input-azure_blob_storage/tree/master/lib/logstash/inputs](lib/logstash/inputs) [https://github.com/janmg/logstash-input-azure_blob_storage/blob/master/lib/logstash/inputs/azure_blob_storage.rb#L10](azure_blob_storage.rb)
90
120
 
121
+ ## Codecs
122
+ The default codec is json, the plugin should also work with json_lines, line. Other codecs like gzip and csv may work, but this plugin doesn't have specific code to handle them. This plugin reads all the binary from the file and gives it to the codec to make into events. For the logtype nsgflowlogs the plugin will read all the blocks and chops it into one event per rule.
123
+ https://www.elastic.co/guide/en/logstash/current/codec-plugins.html
124
+
91
125
  ## Enabling NSG Flowlogs
92
126
  1. Enable Network Watcher in your regions
93
127
  2. Create Storage account per region
@@ -331,6 +331,9 @@ public
331
331
  begin
332
332
  @codec.decode(chunk) do |event|
333
333
  counter += 1
334
+ if @addfilename
335
+ event.set('filename', name)
336
+ end
334
337
  queue << event
335
338
  end
336
339
  @processed += counter
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-input-azure_blob_storage'
3
- s.version = '0.12.9'
3
+ s.version = '0.12.10'
4
4
  s.licenses = ['Apache-2.0']
5
5
  s.summary = 'This logstash plugin reads and parses data from Azure Storage Blobs.'
6
6
  s.description = <<-EOF
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-input-azure_blob_storage
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.12.9
4
+ version: 0.12.10
5
5
  platform: ruby
6
6
  authors:
7
7
  - Jan Geertsma
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-07-15 00:00:00.000000000 Z
11
+ date: 2023-12-06 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement