logstash-input-azure_blob_storage 0.12.8 → 0.12.10
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +7 -0
- data/README.md +41 -7
- data/lib/logstash/inputs/azure_blob_storage.rb +21 -0
- data/logstash-input-azure_blob_storage.gemspec +1 -1
- metadata +2 -2
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: ebb6ddba4c6e8d122a85901fe33f03125951facfb164fa16e8aa987cf314c34b
|
4
|
+
data.tar.gz: 5fc3537dcbbeacfefb51b45f29493c8494e74a6fcc2fc0d8414626dc86a209aa
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 85abf8fdd855702ddb710ef82fce98d108dac266f4bd1afc8693626ef2319c838e751e03e029dedd3c13eddf090137078a061164bf8748b967aedf875f4d734e
|
7
|
+
data.tar.gz: 256db7cef14c127bccfcd409d5b161e7c9a82571eea17c0255709fd3995a70d49fb08e12a36f221b72c9c4d4f5a4205f1b2380d7df20e311afb65bf260563b7e
|
data/CHANGELOG.md
CHANGED
@@ -1,3 +1,10 @@
|
|
1
|
+
## 0.12.10 (Not yet started)
|
2
|
+
- gzip support
|
3
|
+
- csv support
|
4
|
+
|
5
|
+
## 0.12.9
|
6
|
+
- fixed the processing of non json and json_lines codecs
|
7
|
+
|
1
8
|
## 0.12.8
|
2
9
|
- support append blob (use codec json_lines and logtype raw)
|
3
10
|
- change the default head and tail to an empty string, unless the logtype is nsgflowlog
|
data/README.md
CHANGED
@@ -1,18 +1,27 @@
|
|
1
|
-
#
|
1
|
+
# WARNING !!!
|
2
|
+
Because of on update of logstash or azure I can't seem to get this plugin to work.
|
3
|
+
https://github.com/janmg/logstash-input-azure_blob_storage/issues/44
|
4
|
+
It doesn't look like I'll be able to fix this, even the quick start example is broken.
|
5
|
+
|
6
|
+
Going forward I will rebuild an NSGFLOWLOG only tool in GOLANG, that can fetch the log entries and feed them to stdout, a log file, or a queue like kafka, this way I cut the JRUBY dependancies with logstash. With the logstash-input-kafka plugin you can still suck in the flow logs in logstash or use an Azure eventhub. The GOLANG program is a proof of concept. It will take some time before it's going to be useable.
|
7
|
+
|
8
|
+
blob-to-kafka.go can already list blobs, list the blocks of a blob, read the blob, loop through the json and find the flowtuples and send them to kafka. It's work in progress. It's not yet using the file listing, the blob is fully read and partial reads are not yet implemented and there is no tracking of which files got read. but at least it looks hopeful and I get some progress done.
|
9
|
+
|
10
|
+
For problems or feature requests with this specific program, raise a github issue [GITHUB/janmg/logstash-input-azure_blob_storage/](https://github.com/janmg/logstash-input-azure_blob_storage). Pull requests will also be welcomed after discussion through an issue.
|
2
11
|
|
3
|
-
|
12
|
+
# Logstash
|
4
13
|
|
5
|
-
|
14
|
+
This was a plugin for [Logstash](https://github.com/elastic/logstash). It was fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way. All logstash plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/). Need generic logstash help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
6
15
|
|
7
16
|
## Purpose
|
8
|
-
This plugin
|
17
|
+
This plugin was abled to read from Azure Storage Blobs, for instance JSON diagnostics logs for NSG flow logs or LINE based accesslogs from App Services.
|
9
18
|
[Azure Blob Storage](https://azure.microsoft.com/en-us/services/storage/blobs/)
|
10
19
|
|
11
20
|
## Alternatives
|
12
21
|
This plugin was inspired by the Azure diagnostics tools, but should work better for bigger amounts of files. the configuration is not compatible, the configuration azureblob refers to the diagnostics tools plugin and this plugin uses azure_blob_storage
|
13
22
|
https://github.com/Azure/azure-diagnostics-tools/tree/master/Logstash/logstash-input-azureblob
|
14
23
|
|
15
|
-
There is a Filebeat plugin, that may work in the future
|
24
|
+
There is a Filebeat plugin, that may work in the future (or not?)
|
16
25
|
https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-azure-blob-storage.html
|
17
26
|
|
18
27
|
## Innerworking
|
@@ -28,11 +37,28 @@ The plugin executes the following steps
|
|
28
37
|
7. If logstash is stopped, a stop signal will try to finish the current file, save the registry and than quit
|
29
38
|
|
30
39
|
## Installation
|
31
|
-
This plugin can be installed through logstash-plugin
|
40
|
+
This plugin can be installed through logstash-plugin as documented https://www.elastic.co/guide/en/logstash/current/working-with-plugins.html#listing-plugins. This should pull the latest version from rubygems https://rubygems.org/gems/logstash-input-azure_blob_storage
|
41
|
+
|
32
42
|
```
|
33
43
|
/usr/share/logstash/bin/logstash-plugin install logstash-input-azure_blob_storage
|
34
44
|
```
|
35
45
|
|
46
|
+
For Ubuntu I use these commands, to list, update, remove and install
|
47
|
+
```
|
48
|
+
sudo -u logstash /usr/share/logstash/bin/logstash-plugin list --verbose
|
49
|
+
sudo -u logstash /usr/share/logstash/bin/logstash-plugin update
|
50
|
+
sudo -u logstash /usr/share/logstash/bin/logstash-plugin update logstash-input-azure_blob_storage
|
51
|
+
sudo -u logstash /usr/share/logstash/bin/logstash-plugin remove logstash-input-azurestorage
|
52
|
+
sudo -u logstash /usr/share/logstash/bin/logstash-plugin install logstash-input-azure_blob_storage
|
53
|
+
```
|
54
|
+
|
55
|
+
Alternatively you can use the commands from the build.sh script to build and install the gem locally. This you don't have to do, unless you want to modify the code in lib/logstash/inputs/azure_blob_storage.rb
|
56
|
+
```
|
57
|
+
sudo -u logstash gem build logstash-input-azure_blob_storage.gemspec
|
58
|
+
sudo -u logstash gem install logstash-input-azure_blob_storage-${VERSION}.gem
|
59
|
+
sudo -u logstash /usr/share/logstash/bin/logstash-plugin install ${GEMPWD}/logstash-input-azure_blob_storage-${VERSION}.gem
|
60
|
+
```
|
61
|
+
|
36
62
|
## Minimal Configuration
|
37
63
|
The minimum configuration required as input is storageaccount, access_key and container.
|
38
64
|
|
@@ -74,9 +100,13 @@ The pipeline can be started in several ways.
|
|
74
100
|
pipe.id = test
|
75
101
|
pipe.path = /etc/logstash/conf.d/test.conf
|
76
102
|
```
|
103
|
+
and then started as a service
|
104
|
+
```
|
105
|
+
service logstash start
|
106
|
+
```
|
77
107
|
- As managed pipeline from Kibana
|
78
108
|
|
79
|
-
|
109
|
+
To update a config file on a running instance on the commandline you can add the argument --config.reload.automatic and if you modify the files that are in the pipeline.yml you can send a SIGHUP channel to reload the pipelines where the config was changed.
|
80
110
|
[https://www.elastic.co/guide/en/logstash/current/reloading-config.html](https://www.elastic.co/guide/en/logstash/current/reloading-config.html)
|
81
111
|
|
82
112
|
## Internal Working
|
@@ -88,6 +118,10 @@ Additional fields can be enabled with addfilename and addall, ecs_compatibility
|
|
88
118
|
|
89
119
|
The configurations and the rest of the code are in [https://github.com/janmg/logstash-input-azure_blob_storage/tree/master/lib/logstash/inputs](lib/logstash/inputs) [https://github.com/janmg/logstash-input-azure_blob_storage/blob/master/lib/logstash/inputs/azure_blob_storage.rb#L10](azure_blob_storage.rb)
|
90
120
|
|
121
|
+
## Codecs
|
122
|
+
The default codec is json, the plugin should also work with json_lines, line. Other codecs like gzip and csv may work, but this plugin doesn't have specific code to handle them. This plugin reads all the binary from the file and gives it to the codec to make into events. For the logtype nsgflowlogs the plugin will read all the blocks and chops it into one event per rule.
|
123
|
+
https://www.elastic.co/guide/en/logstash/current/codec-plugins.html
|
124
|
+
|
91
125
|
## Enabling NSG Flowlogs
|
92
126
|
1. Enable Network Watcher in your regions
|
93
127
|
2. Create Storage account per region
|
@@ -321,6 +321,27 @@ public
|
|
321
321
|
@logger.error("json_lines codec exception: #{e.message} .. continue and pretend this never happened")
|
322
322
|
end
|
323
323
|
end
|
324
|
+
|
325
|
+
if !@is_json_line && !@is_json
|
326
|
+
if logtype == "wadiis"
|
327
|
+
# TODO: Convert this to line based grokking.
|
328
|
+
@processed += wadiislog(queue, name)
|
329
|
+
else
|
330
|
+
# Any other codec and logstyle
|
331
|
+
begin
|
332
|
+
@codec.decode(chunk) do |event|
|
333
|
+
counter += 1
|
334
|
+
if @addfilename
|
335
|
+
event.set('filename', name)
|
336
|
+
end
|
337
|
+
queue << event
|
338
|
+
end
|
339
|
+
@processed += counter
|
340
|
+
rescue Exception => e
|
341
|
+
@logger.error("other codec exception: #{e.message} .. continue and pretend this never happened")
|
342
|
+
end
|
343
|
+
end
|
344
|
+
end
|
324
345
|
end
|
325
346
|
|
326
347
|
# Update the size
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-input-azure_blob_storage
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.12.
|
4
|
+
version: 0.12.10
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Jan Geertsma
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2023-
|
11
|
+
date: 2023-12-06 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|