logstash-output-azure_loganalytics 0.2.2 → 0.2.3
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +5 -2
- data/README.md +31 -0
- data/VERSION +1 -1
- data/lib/logstash/outputs/azure_loganalytics.rb +5 -1
- metadata +3 -3
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA1:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 979885fba76d51d97736cce9150ddc0df3e08061
|
4
|
+
data.tar.gz: 64e69b5fd9e4290e7ff40dfced1484033c8d764b
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: f43dcc28048eb42f27b9053f2bd530791eb852d43f6eff796ace77e4526a873da074cb0c839780f927e397cb120b2bebca11a1b05f03289b6c8978f7706f3e96
|
7
|
+
data.tar.gz: cb4f87632dd4af3913f21c01bd0611648c0be7d48beaf4de7b8f295e25f767447da8c9fa80f59a8b543db441d7fac0b720cb1dc4b069bb6da48dfe40ad7c1f28
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,8 @@
|
|
1
|
+
## 0.2.3
|
2
|
+
* Added additional debug logging for successful requests - [PR#7](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/7) by [@daniel-chambers](https://github.com/daniel-chambers)
|
3
|
+
|
1
4
|
## 0.2.2
|
2
|
-
* Fix logging failure [PR#6](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/6)
|
5
|
+
* Fix logging failure - [PR#6](https://github.com/yokawasa/logstash-output-azure_loganalytics/pull/6) by [@daniel-chambers](https://github.com/daniel-chambers)
|
3
6
|
|
4
7
|
## 0.2.1
|
5
8
|
|
@@ -7,7 +10,7 @@
|
|
7
10
|
|
8
11
|
## 0.2.0
|
9
12
|
|
10
|
-
* Support for time-generated-field in output configuration [Issue#4](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/4) (Thanks to [@KiZach](https://github.com/KiZach))
|
13
|
+
* Support for time-generated-field in output configuration - [Issue#4](https://github.com/yokawasa/logstash-output-azure_loganalytics/issues/4) (Thanks to [@KiZach](https://github.com/KiZach))
|
11
14
|
|
12
15
|
## 0.1.1
|
13
16
|
|
data/README.md
CHANGED
@@ -115,6 +115,37 @@ Here is an expected output for sample input (Apache2 access log):
|
|
115
115
|
}
|
116
116
|
```
|
117
117
|
|
118
|
+
## Debugging
|
119
|
+
If you need to debug and watch what this plugin is sending to Log Analytics, you can change the logstash log level for this plugin to `DEBUG` to get additional logs in the logstash logs.
|
120
|
+
|
121
|
+
One way of changing the log level is to use the logstash API:
|
122
|
+
|
123
|
+
```
|
124
|
+
> curl -XPUT 'localhost:9600/_node/logging?pretty' -H "Content-Type: application/json" -d '{ "logger.logstash.outputs.azureloganalytcs" : "DEBUG" }'
|
125
|
+
{
|
126
|
+
"host" : "yoichitest01",
|
127
|
+
"version" : "6.5.4",
|
128
|
+
"http_address" : "127.0.0.1:9600",
|
129
|
+
"id" : "d8038a9e-02c6-411a-9f6b-597f910edc54",
|
130
|
+
"name" : "yoichitest01",
|
131
|
+
"acknowledged" : true
|
132
|
+
}
|
133
|
+
```
|
134
|
+
|
135
|
+
You should then be able to see logs like this in your logstash logs:
|
136
|
+
|
137
|
+
```
|
138
|
+
[2019-03-29T01:18:52,652][DEBUG][logstash.outputs.azureloganalytics] Posting log batch (log count: 50) as log type HealthCheckLogs to DataCollector API. First log: {"message":{"Application":"HealthCheck.API","Environments":{},"Name":"SystemMetrics","LogLevel":"Information","Properties":{"CPU":3,"Memory":83}},"beat":{"version":"6.5.4","hostname":"yoichitest01","name":"yoichitest01"},"timestamp":"2019-03-29T01:18:51.901Z"}
|
139
|
+
|
140
|
+
[2019-03-29T01:18:52,819][DEBUG][logstash.outputs.azureloganalytics] Successfully posted logs as log type HealthCheckLogs with result code 200 to DataCollector API
|
141
|
+
```
|
142
|
+
|
143
|
+
Once you're done, you can use the logstash API to undo your log level changes:
|
144
|
+
|
145
|
+
```
|
146
|
+
> curl -XPUT 'localhost:9600/_node/logging/reset?pretty'
|
147
|
+
```
|
148
|
+
|
118
149
|
## Contributing
|
119
150
|
|
120
151
|
Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/logstash-output-azure_loganalytics.
|
data/VERSION
CHANGED
@@ -1 +1 @@
|
|
1
|
-
0.2.
|
1
|
+
0.2.3
|
@@ -80,12 +80,16 @@ class LogStash::Outputs::AzureLogAnalytics < LogStash::Outputs::Base
|
|
80
80
|
|
81
81
|
# Skip in case there are no candidate documents to deliver
|
82
82
|
if documents.length < 1
|
83
|
+
@logger.debug("No documents in batch for log type #{@log_type}. Skipping")
|
83
84
|
return
|
84
85
|
end
|
85
86
|
|
86
87
|
begin
|
88
|
+
@logger.debug("Posting log batch (log count: #{documents.length}) as log type #{@log_type} to DataCollector API. First log: " + (documents[0].to_json).to_s)
|
87
89
|
res = @client.post_data(@log_type, documents, @time_generated_field)
|
88
|
-
if
|
90
|
+
if Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
|
91
|
+
@logger.debug("Successfully posted logs as log type #{@log_type} with result code #{res.code} to DataCollector API")
|
92
|
+
else
|
89
93
|
@logger.error("DataCollector API request failure: error code: #{res.code}, data=>" + (documents.to_json).to_s)
|
90
94
|
end
|
91
95
|
rescue Exception => ex
|
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: logstash-output-azure_loganalytics
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.2.
|
4
|
+
version: 0.2.3
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Yoichi Kawasaki
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2019-03-
|
11
|
+
date: 2019-03-30 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
requirement: !ruby/object:Gem::Requirement
|
@@ -123,7 +123,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
123
123
|
version: '0'
|
124
124
|
requirements: []
|
125
125
|
rubyforge_project:
|
126
|
-
rubygems_version: 2.
|
126
|
+
rubygems_version: 2.6.8
|
127
127
|
signing_key:
|
128
128
|
specification_version: 4
|
129
129
|
summary: logstash output plugin to store events into Azure Log Analytics
|