fluent-plugin-jsonish 2.0.1 → 2.0.2
Sign up to get free protection for your applications and to get access to all the features.
- data.tar.gz.sig +0 -0
- data/README.md +23 -4
- data/lib/fluent/plugin/formatter_jsonish.rb +45 -0
- data/lib/fluent/plugin/formatter_logstash.rb +15 -0
- metadata +6 -4
- metadata.gz.sig +0 -0
data.tar.gz.sig
CHANGED
Binary file
|
data/README.md
CHANGED
@@ -3,6 +3,8 @@
|
|
3
3
|
## Notes
|
4
4
|
Some of the functionality implemented by these plugins seems to now be availabe in the standard JSON parser for Fluentd 1.0. I haven't fully checked how usable it may be for particular purposes, but these plugins may end up being abandoned if it turns out that they are no longer needed. However, I have updated them so that they do work with Fluentd 1.0. I haven't checked to ensure that the changes made for Fluentd are back-compatible with previous versions: the 2.x.x versions of this plugin now require Fluentd 1.0.
|
5
5
|
|
6
|
+
It appears that Fluentd 1.0 has dropped support for inserting timestamp or tag data the output when using the JSON formatter. A JSONish formatter has been added to this module, along with a trivial Logstash formatter to reimplement this functionality.
|
7
|
+
|
6
8
|
## Overview
|
7
9
|
The jsonish [parser plugin](http://docs.fluentd.org/articles/parser-plugin-overview) for fluentd. It subclasses the JSONParser to allow for modifications to be made to input test before it is deserialized. It subclasses the TimeParser to allow time format specifications using Ruby Time class method names instead of strftime format strings -- in particular, the iso8601 method.
|
8
10
|
:
|
@@ -21,7 +23,7 @@ gem install fluent-plugin-jsonish
|
|
21
23
|
|
22
24
|
## Configuration
|
23
25
|
|
24
|
-
### jsonish
|
26
|
+
### jsonish parser
|
25
27
|
|
26
28
|
```
|
27
29
|
<source>
|
@@ -68,7 +70,21 @@ As an example, the nginx_jsonish subclasses the jsonish parser which really only
|
|
68
70
|
|
69
71
|
The nodejs_bunyan plugin is similar in its behavior in setting needed defaults, although it also performs some needed processing after the JSON deserialization.
|
70
72
|
|
71
|
-
###
|
73
|
+
### jsonish formatter
|
74
|
+
|
75
|
+
```<match **>
|
76
|
+
type file # any type which takes a format argument
|
77
|
+
format jsonish
|
78
|
+
add_time <hash with record time insertion configuration>
|
79
|
+
add_tag <hash with record tag insertion configuration>
|
80
|
+
[ fluent file output plugin configuration ]
|
81
|
+
</match>
|
82
|
+
```
|
83
|
+
|
84
|
+
`add_time`: a hash specifying how the record timestamp should be inserted into the output JSON -- 'key' must be set to the name of the key to use in the output, with 'format' is the method name to convert to a string (defaults to 'iso8601(3)'
|
85
|
+
`add_time`: a hash specifying how the record tag should be inserted into the output JSON -- 'key' must be set to the name of the key to use in the output, with 'format' is the method name to convert to a string (defaults to 'to_s'
|
86
|
+
|
87
|
+
### nginx_jsonish parser
|
72
88
|
The nginx configuration must be configured to output a "JSONish" format. As mentioned above, a true JSON format cannot be reliably emitted using an nginx custom log format. The "time" key, at a minimum, must be set with an ISO-8601 time stamp. By default, the parser will look for a "request" key and set the "message" to this value.
|
73
89
|
|
74
90
|
Something like the following is probably overkill for most, but it does work:
|
@@ -117,7 +133,7 @@ log_format extended "{ \"time\": \"$msec\", \"proxy_http_x_forwarded_for\": \"$
|
|
117
133
|
</source>
|
118
134
|
```
|
119
135
|
|
120
|
-
### nodejs_bunyan
|
136
|
+
### nodejs_bunyan parser
|
121
137
|
This is a parser for Node.js applicatons which use [node-bunyan](https://github.com/trentm/node-bunyan) for logging. It pretty much takes care of everything, including mapping the "level" from this format to standard [syslog severity levels](https://en.wikipedia.org/wiki/Syslog#Severity_level).
|
122
138
|
|
123
139
|
The fluentd parser configuration for this input is straight-forward:
|
@@ -133,7 +149,7 @@ The fluentd parser configuration for this input is straight-forward:
|
|
133
149
|
</source>
|
134
150
|
```
|
135
151
|
|
136
|
-
### logstash
|
152
|
+
### logstash parser
|
137
153
|
This is a parser for inputs whicare in [Logstash](https://gist.github.com/jordansissel/2996677) format. Only two keys are required in this format ("@timestamp" and "@version"). The parser automatically maps "@timestamp" to time. Both the "@version" and "@timestamp" keys are deleted, since they are part of the Logstash event meta data, not actual event data.
|
138
154
|
|
139
155
|
The fluentd parser configuration for this input is straight-forward:
|
@@ -148,3 +164,6 @@ The fluentd parser configuration for this input is straight-forward:
|
|
148
164
|
</parse>
|
149
165
|
</source>
|
150
166
|
```
|
167
|
+
|
168
|
+
### logstash formatter
|
169
|
+
This is a trivial subclass of the jsonish formatter which automatically sets 'add_key' to '{ "key": "@timestamp", "format": "iso8601(3)"}'.
|
@@ -0,0 +1,45 @@
|
|
1
|
+
require 'fluent/formatter'
|
2
|
+
require 'time'
|
3
|
+
|
4
|
+
module Fluent
|
5
|
+
module TextFormatter
|
6
|
+
class JSONishFormatter < JSONFormatter
|
7
|
+
|
8
|
+
Plugin.register_formatter("jsonish", self)
|
9
|
+
|
10
|
+
config_param :add_tag, :hash, :default => {}
|
11
|
+
config_param :add_time, :hash, :default => {}
|
12
|
+
|
13
|
+
def update_entry(tag, time, record)
|
14
|
+
merge_hash = {}
|
15
|
+
|
16
|
+
if @add_time.key?('key')
|
17
|
+
if not @add_time.key?('format') or @add_time['format'] == 'iso8601(3)'
|
18
|
+
merge_hash[@add_time['key']] = Time.at(time.to_r).iso8601(3)
|
19
|
+
else
|
20
|
+
merge_hash[@add_time['key']] = eval("Time.at(time.to_r).#{@add_time['format']}")
|
21
|
+
end
|
22
|
+
end
|
23
|
+
|
24
|
+
if @add_tag.key?('key')
|
25
|
+
if not @add_tag.key?('format') or @add_tag['format'] == 'to_s'
|
26
|
+
merge_hash[@add_tag['key']] = tag.to_s
|
27
|
+
else
|
28
|
+
merge_hash[@add_tag['key']] = eval("tag.#{@add_tag['format']}")
|
29
|
+
end
|
30
|
+
end
|
31
|
+
|
32
|
+
return tag, time, record.merge(merge_hash)
|
33
|
+
end
|
34
|
+
|
35
|
+
def format(tag, time, record)
|
36
|
+
super(*update_entry(tag, time, record))
|
37
|
+
end
|
38
|
+
|
39
|
+
def format_without_nl(tag, time, record)
|
40
|
+
super(*update_entry(tag, time, record))
|
41
|
+
end
|
42
|
+
|
43
|
+
end
|
44
|
+
end
|
45
|
+
end
|
@@ -0,0 +1,15 @@
|
|
1
|
+
require_relative 'formatter_jsonish'
|
2
|
+
|
3
|
+
module Fluent
|
4
|
+
module TextFormatter
|
5
|
+
class LogstashFormatter < JSONishFormatter
|
6
|
+
Plugin.register_formatter('logstash', self)
|
7
|
+
|
8
|
+
def configure(conf)
|
9
|
+
super(conf)
|
10
|
+
@add_time = { 'key' => '@timestamp', 'format' => 'iso8601(3)' }
|
11
|
+
end
|
12
|
+
|
13
|
+
end
|
14
|
+
end
|
15
|
+
end
|
metadata
CHANGED
@@ -1,13 +1,13 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: fluent-plugin-jsonish
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
hash:
|
4
|
+
hash: 11
|
5
5
|
prerelease:
|
6
6
|
segments:
|
7
7
|
- 2
|
8
8
|
- 0
|
9
|
-
-
|
10
|
-
version: 2.0.
|
9
|
+
- 2
|
10
|
+
version: 2.0.2
|
11
11
|
platform: ruby
|
12
12
|
authors:
|
13
13
|
- Alex Yamauchi
|
@@ -41,7 +41,7 @@ cert_chain:
|
|
41
41
|
2Zk648Ep9HVPKmwoVuB75+xEQw==
|
42
42
|
-----END CERTIFICATE-----
|
43
43
|
|
44
|
-
date:
|
44
|
+
date: 2019-02-19 00:00:00 Z
|
45
45
|
dependencies:
|
46
46
|
- !ruby/object:Gem::Dependency
|
47
47
|
name: fluentd
|
@@ -75,6 +75,8 @@ files:
|
|
75
75
|
- README.md
|
76
76
|
- certs/oss@hotschedules.com.cert
|
77
77
|
- fluent-plugin-jsonish.gemspec
|
78
|
+
- lib/fluent/plugin/formatter_jsonish.rb
|
79
|
+
- lib/fluent/plugin/formatter_logstash.rb
|
78
80
|
- lib/fluent/plugin/parser_jsonish.rb
|
79
81
|
- lib/fluent/plugin/parser_logstash.rb
|
80
82
|
- lib/fluent/plugin/parser_nginx_jsonish.rb
|
metadata.gz.sig
CHANGED
Binary file
|