fluent-plugin-jsonish 1.0.2 → 2.0.1

Sign up to get free protection for your applications and to get access to all the features.
data.tar.gz.sig CHANGED
Binary file
data/README.md CHANGED
@@ -1,5 +1,8 @@
1
1
  # jsonish, nginx_jsonish, nodejs_bunyan, and logstash parser plugins for Fluentd
2
2
 
3
+ ## Notes
4
+ Some of the functionality implemented by these plugins seems to now be availabe in the standard JSON parser for Fluentd 1.0. I haven't fully checked how usable it may be for particular purposes, but these plugins may end up being abandoned if it turns out that they are no longer needed. However, I have updated them so that they do work with Fluentd 1.0. I haven't checked to ensure that the changes made for Fluentd are back-compatible with previous versions: the 2.x.x versions of this plugin now require Fluentd 1.0.
5
+
3
6
  ## Overview
4
7
  The jsonish [parser plugin](http://docs.fluentd.org/articles/parser-plugin-overview) for fluentd. It subclasses the JSONParser to allow for modifications to be made to input test before it is deserialized. It subclasses the TimeParser to allow time format specifications using Ruby Time class method names instead of strftime format strings -- in particular, the iso8601 method.
5
8
  :
@@ -22,14 +25,16 @@ gem install fluent-plugin-jsonish
22
25
 
23
26
  ```
24
27
  <source>
25
- type [tail|tcp|udp|syslog|http] # any input type which takes a format
26
- format jsonish
27
- maps <mappings for text processing prior to the JSON parse>
28
- null_pattern <the pattern used for nulls in the text>
29
- null_maps <how the patterns specified by null_pattern should be replaced>
30
- message_key <key to use for setting the 'message' key in the record>
31
- add_full_message <whether to the record 'full_message' key to the raw input>
32
- move_keys <hash in JSON format, defaults to {}>
28
+ @type [tail|tcp|udp|syslog|http] # any input type which takes a format
29
+ <parse>
30
+ @type jsonish
31
+ maps <mappings for text processing prior to the JSON parse>
32
+ null_pattern <the pattern used for nulls in the text>
33
+ null_maps <how the patterns specified by null_pattern should be replaced>
34
+ message_key <key to use for setting the 'message' key in the record>
35
+ add_full_message <whether to the record 'full_message' key to the raw input>
36
+ move_keys <hash in JSON format, defaults to {}>
37
+ </parse>
33
38
  </source>
34
39
  ```
35
40
 
@@ -50,12 +55,14 @@ As an example, the nginx_jsonish subclasses the jsonish parser which really only
50
55
 
51
56
  ```
52
57
  <source>
53
- type tail
54
- format jsonish
55
- path <nginx access log file>
56
- maps ([ "slashes", "nulls" ] automatically prepended to anything set here)
57
- message_key (set to "request", by default)
58
- [ standard parser or jsonish arguments ]
58
+ @type tail
59
+ <parse>
60
+ @type jsonish
61
+ path <nginx access log file>
62
+ maps ([ "slashes", "nulls" ] automatically prepended to anything set here)
63
+ message_key (set to "request", by default)
64
+ [ standard parser or jsonish arguments ]
65
+ </parse>
59
66
  </source>
60
67
  ```
61
68
 
@@ -80,10 +87,33 @@ The fluentd parser configuration for this input is straight-forward:
80
87
 
81
88
  ```
82
89
  <source>
83
- type tail
84
- format nginx_jsonish
85
- path <nginx access log file>
86
- [ standard parser or jsonish arguments ]
90
+ @type tail
91
+ <parse>
92
+ @type nginx_jsonish
93
+ path <nginx access log file>
94
+ [ standard parser or jsonish arguments ]
95
+ </parse>
96
+ </source>
97
+ ```
98
+
99
+ With Fluentd 1.0, timestamps with millisecond precision are available. They are available for use in a nginx log_format specification, but not in an ISo-8601 format -- only by using the "$msec" variable, which is horribly name (it's actually a Unix epoch in seconds, with millisecond precision). However, there is something strange going on with the time_type "float" where it seems to insist the value for the time_key be a string. I haven't been able to get that to, but the following does work for nginx access logs with millisecond precision timestamps.
100
+
101
+ ```
102
+ log_format extended "{ \"time\": \"$msec\", \"proxy_http_x_forwarded_for\": \"$proxy_add_x_forwarded_for\", \"proxy_x_forwarded_host\": \"$host\", \"proxy_x_forwarded_proto\": \"$scheme\", \"proxy_host\": \"$proxy_host\", \"remote_addr\": \"$remote_addr\", \"remote_port\": \"$remote_port\", \"request\": \"$request\", \"request_method\": \"$request_method\", \"request_uri\": \"$request_uri\", \"request_protocol\": \"$server_protocol\", \"http_accept\": \"$http_accept\", \"http_accept_encoding\": \"$http_accept_encoding\", \"http_accept_language\": \"$http_accept_language\", \"http_connection\": \"$http_connection\", \"sent_http_connection\": \"$sent_http_connection\", \"http_host\": \"$http_host\", \"http_user_agent\": \"$http_user_agent\", \"http_x_forwarded_for\": \"$http_x_forwarded_for\", \"body_bytes_sent\": $body_bytes_sent, \"connection_requests\": $connection_requests, \"proxy_internal_body_length\": $proxy_internal_body_length, \"request_length\": $request_length, \"request_time\": $request_time, \"status\": $status, \"upstream_response_time\": [$upstream_response_time], \"upstream_response_length\": [$upstream_response_length], \"upstream_status\": [$upstream_status], \"gzip_ratio\": $gzip_ratio }";
103
+ ```
104
+
105
+ ```
106
+ <source>
107
+ @type tail
108
+ <parse>
109
+ @type nginx_jsonish
110
+ time_type string
111
+ time_format %s.%L
112
+ path <nginx access log file>
113
+ maps ([ "slashes", "nulls" ] automatically prepended to anything set here)
114
+ message_key (set to "request", by default)
115
+ [ standard parser or jsonish arguments ]
116
+ </parse>
87
117
  </source>
88
118
  ```
89
119
 
@@ -94,10 +124,12 @@ The fluentd parser configuration for this input is straight-forward:
94
124
 
95
125
  ```
96
126
  <source>
97
- type tail
98
- format nodejs_bunyan
99
- path <application log file name>
100
- [ standard parser or jsonish arguments ]
127
+ @type tail
128
+ <parse>
129
+ @type nodejs_bunyan
130
+ path <application log file name>
131
+ [ standard parser or jsonish arguments ]
132
+ </parse>
101
133
  </source>
102
134
  ```
103
135
 
@@ -108,9 +140,11 @@ The fluentd parser configuration for this input is straight-forward:
108
140
 
109
141
  ```
110
142
  <source>
111
- type tail
112
- format logstash
113
- path <application log file name>
114
- [ standard parser or jsonish arguments ]
143
+ @type tail
144
+ <parse>
145
+ @type logstash
146
+ path <application log file name>
147
+ [ standard parser or jsonish arguments ]
148
+ </parse>
115
149
  </source>
116
150
  ```
@@ -10,7 +10,7 @@ Gem::Specification.new do |gem|
10
10
  gem.summary = %q{Input parser for records which require minor text processing before they can be parsed as JSON. Also allows names of standard Time parser methods to be passed as time_format arguments and sets a reasonable default (iso8601).}
11
11
  gem.homepage = 'https://github.com/bodhi-space/fluent-plugin-jsonish'
12
12
  gem.license = 'Apache-2.0'
13
- gem.add_runtime_dependency 'fluentd', '>= 0.10.0'
13
+ gem.add_runtime_dependency 'fluentd', '>= 0.14.0'
14
14
  gem.files = `git ls-files`.split("\n")
15
15
  gem.executables = gem.files.grep(%r{^bin/}) { |f| File.basename(f) }
16
16
  gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
@@ -14,20 +14,13 @@ module Fluent
14
14
  # * '%y-%m-%dT%H:%M:%S%Z' (no fractional seconds)
15
15
  # * '%y-%m-%dT%H:%M:%S.%L%Z' (fractional seconds)
16
16
 
17
- def initialize(time_format)
18
-
19
- if not time_format.nil? and time_format.empty?
20
- # Set a reasonable default.
21
- time_format = 'iso8601'
22
- end
23
-
24
- if not time_format.nil? and not /%/.match(time_format)
25
- super(nil)
26
-
27
- @parser = Proc.new { |v| Time.method(time_format).call(v) }
17
+ def initialize(format)
28
18
 
19
+ if not /%/.match(format)
20
+ super()
21
+ @parse = ->(v){ Fluent::EventTime.from_time(Time.method(format).call(v)) }
29
22
  else
30
- super(time_format)
23
+ super(format)
31
24
  end
32
25
  end
33
26
  end
@@ -54,27 +47,39 @@ module Fluent
54
47
 
55
48
  def configure(conf)
56
49
 
57
- if conf['time_format']
58
- # Remove the time_format key before the super call
59
- # so the it does as little as possible as possible
60
- # (ie. less that we'll have to override).
61
- tmp_time_format = conf['time_format']
62
- conf.delete('time_format')
50
+ if conf['time_format'] and not conf['time_format'].nil?
51
+ # Remove the time_format key before the super call so
52
+ # the it does as little as possible as possible (ie.
53
+ # less that we'll have to override).
54
+ tmp_time_format = conf.delete('time_format')
55
+ # This has to be set to string when time_format is set.
56
+ # Deleting it without deleting time_type will leave an
57
+ # invalid configuration.
58
+ tmp_time_type = conf.delete('time_type')
63
59
  end
64
60
 
65
61
  super(conf)
66
62
 
67
- @time_parser = StdFormatTimeParser.new(tmp_time_format)
63
+ # Overwrite the time parser unless the time_type is set
64
+ # to something other than string.
65
+ if not tmp_time_type.nil? and tmp_time_type == 'string'
66
+ @time_parser = StdFormatTimeParser.new(tmp_time_format)
67
+ # If these values are not restored, fluent will not
68
+ # show the value in trace/debug output.
69
+ conf['time_type'] = tmp_time_type
70
+ @time_type = tmp_time_type
71
+ conf['time_format'] = tmp_time_format
72
+ @time_format = tmp_time_format
73
+ elsif tmp_time_format.nil?
74
+ # The v1.0 time parser has a way to use the Time class
75
+ # method iso8601, but I would still to be able to access
76
+ # any available Time methods in a generic way.
77
+ @time_parser = StdFormatTimeParser.new('iso8601')
78
+ @time_type = 'string'
79
+ @time_format = 'iso8601'
80
+ end
68
81
  @mutex = Mutex.new
69
82
 
70
- # This may look stupid (it actually is really stupid),
71
- # but this *must* be set back to a non-null string
72
- # prior to the return, since the superclass parser
73
- # method checks for this and them implements its
74
- # own ad hoc parser, in-line. This is a necessary
75
- # kludge to bypass a more egregious kludge.
76
- @time_format = 'ignore_me'
77
-
78
83
  @transforms = []
79
84
 
80
85
  @maps.each do |elem|
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-jsonish
3
3
  version: !ruby/object:Gem::Version
4
- hash: 19
4
+ hash: 13
5
5
  prerelease:
6
6
  segments:
7
- - 1
8
- - 0
9
7
  - 2
10
- version: 1.0.2
8
+ - 0
9
+ - 1
10
+ version: 2.0.1
11
11
  platform: ruby
12
12
  authors:
13
13
  - Alex Yamauchi
@@ -41,7 +41,7 @@ cert_chain:
41
41
  2Zk648Ep9HVPKmwoVuB75+xEQw==
42
42
  -----END CERTIFICATE-----
43
43
 
44
- date: 2017-08-17 00:00:00 Z
44
+ date: 2018-07-11 00:00:00 Z
45
45
  dependencies:
46
46
  - !ruby/object:Gem::Dependency
47
47
  name: fluentd
@@ -51,12 +51,12 @@ dependencies:
51
51
  requirements:
52
52
  - - ">="
53
53
  - !ruby/object:Gem::Version
54
- hash: 55
54
+ hash: 39
55
55
  segments:
56
56
  - 0
57
- - 10
57
+ - 14
58
58
  - 0
59
- version: 0.10.0
59
+ version: 0.14.0
60
60
  type: :runtime
61
61
  version_requirements: *id001
62
62
  description: Input parser for records which require minor text processing before they can be parsed as JSON
metadata.gz.sig CHANGED
Binary file