fluent-plugin-azure-loganalytics 0.3.1 → 0.4.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: b312ebea1de59bae4bc49210d65b1a8314edc225
4
- data.tar.gz: a73eaccd5388ade37c58eefba7ac8f6f7dc8d989
2
+ SHA256:
3
+ metadata.gz: 22b06d1614e45b6e4bd69d561fd9e1baa607aab324877c55279d68e6f95cdf89
4
+ data.tar.gz: 810e6437147632a77ec333e484282c320a75482c889d1cfe59d5e4410eb495e0
5
5
  SHA512:
6
- metadata.gz: 7efc7b88b43025f3d49a67178e1538df0b7e5970d8ebb69fcbf6578c432f0c5dd2f30c2fb99736b83de34d2e8b1095c1c3d27ae8bd581d8418de388f5f4a5807
7
- data.tar.gz: cc066b03aa2d1f3e836846fbc377aeeaceccca2e51e98de73270ab25657b77d6c5b6def784b17f5aed1cf4136585c9016c2255b0a137ab843ee13569cf17f516
6
+ metadata.gz: b17088b54b49a51b04d15545b7e15c16cef996ec4ac7c1ff2c1739e383a15f3064a11db2d7bc763833ef325790935c89dc11ff85589b56da9f3f700e3a69ade4
7
+ data.tar.gz: f80ef8d11245daf1bafa759dbfd26b3b55dbe6649eaa3bf6389c8a5f78a00c2bb76d5499c94051f5b8b5b7b540dd3249d7ec63885df763382faf8b6f85c25422
data/ChangeLog.md CHANGED
@@ -1,3 +1,7 @@
1
+ ## 0.4.0
2
+ * Add endpoint parameter for sovereign cloud - [PR#8](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/8)
3
+ * Changed dependency for azure-loganalytics-datacollector-api to `>= 0.1.5` - [PR#8](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/8)
4
+
1
5
  ## 0.3.1
2
6
 
3
7
  * Add requirements section - [PR#2](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/2)
data/README.md CHANGED
@@ -11,10 +11,20 @@
11
11
  | < 0.3.0 | >= v0.12.0 | >= 1.9 |
12
12
 
13
13
  ## Installation
14
+ ### Installing gems into system Ruby
14
15
  ```
15
16
  $ gem install fluent-plugin-azure-loganalytics
16
17
  ```
17
18
 
19
+ ### Installing gems into td-agent’s Ruby
20
+ If you installed td-agent and want to add this custom plugins, use td-agent-gem to install as td-agent has own Ruby so you should install gems into td-agent’s Ruby, not system Ruby:
21
+
22
+ ```
23
+ $ /usr/sbin/td-agent-gem install fluent-plugin-azure-loganalytics
24
+ ```
25
+ Please see also [I installed td-agent and want to add custom plugins. How do I do it?](https://docs.fluentd.org/v0.12/articles/faq#i-installed-td-agent-and-want-to-add-custom-plugins.-how-do-i-do-it?)
26
+
27
+
18
28
  ## Configuration
19
29
 
20
30
  ### Azure Log Analytics
@@ -33,6 +43,7 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
33
43
  customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
34
44
  shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
35
45
  log_type EVENT_TYPE_NAME # The name of the event type. ex) ApacheAccessLog
46
+ endpoint myendpoint
36
47
  add_time_field true
37
48
  time_field_name mytime
38
49
  time_format %s
@@ -45,6 +56,7 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
45
56
  * **customer\_id (required)** - Your Operations Management Suite workspace ID
46
57
  * **shared\_key (required)** - The primary or the secondary Connected Sources client authentication key
47
58
  * **log\_type (required)** - The name of the event type that is being submitted to Log Analytics. log_type only supports alpha characters
59
+ * **endpoint (optional)** - Default:'ods.opinsights.azure.com'. The service endpoint. You may want to use this param in case of sovereign cloud that has a different endpoint from the public cloud
48
60
  * **time\_generated\_field (optional)** - Default:''(empty string) The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ). See also [this](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api#create-a-request) for more details
49
61
  * **add\_time\_field (optional)** - Default:true. This option allows to insert a time field to record
50
62
  * **time\_field\_name (optional)** - Default:time. This is required only when add_time_field is true
@@ -59,7 +71,7 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
59
71
  fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default if **add_time_field** and **add_tag_field** are true respectively. Below are two types of the plugin configurations - Default and All options configuration.
60
72
 
61
73
  ### (1) Default Configuration (No options)
62
- <u>fluent.conf</u>
74
+ <u>fluent_1.conf</u>
63
75
  ```
64
76
  <source>
65
77
  @type tail # input plugin
@@ -78,7 +90,33 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
78
90
  ```
79
91
 
80
92
  ### (2) Configuration with All Options
81
- <u>fluent.conf</u>
93
+ <u>fluent_2.conf</u>
94
+ ```
95
+ <source>
96
+ @type tail # input plugin
97
+ path /var/log/apache2/access.log # monitoring file
98
+ pos_file /tmp/fluentd_pos_file # position file
99
+ format apache # format
100
+ tag azure-loganalytics.access # tag
101
+ </source>
102
+
103
+ <match azure-loganalytics.**>
104
+ @type azure-loganalytics
105
+ customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
106
+ shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
107
+ log_type ApacheAccessLog
108
+ add_time_field true
109
+ time_field_name mytime
110
+ time_format %s
111
+ localtime true
112
+ add_tag_field true
113
+ tag_field_name mytag
114
+ </match>
115
+ ```
116
+ ### (3) Configuration with Typecast filter
117
+
118
+ You want to add typecast filter when you want to cast fields type. The filed type of code and size are cast by typecast filter.
119
+ <u>fluent_typecast.conf</u>
82
120
  ```
83
121
  <source>
84
122
  @type tail # input plugin
@@ -88,6 +126,11 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
88
126
  tag azure-loganalytics.access # tag
89
127
  </source>
90
128
 
129
+ <filter **>
130
+ @type typecast
131
+ types host:string,user:string,method:string,path:string,referer:string,agent:string,code:integer,size:integer
132
+ </filter>
133
+
91
134
  <match azure-loganalytics.**>
92
135
  @type azure-loganalytics
93
136
  customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
@@ -101,6 +144,54 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
101
144
  tag_field_name mytag
102
145
  </match>
103
146
  ```
147
+ [note] you need to install [fluent-plugin-filter-typecast](https://github.com/sonots/fluent-plugin-filter_typecast) for the sample configuration above.
148
+ ```
149
+ gem install fluent-plugin-filter_typecast
150
+ ```
151
+ ### (4) Configuration with CSV format as input and specific field type as output
152
+ You want to send to Log Analytics, logs generated with known delimiter (like comma, semi-colon) then you can use the csv format of fluentd and the keys/types properties.
153
+ This can be used with any log, here implemented with Nginx custom log.
154
+ <u>fluent_csv.conf</u>
155
+
156
+ Suppose your log is formated the way below in the /etc/nginx/conf.d/log.conf:
157
+ ```
158
+ log_format appcustomlog '"$time_iso8601";"$hostname";$bytes_sent;$request_time;$upstream_response_length;$upstream_response_time;$content_length;"$remote_addr";$status;"$host";"$request";"$http_user_agent"';
159
+ ```
160
+ And this log is activated throught the /etc/nginx/conf.d/virtualhost.conf :
161
+ ```
162
+ server {
163
+ ...
164
+ access_log /var/log/nginx/access.log appcustomlog;
165
+ ...
166
+ }
167
+ ```
168
+ You can use the following configuration for the source to tail the log file and format it with proper field type.
169
+ ```
170
+ <source>
171
+ @type tail
172
+ path /var/log/nginx/access.log
173
+ pos_file /var/log/td-agent/access.log.pos
174
+ tag nginx.accesslog
175
+ format csv
176
+ delimiter ;
177
+ keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
178
+ types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
179
+ time_key time
180
+ time_format %FT%T%z
181
+ </source>
182
+
183
+ <match nginx.accesslog>
184
+ @type azure-loganalytics
185
+ customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
186
+ shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
187
+ log_type NginxAcessLog
188
+ time_generated_field time
189
+ time_format %FT%T%z
190
+ add_tag_field true
191
+ tag_field_name mytag
192
+ </match>
193
+ ```
194
+
104
195
 
105
196
  ## Sample inputs and expected records
106
197
 
@@ -117,9 +208,19 @@ The output record for sample input can be seen at Log Analytics portal like this
117
208
 
118
209
  ![fluent-plugin-azure-loganalytics output image](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/raw/master/img/Azure-LogAnalytics-Output-Image.png)
119
210
 
211
+ <u>Sample Input (nginx custom access log)</u>
212
+ ```
213
+ "2017-12-13T11:31:59+00:00";"nginx0001";21381;0.238;20882;0.178;-;"193.192.35.178";200;"mynginx.domain.com";"GET /mysite/picture.jpeg HTTP/1.1";"Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/63.0.3239.84 Safari/537.36"
214
+ ```
215
+
216
+ <u>Output Record</u>
217
+
218
+ Part of the output record for sample input can be seen at Log Analytics portal like this with field of type _s (string) or _d (double):
219
+
220
+ ![fluent-plugin-azure-loganalytics output image](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/raw/master/img/Azure-LogAnalytics-Output-Image-2.png)
120
221
 
121
222
  ## Tests
122
- ### Running test code
223
+ ### Running test code (using System rake)
123
224
  ```
124
225
  $ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
125
226
  $ cd fluent-plugin-azure-loganalytics
@@ -131,6 +232,18 @@ $ vi test/plugin/test_azure_loganalytics.rb
131
232
  $ rake test
132
233
  ```
133
234
 
235
+ ### Running test code (using td-agent's rake)
236
+ ```
237
+ $ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
238
+ $ cd fluent-plugin-azure-loganalytics
239
+
240
+ # edit CONFIG params of test/plugin/test_azure_loganalytics.rb
241
+ $ vi test/plugin/test_azure_loganalytics.rb
242
+
243
+ # run test
244
+ $ /opt/td-agent/embedded/bin/rake test
245
+ ```
246
+
134
247
  ### Creating package, running and testing locally
135
248
  ```
136
249
  $ rake build
@@ -148,9 +261,9 @@ $ ab -n 5 -c 2 http://localhost/test/foo.html
148
261
 
149
262
  ## Links
150
263
 
151
- * http://yokawasa.github.io/fluent-plugin-azure-loganalytics
152
264
  * https://rubygems.org/gems/fluent-plugin-azure-loganalytics
153
265
  * https://rubygems.org/gems/azure-loganalytics-datacollector-api
266
+ * [How to install td-agent and luent-plugin-azure-loganalytics plugin on RHEL](docs/install-tdagent-and-the-plugin-on-rhel.md)
154
267
 
155
268
  ## Contributing
156
269
 
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.3.1
1
+ 0.4.0
@@ -0,0 +1,68 @@
1
+ # How to install td-agent and luent-plugin-azure-loganalytics plugin on RHEL
2
+
3
+ This is a quick installation procedure of td-agent and the custom plugin (fluent-plugin-azure-loganalytics) on Red Hat Enterprise Linux (7.4)
4
+
5
+ $ cat /etc/os-release
6
+ ```
7
+ NAME="Red Hat Enterprise Linux Server"
8
+ VERSION="7.4 (Maipo)"
9
+ ID="rhel"
10
+ ID_LIKE="fedora"
11
+ VARIANT="Server"
12
+ VARIANT_ID="server"
13
+ VERSION_ID="7.4"
14
+ PRETTY_NAME="Red Hat Enterprise Linux Server 7.4 (Maipo)"
15
+ ANSI_COLOR="0;31"
16
+ CPE_NAME="cpe:/o:redhat:enterprise_linux:7.4:GA:server"
17
+ HOME_URL="https://www.redhat.com/"
18
+ BUG_REPORT_URL="https://bugzilla.redhat.com/"
19
+
20
+ REDHAT_BUGZILLA_PRODUCT="Red Hat Enterprise Linux 7"
21
+ REDHAT_BUGZILLA_PRODUCT_VERSION=7.4
22
+ REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux"
23
+ REDHAT_SUPPORT_PRODUCT_VERSION="7.4"
24
+ ```
25
+
26
+ ## 0. prerequisites (for Redhat/Centos)
27
+ Install GCC and Development Tools on a CentOS / RHEL 7 server
28
+ ```
29
+ $ suod yum group install "Development Tools"
30
+ ```
31
+
32
+ ## 1. Install td-agent (fluentd)
33
+
34
+ Following the [fluentd official page](https://docs.fluentd.org/v0.12/articles/install-by-rpm), install like this:
35
+
36
+ ```
37
+ $ curl -L https://toolbelt.treasuredata.com/sh/install-redhat-td-agent2.sh | sh
38
+
39
+ $ td-agent --version
40
+ td-agent 0.12.40
41
+ ```
42
+
43
+ ## 2. Launching Daemon
44
+ ```
45
+ $ sudo /etc/init.d/td-agent start
46
+ $ sudo /etc/init.d/td-agent status
47
+ ```
48
+ ## 3. Post Sample Logs via HTTP
49
+ By default, /etc/td-agent/td-agent.conf is configured to take logs from HTTP and route them to stdout (/var/log/td-agent/td-agent.log). You can post sample log records using the curl command.
50
+
51
+ ```
52
+ $ curl -X POST -d 'json={"json":"message"}' http://localhost:8888/debug.test
53
+
54
+ # Checking log (/var/log/td-agent/td-agent.log) and see if the log is written
55
+ $ cat /var/log/td-agent/td-agent.log
56
+ ```
57
+
58
+ ## 4. Install the custom plugin
59
+ ```
60
+ $ sudo /usr/sbin/td-agent-gem install fluent-plugin-azure-loganalytics
61
+ ```
62
+
63
+ ## 5. Testing the plugin
64
+ ```
65
+ $ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
66
+ $ cd fluent-plugin-azure-loganalytics
67
+ $ /opt/td-agent/embedded/bin/rake test
68
+ ```
@@ -0,0 +1,23 @@
1
+ <source>
2
+ @type tail # input plugin
3
+ path /var/log/nginx/access.log # monitoring file
4
+ pos_file /var/log/td-agent/access.log.pos # position file
5
+ format csv # format
6
+ tag nginx.accesslog # tag
7
+ delimiter ; # record delimiter used in source log
8
+ keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
9
+ types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
10
+ time_key time
11
+ time_format %FT%T%z
12
+ </source>
13
+
14
+ <match nginx.accesslog>
15
+ @type azure-loganalytics
16
+ customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
17
+ shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
18
+ log_type EVENT_TYPE_NAME # The name of the event type. ex) NginxAcessLog
19
+ time_generated_field time
20
+ time_format %FT%T%z
21
+ add_tag_field true
22
+ tag_field_name mytag
23
+ </match>
@@ -0,0 +1,25 @@
1
+ <source>
2
+ @type tail # input plugin
3
+ path /var/log/apache2/access.log # monitoring file
4
+ pos_file /tmp/fluentd_pos_file # position file
5
+ format apache # format
6
+ tag azure-loganalytics.access # tag
7
+ </source>
8
+
9
+ <filter **>
10
+ @type typecast
11
+ types host:string,user:string,method:string,path:string,referer:string,agent:string,code:integer,size:integer
12
+ </filter>
13
+
14
+ <match azure-loganalytics.**>
15
+ @type azure-loganalytics
16
+ customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
17
+ shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
18
+ log_type EVENT_TYPE_NAME # The name of the event type. ex) ApacheAccessLog
19
+ add_time_field true
20
+ time_field_name mytime
21
+ time_format %s
22
+ localtime true
23
+ add_tag_field true
24
+ tag_field_name mytag
25
+ </match>
@@ -20,7 +20,7 @@ Gem::Specification.new do |gem|
20
20
 
21
21
  gem.add_dependency "fluentd", [">= 0.14.15", "< 2"]
22
22
  gem.add_dependency "rest-client"
23
- gem.add_dependency "azure-loganalytics-datacollector-api", [">= 0.1.2"]
23
+ gem.add_dependency "azure-loganalytics-datacollector-api", [">= 0.1.5"]
24
24
  gem.add_development_dependency "bundler", "~> 1.11"
25
25
  gem.add_development_dependency "rake", "~> 10.0"
26
26
  gem.add_development_dependency "test-unit"
@@ -16,6 +16,8 @@ module Fluent::Plugin
16
16
  :desc => "Your Operations Management Suite workspace ID"
17
17
  config_param :shared_key, :string, :secret => true,
18
18
  :desc => "The primary or the secondary Connected Sources client authentication key"
19
+ config_param :endpoint, :string, :default =>'ods.opinsights.azure.com',
20
+ :desc => "The service endpoint"
19
21
  config_param :log_type, :string,
20
22
  :desc => "The name of the event type that is being submitted to Log Analytics. log_type only alpha characters"
21
23
  config_param :time_generated_field, :string, :default => '',
@@ -59,7 +61,7 @@ module Fluent::Plugin
59
61
  def start
60
62
  super
61
63
  # start
62
- @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key)
64
+ @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key,@endpoint)
63
65
  end
64
66
 
65
67
  def shutdown
@@ -22,18 +22,16 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
22
22
 
23
23
  def test_configure
24
24
  d = create_driver
25
- assert_equal '<Customer ID aka WorkspaceID String>', d.instance.customer_id
26
- assert_equal '<Primary Key String>', d.instance.shared_key
27
25
  assert_equal 'ApacheAccessLog', d.instance.log_type
28
- assert_true d.instance.add_time_field
29
- assert_true d.instance.localtime
30
- assert_true d.instance.add_tag_field
26
+ assert_equal true, d.instance.add_time_field
27
+ assert_equal true, d.instance.localtime
28
+ assert_equal true, d.instance.add_tag_field
31
29
  assert_equal 'tag', d.instance.tag_field_name
32
30
  end
33
31
 
34
32
  def test_format
35
33
  d = create_driver
36
- time = event_time("2011-01-02 13:14:15 UTC")
34
+ time = event_time("2017-11-24 01:14:15 UTC")
37
35
  d.run(default_tag: 'test') do
38
36
  d.feed(time, {"a"=>1})
39
37
  d.feed(time, {"a"=>2})
@@ -57,9 +55,9 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
57
55
  d.feed(
58
56
  time,
59
57
  {
60
- :Log_ID => "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
61
- :date => "2016-12-10 09:44:32 JST",
62
- :processing_time => "372",
58
+ :Log_ID => "5cdad72a-c848-4df0-8aaa-ffe033e75d57",
59
+ :date => "2017-11-24 01:44:32 JST",
60
+ :processing_time => 372,
63
61
  :remote => "101.202.74.59",
64
62
  :user => "-",
65
63
  :method => "GET / HTTP/1.1",
@@ -67,15 +65,15 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
67
65
  :size => "-",
68
66
  :referer => "-",
69
67
  :agent => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:27.0) Gecko/20100101 Firefox/27.0",
70
- :eventtime => "2016-12-10T09:44:32Z"
68
+ :eventtime => "2017-11-24T01:44:32Z"
71
69
  })
72
70
 
73
71
  d.feed(
74
72
  time,
75
73
  {
76
- :Log_ID => "7260iswx-8034-4cc3-uirtx-f068dd4cd659",
77
- :date => "2016-12-10 09:45:14 JST",
78
- :processing_time => "105",
74
+ :Log_ID => "7260iswa-8034-4cc3-uirtx-f068dd4cd659",
75
+ :date => "2017-11-24 01:45:14 JST",
76
+ :processing_time => 105,
79
77
  :remote => "201.78.74.59",
80
78
  :user => "-",
81
79
  :method => "GET /manager/html HTTP/1.1",
@@ -83,7 +81,7 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
83
81
  :size => "-",
84
82
  :referer => "-",
85
83
  :agent => "Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0",
86
- :eventtime => "2016-12-10T09:45:14Z"
84
+ :eventtime => "2017-11-24T01:45:14Z"
87
85
  })
88
86
  end
89
87
  data = d.events
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-azure-loganalytics
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.3.1
4
+ version: 0.4.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Yoichi Kawasaki
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-09-05 00:00:00.000000000 Z
11
+ date: 2019-06-06 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fluentd
@@ -50,14 +50,14 @@ dependencies:
50
50
  requirements:
51
51
  - - ">="
52
52
  - !ruby/object:Gem::Version
53
- version: 0.1.2
53
+ version: 0.1.5
54
54
  type: :runtime
55
55
  prerelease: false
56
56
  version_requirements: !ruby/object:Gem::Requirement
57
57
  requirements:
58
58
  - - ">="
59
59
  - !ruby/object:Gem::Version
60
- version: 0.1.2
60
+ version: 0.1.5
61
61
  - !ruby/object:Gem::Dependency
62
62
  name: bundler
63
63
  requirement: !ruby/object:Gem::Requirement
@@ -113,10 +113,14 @@ files:
113
113
  - README.md
114
114
  - Rakefile
115
115
  - VERSION
116
+ - docs/install-tdagent-and-the-plugin-on-rhel.md
116
117
  - examples/fluent_1.conf
117
118
  - examples/fluent_2.conf
119
+ - examples/fluent_csv.conf
120
+ - examples/fluent_typecast.conf
118
121
  - fluent-plugin-azure-loganalytics.gemspec
119
122
  - img/Azure-LogAnalytics-Fluentd.png
123
+ - img/Azure-LogAnalytics-Output-Image-2.png
120
124
  - img/Azure-LogAnalytics-Output-Image.png
121
125
  - lib/fluent/plugin/out_azure-loganalytics.rb
122
126
  - test/helper.rb
@@ -140,8 +144,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
140
144
  - !ruby/object:Gem::Version
141
145
  version: '0'
142
146
  requirements: []
143
- rubyforge_project:
144
- rubygems_version: 2.5.2
147
+ rubygems_version: 3.0.3
145
148
  signing_key:
146
149
  specification_version: 4
147
150
  summary: Azure Log Analytics output plugin for Fluentd