fluent-plugin-azure-loganalytics 0.3.1 → 0.6.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: b312ebea1de59bae4bc49210d65b1a8314edc225
4
- data.tar.gz: a73eaccd5388ade37c58eefba7ac8f6f7dc8d989
2
+ SHA256:
3
+ metadata.gz: c92e31fa9c5e7ae66bd87d98786f115195b1ca029bf3446ed12f80bc0f67ba11
4
+ data.tar.gz: e50d481460b64d351080b96ae1aa750792e139f9f9600034bc9cd46643a5393e
5
5
  SHA512:
6
- metadata.gz: 7efc7b88b43025f3d49a67178e1538df0b7e5970d8ebb69fcbf6578c432f0c5dd2f30c2fb99736b83de34d2e8b1095c1c3d27ae8bd581d8418de388f5f4a5807
7
- data.tar.gz: cc066b03aa2d1f3e836846fbc377aeeaceccca2e51e98de73270ab25657b77d6c5b6def784b17f5aed1cf4136585c9016c2255b0a137ab843ee13569cf17f516
6
+ metadata.gz: 5e110eb72a5cd2a385e25ab8e86bd18359b064b921e853ddb8eecfc0dd877a4504e70f2407cd18eede7e8cfea9a2a96d740dcabfd605357c0043b15f126ed979
7
+ data.tar.gz: 7aafd0e8c1071ecc237b24307d253f9628304efabc339f86123eb1a0947eb7baf0c283eadc07a4c2ea77884b2b2265e9f9eada4f5fa204cf0ef8399b45a8b0a0
@@ -1,3 +1,22 @@
1
+ ## 0.6.0
2
+ * Change base [azure-loganalytics-datacollector-api](https://github.com/yokawasa/azure-log-analytics-data-collector) to ">= 0.4.0"
3
+
4
+ ## 0.5.0
5
+
6
+ * Support setting the [x-ms-AzureResourceId](https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api#request-headers) Header - [issue #17](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/issues/17)
7
+
8
+ ## 0.4.2
9
+ * fix CVE-2020-8130 - [issue #13](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/issues/13)
10
+
11
+ ## 0.4.1
12
+
13
+ * Use `yajl` instead of default JSON encoder to fix logging exceptions - [PR#10](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/10)
14
+
15
+ ## 0.4.0
16
+
17
+ * Add endpoint parameter for sovereign cloud - [PR#8](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/8)
18
+ * Changed dependency for azure-loganalytics-datacollector-api to `>= 0.1.5` - [PR#8](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/8)
19
+
1
20
  ## 0.3.1
2
21
 
3
22
  * Add requirements section - [PR#2](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/pull/2)
data/README.md CHANGED
@@ -11,10 +11,20 @@
11
11
  | < 0.3.0 | >= v0.12.0 | >= 1.9 |
12
12
 
13
13
  ## Installation
14
+ ### Installing gems into system Ruby
14
15
  ```
15
16
  $ gem install fluent-plugin-azure-loganalytics
16
17
  ```
17
18
 
19
+ ### Installing gems into td-agent’s Ruby
20
+ If you installed td-agent and want to add this custom plugins, use td-agent-gem to install as td-agent has own Ruby so you should install gems into td-agent’s Ruby, not system Ruby:
21
+
22
+ ```
23
+ $ /usr/sbin/td-agent-gem install fluent-plugin-azure-loganalytics
24
+ ```
25
+ Please see also [I installed td-agent and want to add custom plugins. How do I do it?](https://docs.fluentd.org/v0.12/articles/faq#i-installed-td-agent-and-want-to-add-custom-plugins.-how-do-i-do-it?)
26
+
27
+
18
28
  ## Configuration
19
29
 
20
30
  ### Azure Log Analytics
@@ -33,6 +43,7 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
33
43
  customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
34
44
  shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
35
45
  log_type EVENT_TYPE_NAME # The name of the event type. ex) ApacheAccessLog
46
+ endpoint myendpoint
36
47
  add_time_field true
37
48
  time_field_name mytime
38
49
  time_format %s
@@ -45,7 +56,10 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
45
56
  * **customer\_id (required)** - Your Operations Management Suite workspace ID
46
57
  * **shared\_key (required)** - The primary or the secondary Connected Sources client authentication key
47
58
  * **log\_type (required)** - The name of the event type that is being submitted to Log Analytics. log_type only supports alpha characters
59
+ * **endpoint (optional)** - Default:'ods.opinsights.azure.com'. The service endpoint. You may want to use this param in case of sovereign cloud that has a different endpoint from the public cloud
48
60
  * **time\_generated\_field (optional)** - Default:''(empty string) The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ). See also [this](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api#create-a-request) for more details
61
+ * **azure\_resource\_id (optional)** - Default:''(empty string) The resource ID of the Azure resource the data should be associated with. This populates the [_ResourceId](https://docs.microsoft.com/en-us/azure/azure-monitor/platform/log-standard-properties#_resourceid) property and allows the data to be included in [resource-context](https://docs.microsoft.com/en-us/azure/azure-monitor/platform/design-logs-deployment#access-mode) queries in Azure Log Analytics (Azure Monitor). If this field isn't specified, the data will not be included in resource-context queries. The format should be like /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}. Please see [this](https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-resource#resourceid) for more detail on the resource ID format.
62
+
49
63
  * **add\_time\_field (optional)** - Default:true. This option allows to insert a time field to record
50
64
  * **time\_field\_name (optional)** - Default:time. This is required only when add_time_field is true
51
65
  * **localtime (optional)** - Default:false. Time record is inserted with UTC (Coordinated Universal Time) by default. This option allows to use local time if you set localtime true. This is valid only when add_time_field is true
@@ -59,7 +73,7 @@ Once you have the workspace, get Workspace ID and Shared Key (either Primary Key
59
73
  fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default if **add_time_field** and **add_tag_field** are true respectively. Below are two types of the plugin configurations - Default and All options configuration.
60
74
 
61
75
  ### (1) Default Configuration (No options)
62
- <u>fluent.conf</u>
76
+ <u>fluent_1.conf</u>
63
77
  ```
64
78
  <source>
65
79
  @type tail # input plugin
@@ -78,7 +92,34 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
78
92
  ```
79
93
 
80
94
  ### (2) Configuration with All Options
81
- <u>fluent.conf</u>
95
+ <u>fluent_2.conf</u>
96
+ ```
97
+ <source>
98
+ @type tail # input plugin
99
+ path /var/log/apache2/access.log # monitoring file
100
+ pos_file /tmp/fluentd_pos_file # position file
101
+ format apache # format
102
+ tag azure-loganalytics.access # tag
103
+ </source>
104
+
105
+ <match azure-loganalytics.**>
106
+ @type azure-loganalytics
107
+ customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
108
+ shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
109
+ log_type ApacheAccessLog
110
+ azure_resource_id /subscriptions/11111111-1111-1111-1111-111111111111/resourceGroups/otherResourceGroup/providers/Microsoft.Storage/storageAccounts/examplestorage
111
+ add_time_field true
112
+ time_field_name mytime
113
+ time_format %s
114
+ localtime true
115
+ add_tag_field true
116
+ tag_field_name mytag
117
+ </match>
118
+ ```
119
+ ### (3) Configuration with Typecast filter
120
+
121
+ You want to add typecast filter when you want to cast fields type. The filed type of code and size are cast by typecast filter.
122
+ <u>fluent_typecast.conf</u>
82
123
  ```
83
124
  <source>
84
125
  @type tail # input plugin
@@ -88,6 +129,11 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
88
129
  tag azure-loganalytics.access # tag
89
130
  </source>
90
131
 
132
+ <filter **>
133
+ @type typecast
134
+ types host:string,user:string,method:string,path:string,referer:string,agent:string,code:integer,size:integer
135
+ </filter>
136
+
91
137
  <match azure-loganalytics.**>
92
138
  @type azure-loganalytics
93
139
  customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
@@ -101,6 +147,54 @@ fluent-plugin-azure-loganalytics adds **time** and **tag** attributes by default
101
147
  tag_field_name mytag
102
148
  </match>
103
149
  ```
150
+ [note] you need to install [fluent-plugin-filter-typecast](https://github.com/sonots/fluent-plugin-filter_typecast) for the sample configuration above.
151
+ ```
152
+ gem install fluent-plugin-filter_typecast
153
+ ```
154
+ ### (4) Configuration with CSV format as input and specific field type as output
155
+ You want to send to Log Analytics, logs generated with known delimiter (like comma, semi-colon) then you can use the csv format of fluentd and the keys/types properties.
156
+ This can be used with any log, here implemented with Nginx custom log.
157
+ <u>fluent_csv.conf</u>
158
+
159
+ Suppose your log is formated the way below in the /etc/nginx/conf.d/log.conf:
160
+ ```
161
+ log_format appcustomlog '"$time_iso8601";"$hostname";$bytes_sent;$request_time;$upstream_response_length;$upstream_response_time;$content_length;"$remote_addr";$status;"$host";"$request";"$http_user_agent"';
162
+ ```
163
+ And this log is activated throught the /etc/nginx/conf.d/virtualhost.conf :
164
+ ```
165
+ server {
166
+ ...
167
+ access_log /var/log/nginx/access.log appcustomlog;
168
+ ...
169
+ }
170
+ ```
171
+ You can use the following configuration for the source to tail the log file and format it with proper field type.
172
+ ```
173
+ <source>
174
+ @type tail
175
+ path /var/log/nginx/access.log
176
+ pos_file /var/log/td-agent/access.log.pos
177
+ tag nginx.accesslog
178
+ format csv
179
+ delimiter ;
180
+ keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
181
+ types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
182
+ time_key time
183
+ time_format %FT%T%z
184
+ </source>
185
+
186
+ <match nginx.accesslog>
187
+ @type azure-loganalytics
188
+ customer_id 818f7bbc-8034-4cc3-b97d-f068dd4cd658
189
+ shared_key ppC5500KzCcDsOKwM1yWUvZydCuC3m+ds/2xci0byeQr1G3E0Jkygn1N0Rxx/yVBUrDE2ok3vf4ksCzvBmQXHw==(dummy)
190
+ log_type NginxAcessLog
191
+ time_generated_field time
192
+ time_format %FT%T%z
193
+ add_tag_field true
194
+ tag_field_name mytag
195
+ </match>
196
+ ```
197
+
104
198
 
105
199
  ## Sample inputs and expected records
106
200
 
@@ -117,9 +211,19 @@ The output record for sample input can be seen at Log Analytics portal like this
117
211
 
118
212
  ![fluent-plugin-azure-loganalytics output image](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/raw/master/img/Azure-LogAnalytics-Output-Image.png)
119
213
 
214
+ <u>Sample Input (nginx custom access log)</u>
215
+ ```
216
+ "2017-12-13T11:31:59+00:00";"nginx0001";21381;0.238;20882;0.178;-;"193.192.35.178";200;"mynginx.domain.com";"GET /mysite/picture.jpeg HTTP/1.1";"Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/63.0.3239.84 Safari/537.36"
217
+ ```
218
+
219
+ <u>Output Record</u>
220
+
221
+ Part of the output record for sample input can be seen at Log Analytics portal like this with field of type _s (string) or _d (double):
222
+
223
+ ![fluent-plugin-azure-loganalytics output image](https://github.com/yokawasa/fluent-plugin-azure-loganalytics/raw/master/img/Azure-LogAnalytics-Output-Image-2.png)
120
224
 
121
225
  ## Tests
122
- ### Running test code
226
+ ### Running test code (using System rake)
123
227
  ```
124
228
  $ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
125
229
  $ cd fluent-plugin-azure-loganalytics
@@ -131,6 +235,18 @@ $ vi test/plugin/test_azure_loganalytics.rb
131
235
  $ rake test
132
236
  ```
133
237
 
238
+ ### Running test code (using td-agent's rake)
239
+ ```
240
+ $ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
241
+ $ cd fluent-plugin-azure-loganalytics
242
+
243
+ # edit CONFIG params of test/plugin/test_azure_loganalytics.rb
244
+ $ vi test/plugin/test_azure_loganalytics.rb
245
+
246
+ # run test
247
+ $ /opt/td-agent/embedded/bin/rake test
248
+ ```
249
+
134
250
  ### Creating package, running and testing locally
135
251
  ```
136
252
  $ rake build
@@ -148,9 +264,9 @@ $ ab -n 5 -c 2 http://localhost/test/foo.html
148
264
 
149
265
  ## Links
150
266
 
151
- * http://yokawasa.github.io/fluent-plugin-azure-loganalytics
152
267
  * https://rubygems.org/gems/fluent-plugin-azure-loganalytics
153
268
  * https://rubygems.org/gems/azure-loganalytics-datacollector-api
269
+ * [How to install td-agent and luent-plugin-azure-loganalytics plugin on RHEL](docs/install-tdagent-and-the-plugin-on-rhel.md)
154
270
 
155
271
  ## Contributing
156
272
 
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.3.1
1
+ 0.6.0
@@ -0,0 +1,68 @@
1
+ # How to install td-agent and fluent-plugin-azure-loganalytics plugin on RHEL
2
+
3
+ This is a quick installation procedure of td-agent and the custom plugin (fluent-plugin-azure-loganalytics) on Red Hat Enterprise Linux (7.4)
4
+
5
+ $ cat /etc/os-release
6
+ ```
7
+ NAME="Red Hat Enterprise Linux Server"
8
+ VERSION="7.4 (Maipo)"
9
+ ID="rhel"
10
+ ID_LIKE="fedora"
11
+ VARIANT="Server"
12
+ VARIANT_ID="server"
13
+ VERSION_ID="7.4"
14
+ PRETTY_NAME="Red Hat Enterprise Linux Server 7.4 (Maipo)"
15
+ ANSI_COLOR="0;31"
16
+ CPE_NAME="cpe:/o:redhat:enterprise_linux:7.4:GA:server"
17
+ HOME_URL="https://www.redhat.com/"
18
+ BUG_REPORT_URL="https://bugzilla.redhat.com/"
19
+
20
+ REDHAT_BUGZILLA_PRODUCT="Red Hat Enterprise Linux 7"
21
+ REDHAT_BUGZILLA_PRODUCT_VERSION=7.4
22
+ REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux"
23
+ REDHAT_SUPPORT_PRODUCT_VERSION="7.4"
24
+ ```
25
+
26
+ ## 0. prerequisites (for Redhat/Centos)
27
+ Install GCC and Development Tools on a CentOS / RHEL 7 server
28
+ ```
29
+ $ suod yum group install "Development Tools"
30
+ ```
31
+
32
+ ## 1. Install td-agent (fluentd)
33
+
34
+ Following the [fluentd official page](https://docs.fluentd.org/v0.12/articles/install-by-rpm), install like this:
35
+
36
+ ```
37
+ $ curl -L https://toolbelt.treasuredata.com/sh/install-redhat-td-agent2.sh | sh
38
+
39
+ $ td-agent --version
40
+ td-agent 0.12.40
41
+ ```
42
+
43
+ ## 2. Launching Daemon
44
+ ```
45
+ $ sudo /etc/init.d/td-agent start
46
+ $ sudo /etc/init.d/td-agent status
47
+ ```
48
+ ## 3. Post Sample Logs via HTTP
49
+ By default, /etc/td-agent/td-agent.conf is configured to take logs from HTTP and route them to stdout (/var/log/td-agent/td-agent.log). You can post sample log records using the curl command.
50
+
51
+ ```
52
+ $ curl -X POST -d 'json={"json":"message"}' http://localhost:8888/debug.test
53
+
54
+ # Checking log (/var/log/td-agent/td-agent.log) and see if the log is written
55
+ $ cat /var/log/td-agent/td-agent.log
56
+ ```
57
+
58
+ ## 4. Install the custom plugin
59
+ ```
60
+ $ sudo /usr/sbin/td-agent-gem install fluent-plugin-azure-loganalytics
61
+ ```
62
+
63
+ ## 5. Testing the plugin
64
+ ```
65
+ $ git clone https://github.com/yokawasa/fluent-plugin-azure-loganalytics.git
66
+ $ cd fluent-plugin-azure-loganalytics
67
+ $ /opt/td-agent/embedded/bin/rake test
68
+ ```
@@ -11,6 +11,7 @@
11
11
  customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
12
12
  shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
13
13
  log_type EVENT_TYPE_NAME # The name of the event type. ex) ApacheAccessLog
14
+ azure_resource_id RESOURCE_ID # format: /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}
14
15
  add_time_field true
15
16
  time_field_name mytime
16
17
  time_format %s
@@ -0,0 +1,23 @@
1
+ <source>
2
+ @type tail # input plugin
3
+ path /var/log/nginx/access.log # monitoring file
4
+ pos_file /var/log/td-agent/access.log.pos # position file
5
+ format csv # format
6
+ tag nginx.accesslog # tag
7
+ delimiter ; # record delimiter used in source log
8
+ keys time,hostname,bytes_sent,request_time,content_length,remote_addr,status,host,request,http_user_agent
9
+ types time:time,hostname:string,bytes_sent:float,request_time:float,content_length:string,remote_addr:string,status:integer,host:string,request:string,http_user_agent:string
10
+ time_key time
11
+ time_format %FT%T%z
12
+ </source>
13
+
14
+ <match nginx.accesslog>
15
+ @type azure-loganalytics
16
+ customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
17
+ shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
18
+ log_type EVENT_TYPE_NAME # The name of the event type. ex) NginxAcessLog
19
+ time_generated_field time
20
+ time_format %FT%T%z
21
+ add_tag_field true
22
+ tag_field_name mytag
23
+ </match>
@@ -0,0 +1,25 @@
1
+ <source>
2
+ @type tail # input plugin
3
+ path /var/log/apache2/access.log # monitoring file
4
+ pos_file /tmp/fluentd_pos_file # position file
5
+ format apache # format
6
+ tag azure-loganalytics.access # tag
7
+ </source>
8
+
9
+ <filter **>
10
+ @type typecast
11
+ types host:string,user:string,method:string,path:string,referer:string,agent:string,code:integer,size:integer
12
+ </filter>
13
+
14
+ <match azure-loganalytics.**>
15
+ @type azure-loganalytics
16
+ customer_id CUSTOMER_ID # Customer ID aka WorkspaceID String
17
+ shared_key KEY_STRING # The primary or the secondary Connected Sources client authentication key
18
+ log_type EVENT_TYPE_NAME # The name of the event type. ex) ApacheAccessLog
19
+ add_time_field true
20
+ time_field_name mytime
21
+ time_format %s
22
+ localtime true
23
+ add_tag_field true
24
+ tag_field_name mytag
25
+ </match>
@@ -11,7 +11,6 @@ Gem::Specification.new do |gem|
11
11
  gem.description = gem.summary
12
12
  gem.homepage = "http://github.com/yokawasa/fluent-plugin-azure-loganalytics"
13
13
  gem.license = "Apache-2.0"
14
- gem.has_rdoc = false
15
14
 
16
15
  gem.files = `git ls-files`.split("\n")
17
16
  gem.executables = gem.files.grep(%r{^bin/}) { |f| File.basename(f) }
@@ -20,8 +19,8 @@ Gem::Specification.new do |gem|
20
19
 
21
20
  gem.add_dependency "fluentd", [">= 0.14.15", "< 2"]
22
21
  gem.add_dependency "rest-client"
23
- gem.add_dependency "azure-loganalytics-datacollector-api", [">= 0.1.2"]
24
- gem.add_development_dependency "bundler", "~> 1.11"
25
- gem.add_development_dependency "rake", "~> 10.0"
22
+ gem.add_dependency "yajl-ruby"
23
+ gem.add_dependency "azure-loganalytics-datacollector-api", [">= 0.4.0"]
24
+ gem.add_development_dependency "rake", ">= 12.3.3"
26
25
  gem.add_development_dependency "test-unit"
27
26
  end
@@ -16,10 +16,14 @@ module Fluent::Plugin
16
16
  :desc => "Your Operations Management Suite workspace ID"
17
17
  config_param :shared_key, :string, :secret => true,
18
18
  :desc => "The primary or the secondary Connected Sources client authentication key"
19
+ config_param :endpoint, :string, :default =>'ods.opinsights.azure.com',
20
+ :desc => "The service endpoint"
19
21
  config_param :log_type, :string,
20
22
  :desc => "The name of the event type that is being submitted to Log Analytics. log_type only alpha characters"
21
23
  config_param :time_generated_field, :string, :default => '',
22
24
  :desc => "The name of the time generated field. Be carefule that the value of field should strictly follow the ISO 8601 format (YYYY-MM-DDThh:mm:ssZ)"
25
+ config_param :azure_resource_id, :string, :default => '',
26
+ :desc => "Resource ID of the Azure resource the data should be associated with. This populates the _ResourceId property and allows the data to be included in resource-context queries in Azure Log Analytics (Azure Monitor). If this field isn't specified, the data will not be included in resource-context queries. The format should be like /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}"
23
27
  config_param :add_time_field, :bool, :default => true,
24
28
  :desc => "This option allows to insert a time field to record"
25
29
  config_param :time_field_name, :string, :default => "time",
@@ -59,7 +63,7 @@ module Fluent::Plugin
59
63
  def start
60
64
  super
61
65
  # start
62
- @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key)
66
+ @client=Azure::Loganalytics::Datacollectorapi::Client::new(@customer_id,@shared_key,@endpoint)
63
67
  end
64
68
 
65
69
  def shutdown
@@ -91,14 +95,14 @@ module Fluent::Plugin
91
95
  records.push(record)
92
96
  }
93
97
  begin
94
- res = @client.post_data(@log_type, records, @time_generated_field)
98
+ res = @client.post_data(@log_type, records, @time_generated_field, @azure_resource_id)
95
99
  if not Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
96
- log.fatal "DataCollector API request failure: error code: "
97
- + "#{res.code}, data=>" + records.to_json
100
+ log.fatal "DataCollector API request failure: error code: " +
101
+ "#{res.code}, data=>" + Yajl.dump(records)
98
102
  end
99
103
  rescue Exception => ex
100
- log.fatal "Exception occured in posting to DataCollector API: "
101
- + "'#{ex}', data=>" + records.to_json
104
+ log.fatal "Exception occured in posting to DataCollector API: " +
105
+ "'#{ex}', data=>" + Yajl.dump(records)
102
106
  end
103
107
  end
104
108
  end
@@ -22,18 +22,16 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
22
22
 
23
23
  def test_configure
24
24
  d = create_driver
25
- assert_equal '<Customer ID aka WorkspaceID String>', d.instance.customer_id
26
- assert_equal '<Primary Key String>', d.instance.shared_key
27
25
  assert_equal 'ApacheAccessLog', d.instance.log_type
28
- assert_true d.instance.add_time_field
29
- assert_true d.instance.localtime
30
- assert_true d.instance.add_tag_field
26
+ assert_equal true, d.instance.add_time_field
27
+ assert_equal true, d.instance.localtime
28
+ assert_equal true, d.instance.add_tag_field
31
29
  assert_equal 'tag', d.instance.tag_field_name
32
30
  end
33
31
 
34
32
  def test_format
35
33
  d = create_driver
36
- time = event_time("2011-01-02 13:14:15 UTC")
34
+ time = event_time("2017-11-24 01:14:15 UTC")
37
35
  d.run(default_tag: 'test') do
38
36
  d.feed(time, {"a"=>1})
39
37
  d.feed(time, {"a"=>2})
@@ -57,9 +55,9 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
57
55
  d.feed(
58
56
  time,
59
57
  {
60
- :Log_ID => "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
61
- :date => "2016-12-10 09:44:32 JST",
62
- :processing_time => "372",
58
+ :Log_ID => "5cdad72a-c848-4df0-8aaa-ffe033e75d57",
59
+ :date => "2017-11-24 01:44:32 JST",
60
+ :processing_time => 372,
63
61
  :remote => "101.202.74.59",
64
62
  :user => "-",
65
63
  :method => "GET / HTTP/1.1",
@@ -67,15 +65,15 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
67
65
  :size => "-",
68
66
  :referer => "-",
69
67
  :agent => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:27.0) Gecko/20100101 Firefox/27.0",
70
- :eventtime => "2016-12-10T09:44:32Z"
68
+ :eventtime => "2017-11-24T01:44:32Z"
71
69
  })
72
70
 
73
71
  d.feed(
74
72
  time,
75
73
  {
76
- :Log_ID => "7260iswx-8034-4cc3-uirtx-f068dd4cd659",
77
- :date => "2016-12-10 09:45:14 JST",
78
- :processing_time => "105",
74
+ :Log_ID => "7260iswa-8034-4cc3-uirtx-f068dd4cd659",
75
+ :date => "2017-11-24 01:45:14 JST",
76
+ :processing_time => 105,
79
77
  :remote => "201.78.74.59",
80
78
  :user => "-",
81
79
  :method => "GET /manager/html HTTP/1.1",
@@ -83,7 +81,7 @@ class AzureLogAnalyticsOutputTest < Test::Unit::TestCase
83
81
  :size => "-",
84
82
  :referer => "-",
85
83
  :agent => "Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0",
86
- :eventtime => "2016-12-10T09:45:14Z"
84
+ :eventtime => "2017-11-24T01:45:14Z"
87
85
  })
88
86
  end
89
87
  data = d.events
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-azure-loganalytics
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.3.1
4
+ version: 0.6.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Yoichi Kawasaki
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-09-05 00:00:00.000000000 Z
11
+ date: 2020-07-17 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fluentd
@@ -45,47 +45,47 @@ dependencies:
45
45
  - !ruby/object:Gem::Version
46
46
  version: '0'
47
47
  - !ruby/object:Gem::Dependency
48
- name: azure-loganalytics-datacollector-api
48
+ name: yajl-ruby
49
49
  requirement: !ruby/object:Gem::Requirement
50
50
  requirements:
51
51
  - - ">="
52
52
  - !ruby/object:Gem::Version
53
- version: 0.1.2
53
+ version: '0'
54
54
  type: :runtime
55
55
  prerelease: false
56
56
  version_requirements: !ruby/object:Gem::Requirement
57
57
  requirements:
58
58
  - - ">="
59
59
  - !ruby/object:Gem::Version
60
- version: 0.1.2
60
+ version: '0'
61
61
  - !ruby/object:Gem::Dependency
62
- name: bundler
62
+ name: azure-loganalytics-datacollector-api
63
63
  requirement: !ruby/object:Gem::Requirement
64
64
  requirements:
65
- - - "~>"
65
+ - - ">="
66
66
  - !ruby/object:Gem::Version
67
- version: '1.11'
68
- type: :development
67
+ version: 0.4.0
68
+ type: :runtime
69
69
  prerelease: false
70
70
  version_requirements: !ruby/object:Gem::Requirement
71
71
  requirements:
72
- - - "~>"
72
+ - - ">="
73
73
  - !ruby/object:Gem::Version
74
- version: '1.11'
74
+ version: 0.4.0
75
75
  - !ruby/object:Gem::Dependency
76
76
  name: rake
77
77
  requirement: !ruby/object:Gem::Requirement
78
78
  requirements:
79
- - - "~>"
79
+ - - ">="
80
80
  - !ruby/object:Gem::Version
81
- version: '10.0'
81
+ version: 12.3.3
82
82
  type: :development
83
83
  prerelease: false
84
84
  version_requirements: !ruby/object:Gem::Requirement
85
85
  requirements:
86
- - - "~>"
86
+ - - ">="
87
87
  - !ruby/object:Gem::Version
88
- version: '10.0'
88
+ version: 12.3.3
89
89
  - !ruby/object:Gem::Dependency
90
90
  name: test-unit
91
91
  requirement: !ruby/object:Gem::Requirement
@@ -113,10 +113,14 @@ files:
113
113
  - README.md
114
114
  - Rakefile
115
115
  - VERSION
116
+ - docs/install-tdagent-and-the-plugin-on-rhel.md
116
117
  - examples/fluent_1.conf
117
118
  - examples/fluent_2.conf
119
+ - examples/fluent_csv.conf
120
+ - examples/fluent_typecast.conf
118
121
  - fluent-plugin-azure-loganalytics.gemspec
119
122
  - img/Azure-LogAnalytics-Fluentd.png
123
+ - img/Azure-LogAnalytics-Output-Image-2.png
120
124
  - img/Azure-LogAnalytics-Output-Image.png
121
125
  - lib/fluent/plugin/out_azure-loganalytics.rb
122
126
  - test/helper.rb
@@ -140,8 +144,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
140
144
  - !ruby/object:Gem::Version
141
145
  version: '0'
142
146
  requirements: []
143
- rubyforge_project:
144
- rubygems_version: 2.5.2
147
+ rubygems_version: 3.1.4
145
148
  signing_key:
146
149
  specification_version: 4
147
150
  summary: Azure Log Analytics output plugin for Fluentd