logstash-output-google_bigquery 4.1.0-java → 4.1.1-java

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 31a414a63c70fc7b35c690e076db1d1595785c79187a94e361b50d8632161496
4
- data.tar.gz: 70622ec5c509cd8ac319e4e021b04086a841952b51619dd99fdaa4e2ac21c0a6
3
+ metadata.gz: 1bea90d55eb25689d86a79c1adf326d382d8f2d7c6045a027935b5f19c4e005f
4
+ data.tar.gz: fd315b0e136a61429a63489a2d3ea21e294d474814d809911b382268037a8ce8
5
5
  SHA512:
6
- metadata.gz: b0157e8e33e82c5532bf13cb758ef74d55b05c5ee32185ba53ae0f3dfefbfbc34bcd0df8eac2d1c024b01be93c034ad923e15741df31cb37a84746e944ffe69f
7
- data.tar.gz: 31a7c30f4a4bd0d5b95cfec1f4e569961263527a81f8d92b97c0d9ec04dcff8b73c4f5fe4c118cc307543a12958d0986a33ea35e9e45f3c89e7363dcf8a501d5
6
+ metadata.gz: d178dea332da37a9f4762624a409415b90b55b74d3c2279cb521c79621d74e2fb70686b1256249d368da22a08c742a10714b8623eb632c423d9d8be4f90e8d2f
7
+ data.tar.gz: a83de7d1395056256d36889a4ecb8ee30b0d79cfc81ae2b193363a86cc188728218670ad5250a4cf24796eeff0af9d8f4a2aa4972e8c9d169c4a486f1c5df5ff
@@ -1,3 +1,6 @@
1
+ ## 4.1.1
2
+ - Fixed inaccuracies in documentation [#46](https://github.com/logstash-plugins/logstash-output-google_bigquery/pull/46)
3
+
1
4
  ## 4.1.0
2
5
  - Added `skip_invalid_rows` configuration which will insert all valid rows of a BigQuery insert
3
6
  and skip any invalid ones.
@@ -23,25 +23,24 @@ include::{include_path}/plugin_header.asciidoc[]
23
23
 
24
24
  ===== Summary
25
25
 
26
- This plugin uploads events to Google BigQuery using the streaming API
27
- so data can become available nearly immediately.
26
+ This Logstash plugin uploads events to Google BigQuery using the streaming API
27
+ so data can become available to query nearly immediately.
28
28
 
29
29
  You can configure it to flush periodically, after N events or after
30
30
  a certain amount of data is ingested.
31
31
 
32
32
  ===== Environment Configuration
33
33
 
34
- You must enable BigQuery on your Google Cloud Storage (GCS) account and create a dataset to
34
+ You must enable BigQuery on your Google Cloud account and create a dataset to
35
35
  hold the tables this plugin generates.
36
36
 
37
- You must also grant the service account this plugin uses access to
38
- the dataset.
37
+ You must also grant the service account this plugin uses access to the dataset.
39
38
 
40
39
  You can use https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html[Logstash conditionals]
41
40
  and multiple configuration blocks to upload events with different structures.
42
41
 
43
42
  ===== Usage
44
- This is an example of logstash config:
43
+ This is an example of Logstash config:
45
44
 
46
45
  [source,ruby]
47
46
  --------------------------
@@ -65,15 +64,18 @@ https://cloud.google.com/docs/authentication/production[Application Default Cred
65
64
 
66
65
  ===== Considerations
67
66
 
68
- * There is a small fee to insert data into BigQuery using the streaming API
67
+ * There is a small fee to insert data into BigQuery using the streaming API.
69
68
  * This plugin buffers events in-memory, so make sure the flush configurations are appropriate
70
69
  for your use-case and consider using
71
- https://www.elastic.co/guide/en/logstash/current/persistent-queues.html[Logstash Persistent Queues]
70
+ https://www.elastic.co/guide/en/logstash/current/persistent-queues.html[Logstash Persistent Queues].
71
+ * Events will be flushed when <<plugins-{type}s-{plugin}-batch_size>>, <<plugins-{type}s-{plugin}-batch_size_bytes>>, or <<plugins-{type}s-{plugin}-flush_interval_secs>> is met, whatever comes first.
72
+ If you notice a delay in your processing or low throughput, try adjusting those settings.
72
73
 
73
74
  ===== Additional Resources
74
75
 
75
76
  * https://cloud.google.com/docs/authentication/production[Application Default Credentials (ADC) Overview]
76
77
  * https://cloud.google.com/bigquery/[BigQuery Introduction]
78
+ * https://cloud.google.com/bigquery/quota[BigQuery Quotas and Limits]
77
79
  * https://cloud.google.com/bigquery/docs/schemas[BigQuery Schema Formats and Types]
78
80
 
79
81
  [id="plugins-{type}s-{plugin}-options"]
@@ -120,7 +122,12 @@ added[4.0.0]
120
122
  * Value type is <<number,number>>
121
123
  * Default value is `128`
122
124
 
123
- The number of messages to upload at a single time. (< 1000, default: 128)
125
+ The maximum number of messages to upload at a single time.
126
+ This number must be < 10,000.
127
+ Batching can increase performance and throughput to a point, but at the cost of per-request latency.
128
+ Too few rows per request and the overhead of each request can make ingestion inefficient.
129
+ Too many rows per request and the throughput may drop.
130
+ BigQuery recommends using about 500 rows per request, but experimentation with representative data (schema and data sizes) will help you determine the ideal batch size.
124
131
 
125
132
  [id="plugins-{type}s-{plugin}-batch_size_bytes"]
126
133
  ===== `batch_size_bytes`
@@ -130,10 +137,11 @@ added[4.0.0]
130
137
  * Value type is <<number,number>>
131
138
  * Default value is `1_000_000`
132
139
 
133
- An approximate number of bytes to upload as part of a batch. Default: 1MB
140
+ An approximate number of bytes to upload as part of a batch.
141
+ This number should be < 10MB or inserts may fail.
134
142
 
135
143
  [id="plugins-{type}s-{plugin}-csv_schema"]
136
- ===== `csv_schema`
144
+ ===== `csv_schema`
137
145
 
138
146
  * Value type is <<string,string>>
139
147
  * Default value is `nil`
@@ -142,7 +150,7 @@ Schema for log data. It must follow the format `name1:type1(,name2:type2)*`.
142
150
  For example, `path:STRING,status:INTEGER,score:FLOAT`.
143
151
 
144
152
  [id="plugins-{type}s-{plugin}-dataset"]
145
- ===== `dataset`
153
+ ===== `dataset`
146
154
 
147
155
  * This is a required setting.
148
156
  * Value type is <<string,string>>
@@ -151,7 +159,7 @@ For example, `path:STRING,status:INTEGER,score:FLOAT`.
151
159
  The BigQuery dataset the tables for the events will be added to.
152
160
 
153
161
  [id="plugins-{type}s-{plugin}-date_pattern"]
154
- ===== `date_pattern`
162
+ ===== `date_pattern`
155
163
 
156
164
  * Value type is <<string,string>>
157
165
  * Default value is `"%Y-%m-%dT%H:00"`
@@ -187,15 +195,16 @@ transparently upload to a GCS bucket.
187
195
  Files names follow the pattern `[table name]-[UNIX timestamp].log`
188
196
 
189
197
  [id="plugins-{type}s-{plugin}-flush_interval_secs"]
190
- ===== `flush_interval_secs`
198
+ ===== `flush_interval_secs`
191
199
 
192
200
  * Value type is <<number,number>>
193
201
  * Default value is `5`
194
202
 
195
- Uploads all data this often even if other upload criteria aren't met. Default: 5s
203
+ Uploads all data this often even if other upload criteria aren't met.
204
+
196
205
 
197
206
  [id="plugins-{type}s-{plugin}-ignore_unknown_values"]
198
- ===== `ignore_unknown_values`
207
+ ===== `ignore_unknown_values`
199
208
 
200
209
  * Value type is <<boolean,boolean>>
201
210
  * Default value is `false`
@@ -222,12 +231,12 @@ added[4.0.0, Replaces <<plugins-{type}s-{plugin}-key_password>>, <<plugins-{type
222
231
  * Value type is <<string,string>>
223
232
  * Default value is `nil`
224
233
 
225
- If logstash is running within Google Compute Engine, the plugin can use
234
+ If Logstash is running within Google Compute Engine, the plugin can use
226
235
  GCE's Application Default Credentials. Outside of GCE, you will need to
227
236
  specify a Service Account JSON key file.
228
237
 
229
238
  [id="plugins-{type}s-{plugin}-json_schema"]
230
- ===== `json_schema`
239
+ ===== `json_schema`
231
240
 
232
241
  * Value type is <<hash,hash>>
233
242
  * Default value is `nil`
@@ -287,7 +296,7 @@ Please use one of the following mechanisms:
287
296
  `gcloud iam service-accounts keys create key.json --iam-account my-sa-123@my-project-123.iam.gserviceaccount.com`
288
297
 
289
298
  [id="plugins-{type}s-{plugin}-project_id"]
290
- ===== `project_id`
299
+ ===== `project_id`
291
300
 
292
301
  * This is a required setting.
293
302
  * Value type is <<string,string>>
@@ -314,7 +323,7 @@ Insert all valid rows of a request, even if invalid rows exist.
314
323
  The default value is false, which causes the entire request to fail if any invalid rows exist.
315
324
 
316
325
  [id="plugins-{type}s-{plugin}-table_prefix"]
317
- ===== `table_prefix`
326
+ ===== `table_prefix`
318
327
 
319
328
  * Value type is <<string,string>>
320
329
  * Default value is `"logstash"`
@@ -323,7 +332,7 @@ BigQuery table ID prefix to be used when creating new tables for log data.
323
332
  Table name will be `<table_prefix><table_separator><date>`
324
333
 
325
334
  [id="plugins-{type}s-{plugin}-table_separator"]
326
- ===== `table_separator`
335
+ ===== `table_separator`
327
336
 
328
337
  * Value type is <<string,string>>
329
338
  * Default value is `"_"`
@@ -361,4 +370,4 @@ around one hour).
361
370
  [id="plugins-{type}s-{plugin}-common-options"]
362
371
  include::{include_path}/{type}.asciidoc[]
363
372
 
364
- :default_codec!:
373
+ :default_codec!:
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-google_bigquery'
3
- s.version = '4.1.0'
3
+ s.version = '4.1.1'
4
4
  s.licenses = ['Apache License (2.0)']
5
5
  s.summary = "Writes events to Google BigQuery"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-google_bigquery
3
3
  version: !ruby/object:Gem::Version
4
- version: 4.1.0
4
+ version: 4.1.1
5
5
  platform: java
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-09-07 00:00:00.000000000 Z
11
+ date: 2018-10-25 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement