logstash-output-googlecloudstorage 0.1.0-java

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: f251744729ec8eef30bf68a88959af7d00de462c2ab1c37f7b01734039261fbb
4
+ data.tar.gz: 0b7333d7c5b8bc82e4bff84d711612481a5c664c256e3bc49823e638326d873f
5
+ SHA512:
6
+ metadata.gz: 6e739222bdb3f232422e5852b9e40cd6c7540dbe3fc5bf73cb8e195a9db6428c8af36ab1479dc3b24e5642054bc69f538b721ad1f0e7f5c0b357fc3861064b79
7
+ data.tar.gz: cbe5fc75109ce905c48dd4cb75679f80c6aa5aecb6c398fea85b57839e325437a8c6f90b6893236b71f5702dd05bd0a171dc73b2dbf1bfdb8adfd243fda93c36
data/CHANGELOG.md ADDED
@@ -0,0 +1,55 @@
1
+ ## 3.3.0
2
+ Added the ability to set `gzip` as `Content-Encoding`.
3
+ This saves storage size but still allows uncompressed downloads.
4
+
5
+ - Fixes [#13](https://github.com/logstash-plugins/logstash-output-google_cloud_storage/issues/13) - Use `gzip` for `Content-Encoding` instead of `Content-Type`
6
+
7
+ ## 3.2.1
8
+ - Refactoring work to add locks to file rotation and writing.
9
+ - Fixes [#2](https://github.com/logstash-plugins/logstash-output-google_cloud_storage/issues/2) - Plugin crashes on file rotation.
10
+ - Fixes [#19](https://github.com/logstash-plugins/logstash-output-google_cloud_storage/issues/19) - Deleted files remain in use by the system eventually filling up disk space.
11
+
12
+ ## 3.2.0
13
+ - Change uploads to use a job pool for better performance
14
+ - Fixes [#22](https://github.com/logstash-plugins/logstash-output-google_cloud_storage/issues/22) - Refactor Job Queue Architecture
15
+ - Fixes [#5](https://github.com/logstash-plugins/logstash-output-google_cloud_storage/issues/5) - Major Performance Issues
16
+ - Wait for files to upload before Logstash quits
17
+ - Fixes [#15](https://github.com/logstash-plugins/logstash-output-google_cloud_storage/issues/15) - Fails to upload files when Logstash exits
18
+
19
+ ## 3.1.0
20
+ - Add support for disabling hostname in the log file names
21
+ - Add support for adding a UUID to the log file names
22
+
23
+ ## 3.0.5
24
+ - Docs: Set the default_codec doc attribute.
25
+
26
+ ## 3.0.4
27
+ - Fix some documentation issues
28
+
29
+ ## 3.0.2
30
+ - Docs: Fix doc formatting
31
+
32
+ ## 3.0.1
33
+ - align the dependency on mime-type and google-api-client with the `logstash-output-google_bigquery`
34
+
35
+ ## 3.0.0
36
+ - Breaking: Updated plugin to use new Java Event APIs
37
+ - relax contraints on logstash-core-plugin-api
38
+ - Update .travis.yml
39
+ - Freeze google-api-client and mime-types
40
+ - use concurrency :single
41
+
42
+ ## 2.0.4
43
+ - Depend on logstash-core-plugin-api instead of logstash-core, removing the need to mass update plugins on major releases of logstash
44
+
45
+ ## 2.0.3
46
+ - New dependency requirements for logstash-core for the 5.0 release
47
+
48
+ ## 2.0.0
49
+ - Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
50
+ instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
51
+ - Dependency on logstash-core update to 2.0
52
+
53
+ ## 0.2.0
54
+ - Changed the Google Cloud Storage API version to v1
55
+ - Added simple test for plugin lookup
data/CONTRIBUTORS ADDED
@@ -0,0 +1,18 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Aaron Mildenstein (untergeek)
6
+ * Ethan Estrada (eestrada)
7
+ * Google LLC.
8
+ * Jordan Sissel (jordansissel)
9
+ * Joseph Lewis III (jlewisiii)
10
+ * MetaPipe
11
+ * Pier-Hugues Pellerin (ph)
12
+ * Richard Pijnenburg (electrical)
13
+ * Rodrigo De Castro (rdcastro)
14
+
15
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
16
+ Logstash, and you aren't on the list above and want to be, please let us know
17
+ and we'll make sure you're here. Contributions from folks like you are what make
18
+ open source awesome.
data/Gemfile ADDED
@@ -0,0 +1,11 @@
1
+ source 'https://rubygems.org'
2
+
3
+ gemspec
4
+
5
+ logstash_path = ENV["LOGSTASH_PATH"] || "../../logstash"
6
+ use_logstash_source = ENV["LOGSTASH_SOURCE"] && ENV["LOGSTASH_SOURCE"].to_s == "1"
7
+
8
+ if Dir.exist?(logstash_path) && use_logstash_source
9
+ gem 'logstash-core', :path => "#{logstash_path}/logstash-core"
10
+ gem 'logstash-core-plugin-api', :path => "#{logstash_path}/logstash-core-plugin-api"
11
+ end
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012-2018 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
data/NOTICE.TXT ADDED
@@ -0,0 +1,5 @@
1
+ Elasticsearch
2
+ Copyright 2012-2015 Elasticsearch
3
+
4
+ This product includes software developed by The Apache Software
5
+ Foundation (http://www.apache.org/).
data/README.md ADDED
@@ -0,0 +1,98 @@
1
+ # Logstash Plugin
2
+
3
+ [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-output-google_cloud_storage.svg)](https://travis-ci.org/logstash-plugins/logstash-output-google_cloud_storage)
4
+
5
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
6
+
7
+ It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
8
+
9
+ ## Documentation
10
+
11
+ Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
12
+
13
+ - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
14
+ - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
15
+
16
+ ## Need Help?
17
+
18
+ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
19
+
20
+ ## Developing
21
+
22
+ ### 1. Plugin Developement and Testing
23
+
24
+ #### Code
25
+ - To get started, you'll need JRuby with the Bundler gem installed.
26
+
27
+ - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
28
+
29
+ - Install dependencies
30
+ ```sh
31
+ bundle install
32
+ ```
33
+
34
+ #### Test
35
+
36
+ - Update your dependencies
37
+
38
+ ```sh
39
+ bundle install
40
+ ```
41
+
42
+ - Run tests
43
+
44
+ ```sh
45
+ bundle exec rspec
46
+ ```
47
+
48
+ ### 2. Running your unpublished Plugin in Logstash
49
+
50
+ #### 2.1 Run in a local Logstash clone
51
+
52
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
53
+ ```ruby
54
+ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
55
+ ```
56
+ - Install plugin
57
+ ```sh
58
+ # Logstash 2.3 and higher
59
+ bin/logstash-plugin install --no-verify
60
+
61
+ # Prior to Logstash 2.3
62
+ bin/plugin install --no-verify
63
+
64
+ ```
65
+ - Run Logstash with your plugin
66
+ ```sh
67
+ bin/logstash -e 'filter {awesome {}}'
68
+ ```
69
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
70
+
71
+ #### 2.2 Run in an installed Logstash
72
+
73
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
74
+
75
+ - Build your plugin gem
76
+ ```sh
77
+ gem build logstash-filter-awesome.gemspec
78
+ ```
79
+ - Install the plugin from the Logstash home
80
+ ```sh
81
+ # Logstash 2.3 and higher
82
+ bin/logstash-plugin install --no-verify
83
+
84
+ # Prior to Logstash 2.3
85
+ bin/plugin install --no-verify
86
+
87
+ ```
88
+ - Start Logstash and proceed to test the plugin
89
+
90
+ ## Contributing
91
+
92
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
93
+
94
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
95
+
96
+ It is more important to the community that you are able to contribute.
97
+
98
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -0,0 +1,272 @@
1
+ :plugin: google_cloud_storage
2
+ :type: output
3
+ :default_codec: plain
4
+
5
+ ///////////////////////////////////////////
6
+ START - GENERATED VARIABLES, DO NOT EDIT!
7
+ ///////////////////////////////////////////
8
+ :version: %VERSION%
9
+ :release_date: %RELEASE_DATE%
10
+ :changelog_url: %CHANGELOG_URL%
11
+ :include_path: ../../../../logstash/docs/include
12
+ ///////////////////////////////////////////
13
+ END - GENERATED VARIABLES, DO NOT EDIT!
14
+ ///////////////////////////////////////////
15
+
16
+ [id="plugins-{type}s-{plugin}"]
17
+
18
+ === Google_cloud_storage output plugin
19
+
20
+ include::{include_path}/plugin_header.asciidoc[]
21
+
22
+ ==== Description
23
+
24
+ Summary: plugin to upload log events to Google Cloud Storage (GCS), rolling
25
+ files based on the date pattern provided as a configuration setting. Events
26
+ are written to files locally and, once file is closed, this plugin uploads
27
+ it to the configured bucket.
28
+
29
+ For more info on Google Cloud Storage, please go to:
30
+ https://cloud.google.com/products/cloud-storage
31
+
32
+ In order to use this plugin, a Google service account must be used. For
33
+ more information, please refer to:
34
+ https://developers.google.com/storage/docs/authentication#service_accounts
35
+
36
+ Recommendation: experiment with the settings depending on how much log
37
+ data you generate, so the uploader can keep up with the generated logs.
38
+ Using gzip output can be a good option to reduce network traffic when
39
+ uploading the log files and in terms of storage costs as well.
40
+
41
+ USAGE:
42
+ This is an example of logstash config:
43
+
44
+ [source,json]
45
+ --------------------------
46
+ output {
47
+ google_cloud_storage {
48
+ bucket => "my_bucket" (required)
49
+ key_path => "/path/to/privatekey.p12" (required)
50
+ key_password => "notasecret" (optional)
51
+ service_account => "1234@developer.gserviceaccount.com" (required)
52
+ temp_directory => "/tmp/logstash-gcs" (optional)
53
+ log_file_prefix => "logstash_gcs" (optional)
54
+ max_file_size_kbytes => 1024 (optional)
55
+ output_format => "plain" (optional)
56
+ date_pattern => "%Y-%m-%dT%H:00" (optional)
57
+ flush_interval_secs => 2 (optional)
58
+ gzip => false (optional)
59
+ gzip_content_encoding => false (optional)
60
+ uploader_interval_secs => 60 (optional)
61
+ include_uuid => true (optional)
62
+ include_hostname => true (optional)
63
+ }
64
+ }
65
+ --------------------------
66
+
67
+ * Support logstash event variables to determine filename.
68
+ * Turn Google API code into a Plugin Mixin (like AwsConfig).
69
+ * There's no recover method, so if logstash/plugin crashes, files may not
70
+ be uploaded to GCS.
71
+ * Allow user to configure file name.
72
+ * Allow parallel uploads for heavier loads (+ connection configuration if
73
+ exposed by Ruby API client)
74
+
75
+ [id="plugins-{type}s-{plugin}-options"]
76
+ ==== Google_cloud_storage Output Configuration Options
77
+
78
+ This plugin supports the following configuration options plus the <<plugins-{type}s-{plugin}-common-options>> described later.
79
+
80
+ [cols="<,<,<",options="header",]
81
+ |=======================================================================
82
+ |Setting |Input type|Required
83
+ | <<plugins-{type}s-{plugin}-bucket>> |<<string,string>>|Yes
84
+ | <<plugins-{type}s-{plugin}-date_pattern>> |<<string,string>>|No
85
+ | <<plugins-{type}s-{plugin}-flush_interval_secs>> |<<number,number>>|No
86
+ | <<plugins-{type}s-{plugin}-gzip>> |<<boolean,boolean>>|No
87
+ | <<plugins-{type}s-{plugin}-gzip_content_encoding>> |<<boolean,boolean>>|No
88
+ | <<plugins-{type}s-{plugin}-include_hostname>> |<<boolean,boolean>>|No
89
+ | <<plugins-{type}s-{plugin}-include_uuid>> |<<boolean,boolean>>|No
90
+ | <<plugins-{type}s-{plugin}-key_password>> |<<string,string>>|No
91
+ | <<plugins-{type}s-{plugin}-key_path>> |<<string,string>>|Yes
92
+ | <<plugins-{type}s-{plugin}-log_file_prefix>> |<<string,string>>|No
93
+ | <<plugins-{type}s-{plugin}-max_concurrent_uploads>> |<<number,number>>|No
94
+ | <<plugins-{type}s-{plugin}-max_file_size_kbytes>> |<<number,number>>|No
95
+ | <<plugins-{type}s-{plugin}-output_format>> |<<string,string>>, one of `["json", "plain"]`|No
96
+ | <<plugins-{type}s-{plugin}-service_account>> |<<string,string>>|Yes
97
+ | <<plugins-{type}s-{plugin}-temp_directory>> |<<string,string>>|No
98
+ | <<plugins-{type}s-{plugin}-uploader_interval_secs>> |<<number,number>>|No
99
+ |=======================================================================
100
+
101
+ Also see <<plugins-{type}s-{plugin}-common-options>> for a list of options supported by all
102
+ output plugins.
103
+
104
+ &nbsp;
105
+
106
+ [id="plugins-{type}s-{plugin}-bucket"]
107
+ ===== `bucket`
108
+
109
+ * This is a required setting.
110
+ * Value type is <<string,string>>
111
+ * There is no default value for this setting.
112
+
113
+ GCS bucket name, without "gs://" or any other prefix.
114
+
115
+ [id="plugins-{type}s-{plugin}-date_pattern"]
116
+ ===== `date_pattern`
117
+
118
+ * Value type is <<string,string>>
119
+ * Default value is `"%Y-%m-%dT%H:00"`
120
+
121
+ Time pattern for log file, defaults to hourly files.
122
+ Must Time.strftime patterns: www.ruby-doc.org/core-2.0/Time.html#method-i-strftime
123
+
124
+ [id="plugins-{type}s-{plugin}-flush_interval_secs"]
125
+ ===== `flush_interval_secs`
126
+
127
+ * Value type is <<number,number>>
128
+ * Default value is `2`
129
+
130
+ Flush interval in seconds for flushing writes to log files. 0 will flush
131
+ on every message.
132
+
133
+ [id="plugins-{type}s-{plugin}-gzip"]
134
+ ===== `gzip`
135
+
136
+ * Value type is <<boolean,boolean>>
137
+ * Default value is `false`
138
+
139
+ Gzip output stream when writing events to log files, set
140
+ `Content-Type` to `application/gzip` instead of `text/plain`, and
141
+ use file suffix `.log.gz` instead of `.log`.
142
+
143
+ [id="plugins-{type}s-{plugin}-gzip_content_encoding"]
144
+ ===== `gzip_content_encoding`
145
+
146
+ added[3.3.0]
147
+
148
+ * Value type is <<boolean,boolean>>
149
+ * Default value is `false`
150
+
151
+ Gzip output stream when writing events to log files and set `Content-Encoding` to `gzip`.
152
+ This will upload your files as `gzip` saving network and storage costs, but they will be
153
+ transparently decompressed when you read them from the storage bucket.
154
+
155
+ See the Cloud Storage documentation on https://cloud.google.com/storage/docs/metadata#content-encoding[metadata] and
156
+ https://cloud.google.com/storage/docs/transcoding#content-type_vs_content-encoding[transcoding]
157
+ for more information.
158
+
159
+ **Note**: It is not recommended to use both `gzip_content_encoding` and `gzip`.
160
+ This compresses your file _twice_, will increase the work your machine does and makes
161
+ the files larger than just compressing once.
162
+
163
+ [id="plugins-{type}s-{plugin}-include_hostname"]
164
+ ===== `include_hostname`
165
+
166
+ added[3.1.0]
167
+
168
+ * Value type is <<boolean,boolean>>
169
+ * Default value is `true`
170
+
171
+ Should the hostname be included in the file name?
172
+ You may want to turn this off for privacy reasons or if you are running multiple
173
+ instances of Logstash and need to match the files you create with a simple glob
174
+ such as if you wanted to import files to BigQuery.
175
+
176
+
177
+ [id="plugins-{type}s-{plugin}-include_uuid"]
178
+ ===== `include_uuid`
179
+
180
+ added[3.1.0]
181
+
182
+ * Value type is <<boolean,boolean>>
183
+ * Default value is `false`
184
+
185
+ Adds a UUID to the end of a file name.
186
+ You may want to enable this feature so files don't clobber one another if you're
187
+ running multiple instances of Logstash or if you expect frequent node restarts.
188
+
189
+ [id="plugins-{type}s-{plugin}-key_password"]
190
+ ===== `key_password`
191
+
192
+ * Value type is <<string,string>>
193
+ * Default value is `"notasecret"`
194
+
195
+ GCS private key password.
196
+
197
+ [id="plugins-{type}s-{plugin}-key_path"]
198
+ ===== `key_path`
199
+
200
+ * This is a required setting.
201
+ * Value type is <<string,string>>
202
+ * There is no default value for this setting.
203
+
204
+ GCS path to private key file.
205
+
206
+ [id="plugins-{type}s-{plugin}-log_file_prefix"]
207
+ ===== `log_file_prefix`
208
+
209
+ * Value type is <<string,string>>
210
+ * Default value is `"logstash_gcs"`
211
+
212
+ Log file prefix. Log file will follow the format:
213
+ <prefix>_hostname_date<.part?>.log
214
+
215
+ [id="plugins-{type}s-{plugin}-max_concurrent_uploads"]
216
+ ===== `max_concurrent_uploads`
217
+
218
+ * Value type is <<number,number>>
219
+ * Default value is `5`
220
+
221
+ Sets the maximum number of concurrent uploads to Cloud Storage at a time.
222
+ Uploads are I/O bound so it makes sense to tune this paramater with regards
223
+ to the network bandwidth available and the latency between your server and
224
+ Cloud Storage.
225
+
226
+ [id="plugins-{type}s-{plugin}-max_file_size_kbytes"]
227
+ ===== `max_file_size_kbytes`
228
+
229
+ * Value type is <<number,number>>
230
+ * Default value is `10000`
231
+
232
+ Sets max file size in kbytes. 0 disable max file check.
233
+
234
+ [id="plugins-{type}s-{plugin}-output_format"]
235
+ ===== `output_format`
236
+
237
+ * Value can be any of: `json`, `plain`
238
+ * Default value is `"plain"`
239
+
240
+ The event format you want to store in files. Defaults to plain text.
241
+
242
+ [id="plugins-{type}s-{plugin}-service_account"]
243
+ ===== `service_account`
244
+
245
+ * This is a required setting.
246
+ * Value type is <<string,string>>
247
+ * There is no default value for this setting.
248
+
249
+ GCS service account.
250
+
251
+ [id="plugins-{type}s-{plugin}-temp_directory"]
252
+ ===== `temp_directory`
253
+
254
+ * Value type is <<string,string>>
255
+ * Default value is `""`
256
+
257
+ Directory where temporary files are stored.
258
+ Defaults to /tmp/logstash-gcs-<random-suffix>
259
+
260
+ [id="plugins-{type}s-{plugin}-uploader_interval_secs"]
261
+ ===== `uploader_interval_secs`
262
+
263
+ * Value type is <<number,number>>
264
+ * Default value is `60`
265
+
266
+ Uploader interval when uploading new files to GCS. Adjust time based
267
+ on your time pattern (for example, for hourly files, this interval can be
268
+ around one hour).
269
+
270
+
271
+
272
+ include::{include_path}/{type}.asciidoc[]