logstash-output-documentdb 0.1.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: 5f78be4c25e8e6aa60117d8e38fce205e9d8a140
4
+ data.tar.gz: c7d4ae94f9fb1dd05bef5ae097597e5c6c72281c
5
+ SHA512:
6
+ metadata.gz: e2765023f4e8f62f069273e52a7ed3136fb398bd0eeb662c3b7283357b0681ec2654ae2a6028e6e9e272538d9421dc23b05e159dea4682aa7ebf6b9374daf03d
7
+ data.tar.gz: dad8180f0c381e813962a6438c902978c37ab862c4f2c72f6c16774c1308a39128eebdd22960bab57b2d638441cd8277c8f469f48ad7777dea996e25c2eb8f83
data/CHANGELOG.md ADDED
@@ -0,0 +1,8 @@
1
+ ## 0.1.1
2
+
3
+ * changed required package version in gemspec
4
+ * fixup bug on error log messages output
5
+
6
+ ## 0.1.0
7
+
8
+ * Initial Release
data/CONTRIBUTORS ADDED
@@ -0,0 +1,10 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Yoichi Kawasaki (yokawasa)
6
+
7
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
8
+ Logstash, and you aren't on the list above and want to be, please let us know
9
+ and we'll make sure you're here. Contributions from folks like you are what make
10
+ open source awesome.
data/Gemfile ADDED
@@ -0,0 +1,2 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012–2015 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
data/README.md ADDED
@@ -0,0 +1,127 @@
1
+ # Azure DocumentDB output plugin for Logstash
2
+
3
+ logstash-output-documentdb is a logstash plugin to output to Azure DocumentDB. [Logstash](https://www.elastic.co/products/logstash) is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite [destinations](https://www.elastic.co/products/logstash). [Azure DocumentDB](https://azure.microsoft.com/en-us/services/documentdb/) is a managed NoSQL database service provided by Microsoft Azure. It’s schemaless, natively support JSON, very easy-to-use, very fast, highly reliable, and enables rapid deployment, you name it.
4
+
5
+ ## Installation
6
+
7
+ You can install this plugin using the Logstash "plugin" or "logstash-plugin" (for newer versions of Logstash) command:
8
+ ```
9
+ bin/plugin install logstash-output-documentdb
10
+ # or
11
+ bin/logstash-plugin install logstash-output-documentdb (Newer versions of Logstash)
12
+ ```
13
+ Please see [Logstash reference](https://www.elastic.co/guide/en/logstash/current/offline-plugins.html) for more information.
14
+
15
+ ## Configuration
16
+
17
+ ```
18
+ output {
19
+ documentdb {
20
+ docdb_endpoint => "https://<YOUR ACCOUNT>.documents.azure.com:443/"
21
+ docdb_account_key => "<ACCOUNT KEY>"
22
+ docdb_database => "<DATABASE NAME>"
23
+ docdb_collection => "<COLLECTION NAME>"
24
+ auto_create_database => true|false
25
+ auto_create_collection => true|false
26
+ partitioned_collection => true|false
27
+ partition_key => "<PARTITIONED KEY NAME>"
28
+ offer_throughput => <THROUGHPUT NUM>
29
+ }
30
+ }
31
+ ```
32
+
33
+ * **docdb\_endpoint (required)** - Azure DocumentDB Account endpoint URI
34
+ * **docdb\_account\_key (required)** - Azure DocumentDB Account key (master key). You must NOT set a read-only key
35
+ * **docdb\_database (required)** - DocumentDB database nameb
36
+ * **docdb\_collection (required)** - DocumentDB collection name
37
+ * **auto\_create\_database (optional)** - Default:true. By default, DocumentDB database named **docdb\_database** will be automatically created if it does not exist
38
+ * **auto\_create\_collection (optional)** - Default:true. By default, DocumentDB collection named **docdb\_collection** will be automatically created if it does not exist
39
+ * **partitioned\_collection (optional)** - Default:false. Set true if you want to create and/or store records to partitioned collection. Set false for single-partition collection
40
+ * **partition\_key (optional)** - Default:nil. Partition key must be specified for paritioned collection (partitioned\_collection set to be true)
41
+ * **offer\_throughput (optional)** - Default:10100. Throughput for the collection expressed in units of 100 request units per second. This is only effective when you newly create a partitioned collection (ie. Both auto\_create\_collection and partitioned\_collection are set to be true )
42
+
43
+
44
+ ## Tests
45
+
46
+ logstash-output-documentdb adds id attribute (UUID format) to in-coming events automatically and send them to DocumentDB. Here is an example configuration where Logstash's event source and destination are configured as Apache2 access log and DocumentDB respectively.
47
+
48
+ ### Example Configuration
49
+ ```
50
+ input {
51
+ file {
52
+ path => "/var/log/apache2/access.log"
53
+ start_position => "beginning"
54
+ }
55
+ }
56
+
57
+ filter {
58
+ if [path] =~ "access" {
59
+ mutate { replace => { "type" => "apache_access" } }
60
+ grok {
61
+ match => { "message" => "%{COMBINEDAPACHELOG}" }
62
+ }
63
+ }
64
+ date {
65
+ match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
66
+ }
67
+ }
68
+
69
+ output {
70
+ documentdb {
71
+ docdb_endpoint => "https://yoichikademo.documents.azure.com:443/"
72
+ docdb_account_key => "EMwUa3EzsAtJ1qYfzwo9nQxxydofsXNm3xLh1SLffKkUHMFl80OZRZIVu4lxdKRKxkgVAj0c2mv9BZSyMN7tdg==(dummy)"
73
+ docdb_database => "testdb"
74
+ docdb_collection => "apache_access"
75
+ auto_create_database => true
76
+ auto_create_collection => true
77
+ }
78
+ # for debug
79
+ stdout { codec => rubydebug }
80
+ }
81
+ ```
82
+ You can find example configuration files in logstash-output-documentdb/examples.
83
+
84
+ ### Run the plugin with the example configuration
85
+
86
+ Now you run logstash with the the example configuration like this:
87
+ ```
88
+ # Test your logstash configuration before actually running the logstash
89
+ bin/logstash -f logstash-apache2-to-documentdb.conf --configtest
90
+ # run
91
+ bin/logstash -f logstash-apache2-to-documentdb.conf
92
+ ```
93
+
94
+ Here is an expected output for sample input (Apache2 access log):
95
+
96
+ <u>Apache2 access log</u>
97
+ ```
98
+ 124.211.152.166 - - [27/Dec/2016:02:12:28 +0000] "GET /test.html HTTP/1.1" 200 316 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36"
99
+ ```
100
+
101
+ <u>Output (rubydebug)</u>
102
+ ```
103
+ {
104
+ "message" => "124.211.152.166 - - [27/Dec/2016:02:12:28 +0000] \"GET /test.html HTTP/1.1\" 200 316 \"-\" \"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\"",
105
+ "@version" => "1",
106
+ "@timestamp" => "2016-12-27T02:12:28.000Z",
107
+ "path" => "/var/log/apache2/access.log",
108
+ "host" => "yoichitest01",
109
+ "type" => "apache_access",
110
+ "clientip" => "124.211.152.166",
111
+ "ident" => "-",
112
+ "auth" => "-",
113
+ "timestamp" => "27/Dec/2016:02:12:28 +0000",
114
+ "verb" => "GET",
115
+ "request" => "/test.html",
116
+ "httpversion" => "1.1",
117
+ "response" => "200",
118
+ "bytes" => "316",
119
+ "referrer" => "\"-\"",
120
+ "agent" => "\"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36\"",
121
+ "id" => "0cae1966-b7ab-4f32-8893-b4fabc7800ae"
122
+ }
123
+ ```
124
+
125
+ ## Contributing
126
+ Bug reports and pull requests are welcome on GitHub at https://github.com/yokawasa/logstash-output-documentdb.
127
+
data/VERSION ADDED
@@ -0,0 +1 @@
1
+ 0.1.1
@@ -0,0 +1,101 @@
1
+ # encoding: utf-8
2
+ require "logstash/outputs/base"
3
+ require "logstash/namespace"
4
+ require 'time'
5
+ require 'securerandom'
6
+ require_relative 'documentdb/client'
7
+ require_relative 'documentdb/partitioned_coll_client'
8
+ require_relative 'documentdb/header'
9
+ require_relative 'documentdb/resource'
10
+
11
+
12
+ class LogStash::Outputs::Documentdb < LogStash::Outputs::Base
13
+ config_name "documentdb"
14
+
15
+ config :docdb_endpoint, :validate => :string, :required => true
16
+ config :docdb_account_key, :validate => :string, :required => true
17
+ config :docdb_database, :validate => :string, :required => true
18
+ config :docdb_collection, :validate => :string, :required => true
19
+ config :auto_create_database, :validate => :boolean, :default => true
20
+ config :auto_create_collection, :validate => :boolean, :default => true
21
+ config :partitioned_collection, :validate => :boolean, :default => false
22
+ config :partition_key, :validate => :string, :default => nil
23
+ config :offer_throughput, :validate =>:number, :default => AzureDocumentDB::PARTITIONED_COLL_MIN_THROUGHPUT
24
+
25
+ public
26
+ def register
27
+ ## Configure
28
+ if @partitioned_collection
29
+ raise ArgumentError, 'partition_key must be set in partitioned collection mode' if @partition_key.empty?
30
+ if (@auto_create_collection &&
31
+ @offer_throughput < AzureDocumentDB::PARTITIONED_COLL_MIN_THROUGHPUT)
32
+ raise ArgumentError, sprintf("offer_throughput must be more than and equals to %s",
33
+ AzureDocumentDB::PARTITIONED_COLL_MIN_THROUGHPUT)
34
+ end
35
+ end
36
+
37
+ ## Start
38
+ begin
39
+ @client = nil
40
+ if @partitioned_collection
41
+ @client = AzureDocumentDB::PartitionedCollectionClient.new(@docdb_account_key,@docdb_endpoint)
42
+ else
43
+ @client = AzureDocumentDB::Client.new(@docdb_account_key,@docdb_endpoint)
44
+ end
45
+
46
+ # initial operations for database
47
+ res = @client.find_databases_by_name(@docdb_database)
48
+ if( res[:body]["_count"].to_i == 0 )
49
+ raise RuntimeError, "No database (#{docdb_database}) exists! Enable auto_create_database or create it by useself" if !@auto_create_database
50
+ # create new database as it doesn't exists
51
+ @client.create_database(@docdb_database)
52
+ end
53
+
54
+ # initial operations for collection
55
+ database_resource = @client.get_database_resource(@docdb_database)
56
+ res = @client.find_collections_by_name(database_resource, @docdb_collection)
57
+ if( res[:body]["_count"].to_i == 0 )
58
+ raise "No collection (#{docdb_collection}) exists! Enable auto_create_collection or create it by useself" if !@auto_create_collection
59
+ # create new collection as it doesn't exists
60
+ if @partitioned_collection
61
+ partition_key_paths = ["/#{@partition_key}"]
62
+ @client.create_collection(database_resource,
63
+ @docdb_collection, partition_key_paths, @offer_throughput)
64
+ else
65
+ @client.create_collection(database_resource, @docdb_collection)
66
+ end
67
+ end
68
+ @coll_resource = @client.get_collection_resource(database_resource, @docdb_collection)
69
+
70
+ rescue Exception =>ex
71
+ @logger.error("Documentdb output plugin's register Error: '#{ex}'")
72
+ exit!
73
+ end
74
+ end # def register
75
+
76
+ public
77
+ def receive(event)
78
+ document = event.to_hash()
79
+ document['id'] = SecureRandom.uuid
80
+
81
+ ## Writing document to DocumentDB
82
+ unique_doc_identifier = document['id']
83
+ begin
84
+ if @partitioned_collection
85
+ @client.create_document(@coll_resource, unique_doc_identifier, document, @partition_key)
86
+ else
87
+ @client.create_document(@coll_resource, unique_doc_identifier, document)
88
+ end
89
+ rescue RestClient::ExceptionWithResponse => rcex
90
+ exdict = JSON.parse(rcex.response)
91
+ if exdict['code'] == 'Conflict'
92
+ $logger.error("Duplicate Error: document #{unique_doc_identifier} already exists, data=>" + (document.to_json).to_s)
93
+ else
94
+ $logger.error("RestClient Error: '#{rcex.response}', data=>" + (document.to_json).to_s)
95
+ end
96
+ rescue => ex
97
+ $logger.error("UnknownError: '#{ex}', uniqueid=>#{unique_doc_identifier}, data=>" + (document.to_json).to_s )
98
+ end
99
+ end # def event
100
+
101
+ end # class LogStash::Outputs::Documentdb
@@ -0,0 +1,167 @@
1
+ require 'rest-client'
2
+ require 'json'
3
+ require_relative 'constants'
4
+ require_relative 'header'
5
+ require_relative 'resource'
6
+
7
+ module AzureDocumentDB
8
+
9
+ class Client
10
+
11
+ def initialize (master_key, url_endpoint)
12
+ @master_key = master_key
13
+ @url_endpoint = url_endpoint
14
+ @header = AzureDocumentDB::Header.new(@master_key)
15
+ end
16
+
17
+ def create_database (database_name)
18
+ url = "#{@url_endpoint}/dbs"
19
+ custom_headers = {'Content-Type' => 'application/json'}
20
+ headers = @header.generate('post', AzureDocumentDB::RESOURCE_TYPE_DATABASE, '', custom_headers )
21
+ body_json = { 'id' => database_name }.to_json
22
+ res = RestClient.post( url, body_json, headers)
23
+ JSON.parse(res)
24
+ end
25
+
26
+ def find_databases_by_name (database_name)
27
+ query_params = []
28
+ query_text = "SELECT * FROM root r WHERE r.id=@id"
29
+ query_params.push( {:name=>"@id", :value=> database_name } )
30
+ url = sprintf("%s/dbs", @url_endpoint )
31
+ res = _query(AzureDocumentDB::RESOURCE_TYPE_DATABASE, '', url, query_text, query_params)
32
+ res
33
+ end
34
+
35
+ def get_database_resource (database_name)
36
+ resource = nil
37
+ res = find_databases_by_name (database_name)
38
+ if( res[:body]["_count"].to_i == 0 )
39
+ p "no #{database_name} database exists"
40
+ return resource
41
+ end
42
+ res[:body]['Databases'].select do |db|
43
+ if (db['id'] == database_name )
44
+ resource = AzureDocumentDB::DatabaseResource.new(db['_rid'])
45
+ end
46
+ end
47
+ resource
48
+ end
49
+
50
+ def create_collection(database_resource, collection_name, colls_options={}, custom_headers={} )
51
+ if !database_resource
52
+ raise ArgumentError.new 'No database_resource!'
53
+ end
54
+ url = sprintf("%s/dbs/%s/colls", @url_endpoint, database_resource.database_rid )
55
+ custom_headers['Content-Type'] = 'application/json'
56
+ headers = @header.generate('post',
57
+ AzureDocumentDB::RESOURCE_TYPE_COLLECTION,
58
+ database_resource.database_rid, custom_headers )
59
+ body = {'id' => collection_name }
60
+ colls_options.each{|k, v|
61
+ if k == 'indexingPolicy' || k == 'partitionKey'
62
+ body[k] = v
63
+ end
64
+ }
65
+ res = RestClient.post( url, body.to_json, headers)
66
+ JSON.parse(res)
67
+ end
68
+
69
+ def find_collections_by_name(database_resource, collection_name)
70
+ if !database_resource
71
+ raise ArgumentError.new 'No database_resource!'
72
+ end
73
+ ret = {}
74
+ query_params = []
75
+ query_text = "SELECT * FROM root r WHERE r.id=@id"
76
+ query_params.push( {:name=>"@id", :value=> collection_name } )
77
+ url = sprintf("%s/dbs/%s/colls", @url_endpoint, database_resource.database_rid)
78
+ ret = _query(AzureDocumentDB::RESOURCE_TYPE_COLLECTION,
79
+ database_resource.database_rid, url, query_text, query_params)
80
+ ret
81
+ end
82
+
83
+ def get_collection_resource (database_resource, collection_name)
84
+ _collection_rid = ''
85
+ if !database_resource
86
+ raise ArgumentError.new 'No database_resource!'
87
+ end
88
+ res = find_collections_by_name(database_resource, collection_name)
89
+ res[:body]['DocumentCollections'].select do |col|
90
+ if (col['id'] == collection_name )
91
+ _collection_rid = col['_rid']
92
+ end
93
+ end
94
+ if _collection_rid.empty?
95
+ p "no #{collection_name} collection exists"
96
+ return nil
97
+ end
98
+ AzureDocumentDB::CollectionResource.new(database_resource.database_rid, _collection_rid)
99
+ end
100
+
101
+ def create_document(collection_resource, document_id, document, custom_headers={} )
102
+ if !collection_resource
103
+ raise ArgumentError.new 'No collection_resource!'
104
+ end
105
+ if document['id'] && document_id != document['id']
106
+ raise ArgumentError.new "Document id mismatch error (#{document_id})!"
107
+ end
108
+ body = { 'id' => document_id }.merge document
109
+ url = sprintf("%s/dbs/%s/colls/%s/docs",
110
+ @url_endpoint, collection_resource.database_rid, collection_resource.collection_rid)
111
+ custom_headers['Content-Type'] = 'application/json'
112
+ headers = @header.generate('post', AzureDocumentDB::RESOURCE_TYPE_DOCUMENT,
113
+ collection_resource.collection_rid, custom_headers )
114
+ res = RestClient.post( url, body.to_json, headers)
115
+ JSON.parse(res)
116
+ end
117
+
118
+ def find_documents(collection_resource, document_id, custom_headers={})
119
+ if !collection_resource
120
+ raise ArgumentError.new 'No collection_resource!'
121
+ end
122
+ ret = {}
123
+ query_params = []
124
+ query_text = "SELECT * FROM c WHERE c.id=@id"
125
+ query_params.push( {:name=>"@id", :value=> document_id } )
126
+ url = sprintf("%s/dbs/%s/colls/%s/docs",
127
+ @url_endpoint, collection_resource.database_rid, collection_resource.collection_rid)
128
+ ret = _query(AzureDocumentDB::RESOURCE_TYPE_DOCUMENT,
129
+ collection_resource.collection_rid, url, query_text, query_params, custom_headers)
130
+ ret
131
+ end
132
+
133
+ def query_documents( collection_resource, query_text, query_params, custom_headers={} )
134
+ if !collection_resource
135
+ raise ArgumentError.new 'No collection_resource!'
136
+ end
137
+ ret = {}
138
+ url = sprintf("%s/dbs/%s/colls/%s/docs",
139
+ @url_endpoint, collection_resource.database_rid, collection_resource.collection_rid)
140
+ ret = _query(AzureDocumentDB::RESOURCE_TYPE_DOCUMENT,
141
+ collection_resource.collection_rid, url, query_text, query_params, custom_headers)
142
+ ret
143
+ end
144
+
145
+ protected
146
+
147
+ def _query( resource_type, parent_resource_id, url, query_text, query_params, custom_headers={} )
148
+ query_specific_header = {
149
+ 'x-ms-documentdb-isquery' => 'True',
150
+ 'Content-Type' => 'application/query+json',
151
+ 'Accept' => 'application/json'
152
+ }
153
+ query_specific_header.merge! custom_headers
154
+ headers = @header.generate('post', resource_type, parent_resource_id, query_specific_header)
155
+ body_json = {
156
+ :query => query_text,
157
+ :parameters => query_params
158
+ }.to_json
159
+
160
+ res = RestClient.post( url, body_json, headers)
161
+ result = {
162
+ :header => res.headers,
163
+ :body => JSON.parse(res.body) }
164
+ return result
165
+ end
166
+ end
167
+ end
@@ -0,0 +1,10 @@
1
+ module AzureDocumentDB
2
+ API_VERSION = '2015-12-16'.freeze
3
+ RESOURCE_TYPE_DATABASE='dbs'.freeze
4
+ RESOURCE_TYPE_COLLECTION='colls'.freeze
5
+ RESOURCE_TYPE_DOCUMENT='docs'.freeze
6
+ AUTH_TOKEN_VERSION = '1.0'.freeze
7
+ AUTH_TOKEN_TYPE_MASTER = 'master'.freeze
8
+ AUTH_TOKEN_TYPE_RESOURCE = 'resource'.freeze
9
+ PARTITIONED_COLL_MIN_THROUGHPUT = 10100.freeze
10
+ end
@@ -0,0 +1,55 @@
1
+ require 'time'
2
+ require 'openssl'
3
+ require 'base64'
4
+ require 'erb'
5
+
6
+ module AzureDocumentDB
7
+
8
+ class Header
9
+
10
+ def initialize (master_key)
11
+ @master_key = master_key
12
+ end
13
+
14
+ def generate (verb, resource_type, parent_resource_id, api_specific_headers = {} )
15
+ headers = {}
16
+ utc_date = get_httpdate()
17
+ auth_token = generate_auth_token(verb, resource_type, parent_resource_id, utc_date )
18
+ default_headers = {
19
+ 'x-ms-version' => AzureDocumentDB::API_VERSION,
20
+ 'x-ms-date' => utc_date,
21
+ 'authorization' => auth_token
22
+ }.freeze
23
+ headers.merge!(default_headers)
24
+ headers.merge(api_specific_headers)
25
+ end
26
+
27
+ private
28
+
29
+ def generate_auth_token ( verb, resource_type, resource_id, utc_date)
30
+ payload = sprintf("%s\n%s\n%s\n%s\n%s\n",
31
+ verb,
32
+ resource_type,
33
+ resource_id,
34
+ utc_date,
35
+ "" )
36
+ sig = hmac_base64encode(payload)
37
+
38
+ ERB::Util.url_encode sprintf("type=%s&ver=%s&sig=%s",
39
+ AzureDocumentDB::AUTH_TOKEN_TYPE_MASTER,
40
+ AzureDocumentDB::AUTH_TOKEN_VERSION,
41
+ sig )
42
+ end
43
+
44
+ def get_httpdate
45
+ Time.now.httpdate
46
+ end
47
+
48
+ def hmac_base64encode( text )
49
+ key = Base64.urlsafe_decode64 @master_key
50
+ hmac = OpenSSL::HMAC.digest('sha256', key, text.downcase)
51
+ Base64.encode64(hmac).strip
52
+ end
53
+
54
+ end
55
+ end
@@ -0,0 +1,62 @@
1
+ require 'rest-client'
2
+ require 'json'
3
+ require_relative 'constants'
4
+ require_relative 'header'
5
+ require_relative 'resource'
6
+
7
+ module AzureDocumentDB
8
+
9
+ class PartitionedCollectionClient < Client
10
+
11
+ def create_collection(database_resource, collection_name,
12
+ partition_key_paths, offer_throughput = AzureDocumentDB::PARTITIONED_COLL_MIN_THROUGHPUT )
13
+
14
+ if (offer_throughput < AzureDocumentDB::PARTITIONED_COLL_MIN_THROUGHPUT)
15
+ raise ArgumentError.new sprintf("Offeer thoughput need to be more than %d !",
16
+ AzureDocumentDB::PARTITIONED_COLL_MIN_THROUGHPUT)
17
+ end
18
+ if (partition_key_paths.length < 1 )
19
+ raise ArgumentError.new "No PartitionKey paths!"
20
+ end
21
+ colls_options = {
22
+ 'indexingPolicy' => { 'indexingMode' => "consistent", 'automatic'=>true },
23
+ 'partitionKey' => { "paths" => partition_key_paths, "kind" => "Hash" }
24
+ }
25
+ custom_headers= {'x-ms-offer-throughput' => offer_throughput }
26
+ super(database_resource, collection_name, colls_options, custom_headers)
27
+ end
28
+
29
+
30
+ def create_document(collection_resource, document_id, document, partitioned_key )
31
+ if partitioned_key.empty?
32
+ raise ArgumentError.new "No partitioned key!"
33
+ end
34
+ if !document.key?(partitioned_key)
35
+ raise ArgumentError.new "No partitioned key in your document!"
36
+ end
37
+ partitioned_key_value = document[partitioned_key]
38
+ custom_headers = {
39
+ 'x-ms-documentdb-partitionkey' => "[\"#{partitioned_key_value}\"]"
40
+ }
41
+ super(collection_resource, document_id, document, custom_headers)
42
+ end
43
+
44
+ def find_documents(collection_resource, document_id,
45
+ partitioned_key, partitioned_key_value, custom_headers={})
46
+ if !collection_resource
47
+ raise ArgumentError.new "No collection_resource!"
48
+ end
49
+ ret = {}
50
+ query_params = []
51
+ query_text = sprintf("SELECT * FROM c WHERE c.id=@id AND c.%s=@value", partitioned_key)
52
+ query_params.push( {:name=>"@id", :value=> document_id } )
53
+ query_params.push( {:name=>"@value", :value=> partitioned_key_value } )
54
+ url = sprintf("%s/dbs/%s/colls/%s/docs",
55
+ @url_endpoint, collection_resource.database_rid, collection_resource.collection_rid)
56
+ ret = query(AzureDocumentDB::RESOURCE_TYPE_DOCUMENT,
57
+ collection_resource.collection_rid, url, query_text, query_params, custom_headers)
58
+ ret
59
+ end
60
+
61
+ end
62
+ end
@@ -0,0 +1,40 @@
1
+ module AzureDocumentDB
2
+
3
+ class Resource
4
+ def initialize
5
+ @r = {}
6
+ end
7
+ protected
8
+ attr_accessor :r
9
+ end
10
+
11
+ class DatabaseResource < Resource
12
+
13
+ def initialize (database_rid)
14
+ super()
15
+ @r['database_rid'] = database_rid
16
+ end
17
+
18
+ def database_rid
19
+ @r['database_rid']
20
+ end
21
+ end
22
+
23
+ class CollectionResource < Resource
24
+
25
+ def initialize (database_rid, collection_rid)
26
+ super()
27
+ @r['database_rid'] = database_rid
28
+ @r['collection_rid'] = collection_rid
29
+ end
30
+
31
+ def database_rid
32
+ @r['database_rid']
33
+ end
34
+
35
+ def collection_rid
36
+ @r['collection_rid']
37
+ end
38
+ end
39
+
40
+ end
@@ -0,0 +1,25 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'logstash-output-documentdb'
3
+ s.version = File.read("VERSION").strip
4
+ s.authors = ["Yoichi Kawasaki"]
5
+ s.email = "yoichi.kawasaki@outlook.com"
6
+ s.summary = %q{logstash output plugin to store events into Azure DocumentDB}
7
+ s.description = s.summary
8
+ s.homepage = "http://github.com/yokawasa/logstash-output-documentdb"
9
+ s.licenses = ["Apache License (2.0)"]
10
+ s.require_paths = ["lib"]
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT', 'VERSION']
14
+ # Tests
15
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
16
+
17
+ # Special flag to let us know this is actually a logstash plugin
18
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
19
+
20
+ # Gem dependencies
21
+ s.add_runtime_dependency "rest-client"
22
+ s.add_runtime_dependency "logstash-core", ">= 2.0.0", "< 3.0.0"
23
+ s.add_runtime_dependency "logstash-codec-plain"
24
+ s.add_development_dependency "logstash-devutils"
25
+ end
@@ -0,0 +1,41 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+ require "logstash/outputs/documentdb"
4
+ require "logstash/codecs/plain"
5
+ require "logstash/event"
6
+
7
+ describe LogStash::Outputs::Documentdb do
8
+
9
+ let(:docdb_endpoint) { 'https://<YOUR ACCOUNT>.documents.azure.com:443/' }
10
+ let(:docdb_account_key) { '<ACCOUNT KEY>' }
11
+ let(:docdb_database) { '<DATABASE NAME>' }
12
+ let(:docdb_collection) { '<COLLECTION NAME>' }
13
+ let(:auto_create_database) { true }
14
+ let(:auto_create_collection) { true }
15
+
16
+ let(:docdb_config) {
17
+ {
18
+ "docdb_endpoint" => docdb_endpoint,
19
+ "docdb_account_key" => docdb_account_key,
20
+ "docdb_database" => docdb_database,
21
+ "docdb_collection" => docdb_collection,
22
+ "auto_create_database" => auto_create_database,
23
+ "auto_create_collection" => auto_create_collection,
24
+ }
25
+ }
26
+
27
+ let(:docdb_output) { LogStash::Outputs::Documentdb.new(docdb_config) }
28
+
29
+ before do
30
+ docdb_output.register
31
+ end
32
+
33
+ describe "#recieve" do
34
+ it "Should successfully send the event to documentdb" do
35
+ properties = { "a" => 1, "b" => 2, "c" => 3 }
36
+ event = LogStash::Event.new(properties)
37
+ expect {docdb_output.receive(event)}.to_not raise_error
38
+ end
39
+ end
40
+
41
+ end
metadata ADDED
@@ -0,0 +1,122 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-output-documentdb
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.1
5
+ platform: ruby
6
+ authors:
7
+ - Yoichi Kawasaki
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2016-12-29 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - ">="
17
+ - !ruby/object:Gem::Version
18
+ version: '0'
19
+ name: rest-client
20
+ prerelease: false
21
+ type: :runtime
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: '0'
27
+ - !ruby/object:Gem::Dependency
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - ">="
31
+ - !ruby/object:Gem::Version
32
+ version: 2.0.0
33
+ - - "<"
34
+ - !ruby/object:Gem::Version
35
+ version: 3.0.0
36
+ name: logstash-core
37
+ prerelease: false
38
+ type: :runtime
39
+ version_requirements: !ruby/object:Gem::Requirement
40
+ requirements:
41
+ - - ">="
42
+ - !ruby/object:Gem::Version
43
+ version: 2.0.0
44
+ - - "<"
45
+ - !ruby/object:Gem::Version
46
+ version: 3.0.0
47
+ - !ruby/object:Gem::Dependency
48
+ requirement: !ruby/object:Gem::Requirement
49
+ requirements:
50
+ - - ">="
51
+ - !ruby/object:Gem::Version
52
+ version: '0'
53
+ name: logstash-codec-plain
54
+ prerelease: false
55
+ type: :runtime
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - ">="
59
+ - !ruby/object:Gem::Version
60
+ version: '0'
61
+ - !ruby/object:Gem::Dependency
62
+ requirement: !ruby/object:Gem::Requirement
63
+ requirements:
64
+ - - ">="
65
+ - !ruby/object:Gem::Version
66
+ version: '0'
67
+ name: logstash-devutils
68
+ prerelease: false
69
+ type: :development
70
+ version_requirements: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - ">="
73
+ - !ruby/object:Gem::Version
74
+ version: '0'
75
+ description: logstash output plugin to store events into Azure DocumentDB
76
+ email: yoichi.kawasaki@outlook.com
77
+ executables: []
78
+ extensions: []
79
+ extra_rdoc_files: []
80
+ files:
81
+ - CHANGELOG.md
82
+ - CONTRIBUTORS
83
+ - Gemfile
84
+ - LICENSE
85
+ - README.md
86
+ - VERSION
87
+ - lib/logstash/outputs/documentdb.rb
88
+ - lib/logstash/outputs/documentdb/client.rb
89
+ - lib/logstash/outputs/documentdb/constants.rb
90
+ - lib/logstash/outputs/documentdb/header.rb
91
+ - lib/logstash/outputs/documentdb/partitioned_coll_client.rb
92
+ - lib/logstash/outputs/documentdb/resource.rb
93
+ - logstash-output-documentdb.gemspec
94
+ - spec/outputs/documentdb_spec.rb
95
+ homepage: http://github.com/yokawasa/logstash-output-documentdb
96
+ licenses:
97
+ - Apache License (2.0)
98
+ metadata:
99
+ logstash_plugin: 'true'
100
+ logstash_group: output
101
+ post_install_message:
102
+ rdoc_options: []
103
+ require_paths:
104
+ - lib
105
+ required_ruby_version: !ruby/object:Gem::Requirement
106
+ requirements:
107
+ - - ">="
108
+ - !ruby/object:Gem::Version
109
+ version: '0'
110
+ required_rubygems_version: !ruby/object:Gem::Requirement
111
+ requirements:
112
+ - - ">="
113
+ - !ruby/object:Gem::Version
114
+ version: '0'
115
+ requirements: []
116
+ rubyforge_project:
117
+ rubygems_version: 2.4.8
118
+ signing_key:
119
+ specification_version: 4
120
+ summary: logstash output plugin to store events into Azure DocumentDB
121
+ test_files:
122
+ - spec/outputs/documentdb_spec.rb