logstash-input-bitbucket 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +2 -0
- data/CONTRIBUTORS +10 -0
- data/DEVELOPER.md +2 -0
- data/Gemfile +10 -0
- data/LICENSE +11 -0
- data/README.md +86 -0
- data/lib/logstash/inputs/bitbucket.rb +331 -0
- data/lib/response.rb +10 -0
- data/logstash-input-bitbucket.gemspec +28 -0
- data/spec/inputs/bitbucket_spec.rb +73 -0
- metadata +140 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 74e860e2f22e5dd067b1cae1a7e5d41f1b7e4e67e24db12fc00837f4a92b515e
|
4
|
+
data.tar.gz: e4a6ca4ab780c349e80189e569b4ad7608a83636c9f5743cf3f77030fba4c489
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: c0810e7944c85dc6408b729c2466875bd9eb5330b852c5d1e0d5720177992cee2c8a1329c4ec21bfb79af7e32447239cc5124ed540899239107c5d3fdc847f2f
|
7
|
+
data.tar.gz: bb3d5260e75ab91ad17b56f1ba29a91d237fe5c661caa639b66135bcb61e1a69a7610bfdd42b1dc13ce31446d265e62199dc854fb28b5fdc04fcf63a05bb042d
|
data/CHANGELOG.md
ADDED
data/CONTRIBUTORS
ADDED
@@ -0,0 +1,10 @@
|
|
1
|
+
The following is a list of people who have contributed ideas, code, bug
|
2
|
+
reports, or in general have helped logstash along its way.
|
3
|
+
|
4
|
+
Contributors:
|
5
|
+
* -
|
6
|
+
|
7
|
+
Note: If you've sent us patches, bug reports, or otherwise contributed to
|
8
|
+
Logstash, and you aren't on the list above and want to be, please let us know
|
9
|
+
and we'll make sure you're here. Contributions from folks like you are what make
|
10
|
+
open source awesome.
|
data/DEVELOPER.md
ADDED
data/Gemfile
ADDED
@@ -0,0 +1,10 @@
|
|
1
|
+
source 'https://rubygems.org'
|
2
|
+
gemspec
|
3
|
+
|
4
|
+
logstash_path = ENV["LOGSTASH_PATH"] || "../../logstash"
|
5
|
+
use_logstash_source = ENV["LOGSTASH_SOURCE"] && ENV["LOGSTASH_SOURCE"].to_s == "1"
|
6
|
+
|
7
|
+
if Dir.exist?(logstash_path) && use_logstash_source
|
8
|
+
gem 'logstash-core', :path => "#{logstash_path}/logstash-core"
|
9
|
+
gem 'logstash-core-plugin-api', :path => "#{logstash_path}/logstash-core-plugin-api"
|
10
|
+
end
|
data/LICENSE
ADDED
@@ -0,0 +1,11 @@
|
|
1
|
+
Licensed under the Apache License, Version 2.0 (the "License");
|
2
|
+
you may not use this file except in compliance with the License.
|
3
|
+
You may obtain a copy of the License at
|
4
|
+
|
5
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
6
|
+
|
7
|
+
Unless required by applicable law or agreed to in writing, software
|
8
|
+
distributed under the License is distributed on an "AS IS" BASIS,
|
9
|
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
10
|
+
See the License for the specific language governing permissions and
|
11
|
+
limitations under the License.
|
data/README.md
ADDED
@@ -0,0 +1,86 @@
|
|
1
|
+
# Logstash Plugin
|
2
|
+
|
3
|
+
This is a plugin for [Logstash](https://github.com/elastic/logstash).
|
4
|
+
|
5
|
+
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
|
6
|
+
|
7
|
+
## Documentation
|
8
|
+
|
9
|
+
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
|
10
|
+
|
11
|
+
- For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
|
12
|
+
- For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
|
13
|
+
|
14
|
+
## Need Help?
|
15
|
+
|
16
|
+
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
|
17
|
+
|
18
|
+
## Developing
|
19
|
+
|
20
|
+
### 1. Plugin Developement and Testing
|
21
|
+
|
22
|
+
#### Code
|
23
|
+
- To get started, you'll need JRuby with the Bundler gem installed.
|
24
|
+
|
25
|
+
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
|
26
|
+
|
27
|
+
- Install dependencies
|
28
|
+
```sh
|
29
|
+
bundle install
|
30
|
+
```
|
31
|
+
|
32
|
+
#### Test
|
33
|
+
|
34
|
+
- Update your dependencies
|
35
|
+
|
36
|
+
```sh
|
37
|
+
bundle install
|
38
|
+
```
|
39
|
+
|
40
|
+
- Run tests
|
41
|
+
|
42
|
+
```sh
|
43
|
+
bundle exec rspec
|
44
|
+
```
|
45
|
+
|
46
|
+
### 2. Running your unpublished Plugin in Logstash
|
47
|
+
|
48
|
+
#### 2.1 Run in a local Logstash clone
|
49
|
+
|
50
|
+
- Edit Logstash `Gemfile` and add the local plugin path, for example:
|
51
|
+
```ruby
|
52
|
+
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
|
53
|
+
```
|
54
|
+
- Install plugin
|
55
|
+
```sh
|
56
|
+
bin/logstash-plugin install --no-verify
|
57
|
+
```
|
58
|
+
- Run Logstash with your plugin
|
59
|
+
```sh
|
60
|
+
bin/logstash -e 'filter {awesome {}}'
|
61
|
+
```
|
62
|
+
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
|
63
|
+
|
64
|
+
#### 2.2 Run in an installed Logstash
|
65
|
+
|
66
|
+
You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
|
67
|
+
|
68
|
+
- Build your plugin gem
|
69
|
+
```sh
|
70
|
+
gem build logstash-filter-awesome.gemspec
|
71
|
+
```
|
72
|
+
- Install the plugin from the Logstash home
|
73
|
+
```sh
|
74
|
+
bin/logstash-plugin install /your/local/plugin/logstash-filter-awesome.gem
|
75
|
+
```
|
76
|
+
- Start Logstash and proceed to test the plugin
|
77
|
+
|
78
|
+
## Contributing
|
79
|
+
|
80
|
+
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
|
81
|
+
|
82
|
+
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
|
83
|
+
|
84
|
+
It is more important to the community that you are able to contribute.
|
85
|
+
|
86
|
+
For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
|
@@ -0,0 +1,331 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/inputs/base"
|
3
|
+
require 'logstash/plugin_mixins/http_client'
|
4
|
+
require 'logstash/event'
|
5
|
+
require 'logstash/json'
|
6
|
+
require "stud/interval"
|
7
|
+
require "socket" # for Socket.gethostname
|
8
|
+
require "rufus/scheduler"
|
9
|
+
require "json"
|
10
|
+
require "ostruct"
|
11
|
+
|
12
|
+
|
13
|
+
# DISCLAIMER: Functions for this plugin are made public for the sake of creating concise unit tests
|
14
|
+
|
15
|
+
class LogStash::Inputs::Bitbucket < LogStash::Inputs::Base
|
16
|
+
include LogStash::PluginMixins::HttpClient
|
17
|
+
|
18
|
+
config_name "bitbucket"
|
19
|
+
|
20
|
+
# If undefined, Logstash will complain, even if codec is unused.
|
21
|
+
default :codec, "json"
|
22
|
+
|
23
|
+
# Schedule of when to periodically poll from the urls
|
24
|
+
# Format: A hash with
|
25
|
+
# + key: "cron" | "every" | "in" | "at"
|
26
|
+
# + value: string
|
27
|
+
# Examples:
|
28
|
+
# a) { "every" => "1h" }
|
29
|
+
# b) { "cron" => "* * * * * UTC" }
|
30
|
+
# See: rufus/scheduler for details about different schedule options and value string format
|
31
|
+
config :schedule, :validate => :hash, :required => true
|
32
|
+
|
33
|
+
config :scheme, :validate => :string, :default => 'http'
|
34
|
+
|
35
|
+
config :hostname, :validate => :string, :default => 'localhost'
|
36
|
+
|
37
|
+
config :port, :validate => :number, :default => 80
|
38
|
+
|
39
|
+
config :token, :validate => :string, :required => true
|
40
|
+
|
41
|
+
public
|
42
|
+
|
43
|
+
Schedule_types = %w(cron every at in)
|
44
|
+
|
45
|
+
def register
|
46
|
+
@host = Socket.gethostname.force_encoding(Encoding::UTF_8)
|
47
|
+
@authorization = "Bearer #{@token}"
|
48
|
+
@logger.info('Register BitBucket Input', :schedule => @schedule, :hostname => @hostname, :port => @port)
|
49
|
+
end
|
50
|
+
|
51
|
+
def run(queue)
|
52
|
+
@logger.info('RUN')
|
53
|
+
#schedule hash must contain exactly one of the allowed keys
|
54
|
+
msg_invalid_schedule = "Invalid config. schedule hash must contain " +
|
55
|
+
"exactly one of the following keys - cron, at, every or in"
|
56
|
+
raise Logstash::ConfigurationError, msg_invalid_schedule if @schedule.keys.length != 1
|
57
|
+
schedule_type = @schedule.keys.first
|
58
|
+
schedule_value = @schedule[schedule_type]
|
59
|
+
raise LogStash::ConfigurationError, msg_invalid_schedule unless Schedule_types.include?(schedule_type)
|
60
|
+
|
61
|
+
@scheduler = Rufus::Scheduler.new(:max_work_threads => 1)
|
62
|
+
#as of v3.0.9, :first_in => :now doesn't work. Use the following workaround instead
|
63
|
+
opts = schedule_type == "every" ? {:first_in => 0.01} : {}
|
64
|
+
@scheduler.send(schedule_type, schedule_value, opts) {run_once(queue)}
|
65
|
+
@scheduler.join
|
66
|
+
@logger.info('RUN COMPLETE')
|
67
|
+
end
|
68
|
+
|
69
|
+
def run_once(queue)
|
70
|
+
@logger.info('RUN ONCE')
|
71
|
+
|
72
|
+
request_async(
|
73
|
+
queue,
|
74
|
+
'rest/api/1.0/projects',
|
75
|
+
{},
|
76
|
+
{},
|
77
|
+
'handle_projects_response')
|
78
|
+
|
79
|
+
# request_async(
|
80
|
+
# queue,
|
81
|
+
# "rest/api/1.0/projects/%{project}/repos",
|
82
|
+
# {:project => 'SOCK', :start => 0},
|
83
|
+
# {:headers => {'Authorization' => @authorization}},
|
84
|
+
# 'handle_repos_response')
|
85
|
+
|
86
|
+
client.execute!
|
87
|
+
end
|
88
|
+
|
89
|
+
def request_async(queue, path, parameters, request_options, callback)
|
90
|
+
started = Time.now
|
91
|
+
|
92
|
+
method = parameters[:method] ? parameters.delete(:method) : :get
|
93
|
+
|
94
|
+
uri = "#{@scheme}://#{@hostname}/#{path}" % parameters
|
95
|
+
|
96
|
+
request_options[:headers] = {'Authorization' => @authorization}
|
97
|
+
|
98
|
+
# @logger.info("Fetching URL", :method => method, :request => uri)
|
99
|
+
|
100
|
+
client.parallel.send(method, uri, request_options).
|
101
|
+
on_success {|response| self.send(callback, queue, uri, parameters, response, Time.now - started)}.
|
102
|
+
on_failure {|exception|
|
103
|
+
handle_failure(queue, uri, parameters, exception, Time.now - started)
|
104
|
+
}
|
105
|
+
end
|
106
|
+
|
107
|
+
def handle_projects_response(queue, uri, parameters, response, execution_time)
|
108
|
+
# Decode JSON
|
109
|
+
body = JSON.parse(response.body)
|
110
|
+
|
111
|
+
#@logger.info("Handle Projects Response", :uri => uri, :start => body['start'], :size => body['size'])
|
112
|
+
#@logger.info("Response Body", :body => response)
|
113
|
+
|
114
|
+
request_count = 0
|
115
|
+
|
116
|
+
# Fetch addition project pages
|
117
|
+
unless body['isLastPage']
|
118
|
+
request_async(
|
119
|
+
queue,
|
120
|
+
"rest/api/1.0/projects",
|
121
|
+
{},
|
122
|
+
{:query => {'start' => body['nextPageStart']}},
|
123
|
+
'handle_projects_response'
|
124
|
+
)
|
125
|
+
|
126
|
+
client.execute!
|
127
|
+
end
|
128
|
+
|
129
|
+
# Iterate over each project
|
130
|
+
body['values'].each do |project|
|
131
|
+
#@logger.info("Add project", :project => project['key'])
|
132
|
+
|
133
|
+
# Send get repos request
|
134
|
+
request_async(
|
135
|
+
queue,
|
136
|
+
"rest/api/1.0/projects/%{project}/repos",
|
137
|
+
{:project => project['key']},
|
138
|
+
{},
|
139
|
+
'handle_repos_response')
|
140
|
+
|
141
|
+
request_count += 1
|
142
|
+
|
143
|
+
if request_count > 1
|
144
|
+
request_count = 0
|
145
|
+
client.execute!
|
146
|
+
end
|
147
|
+
|
148
|
+
# Push project event into queue
|
149
|
+
event = LogStash::Event.new(project)
|
150
|
+
event.set('[@metadata][index]', 'project')
|
151
|
+
event.set('[@metadata][id]', project['id'])
|
152
|
+
queue << event
|
153
|
+
end
|
154
|
+
|
155
|
+
if request_count > 0
|
156
|
+
# Send HTTP requests
|
157
|
+
client.execute!
|
158
|
+
end
|
159
|
+
end
|
160
|
+
|
161
|
+
# Process response from get repos API request
|
162
|
+
def handle_repos_response(queue, uri, parameters, response, execution_time)
|
163
|
+
# Decode JSON
|
164
|
+
body = JSON.parse(response.body)
|
165
|
+
|
166
|
+
#@logger.info("Handle Repos Response", :uri => uri, :project => parameters[:project], :start => body['start'], :size => body['size'])
|
167
|
+
|
168
|
+
request_count = 0
|
169
|
+
|
170
|
+
# Fetch addition repo pages
|
171
|
+
unless body['isLastPage']
|
172
|
+
request_async(
|
173
|
+
queue,
|
174
|
+
"rest/api/1.0/projects/%{project}/repos",
|
175
|
+
{:project => parameters[:project]},
|
176
|
+
{:query => {'start' => body['nextPageStart']}},
|
177
|
+
'handle_repos_response'
|
178
|
+
)
|
179
|
+
|
180
|
+
client.execute!
|
181
|
+
end
|
182
|
+
|
183
|
+
# Iterate over each repo
|
184
|
+
body['values'].each { |repo|
|
185
|
+
#@logger.info("Add repo", :project => parameters[:project], :repo => repo['slug'])
|
186
|
+
|
187
|
+
# Send get pull requests request
|
188
|
+
request_async(
|
189
|
+
queue,
|
190
|
+
"rest/api/1.0/projects/%{project}/repos/%{repo}/pull-requests",
|
191
|
+
{:project => parameters[:project], :repo => repo['slug']},
|
192
|
+
{:query => {'state' => 'ALL'}},
|
193
|
+
'handle_pull_requests_response')
|
194
|
+
# Semd Branch requests request
|
195
|
+
request_async(
|
196
|
+
queue,
|
197
|
+
"rest/api/1.0/projects/%{project}/repos/%{repo}/branches",
|
198
|
+
{:project => parameters[:project], :repo => repo['slug']},
|
199
|
+
{:query => {'state' => 'ALL'}},
|
200
|
+
'handle_branch_response')
|
201
|
+
|
202
|
+
request_async(
|
203
|
+
queue,
|
204
|
+
"rest/api/1.0/projects/%{project}/repos/%{repo}/commits",
|
205
|
+
{:project => parameters[:project], :repo => repo['slug']},
|
206
|
+
{:query => {'state' => 'ALL'}},
|
207
|
+
'handle_commits_response')
|
208
|
+
request_count +=1
|
209
|
+
|
210
|
+
if request_count > 1
|
211
|
+
request_count = 0
|
212
|
+
client.execute!
|
213
|
+
end
|
214
|
+
|
215
|
+
# Push repo event into queue
|
216
|
+
event = LogStash::Event.new(repo)
|
217
|
+
event.set("[@metadata][index]", "repo")
|
218
|
+
event.set("[@metadata][id]", repo['id'])
|
219
|
+
queue << event
|
220
|
+
}
|
221
|
+
|
222
|
+
if request_count > 0
|
223
|
+
# Send HTTP requests
|
224
|
+
client.execute!
|
225
|
+
end
|
226
|
+
end
|
227
|
+
|
228
|
+
def handle_pull_requests_response(queue, uri, parameters, response, execution_time)
|
229
|
+
# Decode JSON
|
230
|
+
body = JSON.parse(response.body)
|
231
|
+
|
232
|
+
#@logger.info("Handle Pull Requests Response", :uri => uri, :project => parameters[:project], :repo => parameters[:repo], :start => body['start'], :size => body['size'])
|
233
|
+
|
234
|
+
# Fetch addition pull request pages
|
235
|
+
unless body['isLastPage']
|
236
|
+
request_async(
|
237
|
+
queue,
|
238
|
+
"rest/api/1.0/projects/%{project}/repos/%{repo}/pull-requests",
|
239
|
+
{:project => parameters[:project], :repo => parameters[:repo]},
|
240
|
+
{:query => {'state' => 'ALL', 'start' => body['nextPageStart']}},
|
241
|
+
'handle_pull_requests_response')
|
242
|
+
|
243
|
+
# Send HTTP requests
|
244
|
+
client.execute!
|
245
|
+
end
|
246
|
+
|
247
|
+
# Iterate over each pull request
|
248
|
+
body['values'].each { |pull_request|
|
249
|
+
#@logger.info("Add Pull Request", :project => parameters[:project], :repo => parameters[:repo], :pull_request => pull_request['title'])
|
250
|
+
|
251
|
+
# Push repo event into queue
|
252
|
+
event = LogStash::Event.new(pull_request)
|
253
|
+
event.set("[@metadata][index]", "pull_request")
|
254
|
+
event.set("[@metadata][id]", "#{parameters[:project]}-#{parameters[:repo]}-#{pull_request['id']}")
|
255
|
+
queue << event
|
256
|
+
}
|
257
|
+
end
|
258
|
+
|
259
|
+
def handle_branch_response(queue, uri, parameters, response, execution_time)
|
260
|
+
body = JSON.parse(response.body)
|
261
|
+
|
262
|
+
#@logger.info("Handle Branch Response", :uri => uri, :project => parameters[:project], :repo => parameters[:repo], :start => body['start'], :size => body['size'])
|
263
|
+
|
264
|
+
|
265
|
+
unless body['isLastPage']
|
266
|
+
request_async(
|
267
|
+
queue,
|
268
|
+
"rest/api/1.0/projects/%{project}/repos/%{repo}/branches",
|
269
|
+
{:project => parameters[:project], :repo => parameters[:repo]},
|
270
|
+
{:query => {'details' => 'true', 'state' => 'ALL', 'start' => body['nextPageStart']}},
|
271
|
+
'handle_branch_response')
|
272
|
+
|
273
|
+
# Send HTTP requests
|
274
|
+
client.execute!
|
275
|
+
end
|
276
|
+
|
277
|
+
# Iterate over each Branch
|
278
|
+
body['values'].each { |branch|
|
279
|
+
#@logger.info("Add Branch Request", :project => parameters[:project], :repo => parameters[:repo], :branch => branch['id'])
|
280
|
+
|
281
|
+
# Push Branch event into queue
|
282
|
+
event = LogStash::Event.new(branch)
|
283
|
+
event.set("[@metadata][index]", "branch")
|
284
|
+
event.set("[@metadata][id]", "#{parameters[:project]}-#{parameters[:repo]}-#{branch['id']}")
|
285
|
+
queue << event
|
286
|
+
}
|
287
|
+
|
288
|
+
end
|
289
|
+
|
290
|
+
def handle_commits_response(queue, uri, parameters, response, execution_time)
|
291
|
+
|
292
|
+
body = JSON.parse(response.body)
|
293
|
+
@logger.info("Handle Commits Response", :uri => uri, :project => parameters[:project], :repo => parameters[:repo], :start => body['start'], :size => body['size'])
|
294
|
+
|
295
|
+
unless body['isLastPage']
|
296
|
+
request_async(
|
297
|
+
queue,
|
298
|
+
"rest/api/1.0/projects/%{project}/repos/%{repo}/commits",
|
299
|
+
{:project => parameters[:project], :repo => parameters[:repo]},
|
300
|
+
{:query => {'details' => 'true', 'state' => 'ALL', 'start' => body['nextPageStart']}},
|
301
|
+
'handle_commits_response')
|
302
|
+
|
303
|
+
# Send HTTP requests
|
304
|
+
client.execute!
|
305
|
+
end
|
306
|
+
|
307
|
+
# Iterate over each Commit
|
308
|
+
body['values'].each { |commit|
|
309
|
+
@logger.info("Add Commit Request", :project => parameters[:project], :repo => parameters[:repo], :commit => commit['id'])
|
310
|
+
|
311
|
+
# Push Commit event into queue
|
312
|
+
event = LogStash::Event.new(commit)
|
313
|
+
event.set("[@metadata][index]", "commit")
|
314
|
+
event.set("[@metadata][id]", "#{parameters[:project]}-#{parameters[:repo]}-#{commit['id']}")
|
315
|
+
queue << event
|
316
|
+
}
|
317
|
+
end
|
318
|
+
|
319
|
+
def handle_failure(queue, path, parameters, exception, execution_time)
|
320
|
+
@logger.error('HTTP Request failed', :path => path, :parameters => parameters, :exception => exception, :backtrace => exception.backtrace);
|
321
|
+
end
|
322
|
+
|
323
|
+
def stop
|
324
|
+
# nothing to do in this case so it is not necessary to define stop
|
325
|
+
# examples of common "stop" tasks:
|
326
|
+
# * close sockets (unblocking blocking reads/accepts)
|
327
|
+
# * cleanup temporary files
|
328
|
+
# * terminate spawned threads
|
329
|
+
end
|
330
|
+
|
331
|
+
end # class LogStash::Inputs::Bitbucket
|
data/lib/response.rb
ADDED
@@ -0,0 +1,28 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'logstash-input-bitbucket'
|
3
|
+
s.version = '0.1.0'
|
4
|
+
s.licenses = ['Apache-2.0']
|
5
|
+
s.summary = 'BitBucket input plugin'
|
6
|
+
s.description = 'Import data from BitBucket server to aggregate analytic data'
|
7
|
+
s.homepage = 'http://liatr.io'
|
8
|
+
s.authors = ['Chris Schreiber']
|
9
|
+
s.email = 'chriss@liatrio.com'
|
10
|
+
s.require_paths = ['lib']
|
11
|
+
|
12
|
+
# Files
|
13
|
+
s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
|
14
|
+
# Tests
|
15
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
16
|
+
|
17
|
+
# Special flag to let us know this is actually a logstash plugin
|
18
|
+
s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
|
19
|
+
|
20
|
+
# Gem dependencies
|
21
|
+
s.add_runtime_dependency "logstash-core-plugin-api", "~> 2.0"
|
22
|
+
s.add_runtime_dependency 'logstash-mixin-http_client', "~> 6.0"
|
23
|
+
s.add_runtime_dependency 'rufus-scheduler', "~>3.0"
|
24
|
+
s.add_runtime_dependency 'stud', '~> 0.0'
|
25
|
+
|
26
|
+
s.add_development_dependency 'logstash-devutils', '~> 1.3'
|
27
|
+
s.add_development_dependency 'logstash-codec-json', '~> 3.0'
|
28
|
+
end
|
@@ -0,0 +1,73 @@
|
|
1
|
+
# encoding: utf-8
|
2
|
+
require "logstash/devutils/rspec/spec_helper"
|
3
|
+
require "logstash/inputs/bitbucket"
|
4
|
+
require "response"
|
5
|
+
|
6
|
+
describe LogStash::Inputs::Bitbucket do
|
7
|
+
#create a new instance of the class Bitbucket named subject with the following configuration
|
8
|
+
subject { LogStash::Inputs::Bitbucket.new(config) }
|
9
|
+
let(:config) do
|
10
|
+
{
|
11
|
+
'schedule' => { "foo" => "bar", "tak" => "jon" },
|
12
|
+
'scheme' => 'foo',
|
13
|
+
'hostname' => 'foo',
|
14
|
+
'port' => '12345',
|
15
|
+
'token' => 'foo'
|
16
|
+
}
|
17
|
+
end
|
18
|
+
|
19
|
+
#assures that we initialized all required configuration vars
|
20
|
+
describe "bitbucket authorization" do
|
21
|
+
it "not raise error" do
|
22
|
+
expect {subject.register}.to_not raise_error
|
23
|
+
end
|
24
|
+
end
|
25
|
+
|
26
|
+
|
27
|
+
#ruby STL queue, works like a stack
|
28
|
+
let (:queue) { Queue.new }
|
29
|
+
|
30
|
+
#calls the method request_async and gives bad input into the function to expectedly fail
|
31
|
+
#specifically, the path string should have '/' between folder names
|
32
|
+
describe "handling API call with bad input" do
|
33
|
+
it "handles generic input" do
|
34
|
+
path = 'invalid garbage input'
|
35
|
+
parameters = {}
|
36
|
+
request_options = {}
|
37
|
+
callback = double("test data")
|
38
|
+
expect {subject.request_async(
|
39
|
+
queue, path, parameters, request_options, callback
|
40
|
+
)}.to raise_error
|
41
|
+
end
|
42
|
+
end
|
43
|
+
|
44
|
+
#inverse of the previous test, passes in good data and succeeds
|
45
|
+
describe "handling API call with good input" do
|
46
|
+
it "handles generic input" do
|
47
|
+
path = 'valid/garbage/path'
|
48
|
+
parameters = {}
|
49
|
+
request_options = {}
|
50
|
+
callback = double("test data")
|
51
|
+
expect {subject.request_async(
|
52
|
+
queue, path, parameters, request_options, callback
|
53
|
+
)}.to_not raise_error
|
54
|
+
end
|
55
|
+
end
|
56
|
+
|
57
|
+
#let(:response) { IO.read("response.json") }
|
58
|
+
#describe "parsing data with real input" do
|
59
|
+
# it "iterates over the data" do
|
60
|
+
#
|
61
|
+
# response = Response.new(IO.read("response.json"))
|
62
|
+
#
|
63
|
+
# uri = double("foo")
|
64
|
+
# execution_time = double("bar")
|
65
|
+
# parameters = {}
|
66
|
+
# #def handle_pull_requests_response(queue, uri, parameters, response, execution_time)
|
67
|
+
# expect {subject.handle_repos_response(
|
68
|
+
# queue, uri, parameters, response, execution_time
|
69
|
+
# )}.to_not raise_error
|
70
|
+
# end
|
71
|
+
# end
|
72
|
+
#handle_projects_response
|
73
|
+
end
|
metadata
ADDED
@@ -0,0 +1,140 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: logstash-input-bitbucket
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Chris Schreiber
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2019-03-19 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
requirement: !ruby/object:Gem::Requirement
|
15
|
+
requirements:
|
16
|
+
- - "~>"
|
17
|
+
- !ruby/object:Gem::Version
|
18
|
+
version: '2.0'
|
19
|
+
name: logstash-core-plugin-api
|
20
|
+
prerelease: false
|
21
|
+
type: :runtime
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: '2.0'
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
requirement: !ruby/object:Gem::Requirement
|
29
|
+
requirements:
|
30
|
+
- - "~>"
|
31
|
+
- !ruby/object:Gem::Version
|
32
|
+
version: '6.0'
|
33
|
+
name: logstash-mixin-http_client
|
34
|
+
prerelease: false
|
35
|
+
type: :runtime
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - "~>"
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '6.0'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
requirement: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - "~>"
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '3.0'
|
47
|
+
name: rufus-scheduler
|
48
|
+
prerelease: false
|
49
|
+
type: :runtime
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - "~>"
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '3.0'
|
55
|
+
- !ruby/object:Gem::Dependency
|
56
|
+
requirement: !ruby/object:Gem::Requirement
|
57
|
+
requirements:
|
58
|
+
- - "~>"
|
59
|
+
- !ruby/object:Gem::Version
|
60
|
+
version: '0.0'
|
61
|
+
name: stud
|
62
|
+
prerelease: false
|
63
|
+
type: :runtime
|
64
|
+
version_requirements: !ruby/object:Gem::Requirement
|
65
|
+
requirements:
|
66
|
+
- - "~>"
|
67
|
+
- !ruby/object:Gem::Version
|
68
|
+
version: '0.0'
|
69
|
+
- !ruby/object:Gem::Dependency
|
70
|
+
requirement: !ruby/object:Gem::Requirement
|
71
|
+
requirements:
|
72
|
+
- - "~>"
|
73
|
+
- !ruby/object:Gem::Version
|
74
|
+
version: '1.3'
|
75
|
+
name: logstash-devutils
|
76
|
+
prerelease: false
|
77
|
+
type: :development
|
78
|
+
version_requirements: !ruby/object:Gem::Requirement
|
79
|
+
requirements:
|
80
|
+
- - "~>"
|
81
|
+
- !ruby/object:Gem::Version
|
82
|
+
version: '1.3'
|
83
|
+
- !ruby/object:Gem::Dependency
|
84
|
+
requirement: !ruby/object:Gem::Requirement
|
85
|
+
requirements:
|
86
|
+
- - "~>"
|
87
|
+
- !ruby/object:Gem::Version
|
88
|
+
version: '3.0'
|
89
|
+
name: logstash-codec-json
|
90
|
+
prerelease: false
|
91
|
+
type: :development
|
92
|
+
version_requirements: !ruby/object:Gem::Requirement
|
93
|
+
requirements:
|
94
|
+
- - "~>"
|
95
|
+
- !ruby/object:Gem::Version
|
96
|
+
version: '3.0'
|
97
|
+
description: Import data from BitBucket server to aggregate analytic data
|
98
|
+
email: chriss@liatrio.com
|
99
|
+
executables: []
|
100
|
+
extensions: []
|
101
|
+
extra_rdoc_files: []
|
102
|
+
files:
|
103
|
+
- CHANGELOG.md
|
104
|
+
- CONTRIBUTORS
|
105
|
+
- DEVELOPER.md
|
106
|
+
- Gemfile
|
107
|
+
- LICENSE
|
108
|
+
- README.md
|
109
|
+
- lib/logstash/inputs/bitbucket.rb
|
110
|
+
- lib/response.rb
|
111
|
+
- logstash-input-bitbucket.gemspec
|
112
|
+
- spec/inputs/bitbucket_spec.rb
|
113
|
+
homepage: http://liatr.io
|
114
|
+
licenses:
|
115
|
+
- Apache-2.0
|
116
|
+
metadata:
|
117
|
+
logstash_plugin: 'true'
|
118
|
+
logstash_group: input
|
119
|
+
post_install_message:
|
120
|
+
rdoc_options: []
|
121
|
+
require_paths:
|
122
|
+
- lib
|
123
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
124
|
+
requirements:
|
125
|
+
- - ">="
|
126
|
+
- !ruby/object:Gem::Version
|
127
|
+
version: '0'
|
128
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
129
|
+
requirements:
|
130
|
+
- - ">="
|
131
|
+
- !ruby/object:Gem::Version
|
132
|
+
version: '0'
|
133
|
+
requirements: []
|
134
|
+
rubyforge_project:
|
135
|
+
rubygems_version: 2.6.13
|
136
|
+
signing_key:
|
137
|
+
specification_version: 4
|
138
|
+
summary: BitBucket input plugin
|
139
|
+
test_files:
|
140
|
+
- spec/inputs/bitbucket_spec.rb
|