druiddb 1.2.0 → 1.2.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 6aff717c3c1644264319311423dd20b93c6b1c1b
4
- data.tar.gz: 13ee2b193c9fd2eb5ce6c875176b950f4be736aa
3
+ metadata.gz: d70e61ee7c8aa988f217cfca1f6f32431906c2db
4
+ data.tar.gz: 1e989cc4dffbef040e91a98062f7c58e9ba86cde
5
5
  SHA512:
6
- metadata.gz: c20d56f9c873ded1c52c99ee7fe3e6e6517c652335d45c8cf8e3696063b25e1291eff03a210036c0b3128d76ae6e86303f90e0d62ad5ede3ded5fa7a72db4ee2
7
- data.tar.gz: 1c8b7c392900c2b99c517186a498ed84d9843ccb40ba95ff24e0b929a9448e58b387fc8d20257debbbb69bdead3d5b0dbc7c1ab059feef65a63c6b0199047269
6
+ metadata.gz: df81ac19746fe5ed208b5011166cdcc10cc6f6eb6481e89387b4c3bf5d910614921a9f8ecaff4899f4a6ee6bada22ec379719ca0ded9ae53172541379d4bb618
7
+ data.tar.gz: 6a247ccbd87d1f48cd4cb0813dee467167cd40c3105cfdb75d8ef37f747e52b900c1a031372f3132553ceacd4d9e9b3294b1a85899e83728772872761eb4c78e
data/README.md CHANGED
@@ -3,7 +3,6 @@
3
3
  [![Build Status](https://travis-ci.org/andremleblanc/druiddb-ruby.svg?branch=master)](https://travis-ci.org/andremleblanc/druiddb-ruby)
4
4
  [![Gem Version](https://badge.fury.io/rb/druiddb.svg)](https://badge.fury.io/rb/druiddb)
5
5
  [![Code Climate](https://codeclimate.com/github/andremleblanc/druiddb-ruby/badges/gpa.svg)](https://codeclimate.com/github/andremleblanc/druiddb-ruby)
6
- [![Test Coverage](https://codeclimate.com/github/andremleblanc/druiddb-ruby/badges/coverage.svg)](https://codeclimate.com/github/andremleblanc/druiddb-ruby/coverage)
7
6
  [![Dependency Status](https://gemnasium.com/badges/github.com/andremleblanc/druiddb-ruby.svg)](https://gemnasium.com/github.com/andremleblanc/druiddb-ruby)
8
7
 
9
8
  This documentation is intended to be a quick-start guide, not a comprehensive
@@ -14,6 +13,10 @@ and the `DruidDB::Query` modules as they expose most of the methods on the clien
14
13
  This guide assumes a significant knowledge of Druid, for more info:
15
14
  http://druid.io/docs/latest/design/index.html
16
15
 
16
+ ## What Does it Do
17
+
18
+ druiddb-ruby provides a client for your Ruby application to push data to Druid leveraging the [Kafka Indexing Service](http://druid.io/docs/latest/development/extensions-core/kafka-ingestion.html). The client also provides an interface for querying and performing management tasks. It will automatically find and connect to Kafka and the Druid nodes through ZooKeeper, which means you only need to provide the ZooKeeper host and it will find everything else.
19
+
17
20
  ## Install
18
21
 
19
22
  ```bash
@@ -31,7 +34,7 @@ client = DruidDB::Client.new()
31
34
 
32
35
  ### Writing Data
33
36
 
34
- #### Kafka Indexing service
37
+ #### Kafka Indexing Service
35
38
  This gem leverages the [Kafka Indexing Service](http://druid.io/docs/latest/development/extensions-core/kafka-ingestion.html) for ingesting data. The gem pushes datapoints onto Kafka topics (typically named after the datasource). You can also use the gem to upload an ingestion spec, which is needed for Druid to consume the Kafka topic.
36
39
 
37
40
  This repo contains a `docker-compose.yml` build that may help bootstrap development with Druid and the Kafka Indexing Service. It's what we use for integration testing.
@@ -112,7 +115,20 @@ This project uses docker-compose to provide a development environment.
112
115
  2. cd into project
113
116
  3. `docker-compose up` - this will download necessary images and run all dependencies in the foreground.
114
117
 
115
- Then you can use `docker build -t some_tag .` to build the Docker image for this project after making changes and `docker run -it --network=druiddbruby_druiddb some_tag some_command` to interact with it.
118
+ When changes are made to the project, rebuild the Docker image with:
119
+
120
+ ```shell
121
+ $ docker build -t <some_tag> .
122
+ ```
123
+
124
+ Where `<some_tag>` is something like `druiddb-ruby`.
125
+
126
+ To interact with the newly changed project, run it with:
127
+
128
+ ```shell
129
+ $ docker run -it --network=druiddbruby_druiddb <some_tag> <some_command>
130
+ ```
131
+ Where `<some_command>` is a shell command that can be run on the docker image (i.e. `bash` or anything in the `bin` folder)
116
132
 
117
133
  ### Metabase
118
134
 
@@ -123,7 +139,7 @@ Viewing data in the database can be a bit annoying, use a tool like [Metabase](h
123
139
  Testing is run utilizing the docker-compose environment.
124
140
 
125
141
  1. `docker-compose up`
126
- 2. `docker run -it --network=druiddbruby_druiddb druiddb-ruby bin/run_tests.sh`
142
+ 2. `docker run -it --network=druiddbruby_druiddb <some_tag> bin/run_tests.sh`
127
143
 
128
144
  ## License
129
145
 
@@ -4,6 +4,7 @@ module DruidDB
4
4
  DISCOVERY_PATH = '/druid/discovery'.freeze
5
5
  INDEX_SERVICE = 'druid/overlord'.freeze
6
6
  KAFKA_BROKER_PATH = '/brokers/ids'.freeze
7
+ KAFKA_OVERFLOW_WAIT = 0.05
7
8
  LOG_LEVEL = :error
8
9
  ROLLUP_GRANULARITY = :minute
9
10
  STRONG_DELETE = false
@@ -16,6 +17,7 @@ module DruidDB
16
17
  :discovery_path,
17
18
  :index_service,
18
19
  :kafka_broker_path,
20
+ :kafka_overflow_wait,
19
21
  :log_level,
20
22
  :rollup_granularity,
21
23
  :strong_delete,
@@ -29,6 +31,7 @@ module DruidDB
29
31
  @discovery_path = opts[:discovery_path] || DISCOVERY_PATH
30
32
  @index_service = opts[:index_service] || INDEX_SERVICE
31
33
  @kafka_broker_path = opts[:kafka_broker_path] || KAFKA_BROKER_PATH
34
+ @kafka_overflow_wait = opts[:kafka_overflow_wait] || KAFKA_OVERFLOW_WAIT
32
35
  @log_level = opts[:log_level] || LOG_LEVEL
33
36
  @rollup_granularity = rollup_granularity_string(opts[:rollup_granularity])
34
37
  @strong_delete = opts[:strong_delete] || STRONG_DELETE
@@ -1,3 +1,3 @@
1
1
  module DruidDB
2
- VERSION = '1.2.0'.freeze
2
+ VERSION = '1.2.1'.freeze
3
3
  end
@@ -10,7 +10,12 @@ module DruidDB
10
10
 
11
11
  def write_point(datasource, datapoint)
12
12
  raise DruidDB::ConnectionError, 'no kafka brokers available' if producer.nil?
13
- producer.produce(datapoint.to_json, topic: datasource)
13
+ begin
14
+ producer.produce(datapoint.to_json, topic: datasource)
15
+ rescue Kafka::BufferOverflow
16
+ sleep config.kafka_overflow_wait
17
+ retry
18
+ end
14
19
  end
15
20
 
16
21
  private
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: druiddb
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.2.0
4
+ version: 1.2.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Andre LeBlanc
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2017-08-23 00:00:00.000000000 Z
11
+ date: 2017-10-19 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: activesupport