elasticsearch-rails-ha 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: be1cae2a83f408a6cf97871d0cd9fd2d284c2474
4
+ data.tar.gz: 6faa05ee8cb145ad8c484a85634224bb1fc3369f
5
+ SHA512:
6
+ metadata.gz: df5d68a32704dd00d819c1c7335d5a8bef8d7e64c86fdc61e8d903d661d79006384efbeb56d8e6fc2c888222930fe0c258842cb965156e95272266199bac5559
7
+ data.tar.gz: adb0cc7f73748e22fd62d663a59179430f4bf1f4bc282767254ed36f404206ebdcca31c87d342a9ff66458a9171a4d73732b33cfd263774823357fb3aedef265
data/CONTRIBUTING.md ADDED
@@ -0,0 +1,15 @@
1
+ ## Welcome!
2
+
3
+ We're so glad you're thinking about contributing to an 18F open source project! If you're unsure or afraid of anything, just ask or submit the issue or pull request anyways. The worst that can happen is that you'll be politely asked to change something. We appreciate any sort of contribution, and don't want a wall of rules to get in the way of that.
4
+
5
+ Before contributing, we encourage you to read our CONTRIBUTING policy (you are here), our LICENSE, and our README, all of which should be in this repository. If you have any questions, or want to read more about our underlying policies, you can consult the 18F Open Source Policy GitHub repository at https://github.com/18f/open-source-policy, or just shoot us an email/official government letterhead note to [18f@gsa.gov](mailto:18f@gsa.gov).
6
+
7
+ ## Public domain
8
+
9
+ This project is in the public domain within the United States, and
10
+ copyright and related rights in the work worldwide are waived through
11
+ the [CC0 1.0 Universal public domain dedication](https://creativecommons.org/publicdomain/zero/1.0/).
12
+
13
+ All contributions to this project will be released under the CC0
14
+ dedication. By submitting a pull request, you are agreeing to comply
15
+ with this waiver of copyright interest.
data/LICENSE.md ADDED
@@ -0,0 +1,31 @@
1
+ As a work of the United States Government, this project is in the
2
+ public domain within the United States.
3
+
4
+ Additionally, we waive copyright and related rights in the work
5
+ worldwide through the CC0 1.0 Universal public domain dedication.
6
+
7
+ ## CC0 1.0 Universal Summary
8
+
9
+ This is a human-readable summary of the [Legal Code (read the full text)](https://creativecommons.org/publicdomain/zero/1.0/legalcode).
10
+
11
+ ### No Copyright
12
+
13
+ The person who associated a work with this deed has dedicated the work to
14
+ the public domain by waiving all of his or her rights to the work worldwide
15
+ under copyright law, including all related and neighboring rights, to the
16
+ extent allowed by law.
17
+
18
+ You can copy, modify, distribute and perform the work, even for commercial
19
+ purposes, all without asking permission.
20
+
21
+ ### Other Information
22
+
23
+ In no way are the patent or trademark rights of any person affected by CC0,
24
+ nor are the rights that other persons may have in the work or in how the
25
+ work is used, such as publicity or privacy rights.
26
+
27
+ Unless expressly stated otherwise, the person who associated a work with
28
+ this deed makes no warranties about the work, and disclaims liability for
29
+ all uses of the work, to the fullest extent permitted by applicable law.
30
+ When using or citing the work, you should not imply endorsement by the
31
+ author or the affirmer.
data/README.md ADDED
@@ -0,0 +1,50 @@
1
+ # elasticsearch-rails-ha RubyGem
2
+
3
+ [![Build Status](https://travis-ci.org/18F/elasticsearch-rails-ha-gem.svg?branch=master)](https://travis-ci.org/18F/elasticsearch-rails-ha-gem)
4
+
5
+ Elasticsearch for Rails, high availability extensions.
6
+
7
+ See also:
8
+
9
+ * [elasticsearch-rails](https://github.com/elastic/elasticsearch-rails)
10
+
11
+ ## Examples
12
+
13
+ Add the high availability tasks to your Rake task file `lib/tasks/elasticsearch.rake`:
14
+
15
+ ```
16
+ require 'elasticsearch/rails/ha/tasks'
17
+ ```
18
+
19
+ Import all the Articles on a machine with 4 cores available:
20
+
21
+ ```
22
+ % bundle exec rake environment elasticsearch:ha:import NPROCS=4 CLASS='Article'
23
+ ```
24
+
25
+ Stage an index alongside your live index, but do not make it live yet:
26
+
27
+ ```
28
+ % bundle exec rake environment elasticsearch:ha:stage NPROCS=4 CLASS='Article'
29
+ ```
30
+
31
+ Promote your staged index:
32
+
33
+ ```
34
+ % bundle exec rake environment elasticsearch:ha:promote
35
+ ```
36
+
37
+ ## Acknowledgements
38
+
39
+ Thanks to [Pop Up Archive](http://popuparchive.com/) for
40
+ contributing the [original version of this code](https://github.com/popuparchive/pop-up-archive/blob/master/lib/tasks/search.rake) to the public domain.
41
+
42
+ ## Public domain
43
+
44
+ This project is in the worldwide [public domain](LICENSE.md). As stated in [CONTRIBUTING](CONTRIBUTING.md):
45
+
46
+ > This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the [CC0 1.0 Universal public domain dedication](https://creativecommons.org/publicdomain/zero/1.0/).
47
+ >
48
+ > All contributions to this project will be released under the CC0
49
+ > dedication. By submitting a pull request, you are agreeing to comply
50
+ > with this waiver of copyright interest.
@@ -0,0 +1,3 @@
1
+ require_relative 'ha/version'
2
+ require_relative 'ha/parallel_indexer'
3
+ require_relative 'ha/index_stager'
@@ -0,0 +1,103 @@
1
+ module Elasticsearch
2
+ module Rails
3
+ module HA
4
+ class IndexStager
5
+ attr_reader :klass, :live_index_name
6
+
7
+ def initialize(klass)
8
+ @klass = klass.constantize
9
+ end
10
+
11
+ def stage_index_name
12
+ if klass.respond_to?(:stage_index_name)
13
+ klass.stage_index_name
14
+ else
15
+ klass.index_name + "_staged"
16
+ end
17
+ end
18
+
19
+ def tmp_index_name
20
+ @_suffix ||= Time.now.strftime('%Y%m%d%H%M%S') + '-' + SecureRandom.hex[0..7]
21
+ "#{klass.index_name}_#{@_suffix}"
22
+ end
23
+
24
+ def alias_stage_to_tmp_index
25
+ es_client.indices.delete index: stage_index_name rescue false
26
+ es_client.indices.update_aliases body: {
27
+ actions: [
28
+ { add: { index: tmp_index_name, alias: stage_index_name } }
29
+ ]
30
+ }
31
+ end
32
+
33
+ def promote(live_index_name=klass.index_name)
34
+ @live_index_name = live_index_name
35
+
36
+ # the renaming actions (performed atomically by ES)
37
+ rename_actions = [
38
+ { remove: { index: stage_aliased_to, alias: stage_index_name } },
39
+ { add: { index: stage_index_name, alias: live_index_name } }
40
+ ]
41
+
42
+ # zap any existing index known as index_name,
43
+ # but do it conditionally since it is reasonable that it does not exist.
44
+ to_delete = []
45
+ existing_live_index = es_client.indices.get_aliases(index: live_index_name)
46
+ existing_live_index.each do |k,v|
47
+
48
+ # if the index is merely aliased, remove its alias as part of the aliasing transaction.
49
+ if k != klass.index_name
50
+ rename_actions.unshift({ remove: { index: k, alias: live_index_name } })
51
+
52
+ # mark it for deletion when we've successfully updated aliases
53
+ to_delete.push k
54
+
55
+ else
56
+ # this is a real, unaliased index with this name, so it must be deleted.
57
+ # (This usually happens the first time we implement the aliasing scheme against
58
+ # an existing installation.)
59
+ es_client.indices.delete index: live_index_name rescue false
60
+ end
61
+ end
62
+
63
+ # re-alias
64
+ es_client.indices.update_aliases body: { actions: rename_actions }
65
+
66
+ # clean up
67
+ to_delete.each do |idxname|
68
+ es_client.indices.delete index: idxname rescue false
69
+ end
70
+ end
71
+
72
+ private
73
+
74
+ def tmp_index_pattern
75
+ /#{klass.index_name}_(\d{14})-\w{8}$/
76
+ end
77
+
78
+ def es_client
79
+ klass.__elasticsearch__.client
80
+ end
81
+
82
+ def stage_aliased_to
83
+ # find the newest tmp index to which staged is aliased.
84
+ # we need this because we want to re-alias it.
85
+ aliased_to = nil
86
+ stage_aliases = es_client.indices.get_aliases(index: stage_index_name)
87
+ stage_aliases.each do |k,v|
88
+ aliased_to ||= k
89
+ stage_tstamp = aliased_to.match(tmp_index_pattern)[1]
90
+ k_tstamp = k.match(tmp_index_pattern)[1]
91
+ if Time.parse(stage_tstamp) < Time.parse(k_tstamp)
92
+ aliased_to = k
93
+ end
94
+ end
95
+ if !aliased_to
96
+ raise "Cannot identify index aliased to by '#{stage_index_name}'"
97
+ end
98
+ aliased_to
99
+ end
100
+ end
101
+ end
102
+ end
103
+ end
@@ -0,0 +1,152 @@
1
+ require 'ansi'
2
+
3
+ module Elasticsearch
4
+ module Rails
5
+ module HA
6
+ class ParallelIndexer
7
+
8
+ attr_reader :klass, :idx_name, :nprocs, :batch_size, :max, :force, :verbose, :scope
9
+
10
+ # leverage multiple cores to run indexing in parallel
11
+ def initialize(opts)
12
+ @klass = opts[:klass] or fail "klass required"
13
+ @idx_name = opts[:idx_name] or fail "idx_name required"
14
+ @nprocs = opts[:nprocs] or fail "nprocs required"
15
+ @batch_size = opts[:batch_size] or fail "batch_size required"
16
+ @max = opts[:max]
17
+ @force = opts[:force]
18
+ @verbose = opts[:verbose]
19
+ @scope = opts[:scope]
20
+
21
+ # calculate array of offsets based on nprocs
22
+ @total_expected = klass.count
23
+ @pool_size = (@total_expected / @nprocs.to_f).ceil
24
+ end
25
+
26
+ def run
27
+ # get all ids since we can't assume there are no holes in the PK sequencing
28
+ ids = klass.order('id ASC').pluck(:id)
29
+ offsets = []
30
+ ids.each_slice(@pool_size) do |chunk|
31
+ #puts "chunk: size=#{chunk.size} #{chunk.first}..#{chunk.last}"
32
+ offsets.push( chunk.first )
33
+ end
34
+ if @verbose
35
+ puts ::ANSI.blue{ "Parallel Indexer: index=#{@idx_name} total=#{@total_expected} nprocs=#{@nprocs} pool_size=#{@pool_size} offsets=#{offsets} " }
36
+ end
37
+
38
+ if @force
39
+ @verbose and puts ::ANSI.blue{ "Force creating new index" }
40
+ klass.__elasticsearch__.create_index! force: true, index: idx_name
41
+ klass.__elasticsearch__.refresh_index! index: idx_name
42
+ end
43
+
44
+ @current_db_config = ActiveRecord::Base.connection_config
45
+ # IMPORTANT before forks in offsets loop
46
+ ActiveRecord::Base.connection.disconnect!
47
+
48
+ child_pids = []
49
+ offsets.each do |start_at|
50
+ child_pid = fork do
51
+ run_child(start_at)
52
+ end
53
+ if child_pid
54
+ child_pids << child_pid
55
+ end
56
+ end
57
+
58
+ # reconnect in parent
59
+ ActiveRecord::Base.establish_connection(@current_db_config)
60
+
61
+ # Process.waitall seems to hang during tests. Do it manually.
62
+ child_results = []
63
+
64
+ child_pids.each do |pid|
65
+ Process.wait(pid)
66
+ child_results.push [pid, $?]
67
+ end
68
+
69
+ process_child_results(child_results)
70
+ end
71
+
72
+ def process_child_results(results)
73
+ # check exit status of each child so we know if we should throw exception
74
+ results.each do |pair|
75
+ pid = pair[0]
76
+ pstat = pair[1]
77
+ exit_ok = true
78
+ if pstat.exited?
79
+ @verbose and puts ::ANSI.blue{ "PID #{pid} exited with #{pstat.exitstatus}" }
80
+ end
81
+ if pstat.signaled?
82
+ puts ::ANSI.red{ " >> #{pid} exited with uncaught signal #{pstat.termsig}" }
83
+ exit_ok = false
84
+ end
85
+
86
+ if !pstat.success?
87
+ puts ::ANSI.red{ " >> #{pid} was not successful" }
88
+ exit_ok = false
89
+ end
90
+
91
+ if pair[1].exitstatus != 0
92
+ puts ::ANSI.red{ " >> #{pid} exited with non-zero status" }
93
+ exit_ok = false
94
+ end
95
+
96
+ if !exit_ok
97
+ raise ::ANSI.red{ "PID #{pair[0]} exited abnormally, so the whole reindex fails" }
98
+ end
99
+ end
100
+ end
101
+
102
+ def run_child(start_at)
103
+ # IMPORTANT after fork
104
+ ActiveRecord::Base.establish_connection(@current_db_config)
105
+
106
+ # IMPORTANT for tests to determine whether at_end should run
107
+ ENV["I_AM_HA_CHILD"] = "true"
108
+
109
+ completed = 0
110
+ errors = []
111
+ @verbose and puts ::ANSI.blue{ "Start worker #{$$} at offset #{start_at}" }
112
+ pbar = ::ANSI::Progressbar.new("#{klass} [#{$$}]", @pool_size, STDOUT) rescue nil
113
+ checkpoint = false
114
+ if pbar
115
+ pbar.__send__ :show
116
+ pbar.bar_mark = '='
117
+ else
118
+ checkpoint = true
119
+ end
120
+
121
+ @klass.__elasticsearch__.import return: 'errors',
122
+ index: @idx_name,
123
+ start: start_at,
124
+ scope: @scope,
125
+ batch_size: @batch_size do |resp|
126
+ # show errors immediately (rather than buffering them)
127
+ errors += resp['items'].select { |k, v| k.values.first['error'] }
128
+ completed += resp['items'].size
129
+ if pbar && @verbose
130
+ pbar.inc resp['items'].size
131
+ end
132
+ if checkpoint && @verbose
133
+ puts ::ANSI.blue{ "[#{$$}] #{Time.now.utc.iso8601} : #{completed} records completed" }
134
+ end
135
+ STDERR.flush
136
+ STDOUT.flush
137
+ if errors.size > 0
138
+ STDOUT.puts "ERRORS in #{$$}:"
139
+ STDOUT.puts pp(errors)
140
+ end
141
+ if completed >= @pool_size || (@max && @max.to_i == completed)
142
+ pbar.finish if pbar
143
+ @verbose and puts ::ANSI.blue{ "Worker #{$$} finished #{completed} records" }
144
+ exit!(true) # exit child worker
145
+ end
146
+ end # end do |resp| block
147
+ end
148
+
149
+ end
150
+ end
151
+ end
152
+ end
@@ -0,0 +1,61 @@
1
+ # Rake tasks to make parallel indexing and high availability easier.
2
+
3
+ require 'elasticsearch/rails/ha'
4
+
5
+ namespace :elasticsearch do
6
+ namespace :ha do
7
+
8
+ desc "import records in parallel"
9
+ task :import do
10
+ nprocs = ENV['NPROCS'] || 1
11
+ batch_size = ENV['BATCH'] || 100
12
+ max = ENV['MAX'] || nil
13
+ klass = ENV['CLASS'] or fail "CLASS required"
14
+
15
+ indexer = Elasticsearch::Rails::HA::ParallelIndexer.new(
16
+ klass: klass.constantize,
17
+ idx_name: (ENV['INDEX'] || klass.constantize.index_name),
18
+ nprocs: nprocs.to_i,
19
+ batch_size: batch_size.to_i,
20
+ max: max,
21
+ scope: ENV.fetch('SCOPE', nil),
22
+ force: ENV['FORCE'],
23
+ verbose: !ENV['QUIET']
24
+ )
25
+
26
+ indexer.run
27
+ end
28
+
29
+ desc "stage an index"
30
+ task :stage do
31
+ nprocs = ENV['NPROCS'] || 1
32
+ batch_size = ENV['BATCH'] || 100
33
+ max = ENV['MAX'] || nil
34
+ klass = ENV['CLASS'] or fail "CLASS required"
35
+
36
+ stager = Elasticsearch::Rails::HA::IndexStager.new(klass)
37
+ indexer = Elasticsearch::Rails::HA::ParallelIndexer.new(
38
+ klass: stager.klass,
39
+ idx_name: (ENV['INDEX'] || stager.tmp_index_name),
40
+ nprocs: nprocs.to_i,
41
+ batch_size: batch_size.to_i,
42
+ max: max,
43
+ scope: ENV.fetch('SCOPE', nil),
44
+ force: true,
45
+ verbose: !ENV['QUIET']
46
+ )
47
+ indexer.run
48
+ stager.alias_stage_to_tmp_index
49
+ puts "[#{Time.now.utc.iso8601}] #{klass} index staged as #{stager.stage_index_name}"
50
+ end
51
+
52
+ desc "promote staged index to live"
53
+ task :promote do
54
+ klass = ENV['CLASS'] or fail "CLASS required"
55
+ stager = Elasticsearch::Rails::HA::IndexStager.new(klass)
56
+ stager.promote(ENV['INDEX'])
57
+ puts "[#{Time.now.utc.iso8601}] #{klass} promoted #{stage_index_name} to #{stager.live_index_name}"
58
+ end
59
+
60
+ end
61
+ end
@@ -0,0 +1,7 @@
1
+ module Elasticsearch
2
+ module Rails
3
+ module HA
4
+ VERSION = '1.0.0'
5
+ end
6
+ end
7
+ end
metadata ADDED
@@ -0,0 +1,193 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: elasticsearch-rails-ha
3
+ version: !ruby/object:Gem::Version
4
+ version: 1.0.0
5
+ platform: ruby
6
+ authors:
7
+ - Peter Karman
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2016-01-25 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: elasticsearch-model
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - ">="
18
+ - !ruby/object:Gem::Version
19
+ version: '0'
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: '0'
27
+ - !ruby/object:Gem::Dependency
28
+ name: elasticsearch-rails
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - ">="
32
+ - !ruby/object:Gem::Version
33
+ version: '0'
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: '0'
41
+ - !ruby/object:Gem::Dependency
42
+ name: ansi
43
+ requirement: !ruby/object:Gem::Requirement
44
+ requirements:
45
+ - - ">="
46
+ - !ruby/object:Gem::Version
47
+ version: '0'
48
+ type: :runtime
49
+ prerelease: false
50
+ version_requirements: !ruby/object:Gem::Requirement
51
+ requirements:
52
+ - - ">="
53
+ - !ruby/object:Gem::Version
54
+ version: '0'
55
+ - !ruby/object:Gem::Dependency
56
+ name: about_yml
57
+ requirement: !ruby/object:Gem::Requirement
58
+ requirements:
59
+ - - ">="
60
+ - !ruby/object:Gem::Version
61
+ version: '0'
62
+ type: :development
63
+ prerelease: false
64
+ version_requirements: !ruby/object:Gem::Requirement
65
+ requirements:
66
+ - - ">="
67
+ - !ruby/object:Gem::Version
68
+ version: '0'
69
+ - !ruby/object:Gem::Dependency
70
+ name: bundler
71
+ requirement: !ruby/object:Gem::Requirement
72
+ requirements:
73
+ - - "~>"
74
+ - !ruby/object:Gem::Version
75
+ version: '1.3'
76
+ type: :development
77
+ prerelease: false
78
+ version_requirements: !ruby/object:Gem::Requirement
79
+ requirements:
80
+ - - "~>"
81
+ - !ruby/object:Gem::Version
82
+ version: '1.3'
83
+ - !ruby/object:Gem::Dependency
84
+ name: rake
85
+ requirement: !ruby/object:Gem::Requirement
86
+ requirements:
87
+ - - ">="
88
+ - !ruby/object:Gem::Version
89
+ version: '0'
90
+ type: :development
91
+ prerelease: false
92
+ version_requirements: !ruby/object:Gem::Requirement
93
+ requirements:
94
+ - - ">="
95
+ - !ruby/object:Gem::Version
96
+ version: '0'
97
+ - !ruby/object:Gem::Dependency
98
+ name: rspec
99
+ requirement: !ruby/object:Gem::Requirement
100
+ requirements:
101
+ - - ">="
102
+ - !ruby/object:Gem::Version
103
+ version: '0'
104
+ type: :development
105
+ prerelease: false
106
+ version_requirements: !ruby/object:Gem::Requirement
107
+ requirements:
108
+ - - ">="
109
+ - !ruby/object:Gem::Version
110
+ version: '0'
111
+ - !ruby/object:Gem::Dependency
112
+ name: elasticsearch-extensions
113
+ requirement: !ruby/object:Gem::Requirement
114
+ requirements:
115
+ - - ">="
116
+ - !ruby/object:Gem::Version
117
+ version: '0'
118
+ type: :development
119
+ prerelease: false
120
+ version_requirements: !ruby/object:Gem::Requirement
121
+ requirements:
122
+ - - ">="
123
+ - !ruby/object:Gem::Version
124
+ version: '0'
125
+ - !ruby/object:Gem::Dependency
126
+ name: sqlite3
127
+ requirement: !ruby/object:Gem::Requirement
128
+ requirements:
129
+ - - ">="
130
+ - !ruby/object:Gem::Version
131
+ version: '0'
132
+ type: :development
133
+ prerelease: false
134
+ version_requirements: !ruby/object:Gem::Requirement
135
+ requirements:
136
+ - - ">="
137
+ - !ruby/object:Gem::Version
138
+ version: '0'
139
+ - !ruby/object:Gem::Dependency
140
+ name: rails
141
+ requirement: !ruby/object:Gem::Requirement
142
+ requirements:
143
+ - - ">="
144
+ - !ruby/object:Gem::Version
145
+ version: '3.1'
146
+ type: :development
147
+ prerelease: false
148
+ version_requirements: !ruby/object:Gem::Requirement
149
+ requirements:
150
+ - - ">="
151
+ - !ruby/object:Gem::Version
152
+ version: '3.1'
153
+ description: High Availability extensions to the Elasticsearch::Rails gem
154
+ email:
155
+ - peter.karman@gsa.gov
156
+ executables: []
157
+ extensions: []
158
+ extra_rdoc_files: []
159
+ files:
160
+ - CONTRIBUTING.md
161
+ - LICENSE.md
162
+ - README.md
163
+ - lib/elasticsearch/rails/ha.rb
164
+ - lib/elasticsearch/rails/ha/index_stager.rb
165
+ - lib/elasticsearch/rails/ha/parallel_indexer.rb
166
+ - lib/elasticsearch/rails/ha/tasks.rb
167
+ - lib/elasticsearch/rails/ha/version.rb
168
+ homepage: https://github.com/18F/elasticsearch-rails-ha-gem
169
+ licenses:
170
+ - CC0
171
+ metadata: {}
172
+ post_install_message:
173
+ rdoc_options: []
174
+ require_paths:
175
+ - lib
176
+ required_ruby_version: !ruby/object:Gem::Requirement
177
+ requirements:
178
+ - - ">="
179
+ - !ruby/object:Gem::Version
180
+ version: 1.9.3
181
+ required_rubygems_version: !ruby/object:Gem::Requirement
182
+ requirements:
183
+ - - ">="
184
+ - !ruby/object:Gem::Version
185
+ version: '0'
186
+ requirements: []
187
+ rubyforge_project:
188
+ rubygems_version: 2.4.5.1
189
+ signing_key:
190
+ specification_version: 4
191
+ summary: High Availability extensions to the Elasticsearch::Rails gem
192
+ test_files: []
193
+ has_rdoc: