db_sucker 3.1.0 → 3.1.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 1bad32bfbe4d3e2a24def83464a9bdce5c1c7f3158f558ed405d15967adc1b4e
4
- data.tar.gz: b5518954de8c6633ac22979b53f225eb7ad1e609fc191931290287479da69547
3
+ metadata.gz: a5629ff79f3a0b48bf58f3a19c3f3f342082d76ae896ffca04d781e19de8959d
4
+ data.tar.gz: 0601cd98271331cd62ffbda5f7ae0c6ecd27333bad3598229b224d5bceef586b
5
5
  SHA512:
6
- metadata.gz: aa025dc2bf4dac913e9e4a7295980d8030cb795cedb6edcd458d26783f8349499879bbb6520a2d09e9f1dc879e531656c7e639119fcee80e0b36df613748723c
7
- data.tar.gz: 55f8415d7f31e16d6b6d8aec346c37ce8cd30112d4f5aaa516200dd2a75c92099ed31994d2691cba1e360f36807f93b638ae83a1c4fa367e02bfa8922b01d97f
6
+ metadata.gz: 2f856a743e274a0a4caf46aa614268d14e92ea40f63cc60f248d7561ddc421c260ddec37542b5a1095e8f6b1b654e8afc4616dd8191170251a48d67ba40645cf
7
+ data.tar.gz: 3a633a3ca67f10b3f0914b8a4b5c7fa6355d4b471a94dab1b8423135919b6b183b4accce904b2a350394e17601715dfa24410f43873e79cc839dd0367bfe3a55
data/CHANGELOG.md CHANGED
@@ -1,3 +1,11 @@
1
+ ## 3.1.1
2
+
3
+ ### Fixes
4
+
5
+ * Prevent a single task (running in main thread) to summon workers (and fail) when getting deferred
6
+
7
+ -------------------
8
+
1
9
  ## 3.1.0
2
10
 
3
11
  ### Fixes
data/README.md CHANGED
@@ -19,7 +19,7 @@ This tool is meant for pulling live data into your development environment. **It
19
19
  * independent parallel dump / download / import cycle for each table
20
20
  * verifies file integrity via SHA
21
21
  * flashy and colorful curses based interface with keyboard shortcuts
22
- * more status indications than you would ever want (even more if the remote has a somewhat recent `pv` (pipeviewer) installed)
22
+ * more status indications than you would ever want (even more if the remote has `pv` (pipeviewer) >= 1.3.8 installed)
23
23
  * limit concurrency of certain type of tasks (e.g. limit downloads, imports, etc.)
24
24
  * uses more threads than any application should ever use (seriously it's a nightmare)
25
25
 
@@ -32,7 +32,7 @@ Currently `db_sucker` only handles the following data-flow constellation:
32
32
 
33
33
  On the local side you will need:
34
34
  - unixoid OS
35
- - Ruby (>= 2.0, != 2.3.1 see gotchas)
35
+ - Ruby (>= 2.0, != 2.3.0 see [Caveats](#caveats---bugs))
36
36
  - mysql2 gem
37
37
  - MySQL client (`mysql` command will be used for importing)
38
38
 
@@ -42,6 +42,7 @@ On the remote side you will need:
42
42
  - any folder with write permissions (for the temporary dumps)
43
43
  - mysqldump executable
44
44
  - MySQL credentials :)
45
+ - Optional: Install "pv" aka pipeviewer with a version >= 1.3.8 for progress displays on remote tasks
45
46
 
46
47
 
47
48
  ## Installation
@@ -50,9 +51,13 @@ Simple as:
50
51
 
51
52
  $ gem install db_sucker
52
53
 
53
- At the moment you are advised to adjust the MaxSessions limit on your remote SSH server if you run into issues, see Caveats.
54
+ You will need mysql2 as well (it's not a dependency as we might support other DBMS in the future):
54
55
 
55
- You will also need at least one configuration, see Configuration.
56
+ $ gem install mysql2
57
+
58
+ At the moment you are advised to adjust the MaxSessions limit on your remote SSH server if you run into issues, see [Caveats](#caveats---bugs).
59
+
60
+ You will also need at least one configuration, see [Configuration](#configuration-for-sucking---yaml-format).
56
61
 
57
62
 
58
63
  ## Usage
@@ -130,7 +135,7 @@ DbSucker has a lot of settings and other mechanisms which you can tweak and util
130
135
 
131
136
  ## Deferred import
132
137
 
133
- Tables with an uncompressed filesize of over 50MB will be queued up for import. Files smaller than 50MB will be imported concurrently with other tables. When all those have finished the large ones will import one after another. You can skip this behaviour with the `-n` resp. `--no-deffer` option. The threshold is changeable in your `config.rb`, see Configuration.
138
+ Tables with an uncompressed filesize of over 50MB will be queued up for import. Files smaller than 50MB will be imported concurrently with other tables. When all those have finished the large ones will import one after another. You can skip this behaviour with the `-n` resp. `--no-deffer` option. The threshold is changeable in your `config.rb`, see [Configuration](#configuration-application---ruby-format).
134
139
 
135
140
 
136
141
  ## Importer
@@ -170,7 +175,7 @@ Under certain conditions the program might softlock when the remote unexpectedly
170
175
  If you get warnings that SSH errors occured (and most likely tasks fail), please do any of the following to prevent the issue:
171
176
 
172
177
  * Raise the MaxSession setting on the remote SSHd server if you can (recommended)
173
- * Lower the amount of slots for concurrent downloads (see Configuration)
178
+ * Lower the amount of slots for concurrent downloads (see [Configuration](#configuration-application---ruby-format))
174
179
  * Lower the amount of consumers (not recommended, use slots instead)
175
180
 
176
181
  You can run basic SSH diagnosis tests with `db_sucker <config_identifier> -a sshdiag`.
data/VERSION CHANGED
@@ -1 +1 @@
1
- 3.1.0
1
+ 3.1.1
@@ -247,7 +247,9 @@ module DbSucker
247
247
 
248
248
  def _l_wait_for_workers
249
249
  @perform << "l_import_file_deferred"
250
- Thread.main.sync { Thread.main[:summon_workers] += 1 }
250
+ unless Thread.current[:managed_worker] == :main
251
+ Thread.main.sync { Thread.main[:summon_workers] += 1 }
252
+ end
251
253
  wait_defer_ready
252
254
  end
253
255
 
@@ -1,4 +1,4 @@
1
1
  module DbSucker
2
- VERSION = "3.1.0"
2
+ VERSION = "3.1.1"
3
3
  UPDATE_URL = "https://raw.githubusercontent.com/2called-chaos/db_sucker/master/VERSION"
4
4
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: db_sucker
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.1.0
4
+ version: 3.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Sven Pachnit
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-04-10 00:00:00.000000000 Z
11
+ date: 2018-06-17 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: curses