pgsync 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
Potentially problematic release.
This version of pgsync might be problematic. Click here for more details.
- checksums.yaml +7 -0
- data/.gitignore +9 -0
- data/Gemfile +4 -0
- data/LICENSE.txt +21 -0
- data/README.md +149 -0
- data/Rakefile +8 -0
- data/bin/pgsync +8 -0
- data/config.yml +27 -0
- data/lib/pgsync.rb +366 -0
- data/lib/pgsync/version.rb +3 -0
- data/pgsync.gemspec +29 -0
- metadata +154 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA1:
|
3
|
+
metadata.gz: 58a427f20bdf42cd97ca4101964a571ab7f5da76
|
4
|
+
data.tar.gz: 13da001cf27f6ddda55a96b5f9ec3a8e27577eb5
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 7399365349da94533619effdeee4c80d05817384b75377af12c4d2f05d1cc2af6b9681dcc0e9b5d4cf8c3616e9f7f048b6b3fd7ca765d74b97437f29a503e04c
|
7
|
+
data.tar.gz: 15abafa358b77760f493d28f7c20ad8c44cf2e5a6e3ed7f5432f2b2b50c0aa67c465219026f705e1e37da4830c6881fc579ccb756ea694a05429c92c4f270112
|
data/.gitignore
ADDED
data/Gemfile
ADDED
data/LICENSE.txt
ADDED
@@ -0,0 +1,21 @@
|
|
1
|
+
The MIT License (MIT)
|
2
|
+
|
3
|
+
Copyright (c) 2015 Andrew Kane
|
4
|
+
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
7
|
+
in the Software without restriction, including without limitation the rights
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
10
|
+
furnished to do so, subject to the following conditions:
|
11
|
+
|
12
|
+
The above copyright notice and this permission notice shall be included in
|
13
|
+
all copies or substantial portions of the Software.
|
14
|
+
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
21
|
+
THE SOFTWARE.
|
data/README.md
ADDED
@@ -0,0 +1,149 @@
|
|
1
|
+
# pgsync
|
2
|
+
|
3
|
+
Quickly and securely sync data between environments
|
4
|
+
|
5
|
+
## Installation
|
6
|
+
|
7
|
+
```sh
|
8
|
+
gem install pgsync
|
9
|
+
```
|
10
|
+
|
11
|
+
And in your project directory, run:
|
12
|
+
|
13
|
+
```sh
|
14
|
+
pgsync setup
|
15
|
+
```
|
16
|
+
|
17
|
+
This creates a `.pgsync.yml` for you to customize. We recommend checking this into your version control (assuming it doesn’t contain sensitive information).
|
18
|
+
|
19
|
+
## How to Use
|
20
|
+
|
21
|
+
Fetch all tables
|
22
|
+
|
23
|
+
```sh
|
24
|
+
pgsync
|
25
|
+
```
|
26
|
+
|
27
|
+
Fetch specific tables
|
28
|
+
|
29
|
+
```sh
|
30
|
+
pgsync table1,table2
|
31
|
+
```
|
32
|
+
|
33
|
+
Fetch specific rows (truncates destination table first)
|
34
|
+
|
35
|
+
```sh
|
36
|
+
pgsync products --where "id < 100"
|
37
|
+
```
|
38
|
+
|
39
|
+
### Exclude Tables
|
40
|
+
|
41
|
+
```sh
|
42
|
+
pgsync --exclude users
|
43
|
+
```
|
44
|
+
|
45
|
+
To always exclude, add to `.pgsync.yml`.
|
46
|
+
|
47
|
+
```yml
|
48
|
+
exclude:
|
49
|
+
- table1
|
50
|
+
- table2
|
51
|
+
```
|
52
|
+
|
53
|
+
For Rails, you probably want to exclude schema migrations.
|
54
|
+
|
55
|
+
```yml
|
56
|
+
exclude:
|
57
|
+
- schema_migrations
|
58
|
+
```
|
59
|
+
|
60
|
+
### Schema
|
61
|
+
|
62
|
+
Fetch schema
|
63
|
+
|
64
|
+
```sh
|
65
|
+
pgsync schema
|
66
|
+
```
|
67
|
+
|
68
|
+
Specify tables
|
69
|
+
|
70
|
+
```sh
|
71
|
+
pgsync schema table1,table2
|
72
|
+
```
|
73
|
+
|
74
|
+
### Groups
|
75
|
+
|
76
|
+
Define groups in `.pgsync.yml`:
|
77
|
+
|
78
|
+
```yml
|
79
|
+
groups:
|
80
|
+
group1:
|
81
|
+
- table1
|
82
|
+
- table2
|
83
|
+
group2:
|
84
|
+
- table3
|
85
|
+
- table4
|
86
|
+
```
|
87
|
+
|
88
|
+
And run:
|
89
|
+
|
90
|
+
```sh
|
91
|
+
pgsync groups group1,group2
|
92
|
+
```
|
93
|
+
|
94
|
+
## Senstive Information
|
95
|
+
|
96
|
+
Prevent sensitive information - like passwords and email addresses - from leaving the remote server.
|
97
|
+
|
98
|
+
Define rules in `.pgsync.yml`:
|
99
|
+
|
100
|
+
```yml
|
101
|
+
data_rules:
|
102
|
+
email: unique_email
|
103
|
+
last_name: random_letter
|
104
|
+
birthday: random_date
|
105
|
+
users.auth_token:
|
106
|
+
value: secret
|
107
|
+
visits_count:
|
108
|
+
statement: "(RANDOM() * 10)::int"
|
109
|
+
encrypted_*: null
|
110
|
+
```
|
111
|
+
|
112
|
+
`last_name` matches all columns named `last_name` and `users.last_name` matches only the users table. Wildcards are supported, and the first matching rule is applied.
|
113
|
+
|
114
|
+
Options for replacement are:
|
115
|
+
|
116
|
+
- null
|
117
|
+
- value
|
118
|
+
- statement
|
119
|
+
- unique_email
|
120
|
+
- unique_phone
|
121
|
+
- random_letter
|
122
|
+
- random_int
|
123
|
+
- random_date
|
124
|
+
- random_time
|
125
|
+
- random_ip
|
126
|
+
- untouched
|
127
|
+
|
128
|
+
## Safety
|
129
|
+
|
130
|
+
To keep you from accidentally overwriting production, the destination is limited to `localhost` or `127.0.0.1` by default.
|
131
|
+
|
132
|
+
To use another host, add `to_safe: true` to your `.pgsync.yml`.
|
133
|
+
|
134
|
+
## Thanks
|
135
|
+
|
136
|
+
Inspired by [heroku-pg-transfer](https://github.com/ddollar/heroku-pg-transfer).
|
137
|
+
|
138
|
+
## TODO
|
139
|
+
|
140
|
+
- Support for schemas other than `public`
|
141
|
+
|
142
|
+
## Contributing
|
143
|
+
|
144
|
+
Everyone is encouraged to help improve this project. Here are a few ways you can help:
|
145
|
+
|
146
|
+
- [Report bugs](https://github.com/ankane/pgsync/issues)
|
147
|
+
- Fix bugs and [submit pull requests](https://github.com/ankane/pgsync/pulls)
|
148
|
+
- Write, clarify, or fix documentation
|
149
|
+
- Suggest or add new features
|
data/Rakefile
ADDED
data/bin/pgsync
ADDED
data/config.yml
ADDED
@@ -0,0 +1,27 @@
|
|
1
|
+
# source database URL
|
2
|
+
# database URLs take the format of:
|
3
|
+
# postgres://user:password@host:port/dbname
|
4
|
+
# we recommend a command which outputs a database URL
|
5
|
+
# so sensitive information is not included in this file
|
6
|
+
from: $(heroku config:get DATABASE_URL)
|
7
|
+
|
8
|
+
# destination database URL
|
9
|
+
to: postgres://localhost:5432/myapp_development
|
10
|
+
|
11
|
+
# exclude tables
|
12
|
+
# exclude:
|
13
|
+
# - schema_migrations
|
14
|
+
|
15
|
+
# define groups
|
16
|
+
# groups:
|
17
|
+
# group1:
|
18
|
+
# - table1
|
19
|
+
# - table2
|
20
|
+
|
21
|
+
# protect sensitive information
|
22
|
+
data_rules:
|
23
|
+
email: unique_email
|
24
|
+
phone: unique_phone
|
25
|
+
last_name: random_letter
|
26
|
+
birthday: random_date
|
27
|
+
encrypted_*: null
|
data/lib/pgsync.rb
ADDED
@@ -0,0 +1,366 @@
|
|
1
|
+
require "pgsync/version"
|
2
|
+
require "yaml"
|
3
|
+
require "slop"
|
4
|
+
require "uri"
|
5
|
+
require "erb"
|
6
|
+
require "pg"
|
7
|
+
require "parallel"
|
8
|
+
require "multiprocessing"
|
9
|
+
require "fileutils"
|
10
|
+
|
11
|
+
module PgSync
|
12
|
+
class Error < StandardError; end
|
13
|
+
class Rollback < StandardError; end
|
14
|
+
|
15
|
+
class Client
|
16
|
+
def initialize(args)
|
17
|
+
$stdout.sync = true
|
18
|
+
@arguments, @options = parse_args(args)
|
19
|
+
@config_file = @options[:config]
|
20
|
+
@mutex = MultiProcessing::Mutex.new
|
21
|
+
end
|
22
|
+
|
23
|
+
# TODO clean up this mess
|
24
|
+
def perform
|
25
|
+
start_time = Time.now
|
26
|
+
|
27
|
+
args, opts = @arguments, @options
|
28
|
+
[:to, :from, :to_safe, :exclude].each do |opt|
|
29
|
+
opts[opt] ||= config[opt.to_s]
|
30
|
+
end
|
31
|
+
command = args[0]
|
32
|
+
|
33
|
+
if command == "setup"
|
34
|
+
setup
|
35
|
+
else
|
36
|
+
source = parse_source(opts[:from])
|
37
|
+
abort "No source" unless source
|
38
|
+
source_uri = parse_uri(source)
|
39
|
+
|
40
|
+
destination = parse_source(opts[:to])
|
41
|
+
abort "No destination" unless destination
|
42
|
+
destination_uri = parse_uri(destination)
|
43
|
+
abort "Danger! Add `to_safe: true` to `.pgsync.yml` if the destination is not localhost or 127.0.0.1" unless %(localhost 127.0.0.1).include?(destination_uri.host) || opts[:to_safe]
|
44
|
+
|
45
|
+
print_uri("From", source_uri)
|
46
|
+
print_uri("To", destination_uri)
|
47
|
+
|
48
|
+
if args[0] == "schema"
|
49
|
+
time =
|
50
|
+
benchmark do
|
51
|
+
puts "* Dumping schema"
|
52
|
+
tables = to_arr(args[1]).map { |t| "-t #{t}" }.join(" ")
|
53
|
+
dump_command = "pg_dump --verbose --schema-only --no-owner --no-acl --clean #{tables} #{to_url(source_uri)}"
|
54
|
+
restore_command = "psql -q -d #{to_url(destination_uri)}"
|
55
|
+
system("#{dump_command} | #{restore_command}")
|
56
|
+
end
|
57
|
+
|
58
|
+
puts "* DONE (#{time.round(1)}s)"
|
59
|
+
else
|
60
|
+
from_uri = source_uri
|
61
|
+
to_uri = destination_uri
|
62
|
+
|
63
|
+
tables =
|
64
|
+
if args[0] == "groups"
|
65
|
+
specified_groups = to_arr(args[1])
|
66
|
+
specified_groups.map do |group|
|
67
|
+
if (tables = config["groups"][group])
|
68
|
+
tables
|
69
|
+
else
|
70
|
+
abort "Group not found: #{group}"
|
71
|
+
end
|
72
|
+
end.flatten
|
73
|
+
elsif args[0] == "tables"
|
74
|
+
to_arr(args[1])
|
75
|
+
elsif args[0]
|
76
|
+
to_arr(args[0])
|
77
|
+
else
|
78
|
+
nil
|
79
|
+
end
|
80
|
+
|
81
|
+
with_connection(from_uri, timeout: 3) do |conn|
|
82
|
+
tables ||= self.tables(conn, "public") - to_arr(opts[:exclude])
|
83
|
+
|
84
|
+
tables.each do |table|
|
85
|
+
unless table_exists?(conn, table, "public")
|
86
|
+
abort "Table does not exist in source: #{table}"
|
87
|
+
end
|
88
|
+
end
|
89
|
+
end
|
90
|
+
|
91
|
+
with_connection(to_uri, timeout: 3) do |conn|
|
92
|
+
tables.each do |table|
|
93
|
+
unless table_exists?(conn, table, "public")
|
94
|
+
abort "Table does not exist in destination: #{table}"
|
95
|
+
end
|
96
|
+
end
|
97
|
+
end
|
98
|
+
|
99
|
+
Parallel.each(tables) do |table|
|
100
|
+
time =
|
101
|
+
benchmark do
|
102
|
+
with_connection(from_uri) do |from_connection|
|
103
|
+
with_connection(to_uri) do |to_connection|
|
104
|
+
bad_fields = config["data_rules"]
|
105
|
+
|
106
|
+
from_fields = columns(from_connection, table, "public")
|
107
|
+
to_fields = columns(to_connection, table, "public")
|
108
|
+
shared_fields = to_fields & from_fields
|
109
|
+
extra_fields = to_fields - from_fields
|
110
|
+
missing_fields = from_fields - to_fields
|
111
|
+
|
112
|
+
from_sequences = sequences(from_connection, table, shared_fields)
|
113
|
+
to_sequences = sequences(to_connection, table, shared_fields)
|
114
|
+
shared_sequences = to_sequences & from_sequences
|
115
|
+
extra_sequences = to_sequences - from_sequences
|
116
|
+
missing_sequences = from_sequences - to_sequences
|
117
|
+
|
118
|
+
where = opts[:where]
|
119
|
+
|
120
|
+
@mutex.synchronize do
|
121
|
+
puts "* Syncing #{table}"
|
122
|
+
if where
|
123
|
+
puts " #{where}"
|
124
|
+
where = " WHERE #{opts[:where]}"
|
125
|
+
end
|
126
|
+
puts "EXTRA COLUMNS: #{extra_fields}" if extra_fields.any?
|
127
|
+
puts "MISSING COLUMNS: #{missing_fields}" if missing_fields.any?
|
128
|
+
puts "EXTRA SEQUENCES: #{extra_sequences}" if extra_sequences.any?
|
129
|
+
puts "MISSING SEQUENCES: #{missing_sequences}" if missing_sequences.any?
|
130
|
+
end
|
131
|
+
|
132
|
+
if shared_fields.empty?
|
133
|
+
abort "No fields to copy: #{table}"
|
134
|
+
end
|
135
|
+
|
136
|
+
copy_fields = shared_fields.map { |f| f2 = bad_fields.to_a.find { |bf, bk| rule_match?(table, f, bf) }; f2 ? "#{apply_strategy(f2[1], f, from_connection)} AS #{escape_identifier(f)}" : escape_identifier(f) }.join(", ")
|
137
|
+
fields = shared_fields.map { |f| escape_identifier(f) }.join(", ")
|
138
|
+
|
139
|
+
seq_values = {}
|
140
|
+
shared_sequences.each do |seq|
|
141
|
+
seq_values[seq] = from_connection.exec("select last_value from #{seq}").to_a[0]["last_value"]
|
142
|
+
end
|
143
|
+
|
144
|
+
# use transaction to revert statement timeout
|
145
|
+
begin
|
146
|
+
from_connection.transaction do |fconn|
|
147
|
+
fconn.exec("SET statement_timeout = 0")
|
148
|
+
to_connection.exec("TRUNCATE #{table} CASCADE")
|
149
|
+
to_connection.copy_data "COPY #{table} (#{fields}) FROM STDIN" do
|
150
|
+
fconn.copy_data "COPY (SELECT #{copy_fields} FROM #{table}#{where}) TO STDOUT" do
|
151
|
+
while row = fconn.get_copy_data
|
152
|
+
to_connection.put_copy_data(row)
|
153
|
+
end
|
154
|
+
end
|
155
|
+
end
|
156
|
+
seq_values.each do |seq, value|
|
157
|
+
to_connection.exec("SELECT setval(#{escape(seq)}, #{escape(value)})")
|
158
|
+
end
|
159
|
+
raise PgSync::Rollback
|
160
|
+
end
|
161
|
+
rescue PgSync::Rollback
|
162
|
+
# success
|
163
|
+
end
|
164
|
+
end
|
165
|
+
end
|
166
|
+
end
|
167
|
+
|
168
|
+
@mutex.synchronize do
|
169
|
+
puts "* DONE #{table} (#{time.round(1)}s)"
|
170
|
+
end
|
171
|
+
end
|
172
|
+
|
173
|
+
time = Time.now - start_time
|
174
|
+
puts "Completed in #{time.round(1)}s"
|
175
|
+
end
|
176
|
+
end
|
177
|
+
true
|
178
|
+
end
|
179
|
+
|
180
|
+
protected
|
181
|
+
|
182
|
+
def parse_args(args)
|
183
|
+
opts = Slop.parse(args) do |o|
|
184
|
+
o.banner = "usage: pgsync [options]"
|
185
|
+
o.string "--from", "source"
|
186
|
+
o.string "--to", "destination"
|
187
|
+
o.string "--where", "where"
|
188
|
+
o.string "--exclude", "exclude tables"
|
189
|
+
o.string "--config", "config file", default: ".pgsync.yml"
|
190
|
+
# TODO much better name for this option
|
191
|
+
o.boolean "--to-safe", "accept danger", default: false
|
192
|
+
o.on "-v", "--version", "print the version" do
|
193
|
+
puts PgSync::VERSION
|
194
|
+
exit
|
195
|
+
end
|
196
|
+
o.on "-h", "--help", "prints help" do
|
197
|
+
puts o
|
198
|
+
exit
|
199
|
+
end
|
200
|
+
end
|
201
|
+
[opts.arguments, opts.to_hash]
|
202
|
+
rescue Slop::Error => e
|
203
|
+
abort e.message
|
204
|
+
end
|
205
|
+
|
206
|
+
# TODO look down path
|
207
|
+
def config
|
208
|
+
@config ||= begin
|
209
|
+
if File.exist?(@config_file)
|
210
|
+
begin
|
211
|
+
YAML.load_file(@config_file) || {}
|
212
|
+
rescue Psych::SyntaxError => e
|
213
|
+
raise PgSync::Error, e.message
|
214
|
+
end
|
215
|
+
else
|
216
|
+
{}
|
217
|
+
end
|
218
|
+
end
|
219
|
+
end
|
220
|
+
|
221
|
+
def parse_source(source)
|
222
|
+
if source && source[0..1] == "$(" && source[-1] == ")"
|
223
|
+
source = `#{source[2..-2]}`.chomp
|
224
|
+
end
|
225
|
+
source
|
226
|
+
end
|
227
|
+
|
228
|
+
def setup
|
229
|
+
if File.exist?(@config_file)
|
230
|
+
abort "#{@config_file} exists."
|
231
|
+
else
|
232
|
+
FileUtils.cp(File.dirname(__FILE__) + "/../config.yml", @config_file)
|
233
|
+
puts "#{@config_file} created. Add your database credentials."
|
234
|
+
end
|
235
|
+
end
|
236
|
+
|
237
|
+
def with_connection(uri, timeout: 0)
|
238
|
+
conn =
|
239
|
+
PG::Connection.new(
|
240
|
+
host: uri.host,
|
241
|
+
port: uri.port,
|
242
|
+
dbname: uri.path.sub(/\A\//, ""),
|
243
|
+
user: uri.user,
|
244
|
+
password: uri.password,
|
245
|
+
connect_timeout: timeout
|
246
|
+
)
|
247
|
+
begin
|
248
|
+
yield conn
|
249
|
+
ensure
|
250
|
+
conn.close
|
251
|
+
end
|
252
|
+
rescue PG::ConnectionBad => e
|
253
|
+
puts
|
254
|
+
abort e.message
|
255
|
+
end
|
256
|
+
|
257
|
+
def benchmark
|
258
|
+
start_time = Time.now
|
259
|
+
yield
|
260
|
+
Time.now - start_time
|
261
|
+
end
|
262
|
+
|
263
|
+
def tables(conn, schema)
|
264
|
+
query = "SELECT tablename FROM pg_catalog.pg_tables WHERE schemaname = $1 ORDER BY tablename ASC"
|
265
|
+
conn.exec_params(query, [schema]).to_a.map { |row| row["tablename"] }
|
266
|
+
end
|
267
|
+
|
268
|
+
def columns(conn, table, schema)
|
269
|
+
query = "SELECT column_name FROM information_schema.columns WHERE table_schema = $1 AND table_name = $2"
|
270
|
+
conn.exec_params(query, [schema, table]).to_a.map { |row| row["column_name"] }
|
271
|
+
end
|
272
|
+
|
273
|
+
def table_exists?(conn, table, schema)
|
274
|
+
query = "SELECT 1 FROM information_schema.tables WHERE table_schema = $1 AND table_name = $2"
|
275
|
+
conn.exec_params(query, [schema, table]).to_a.size > 0
|
276
|
+
end
|
277
|
+
|
278
|
+
# TODO better performance
|
279
|
+
def rule_match?(table, column, rule)
|
280
|
+
regex = Regexp.new('\A' + Regexp.escape(rule).gsub('\*','[^\.]*') + '\z')
|
281
|
+
regex.match(column) || regex.match("#{table}.#{column}")
|
282
|
+
end
|
283
|
+
|
284
|
+
# TODO wildcard rules
|
285
|
+
def apply_strategy(rule, column, conn)
|
286
|
+
if rule.is_a?(Hash)
|
287
|
+
if rule.key?("value")
|
288
|
+
escape(rule["value"])
|
289
|
+
elsif rule.key?("statement")
|
290
|
+
rule["statement"]
|
291
|
+
else
|
292
|
+
abort "Unknown rule #{rule.inspect} for column #{column}"
|
293
|
+
end
|
294
|
+
else
|
295
|
+
strategies = {
|
296
|
+
"unique_email" => "'email' || id || '@example.org'",
|
297
|
+
"untouched" => escape_identifier(column),
|
298
|
+
"unique_phone" => "(id + 1000000000)::text",
|
299
|
+
"random_int" => "(RAND() * 10)::int",
|
300
|
+
"random_date" => "'1970-01-01'",
|
301
|
+
"random_time" => "NOW()",
|
302
|
+
"unique_secret" => "'secret' || id",
|
303
|
+
"random_ip" => "'127.0.0.1'",
|
304
|
+
"random_letter" => "'A'",
|
305
|
+
"null" => "NULL",
|
306
|
+
nil => "NULL"
|
307
|
+
}
|
308
|
+
if strategies[rule]
|
309
|
+
strategies[rule]
|
310
|
+
else
|
311
|
+
abort "Unknown rule #{rule} for column #{column}"
|
312
|
+
end
|
313
|
+
end
|
314
|
+
end
|
315
|
+
|
316
|
+
def escape_identifier(value)
|
317
|
+
PG::Connection.quote_ident(value)
|
318
|
+
end
|
319
|
+
|
320
|
+
def escape(value)
|
321
|
+
if value.is_a?(String)
|
322
|
+
"'#{quote_string(value)}'"
|
323
|
+
else
|
324
|
+
value
|
325
|
+
end
|
326
|
+
end
|
327
|
+
|
328
|
+
# activerecord
|
329
|
+
def quote_string(s)
|
330
|
+
s.gsub(/\\/, '\&\&').gsub(/'/, "''")
|
331
|
+
end
|
332
|
+
|
333
|
+
def to_arr(value)
|
334
|
+
if value.is_a?(Array)
|
335
|
+
value
|
336
|
+
else
|
337
|
+
value.to_s.split(",")
|
338
|
+
end
|
339
|
+
end
|
340
|
+
|
341
|
+
def parse_uri(url)
|
342
|
+
uri = URI.parse(url)
|
343
|
+
uri.host ||= "localhost"
|
344
|
+
uri.port ||= 5432
|
345
|
+
uri
|
346
|
+
end
|
347
|
+
|
348
|
+
def print_uri(prefix, uri)
|
349
|
+
puts "#{prefix}: #{uri.path.sub(/\A\//, '')} on #{uri.host}:#{uri.port}"
|
350
|
+
end
|
351
|
+
|
352
|
+
def to_url(uri)
|
353
|
+
uri = uri.dup
|
354
|
+
uri.query = nil
|
355
|
+
uri.to_s
|
356
|
+
end
|
357
|
+
|
358
|
+
def abort(message)
|
359
|
+
raise PgSync::Error, message
|
360
|
+
end
|
361
|
+
|
362
|
+
def sequences(conn, table, columns)
|
363
|
+
conn.exec("SELECT #{columns.map { |f| "pg_get_serial_sequence(#{escape(table)}, #{escape(f)}) AS #{f}" }.join(", ")}").to_a[0].values.compact
|
364
|
+
end
|
365
|
+
end
|
366
|
+
end
|
data/pgsync.gemspec
ADDED
@@ -0,0 +1,29 @@
|
|
1
|
+
# coding: utf-8
|
2
|
+
lib = File.expand_path("../lib", __FILE__)
|
3
|
+
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
|
4
|
+
require "pgsync/version"
|
5
|
+
|
6
|
+
Gem::Specification.new do |spec|
|
7
|
+
spec.name = "pgsync"
|
8
|
+
spec.version = PgSync::VERSION
|
9
|
+
spec.authors = ["Andrew Kane"]
|
10
|
+
spec.email = ["andrew@chartkick.com"]
|
11
|
+
|
12
|
+
spec.summary = "Quickly and securely sync data between environments"
|
13
|
+
spec.homepage = "https://github.com/ankane/pgsync"
|
14
|
+
spec.license = "MIT"
|
15
|
+
|
16
|
+
spec.files = `git ls-files -z`.split("\x0").reject { |f| f.match(%r{^(test|spec|features)/}) }
|
17
|
+
spec.bindir = "bin"
|
18
|
+
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
|
19
|
+
spec.require_paths = ["lib"]
|
20
|
+
|
21
|
+
spec.add_dependency "slop", ">= 4.2.0"
|
22
|
+
spec.add_dependency "pg"
|
23
|
+
spec.add_dependency "parallel"
|
24
|
+
spec.add_dependency "multiprocessing"
|
25
|
+
|
26
|
+
spec.add_development_dependency "bundler"
|
27
|
+
spec.add_development_dependency "rake"
|
28
|
+
spec.add_development_dependency "minitest"
|
29
|
+
end
|
metadata
ADDED
@@ -0,0 +1,154 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: pgsync
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Andrew Kane
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2015-12-08 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: slop
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - ">="
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: 4.2.0
|
20
|
+
type: :runtime
|
21
|
+
prerelease: false
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - ">="
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: 4.2.0
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
name: pg
|
29
|
+
requirement: !ruby/object:Gem::Requirement
|
30
|
+
requirements:
|
31
|
+
- - ">="
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '0'
|
34
|
+
type: :runtime
|
35
|
+
prerelease: false
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - ">="
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '0'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
name: parallel
|
43
|
+
requirement: !ruby/object:Gem::Requirement
|
44
|
+
requirements:
|
45
|
+
- - ">="
|
46
|
+
- !ruby/object:Gem::Version
|
47
|
+
version: '0'
|
48
|
+
type: :runtime
|
49
|
+
prerelease: false
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - ">="
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '0'
|
55
|
+
- !ruby/object:Gem::Dependency
|
56
|
+
name: multiprocessing
|
57
|
+
requirement: !ruby/object:Gem::Requirement
|
58
|
+
requirements:
|
59
|
+
- - ">="
|
60
|
+
- !ruby/object:Gem::Version
|
61
|
+
version: '0'
|
62
|
+
type: :runtime
|
63
|
+
prerelease: false
|
64
|
+
version_requirements: !ruby/object:Gem::Requirement
|
65
|
+
requirements:
|
66
|
+
- - ">="
|
67
|
+
- !ruby/object:Gem::Version
|
68
|
+
version: '0'
|
69
|
+
- !ruby/object:Gem::Dependency
|
70
|
+
name: bundler
|
71
|
+
requirement: !ruby/object:Gem::Requirement
|
72
|
+
requirements:
|
73
|
+
- - ">="
|
74
|
+
- !ruby/object:Gem::Version
|
75
|
+
version: '0'
|
76
|
+
type: :development
|
77
|
+
prerelease: false
|
78
|
+
version_requirements: !ruby/object:Gem::Requirement
|
79
|
+
requirements:
|
80
|
+
- - ">="
|
81
|
+
- !ruby/object:Gem::Version
|
82
|
+
version: '0'
|
83
|
+
- !ruby/object:Gem::Dependency
|
84
|
+
name: rake
|
85
|
+
requirement: !ruby/object:Gem::Requirement
|
86
|
+
requirements:
|
87
|
+
- - ">="
|
88
|
+
- !ruby/object:Gem::Version
|
89
|
+
version: '0'
|
90
|
+
type: :development
|
91
|
+
prerelease: false
|
92
|
+
version_requirements: !ruby/object:Gem::Requirement
|
93
|
+
requirements:
|
94
|
+
- - ">="
|
95
|
+
- !ruby/object:Gem::Version
|
96
|
+
version: '0'
|
97
|
+
- !ruby/object:Gem::Dependency
|
98
|
+
name: minitest
|
99
|
+
requirement: !ruby/object:Gem::Requirement
|
100
|
+
requirements:
|
101
|
+
- - ">="
|
102
|
+
- !ruby/object:Gem::Version
|
103
|
+
version: '0'
|
104
|
+
type: :development
|
105
|
+
prerelease: false
|
106
|
+
version_requirements: !ruby/object:Gem::Requirement
|
107
|
+
requirements:
|
108
|
+
- - ">="
|
109
|
+
- !ruby/object:Gem::Version
|
110
|
+
version: '0'
|
111
|
+
description:
|
112
|
+
email:
|
113
|
+
- andrew@chartkick.com
|
114
|
+
executables:
|
115
|
+
- pgsync
|
116
|
+
extensions: []
|
117
|
+
extra_rdoc_files: []
|
118
|
+
files:
|
119
|
+
- ".gitignore"
|
120
|
+
- Gemfile
|
121
|
+
- LICENSE.txt
|
122
|
+
- README.md
|
123
|
+
- Rakefile
|
124
|
+
- bin/pgsync
|
125
|
+
- config.yml
|
126
|
+
- lib/pgsync.rb
|
127
|
+
- lib/pgsync/version.rb
|
128
|
+
- pgsync.gemspec
|
129
|
+
homepage: https://github.com/ankane/pgsync
|
130
|
+
licenses:
|
131
|
+
- MIT
|
132
|
+
metadata: {}
|
133
|
+
post_install_message:
|
134
|
+
rdoc_options: []
|
135
|
+
require_paths:
|
136
|
+
- lib
|
137
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
138
|
+
requirements:
|
139
|
+
- - ">="
|
140
|
+
- !ruby/object:Gem::Version
|
141
|
+
version: '0'
|
142
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
143
|
+
requirements:
|
144
|
+
- - ">="
|
145
|
+
- !ruby/object:Gem::Version
|
146
|
+
version: '0'
|
147
|
+
requirements: []
|
148
|
+
rubyforge_project:
|
149
|
+
rubygems_version: 2.4.5
|
150
|
+
signing_key:
|
151
|
+
specification_version: 4
|
152
|
+
summary: Quickly and securely sync data between environments
|
153
|
+
test_files: []
|
154
|
+
has_rdoc:
|