ghtorrent 0.7.1 → 0.7.2

Sign up to get free protection for your applications and to get access to all the features.
data/CHANGELOG CHANGED
@@ -1,3 +1,6 @@
1
+ = Version 0.7.2
2
+ * Support for running in standalone mode
3
+
1
4
  = Version 0.7
2
5
  * Full support for issues (comments, labels etc) and pull requests
3
6
  * Cleaned up retrieval of pull request commits
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- ghtorrent (0.6)
4
+ ghtorrent (0.7.2)
5
5
  amqp (~> 1.0.0)
6
6
  bson_ext (~> 1.8.0)
7
7
  daemons (~> 1.1.0)
@@ -15,7 +15,7 @@ GEM
15
15
  amq-client (1.0.2)
16
16
  amq-protocol (>= 1.2.0)
17
17
  eventmachine
18
- amq-protocol (1.4.0)
18
+ amq-protocol (1.5.0)
19
19
  amqp (1.0.2)
20
20
  amq-client (~> 1.0.2)
21
21
  amq-protocol (>= 1.3.0)
data/README.md CHANGED
@@ -1,16 +1,41 @@
1
- ghtorrent: Mirror and process the Github event steam
2
- =========================================================
1
+ ##ghtorrent: Mirror and process data from the Github API
2
+
3
+ A library and a collection of scripts used to retrieve data from the Github API
4
+ and extract metadata in an SQL database, in a modular and scalable manner. The
5
+ scripts are distributed as a Gem (`ghtorrent`), but they can also be run by
6
+ checking out this repository.
7
+
8
+ GHTorrent can be used for a variety of purposes, such as:
9
+
10
+ * Mirror the Github API event stream and follow links from events to actual data
11
+ to gradually build a [Github index](http://ghtorrent.org/dblite/)
12
+ * Create a queriable metadata index for a specific repository
13
+ * Query the Github API using intelligent caching to avoid duplicate queries
14
+
15
+ GHTorrent is comprised from the following components (which can be used
16
+ individually):
17
+
18
+ * [APIClient](https://github.com/gousiosg/github-mirror/blob/master/lib/ghtorrent/api_client.rb): Knows how to query the Github API (both single entities and
19
+ pages) and respect the API request limit. Can be configured to override the
20
+ default IP address, in case of multihomed hosts. Uses configurable on disk [caching](https://github.com/gousiosg/github-mirror/blob/master/lib/ghtorrent/cache.rb) to avoid retrieving data that do not change.
21
+ * [Retriever](https://github.com/gousiosg/github-mirror/blob/master/lib/ghtorrent/retriever.rb): Knows how to retrieve specific Github entities (users, repositories, watchers) by name. Uses an optional persister to avoid
22
+ retrieving data that have not changed.
23
+ * [Persister](https://github.com/gousiosg/github-mirror/blob/master/lib/ghtorrent/persister.rb): A key/value store, which can be backed by a real key/value store,
24
+ to store Github JSON replies and query them on request. The backing key/value
25
+ store must support arbitrary queries to the stored JSON objects.
26
+ * [GHTorrent](https://github.com/gousiosg/github-mirror/blob/master/lib/ghtorrent/ghtorrent.rb): Knows how to extract information from the data retrieved by
27
+ the retriever in order to update an SQL database (see [schema](http://ghtorrent.org/relational.html)) with metadata.
28
+
29
+ The Persister and GHTorrent components have configurable back ends:
30
+
31
+ * Persister: Either uses MongoDB > 2.0 (`mongo` driver) or no persister (`noop` driver)
32
+ * GHTorrent: GHTorrent is tested mainly with MySQL, but can theoretically be
33
+ used with any SQL database compatible with [Sequel](http://sequel.rubyforge.org/rdoc/files/doc/opening_databases_rdoc.html). Your milaege may vary.
3
34
 
4
- A collection of scripts used to mirror the Github event stream, for
5
- research purposes. The scripts are distributed as a Gem (`ghtorrent`),
6
- but they can also be run by checking out this repository.
7
35
 
8
- GHTorrent relies on the following software to work:
36
+ The distributed mirroring scripts also require RabbitMQ >= 2.8 or other
9
37
 
10
- * MongoDB > 2.0
11
- * RabbitMQ >= 2.7
12
- * MySQL >= 5.5. GHTorrent is tested mainly with MySQL, but can theoretically be
13
- used with any SQL database compatible with [Sequel](http://sequel.rubyforge.org/rdoc/files/doc/opening_databases_rdoc.html). Your milaege may vary.
38
+ #### Installing
14
39
 
15
40
  GHTorrent is written in Ruby (tested with 1.9). To install it as a Gem do:
16
41
 
@@ -19,8 +44,7 @@ sudo gem install ghtorrent
19
44
  </code>
20
45
 
21
46
  Depending on which SQL database you want to use, install the appropriate
22
- dependency gem. GHTorrent already installs the `mysql2` gem (if it fails,
23
- install the development package for `libmysql-dev` for your system).
47
+ dependency gem.
24
48
 
25
49
  <code>
26
50
  sudo gem install mysql2 #or sqlite3-ruby #or postgres
@@ -34,74 +58,41 @@ file to a file in your home directory. All provided scripts accept the `-c`
34
58
  option, which you can use to pass the location of the configuration file as
35
59
  a parameter.
36
60
 
37
- Edit the MongoDB and AMQP configuration options accordingly. The scripts
38
- require accounts with permissions to create queues and exchanges in the AMQP
39
- queue, collections in MongoDB and tables in the selected SQL database,
40
- respectively.
41
-
42
- To prepare MongoDB:
43
-
44
- <pre>
45
- $ mongo admin
46
- > db.addUser('github', 'github')
47
- > use github
48
- > db.addUser('github', 'github')
49
- </pre>
50
-
51
- To prepare RabbitMQ:
52
-
53
- <pre>
54
- $ rabbitmqctl add_user github
55
- $ rabbitmqctl set_permissions -p / github ".*" ".*" ".*"
56
-
57
- # The following will enable the RabbitMQ web admin for the github user
58
- # Not necessary to have, but good to debug and diagnose problems
59
- $ rabbitmq-plugins enable rabbitmq_management
60
- $ rabbitmqctl set_user_tags github administrator
61
- </pre>
62
-
63
- To prepare MySQL:
64
-
65
- <pre>
66
- $ mysql -u root -p
67
- mysql> create user 'github'@'localhost' identified by 'github';
68
- mysql> create database github;
69
- mysql> GRANT ALL PRIVILEGES ON github.* to github@'localhost';
70
- mysql> flush privileges;
71
- </pre>
72
-
73
- You can find more information of how you can setup a cluster of machines
61
+ You can find more information of how you can setup a mirroring cluster of machines
74
62
  to retrieve data in parallel on the [Wiki](https://github.com/gousiosg/github-mirror/wiki/Setting-up-a-mirroring-cluster).
75
63
 
76
64
  ### Running
77
65
 
78
- To retrieve data with GHTorrent:
66
+ To mirror the event stream and capture all data:
79
67
 
80
68
  * `ght-mirror-events.rb` periodically polls Github's event
81
69
  queue (`https://api.github.com/events`), stores all new events in the
82
- `events` collection in MongoDB and posts them to the `github` exchange in
70
+ configured pestister and posts them to the `github` exchange in
83
71
  RabbitMQ.
84
72
 
85
73
  * `ght-data_retrieval.rb` creates queues that route posted events to processor
86
74
  functions, which in turn use the appropriate Github API call to retrieve the
87
75
  linked contents, extract metadata to store in the SQL database and store the
88
- retrieved data in the appropriate collection in Mongo, to avoid further API
89
- calls. Data in the SQL database contain pointers (the MongoDB key) to the
90
- "raw" data in MongoDB.
91
-
92
- Both scripts can be run concurrently on more than one hosts, for resilience
93
- and performance reasons. To catch up with Github's event stream, it is
94
- usually enough to run `ght-mirror-events` on one host. To collect all data
95
- pointed by each event, one instance of `ght-data-retrieval` is not enough.
96
- Both scripts employ throttling mechanisms to keep API usage whithin the
97
- limits imposed by Github (currently 60 reqs/hr/ip). If you want the full
98
- 5000 reqs/hr/ip, you will have to provide your Github login details
99
- in the `config.yaml` file.
76
+ retrieved data in the appropriate collection in the persister, to avoid
77
+ duplicate API
78
+ calls. Data in the SQL database contain pointers (the `ext_ref_id` field) to the
79
+ "raw" data in the persister.
80
+
81
+ To retrieve data for a repository or user:
82
+
83
+ * `ght-retrieve-repo` retrieves all data for a specific repository
84
+ * `ght-retrieve-user` retrieves all data for a specific user
85
+
86
+ To perform maintenance:
87
+
88
+ * `ght-load` loads selected events from the persister to the queue in order for
89
+ the `ght-data-retrieval` script to reprocess them
90
+ * `ght-get-more-commits` retrieves all commits for a specific repository
100
91
 
101
92
  #### Data
102
93
 
103
94
  You can find torrents for retrieving data on the
104
- [Available Torrents](https://ghtorrent.org/downloads.html) page. You need two sets of data:
95
+ [Available Torrents](https://ghtorrent.org/downloads.html) page. You can find two sets of data:
105
96
 
106
97
  * Raw events: Github's [event stream](https://api.github.com/events). These
107
98
  are the roots for mirroring operations. The `ght-data-retrieval` crawler starts
@@ -3,7 +3,7 @@ module GHTorrent
3
3
  # Persister adapter that does not store any data.
4
4
  class NoopPersister < BaseAdapter
5
5
 
6
- def init(settings)
6
+ def initialize(settings)
7
7
  end
8
8
 
9
9
  def store(entity, data = {})
@@ -28,20 +28,25 @@ module GHTorrent
28
28
  def paged_api_request(url, pages = config(:mirror_history_pages_back),
29
29
  cache = true, last = nil)
30
30
 
31
- url = if not url.include?("per_page")
32
- if url.include?("?")
33
- url + "&per_page=100"
31
+ url = if not url.include?('per_page')
32
+ if url.include?('?')
33
+ url + '&per_page=100'
34
34
  else
35
- url + "?per_page=100"
35
+ url + '?per_page=100'
36
36
  end
37
37
  else
38
38
  url
39
39
  end
40
40
 
41
- data = if CGI::parse(URI::parse(url).query).has_key?("page")
41
+ params = CGI::parse(URI::parse(url).query)
42
+ data = if params.has_key?('page') or (params.has_key?('last_sha'))
42
43
  api_request_raw(url, use_cache?(cache, method = :paged))
43
44
  else
44
- api_request_raw(url, false)
45
+ if @cache_mode == :all
46
+ api_request_raw(url, true)
47
+ else
48
+ api_request_raw(url, false)
49
+ end
45
50
  end
46
51
 
47
52
  return [] if data.nil?
@@ -77,7 +82,7 @@ module GHTorrent
77
82
 
78
83
  # A normal request. Returns a hash or an array of hashes representing the
79
84
  # parsed JSON result.
80
- def api_request(url, cache = false)
85
+ def api_request(url, cache = true)
81
86
  parse_request_result api_request_raw(url, use_cache?(cache))
82
87
  end
83
88
 
@@ -87,14 +92,18 @@ module GHTorrent
87
92
  # request
88
93
  def use_cache?(client_request, method = :non_paged)
89
94
  @cache_mode ||= case config(:cache_mode)
90
- when "dev"
95
+ when 'dev'
91
96
  :dev
92
- when "prod"
97
+ when 'prod'
93
98
  :prod
99
+ when 'all'
100
+ :all
94
101
  else
95
102
  raise GHTorrentException.new("Don't know cache configuration #{@cache_mode}")
96
103
  end
97
104
  case @cache_mode
105
+ when :all
106
+ return true
98
107
  when :dev
99
108
  unless client_request
100
109
  return false
@@ -160,7 +169,7 @@ module GHTorrent
160
169
  end
161
170
 
162
171
  total = Time.now.to_ms - start_time.to_ms
163
- debug "APIClient: Request: #{url} #{if from_cache then " from cache," else "(#{contents.meta['x-ratelimit-remaining']} remaining)," end} Total: #{total} ms"
172
+ debug "APIClient: Request: #{url} #{if from_cache then "from cache," else "(#{contents.meta['x-ratelimit-remaining']} remaining)," end} Total: #{total} ms"
164
173
 
165
174
  if not from_cache and config(:respect_api_ratelimit) and
166
175
  contents.meta['x-ratelimit-remaining'].to_i < 20
@@ -201,7 +210,7 @@ module GHTorrent
201
210
  :http_basic_authentication => [@username, @passwd])}
202
211
  end
203
212
 
204
- if @attach_ip.nil? or @attach_ip.eql? "0.0.0.0"
213
+ if @attach_ip.nil? or @attach_ip.eql? '0.0.0.0'
205
214
  @open_func.call(url)
206
215
  else
207
216
  attach_to(@attach_ip) do
@@ -33,7 +33,7 @@ module GHTorrent
33
33
  end
34
34
  end
35
35
  rescue
36
- warn "Could not cache object #{file} for key #{key}"
36
+ warn "Cache: Could not cache object #{file} for key #{key}"
37
37
  end
38
38
  end
39
39
 
@@ -47,7 +47,7 @@ module GHTorrent
47
47
  end
48
48
 
49
49
  unless (Time.now() - File.mtime(file)) < max_life
50
- debug "Cached object for key #{key} too old"
50
+ debug "Cache: Cached object for key #{key} too old"
51
51
  return nil
52
52
  end
53
53
 
@@ -58,7 +58,7 @@ module GHTorrent
58
58
  end
59
59
  end
60
60
  rescue
61
- warn "Could not read object from cache location #{file}"
61
+ warn "Cache: Could not read object from cache location #{file}"
62
62
  File.delete(file)
63
63
  end
64
64
  end
@@ -20,6 +20,7 @@ module GHTorrent
20
20
  @settings = settings
21
21
  @ext_uniq = config(:uniq_id)
22
22
  @logger = Logger.new(STDOUT)
23
+ debug "GHTorrent: Using cache dir #{config(:cache_dir)}"
23
24
  end
24
25
 
25
26
  # Get a connection to the database
@@ -40,7 +41,7 @@ module GHTorrent
40
41
  end
41
42
 
42
43
  def persister
43
- @persister ||= connect(:mongo, @settings)
44
+ @persister ||= connect(config(:mirror_persister), @settings)
44
45
  @persister
45
46
  end
46
47
 
@@ -3,20 +3,21 @@ require 'sequel'
3
3
  require 'ghtorrent/migrations/mysql_defaults'
4
4
 
5
5
  Sequel.migration do
6
+
6
7
  up do
7
8
 
8
- puts "Adding column forked_from in table projects"
9
+ puts 'Adding column forked_from in table projects'
9
10
  alter_table :projects do
10
11
  add_foreign_key :forked_from, :projects, :null => true
11
12
  end
12
13
 
13
- puts "Migrating data from forks to project(forked_from)"
14
- DB.transaction(:rollback => :reraise, :isolation => :committed) do
15
- DB[:projects].each do |p|
16
- fork = DB[:forks].first(:forked_project_id => p[:id])
14
+ puts 'Migrating data from forks to project(forked_from)'
15
+ self.transaction(:rollback => :reraise, :isolation => :committed) do
16
+ self[:projects].each do |p|
17
+ fork = self[:forks].first(:forked_project_id => p[:id])
17
18
  unless fork.nil?
18
- source = DB[:projects].first(:id => fork[:forked_from_id])
19
- DB[:projects].filter(:id => p[:id]).update(:forked_from => source[:id])
19
+ source = self[:projects].first(:id => fork[:forked_from_id])
20
+ self[:projects].filter(:id => p[:id]).update(:forked_from => source[:id])
20
21
  puts "#{p[:owner_id]}/#{p[:name]} is forked from #{source[:owner_id]}/#{source[:name]}"
21
22
  end
22
23
  end
@@ -10,26 +10,28 @@ Sequel.migration do
10
10
  :default => false
11
11
 
12
12
  puts "Updating pull_requests.merged"
13
- DB.transaction(:rollback => :reraise, :isolation => :committed) do
14
- DB << "update pull_requests pr
15
- set pr.merged = true
16
- where exists (select *
17
- from pull_request_commits prc, project_commits pc
18
- where prc.commit_id = pc.commit_id
19
- and prc.pull_request_id = pr.id
20
- and pc.project_id = pr.base_repo_id
21
- and pr.base_repo_id <> pr.head_repo_id)"
22
- DB << "update pull_requests pr
23
- set pr.merged = true
13
+ self.transaction(:rollback => :reraise, :isolation => :committed) do
14
+ self << "update pull_requests
15
+ set merged = '1'
16
+ where exists (
17
+ select *
18
+ from pull_request_commits prc, project_commits pc
19
+ where prc.commit_id = pc.commit_id
20
+ and prc.pull_request_id = pull_requests.id
21
+ and pc.project_id = pull_requests.base_repo_id
22
+ and pull_requests.base_repo_id <> pull_requests.head_repo_id);"
23
+
24
+ self << "update pull_requests
25
+ set merged = '1'
24
26
  where exists(
25
27
  select prh.created_at
26
28
  from pull_request_history prh
27
- where prh.action='merged' and prh.pull_request_id=pr.id)"
29
+ where prh.action='merged' and prh.pull_request_id=pull_requests.id)"
28
30
  end
29
31
 
30
- puts "Correcting intra_branch field"
31
- DB.transaction(:rollback => :reraise, :isolation => :committed) do
32
- DB << "update pull_requests set intra_branch = true where base_repo_id = head_repo_id"
32
+ puts 'Correcting intra_branch field'
33
+ self.transaction(:rollback => :reraise, :isolation => :committed) do
34
+ self << "update pull_requests set intra_branch = '1' where base_repo_id = head_repo_id"
33
35
  end
34
36
  end
35
37
 
@@ -13,9 +13,9 @@ module GHTorrent
13
13
 
14
14
  # Factory method for retrieving persistence connections.
15
15
  # The +settings+ argument is a fully parsed YAML document
16
- # passed on to adapters. The available +adapter+ are :mongo and :noop
16
+ # passed on to adapters. The available +adapter+ are 'mongo' and 'noop'
17
17
  def connect(adapter, settings)
18
- driver = ADAPTERS[adapter]
18
+ driver = ADAPTERS[adapter.intern]
19
19
  driver.new(settings)
20
20
  end
21
21
 
@@ -559,7 +559,7 @@ module GHTorrent
559
559
  paged_api_request(ghurl urls)
560
560
  end
561
561
 
562
- items.each do |x|
562
+ items = items.map do |x|
563
563
  x['repo'] = repo
564
564
  x['owner'] = user
565
565
 
@@ -589,12 +589,18 @@ module GHTorrent
589
589
  debug "Retriever: #{entity} #{user}/#{repo} -> #{x[descriminator]} exists"
590
590
  end
591
591
  end
592
+ # If the persistence driver does not set an ext_ref_id key, set a dummy
593
+ # one here
594
+ unless x.has_key? ext_uniq
595
+ x[ext_uniq] = '0'
596
+ end
597
+ x
592
598
  end
593
599
 
594
600
  if item_id.nil?
595
- persister.find(entity, selector)
601
+ items
596
602
  else
597
- repo_bound_instance(entity, selector, descriminator, item_id)
603
+ [items.find{|x| x[descriminator] == item_id}]
598
604
  end
599
605
  end
600
606
 
@@ -9,68 +9,73 @@ module GHTorrent
9
9
  include GHTorrent::Utils
10
10
 
11
11
  CONFIGKEYS = {
12
- :amqp_host => "amqp.host",
13
- :amqp_port => "amqp.port",
14
- :amqp_username => "amqp.username",
15
- :amqp_password => "amqp.password",
16
- :amqp_exchange => "amqp.exchange",
17
- :amqp_prefetch => "amqp.prefetch",
12
+ :amqp_host => 'amqp.host',
13
+ :amqp_port => 'amqp.port',
14
+ :amqp_username => 'amqp.username',
15
+ :amqp_password => 'amqp.password',
16
+ :amqp_exchange => 'amqp.exchange',
17
+ :amqp_prefetch => 'amqp.prefetch',
18
18
 
19
- :sql_url => "sql.url",
19
+ :sql_url => 'sql.url',
20
20
 
21
- :mirror_urlbase => "mirror.urlbase",
22
- :mirror_pollevery => "mirror.pollevery",
23
- :mirror_persister => "mirror.persister",
24
- :mirror_commit_pages_new_repo => "mirror.commit_pages_new_repo",
25
- :mirror_history_pages_back => "mirror.history_pages_back",
26
- :uniq_id => "mirror.uniq_id",
27
- :user_agent => "mirror.user_agent",
21
+ :mirror_urlbase => 'mirror.urlbase',
22
+ :mirror_pollevery => 'mirror.pollevery',
23
+ :mirror_persister => 'mirror.persister',
24
+ :mirror_commit_pages_new_repo => 'mirror.commit_pages_new_repo',
25
+ :mirror_history_pages_back => 'mirror.history_pages_back',
26
+ :uniq_id => 'mirror.uniq_id',
27
+ :user_agent => 'mirror.user_agent',
28
28
 
29
- :cache_mode => "mirror.cache_mode",
30
- :cache_dir => "mirror.cache_dir",
31
- :cache_stale_age => "mirror.cache_stale_age",
29
+ :cache_mode => 'mirror.cache_mode',
30
+ :cache_dir => 'mirror.cache_dir',
31
+ :cache_stale_age => 'mirror.cache_stale_age',
32
32
 
33
- :github_username => "mirror.username",
34
- :github_passwd => "mirror.passwd",
33
+ :github_username => 'mirror.username',
34
+ :github_passwd => 'mirror.passwd',
35
35
 
36
- :respect_api_ratelimit => "mirror.respect_api_ratelimit",
36
+ :respect_api_ratelimit => 'mirror.respect_api_ratelimit',
37
37
 
38
- :attach_ip => "mirror.attach_ip"
38
+ :attach_ip => 'mirror.attach_ip'
39
39
  }
40
40
 
41
41
  DEFAULTS = {
42
- :amqp_host => "localhost",
42
+ :amqp_host => 'localhost',
43
43
  :amqp_port => 5672,
44
- :amqp_username => "github",
45
- :amqp_password => "github",
46
- :amqp_exchange => "github",
44
+ :amqp_username => 'github',
45
+ :amqp_password => 'github',
46
+ :amqp_exchange => 'github',
47
47
  :amqp_prefetch => 1,
48
48
 
49
- :sql_url => "sqlite://github.db",
49
+ :sql_url => 'sqlite://github.db',
50
50
 
51
- :mirror_urlbase => "https://api.github.com/",
52
- :mirror_pollevery => "mirror.pollevery",
53
- :mirror_persister => "no-op",
51
+ :mirror_urlbase => 'https://api.github.com/',
52
+ :mirror_pollevery => 'mirror.pollevery',
53
+ :mirror_persister => 'noop',
54
54
  :mirror_commit_pages_new_repo => 3,
55
55
  :mirror_history_pages_back => 1,
56
- :uniq_id => "ext_ref_id",
57
- :user_agent => "ghtorrent",
56
+ :uniq_id => 'ext_ref_id',
57
+ :user_agent => 'ghtorrent',
58
58
 
59
- :cache_mode => "dev",
60
- :cache_dir => Dir::tmpdir + File::SEPARATOR + "ghtorrent",
59
+ :cache_mode => 'dev',
60
+ :cache_dir => Dir::tmpdir + File::SEPARATOR + 'ghtorrent',
61
61
  :cache_stale_age => 604800,
62
62
 
63
- :github_username => "foo",
64
- :github_passwd => "bar",
63
+ :github_username => 'foo',
64
+ :github_passwd => 'bar',
65
65
 
66
- :respect_api_ratelimit => "true",
66
+ :respect_api_ratelimit => 'true',
67
67
 
68
- :attach_ip => "0.0.0.0"
68
+ :attach_ip => '0.0.0.0'
69
69
  }
70
70
 
71
71
  def config(key, use_default = true)
72
72
  begin
73
- read_value(settings, CONFIGKEYS[key])
73
+ a = read_value(settings, CONFIGKEYS[key])
74
+ if a.nil? && use_default
75
+ DEFAULTS[key]
76
+ else
77
+ a
78
+ end
74
79
  rescue Exception => e
75
80
  if use_default
76
81
  DEFAULTS[key]
@@ -91,7 +96,7 @@ module GHTorrent
91
96
  end
92
97
 
93
98
  def settings
94
- raise Exception.new("Unimplemented")
99
+ raise Exception.new('Unimplemented')
95
100
  end
96
101
 
97
102
  end
data/lib/version.rb CHANGED
@@ -1,5 +1,5 @@
1
1
  module GHTorrent
2
2
 
3
- VERSION = '0.7.1'
3
+ VERSION = '0.7.2'
4
4
 
5
5
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ghtorrent
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.7.1
4
+ version: 0.7.2
5
5
  prerelease:
6
6
  platform: ruby
7
7
  authors:
@@ -10,7 +10,7 @@ authors:
10
10
  autorequire:
11
11
  bindir: bin
12
12
  cert_chain: []
13
- date: 2013-05-03 00:00:00.000000000 Z
13
+ date: 2013-05-12 00:00:00.000000000 Z
14
14
  dependencies:
15
15
  - !ruby/object:Gem::Dependency
16
16
  name: amqp
@@ -180,24 +180,8 @@ files:
180
180
  homepage: https://github.com/gousiosg/github-mirror
181
181
  licenses: []
182
182
  post_install_message: !binary |-
183
- WxtbMzJtVmVyc2lvbiAwLjcbWzBtXSBGdWxsIHN1cHBvcnQgZm9yIGlzc3Vl
184
- cyAoY29tbWVudHMsIGxhYmVscyBldGMpIGFuZCBwdWxsIHJlcXVlc3RzClsb
185
- WzMybVZlcnNpb24gMC43G1swbV0gQ2xlYW5lZCB1cCByZXRyaWV2YWwgb2Yg
186
- cHVsbCByZXF1ZXN0IGNvbW1pdHMKWxtbMzJtVmVyc2lvbiAwLjcbWzBtXSBD
187
- bGVhbmVkIHVwIGFzc29jaWF0aW9uIG9mIGNvbW1pdHMgd2l0aCByZXBvc2l0
188
- b3JpZXMuClsbWzMybVZlcnNpb24gMC43G1swbV0gUmVtb3ZlZCB0aGUgZm9y
189
- a3MgdGFibGUuIEZvcmtzIGFyZSBub3cgdHJhY2tlZCBieSB0aGUgZm9ya2Vk
190
- X2Zyb20gZmllbGQgaW4gcHJvamVjdHMKWxtbMzJtVmVyc2lvbiAwLjcbWzBt
191
- XSBVc2UgR2l0aHViJ3MgSFRUUCBoZWFkZXJzIGZvciByZXF1ZXN0IHRocm90
192
- dGxpbmcKWxtbMzJtVmVyc2lvbiAwLjcbWzBtXSBTdXBwb3J0IGZvciBzZXR0
193
- aW5nIHVzZXIgYWdlbnQgaGVhZGVyIGFzIHBlciBHaXRodWIgQVBJIHJlcXVp
194
- cmVtZW50cwpbG1szMm1WZXJzaW9uIDAuNxtbMG1dIFN1cHBvcnQgZm9yIG1h
195
- cmtpbmcgcHJvamVjdHMgYXMgZGVsZXRlZCAocnVuIGZpeGVzL3VwZGF0ZV9k
196
- ZWxldGVkLnJiKQpbG1szMm1WZXJzaW9uIDAuNxtbMG1dIE5ldyB0b29sIChn
197
- aHQtcmV0cmlldmUtdXNlcikgdG8gcmV0cmlldmUgYWxsIGRhdGEgZm9yIGEg
198
- c2luZ2xlIHVzZXIgClsbWzMybVZlcnNpb24gMC43G1swbV0gU3VwcG9ydCBm
199
- b3IgcnVubmluZyB3aXRob3V0IGEgY29uZmlnLnlhbWwgZmlsZQpbG1szMm1W
200
- ZXJzaW9uIDAuNxtbMG1dIFRlY2huaWNhbCByZXBvcnQgaW4gZG9jLwo=
183
+ WxtbMzJtVmVyc2lvbiAwLjcuMhtbMG1dIFN1cHBvcnQgZm9yIHJ1bm5pbmcg
184
+ aW4gc3RhbmRhbG9uZSBtb2RlCg==
201
185
  rdoc_options:
202
186
  - --charset=UTF-8
203
187
  require_paths: