better_robots 0.0.1 → 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +7 -0
- data/.travis.yml +14 -0
- data/README.md +20 -7
- data/better_robots.gemspec +1 -0
- data/lib/better_robots/version.rb +1 -1
- metadata +10 -10
checksums.yaml
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
1
|
+
---
|
|
2
|
+
SHA1:
|
|
3
|
+
metadata.gz: 83b05afc07541a3d877d570511bb1a215878aacb
|
|
4
|
+
data.tar.gz: bbbb9d0b00fafb73711fe508bd6154081a77c963
|
|
5
|
+
SHA512:
|
|
6
|
+
metadata.gz: 06eb8f39b15772552a3fc6afb6902659376130e668beafafd3c9dff033ce2ab81011903a932594625e7aae0433c412bfe37cdaa9d729cc603b42ab35af88e4a2
|
|
7
|
+
data.tar.gz: b46f24857328f0f9f45a13de96e6861f4a7ff16b60f3fa319e0bf74ef8eb8d0b9a2b63f5f1ec7bf9fd0a9c7188cb312eca86394afdb698c803e7bca4112876be
|
data/.travis.yml
ADDED
data/README.md
CHANGED
|
@@ -1,4 +1,6 @@
|
|
|
1
1
|
# BetterRobots - Better SEO with robots.txt
|
|
2
|
+
[](https://travis-ci.org/gerrypower/better_robots)
|
|
3
|
+
[](https://codeclimate.com/github/gerrypower/better_robots)
|
|
2
4
|
|
|
3
5
|
BetterRobots is designed to SEO enhance your robots.txt serving, in particular for situations where
|
|
4
6
|
you have multiple domains or subdomains being served from one application. e.g. www.yoursite.com,
|
|
@@ -8,7 +10,7 @@ assets.yoursite.com, beta.yoursite.com, etc.
|
|
|
8
10
|
|
|
9
11
|
A web application commonly has several subdomains that refer to the same application. For example,
|
|
10
12
|
a standard practice to speed browser page load time, is to have multiple asset hosts aliased to your
|
|
11
|
-
application, allowing a browser to simultaneously stream multiple assets. In an SEO context, each of
|
|
13
|
+
application, allowing a browser to simultaneously stream multiple assets. In an SEO context, each of
|
|
12
14
|
these aliased hosts are considered to be duplicate content. To avoid this, you should have different
|
|
13
15
|
robots.txt that exclude search engines for all but your canonical domain.
|
|
14
16
|
|
|
@@ -21,7 +23,7 @@ robots.txt will return the following:
|
|
|
21
23
|
|
|
22
24
|
www.yoursite.com/robots.txt -> User-agent: *
|
|
23
25
|
Crawl-Delay: 3
|
|
24
|
-
|
|
26
|
+
|
|
25
27
|
assets0.yoursite.com/robots.txt -> User-agent: *
|
|
26
28
|
Disallow: /
|
|
27
29
|
|
|
@@ -45,9 +47,9 @@ Or install it yourself as:
|
|
|
45
47
|
|
|
46
48
|
## Usage
|
|
47
49
|
|
|
48
|
-
For Rails 3, add a route to config/routes.rb
|
|
50
|
+
For Rails 3 & 4, add a route to config/routes.rb
|
|
49
51
|
```ruby
|
|
50
|
-
|
|
52
|
+
get "/robots.txt" => BetterRobots::Generator
|
|
51
53
|
```
|
|
52
54
|
|
|
53
55
|
For each domain name that you want a robots.txt file served, rename your public/robots.txt to
|
|
@@ -59,18 +61,29 @@ All other domain names will default to:
|
|
|
59
61
|
## <a name="works_with"></a>Works with:
|
|
60
62
|
|
|
61
63
|
BetterRobots is a Rack based app, and should work with any Rack compatible framework. It has been tested with
|
|
62
|
-
Rails 3.2 and Sinatra 1.3, and on the following Ruby implementations:
|
|
64
|
+
Rails 3.2, Rails 4.0 and Sinatra 1.3, and on the following Ruby implementations:
|
|
63
65
|
|
|
64
66
|
* JRuby 1.7.1
|
|
65
|
-
* MRI 1.8.7
|
|
66
67
|
* MRI 1.9.2
|
|
67
68
|
* MRI 1.9.3
|
|
69
|
+
* MRI 2.0.0
|
|
68
70
|
* Rubinius 1.2.4
|
|
69
71
|
* Ruby EE 1.8.7
|
|
70
72
|
|
|
73
|
+
### Versioning
|
|
74
|
+
This library aims to adhere to [Semantic Versioning 2.0.0](http://semver.org/). Violations of this scheme should be reported as
|
|
75
|
+
bugs. Specifically, if a minor or patch version is released that breaks backward compatibility, that
|
|
76
|
+
version should be immediately yanked and/or a new version should be immediately released that restores
|
|
77
|
+
compatibility. Breaking changes to the public API will only be introduced with new major versions. As a
|
|
78
|
+
result of this policy, once this gem reaches a 1.0 release, you can (and should) specify a dependency on
|
|
79
|
+
this gem using the [Pessimistic Version Constraint](http://docs.rubygems.org/read/chapter/16#page74) with
|
|
80
|
+
two digits of precision. For example:
|
|
81
|
+
|
|
82
|
+
spec.add_dependency 'better_robots', '~> 1.0'
|
|
83
|
+
|
|
71
84
|
### License
|
|
72
85
|
|
|
73
|
-
|
|
86
|
+
better_robots is released under the [MIT license](http://www.opensource.org/licenses/MIT).
|
|
74
87
|
|
|
75
88
|
## Author
|
|
76
89
|
|
data/better_robots.gemspec
CHANGED
|
@@ -11,6 +11,7 @@ Gem::Specification.new do |gem|
|
|
|
11
11
|
gem.description = "Better SEO with robots.txt"
|
|
12
12
|
gem.summary = gem.description
|
|
13
13
|
gem.homepage = "https://github.com/gerrypower/better_robots"
|
|
14
|
+
gem.license = "MIT"
|
|
14
15
|
|
|
15
16
|
gem.files = `git ls-files`.split($/)
|
|
16
17
|
gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
|
metadata
CHANGED
|
@@ -1,15 +1,14 @@
|
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
|
2
2
|
name: better_robots
|
|
3
3
|
version: !ruby/object:Gem::Version
|
|
4
|
-
|
|
5
|
-
version: 0.0.1
|
|
4
|
+
version: 1.0.0
|
|
6
5
|
platform: ruby
|
|
7
6
|
authors:
|
|
8
7
|
- Gerry Power
|
|
9
8
|
autorequire:
|
|
10
9
|
bindir: bin
|
|
11
10
|
cert_chain: []
|
|
12
|
-
date:
|
|
11
|
+
date: 2013-08-03 00:00:00.000000000 Z
|
|
13
12
|
dependencies: []
|
|
14
13
|
description: Better SEO with robots.txt
|
|
15
14
|
email:
|
|
@@ -19,6 +18,7 @@ extensions: []
|
|
|
19
18
|
extra_rdoc_files: []
|
|
20
19
|
files:
|
|
21
20
|
- .gitignore
|
|
21
|
+
- .travis.yml
|
|
22
22
|
- Gemfile
|
|
23
23
|
- MIT-LICENSE
|
|
24
24
|
- README.md
|
|
@@ -31,28 +31,28 @@ files:
|
|
|
31
31
|
- test/routes.rb
|
|
32
32
|
- test/test_helper.rb
|
|
33
33
|
homepage: https://github.com/gerrypower/better_robots
|
|
34
|
-
licenses:
|
|
34
|
+
licenses:
|
|
35
|
+
- MIT
|
|
36
|
+
metadata: {}
|
|
35
37
|
post_install_message:
|
|
36
38
|
rdoc_options: []
|
|
37
39
|
require_paths:
|
|
38
40
|
- lib
|
|
39
41
|
required_ruby_version: !ruby/object:Gem::Requirement
|
|
40
42
|
requirements:
|
|
41
|
-
- -
|
|
43
|
+
- - '>='
|
|
42
44
|
- !ruby/object:Gem::Version
|
|
43
45
|
version: '0'
|
|
44
|
-
none: false
|
|
45
46
|
required_rubygems_version: !ruby/object:Gem::Requirement
|
|
46
47
|
requirements:
|
|
47
|
-
- -
|
|
48
|
+
- - '>='
|
|
48
49
|
- !ruby/object:Gem::Version
|
|
49
50
|
version: '0'
|
|
50
|
-
none: false
|
|
51
51
|
requirements: []
|
|
52
52
|
rubyforge_project:
|
|
53
|
-
rubygems_version:
|
|
53
|
+
rubygems_version: 2.0.3
|
|
54
54
|
signing_key:
|
|
55
|
-
specification_version:
|
|
55
|
+
specification_version: 4
|
|
56
56
|
summary: Better SEO with robots.txt
|
|
57
57
|
test_files:
|
|
58
58
|
- test/integration/better_robots_spec.rb
|