prosto_cache 0.2.1 → 0.2.2

Sign up to get free protection for your applications and to get access to all the features.
data/Gemfile CHANGED
@@ -1,4 +1,4 @@
1
- source "http://rubygems.org"
1
+ source 'https://rubygems.org'
2
2
 
3
3
  group :development do
4
4
  gem "rspec" #, "~> 2.3.0"
data/Gemfile.lock CHANGED
@@ -1,5 +1,5 @@
1
1
  GEM
2
- remote: http://rubygems.org/
2
+ remote: https://rubygems.org/
3
3
  specs:
4
4
  diff-lcs (1.2.4)
5
5
  rake (10.0.4)
data/README.rdoc CHANGED
@@ -1,8 +1,10 @@
1
1
  = prosto_cache
2
2
 
3
- prosto_cache provides a simple way to cache enumeration-like models in memory and to access this cache in uniform way.
3
+ {<img src="https://travis-ci.org/olek/prosto_cache.png?branch=master" alt="Build Status" />}[https://travis-ci.org/olek/prosto_cache]
4
4
 
5
- Here are couple thinks that it will NOT do for you.
5
+ prosto_cache provides a simple way to cache enumeration-like models in memory and to access this cache in a uniform way.
6
+
7
+ Here are a couple of things that it will NOT do for you.
6
8
 
7
9
  - it will not monkey patch rails in attempt to make itself 'totally' easy to use
8
10
  - it will not attempt to share/distribute cache
@@ -31,14 +33,14 @@ Usage:
31
33
  include ProstoCache
32
34
  2. Configure cache access keys (optional step, by default cache is accessed by key 'name')
33
35
  cache_accessor_keys %w(scope name)
34
- 3. Your model MUST have non-nullable column updated_at, add new migration with it if missing
36
+ 3. Your model MUST have a non-nullable column updated_at, add new migration with it if missing
35
37
  (this field is used for invalidating cache in other ruby processes)
36
38
  4. Access cached model object in your code like this
37
39
  * Simple case of one key
38
40
 
39
41
  YourModel.cache[:key1]
40
42
  * Case of 2 or more keys where it is known ahead of time that all parent keys exist.
41
- Looks conventional, but is not quite unreliable because brackets operation on nil will raise an error.
43
+ Looks conventional, but is not quite reliable because brackets operation on nil will raise an error.
42
44
  If you know a good way to make it reliable, let me know!
43
45
 
44
46
  YourModel.cache[:key1][:key2][:key3]
@@ -0,0 +1,17 @@
1
+ module ProstoCache
2
+ module Extensions
3
+ def lookup_enum_for(name, enum_class=nil)
4
+ raise ArgumentError, "No name provided" unless name
5
+ enum_class = name.to_s.classify.constantize unless enum_class
6
+ define_method("#{name}_with_lookup=") do |o|
7
+ new_value = o
8
+ unless o.is_a?(enum_class)
9
+ new_value = o.blank? ? nil : enum_class[o.to_s]
10
+ end
11
+ self.send("#{name}_without_lookup=", new_value)
12
+ end
13
+
14
+ alias_method_chain "#{name}=", :lookup
15
+ end
16
+ end
17
+ end
@@ -0,0 +1,43 @@
1
+ module ProstoCache
2
+ class ProstoHash
3
+ extend Forwardable
4
+ def initialize(hash = {})
5
+ @hash = hash.each_with_object({}) { |(k, v), memo| memo[k.to_sym] = v }
6
+ end
7
+
8
+ def [](key)
9
+ raise ArgumentError unless key
10
+ hash[key.to_sym]
11
+ end
12
+
13
+ def []=(key, value)
14
+ raise ArgumentError unless key
15
+ hash[key.to_sym] = value
16
+ end
17
+
18
+ def keys(init = nil)
19
+ @keys = @keys || init || hash.keys
20
+ end
21
+
22
+ def values(init = nil)
23
+ @values = @values || init || hash.values
24
+ end
25
+
26
+ def_delegators :hash, :to_s, :inspect
27
+
28
+ private
29
+
30
+ attr_reader :hash
31
+ end
32
+
33
+ def self.fail_on_missing_value?(litmus)
34
+ case litmus
35
+ when Symbol
36
+ true
37
+ when String
38
+ false
39
+ else
40
+ raise ArgumentError, "Unknown type of cache key #{litmus.inspect}"
41
+ end
42
+ end
43
+ end
@@ -0,0 +1,225 @@
1
+ =begin
2
+ This library provides a simple way to cache model and to access this cache in some canonical way.
3
+ Any changes to the model's objects will automatically result in cache reload.
4
+ Cache reload in other ruby processes of same app will be triggered as well, but
5
+ with some delay (currently up to 60 seconds).
6
+ If the delay in cache reloading is not an option, well, this simple library will
7
+ not work for you, and you will have to use something fancier, like Memcached.
8
+
9
+ Usage:
10
+
11
+ * Add ProstoCache mixin to your model
12
+ class YourModel < ActiveRecord::Base
13
+ include ProstoCache
14
+
15
+ * Configure cache access keys (optional step, by default cache is accessed by key 'name')
16
+ cache_accessor_keys %w(scope name)
17
+
18
+ * Your model must have non-nullable column updated_at, add it in migration if
19
+ it is missing (this field is used for invalidating cache in other ruby processes).
20
+
21
+ * Access cached model object in your code like this
22
+ Simple case of one key
23
+ YourModel.cache[:key1]
24
+ Case of 2 or more keys
25
+ YourModel.cache[:key1, :key2, :key3]
26
+
27
+ * Handling of non-existing cache values.
28
+ If cache is accessed using symbol key and value not found, it will raise BadCacheKeyError.
29
+ If cache is accessed using string key and value not found, it will return nil.
30
+ For complex keys type of last key component is the one taken into account.
31
+
32
+ * If you want to, you can add extra lookup helpers to the objects that relate
33
+ to the cached object, that will allow those objects to update 'string'
34
+ attribute, and that will result in database reference change.
35
+ class OtherModel < ActiveRecord::Base
36
+ belongs_to :your_model
37
+ lookup_enum_for :your_model
38
+ end
39
+
40
+ This lookup was intertionally not integrated 'seamlessly' with ActiveRecord since not
41
+ everybody would want that, and monkey-patching other library (AR) is not really
42
+ a good thing, even if it results in a 'smoother' experience where everything works as if by magic.
43
+ =end
44
+
45
+ module ProstoCache
46
+ class BadCacheKeyError < StandardError; end
47
+ class BadCacheValuesError < StandardError; end
48
+
49
+ # cache itself, contains pretty much all the logic
50
+ class ProstoModelCache
51
+
52
+ def self.fail_on_missing_value?(litmus)
53
+ case litmus
54
+ when Symbol
55
+ true
56
+ when String
57
+ false
58
+ else
59
+ raise ArgumentError, "Unknown type of cache key #{litmus.inspect}"
60
+ end
61
+ end
62
+
63
+ MAX_CACHE_LIFE = 60 # seconds
64
+
65
+ def initialize(model_class, accessor_keys, sort_keys)
66
+ raise ArgumentError, "No model class provided" unless model_class
67
+
68
+ @model_class = model_class
69
+ @accessor_keys = [*(accessor_keys || :name)]
70
+ @sort_keys = sort_keys ? [*(sort_keys)] : @accessor_keys
71
+ end
72
+
73
+ def invalidate
74
+ self.cache = self.signature = self.validated_at = nil
75
+ end
76
+
77
+ def [](*keys)
78
+ unless keys.length == accessor_keys.length
79
+ raise BadCacheKeyError, "Cached accessed by #{keys.length} keys, expected #{accessor_keys.length}"
80
+ end
81
+
82
+ keys.zip((1..keys.length)).inject(safe_cache) do |memo, (key, index)|
83
+ value = memo[key]
84
+ unless value
85
+ if ProstoModelCache.fail_on_missing_value?(keys.last)
86
+ raise BadCacheKeyError, key
87
+ else
88
+ value = ProstoHash.new unless index == keys.length
89
+ end
90
+ end
91
+
92
+ value
93
+ end
94
+ end
95
+
96
+ def keys
97
+ safe_cache.keys
98
+ end
99
+
100
+ def values
101
+ safe_cache.values
102
+ end
103
+
104
+ private
105
+
106
+ attr_reader :model_class, :cache, :signature, :validated_at, :accessor_keys, :sort_keys
107
+ attr_writer :cache, :signature, :validated_at
108
+
109
+ def safe_cache
110
+ time = Time.now.to_i
111
+
112
+ if cache && validated_at < time - MAX_CACHE_LIFE
113
+ current_cache_signature = validate_cache_signature(time)
114
+ end
115
+
116
+ unless cache
117
+ load_cache(time, current_cache_signature)
118
+ end
119
+
120
+ cache
121
+ end
122
+
123
+ def validate_cache_signature(time)
124
+ query_cache_signature.tap { |current_cache_signature|
125
+ if current_cache_signature == signature
126
+ self.validated_at = time
127
+ else
128
+ invalidate
129
+ end
130
+ }
131
+ end
132
+
133
+ def load_cache(time, current_cache_signature = nil)
134
+ fail "Can not load already loaded cache" if cache
135
+
136
+ current_cache_signature ||= query_cache_signature
137
+
138
+ cache_values = model_class.all
139
+ self.cache = build_cache(cache_values, accessor_keys)
140
+ cache.values(sorted_cache_values(cache_values))
141
+ cache.keys(sorted_keys(cache.values))
142
+ self.validated_at = time
143
+ self.signature = current_cache_signature
144
+ end
145
+
146
+
147
+ def build_cache(objects, attributes=[])
148
+ attributes = [*attributes]
149
+ if attributes.empty?
150
+ # terminal case
151
+ raise BadCacheValuesError, "No cache entry found" if objects.nil? || objects.empty?
152
+ raise BadCacheValuesError, "Non deterministic search result, more then one cache entry found" if objects.size > 1
153
+ return objects.first
154
+ else
155
+ reduced_attributes = attributes.dup
156
+ attribute = reduced_attributes.delete_at(0).to_sym
157
+ # first, bucketize to reduce problem's complexity
158
+ array_map = objects.each_with_object({}) do |o, memo|
159
+ key = o.public_send(attribute).to_s
160
+ memo[key] ||= []
161
+ memo[key] << o
162
+ end
163
+ # second, recurse and build cache from those reduced buckets!
164
+ array_map.each_with_object(ProstoHash.new) do |(attr_value, attr_bucket), memo|
165
+ memo[attr_value] = build_cache(attr_bucket, reduced_attributes)
166
+ end
167
+ end
168
+ end
169
+
170
+ def sorted_cache_values(cache_values)
171
+ cache_values.sort_by { |o|
172
+ sort_keys.inject('') { |memo, k|
173
+ memo << o.public_send(k)
174
+ }
175
+ }
176
+ end
177
+
178
+ def sorted_keys(cache_values)
179
+ cache_values.map { |o|
180
+ accessor_keys.inject([]) { |memo, k|
181
+ memo << o.public_send(k).to_sym
182
+ }
183
+ }.tap { |rtn|
184
+ rtn.flatten! if accessor_keys.length == 1
185
+ }
186
+ end
187
+
188
+ def query_cache_signature
189
+ raw_result = ActiveRecord::Base.connection.execute(
190
+ "select max(updated_at) as max_updated_at, max(id) as max_id, count(id) as count from #{model_class.table_name}"
191
+ )
192
+ array_result = case raw_result.class.name
193
+ when 'Mysql::Result'
194
+ [].tap { |rows| raw_result.each_hash { |h| rows << h } }
195
+ when 'Mysql2::Result'
196
+ [].tap { |rows| raw_result.each(:as => :hash) { |h| rows << h } }
197
+ when 'PGresult'
198
+ raw_result.map(&:to_hash)
199
+ else
200
+ fail "Result class #{raw_result.class.name} in unsupported"
201
+ end
202
+ array_result.map(&:symbolize_keys).first
203
+ end
204
+ end
205
+
206
+ def self.included(cl)
207
+
208
+ cl.after_save { cl.cache.invalidate }
209
+
210
+ class << cl
211
+
212
+ def cache
213
+ @cache ||= ProstoModelCache.new self, @accessor_keys, @sort_keys
214
+ end
215
+
216
+ def cache_accessor_keys(keys)
217
+ @accessor_keys = keys
218
+ end
219
+
220
+ def cache_sort_keys(keys)
221
+ @sort_keys = keys
222
+ end
223
+ end
224
+ end
225
+ end
data/lib/prosto_cache.rb CHANGED
@@ -1,238 +1,3 @@
1
- =begin
2
- This library provides a simple way to cache model and to access this cache in some canonical way.
3
- Any changes to the model's objects will automatically result in cache reload.
4
- Cache reload in other ruby processes of same app will be triggered as well, but
5
- with some delay (currently up to 60 seconds).
6
- If delay in cache reloading is not an option, well, this simply library will
7
- not work for you, and you have to use something fancier, like Memcached.
8
-
9
- Usage:
10
-
11
- * Add ProstoCache mixin to your model
12
- class YourModel < ActiveRecord::Base
13
- include ProstoCache
14
-
15
- * Configure cache access keys (optional step, by default cache is accessed by key 'name')
16
- cache_accessor_keys %w(scope name)
17
-
18
- * Your model must have non-nullable column updated_at, add it in migration if
19
- it is missing (this field is used for invalidating cache in other ruby processes).
20
-
21
- * Access cached model object in your code like this
22
- Simple case of one key
23
- YourModel.cache[:key1]
24
- Case of 2 or more keys
25
- YourModel.cache[:key1, :key2, :key3]
26
-
27
- * Handling of non-existing cache values.
28
- If cache is accessed using symbol key and value not found, it will raise BadCacheKeyError.
29
- If cache is accessed using string key and value not found, it will return nil.
30
- For complex keys type of last key component is the one taken into account.
31
-
32
- * If you want to, you can add extra lookup helpers to the objects that relate
33
- to the cached object, that will allow those objects to update 'string'
34
- attribute, and that will result in database reference change.
35
- class OtherModel < ActiveRecord::Base
36
- belongs_to :your_model
37
- lookup_enum_for :your_model
38
- end
39
-
40
- This lookup was intertionally not integrated 'seamlessly' with ActiveRecord since not
41
- everybody would want that, and monkey-patching other library (AR) is not really
42
- a good thing, even if it results in more 'smooth' experience where everything works as if by magic.
43
- =end
44
-
45
- module ProstoCache
46
- class BadCacheKeyError < StandardError; end
47
- class BadCacheValuesError < StandardError; end
48
-
49
- # cache itself, contains pretty much all the logic
50
- class ProstoModelCache
51
-
52
- class ProstoHash
53
- extend Forwardable
54
- def initialize(hash = {})
55
- @hash = hash.each_with_object({}) { |(k, v), memo| memo[k.to_sym] = v }
56
- end
57
-
58
- def [](key)
59
- raise ArgumentError unless key
60
- hash[key.to_sym]
61
- end
62
-
63
- def []=(key, value)
64
- raise ArgumentError unless key
65
- hash[key.to_sym] = value
66
- end
67
-
68
- def_delegators :hash, :to_s, :inspect, :keys, :values
69
-
70
- private
71
-
72
- attr_reader :hash
73
- end
74
-
75
- def self.fail_on_missing_value?(litmus)
76
- case litmus
77
- when Symbol
78
- true
79
- when String
80
- false
81
- else
82
- raise ArgumentError, "Unknown type of cache key #{litmus.inspect}"
83
- end
84
- end
85
-
86
- MAX_CACHE_LIFE = 60 # seconds
87
-
88
- def initialize(model_class, accessor_keys)
89
- raise ArgumentError, "No model class provided" unless model_class
90
-
91
- @model_class = model_class
92
- @accessor_keys = [*(accessor_keys || :name)]
93
- end
94
-
95
- def invalidate
96
- self.cache = self.signature = self.validated_at = nil
97
- end
98
-
99
- def [](*keys)
100
- unless keys.length == accessor_keys.length
101
- raise BadCacheKeyError, "Cached accessed by #{keys.length} keys, expected #{accessor_keys.length}"
102
- end
103
-
104
- keys.zip((1..keys.length)).inject(safe_cache) do |memo, (key, index)|
105
- value = memo[key]
106
- unless value
107
- if ProstoModelCache.fail_on_missing_value?(keys.last)
108
- raise BadCacheKeyError, key
109
- else
110
- value = ProstoHash.new unless index == keys.length
111
- end
112
- end
113
-
114
- value
115
- end
116
- end
117
-
118
- def keys
119
- safe_cache.keys
120
- end
121
-
122
- def values
123
- safe_cache.values
124
- end
125
-
126
- private
127
-
128
- attr_reader :model_class, :cache, :signature, :validated_at, :accessor_keys
129
- attr_writer :cache, :signature, :validated_at
130
-
131
- def safe_cache
132
- time = Time.now.to_i
133
-
134
- if cache && validated_at < time - MAX_CACHE_LIFE
135
- current_cache_signature = validate_cache_signature(time)
136
- end
137
-
138
- unless cache
139
- load_cache(time, current_cache_signature)
140
- end
141
-
142
- cache
143
- end
144
-
145
- def validate_cache_signature(time)
146
- query_cache_signature.tap { |current_cache_signature|
147
- if current_cache_signature == signature
148
- self.validated_at = time
149
- else
150
- invalidate
151
- end
152
- }
153
- end
154
-
155
- def load_cache(time, current_cache_signature = nil)
156
- fail "Can not load already loaded cache" if cache
157
-
158
- current_cache_signature ||= query_cache_signature
159
-
160
- self.cache = build_cache(model_class.all, accessor_keys)
161
- self.validated_at = time
162
- self.signature = current_cache_signature
163
- end
164
-
165
-
166
- def build_cache(objects, attributes=[])
167
- attributes = [*attributes]
168
- if attributes.empty?
169
- # terminal case
170
- raise BadCacheValuesError, "No cache entry found" if objects.nil? || objects.empty?
171
- raise BadCacheValuesError, "Non deterministic search result, more then one cache entry found" if objects.size > 1
172
- return objects.first
173
- else
174
- reduced_attributes = attributes.dup
175
- attribute = reduced_attributes.delete_at(0).to_sym
176
- # first, bucketize to reduce problem's complexity
177
- array_map = objects.each_with_object({}) do |o, memo|
178
- key = o.send(attribute).to_s
179
- memo[key] ||= []
180
- memo[key] << o
181
- end
182
- # second, recurse and build cache from those reduced buckets!
183
- array_map.each_with_object(ProstoHash.new) do |(attr_value, attr_bucket), memo|
184
- memo[attr_value] = build_cache(attr_bucket, reduced_attributes)
185
- end
186
- end
187
- end
188
-
189
- def query_cache_signature
190
- raw_result = ActiveRecord::Base.connection.execute(
191
- "select max(updated_at) as max_updated_at, max(id) as max_id, count(id) as count from #{model_class.table_name}"
192
- )
193
- array_result = case raw_result.class.name
194
- when 'Mysql::Result'
195
- [].tap { |rows| raw_result.each_hash { |h| rows << h } }
196
- when 'Mysql2::Result'
197
- [].tap { |rows| raw_result.each(:as => :hash) { |h| rows << h } }
198
- when 'PGresult'
199
- raw_result.map(&:to_hash)
200
- else
201
- fail "Result class #{raw_result.class.name} in unsupported"
202
- end
203
- array_result.map(&:symbolize_keys).first
204
- end
205
- end
206
-
207
- def self.included(cl)
208
-
209
- cl.after_save { cl.cache.invalidate }
210
-
211
- class << cl
212
-
213
- def cache
214
- @cache ||= ProstoModelCache.new self, @accessor_keys
215
- end
216
-
217
- def cache_accessor_keys(keys)
218
- @accessor_keys = keys
219
- end
220
- end
221
- end
222
-
223
- module Extensions
224
- def lookup_enum_for(name, enum_class=nil)
225
- raise ArgumentError, "No name provided" unless name
226
- enum_class = name.to_s.classify.constantize unless enum_class
227
- define_method("#{name}_with_lookup=") do |o|
228
- new_value = o
229
- unless o.is_a?(enum_class)
230
- new_value = o.blank? ? nil : enum_class[o.to_s]
231
- end
232
- self.send("#{name}_without_lookup=", new_value)
233
- end
234
-
235
- alias_method_chain "#{name}=", :lookup
236
- end
237
- end
238
- end
1
+ require 'prosto_cache/prosto_hash'
2
+ require 'prosto_cache/prosto_model_cache'
3
+ require 'prosto_cache/extensions'
@@ -8,12 +8,12 @@ describe "Model with prosto cache mixed in" do
8
8
  }
9
9
  }
10
10
 
11
- describe '#cache' do
12
- it "should add a cache method" do
13
- model_class.cache.should_not be_nil
14
- model_class.cache.should be_an_instance_of(ProstoCache::ProstoModelCache)
15
- end
11
+ it "should have cache method" do
12
+ model_class.cache.should_not be_nil
13
+ model_class.cache.should be_an_instance_of(ProstoCache::ProstoModelCache)
14
+ end
16
15
 
16
+ describe '.cache' do
17
17
  it "should not load cache if it was never accessed" do
18
18
  model_class.cache.should_not_receive(:query_cache_signature)
19
19
  model_class.should_not_receive(:all)
@@ -21,8 +21,10 @@ describe "Model with prosto cache mixed in" do
21
21
  model_class.cache.should_not be_nil
22
22
  end
23
23
  end
24
+ end
24
25
 
25
- describe 'cache' do
26
+ describe ProstoCache::ProstoModelCache do
27
+ context "with single key access" do
26
28
  let(:model_class) {
27
29
  Class.new {
28
30
  def self.after_save; end
@@ -41,7 +43,7 @@ describe "Model with prosto cache mixed in" do
41
43
 
42
44
  describe "#keys" do
43
45
  it "should return all keys from the cache" do
44
- model_class.cache.keys.should == [:foo, :bar]
46
+ model_class.cache.keys.should == [:bar, :foo]
45
47
  end
46
48
  end
47
49
 
@@ -49,42 +51,119 @@ describe "Model with prosto cache mixed in" do
49
51
  it "should return all values from the cache" do
50
52
  values = model_class.cache.values
51
53
  values.should have(2).instances
52
- values.map(&:name).should == %w(foo bar)
54
+ # values.map(&:name).should == %w(foo bar)
55
+ values.map(&:name).should include('foo')
56
+ values.map(&:name).should include('bar')
57
+ end
58
+
59
+ it "should sort values by accessor key" do
60
+ values = model_class.cache.values
61
+ values.map(&:name).should == %w(bar foo)
62
+ end
63
+
64
+ context 'when sort_keys are provided' do
65
+ let(:model_class) {
66
+ Class.new {
67
+ def self.after_save; end
68
+
69
+ include ProstoCache
70
+
71
+ cache_sort_keys(:reversed_name)
72
+
73
+ attr_accessor :name
74
+ def initialize(name)
75
+ self.name = name
76
+ end
77
+
78
+ def reversed_name
79
+ name.reverse
80
+ end
81
+ }.tap { |model_class|
82
+ model_class.cache.stub(:query_cache_signature).once.and_return(:something)
83
+ model_class.stub(:all).once.and_return(%w(foo bar).map { |n| model_class.new(n) })
84
+ }
85
+ }
86
+
87
+ it "should sort values by sort key" do
88
+ values = model_class.cache.values
89
+ values.map(&:name).should == %w(foo bar)
90
+ end
53
91
  end
54
92
  end
55
93
 
56
94
  describe "#[]" do
57
- context "with single key access" do
58
- it "should load cache when it is accessed" do
59
- model_class.should_receive(:all).once.and_return(%w(foo bar).map { |n| model_class.new(n) })
95
+ it "should load cache when it is accessed" do
96
+ model_class.should_receive(:all).once.and_return(%w(foo bar).map { |n| model_class.new(n) })
97
+
98
+ model_class.cache[:foo]
99
+ end
100
+
101
+ context 'when key is symbol' do
102
+ it "should raise an error for key that was not found" do
103
+ expect { model_class.cache[:nondef] }.to raise_error ProstoCache::BadCacheKeyError
104
+ end
60
105
 
61
- model_class.cache[:foo]
106
+ it "should return proper object for key that was found" do
107
+ model_class.cache[:foo].should_not be_nil
108
+ model_class.cache[:foo].name.should == 'foo'
62
109
  end
110
+ end
63
111
 
64
- context 'when key is symbol' do
65
- it "should raise an error for key that was not found" do
66
- expect { model_class.cache[:nondef] }.to raise_error ProstoCache::BadCacheKeyError
67
- end
112
+ context 'when key is string' do
113
+ it "should return nil for key that was not found" do
114
+ model_class.cache['nondef'].should be_nil
115
+ end
68
116
 
69
- it "should return proper object for key that was found" do
70
- model_class.cache[:foo].should_not be_nil
71
- model_class.cache[:foo].name.should == 'foo'
72
- end
117
+ it "should return proper object for key that was found" do
118
+ model_class.cache['foo'].should_not be_nil
119
+ model_class.cache['foo'].name.should == 'foo'
73
120
  end
121
+ end
122
+ end
123
+ end
124
+
125
+ context "with composite key access" do
126
+ let(:model_class) {
127
+ Class.new {
128
+ def self.after_save; end
74
129
 
75
- context 'when key is string' do
76
- it "should return nil for key that was not found" do
77
- model_class.cache['nondef'].should be_nil
78
- end
130
+ include ProstoCache
79
131
 
80
- it "should return proper object for key that was found" do
81
- model_class.cache['foo'].should_not be_nil
82
- model_class.cache['foo'].name.should == 'foo'
83
- end
132
+ cache_accessor_keys %w(key1 key2)
133
+
134
+ attr_accessor :name, :key1, :key2
135
+ def initialize(name, key1, key2)
136
+ self.name = name
137
+ self.key1 = key1
138
+ self.key2 = key2
84
139
  end
140
+ }.tap { |model_class|
141
+ model_class.cache.stub(:query_cache_signature).once.and_return(:foo)
142
+ model_class.stub(:all).once.and_return(%w(foo bar).map { |n| model_class.new(n, n + '1', n + '2') })
143
+ }
144
+ }
145
+
146
+ describe "#keys" do
147
+ it "should return all keys from the cache as array of arrays" do
148
+ model_class.cache.keys.should == [[:bar1, :bar2], [:foo1, :foo2]]
85
149
  end
150
+ end
86
151
 
87
- context "with composite key access" do
152
+ describe "#values" do
153
+ it "should return all values from the cache" do
154
+ values = model_class.cache.values
155
+ values.should have(2).instances
156
+ # values.map(&:name).should == %w(foo bar)
157
+ values.map(&:name).should include('foo')
158
+ values.map(&:name).should include('bar')
159
+ end
160
+
161
+ it "should sort values by accessor keys" do
162
+ values = model_class.cache.values
163
+ values.map(&:name).should == %w(bar foo)
164
+ end
165
+
166
+ context 'when sort_keys are provided' do
88
167
  let(:model_class) {
89
168
  Class.new {
90
169
  def self.after_save; end
@@ -92,6 +171,7 @@ describe "Model with prosto cache mixed in" do
92
171
  include ProstoCache
93
172
 
94
173
  cache_accessor_keys %w(key1 key2)
174
+ cache_sort_keys(%w(reversed_name key1 key2))
95
175
 
96
176
  attr_accessor :name, :key1, :key2
97
177
  def initialize(name, key1, key2)
@@ -99,58 +179,69 @@ describe "Model with prosto cache mixed in" do
99
179
  self.key1 = key1
100
180
  self.key2 = key2
101
181
  end
182
+
183
+ def reversed_name
184
+ name.reverse
185
+ end
102
186
  }.tap { |model_class|
103
187
  model_class.cache.stub(:query_cache_signature).once.and_return(:foo)
104
188
  model_class.stub(:all).once.and_return(%w(foo bar).map { |n| model_class.new(n, n + '1', n + '2') })
105
189
  }
106
190
  }
107
191
 
108
- it "should raise an error when not enough keys provided" do
109
- expect { model_class.cache[:nondef] }.to raise_error ProstoCache::BadCacheKeyError
110
- expect { model_class.cache[:foo1] }.to raise_error ProstoCache::BadCacheKeyError
111
- expect { model_class.cache['nondef'] }.to raise_error ProstoCache::BadCacheKeyError
112
- expect { model_class.cache['foo1'] }.to raise_error ProstoCache::BadCacheKeyError
192
+ it "should sort values by sort keys" do
193
+ values = model_class.cache.values
194
+ values.map(&:name).should == %w(foo bar)
195
+ end
196
+ end
197
+ end
198
+
199
+ describe "#[]" do
200
+ it "should raise an error when not enough keys provided" do
201
+ expect { model_class.cache[:nondef] }.to raise_error ProstoCache::BadCacheKeyError
202
+ expect { model_class.cache[:foo1] }.to raise_error ProstoCache::BadCacheKeyError
203
+ expect { model_class.cache['nondef'] }.to raise_error ProstoCache::BadCacheKeyError
204
+ expect { model_class.cache['foo1'] }.to raise_error ProstoCache::BadCacheKeyError
205
+ end
206
+
207
+ it "should raise an error when too many keys provided" do
208
+ expect { model_class.cache[:nondef1, :nondef2, :nondef3] }.to raise_error ProstoCache::BadCacheKeyError
209
+ expect { model_class.cache[:foo1, :foo2, :nondef] }.to raise_error ProstoCache::BadCacheKeyError
210
+ expect { model_class.cache['nondef1', 'nondef2', 'nondef3'] }.to raise_error ProstoCache::BadCacheKeyError
211
+ expect { model_class.cache['foo1', 'foo2', 'nondef'] }.to raise_error ProstoCache::BadCacheKeyError
212
+ end
213
+
214
+ context 'when last key is symbol' do
215
+ it "should raise an error for first key that was not found" do
216
+ expect { model_class.cache[:undef, :foo2] }.to raise_error ProstoCache::BadCacheKeyError
217
+ expect { model_class.cache['undef', :foo2] }.to raise_error ProstoCache::BadCacheKeyError
113
218
  end
114
219
 
115
- it "should raise an error when too many keys provided" do
116
- expect { model_class.cache[:nondef1, :nondef2, :nondef3] }.to raise_error ProstoCache::BadCacheKeyError
117
- expect { model_class.cache[:foo1, :foo2, :nondef] }.to raise_error ProstoCache::BadCacheKeyError
118
- expect { model_class.cache['nondef1', 'nondef2', 'nondef3'] }.to raise_error ProstoCache::BadCacheKeyError
119
- expect { model_class.cache['foo1', 'foo2', 'nondef'] }.to raise_error ProstoCache::BadCacheKeyError
220
+ it "should raise an error for last key that was not found" do
221
+ expect { model_class.cache[:foo1, :nondef] }.to raise_error ProstoCache::BadCacheKeyError
222
+ expect { model_class.cache['foo1', :nondef] }.to raise_error ProstoCache::BadCacheKeyError
223
+ end
224
+
225
+ it "should return proper object for key that was found" do
226
+ model_class.cache[:foo1, :foo2].should_not be_nil
227
+ model_class.cache[:foo1, :foo2].name.should == 'foo'
228
+ end
229
+ end
230
+
231
+ context 'when last key is string' do
232
+ it "should return nil for first level key that was not found" do
233
+ model_class.cache['nondef', 'foo2'].should be_nil
234
+ model_class.cache[:nondef, 'foo2'].should be_nil
120
235
  end
121
236
 
122
- context 'when last key is symbol' do
123
- it "should raise an error for first key that was not found" do
124
- expect { model_class.cache[:undef, :foo2] }.to raise_error ProstoCache::BadCacheKeyError
125
- expect { model_class.cache['undef', :foo2] }.to raise_error ProstoCache::BadCacheKeyError
126
- end
127
-
128
- it "should raise an error for last key that was not found" do
129
- expect { model_class.cache[:foo1, :nondef] }.to raise_error ProstoCache::BadCacheKeyError
130
- expect { model_class.cache['foo1', :nondef] }.to raise_error ProstoCache::BadCacheKeyError
131
- end
132
-
133
- it "should return proper object for key that was found" do
134
- model_class.cache[:foo1, :foo2].should_not be_nil
135
- model_class.cache[:foo1, :foo2].name.should == 'foo'
136
- end
237
+ it "should return nil for second level key that was not found" do
238
+ model_class.cache['foo1', 'nondef'].should be_nil
239
+ model_class.cache[:foo1, 'nondef'].should be_nil
137
240
  end
138
241
 
139
- context 'when last key is string' do
140
- it "should return nil for first level key that was not found" do
141
- model_class.cache['nondef', 'foo2'].should be_nil
142
- model_class.cache[:nondef, 'foo2'].should be_nil
143
- end
144
-
145
- it "should return nil for second level key that was not found" do
146
- model_class.cache['foo1', 'nondef'].should be_nil
147
- model_class.cache[:foo1, 'nondef'].should be_nil
148
- end
149
-
150
- it "should return proper object for key that was found" do
151
- model_class.cache['foo1', 'foo2'].should_not be_nil
152
- model_class.cache['foo1', 'foo2'].name.should == 'foo'
153
- end
242
+ it "should return proper object for key that was found" do
243
+ model_class.cache['foo1', 'foo2'].should_not be_nil
244
+ model_class.cache['foo1', 'foo2'].name.should == 'foo'
154
245
  end
155
246
  end
156
247
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: prosto_cache
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.1
4
+ version: 0.2.2
5
5
  prerelease:
6
6
  platform: ruby
7
7
  authors:
@@ -9,7 +9,7 @@ authors:
9
9
  autorequire:
10
10
  bindir: bin
11
11
  cert_chain: []
12
- date: 2011-05-23 00:00:00.000000000 Z
12
+ date: 2013-10-31 00:00:00.000000000 Z
13
13
  dependencies:
14
14
  - !ruby/object:Gem::Dependency
15
15
  name: rspec
@@ -18,7 +18,7 @@ dependencies:
18
18
  requirements:
19
19
  - - ~>
20
20
  - !ruby/object:Gem::Version
21
- version: '2.3'
21
+ version: '2.13'
22
22
  type: :development
23
23
  prerelease: false
24
24
  version_requirements: !ruby/object:Gem::Requirement
@@ -26,7 +26,7 @@ dependencies:
26
26
  requirements:
27
27
  - - ~>
28
28
  - !ruby/object:Gem::Version
29
- version: '2.3'
29
+ version: '2.13'
30
30
  - !ruby/object:Gem::Dependency
31
31
  name: bundler
32
32
  requirement: !ruby/object:Gem::Requirement
@@ -43,41 +43,25 @@ dependencies:
43
43
  - - ~>
44
44
  - !ruby/object:Gem::Version
45
45
  version: '1.1'
46
- - !ruby/object:Gem::Dependency
47
- name: rake
48
- requirement: !ruby/object:Gem::Requirement
49
- none: false
50
- requirements:
51
- - - ! '>='
52
- - !ruby/object:Gem::Version
53
- version: '0'
54
- type: :development
55
- prerelease: false
56
- version_requirements: !ruby/object:Gem::Requirement
57
- none: false
58
- requirements:
59
- - - ! '>='
60
- - !ruby/object:Gem::Version
61
- version: '0'
62
46
  description: Use this gem if you want a simple 'enum-like' cache for your models that
63
47
  does not restrict updates, but will stay current with them.
64
48
  email: olek@woodenbits.com
65
49
  executables: []
66
50
  extensions: []
67
- extra_rdoc_files:
68
- - LICENSE.txt
69
- - README.rdoc
51
+ extra_rdoc_files: []
70
52
  files:
71
- - .rspec
53
+ - lib/prosto_cache/extensions.rb
54
+ - lib/prosto_cache/prosto_hash.rb
55
+ - lib/prosto_cache/prosto_model_cache.rb
56
+ - lib/prosto_cache.rb
72
57
  - Gemfile
73
58
  - Gemfile.lock
74
59
  - LICENSE.txt
75
- - README.rdoc
76
60
  - Rakefile
77
- - lib/prosto_cache.rb
78
- - prosto_cache.gemspec
61
+ - README.rdoc
79
62
  - spec/prosto_cache_spec.rb
80
63
  - spec/spec_helper.rb
64
+ - .rspec
81
65
  homepage: http://github.com/olek/prosto_cache
82
66
  licenses:
83
67
  - MIT
@@ -90,17 +74,19 @@ required_ruby_version: !ruby/object:Gem::Requirement
90
74
  requirements:
91
75
  - - ! '>='
92
76
  - !ruby/object:Gem::Version
93
- version: '0'
77
+ version: 1.9.3
94
78
  required_rubygems_version: !ruby/object:Gem::Requirement
95
79
  none: false
96
80
  requirements:
97
81
  - - ! '>='
98
82
  - !ruby/object:Gem::Version
99
- version: '0'
83
+ version: 1.3.6
100
84
  requirements: []
101
85
  rubyforge_project:
102
86
  rubygems_version: 1.8.23
103
87
  signing_key:
104
88
  specification_version: 3
105
89
  summary: Very simple caching for your ActiveRecord models.
106
- test_files: []
90
+ test_files:
91
+ - spec/prosto_cache_spec.rb
92
+ - spec/spec_helper.rb
data/prosto_cache.gemspec DELETED
@@ -1,51 +0,0 @@
1
- # encoding: utf-8
2
-
3
- Gem::Specification.new do |s|
4
- s.name = "prosto_cache"
5
- s.version = "0.2.1"
6
-
7
- s.required_rubygems_version = Gem::Requirement.new(">= 0") if s.respond_to? :required_rubygems_version=
8
- s.authors = ["Olek Poplavsky"]
9
- s.date = "2011-05-23"
10
- s.description = "Use this gem if you want a simple 'enum-like' cache for your models that does not restrict updates, but will stay current with them."
11
- s.email = "olek@woodenbits.com"
12
- s.extra_rdoc_files = [
13
- "LICENSE.txt",
14
- "README.rdoc"
15
- ]
16
- s.files = [
17
- ".rspec",
18
- "Gemfile",
19
- "Gemfile.lock",
20
- "LICENSE.txt",
21
- "README.rdoc",
22
- "Rakefile",
23
- "lib/prosto_cache.rb",
24
- "prosto_cache.gemspec",
25
- "spec/prosto_cache_spec.rb",
26
- "spec/spec_helper.rb",
27
- ]
28
- s.homepage = "http://github.com/olek/prosto_cache"
29
- s.licenses = ["MIT"]
30
- s.require_paths = ["lib"]
31
- s.rubygems_version = "1.8.11"
32
- s.summary = "Very simple caching for your ActiveRecord models."
33
-
34
- if s.respond_to? :specification_version then
35
- s.specification_version = 3
36
-
37
- if Gem::Version.new(Gem::VERSION) >= Gem::Version.new('1.2.0') then
38
- s.add_development_dependency(%q<rspec>, ["~> 2.3"])
39
- s.add_development_dependency(%q<bundler>, ["~> 1.1"])
40
- s.add_development_dependency(%q<rake>)
41
- else
42
- s.add_dependency(%q<rspec>, ["~> 2.3"])
43
- s.add_dependency(%q<bundler>, ["~> 1.1"])
44
- s.add_dependency(%q<rake>)
45
- end
46
- else
47
- s.add_dependency(%q<rspec>, ["~> 2.3"])
48
- s.add_dependency(%q<bundler>, ["~> 1.1"])
49
- s.add_dependency(%q<rake>)
50
- end
51
- end