rediscache 0.3.3__tar.gz → 1.0.1__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {rediscache-0.3.3 → rediscache-1.0.1}/PKG-INFO +27 -24
- {rediscache-0.3.3 → rediscache-1.0.1}/README.md +22 -19
- {rediscache-0.3.3 → rediscache-1.0.1}/pyproject.toml +17 -9
- {rediscache-0.3.3 → rediscache-1.0.1}/rediscache/__init__.py +11 -69
- {rediscache-0.3.3 → rediscache-1.0.1}/LICENSE +0 -0
- {rediscache-0.3.3 → rediscache-1.0.1}/rediscache/tools.py +0 -0
|
@@ -1,8 +1,7 @@
|
|
|
1
|
-
Metadata-Version: 2.
|
|
1
|
+
Metadata-Version: 2.3
|
|
2
2
|
Name: rediscache
|
|
3
|
-
Version: 0.
|
|
3
|
+
Version: 1.0.1
|
|
4
4
|
Summary: Redis caching of functions evolving over time
|
|
5
|
-
Home-page: https://github.com/AmadeusITGroup/RedisCache
|
|
6
5
|
License: MIT
|
|
7
6
|
Keywords: redis,performance,cache
|
|
8
7
|
Author: Pierre Cart-Grandjean
|
|
@@ -20,9 +19,10 @@ Classifier: Programming Language :: Python :: 3.9
|
|
|
20
19
|
Classifier: Programming Language :: Python :: 3.10
|
|
21
20
|
Classifier: Programming Language :: Python :: 3.11
|
|
22
21
|
Classifier: Programming Language :: Python :: 3.12
|
|
22
|
+
Classifier: Programming Language :: Python :: 3.13
|
|
23
23
|
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
|
24
|
-
Requires-Dist: executiontime (>=0.3
|
|
25
|
-
Requires-Dist: redis (>=5.
|
|
24
|
+
Requires-Dist: executiontime (>=0.4.3,<0.5.0)
|
|
25
|
+
Requires-Dist: redis (>=5.2.1,<6.0.0)
|
|
26
26
|
Project-URL: Repository, https://github.com/AmadeusITGroup/RedisCache
|
|
27
27
|
Description-Content-Type: text/markdown
|
|
28
28
|
|
|
@@ -46,7 +46,7 @@ This is a great caching mechanism for functions that will give a consistent outp
|
|
|
46
46
|
|
|
47
47
|
## Installation
|
|
48
48
|
|
|
49
|
-
Simply install the PyPi package:
|
|
49
|
+
Simply install the PyPi package [rediscache](https://pypi.org/project/rediscache/):
|
|
50
50
|
|
|
51
51
|
```bash
|
|
52
52
|
pip install rediscache
|
|
@@ -115,22 +115,6 @@ See `test_rediscache.py` for more examples.
|
|
|
115
115
|
|
|
116
116
|
Note: when you choose to wait for the value, you do not have an absolute guarantee that you will not get the default value. For example if it takes more than the retry time to get an answer from the function, the decorator will give up.
|
|
117
117
|
|
|
118
|
-
### `cache_raw` decorator helper
|
|
119
|
-
|
|
120
|
-
No serializer or deserializer. This will only work if the cached function only returns `byte`, `str`, `int` or `float` types. Even `None` will fail.
|
|
121
|
-
|
|
122
|
-
### `cache_raw_wait` decorator helper
|
|
123
|
-
|
|
124
|
-
Same as above but waits for the value if not in the cache.
|
|
125
|
-
|
|
126
|
-
### `cache_json` decorator helper
|
|
127
|
-
|
|
128
|
-
Serialize the value with `json.dumps()` and deserialize the value with `json.loads()`.
|
|
129
|
-
|
|
130
|
-
### `cache_json_wait` decorator helper
|
|
131
|
-
|
|
132
|
-
Same as above but waits for the value if not in the cache.
|
|
133
|
-
|
|
134
118
|
### `get_stats(delete=False)`
|
|
135
119
|
|
|
136
120
|
This will get the stats stored when using the cache. The `delete` option is to reset the counters after read.
|
|
@@ -138,6 +122,7 @@ The output is a dictionary with the following keys and values:
|
|
|
138
122
|
|
|
139
123
|
- **Refresh**: Number of times the cached function was actually called.
|
|
140
124
|
- **Wait**: Number of times we waited for the result when executing the function.
|
|
125
|
+
- **Sleep**: Number of 1 seconds we waited for the results to be found in the cache.
|
|
141
126
|
- **Failed**: Number of times the cached function raised an exception when called.
|
|
142
127
|
- **Missed**: Number of times the functions result was not found in the cache.
|
|
143
128
|
- **Success**: Number of times the function's result was found in the cache.
|
|
@@ -203,10 +188,10 @@ My development environment is handled by Poetry. I use `Python 3.11.7`.
|
|
|
203
188
|
|
|
204
189
|
To make sure we use Redis properly, we do not mock it in the unit tess. So you will need a localhost default instance of Redis server without a password. This means that the unit tests are more like integration tests.
|
|
205
190
|
|
|
206
|
-
The execution of the tests including coverage result
|
|
191
|
+
The execution of the tests including coverage result is done with `pytest`:
|
|
207
192
|
|
|
208
193
|
```bash
|
|
209
|
-
|
|
194
|
+
poetry run pytest --cov=rediscache
|
|
210
195
|
```
|
|
211
196
|
|
|
212
197
|
## CI/CD
|
|
@@ -225,3 +210,21 @@ We get help from re-usable actions. Here is the [Marketplace](https://github.com
|
|
|
225
210
|
|
|
226
211
|
For the moment the publish to PyPI is done manually with the `publish.sh` script. You will need a PyPI API token in `PYPI_API_TOKEN`, stored in a `secrets.sh`.
|
|
227
212
|
|
|
213
|
+
## Demo application
|
|
214
|
+
|
|
215
|
+
In the `demo` directory you will find a web application to test `RedisCache`.
|
|
216
|
+
|
|
217
|
+
```bash
|
|
218
|
+
poetry run webapp
|
|
219
|
+
```
|
|
220
|
+
|
|
221
|
+
Entry points:
|
|
222
|
+
|
|
223
|
+
- Call to long function with parameter value `20` and using the cache but waiting for a result: [link](http://localhost:9090/cached/20)
|
|
224
|
+
- Call to long function with parameter value `20` without using the cache: [link](http://localhost:9090/direct/20)
|
|
225
|
+
- Get the stats stored in Redis database: [link](http://localhost:9090/stats)
|
|
226
|
+
|
|
227
|
+
There is also a `Nginx` configuration file to further test if with a load balancing of workers. It is useful to demonstrate that many workers can share efficiently the same instance of `Redis`.
|
|
228
|
+
|
|
229
|
+
Finally a `Gatling` configuration file can be used to test the performance.
|
|
230
|
+
|
|
@@ -18,7 +18,7 @@ This is a great caching mechanism for functions that will give a consistent outp
|
|
|
18
18
|
|
|
19
19
|
## Installation
|
|
20
20
|
|
|
21
|
-
Simply install the PyPi package:
|
|
21
|
+
Simply install the PyPi package [rediscache](https://pypi.org/project/rediscache/):
|
|
22
22
|
|
|
23
23
|
```bash
|
|
24
24
|
pip install rediscache
|
|
@@ -87,22 +87,6 @@ See `test_rediscache.py` for more examples.
|
|
|
87
87
|
|
|
88
88
|
Note: when you choose to wait for the value, you do not have an absolute guarantee that you will not get the default value. For example if it takes more than the retry time to get an answer from the function, the decorator will give up.
|
|
89
89
|
|
|
90
|
-
### `cache_raw` decorator helper
|
|
91
|
-
|
|
92
|
-
No serializer or deserializer. This will only work if the cached function only returns `byte`, `str`, `int` or `float` types. Even `None` will fail.
|
|
93
|
-
|
|
94
|
-
### `cache_raw_wait` decorator helper
|
|
95
|
-
|
|
96
|
-
Same as above but waits for the value if not in the cache.
|
|
97
|
-
|
|
98
|
-
### `cache_json` decorator helper
|
|
99
|
-
|
|
100
|
-
Serialize the value with `json.dumps()` and deserialize the value with `json.loads()`.
|
|
101
|
-
|
|
102
|
-
### `cache_json_wait` decorator helper
|
|
103
|
-
|
|
104
|
-
Same as above but waits for the value if not in the cache.
|
|
105
|
-
|
|
106
90
|
### `get_stats(delete=False)`
|
|
107
91
|
|
|
108
92
|
This will get the stats stored when using the cache. The `delete` option is to reset the counters after read.
|
|
@@ -110,6 +94,7 @@ The output is a dictionary with the following keys and values:
|
|
|
110
94
|
|
|
111
95
|
- **Refresh**: Number of times the cached function was actually called.
|
|
112
96
|
- **Wait**: Number of times we waited for the result when executing the function.
|
|
97
|
+
- **Sleep**: Number of 1 seconds we waited for the results to be found in the cache.
|
|
113
98
|
- **Failed**: Number of times the cached function raised an exception when called.
|
|
114
99
|
- **Missed**: Number of times the functions result was not found in the cache.
|
|
115
100
|
- **Success**: Number of times the function's result was found in the cache.
|
|
@@ -175,10 +160,10 @@ My development environment is handled by Poetry. I use `Python 3.11.7`.
|
|
|
175
160
|
|
|
176
161
|
To make sure we use Redis properly, we do not mock it in the unit tess. So you will need a localhost default instance of Redis server without a password. This means that the unit tests are more like integration tests.
|
|
177
162
|
|
|
178
|
-
The execution of the tests including coverage result
|
|
163
|
+
The execution of the tests including coverage result is done with `pytest`:
|
|
179
164
|
|
|
180
165
|
```bash
|
|
181
|
-
|
|
166
|
+
poetry run pytest --cov=rediscache
|
|
182
167
|
```
|
|
183
168
|
|
|
184
169
|
## CI/CD
|
|
@@ -196,3 +181,21 @@ We get help from re-usable actions. Here is the [Marketplace](https://github.com
|
|
|
196
181
|
### Publish to PyPI
|
|
197
182
|
|
|
198
183
|
For the moment the publish to PyPI is done manually with the `publish.sh` script. You will need a PyPI API token in `PYPI_API_TOKEN`, stored in a `secrets.sh`.
|
|
184
|
+
|
|
185
|
+
## Demo application
|
|
186
|
+
|
|
187
|
+
In the `demo` directory you will find a web application to test `RedisCache`.
|
|
188
|
+
|
|
189
|
+
```bash
|
|
190
|
+
poetry run webapp
|
|
191
|
+
```
|
|
192
|
+
|
|
193
|
+
Entry points:
|
|
194
|
+
|
|
195
|
+
- Call to long function with parameter value `20` and using the cache but waiting for a result: [link](http://localhost:9090/cached/20)
|
|
196
|
+
- Call to long function with parameter value `20` without using the cache: [link](http://localhost:9090/direct/20)
|
|
197
|
+
- Get the stats stored in Redis database: [link](http://localhost:9090/stats)
|
|
198
|
+
|
|
199
|
+
There is also a `Nginx` configuration file to further test if with a load balancing of workers. It is useful to demonstrate that many workers can share efficiently the same instance of `Redis`.
|
|
200
|
+
|
|
201
|
+
Finally a `Gatling` configuration file can be used to test the performance.
|
|
@@ -1,6 +1,7 @@
|
|
|
1
|
+
# For more details on this file see: https://python-poetry.org/docs/pyproject/
|
|
1
2
|
[tool.poetry]
|
|
2
3
|
name = "rediscache"
|
|
3
|
-
version = "0.
|
|
4
|
+
version = "1.0.1"
|
|
4
5
|
description = "Redis caching of functions evolving over time"
|
|
5
6
|
authors = ["Pierre Cart-Grandjean <pcart-grandjean@amadeus.com>"]
|
|
6
7
|
license = "MIT"
|
|
@@ -21,28 +22,35 @@ classifiers = [
|
|
|
21
22
|
"Programming Language :: Python :: 3.11",
|
|
22
23
|
"Topic :: Software Development :: Libraries :: Python Modules",
|
|
23
24
|
]
|
|
25
|
+
packages = [{ include = "rediscache" }]
|
|
24
26
|
|
|
25
27
|
[tool.poetry.dependencies]
|
|
26
28
|
python = "^3.9"
|
|
27
|
-
redis = "^5.
|
|
28
|
-
executiontime = "^0.3
|
|
29
|
+
redis = "^5.2.1"
|
|
30
|
+
executiontime = "^0.4.3"
|
|
29
31
|
|
|
30
32
|
[tool.poetry.group.dev.dependencies]
|
|
31
|
-
pylint = "^3.
|
|
32
|
-
pytest = "^8.
|
|
33
|
+
pylint = "^3.3.5"
|
|
34
|
+
pytest = "^8.3.5"
|
|
33
35
|
pdbpp = "^0.10.3"
|
|
34
|
-
|
|
35
|
-
|
|
36
|
-
|
|
37
|
-
|
|
36
|
+
mypy = "^1.15.0"
|
|
37
|
+
safety = "^3.3.1"
|
|
38
|
+
black = "^25.1.0"
|
|
39
|
+
pytest-cov = "^6.0.0"
|
|
40
|
+
tornado = "^6.4.2"
|
|
41
|
+
pip-audit = "^2.8.0"
|
|
38
42
|
|
|
39
43
|
[build-system]
|
|
40
44
|
requires = ["poetry-core"]
|
|
41
45
|
build-backend = "poetry.core.masonry.api"
|
|
42
46
|
|
|
47
|
+
[tool.poetry.scripts]
|
|
48
|
+
webapp = "demo.webapp:main"
|
|
49
|
+
|
|
43
50
|
[tool.mypy]
|
|
44
51
|
strict = true
|
|
45
52
|
|
|
46
53
|
[tool.pylint.format]
|
|
47
54
|
# Maximum number of characters on a single line.
|
|
48
55
|
max-line-length = 160
|
|
56
|
+
max-args = 10
|
|
@@ -35,15 +35,14 @@ TODO:
|
|
|
35
35
|
"""
|
|
36
36
|
|
|
37
37
|
from functools import wraps
|
|
38
|
-
from json import dumps, loads
|
|
39
38
|
import logging
|
|
40
39
|
import os
|
|
41
40
|
import threading
|
|
42
41
|
from time import sleep
|
|
43
|
-
from typing import Any, Callable, Dict, List, Optional, ParamSpec, TypeVar
|
|
42
|
+
from typing import Any, Callable, cast, Dict, List, Optional, ParamSpec, TypeVar
|
|
44
43
|
|
|
45
|
-
import redis
|
|
46
44
|
from executiontime import printexecutiontime, YELLOW, RED
|
|
45
|
+
import redis
|
|
47
46
|
|
|
48
47
|
PREFIX = "."
|
|
49
48
|
REFRESH = "Refresh" # Number of times the cached function was actually called.
|
|
@@ -56,7 +55,7 @@ DEFAULT = "Default" # Number of times the default value was used because nothin
|
|
|
56
55
|
STATS = [REFRESH, WAIT, SLEEP, FAILED, MISSED, SUCCESS, DEFAULT]
|
|
57
56
|
|
|
58
57
|
P = ParamSpec("P")
|
|
59
|
-
T = TypeVar("T")
|
|
58
|
+
T = TypeVar("T", str, bytes)
|
|
60
59
|
|
|
61
60
|
|
|
62
61
|
class RedisCache:
|
|
@@ -64,8 +63,6 @@ class RedisCache:
|
|
|
64
63
|
Having the decorator provided by a class allows to have some context to improve performances.
|
|
65
64
|
"""
|
|
66
65
|
|
|
67
|
-
# pylint: disable=too-many-arguments
|
|
68
|
-
|
|
69
66
|
def __init__(
|
|
70
67
|
self,
|
|
71
68
|
host: Optional[str] = None,
|
|
@@ -120,25 +117,18 @@ class RedisCache:
|
|
|
120
117
|
|
|
121
118
|
return f"{name}({','.join(values)})"
|
|
122
119
|
|
|
123
|
-
# pylint: disable=line-too-long
|
|
124
120
|
def cache(
|
|
125
121
|
self,
|
|
126
122
|
refresh: int,
|
|
127
123
|
expire: int,
|
|
124
|
+
default: T,
|
|
128
125
|
retry: Optional[int] = None,
|
|
129
|
-
default: Any = "",
|
|
130
126
|
wait: bool = False,
|
|
131
|
-
serializer: Optional[Callable[..., Any]] = None,
|
|
132
|
-
deserializer: Optional[Callable[..., Any]] = None,
|
|
133
127
|
use_args: Optional[List[int]] = None,
|
|
134
128
|
use_kwargs: Optional[List[str]] = None,
|
|
135
129
|
) -> Callable[[Callable[P, T]], Callable[P, T]]:
|
|
136
130
|
"""
|
|
137
|
-
Full decorator will all possible parameters.
|
|
138
|
-
|
|
139
|
-
Specific examples when to use this decorator:
|
|
140
|
-
- Raw storage of byte string that you do not want to be decoded: use the decode=False.
|
|
141
|
-
- JSON dumps data that doesn't need to be loaded before it is sent by a REST API: use serializer=dumps but no deserializer.
|
|
131
|
+
Full decorator will all possible parameters.
|
|
142
132
|
"""
|
|
143
133
|
|
|
144
134
|
logger = logging.getLogger(__name__)
|
|
@@ -188,11 +178,8 @@ class RedisCache:
|
|
|
188
178
|
self.server.incr(DEFAULT)
|
|
189
179
|
new_value = default
|
|
190
180
|
|
|
191
|
-
# Serialize the value if requested
|
|
192
|
-
if serializer:
|
|
193
|
-
new_value = serializer(new_value)
|
|
194
181
|
# Store value in cache with expiration time
|
|
195
|
-
self.server.set(key, new_value, ex=expire)
|
|
182
|
+
self.server.set(key, new_value, ex=expire)
|
|
196
183
|
# Set refresh key with refresh time
|
|
197
184
|
self.server.set(PREFIX + key, 1, ex=refresh)
|
|
198
185
|
return new_value
|
|
@@ -206,13 +193,7 @@ class RedisCache:
|
|
|
206
193
|
|
|
207
194
|
# If the cache is disabled, directly call the function
|
|
208
195
|
if not self.enabled:
|
|
209
|
-
|
|
210
|
-
# If we have decided to serialize, we always do it to be consistent
|
|
211
|
-
if serializer:
|
|
212
|
-
direct_value = serializer(direct_value)
|
|
213
|
-
if deserializer:
|
|
214
|
-
direct_value = deserializer(direct_value)
|
|
215
|
-
return direct_value
|
|
196
|
+
return function(*args, **kwargs)
|
|
216
197
|
|
|
217
198
|
# Lets create a key from the function's name and its parameters values
|
|
218
199
|
key = self._create_key(name=function.__name__, args=args, use_args=use_args, kwargs=kwargs, use_kwargs=use_kwargs)
|
|
@@ -225,7 +206,7 @@ class RedisCache:
|
|
|
225
206
|
|
|
226
207
|
# Get the value from the cache.
|
|
227
208
|
# If it is not there we will get None.
|
|
228
|
-
cached_value = self.server.get(key)
|
|
209
|
+
cached_value = cast(T, self.server.get(key))
|
|
229
210
|
|
|
230
211
|
# Time to update stats counters
|
|
231
212
|
if cached_value is None:
|
|
@@ -254,16 +235,16 @@ class RedisCache:
|
|
|
254
235
|
# Let's count how many times we wait 1s
|
|
255
236
|
self.server.incr(SLEEP)
|
|
256
237
|
sleep(1)
|
|
257
|
-
cached_value = self.server.get(key)
|
|
238
|
+
cached_value = cast(T, self.server.get(key))
|
|
258
239
|
|
|
259
240
|
# If the cache was empty, we have None in the cached_value.
|
|
260
241
|
if cached_value is None:
|
|
261
242
|
# We are going to return the default value
|
|
262
243
|
self.server.incr(DEFAULT)
|
|
263
|
-
cached_value =
|
|
244
|
+
cached_value = default
|
|
264
245
|
|
|
265
246
|
# Return whatever value we have at this point.
|
|
266
|
-
return
|
|
247
|
+
return cached_value
|
|
267
248
|
|
|
268
249
|
# If we want to bypass the cache at runtime, we need a reference to the decorated function
|
|
269
250
|
wrapper.function = function # type: ignore
|
|
@@ -272,45 +253,6 @@ class RedisCache:
|
|
|
272
253
|
|
|
273
254
|
return decorator
|
|
274
255
|
|
|
275
|
-
def cache_raw(self, refresh: int, expire: int, retry: Optional[int] = None, default: Any = "") -> Callable[[Callable[P, T]], Callable[P, T]]:
|
|
276
|
-
"""
|
|
277
|
-
Normal caching of values directly storable in redis: byte string, string, int, float.
|
|
278
|
-
"""
|
|
279
|
-
return self.cache(refresh=refresh, expire=expire, retry=retry, default=default)
|
|
280
|
-
|
|
281
|
-
def cache_raw_wait(self, refresh: int, expire: int, retry: Optional[int] = None, default: Any = "") -> Callable[[Callable[P, T]], Callable[P, T]]:
|
|
282
|
-
"""
|
|
283
|
-
Same as cache_raw() but will wait for the completion of the cached function if no value is found in redis.
|
|
284
|
-
"""
|
|
285
|
-
return self.cache(refresh=refresh, expire=expire, retry=retry, default=default, wait=True)
|
|
286
|
-
|
|
287
|
-
def cache_json(self, refresh: int, expire: int, retry: Optional[int] = None, default: Any = "") -> Callable[[Callable[P, T]], Callable[P, T]]:
|
|
288
|
-
"""
|
|
289
|
-
JSON dumps the values to be stored in redis and loads them again when returning them to the caller.
|
|
290
|
-
"""
|
|
291
|
-
return self.cache(
|
|
292
|
-
refresh=refresh,
|
|
293
|
-
expire=expire,
|
|
294
|
-
retry=retry,
|
|
295
|
-
default=default,
|
|
296
|
-
serializer=dumps,
|
|
297
|
-
deserializer=loads,
|
|
298
|
-
)
|
|
299
|
-
|
|
300
|
-
def cache_json_wait(self, refresh: int, expire: int, retry: Optional[int] = None, default: Any = "") -> Callable[[Callable[P, T]], Callable[P, T]]:
|
|
301
|
-
"""
|
|
302
|
-
Same as cache_json() but will wait for the completion of the cached function if no value is found in redis.
|
|
303
|
-
"""
|
|
304
|
-
return self.cache(
|
|
305
|
-
refresh=refresh,
|
|
306
|
-
expire=expire,
|
|
307
|
-
retry=retry,
|
|
308
|
-
default=default,
|
|
309
|
-
wait=True,
|
|
310
|
-
serializer=dumps,
|
|
311
|
-
deserializer=loads,
|
|
312
|
-
)
|
|
313
|
-
|
|
314
256
|
def get_stats(self, delete: bool = False) -> Dict[str, Any]:
|
|
315
257
|
"""
|
|
316
258
|
Get the stats stored by RedisCache. See the list and definition at the top of this file.
|
|
File without changes
|
|
File without changes
|