taskiq-redis 1.0.1__tar.gz → 1.0.3__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,22 +1,23 @@
1
- Metadata-Version: 2.1
1
+ Metadata-Version: 2.3
2
2
  Name: taskiq-redis
3
- Version: 1.0.1
3
+ Version: 1.0.3
4
4
  Summary: Redis integration for taskiq
5
- Home-page: https://github.com/taskiq-python/taskiq-redis
6
5
  Keywords: taskiq,tasks,distributed,async,redis,result_backend
7
6
  Author: taskiq-team
8
7
  Author-email: taskiq@norely.com
9
- Requires-Python: >=3.8.1,<4.0.0
8
+ Requires-Python: >=3.9,<4.0
10
9
  Classifier: Programming Language :: Python
11
10
  Classifier: Programming Language :: Python :: 3
12
11
  Classifier: Programming Language :: Python :: 3.9
13
12
  Classifier: Programming Language :: Python :: 3.10
14
13
  Classifier: Programming Language :: Python :: 3.11
15
14
  Classifier: Programming Language :: Python :: 3.12
15
+ Classifier: Programming Language :: Python :: 3.13
16
16
  Classifier: Programming Language :: Python :: 3 :: Only
17
17
  Classifier: Programming Language :: Python :: 3.8
18
18
  Requires-Dist: redis (>=5,<6)
19
- Requires-Dist: taskiq (>=0.11.1,<1)
19
+ Requires-Dist: taskiq (>=0.11.12,<1)
20
+ Project-URL: Homepage, https://github.com/taskiq-python/taskiq-redis
20
21
  Project-URL: Repository, https://github.com/taskiq-python/taskiq-redis
21
22
  Description-Content-Type: text/markdown
22
23
 
@@ -43,17 +44,17 @@ Let's see the example with the redis broker and redis async result:
43
44
  # broker.py
44
45
  import asyncio
45
46
 
46
- from taskiq_redis import ListQueueBroker, RedisAsyncResultBackend
47
+ from taskiq_redis import RedisAsyncResultBackend, RedisStreamBroker
47
48
 
48
- redis_async_result = RedisAsyncResultBackend(
49
+ result_backend = RedisAsyncResultBackend(
49
50
  redis_url="redis://localhost:6379",
50
51
  )
51
52
 
52
53
  # Or you can use PubSubBroker if you need broadcasting
53
- broker = ListQueueBroker(
54
+ # Or ListQueueBroker if you don't want acknowledges
55
+ broker = RedisStreamBroker(
54
56
  url="redis://localhost:6379",
55
- result_backend=redis_async_result,
56
- )
57
+ ).with_result_backend(result_backend)
57
58
 
58
59
 
59
60
  @broker.task
@@ -77,25 +78,48 @@ Launch the workers:
77
78
  Then run the main code:
78
79
  `python3 broker.py`
79
80
 
80
- ## PubSubBroker and ListQueueBroker configuration
81
81
 
82
- We have two brokers with similar interfaces, but with different logic.
83
- The PubSubBroker uses redis' pubsub mechanism and is very powerful,
84
- but it executes every task on all workers, because PUBSUB broadcasts message
85
- to all subscribers.
82
+ ## Brokers
86
83
 
87
- If you want your messages to be processed only once, please use ListQueueBroker.
88
- It uses redis' [LPUSH](https://redis.io/commands/lpush/) and [BRPOP](https://redis.io/commands/brpop/) commands to deal with messages.
84
+ This package contains 6 broker implementations.
85
+ 3 broker types:
86
+ * PubSub broker
87
+ * ListQueue broker
88
+ * Stream broker
89
89
 
90
- Brokers parameters:
91
- * `url` - url to redis.
92
- * `task_id_generator` - custom task_id genertaor.
93
- * `result_backend` - custom result backend.
94
- * `queue_name` - name of the pub/sub channel in redis.
95
- * `max_connection_pool_size` - maximum number of connections in pool.
96
- * Any other keyword arguments are passed to `redis.asyncio.BlockingConnectionPool`.
97
- Notably, you can use `timeout` to set custom timeout in seconds for reconnects
98
- (or set it to `None` to try reconnects indefinitely).
90
+ Each of type is implemented for each redis architecture:
91
+ * Single node
92
+ * Cluster
93
+ * Sentinel
94
+
95
+ Here's a small breakdown of how they differ from eachother.
96
+
97
+
98
+ ### PubSub
99
+
100
+ By default on old redis versions PUBSUB was the way of making redis into a queue.
101
+ But using PUBSUB means that all messages delivered to all subscribed consumers.
102
+
103
+ > [!WARNING]
104
+ > This broker doesn't support acknowledgements. If during message processing
105
+ > Worker was suddenly killed the message is going to be lost.
106
+
107
+ ### ListQueue
108
+
109
+ This broker creates a list of messages at some key. Adding new tasks will be done
110
+ by appending them from the left side using `lpush`, and taking them from the right side using `brpop`.
111
+
112
+ > [!WARNING]
113
+ > This broker doesn't support acknowledgements. If during message processing
114
+ > Worker was suddenly killed the message is going to be lost.
115
+
116
+ ### Stream
117
+
118
+ Stream brokers use redis [stream type](https://redis.io/docs/latest/develop/data-types/streams/) to store and fetch messages.
119
+
120
+ > [!TIP]
121
+ > This broker **supports** acknowledgements and therefore is fine to use in cases when data durability is
122
+ > required.
99
123
 
100
124
  ## RedisAsyncResultBackend configuration
101
125
 
@@ -107,19 +131,21 @@ RedisAsyncResultBackend parameters:
107
131
  * Any other keyword arguments are passed to `redis.asyncio.BlockingConnectionPool`.
108
132
  Notably, you can use `timeout` to set custom timeout in seconds for reconnects
109
133
  (or set it to `None` to try reconnects indefinitely).
110
- > IMPORTANT: **It is highly recommended to use expire time ​​in RedisAsyncResultBackend**
134
+
135
+ > [!WARNING]
136
+ > **It is highly recommended to use expire time in RedisAsyncResultBackend**
111
137
  > If you want to add expiration, either `result_ex_time` or `result_px_time` must be set.
112
- >```python
113
- ># First variant
114
- >redis_async_result = RedisAsyncResultBackend(
115
- > redis_url="redis://localhost:6379",
116
- > result_ex_time=1000,
117
- >)
138
+ > ```python
139
+ > # First variant
140
+ > redis_async_result = RedisAsyncResultBackend(
141
+ > redis_url="redis://localhost:6379",
142
+ > result_ex_time=1000,
143
+ > )
118
144
  >
119
- ># Second variant
120
- >redis_async_result = RedisAsyncResultBackend(
121
- > redis_url="redis://localhost:6379",
122
- > result_px_time=1000000,
123
- >)
124
- >```
145
+ > # Second variant
146
+ > redis_async_result = RedisAsyncResultBackend(
147
+ > redis_url="redis://localhost:6379",
148
+ > result_px_time=1000000,
149
+ > )
150
+ > ```
125
151
 
@@ -0,0 +1,127 @@
1
+ # TaskIQ-Redis
2
+
3
+ Taskiq-redis is a plugin for taskiq that adds a new broker and result backend based on redis.
4
+
5
+ # Installation
6
+
7
+ To use this project you must have installed core taskiq library:
8
+ ```bash
9
+ pip install taskiq
10
+ ```
11
+ This project can be installed using pip:
12
+ ```bash
13
+ pip install taskiq-redis
14
+ ```
15
+
16
+ # Usage
17
+
18
+ Let's see the example with the redis broker and redis async result:
19
+
20
+ ```python
21
+ # broker.py
22
+ import asyncio
23
+
24
+ from taskiq_redis import RedisAsyncResultBackend, RedisStreamBroker
25
+
26
+ result_backend = RedisAsyncResultBackend(
27
+ redis_url="redis://localhost:6379",
28
+ )
29
+
30
+ # Or you can use PubSubBroker if you need broadcasting
31
+ # Or ListQueueBroker if you don't want acknowledges
32
+ broker = RedisStreamBroker(
33
+ url="redis://localhost:6379",
34
+ ).with_result_backend(result_backend)
35
+
36
+
37
+ @broker.task
38
+ async def best_task_ever() -> None:
39
+ """Solve all problems in the world."""
40
+ await asyncio.sleep(5.5)
41
+ print("All problems are solved!")
42
+
43
+
44
+ async def main():
45
+ task = await best_task_ever.kiq()
46
+ print(await task.wait_result())
47
+
48
+
49
+ if __name__ == "__main__":
50
+ asyncio.run(main())
51
+ ```
52
+
53
+ Launch the workers:
54
+ `taskiq worker broker:broker`
55
+ Then run the main code:
56
+ `python3 broker.py`
57
+
58
+
59
+ ## Brokers
60
+
61
+ This package contains 6 broker implementations.
62
+ 3 broker types:
63
+ * PubSub broker
64
+ * ListQueue broker
65
+ * Stream broker
66
+
67
+ Each of type is implemented for each redis architecture:
68
+ * Single node
69
+ * Cluster
70
+ * Sentinel
71
+
72
+ Here's a small breakdown of how they differ from eachother.
73
+
74
+
75
+ ### PubSub
76
+
77
+ By default on old redis versions PUBSUB was the way of making redis into a queue.
78
+ But using PUBSUB means that all messages delivered to all subscribed consumers.
79
+
80
+ > [!WARNING]
81
+ > This broker doesn't support acknowledgements. If during message processing
82
+ > Worker was suddenly killed the message is going to be lost.
83
+
84
+ ### ListQueue
85
+
86
+ This broker creates a list of messages at some key. Adding new tasks will be done
87
+ by appending them from the left side using `lpush`, and taking them from the right side using `brpop`.
88
+
89
+ > [!WARNING]
90
+ > This broker doesn't support acknowledgements. If during message processing
91
+ > Worker was suddenly killed the message is going to be lost.
92
+
93
+ ### Stream
94
+
95
+ Stream brokers use redis [stream type](https://redis.io/docs/latest/develop/data-types/streams/) to store and fetch messages.
96
+
97
+ > [!TIP]
98
+ > This broker **supports** acknowledgements and therefore is fine to use in cases when data durability is
99
+ > required.
100
+
101
+ ## RedisAsyncResultBackend configuration
102
+
103
+ RedisAsyncResultBackend parameters:
104
+ * `redis_url` - url to redis.
105
+ * `keep_results` - flag to not remove results from Redis after reading.
106
+ * `result_ex_time` - expire time in seconds (by default - not specified)
107
+ * `result_px_time` - expire time in milliseconds (by default - not specified)
108
+ * Any other keyword arguments are passed to `redis.asyncio.BlockingConnectionPool`.
109
+ Notably, you can use `timeout` to set custom timeout in seconds for reconnects
110
+ (or set it to `None` to try reconnects indefinitely).
111
+
112
+ > [!WARNING]
113
+ > **It is highly recommended to use expire time in RedisAsyncResultBackend**
114
+ > If you want to add expiration, either `result_ex_time` or `result_px_time` must be set.
115
+ > ```python
116
+ > # First variant
117
+ > redis_async_result = RedisAsyncResultBackend(
118
+ > redis_url="redis://localhost:6379",
119
+ > result_ex_time=1000,
120
+ > )
121
+ >
122
+ > # Second variant
123
+ > redis_async_result = RedisAsyncResultBackend(
124
+ > redis_url="redis://localhost:6379",
125
+ > result_px_time=1000000,
126
+ > )
127
+ > ```
@@ -1,6 +1,6 @@
1
1
  [tool.poetry]
2
2
  name = "taskiq-redis"
3
- version = "1.0.1"
3
+ version = "1.0.3"
4
4
  description = "Redis integration for taskiq"
5
5
  authors = ["taskiq-team <taskiq@norely.com>"]
6
6
  readme = "README.md"
@@ -25,22 +25,21 @@ keywords = [
25
25
  ]
26
26
 
27
27
  [tool.poetry.dependencies]
28
- python = "^3.8.1"
29
- taskiq = ">=0.11.1,<1"
28
+ python = "^3.9"
29
+ taskiq = ">=0.11.12,<1"
30
30
  redis = "^5"
31
31
 
32
32
  [tool.poetry.group.dev.dependencies]
33
- pytest = "^7.0"
33
+ pytest = "^8"
34
34
  mypy = "^1"
35
- black = "^22.3.0"
36
- pytest-cov = "^3.0.0"
37
- anyio = "^3.6.1"
38
- pytest-env = "^0.6.2"
35
+ black = "^25"
36
+ pytest-cov = "^6"
37
+ anyio = "^4"
38
+ pytest-env = "^1"
39
39
  fakeredis = "^2"
40
- pre-commit = "^2.20.0"
41
- pytest-xdist = { version = "^2.5.0", extras = ["psutil"] }
40
+ pre-commit = "^4"
41
+ pytest-xdist = { version = "^3", extras = ["psutil"] }
42
42
  ruff = "^0"
43
- types-redis = "^4.6.0.20240425"
44
43
 
45
44
  [tool.mypy]
46
45
  strict = true
@@ -56,6 +55,7 @@ warn_return_any = false
56
55
  [[tool.mypy.overrides]]
57
56
  module = ['redis']
58
57
  ignore_missing_imports = true
58
+ ignore_errors = true
59
59
  strict = false
60
60
 
61
61
  [build-system]
@@ -65,7 +65,7 @@ build-backend = "poetry.core.masonry.api"
65
65
  [tool.ruff]
66
66
  # List of enabled rulsets.
67
67
  # See https://docs.astral.sh/ruff/rules/ for more information.
68
- select = [
68
+ lint.select = [
69
69
  "E", # Error
70
70
  "F", # Pyflakes
71
71
  "W", # Pycodestyle
@@ -92,24 +92,22 @@ select = [
92
92
  "PL", # PyLint checks
93
93
  "RUF", # Specific to Ruff checks
94
94
  ]
95
- ignore = [
95
+ lint.ignore = [
96
96
  "D105", # Missing docstring in magic method
97
97
  "D107", # Missing docstring in __init__
98
98
  "D212", # Multi-line docstring summary should start at the first line
99
99
  "D401", # First line should be in imperative mood
100
100
  "D104", # Missing docstring in public package
101
101
  "D100", # Missing docstring in public module
102
- "ANN102", # Missing type annotation for self in method
103
- "ANN101", # Missing type annotation for argument
104
102
  "ANN401", # typing.Any are disallowed in `**kwargs
105
103
  "PLR0913", # Too many arguments for function call
106
104
  "D106", # Missing docstring in public nested class
107
105
  ]
108
106
  exclude = [".venv/"]
109
- mccabe = { max-complexity = 10 }
107
+ lint.mccabe = { max-complexity = 10 }
110
108
  line-length = 88
111
109
 
112
- [tool.ruff.per-file-ignores]
110
+ [tool.ruff.lint.per-file-ignores]
113
111
  "tests/*" = [
114
112
  "S101", # Use of assert detected
115
113
  "S301", # Use of pickle detected
@@ -119,12 +117,12 @@ line-length = 88
119
117
  "D101", # Missing docstring in public class
120
118
  ]
121
119
 
122
- [tool.ruff.pydocstyle]
120
+ [tool.ruff.lint.pydocstyle]
123
121
  convention = "pep257"
124
122
  ignore-decorators = ["typing.overload"]
125
123
 
126
- [tool.ruff.pylint]
124
+ [tool.ruff.lint.pylint]
127
125
  allow-magic-value-types = ["int", "str", "float"]
128
126
 
129
- [tool.ruff.flake8-bugbear]
127
+ [tool.ruff.lint.flake8-bugbear]
130
128
  extend-immutable-calls = ["taskiq_dependencies.Depends", "taskiq.TaskiqDepends"]
@@ -1,14 +1,19 @@
1
1
  """Package for redis integration."""
2
+
2
3
  from taskiq_redis.redis_backend import (
3
4
  RedisAsyncClusterResultBackend,
4
5
  RedisAsyncResultBackend,
5
6
  RedisAsyncSentinelResultBackend,
6
7
  )
7
- from taskiq_redis.redis_broker import ListQueueBroker, PubSubBroker
8
- from taskiq_redis.redis_cluster_broker import ListQueueClusterBroker
8
+ from taskiq_redis.redis_broker import ListQueueBroker, PubSubBroker, RedisStreamBroker
9
+ from taskiq_redis.redis_cluster_broker import (
10
+ ListQueueClusterBroker,
11
+ RedisStreamClusterBroker,
12
+ )
9
13
  from taskiq_redis.redis_sentinel_broker import (
10
14
  ListQueueSentinelBroker,
11
15
  PubSubSentinelBroker,
16
+ RedisStreamSentinelBroker,
12
17
  )
13
18
  from taskiq_redis.schedule_source import (
14
19
  RedisClusterScheduleSource,
@@ -17,15 +22,18 @@ from taskiq_redis.schedule_source import (
17
22
  )
18
23
 
19
24
  __all__ = [
20
- "RedisAsyncClusterResultBackend",
21
- "RedisAsyncResultBackend",
22
- "RedisAsyncSentinelResultBackend",
23
25
  "ListQueueBroker",
24
- "PubSubBroker",
25
26
  "ListQueueClusterBroker",
26
27
  "ListQueueSentinelBroker",
28
+ "PubSubBroker",
27
29
  "PubSubSentinelBroker",
28
- "RedisScheduleSource",
30
+ "RedisAsyncClusterResultBackend",
31
+ "RedisAsyncResultBackend",
32
+ "RedisAsyncSentinelResultBackend",
29
33
  "RedisClusterScheduleSource",
34
+ "RedisScheduleSource",
30
35
  "RedisSentinelScheduleSource",
36
+ "RedisStreamBroker",
37
+ "RedisStreamClusterBroker",
38
+ "RedisStreamSentinelBroker",
31
39
  ]
@@ -8,10 +8,16 @@ class TaskIQRedisError(TaskiqError):
8
8
  class DuplicateExpireTimeSelectedError(ResultBackendError, TaskIQRedisError):
9
9
  """Error if two lifetimes are selected."""
10
10
 
11
+ __template__ = "Choose either result_ex_time or result_px_time."
12
+
11
13
 
12
14
  class ExpireTimeMustBeMoreThanZeroError(ResultBackendError, TaskIQRedisError):
13
15
  """Error if two lifetimes are less or equal zero."""
14
16
 
17
+ __template__ = (
18
+ "You must select one expire time param and it must be more than zero."
19
+ )
20
+
15
21
 
16
22
  class ResultIsMissingError(TaskIQRedisError, ResultGetError):
17
23
  """Error if there is no result when trying to get it."""