taskiq-redis 1.0.2__py3-none-any.whl → 1.0.4__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,215 @@
1
+ Metadata-Version: 2.3
2
+ Name: taskiq-redis
3
+ Version: 1.0.4
4
+ Summary: Redis integration for taskiq
5
+ Keywords: taskiq,tasks,distributed,async,redis,result_backend
6
+ Author: taskiq-team
7
+ Author-email: taskiq@norely.com
8
+ Requires-Python: >=3.9,<4.0
9
+ Classifier: Programming Language :: Python
10
+ Classifier: Programming Language :: Python :: 3
11
+ Classifier: Programming Language :: Python :: 3.9
12
+ Classifier: Programming Language :: Python :: 3.10
13
+ Classifier: Programming Language :: Python :: 3.11
14
+ Classifier: Programming Language :: Python :: 3.12
15
+ Classifier: Programming Language :: Python :: 3.13
16
+ Classifier: Programming Language :: Python :: 3 :: Only
17
+ Classifier: Programming Language :: Python :: 3.8
18
+ Requires-Dist: redis (>=5,<6)
19
+ Requires-Dist: taskiq (>=0.11.12,<1)
20
+ Project-URL: Homepage, https://github.com/taskiq-python/taskiq-redis
21
+ Project-URL: Repository, https://github.com/taskiq-python/taskiq-redis
22
+ Description-Content-Type: text/markdown
23
+
24
+ # TaskIQ-Redis
25
+
26
+ Taskiq-redis is a plugin for taskiq that adds a new broker and result backend based on redis.
27
+
28
+ # Installation
29
+
30
+ To use this project you must have installed core taskiq library:
31
+ ```bash
32
+ pip install taskiq
33
+ ```
34
+ This project can be installed using pip:
35
+ ```bash
36
+ pip install taskiq-redis
37
+ ```
38
+
39
+ # Usage
40
+
41
+ Let's see the example with the redis broker and redis async result:
42
+
43
+ ```python
44
+ # broker.py
45
+ import asyncio
46
+
47
+ from taskiq_redis import RedisAsyncResultBackend, RedisStreamBroker
48
+
49
+ result_backend = RedisAsyncResultBackend(
50
+ redis_url="redis://localhost:6379",
51
+ )
52
+
53
+ # Or you can use PubSubBroker if you need broadcasting
54
+ # Or ListQueueBroker if you don't want acknowledges
55
+ broker = RedisStreamBroker(
56
+ url="redis://localhost:6379",
57
+ ).with_result_backend(result_backend)
58
+
59
+
60
+ @broker.task
61
+ async def best_task_ever() -> None:
62
+ """Solve all problems in the world."""
63
+ await asyncio.sleep(5.5)
64
+ print("All problems are solved!")
65
+
66
+
67
+ async def main():
68
+ task = await best_task_ever.kiq()
69
+ print(await task.wait_result())
70
+
71
+
72
+ if __name__ == "__main__":
73
+ asyncio.run(main())
74
+ ```
75
+
76
+ Launch the workers:
77
+ `taskiq worker broker:broker`
78
+ Then run the main code:
79
+ `python3 broker.py`
80
+
81
+
82
+ ## Brokers
83
+
84
+ This package contains 6 broker implementations.
85
+ 3 broker types:
86
+ * PubSub broker
87
+ * ListQueue broker
88
+ * Stream broker
89
+
90
+ Each of type is implemented for each redis architecture:
91
+ * Single node
92
+ * Cluster
93
+ * Sentinel
94
+
95
+ Here's a small breakdown of how they differ from eachother.
96
+
97
+
98
+ ### PubSub
99
+
100
+ By default on old redis versions PUBSUB was the way of making redis into a queue.
101
+ But using PUBSUB means that all messages delivered to all subscribed consumers.
102
+
103
+ > [!WARNING]
104
+ > This broker doesn't support acknowledgements. If during message processing
105
+ > Worker was suddenly killed the message is going to be lost.
106
+
107
+ ### ListQueue
108
+
109
+ This broker creates a list of messages at some key. Adding new tasks will be done
110
+ by appending them from the left side using `lpush`, and taking them from the right side using `brpop`.
111
+
112
+ > [!WARNING]
113
+ > This broker doesn't support acknowledgements. If during message processing
114
+ > Worker was suddenly killed the message is going to be lost.
115
+
116
+ ### Stream
117
+
118
+ Stream brokers use redis [stream type](https://redis.io/docs/latest/develop/data-types/streams/) to store and fetch messages.
119
+
120
+ > [!TIP]
121
+ > This broker **supports** acknowledgements and therefore is fine to use in cases when data durability is
122
+ > required.
123
+
124
+ ## RedisAsyncResultBackend configuration
125
+
126
+ RedisAsyncResultBackend parameters:
127
+ * `redis_url` - url to redis.
128
+ * `keep_results` - flag to not remove results from Redis after reading.
129
+ * `result_ex_time` - expire time in seconds (by default - not specified)
130
+ * `result_px_time` - expire time in milliseconds (by default - not specified)
131
+ * Any other keyword arguments are passed to `redis.asyncio.BlockingConnectionPool`.
132
+ Notably, you can use `timeout` to set custom timeout in seconds for reconnects
133
+ (or set it to `None` to try reconnects indefinitely).
134
+
135
+ > [!WARNING]
136
+ > **It is highly recommended to use expire time in RedisAsyncResultBackend**
137
+ > If you want to add expiration, either `result_ex_time` or `result_px_time` must be set.
138
+ > ```python
139
+ > # First variant
140
+ > redis_async_result = RedisAsyncResultBackend(
141
+ > redis_url="redis://localhost:6379",
142
+ > result_ex_time=1000,
143
+ > )
144
+ >
145
+ > # Second variant
146
+ > redis_async_result = RedisAsyncResultBackend(
147
+ > redis_url="redis://localhost:6379",
148
+ > result_px_time=1000000,
149
+ > )
150
+ > ```
151
+
152
+
153
+ ## Schedule sources
154
+
155
+
156
+ You can use this package to add dynamic schedule sources. They are used to store
157
+ schedules for taskiq scheduler.
158
+
159
+ The advantage of using schedule sources from this package over default `LabelBased` source is that you can
160
+ dynamically add schedules in it.
161
+
162
+ We have two types of schedules:
163
+
164
+ * `RedisScheduleSource`
165
+ * `ListRedisScheduleSource`
166
+
167
+
168
+ ### RedisScheduleSource
169
+
170
+ This source is super simple. It stores all schedules by key `{prefix}:{schedule_id}`. When scheduler requests
171
+ schedules, it retrieves all values from redis that start with a given `prefix`.
172
+
173
+ This is very ineficent and should not be used for high-volume schedules. Because if you have `1000` schedules, this scheduler will make at least `20` requests to retrieve them (we use `scan` and `mget` to minimize number of calls).
174
+
175
+ ### ListRedisScheduleSource
176
+
177
+ This source holds values in lists.
178
+
179
+ * For cron tasks it uses key `{prefix}:cron`.
180
+ * For timed schedules it uses key `{prefix}:time:{time}` where `{time}` is actually time where schedules should run.
181
+
182
+ The main advantage of this approach is that we only fetch tasks we need to run at a given time and do not perform any excesive calls to redis.
183
+
184
+
185
+ ### Migration from one source to another
186
+
187
+ To migrate from `RedisScheduleSource` to `ListRedisScheduleSource` you can define the latter as this:
188
+
189
+ ```python
190
+ # broker.py
191
+ import asyncio
192
+ import datetime
193
+
194
+ from taskiq import TaskiqScheduler
195
+
196
+ from taskiq_redis import ListRedisScheduleSource, RedisStreamBroker
197
+ from taskiq_redis.schedule_source import RedisScheduleSource
198
+
199
+ broker = RedisStreamBroker(url="redis://localhost:6379")
200
+
201
+ old_source = RedisScheduleSource("redis://localhost/1", prefix="prefix1")
202
+ array_source = ListRedisScheduleSource(
203
+ "redis://localhost/1",
204
+ prefix="prefix2",
205
+ # To migrate schedules from an old source.
206
+ ).with_migrate_from(
207
+ old_source,
208
+ # To delete schedules from an old source.
209
+ delete_schedules=True,
210
+ )
211
+ scheduler = TaskiqScheduler(broker, [array_source])
212
+ ```
213
+
214
+ During startup the scheduler will try to migrate schedules from an old source to a new one. Please be sure to specify different prefixe just to avoid any kind of collision between these two.
215
+
@@ -0,0 +1,13 @@
1
+ taskiq_redis/__init__.py,sha256=Sl4m9rKxweU1t0m289Qtf0qm4xSSkkFHoOfKq6qaz6g,1192
2
+ taskiq_redis/exceptions.py,sha256=7buBJ7CRVWd5WqVqSjtHO8cVL7QzZg-DOM3nB87t-Sk,738
3
+ taskiq_redis/list_schedule_source.py,sha256=guWql2hs2WT35vZtrsW1W9-TvaHsX5Lq_CyRjrl0tGA,9458
4
+ taskiq_redis/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
5
+ taskiq_redis/redis_backend.py,sha256=MLBaUN3Zx-DLvm1r-lgPU82_WZq9kc6oTxYI8LQjd6k,19882
6
+ taskiq_redis/redis_broker.py,sha256=ZLn7LAHj8Sh_oyW5hMgD7PZPQfUdXNPKdqhBcr9Okmg,9775
7
+ taskiq_redis/redis_cluster_broker.py,sha256=FuWl5fP7Fwr9FbytErmhcUGjRCdPexDK2Co2u6kpDlo,6591
8
+ taskiq_redis/redis_sentinel_broker.py,sha256=wHnbG3xuD_ruhhwp4AXo91NNjq8v2iufUZ0i_HbBRVQ,9073
9
+ taskiq_redis/schedule_source.py,sha256=hqpcs2D8W90KUDHREKblisnhGCE9dbVOtKtuJcOTGZw,9915
10
+ taskiq_redis-1.0.4.dist-info/LICENSE,sha256=lEHEEE-ZxmuItxYgUMPiFWdRcAITxE8DFMNyAg4eOYE,1075
11
+ taskiq_redis-1.0.4.dist-info/METADATA,sha256=sv_06NsLK3SODn9rj404w-mGKpnIWrI5iGLZEPyaBj8,6573
12
+ taskiq_redis-1.0.4.dist-info/WHEEL,sha256=fGIA9gx4Qxk2KDKeNJCbOEwSrmLtjWCwzBz351GyrPQ,88
13
+ taskiq_redis-1.0.4.dist-info/RECORD,,
@@ -1,4 +1,4 @@
1
1
  Wheel-Version: 1.0
2
- Generator: poetry-core 1.9.0
2
+ Generator: poetry-core 2.1.2
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
@@ -1,125 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: taskiq-redis
3
- Version: 1.0.2
4
- Summary: Redis integration for taskiq
5
- Home-page: https://github.com/taskiq-python/taskiq-redis
6
- Keywords: taskiq,tasks,distributed,async,redis,result_backend
7
- Author: taskiq-team
8
- Author-email: taskiq@norely.com
9
- Requires-Python: >=3.8.1,<4.0.0
10
- Classifier: Programming Language :: Python
11
- Classifier: Programming Language :: Python :: 3
12
- Classifier: Programming Language :: Python :: 3.9
13
- Classifier: Programming Language :: Python :: 3.10
14
- Classifier: Programming Language :: Python :: 3.11
15
- Classifier: Programming Language :: Python :: 3.12
16
- Classifier: Programming Language :: Python :: 3 :: Only
17
- Classifier: Programming Language :: Python :: 3.8
18
- Requires-Dist: redis (>=5,<6)
19
- Requires-Dist: taskiq (>=0.11.1,<1)
20
- Project-URL: Repository, https://github.com/taskiq-python/taskiq-redis
21
- Description-Content-Type: text/markdown
22
-
23
- # TaskIQ-Redis
24
-
25
- Taskiq-redis is a plugin for taskiq that adds a new broker and result backend based on redis.
26
-
27
- # Installation
28
-
29
- To use this project you must have installed core taskiq library:
30
- ```bash
31
- pip install taskiq
32
- ```
33
- This project can be installed using pip:
34
- ```bash
35
- pip install taskiq-redis
36
- ```
37
-
38
- # Usage
39
-
40
- Let's see the example with the redis broker and redis async result:
41
-
42
- ```python
43
- # broker.py
44
- import asyncio
45
-
46
- from taskiq_redis import ListQueueBroker, RedisAsyncResultBackend
47
-
48
- redis_async_result = RedisAsyncResultBackend(
49
- redis_url="redis://localhost:6379",
50
- )
51
-
52
- # Or you can use PubSubBroker if you need broadcasting
53
- broker = ListQueueBroker(
54
- url="redis://localhost:6379",
55
- result_backend=redis_async_result,
56
- )
57
-
58
-
59
- @broker.task
60
- async def best_task_ever() -> None:
61
- """Solve all problems in the world."""
62
- await asyncio.sleep(5.5)
63
- print("All problems are solved!")
64
-
65
-
66
- async def main():
67
- task = await best_task_ever.kiq()
68
- print(await task.wait_result())
69
-
70
-
71
- if __name__ == "__main__":
72
- asyncio.run(main())
73
- ```
74
-
75
- Launch the workers:
76
- `taskiq worker broker:broker`
77
- Then run the main code:
78
- `python3 broker.py`
79
-
80
- ## PubSubBroker and ListQueueBroker configuration
81
-
82
- We have two brokers with similar interfaces, but with different logic.
83
- The PubSubBroker uses redis' pubsub mechanism and is very powerful,
84
- but it executes every task on all workers, because PUBSUB broadcasts message
85
- to all subscribers.
86
-
87
- If you want your messages to be processed only once, please use ListQueueBroker.
88
- It uses redis' [LPUSH](https://redis.io/commands/lpush/) and [BRPOP](https://redis.io/commands/brpop/) commands to deal with messages.
89
-
90
- Brokers parameters:
91
- * `url` - url to redis.
92
- * `task_id_generator` - custom task_id genertaor.
93
- * `result_backend` - custom result backend.
94
- * `queue_name` - name of the pub/sub channel in redis.
95
- * `max_connection_pool_size` - maximum number of connections in pool.
96
- * Any other keyword arguments are passed to `redis.asyncio.BlockingConnectionPool`.
97
- Notably, you can use `timeout` to set custom timeout in seconds for reconnects
98
- (or set it to `None` to try reconnects indefinitely).
99
-
100
- ## RedisAsyncResultBackend configuration
101
-
102
- RedisAsyncResultBackend parameters:
103
- * `redis_url` - url to redis.
104
- * `keep_results` - flag to not remove results from Redis after reading.
105
- * `result_ex_time` - expire time in seconds (by default - not specified)
106
- * `result_px_time` - expire time in milliseconds (by default - not specified)
107
- * Any other keyword arguments are passed to `redis.asyncio.BlockingConnectionPool`.
108
- Notably, you can use `timeout` to set custom timeout in seconds for reconnects
109
- (or set it to `None` to try reconnects indefinitely).
110
- > IMPORTANT: **It is highly recommended to use expire time ​​in RedisAsyncResultBackend**
111
- > If you want to add expiration, either `result_ex_time` or `result_px_time` must be set.
112
- >```python
113
- ># First variant
114
- >redis_async_result = RedisAsyncResultBackend(
115
- > redis_url="redis://localhost:6379",
116
- > result_ex_time=1000,
117
- >)
118
- >
119
- ># Second variant
120
- >redis_async_result = RedisAsyncResultBackend(
121
- > redis_url="redis://localhost:6379",
122
- > result_px_time=1000000,
123
- >)
124
- >```
125
-
@@ -1,12 +0,0 @@
1
- taskiq_redis/__init__.py,sha256=UEW3rQXt4jinMnAKJlpXQhyPDh6SU2in0bPgzfIo3y4,911
2
- taskiq_redis/exceptions.py,sha256=eS4bfZVAjyMsnFs3IF74uYwO1KZOlrYxhxgPqD49ztU,561
3
- taskiq_redis/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
4
- taskiq_redis/redis_backend.py,sha256=mGU3rXJ727X-qYqSfaderTFVGz93NpCyHCf5PDFTjGk,19543
5
- taskiq_redis/redis_broker.py,sha256=JeoA3-quZYqa_wixJefMRYPkZe94x-qoQb6tQKkHLzg,4733
6
- taskiq_redis/redis_cluster_broker.py,sha256=CgPKkoEHZ1moNM-VNmzPQdjjNOrhiVUCNV-7FrUgqTo,2121
7
- taskiq_redis/redis_sentinel_broker.py,sha256=5MxUFIX7qRyDT7IHebLUhxAmmUwk1_b2sxjpSXRcjlo,4114
8
- taskiq_redis/schedule_source.py,sha256=bk96UBg8op-Xqg_PVETgyDb92cDaY69EAjpP8GvYSnY,10068
9
- taskiq_redis-1.0.2.dist-info/LICENSE,sha256=lEHEEE-ZxmuItxYgUMPiFWdRcAITxE8DFMNyAg4eOYE,1075
10
- taskiq_redis-1.0.2.dist-info/METADATA,sha256=6A_nDPLAmO92y_Db7vNUIGiOGqH7gTm_rCpb0KMIPOc,4030
11
- taskiq_redis-1.0.2.dist-info/WHEEL,sha256=sP946D7jFCHeNz5Iq4fL4Lu-PrWrFsgfLXbbkciIZwg,88
12
- taskiq_redis-1.0.2.dist-info/RECORD,,