ltq 0.3.1__py3-none-any.whl → 0.3.2__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,218 @@
1
+ Metadata-Version: 2.3
2
+ Name: ltq
3
+ Version: 0.3.2
4
+ Summary: Add your description here
5
+ Author: Tom Clesius
6
+ Author-email: Tom Clesius <tomclesius@gmail.com>
7
+ Requires-Dist: redis>=7.1.0
8
+ Requires-Dist: croniter>=6.0.0 ; extra == 'scheduler'
9
+ Requires-Dist: sentry-sdk>=2.0.0 ; extra == 'sentry'
10
+ Requires-Python: >=3.13
11
+ Provides-Extra: scheduler
12
+ Provides-Extra: sentry
13
+ Description-Content-Type: text/markdown
14
+
15
+ <p align="center">
16
+ <img src="https://raw.githubusercontent.com/tclesius/ltq/refs/heads/main/assets/logo.png" alt="LTQ" width="400">
17
+ </p>
18
+
19
+ <p align="center">
20
+ A lightweight, Async-first task queue built on Redis.
21
+ </p>
22
+
23
+ ## Installation
24
+
25
+ ```bash
26
+ pip install ltq
27
+ # or
28
+ uv add ltq
29
+ ```
30
+
31
+ ## Broker Backends
32
+
33
+ LTQ supports multiple broker backends:
34
+
35
+ - **Redis** (default): `broker_url="redis://localhost:6379"`
36
+ - **Memory**: `broker_url="memory://"` (useful for testing)
37
+
38
+ All workers and schedulers accept a `broker_url` parameter.
39
+
40
+ ## Quick Start
41
+
42
+ ```python
43
+ import asyncio
44
+ import ltq
45
+
46
+ worker = ltq.Worker("emails", broker_url="redis://localhost:6379")
47
+
48
+ @worker.task()
49
+ async def send_email(to: str, subject: str, body: str) -> None:
50
+ # your async code here
51
+ pass
52
+
53
+ async def main():
54
+ # Enqueue a task
55
+ await send_email.send("user@example.com", "Hello", "World")
56
+
57
+ # Or enqueue multiple tasks
58
+ for email in ["a@example.com", "b@example.com"]:
59
+ await send_email.send(email, "Hi", "Message")
60
+
61
+ asyncio.run(main())
62
+ ```
63
+
64
+ Each worker has a namespace (e.g., `"emails"`), and tasks are automatically namespaced as `{namespace}:{function_name}`.
65
+
66
+ ## Running Workers
67
+
68
+ ```bash
69
+ # Run a single worker
70
+ ltq run myapp:worker
71
+
72
+ # With options
73
+ ltq run myapp:worker --concurrency 100 --log-level DEBUG
74
+ ```
75
+
76
+ ## Running an App
77
+
78
+ Register multiple workers into an `App` to run them together:
79
+
80
+ ```python
81
+ import ltq
82
+
83
+ app = ltq.App()
84
+ app.register_worker(emails_worker)
85
+ app.register_worker(notifications_worker)
86
+ ```
87
+
88
+ ```bash
89
+ ltq run --app myapp:app
90
+ ```
91
+
92
+ ### App Middleware
93
+
94
+ Apply middleware globally to all workers in an app:
95
+
96
+ ```python
97
+ from ltq.middleware import Sentry
98
+
99
+ app = ltq.App(middlewares=[Sentry(dsn="https://...")])
100
+
101
+ # Or register after creation
102
+ app.register_middleware(Sentry(dsn="https://..."))
103
+ app.register_middleware(MyMiddleware(), pos=0)
104
+
105
+ # When workers are registered, app middlewares are prepended to each worker's stack
106
+ app.register_worker(emails_worker)
107
+ ```
108
+
109
+ ### Threading Model
110
+
111
+ By default, `App` runs each worker in its own thread with a separate event loop. This provides isolation between workers while keeping them in the same process. Workers won't block each other since each has its own async event loop.
112
+
113
+ **For maximum isolation** (separate memory, crash protection), run each worker in its own process:
114
+
115
+ ```bash
116
+ # Terminal 1
117
+ ltq run myapp:emails_worker
118
+
119
+ # Terminal 2
120
+ ltq run myapp:notifications_worker
121
+ ```
122
+
123
+ This gives you full process isolation at the cost of more overhead.
124
+
125
+ ## Queue Management
126
+
127
+ Manage queues using the CLI:
128
+
129
+ ```bash
130
+ # Clear a task queue
131
+ ltq clear emails:send_email
132
+
133
+ # Get queue size
134
+ ltq size emails:send_email
135
+
136
+ # With custom Redis URL
137
+ ltq clear emails:send_email --redis-url redis://localhost:6380
138
+ ltq size emails:send_email --redis-url redis://localhost:6380
139
+ ```
140
+
141
+ Queue names are automatically namespaced as `{worker_name}:{function_name}`.
142
+
143
+ ## Scheduler
144
+
145
+ Run tasks on a cron schedule (requires `ltq[scheduler]`):
146
+
147
+ ```python
148
+ import ltq
149
+
150
+ scheduler = ltq.Scheduler()
151
+ scheduler.cron("*/5 * * * *", send_email.message("admin@example.com", "Report", "..."))
152
+ scheduler.start() # Runs scheduler in blocking mode with asyncio.run()
153
+ ```
154
+
155
+ ## Task Options
156
+
157
+ Configure task behavior with options:
158
+
159
+ ```python
160
+ from datetime import timedelta
161
+
162
+ @worker.task(max_tries=3, max_age=timedelta(hours=1), max_rate="10/s")
163
+ async def send_email(to: str, subject: str, body: str) -> None:
164
+ # your async code here
165
+ pass
166
+ ```
167
+
168
+ **Available options:**
169
+
170
+ - `max_tries` (int): Maximum retry attempts before rejecting the message
171
+ - `max_age` (timedelta): Maximum message age before rejection
172
+ - `max_rate` (str): Rate limit in format `"N/s"`, `"N/m"`, or `"N/h"` (requests per second/minute/hour)
173
+
174
+ ## Middleware
175
+
176
+ Middleware are async context managers that wrap task execution. The default stack is `[MaxTries(), MaxAge(), MaxRate()]`, so you only need to specify middlewares if you want to customize or add additional ones:
177
+
178
+ ```python
179
+ from ltq.middleware import MaxTries, MaxAge, MaxRate, Sentry
180
+
181
+ worker = ltq.Worker(
182
+ "emails",
183
+ broker_url="redis://localhost:6379",
184
+ middlewares=[
185
+ MaxTries(),
186
+ MaxAge(),
187
+ MaxRate(),
188
+ Sentry(dsn="https://..."),
189
+ ],
190
+ )
191
+ ```
192
+
193
+ **Built-in:** `MaxTries`, `MaxAge`, `MaxRate`, `Sentry` (requires `ltq[sentry]`)
194
+
195
+ You can also register middleware after creating the worker:
196
+
197
+ ```python
198
+ worker.register_middleware(Sentry(dsn="https://..."))
199
+
200
+ # Insert at specific position (default is -1 for append)
201
+ worker.register_middleware(MyMiddleware(), pos=0)
202
+ ```
203
+
204
+ **Custom middleware:**
205
+
206
+ ```python
207
+ from contextlib import asynccontextmanager
208
+ from ltq.middleware import Middleware
209
+ from ltq.message import Message
210
+ from ltq.task import Task
211
+
212
+ class Logger(Middleware):
213
+ @asynccontextmanager
214
+ async def __call__(self, message: Message, task: Task):
215
+ print(f"Processing {message.task_name}")
216
+ yield
217
+ print(f"Completed {message.task_name}")
218
+ ```
@@ -0,0 +1,16 @@
1
+ ltq/__init__.py,sha256=dMoQtB1-odVRCDWDijNft-DlCverh8y83iQd-hDJxhw,491
2
+ ltq/app.py,sha256=l045sTXKHQo-pv5JDi8pE90wHGoCX7vsxrTQZIx7o5s,1335
3
+ ltq/broker.py,sha256=oNZwmugFHB2PbdDiLt4db4MYUxwNOhdOIDto4vLb8uU,4062
4
+ ltq/cli.py,sha256=8pt6_aP6RFi0Xi2garzefOlsBuPRXuJjpu_-ohG-__o,5282
5
+ ltq/errors.py,sha256=Oo6fVTFoNhO7xlXJLj4NwBBD4UsrYU_f4msoeS2GsM0,320
6
+ ltq/logger.py,sha256=XEcjvuZz_drfVyKAQj4r1Dz-gRwZcffkN_xTLg_h4hk,2088
7
+ ltq/message.py,sha256=vkF17yfTE7sWjqsn5PE9LpGh6FOIlwOJLS4OZ4iGD5k,829
8
+ ltq/middleware.py,sha256=uHNCJoohgHnPV2cPbaKOL0i-wU9D8WIhfoc07c78ve4,3475
9
+ ltq/scheduler.py,sha256=FauNClAN-N7A6hUCBI0oAGe8at1fohDLzXuFI1hF1-I,3111
10
+ ltq/task.py,sha256=0JUY55JezhrtUvAiZLtRPMap77nua09-D3S5VdOA9uY,944
11
+ ltq/utils.py,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
12
+ ltq/worker.py,sha256=6RPiNgPiJmdX38p8Iw_xpbExhYvE5VKqV_g-oC_nctc,3921
13
+ ltq-0.3.2.dist-info/WHEEL,sha256=iHtWm8nRfs0VRdCYVXocAWFW8ppjHL-uTJkAdZJKOBM,80
14
+ ltq-0.3.2.dist-info/entry_points.txt,sha256=OogYaOJ_RORrWtrLlEL_gTN9Vx5tkgawl8BO7G9FKcg,38
15
+ ltq-0.3.2.dist-info/METADATA,sha256=8PubjxCurEHvtFF4L7enJM7D9OF65Tg0exeUd_Tt2rY,5326
16
+ ltq-0.3.2.dist-info/RECORD,,
@@ -1,4 +1,4 @@
1
1
  Wheel-Version: 1.0
2
- Generator: uv 0.9.27
2
+ Generator: uv 0.9.30
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
ltq/q.py DELETED
@@ -1,82 +0,0 @@
1
- from __future__ import annotations
2
-
3
- import asyncio
4
- from typing import TYPE_CHECKING
5
-
6
- from .message import Message
7
-
8
- if TYPE_CHECKING:
9
- from redis.asyncio import Redis
10
-
11
-
12
- class Queue:
13
- _GET_SCRIPT = """
14
- local items = {}
15
- for i = 1, ARGV[1] do
16
- local item = redis.call("RPOP", KEYS[1])
17
- if not item then break end
18
- table.insert(items, item)
19
- end
20
- for i = 1, #items, 5000 do
21
- local chunk = {unpack(items, i, math.min(i + 4999, #items))}
22
- redis.call("SADD", KEYS[2], unpack(chunk))
23
- end
24
- return items
25
- """
26
-
27
- def __init__(self, client: Redis, name: str) -> None:
28
- self.client = client
29
- self.name = name
30
- self.queue_key = f"queue:{name}"
31
- self.processing_key = f"queue:{name}:processing"
32
- self._get = client.register_script(self._GET_SCRIPT)
33
-
34
- @staticmethod
35
- def _serialize(messages: list[Message]) -> list[str]:
36
- return [msg.to_json() for msg in messages]
37
-
38
- async def put(
39
- self,
40
- messages: list[Message],
41
- delay: float = 0.0,
42
- ttl: int | None = None,
43
- ) -> None:
44
- if not messages:
45
- return
46
- if delay > 0:
47
- await asyncio.sleep(delay)
48
- pipe = self.client.pipeline()
49
- for item in self._serialize(messages):
50
- pipe.lpush(self.queue_key, item)
51
- if ttl:
52
- pipe.expire(self.queue_key, ttl)
53
- await pipe.execute() # type: ignore
54
-
55
- async def get(self, count: int) -> list[Message]:
56
- results = await self._get(
57
- keys=[self.queue_key, self.processing_key],
58
- args=[count],
59
- ) # type: ignore
60
- return [Message.from_json(r) for r in results]
61
-
62
- async def ack(self, messages: list[Message]) -> None:
63
- if not messages:
64
- return
65
- items = self._serialize(messages)
66
- await self.client.srem(self.processing_key, *items) # type: ignore
67
-
68
- async def nack(self, messages: list[Message]) -> None:
69
- if not messages:
70
- return
71
- items = self._serialize(messages)
72
- pipe = self.client.pipeline()
73
- pipe.srem(self.processing_key, *items)
74
- for item in items:
75
- pipe.lpush(self.queue_key, item)
76
- await pipe.execute() # type: ignore
77
-
78
- async def len(self) -> int:
79
- return await self.client.llen(self.queue_key) # type: ignore
80
-
81
- async def clear(self) -> None:
82
- await self.client.delete(self.queue_key, self.processing_key) # type: ignore
@@ -1,137 +0,0 @@
1
- Metadata-Version: 2.3
2
- Name: ltq
3
- Version: 0.3.1
4
- Summary: Add your description here
5
- Author: Tom Clesius
6
- Author-email: Tom Clesius <tomclesius@gmail.com>
7
- Requires-Dist: redis>=7.1.0
8
- Requires-Dist: croniter>=6.0.0 ; extra == 'scheduler'
9
- Requires-Dist: sentry-sdk>=2.0.0 ; extra == 'sentry'
10
- Requires-Python: >=3.13
11
- Provides-Extra: scheduler
12
- Provides-Extra: sentry
13
- Description-Content-Type: text/markdown
14
-
15
- <p align="center">
16
- <img src="https://raw.githubusercontent.com/tclesius/ltq/refs/heads/main/assets/logo.png" alt="LTQ" width="400">
17
- </p>
18
-
19
- <p align="center">
20
- A lightweight, Async-first task queue built on Redis.
21
- </p>
22
-
23
- ## Installation
24
-
25
- ```bash
26
- pip install ltq
27
- # or
28
- uv add ltq
29
- ```
30
-
31
- ## Quick Start
32
-
33
- ```python
34
- import asyncio
35
- import ltq
36
-
37
- worker = ltq.Worker(url="redis://localhost:6379")
38
-
39
- @worker.task()
40
- async def send_email(to: str, subject: str, body: str) -> None:
41
- # your async code here
42
- pass
43
-
44
- async def main():
45
- # Enqueue a task
46
- await send_email.send("user@example.com", "Hello", "World")
47
-
48
- # Or dispatch in bulk
49
- messages = [
50
- send_email.message("a@example.com", "Hi", "A"),
51
- send_email.message("b@example.com", "Hi", "B"),
52
- ]
53
- await ltq.dispatch(messages)
54
-
55
- asyncio.run(main())
56
- ```
57
-
58
- Each task gets its own queue by default. To share a queue between tasks, pass `queue_name`:
59
-
60
- ```python
61
- @worker.task(queue_name="emails")
62
- async def send_email(...): ...
63
-
64
- @worker.task(queue_name="emails")
65
- async def send_newsletter(...): ...
66
- ```
67
-
68
- ## Running Workers
69
-
70
- ```bash
71
- # Run a single worker
72
- ltq myapp:worker
73
-
74
- # With options
75
- ltq myapp:worker --concurrency 100 --log-level DEBUG
76
- ```
77
-
78
- ## Running an App
79
-
80
- Register multiple workers into an `App` to run them together:
81
-
82
- ```python
83
- import ltq
84
-
85
- app = ltq.App()
86
- app.register_worker(emails_worker)
87
- app.register_worker(notifications_worker)
88
- ```
89
-
90
- ```bash
91
- ltq --app myapp:app
92
- ```
93
-
94
- ## Scheduler
95
-
96
- Run tasks on a cron schedule (requires `ltq[scheduler]`):
97
-
98
- ```python
99
- import ltq
100
-
101
- scheduler = ltq.Scheduler()
102
- scheduler.cron("*/5 * * * *", send_email.message("admin@example.com", "Report", "..."))
103
- scheduler.run()
104
- ```
105
-
106
- ## Middleware
107
-
108
- Add middleware to handle cross-cutting concerns:
109
-
110
- ```python
111
- from ltq.middleware import Retry, RateLimit, Timeout
112
-
113
- worker = ltq.Worker(
114
- url="redis://localhost:6379",
115
- middlewares=[
116
- Retry(max_retries=3, min_delay=1.0),
117
- RateLimit(requests_per_second=10),
118
- Timeout(timeout=30.0),
119
- ],
120
- )
121
- ```
122
-
123
- **Built-in:** `Retry`, `RateLimit`, `Timeout`, `Sentry` (requires `ltq[sentry]`)
124
-
125
- **Custom middleware:**
126
-
127
- ```python
128
- from ltq.middleware import Middleware, Handler
129
- from ltq.message import Message
130
-
131
- class Logger(Middleware):
132
- async def handle(self, message: Message, next_handler: Handler):
133
- print(f"Processing {message.task}")
134
- result = await next_handler(message)
135
- print(f"Completed {message.task}")
136
- return result
137
- ```
@@ -1,16 +0,0 @@
1
- ltq/__init__.py,sha256=yZ65BhEtoQApFCOPWCL5_5b2rAIfBkw6XW5ppSpurW0,355
2
- ltq/app.py,sha256=PI9Zti37qpgYVzLRIbRoAMmhJ3wLdIY05bTQCkUzehA,316
3
- ltq/cli.py,sha256=JD_iHStkP2WuVlOz0L90Sako-oHPafTzV08INDJsHio,3219
4
- ltq/errors.py,sha256=i9kXtSVMpam_0VpyL5eaSfBwkGF6dK7xouAcLdc_eNc,324
5
- ltq/logger.py,sha256=HPClhDt3ecwZqE0Vq2oYF8Nr9jj-xrsSX9tM6enVgkA,1791
6
- ltq/message.py,sha256=hY_oPNMicSICV9-_o8Rma_kJO3DGJu_PFxyt1xZKMJE,871
7
- ltq/middleware.py,sha256=xy4VXy8uXp0vBMhdD6aSOQdzwCdbUkcuFw7FwPsqabU,3641
8
- ltq/q.py,sha256=3jRlPZKO4Zd5IYnb24jO_oRHtHBbG1la5SCZPI6QLI4,2537
9
- ltq/scheduler.py,sha256=6l53-Hf-2sURxTEpY1MOM5z18slQTl4450f-aLHF1sY,2516
10
- ltq/task.py,sha256=ZKKag4u9k9hQ-lNdIWZygmGCc01DRt8a6kMQlQr_yaw,1000
11
- ltq/utils.py,sha256=M7EWJ-n9J3MMwofWwtxVkwgHmGeP66eoWuWDhjbx67k,536
12
- ltq/worker.py,sha256=WIxLBCZYaVH_KVdexx77J607EEfQm1OKdtjmzCirvPM,3344
13
- ltq-0.3.1.dist-info/WHEEL,sha256=e_m4S054HL0hyR3CpOk-b7Q7fDX6BuFkgL5OjAExXas,80
14
- ltq-0.3.1.dist-info/entry_points.txt,sha256=OogYaOJ_RORrWtrLlEL_gTN9Vx5tkgawl8BO7G9FKcg,38
15
- ltq-0.3.1.dist-info/METADATA,sha256=WJzVlmvsEHVuUl4HovJ9Ip1EU2HERuarIcGpfcpOKAc,2863
16
- ltq-0.3.1.dist-info/RECORD,,