AxiomQuery 0.1.0__tar.gz → 0.2.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {axiomquery-0.1.0 → axiomquery-0.2.0}/CHANGELOG.md +18 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/PKG-INFO +22 -1
- {axiomquery-0.1.0 → axiomquery-0.2.0}/README.md +21 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/examples/example_async.py +80 -14
- {axiomquery-0.1.0 → axiomquery-0.2.0}/examples/example_sync.py +81 -13
- {axiomquery-0.1.0 → axiomquery-0.2.0}/pyproject.toml +1 -1
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/__init__.py +1 -1
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/compiler.py +30 -13
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/schema.py +29 -3
- {axiomquery-0.1.0 → axiomquery-0.2.0}/tests/conftest.py +24 -3
- {axiomquery-0.1.0 → axiomquery-0.2.0}/tests/test_list.py +39 -1
- {axiomquery-0.1.0 → axiomquery-0.2.0}/.github/workflows/python-publish.yml +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/.gitignore +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/CONTRIBUTING.md +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/LICENSE +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/aggregation.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/aggregation_parser.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/ast.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/compiler_aggregate.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/engine.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/errors.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/operators.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/parser.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/src/axiom_query/py.typed +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/tests/test_async.py +0 -0
- {axiomquery-0.1.0 → axiomquery-0.2.0}/tests/test_read_group.py +0 -0
|
@@ -7,6 +7,24 @@ Versioning follows [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
|
|
7
7
|
|
|
8
8
|
---
|
|
9
9
|
|
|
10
|
+
## [0.2.0] — 2026-04-13
|
|
11
|
+
|
|
12
|
+
### Added
|
|
13
|
+
|
|
14
|
+
- M2O (Many-to-One) relational field filtering via dot-notation (e.g. `["customer.name", "ilike", "%Alice%"]`)
|
|
15
|
+
- Generates an `EXISTS` subquery joining the referenced table on the local FK column
|
|
16
|
+
- Composes freely with scalar filters and O2M child filters in the same domain
|
|
17
|
+
- `RelatedSchema` dataclass in `schema.py` to capture M2O relationship metadata (referenced table, local FK column, columns)
|
|
18
|
+
- `related` field on `ModelSchema` mapping relationship attribute names to their `RelatedSchema`
|
|
19
|
+
|
|
20
|
+
---
|
|
21
|
+
|
|
22
|
+
## [0.1.1] — 2026-04-05
|
|
23
|
+
Acknowledgement and Inspiration Added
|
|
24
|
+
|
|
25
|
+
### Added
|
|
26
|
+
- updated the README.md file with details
|
|
27
|
+
|
|
10
28
|
## [0.1.0] — 2026-03-29
|
|
11
29
|
|
|
12
30
|
Initial release.
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: AxiomQuery
|
|
3
|
-
Version: 0.
|
|
3
|
+
Version: 0.2.0
|
|
4
4
|
Summary: Specification-based query and aggregation engine for SQLAlchemy 2.0 ORM models
|
|
5
5
|
Project-URL: Source Code, https://github.com/Axiom-Dev-Labs/AxiomQuery
|
|
6
6
|
Project-URL: Bug Tracker, https://github.com/Axiom-Dev-Labs/AxiomQuery/issues
|
|
@@ -230,3 +230,24 @@ python examples/example_async.py
|
|
|
230
230
|
```
|
|
231
231
|
|
|
232
232
|
Both cover: simple filters, AND / OR / NOT, combined nesting, child EXISTS filtering, pagination, `read_group` with domain / date granularity / child aggregation / HAVING, and `__domain` drill-down.
|
|
233
|
+
|
|
234
|
+
Here is an improved, livelier version of your acknowledgement note, complete with emojis and the added context about the Specification pattern acting as the core inspiration for the library. It is formatted directly for your `README.md`.
|
|
235
|
+
|
|
236
|
+
***
|
|
237
|
+
|
|
238
|
+
## 🙌 Acknowledgements & Inspirations
|
|
239
|
+
|
|
240
|
+
The creation of AxiomQuery was sparked by a desire to cleanly bridge pure domain logic with robust data access. The conceptual "trigger point" for this library came from **Martin Fowler and Eric Evans' Specification Pattern**—a brilliant blueprint for encapsulating business rules. However, it was the phenomenal foundation of **SQLAlchemy 2.0** that provided the mechanical reality, making it possible to seamlessly translate those decoupled domain specifications into highly optimized SQL.
|
|
241
|
+
|
|
242
|
+
A huge thank you to the maintainers and contributors of SQLAlchemy. AxiomQuery is built explicitly as a specification-based query and aggregation engine for SQLAlchemy 2.0 ORM models, and it relies entirely on several of their most powerful features:
|
|
243
|
+
|
|
244
|
+
* 🔍 **Incredible Introspection (`inspect()`):** AxiomQuery automatically derives all necessary schema data—including `mapper.columns`, one-to-many relationships (`RelationshipDirection.ONETOMANY`), and foreign key synchronization pairs—directly from SQLAlchemy's introspection tools. This allows the engine to extract everything the compiler needs without ever forcing the developer to write duplicate descriptor code.
|
|
245
|
+
* 🏗️ **Robust Expression Language:** Our underlying AST compiler relies heavily on SQLAlchemy's composable query constructs. Mapping our 11 supported operators to native methods makes it incredibly easy to safely compile complex SQL `WHERE` clauses. It seamlessly handles advanced requirements, such as utilizing `EXISTS` subqueries for parent-child filtering and executing `LEFT JOIN` aggregations with database-specific date truncations.
|
|
246
|
+
* 🔌 **Decoupled Session Management:** Because SQLAlchemy cleanly separates the ORM models from the active database connection, AxiomQuery can operate as a thin, highly reusable facade. The library expects a caller-owned session (whether standard or an `AsyncSession`), allowing developers to easily manage transactions across multiple engines without friction.
|
|
247
|
+
|
|
248
|
+
Thank you for providing the introspection and query-building tools that make translating dynamic JSON expressions into complex SQL queries a reality! ✨
|
|
249
|
+
|
|
250
|
+
### 📚 References
|
|
251
|
+
|
|
252
|
+
* **The Specification Pattern:** [Specifications by Martin Fowler & Eric Evans (PDF)](https://martinfowler.com/apsupp/spec.pdf) - The foundational paper that inspired the core domain-driven architecture of this library.
|
|
253
|
+
* **SQLAlchemy 2.0:** [Official Documentation](https://docs.sqlalchemy.org/en/20/) - The robust ORM and toolkit that powers the AxiomQuery engine.
|
|
@@ -187,3 +187,24 @@ python examples/example_async.py
|
|
|
187
187
|
```
|
|
188
188
|
|
|
189
189
|
Both cover: simple filters, AND / OR / NOT, combined nesting, child EXISTS filtering, pagination, `read_group` with domain / date granularity / child aggregation / HAVING, and `__domain` drill-down.
|
|
190
|
+
|
|
191
|
+
Here is an improved, livelier version of your acknowledgement note, complete with emojis and the added context about the Specification pattern acting as the core inspiration for the library. It is formatted directly for your `README.md`.
|
|
192
|
+
|
|
193
|
+
***
|
|
194
|
+
|
|
195
|
+
## 🙌 Acknowledgements & Inspirations
|
|
196
|
+
|
|
197
|
+
The creation of AxiomQuery was sparked by a desire to cleanly bridge pure domain logic with robust data access. The conceptual "trigger point" for this library came from **Martin Fowler and Eric Evans' Specification Pattern**—a brilliant blueprint for encapsulating business rules. However, it was the phenomenal foundation of **SQLAlchemy 2.0** that provided the mechanical reality, making it possible to seamlessly translate those decoupled domain specifications into highly optimized SQL.
|
|
198
|
+
|
|
199
|
+
A huge thank you to the maintainers and contributors of SQLAlchemy. AxiomQuery is built explicitly as a specification-based query and aggregation engine for SQLAlchemy 2.0 ORM models, and it relies entirely on several of their most powerful features:
|
|
200
|
+
|
|
201
|
+
* 🔍 **Incredible Introspection (`inspect()`):** AxiomQuery automatically derives all necessary schema data—including `mapper.columns`, one-to-many relationships (`RelationshipDirection.ONETOMANY`), and foreign key synchronization pairs—directly from SQLAlchemy's introspection tools. This allows the engine to extract everything the compiler needs without ever forcing the developer to write duplicate descriptor code.
|
|
202
|
+
* 🏗️ **Robust Expression Language:** Our underlying AST compiler relies heavily on SQLAlchemy's composable query constructs. Mapping our 11 supported operators to native methods makes it incredibly easy to safely compile complex SQL `WHERE` clauses. It seamlessly handles advanced requirements, such as utilizing `EXISTS` subqueries for parent-child filtering and executing `LEFT JOIN` aggregations with database-specific date truncations.
|
|
203
|
+
* 🔌 **Decoupled Session Management:** Because SQLAlchemy cleanly separates the ORM models from the active database connection, AxiomQuery can operate as a thin, highly reusable facade. The library expects a caller-owned session (whether standard or an `AsyncSession`), allowing developers to easily manage transactions across multiple engines without friction.
|
|
204
|
+
|
|
205
|
+
Thank you for providing the introspection and query-building tools that make translating dynamic JSON expressions into complex SQL queries a reality! ✨
|
|
206
|
+
|
|
207
|
+
### 📚 References
|
|
208
|
+
|
|
209
|
+
* **The Specification Pattern:** [Specifications by Martin Fowler & Eric Evans (PDF)](https://martinfowler.com/apsupp/spec.pdf) - The foundational paper that inspired the core domain-driven architecture of this library.
|
|
210
|
+
* **SQLAlchemy 2.0:** [Official Documentation](https://docs.sqlalchemy.org/en/20/) - The robust ORM and toolkit that powers the AxiomQuery engine.
|
|
@@ -5,7 +5,8 @@ methods. Demonstrates the same domain styles:
|
|
|
5
5
|
- Simple equality / comparison
|
|
6
6
|
- AND, OR, NOT
|
|
7
7
|
- Combined nested conditions
|
|
8
|
-
- Child-field EXISTS filtering
|
|
8
|
+
- Child-field EXISTS filtering (O2M)
|
|
9
|
+
- Many-to-One field filtering (M2O EXISTS subquery)
|
|
9
10
|
- alist() options: limit, offset, order_by
|
|
10
11
|
- aread_group() with domain, date granularity, child aggregate, HAVING
|
|
11
12
|
- __domain drill-down with alist()
|
|
@@ -18,7 +19,7 @@ from __future__ import annotations
|
|
|
18
19
|
|
|
19
20
|
import asyncio
|
|
20
21
|
from datetime import datetime
|
|
21
|
-
from typing import List
|
|
22
|
+
from typing import List, Optional
|
|
22
23
|
|
|
23
24
|
from sqlalchemy import DateTime, ForeignKey, String
|
|
24
25
|
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
|
|
@@ -34,12 +35,25 @@ class Base(DeclarativeBase):
|
|
|
34
35
|
pass
|
|
35
36
|
|
|
36
37
|
|
|
38
|
+
class Customer(Base):
|
|
39
|
+
__tablename__ = "customers"
|
|
40
|
+
id: Mapped[int] = mapped_column(primary_key=True)
|
|
41
|
+
name: Mapped[str] = mapped_column(String(100))
|
|
42
|
+
|
|
43
|
+
def __repr__(self) -> str:
|
|
44
|
+
return f"Customer(id={self.id}, name={self.name!r})"
|
|
45
|
+
|
|
46
|
+
|
|
37
47
|
class Order(Base):
|
|
38
48
|
__tablename__ = "orders"
|
|
39
49
|
id: Mapped[int] = mapped_column(primary_key=True)
|
|
40
50
|
status: Mapped[str] = mapped_column(String(20))
|
|
41
51
|
total: Mapped[int] = mapped_column(default=0)
|
|
42
52
|
created_at: Mapped[datetime] = mapped_column(DateTime)
|
|
53
|
+
customer_id: Mapped[Optional[int]] = mapped_column(
|
|
54
|
+
ForeignKey("customers.id"), nullable=True
|
|
55
|
+
)
|
|
56
|
+
customer: Mapped[Optional["Customer"]] = relationship()
|
|
43
57
|
lines: Mapped[List["OrderLine"]] = relationship(back_populates="order")
|
|
44
58
|
|
|
45
59
|
def __repr__(self) -> str:
|
|
@@ -62,14 +76,41 @@ class OrderLine(Base):
|
|
|
62
76
|
async def seed(session: AsyncSession) -> None:
|
|
63
77
|
session.add_all(
|
|
64
78
|
[
|
|
79
|
+
Customer(id=1, name="Joy"),
|
|
80
|
+
Customer(id=2, name="Bob"),
|
|
81
|
+
]
|
|
82
|
+
)
|
|
83
|
+
await session.flush()
|
|
84
|
+
session.add_all(
|
|
85
|
+
[
|
|
86
|
+
Order(
|
|
87
|
+
id=1,
|
|
88
|
+
status="CONFIRMED",
|
|
89
|
+
total=100,
|
|
90
|
+
created_at=datetime(2026, 1, 15),
|
|
91
|
+
customer_id=1,
|
|
92
|
+
),
|
|
93
|
+
Order(
|
|
94
|
+
id=2,
|
|
95
|
+
status="CONFIRMED",
|
|
96
|
+
total=200,
|
|
97
|
+
created_at=datetime(2026, 2, 20),
|
|
98
|
+
customer_id=2,
|
|
99
|
+
),
|
|
65
100
|
Order(
|
|
66
|
-
id=
|
|
101
|
+
id=3,
|
|
102
|
+
status="DRAFT",
|
|
103
|
+
total=50,
|
|
104
|
+
created_at=datetime(2026, 1, 25),
|
|
105
|
+
customer_id=1,
|
|
67
106
|
),
|
|
68
107
|
Order(
|
|
69
|
-
id=
|
|
108
|
+
id=4,
|
|
109
|
+
status="CANCELLED",
|
|
110
|
+
total=75,
|
|
111
|
+
created_at=datetime(2026, 3, 10),
|
|
112
|
+
customer_id=None,
|
|
70
113
|
),
|
|
71
|
-
Order(id=3, status="DRAFT", total=50, created_at=datetime(2026, 1, 25)),
|
|
72
|
-
Order(id=4, status="CANCELLED", total=75, created_at=datetime(2026, 3, 10)),
|
|
73
114
|
]
|
|
74
115
|
)
|
|
75
116
|
await session.flush()
|
|
@@ -302,9 +343,34 @@ async def main() -> None:
|
|
|
302
343
|
),
|
|
303
344
|
)
|
|
304
345
|
|
|
305
|
-
# ── Section 8:
|
|
346
|
+
# ── Section 8: M2O field — EXISTS on referenced table ────────────
|
|
347
|
+
|
|
348
|
+
section("8. M2O field filtering (EXISTS on referenced table)")
|
|
349
|
+
|
|
350
|
+
show(
|
|
351
|
+
"orders where customer.name = 'Joy'",
|
|
352
|
+
await engine.alist(session, domain=[["customer.name", "=", "Joy"]]),
|
|
353
|
+
)
|
|
354
|
+
|
|
355
|
+
show(
|
|
356
|
+
"orders where customer.name ilike '%ob%'",
|
|
357
|
+
await engine.alist(session, domain=[["customer.name", "ilike", "%ob%"]]),
|
|
358
|
+
)
|
|
359
|
+
|
|
360
|
+
show(
|
|
361
|
+
"CONFIRMED orders for Joy (M2O + scalar)",
|
|
362
|
+
await engine.alist(
|
|
363
|
+
session,
|
|
364
|
+
domain=[
|
|
365
|
+
["customer.name", "=", "Joy"],
|
|
366
|
+
["status", "=", "CONFIRMED"],
|
|
367
|
+
],
|
|
368
|
+
),
|
|
369
|
+
)
|
|
370
|
+
|
|
371
|
+
# ── Section 9: alist() options ────────────────────────────────────
|
|
306
372
|
|
|
307
|
-
section("
|
|
373
|
+
section("9. alist() — limit, offset, order_by")
|
|
308
374
|
|
|
309
375
|
show(
|
|
310
376
|
"top 2 orders by total desc",
|
|
@@ -327,7 +393,7 @@ async def main() -> None:
|
|
|
327
393
|
|
|
328
394
|
# ── Section 9: aread_group — basic ────────────────────────────────
|
|
329
395
|
|
|
330
|
-
section("
|
|
396
|
+
section("10. aread_group — basic groupby + count")
|
|
331
397
|
|
|
332
398
|
groups, total = await engine.aread_group(
|
|
333
399
|
session,
|
|
@@ -339,7 +405,7 @@ async def main() -> None:
|
|
|
339
405
|
|
|
340
406
|
# ── Section 10: aread_group with domain ───────────────────────────
|
|
341
407
|
|
|
342
|
-
section("
|
|
408
|
+
section("11. aread_group with domain filter")
|
|
343
409
|
|
|
344
410
|
groups, total = await engine.aread_group(
|
|
345
411
|
session,
|
|
@@ -351,7 +417,7 @@ async def main() -> None:
|
|
|
351
417
|
|
|
352
418
|
# ── Section 11: aread_group — date granularity ────────────────────
|
|
353
419
|
|
|
354
|
-
section("
|
|
420
|
+
section("12. aread_group — date granularity (month)")
|
|
355
421
|
|
|
356
422
|
groups, total = await engine.aread_group(
|
|
357
423
|
session,
|
|
@@ -363,7 +429,7 @@ async def main() -> None:
|
|
|
363
429
|
|
|
364
430
|
# ── Section 12: aread_group — child aggregate (LEFT JOIN) ─────────
|
|
365
431
|
|
|
366
|
-
section("
|
|
432
|
+
section("13. aread_group — child aggregate (LEFT JOIN)")
|
|
367
433
|
|
|
368
434
|
groups, total = await engine.aread_group(
|
|
369
435
|
session,
|
|
@@ -374,7 +440,7 @@ async def main() -> None:
|
|
|
374
440
|
|
|
375
441
|
# ── Section 13: aread_group — HAVING ──────────────────────────────
|
|
376
442
|
|
|
377
|
-
section("
|
|
443
|
+
section("14. aread_group — HAVING filter on aggregate")
|
|
378
444
|
|
|
379
445
|
groups, total = await engine.aread_group(
|
|
380
446
|
session,
|
|
@@ -386,7 +452,7 @@ async def main() -> None:
|
|
|
386
452
|
|
|
387
453
|
# ── Section 14: __domain drill-down with alist() ──────────────────
|
|
388
454
|
|
|
389
|
-
section("
|
|
455
|
+
section("15. __domain drill-down — group → records via alist()")
|
|
390
456
|
|
|
391
457
|
groups, _ = await engine.aread_group(
|
|
392
458
|
session,
|
|
@@ -6,7 +6,8 @@ Demonstrates all domain condition styles:
|
|
|
6
6
|
- OR — any condition must hold
|
|
7
7
|
- NOT — negation
|
|
8
8
|
- Combined — nested AND / OR / NOT
|
|
9
|
-
- Child-field EXISTS filtering
|
|
9
|
+
- Child-field EXISTS filtering (O2M)
|
|
10
|
+
- Many-to-One field filtering (M2O EXISTS subquery)
|
|
10
11
|
- list() options: limit, offset, order_by
|
|
11
12
|
- read_group() with domain, date granularity, child aggregate, HAVING
|
|
12
13
|
- __domain drill-down: group result → list of matching records
|
|
@@ -20,6 +21,8 @@ from __future__ import annotations
|
|
|
20
21
|
from datetime import datetime
|
|
21
22
|
from typing import List
|
|
22
23
|
|
|
24
|
+
from typing import Optional
|
|
25
|
+
|
|
23
26
|
from sqlalchemy import DateTime, ForeignKey, String, create_engine
|
|
24
27
|
from sqlalchemy.orm import DeclarativeBase, Mapped, Session, mapped_column, relationship
|
|
25
28
|
|
|
@@ -33,12 +36,25 @@ class Base(DeclarativeBase):
|
|
|
33
36
|
pass
|
|
34
37
|
|
|
35
38
|
|
|
39
|
+
class Customer(Base):
|
|
40
|
+
__tablename__ = "customers"
|
|
41
|
+
id: Mapped[int] = mapped_column(primary_key=True)
|
|
42
|
+
name: Mapped[str] = mapped_column(String(100))
|
|
43
|
+
|
|
44
|
+
def __repr__(self) -> str:
|
|
45
|
+
return f"Customer(id={self.id}, name={self.name!r})"
|
|
46
|
+
|
|
47
|
+
|
|
36
48
|
class Order(Base):
|
|
37
49
|
__tablename__ = "orders"
|
|
38
50
|
id: Mapped[int] = mapped_column(primary_key=True)
|
|
39
51
|
status: Mapped[str] = mapped_column(String(20))
|
|
40
52
|
total: Mapped[int] = mapped_column(default=0)
|
|
41
53
|
created_at: Mapped[datetime] = mapped_column(DateTime)
|
|
54
|
+
customer_id: Mapped[Optional[int]] = mapped_column(
|
|
55
|
+
ForeignKey("customers.id"), nullable=True
|
|
56
|
+
)
|
|
57
|
+
customer: Mapped[Optional["Customer"]] = relationship()
|
|
42
58
|
lines: Mapped[List["OrderLine"]] = relationship(back_populates="order")
|
|
43
59
|
|
|
44
60
|
def __repr__(self) -> str:
|
|
@@ -61,14 +77,41 @@ class OrderLine(Base):
|
|
|
61
77
|
def seed(session: Session) -> None:
|
|
62
78
|
session.add_all(
|
|
63
79
|
[
|
|
80
|
+
Customer(id=1, name="Joy"),
|
|
81
|
+
Customer(id=2, name="Bob"),
|
|
82
|
+
]
|
|
83
|
+
)
|
|
84
|
+
session.flush()
|
|
85
|
+
session.add_all(
|
|
86
|
+
[
|
|
87
|
+
Order(
|
|
88
|
+
id=1,
|
|
89
|
+
status="CONFIRMED",
|
|
90
|
+
total=100,
|
|
91
|
+
created_at=datetime(2026, 1, 15),
|
|
92
|
+
customer_id=1,
|
|
93
|
+
),
|
|
94
|
+
Order(
|
|
95
|
+
id=2,
|
|
96
|
+
status="CONFIRMED",
|
|
97
|
+
total=200,
|
|
98
|
+
created_at=datetime(2026, 2, 20),
|
|
99
|
+
customer_id=2,
|
|
100
|
+
),
|
|
64
101
|
Order(
|
|
65
|
-
id=
|
|
102
|
+
id=3,
|
|
103
|
+
status="DRAFT",
|
|
104
|
+
total=50,
|
|
105
|
+
created_at=datetime(2026, 1, 25),
|
|
106
|
+
customer_id=1,
|
|
66
107
|
),
|
|
67
108
|
Order(
|
|
68
|
-
id=
|
|
109
|
+
id=4,
|
|
110
|
+
status="CANCELLED",
|
|
111
|
+
total=75,
|
|
112
|
+
created_at=datetime(2026, 3, 10),
|
|
113
|
+
customer_id=None,
|
|
69
114
|
),
|
|
70
|
-
Order(id=3, status="DRAFT", total=50, created_at=datetime(2026, 1, 25)),
|
|
71
|
-
Order(id=4, status="CANCELLED", total=75, created_at=datetime(2026, 3, 10)),
|
|
72
115
|
]
|
|
73
116
|
)
|
|
74
117
|
session.flush()
|
|
@@ -296,9 +339,34 @@ def main() -> None:
|
|
|
296
339
|
),
|
|
297
340
|
)
|
|
298
341
|
|
|
299
|
-
# ── Section 8:
|
|
342
|
+
# ── Section 8: M2O field — EXISTS on referenced table ────────────
|
|
343
|
+
|
|
344
|
+
section("8. M2O field filtering (EXISTS on referenced table)")
|
|
345
|
+
|
|
346
|
+
show(
|
|
347
|
+
"orders where customer.name = 'Joy'",
|
|
348
|
+
engine.list(session, domain=[["customer.name", "=", "Joy"]]),
|
|
349
|
+
)
|
|
350
|
+
|
|
351
|
+
show(
|
|
352
|
+
"orders where customer.name ilike '%ob%'",
|
|
353
|
+
engine.list(session, domain=[["customer.name", "ilike", "%ob%"]]),
|
|
354
|
+
)
|
|
355
|
+
|
|
356
|
+
show(
|
|
357
|
+
"CONFIRMED orders for Joy (M2O + scalar)",
|
|
358
|
+
engine.list(
|
|
359
|
+
session,
|
|
360
|
+
domain=[
|
|
361
|
+
["customer.name", "=", "Joy"],
|
|
362
|
+
["status", "=", "CONFIRMED"],
|
|
363
|
+
],
|
|
364
|
+
),
|
|
365
|
+
)
|
|
366
|
+
|
|
367
|
+
# ── Section 9: list() options ─────────────────────────────────────
|
|
300
368
|
|
|
301
|
-
section("
|
|
369
|
+
section("9. list() — limit, offset, order_by")
|
|
302
370
|
|
|
303
371
|
show(
|
|
304
372
|
"top 2 by total desc",
|
|
@@ -312,7 +380,7 @@ def main() -> None:
|
|
|
312
380
|
|
|
313
381
|
# ── Section 9: read_group — basic ─────────────────────────────────
|
|
314
382
|
|
|
315
|
-
section("
|
|
383
|
+
section("10. read_group — basic groupby + count")
|
|
316
384
|
|
|
317
385
|
groups, total = engine.read_group(
|
|
318
386
|
session,
|
|
@@ -324,7 +392,7 @@ def main() -> None:
|
|
|
324
392
|
|
|
325
393
|
# ── Section 10: read_group with domain ───────────────────────────
|
|
326
394
|
|
|
327
|
-
section("
|
|
395
|
+
section("11. read_group with domain filter")
|
|
328
396
|
|
|
329
397
|
groups, total = engine.read_group(
|
|
330
398
|
session,
|
|
@@ -336,7 +404,7 @@ def main() -> None:
|
|
|
336
404
|
|
|
337
405
|
# ── Section 11: read_group — date granularity ─────────────────────
|
|
338
406
|
|
|
339
|
-
section("
|
|
407
|
+
section("12. read_group — date granularity (month)")
|
|
340
408
|
|
|
341
409
|
groups, total = engine.read_group(
|
|
342
410
|
session,
|
|
@@ -348,7 +416,7 @@ def main() -> None:
|
|
|
348
416
|
|
|
349
417
|
# ── Section 12: read_group — child aggregate (LEFT JOIN) ──────────
|
|
350
418
|
|
|
351
|
-
section("
|
|
419
|
+
section("13. read_group — child aggregate (LEFT JOIN)")
|
|
352
420
|
|
|
353
421
|
groups, total = engine.read_group(
|
|
354
422
|
session,
|
|
@@ -359,7 +427,7 @@ def main() -> None:
|
|
|
359
427
|
|
|
360
428
|
# ── Section 13: read_group — HAVING ──────────────────────────────
|
|
361
429
|
|
|
362
|
-
section("
|
|
430
|
+
section("14. read_group — HAVING filter on aggregate")
|
|
363
431
|
|
|
364
432
|
groups, total = engine.read_group(
|
|
365
433
|
session,
|
|
@@ -371,7 +439,7 @@ def main() -> None:
|
|
|
371
439
|
|
|
372
440
|
# ── Section 14: __domain drill-down ──────────────────────────────
|
|
373
441
|
|
|
374
|
-
section("
|
|
442
|
+
section("15. __domain drill-down — group → records")
|
|
375
443
|
|
|
376
444
|
groups, _ = engine.read_group(
|
|
377
445
|
session,
|
|
@@ -94,22 +94,39 @@ def _make_table_resolver(schema: ModelSchema) -> SAResolver:
|
|
|
94
94
|
|
|
95
95
|
def resolve(fp: str, op: Op, val: Any) -> ColumnElement:
|
|
96
96
|
if "." in fp:
|
|
97
|
-
|
|
97
|
+
rel_name, field_name = fp.split(".", 1)
|
|
98
98
|
from axiom_query.errors import QueryError
|
|
99
99
|
|
|
100
|
-
child
|
|
101
|
-
|
|
102
|
-
|
|
103
|
-
|
|
104
|
-
|
|
100
|
+
# O2M: FK is on the child table; use EXISTS over child rows
|
|
101
|
+
child = schema.children.get(rel_name)
|
|
102
|
+
if child is not None:
|
|
103
|
+
fk_col = child.table.c[child.fk_field]
|
|
104
|
+
field_col = child.table.c[field_name]
|
|
105
|
+
condition = _apply_operator(field_col, op, val)
|
|
106
|
+
return exists(
|
|
107
|
+
select(1)
|
|
108
|
+
.select_from(child.table)
|
|
109
|
+
.where(and_(fk_col == schema.table.c.id, condition))
|
|
105
110
|
)
|
|
106
|
-
|
|
107
|
-
|
|
108
|
-
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
|
|
112
|
-
|
|
111
|
+
|
|
112
|
+
# M2O: FK is on the current table; use EXISTS over the referenced table
|
|
113
|
+
related = schema.related.get(rel_name)
|
|
114
|
+
if related is not None:
|
|
115
|
+
local_fk_col = schema.table.c[related.fk_field]
|
|
116
|
+
ref_pk = next(iter(related.table.primary_key))
|
|
117
|
+
field_col = related.table.c[field_name]
|
|
118
|
+
condition = _apply_operator(field_col, op, val)
|
|
119
|
+
return exists(
|
|
120
|
+
select(1)
|
|
121
|
+
.select_from(related.table)
|
|
122
|
+
.where(and_(ref_pk == local_fk_col, condition))
|
|
123
|
+
)
|
|
124
|
+
|
|
125
|
+
all_relations = list(schema.children.keys()) + list(schema.related.keys())
|
|
126
|
+
raise QueryError(
|
|
127
|
+
"INVALID_FILTER_FIELD",
|
|
128
|
+
f"No relation '{rel_name}' on {schema.model_class.__name__}. "
|
|
129
|
+
f"Available: {', '.join(all_relations) or 'none'}",
|
|
113
130
|
)
|
|
114
131
|
else:
|
|
115
132
|
col = _resolve_column(schema, fp)
|
|
@@ -17,12 +17,27 @@ class ChildSchema:
|
|
|
17
17
|
columns: dict[str, Column]
|
|
18
18
|
|
|
19
19
|
|
|
20
|
+
@dataclass
|
|
21
|
+
class RelatedSchema:
|
|
22
|
+
"""Represents a Many-to-One (M2O) relationship on the inspected model.
|
|
23
|
+
|
|
24
|
+
The FK column lives on the *current* table (e.g. orders.customer_id),
|
|
25
|
+
and the referenced table holds the PK that FK points to.
|
|
26
|
+
"""
|
|
27
|
+
|
|
28
|
+
name: str
|
|
29
|
+
table: Table
|
|
30
|
+
fk_field: str # FK column name on the current (owning) table
|
|
31
|
+
columns: dict[str, Column]
|
|
32
|
+
|
|
33
|
+
|
|
20
34
|
@dataclass
|
|
21
35
|
class ModelSchema:
|
|
22
36
|
model_class: type
|
|
23
37
|
table: Table
|
|
24
38
|
columns: dict[str, Column]
|
|
25
39
|
children: dict[str, ChildSchema] = field(default_factory=dict)
|
|
40
|
+
related: dict[str, RelatedSchema] = field(default_factory=dict)
|
|
26
41
|
|
|
27
42
|
|
|
28
43
|
def derive_schema(model_class: type) -> ModelSchema:
|
|
@@ -32,24 +47,35 @@ def derive_schema(model_class: type) -> ModelSchema:
|
|
|
32
47
|
columns = {col.key: col for col in mapper.columns}
|
|
33
48
|
|
|
34
49
|
children: dict[str, ChildSchema] = {}
|
|
50
|
+
related: dict[str, RelatedSchema] = {}
|
|
35
51
|
for rel_name, rel in mapper.relationships.items():
|
|
36
52
|
if rel.direction == RelationshipDirection.ONETOMANY:
|
|
37
53
|
child_table = rel.mapper.local_table
|
|
38
54
|
child_columns = {col.key: col for col in rel.mapper.columns}
|
|
39
|
-
# Find the FK column on the child table via synchronize_pairs
|
|
40
55
|
# synchronize_pairs: list of (parent_col, child_col) tuples
|
|
41
56
|
_, child_fk_col = next(iter(rel.synchronize_pairs))
|
|
42
|
-
fk_field = child_fk_col.key
|
|
43
57
|
children[rel_name] = ChildSchema(
|
|
44
58
|
name=rel_name,
|
|
45
59
|
table=child_table,
|
|
46
|
-
fk_field=
|
|
60
|
+
fk_field=child_fk_col.key,
|
|
47
61
|
columns=child_columns,
|
|
48
62
|
)
|
|
63
|
+
elif rel.direction == RelationshipDirection.MANYTOONE:
|
|
64
|
+
ref_table = rel.mapper.local_table
|
|
65
|
+
ref_columns = {col.key: col for col in rel.mapper.columns}
|
|
66
|
+
# synchronize_pairs for M2O: (referenced_pk_col, local_fk_col)
|
|
67
|
+
_, local_fk_col = next(iter(rel.synchronize_pairs))
|
|
68
|
+
related[rel_name] = RelatedSchema(
|
|
69
|
+
name=rel_name,
|
|
70
|
+
table=ref_table,
|
|
71
|
+
fk_field=local_fk_col.key,
|
|
72
|
+
columns=ref_columns,
|
|
73
|
+
)
|
|
49
74
|
|
|
50
75
|
return ModelSchema(
|
|
51
76
|
model_class=model_class,
|
|
52
77
|
table=table,
|
|
53
78
|
columns=columns,
|
|
54
79
|
children=children,
|
|
80
|
+
related=related,
|
|
55
81
|
)
|
|
@@ -16,12 +16,20 @@ class Base(DeclarativeBase):
|
|
|
16
16
|
pass
|
|
17
17
|
|
|
18
18
|
|
|
19
|
+
class Customer(Base):
|
|
20
|
+
__tablename__ = "customers"
|
|
21
|
+
id: Mapped[int] = mapped_column(primary_key=True)
|
|
22
|
+
name: Mapped[str] = mapped_column(String(100))
|
|
23
|
+
|
|
24
|
+
|
|
19
25
|
class Order(Base):
|
|
20
26
|
__tablename__ = "orders"
|
|
21
27
|
id: Mapped[int] = mapped_column(primary_key=True)
|
|
22
28
|
status: Mapped[str] = mapped_column(String(20))
|
|
23
29
|
total: Mapped[int] = mapped_column(default=0)
|
|
24
30
|
created_at: Mapped[datetime] = mapped_column(DateTime)
|
|
31
|
+
customer_id: Mapped[Optional[int]] = mapped_column(ForeignKey("customers.id"), nullable=True)
|
|
32
|
+
customer: Mapped[Optional["Customer"]] = relationship()
|
|
25
33
|
lines: Mapped[List["OrderLine"]] = relationship(back_populates="order")
|
|
26
34
|
|
|
27
35
|
|
|
@@ -47,23 +55,31 @@ def db_engine():
|
|
|
47
55
|
def seeded_engine(db_engine):
|
|
48
56
|
"""Seed test data once per session."""
|
|
49
57
|
with Session(db_engine) as sess:
|
|
58
|
+
c1 = Customer(id=1, name="Alice")
|
|
59
|
+
c2 = Customer(id=2, name="Bob")
|
|
60
|
+
sess.add_all([c1, c2])
|
|
61
|
+
sess.flush()
|
|
62
|
+
|
|
50
63
|
o1 = Order(
|
|
51
64
|
id=1,
|
|
52
65
|
status="CONFIRMED",
|
|
53
66
|
total=100,
|
|
54
67
|
created_at=datetime(2026, 1, 15),
|
|
68
|
+
customer_id=1,
|
|
55
69
|
)
|
|
56
70
|
o2 = Order(
|
|
57
71
|
id=2,
|
|
58
72
|
status="CONFIRMED",
|
|
59
73
|
total=200,
|
|
60
74
|
created_at=datetime(2026, 2, 20),
|
|
75
|
+
customer_id=2,
|
|
61
76
|
)
|
|
62
77
|
o3 = Order(
|
|
63
78
|
id=3,
|
|
64
79
|
status="DRAFT",
|
|
65
80
|
total=50,
|
|
66
81
|
created_at=datetime(2026, 1, 25),
|
|
82
|
+
customer_id=None,
|
|
67
83
|
)
|
|
68
84
|
sess.add_all([o1, o2, o3])
|
|
69
85
|
sess.flush()
|
|
@@ -108,9 +124,14 @@ async def seeded_async_engine(async_db_engine):
|
|
|
108
124
|
from sqlalchemy.ext.asyncio import AsyncSession
|
|
109
125
|
|
|
110
126
|
async with AsyncSession(async_db_engine) as sess:
|
|
111
|
-
|
|
112
|
-
|
|
113
|
-
|
|
127
|
+
c1 = Customer(id=1, name="Alice")
|
|
128
|
+
c2 = Customer(id=2, name="Bob")
|
|
129
|
+
sess.add_all([c1, c2])
|
|
130
|
+
await sess.flush()
|
|
131
|
+
|
|
132
|
+
o1 = Order(id=1, status="CONFIRMED", total=100, created_at=datetime(2026, 1, 15), customer_id=1)
|
|
133
|
+
o2 = Order(id=2, status="CONFIRMED", total=200, created_at=datetime(2026, 2, 20), customer_id=2)
|
|
134
|
+
o3 = Order(id=3, status="DRAFT", total=50, created_at=datetime(2026, 1, 25), customer_id=None)
|
|
114
135
|
sess.add_all([o1, o2, o3])
|
|
115
136
|
await sess.flush()
|
|
116
137
|
|
|
@@ -1,7 +1,9 @@
|
|
|
1
|
-
"""Tests for QueryEngine.list() — slices 1-5."""
|
|
1
|
+
"""Tests for QueryEngine.list() — slices 1-5 + M2O filtering."""
|
|
2
2
|
|
|
3
3
|
from __future__ import annotations
|
|
4
4
|
|
|
5
|
+
import pytest
|
|
6
|
+
|
|
5
7
|
from conftest import Order
|
|
6
8
|
|
|
7
9
|
|
|
@@ -51,3 +53,39 @@ def test_list_with_order_by(session, engine):
|
|
|
51
53
|
records = engine.list(session, order_by=[["total", "desc"]])
|
|
52
54
|
assert records[0].total == 200
|
|
53
55
|
assert records[1].total == 100
|
|
56
|
+
|
|
57
|
+
|
|
58
|
+
# Slice 6 — M2O field filtering (EXISTS on referenced table)
|
|
59
|
+
def test_list_filters_by_m2o_field(session, engine):
|
|
60
|
+
# Order 1 → customer Alice; Order 2 → customer Bob; Order 3 → no customer
|
|
61
|
+
records = engine.list(session, domain=[["customer.name", "=", "Alice"]])
|
|
62
|
+
assert len(records) == 1
|
|
63
|
+
assert records[0].id == 1
|
|
64
|
+
|
|
65
|
+
|
|
66
|
+
def test_list_m2o_ilike(session, engine):
|
|
67
|
+
records = engine.list(session, domain=[["customer.name", "ilike", "%ob%"]])
|
|
68
|
+
assert len(records) == 1
|
|
69
|
+
assert records[0].id == 2
|
|
70
|
+
|
|
71
|
+
|
|
72
|
+
def test_list_m2o_no_match(session, engine):
|
|
73
|
+
records = engine.list(session, domain=[["customer.name", "=", "Nobody"]])
|
|
74
|
+
assert records == []
|
|
75
|
+
|
|
76
|
+
|
|
77
|
+
def test_list_m2o_combined_with_scalar(session, engine):
|
|
78
|
+
# Alice's order is CONFIRMED → should match
|
|
79
|
+
records = engine.list(
|
|
80
|
+
session,
|
|
81
|
+
domain=[["customer.name", "=", "Alice"], ["status", "=", "CONFIRMED"]],
|
|
82
|
+
)
|
|
83
|
+
assert len(records) == 1
|
|
84
|
+
assert records[0].id == 1
|
|
85
|
+
|
|
86
|
+
|
|
87
|
+
def test_list_m2o_unknown_relation_raises(session, engine):
|
|
88
|
+
from axiom_query.errors import QueryError
|
|
89
|
+
|
|
90
|
+
with pytest.raises(QueryError):
|
|
91
|
+
engine.list(session, domain=[["nonexistent.name", "=", "x"]])
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|