sql-testing-library 0.14.0__tar.gz → 0.16.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (23) hide show
  1. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/CHANGELOG.md +23 -0
  2. sql_testing_library-0.14.0/README.md → sql_testing_library-0.16.0/PKG-INFO +173 -32
  3. sql_testing_library-0.14.0/PKG-INFO → sql_testing_library-0.16.0/README.md +114 -90
  4. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/pyproject.toml +12 -5
  5. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/__init__.py +3 -1
  6. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/bigquery.py +17 -3
  7. sql_testing_library-0.16.0/src/sql_testing_library/_adapters/duckdb.py +474 -0
  8. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/snowflake.py +11 -1
  9. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_core.py +11 -4
  10. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_pytest_plugin.py +6 -0
  11. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_sql_utils.py +11 -6
  12. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_types.py +12 -1
  13. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/LICENSE +0 -0
  14. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/__init__.py +0 -0
  15. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/athena.py +0 -0
  16. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/base.py +0 -0
  17. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/presto.py +0 -0
  18. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/redshift.py +0 -0
  19. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_adapters/trino.py +0 -0
  20. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_exceptions.py +0 -0
  21. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_mock_table.py +0 -0
  22. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/_sql_logger.py +0 -0
  23. {sql_testing_library-0.14.0 → sql_testing_library-0.16.0}/src/sql_testing_library/py.typed +0 -0
@@ -5,6 +5,29 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## 0.16.0 (2025-08-20)
9
+
10
+ ### Feat
11
+
12
+ - upgrade mocksmith to 6.0.1 (#122)
13
+
14
+ ### Fix
15
+
16
+ - upgrade snowflake-connector-python to >=3.13.1 for critical security vulnerabilities (#123)
17
+
18
+ ## 0.15.0 (2025-07-27)
19
+
20
+ ### Feat
21
+
22
+ - implement duckdb integration (#117)
23
+ - integrate mocksmith for test data generation and simplify relea… (#112)
24
+ - integrate mocksmith for test data generation and simplify release workflow
25
+
26
+ ### Fix
27
+
28
+ - added explicit dependency of faker
29
+ - upgrade mocksmith library version
30
+
8
31
  ## 0.14.0 (2025-06-30)
9
32
 
10
33
  ### Feat
@@ -1,6 +1,64 @@
1
+ Metadata-Version: 2.3
2
+ Name: sql-testing-library
3
+ Version: 0.16.0
4
+ Summary: A powerful Python framework for unit testing SQL queries across BigQuery, Snowflake, Redshift, Athena, Trino, and DuckDB with mock data
5
+ License: MIT
6
+ Keywords: sql,testing,unit-testing,mock-data,database-testing,bigquery,snowflake,redshift,athena,trino,duckdb,data-engineering,etl-testing,sql-validation,query-testing
7
+ Author: Gurmeet Saran
8
+ Author-email: gurmeetx@gmail.com
9
+ Maintainer: Gurmeet Saran
10
+ Maintainer-email: gurmeetx@gmail.com
11
+ Requires-Python: >=3.9
12
+ Classifier: Development Status :: 4 - Beta
13
+ Classifier: Environment :: Console
14
+ Classifier: Framework :: Pytest
15
+ Classifier: Intended Audience :: Developers
16
+ Classifier: Intended Audience :: Science/Research
17
+ Classifier: Intended Audience :: System Administrators
18
+ Classifier: License :: OSI Approved :: MIT License
19
+ Classifier: Natural Language :: English
20
+ Classifier: Operating System :: OS Independent
21
+ Classifier: Programming Language :: Python :: 3
22
+ Classifier: Programming Language :: Python :: 3.9
23
+ Classifier: Programming Language :: Python :: 3.10
24
+ Classifier: Programming Language :: Python :: 3.11
25
+ Classifier: Programming Language :: Python :: 3.12
26
+ Classifier: Programming Language :: Python :: 3.13
27
+ Classifier: Programming Language :: Python :: Implementation :: CPython
28
+ Classifier: Topic :: Database
29
+ Classifier: Topic :: Database :: Database Engines/Servers
30
+ Classifier: Topic :: Scientific/Engineering :: Information Analysis
31
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
32
+ Classifier: Topic :: Software Development :: Testing
33
+ Classifier: Topic :: Software Development :: Testing :: Unit
34
+ Classifier: Typing :: Typed
35
+ Provides-Extra: all
36
+ Provides-Extra: athena
37
+ Provides-Extra: bigquery
38
+ Provides-Extra: duckdb
39
+ Provides-Extra: redshift
40
+ Provides-Extra: snowflake
41
+ Provides-Extra: trino
42
+ Requires-Dist: db-dtypes (>=1.4.3,<2.0.0)
43
+ Requires-Dist: docutils (>=0.21.2,<0.23.0)
44
+ Requires-Dist: pandas (>=1.0.0)
45
+ Requires-Dist: pydantic (>=2.0.0)
46
+ Requires-Dist: sqlglot (>=26.0.0)
47
+ Requires-Dist: typing-extensions (>=4.5.0) ; python_version < "3.10"
48
+ Project-URL: Bug Tracker, https://github.com/gurmeetsaran/sqltesting/issues
49
+ Project-URL: Changelog, https://github.com/gurmeetsaran/sqltesting/blob/master/CHANGELOG.md
50
+ Project-URL: Contributing, https://github.com/gurmeetsaran/sqltesting/blob/master/CONTRIBUTING.md
51
+ Project-URL: Documentation, https://gurmeetsaran.github.io/sqltesting/
52
+ Project-URL: Discussions, https://github.com/gurmeetsaran/sqltesting/discussions
53
+ Project-URL: Homepage, https://gurmeetsaran.github.io/sqltesting/
54
+ Project-URL: Repository, https://github.com/gurmeetsaran/sqltesting
55
+ Project-URL: Release Notes, https://github.com/gurmeetsaran/sqltesting/releases
56
+ Project-URL: Source Code, https://github.com/gurmeetsaran/sqltesting
57
+ Description-Content-Type: text/markdown
58
+
1
59
  # SQL Testing Library
2
60
 
3
- A powerful Python framework for unit testing SQL queries with mock data injection across BigQuery, Snowflake, Redshift, Athena, and Trino.
61
+ A powerful Python framework for unit testing SQL queries with mock data injection across BigQuery, Snowflake, Redshift, Athena, Trino, and DuckDB.
4
62
 
5
63
  [![Unit Tests](https://github.com/gurmeetsaran/sqltesting/actions/workflows/tests.yaml/badge.svg)](https://github.com/gurmeetsaran/sqltesting/actions/workflows/tests.yaml)
6
64
  [![Athena Integration](https://github.com/gurmeetsaran/sqltesting/actions/workflows/athena-integration.yml/badge.svg)](https://github.com/gurmeetsaran/sqltesting/actions/workflows/athena-integration.yml)
@@ -8,6 +66,7 @@ A powerful Python framework for unit testing SQL queries with mock data injectio
8
66
  [![Redshift Integration](https://github.com/gurmeetsaran/sqltesting/actions/workflows/redshift-integration.yml/badge.svg)](https://github.com/gurmeetsaran/sqltesting/actions/workflows/redshift-integration.yml)
9
67
  [![Trino Integration](https://github.com/gurmeetsaran/sqltesting/actions/workflows/trino-integration.yml/badge.svg)](https://github.com/gurmeetsaran/sqltesting/actions/workflows/trino-integration.yml)
10
68
  [![Snowflake Integration](https://github.com/gurmeetsaran/sqltesting/actions/workflows/snowflake-integration.yml/badge.svg)](https://github.com/gurmeetsaran/sqltesting/actions/workflows/snowflake-integration.yml)
69
+ [![DuckDB Integration](https://github.com/gurmeetsaran/sqltesting/actions/workflows/duckdb-integration.yml/badge.svg)](https://github.com/gurmeetsaran/sqltesting/actions/workflows/duckdb-integration.yml)
11
70
  [![GitHub license](https://img.shields.io/github/license/gurmeetsaran/sqltesting)](https://github.com/gurmeetsaran/sqltesting/blob/master/LICENSE)
12
71
  [![Pepy Total Downloads](https://img.shields.io/pepy/dt/sql-testing-library?label=PyPI%20Downloads)](https://pepy.tech/projects/sql-testing-library)
13
72
  [![codecov](https://codecov.io/gh/gurmeetsaran/sqltesting/branch/master/graph/badge.svg?token=CN3G5X5ZA5)](https://codecov.io/gh/gurmeetsaran/sqltesting)
@@ -47,7 +106,7 @@ For more details on our journey and the engineering challenges we solved, read t
47
106
 
48
107
  ## Features
49
108
 
50
- - **Multi-Database Support**: Test SQL across BigQuery, Athena, Redshift, Trino, and Snowflake
109
+ - **Multi-Database Support**: Test SQL across BigQuery, Athena, Redshift, Trino, Snowflake, and DuckDB
51
110
  - **Mock Data Injection**: Use Python dataclasses for type-safe test data
52
111
  - **CTE or Physical Tables**: Automatic fallback for query size limits
53
112
  - **Type-Safe Results**: Deserialize results to Pydantic models
@@ -60,28 +119,28 @@ The library supports different data types across database engines. All checkmark
60
119
 
61
120
  ### Primitive Types
62
121
 
63
- | Data Type | Python Type | BigQuery | Athena | Redshift | Trino | Snowflake |
64
- |-----------|-------------|----------|--------|----------|-------|-----------|
65
- | **String** | `str` | ✅ | ✅ | ✅ | ✅ | ✅ |
66
- | **Integer** | `int` | ✅ | ✅ | ✅ | ✅ | ✅ |
67
- | **Float** | `float` | ✅ | ✅ | ✅ | ✅ | ✅ |
68
- | **Boolean** | `bool` | ✅ | ✅ | ✅ | ✅ | ✅ |
69
- | **Date** | `date` | ✅ | ✅ | ✅ | ✅ | ✅ |
70
- | **Datetime** | `datetime` | ✅ | ✅ | ✅ | ✅ | ✅ |
71
- | **Decimal** | `Decimal` | ✅ | ✅ | ✅ | ✅ | ✅ |
72
- | **Optional** | `Optional[T]` | ✅ | ✅ | ✅ | ✅ | ✅ |
122
+ | Data Type | Python Type | BigQuery | Athena | Redshift | Trino | Snowflake | DuckDB |
123
+ |-----------|-------------|----------|--------|----------|-------|-----------|--------|
124
+ | **String** | `str` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
125
+ | **Integer** | `int` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
126
+ | **Float** | `float` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
127
+ | **Boolean** | `bool` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
128
+ | **Date** | `date` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
129
+ | **Datetime** | `datetime` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
130
+ | **Decimal** | `Decimal` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
131
+ | **Optional** | `Optional[T]` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
73
132
 
74
133
  ### Complex Types
75
134
 
76
- | Data Type | Python Type | BigQuery | Athena | Redshift | Trino | Snowflake |
77
- |-----------|-------------|----------|--------|----------|-------|-----------|
78
- | **String Array** | `List[str]` | ✅ | ✅ | ✅ | ✅ | ✅ |
79
- | **Integer Array** | `List[int]` | ✅ | ✅ | ✅ | ✅ | ✅ |
80
- | **Decimal Array** | `List[Decimal]` | ✅ | ✅ | ✅ | ✅ | ✅ |
81
- | **Optional Array** | `Optional[List[T]]` | ✅ | ✅ | ✅ | ✅ | ✅ |
82
- | **Map/Dict** | `Dict[K, V]` | ✅ | ✅ | ✅ | ✅ | ✅ |
83
- | **Struct/Record** | `dataclass` | ✅ | ✅ | ❌ | ✅ | ❌ |
84
- | **Nested Arrays** | `List[List[T]]` | ❌ | ❌ | ❌ | ❌ | ❌ |
135
+ | Data Type | Python Type | BigQuery | Athena | Redshift | Trino | Snowflake | DuckDB |
136
+ |-----------|-------------|----------|--------|----------|-------|-----------|--------|
137
+ | **String Array** | `List[str]` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
138
+ | **Integer Array** | `List[int]` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
139
+ | **Decimal Array** | `List[Decimal]` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
140
+ | **Optional Array** | `Optional[List[T]]` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
141
+ | **Map/Dict** | `Dict[K, V]` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
142
+ | **Struct/Record** | `dataclass` | ✅ | ✅ | ❌ | ✅ | ❌ | ✅ |
143
+ | **Nested Arrays** | `List[List[T]]` | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
85
144
 
86
145
  ### Database-Specific Notes
87
146
 
@@ -90,15 +149,16 @@ The library supports different data types across database engines. All checkmark
90
149
  - **Redshift**: Arrays and maps implemented via SUPER type (JSON parsing); 16MB query size limit; struct types not yet supported
91
150
  - **Trino**: Memory catalog for testing; excellent decimal precision; supports arrays, maps, and struct types using `ROW` with named fields (dataclasses and Pydantic models)
92
151
  - **Snowflake**: Column names normalized to lowercase; 1MB query size limit; dict/map types implemented via VARIANT type (JSON parsing); struct types not yet supported
152
+ - **DuckDB**: Fast embedded analytics database; excellent SQL standards compliance; supports arrays, maps, and struct types using `STRUCT` syntax with named fields (dataclasses and Pydantic models)
93
153
 
94
154
  ## Execution Modes Support
95
155
 
96
156
  The library supports two execution modes for mock data injection. **CTE Mode is the default** and is automatically used unless Physical Tables mode is explicitly requested or required due to query size limits.
97
157
 
98
- | Execution Mode | Description | BigQuery | Athena | Redshift | Trino | Snowflake |
99
- |----------------|-------------|----------|--------|----------|-------|-----------|
100
- | **CTE Mode** | Mock data injected as Common Table Expressions | ✅ | ✅ | ✅ | ✅ | ✅ |
101
- | **Physical Tables** | Mock data created as temporary tables | ✅ | ✅ | ✅ | ✅ | ✅ |
158
+ | Execution Mode | Description | BigQuery | Athena | Redshift | Trino | Snowflake | DuckDB |
159
+ |----------------|-------------|----------|--------|----------|-------|-----------|--------|
160
+ | **CTE Mode** | Mock data injected as Common Table Expressions | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
161
+ | **Physical Tables** | Mock data created as temporary tables | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
102
162
 
103
163
  ### Execution Mode Details
104
164
 
@@ -122,14 +182,16 @@ The library supports two execution modes for mock data injection. **CTE Mode is
122
182
  | **Redshift** | Temporary tables | Session-specific temp schema | Database automatic | Session end |
123
183
  | **Trino** | Memory tables | `memory.default` schema | Library executes `DROP TABLE` | After each test |
124
184
  | **Snowflake** | Temporary tables | Session-specific temp schema | Database automatic | Session end |
185
+ | **DuckDB** | Temporary tables | Database-specific temp schema | Library executes `DROP TABLE` | After each test |
125
186
 
126
187
  #### **Cleanup Behavior Explained**
127
188
 
128
- **Library-Managed Cleanup (BigQuery, Athena, Trino):**
189
+ **Library-Managed Cleanup (BigQuery, Athena, Trino, DuckDB):**
129
190
  - The SQL Testing Library explicitly calls cleanup methods after each test
130
191
  - **BigQuery**: Creates standard tables in your dataset, then deletes them via `client.delete_table()`
131
192
  - **Athena**: Creates external tables backed by S3 data, then drops table metadata via `DROP TABLE IF EXISTS` (⚠️ **S3 data files remain and require separate cleanup**)
132
193
  - **Trino**: Creates tables in memory catalog, then drops them via `DROP TABLE IF EXISTS`
194
+ - **DuckDB**: Creates temporary tables in the database, then drops them via `DROP TABLE IF EXISTS`
133
195
 
134
196
  **Database-Managed Cleanup (Redshift, Snowflake):**
135
197
  - These databases have built-in temporary table mechanisms
@@ -154,7 +216,7 @@ A: Trino's memory catalog doesn't automatically clean up tables when sessions en
154
216
  A: BigQuery tables created by the library are **standard tables without TTL** - they persist until explicitly deleted. The library immediately calls `client.delete_table()` after each test. If you want to set TTL as a safety net, you can configure it at the dataset level (e.g., 24 hours) to auto-delete any orphaned tables.
155
217
 
156
218
  **Q: Which databases leave artifacts if tests crash?**
157
- - **BigQuery, Athena, Trino**: May leave tables if library crashes before cleanup
219
+ - **BigQuery, Athena, Trino, DuckDB**: May leave tables if library crashes before cleanup
158
220
  - **Redshift, Snowflake**: No artifacts - temporary tables auto-cleanup on session end
159
221
 
160
222
  **Q: How to manually clean up orphaned tables?**
@@ -170,6 +232,10 @@ DROP TABLE temp_table_name;
170
232
  -- Trino: List and drop tables with temp prefix
171
233
  SHOW TABLES FROM memory.default LIKE 'temp_%';
172
234
  DROP TABLE memory.default.temp_table_name;
235
+
236
+ -- DuckDB: List and drop tables with temp prefix
237
+ SHOW TABLES;
238
+ DROP TABLE temp_table_name;
173
239
  ```
174
240
 
175
241
  **Q: How to handle S3 cleanup for Athena tables?**
@@ -220,6 +286,7 @@ aws s3api list-objects-v2 --bucket your-athena-results-bucket --prefix "temp_" \
220
286
  | **Redshift** | 16MB | Automatically switches at 16MB |
221
287
  | **Trino** | 16MB (estimated) | Large dataset or complex CTEs |
222
288
  | **Snowflake** | 1MB | Automatically switches at 1MB |
289
+ | **DuckDB** | 32MB (estimated) | Large dataset or complex CTEs |
223
290
 
224
291
  ### How to Control Execution Mode
225
292
 
@@ -342,6 +409,9 @@ pip install sql-testing-library[trino]
342
409
  # Install with Snowflake support
343
410
  pip install sql-testing-library[snowflake]
344
411
 
412
+ # Install with DuckDB support
413
+ pip install sql-testing-library[duckdb]
414
+
345
415
  # Or install with all database adapters
346
416
  pip install sql-testing-library[all]
347
417
  ```
@@ -357,9 +427,10 @@ poetry install --with athena
357
427
  poetry install --with redshift
358
428
  poetry install --with trino
359
429
  poetry install --with snowflake
430
+ poetry install --with duckdb
360
431
 
361
432
  # Install with all database adapters and dev tools
362
- poetry install --with bigquery,athena,redshift,trino,snowflake,dev
433
+ poetry install --with bigquery,athena,redshift,trino,snowflake,duckdb,dev
363
434
  ```
364
435
 
365
436
  ## Quick Start
@@ -368,7 +439,7 @@ poetry install --with bigquery,athena,redshift,trino,snowflake,dev
368
439
 
369
440
  ```ini
370
441
  [sql_testing]
371
- adapter = bigquery # Use 'bigquery', 'athena', 'redshift', 'trino', or 'snowflake'
442
+ adapter = bigquery # Use 'bigquery', 'athena', 'redshift', 'trino', 'snowflake', or 'duckdb'
372
443
 
373
444
  # BigQuery configuration
374
445
  [sql_testing.bigquery]
@@ -426,6 +497,10 @@ credentials_path = <path to credentials json>
426
497
  #
427
498
  # # Option 2: Password authentication (for accounts without MFA)
428
499
  # password = <snowflake_password>
500
+
501
+ # DuckDB configuration
502
+ # [sql_testing.duckdb]
503
+ # database = <path/to/database.duckdb> # Optional: defaults to in-memory database
429
504
  ```
430
505
 
431
506
  ### Database Context Understanding
@@ -439,6 +514,7 @@ Each database adapter uses a different concept for organizing tables and queries
439
514
  | **Redshift** | `{database}` | database only | `"test_db"` | `SELECT * FROM test_db.orders` |
440
515
  | **Snowflake** | `{database}.{schema}` | database + schema | `"test_db.public"` | `SELECT * FROM test_db.public.products` |
441
516
  | **Trino** | `{catalog}.{schema}` | catalog + schema | `"memory.default"` | `SELECT * FROM memory.default.inventory` |
517
+ | **DuckDB** | `{database}` | database only | `"test_db"` | `SELECT * FROM test_db.analytics` |
442
518
 
443
519
  #### Key Points:
444
520
 
@@ -511,6 +587,14 @@ class ProductsMockTable(BaseMockTable):
511
587
 
512
588
  def get_table_name(self) -> str:
513
589
  return "products"
590
+
591
+ # DuckDB Mock Table
592
+ class AnalyticsMockTable(BaseMockTable):
593
+ def get_database_name(self) -> str:
594
+ return "test_db" # database only
595
+
596
+ def get_table_name(self) -> str:
597
+ return "analytics"
514
598
  ```
515
599
 
516
600
  2. **Write a test** using one of the flexible patterns:
@@ -636,7 +720,7 @@ class EmployeesMockTable(BaseMockTable):
636
720
 
637
721
  # Test with struct types
638
722
  @sql_test(
639
- adapter_type="athena", # or "trino" or "bigquery"
723
+ adapter_type="athena", # or "trino", "bigquery", or "duckdb"
640
724
  mock_tables=[
641
725
  EmployeesMockTable([
642
726
  Employee(
@@ -682,7 +766,7 @@ def test_struct_with_dot_notation():
682
766
 
683
767
  # You can also query entire structs
684
768
  @sql_test(
685
- adapter_type="trino", # or "athena" or "bigquery"
769
+ adapter_type="trino", # or "athena", "bigquery", or "duckdb"
686
770
  mock_tables=[EmployeesMockTable([...])],
687
771
  result_class=dict # Returns full struct as dict
688
772
  )
@@ -930,9 +1014,21 @@ def test_snowflake_query():
930
1014
  query="SELECT user_id, name FROM users WHERE user_id = 1",
931
1015
  default_namespace="test_db"
932
1016
  )
1017
+
1018
+ # Use DuckDB adapter for this test
1019
+ @sql_test(
1020
+ adapter_type="duckdb",
1021
+ mock_tables=[...],
1022
+ result_class=UserResult
1023
+ )
1024
+ def test_duckdb_query():
1025
+ return TestCase(
1026
+ query="SELECT user_id, name FROM users WHERE user_id = 1",
1027
+ default_namespace="test_db"
1028
+ )
933
1029
  ```
934
1030
 
935
- The adapter_type parameter will use the configuration from the corresponding section in pytest.ini, such as `[sql_testing.bigquery]`, `[sql_testing.athena]`, `[sql_testing.redshift]`, `[sql_testing.trino]`, or `[sql_testing.snowflake]`.
1031
+ The adapter_type parameter will use the configuration from the corresponding section in pytest.ini, such as `[sql_testing.bigquery]`, `[sql_testing.athena]`, `[sql_testing.redshift]`, `[sql_testing.trino]`, `[sql_testing.snowflake]`, or `[sql_testing.duckdb]`.
936
1032
 
937
1033
  **Default Adapter Behavior:**
938
1034
  - If `adapter_type` is not specified in the test, the library uses the adapter from `[sql_testing]` section's `adapter` setting
@@ -975,6 +1071,14 @@ The adapter_type parameter will use the configuration from the corresponding sec
975
1071
  - Supports authentication via username and password
976
1072
  - Optional support for warehouse, role, and schema specification
977
1073
 
1074
+ #### DuckDB Adapter
1075
+ - Supports DuckDB embedded analytical database
1076
+ - Uses CTAS (CREATE TABLE AS SELECT) for efficient temporary table creation
1077
+ - Fast local database with excellent SQL standards compliance
1078
+ - Supports both file-based and in-memory databases
1079
+ - No authentication required - perfect for local development and testing
1080
+ - Excellent performance for analytical workloads
1081
+
978
1082
  **Default Behavior:**
979
1083
  - If adapter_type is not specified in the TestCase or decorator, the library will use the adapter specified in the `[sql_testing]` section's `adapter` setting.
980
1084
  - If no adapter is specified in the `[sql_testing]` section, it defaults to "bigquery".
@@ -1183,6 +1287,41 @@ The library automatically:
1183
1287
 
1184
1288
  For detailed usage and configuration options, see the example files included.
1185
1289
 
1290
+ ## Integration with Mocksmith
1291
+
1292
+ SQL Testing Library works seamlessly with [Mocksmith](https://github.com/gurmeetsaran/mocksmith) for automatic test data generation. Mocksmith can reduce your test setup code by ~70% while providing more realistic test data.
1293
+
1294
+ Install mocksmith with: `pip install mocksmith[mock,pydantic]`
1295
+
1296
+ ### Quick Example
1297
+
1298
+ ```python
1299
+ # Without Mocksmith - Manual data creation
1300
+ customers = []
1301
+ for i in range(100):
1302
+ customers.append(Customer(
1303
+ id=i + 1,
1304
+ name=f"Customer {i + 1}",
1305
+ email=f"customer{i + 1}@test.com",
1306
+ balance=Decimal(str(random.uniform(0, 10000)))
1307
+ ))
1308
+
1309
+ # With Mocksmith - Automatic realistic data
1310
+ from mocksmith import mockable, Varchar, Integer, Money
1311
+
1312
+ @mockable
1313
+ @dataclass
1314
+ class Customer:
1315
+ id: Integer()
1316
+ name: Varchar(100)
1317
+ email: Varchar(255)
1318
+ balance: Money()
1319
+
1320
+ customers = [Customer.mock() for _ in range(100)]
1321
+ ```
1322
+
1323
+ See the [Mocksmith Integration Guide](docs/mocksmith_integration.md) and [examples](examples/mocksmith_integration_example.py) for detailed usage patterns.
1324
+
1186
1325
  ## Known Limitations and TODOs
1187
1326
 
1188
1327
  The library has a few known limitations that are planned to be addressed in future updates:
@@ -1211,3 +1350,5 @@ The library has a few known limitations that are planned to be addressed in futu
1211
1350
  - psycopg2-binary for Redshift
1212
1351
  - trino for Trino
1213
1352
  - snowflake-connector-python for Snowflake
1353
+ - duckdb for DuckDB
1354
+