dbt-adapters 1.22.2__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- dbt/adapters/__about__.py +1 -0
- dbt/adapters/__init__.py +8 -0
- dbt/adapters/base/README.md +13 -0
- dbt/adapters/base/__init__.py +16 -0
- dbt/adapters/base/column.py +173 -0
- dbt/adapters/base/connections.py +429 -0
- dbt/adapters/base/impl.py +2036 -0
- dbt/adapters/base/meta.py +150 -0
- dbt/adapters/base/plugin.py +32 -0
- dbt/adapters/base/query_headers.py +106 -0
- dbt/adapters/base/relation.py +648 -0
- dbt/adapters/cache.py +521 -0
- dbt/adapters/capability.py +63 -0
- dbt/adapters/catalogs/__init__.py +14 -0
- dbt/adapters/catalogs/_client.py +54 -0
- dbt/adapters/catalogs/_constants.py +1 -0
- dbt/adapters/catalogs/_exceptions.py +39 -0
- dbt/adapters/catalogs/_integration.py +113 -0
- dbt/adapters/clients/__init__.py +0 -0
- dbt/adapters/clients/jinja.py +24 -0
- dbt/adapters/contracts/__init__.py +0 -0
- dbt/adapters/contracts/connection.py +229 -0
- dbt/adapters/contracts/macros.py +11 -0
- dbt/adapters/contracts/relation.py +160 -0
- dbt/adapters/events/README.md +51 -0
- dbt/adapters/events/__init__.py +0 -0
- dbt/adapters/events/adapter_types_pb2.py +2 -0
- dbt/adapters/events/base_types.py +36 -0
- dbt/adapters/events/logging.py +83 -0
- dbt/adapters/events/types.py +436 -0
- dbt/adapters/exceptions/__init__.py +40 -0
- dbt/adapters/exceptions/alias.py +24 -0
- dbt/adapters/exceptions/cache.py +68 -0
- dbt/adapters/exceptions/compilation.py +269 -0
- dbt/adapters/exceptions/connection.py +16 -0
- dbt/adapters/exceptions/database.py +51 -0
- dbt/adapters/factory.py +264 -0
- dbt/adapters/protocol.py +150 -0
- dbt/adapters/py.typed +0 -0
- dbt/adapters/record/__init__.py +2 -0
- dbt/adapters/record/base.py +291 -0
- dbt/adapters/record/cursor/cursor.py +69 -0
- dbt/adapters/record/cursor/description.py +37 -0
- dbt/adapters/record/cursor/execute.py +39 -0
- dbt/adapters/record/cursor/fetchall.py +69 -0
- dbt/adapters/record/cursor/fetchmany.py +23 -0
- dbt/adapters/record/cursor/fetchone.py +23 -0
- dbt/adapters/record/cursor/rowcount.py +23 -0
- dbt/adapters/record/handle.py +55 -0
- dbt/adapters/record/serialization.py +115 -0
- dbt/adapters/reference_keys.py +39 -0
- dbt/adapters/relation_configs/README.md +25 -0
- dbt/adapters/relation_configs/__init__.py +12 -0
- dbt/adapters/relation_configs/config_base.py +46 -0
- dbt/adapters/relation_configs/config_change.py +26 -0
- dbt/adapters/relation_configs/config_validation.py +57 -0
- dbt/adapters/sql/__init__.py +2 -0
- dbt/adapters/sql/connections.py +263 -0
- dbt/adapters/sql/impl.py +286 -0
- dbt/adapters/utils.py +69 -0
- dbt/include/__init__.py +3 -0
- dbt/include/global_project/__init__.py +4 -0
- dbt/include/global_project/dbt_project.yml +7 -0
- dbt/include/global_project/docs/overview.md +43 -0
- dbt/include/global_project/macros/adapters/apply_grants.sql +167 -0
- dbt/include/global_project/macros/adapters/columns.sql +144 -0
- dbt/include/global_project/macros/adapters/freshness.sql +32 -0
- dbt/include/global_project/macros/adapters/indexes.sql +41 -0
- dbt/include/global_project/macros/adapters/metadata.sql +105 -0
- dbt/include/global_project/macros/adapters/persist_docs.sql +33 -0
- dbt/include/global_project/macros/adapters/relation.sql +84 -0
- dbt/include/global_project/macros/adapters/schema.sql +20 -0
- dbt/include/global_project/macros/adapters/show.sql +26 -0
- dbt/include/global_project/macros/adapters/timestamps.sql +52 -0
- dbt/include/global_project/macros/adapters/validate_sql.sql +10 -0
- dbt/include/global_project/macros/etc/datetime.sql +62 -0
- dbt/include/global_project/macros/etc/statement.sql +52 -0
- dbt/include/global_project/macros/generic_test_sql/accepted_values.sql +27 -0
- dbt/include/global_project/macros/generic_test_sql/not_null.sql +9 -0
- dbt/include/global_project/macros/generic_test_sql/relationships.sql +23 -0
- dbt/include/global_project/macros/generic_test_sql/unique.sql +12 -0
- dbt/include/global_project/macros/get_custom_name/get_custom_alias.sql +36 -0
- dbt/include/global_project/macros/get_custom_name/get_custom_database.sql +32 -0
- dbt/include/global_project/macros/get_custom_name/get_custom_schema.sql +60 -0
- dbt/include/global_project/macros/materializations/configs.sql +21 -0
- dbt/include/global_project/macros/materializations/functions/aggregate.sql +65 -0
- dbt/include/global_project/macros/materializations/functions/function.sql +20 -0
- dbt/include/global_project/macros/materializations/functions/helpers.sql +20 -0
- dbt/include/global_project/macros/materializations/functions/scalar.sql +69 -0
- dbt/include/global_project/macros/materializations/hooks.sql +35 -0
- dbt/include/global_project/macros/materializations/models/clone/can_clone_table.sql +7 -0
- dbt/include/global_project/macros/materializations/models/clone/clone.sql +67 -0
- dbt/include/global_project/macros/materializations/models/clone/create_or_replace_clone.sql +7 -0
- dbt/include/global_project/macros/materializations/models/incremental/column_helpers.sql +80 -0
- dbt/include/global_project/macros/materializations/models/incremental/incremental.sql +99 -0
- dbt/include/global_project/macros/materializations/models/incremental/is_incremental.sql +13 -0
- dbt/include/global_project/macros/materializations/models/incremental/merge.sql +120 -0
- dbt/include/global_project/macros/materializations/models/incremental/on_schema_change.sql +159 -0
- dbt/include/global_project/macros/materializations/models/incremental/strategies.sql +92 -0
- dbt/include/global_project/macros/materializations/models/materialized_view.sql +121 -0
- dbt/include/global_project/macros/materializations/models/table.sql +64 -0
- dbt/include/global_project/macros/materializations/models/view.sql +72 -0
- dbt/include/global_project/macros/materializations/seeds/helpers.sql +128 -0
- dbt/include/global_project/macros/materializations/seeds/seed.sql +60 -0
- dbt/include/global_project/macros/materializations/snapshots/helpers.sql +345 -0
- dbt/include/global_project/macros/materializations/snapshots/snapshot.sql +109 -0
- dbt/include/global_project/macros/materializations/snapshots/snapshot_merge.sql +34 -0
- dbt/include/global_project/macros/materializations/snapshots/strategies.sql +184 -0
- dbt/include/global_project/macros/materializations/tests/helpers.sql +44 -0
- dbt/include/global_project/macros/materializations/tests/test.sql +66 -0
- dbt/include/global_project/macros/materializations/tests/unit.sql +40 -0
- dbt/include/global_project/macros/materializations/tests/where_subquery.sql +15 -0
- dbt/include/global_project/macros/python_model/python.sql +114 -0
- dbt/include/global_project/macros/relations/column/columns_spec_ddl.sql +89 -0
- dbt/include/global_project/macros/relations/create.sql +23 -0
- dbt/include/global_project/macros/relations/create_backup.sql +17 -0
- dbt/include/global_project/macros/relations/create_intermediate.sql +17 -0
- dbt/include/global_project/macros/relations/drop.sql +41 -0
- dbt/include/global_project/macros/relations/drop_backup.sql +14 -0
- dbt/include/global_project/macros/relations/materialized_view/alter.sql +55 -0
- dbt/include/global_project/macros/relations/materialized_view/create.sql +10 -0
- dbt/include/global_project/macros/relations/materialized_view/drop.sql +14 -0
- dbt/include/global_project/macros/relations/materialized_view/refresh.sql +9 -0
- dbt/include/global_project/macros/relations/materialized_view/rename.sql +10 -0
- dbt/include/global_project/macros/relations/materialized_view/replace.sql +10 -0
- dbt/include/global_project/macros/relations/rename.sql +35 -0
- dbt/include/global_project/macros/relations/rename_intermediate.sql +14 -0
- dbt/include/global_project/macros/relations/replace.sql +50 -0
- dbt/include/global_project/macros/relations/schema.sql +8 -0
- dbt/include/global_project/macros/relations/table/create.sql +60 -0
- dbt/include/global_project/macros/relations/table/drop.sql +14 -0
- dbt/include/global_project/macros/relations/table/rename.sql +10 -0
- dbt/include/global_project/macros/relations/table/replace.sql +10 -0
- dbt/include/global_project/macros/relations/view/create.sql +27 -0
- dbt/include/global_project/macros/relations/view/drop.sql +14 -0
- dbt/include/global_project/macros/relations/view/rename.sql +10 -0
- dbt/include/global_project/macros/relations/view/replace.sql +66 -0
- dbt/include/global_project/macros/unit_test_sql/get_fixture_sql.sql +107 -0
- dbt/include/global_project/macros/utils/any_value.sql +9 -0
- dbt/include/global_project/macros/utils/array_append.sql +8 -0
- dbt/include/global_project/macros/utils/array_concat.sql +7 -0
- dbt/include/global_project/macros/utils/array_construct.sql +12 -0
- dbt/include/global_project/macros/utils/bool_or.sql +9 -0
- dbt/include/global_project/macros/utils/cast.sql +7 -0
- dbt/include/global_project/macros/utils/cast_bool_to_text.sql +7 -0
- dbt/include/global_project/macros/utils/concat.sql +7 -0
- dbt/include/global_project/macros/utils/data_types.sql +129 -0
- dbt/include/global_project/macros/utils/date.sql +10 -0
- dbt/include/global_project/macros/utils/date_spine.sql +75 -0
- dbt/include/global_project/macros/utils/date_trunc.sql +7 -0
- dbt/include/global_project/macros/utils/dateadd.sql +14 -0
- dbt/include/global_project/macros/utils/datediff.sql +14 -0
- dbt/include/global_project/macros/utils/equals.sql +14 -0
- dbt/include/global_project/macros/utils/escape_single_quotes.sql +8 -0
- dbt/include/global_project/macros/utils/except.sql +9 -0
- dbt/include/global_project/macros/utils/generate_series.sql +53 -0
- dbt/include/global_project/macros/utils/hash.sql +7 -0
- dbt/include/global_project/macros/utils/intersect.sql +9 -0
- dbt/include/global_project/macros/utils/last_day.sql +15 -0
- dbt/include/global_project/macros/utils/length.sql +11 -0
- dbt/include/global_project/macros/utils/listagg.sql +30 -0
- dbt/include/global_project/macros/utils/literal.sql +7 -0
- dbt/include/global_project/macros/utils/position.sql +11 -0
- dbt/include/global_project/macros/utils/replace.sql +14 -0
- dbt/include/global_project/macros/utils/right.sql +12 -0
- dbt/include/global_project/macros/utils/safe_cast.sql +9 -0
- dbt/include/global_project/macros/utils/split_part.sql +26 -0
- dbt/include/global_project/tests/generic/builtin.sql +30 -0
- dbt/include/py.typed +0 -0
- dbt_adapters-1.22.2.dist-info/METADATA +124 -0
- dbt_adapters-1.22.2.dist-info/RECORD +173 -0
- dbt_adapters-1.22.2.dist-info/WHEEL +4 -0
- dbt_adapters-1.22.2.dist-info/licenses/LICENSE +201 -0
|
@@ -0,0 +1,263 @@
|
|
|
1
|
+
import abc
|
|
2
|
+
import time
|
|
3
|
+
from typing import (
|
|
4
|
+
Any,
|
|
5
|
+
Dict,
|
|
6
|
+
Iterable,
|
|
7
|
+
Iterator,
|
|
8
|
+
List,
|
|
9
|
+
Optional,
|
|
10
|
+
Tuple,
|
|
11
|
+
TYPE_CHECKING,
|
|
12
|
+
Type,
|
|
13
|
+
)
|
|
14
|
+
|
|
15
|
+
from dbt_common.events.contextvars import get_node_info
|
|
16
|
+
from dbt_common.events.functions import fire_event
|
|
17
|
+
from dbt_common.exceptions import DbtInternalError, NotImplementedError
|
|
18
|
+
from dbt_common.utils import cast_to_str
|
|
19
|
+
|
|
20
|
+
from dbt.adapters.base import BaseConnectionManager
|
|
21
|
+
from dbt.adapters.contracts.connection import (
|
|
22
|
+
AdapterResponse,
|
|
23
|
+
Connection,
|
|
24
|
+
ConnectionState,
|
|
25
|
+
)
|
|
26
|
+
from dbt.adapters.events.types import (
|
|
27
|
+
ConnectionUsed,
|
|
28
|
+
SQLCommit,
|
|
29
|
+
SQLQuery,
|
|
30
|
+
SQLQueryStatus,
|
|
31
|
+
AdapterEventDebug,
|
|
32
|
+
)
|
|
33
|
+
|
|
34
|
+
if TYPE_CHECKING:
|
|
35
|
+
import agate
|
|
36
|
+
|
|
37
|
+
|
|
38
|
+
class SQLConnectionManager(BaseConnectionManager):
|
|
39
|
+
"""The default connection manager with some common SQL methods implemented.
|
|
40
|
+
|
|
41
|
+
Methods to implement:
|
|
42
|
+
- exception_handler
|
|
43
|
+
- cancel
|
|
44
|
+
- get_response
|
|
45
|
+
- open
|
|
46
|
+
"""
|
|
47
|
+
|
|
48
|
+
@abc.abstractmethod
|
|
49
|
+
def cancel(self, connection: Connection):
|
|
50
|
+
"""Cancel the given connection."""
|
|
51
|
+
raise NotImplementedError("`cancel` is not implemented for this adapter!")
|
|
52
|
+
|
|
53
|
+
def cancel_open(self) -> List[str]:
|
|
54
|
+
names = []
|
|
55
|
+
this_connection = self.get_if_exists()
|
|
56
|
+
with self.lock:
|
|
57
|
+
for connection in self.thread_connections.values():
|
|
58
|
+
if connection is this_connection:
|
|
59
|
+
continue
|
|
60
|
+
|
|
61
|
+
# if the connection failed, the handle will be None so we have
|
|
62
|
+
# nothing to cancel.
|
|
63
|
+
if connection.handle is not None and connection.state == ConnectionState.OPEN:
|
|
64
|
+
self.cancel(connection)
|
|
65
|
+
if connection.name is not None:
|
|
66
|
+
names.append(connection.name)
|
|
67
|
+
return names
|
|
68
|
+
|
|
69
|
+
def add_query(
|
|
70
|
+
self,
|
|
71
|
+
sql: str,
|
|
72
|
+
auto_begin: bool = True,
|
|
73
|
+
bindings: Optional[Any] = None,
|
|
74
|
+
abridge_sql_log: bool = False,
|
|
75
|
+
retryable_exceptions: Tuple[Type[Exception], ...] = tuple(),
|
|
76
|
+
retry_limit: int = 1,
|
|
77
|
+
) -> Tuple[Connection, Any]:
|
|
78
|
+
"""
|
|
79
|
+
Retry function encapsulated here to avoid commitment to some
|
|
80
|
+
user-facing interface. Right now, Redshift commits to a 1 second
|
|
81
|
+
retry timeout so this serves as a default.
|
|
82
|
+
"""
|
|
83
|
+
|
|
84
|
+
def _execute_query_with_retry(
|
|
85
|
+
cursor: Any,
|
|
86
|
+
sql: str,
|
|
87
|
+
bindings: Optional[Any],
|
|
88
|
+
retryable_exceptions: Tuple[Type[Exception], ...],
|
|
89
|
+
retry_limit: int,
|
|
90
|
+
attempt: int,
|
|
91
|
+
):
|
|
92
|
+
"""
|
|
93
|
+
A success sees the try exit cleanly and avoid any recursive
|
|
94
|
+
retries. Failure begins a sleep and retry routine.
|
|
95
|
+
"""
|
|
96
|
+
try:
|
|
97
|
+
cursor.execute(sql, bindings)
|
|
98
|
+
except retryable_exceptions as e:
|
|
99
|
+
# Cease retries and fail when limit is hit.
|
|
100
|
+
if attempt >= retry_limit:
|
|
101
|
+
raise e
|
|
102
|
+
|
|
103
|
+
fire_event(
|
|
104
|
+
AdapterEventDebug(
|
|
105
|
+
base_msg=f"Got a retryable error {type(e)}. {retry_limit - attempt} retries left. "
|
|
106
|
+
f"Retrying in 1 second.\nError:\n{e}"
|
|
107
|
+
)
|
|
108
|
+
)
|
|
109
|
+
time.sleep(1)
|
|
110
|
+
|
|
111
|
+
return _execute_query_with_retry(
|
|
112
|
+
cursor=cursor,
|
|
113
|
+
sql=sql,
|
|
114
|
+
bindings=bindings,
|
|
115
|
+
retryable_exceptions=retryable_exceptions,
|
|
116
|
+
retry_limit=retry_limit,
|
|
117
|
+
attempt=attempt + 1,
|
|
118
|
+
)
|
|
119
|
+
|
|
120
|
+
connection = self.get_thread_connection()
|
|
121
|
+
if auto_begin and connection.transaction_open is False:
|
|
122
|
+
self.begin()
|
|
123
|
+
fire_event(
|
|
124
|
+
ConnectionUsed(
|
|
125
|
+
conn_type=self.TYPE,
|
|
126
|
+
conn_name=cast_to_str(connection.name),
|
|
127
|
+
node_info=get_node_info(),
|
|
128
|
+
)
|
|
129
|
+
)
|
|
130
|
+
|
|
131
|
+
with self.exception_handler(sql):
|
|
132
|
+
if abridge_sql_log:
|
|
133
|
+
log_sql = "{}...".format(sql[:512])
|
|
134
|
+
else:
|
|
135
|
+
log_sql = sql
|
|
136
|
+
|
|
137
|
+
fire_event(
|
|
138
|
+
SQLQuery(
|
|
139
|
+
conn_name=cast_to_str(connection.name),
|
|
140
|
+
sql=log_sql,
|
|
141
|
+
node_info=get_node_info(),
|
|
142
|
+
)
|
|
143
|
+
)
|
|
144
|
+
|
|
145
|
+
pre = time.perf_counter()
|
|
146
|
+
|
|
147
|
+
cursor = connection.handle.cursor()
|
|
148
|
+
_execute_query_with_retry(
|
|
149
|
+
cursor=cursor,
|
|
150
|
+
sql=sql,
|
|
151
|
+
bindings=bindings,
|
|
152
|
+
retryable_exceptions=retryable_exceptions,
|
|
153
|
+
retry_limit=retry_limit,
|
|
154
|
+
attempt=1,
|
|
155
|
+
)
|
|
156
|
+
|
|
157
|
+
result = self.get_response(cursor)
|
|
158
|
+
|
|
159
|
+
fire_event(
|
|
160
|
+
SQLQueryStatus(
|
|
161
|
+
status=str(result),
|
|
162
|
+
elapsed=time.perf_counter() - pre,
|
|
163
|
+
node_info=get_node_info(),
|
|
164
|
+
query_id=result.query_id,
|
|
165
|
+
)
|
|
166
|
+
)
|
|
167
|
+
|
|
168
|
+
return connection, cursor
|
|
169
|
+
|
|
170
|
+
@classmethod
|
|
171
|
+
@abc.abstractmethod
|
|
172
|
+
def get_response(cls, cursor: Any) -> AdapterResponse:
|
|
173
|
+
"""Get the status of the cursor."""
|
|
174
|
+
raise NotImplementedError("`get_response` is not implemented for this adapter!")
|
|
175
|
+
|
|
176
|
+
@classmethod
|
|
177
|
+
def process_results(
|
|
178
|
+
cls, column_names: Iterable[str], rows: Iterable[Any]
|
|
179
|
+
) -> Iterator[Dict[str, Any]]:
|
|
180
|
+
unique_col_names = dict() # type: ignore[var-annotated]
|
|
181
|
+
for idx in range(len(column_names)): # type: ignore[arg-type]
|
|
182
|
+
col_name = column_names[idx] # type: ignore[index]
|
|
183
|
+
if col_name in unique_col_names:
|
|
184
|
+
unique_col_names[col_name] += 1
|
|
185
|
+
column_names[idx] = f"{col_name}_{unique_col_names[col_name]}" # type: ignore[index] # noqa
|
|
186
|
+
else:
|
|
187
|
+
unique_col_names[column_names[idx]] = 1 # type: ignore[index]
|
|
188
|
+
|
|
189
|
+
for row in rows:
|
|
190
|
+
yield dict(zip(column_names, row))
|
|
191
|
+
|
|
192
|
+
@classmethod
|
|
193
|
+
def get_result_from_cursor(cls, cursor: Any, limit: Optional[int]) -> "agate.Table":
|
|
194
|
+
from dbt_common.clients.agate_helper import table_from_data_flat
|
|
195
|
+
|
|
196
|
+
data: Iterable[Any] = []
|
|
197
|
+
column_names: List[str] = []
|
|
198
|
+
|
|
199
|
+
if cursor.description is not None:
|
|
200
|
+
column_names = [col[0] for col in cursor.description]
|
|
201
|
+
if limit:
|
|
202
|
+
rows = cursor.fetchmany(limit)
|
|
203
|
+
else:
|
|
204
|
+
rows = cursor.fetchall()
|
|
205
|
+
data = cls.process_results(column_names, rows)
|
|
206
|
+
|
|
207
|
+
return table_from_data_flat(data, column_names)
|
|
208
|
+
|
|
209
|
+
def execute(
|
|
210
|
+
self,
|
|
211
|
+
sql: str,
|
|
212
|
+
auto_begin: bool = False,
|
|
213
|
+
fetch: bool = False,
|
|
214
|
+
limit: Optional[int] = None,
|
|
215
|
+
) -> Tuple[AdapterResponse, "agate.Table"]:
|
|
216
|
+
from dbt_common.clients.agate_helper import empty_table
|
|
217
|
+
|
|
218
|
+
sql = self._add_query_comment(sql)
|
|
219
|
+
_, cursor = self.add_query(sql, auto_begin)
|
|
220
|
+
response = self.get_response(cursor)
|
|
221
|
+
if fetch:
|
|
222
|
+
table = self.get_result_from_cursor(cursor, limit)
|
|
223
|
+
else:
|
|
224
|
+
table = empty_table()
|
|
225
|
+
return response, table
|
|
226
|
+
|
|
227
|
+
def add_begin_query(self):
|
|
228
|
+
return self.add_query("BEGIN", auto_begin=False)
|
|
229
|
+
|
|
230
|
+
def add_commit_query(self):
|
|
231
|
+
return self.add_query("COMMIT", auto_begin=False)
|
|
232
|
+
|
|
233
|
+
def add_select_query(self, sql: str) -> Tuple[Connection, Any]:
|
|
234
|
+
sql = self._add_query_comment(sql)
|
|
235
|
+
return self.add_query(sql, auto_begin=False)
|
|
236
|
+
|
|
237
|
+
def begin(self):
|
|
238
|
+
connection = self.get_thread_connection()
|
|
239
|
+
if connection.transaction_open is True:
|
|
240
|
+
raise DbtInternalError(
|
|
241
|
+
'Tried to begin a new transaction on connection "{}", but '
|
|
242
|
+
"it already had one open!".format(connection.name)
|
|
243
|
+
)
|
|
244
|
+
|
|
245
|
+
self.add_begin_query()
|
|
246
|
+
|
|
247
|
+
connection.transaction_open = True
|
|
248
|
+
return connection
|
|
249
|
+
|
|
250
|
+
def commit(self):
|
|
251
|
+
connection = self.get_thread_connection()
|
|
252
|
+
if connection.transaction_open is False:
|
|
253
|
+
raise DbtInternalError(
|
|
254
|
+
'Tried to commit transaction on connection "{}", but '
|
|
255
|
+
"it does not have one open!".format(connection.name)
|
|
256
|
+
)
|
|
257
|
+
|
|
258
|
+
fire_event(SQLCommit(conn_name=connection.name, node_info=get_node_info()))
|
|
259
|
+
self.add_commit_query()
|
|
260
|
+
|
|
261
|
+
connection.transaction_open = False
|
|
262
|
+
|
|
263
|
+
return connection
|
dbt/adapters/sql/impl.py
ADDED
|
@@ -0,0 +1,286 @@
|
|
|
1
|
+
from typing import Any, List, Optional, Tuple, Type, TYPE_CHECKING
|
|
2
|
+
|
|
3
|
+
from dbt_common.events.functions import fire_event
|
|
4
|
+
from dbt_common.record import record_function
|
|
5
|
+
|
|
6
|
+
from dbt.adapters.base import BaseAdapter, BaseRelation, available
|
|
7
|
+
from dbt.adapters.cache import _make_ref_key_dict
|
|
8
|
+
from dbt.adapters.contracts.connection import AdapterResponse, Connection
|
|
9
|
+
from dbt.adapters.events.types import ColTypeChange, SchemaCreation, SchemaDrop
|
|
10
|
+
from dbt.adapters.exceptions import RelationTypeNullError
|
|
11
|
+
from dbt.adapters.record.base import AdapterTestSqlRecord, AdapterAddQueryRecord
|
|
12
|
+
from dbt.adapters.sql.connections import SQLConnectionManager
|
|
13
|
+
|
|
14
|
+
LIST_RELATIONS_MACRO_NAME = "list_relations_without_caching"
|
|
15
|
+
GET_COLUMNS_IN_RELATION_MACRO_NAME = "get_columns_in_relation"
|
|
16
|
+
LIST_SCHEMAS_MACRO_NAME = "list_schemas"
|
|
17
|
+
CHECK_SCHEMA_EXISTS_MACRO_NAME = "check_schema_exists"
|
|
18
|
+
CREATE_SCHEMA_MACRO_NAME = "create_schema"
|
|
19
|
+
DROP_SCHEMA_MACRO_NAME = "drop_schema"
|
|
20
|
+
RENAME_RELATION_MACRO_NAME = "rename_relation"
|
|
21
|
+
TRUNCATE_RELATION_MACRO_NAME = "truncate_relation"
|
|
22
|
+
DROP_RELATION_MACRO_NAME = "drop_relation"
|
|
23
|
+
ALTER_COLUMN_TYPE_MACRO_NAME = "alter_column_type"
|
|
24
|
+
VALIDATE_SQL_MACRO_NAME = "validate_sql"
|
|
25
|
+
|
|
26
|
+
if TYPE_CHECKING:
|
|
27
|
+
import agate
|
|
28
|
+
|
|
29
|
+
|
|
30
|
+
class SQLAdapter(BaseAdapter):
|
|
31
|
+
"""The default adapter with the common agate conversions and some SQL
|
|
32
|
+
methods was implemented. This adapter has a different much shorter list of
|
|
33
|
+
methods to implement, but some more macros that must be implemented.
|
|
34
|
+
|
|
35
|
+
To implement a macro, implement "${adapter_type}__${macro_name}". in the
|
|
36
|
+
adapter's internal project.
|
|
37
|
+
|
|
38
|
+
Methods to implement:
|
|
39
|
+
- date_function
|
|
40
|
+
|
|
41
|
+
Macros to implement:
|
|
42
|
+
- get_catalog
|
|
43
|
+
- list_relations_without_caching
|
|
44
|
+
- get_columns_in_relation
|
|
45
|
+
- get_catalog_for_single_relation
|
|
46
|
+
"""
|
|
47
|
+
|
|
48
|
+
ConnectionManager: Type[SQLConnectionManager]
|
|
49
|
+
connections: SQLConnectionManager
|
|
50
|
+
|
|
51
|
+
@available.parse(lambda *a, **k: (None, None))
|
|
52
|
+
@record_function(
|
|
53
|
+
AdapterAddQueryRecord, method=True, index_on_thread_id=True, id_field_name="thread_id"
|
|
54
|
+
)
|
|
55
|
+
def add_query(
|
|
56
|
+
self,
|
|
57
|
+
sql: str,
|
|
58
|
+
auto_begin: bool = True,
|
|
59
|
+
bindings: Optional[Any] = None,
|
|
60
|
+
abridge_sql_log: bool = False,
|
|
61
|
+
) -> Tuple[Connection, Any]:
|
|
62
|
+
"""Add a query to the current transaction. A thin wrapper around
|
|
63
|
+
ConnectionManager.add_query.
|
|
64
|
+
|
|
65
|
+
:param sql: The SQL query to add
|
|
66
|
+
:param auto_begin: If set and there is no transaction in progress,
|
|
67
|
+
begin a new one.
|
|
68
|
+
:param bindings: An optional list of bindings for the query.
|
|
69
|
+
:param abridge_sql_log: If set, limit the raw sql logged to 512
|
|
70
|
+
characters
|
|
71
|
+
"""
|
|
72
|
+
return self.connections.add_query(sql, auto_begin, bindings, abridge_sql_log)
|
|
73
|
+
|
|
74
|
+
@classmethod
|
|
75
|
+
def convert_text_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
76
|
+
return "text"
|
|
77
|
+
|
|
78
|
+
@classmethod
|
|
79
|
+
def convert_number_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
80
|
+
import agate
|
|
81
|
+
|
|
82
|
+
# TODO CT-211
|
|
83
|
+
decimals = agate_table.aggregate(agate.MaxPrecision(col_idx))
|
|
84
|
+
return "float8" if decimals else "integer"
|
|
85
|
+
|
|
86
|
+
@classmethod
|
|
87
|
+
def convert_integer_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
88
|
+
return "integer"
|
|
89
|
+
|
|
90
|
+
@classmethod
|
|
91
|
+
def convert_boolean_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
92
|
+
return "boolean"
|
|
93
|
+
|
|
94
|
+
@classmethod
|
|
95
|
+
def convert_datetime_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
96
|
+
return "timestamp without time zone"
|
|
97
|
+
|
|
98
|
+
@classmethod
|
|
99
|
+
def convert_date_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
100
|
+
return "date"
|
|
101
|
+
|
|
102
|
+
@classmethod
|
|
103
|
+
def convert_time_type(cls, agate_table: "agate.Table", col_idx: int) -> str:
|
|
104
|
+
return "time"
|
|
105
|
+
|
|
106
|
+
@classmethod
|
|
107
|
+
def is_cancelable(cls) -> bool:
|
|
108
|
+
return True
|
|
109
|
+
|
|
110
|
+
def expand_column_types(self, goal, current):
|
|
111
|
+
reference_columns = {c.name: c for c in self.get_columns_in_relation(goal)}
|
|
112
|
+
|
|
113
|
+
target_columns = {c.name: c for c in self.get_columns_in_relation(current)}
|
|
114
|
+
|
|
115
|
+
for column_name, reference_column in reference_columns.items():
|
|
116
|
+
target_column = target_columns.get(column_name)
|
|
117
|
+
|
|
118
|
+
if target_column is not None and target_column.can_expand_to(reference_column):
|
|
119
|
+
col_string_size = reference_column.string_size()
|
|
120
|
+
new_type = self.Column.string_type(col_string_size)
|
|
121
|
+
fire_event(
|
|
122
|
+
ColTypeChange(
|
|
123
|
+
orig_type=target_column.data_type,
|
|
124
|
+
new_type=new_type,
|
|
125
|
+
table=_make_ref_key_dict(current),
|
|
126
|
+
)
|
|
127
|
+
)
|
|
128
|
+
|
|
129
|
+
self.alter_column_type(current, column_name, new_type)
|
|
130
|
+
|
|
131
|
+
def alter_column_type(self, relation, column_name, new_column_type) -> None:
|
|
132
|
+
"""
|
|
133
|
+
1. Create a new column (w/ temp name and correct type)
|
|
134
|
+
2. Copy data over to it
|
|
135
|
+
3. Drop the existing column (cascade!)
|
|
136
|
+
4. Rename the new column to existing column
|
|
137
|
+
"""
|
|
138
|
+
kwargs = {
|
|
139
|
+
"relation": relation,
|
|
140
|
+
"column_name": column_name,
|
|
141
|
+
"new_column_type": new_column_type,
|
|
142
|
+
}
|
|
143
|
+
self.execute_macro(ALTER_COLUMN_TYPE_MACRO_NAME, kwargs=kwargs)
|
|
144
|
+
|
|
145
|
+
def drop_relation(self, relation):
|
|
146
|
+
if relation.type is None:
|
|
147
|
+
raise RelationTypeNullError(relation)
|
|
148
|
+
|
|
149
|
+
self.cache_dropped(relation)
|
|
150
|
+
self.execute_macro(DROP_RELATION_MACRO_NAME, kwargs={"relation": relation})
|
|
151
|
+
|
|
152
|
+
def truncate_relation(self, relation):
|
|
153
|
+
self.execute_macro(TRUNCATE_RELATION_MACRO_NAME, kwargs={"relation": relation})
|
|
154
|
+
|
|
155
|
+
def rename_relation(self, from_relation, to_relation):
|
|
156
|
+
self.cache_renamed(from_relation, to_relation)
|
|
157
|
+
|
|
158
|
+
kwargs = {"from_relation": from_relation, "to_relation": to_relation}
|
|
159
|
+
self.execute_macro(RENAME_RELATION_MACRO_NAME, kwargs=kwargs)
|
|
160
|
+
|
|
161
|
+
def get_columns_in_relation(self, relation):
|
|
162
|
+
return self.execute_macro(
|
|
163
|
+
GET_COLUMNS_IN_RELATION_MACRO_NAME, kwargs={"relation": relation}
|
|
164
|
+
)
|
|
165
|
+
|
|
166
|
+
def create_schema(self, relation: BaseRelation) -> None:
|
|
167
|
+
relation = relation.without_identifier()
|
|
168
|
+
fire_event(SchemaCreation(relation=_make_ref_key_dict(relation)))
|
|
169
|
+
kwargs = {
|
|
170
|
+
"relation": relation,
|
|
171
|
+
}
|
|
172
|
+
self.execute_macro(CREATE_SCHEMA_MACRO_NAME, kwargs=kwargs)
|
|
173
|
+
self.commit_if_has_connection()
|
|
174
|
+
# we can't update the cache here, as if the schema already existed we
|
|
175
|
+
# don't want to (incorrectly) say that it's empty
|
|
176
|
+
|
|
177
|
+
def drop_schema(self, relation: BaseRelation) -> None:
|
|
178
|
+
relation = relation.without_identifier()
|
|
179
|
+
fire_event(SchemaDrop(relation=_make_ref_key_dict(relation)))
|
|
180
|
+
kwargs = {
|
|
181
|
+
"relation": relation,
|
|
182
|
+
}
|
|
183
|
+
self.execute_macro(DROP_SCHEMA_MACRO_NAME, kwargs=kwargs)
|
|
184
|
+
self.commit_if_has_connection()
|
|
185
|
+
# we can update the cache here
|
|
186
|
+
self.cache.drop_schema(relation.database, relation.schema)
|
|
187
|
+
|
|
188
|
+
def list_relations_without_caching(
|
|
189
|
+
self,
|
|
190
|
+
schema_relation: BaseRelation,
|
|
191
|
+
) -> List[BaseRelation]:
|
|
192
|
+
kwargs = {"schema_relation": schema_relation}
|
|
193
|
+
results = self.execute_macro(LIST_RELATIONS_MACRO_NAME, kwargs=kwargs)
|
|
194
|
+
|
|
195
|
+
relations = []
|
|
196
|
+
quote_policy = {"database": True, "schema": True, "identifier": True}
|
|
197
|
+
for _database, name, _schema, _type in results:
|
|
198
|
+
try:
|
|
199
|
+
_type = self.Relation.get_relation_type(_type)
|
|
200
|
+
except ValueError:
|
|
201
|
+
_type = self.Relation.External
|
|
202
|
+
relations.append(
|
|
203
|
+
self.Relation.create(
|
|
204
|
+
database=_database,
|
|
205
|
+
schema=_schema,
|
|
206
|
+
identifier=name,
|
|
207
|
+
quote_policy=quote_policy,
|
|
208
|
+
type=_type,
|
|
209
|
+
)
|
|
210
|
+
)
|
|
211
|
+
return relations
|
|
212
|
+
|
|
213
|
+
@classmethod
|
|
214
|
+
def quote(self, identifier):
|
|
215
|
+
return '"{}"'.format(identifier)
|
|
216
|
+
|
|
217
|
+
def list_schemas(self, database: str) -> List[str]:
|
|
218
|
+
results = self.execute_macro(LIST_SCHEMAS_MACRO_NAME, kwargs={"database": database})
|
|
219
|
+
|
|
220
|
+
return [row[0] for row in results]
|
|
221
|
+
|
|
222
|
+
def check_schema_exists(self, database: str, schema: str) -> bool:
|
|
223
|
+
information_schema = self.Relation.create(
|
|
224
|
+
database=database,
|
|
225
|
+
schema=schema,
|
|
226
|
+
identifier="INFORMATION_SCHEMA",
|
|
227
|
+
quote_policy=self.config.quoting,
|
|
228
|
+
).information_schema()
|
|
229
|
+
|
|
230
|
+
kwargs = {"information_schema": information_schema, "schema": schema}
|
|
231
|
+
results = self.execute_macro(CHECK_SCHEMA_EXISTS_MACRO_NAME, kwargs=kwargs)
|
|
232
|
+
return results[0][0] > 0
|
|
233
|
+
|
|
234
|
+
def validate_sql(self, sql: str) -> AdapterResponse:
|
|
235
|
+
"""Submit the given SQL to the engine for validation, but not execution.
|
|
236
|
+
|
|
237
|
+
By default we simply prefix the query with the explain keyword and allow the
|
|
238
|
+
exceptions thrown by the underlying engine on invalid SQL inputs to bubble up
|
|
239
|
+
to the exception handler. For adjustments to the explain statement - such as
|
|
240
|
+
for adapters that have different mechanisms for hinting at query validation
|
|
241
|
+
or dry-run - callers may be able to override the validate_sql_query macro with
|
|
242
|
+
the addition of an <adapter>__validate_sql implementation.
|
|
243
|
+
|
|
244
|
+
:param sql str: The sql to validate
|
|
245
|
+
"""
|
|
246
|
+
kwargs = {
|
|
247
|
+
"sql": sql,
|
|
248
|
+
}
|
|
249
|
+
result = self.execute_macro(VALIDATE_SQL_MACRO_NAME, kwargs=kwargs)
|
|
250
|
+
# The statement macro always returns an AdapterResponse in the output AttrDict's
|
|
251
|
+
# `response` property, and we preserve the full payload in case we want to
|
|
252
|
+
# return fetched output for engines where explain plans are emitted as columnar
|
|
253
|
+
# results. Any macro override that deviates from this behavior may encounter an
|
|
254
|
+
# assertion error in the runtime.
|
|
255
|
+
adapter_response = result.response
|
|
256
|
+
assert isinstance(adapter_response, AdapterResponse), (
|
|
257
|
+
f"Expected AdapterResponse from validate_sql macro execution, "
|
|
258
|
+
f"got {type(adapter_response)}."
|
|
259
|
+
)
|
|
260
|
+
return adapter_response
|
|
261
|
+
|
|
262
|
+
# This is for use in the test suite
|
|
263
|
+
@available
|
|
264
|
+
@record_function(
|
|
265
|
+
AdapterTestSqlRecord, method=True, index_on_thread_id=True, id_field_name="thread_id"
|
|
266
|
+
)
|
|
267
|
+
def run_sql_for_tests(self, sql, fetch, conn):
|
|
268
|
+
cursor = conn.handle.cursor()
|
|
269
|
+
try:
|
|
270
|
+
cursor.execute(sql)
|
|
271
|
+
if hasattr(conn.handle, "commit"):
|
|
272
|
+
conn.handle.commit()
|
|
273
|
+
if fetch == "one":
|
|
274
|
+
return cursor.fetchone()
|
|
275
|
+
elif fetch == "all":
|
|
276
|
+
return cursor.fetchall()
|
|
277
|
+
else:
|
|
278
|
+
return
|
|
279
|
+
except BaseException as e:
|
|
280
|
+
if conn.handle and not getattr(conn.handle, "closed", True):
|
|
281
|
+
conn.handle.rollback()
|
|
282
|
+
print(sql)
|
|
283
|
+
print(e)
|
|
284
|
+
raise
|
|
285
|
+
finally:
|
|
286
|
+
conn.transaction_open = False
|
dbt/adapters/utils.py
ADDED
|
@@ -0,0 +1,69 @@
|
|
|
1
|
+
from typing import Mapping, Sequence, Any, Dict, List
|
|
2
|
+
|
|
3
|
+
from dbt.adapters.exceptions import DuplicateAliasError
|
|
4
|
+
|
|
5
|
+
|
|
6
|
+
class Translator:
|
|
7
|
+
def __init__(self, aliases: Mapping[str, str], recursive: bool = False) -> None:
|
|
8
|
+
self.aliases = aliases
|
|
9
|
+
self.recursive = recursive
|
|
10
|
+
|
|
11
|
+
def translate_mapping(self, kwargs: Mapping[str, Any]) -> Dict[str, Any]:
|
|
12
|
+
result: Dict[str, Any] = {}
|
|
13
|
+
|
|
14
|
+
for key, value in kwargs.items():
|
|
15
|
+
canonical_key = self.aliases.get(key, key)
|
|
16
|
+
if canonical_key in result:
|
|
17
|
+
raise DuplicateAliasError(kwargs, self.aliases, canonical_key)
|
|
18
|
+
result[canonical_key] = self.translate_value(value)
|
|
19
|
+
return result
|
|
20
|
+
|
|
21
|
+
def translate_sequence(self, value: Sequence[Any]) -> List[Any]:
|
|
22
|
+
return [self.translate_value(v) for v in value]
|
|
23
|
+
|
|
24
|
+
def translate_value(self, value: Any) -> Any:
|
|
25
|
+
if self.recursive:
|
|
26
|
+
if isinstance(value, Mapping):
|
|
27
|
+
return self.translate_mapping(value)
|
|
28
|
+
elif isinstance(value, (list, tuple)):
|
|
29
|
+
return self.translate_sequence(value)
|
|
30
|
+
return value
|
|
31
|
+
|
|
32
|
+
def translate(self, value: Mapping[str, Any]) -> Dict[str, Any]:
|
|
33
|
+
try:
|
|
34
|
+
return self.translate_mapping(value)
|
|
35
|
+
except RuntimeError as exc:
|
|
36
|
+
if "maximum recursion depth exceeded" in str(exc):
|
|
37
|
+
raise RecursionError("Cycle detected in a value passed to translate!")
|
|
38
|
+
raise
|
|
39
|
+
|
|
40
|
+
|
|
41
|
+
def translate_aliases(
|
|
42
|
+
kwargs: Dict[str, Any],
|
|
43
|
+
aliases: Dict[str, str],
|
|
44
|
+
recurse: bool = False,
|
|
45
|
+
) -> Dict[str, Any]:
|
|
46
|
+
"""Given a dict of keyword arguments and a dict mapping aliases to their
|
|
47
|
+
canonical values, canonicalize the keys in the kwargs dict.
|
|
48
|
+
|
|
49
|
+
If recurse is True, perform this operation recursively.
|
|
50
|
+
|
|
51
|
+
:returns: A dict containing all the values in kwargs referenced by their
|
|
52
|
+
canonical key.
|
|
53
|
+
:raises: `AliasError`, if a canonical key is defined more than once.
|
|
54
|
+
"""
|
|
55
|
+
translator = Translator(aliases, recurse)
|
|
56
|
+
return translator.translate(kwargs)
|
|
57
|
+
|
|
58
|
+
|
|
59
|
+
# some types need to make constants available to the jinja context as
|
|
60
|
+
# attributes, and regular properties only work with objects. maybe this should
|
|
61
|
+
# be handled by the RelationProxy?
|
|
62
|
+
|
|
63
|
+
|
|
64
|
+
class classproperty(object):
|
|
65
|
+
def __init__(self, func) -> None:
|
|
66
|
+
self.func = func
|
|
67
|
+
|
|
68
|
+
def __get__(self, obj, objtype):
|
|
69
|
+
return self.func(objtype)
|
dbt/include/__init__.py
ADDED
|
@@ -0,0 +1,43 @@
|
|
|
1
|
+
|
|
2
|
+
{% docs __overview__ %}
|
|
3
|
+
|
|
4
|
+
### Welcome!
|
|
5
|
+
|
|
6
|
+
Welcome to the auto-generated documentation for your dbt project!
|
|
7
|
+
|
|
8
|
+
### Navigation
|
|
9
|
+
|
|
10
|
+
You can use the `Project` and `Database` navigation tabs on the left side of the window to explore the models
|
|
11
|
+
in your project.
|
|
12
|
+
|
|
13
|
+
#### Project Tab
|
|
14
|
+
The `Project` tab mirrors the directory structure of your dbt project. In this tab, you can see all of the
|
|
15
|
+
models defined in your dbt project, as well as models imported from dbt packages.
|
|
16
|
+
|
|
17
|
+
#### Database Tab
|
|
18
|
+
The `Database` tab also exposes your models, but in a format that looks more like a database explorer. This view
|
|
19
|
+
shows relations (tables and views) grouped into database schemas. Note that ephemeral models are _not_ shown
|
|
20
|
+
in this interface, as they do not exist in the database.
|
|
21
|
+
|
|
22
|
+
### Graph Exploration
|
|
23
|
+
You can click the blue icon on the bottom-right corner of the page to view the lineage graph of your models.
|
|
24
|
+
|
|
25
|
+
On model pages, you'll see the immediate parents and children of the model you're exploring. By clicking the `Expand`
|
|
26
|
+
button at the top-right of this lineage pane, you'll be able to see all of the models that are used to build,
|
|
27
|
+
or are built from, the model you're exploring.
|
|
28
|
+
|
|
29
|
+
Once expanded, you'll be able to use the `--select` and `--exclude` model selection syntax to filter the
|
|
30
|
+
models in the graph. For more information on model selection, check out the [dbt docs](https://docs.getdbt.com/docs/model-selection-syntax).
|
|
31
|
+
|
|
32
|
+
Note that you can also right-click on models to interactively filter and explore the graph.
|
|
33
|
+
|
|
34
|
+
---
|
|
35
|
+
|
|
36
|
+
### More information
|
|
37
|
+
|
|
38
|
+
- [What is dbt](https://docs.getdbt.com/docs/introduction)?
|
|
39
|
+
- Read the [dbt viewpoint](https://docs.getdbt.com/docs/viewpoint)
|
|
40
|
+
- [Installation](https://docs.getdbt.com/docs/installation)
|
|
41
|
+
- Join the [dbt Community](https://www.getdbt.com/community/) for questions and discussion
|
|
42
|
+
|
|
43
|
+
{% enddocs %}
|