apache-airflow-providers-common-sql 1.21.0rc1__tar.gz → 1.23.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of apache-airflow-providers-common-sql might be problematic. Click here for more details.
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0}/PKG-INFO +11 -29
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0}/README.rst +4 -22
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0}/pyproject.toml +27 -17
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/LICENSE +0 -52
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/__init__.py +1 -1
- apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/dialects/dialect.py +211 -0
- apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/dialects/dialect.pyi +82 -0
- apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/doc/adr/0003-introduce-notion-of-dialects-in-dbapihook.md +61 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/get_provider_info.py +15 -14
- apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/get_provider_info.pyi +35 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/hooks/handlers.pyi +3 -2
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/hooks/sql.py +142 -34
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/hooks/sql.pyi +27 -5
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/operators/sql.py +2 -1
- apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/sensors/__init__.py +16 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/sensors/sql.pyi +6 -2
- apache_airflow_providers_common_sql-1.21.0rc1/airflow/providers/common/sql/operators/sql.pyi +0 -256
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/README_API.md +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1/airflow/providers/common/sql/hooks → apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/dialects}/__init__.py +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/doc/adr/0001-record-architecture-decisions.md +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/doc/adr/0002-return-common-data-structure-from-dbapihook-derived-hooks.md +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1/airflow/providers/common/sql/operators → apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/hooks}/__init__.py +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/hooks/handlers.py +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1/airflow/providers/common/sql/sensors → apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/operators}/__init__.py +0 -0
- {apache_airflow_providers_common_sql-1.21.0rc1 → apache_airflow_providers_common_sql-1.23.0/src}/airflow/providers/common/sql/sensors/sql.py +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.3
|
|
2
2
|
Name: apache-airflow-providers-common-sql
|
|
3
|
-
Version: 1.
|
|
3
|
+
Version: 1.23.0
|
|
4
4
|
Summary: Provider package apache-airflow-providers-common-sql for Apache Airflow
|
|
5
5
|
Keywords: airflow-provider,common.sql,airflow,integration
|
|
6
6
|
Author-email: Apache Software Foundation <dev@airflow.apache.org>
|
|
@@ -20,15 +20,14 @@ Classifier: Programming Language :: Python :: 3.10
|
|
|
20
20
|
Classifier: Programming Language :: Python :: 3.11
|
|
21
21
|
Classifier: Programming Language :: Python :: 3.12
|
|
22
22
|
Classifier: Topic :: System :: Monitoring
|
|
23
|
-
Requires-Dist: apache-airflow>=2.9.
|
|
24
|
-
Requires-Dist: more-itertools>=9.0.0
|
|
23
|
+
Requires-Dist: apache-airflow>=2.9.0
|
|
25
24
|
Requires-Dist: sqlparse>=0.5.1
|
|
25
|
+
Requires-Dist: more-itertools>=9.0.0
|
|
26
26
|
Requires-Dist: apache-airflow-providers-openlineage ; extra == "openlineage"
|
|
27
|
-
Requires-Dist: pandas>=2.1.2,<2.2 ; extra == "pandas"
|
|
28
|
-
Requires-Dist: pandas>=1.5.3,<2.2 ; extra == "pandas" and (python_version<"3.9")
|
|
27
|
+
Requires-Dist: pandas>=2.1.2,<2.2 ; extra == "pandas"
|
|
29
28
|
Project-URL: Bug Tracker, https://github.com/apache/airflow/issues
|
|
30
|
-
Project-URL: Changelog, https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
31
|
-
Project-URL: Documentation, https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
29
|
+
Project-URL: Changelog, https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0/changelog.html
|
|
30
|
+
Project-URL: Documentation, https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0
|
|
32
31
|
Project-URL: Slack Chat, https://s.apache.org/airflow-slack
|
|
33
32
|
Project-URL: Source Code, https://github.com/apache/airflow
|
|
34
33
|
Project-URL: Twitter, https://x.com/ApacheAirflow
|
|
@@ -37,23 +36,6 @@ Provides-Extra: openlineage
|
|
|
37
36
|
Provides-Extra: pandas
|
|
38
37
|
|
|
39
38
|
|
|
40
|
-
.. Licensed to the Apache Software Foundation (ASF) under one
|
|
41
|
-
or more contributor license agreements. See the NOTICE file
|
|
42
|
-
distributed with this work for additional information
|
|
43
|
-
regarding copyright ownership. The ASF licenses this file
|
|
44
|
-
to you under the Apache License, Version 2.0 (the
|
|
45
|
-
"License"); you may not use this file except in compliance
|
|
46
|
-
with the License. You may obtain a copy of the License at
|
|
47
|
-
|
|
48
|
-
.. http://www.apache.org/licenses/LICENSE-2.0
|
|
49
|
-
|
|
50
|
-
.. Unless required by applicable law or agreed to in writing,
|
|
51
|
-
software distributed under the License is distributed on an
|
|
52
|
-
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
|
53
|
-
KIND, either express or implied. See the License for the
|
|
54
|
-
specific language governing permissions and limitations
|
|
55
|
-
under the License.
|
|
56
|
-
|
|
57
39
|
.. Licensed to the Apache Software Foundation (ASF) under one
|
|
58
40
|
or more contributor license agreements. See the NOTICE file
|
|
59
41
|
distributed with this work for additional information
|
|
@@ -71,8 +53,7 @@ Provides-Extra: pandas
|
|
|
71
53
|
specific language governing permissions and limitations
|
|
72
54
|
under the License.
|
|
73
55
|
|
|
74
|
-
.. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE
|
|
75
|
-
OVERWRITTEN WHEN PREPARING PACKAGES.
|
|
56
|
+
.. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN!
|
|
76
57
|
|
|
77
58
|
.. IF YOU WANT TO MODIFY TEMPLATE FOR THIS FILE, YOU SHOULD MODIFY THE TEMPLATE
|
|
78
59
|
`PROVIDER_README_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY
|
|
@@ -80,7 +61,7 @@ Provides-Extra: pandas
|
|
|
80
61
|
|
|
81
62
|
Package ``apache-airflow-providers-common-sql``
|
|
82
63
|
|
|
83
|
-
Release: ``1.
|
|
64
|
+
Release: ``1.23.0``
|
|
84
65
|
|
|
85
66
|
|
|
86
67
|
`Common SQL Provider <https://en.wikipedia.org/wiki/SQL>`__
|
|
@@ -93,7 +74,7 @@ This is a provider package for ``common.sql`` provider. All classes for this pro
|
|
|
93
74
|
are in ``airflow.providers.common.sql`` python package.
|
|
94
75
|
|
|
95
76
|
You can find package information and changelog for the provider
|
|
96
|
-
in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
77
|
+
in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0/>`_.
|
|
97
78
|
|
|
98
79
|
Installation
|
|
99
80
|
------------
|
|
@@ -135,4 +116,5 @@ Dependent package
|
|
|
135
116
|
============================================================================================================== ===============
|
|
136
117
|
|
|
137
118
|
The changelog for the provider package can be found in the
|
|
138
|
-
`changelog <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
119
|
+
`changelog <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0/changelog.html>`_.
|
|
120
|
+
|
|
@@ -1,21 +1,4 @@
|
|
|
1
1
|
|
|
2
|
-
.. Licensed to the Apache Software Foundation (ASF) under one
|
|
3
|
-
or more contributor license agreements. See the NOTICE file
|
|
4
|
-
distributed with this work for additional information
|
|
5
|
-
regarding copyright ownership. The ASF licenses this file
|
|
6
|
-
to you under the Apache License, Version 2.0 (the
|
|
7
|
-
"License"); you may not use this file except in compliance
|
|
8
|
-
with the License. You may obtain a copy of the License at
|
|
9
|
-
|
|
10
|
-
.. http://www.apache.org/licenses/LICENSE-2.0
|
|
11
|
-
|
|
12
|
-
.. Unless required by applicable law or agreed to in writing,
|
|
13
|
-
software distributed under the License is distributed on an
|
|
14
|
-
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
|
15
|
-
KIND, either express or implied. See the License for the
|
|
16
|
-
specific language governing permissions and limitations
|
|
17
|
-
under the License.
|
|
18
|
-
|
|
19
2
|
.. Licensed to the Apache Software Foundation (ASF) under one
|
|
20
3
|
or more contributor license agreements. See the NOTICE file
|
|
21
4
|
distributed with this work for additional information
|
|
@@ -33,8 +16,7 @@
|
|
|
33
16
|
specific language governing permissions and limitations
|
|
34
17
|
under the License.
|
|
35
18
|
|
|
36
|
-
.. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE
|
|
37
|
-
OVERWRITTEN WHEN PREPARING PACKAGES.
|
|
19
|
+
.. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN!
|
|
38
20
|
|
|
39
21
|
.. IF YOU WANT TO MODIFY TEMPLATE FOR THIS FILE, YOU SHOULD MODIFY THE TEMPLATE
|
|
40
22
|
`PROVIDER_README_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY
|
|
@@ -42,7 +24,7 @@
|
|
|
42
24
|
|
|
43
25
|
Package ``apache-airflow-providers-common-sql``
|
|
44
26
|
|
|
45
|
-
Release: ``1.
|
|
27
|
+
Release: ``1.23.0``
|
|
46
28
|
|
|
47
29
|
|
|
48
30
|
`Common SQL Provider <https://en.wikipedia.org/wiki/SQL>`__
|
|
@@ -55,7 +37,7 @@ This is a provider package for ``common.sql`` provider. All classes for this pro
|
|
|
55
37
|
are in ``airflow.providers.common.sql`` python package.
|
|
56
38
|
|
|
57
39
|
You can find package information and changelog for the provider
|
|
58
|
-
in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
40
|
+
in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0/>`_.
|
|
59
41
|
|
|
60
42
|
Installation
|
|
61
43
|
------------
|
|
@@ -97,4 +79,4 @@ Dependent package
|
|
|
97
79
|
============================================================================================================== ===============
|
|
98
80
|
|
|
99
81
|
The changelog for the provider package can be found in the
|
|
100
|
-
`changelog <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
82
|
+
`changelog <https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0/changelog.html>`_.
|
|
@@ -1,4 +1,3 @@
|
|
|
1
|
-
|
|
2
1
|
# Licensed to the Apache Software Foundation (ASF) under one
|
|
3
2
|
# or more contributor license agreements. See the NOTICE file
|
|
4
3
|
# distributed with this work for additional information
|
|
@@ -16,10 +15,9 @@
|
|
|
16
15
|
# specific language governing permissions and limitations
|
|
17
16
|
# under the License.
|
|
18
17
|
|
|
19
|
-
# NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE
|
|
20
|
-
# OVERWRITTEN WHEN PREPARING PACKAGES.
|
|
18
|
+
# NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN!
|
|
21
19
|
|
|
22
|
-
# IF YOU WANT TO MODIFY THIS FILE, YOU SHOULD MODIFY THE TEMPLATE
|
|
20
|
+
# IF YOU WANT TO MODIFY THIS FILE EXCEPT DEPENDENCIES, YOU SHOULD MODIFY THE TEMPLATE
|
|
23
21
|
# `pyproject_TEMPLATE.toml.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY
|
|
24
22
|
[build-system]
|
|
25
23
|
requires = ["flit_core==3.10.1"]
|
|
@@ -27,7 +25,7 @@ build-backend = "flit_core.buildapi"
|
|
|
27
25
|
|
|
28
26
|
[project]
|
|
29
27
|
name = "apache-airflow-providers-common-sql"
|
|
30
|
-
version = "1.
|
|
28
|
+
version = "1.23.0"
|
|
31
29
|
description = "Provider package apache-airflow-providers-common-sql for Apache Airflow"
|
|
32
30
|
readme = "README.rst"
|
|
33
31
|
authors = [
|
|
@@ -53,15 +51,32 @@ classifiers = [
|
|
|
53
51
|
"Topic :: System :: Monitoring",
|
|
54
52
|
]
|
|
55
53
|
requires-python = "~=3.9"
|
|
54
|
+
|
|
55
|
+
# The dependencies should be modified in place in the generated file
|
|
56
|
+
# Any change in the dependencies is preserved when the file is regenerated
|
|
56
57
|
dependencies = [
|
|
57
|
-
"apache-airflow>=2.9.
|
|
58
|
-
"more-itertools>=9.0.0",
|
|
58
|
+
"apache-airflow>=2.9.0",
|
|
59
59
|
"sqlparse>=0.5.1",
|
|
60
|
+
"more-itertools>=9.0.0",
|
|
61
|
+
]
|
|
62
|
+
|
|
63
|
+
# The optional dependencies should be modified in place in the generated file
|
|
64
|
+
# Any change in the dependencies is preserved when the file is regenerated
|
|
65
|
+
[project.optional-dependencies]
|
|
66
|
+
"pandas" = [
|
|
67
|
+
# In pandas 2.2 minimal version of the sqlalchemy is 2.0
|
|
68
|
+
# https://pandas.pydata.org/docs/whatsnew/v2.2.0.html#increased-minimum-versions-for-dependencies
|
|
69
|
+
# However Airflow not fully supports it yet: https://github.com/apache/airflow/issues/28723
|
|
70
|
+
# In addition FAB also limit sqlalchemy to < 2.0
|
|
71
|
+
"pandas>=2.1.2,<2.2",
|
|
72
|
+
]
|
|
73
|
+
"openlineage" = [
|
|
74
|
+
"apache-airflow-providers-openlineage"
|
|
60
75
|
]
|
|
61
76
|
|
|
62
77
|
[project.urls]
|
|
63
|
-
"Documentation" = "https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
64
|
-
"Changelog" = "https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.
|
|
78
|
+
"Documentation" = "https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0"
|
|
79
|
+
"Changelog" = "https://airflow.apache.org/docs/apache-airflow-providers-common-sql/1.23.0/changelog.html"
|
|
65
80
|
"Bug Tracker" = "https://github.com/apache/airflow/issues"
|
|
66
81
|
"Source Code" = "https://github.com/apache/airflow"
|
|
67
82
|
"Slack Chat" = "https://s.apache.org/airflow-slack"
|
|
@@ -70,14 +85,9 @@ dependencies = [
|
|
|
70
85
|
|
|
71
86
|
[project.entry-points."apache_airflow_provider"]
|
|
72
87
|
provider_info = "airflow.providers.common.sql.get_provider_info:get_provider_info"
|
|
73
|
-
[project.optional-dependencies]
|
|
74
|
-
"openlineage" = [
|
|
75
|
-
"apache-airflow-providers-openlineage",
|
|
76
|
-
]
|
|
77
|
-
"pandas" = [
|
|
78
|
-
"pandas>=2.1.2,<2.2;python_version>=\"3.9\"",
|
|
79
|
-
"pandas>=1.5.3,<2.2;python_version<\"3.9\"",
|
|
80
|
-
]
|
|
81
88
|
|
|
82
89
|
[tool.flit.module]
|
|
83
90
|
name = "airflow.providers.common.sql"
|
|
91
|
+
|
|
92
|
+
[tool.pytest.ini_options]
|
|
93
|
+
ignore = "tests/system/"
|
|
@@ -199,55 +199,3 @@ distributed under the License is distributed on an "AS IS" BASIS,
|
|
|
199
199
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
200
200
|
See the License for the specific language governing permissions and
|
|
201
201
|
limitations under the License.
|
|
202
|
-
|
|
203
|
-
============================================================================
|
|
204
|
-
APACHE AIRFLOW SUBCOMPONENTS:
|
|
205
|
-
|
|
206
|
-
The Apache Airflow project contains subcomponents with separate copyright
|
|
207
|
-
notices and license terms. Your use of the source code for the these
|
|
208
|
-
subcomponents is subject to the terms and conditions of the following
|
|
209
|
-
licenses.
|
|
210
|
-
|
|
211
|
-
|
|
212
|
-
========================================================================
|
|
213
|
-
Third party Apache 2.0 licenses
|
|
214
|
-
========================================================================
|
|
215
|
-
|
|
216
|
-
The following components are provided under the Apache 2.0 License.
|
|
217
|
-
See project link for details. The text of each license is also included
|
|
218
|
-
at 3rd-party-licenses/LICENSE-[project].txt.
|
|
219
|
-
|
|
220
|
-
(ALv2 License) hue v4.3.0 (https://github.com/cloudera/hue/)
|
|
221
|
-
(ALv2 License) jqclock v2.3.0 (https://github.com/JohnRDOrazio/jQuery-Clock-Plugin)
|
|
222
|
-
(ALv2 License) bootstrap3-typeahead v4.0.2 (https://github.com/bassjobsen/Bootstrap-3-Typeahead)
|
|
223
|
-
(ALv2 License) connexion v2.7.0 (https://github.com/zalando/connexion)
|
|
224
|
-
|
|
225
|
-
========================================================================
|
|
226
|
-
MIT licenses
|
|
227
|
-
========================================================================
|
|
228
|
-
|
|
229
|
-
The following components are provided under the MIT License. See project link for details.
|
|
230
|
-
The text of each license is also included at 3rd-party-licenses/LICENSE-[project].txt.
|
|
231
|
-
|
|
232
|
-
(MIT License) jquery v3.5.1 (https://jquery.org/license/)
|
|
233
|
-
(MIT License) dagre-d3 v0.6.4 (https://github.com/cpettitt/dagre-d3)
|
|
234
|
-
(MIT License) bootstrap v3.4.1 (https://github.com/twbs/bootstrap/)
|
|
235
|
-
(MIT License) d3-tip v0.9.1 (https://github.com/Caged/d3-tip)
|
|
236
|
-
(MIT License) dataTables v1.10.25 (https://datatables.net)
|
|
237
|
-
(MIT License) normalize.css v3.0.2 (http://necolas.github.io/normalize.css/)
|
|
238
|
-
(MIT License) ElasticMock v1.3.2 (https://github.com/vrcmarcos/elasticmock)
|
|
239
|
-
(MIT License) MomentJS v2.24.0 (http://momentjs.com/)
|
|
240
|
-
(MIT License) eonasdan-bootstrap-datetimepicker v4.17.49 (https://github.com/eonasdan/bootstrap-datetimepicker/)
|
|
241
|
-
|
|
242
|
-
========================================================================
|
|
243
|
-
BSD 3-Clause licenses
|
|
244
|
-
========================================================================
|
|
245
|
-
The following components are provided under the BSD 3-Clause license. See project links for details.
|
|
246
|
-
The text of each license is also included at 3rd-party-licenses/LICENSE-[project].txt.
|
|
247
|
-
|
|
248
|
-
(BSD 3 License) d3 v5.16.0 (https://d3js.org)
|
|
249
|
-
(BSD 3 License) d3-shape v2.1.0 (https://github.com/d3/d3-shape)
|
|
250
|
-
(BSD 3 License) cgroupspy 0.2.1 (https://github.com/cloudsigma/cgroupspy)
|
|
251
|
-
|
|
252
|
-
========================================================================
|
|
253
|
-
See 3rd-party-licenses/LICENSES-ui.txt for packages used in `/airflow/www`
|
|
@@ -29,7 +29,7 @@ from airflow import __version__ as airflow_version
|
|
|
29
29
|
|
|
30
30
|
__all__ = ["__version__"]
|
|
31
31
|
|
|
32
|
-
__version__ = "1.
|
|
32
|
+
__version__ = "1.23.0"
|
|
33
33
|
|
|
34
34
|
if packaging.version.parse(packaging.version.parse(airflow_version).base_version) < packaging.version.parse(
|
|
35
35
|
"2.9.0"
|
apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/dialects/dialect.py
ADDED
|
@@ -0,0 +1,211 @@
|
|
|
1
|
+
# Licensed to the Apache Software Foundation (ASF) under one
|
|
2
|
+
# or more contributor license agreements. See the NOTICE file
|
|
3
|
+
# distributed with this work for additional information
|
|
4
|
+
# regarding copyright ownership. The ASF licenses this file
|
|
5
|
+
# to you under the Apache License, Version 2.0 (the
|
|
6
|
+
# "License"); you may not use this file except in compliance
|
|
7
|
+
# with the License. You may obtain a copy of the License at
|
|
8
|
+
#
|
|
9
|
+
# http://www.apache.org/licenses/LICENSE-2.0
|
|
10
|
+
#
|
|
11
|
+
# Unless required by applicable law or agreed to in writing,
|
|
12
|
+
# software distributed under the License is distributed on an
|
|
13
|
+
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
|
14
|
+
# KIND, either express or implied. See the License for the
|
|
15
|
+
# specific language governing permissions and limitations
|
|
16
|
+
# under the License.
|
|
17
|
+
from __future__ import annotations
|
|
18
|
+
|
|
19
|
+
import re
|
|
20
|
+
from collections.abc import Iterable, Mapping
|
|
21
|
+
from typing import TYPE_CHECKING, Any, Callable, TypeVar
|
|
22
|
+
|
|
23
|
+
from methodtools import lru_cache
|
|
24
|
+
|
|
25
|
+
from airflow.utils.log.logging_mixin import LoggingMixin
|
|
26
|
+
|
|
27
|
+
if TYPE_CHECKING:
|
|
28
|
+
from sqlalchemy.engine import Inspector
|
|
29
|
+
|
|
30
|
+
T = TypeVar("T")
|
|
31
|
+
|
|
32
|
+
|
|
33
|
+
class Dialect(LoggingMixin):
|
|
34
|
+
"""Generic dialect implementation."""
|
|
35
|
+
|
|
36
|
+
pattern = re.compile(r"[^\w]")
|
|
37
|
+
|
|
38
|
+
def __init__(self, hook, **kwargs) -> None:
|
|
39
|
+
super().__init__(**kwargs)
|
|
40
|
+
|
|
41
|
+
from airflow.providers.common.sql.hooks.sql import DbApiHook
|
|
42
|
+
|
|
43
|
+
if not isinstance(hook, DbApiHook):
|
|
44
|
+
raise TypeError(f"hook must be an instance of {DbApiHook.__class__.__name__}")
|
|
45
|
+
|
|
46
|
+
self.hook: DbApiHook = hook
|
|
47
|
+
|
|
48
|
+
@property
|
|
49
|
+
def placeholder(self) -> str:
|
|
50
|
+
return self.hook.placeholder
|
|
51
|
+
|
|
52
|
+
@property
|
|
53
|
+
def inspector(self) -> Inspector:
|
|
54
|
+
return self.hook.inspector
|
|
55
|
+
|
|
56
|
+
@property
|
|
57
|
+
def insert_statement_format(self) -> str:
|
|
58
|
+
return self.hook.insert_statement_format
|
|
59
|
+
|
|
60
|
+
@property
|
|
61
|
+
def replace_statement_format(self) -> str:
|
|
62
|
+
return self.hook.replace_statement_format
|
|
63
|
+
|
|
64
|
+
@property
|
|
65
|
+
def escape_word_format(self) -> str:
|
|
66
|
+
return self.hook.escape_word_format
|
|
67
|
+
|
|
68
|
+
@property
|
|
69
|
+
def escape_column_names(self) -> bool:
|
|
70
|
+
return self.hook.escape_column_names
|
|
71
|
+
|
|
72
|
+
def escape_word(self, word: str) -> str:
|
|
73
|
+
"""
|
|
74
|
+
Escape the word if necessary.
|
|
75
|
+
|
|
76
|
+
If the word is a reserved word or contains special characters or if the ``escape_column_names``
|
|
77
|
+
property is set to True in connection extra field, then the given word will be escaped.
|
|
78
|
+
|
|
79
|
+
:param word: Name of the column
|
|
80
|
+
:return: The escaped word
|
|
81
|
+
"""
|
|
82
|
+
if word != self.escape_word_format.format(self.unescape_word(word)) and (
|
|
83
|
+
self.escape_column_names or word.casefold() in self.reserved_words or self.pattern.search(word)
|
|
84
|
+
):
|
|
85
|
+
return self.escape_word_format.format(word)
|
|
86
|
+
return word
|
|
87
|
+
|
|
88
|
+
def unescape_word(self, word: str | None) -> str | None:
|
|
89
|
+
"""
|
|
90
|
+
Remove escape characters from each part of a dotted identifier (e.g., schema.table).
|
|
91
|
+
|
|
92
|
+
:param word: Escaped schema, table, or column name, potentially with multiple segments.
|
|
93
|
+
:return: The word without escaped characters.
|
|
94
|
+
"""
|
|
95
|
+
if not word:
|
|
96
|
+
return word
|
|
97
|
+
|
|
98
|
+
escape_char_start = self.escape_word_format[0]
|
|
99
|
+
escape_char_end = self.escape_word_format[-1]
|
|
100
|
+
|
|
101
|
+
def unescape_part(part: str) -> str:
|
|
102
|
+
if part.startswith(escape_char_start) and part.endswith(escape_char_end):
|
|
103
|
+
return part[1:-1]
|
|
104
|
+
return part
|
|
105
|
+
|
|
106
|
+
return ".".join(map(unescape_part, word.split(".")))
|
|
107
|
+
|
|
108
|
+
@classmethod
|
|
109
|
+
def extract_schema_from_table(cls, table: str) -> tuple[str, str | None]:
|
|
110
|
+
parts = table.split(".")
|
|
111
|
+
return tuple(parts[::-1]) if len(parts) == 2 else (table, None) # type: ignore[return-value]
|
|
112
|
+
|
|
113
|
+
@lru_cache(maxsize=None)
|
|
114
|
+
def get_column_names(
|
|
115
|
+
self, table: str, schema: str | None = None, predicate: Callable[[T], bool] = lambda column: True
|
|
116
|
+
) -> list[str] | None:
|
|
117
|
+
if schema is None:
|
|
118
|
+
table, schema = self.extract_schema_from_table(table)
|
|
119
|
+
column_names = list(
|
|
120
|
+
column["name"]
|
|
121
|
+
for column in filter(
|
|
122
|
+
predicate,
|
|
123
|
+
self.inspector.get_columns(
|
|
124
|
+
table_name=self.unescape_word(table),
|
|
125
|
+
schema=self.unescape_word(schema) if schema else None,
|
|
126
|
+
),
|
|
127
|
+
)
|
|
128
|
+
)
|
|
129
|
+
self.log.debug("Column names for table '%s': %s", table, column_names)
|
|
130
|
+
return column_names
|
|
131
|
+
|
|
132
|
+
@lru_cache(maxsize=None)
|
|
133
|
+
def get_target_fields(self, table: str, schema: str | None = None) -> list[str] | None:
|
|
134
|
+
target_fields = self.get_column_names(
|
|
135
|
+
table,
|
|
136
|
+
schema,
|
|
137
|
+
lambda column: not column.get("identity", False) and not column.get("autoincrement", False),
|
|
138
|
+
)
|
|
139
|
+
self.log.debug("Target fields for table '%s': %s", table, target_fields)
|
|
140
|
+
return target_fields
|
|
141
|
+
|
|
142
|
+
@lru_cache(maxsize=None)
|
|
143
|
+
def get_primary_keys(self, table: str, schema: str | None = None) -> list[str] | None:
|
|
144
|
+
if schema is None:
|
|
145
|
+
table, schema = self.extract_schema_from_table(table)
|
|
146
|
+
primary_keys = self.inspector.get_pk_constraint(
|
|
147
|
+
table_name=self.unescape_word(table),
|
|
148
|
+
schema=self.unescape_word(schema) if schema else None,
|
|
149
|
+
).get("constrained_columns", [])
|
|
150
|
+
self.log.debug("Primary keys for table '%s': %s", table, primary_keys)
|
|
151
|
+
return primary_keys
|
|
152
|
+
|
|
153
|
+
def run(
|
|
154
|
+
self,
|
|
155
|
+
sql: str | Iterable[str],
|
|
156
|
+
autocommit: bool = False,
|
|
157
|
+
parameters: Iterable | Mapping[str, Any] | None = None,
|
|
158
|
+
handler: Callable[[Any], T] | None = None,
|
|
159
|
+
split_statements: bool = False,
|
|
160
|
+
return_last: bool = True,
|
|
161
|
+
) -> tuple | list[tuple] | list[list[tuple] | tuple] | None:
|
|
162
|
+
return self.hook.run(sql, autocommit, parameters, handler, split_statements, return_last)
|
|
163
|
+
|
|
164
|
+
def get_records(
|
|
165
|
+
self,
|
|
166
|
+
sql: str | list[str],
|
|
167
|
+
parameters: Iterable | Mapping[str, Any] | None = None,
|
|
168
|
+
) -> Any:
|
|
169
|
+
return self.hook.get_records(sql=sql, parameters=parameters)
|
|
170
|
+
|
|
171
|
+
@property
|
|
172
|
+
def reserved_words(self) -> set[str]:
|
|
173
|
+
return self.hook.reserved_words
|
|
174
|
+
|
|
175
|
+
def _joined_placeholders(self, values) -> str:
|
|
176
|
+
placeholders = [
|
|
177
|
+
self.placeholder,
|
|
178
|
+
] * len(values)
|
|
179
|
+
return ",".join(placeholders)
|
|
180
|
+
|
|
181
|
+
def _joined_target_fields(self, target_fields) -> str:
|
|
182
|
+
if target_fields:
|
|
183
|
+
target_fields = ", ".join(map(self.escape_word, target_fields))
|
|
184
|
+
return f"({target_fields})"
|
|
185
|
+
return ""
|
|
186
|
+
|
|
187
|
+
def generate_insert_sql(self, table, values, target_fields, **kwargs) -> str:
|
|
188
|
+
"""
|
|
189
|
+
Generate the INSERT SQL statement.
|
|
190
|
+
|
|
191
|
+
:param table: Name of the target table
|
|
192
|
+
:param values: The row to insert into the table
|
|
193
|
+
:param target_fields: The names of the columns to fill in the table
|
|
194
|
+
:return: The generated INSERT SQL statement
|
|
195
|
+
"""
|
|
196
|
+
return self.insert_statement_format.format(
|
|
197
|
+
table, self._joined_target_fields(target_fields), self._joined_placeholders(values)
|
|
198
|
+
)
|
|
199
|
+
|
|
200
|
+
def generate_replace_sql(self, table, values, target_fields, **kwargs) -> str:
|
|
201
|
+
"""
|
|
202
|
+
Generate the REPLACE SQL statement.
|
|
203
|
+
|
|
204
|
+
:param table: Name of the target table
|
|
205
|
+
:param values: The row to insert into the table
|
|
206
|
+
:param target_fields: The names of the columns to fill in the table
|
|
207
|
+
:return: The generated REPLACE SQL statement
|
|
208
|
+
"""
|
|
209
|
+
return self.replace_statement_format.format(
|
|
210
|
+
table, self._joined_target_fields(target_fields), self._joined_placeholders(values)
|
|
211
|
+
)
|
apache_airflow_providers_common_sql-1.23.0/src/airflow/providers/common/sql/dialects/dialect.pyi
ADDED
|
@@ -0,0 +1,82 @@
|
|
|
1
|
+
# Licensed to the Apache Software Foundation (ASF) under one
|
|
2
|
+
# or more contributor license agreements. See the NOTICE file
|
|
3
|
+
# distributed with this work for additional information
|
|
4
|
+
# regarding copyright ownership. The ASF licenses this file
|
|
5
|
+
# to you under the Apache License, Version 2.0 (the
|
|
6
|
+
# "License"); you may not use this file except in compliance
|
|
7
|
+
# with the License. You may obtain a copy of the License at
|
|
8
|
+
#
|
|
9
|
+
# http://www.apache.org/licenses/LICENSE-2.0
|
|
10
|
+
#
|
|
11
|
+
# Unless required by applicable law or agreed to in writing,
|
|
12
|
+
# software distributed under the License is distributed on an
|
|
13
|
+
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
|
14
|
+
# KIND, either express or implied. See the License for the
|
|
15
|
+
# specific language governing permissions and limitations
|
|
16
|
+
# under the License.
|
|
17
|
+
#
|
|
18
|
+
# This is automatically generated stub for the `common.sql` provider
|
|
19
|
+
#
|
|
20
|
+
# This file is generated automatically by the `update-common-sql-api stubs` pre-commit
|
|
21
|
+
# and the .pyi file represents part of the "public" API that the
|
|
22
|
+
# `common.sql` provider exposes to other providers.
|
|
23
|
+
#
|
|
24
|
+
# Any, potentially breaking change in the stubs will require deliberate manual action from the contributor
|
|
25
|
+
# making a change to the `common.sql` provider. Those stubs are also used by MyPy automatically when checking
|
|
26
|
+
# if only public API of the common.sql provider is used by all the other providers.
|
|
27
|
+
#
|
|
28
|
+
# You can read more in the README_API.md file
|
|
29
|
+
#
|
|
30
|
+
"""
|
|
31
|
+
Definition of the public interface for airflow.providers.common.sql.src.airflow.providers.common.sql.dialects.dialect
|
|
32
|
+
isort:skip_file
|
|
33
|
+
"""
|
|
34
|
+
|
|
35
|
+
from collections.abc import Iterable, Mapping
|
|
36
|
+
from typing import Any, Callable, TypeVar
|
|
37
|
+
|
|
38
|
+
from _typeshed import Incomplete as Incomplete
|
|
39
|
+
from sqlalchemy.engine import Inspector as Inspector
|
|
40
|
+
|
|
41
|
+
from airflow.utils.log.logging_mixin import LoggingMixin as LoggingMixin
|
|
42
|
+
|
|
43
|
+
T = TypeVar("T")
|
|
44
|
+
|
|
45
|
+
class Dialect(LoggingMixin):
|
|
46
|
+
hook: Incomplete
|
|
47
|
+
def __init__(self, hook, **kwargs) -> None: ...
|
|
48
|
+
def escape_word(self, column_name: str) -> str: ...
|
|
49
|
+
def unescape_word(self, value: str | None) -> str | None: ...
|
|
50
|
+
@property
|
|
51
|
+
def placeholder(self) -> str: ...
|
|
52
|
+
@property
|
|
53
|
+
def insert_statement_format(self) -> str: ...
|
|
54
|
+
@property
|
|
55
|
+
def replace_statement_format(self) -> str: ...
|
|
56
|
+
@property
|
|
57
|
+
def escape_word_format(self) -> str: ...
|
|
58
|
+
@property
|
|
59
|
+
def inspector(self) -> Inspector: ...
|
|
60
|
+
@classmethod
|
|
61
|
+
def extract_schema_from_table(cls, table: str) -> tuple[str, str | None]: ...
|
|
62
|
+
def get_column_names(
|
|
63
|
+
self, table: str, schema: str | None = None, predicate: Callable[[T], bool] = ...
|
|
64
|
+
) -> list[str] | None: ...
|
|
65
|
+
def get_target_fields(self, table: str, schema: str | None = None) -> list[str] | None: ...
|
|
66
|
+
def get_primary_keys(self, table: str, schema: str | None = None) -> list[str] | None: ...
|
|
67
|
+
def run(
|
|
68
|
+
self,
|
|
69
|
+
sql: str | Iterable[str],
|
|
70
|
+
autocommit: bool = False,
|
|
71
|
+
parameters: Iterable | Mapping[str, Any] | None = None,
|
|
72
|
+
handler: Callable[[Any], T] | None = None,
|
|
73
|
+
split_statements: bool = False,
|
|
74
|
+
return_last: bool = True,
|
|
75
|
+
) -> tuple | list[tuple] | list[list[tuple] | tuple] | None: ...
|
|
76
|
+
def get_records(
|
|
77
|
+
self, sql: str | list[str], parameters: Iterable | Mapping[str, Any] | None = None
|
|
78
|
+
) -> Any: ...
|
|
79
|
+
@property
|
|
80
|
+
def reserved_words(self) -> set[str]: ...
|
|
81
|
+
def generate_insert_sql(self, table, values, target_fields, **kwargs) -> str: ...
|
|
82
|
+
def generate_replace_sql(self, table, values, target_fields, **kwargs) -> str: ...
|
|
@@ -0,0 +1,61 @@
|
|
|
1
|
+
<!--
|
|
2
|
+
Licensed to the Apache Software Foundation (ASF) under one
|
|
3
|
+
or more contributor license agreements. See the NOTICE file
|
|
4
|
+
distributed with this work for additional information
|
|
5
|
+
regarding copyright ownership. The ASF licenses this file
|
|
6
|
+
to you under the Apache License, Version 2.0 (the
|
|
7
|
+
"License"); you may not use this file except in compliance
|
|
8
|
+
with the License. You may obtain a copy of the License at
|
|
9
|
+
|
|
10
|
+
http://www.apache.org/licenses/LICENSE-2.0
|
|
11
|
+
|
|
12
|
+
Unless required by applicable law or agreed to in writing,
|
|
13
|
+
software distributed under the License is distributed on an
|
|
14
|
+
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
|
15
|
+
KIND, either express or implied. See the License for the
|
|
16
|
+
specific language governing permissions and limitations
|
|
17
|
+
under the License.
|
|
18
|
+
-->
|
|
19
|
+
|
|
20
|
+
# 3. Introduce notion of dialects in DbApiHook
|
|
21
|
+
|
|
22
|
+
Date: 2025-01-07
|
|
23
|
+
|
|
24
|
+
## Status
|
|
25
|
+
|
|
26
|
+
Accepted
|
|
27
|
+
|
|
28
|
+
## Context
|
|
29
|
+
|
|
30
|
+
This ADR describes the proposition why we wanted to introduce dialects in the ``DBAPIHook`` as we experienced
|
|
31
|
+
that the ``_insert_statement_format`` and ``_replace_statement_format`` string formatting properties used by the
|
|
32
|
+
``insert_rows`` method in the ``DbApiHook`` where lacking in some cases as the number of parameters passed to the
|
|
33
|
+
string format are hard-coded and aren't always sufficient when using different database through the
|
|
34
|
+
generic JBDC and ODBC connection types.
|
|
35
|
+
|
|
36
|
+
That's why we wanted a generic approach in which the code isn't tied to a specific database hook.
|
|
37
|
+
|
|
38
|
+
For example when using MsSQL through ODBC instead of the native ``MsSqlHook``, you won't have the merge into
|
|
39
|
+
(e.g. replace) functionality for MSSQL when using the ODBC connection type as that one was only available in
|
|
40
|
+
the native ``MsSqlHook``.
|
|
41
|
+
|
|
42
|
+
That's where the notion of dialects come into play and allow us to benefit of the same functionalities
|
|
43
|
+
independently of which connection type you want to use (ODBC/JDBC or native if available) for a specific
|
|
44
|
+
database.
|
|
45
|
+
|
|
46
|
+
|
|
47
|
+
## Decision
|
|
48
|
+
|
|
49
|
+
We decided the introduce the notion of dialects which allows us to implement database specific functionalities
|
|
50
|
+
independently of the used connection type (e.g. hook). That way when using for example the ``insert_rows`` method on
|
|
51
|
+
the ``DbApiHook`` for as well ODBC as JDBC as native connection types, it will always be possible to use the replace
|
|
52
|
+
into (e.g. merge into) functionality as that won't be tied to a specific implementation with a Hook an thus the
|
|
53
|
+
connection type.
|
|
54
|
+
|
|
55
|
+
|
|
56
|
+
## Consequences
|
|
57
|
+
|
|
58
|
+
The consequence of this decision is that from now on database specific implementations should be done within the
|
|
59
|
+
dialect for that database instead of the specialized hook, unless the connection type is tied to the hook,
|
|
60
|
+
meaning that there is only one connection type possible and an ODBC/JDBC and in the future maybe even ADBC
|
|
61
|
+
(e.g. Apache Arrow) isn't available.
|