airflow-unicore-integration 0.0.4__tar.gz → 0.0.5__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (18) hide show
  1. airflow_unicore_integration-0.0.5/LICENSE +29 -0
  2. airflow_unicore_integration-0.0.5/PKG-INFO +156 -0
  3. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/README.rst +1 -1
  4. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/pyproject.toml +5 -5
  5. airflow_unicore_integration-0.0.5/src/airflow_unicore_integration.egg-info/PKG-INFO +156 -0
  6. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration.egg-info/SOURCES.txt +1 -0
  7. airflow_unicore_integration-0.0.4/PKG-INFO +0 -16
  8. airflow_unicore_integration-0.0.4/src/airflow_unicore_integration.egg-info/PKG-INFO +0 -16
  9. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/setup.cfg +0 -0
  10. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration/__init__.py +0 -0
  11. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration/hooks/__init__.py +0 -0
  12. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration/hooks/unicore_hooks.py +0 -0
  13. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration/operators/__init__.py +0 -0
  14. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration/operators/unicore_operators.py +0 -0
  15. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration.egg-info/dependency_links.txt +0 -0
  16. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration.egg-info/entry_points.txt +0 -0
  17. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration.egg-info/requires.txt +0 -0
  18. {airflow_unicore_integration-0.0.4 → airflow_unicore_integration-0.0.5}/src/airflow_unicore_integration.egg-info/top_level.txt +0 -0
@@ -0,0 +1,29 @@
1
+ BSD 3-Clause License
2
+
3
+ Copyright (c) Forschungszentrum Juelich GmbH
4
+ All rights reserved.
5
+
6
+ Redistribution and use in source and binary forms, with or without
7
+ modification, are permitted provided that the following conditions are met:
8
+
9
+ * Redistributions of source code must retain the above copyright notice, this
10
+ list of conditions and the following disclaimer.
11
+
12
+ * Redistributions in binary form must reproduce the above copyright notice,
13
+ this list of conditions and the following disclaimer in the documentation
14
+ and/or other materials provided with the distribution.
15
+
16
+ * Neither the names of the copyright holders nor the names of its
17
+ contributors may be used to endorse or promote products derived from
18
+ this software without specific prior written permission.
19
+
20
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
21
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
22
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
23
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
24
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
25
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
26
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
27
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
28
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
@@ -0,0 +1,156 @@
1
+ Metadata-Version: 2.1
2
+ Name: airflow-unicore-integration
3
+ Version: 0.0.5
4
+ Summary: Running Unicore Jobs from airflow DAGs.
5
+ Author-email: Christian Böttcher <c.boettcher@fz-juelich.de>
6
+ Project-URL: Homepage, https://github.com/UNICORE-EU/airflow-unicore-integration
7
+ Project-URL: Issues, https://github.com/UNICORE-EU/airflow-unicore-integration/issues
8
+ Classifier: Development Status :: 4 - Beta
9
+ Classifier: Framework :: Apache Airflow :: Provider
10
+ Classifier: Programming Language :: Python :: 3
11
+ Classifier: License :: OSI Approved :: BSD License
12
+ Classifier: Operating System :: OS Independent
13
+ Requires-Python: >=3.10
14
+ Description-Content-Type: text/x-rst
15
+ License-File: LICENSE
16
+ Requires-Dist: pyunicore>=1.0.0
17
+ Requires-Dist: apache-airflow==2.8.4
18
+
19
+ ===========================
20
+ Unicore Airflow Integration
21
+ ===========================
22
+
23
+
24
+ ---------------------------
25
+ Using the Unicore Operators
26
+ ---------------------------
27
+
28
+ There are multiple Unicore operators provided by this package. The most versatile one is the ``UnicoreGenericOperator``, which supports a lot of job parameters.
29
+ All other operators are intended to offer a slightly less complex constructor, and therefore simpler usage, but all generic parameters are still available to be used.
30
+
31
+ All operators support all possible parameters of the `Unicore job description <https://unicore-docs.readthedocs.io/en/latest/user-docs/rest-api/job-description/index.html#overview>`_. Here is an excerpt containing some commonly used parameters:
32
+
33
+ ======================= ======================= =========================================== ====================
34
+ parameter name type default description
35
+ ======================= ======================= =========================================== ====================
36
+ application_name str None Application Name
37
+ application_version str None Application Version
38
+ executable str None Command line executable
39
+ arguments List(str) None Command line arguments
40
+ environment Map(str,str) None environment arguments
41
+ parameters Map None Application Parameters
42
+ project str None Accounting Project
43
+ imports List(imports) None Stage-in/data import - see Unicore docs
44
+ exports List(exports) None Stage-out/data export - see Unicore docs
45
+ ======================= ======================= =========================================== ====================
46
+
47
+ For imports and exports go `here <https://unicore-docs.readthedocs.io/en/latest/user-docs/rest-api/job-description/index.html#importing-files-into-the-job-workspace>`_ for details.
48
+
49
+
50
+ The ``UnicoreGenericOperator`` supports the following additional parameters:
51
+
52
+ ======================= ======================= =========================================== ====================
53
+ parameter name type default description
54
+ ======================= ======================= =========================================== ====================
55
+ name str None name for the airflow task and the Unicore job
56
+ xcom_output_files List(str) ["stdout","stderr"] list of files of which the content should be put into xcoms
57
+ base_url str configured in airflow connections or None The base URL of the UNICOREX server to be used for the Unicore client
58
+ credential pyunicore credential configured in airflow connections or None A Unicore Credential to be used for the Unicore client
59
+ credential_username str configured in airflow connections or None Username for the Unicore client credentials
60
+ credential_password str configured in airflow connections or None Password the the Unicore client credentials
61
+ credential_token str configured in airflow connections or None An OIDC token to be used by the Unicore client
62
+ ======================= ======================= =========================================== ====================
63
+
64
+
65
+ The ``UnicoreScriptOperator`` offers a way to more easily submit a script as a job, where the script content can be provided as a string.
66
+
67
+ ======================= ======================= =========================================== ====================
68
+ parameter name type default description
69
+ ======================= ======================= =========================================== ====================
70
+ script_content str None The content of the script file
71
+ ======================= ======================= =========================================== ====================
72
+
73
+
74
+ The ``UnicoreBSSOperator`` offers a way to directly submit batch-scripts from their content-strings.
75
+
76
+ ======================= ======================= =========================================== ====================
77
+ parameter name type default description
78
+ ======================= ======================= =========================================== ====================
79
+ bss_file_content str None The content of the batch script file
80
+ ======================= ======================= =========================================== ====================
81
+
82
+
83
+ The ``UnicoreExecutableOperator`` offers a reduced constructor that only requires an executable.
84
+
85
+ ======================= ======================= =========================================== ====================
86
+ parameter name type default description
87
+ ======================= ======================= =========================================== ====================
88
+ executable str None The executable to run for this job
89
+ xcom_output_files List(str) ["stdout","stderr"] list of files of which the content should be put into xcoms
90
+ ======================= ======================= =========================================== ====================
91
+
92
+ The ``UnicoreDateOperator`` is more of a testing operator, since it will only run the ``date`` executable.
93
+
94
+ -------------------------------
95
+ Behaviour on Errors and Success
96
+ -------------------------------
97
+
98
+ The Unicore Operators do not do a lot of error and exception handling, and mostly just forward any problems to be handled by airflow.
99
+ All of the Unicore logic is handled by the `pyunicore library <https://github.com/HumanBrainProject/pyunicore>`_.
100
+
101
+ While some validation of the resulting Unicore job description is done automatically, it may still be possible to build an invalid job description with the operators.
102
+ This may lead to a submission failure with Unicore. In this case, an exception is thrown to be handled by airflow.
103
+
104
+
105
+ For a successful job submission, the job exit code is returned as the task return value, so that airflow can handle non-zero exit codes.
106
+ All operators will also append the content of the job-log-file from Unicore to the airflow task log.
107
+ Also, some job results and values will be uploaded via airflow-x-coms as well:
108
+
109
+ ======================= ========================================
110
+ xcom name description
111
+ ======================= ========================================
112
+ Unicore Job ID the Unicore ID for the job
113
+ Unicore Job the TSI script that was submitted by Unicore
114
+ BSS_SUBMIT the bss_script submitted by Unicore
115
+ status_message the status message for the Unicore job
116
+ log the Unicore job log
117
+ workdir_content content of the job workdir upon completion
118
+ [xcom_output_files] content of each file in their own xcom, by default stdout and stderr
119
+ ======================= ========================================
120
+
121
+ ------------
122
+ Example DAGs
123
+ ------------
124
+
125
+ There are some example DAGs in this repository under ``project-dir/dags``.
126
+
127
+ - ``unicore-test-1.py`` just shows basic date and executable usage.
128
+ - ``unicore-test-2.py`` has some basic examples for the generic operator.
129
+ - ``unicore-test-3.py`` also includes script-operator examples.
130
+ - ``unicore-test-4.py`` has some examples with more arguments.
131
+ - ``unicore-test-bss.py`` shows how bss submission can be done (very simple example).
132
+ - ``unicore-test-credentials.py`` demonstrates that not only the credentials from the airflow connections backend can be used, but they can also be provided in the constructor of the o`perator.
133
+ - ``unicore-test-import-export.py`` gives shprt examples for the imports and exports usage.
134
+
135
+
136
+ -----------------
137
+ Setup testing env
138
+ -----------------
139
+
140
+ Ensure a current version of docker is installed.
141
+
142
+ Run ``python3 -m build`` to build the python package.
143
+
144
+ Run the ``testing-env/build-image.sh`` script to create the customized airflow image, which will contain the newly build python package.
145
+
146
+ Run ``testing-env/run-testing-env.sh init`` to initialize the airflow containers, database etc. This only needs to be done once.
147
+
148
+ Run ``testing-env/run-testing-env.sh up`` to start the local airflow and Unicore deployment. Airflow will be available on port 8080, Unicore on port 8081.
149
+
150
+ The ``run-testing-env.sh`` script supports the commands up, down, start, stop, ps and init for matching docker compose functions.
151
+
152
+ -----------------------
153
+ Install package via pip
154
+ -----------------------
155
+
156
+ ``pip install airflow-unicore-integration``
@@ -135,4 +135,4 @@ The ``run-testing-env.sh`` script supports the commands up, down, start, stop, p
135
135
  Install package via pip
136
136
  -----------------------
137
137
 
138
- ``pip install airflow-unicore-integration --index-url https://gitlab.jsc.fz-juelich.de/api/v4/projects/6269/packages/pypi/simple``
138
+ ``pip install airflow-unicore-integration``
@@ -6,18 +6,18 @@ build-backend = "setuptools.build_meta"
6
6
 
7
7
  [project]
8
8
  name = "airflow-unicore-integration"
9
- version = "0.0.4"
9
+ version = "0.0.5"
10
10
  authors = [
11
11
  { name="Christian Böttcher", email="c.boettcher@fz-juelich.de" },
12
12
  ]
13
13
  description = "Running Unicore Jobs from airflow DAGs."
14
- readme = {file = "README.txt", content-type = "text/markdown"}
14
+ readme = "README.rst"
15
15
  requires-python = ">=3.10"
16
16
  classifiers = [
17
17
  "Development Status :: 4 - Beta",
18
18
  "Framework :: Apache Airflow :: Provider",
19
19
  "Programming Language :: Python :: 3",
20
- "License :: OSI Approved :: MIT License",
20
+ "License :: OSI Approved :: BSD License",
21
21
  "Operating System :: OS Independent",
22
22
  ]
23
23
 
@@ -29,8 +29,8 @@ dependencies = [
29
29
  ]
30
30
 
31
31
  [project.urls]
32
- Homepage = "https://gitlab.jsc.fz-juelich.de/boettcher1/airflow_unicore_integration"
33
- Issues = "https://gitlab.jsc.fz-juelich.de/boettcher1/airflow_unicore_integration/-/issues"
32
+ Homepage = "https://github.com/UNICORE-EU/airflow-unicore-integration"
33
+ Issues = "https://github.com/UNICORE-EU/airflow-unicore-integration/issues"
34
34
 
35
35
  [project.entry-points."apache_airflow_provider"]
36
36
  provider_info = "airflow_unicore_integration:get_provider_info"
@@ -0,0 +1,156 @@
1
+ Metadata-Version: 2.1
2
+ Name: airflow-unicore-integration
3
+ Version: 0.0.5
4
+ Summary: Running Unicore Jobs from airflow DAGs.
5
+ Author-email: Christian Böttcher <c.boettcher@fz-juelich.de>
6
+ Project-URL: Homepage, https://github.com/UNICORE-EU/airflow-unicore-integration
7
+ Project-URL: Issues, https://github.com/UNICORE-EU/airflow-unicore-integration/issues
8
+ Classifier: Development Status :: 4 - Beta
9
+ Classifier: Framework :: Apache Airflow :: Provider
10
+ Classifier: Programming Language :: Python :: 3
11
+ Classifier: License :: OSI Approved :: BSD License
12
+ Classifier: Operating System :: OS Independent
13
+ Requires-Python: >=3.10
14
+ Description-Content-Type: text/x-rst
15
+ License-File: LICENSE
16
+ Requires-Dist: pyunicore>=1.0.0
17
+ Requires-Dist: apache-airflow==2.8.4
18
+
19
+ ===========================
20
+ Unicore Airflow Integration
21
+ ===========================
22
+
23
+
24
+ ---------------------------
25
+ Using the Unicore Operators
26
+ ---------------------------
27
+
28
+ There are multiple Unicore operators provided by this package. The most versatile one is the ``UnicoreGenericOperator``, which supports a lot of job parameters.
29
+ All other operators are intended to offer a slightly less complex constructor, and therefore simpler usage, but all generic parameters are still available to be used.
30
+
31
+ All operators support all possible parameters of the `Unicore job description <https://unicore-docs.readthedocs.io/en/latest/user-docs/rest-api/job-description/index.html#overview>`_. Here is an excerpt containing some commonly used parameters:
32
+
33
+ ======================= ======================= =========================================== ====================
34
+ parameter name type default description
35
+ ======================= ======================= =========================================== ====================
36
+ application_name str None Application Name
37
+ application_version str None Application Version
38
+ executable str None Command line executable
39
+ arguments List(str) None Command line arguments
40
+ environment Map(str,str) None environment arguments
41
+ parameters Map None Application Parameters
42
+ project str None Accounting Project
43
+ imports List(imports) None Stage-in/data import - see Unicore docs
44
+ exports List(exports) None Stage-out/data export - see Unicore docs
45
+ ======================= ======================= =========================================== ====================
46
+
47
+ For imports and exports go `here <https://unicore-docs.readthedocs.io/en/latest/user-docs/rest-api/job-description/index.html#importing-files-into-the-job-workspace>`_ for details.
48
+
49
+
50
+ The ``UnicoreGenericOperator`` supports the following additional parameters:
51
+
52
+ ======================= ======================= =========================================== ====================
53
+ parameter name type default description
54
+ ======================= ======================= =========================================== ====================
55
+ name str None name for the airflow task and the Unicore job
56
+ xcom_output_files List(str) ["stdout","stderr"] list of files of which the content should be put into xcoms
57
+ base_url str configured in airflow connections or None The base URL of the UNICOREX server to be used for the Unicore client
58
+ credential pyunicore credential configured in airflow connections or None A Unicore Credential to be used for the Unicore client
59
+ credential_username str configured in airflow connections or None Username for the Unicore client credentials
60
+ credential_password str configured in airflow connections or None Password the the Unicore client credentials
61
+ credential_token str configured in airflow connections or None An OIDC token to be used by the Unicore client
62
+ ======================= ======================= =========================================== ====================
63
+
64
+
65
+ The ``UnicoreScriptOperator`` offers a way to more easily submit a script as a job, where the script content can be provided as a string.
66
+
67
+ ======================= ======================= =========================================== ====================
68
+ parameter name type default description
69
+ ======================= ======================= =========================================== ====================
70
+ script_content str None The content of the script file
71
+ ======================= ======================= =========================================== ====================
72
+
73
+
74
+ The ``UnicoreBSSOperator`` offers a way to directly submit batch-scripts from their content-strings.
75
+
76
+ ======================= ======================= =========================================== ====================
77
+ parameter name type default description
78
+ ======================= ======================= =========================================== ====================
79
+ bss_file_content str None The content of the batch script file
80
+ ======================= ======================= =========================================== ====================
81
+
82
+
83
+ The ``UnicoreExecutableOperator`` offers a reduced constructor that only requires an executable.
84
+
85
+ ======================= ======================= =========================================== ====================
86
+ parameter name type default description
87
+ ======================= ======================= =========================================== ====================
88
+ executable str None The executable to run for this job
89
+ xcom_output_files List(str) ["stdout","stderr"] list of files of which the content should be put into xcoms
90
+ ======================= ======================= =========================================== ====================
91
+
92
+ The ``UnicoreDateOperator`` is more of a testing operator, since it will only run the ``date`` executable.
93
+
94
+ -------------------------------
95
+ Behaviour on Errors and Success
96
+ -------------------------------
97
+
98
+ The Unicore Operators do not do a lot of error and exception handling, and mostly just forward any problems to be handled by airflow.
99
+ All of the Unicore logic is handled by the `pyunicore library <https://github.com/HumanBrainProject/pyunicore>`_.
100
+
101
+ While some validation of the resulting Unicore job description is done automatically, it may still be possible to build an invalid job description with the operators.
102
+ This may lead to a submission failure with Unicore. In this case, an exception is thrown to be handled by airflow.
103
+
104
+
105
+ For a successful job submission, the job exit code is returned as the task return value, so that airflow can handle non-zero exit codes.
106
+ All operators will also append the content of the job-log-file from Unicore to the airflow task log.
107
+ Also, some job results and values will be uploaded via airflow-x-coms as well:
108
+
109
+ ======================= ========================================
110
+ xcom name description
111
+ ======================= ========================================
112
+ Unicore Job ID the Unicore ID for the job
113
+ Unicore Job the TSI script that was submitted by Unicore
114
+ BSS_SUBMIT the bss_script submitted by Unicore
115
+ status_message the status message for the Unicore job
116
+ log the Unicore job log
117
+ workdir_content content of the job workdir upon completion
118
+ [xcom_output_files] content of each file in their own xcom, by default stdout and stderr
119
+ ======================= ========================================
120
+
121
+ ------------
122
+ Example DAGs
123
+ ------------
124
+
125
+ There are some example DAGs in this repository under ``project-dir/dags``.
126
+
127
+ - ``unicore-test-1.py`` just shows basic date and executable usage.
128
+ - ``unicore-test-2.py`` has some basic examples for the generic operator.
129
+ - ``unicore-test-3.py`` also includes script-operator examples.
130
+ - ``unicore-test-4.py`` has some examples with more arguments.
131
+ - ``unicore-test-bss.py`` shows how bss submission can be done (very simple example).
132
+ - ``unicore-test-credentials.py`` demonstrates that not only the credentials from the airflow connections backend can be used, but they can also be provided in the constructor of the o`perator.
133
+ - ``unicore-test-import-export.py`` gives shprt examples for the imports and exports usage.
134
+
135
+
136
+ -----------------
137
+ Setup testing env
138
+ -----------------
139
+
140
+ Ensure a current version of docker is installed.
141
+
142
+ Run ``python3 -m build`` to build the python package.
143
+
144
+ Run the ``testing-env/build-image.sh`` script to create the customized airflow image, which will contain the newly build python package.
145
+
146
+ Run ``testing-env/run-testing-env.sh init`` to initialize the airflow containers, database etc. This only needs to be done once.
147
+
148
+ Run ``testing-env/run-testing-env.sh up`` to start the local airflow and Unicore deployment. Airflow will be available on port 8080, Unicore on port 8081.
149
+
150
+ The ``run-testing-env.sh`` script supports the commands up, down, start, stop, ps and init for matching docker compose functions.
151
+
152
+ -----------------------
153
+ Install package via pip
154
+ -----------------------
155
+
156
+ ``pip install airflow-unicore-integration``
@@ -1,3 +1,4 @@
1
+ LICENSE
1
2
  README.rst
2
3
  pyproject.toml
3
4
  src/airflow_unicore_integration/__init__.py
@@ -1,16 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: airflow-unicore-integration
3
- Version: 0.0.4
4
- Summary: Running Unicore Jobs from airflow DAGs.
5
- Author-email: Christian Böttcher <c.boettcher@fz-juelich.de>
6
- Project-URL: Homepage, https://gitlab.jsc.fz-juelich.de/boettcher1/airflow_unicore_integration
7
- Project-URL: Issues, https://gitlab.jsc.fz-juelich.de/boettcher1/airflow_unicore_integration/-/issues
8
- Classifier: Development Status :: 4 - Beta
9
- Classifier: Framework :: Apache Airflow :: Provider
10
- Classifier: Programming Language :: Python :: 3
11
- Classifier: License :: OSI Approved :: MIT License
12
- Classifier: Operating System :: OS Independent
13
- Requires-Python: >=3.10
14
- Description-Content-Type: text/markdown
15
- Requires-Dist: pyunicore>=1.0.0
16
- Requires-Dist: apache-airflow==2.8.4
@@ -1,16 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: airflow-unicore-integration
3
- Version: 0.0.4
4
- Summary: Running Unicore Jobs from airflow DAGs.
5
- Author-email: Christian Böttcher <c.boettcher@fz-juelich.de>
6
- Project-URL: Homepage, https://gitlab.jsc.fz-juelich.de/boettcher1/airflow_unicore_integration
7
- Project-URL: Issues, https://gitlab.jsc.fz-juelich.de/boettcher1/airflow_unicore_integration/-/issues
8
- Classifier: Development Status :: 4 - Beta
9
- Classifier: Framework :: Apache Airflow :: Provider
10
- Classifier: Programming Language :: Python :: 3
11
- Classifier: License :: OSI Approved :: MIT License
12
- Classifier: Operating System :: OS Independent
13
- Requires-Python: >=3.10
14
- Description-Content-Type: text/markdown
15
- Requires-Dist: pyunicore>=1.0.0
16
- Requires-Dist: apache-airflow==2.8.4