airbyte-source-convex 0.4.0__py3-none-any.whl → 0.4.2__py3-none-any.whl

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,110 @@
1
+ Metadata-Version: 2.1
2
+ Name: airbyte-source-convex
3
+ Version: 0.4.2
4
+ Summary: Source implementation for Convex.
5
+ Home-page: https://airbyte.com
6
+ License: MIT
7
+ Author: Airbyte
8
+ Author-email: contact@airbyte.io
9
+ Requires-Python: >=3.9,<3.12
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Programming Language :: Python :: 3
12
+ Classifier: Programming Language :: Python :: 3.9
13
+ Classifier: Programming Language :: Python :: 3.10
14
+ Classifier: Programming Language :: Python :: 3.11
15
+ Requires-Dist: airbyte-cdk (==0.80.0)
16
+ Project-URL: Documentation, https://docs.airbyte.com/integrations/sources/convex
17
+ Project-URL: Repository, https://github.com/airbytehq/airbyte
18
+ Description-Content-Type: text/markdown
19
+
20
+ # Convex source connector
21
+
22
+
23
+ This is the repository for the Convex source connector, written in Python.
24
+ For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/convex).
25
+
26
+ ## Local development
27
+
28
+ ### Prerequisites
29
+ * Python (~=3.9)
30
+ * Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation)
31
+
32
+
33
+ ### Installing the connector
34
+ From this connector directory, run:
35
+ ```bash
36
+ poetry install --with dev
37
+ ```
38
+
39
+
40
+ ### Create credentials
41
+ **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/convex)
42
+ to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_convex/spec.yaml` file.
43
+ Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
44
+ See `sample_files/sample_config.json` for a sample config file.
45
+
46
+
47
+ ### Locally running the connector
48
+ ```
49
+ poetry run source-convex spec
50
+ poetry run source-convex check --config secrets/config.json
51
+ poetry run source-convex discover --config secrets/config.json
52
+ poetry run source-convex read --config secrets/config.json --catalog sample_files/configured_catalog.json
53
+ ```
54
+
55
+ ### Running unit tests
56
+ To run unit tests locally, from the connector directory run:
57
+ ```
58
+ poetry run pytest unit_tests
59
+ ```
60
+
61
+ ### Building the docker image
62
+ 1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md)
63
+ 2. Run the following command to build the docker image:
64
+ ```bash
65
+ airbyte-ci connectors --name=source-convex build
66
+ ```
67
+
68
+ An image will be available on your host with the tag `airbyte/source-convex:dev`.
69
+
70
+
71
+ ### Running as a docker container
72
+ Then run any of the connector commands as follows:
73
+ ```
74
+ docker run --rm airbyte/source-convex:dev spec
75
+ docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev check --config /secrets/config.json
76
+ docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev discover --config /secrets/config.json
77
+ docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-convex:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
78
+ ```
79
+
80
+ ### Running our CI test suite
81
+ You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
82
+ ```bash
83
+ airbyte-ci connectors --name=source-convex test
84
+ ```
85
+
86
+ ### Customizing acceptance Tests
87
+ Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
88
+ If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
89
+
90
+ ### Dependency Management
91
+ All of your dependencies should be managed via Poetry.
92
+ To add a new dependency, run:
93
+ ```bash
94
+ poetry add <package-name>
95
+ ```
96
+
97
+ Please commit the changes to `pyproject.toml` and `poetry.lock` files.
98
+
99
+ ## Publishing a new version of the connector
100
+ You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
101
+ 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-convex test`
102
+ 2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
103
+ - bump the `dockerImageTag` value in in `metadata.yaml`
104
+ - bump the `version` value in `pyproject.toml`
105
+ 3. Make sure the `metadata.yaml` content is up to date.
106
+ 4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/convex.md`).
107
+ 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
108
+ 6. Pat yourself on the back for being an awesome contributor.
109
+ 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
110
+ 8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
@@ -0,0 +1,8 @@
1
+ source_convex/__init__.py,sha256=T8ZCfUI3YTwNnVbF2QrCbi0VvbltBZDfzIIQqs2FNi8,124
2
+ source_convex/run.py,sha256=DhNog0Q8GE3ttxc-yYiuT-1NBe34IDG_Z78CvDDum9c,230
3
+ source_convex/source.py,sha256=6G90cvi7pp6Xzu8akY5Vsuse62h8d1IcMJh2KCqITaA,9347
4
+ source_convex/spec.yaml,sha256=B3bAQFg19q5SYtZJXg_X3Xzvh7zthE6Iyxvm5DwQUU0,616
5
+ airbyte_source_convex-0.4.2.dist-info/METADATA,sha256=xTLQ5K5aiPUgzs0HJBfbGlrBfRJij9-IwCYM7HPAWcU,5200
6
+ airbyte_source_convex-0.4.2.dist-info/WHEEL,sha256=sP946D7jFCHeNz5Iq4fL4Lu-PrWrFsgfLXbbkciIZwg,88
7
+ airbyte_source_convex-0.4.2.dist-info/entry_points.txt,sha256=bpRwIyH5YD6_9_szRISOplm5kfQ4CfW_ROvHg2fP2e0,55
8
+ airbyte_source_convex-0.4.2.dist-info/RECORD,,
@@ -1,5 +1,4 @@
1
1
  Wheel-Version: 1.0
2
- Generator: bdist_wheel (0.42.0)
2
+ Generator: poetry-core 1.9.0
3
3
  Root-Is-Purelib: true
4
4
  Tag: py3-none-any
5
-
@@ -0,0 +1,3 @@
1
+ [console_scripts]
2
+ source-convex=source_convex.run:run
3
+
@@ -1,100 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: airbyte-source-convex
3
- Version: 0.4.0
4
- Summary: Source implementation for Convex.
5
- Author: Airbyte
6
- Author-email: contact@airbyte.io
7
- Description-Content-Type: text/markdown
8
- Requires-Dist: airbyte-cdk ~=0.2
9
- Provides-Extra: tests
10
- Requires-Dist: requests-mock ~=1.9.3 ; extra == 'tests'
11
- Requires-Dist: pytest ~=6.1 ; extra == 'tests'
12
- Requires-Dist: pytest-mock ~=3.6.1 ; extra == 'tests'
13
- Requires-Dist: responses ~=0.13.3 ; extra == 'tests'
14
-
15
- # Convex Source
16
-
17
- This is the repository for the Convex source connector, written in Python.
18
- For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/convex).
19
-
20
-
21
- **To iterate on this connector, make sure to complete this prerequisites section.**
22
-
23
-
24
- From this connector directory, create a virtual environment:
25
- ```
26
- python -m venv .venv
27
- ```
28
-
29
- This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
30
- development environment of choice. To activate it from the terminal, run:
31
- ```
32
- source .venv/bin/activate
33
- pip install -r requirements.txt
34
- pip install '.[tests]'
35
- ```
36
- If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
37
-
38
- Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
39
- used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
40
- If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
41
- should work as you expect.
42
-
43
- **If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/convex)
44
- to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_convex/spec.yaml` file.
45
- Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
46
- See `integration_tests/sample_config.json` for a sample config file.
47
-
48
- **If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source convex test creds`
49
- and place them into `secrets/config.json`.
50
-
51
- ```
52
- python main.py spec
53
- python main.py check --config secrets/config.json
54
- python main.py discover --config secrets/config.json
55
- python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
56
- ```
57
-
58
-
59
-
60
- **Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
61
- ```bash
62
- airbyte-ci connectors --name=source-convex build
63
- ```
64
-
65
- An image will be built with the tag `airbyte/source-convex:dev`.
66
-
67
- **Via `docker build`:**
68
- ```bash
69
- docker build -t airbyte/source-convex:dev .
70
- ```
71
-
72
- Then run any of the connector commands as follows:
73
- ```
74
- docker run --rm airbyte/source-convex:dev spec
75
- docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev check --config /secrets/config.json
76
- docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev discover --config /secrets/config.json
77
- docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-convex:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
78
- ```
79
-
80
- You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
81
- ```bash
82
- airbyte-ci connectors --name=source-convex test
83
- ```
84
-
85
- Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
86
- If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
87
-
88
- All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
89
- We split dependencies between two groups, dependencies that are:
90
- * required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
91
- * required for the testing need to go to `TEST_REQUIREMENTS` list
92
-
93
- You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
94
- 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-convex test`
95
- 2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
96
- 3. Make sure the `metadata.yaml` content is up to date.
97
- 4. Make the connector documentation and its changelog is up to date (`docs/integrations/sources/convex.md`).
98
- 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
99
- 6. Pat yourself on the back for being an awesome contributor.
100
- 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
@@ -1,20 +0,0 @@
1
- integration_tests/__init__.py,sha256=4Hw-PX1-VgESLF16cDdvuYCzGJtHntThLF4qIiULWeo,61
2
- integration_tests/abnormal_state.json,sha256=9ANLAXetox-PHz_0mt6aveFTCgpWtvC92BCHbrK7BRc,241
3
- integration_tests/acceptance.py,sha256=VpQhkCpkovNSs0uLNh6iAdJ0WY-gYg0FbSBl8te_D0A,314
4
- integration_tests/configured_catalog.json,sha256=cfFZATj-vHwPipaN1NgOcUviOW6bSJY7tVSpfVRnSQ8,1638
5
- integration_tests/invalid_config.json,sha256=tapB54qKNtQsnlodcBFCPFVQIhwRIAQx6O9GNGpjuRs,85
6
- integration_tests/sample_config.json,sha256=cOt8KoaakpACs_TJeCKAh7UUNh2NkZQ9R1poSDs5rKE,108
7
- integration_tests/sample_state.json,sha256=5erVJh8kM_32y6_jZX7BROVOWiwbmTMsBez3Ap1VJbc,207
8
- source_convex/__init__.py,sha256=T8ZCfUI3YTwNnVbF2QrCbi0VvbltBZDfzIIQqs2FNi8,124
9
- source_convex/run.py,sha256=DhNog0Q8GE3ttxc-yYiuT-1NBe34IDG_Z78CvDDum9c,230
10
- source_convex/source.py,sha256=6G90cvi7pp6Xzu8akY5Vsuse62h8d1IcMJh2KCqITaA,9347
11
- source_convex/spec.yaml,sha256=B3bAQFg19q5SYtZJXg_X3Xzvh7zthE6Iyxvm5DwQUU0,616
12
- unit_tests/__init__.py,sha256=4Hw-PX1-VgESLF16cDdvuYCzGJtHntThLF4qIiULWeo,61
13
- unit_tests/test_incremental_streams.py,sha256=sanSrKu0LhoncH7GuBK-hIWDmcoo9qnd6lT-xxKcmnA,3499
14
- unit_tests/test_source.py,sha256=6ixrrOLhTeE7IVd1X6mSZxbmTH4mZ3CYUAShlo4yMXc,4111
15
- unit_tests/test_streams.py,sha256=KMzgksepqoh55F9LhxFrwV4pZbbBivJA0dspjQ1Wo2w,7485
16
- airbyte_source_convex-0.4.0.dist-info/METADATA,sha256=A1b33nI0QgYmHVfnvhawGnzMhGKO8wSJmK6t55gCHjA,5432
17
- airbyte_source_convex-0.4.0.dist-info/WHEEL,sha256=oiQVh_5PnQM0E3gPdiz09WCNmwiHDMaGer_elqB3coM,92
18
- airbyte_source_convex-0.4.0.dist-info/entry_points.txt,sha256=rF4lpMvFNATzw42FmmmmitdqI1c8FPnX-kyPdfHeS1w,56
19
- airbyte_source_convex-0.4.0.dist-info/top_level.txt,sha256=NnXtleRcJaGOVmXZDaejzvTZyWeKQkLuKtJ-AgFbt5Q,43
20
- airbyte_source_convex-0.4.0.dist-info/RECORD,,
@@ -1,2 +0,0 @@
1
- [console_scripts]
2
- source-convex = source_convex.run:run
@@ -1,3 +0,0 @@
1
- integration_tests
2
- source_convex
3
- unit_tests
@@ -1,3 +0,0 @@
1
- #
2
- # Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3
- #
@@ -1,12 +0,0 @@
1
- {
2
- "posts": {
3
- "snapshot_cursor": "hi",
4
- "snapshot_has_more": false,
5
- "delta_cursor": 2652635567679741986
6
- },
7
- "users": {
8
- "snapshot_cursor": "hi",
9
- "snapshot_has_more": false,
10
- "delta_cursor": 2660025892355943945
11
- }
12
- }
@@ -1,14 +0,0 @@
1
- #
2
- # Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3
- #
4
-
5
-
6
- import pytest
7
-
8
- pytest_plugins = ("connector_acceptance_test.plugin",)
9
-
10
-
11
- @pytest.fixture(scope="session", autouse=True)
12
- def connector_setup():
13
- """This fixture is a placeholder for external resources that acceptance test might require."""
14
- yield
@@ -1,56 +0,0 @@
1
- {
2
- "streams": [
3
- {
4
- "sync_mode": "incremental",
5
- "destination_sync_mode": "append",
6
- "stream": {
7
- "name": "posts",
8
- "json_schema": {
9
- "type": "object",
10
- "properties": {
11
- "_creationTime": { "type": "number" },
12
- "_id": {
13
- "type": "object",
14
- "properties": { "$id": { "type": "string" } }
15
- },
16
- "author": {
17
- "type": "object",
18
- "properties": { "$id": { "type": "string" } }
19
- },
20
- "body": { "type": "string" },
21
- "time": { "type": "number" },
22
- "_ts": { "type": "number" }
23
- }
24
- },
25
- "supported_sync_modes": ["full_refresh", "incremental"],
26
- "source_defined_cursor": true,
27
- "default_cursor_field": ["_ts"],
28
- "source_defined_primary_key": [["_id"]]
29
- }
30
- },
31
- {
32
- "sync_mode": "incremental",
33
- "destination_sync_mode": "append",
34
- "stream": {
35
- "name": "users",
36
- "json_schema": {
37
- "type": "object",
38
- "properties": {
39
- "_creationTime": { "type": "number" },
40
- "_id": {
41
- "type": "object",
42
- "properties": { "$id": { "type": "string" } }
43
- },
44
- "name": { "type": "string" },
45
- "tokenIdentifier": { "type": "string" },
46
- "_ts": { "type": "number" }
47
- }
48
- },
49
- "supported_sync_modes": ["full_refresh", "incremental"],
50
- "source_defined_cursor": true,
51
- "default_cursor_field": ["_ts"],
52
- "source_defined_primary_key": [["_id"]]
53
- }
54
- }
55
- ]
56
- }
@@ -1,4 +0,0 @@
1
- {
2
- "deployment_url": "https://murky-swan-635.convex.cloud",
3
- "access_key": "bad"
4
- }
@@ -1,4 +0,0 @@
1
- {
2
- "deployment_url": "https://descriptive-vulture-260.convex.cloud",
3
- "access_key": "Your access token"
4
- }
@@ -1,12 +0,0 @@
1
- {
2
- "posts": {
3
- "snapshot_cursor": "hi",
4
- "snapshot_has_more": false,
5
- "delta_cursor": 1
6
- },
7
- "users": {
8
- "snapshot_cursor": null,
9
- "snapshot_has_more": true,
10
- "delta_cursor": null
11
- }
12
- }
unit_tests/__init__.py DELETED
@@ -1,3 +0,0 @@
1
- #
2
- # Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3
- #
@@ -1,88 +0,0 @@
1
- #
2
- # Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3
- #
4
-
5
-
6
- from unittest.mock import MagicMock
7
-
8
- from airbyte_cdk.models import SyncMode
9
- from pytest import fixture
10
- from source_convex.source import ConvexStream
11
-
12
-
13
- @fixture
14
- def patch_incremental_base_class(mocker):
15
- # Mock abstract methods to enable instantiating abstract class
16
- mocker.patch.object(ConvexStream, "path", "v0/example_endpoint")
17
- mocker.patch.object(ConvexStream, "primary_key", "test_primary_key")
18
- mocker.patch.object(ConvexStream, "__abstractmethods__", set())
19
-
20
-
21
- def test_cursor_field(patch_incremental_base_class):
22
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
23
- expected_cursor_field = "_ts"
24
- assert stream.cursor_field == expected_cursor_field
25
-
26
-
27
- def test_get_updated_state(patch_incremental_base_class):
28
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
29
- resp = MagicMock()
30
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 123}], "cursor": 1234, "snapshot": 3000, "hasMore": True}
31
- resp.status_code = 200
32
- stream.parse_response(resp, {})
33
- stream.next_page_token(resp)
34
- assert stream.get_updated_state(None, None) == {
35
- "snapshot_cursor": 1234,
36
- "snapshot_has_more": True,
37
- "delta_cursor": 3000,
38
- }
39
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1235}], "cursor": 1235, "snapshot": 3000, "hasMore": False}
40
- stream.parse_response(resp, {})
41
- stream.next_page_token(resp)
42
- assert stream.get_updated_state(None, None) == {
43
- "snapshot_cursor": 1235,
44
- "snapshot_has_more": False,
45
- "delta_cursor": 3000,
46
- }
47
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1235}], "cursor": 8000, "hasMore": True}
48
- stream.parse_response(resp, {})
49
- stream.next_page_token(resp)
50
- assert stream.get_updated_state(None, None) == {
51
- "snapshot_cursor": 1235,
52
- "snapshot_has_more": False,
53
- "delta_cursor": 8000,
54
- }
55
- assert stream._delta_has_more is True
56
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1235}], "cursor": 9000, "hasMore": False}
57
- stream.parse_response(resp, {})
58
- stream.next_page_token(resp)
59
- assert stream.get_updated_state(None, None) == {
60
- "snapshot_cursor": 1235,
61
- "snapshot_has_more": False,
62
- "delta_cursor": 9000,
63
- }
64
- assert stream._delta_has_more is False
65
-
66
-
67
- def test_stream_slices(patch_incremental_base_class):
68
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
69
- inputs = {"sync_mode": SyncMode.incremental, "cursor_field": [], "stream_state": {}}
70
- expected_stream_slice = [None]
71
- assert stream.stream_slices(**inputs) == expected_stream_slice
72
-
73
-
74
- def test_supports_incremental(patch_incremental_base_class, mocker):
75
- mocker.patch.object(ConvexStream, "cursor_field", "dummy_field")
76
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
77
- assert stream.supports_incremental
78
-
79
-
80
- def test_source_defined_cursor(patch_incremental_base_class):
81
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
82
- assert stream.source_defined_cursor
83
-
84
-
85
- def test_stream_checkpoint_interval(patch_incremental_base_class):
86
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
87
- expected_checkpoint_interval = 128
88
- assert stream.state_checkpoint_interval == expected_checkpoint_interval
unit_tests/test_source.py DELETED
@@ -1,115 +0,0 @@
1
- #
2
- # Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3
- #
4
-
5
- from unittest.mock import MagicMock
6
-
7
- import responses
8
- from source_convex.source import SourceConvex
9
-
10
-
11
- def setup_responses():
12
- sample_shapes_resp = {
13
- "posts": {
14
- "type": "object",
15
- "properties": {
16
- "_creationTime": {"type": "number"},
17
- "_id": {"$description": "Id(posts)", "type": "object", "properties": {"$id": {"type": "string"}}},
18
- "author": {"$description": "Id(users)", "type": "object", "properties": {"$id": {"type": "string"}}},
19
- "body": {"type": "string"},
20
- "_ts": {"type": "integer"},
21
- "_deleted": {"type": "boolean"},
22
- },
23
- "$schema": "http://json-schema.org/draft-07/schema#",
24
- },
25
- "users": {
26
- "type": "object",
27
- "properties": {
28
- "_creationTime": {"type": "number"},
29
- "_id": {"$description": "Id(users)", "type": "object", "properties": {"$id": {"type": "string"}}},
30
- "name": {"type": "string"},
31
- "tokenIdentifier": {"type": "string"},
32
- "_ts": {"type": "integer"},
33
- "_deleted": {"type": "boolean"},
34
- },
35
- "$schema": "http://json-schema.org/draft-07/schema#",
36
- },
37
- }
38
- responses.add(
39
- responses.GET,
40
- "https://murky-swan-635.convex.cloud/api/json_schemas?deltaSchema=true&format=json",
41
- json=sample_shapes_resp,
42
- )
43
- responses.add(
44
- responses.GET,
45
- "https://curious-giraffe-964.convex.cloud/api/json_schemas?deltaSchema=true&format=json",
46
- json={"code": "Error code", "message": "Error message"},
47
- status=400,
48
- )
49
-
50
-
51
- @responses.activate
52
- def test_check_connection(mocker):
53
- setup_responses()
54
- source = SourceConvex()
55
- logger_mock = MagicMock()
56
- assert source.check_connection(
57
- logger_mock,
58
- {
59
- "deployment_url": "https://murky-swan-635.convex.cloud",
60
- "access_key": "test_api_key",
61
- },
62
- ) == (True, None)
63
-
64
-
65
- @responses.activate
66
- def test_check_bad_connection(mocker):
67
- setup_responses()
68
- source = SourceConvex()
69
- logger_mock = MagicMock()
70
- assert source.check_connection(
71
- logger_mock,
72
- {
73
- "deployment_url": "https://curious-giraffe-964.convex.cloud",
74
- "access_key": "test_api_key",
75
- },
76
- ) == (False, "Connection to Convex via json_schemas endpoint failed: 400: Error code: Error message")
77
-
78
-
79
- @responses.activate
80
- def test_streams(mocker):
81
- setup_responses()
82
- source = SourceConvex()
83
- streams = source.streams(
84
- {
85
- "deployment_url": "https://murky-swan-635.convex.cloud",
86
- "access_key": "test_api_key",
87
- }
88
- )
89
- assert len(streams) == 2
90
- streams.sort(key=lambda stream: stream.table_name)
91
- assert streams[0].table_name == "posts"
92
- assert streams[1].table_name == "users"
93
- assert all(stream.deployment_url == "https://murky-swan-635.convex.cloud" for stream in streams)
94
- assert all(stream._session.auth.get_auth_header() == {"Authorization": "Convex test_api_key"} for stream in streams)
95
- shapes = [stream.get_json_schema() for stream in streams]
96
- assert all(shape["type"] == "object" for shape in shapes)
97
- properties = [shape["properties"] for shape in shapes]
98
- assert [
99
- props["_id"]
100
- == {
101
- "type": "object",
102
- "properties": {
103
- "$id": {"type": "string"},
104
- },
105
- }
106
- for props in properties
107
- ]
108
- assert [props["_ts"] == {"type": "number"} for props in properties]
109
- assert [props["_creationTime"] == {"type": "number"} for props in properties]
110
- assert set(properties[0].keys()) == set(
111
- ["_id", "_ts", "_deleted", "_creationTime", "author", "body", "_ab_cdc_lsn", "_ab_cdc_updated_at", "_ab_cdc_deleted_at"]
112
- )
113
- assert set(properties[1].keys()) == set(
114
- ["_id", "_ts", "_deleted", "_creationTime", "name", "tokenIdentifier", "_ab_cdc_lsn", "_ab_cdc_updated_at", "_ab_cdc_deleted_at"]
115
- )
@@ -1,169 +0,0 @@
1
- #
2
- # Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3
- #
4
-
5
- from http import HTTPStatus
6
- from unittest.mock import MagicMock
7
-
8
- import pytest
9
- import requests
10
- import responses
11
- from airbyte_cdk.models import SyncMode
12
- from source_convex.source import ConvexStream
13
-
14
-
15
- @pytest.fixture
16
- def patch_base_class(mocker):
17
- # Mock abstract methods to enable instantiating abstract class
18
- mocker.patch.object(ConvexStream, "primary_key", "test_primary_key")
19
- mocker.patch.object(ConvexStream, "__abstractmethods__", set())
20
-
21
-
22
- def test_request_params(patch_base_class):
23
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
24
- inputs = {"stream_slice": None, "stream_state": None, "next_page_token": None}
25
- expected_params = {"tableName": "messages", "format": "json"}
26
- assert stream.request_params(**inputs) == expected_params
27
- stream._snapshot_cursor_value = 1234
28
- expected_params = {"tableName": "messages", "format": "json", "cursor": 1234}
29
- assert stream.request_params(**inputs) == expected_params
30
- stream._snapshot_has_more = False
31
- stream._delta_cursor_value = 2345
32
- expected_params = {"tableName": "messages", "format": "json", "cursor": 2345}
33
- assert stream.request_params(**inputs) == expected_params
34
-
35
-
36
- def test_next_page_token(patch_base_class):
37
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
38
- resp = MagicMock()
39
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 123}], "cursor": 1234, "snapshot": 5000, "hasMore": True}
40
- resp.status_code = 200
41
- stream.parse_response(resp, {})
42
- assert stream.next_page_token(resp) == {
43
- "snapshot_cursor": 1234,
44
- "snapshot_has_more": True,
45
- "delta_cursor": 5000,
46
- }
47
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1235}], "cursor": 1235, "snapshot": 5000, "hasMore": False}
48
- stream.parse_response(resp, {})
49
- assert stream.next_page_token(resp) == {
50
- "snapshot_cursor": 1235,
51
- "snapshot_has_more": False,
52
- "delta_cursor": 5000,
53
- }
54
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1235}], "cursor": 6000, "hasMore": True}
55
- stream.parse_response(resp, {})
56
- assert stream.next_page_token(resp) == {
57
- "snapshot_cursor": 1235,
58
- "snapshot_has_more": False,
59
- "delta_cursor": 6000,
60
- }
61
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1235}], "cursor": 7000, "hasMore": False}
62
- stream.parse_response(resp, {})
63
- assert stream.next_page_token(resp) is None
64
- assert stream.state == {"snapshot_cursor": 1235, "snapshot_has_more": False, "delta_cursor": 7000}
65
-
66
-
67
- @responses.activate
68
- def test_read_records_full_refresh(patch_base_class):
69
- stream = ConvexStream("http://mocked_base_url:8080", "accesskey", "json", "messages", None)
70
- snapshot0_resp = {"values": [{"_id": "my_id", "field": "f", "_ts": 123}], "cursor": 1234, "snapshot": 5000, "hasMore": True}
71
- responses.add(
72
- responses.GET,
73
- "http://mocked_base_url:8080/api/list_snapshot?tableName=messages&format=json",
74
- json=snapshot0_resp,
75
- )
76
- snapshot1_resp = {"values": [{"_id": "an_id", "field": "b", "_ts": 100}], "cursor": 2345, "snapshot": 5000, "hasMore": True}
77
- responses.add(
78
- responses.GET,
79
- "http://mocked_base_url:8080/api/list_snapshot?tableName=messages&format=json&cursor=1234&snapshot=5000",
80
- json=snapshot1_resp,
81
- )
82
- snapshot2_resp = {"values": [{"_id": "a_id", "field": "x", "_ts": 300}], "cursor": 3456, "snapshot": 5000, "hasMore": False}
83
- responses.add(
84
- responses.GET,
85
- "http://mocked_base_url:8080/api/list_snapshot?tableName=messages&format=json&cursor=2345&snapshot=5000",
86
- json=snapshot2_resp,
87
- )
88
- records = list(stream.read_records(SyncMode.full_refresh))
89
- assert len(records) == 3
90
- assert [record["field"] for record in records] == ["f", "b", "x"]
91
- assert stream.state == {"delta_cursor": 5000, "snapshot_cursor": 3456, "snapshot_has_more": False}
92
-
93
-
94
- @responses.activate
95
- def test_read_records_incremental(patch_base_class):
96
- stream = ConvexStream("http://mocked_base_url:8080", "accesskey", "json", "messages", None)
97
- snapshot0_resp = {"values": [{"_id": "my_id", "field": "f", "_ts": 123}], "cursor": 1234, "snapshot": 5000, "hasMore": True}
98
- responses.add(
99
- responses.GET,
100
- "http://mocked_base_url:8080/api/list_snapshot?tableName=messages&format=json",
101
- json=snapshot0_resp,
102
- )
103
- snapshot1_resp = {"values": [{"_id": "an_id", "field": "b", "_ts": 100}], "cursor": 2345, "snapshot": 5000, "hasMore": False}
104
- responses.add(
105
- responses.GET,
106
- "http://mocked_base_url:8080/api/list_snapshot?tableName=messages&format=json&cursor=1234&snapshot=5000",
107
- json=snapshot1_resp,
108
- )
109
- delta0_resp = {"values": [{"_id": "a_id", "field": "x", "_ts": 300}], "cursor": 6000, "hasMore": True}
110
- responses.add(
111
- responses.GET,
112
- "http://mocked_base_url:8080/api/document_deltas?tableName=messages&format=json&cursor=5000",
113
- json=delta0_resp,
114
- )
115
- delta1_resp = {"values": [{"_id": "a_id", "field": "x", "_ts": 400}], "cursor": 7000, "hasMore": False}
116
- responses.add(
117
- responses.GET,
118
- "http://mocked_base_url:8080/api/document_deltas?tableName=messages&format=json&cursor=6000",
119
- json=delta1_resp,
120
- )
121
- records = list(stream.read_records(SyncMode.incremental))
122
- assert len(records) == 4
123
- assert [record["field"] for record in records] == ["f", "b", "x", "x"]
124
- assert stream.state == {"delta_cursor": 7000, "snapshot_cursor": 2345, "snapshot_has_more": False}
125
-
126
-
127
- def test_parse_response(patch_base_class):
128
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
129
- resp = MagicMock()
130
- resp.json = lambda: {"values": [{"_id": "my_id", "field": "f", "_ts": 1234}], "cursor": 1234, "snapshot": 2000, "hasMore": True}
131
- resp.status_code = 200
132
- inputs = {"response": resp, "stream_state": {}}
133
- expected_parsed_objects = [{"_id": "my_id", "field": "f", "_ts": 1234}]
134
- assert stream.parse_response(**inputs) == expected_parsed_objects
135
-
136
-
137
- def test_request_headers(patch_base_class):
138
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
139
- inputs = {"stream_slice": None, "stream_state": None, "next_page_token": None}
140
- assert stream.request_headers(**inputs) == {"Convex-Client": "airbyte-export-0.4.0"}
141
-
142
-
143
- def test_http_method(patch_base_class):
144
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
145
- expected_method = "GET"
146
- assert stream.http_method == expected_method
147
-
148
-
149
- @pytest.mark.parametrize(
150
- ("http_status", "should_retry"),
151
- [
152
- (HTTPStatus.OK, False),
153
- (HTTPStatus.BAD_REQUEST, False),
154
- (HTTPStatus.TOO_MANY_REQUESTS, True),
155
- (HTTPStatus.INTERNAL_SERVER_ERROR, True),
156
- ],
157
- )
158
- def test_should_retry(patch_base_class, http_status, should_retry):
159
- response_mock = MagicMock()
160
- response_mock.status_code = http_status
161
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
162
- assert stream.should_retry(response_mock) == should_retry
163
-
164
-
165
- def test_backoff_time(patch_base_class):
166
- response_mock = MagicMock()
167
- stream = ConvexStream("murky-swan-635", "accesskey", "json", "messages", None)
168
- expected_backoff_time = None
169
- assert stream.backoff_time(response_mock) == expected_backoff_time