feldera 0.129.0__tar.gz → 0.131.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of feldera might be problematic. Click here for more details.
- {feldera-0.129.0 → feldera-0.131.0}/PKG-INFO +7 -68
- feldera-0.131.0/README.md +79 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera.egg-info/PKG-INFO +7 -68
- {feldera-0.129.0 → feldera-0.131.0}/feldera.egg-info/SOURCES.txt +1 -6
- {feldera-0.129.0 → feldera-0.131.0}/pyproject.toml +1 -1
- feldera-0.131.0/tests/test_uda.py +283 -0
- feldera-0.129.0/README.md +0 -140
- feldera-0.129.0/tests/test_checkpoint_sync.py +0 -320
- feldera-0.129.0/tests/test_issue4457.py +0 -57
- feldera-0.129.0/tests/test_pipeline_builder.py +0 -53
- feldera-0.129.0/tests/test_shared_pipeline.py +0 -647
- feldera-0.129.0/tests/test_shared_pipeline_stress.py +0 -26
- feldera-0.129.0/tests/test_udf.py +0 -311
- {feldera-0.129.0 → feldera-0.131.0}/feldera/__init__.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/_callback_runner.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/_helpers.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/enums.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/output_handler.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/pipeline.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/pipeline_builder.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/__init__.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/_helpers.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/_httprequests.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/config.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/errors.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/feldera_client.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/feldera_config.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/pipeline.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/sql_table.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/rest/sql_view.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/runtime_config.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera/stats.py +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera.egg-info/dependency_links.txt +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera.egg-info/requires.txt +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/feldera.egg-info/top_level.txt +0 -0
- {feldera-0.129.0 → feldera-0.131.0}/setup.cfg +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: feldera
|
|
3
|
-
Version: 0.
|
|
3
|
+
Version: 0.131.0
|
|
4
4
|
Summary: The feldera python client
|
|
5
5
|
Author-email: Feldera Team <dev@feldera.com>
|
|
6
6
|
License: MIT
|
|
@@ -59,6 +59,12 @@ source .venv/activate
|
|
|
59
59
|
uv pip install .
|
|
60
60
|
```
|
|
61
61
|
|
|
62
|
+
You also have to install the `pytest` module:
|
|
63
|
+
|
|
64
|
+
```bash
|
|
65
|
+
python3 -m pip install pytest
|
|
66
|
+
```
|
|
67
|
+
|
|
62
68
|
## Documentation
|
|
63
69
|
|
|
64
70
|
The Python SDK documentation is available at
|
|
@@ -78,73 +84,6 @@ make html
|
|
|
78
84
|
|
|
79
85
|
To clean the build, run `make clean`.
|
|
80
86
|
|
|
81
|
-
## Testing
|
|
82
|
-
|
|
83
|
-
To run unit tests:
|
|
84
|
-
|
|
85
|
-
```bash
|
|
86
|
-
cd python && python3 -m pytest tests/
|
|
87
|
-
```
|
|
88
|
-
|
|
89
|
-
- This will detect and run all test files that match the pattern `test_*.py` or
|
|
90
|
-
`*_test.py`.
|
|
91
|
-
- By default, the tests expect a running Feldera instance at `http://localhost:8080`.
|
|
92
|
-
To override the default endpoint, set the `FELDERA_HOST` environment variable.
|
|
93
|
-
|
|
94
|
-
To run tests from a specific file:
|
|
95
|
-
|
|
96
|
-
```bash
|
|
97
|
-
(cd python && python3 -m pytest ./tests/path-to-file.py)
|
|
98
|
-
```
|
|
99
|
-
|
|
100
|
-
To run a specific test:
|
|
101
|
-
|
|
102
|
-
```bash
|
|
103
|
-
uv run python -m pytest tests/test_shared_pipeline.py::TestPipeline::test_adhoc_query_hash -v
|
|
104
|
-
```
|
|
105
|
-
|
|
106
|
-
#### Running All Tests
|
|
107
|
-
|
|
108
|
-
The tests validate end-to-end correctness of SQL functionality. To
|
|
109
|
-
run the tests use:
|
|
110
|
-
|
|
111
|
-
```bash
|
|
112
|
-
cd python
|
|
113
|
-
PYTHONPATH=`pwd` ./tests/run-all-tests.sh
|
|
114
|
-
```
|
|
115
|
-
|
|
116
|
-
### Reducing Compilation Cycles
|
|
117
|
-
|
|
118
|
-
To reduce redundant compilation cycles during testing:
|
|
119
|
-
|
|
120
|
-
* **Inherit from `SharedTestPipeline`** instead of `unittest.TestCase`.
|
|
121
|
-
* **Define DDLs** (e.g., `CREATE TABLE`, `CREATE VIEW`) in the **docstring** of each test method.
|
|
122
|
-
* All DDLs from all test functions in the class are combined and compiled into a single pipeline.
|
|
123
|
-
* If a table or view is already defined in one test, it can be used directly in others without redefinition.
|
|
124
|
-
* Ensure that all table and view names are unique within the class.
|
|
125
|
-
* Use `@enterprise_only` on tests that require Enterprise features. Their DDLs will be skipped on OSS builds.
|
|
126
|
-
* Use `self.set_runtime_config(...)` to override the default pipeline config.
|
|
127
|
-
* Reset it at the end using `self.reset_runtime_config()`.
|
|
128
|
-
* Access the shared pipeline via `self.pipeline`.
|
|
129
|
-
|
|
130
|
-
#### Example
|
|
131
|
-
|
|
132
|
-
```python
|
|
133
|
-
from tests.shared_test_pipeline import SharedTestPipeline
|
|
134
|
-
|
|
135
|
-
class TestAverage(SharedTestPipeline):
|
|
136
|
-
def test_average(self):
|
|
137
|
-
"""
|
|
138
|
-
CREATE TABLE students(id INT, name STRING);
|
|
139
|
-
CREATE MATERIALIZED VIEW v AS SELECT * FROM students;
|
|
140
|
-
"""
|
|
141
|
-
...
|
|
142
|
-
self.pipeline.start()
|
|
143
|
-
self.pipeline.input_pandas("students", df)
|
|
144
|
-
self.pipeline.wait_for_completion(True)
|
|
145
|
-
...
|
|
146
|
-
```
|
|
147
|
-
|
|
148
87
|
## Linting and formatting
|
|
149
88
|
|
|
150
89
|
Use [Ruff] to run the lint checks that will be executed by the
|
|
@@ -0,0 +1,79 @@
|
|
|
1
|
+
# Feldera Python SDK
|
|
2
|
+
|
|
3
|
+
Feldera Python is the Feldera SDK for Python developers.
|
|
4
|
+
|
|
5
|
+
## Installation
|
|
6
|
+
|
|
7
|
+
```bash
|
|
8
|
+
uv pip install feldera
|
|
9
|
+
```
|
|
10
|
+
|
|
11
|
+
### Installing from Github
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
uv pip install git+https://github.com/feldera/feldera#subdirectory=python
|
|
15
|
+
```
|
|
16
|
+
|
|
17
|
+
Similarly, to install from a specific branch:
|
|
18
|
+
|
|
19
|
+
```bash
|
|
20
|
+
uv pip install git+https://github.com/feldera/feldera@{BRANCH_NAME}#subdirectory=python
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
Replace `{BRANCH_NAME}` with the name of the branch you want to install from.
|
|
24
|
+
|
|
25
|
+
### Installing from Local Directory
|
|
26
|
+
|
|
27
|
+
If you have cloned the Feldera repo, you can install the python SDK as follows:
|
|
28
|
+
|
|
29
|
+
```bash
|
|
30
|
+
# the Feldera Python SDK is present inside the python/ directory
|
|
31
|
+
cd python
|
|
32
|
+
# If you don't have a virtual environment, create one
|
|
33
|
+
uv venv
|
|
34
|
+
source .venv/activate
|
|
35
|
+
# Install the SDK in editable mode
|
|
36
|
+
uv pip install .
|
|
37
|
+
```
|
|
38
|
+
|
|
39
|
+
You also have to install the `pytest` module:
|
|
40
|
+
|
|
41
|
+
```bash
|
|
42
|
+
python3 -m pip install pytest
|
|
43
|
+
```
|
|
44
|
+
|
|
45
|
+
## Documentation
|
|
46
|
+
|
|
47
|
+
The Python SDK documentation is available at
|
|
48
|
+
[Feldera Python SDK Docs](https://docs.feldera.com/python).
|
|
49
|
+
|
|
50
|
+
To build the html documentation run:
|
|
51
|
+
|
|
52
|
+
Ensure that you have sphinx installed. If not, install it using `uv pip install sphinx`.
|
|
53
|
+
|
|
54
|
+
Then run the following commands:
|
|
55
|
+
|
|
56
|
+
```bash
|
|
57
|
+
cd docs
|
|
58
|
+
sphinx-apidoc -o . ../feldera
|
|
59
|
+
make html
|
|
60
|
+
```
|
|
61
|
+
|
|
62
|
+
To clean the build, run `make clean`.
|
|
63
|
+
|
|
64
|
+
## Linting and formatting
|
|
65
|
+
|
|
66
|
+
Use [Ruff] to run the lint checks that will be executed by the
|
|
67
|
+
precommit hook when a PR is submitted:
|
|
68
|
+
|
|
69
|
+
```bash
|
|
70
|
+
ruff check python/
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
To reformat the code in the same way as the precommit hook:
|
|
74
|
+
|
|
75
|
+
```bash
|
|
76
|
+
ruff format
|
|
77
|
+
```
|
|
78
|
+
|
|
79
|
+
[Ruff]: https://github.com/astral-sh/ruff
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: feldera
|
|
3
|
-
Version: 0.
|
|
3
|
+
Version: 0.131.0
|
|
4
4
|
Summary: The feldera python client
|
|
5
5
|
Author-email: Feldera Team <dev@feldera.com>
|
|
6
6
|
License: MIT
|
|
@@ -59,6 +59,12 @@ source .venv/activate
|
|
|
59
59
|
uv pip install .
|
|
60
60
|
```
|
|
61
61
|
|
|
62
|
+
You also have to install the `pytest` module:
|
|
63
|
+
|
|
64
|
+
```bash
|
|
65
|
+
python3 -m pip install pytest
|
|
66
|
+
```
|
|
67
|
+
|
|
62
68
|
## Documentation
|
|
63
69
|
|
|
64
70
|
The Python SDK documentation is available at
|
|
@@ -78,73 +84,6 @@ make html
|
|
|
78
84
|
|
|
79
85
|
To clean the build, run `make clean`.
|
|
80
86
|
|
|
81
|
-
## Testing
|
|
82
|
-
|
|
83
|
-
To run unit tests:
|
|
84
|
-
|
|
85
|
-
```bash
|
|
86
|
-
cd python && python3 -m pytest tests/
|
|
87
|
-
```
|
|
88
|
-
|
|
89
|
-
- This will detect and run all test files that match the pattern `test_*.py` or
|
|
90
|
-
`*_test.py`.
|
|
91
|
-
- By default, the tests expect a running Feldera instance at `http://localhost:8080`.
|
|
92
|
-
To override the default endpoint, set the `FELDERA_HOST` environment variable.
|
|
93
|
-
|
|
94
|
-
To run tests from a specific file:
|
|
95
|
-
|
|
96
|
-
```bash
|
|
97
|
-
(cd python && python3 -m pytest ./tests/path-to-file.py)
|
|
98
|
-
```
|
|
99
|
-
|
|
100
|
-
To run a specific test:
|
|
101
|
-
|
|
102
|
-
```bash
|
|
103
|
-
uv run python -m pytest tests/test_shared_pipeline.py::TestPipeline::test_adhoc_query_hash -v
|
|
104
|
-
```
|
|
105
|
-
|
|
106
|
-
#### Running All Tests
|
|
107
|
-
|
|
108
|
-
The tests validate end-to-end correctness of SQL functionality. To
|
|
109
|
-
run the tests use:
|
|
110
|
-
|
|
111
|
-
```bash
|
|
112
|
-
cd python
|
|
113
|
-
PYTHONPATH=`pwd` ./tests/run-all-tests.sh
|
|
114
|
-
```
|
|
115
|
-
|
|
116
|
-
### Reducing Compilation Cycles
|
|
117
|
-
|
|
118
|
-
To reduce redundant compilation cycles during testing:
|
|
119
|
-
|
|
120
|
-
* **Inherit from `SharedTestPipeline`** instead of `unittest.TestCase`.
|
|
121
|
-
* **Define DDLs** (e.g., `CREATE TABLE`, `CREATE VIEW`) in the **docstring** of each test method.
|
|
122
|
-
* All DDLs from all test functions in the class are combined and compiled into a single pipeline.
|
|
123
|
-
* If a table or view is already defined in one test, it can be used directly in others without redefinition.
|
|
124
|
-
* Ensure that all table and view names are unique within the class.
|
|
125
|
-
* Use `@enterprise_only` on tests that require Enterprise features. Their DDLs will be skipped on OSS builds.
|
|
126
|
-
* Use `self.set_runtime_config(...)` to override the default pipeline config.
|
|
127
|
-
* Reset it at the end using `self.reset_runtime_config()`.
|
|
128
|
-
* Access the shared pipeline via `self.pipeline`.
|
|
129
|
-
|
|
130
|
-
#### Example
|
|
131
|
-
|
|
132
|
-
```python
|
|
133
|
-
from tests.shared_test_pipeline import SharedTestPipeline
|
|
134
|
-
|
|
135
|
-
class TestAverage(SharedTestPipeline):
|
|
136
|
-
def test_average(self):
|
|
137
|
-
"""
|
|
138
|
-
CREATE TABLE students(id INT, name STRING);
|
|
139
|
-
CREATE MATERIALIZED VIEW v AS SELECT * FROM students;
|
|
140
|
-
"""
|
|
141
|
-
...
|
|
142
|
-
self.pipeline.start()
|
|
143
|
-
self.pipeline.input_pandas("students", df)
|
|
144
|
-
self.pipeline.wait_for_completion(True)
|
|
145
|
-
...
|
|
146
|
-
```
|
|
147
|
-
|
|
148
87
|
## Linting and formatting
|
|
149
88
|
|
|
150
89
|
Use [Ruff] to run the lint checks that will be executed by the
|
|
@@ -24,9 +24,4 @@ feldera/rest/feldera_config.py
|
|
|
24
24
|
feldera/rest/pipeline.py
|
|
25
25
|
feldera/rest/sql_table.py
|
|
26
26
|
feldera/rest/sql_view.py
|
|
27
|
-
tests/
|
|
28
|
-
tests/test_issue4457.py
|
|
29
|
-
tests/test_pipeline_builder.py
|
|
30
|
-
tests/test_shared_pipeline.py
|
|
31
|
-
tests/test_shared_pipeline_stress.py
|
|
32
|
-
tests/test_udf.py
|
|
27
|
+
tests/test_uda.py
|
|
@@ -0,0 +1,283 @@
|
|
|
1
|
+
import unittest
|
|
2
|
+
|
|
3
|
+
from feldera import PipelineBuilder
|
|
4
|
+
from tests import TEST_CLIENT
|
|
5
|
+
|
|
6
|
+
|
|
7
|
+
class TestUDA(unittest.TestCase):
|
|
8
|
+
def test_local(self):
|
|
9
|
+
sql = """
|
|
10
|
+
CREATE LINEAR AGGREGATE I128_SUM(s BINARY(16)) RETURNS BINARY(16);
|
|
11
|
+
CREATE TABLE T(x BINARY(16));
|
|
12
|
+
CREATE MATERIALIZED VIEW V AS SELECT I128_SUM(x) AS S, COUNT(*) AS C FROM T;
|
|
13
|
+
"""
|
|
14
|
+
|
|
15
|
+
toml = """
|
|
16
|
+
i256 = { version = "0.2.2", features = ["num-traits"] }
|
|
17
|
+
num-traits = "0.2.19"
|
|
18
|
+
"""
|
|
19
|
+
|
|
20
|
+
udfs = """
|
|
21
|
+
use i256::I256;
|
|
22
|
+
use feldera_sqllib::*;
|
|
23
|
+
use crate::{AddAssignByRef, AddByRef, HasZero, MulByRef, SizeOf, Tup3};
|
|
24
|
+
use derive_more::Add;
|
|
25
|
+
use num_traits::Zero;
|
|
26
|
+
use rkyv::Fallible;
|
|
27
|
+
use std::ops::{Add, AddAssign};
|
|
28
|
+
|
|
29
|
+
#[derive(Add, Clone, Debug, Default, PartialOrd, Ord, Eq, PartialEq, Hash)]
|
|
30
|
+
pub struct I256Wrapper {
|
|
31
|
+
pub data: I256,
|
|
32
|
+
}
|
|
33
|
+
|
|
34
|
+
impl SizeOf for I256Wrapper {
|
|
35
|
+
fn size_of_children(&self, context: &mut size_of::Context) {}
|
|
36
|
+
}
|
|
37
|
+
|
|
38
|
+
impl From<[u8; 32]> for I256Wrapper {
|
|
39
|
+
fn from(value: [u8; 32]) -> Self {
|
|
40
|
+
Self { data: I256::from_be_bytes(value) }
|
|
41
|
+
}
|
|
42
|
+
}
|
|
43
|
+
|
|
44
|
+
impl From<&[u8]> for I256Wrapper {
|
|
45
|
+
fn from(value: &[u8]) -> Self {
|
|
46
|
+
let mut padded = [0u8; 32];
|
|
47
|
+
// If original value is negative, pad with sign
|
|
48
|
+
if value[0] & 0x80 != 0 {
|
|
49
|
+
padded.fill(0xff);
|
|
50
|
+
}
|
|
51
|
+
let len = value.len();
|
|
52
|
+
if len > 32 {
|
|
53
|
+
panic!("Slice larger than target");
|
|
54
|
+
}
|
|
55
|
+
padded[32-len..].copy_from_slice(&value[..len]);
|
|
56
|
+
Self { data: I256::from_be_bytes(padded) }
|
|
57
|
+
}
|
|
58
|
+
}
|
|
59
|
+
|
|
60
|
+
impl MulByRef<Weight> for I256Wrapper {
|
|
61
|
+
type Output = Self;
|
|
62
|
+
|
|
63
|
+
fn mul_by_ref(&self, other: &Weight) -> Self::Output {
|
|
64
|
+
println!("Mul {:?} by {}", self, other);
|
|
65
|
+
Self {
|
|
66
|
+
data: self.data.checked_mul_i64(*other)
|
|
67
|
+
.expect("Overflow during multiplication"),
|
|
68
|
+
}
|
|
69
|
+
}
|
|
70
|
+
}
|
|
71
|
+
|
|
72
|
+
impl HasZero for I256Wrapper {
|
|
73
|
+
fn zero() -> Self {
|
|
74
|
+
Self { data: I256::zero() }
|
|
75
|
+
}
|
|
76
|
+
|
|
77
|
+
fn is_zero(&self) -> bool {
|
|
78
|
+
self.data.is_zero()
|
|
79
|
+
}
|
|
80
|
+
}
|
|
81
|
+
|
|
82
|
+
impl AddByRef for I256Wrapper {
|
|
83
|
+
fn add_by_ref(&self, other: &Self) -> Self {
|
|
84
|
+
Self { data: self.data.add(other.data) }
|
|
85
|
+
}
|
|
86
|
+
}
|
|
87
|
+
|
|
88
|
+
impl AddAssignByRef<Self> for I256Wrapper {
|
|
89
|
+
fn add_assign_by_ref(&mut self, other: &Self) {
|
|
90
|
+
self.data += other.data
|
|
91
|
+
}
|
|
92
|
+
}
|
|
93
|
+
|
|
94
|
+
#[repr(C)]
|
|
95
|
+
#[derive(Debug, Copy, Clone, PartialOrd, Ord, Eq, PartialEq)]
|
|
96
|
+
pub struct ArchivedI256Wrapper {
|
|
97
|
+
pub bytes: [u8; 32],
|
|
98
|
+
}
|
|
99
|
+
|
|
100
|
+
impl rkyv::Archive for I256Wrapper {
|
|
101
|
+
type Archived = ArchivedI256Wrapper;
|
|
102
|
+
type Resolver = ();
|
|
103
|
+
|
|
104
|
+
#[inline]
|
|
105
|
+
unsafe fn resolve(&self, pos: usize, _: Self::Resolver, out: *mut Self::Archived) {
|
|
106
|
+
out.write(ArchivedI256Wrapper {
|
|
107
|
+
bytes: self.data.to_be_bytes(),
|
|
108
|
+
});
|
|
109
|
+
}
|
|
110
|
+
}
|
|
111
|
+
|
|
112
|
+
impl<S: Fallible + ?Sized> rkyv::Serialize<S> for I256Wrapper {
|
|
113
|
+
#[inline]
|
|
114
|
+
fn serialize(&self, serializer: &mut S) -> Result<Self::Resolver, S::Error> {
|
|
115
|
+
Ok(())
|
|
116
|
+
}
|
|
117
|
+
}
|
|
118
|
+
|
|
119
|
+
impl<D: Fallible + ?Sized> rkyv::Deserialize<I256Wrapper, D> for ArchivedI256Wrapper {
|
|
120
|
+
#[inline]
|
|
121
|
+
fn deserialize(&self, _: &mut D) -> Result<I256Wrapper, D::Error> {
|
|
122
|
+
Ok(I256Wrapper::from(self.bytes))
|
|
123
|
+
}
|
|
124
|
+
}
|
|
125
|
+
|
|
126
|
+
pub type i128_sum_accumulator_type = Tup3<I256Wrapper, i64, i64>;
|
|
127
|
+
|
|
128
|
+
pub fn i128_sum_map(val: Option<ByteArray>) -> i128_sum_accumulator_type {
|
|
129
|
+
match val {
|
|
130
|
+
None => Tup3::new(I256Wrapper::zero(), 0, 1),
|
|
131
|
+
Some(val) => Tup3::new(
|
|
132
|
+
I256Wrapper::from(val.as_slice()),
|
|
133
|
+
1,
|
|
134
|
+
1,
|
|
135
|
+
),
|
|
136
|
+
}
|
|
137
|
+
}
|
|
138
|
+
|
|
139
|
+
pub fn i128_sum_post(val: i128_sum_accumulator_type) -> Option<ByteArray> {
|
|
140
|
+
if val.1 == 0 {
|
|
141
|
+
None
|
|
142
|
+
} else {
|
|
143
|
+
// Check for overflow
|
|
144
|
+
if val.0.data < I256::from(i128::MIN) || val.0.data > I256::from(i128::MAX) {
|
|
145
|
+
panic!("Result of aggregation {} does not fit in 128 bits", val.0.data);
|
|
146
|
+
}
|
|
147
|
+
Some(ByteArray::new(&val.0.data.to_be_bytes()[16..]))
|
|
148
|
+
}
|
|
149
|
+
}
|
|
150
|
+
"""
|
|
151
|
+
|
|
152
|
+
pipeline = PipelineBuilder(
|
|
153
|
+
TEST_CLIENT, name="test_uda", sql=sql, udf_rust=udfs, udf_toml=toml
|
|
154
|
+
).create_or_replace()
|
|
155
|
+
|
|
156
|
+
pipeline.start()
|
|
157
|
+
pipeline.input_json(
|
|
158
|
+
"t",
|
|
159
|
+
[
|
|
160
|
+
{
|
|
161
|
+
"insert": {
|
|
162
|
+
"x": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1],
|
|
163
|
+
}
|
|
164
|
+
}
|
|
165
|
+
],
|
|
166
|
+
update_format="insert_delete",
|
|
167
|
+
)
|
|
168
|
+
pipeline.wait_for_idle()
|
|
169
|
+
output = list(pipeline.query("SELECT * FROM V;"))
|
|
170
|
+
assert output == [{"s": "00000000000000000000000000000001", "c": 1}]
|
|
171
|
+
|
|
172
|
+
# Insert -1
|
|
173
|
+
pipeline.input_json(
|
|
174
|
+
"t",
|
|
175
|
+
[
|
|
176
|
+
{
|
|
177
|
+
"insert": {
|
|
178
|
+
"x": [
|
|
179
|
+
255,
|
|
180
|
+
255,
|
|
181
|
+
255,
|
|
182
|
+
255,
|
|
183
|
+
255,
|
|
184
|
+
255,
|
|
185
|
+
255,
|
|
186
|
+
255,
|
|
187
|
+
255,
|
|
188
|
+
255,
|
|
189
|
+
255,
|
|
190
|
+
255,
|
|
191
|
+
255,
|
|
192
|
+
255,
|
|
193
|
+
255,
|
|
194
|
+
255,
|
|
195
|
+
],
|
|
196
|
+
}
|
|
197
|
+
}
|
|
198
|
+
],
|
|
199
|
+
update_format="insert_delete",
|
|
200
|
+
)
|
|
201
|
+
pipeline.wait_for_idle()
|
|
202
|
+
output = list(pipeline.query("SELECT * FROM V;"))
|
|
203
|
+
assert output == [{"s": "00000000000000000000000000000000", "c": 2}]
|
|
204
|
+
|
|
205
|
+
pipeline.input_json(
|
|
206
|
+
"t",
|
|
207
|
+
[
|
|
208
|
+
{
|
|
209
|
+
"insert": {
|
|
210
|
+
"x": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2],
|
|
211
|
+
}
|
|
212
|
+
}
|
|
213
|
+
],
|
|
214
|
+
update_format="insert_delete",
|
|
215
|
+
)
|
|
216
|
+
output = list(pipeline.query("SELECT * FROM V;"))
|
|
217
|
+
assert output == [{"s": "00000000000000000000000000000002", "c": 3}]
|
|
218
|
+
|
|
219
|
+
pipeline.input_json(
|
|
220
|
+
"t",
|
|
221
|
+
[
|
|
222
|
+
{
|
|
223
|
+
"insert": {
|
|
224
|
+
"x": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 3],
|
|
225
|
+
}
|
|
226
|
+
}
|
|
227
|
+
],
|
|
228
|
+
update_format="insert_delete",
|
|
229
|
+
)
|
|
230
|
+
output = list(pipeline.query("SELECT * FROM V;"))
|
|
231
|
+
assert output == [{"s": "00000000000000000000000000000005", "c": 4}]
|
|
232
|
+
|
|
233
|
+
pipeline.input_json(
|
|
234
|
+
"t",
|
|
235
|
+
[
|
|
236
|
+
{
|
|
237
|
+
"delete": {
|
|
238
|
+
"x": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1],
|
|
239
|
+
}
|
|
240
|
+
},
|
|
241
|
+
{
|
|
242
|
+
"delete": {
|
|
243
|
+
"x": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2],
|
|
244
|
+
}
|
|
245
|
+
},
|
|
246
|
+
{
|
|
247
|
+
"delete": {
|
|
248
|
+
"x": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 3],
|
|
249
|
+
}
|
|
250
|
+
},
|
|
251
|
+
{
|
|
252
|
+
"delete": {
|
|
253
|
+
"x": [
|
|
254
|
+
255,
|
|
255
|
+
255,
|
|
256
|
+
255,
|
|
257
|
+
255,
|
|
258
|
+
255,
|
|
259
|
+
255,
|
|
260
|
+
255,
|
|
261
|
+
255,
|
|
262
|
+
255,
|
|
263
|
+
255,
|
|
264
|
+
255,
|
|
265
|
+
255,
|
|
266
|
+
255,
|
|
267
|
+
255,
|
|
268
|
+
255,
|
|
269
|
+
1,
|
|
270
|
+
],
|
|
271
|
+
}
|
|
272
|
+
},
|
|
273
|
+
],
|
|
274
|
+
update_format="insert_delete",
|
|
275
|
+
)
|
|
276
|
+
output = list(pipeline.query("SELECT * FROM V;"))
|
|
277
|
+
assert output == [{"s": None, "c": 0}]
|
|
278
|
+
|
|
279
|
+
pipeline.stop(force=True)
|
|
280
|
+
|
|
281
|
+
|
|
282
|
+
if __name__ == "__main__":
|
|
283
|
+
unittest.main()
|
feldera-0.129.0/README.md
DELETED
|
@@ -1,140 +0,0 @@
|
|
|
1
|
-
# Feldera Python SDK
|
|
2
|
-
|
|
3
|
-
Feldera Python is the Feldera SDK for Python developers.
|
|
4
|
-
|
|
5
|
-
## Installation
|
|
6
|
-
|
|
7
|
-
```bash
|
|
8
|
-
uv pip install feldera
|
|
9
|
-
```
|
|
10
|
-
|
|
11
|
-
### Installing from Github
|
|
12
|
-
|
|
13
|
-
```bash
|
|
14
|
-
uv pip install git+https://github.com/feldera/feldera#subdirectory=python
|
|
15
|
-
```
|
|
16
|
-
|
|
17
|
-
Similarly, to install from a specific branch:
|
|
18
|
-
|
|
19
|
-
```bash
|
|
20
|
-
uv pip install git+https://github.com/feldera/feldera@{BRANCH_NAME}#subdirectory=python
|
|
21
|
-
```
|
|
22
|
-
|
|
23
|
-
Replace `{BRANCH_NAME}` with the name of the branch you want to install from.
|
|
24
|
-
|
|
25
|
-
### Installing from Local Directory
|
|
26
|
-
|
|
27
|
-
If you have cloned the Feldera repo, you can install the python SDK as follows:
|
|
28
|
-
|
|
29
|
-
```bash
|
|
30
|
-
# the Feldera Python SDK is present inside the python/ directory
|
|
31
|
-
cd python
|
|
32
|
-
# If you don't have a virtual environment, create one
|
|
33
|
-
uv venv
|
|
34
|
-
source .venv/activate
|
|
35
|
-
# Install the SDK in editable mode
|
|
36
|
-
uv pip install .
|
|
37
|
-
```
|
|
38
|
-
|
|
39
|
-
## Documentation
|
|
40
|
-
|
|
41
|
-
The Python SDK documentation is available at
|
|
42
|
-
[Feldera Python SDK Docs](https://docs.feldera.com/python).
|
|
43
|
-
|
|
44
|
-
To build the html documentation run:
|
|
45
|
-
|
|
46
|
-
Ensure that you have sphinx installed. If not, install it using `uv pip install sphinx`.
|
|
47
|
-
|
|
48
|
-
Then run the following commands:
|
|
49
|
-
|
|
50
|
-
```bash
|
|
51
|
-
cd docs
|
|
52
|
-
sphinx-apidoc -o . ../feldera
|
|
53
|
-
make html
|
|
54
|
-
```
|
|
55
|
-
|
|
56
|
-
To clean the build, run `make clean`.
|
|
57
|
-
|
|
58
|
-
## Testing
|
|
59
|
-
|
|
60
|
-
To run unit tests:
|
|
61
|
-
|
|
62
|
-
```bash
|
|
63
|
-
cd python && python3 -m pytest tests/
|
|
64
|
-
```
|
|
65
|
-
|
|
66
|
-
- This will detect and run all test files that match the pattern `test_*.py` or
|
|
67
|
-
`*_test.py`.
|
|
68
|
-
- By default, the tests expect a running Feldera instance at `http://localhost:8080`.
|
|
69
|
-
To override the default endpoint, set the `FELDERA_HOST` environment variable.
|
|
70
|
-
|
|
71
|
-
To run tests from a specific file:
|
|
72
|
-
|
|
73
|
-
```bash
|
|
74
|
-
(cd python && python3 -m pytest ./tests/path-to-file.py)
|
|
75
|
-
```
|
|
76
|
-
|
|
77
|
-
To run a specific test:
|
|
78
|
-
|
|
79
|
-
```bash
|
|
80
|
-
uv run python -m pytest tests/test_shared_pipeline.py::TestPipeline::test_adhoc_query_hash -v
|
|
81
|
-
```
|
|
82
|
-
|
|
83
|
-
#### Running All Tests
|
|
84
|
-
|
|
85
|
-
The tests validate end-to-end correctness of SQL functionality. To
|
|
86
|
-
run the tests use:
|
|
87
|
-
|
|
88
|
-
```bash
|
|
89
|
-
cd python
|
|
90
|
-
PYTHONPATH=`pwd` ./tests/run-all-tests.sh
|
|
91
|
-
```
|
|
92
|
-
|
|
93
|
-
### Reducing Compilation Cycles
|
|
94
|
-
|
|
95
|
-
To reduce redundant compilation cycles during testing:
|
|
96
|
-
|
|
97
|
-
* **Inherit from `SharedTestPipeline`** instead of `unittest.TestCase`.
|
|
98
|
-
* **Define DDLs** (e.g., `CREATE TABLE`, `CREATE VIEW`) in the **docstring** of each test method.
|
|
99
|
-
* All DDLs from all test functions in the class are combined and compiled into a single pipeline.
|
|
100
|
-
* If a table or view is already defined in one test, it can be used directly in others without redefinition.
|
|
101
|
-
* Ensure that all table and view names are unique within the class.
|
|
102
|
-
* Use `@enterprise_only` on tests that require Enterprise features. Their DDLs will be skipped on OSS builds.
|
|
103
|
-
* Use `self.set_runtime_config(...)` to override the default pipeline config.
|
|
104
|
-
* Reset it at the end using `self.reset_runtime_config()`.
|
|
105
|
-
* Access the shared pipeline via `self.pipeline`.
|
|
106
|
-
|
|
107
|
-
#### Example
|
|
108
|
-
|
|
109
|
-
```python
|
|
110
|
-
from tests.shared_test_pipeline import SharedTestPipeline
|
|
111
|
-
|
|
112
|
-
class TestAverage(SharedTestPipeline):
|
|
113
|
-
def test_average(self):
|
|
114
|
-
"""
|
|
115
|
-
CREATE TABLE students(id INT, name STRING);
|
|
116
|
-
CREATE MATERIALIZED VIEW v AS SELECT * FROM students;
|
|
117
|
-
"""
|
|
118
|
-
...
|
|
119
|
-
self.pipeline.start()
|
|
120
|
-
self.pipeline.input_pandas("students", df)
|
|
121
|
-
self.pipeline.wait_for_completion(True)
|
|
122
|
-
...
|
|
123
|
-
```
|
|
124
|
-
|
|
125
|
-
## Linting and formatting
|
|
126
|
-
|
|
127
|
-
Use [Ruff] to run the lint checks that will be executed by the
|
|
128
|
-
precommit hook when a PR is submitted:
|
|
129
|
-
|
|
130
|
-
```bash
|
|
131
|
-
ruff check python/
|
|
132
|
-
```
|
|
133
|
-
|
|
134
|
-
To reformat the code in the same way as the precommit hook:
|
|
135
|
-
|
|
136
|
-
```bash
|
|
137
|
-
ruff format
|
|
138
|
-
```
|
|
139
|
-
|
|
140
|
-
[Ruff]: https://github.com/astral-sh/ruff
|