robin-sparkless 0.1.0__cp38-abi3-manylinux_2_28_aarch64.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,5 @@
1
+ from .robin_sparkless import *
2
+
3
+ __doc__ = robin_sparkless.__doc__
4
+ if hasattr(robin_sparkless, "__all__"):
5
+ __all__ = robin_sparkless.__all__
Binary file
@@ -0,0 +1,72 @@
1
+ Metadata-Version: 2.4
2
+ Name: robin-sparkless
3
+ Version: 0.1.0
4
+ Classifier: Development Status :: 3 - Alpha
5
+ Classifier: License :: OSI Approved :: MIT License
6
+ Classifier: Programming Language :: Python :: 3
7
+ Classifier: Programming Language :: Rust
8
+ Classifier: Topic :: Scientific/Engineering
9
+ Summary: PySpark-like DataFrame API in Rust (Polars backend), with Python bindings via PyO3
10
+ Author: Robin Sparkless contributors
11
+ License: MIT
12
+ Requires-Python: >=3.8
13
+ Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
14
+
15
+ # robin-sparkless (Python)
16
+
17
+ **PySpark-style DataFrames in Python—no JVM.** Uses [Polars](https://www.pola.rs/) under the hood for fast execution.
18
+
19
+ ## Install
20
+
21
+ ```bash
22
+ pip install robin-sparkless
23
+ ```
24
+
25
+ **Requirements:** Python 3.8+
26
+
27
+ ## Quick start
28
+
29
+ ```python
30
+ import robin_sparkless as rs
31
+
32
+ spark = rs.SparkSession.builder().app_name("demo").get_or_create()
33
+ df = spark.create_dataframe(
34
+ [(1, 25, "Alice"), (2, 30, "Bob"), (3, 35, "Charlie")],
35
+ ["id", "age", "name"],
36
+ )
37
+ filtered = df.filter(rs.col("age").gt(rs.lit(26)))
38
+ print(filtered.collect())
39
+ # [{"id": 2, "age": 30, "name": "Bob"}, {"id": 3, "age": 35, "name": "Charlie"}]
40
+ ```
41
+
42
+ Read from files:
43
+
44
+ ```python
45
+ df = spark.read_csv("data.csv")
46
+ df = spark.read_parquet("data.parquet")
47
+ df = spark.read_json("data.json")
48
+ ```
49
+
50
+ Filter, select, group, join, and use window functions with a PySpark-like API. See the [full documentation](https://robin-sparkless.readthedocs.io/) for details.
51
+
52
+ ## Optional features
53
+
54
+ Install from source to enable extra features (requires [Rust](https://rustup.rs/) and [maturin](https://www.maturin.rs/)):
55
+
56
+ ```bash
57
+ pip install maturin
58
+ maturin install --features "pyo3,sql" # spark.sql() and temp views
59
+ maturin install --features "pyo3,delta" # read_delta / write_delta
60
+ maturin install --features "pyo3,sql,delta"
61
+ ```
62
+
63
+ ## Links
64
+
65
+ - **Documentation:** [robin-sparkless.readthedocs.io](https://robin-sparkless.readthedocs.io/)
66
+ - **Source:** [github.com/eddiethedean/robin-sparkless](https://github.com/eddiethedean/robin-sparkless)
67
+ - **Rust crate:** [crates.io/crates/robin-sparkless](https://crates.io/crates/robin-sparkless)
68
+
69
+ ## License
70
+
71
+ MIT
72
+
@@ -0,0 +1,5 @@
1
+ robin_sparkless/__init__.py,sha256=h26i0kv9czksgzJ0zbFJvX-5IZV8hrWVEerFQOVG3fA,143
2
+ robin_sparkless/robin_sparkless.abi3.so,sha256=-4W9jCzrZ0H0Fuwgz1zjc2xDsVv9inQr28PRBTMv92c,49154576
3
+ robin_sparkless-0.1.0.dist-info/METADATA,sha256=3NbSFuqgt6XUqvRMJ8vZh962WrX6BSwFnO1D_gT9Lb0,2145
4
+ robin_sparkless-0.1.0.dist-info/WHEEL,sha256=TbOEPJcLRdoLQex6HY6MJy-GFvwdI5JUNazV49lxV2I,108
5
+ robin_sparkless-0.1.0.dist-info/RECORD,,
@@ -0,0 +1,4 @@
1
+ Wheel-Version: 1.0
2
+ Generator: maturin (1.11.5)
3
+ Root-Is-Purelib: false
4
+ Tag: cp38-abi3-manylinux_2_28_aarch64