fraq 0.2.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,78 @@
1
+ # Changelog
2
+
3
+ ## [Unreleased]
4
+
5
+ ## [0.2.1] - 2026-03-17
6
+
7
+ ### Docs
8
+ - Update README.md
9
+ - Update docs/README.md
10
+ - Update project/README.md
11
+ - Update project/context.md
12
+
13
+ ### Other
14
+ - Update .env.example
15
+ - Update examples/app_integrations.py
16
+ - Update examples/text2fraq_examples.py
17
+ - Update fraq/__init__.py
18
+ - Update fraq/text2fraq.py
19
+ - Update project.sh
20
+ - Update project/analysis.toon
21
+ - Update project/calls.mmd
22
+ - Update project/calls.png
23
+ - Update project/compact_flow.mmd
24
+ - ... and 10 more files
25
+
26
+
27
+ All notable changes to **fraq** are documented here.
28
+
29
+ ## [0.2.0] — 2026-03-16
30
+
31
+ ### Added
32
+ - **Unified Query Language** (`fraq/query.py`)
33
+ - `FraqQuery` fluent builder with `.zoom()`, `.select()`, `.where()`, `.output()`, `.take()`
34
+ - `FraqFilter` with operators: eq, ne, gt, lt, gte, lte, contains
35
+ - `FraqExecutor` for running queries against any root node
36
+ - `query()` one-liner convenience function
37
+
38
+ - **Data Source Adapters** (`fraq/adapters.py`)
39
+ - `FileAdapter` — JSON/YAML/CSV on disk with save/load roundtrip
40
+ - `HTTPAdapter` — REST API with deterministic fallback
41
+ - `SQLAdapter` — PostgreSQL/SQLite row mapping, SQL function generation
42
+ - `SensorAdapter` — infinite IoT stream simulation
43
+ - `HybridAdapter` — merge multiple sources (mean position, XOR seeds)
44
+ - `get_adapter()` factory function
45
+
46
+ - **Schema Export** (`fraq/schema_export.py`)
47
+ - `to_nlp2cmd_schema()` — NLP2CMD command schemas
48
+ - `to_nlp2cmd_actions()` — NLP2CMD ActionRegistry entries
49
+ - `to_openapi()` — OpenAPI 3.0 specification
50
+ - `to_graphql()` — GraphQL type definitions
51
+ - `to_asyncapi()` — AsyncAPI 3.0 for streaming channels
52
+ - `to_proto()` — gRPC/Protobuf .proto files
53
+ - `to_json_schema()` — JSON Schema for validation
54
+
55
+ - **Async Streaming** (`fraq/streaming.py`)
56
+ - `AsyncFraqStream` with `async for` iteration
57
+ - `async_query()` — async execution via thread pool
58
+ - `async_stream()` — convenience async generator with count limit
59
+
60
+ - **Examples** (`examples/`)
61
+ - `query_examples.py` — disk, HTTP, SQL, sensor, hybrid queries
62
+ - `nlp2cmd_integration.py` — full NLP2CMD workflow
63
+ - `applications.py` — IoT, ERP, AI/ML, DevOps, Finance, Legal
64
+ - `async_streaming.py` — FastAPI SSE, Kafka patterns
65
+
66
+ ### Changed
67
+ - Test suite expanded from 64 to 132 tests
68
+ - Coverage maintained at 96%
69
+
70
+ ## [0.1.0] — 2026-03-16
71
+
72
+ ### Added
73
+ - Core fractal structures: `FraqNode`, `FraqSchema`, `FraqCursor`
74
+ - Format registry with 6 built-in serialisers (json, jsonl, csv, yaml, binary, msgpack_lite)
75
+ - 4 generators: `HashGenerator`, `FibonacciGenerator`, `PerlinGenerator`, `SensorStreamGenerator`
76
+ - CLI with `explore`, `stream`, `schema` subcommands
77
+ - Dockerfile + docker-compose.yml
78
+ - 64 tests, 96% coverage
fraq-0.2.1/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Softreck / Prototypowanie.pl
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
fraq-0.2.1/MANIFEST.in ADDED
@@ -0,0 +1,12 @@
1
+ include LICENSE
2
+ include README.md
3
+ include CHANGELOG.md
4
+ include pyproject.toml
5
+
6
+ recursive-include fraq *.py py.typed
7
+ recursive-include tests *.py
8
+ recursive-include examples *.py
9
+
10
+ prune __pycache__
11
+ prune *.egg-info
12
+ global-exclude *.pyc
fraq-0.2.1/PKG-INFO ADDED
@@ -0,0 +1,306 @@
1
+ Metadata-Version: 2.4
2
+ Name: fraq
3
+ Version: 0.2.1
4
+ Summary: Fractal Query Data Library — model data as infinite, self-similar fractal structures
5
+ Author: Softreck / Prototypowanie.pl
6
+ Author-email: Tom Sapletta <tom@sapletta.com>
7
+ License-Expression: Apache-2.0
8
+ Project-URL: Homepage, https://github.com/softreck/fraq
9
+ Project-URL: Documentation, https://github.com/softreck/fraq#readme
10
+ Project-URL: Repository, https://github.com/softreck/fraq
11
+ Project-URL: Issues, https://github.com/softreck/fraq/issues
12
+ Project-URL: Changelog, https://github.com/softreck/fraq/blob/main/CHANGELOG.md
13
+ Keywords: fractal,data,procedural,generation,lazy,infinite,query,streaming,iot,sensor,schema,openapi,graphql,asyncapi,grpc,protobuf,nlp2cmd
14
+ Classifier: Development Status :: 3 - Alpha
15
+ Classifier: Intended Audience :: Developers
16
+ Classifier: Intended Audience :: Science/Research
17
+ Classifier: Operating System :: OS Independent
18
+ Classifier: Programming Language :: Python :: 3
19
+ Classifier: Programming Language :: Python :: 3.10
20
+ Classifier: Programming Language :: Python :: 3.11
21
+ Classifier: Programming Language :: Python :: 3.12
22
+ Classifier: Programming Language :: Python :: 3.13
23
+ Classifier: Topic :: Software Development :: Libraries
24
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
25
+ Classifier: Topic :: Scientific/Engineering
26
+ Classifier: Topic :: Scientific/Engineering :: Information Analysis
27
+ Classifier: Typing :: Typed
28
+ Requires-Python: >=3.10
29
+ Description-Content-Type: text/markdown
30
+ License-File: LICENSE
31
+ Provides-Extra: http
32
+ Requires-Dist: requests>=2.28.0; extra == "http"
33
+ Provides-Extra: async
34
+ Requires-Dist: aiohttp>=3.9.0; extra == "async"
35
+ Provides-Extra: fast
36
+ Requires-Dist: orjson>=3.9.0; extra == "fast"
37
+ Provides-Extra: ai
38
+ Requires-Dist: litellm>=1.0.0; extra == "ai"
39
+ Requires-Dist: python-dotenv>=1.0.0; extra == "ai"
40
+ Provides-Extra: all
41
+ Requires-Dist: requests>=2.28.0; extra == "all"
42
+ Requires-Dist: aiohttp>=3.9.0; extra == "all"
43
+ Requires-Dist: orjson>=3.9.0; extra == "all"
44
+ Requires-Dist: litellm>=1.0.0; extra == "all"
45
+ Requires-Dist: python-dotenv>=1.0.0; extra == "all"
46
+ Provides-Extra: dev
47
+ Requires-Dist: pytest>=7.0; extra == "dev"
48
+ Requires-Dist: pytest-cov>=4.0; extra == "dev"
49
+ Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
50
+ Requires-Dist: mypy>=1.0; extra == "dev"
51
+ Requires-Dist: ruff>=0.1.0; extra == "dev"
52
+ Dynamic: license-file
53
+
54
+ # fraq — Fractal Query Data Library
55
+
56
+ > Model data as infinite, self-similar fractal structures in hyperspace.
57
+ > Each zoom level reveals procedurally generated detail — data exists only
58
+ > virtually and materialises on demand via lazy evaluation.
59
+
60
+ ## Core Idea
61
+
62
+ Imagine a schema that is itself an **infinite fractal**. You can zoom into
63
+ any region and receive an unbounded amount of deterministic data in any
64
+ format you choose (JSON, CSV, YAML, binary). The data doesn't live in
65
+ memory or on disk — it is *generated* on the fly from the fractal
66
+ coordinates and a deterministic seed.
67
+
68
+ ```
69
+ root (depth 0)
70
+ ├── zoom (1,0,0) → child A (depth 1)
71
+ │ ├── zoom (1,0,0) → grandchild AA (depth 2) …
72
+ │ └── zoom (0,1,0) → grandchild AB (depth 2) …
73
+ └── zoom (0,0,1) → child B (depth 1)
74
+ └── … infinite …
75
+ ```
76
+
77
+ ### Key Properties
78
+
79
+ | Property | Description |
80
+ |---|---|
81
+ | **Deterministic** | Same path → same data, always |
82
+ | **Lazy** | Nodes materialise only when accessed |
83
+ | **Unbounded** | No depth limit — zoom forever |
84
+ | **Format-agnostic** | JSON, CSV, YAML, binary, msgpack-lite, or register your own |
85
+ | **Source-agnostic** | File, HTTP, SQL, Sensor, Hybrid — same query interface |
86
+ | **Schema-exportable** | NLP2CMD, OpenAPI, GraphQL, AsyncAPI, gRPC, JSON Schema |
87
+ | **Zero storage** | Data exists only as computation |
88
+
89
+ ## Quick Start
90
+
91
+ ```bash
92
+ pip install -e ".[dev]"
93
+
94
+ # CLI
95
+ fraq explore --dims 3 --depth 5 --format json
96
+ fraq stream --dims 2 --count 20 --format csv
97
+ fraq schema --fields "name:str,value:float,flag:bool" --depth 2
98
+
99
+ # Docker
100
+ docker compose run test # run full test suite
101
+ docker compose run cli explore --depth 5
102
+ ```
103
+
104
+ ## Python API
105
+
106
+ ```python
107
+ from fraq import FraqNode, FraqSchema, FraqCursor, FormatRegistry, query
108
+
109
+ # 1. One-liner query
110
+ data = query(depth=2, fields=["temp:float", "id:str"], format="csv", limit=10)
111
+
112
+ # 2. Fluent query builder
113
+ from fraq import FraqQuery, FraqExecutor
114
+
115
+ q = (
116
+ FraqQuery()
117
+ .zoom(5, direction=(0.1, 0.2, 0.7))
118
+ .select("temperature:float", "sensor_id:str", "active:bool")
119
+ .where("temperature", "gt", 0.5)
120
+ .output("json")
121
+ .take(20)
122
+ )
123
+ result = FraqExecutor(dims=3).execute(q)
124
+
125
+ # 3. Source adapters — same query, different sources
126
+ from fraq import FileAdapter, SQLAdapter, SensorAdapter, HybridAdapter
127
+
128
+ # Disk
129
+ adapter = FileAdapter()
130
+ root = adapter.load_root("gradient_root.json")
131
+
132
+ # SQL
133
+ adapter = SQLAdapter(table="sensors")
134
+ root = adapter.load_root("", rows=[{"id": 1, "x": 0.0, "y": 0.0, "z": 0.0}])
135
+
136
+ # Sensors (infinite stream, zero storage)
137
+ adapter = SensorAdapter(base_temp=23.5)
138
+ for reading in adapter.stream(depth=3, count=100):
139
+ print(reading) # {'temperature': 24.1, 'humidity': 58.3, ...}
140
+
141
+ # Hybrid (merge multiple sources)
142
+ hybrid = HybridAdapter()
143
+ hybrid.add(FileAdapter(), "local_backup.json")
144
+ hybrid.add(SensorAdapter(), "")
145
+ merged = hybrid.load_root()
146
+
147
+ # 4. Async streaming (FastAPI SSE, Kafka, NATS)
148
+ from fraq.streaming import async_stream
149
+ async for record in async_stream(count=1000, interval=0.1):
150
+ yield f"data: {json.dumps(record)}\n\n"
151
+ ```
152
+
153
+ ## NLP2CMD Integration
154
+
155
+ Export FraqSchema to NLP2CMD's SchemaRegistry for natural language → command transformation:
156
+
157
+ ```python
158
+ from fraq import FraqNode, FraqSchema, to_nlp2cmd_schema, to_nlp2cmd_actions
159
+
160
+ schema = FraqSchema(root=FraqNode(position=(0.0, 0.0, 0.0)))
161
+ schema.add_field("temperature", "float")
162
+ schema.add_field("sensor_id", "str")
163
+
164
+ # Command schema → command_schemas/fraq_sensor.json
165
+ nlp2cmd_schema = to_nlp2cmd_schema(schema, command_name="fraq_sensor")
166
+
167
+ # Action registry entries
168
+ actions = to_nlp2cmd_actions(schema)
169
+ # → fraq_zoom, fraq_query, fraq_stream, fraq_save
170
+
171
+ # NLP2CMD can now transform:
172
+ # "Show active sensors" → fraq query --fields "temperature:float,sensor_id:str" --depth 2
173
+ ```
174
+
175
+ ## Schema Export Formats
176
+
177
+ ```python
178
+ from fraq import to_openapi, to_graphql, to_asyncapi, to_proto, to_json_schema
179
+
180
+ to_openapi(schema) # OpenAPI 3.0 — /zoom, /query, /stream endpoints
181
+ to_graphql(schema) # GraphQL type + Query definitions
182
+ to_asyncapi(schema) # AsyncAPI 3.0 — Kafka/WebSocket channels
183
+ to_proto(schema) # gRPC .proto with FraqService (Zoom + Stream RPCs)
184
+ to_json_schema(schema) # JSON Schema for validation
185
+ ```
186
+
187
+ ## Generators
188
+
189
+ | Generator | Output | Use Case |
190
+ |---|---|---|
191
+ | `HashGenerator` | `float` in configurable range | General-purpose pseudo-random |
192
+ | `FibonacciGenerator` | `float` based on depth | Mathematical sequences |
193
+ | `PerlinGenerator` | Smooth `float` noise | Organic sensor streams |
194
+ | `SensorStreamGenerator` | `dict` with IoT fields | Embedded / edge simulation |
195
+
196
+ ## Data Source Adapters
197
+
198
+ | Adapter | Source | Key Feature |
199
+ |---|---|---|
200
+ | `FileAdapter` | JSON/YAML/CSV on disk | Save/load roundtrip |
201
+ | `HTTPAdapter` | REST APIs | Fallback to deterministic roots |
202
+ | `SQLAdapter` | PostgreSQL/SQLite | Row mapping, SQL function generation |
203
+ | `SensorAdapter` | IoT streams (RPi/ESP32) | Infinite deterministic readings |
204
+ | `HybridAdapter` | Multiple sources merged | Mean position, XOR seeds |
205
+
206
+ ## text2fraq — Natural Language to Fractal Query
207
+
208
+ Convert natural language to fractal queries using LLM (LiteLLM) or rule-based fallback.
209
+
210
+ ### Setup (Ollama with small models)
211
+
212
+ ```bash
213
+ # Install Ollama: https://ollama.com
214
+ ollama pull qwen2.5:3b # Fast, instruction-tuned
215
+ ollama pull llama3.2:3b # Balanced, multilingual
216
+ ollama pull phi3:3.8b # Strong reasoning
217
+
218
+ # Install with AI extras
219
+ pip install -e ".[ai]"
220
+
221
+ # Copy and edit config
222
+ cp .env.example .env
223
+ ```
224
+
225
+ ### Usage
226
+
227
+ ```python
228
+ from fraq import text2fraq, Text2Fraq, Text2FraqSimple
229
+
230
+ # One-liner (auto-fallback to rule-based if LLM unavailable)
231
+ result = text2fraq("Show 20 temperature readings in CSV")
232
+
233
+ # With specific model
234
+ from fraq.text2fraq import Text2FraqConfig
235
+ t2f = Text2Fraq(Text2FraqConfig(model="qwen2.5:3b"))
236
+ parsed = t2f.parse("Get deep analysis with depth 5")
237
+
238
+ # Rule-based (deterministic, no LLM needed)
239
+ parser = Text2FraqSimple()
240
+ query = parser.parse("Stream sensor data as JSON")
241
+ ```
242
+
243
+ ### Environment Variables (.env)
244
+
245
+ | Variable | Default | Description |
246
+ |----------|---------|-------------|
247
+ | `LITELLM_PROVIDER` | `ollama` | LLM provider |
248
+ | `LITELLM_MODEL` | `qwen2.5:3b` | Model name |
249
+ | `LITELLM_BASE_URL` | `http://localhost:11434` | API endpoint |
250
+ | `LITELLM_TEMPERATURE` | `0.1` | Generation temperature |
251
+ | `TEXT2FRAQ_DEFAULT_FORMAT` | `json` | Default output format |
252
+ | `TEXT2FRAQ_DEFAULT_DIMS` | `3` | Default fractal dimensions |
253
+ | `TEXT2FRAQ_DEFAULT_DEPTH` | `3` | Default query depth |
254
+
255
+ ## Application Integrations
256
+
257
+ See `examples/app_integrations.py` for templates:
258
+
259
+ - **FastAPI** — REST API with `/query`, `/stream` (SSE), `/zoom/{depth}`
260
+ - **Streamlit** — Interactive dashboard with sliders and charts
261
+ - **Flask** — Blueprints with NL endpoints
262
+ - **WebSocket** — Real-time streaming server
263
+ - **Kafka** — Producer/consumer with aiokafka
264
+ - **gRPC** — High-performance RPC service
265
+ - **Celery** — Background task processing
266
+ - **Jupyter** — Interactive exploration widgets
267
+
268
+ ## Testing
269
+
270
+ ```bash
271
+ pytest -v --cov=fraq # 132 tests, 96% coverage
272
+ ```
273
+
274
+ ## Project Structure
275
+
276
+ ```
277
+ fraq/
278
+ ├── fraq/
279
+ │ ├── __init__.py # public API
280
+ │ ├── core.py # FraqNode, FraqSchema, FraqCursor
281
+ │ ├── formats.py # FormatRegistry + 6 built-in serialisers
282
+ │ ├── generators.py # Hash, Fibonacci, Perlin, SensorStream
283
+ │ ├── query.py # FraqQuery, FraqExecutor, FraqFilter
284
+ │ ├── adapters.py # File, HTTP, SQL, Sensor, Hybrid adapters
285
+ │ ├── schema_export.py # NLP2CMD, OpenAPI, GraphQL, AsyncAPI, Proto, JSON Schema
286
+ │ ├── streaming.py # AsyncFraqStream, async_query, async_stream
287
+ │ └── cli.py # CLI entry point
288
+ ├── tests/ # 132 tests, 96% coverage
289
+ ├── examples/
290
+ │ ├── query_examples.py # All data sources (disk, HTTP, SQL, sensor, hybrid)
291
+ │ ├── nlp2cmd_integration.py # NLP2CMD schema workflow
292
+ │ ├── applications.py # IoT, ERP, AI/ML, DevOps, Finance, Legal
293
+ │ └── async_streaming.py # FastAPI SSE, Kafka patterns
294
+ ├── Dockerfile
295
+ ├── docker-compose.yml
296
+ ├── pyproject.toml
297
+ └── README.md
298
+ ```
299
+
300
+ ## License
301
+
302
+ Apache License 2.0 - see [LICENSE](LICENSE) for details.
303
+
304
+ ## Author
305
+
306
+ Created by **Tom Sapletta** - [tom@sapletta.com](mailto:tom@sapletta.com)
fraq-0.2.1/README.md ADDED
@@ -0,0 +1,253 @@
1
+ # fraq — Fractal Query Data Library
2
+
3
+ > Model data as infinite, self-similar fractal structures in hyperspace.
4
+ > Each zoom level reveals procedurally generated detail — data exists only
5
+ > virtually and materialises on demand via lazy evaluation.
6
+
7
+ ## Core Idea
8
+
9
+ Imagine a schema that is itself an **infinite fractal**. You can zoom into
10
+ any region and receive an unbounded amount of deterministic data in any
11
+ format you choose (JSON, CSV, YAML, binary). The data doesn't live in
12
+ memory or on disk — it is *generated* on the fly from the fractal
13
+ coordinates and a deterministic seed.
14
+
15
+ ```
16
+ root (depth 0)
17
+ ├── zoom (1,0,0) → child A (depth 1)
18
+ │ ├── zoom (1,0,0) → grandchild AA (depth 2) …
19
+ │ └── zoom (0,1,0) → grandchild AB (depth 2) …
20
+ └── zoom (0,0,1) → child B (depth 1)
21
+ └── … infinite …
22
+ ```
23
+
24
+ ### Key Properties
25
+
26
+ | Property | Description |
27
+ |---|---|
28
+ | **Deterministic** | Same path → same data, always |
29
+ | **Lazy** | Nodes materialise only when accessed |
30
+ | **Unbounded** | No depth limit — zoom forever |
31
+ | **Format-agnostic** | JSON, CSV, YAML, binary, msgpack-lite, or register your own |
32
+ | **Source-agnostic** | File, HTTP, SQL, Sensor, Hybrid — same query interface |
33
+ | **Schema-exportable** | NLP2CMD, OpenAPI, GraphQL, AsyncAPI, gRPC, JSON Schema |
34
+ | **Zero storage** | Data exists only as computation |
35
+
36
+ ## Quick Start
37
+
38
+ ```bash
39
+ pip install -e ".[dev]"
40
+
41
+ # CLI
42
+ fraq explore --dims 3 --depth 5 --format json
43
+ fraq stream --dims 2 --count 20 --format csv
44
+ fraq schema --fields "name:str,value:float,flag:bool" --depth 2
45
+
46
+ # Docker
47
+ docker compose run test # run full test suite
48
+ docker compose run cli explore --depth 5
49
+ ```
50
+
51
+ ## Python API
52
+
53
+ ```python
54
+ from fraq import FraqNode, FraqSchema, FraqCursor, FormatRegistry, query
55
+
56
+ # 1. One-liner query
57
+ data = query(depth=2, fields=["temp:float", "id:str"], format="csv", limit=10)
58
+
59
+ # 2. Fluent query builder
60
+ from fraq import FraqQuery, FraqExecutor
61
+
62
+ q = (
63
+ FraqQuery()
64
+ .zoom(5, direction=(0.1, 0.2, 0.7))
65
+ .select("temperature:float", "sensor_id:str", "active:bool")
66
+ .where("temperature", "gt", 0.5)
67
+ .output("json")
68
+ .take(20)
69
+ )
70
+ result = FraqExecutor(dims=3).execute(q)
71
+
72
+ # 3. Source adapters — same query, different sources
73
+ from fraq import FileAdapter, SQLAdapter, SensorAdapter, HybridAdapter
74
+
75
+ # Disk
76
+ adapter = FileAdapter()
77
+ root = adapter.load_root("gradient_root.json")
78
+
79
+ # SQL
80
+ adapter = SQLAdapter(table="sensors")
81
+ root = adapter.load_root("", rows=[{"id": 1, "x": 0.0, "y": 0.0, "z": 0.0}])
82
+
83
+ # Sensors (infinite stream, zero storage)
84
+ adapter = SensorAdapter(base_temp=23.5)
85
+ for reading in adapter.stream(depth=3, count=100):
86
+ print(reading) # {'temperature': 24.1, 'humidity': 58.3, ...}
87
+
88
+ # Hybrid (merge multiple sources)
89
+ hybrid = HybridAdapter()
90
+ hybrid.add(FileAdapter(), "local_backup.json")
91
+ hybrid.add(SensorAdapter(), "")
92
+ merged = hybrid.load_root()
93
+
94
+ # 4. Async streaming (FastAPI SSE, Kafka, NATS)
95
+ from fraq.streaming import async_stream
96
+ async for record in async_stream(count=1000, interval=0.1):
97
+ yield f"data: {json.dumps(record)}\n\n"
98
+ ```
99
+
100
+ ## NLP2CMD Integration
101
+
102
+ Export FraqSchema to NLP2CMD's SchemaRegistry for natural language → command transformation:
103
+
104
+ ```python
105
+ from fraq import FraqNode, FraqSchema, to_nlp2cmd_schema, to_nlp2cmd_actions
106
+
107
+ schema = FraqSchema(root=FraqNode(position=(0.0, 0.0, 0.0)))
108
+ schema.add_field("temperature", "float")
109
+ schema.add_field("sensor_id", "str")
110
+
111
+ # Command schema → command_schemas/fraq_sensor.json
112
+ nlp2cmd_schema = to_nlp2cmd_schema(schema, command_name="fraq_sensor")
113
+
114
+ # Action registry entries
115
+ actions = to_nlp2cmd_actions(schema)
116
+ # → fraq_zoom, fraq_query, fraq_stream, fraq_save
117
+
118
+ # NLP2CMD can now transform:
119
+ # "Show active sensors" → fraq query --fields "temperature:float,sensor_id:str" --depth 2
120
+ ```
121
+
122
+ ## Schema Export Formats
123
+
124
+ ```python
125
+ from fraq import to_openapi, to_graphql, to_asyncapi, to_proto, to_json_schema
126
+
127
+ to_openapi(schema) # OpenAPI 3.0 — /zoom, /query, /stream endpoints
128
+ to_graphql(schema) # GraphQL type + Query definitions
129
+ to_asyncapi(schema) # AsyncAPI 3.0 — Kafka/WebSocket channels
130
+ to_proto(schema) # gRPC .proto with FraqService (Zoom + Stream RPCs)
131
+ to_json_schema(schema) # JSON Schema for validation
132
+ ```
133
+
134
+ ## Generators
135
+
136
+ | Generator | Output | Use Case |
137
+ |---|---|---|
138
+ | `HashGenerator` | `float` in configurable range | General-purpose pseudo-random |
139
+ | `FibonacciGenerator` | `float` based on depth | Mathematical sequences |
140
+ | `PerlinGenerator` | Smooth `float` noise | Organic sensor streams |
141
+ | `SensorStreamGenerator` | `dict` with IoT fields | Embedded / edge simulation |
142
+
143
+ ## Data Source Adapters
144
+
145
+ | Adapter | Source | Key Feature |
146
+ |---|---|---|
147
+ | `FileAdapter` | JSON/YAML/CSV on disk | Save/load roundtrip |
148
+ | `HTTPAdapter` | REST APIs | Fallback to deterministic roots |
149
+ | `SQLAdapter` | PostgreSQL/SQLite | Row mapping, SQL function generation |
150
+ | `SensorAdapter` | IoT streams (RPi/ESP32) | Infinite deterministic readings |
151
+ | `HybridAdapter` | Multiple sources merged | Mean position, XOR seeds |
152
+
153
+ ## text2fraq — Natural Language to Fractal Query
154
+
155
+ Convert natural language to fractal queries using LLM (LiteLLM) or rule-based fallback.
156
+
157
+ ### Setup (Ollama with small models)
158
+
159
+ ```bash
160
+ # Install Ollama: https://ollama.com
161
+ ollama pull qwen2.5:3b # Fast, instruction-tuned
162
+ ollama pull llama3.2:3b # Balanced, multilingual
163
+ ollama pull phi3:3.8b # Strong reasoning
164
+
165
+ # Install with AI extras
166
+ pip install -e ".[ai]"
167
+
168
+ # Copy and edit config
169
+ cp .env.example .env
170
+ ```
171
+
172
+ ### Usage
173
+
174
+ ```python
175
+ from fraq import text2fraq, Text2Fraq, Text2FraqSimple
176
+
177
+ # One-liner (auto-fallback to rule-based if LLM unavailable)
178
+ result = text2fraq("Show 20 temperature readings in CSV")
179
+
180
+ # With specific model
181
+ from fraq.text2fraq import Text2FraqConfig
182
+ t2f = Text2Fraq(Text2FraqConfig(model="qwen2.5:3b"))
183
+ parsed = t2f.parse("Get deep analysis with depth 5")
184
+
185
+ # Rule-based (deterministic, no LLM needed)
186
+ parser = Text2FraqSimple()
187
+ query = parser.parse("Stream sensor data as JSON")
188
+ ```
189
+
190
+ ### Environment Variables (.env)
191
+
192
+ | Variable | Default | Description |
193
+ |----------|---------|-------------|
194
+ | `LITELLM_PROVIDER` | `ollama` | LLM provider |
195
+ | `LITELLM_MODEL` | `qwen2.5:3b` | Model name |
196
+ | `LITELLM_BASE_URL` | `http://localhost:11434` | API endpoint |
197
+ | `LITELLM_TEMPERATURE` | `0.1` | Generation temperature |
198
+ | `TEXT2FRAQ_DEFAULT_FORMAT` | `json` | Default output format |
199
+ | `TEXT2FRAQ_DEFAULT_DIMS` | `3` | Default fractal dimensions |
200
+ | `TEXT2FRAQ_DEFAULT_DEPTH` | `3` | Default query depth |
201
+
202
+ ## Application Integrations
203
+
204
+ See `examples/app_integrations.py` for templates:
205
+
206
+ - **FastAPI** — REST API with `/query`, `/stream` (SSE), `/zoom/{depth}`
207
+ - **Streamlit** — Interactive dashboard with sliders and charts
208
+ - **Flask** — Blueprints with NL endpoints
209
+ - **WebSocket** — Real-time streaming server
210
+ - **Kafka** — Producer/consumer with aiokafka
211
+ - **gRPC** — High-performance RPC service
212
+ - **Celery** — Background task processing
213
+ - **Jupyter** — Interactive exploration widgets
214
+
215
+ ## Testing
216
+
217
+ ```bash
218
+ pytest -v --cov=fraq # 132 tests, 96% coverage
219
+ ```
220
+
221
+ ## Project Structure
222
+
223
+ ```
224
+ fraq/
225
+ ├── fraq/
226
+ │ ├── __init__.py # public API
227
+ │ ├── core.py # FraqNode, FraqSchema, FraqCursor
228
+ │ ├── formats.py # FormatRegistry + 6 built-in serialisers
229
+ │ ├── generators.py # Hash, Fibonacci, Perlin, SensorStream
230
+ │ ├── query.py # FraqQuery, FraqExecutor, FraqFilter
231
+ │ ├── adapters.py # File, HTTP, SQL, Sensor, Hybrid adapters
232
+ │ ├── schema_export.py # NLP2CMD, OpenAPI, GraphQL, AsyncAPI, Proto, JSON Schema
233
+ │ ├── streaming.py # AsyncFraqStream, async_query, async_stream
234
+ │ └── cli.py # CLI entry point
235
+ ├── tests/ # 132 tests, 96% coverage
236
+ ├── examples/
237
+ │ ├── query_examples.py # All data sources (disk, HTTP, SQL, sensor, hybrid)
238
+ │ ├── nlp2cmd_integration.py # NLP2CMD schema workflow
239
+ │ ├── applications.py # IoT, ERP, AI/ML, DevOps, Finance, Legal
240
+ │ └── async_streaming.py # FastAPI SSE, Kafka patterns
241
+ ├── Dockerfile
242
+ ├── docker-compose.yml
243
+ ├── pyproject.toml
244
+ └── README.md
245
+ ```
246
+
247
+ ## License
248
+
249
+ Apache License 2.0 - see [LICENSE](LICENSE) for details.
250
+
251
+ ## Author
252
+
253
+ Created by **Tom Sapletta** - [tom@sapletta.com](mailto:tom@sapletta.com)