m8flow 1.1.3 → 1.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,15 +1,62 @@
1
- # M8Flow CLI
1
+ <p align="center">
2
+ <img src="https://cdn.jsdelivr.net/npm/m8flow@latest/bundled/frontend-dist/assets/logo_full.png" alt="M8Flow" height="60" />
3
+ </p>
2
4
 
3
- Visual ML Pipeline Builder install globally, run anywhere.
5
+ <h3 align="center">AI-powered visual machine learning workflow builder with local execution, pipeline orchestration, and zero-config setup.</h3>
4
6
 
5
- ```
7
+ <p align="center">
8
+ <a href="https://www.npmjs.com/package/m8flow"><img src="https://img.shields.io/npm/v/m8flow?color=6366f1&label=npm" alt="npm version" /></a>
9
+ <a href="https://www.npmjs.com/package/m8flow"><img src="https://img.shields.io/npm/dm/m8flow?color=22c55e&label=downloads" alt="downloads" /></a>
10
+ <a href="https://www.npmjs.com/package/m8flow"><img src="https://img.shields.io/npm/l/m8flow?color=f59e0b" alt="license" /></a>
11
+ <img src="https://img.shields.io/badge/Python-3.8%2B-blue" alt="python" />
12
+ <img src="https://img.shields.io/badge/Node.js-18%2B-green" alt="node" />
13
+ </p>
14
+
15
+ ---
16
+
17
+ ## Quick Start
18
+
19
+ ```bash
6
20
  npm install -g m8flow
7
21
  m8flow run
8
22
  ```
9
23
 
24
+ Your browser opens automatically. Start building ML pipelines in seconds — no Docker, no config files, no cloud accounts.
25
+
26
+ ---
27
+
28
+ ## Why M8Flow?
29
+
30
+ Most ML tooling forces you to write boilerplate, manage environments manually, or pay for cloud compute. M8Flow takes a different approach:
31
+
32
+ | Problem | M8Flow Solution |
33
+ |---|---|
34
+ | Writing pipeline code from scratch | Visual drag-and-drop node editor |
35
+ | Complex environment setup | Automatic Python venv + dependency install |
36
+ | No AI assistance for ML workflows | Built-in AI pipeline generation (OpenRouter / Gemini / Mistral) |
37
+ | Vendor lock-in and cloud costs | 100% local-first — your machine, your data |
38
+ | Slow iteration cycles | Live pipeline execution with real-time logs |
39
+
40
+ ---
41
+
42
+ ## Features
43
+
44
+ - **Visual drag-and-drop pipeline builder** — connect nodes with edges, no boilerplate
45
+ - **AI-assisted workflow generation** — describe your pipeline in plain English
46
+ - **Multi-provider AI support** — OpenRouter, Google Gemini, Mistral La Plateforme
47
+ - **Local-first architecture** — all data stays on your machine
48
+ - **Automatic environment setup** — Python virtualenv + dependencies on first run
49
+ - **60+ built-in ML nodes** — preprocessing, models, evaluation, visualization
50
+ - **Custom Python components** — add your own nodes with AI code generation
51
+ - **CSV dataset handling** — upload files, inspect schemas, feed into pipelines
52
+ - **Pipeline versioning** — save checkpoints, compare versions, roll back
53
+ - **Real-time execution monitoring** — live logs, node status, output previews
54
+ - **Cross-browser state sync** — flows persist across tabs and browsers via local server
55
+ - **Port conflict auto-resolution** — never manually kill a port again
56
+
10
57
  ---
11
58
 
12
- ## System requirements
59
+ ## Requirements
13
60
 
14
61
  | Tool | Minimum |
15
62
  |--------|---------|
@@ -23,38 +70,33 @@ m8flow run
23
70
 
24
71
  ### `m8flow run`
25
72
 
26
- Starts the full stack and opens the browser.
73
+ Launches the complete local M8Flow environment.
27
74
 
28
75
  ```
29
76
  Options:
30
77
  -p, --port <n> Frontend port (default: 3000)
31
78
  -b, --backend <n> Backend port (default: 8000)
32
- --no-browser Don't open the browser automatically
33
- --verbose Stream backend and frontend logs
79
+ --no-browser Skip opening the browser automatically
80
+ --verbose Stream full backend logs
34
81
  -h, --help Show help
35
82
  ```
36
83
 
37
- **Examples**
38
-
39
84
  ```bash
40
- m8flow run # defaults (3000 / 8000)
41
- m8flow run --port 4000 # custom frontend port
42
- m8flow run --backend 9000 --verbose # custom backend + full logs
43
- m8flow run --no-browser # headless / CI mode
85
+ m8flow run # standard launch
86
+ m8flow run --port 4000 # custom port
87
+ m8flow run --no-browser --verbose # headless with logs
44
88
  ```
45
89
 
46
90
  ### `m8flow doctor`
47
91
 
48
- Checks system requirements before you try to run.
92
+ Diagnoses your environment before running.
49
93
 
50
94
  ```bash
51
95
  m8flow doctor
52
96
  ```
53
97
 
54
- Output example:
55
-
56
98
  ```
57
- M8Flow Doctor v1.0.0
99
+ M8Flow Doctor v1.1.3
58
100
 
59
101
  ✔ Node.js (20.11.0)
60
102
  ✔ Python (Python 3.11.5)
@@ -62,77 +104,109 @@ Output example:
62
104
  ✔ Python dependencies
63
105
  ✔ Bundled assets
64
106
  ✔ Port Frontend (3000) — free
65
- Port Backend (8000) — busy — will auto-shift to 8001
107
+ Port Backend (8000) — free
66
108
  ```
67
109
 
68
110
  ---
69
111
 
70
- ## What `m8flow run` does
71
-
72
- | Step | Action |
73
- |------|--------|
74
- | 1 | Detect Python 3.8+ on PATH (`python3`, `python`, `py`) |
75
- | 2 | Create `~/.m8flow/{uploads,models,pipelines}` |
76
- | 3 | Create Python virtualenv at `~/.m8flow/venv` |
77
- | 4 | `pip install -r requirements.txt` (only when file changes) |
78
- | 5 | Auto-resolve port conflicts with `detect-port` |
79
- | 6 | Spawn `uvicorn main:app` via **execa** |
80
- | 7 | Serve React build via built-in Node HTTP server |
81
- | 8 | TCP-poll the backend until it accepts connections |
82
- | 9 | Open `http://localhost:3000` with **open** |
83
- | 10 | Clean `SIGINT` / `SIGTERM` shutdown |
112
+ ## What happens when you run `m8flow run`
113
+
114
+ | Step | What M8Flow does |
115
+ |------|-----------------|
116
+ | 1 | Locates Python 3.8+ on your PATH |
117
+ | 2 | Creates `~/.m8flow/` storage directories |
118
+ | 3 | Creates an isolated Python virtualenv |
119
+ | 4 | Installs all ML dependencies (once only hash-cached) |
120
+ | 5 | Auto-resolves port conflicts |
121
+ | 6 | Starts FastAPI backend via uvicorn |
122
+ | 7 | Serves the React frontend via Node HTTP server |
123
+ | 8 | Waits for backend health check (up to 120 s) |
124
+ | 9 | Opens `http://localhost:3000` in your browser |
84
125
 
85
126
  ---
86
127
 
87
- ## Data storage
128
+ ## Architecture
88
129
 
89
- Everything stays on your machine:
130
+ ```
131
+ m8flow run
132
+ ├── Node.js CLI (bin/m8flow.js)
133
+ │ ├── Python venv setup → ~/.m8flow/venv/
134
+ │ ├── Frontend server → localhost:3000 (Node static)
135
+ │ └── Backend server → localhost:8000 (uvicorn)
136
+
137
+ ├── Frontend — React + XYFlow canvas, Zustand state
138
+ │ └── Visual node editor, AI chat, dataset manager
139
+
140
+ └── Backend — FastAPI + Python runtime
141
+ ├── Pipeline executor (topological DAG execution)
142
+ ├── LLM service (OpenRouter / Gemini / Mistral)
143
+ ├── Self-healer (AI auto-fix on node errors)
144
+ └── Template library (60+ pre-built ML components)
145
+ ```
90
146
 
91
- | Path | Purpose |
92
- |------|---------|
93
- | `~/.m8flow/uploads/` | Uploaded CSV files |
94
- | `~/.m8flow/models/` | Trained models |
95
- | `~/.m8flow/pipelines/`| Saved pipelines |
96
- | `~/.m8flow/venv/` | Python virtualenv |
97
- | `~/.m8flow/.env` | API keys (optional) |
147
+ | Layer | Technology |
148
+ |---|---|
149
+ | Frontend | React, XYFlow, Zustand, Vite |
150
+ | Backend | FastAPI, Python, uvicorn |
151
+ | Runtime | Isolated Python virtualenv |
152
+ | CLI | Node.js 18+, ESM, execa |
153
+ | AI Layer | OpenRouter · Google Gemini · Mistral La Plateforme |
154
+ | Storage | Local filesystem (`~/.m8flow/`) |
98
155
 
99
- ### Setting your OpenRouter API key
156
+ ---
100
157
 
101
- ```bash
102
- echo "OPENROUTER_API_KEY=sk-or-..." >> ~/.m8flow/.env
103
- ```
158
+ ## AI Keys
104
159
 
105
- The backend loads this file on every startup.
160
+ M8Flow works without any API key (demo mode with limited models). For full AI pipeline generation, add at least one key in **Settings → API Keys**:
161
+
162
+ | Provider | Key prefix | Models |
163
+ |---|---|---|
164
+ | OpenRouter | `sk-or-...` | Llama 3.3, NVIDIA Nemotron, Gemma, and 200+ more |
165
+ | Google Gemini | `AIza...` | Gemini 2.5 Flash, Flash Lite, Pro |
166
+ | Mistral | any | Codestral, Mistral Small, Mixtral |
167
+
168
+ Keys are stored locally in your browser and on your machine — never sent to M8Flow servers.
106
169
 
107
170
  ---
108
171
 
109
- ## Build from source
172
+ ## Data & Storage
173
+
174
+ Everything lives on your machine:
175
+
176
+ | Path | Contents |
177
+ |------|---------|
178
+ | `~/.m8flow/uploads/` | Uploaded CSV datasets |
179
+ | `~/.m8flow/models/` | Trained model files |
180
+ | `~/.m8flow/pipelines/` | Saved pipeline state |
181
+ | `~/.m8flow/venv/` | Python virtualenv |
182
+ | `~/.m8flow/app_state.json` | Flows, projects, settings |
183
+
184
+ ---
185
+
186
+ ## Build from Source
110
187
 
111
188
  ```bash
112
- git clone https://github.com/your-org/m8flow
189
+ git clone https://github.com/mursaleen231213/m8flow
113
190
  cd m8flow
114
191
 
115
- # 1. Bundle backend + frontend
192
+ # Build React frontend + bundle Python backend
116
193
  node cli/scripts/build.js
117
194
 
118
- # 2. Install globally from local build
195
+ # Install locally
119
196
  cd cli && npm install -g .
120
197
 
121
- # 3. Run
198
+ # Launch
122
199
  m8flow run
123
200
  ```
124
201
 
125
- ---
126
-
127
- ## Publish to npm
202
+ ### Push updates to npm
128
203
 
129
204
  ```bash
130
- # Inside cli/
131
- npm publish
205
+ cd cli
206
+ npm version patch # bump version
207
+ npm publish --otp=<code> # publish (OTP from authenticator)
132
208
  ```
133
209
 
134
- `prepublishOnly` runs `build.js` automatically.
135
-
136
210
  ---
137
211
 
138
212
  ## Uninstall
@@ -140,6 +214,13 @@ npm publish
140
214
  ```bash
141
215
  npm uninstall -g m8flow
142
216
 
143
- # Optional remove all data:
144
- rm -rf ~/.m8flow
217
+ # Remove all local data (optional):
218
+ rm -rf ~/.m8flow # Mac/Linux
219
+ Remove-Item -Recurse -Force "$env:USERPROFILE\.m8flow" # Windows
145
220
  ```
221
+
222
+ ---
223
+
224
+ <p align="center">
225
+ Built with ❤️ for ML engineers who value speed and simplicity.
226
+ </p>
@@ -36,7 +36,9 @@ def get_state():
36
36
  return json.loads(f.read_text(encoding="utf-8"))
37
37
  except Exception as exc:
38
38
  logger.warning("app_state.json unreadable: %s — returning defaults", exc)
39
- return {"flows": [], "projects": [], "myFiles": [], "openRouterKey": None}
39
+ return {"flows": [], "projects": [], "myFiles": [],
40
+ "openRouterKey": None, "geminiKey": None, "mistralKey": None,
41
+ "chatHistories": {}}
40
42
 
41
43
 
42
44
  class StatePayload(BaseModel):
@@ -44,6 +46,9 @@ class StatePayload(BaseModel):
44
46
  projects: list = []
45
47
  myFiles: list = []
46
48
  openRouterKey: str | None = None
49
+ geminiKey: str | None = None
50
+ mistralKey: str | None = None
51
+ chatHistories: dict = {}
47
52
 
48
53
 
49
54
  @router.post("")
@@ -1,35 +1,60 @@
1
1
  # ── Core API ─────────────────────────────────────────────────────────────────
2
- fastapi
2
+ fastapi==0.136.1
3
3
  uvicorn[standard]
4
- python-multipart
5
- pydantic
6
- python-dotenv
7
- httpx
8
-
4
+ starlette==1.0.0
5
+ python-multipart==0.0.27
6
+ python-dotenv==1.2.2
7
+ pydantic==2.13.3
8
+ pydantic_core==2.46.3
9
+ annotated-types==0.7.0
10
+ # ── HTTP ─────────────────────────────────────────────────────────────────────
11
+ httpx==0.28.1
12
+ httpcore==1.0.9
13
+ h11==0.16.0
14
+ anyio==4.13.0
15
+ sniffio==1.3.1
16
+ certifi==2026.4.22
17
+ idna==3.13
9
18
  # ── Data ─────────────────────────────────────────────────────────────────────
10
- pandas
11
- numpy
12
- scipy
13
- joblib
14
-
19
+ pandas==3.0.2
20
+ numpy==2.4.4
21
+ scipy==1.17.1
22
+ joblib==1.5.3
23
+ threadpoolctl==3.6.0
24
+ python-dateutil==2.9.0.post0
25
+ six==1.17.0
26
+ tzdata==2026.2
15
27
  # ── ML ───────────────────────────────────────────────────────────────────────
16
- scikit-learn
17
- xgboost
18
- lightgbm
19
- statsmodels
20
-
21
- # ── Imbalanced learning ───────────────────────────────────────────────────────
22
- imbalanced-learn
23
-
24
- # ── Explainability ────────────────────────────────────────────────────────────
25
- shap
26
-
27
- # ── Dimensionality reduction ──────────────────────────────────────────────────
28
- umap-learn
29
-
30
- # ── Visualisation (server-side, not imported at runtime) ─────────────────────
31
- matplotlib
32
- seaborn
33
-
34
- # ── Legacy / compat ───────────────────────────────────────────────────────────
35
- openai
28
+ scikit-learn==1.8.0
29
+ # ── Visualisation ────────────────────────────────────────────────────────────
30
+ matplotlib==3.10.9
31
+ seaborn==0.13.2
32
+ plotly==6.7.0
33
+ pillow==12.2.0
34
+ contourpy==1.3.3
35
+ cycler==0.12.1
36
+ fonttools==4.62.1
37
+ kiwisolver==1.5.0
38
+ pyparsing==3.3.2
39
+ narwhals==2.21.0
40
+ # ── AI / LLM clients ─────────────────────────────────────────────────────────
41
+ openai==2.33.0
42
+ mistralai==2.4.4
43
+ httpx==0.28.1
44
+ jiter==0.14.0
45
+ distro==1.9.0
46
+ tqdm==4.67.3
47
+ # ── Telemetry ────────────────────────────────────────────────────────────────
48
+ opentelemetry-api==1.39.1
49
+ opentelemetry-semantic-conventions==0.60b1
50
+ # ── Utils ────────────────────────────────────────────────────────────────────
51
+ click==8.3.3
52
+ colorama==0.4.6
53
+ annotated-doc==0.0.4
54
+ eval_type_backport==0.3.1
55
+ importlib_metadata==8.7.1
56
+ jsonpath-python==1.1.5
57
+ packaging==26.2
58
+ typing_extensions==4.15.0
59
+ typing-inspection==0.4.2
60
+ zipp==3.23.1