@inteli.city/node-red-contrib-exec-collection 1.0.4 → 1.0.5
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +836 -5
- package/exec.queue.html +228 -38
- package/exec.queue.js +553 -537
- package/exec.service.html +342 -229
- package/exec.service.js +325 -487
- package/node.queue.html +359 -0
- package/node.queue.js +569 -0
- package/package.json +19 -19
- package/python.config.html +55 -0
- package/python.config.js +24 -0
- package/python.queue.html +360 -0
- package/python.queue.js +555 -0
- package/utils/context.js +54 -0
- package/async.gpt.html +0 -327
- package/async.gpt.js +0 -615
- package/async.latex.html +0 -319
- package/async.latex.js +0 -618
- package/module.njk.html +0 -45
- package/module.njk.js +0 -12
- package/template.njk.html +0 -201
- package/template.njk.js +0 -138
- package/thrd.function.html +0 -312
- package/thrd.function.js +0 -586
- package/thread.queue.html +0 -311
- package/thread.queue.js +0 -586
package/README.md
CHANGED
|
@@ -1,6 +1,837 @@
|
|
|
1
|
-
|
|
1
|
+
# node-red-contrib-exec-collection
|
|
2
2
|
|
|
3
|
-
|
|
4
|
-
|
|
5
|
-
|
|
6
|
-
|
|
3
|
+
A collection of Node-RED nodes for running scripts and system commands.
|
|
4
|
+
|
|
5
|
+
---
|
|
6
|
+
|
|
7
|
+
## Table of Contents
|
|
8
|
+
|
|
9
|
+
- [Nodes](#nodes)
|
|
10
|
+
- [When to use which node](#when-to-use-which-node)
|
|
11
|
+
- [exec.queue](#execqueue)
|
|
12
|
+
- [python.queue](#pythonqueue)
|
|
13
|
+
- [node.queue](#nodequeue)
|
|
14
|
+
- [exec.service](#execservice)
|
|
15
|
+
- [State & Persistence](#state--persistence)
|
|
16
|
+
- [Output & Parsing](#output--parsing)
|
|
17
|
+
- [python.config](#pythonconfig)
|
|
18
|
+
|
|
19
|
+
---
|
|
20
|
+
|
|
21
|
+
## Nodes
|
|
22
|
+
|
|
23
|
+
| Node | Description |
|
|
24
|
+
|---|---|
|
|
25
|
+
| [exec.queue](#execqueue) | Renders a Nunjucks template into a temp file, runs a shell command against it. Fresh process per message. |
|
|
26
|
+
| [python.queue](#pythonqueue) | Persistent Python worker pool. Each message sends rendered code via stdin. Worker state persists across messages. |
|
|
27
|
+
| [node.queue](#nodequeue) | Persistent Node.js worker pool. Each message sends rendered code via stdin. State survives via `global.*`. |
|
|
28
|
+
| [exec.service](#execservice) | Runs a long-lived process as a managed service. Streams stdout continuously. Auto-restarts on crash. |
|
|
29
|
+
| [python.config](#pythonconfig) | Config node storing the Python executable path used by python.queue. |
|
|
30
|
+
|
|
31
|
+
---
|
|
32
|
+
|
|
33
|
+
## When to use which node
|
|
34
|
+
|
|
35
|
+
**Use `exec.queue` when:**
|
|
36
|
+
- Each execution must be fully isolated — no state between messages
|
|
37
|
+
- You need binary output (buffer mode)
|
|
38
|
+
- You run shell commands, R scripts, or other interpreters
|
|
39
|
+
- You prefer simplicity and predictability over performance
|
|
40
|
+
|
|
41
|
+
**Use `python.queue` when:**
|
|
42
|
+
- You want to eliminate process startup cost — workers stay alive between messages
|
|
43
|
+
- You run high-frequency Python workloads
|
|
44
|
+
- You want to load a model, open a connection, or build state once and reuse it
|
|
45
|
+
|
|
46
|
+
**Use `node.queue` when:**
|
|
47
|
+
- Same as python.queue, but for JavaScript
|
|
48
|
+
- You want to reuse loaded modules across messages without re-requiring them
|
|
49
|
+
- You are working in a JS context and don't want a separate interpreter
|
|
50
|
+
|
|
51
|
+
**Use `exec.service` when:**
|
|
52
|
+
- You need a process that runs indefinitely and streams output continuously
|
|
53
|
+
- You are watching files, tailing logs, listening on a channel, or polling a system
|
|
54
|
+
- You want automatic restart on crash with no intervention
|
|
55
|
+
|
|
56
|
+
---
|
|
57
|
+
|
|
58
|
+
## exec.queue
|
|
59
|
+
|
|
60
|
+
### Overview
|
|
61
|
+
|
|
62
|
+
`exec.queue` executes arbitrary system commands through a configurable concurrency queue. Each incoming message triggers one execution: a Nunjucks template is rendered into a temporary file, the command is run against that file, and stdout becomes the output message.
|
|
63
|
+
|
|
64
|
+
It handles:
|
|
65
|
+
|
|
66
|
+
- Short-lived commands (exec mode — waits for completion)
|
|
67
|
+
- Long-running or streaming processes (spawn mode — streams output as it arrives)
|
|
68
|
+
- Concurrent executions with backpressure via a queue
|
|
69
|
+
- Binary output (buffer mode)
|
|
70
|
+
- Cross-platform execution (Linux and Windows)
|
|
71
|
+
|
|
72
|
+
---
|
|
73
|
+
|
|
74
|
+
### Execution Pipeline
|
|
75
|
+
|
|
76
|
+
Every message follows this pipeline:
|
|
77
|
+
|
|
78
|
+
```
|
|
79
|
+
msg received
|
|
80
|
+
→ render Nunjucks template → write to temp file ($file)
|
|
81
|
+
→ render command string (optional Nunjucks)
|
|
82
|
+
→ run command
|
|
83
|
+
→ capture stdout → send msg
|
|
84
|
+
→ clean up temp files
|
|
85
|
+
```
|
|
86
|
+
|
|
87
|
+
Nothing is shared between concurrent executions. Each job gets its own temp file and its own Nunjucks environment.
|
|
88
|
+
|
|
89
|
+
---
|
|
90
|
+
|
|
91
|
+
### Core Concepts
|
|
92
|
+
|
|
93
|
+
#### 1. Template → `$file`
|
|
94
|
+
|
|
95
|
+
The template body is rendered with Nunjucks and written to a temporary file. The path to that file is available as `$file` (Linux/macOS) or `%file%` (Windows) inside the command.
|
|
96
|
+
|
|
97
|
+
```js
|
|
98
|
+
// Template (JavaScript mode)
|
|
99
|
+
const data = require("fs").readFileSync(process.env.INPUT_PATH, "utf8");
|
|
100
|
+
console.log(JSON.stringify({ lines: data.split("\n").length }));
|
|
101
|
+
```
|
|
102
|
+
|
|
103
|
+
```
|
|
104
|
+
# Command
|
|
105
|
+
node $file
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
#### 2. Command
|
|
109
|
+
|
|
110
|
+
The command string is what runs in the shell. `$file` is always the rendered template. The command itself can also be a Nunjucks template (enable "Cmd Template" in the node settings).
|
|
111
|
+
|
|
112
|
+
```
|
|
113
|
+
python3 $file
|
|
114
|
+
bash $file
|
|
115
|
+
Rscript $file
|
|
116
|
+
psql postgresql://user:pass@host:5432/db -f $file
|
|
117
|
+
```
|
|
118
|
+
|
|
119
|
+
#### 3. stdout → `msg.payload`
|
|
120
|
+
|
|
121
|
+
Whatever the process writes to stdout becomes the output message payload (subject to the selected output mode). Writing to stderr does not produce output — it produces warnings.
|
|
122
|
+
|
|
123
|
+
#### 4. Queue
|
|
124
|
+
|
|
125
|
+
Concurrency is controlled by the **Queue** setting. If Queue = 2, up to 2 commands run simultaneously; additional messages wait. The status badge shows `waiting (executing/concurrency)`.
|
|
126
|
+
|
|
127
|
+
---
|
|
128
|
+
|
|
129
|
+
### Template Engine (Nunjucks)
|
|
130
|
+
|
|
131
|
+
The template body uses [Nunjucks](https://mozilla.github.io/nunjucks/) syntax. All `msg` values are automatically converted to strings before rendering — no filters required.
|
|
132
|
+
|
|
133
|
+
| Value type | Renders as |
|
|
134
|
+
|---|---|
|
|
135
|
+
| String | value as-is |
|
|
136
|
+
| Number | string representation (`42` → `"42"`) |
|
|
137
|
+
| Object / Array | JSON-serialized (`{"a":1}`) |
|
|
138
|
+
| `null` / `undefined` | empty string |
|
|
139
|
+
|
|
140
|
+
The rendering context exposes:
|
|
141
|
+
|
|
142
|
+
| Variable | Value |
|
|
143
|
+
|---|---|
|
|
144
|
+
| `{{ payload }}` | `msg.payload` (stringified) |
|
|
145
|
+
| `{{ topic }}` | `msg.topic` |
|
|
146
|
+
| Any `msg.*` | Any top-level message property |
|
|
147
|
+
| `flow.get("key")` | Flow context value |
|
|
148
|
+
| `global.get("key")` | Global context value |
|
|
149
|
+
| `env` | `process.env` (all environment variables) |
|
|
150
|
+
|
|
151
|
+
> **Warning:** Nunjucks evaluates `{{ }}` expressions everywhere in the template — including inside `#` Python comments and `//` JS comments. Never put `{{ expr }}` in a comment unless you intend it to be rendered.
|
|
152
|
+
|
|
153
|
+
---
|
|
154
|
+
|
|
155
|
+
### `asset()` Helper
|
|
156
|
+
|
|
157
|
+
`asset(content)` creates an additional temporary file containing `content` and returns its path. The file is cleaned up automatically after the execution finishes.
|
|
158
|
+
|
|
159
|
+
```nunjucks
|
|
160
|
+
{% set config_path = asset('{"threshold": 0.9, "mode": "strict"}') %}
|
|
161
|
+
python3 $file --config {{ config_path }}
|
|
162
|
+
```
|
|
163
|
+
|
|
164
|
+
```python
|
|
165
|
+
# Template — Python ($file)
|
|
166
|
+
import sys, json
|
|
167
|
+
|
|
168
|
+
config = json.load(open(sys.argv[sys.argv.index("--config") + 1]))
|
|
169
|
+
print(json.dumps({"threshold": config["threshold"]}))
|
|
170
|
+
```
|
|
171
|
+
|
|
172
|
+
---
|
|
173
|
+
|
|
174
|
+
### Command Templating
|
|
175
|
+
|
|
176
|
+
When **Cmd Template** is enabled, the command string is also rendered with Nunjucks before execution:
|
|
177
|
+
|
|
178
|
+
```
|
|
179
|
+
python3 $file --input {{ env.INPUT_DIR }}/{{ payload }}
|
|
180
|
+
```
|
|
181
|
+
|
|
182
|
+
---
|
|
183
|
+
|
|
184
|
+
### Output Modes
|
|
185
|
+
|
|
186
|
+
| Mode | Behavior |
|
|
187
|
+
|---|---|
|
|
188
|
+
| Plain text | stdout as-is (string) |
|
|
189
|
+
| Parsed JSON | `JSON.parse(stdout)` |
|
|
190
|
+
| Parsed YAML | `js-yaml` parse of stdout |
|
|
191
|
+
| Parsed XML | `xml-js` parse of stdout |
|
|
192
|
+
| Buffer | raw stdout bytes as a `Buffer` |
|
|
193
|
+
|
|
194
|
+
#### Buffer Mode
|
|
195
|
+
|
|
196
|
+
Buffer mode captures stdout as raw bytes. `msg.payload` is a Node.js `Buffer`. Use this when the process outputs binary data: images, compressed files, protocol frames, etc.
|
|
197
|
+
|
|
198
|
+
```python
|
|
199
|
+
# Template — Python
|
|
200
|
+
import sys
|
|
201
|
+
with open("/path/to/image.png", "rb") as f:
|
|
202
|
+
sys.stdout.buffer.write(f.read())
|
|
203
|
+
```
|
|
204
|
+
|
|
205
|
+
`splitLine` is not supported in buffer mode.
|
|
206
|
+
|
|
207
|
+
---
|
|
208
|
+
|
|
209
|
+
### Execution Modes
|
|
210
|
+
|
|
211
|
+
**exec mode (default)** — command runs to completion, stdout is captured and sent as one message.
|
|
212
|
+
|
|
213
|
+
**spawn mode** — process streams output as it runs. Each chunk of stdout triggers a message. Use `-u` for Python to disable output buffering:
|
|
214
|
+
|
|
215
|
+
```
|
|
216
|
+
python3 -u $file
|
|
217
|
+
```
|
|
218
|
+
|
|
219
|
+
---
|
|
220
|
+
|
|
221
|
+
### stdout vs stderr
|
|
222
|
+
|
|
223
|
+
**stdout is data. stderr is logs.**
|
|
224
|
+
|
|
225
|
+
```python
|
|
226
|
+
import sys
|
|
227
|
+
print("processing...", file=sys.stderr) # node warning
|
|
228
|
+
print('{"result": 42}') # msg.payload
|
|
229
|
+
```
|
|
230
|
+
|
|
231
|
+
---
|
|
232
|
+
|
|
233
|
+
### Queue Behavior
|
|
234
|
+
|
|
235
|
+
- Messages arriving while the queue is full wait in line
|
|
236
|
+
- Status badge: `waiting (executing/concurrency)` e.g. `3 (2/2)`
|
|
237
|
+
- `msg.stop = true` kills all active processes and drains the queue
|
|
238
|
+
- **⏹ button** in the node editor header does the same without redeploying
|
|
239
|
+
|
|
240
|
+
---
|
|
241
|
+
|
|
242
|
+
### Process Lifecycle
|
|
243
|
+
|
|
244
|
+
- Each active process is tracked by PID
|
|
245
|
+
- On redeploy or `msg.stop = true`, all tracked processes receive SIGTERM (Linux) or are terminated via `terminate()` (Windows)
|
|
246
|
+
- On Linux, the entire process group is signalled (`-pid`) to catch child processes
|
|
247
|
+
- Temp files are cleaned up in a `finally` block — removed even if the command fails
|
|
248
|
+
|
|
249
|
+
---
|
|
250
|
+
|
|
251
|
+
### Cross-Platform Behavior
|
|
252
|
+
|
|
253
|
+
| Platform | Shell | Variable |
|
|
254
|
+
|---|---|---|
|
|
255
|
+
| Linux / macOS | `/bin/bash` | `$file` |
|
|
256
|
+
| Windows | `cmd.exe` | `%file%` |
|
|
257
|
+
|
|
258
|
+
---
|
|
259
|
+
|
|
260
|
+
### Examples
|
|
261
|
+
|
|
262
|
+
#### 1. Run a Python script
|
|
263
|
+
|
|
264
|
+
```python
|
|
265
|
+
# Template — Python
|
|
266
|
+
import json
|
|
267
|
+
|
|
268
|
+
data = json.loads("{{ payload }}")
|
|
269
|
+
result = {"length": len(data), "type": type(data).__name__}
|
|
270
|
+
print(json.dumps(result))
|
|
271
|
+
```
|
|
272
|
+
|
|
273
|
+
```
|
|
274
|
+
python3 $file
|
|
275
|
+
```
|
|
276
|
+
|
|
277
|
+
Output mode: **Parsed JSON**
|
|
278
|
+
|
|
279
|
+
---
|
|
280
|
+
|
|
281
|
+
#### 2. Stream logs in real time (spawn mode)
|
|
282
|
+
|
|
283
|
+
```python
|
|
284
|
+
# Template — Python
|
|
285
|
+
import time, sys, json
|
|
286
|
+
|
|
287
|
+
for i in range(10):
|
|
288
|
+
print(json.dumps({"step": i}), flush=True)
|
|
289
|
+
time.sleep(0.5)
|
|
290
|
+
```
|
|
291
|
+
|
|
292
|
+
```
|
|
293
|
+
python3 -u $file
|
|
294
|
+
```
|
|
295
|
+
|
|
296
|
+
Mode: **spawn** — each `print()` produces a separate output message.
|
|
297
|
+
|
|
298
|
+
---
|
|
299
|
+
|
|
300
|
+
#### 3. SSH remote execution
|
|
301
|
+
|
|
302
|
+
```bash
|
|
303
|
+
# Template — Bash
|
|
304
|
+
echo "hostname: $(hostname)"
|
|
305
|
+
df -h /
|
|
306
|
+
```
|
|
307
|
+
|
|
308
|
+
```
|
|
309
|
+
cat $file | ssh -i /path/to/key user@remote-host bash -s
|
|
310
|
+
```
|
|
311
|
+
|
|
312
|
+
---
|
|
313
|
+
|
|
314
|
+
## python.queue
|
|
315
|
+
|
|
316
|
+
### Overview
|
|
317
|
+
|
|
318
|
+
`python.queue` keeps a pool of persistent Python workers alive. Each incoming message renders your Nunjucks template into Python source code, sends it to a free worker via stdin, and returns whatever the code prints as `msg.payload`.
|
|
319
|
+
|
|
320
|
+
**Mental model: each worker is a persistent Python REPL session.**
|
|
321
|
+
|
|
322
|
+
You are not running a script — you are sending code to a running Python engine. Imports, variables, and objects defined at the top level accumulate in the worker's namespace and are available to every subsequent message on that worker.
|
|
323
|
+
|
|
324
|
+
---
|
|
325
|
+
|
|
326
|
+
### Execution Model
|
|
327
|
+
|
|
328
|
+
Internally, each worker runs an event loop:
|
|
329
|
+
|
|
330
|
+
```python
|
|
331
|
+
while True:
|
|
332
|
+
code = read_next_job_from_stdin()
|
|
333
|
+
exec(code, _ns) # _ns is a persistent dict — the worker's global scope
|
|
334
|
+
send_stdout_to_node_red()
|
|
335
|
+
```
|
|
336
|
+
|
|
337
|
+
`_ns` starts empty and grows with every execution. Any name defined at the top level of your code — variables, functions, classes, imports — persists in `_ns` for the lifetime of that worker.
|
|
338
|
+
|
|
339
|
+
This means:
|
|
340
|
+
|
|
341
|
+
```python
|
|
342
|
+
# First message on this worker:
|
|
343
|
+
import pandas as pd # → stored in _ns["pd"]
|
|
344
|
+
data = pd.DataFrame(...) # → stored in _ns["data"]
|
|
345
|
+
|
|
346
|
+
# Second message on the same worker:
|
|
347
|
+
print(data.shape) # works — "data" is still in _ns
|
|
348
|
+
```
|
|
349
|
+
|
|
350
|
+
---
|
|
351
|
+
|
|
352
|
+
### State Boundaries
|
|
353
|
+
|
|
354
|
+
State is **per-worker**, not global across all workers.
|
|
355
|
+
|
|
356
|
+
With Queue > 1:
|
|
357
|
+
- Each worker has its own independent `_ns`
|
|
358
|
+
- A message routed to worker A cannot see state from worker B
|
|
359
|
+
- Execution is non-deterministic — you cannot predict which worker handles a given message
|
|
360
|
+
|
|
361
|
+
If you need consistent state across messages, use Queue = 1.
|
|
362
|
+
|
|
363
|
+
---
|
|
364
|
+
|
|
365
|
+
### Imports
|
|
366
|
+
|
|
367
|
+
Imports are safe to repeat. Python caches loaded modules internally (`sys.modules`), so re-importing on every message has no performance cost. That said, write imports explicitly to keep code readable:
|
|
368
|
+
|
|
369
|
+
```python
|
|
370
|
+
import json # safe — Python returns cached module
|
|
371
|
+
value = json.loads("{{ payload }}")
|
|
372
|
+
print(json.dumps({"ok": True}))
|
|
373
|
+
```
|
|
374
|
+
|
|
375
|
+
The common pattern is to guard expensive one-time initialization, not imports:
|
|
376
|
+
|
|
377
|
+
```python
|
|
378
|
+
if "model" not in dir():
|
|
379
|
+
import pickle
|
|
380
|
+
with open("/path/to/model.pkl", "rb") as f:
|
|
381
|
+
model = pickle.load(f)
|
|
382
|
+
|
|
383
|
+
import json
|
|
384
|
+
features = json.loads("{{ payload }}")
|
|
385
|
+
print(json.dumps({"prediction": int(model.predict([features])[0])}))
|
|
386
|
+
```
|
|
387
|
+
|
|
388
|
+
---
|
|
389
|
+
|
|
390
|
+
### Persistent Resources Warning
|
|
391
|
+
|
|
392
|
+
> **Warning:** Long-lived resources (database connections, file handles, network sockets) stored in `_ns` may become stale. A connection opened on message 1 may be closed, timed out, or broken by message 100.
|
|
393
|
+
|
|
394
|
+
Always validate or recreate persistent resources:
|
|
395
|
+
|
|
396
|
+
```python
|
|
397
|
+
import psycopg2
|
|
398
|
+
|
|
399
|
+
if "conn" not in dir() or conn.closed:
|
|
400
|
+
conn = psycopg2.connect("postgresql://user:pass@host/db")
|
|
401
|
+
|
|
402
|
+
with conn.cursor() as cur:
|
|
403
|
+
cur.execute("SELECT count(*) FROM events WHERE id = %s", ("{{ payload }}",))
|
|
404
|
+
print(cur.fetchone()[0])
|
|
405
|
+
```
|
|
406
|
+
|
|
407
|
+
---
|
|
408
|
+
|
|
409
|
+
### python.queue vs node.queue
|
|
410
|
+
|
|
411
|
+
Both nodes share the same queue-and-worker architecture. The key difference is how persistent state is scoped:
|
|
412
|
+
|
|
413
|
+
| | python.queue | node.queue |
|
|
414
|
+
|---|---|---|
|
|
415
|
+
| Language | Python | JavaScript |
|
|
416
|
+
| Runtime | Configurable via python.config | System `node` (same as Node-RED) |
|
|
417
|
+
| State mechanism | Implicit — `_ns` dict, like a module's global scope | Explicit — `global.*` on the vm context |
|
|
418
|
+
| Top-level variables | Persist automatically between messages | **Do not persist** — scoped to the execution |
|
|
419
|
+
| Output function | `print()` | `console.log()` |
|
|
420
|
+
| `require()` | N/A | Available (Node-RED module environment) |
|
|
421
|
+
|
|
422
|
+
**In python.queue**, top-level names persist automatically:
|
|
423
|
+
|
|
424
|
+
```python
|
|
425
|
+
count = count + 1 if "count" in dir() else 1
|
|
426
|
+
print(count)
|
|
427
|
+
```
|
|
428
|
+
|
|
429
|
+
**In node.queue**, top-level `const`/`let`/`var` are scoped to each execution and do not survive. You must use `global.*` explicitly:
|
|
430
|
+
|
|
431
|
+
```js
|
|
432
|
+
if (!global.count) global.count = 0;
|
|
433
|
+
global.count++;
|
|
434
|
+
console.log(global.count);
|
|
435
|
+
```
|
|
436
|
+
|
|
437
|
+
---
|
|
438
|
+
|
|
439
|
+
### Queue and Worker Lifecycle
|
|
440
|
+
|
|
441
|
+
The **Queue** setting controls how many Python workers run concurrently. Workers start lazily on the first incoming message.
|
|
442
|
+
|
|
443
|
+
| Status | Meaning |
|
|
444
|
+
|---|---|
|
|
445
|
+
| Blue dot `0 (0/2)` | Workers running, all idle |
|
|
446
|
+
| Blue ring `0 (2/2)` | All workers executing |
|
|
447
|
+
| Blue ring `3 (2/2)` | 3 messages waiting, both workers busy |
|
|
448
|
+
| Grey dot `0 (0/2)` | No workers running |
|
|
449
|
+
|
|
450
|
+
- **Idle 20 minutes** → all workers are killed; restart on next message
|
|
451
|
+
- **Worker crash** → worker removed; in-flight job fails; replacement created on next message
|
|
452
|
+
- **Node redeploy / close** → all workers killed, pending jobs drained
|
|
453
|
+
- **⏹ button** in the editor header → kills all workers immediately; confirmation dialog appears when workers are alive
|
|
454
|
+
- **`msg.stop = true`** → same effect from a flow message
|
|
455
|
+
|
|
456
|
+
---
|
|
457
|
+
|
|
458
|
+
### Python Executable
|
|
459
|
+
|
|
460
|
+
`python.queue` uses the Python binary defined in a linked **python.config** node. Falls back to `python3` if none is linked.
|
|
461
|
+
|
|
462
|
+
The path can point to a system Python or a virtual environment:
|
|
463
|
+
|
|
464
|
+
```
|
|
465
|
+
/usr/bin/python3
|
|
466
|
+
/home/user/myenv/bin/python
|
|
467
|
+
```
|
|
468
|
+
|
|
469
|
+
The environment must already exist with all required packages installed.
|
|
470
|
+
|
|
471
|
+
---
|
|
472
|
+
|
|
473
|
+
### Template Engine (Nunjucks)
|
|
474
|
+
|
|
475
|
+
The template is rendered by Nunjucks before Python sees it. All `msg` values are automatically converted to strings.
|
|
476
|
+
|
|
477
|
+
**Always wrap string variables in Python quotes:**
|
|
478
|
+
|
|
479
|
+
```python
|
|
480
|
+
name = "{{ payload }}" # correct — renders to: name = "hello"
|
|
481
|
+
name = {{ payload }} # wrong — renders to: name = hello (NameError)
|
|
482
|
+
|
|
483
|
+
x = {{ payload }} # correct when payload is a number
|
|
484
|
+
```
|
|
485
|
+
|
|
486
|
+
> **Warning:** Nunjucks evaluates `{{ }}` everywhere — including inside `#` comments. Do not put expressions in comments.
|
|
487
|
+
|
|
488
|
+
---
|
|
489
|
+
|
|
490
|
+
### Output
|
|
491
|
+
|
|
492
|
+
Use `print()` to produce output. Each `print()` call produces one message in Delimited mode (default).
|
|
493
|
+
|
|
494
|
+
**Parsing: Delimited** — buffers stdout and splits on the delimiter (`\n` by default).
|
|
495
|
+
|
|
496
|
+
**Parsing: Raw** — emits each stdout chunk immediately as a separate message.
|
|
497
|
+
|
|
498
|
+
---
|
|
499
|
+
|
|
500
|
+
### stdout vs stderr
|
|
501
|
+
|
|
502
|
+
```python
|
|
503
|
+
import sys
|
|
504
|
+
print("debug", file=sys.stderr) # node warning — not in payload
|
|
505
|
+
print('{"result": 42}') # becomes msg.payload
|
|
506
|
+
```
|
|
507
|
+
|
|
508
|
+
---
|
|
509
|
+
|
|
510
|
+
### Examples
|
|
511
|
+
|
|
512
|
+
#### Accumulate values across messages
|
|
513
|
+
|
|
514
|
+
```python
|
|
515
|
+
import json
|
|
516
|
+
|
|
517
|
+
if "history" not in dir():
|
|
518
|
+
history = []
|
|
519
|
+
|
|
520
|
+
history.append("{{ payload }}")
|
|
521
|
+
print(json.dumps(history))
|
|
522
|
+
```
|
|
523
|
+
|
|
524
|
+
Output mode: **Parsed JSON** — `msg.payload` grows with each message on the same worker.
|
|
525
|
+
|
|
526
|
+
---
|
|
527
|
+
|
|
528
|
+
#### Load a model once, predict every message
|
|
529
|
+
|
|
530
|
+
```python
|
|
531
|
+
if "model" not in dir():
|
|
532
|
+
import pickle
|
|
533
|
+
with open("/path/to/model.pkl", "rb") as f:
|
|
534
|
+
model = pickle.load(f)
|
|
535
|
+
|
|
536
|
+
import json
|
|
537
|
+
features = json.loads("{{ payload }}")
|
|
538
|
+
prediction = model.predict([features])[0]
|
|
539
|
+
print(json.dumps({"prediction": int(prediction)}))
|
|
540
|
+
```
|
|
541
|
+
|
|
542
|
+
---
|
|
543
|
+
|
|
544
|
+
## node.queue
|
|
545
|
+
|
|
546
|
+
### Overview
|
|
547
|
+
|
|
548
|
+
`node.queue` keeps a pool of persistent Node.js workers alive. Each incoming message renders your Nunjucks template into JavaScript, sends it to a free worker via stdin, and returns whatever `console.log()` prints as `msg.payload`.
|
|
549
|
+
|
|
550
|
+
Same execution model as `python.queue`, but runs JavaScript. Workers use the same Node.js runtime as Node-RED, so `require()` resolves from Node-RED's module environment.
|
|
551
|
+
|
|
552
|
+
**Mental model: each worker is a persistent Node.js vm context.**
|
|
553
|
+
|
|
554
|
+
---
|
|
555
|
+
|
|
556
|
+
### Persistent State
|
|
557
|
+
|
|
558
|
+
Each worker's vm context persists across all messages it handles. Top-level `const`/`let`/`var` are scoped to each execution — they do not survive between messages. Use `global.*` to persist state:
|
|
559
|
+
|
|
560
|
+
```js
|
|
561
|
+
if (!global.counter) {
|
|
562
|
+
global.counter = 0;
|
|
563
|
+
}
|
|
564
|
+
|
|
565
|
+
global.counter++;
|
|
566
|
+
console.log(global.counter);
|
|
567
|
+
```
|
|
568
|
+
|
|
569
|
+
With Queue > 1, state is **per-worker** — no shared state between workers.
|
|
570
|
+
|
|
571
|
+
---
|
|
572
|
+
|
|
573
|
+
### require()
|
|
574
|
+
|
|
575
|
+
`require` is available in the worker context and resolves from Node-RED's module environment:
|
|
576
|
+
|
|
577
|
+
```js
|
|
578
|
+
const fs = require('fs');
|
|
579
|
+
const os = require('os');
|
|
580
|
+
|
|
581
|
+
console.log(JSON.stringify({ platform: os.platform(), home: os.homedir() }));
|
|
582
|
+
```
|
|
583
|
+
|
|
584
|
+
---
|
|
585
|
+
|
|
586
|
+
### Output
|
|
587
|
+
|
|
588
|
+
Use `console.log()` to produce output.
|
|
589
|
+
|
|
590
|
+
`console.warn()` and `console.error()` → node warnings, not in payload.
|
|
591
|
+
|
|
592
|
+
**Parsing: Delimited** — each `console.log()` call produces one message (splits on `\n`).
|
|
593
|
+
|
|
594
|
+
**Parsing: Raw** — each stdout chunk emitted immediately.
|
|
595
|
+
|
|
596
|
+
---
|
|
597
|
+
|
|
598
|
+
### Worker Lifecycle
|
|
599
|
+
|
|
600
|
+
- Workers start lazily on first message
|
|
601
|
+
- After 20 minutes idle → all workers killed; restart on next message
|
|
602
|
+
- **⏹ button** in editor header → kills all workers immediately; confirmation dialog appears when workers are alive
|
|
603
|
+
- **`msg.stop = true`** → same effect from a flow message
|
|
604
|
+
- **Node redeploy / close** → all workers killed, pending jobs drained
|
|
605
|
+
|
|
606
|
+
---
|
|
607
|
+
|
|
608
|
+
## exec.service
|
|
609
|
+
|
|
610
|
+
### Overview
|
|
611
|
+
|
|
612
|
+
`exec.service` runs a shell command as a managed, long-lived service. It does not process flow messages — it spawns a process at deploy time and streams stdout continuously as output messages. When the process exits unexpectedly, the node restarts it automatically.
|
|
613
|
+
|
|
614
|
+
**Mental model: you are managing a daemon, not running a command.**
|
|
615
|
+
|
|
616
|
+
Use this node for processes that should always be running: file watchers, log tailers, event listeners, system monitors, persistent workers.
|
|
617
|
+
|
|
618
|
+
---
|
|
619
|
+
|
|
620
|
+
### Execution Model
|
|
621
|
+
|
|
622
|
+
- A single process is spawned immediately on deploy
|
|
623
|
+
- No queue — there is only one process at a time
|
|
624
|
+
- The node has **no input port** — it outputs only
|
|
625
|
+
- stdout is streamed to output messages using the same Parsing system (Delimited / Raw) as the other nodes
|
|
626
|
+
- stderr lines become node warnings
|
|
627
|
+
|
|
628
|
+
---
|
|
629
|
+
|
|
630
|
+
### Template and `$file`
|
|
631
|
+
|
|
632
|
+
Write code in the **Template** editor and reference it with `$file` in the command:
|
|
633
|
+
|
|
634
|
+
```
|
|
635
|
+
python3 -u $file
|
|
636
|
+
bash $file
|
|
637
|
+
node $file
|
|
638
|
+
```
|
|
639
|
+
|
|
640
|
+
The template is rendered with Nunjucks at process start (and again on every restart). Use `{{ flow.get('key') }}`, `{{ global.get('key') }}`, or `{{ env.MY_VAR }}` to inject values at startup. There is no `msg` context since the process starts independently of any incoming message.
|
|
641
|
+
|
|
642
|
+
If no template is configured, the command is run directly.
|
|
643
|
+
|
|
644
|
+
---
|
|
645
|
+
|
|
646
|
+
### Restart Behavior
|
|
647
|
+
|
|
648
|
+
When the process exits unexpectedly:
|
|
649
|
+
|
|
650
|
+
1. The node waits for the configured **Restart delay** (default: 3000 ms)
|
|
651
|
+
2. Then spawns a fresh process
|
|
652
|
+
|
|
653
|
+
**Max retries** limits how many consecutive failures are tolerated before the node stops trying. Set to `0` for infinite retries (default).
|
|
654
|
+
|
|
655
|
+
If the process runs stably for 10 seconds, the retry counter resets — so a stable process that occasionally crashes always gets a fresh set of retries.
|
|
656
|
+
|
|
657
|
+
| Config | Default | Description |
|
|
658
|
+
|---|---|---|
|
|
659
|
+
| Restart delay | 3000 ms | Wait before restarting after a crash |
|
|
660
|
+
| Max retries | 0 | Max consecutive failures before stopping (0 = infinite) |
|
|
661
|
+
|
|
662
|
+
---
|
|
663
|
+
|
|
664
|
+
### Control Actions
|
|
665
|
+
|
|
666
|
+
**Stop** — kills the process and prevents restart. Triggered by:
|
|
667
|
+
- The **⏹ button** in the editor header
|
|
668
|
+
- `POST /exec-service/:id/kill`
|
|
669
|
+
|
|
670
|
+
**Restart** — kills the current process (if running) and immediately starts a fresh one, resetting all retry counters. Triggered by:
|
|
671
|
+
- The **↺ button** in the editor header (confirm dialog if running)
|
|
672
|
+
- `POST /exec-service/:id/restart`
|
|
673
|
+
|
|
674
|
+
The restart action overrides a stopped state — clicking restart on a stopped service will start it.
|
|
675
|
+
|
|
676
|
+
---
|
|
677
|
+
|
|
678
|
+
### Status
|
|
679
|
+
|
|
680
|
+
| Badge | Meaning |
|
|
681
|
+
|---|---|
|
|
682
|
+
| Blue ring `running` | Process is alive and streaming |
|
|
683
|
+
| Yellow ring `restarting (retry N)` | Waiting to restart after a crash |
|
|
684
|
+
| Grey dot `stopped` | Manually killed or max retries exceeded |
|
|
685
|
+
|
|
686
|
+
---
|
|
687
|
+
|
|
688
|
+
### Output & Parsing
|
|
689
|
+
|
|
690
|
+
stdout is streamed using the same Parsing system as `python.queue` and `node.queue`:
|
|
691
|
+
|
|
692
|
+
**Parsing: Delimited** (default) — buffers output and splits on the delimiter (`\n` by default). Each line produces one message.
|
|
693
|
+
|
|
694
|
+
**Parsing: Raw** — emits each stdout chunk immediately. Chunks may not align with line boundaries.
|
|
695
|
+
|
|
696
|
+
---
|
|
697
|
+
|
|
698
|
+
### Use Cases
|
|
699
|
+
|
|
700
|
+
#### File watcher (inotifywait)
|
|
701
|
+
|
|
702
|
+
```
|
|
703
|
+
Command: inotifywait -m -e create,modify --format '{"file":"%w%f","event":"%e"}' /path/to/dir
|
|
704
|
+
Parsing: Delimited
|
|
705
|
+
Output: Parsed JSON
|
|
706
|
+
```
|
|
707
|
+
|
|
708
|
+
Each file event becomes a `msg.payload` object.
|
|
709
|
+
|
|
710
|
+
---
|
|
711
|
+
|
|
712
|
+
#### Log streaming
|
|
713
|
+
|
|
714
|
+
```
|
|
715
|
+
Command: tail -F /var/log/syslog | grep ERROR
|
|
716
|
+
Parsing: Delimited
|
|
717
|
+
Output: Plain text
|
|
718
|
+
```
|
|
719
|
+
|
|
720
|
+
Each matching log line becomes a `msg.payload` string.
|
|
721
|
+
|
|
722
|
+
---
|
|
723
|
+
|
|
724
|
+
#### PostgreSQL LISTEN
|
|
725
|
+
|
|
726
|
+
```
|
|
727
|
+
Command: psql postgresql://user:pass@host/db -c "LISTEN my_channel;" -c "SELECT 1" --no-align --tuples-only
|
|
728
|
+
Parsing: Delimited
|
|
729
|
+
Output: Plain text
|
|
730
|
+
```
|
|
731
|
+
|
|
732
|
+
Streams NOTIFY payloads as they arrive.
|
|
733
|
+
|
|
734
|
+
---
|
|
735
|
+
|
|
736
|
+
#### System monitoring
|
|
737
|
+
|
|
738
|
+
```
|
|
739
|
+
Command: while true; do df -h | jc --df; sleep 5; done
|
|
740
|
+
Parsing: Delimited
|
|
741
|
+
Output: Parsed JSON
|
|
742
|
+
```
|
|
743
|
+
|
|
744
|
+
Emits disk usage as a JSON object every 5 seconds.
|
|
745
|
+
|
|
746
|
+
---
|
|
747
|
+
|
|
748
|
+
#### Python worker with template
|
|
749
|
+
|
|
750
|
+
```
|
|
751
|
+
Template:
|
|
752
|
+
import time, json, sys
|
|
753
|
+
while True:
|
|
754
|
+
print(json.dumps({"tick": True}), flush=True)
|
|
755
|
+
time.sleep({{ flow.get('interval') or 1 }})
|
|
756
|
+
|
|
757
|
+
Command: python3 -u $file
|
|
758
|
+
Parsing: Delimited
|
|
759
|
+
Output: Parsed JSON
|
|
760
|
+
```
|
|
761
|
+
|
|
762
|
+
---
|
|
763
|
+
|
|
764
|
+
## State & Persistence
|
|
765
|
+
|
|
766
|
+
### exec.queue
|
|
767
|
+
|
|
768
|
+
No state. Each execution is fully isolated. Nothing survives between messages.
|
|
769
|
+
|
|
770
|
+
### python.queue
|
|
771
|
+
|
|
772
|
+
State lives in `_ns`, a persistent Python dict that acts as the worker's global scope. Everything defined at the top level of your code accumulates there.
|
|
773
|
+
|
|
774
|
+
```
|
|
775
|
+
Worker 1 _ns: { "model": <sklearn model>, "pd": <pandas>, "history": [...] }
|
|
776
|
+
Worker 2 _ns: { "model": <sklearn model>, "pd": <pandas>, "history": [...] }
|
|
777
|
+
```
|
|
778
|
+
|
|
779
|
+
Workers do not share state with each other.
|
|
780
|
+
|
|
781
|
+
### node.queue
|
|
782
|
+
|
|
783
|
+
State lives in the worker's `vm` context, accessible via `global.*`. Top-level variable declarations (`const`, `let`, `var`) are scoped to each execution and do not persist.
|
|
784
|
+
|
|
785
|
+
```
|
|
786
|
+
Worker 1 global: { counter: 42, db: <connection> }
|
|
787
|
+
Worker 2 global: { counter: 17, db: <connection> }
|
|
788
|
+
```
|
|
789
|
+
|
|
790
|
+
### exec.service
|
|
791
|
+
|
|
792
|
+
No persistent application state — the service process manages its own state internally. The node manages the process lifecycle only.
|
|
793
|
+
|
|
794
|
+
---
|
|
795
|
+
|
|
796
|
+
## Output & Parsing
|
|
797
|
+
|
|
798
|
+
All nodes except exec.queue use a shared streaming output system:
|
|
799
|
+
|
|
800
|
+
### Parsing: Delimited (default)
|
|
801
|
+
|
|
802
|
+
Buffers stdout and splits on the configured delimiter (default: `\n`). Each complete segment is emitted as a separate message. Incomplete segments at the end of a stream are flushed when the job completes.
|
|
803
|
+
|
|
804
|
+
This is the correct mode for line-oriented output (most scripts).
|
|
805
|
+
|
|
806
|
+
### Parsing: Raw
|
|
807
|
+
|
|
808
|
+
Emits each stdout chunk immediately as a separate message with no buffering. Chunks may not align with logical line boundaries — a single `print()` may produce multiple messages, or a single message may contain multiple lines.
|
|
809
|
+
|
|
810
|
+
Use Raw mode only when you need the lowest possible latency and can handle partial chunks.
|
|
811
|
+
|
|
812
|
+
### Output format
|
|
813
|
+
|
|
814
|
+
After splitting (Delimited) or on each chunk (Raw), the segment is parsed according to the selected format:
|
|
815
|
+
|
|
816
|
+
| Format | Behavior |
|
|
817
|
+
|---|---|
|
|
818
|
+
| Plain text | value as string (trimmed in Delimited, raw in Raw) |
|
|
819
|
+
| Parsed JSON | `JSON.parse(segment)` |
|
|
820
|
+
| Parsed YAML | YAML parse of segment |
|
|
821
|
+
| Parsed XML | XML parse of segment |
|
|
822
|
+
|
|
823
|
+
Parse errors are emitted as node errors and do not produce an output message.
|
|
824
|
+
|
|
825
|
+
---
|
|
826
|
+
|
|
827
|
+
## python.config
|
|
828
|
+
|
|
829
|
+
A config node that stores the Python executable path used by `python.queue`.
|
|
830
|
+
|
|
831
|
+
**Fields:**
|
|
832
|
+
- **Name** — optional label shown in the dropdown
|
|
833
|
+
- **Python Path** — path to the Python binary (required)
|
|
834
|
+
|
|
835
|
+
On deploy, the node warns if the path does not exist.
|
|
836
|
+
|
|
837
|
+
`python.queue` falls back to `python3` if no config node is linked.
|