claude-nbexec 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,17 @@
1
+ ---
2
+ name: nbexec
3
+ description: Execute code on remote Jupyter kernels (e.g. PySpark, Python) with local notebook logging. Use when the user asks to run code on a remote kernel, query data via Spark, or interact with a Jupyter notebook session.
4
+ argument-hint: <command> [options]
5
+ ---
6
+
7
+ # nbexec — Remote Jupyter Kernel Execution CLI
8
+
9
+ The nbexec binary is at: `__NBEXEC_PATH__`
10
+
11
+ Before doing anything else, run `__NBEXEC_PATH__ --help` to discover available commands and their options.
12
+
13
+ ## Task
14
+
15
+ $ARGUMENTS
16
+
17
+ Use `nbexec` to accomplish the task. Follow the help output precisely.
@@ -0,0 +1,20 @@
1
+ name: CI
2
+
3
+ on:
4
+ pull_request:
5
+ push:
6
+ branches: [main]
7
+
8
+ jobs:
9
+ build:
10
+ runs-on: ubuntu-latest
11
+ steps:
12
+ - uses: actions/checkout@v4
13
+ with:
14
+ fetch-depth: 0
15
+
16
+ - name: Install uv
17
+ uses: astral-sh/setup-uv@v4
18
+
19
+ - name: Build
20
+ run: uv build
@@ -0,0 +1,23 @@
1
+ name: Publish to PyPI
2
+
3
+ on:
4
+ push:
5
+ tags:
6
+ - "v*"
7
+
8
+ jobs:
9
+ publish:
10
+ runs-on: ubuntu-latest
11
+ steps:
12
+ - uses: actions/checkout@v4
13
+ with:
14
+ fetch-depth: 0
15
+
16
+ - name: Install uv
17
+ uses: astral-sh/setup-uv@v4
18
+
19
+ - name: Build
20
+ run: uv build
21
+
22
+ - name: Publish
23
+ run: uv publish --token ${{ secrets.PYPI_TOKEN }}
@@ -0,0 +1,10 @@
1
+ # Python-generated files
2
+ __pycache__/
3
+ *.py[oc]
4
+ build/
5
+ dist/
6
+ wheels/
7
+ *.egg-info
8
+
9
+ # Virtual environments
10
+ .venv
@@ -0,0 +1 @@
1
+ 3.10
@@ -0,0 +1,85 @@
1
+ Metadata-Version: 2.4
2
+ Name: claude-nbexec
3
+ Version: 0.1.0
4
+ Summary: CLI daemon that proxies code execution to remote Jupyter kernels with local notebook logging
5
+ Requires-Python: >=3.10
6
+ Requires-Dist: click>=8.1
7
+ Requires-Dist: jupyter-kernel-client>=0.3
8
+ Requires-Dist: nbformat>=5.9
9
+ Description-Content-Type: text/markdown
10
+
11
+ # nbexec
12
+
13
+ A CLI tool that lets AI agents (like Claude Code) execute code on remote Jupyter kernels. All executed code and outputs are logged to a local `.ipynb` notebook file for human review.
14
+
15
+ ## Why
16
+
17
+ When an AI agent needs to run code on a remote compute environment — a PySpark cluster, a GPU machine, a data warehouse notebook server — there's no simple way to do it interactively. The agent can't open a Jupyter UI. It needs to send code, get results, and move on.
18
+
19
+ nbexec bridges this gap. The agent calls `nbexec exec --code "..."` and gets text output on stdout. It can also run an existing `.ipynb` notebook on the same kernel with `nbexec exec --file ./analysis.ipynb` — all code cells execute sequentially, and variables persist across all exec calls in the session. Behind the scenes, a daemon holds a persistent WebSocket connection to the remote Jupyter kernel, and every cell + output is recorded in a local `.ipynb` file that you can open in VS Code or Jupyter to see exactly what the agent did.
20
+
21
+ ## How it works
22
+
23
+ ```
24
+ Agent (Claude Code) nbexec daemon Remote Jupyter Server
25
+ ─────────────────── ───────────── ─────────────────────
26
+ (background process)
27
+
28
+ exec --code "..." ────────────► Unix socket request
29
+ append cell to .ipynb
30
+ send to kernel via WS ─────► kernel executes code
31
+ ◄─── results on iopub
32
+ write output to .ipynb
33
+ stdout: result ◄──────────── return output
34
+ ```
35
+
36
+ The daemon is a long-running background process that holds persistent WebSocket connections to one or more remote Jupyter servers. CLI commands (`exec`, `session create`, etc.) are thin clients that talk to the daemon over a Unix socket — each `exec` is a synchronous request/response.
37
+
38
+ This is the same protocol VS Code uses when you connect a local notebook to a remote Jupyter server. The notebook document stays local, only code strings are sent to the kernel. nbexec replicates this model for CLI/agent use, using [jupyter-kernel-client](https://github.com/datalayer/jupyter-kernel-client) to manage the kernel connection.
39
+
40
+ ## Why a CLI and not an MCP server or raw HTTP
41
+
42
+ **Agents don't think in cells.** Existing Jupyter MCP servers expose notebook operations — create cell, edit cell, move cell, run cell. But an agent executing code on a remote kernel doesn't care about cells. It just wants to send code and get results. It doesn't need to edit cell 5 or reorder cells — if something went wrong, it sends corrected code as the next execution. nbexec matches this model: send code, get output, move on. The notebook is just a side effect for human review, not something the agent manages.
43
+
44
+ **Clean context.** An MCP server's tool definitions live in the agent's prompt at all times. nbexec adds nothing to the prompt until the agent actually needs it — the skill loads on demand, and `--help` is only fetched when invoked.
45
+
46
+ **Full visibility.** Everything inside an MCP server is opaque to the agent — it can only call the tools that are exposed. With a CLI, the agent has access to the source code, can inspect how things work, and can understand or work around issues on its own.
47
+
48
+ **Persistent connections without agent coupling.** The daemon runs as a separate process, managing WebSocket connections and kernel sessions independently. The agent doesn't need to hold connections or re-establish them between calls. Sessions survive across multiple agent conversations. An MCP server's lifecycle is tied to the agent process that started it.
49
+
50
+ **Fewer tokens than raw HTTP.** The agent could call the Jupyter REST API directly via curl, but that means generating verbose HTTP requests for every cell execution, manually managing XSRF tokens, parsing WebSocket message framing, and tracking kernel/session IDs. A single `nbexec exec --session spark --code "..."` replaces all of that. Less generated tokens, simpler logic, same result.
51
+
52
+ **Self-documenting from the CLI.** The agent runs `nbexec --help` and gets everything it needs — commands, options, examples, workflow patterns. No need to embed documentation in MCP tool descriptions or maintain it in two places.
53
+
54
+ ## Inspiration
55
+
56
+ The architectural pattern — a long-lived daemon process, CLI-driven interaction, persistent state across calls, and a skill file for agent discovery — is inspired by [OpenClaw](https://github.com/openclaw/openclaw). nbexec applies the same intuition to a narrower problem: giving AI coding agents structured access to remote Jupyter kernels.
57
+
58
+ ## Installation
59
+
60
+ Requires Python 3.10+.
61
+
62
+ ```bash
63
+ uv tool install claude-nbexec
64
+ ```
65
+
66
+ Or with pip:
67
+
68
+ ```bash
69
+ pip install claude-nbexec
70
+ ```
71
+
72
+ ### Install the Claude Code skill
73
+
74
+ Clone the repo and run the skill installer:
75
+
76
+ ```bash
77
+ git clone https://github.com/anish749/claude-nbexec.git && cd claude-nbexec
78
+ ./install-skill.sh
79
+ ```
80
+
81
+ This installs a skill to `$CLAUDE_CONFIG_DIR/skills/nbexec/` that teaches Claude Code when and how to use nbexec.
82
+
83
+ ### Usage
84
+
85
+ All commands and options are documented in `nbexec --help`.
@@ -0,0 +1,75 @@
1
+ # nbexec
2
+
3
+ A CLI tool that lets AI agents (like Claude Code) execute code on remote Jupyter kernels. All executed code and outputs are logged to a local `.ipynb` notebook file for human review.
4
+
5
+ ## Why
6
+
7
+ When an AI agent needs to run code on a remote compute environment — a PySpark cluster, a GPU machine, a data warehouse notebook server — there's no simple way to do it interactively. The agent can't open a Jupyter UI. It needs to send code, get results, and move on.
8
+
9
+ nbexec bridges this gap. The agent calls `nbexec exec --code "..."` and gets text output on stdout. It can also run an existing `.ipynb` notebook on the same kernel with `nbexec exec --file ./analysis.ipynb` — all code cells execute sequentially, and variables persist across all exec calls in the session. Behind the scenes, a daemon holds a persistent WebSocket connection to the remote Jupyter kernel, and every cell + output is recorded in a local `.ipynb` file that you can open in VS Code or Jupyter to see exactly what the agent did.
10
+
11
+ ## How it works
12
+
13
+ ```
14
+ Agent (Claude Code) nbexec daemon Remote Jupyter Server
15
+ ─────────────────── ───────────── ─────────────────────
16
+ (background process)
17
+
18
+ exec --code "..." ────────────► Unix socket request
19
+ append cell to .ipynb
20
+ send to kernel via WS ─────► kernel executes code
21
+ ◄─── results on iopub
22
+ write output to .ipynb
23
+ stdout: result ◄──────────── return output
24
+ ```
25
+
26
+ The daemon is a long-running background process that holds persistent WebSocket connections to one or more remote Jupyter servers. CLI commands (`exec`, `session create`, etc.) are thin clients that talk to the daemon over a Unix socket — each `exec` is a synchronous request/response.
27
+
28
+ This is the same protocol VS Code uses when you connect a local notebook to a remote Jupyter server. The notebook document stays local, only code strings are sent to the kernel. nbexec replicates this model for CLI/agent use, using [jupyter-kernel-client](https://github.com/datalayer/jupyter-kernel-client) to manage the kernel connection.
29
+
30
+ ## Why a CLI and not an MCP server or raw HTTP
31
+
32
+ **Agents don't think in cells.** Existing Jupyter MCP servers expose notebook operations — create cell, edit cell, move cell, run cell. But an agent executing code on a remote kernel doesn't care about cells. It just wants to send code and get results. It doesn't need to edit cell 5 or reorder cells — if something went wrong, it sends corrected code as the next execution. nbexec matches this model: send code, get output, move on. The notebook is just a side effect for human review, not something the agent manages.
33
+
34
+ **Clean context.** An MCP server's tool definitions live in the agent's prompt at all times. nbexec adds nothing to the prompt until the agent actually needs it — the skill loads on demand, and `--help` is only fetched when invoked.
35
+
36
+ **Full visibility.** Everything inside an MCP server is opaque to the agent — it can only call the tools that are exposed. With a CLI, the agent has access to the source code, can inspect how things work, and can understand or work around issues on its own.
37
+
38
+ **Persistent connections without agent coupling.** The daemon runs as a separate process, managing WebSocket connections and kernel sessions independently. The agent doesn't need to hold connections or re-establish them between calls. Sessions survive across multiple agent conversations. An MCP server's lifecycle is tied to the agent process that started it.
39
+
40
+ **Fewer tokens than raw HTTP.** The agent could call the Jupyter REST API directly via curl, but that means generating verbose HTTP requests for every cell execution, manually managing XSRF tokens, parsing WebSocket message framing, and tracking kernel/session IDs. A single `nbexec exec --session spark --code "..."` replaces all of that. Less generated tokens, simpler logic, same result.
41
+
42
+ **Self-documenting from the CLI.** The agent runs `nbexec --help` and gets everything it needs — commands, options, examples, workflow patterns. No need to embed documentation in MCP tool descriptions or maintain it in two places.
43
+
44
+ ## Inspiration
45
+
46
+ The architectural pattern — a long-lived daemon process, CLI-driven interaction, persistent state across calls, and a skill file for agent discovery — is inspired by [OpenClaw](https://github.com/openclaw/openclaw). nbexec applies the same intuition to a narrower problem: giving AI coding agents structured access to remote Jupyter kernels.
47
+
48
+ ## Installation
49
+
50
+ Requires Python 3.10+.
51
+
52
+ ```bash
53
+ uv tool install claude-nbexec
54
+ ```
55
+
56
+ Or with pip:
57
+
58
+ ```bash
59
+ pip install claude-nbexec
60
+ ```
61
+
62
+ ### Install the Claude Code skill
63
+
64
+ Clone the repo and run the skill installer:
65
+
66
+ ```bash
67
+ git clone https://github.com/anish749/claude-nbexec.git && cd claude-nbexec
68
+ ./install-skill.sh
69
+ ```
70
+
71
+ This installs a skill to `$CLAUDE_CONFIG_DIR/skills/nbexec/` that teaches Claude Code when and how to use nbexec.
72
+
73
+ ### Usage
74
+
75
+ All commands and options are documented in `nbexec --help`.
@@ -0,0 +1,22 @@
1
+ #!/usr/bin/env bash
2
+ set -euo pipefail
3
+
4
+ REPO_DIR="$(cd "$(dirname "$0")" && pwd)"
5
+ SKILL_SRC="$REPO_DIR/.claude/skills/nbexec/SKILL.md.template"
6
+ CLAUDE_DIR="${CLAUDE_CONFIG_DIR:-$HOME/.claude}"
7
+ CLAUDE_DIR="${CLAUDE_DIR%/}"
8
+ SKILL_DST="$CLAUDE_DIR/skills/nbexec"
9
+
10
+ # Find nbexec: prefer PATH, fall back to local .venv
11
+ if command -v nbexec &>/dev/null; then
12
+ NBEXEC_PATH="$(command -v nbexec)"
13
+ elif [ -x "$REPO_DIR/.venv/bin/nbexec" ]; then
14
+ NBEXEC_PATH="$REPO_DIR/.venv/bin/nbexec"
15
+ else
16
+ echo "Error: nbexec not found. Install with: pip install claude-nbexec" >&2
17
+ exit 1
18
+ fi
19
+
20
+ mkdir -p "$SKILL_DST"
21
+ sed "s|__NBEXEC_PATH__|$NBEXEC_PATH|g" "$SKILL_SRC" > "$SKILL_DST/SKILL.md"
22
+ echo "Installed skill to $SKILL_DST/SKILL.md (nbexec path: $NBEXEC_PATH)"
@@ -0,0 +1,24 @@
1
+ [project]
2
+ name = "claude-nbexec"
3
+ dynamic = ["version"]
4
+ description = "CLI daemon that proxies code execution to remote Jupyter kernels with local notebook logging"
5
+ readme = "README.md"
6
+ requires-python = ">=3.10"
7
+ dependencies = [
8
+ "click>=8.1",
9
+ "jupyter-kernel-client>=0.3",
10
+ "nbformat>=5.9",
11
+ ]
12
+
13
+ [project.scripts]
14
+ nbexec = "nbexec.cli.main:cli"
15
+
16
+ [build-system]
17
+ requires = ["hatchling", "hatch-vcs"]
18
+ build-backend = "hatchling.build"
19
+
20
+ [tool.hatch.version]
21
+ source = "vcs"
22
+
23
+ [tool.hatch.build.targets.wheel]
24
+ packages = ["src/nbexec"]
File without changes
File without changes
@@ -0,0 +1,51 @@
1
+ import asyncio
2
+ import json
3
+ import sys
4
+
5
+ from nbexec import protocol as proto
6
+ from nbexec.paths import socket_path
7
+
8
+
9
+ def send_to_daemon(method: str, params: dict | None = None, timeout: float | None = None) -> dict:
10
+ """Send a request to the daemon and return the response.
11
+
12
+ Raises SystemExit on connection errors so CLI commands fail cleanly.
13
+ """
14
+ sock = socket_path()
15
+ if not sock.exists():
16
+ print("Error: daemon is not running. Start it with: nbexec daemon start", file=sys.stderr)
17
+ sys.exit(1)
18
+
19
+ request = proto.make_request(method, params)
20
+ response = asyncio.run(_send(str(sock), request, timeout))
21
+
22
+ if not response.get("ok"):
23
+ print(f"Error: {response.get('error', 'unknown error')}", file=sys.stderr)
24
+ sys.exit(1)
25
+
26
+ return response["result"]
27
+
28
+
29
+ async def _send(sock_path: str, request: dict, timeout: float | None) -> dict:
30
+ try:
31
+ reader, writer = await asyncio.open_unix_connection(sock_path)
32
+ except (ConnectionRefusedError, FileNotFoundError):
33
+ print("Error: cannot connect to daemon. Is it running?", file=sys.stderr)
34
+ sys.exit(1)
35
+
36
+ try:
37
+ writer.write(proto.encode(request))
38
+ await writer.drain()
39
+
40
+ line = await asyncio.wait_for(reader.readline(), timeout=timeout)
41
+ if not line:
42
+ print("Error: daemon closed connection", file=sys.stderr)
43
+ sys.exit(1)
44
+
45
+ return proto.decode(line)
46
+ finally:
47
+ writer.close()
48
+ try:
49
+ await writer.wait_closed()
50
+ except Exception:
51
+ pass
@@ -0,0 +1,38 @@
1
+ import json
2
+ import click
3
+
4
+ from nbexec.daemon.process import start_daemon, stop_daemon, daemon_status
5
+
6
+
7
+ @click.group()
8
+ def daemon():
9
+ """Manage the nbexec daemon."""
10
+ pass
11
+
12
+
13
+ @daemon.command()
14
+ def start():
15
+ """Start the daemon in the background."""
16
+ if start_daemon():
17
+ click.echo("Daemon started.")
18
+ else:
19
+ click.echo("Daemon is already running.")
20
+
21
+
22
+ @daemon.command()
23
+ def stop():
24
+ """Stop the daemon."""
25
+ if stop_daemon():
26
+ click.echo("Daemon stopped.")
27
+ else:
28
+ click.echo("Daemon is not running.")
29
+
30
+
31
+ @daemon.command()
32
+ def status():
33
+ """Check daemon status."""
34
+ info = daemon_status()
35
+ if info["running"]:
36
+ click.echo(f"Running (pid={info['pid']}, socket={info['socket']})")
37
+ else:
38
+ click.echo("Not running.")
@@ -0,0 +1,81 @@
1
+ import sys
2
+ from pathlib import Path
3
+
4
+ import click
5
+ import nbformat
6
+
7
+ from nbexec import protocol as proto
8
+ from .client import send_to_daemon
9
+
10
+
11
+ def _exec_one(session_id, code, timeout):
12
+ """Execute a single code string and print output. Returns True on success."""
13
+ result = send_to_daemon(
14
+ proto.EXEC,
15
+ {"session_id": session_id, "code": code},
16
+ timeout=timeout,
17
+ )
18
+ text = result.get("text", "")
19
+ if text:
20
+ click.echo(text)
21
+ return result.get("status") != "error"
22
+
23
+
24
+ def _exec_notebook(session_id, notebook_path, timeout, from_cell, to_cell):
25
+ """Execute code cells from a .ipynb file sequentially."""
26
+ nb = nbformat.read(notebook_path, as_version=4)
27
+ code_cells = [c for c in nb.cells if c.cell_type == "code" and c.source.strip()]
28
+ total = len(code_cells)
29
+ if not code_cells:
30
+ click.echo("No code cells found in notebook", err=True)
31
+ sys.exit(1)
32
+
33
+ # --from-cell and --to-cell are 1-based, inclusive
34
+ start = (from_cell or 1) - 1
35
+ end = to_cell or total
36
+
37
+ if start >= total:
38
+ click.echo(f"--from-cell {from_cell} is beyond the {total} code cells in the notebook", err=True)
39
+ sys.exit(1)
40
+
41
+ selected = code_cells[start:end]
42
+ if not selected:
43
+ click.echo("No code cells in the specified range", err=True)
44
+ sys.exit(1)
45
+
46
+ for i, cell in enumerate(selected, start + 1):
47
+ click.echo(f"--- cell {i}/{total} ---", err=True)
48
+ ok = _exec_one(session_id, cell.source, timeout)
49
+ if not ok:
50
+ click.echo(f"Cell {i} failed, stopping.", err=True)
51
+ sys.exit(1)
52
+
53
+
54
+ @click.command()
55
+ @click.option("--session", "session_id", required=True, help="Session ID")
56
+ @click.option("--code", default=None, help="Code to execute")
57
+ @click.option("--file", "file_path", default=None, type=click.Path(exists=True), help="File containing code to execute (.py or .ipynb)")
58
+ @click.option("--timeout", default=None, type=float, help="Execution timeout in seconds (default: no timeout)")
59
+ @click.option("--from-cell", "from_cell", default=None, type=int, help="Start from this code cell (1-based, inclusive). Only for .ipynb files.")
60
+ @click.option("--to-cell", "to_cell", default=None, type=int, help="Stop at this code cell (1-based, inclusive). Only for .ipynb files.")
61
+ def exec_code(session_id, code, file_path, timeout, from_cell, to_cell):
62
+ """Execute code on a remote kernel."""
63
+ if (from_cell or to_cell) and (file_path is None or not file_path.endswith(".ipynb")):
64
+ click.echo("--from-cell and --to-cell can only be used with .ipynb files", err=True)
65
+ sys.exit(1)
66
+
67
+ if code is None and file_path is None:
68
+ # Read from stdin
69
+ if sys.stdin.isatty():
70
+ click.echo("Error: provide --code, --file, or pipe code via stdin", err=True)
71
+ sys.exit(1)
72
+ code = sys.stdin.read()
73
+ elif file_path is not None:
74
+ if file_path.endswith(".ipynb"):
75
+ _exec_notebook(session_id, file_path, timeout, from_cell, to_cell)
76
+ return
77
+ code = Path(file_path).read_text()
78
+
79
+ ok = _exec_one(session_id, code, timeout)
80
+ if not ok:
81
+ sys.exit(1)
@@ -0,0 +1,12 @@
1
+ import click
2
+
3
+ from nbexec import protocol as proto
4
+ from .client import send_to_daemon
5
+
6
+
7
+ @click.command()
8
+ @click.option("--session", "session_id", required=True, help="Session ID")
9
+ def interrupt(session_id):
10
+ """Interrupt a running execution on a remote kernel."""
11
+ send_to_daemon(proto.INTERRUPT, {"session_id": session_id})
12
+ click.echo(f"Interrupt sent to session '{session_id}'.")
@@ -0,0 +1,179 @@
1
+ import sys
2
+
3
+ import click
4
+
5
+ from .daemon_cmds import daemon
6
+ from .session_cmds import session
7
+ from .exec_cmd import exec_code
8
+ from .interrupt_cmd import interrupt
9
+
10
+ USAGE = """\
11
+ nbexec — CLI daemon that proxies code execution to remote Jupyter kernels
12
+ with local notebook logging.
13
+
14
+ Designed for AI agent use. An agent sends code strings to a remote Jupyter
15
+ kernel (e.g. PySpark, Python) and gets text results back. All executed code
16
+ and outputs are recorded in a local .ipynb file for human inspection.
17
+
18
+ Architecture: a background daemon holds persistent WebSocket connections to
19
+ remote Jupyter servers. CLI commands talk to the daemon via a Unix socket.
20
+ Multiple sessions can run simultaneously, each connected to a different
21
+ (or the same) Jupyter server with its own kernel and notebook file.
22
+
23
+ Usage:
24
+ nbexec <command> [options]
25
+
26
+ Commands:
27
+
28
+ daemon start
29
+ Start the nbexec daemon in the background. The daemon listens on a
30
+ Unix socket at ~/.local/state/nbexec/nbexec.sock and manages all
31
+ kernel connections. Idempotent — prints a message if already running.
32
+
33
+ Example:
34
+ nbexec daemon start
35
+
36
+ daemon stop
37
+ Stop the daemon. Closes all sessions (shutting down remote kernels),
38
+ saves all notebooks, removes the socket and PID file.
39
+
40
+ Example:
41
+ nbexec daemon stop
42
+
43
+ daemon status
44
+ Check if the daemon is running. Prints PID and socket path if running.
45
+
46
+ Example:
47
+ nbexec daemon status
48
+
49
+ session create
50
+ Create a new session: connects to a remote Jupyter server, starts a
51
+ kernel, and creates a local .ipynb notebook file. The session ID is
52
+ used in subsequent exec and close commands.
53
+
54
+ Options:
55
+ --server URL Jupyter server URL (required, e.g. http://localhost:8888)
56
+ --token TOKEN Jupyter server auth token (required)
57
+ --notebook PATH Local path for the .ipynb log file (required)
58
+ --name NAME Session name/ID (optional, auto-generated if omitted)
59
+ --kernel NAME Kernel name (default: python3)
60
+
61
+ Examples:
62
+ nbexec session create --server http://localhost:8888 --token abc123 \\
63
+ --notebook ./spark_session.ipynb --name spark
64
+ nbexec session create --server http://localhost:9999 --token xyz \\
65
+ --notebook ./analysis.ipynb --name analysis
66
+
67
+ session list
68
+ List all active sessions. Shows session ID, server URL, cell count,
69
+ and notebook path for each session.
70
+
71
+ Example:
72
+ nbexec session list
73
+
74
+ session close
75
+ Close a session: shuts down the remote kernel, saves the notebook
76
+ file, and removes the session from the daemon.
77
+
78
+ Options:
79
+ --session ID Session ID to close (required)
80
+
81
+ Example:
82
+ nbexec session close --session spark
83
+
84
+ exec
85
+ Execute code on a remote kernel. Sends the code string to the kernel,
86
+ waits for completion, prints the output to stdout, and records the
87
+ cell and outputs in the session's notebook file.
88
+
89
+ Exit code is 0 on success, 1 on execution error.
90
+
91
+ Options:
92
+ --session ID Session ID (required)
93
+ --file PATH File containing code to execute (recommended).
94
+ Supports .ipynb files — all code cells in the
95
+ notebook are executed sequentially on the session's
96
+ kernel. Execution stops on the first cell error.
97
+ --code CODE Code string to execute (simple one-liners only)
98
+ --from-cell N Start from code cell N (1-based, inclusive, .ipynb only)
99
+ --to-cell N Stop at code cell N (1-based, inclusive, .ipynb only)
100
+
101
+ If neither --code nor --file is given, reads code from stdin.
102
+
103
+ Variables persist across exec calls within the same session (same
104
+ kernel). Each exec appends a new cell to the notebook.
105
+
106
+ interrupt
107
+ Interrupt a currently running execution on a remote kernel. Sends a
108
+ SIGINT to the kernel process, which will cause most Python code to
109
+ raise a KeyboardInterrupt. Use this to cancel long-running cells.
110
+
111
+ Options:
112
+ --session ID Session ID (required)
113
+
114
+ Example:
115
+ nbexec interrupt --session spark
116
+
117
+ IMPORTANT — how to send code:
118
+
119
+ Prefer --file for anything beyond a trivial one-liner. Write the
120
+ code to a temporary file first, then pass the path. This avoids
121
+ bash escaping issues with quotes, newlines, and special characters
122
+ that are common in Python/SQL code.
123
+
124
+ Use --code only for simple single-line expressions like:
125
+ nbexec exec --session spark --code "df.show()"
126
+ nbexec exec --session spark --code "print(x)"
127
+
128
+ For multiline code, write to a file first, then use --file:
129
+ nbexec exec --session spark --file /tmp/cell.py
130
+
131
+ To run an existing .ipynb notebook (e.g. shared setup, a saved
132
+ analysis, or a notebook the user points you to), pass it to --file.
133
+ All code cells run sequentially on the session's kernel, and variables
134
+ persist in both directions:
135
+ nbexec exec --session spark --file ./analysis.ipynb
136
+
137
+ Use --from-cell and --to-cell to run a subset of code cells. Both are
138
+ 1-based and inclusive. For a notebook with 10 code cells:
139
+ --from-cell 5 runs cells 5, 6, 7, 8, 9, 10
140
+ --to-cell 3 runs cells 1, 2, 3
141
+ --from-cell 3 --to-cell 5 runs cells 3, 4, 5
142
+
143
+ Agent Workflow Examples:
144
+
145
+ Start a session, run a notebook, then explore interactively:
146
+ nbexec daemon start
147
+ nbexec session create --server http://localhost:8888 --token $TOKEN \\
148
+ --notebook ./session.ipynb --name spark
149
+ nbexec exec --session spark --file ./analysis.ipynb
150
+ nbexec exec --session spark --code "df.show()"
151
+ nbexec exec --session spark --file /tmp/query.py
152
+ nbexec session close --session spark
153
+ nbexec daemon stop
154
+
155
+ Inspect what was executed:
156
+ Open the .ipynb file in VS Code or Jupyter to see all cells and outputs.
157
+ The notebook is updated after every exec call.
158
+
159
+ Runtime:
160
+ PID file: ~/.local/state/nbexec/daemon.pid
161
+ Unix socket: ~/.local/state/nbexec/nbexec.sock
162
+ Log file: ~/.local/state/nbexec/daemon.log
163
+ """
164
+
165
+
166
+ class NbexecGroup(click.Group):
167
+ def format_help(self, ctx, formatter):
168
+ formatter.write(USAGE)
169
+
170
+
171
+ @click.group(cls=NbexecGroup)
172
+ def cli():
173
+ pass
174
+
175
+
176
+ cli.add_command(daemon)
177
+ cli.add_command(session)
178
+ cli.add_command(exec_code, name="exec")
179
+ cli.add_command(interrupt)