nerve-mcp 0.2.0 → 0.2.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +208 -0
- package/package.json +2 -2
package/README.md
ADDED
|
@@ -0,0 +1,208 @@
|
|
|
1
|
+
# Nerve
|
|
2
|
+
|
|
3
|
+
Nerve gives AI agents eyes and hands inside iOS apps.
|
|
4
|
+
|
|
5
|
+
Add the MCP server to your AI agent, and it can see every element on screen, tap buttons, fill forms, scroll, inspect state, intercept network calls, and debug your iOS app — all through natural language. No code changes needed.
|
|
6
|
+
|
|
7
|
+
## Setup
|
|
8
|
+
|
|
9
|
+
### 1. Install the MCP Server
|
|
10
|
+
|
|
11
|
+
```bash
|
|
12
|
+
npx nerve-mcp@latest
|
|
13
|
+
```
|
|
14
|
+
|
|
15
|
+
Or install globally:
|
|
16
|
+
|
|
17
|
+
```bash
|
|
18
|
+
npm install -g nerve-mcp
|
|
19
|
+
```
|
|
20
|
+
|
|
21
|
+
Or clone and build from source:
|
|
22
|
+
|
|
23
|
+
```bash
|
|
24
|
+
git clone https://github.com/luchi0208/nerve-ios.git
|
|
25
|
+
cd nerve/mcp-server && npm install && npm run build
|
|
26
|
+
```
|
|
27
|
+
|
|
28
|
+
### 2. Configure Your AI Agent
|
|
29
|
+
|
|
30
|
+
**Claude Code** — add to your project's `.mcp.json`:
|
|
31
|
+
|
|
32
|
+
```json
|
|
33
|
+
{
|
|
34
|
+
"mcpServers": {
|
|
35
|
+
"nerve": {
|
|
36
|
+
"command": "npx",
|
|
37
|
+
"args": ["nerve-mcp@latest"]
|
|
38
|
+
}
|
|
39
|
+
}
|
|
40
|
+
}
|
|
41
|
+
```
|
|
42
|
+
|
|
43
|
+
**Claude Desktop / Cursor / Other MCP clients:**
|
|
44
|
+
|
|
45
|
+
```json
|
|
46
|
+
{
|
|
47
|
+
"mcpServers": {
|
|
48
|
+
"nerve": {
|
|
49
|
+
"command": "npx",
|
|
50
|
+
"args": ["nerve-mcp@latest"]
|
|
51
|
+
}
|
|
52
|
+
}
|
|
53
|
+
}
|
|
54
|
+
```
|
|
55
|
+
|
|
56
|
+
If installed from source:
|
|
57
|
+
|
|
58
|
+
```json
|
|
59
|
+
{
|
|
60
|
+
"mcpServers": {
|
|
61
|
+
"nerve": {
|
|
62
|
+
"command": "node",
|
|
63
|
+
"args": ["/path/to/nerve/mcp-server/dist/index.js"]
|
|
64
|
+
}
|
|
65
|
+
}
|
|
66
|
+
}
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
That's it. Tell your AI agent to build and run your app — Nerve auto-injects on the Simulator with no code changes needed.
|
|
70
|
+
|
|
71
|
+
## Example
|
|
72
|
+
|
|
73
|
+
Things you can ask your AI agent with Nerve:
|
|
74
|
+
|
|
75
|
+
> "Run my app on the simulator. Go to the checkout screen and try submitting an empty form — what validation errors show up?"
|
|
76
|
+
|
|
77
|
+
> "There's a bug where the cart badge doesn't update after removing an item. Can you reproduce it and check the console logs?"
|
|
78
|
+
|
|
79
|
+
> "Navigate through every screen in the app and find any buttons that don't respond to taps."
|
|
80
|
+
|
|
81
|
+
> "The login screen looks broken on iPhone SE. Run it on that simulator and screenshot just the login form so I can see what's wrong."
|
|
82
|
+
|
|
83
|
+
> "Trace all calls to `CartManager.addItem` and then add three items to the cart. Show me what arguments are being passed."
|
|
84
|
+
|
|
85
|
+
> "Check what's stored in UserDefaults after onboarding completes. I think we're saving the auth token in the wrong key."
|
|
86
|
+
|
|
87
|
+
> "Intercept the network requests when I pull to refresh on the orders screen. Show me the response bodies — I think the API is returning stale data."
|
|
88
|
+
|
|
89
|
+
These are just starting points. The agent combines Nerve's tools on its own — you describe what you want in plain English, and it figures out the sequence of taps, inspections, and checks to get there.
|
|
90
|
+
|
|
91
|
+
## How It Works
|
|
92
|
+
|
|
93
|
+
Nerve auto-injects into the app at launch on the Simulator — no code changes needed. It runs inside the app process, starts a WebSocket server, and the MCP server on the Mac connects to it. AI agent tool calls are translated into commands executed inside the app.
|
|
94
|
+
|
|
95
|
+
Because it runs in-process, Nerve has access to the full view hierarchy, the Objective-C runtime, live objects, network delegates, and the HID event system.
|
|
96
|
+
|
|
97
|
+
```
|
|
98
|
+
AI Agent → MCP Server (Mac) → WebSocket → Nerve (in-app) → UIKit/SwiftUI
|
|
99
|
+
```
|
|
100
|
+
|
|
101
|
+
## Tools
|
|
102
|
+
|
|
103
|
+
### See the Screen
|
|
104
|
+
|
|
105
|
+
| Tool | Description |
|
|
106
|
+
|------|-------------|
|
|
107
|
+
| `nerve_view` | See all visible elements with type, label, ID, tap point, and position |
|
|
108
|
+
| `nerve_tree` | Full view hierarchy (UIKit + SwiftUI) |
|
|
109
|
+
| `nerve_inspect` | Detailed properties of a specific element |
|
|
110
|
+
| `nerve_screenshot` | Capture the screen as an image |
|
|
111
|
+
|
|
112
|
+
### Interact
|
|
113
|
+
|
|
114
|
+
| Tool | Description |
|
|
115
|
+
|------|-------------|
|
|
116
|
+
| `nerve_tap` | Tap an element by `@eN` ref, `#identifier`, `@label`, or coordinates |
|
|
117
|
+
| `nerve_type` | Type text into the focused field |
|
|
118
|
+
| `nerve_scroll` | Scroll in any direction |
|
|
119
|
+
| `nerve_swipe` | Swipe gesture |
|
|
120
|
+
| `nerve_long_press` | Long press |
|
|
121
|
+
| `nerve_double_tap` | Double tap |
|
|
122
|
+
| `nerve_drag_drop` | Drag from one element to another |
|
|
123
|
+
| `nerve_pull_to_refresh` | Pull to refresh |
|
|
124
|
+
| `nerve_pinch` | Pinch/zoom |
|
|
125
|
+
| `nerve_context_menu` | Open context menu |
|
|
126
|
+
| `nerve_back` | Navigate back |
|
|
127
|
+
| `nerve_dismiss` | Dismiss keyboard or modal |
|
|
128
|
+
|
|
129
|
+
### Navigate
|
|
130
|
+
|
|
131
|
+
| Tool | Description |
|
|
132
|
+
|------|-------------|
|
|
133
|
+
| `nerve_map` | See all discovered screens and transitions |
|
|
134
|
+
| `nerve_navigate` | Auto-navigate to a known screen |
|
|
135
|
+
| `nerve_scroll_to_find` | Scroll until an element appears |
|
|
136
|
+
| `nerve_deeplink` | Open a URL scheme |
|
|
137
|
+
|
|
138
|
+
### Inspect & Debug
|
|
139
|
+
|
|
140
|
+
| Tool | Description |
|
|
141
|
+
|------|-------------|
|
|
142
|
+
| `nerve_console` | App logs (stdout/stderr) |
|
|
143
|
+
| `nerve_network` | Intercepted HTTP traffic with response bodies |
|
|
144
|
+
| `nerve_heap` | Find live object instances by class name |
|
|
145
|
+
| `nerve_storage` | Read UserDefaults, Keychain, cookies, files |
|
|
146
|
+
| `nerve_trace` | Swizzle any method to log calls |
|
|
147
|
+
| `nerve_highlight` | Draw colored borders on elements for visual debugging |
|
|
148
|
+
| `nerve_modify` | Change view properties at runtime |
|
|
149
|
+
| `nerve_lldb` | Full LLDB debugger access |
|
|
150
|
+
|
|
151
|
+
### Build & Launch
|
|
152
|
+
|
|
153
|
+
| Tool | Description |
|
|
154
|
+
|------|-------------|
|
|
155
|
+
| `nerve_run` | Build, install, and launch on the simulator (auto-injects Nerve) |
|
|
156
|
+
| `nerve_build` | Build only |
|
|
157
|
+
| `nerve_status` | Show connected targets |
|
|
158
|
+
| `nerve_list_simulators` | List available simulators |
|
|
159
|
+
| `nerve_boot_simulator` | Boot a simulator by name or UDID |
|
|
160
|
+
| `nerve_appearance` | Switch between light and dark mode |
|
|
161
|
+
| `nerve_grant_permissions` | Pre-grant iOS permissions |
|
|
162
|
+
|
|
163
|
+
## Element Queries
|
|
164
|
+
|
|
165
|
+
Nerve supports several query formats for targeting elements:
|
|
166
|
+
|
|
167
|
+
| Format | Example | Description |
|
|
168
|
+
|--------|---------|-------------|
|
|
169
|
+
| `@eN` | `@e2` | Element ref from `nerve_view` output |
|
|
170
|
+
| `#id` | `#login-btn` | Accessibility identifier |
|
|
171
|
+
| `@label` | `@Settings` | Accessibility label |
|
|
172
|
+
| `.type:index` | `.field:0` | Element type with index |
|
|
173
|
+
| `x,y` | `195,160` | Screen coordinates |
|
|
174
|
+
|
|
175
|
+
The `nerve_view` output shows each element with its ref and tap point:
|
|
176
|
+
|
|
177
|
+
```
|
|
178
|
+
@e1 btn "Product A" #product-a tap=195,222 x=16 y=195 w=358 h=54
|
|
179
|
+
@e2 field val=Email tap=195,160 x=32 y=149 w=326 h=22
|
|
180
|
+
```
|
|
181
|
+
|
|
182
|
+
Use `@e2` to tap that field — Nerve uses the element's activation point (center), which is always the correct hittable position.
|
|
183
|
+
|
|
184
|
+
## Architecture
|
|
185
|
+
|
|
186
|
+
```
|
|
187
|
+
Nerve/
|
|
188
|
+
Sources/
|
|
189
|
+
Nerve/ Swift framework — commands, element resolution, inspection
|
|
190
|
+
NerveObjC/ ObjC/C bridge — touch synthesis, heap walking, swizzling
|
|
191
|
+
Example/ Example app with test views
|
|
192
|
+
Tests/
|
|
193
|
+
E2E/ End-to-end tests (83 tests)
|
|
194
|
+
NerveTests/ Unit tests
|
|
195
|
+
mcp-server/ MCP server (TypeScript)
|
|
196
|
+
cli/ CLI tool
|
|
197
|
+
```
|
|
198
|
+
|
|
199
|
+
## Requirements
|
|
200
|
+
|
|
201
|
+
- macOS 14+
|
|
202
|
+
- Xcode 16+
|
|
203
|
+
- iOS Simulator (iOS 16+)
|
|
204
|
+
- Node.js 18+
|
|
205
|
+
|
|
206
|
+
## License
|
|
207
|
+
|
|
208
|
+
MIT
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "nerve-mcp",
|
|
3
|
-
"version": "0.2.
|
|
3
|
+
"version": "0.2.1",
|
|
4
4
|
"description": "MCP server for Nerve — gives AI agents runtime access to iOS apps",
|
|
5
5
|
"main": "dist/index.js",
|
|
6
6
|
"bin": {
|
|
@@ -10,7 +10,7 @@
|
|
|
10
10
|
"build": "tsc",
|
|
11
11
|
"start": "node dist/index.js",
|
|
12
12
|
"dev": "tsx src/index.ts",
|
|
13
|
-
"prepublishOnly": "npm run build && npm run bundle-framework",
|
|
13
|
+
"prepublishOnly": "npm run build && npm run bundle-framework && cp ../README.md README.md",
|
|
14
14
|
"bundle-framework": "bash ../scripts/build-framework.sh && rm -rf framework && mkdir -p framework/Nerve.framework && cp ../.build/inject/Nerve.framework/Nerve framework/Nerve.framework/ && cp ../.build/inject/Nerve.framework/Info.plist framework/Nerve.framework/"
|
|
15
15
|
},
|
|
16
16
|
"repository": {
|