@syndicalt/snow-cli 1.0.0 → 1.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +1062 -545
  2. package/dist/index.js +2047 -237
  3. package/package.json +52 -52
package/README.md CHANGED
@@ -1,545 +1,1062 @@
1
- # snow-cli
2
-
3
- A portable CLI for ServiceNow. Query tables, inspect schemas, edit script fields, and generate complete applications using AI — all from your terminal.
4
-
5
- - [Installation](#installation)
6
- - [Quick Start](#quick-start)
7
- - [Commands](#commands)
8
- - [snow instance](#snow-instance)
9
- - [snow table](#snow-table)
10
- - [snow schema](#snow-schema)
11
- - [snow script](#snow-script)
12
- - [snow provider](#snow-provider)
13
- - [snow ai](#snow-ai)
14
- - [Configuration File](#configuration-file)
15
- - [Development](#development)
16
-
17
- ---
18
-
19
- ## Installation
20
-
21
- **From npm (when published):**
22
- ```bash
23
- npm install -g snow-cli
24
- ```
25
-
26
- **From source:**
27
- ```bash
28
- git clone <repo>
29
- cd snow-cli
30
- npm install
31
- npm run build
32
- npm link
33
- ```
34
-
35
- Requires Node.js 18+.
36
-
37
- ---
38
-
39
- ## Quick Start
40
-
41
- ```bash
42
- # 1. Add a ServiceNow instance
43
- snow instance add
44
-
45
- # 2. Query a table
46
- snow table get incident -q "active=true" -l 10 -f "number,short_description,state,assigned_to"
47
-
48
- # 3. Configure an AI provider
49
- snow provider set openai
50
-
51
- # 4. Generate a simple feature
52
- snow ai build "Create a script include that auto-routes incidents by category and urgency"
53
-
54
- # 5. Generate a full scoped application (AI decides scope automatically)
55
- snow ai build "Build a hardware asset request application with a custom table, approval script include, and assignment business rule"
56
-
57
- # 6. Start an interactive session to build iteratively
58
- snow ai chat
59
- ```
60
-
61
- ---
62
-
63
- ## Commands
64
-
65
- ### `snow instance`
66
-
67
- Manage ServiceNow instance connections. Credentials are stored in `~/.snow/config.json` (mode `0600`).
68
-
69
- | Command | Description |
70
- |---|---|
71
- | `snow instance add` | Interactively add an instance (prompts for alias, URL, auth type) |
72
- | `snow instance list` | List all configured instances, showing the active one |
73
- | `snow instance use <alias>` | Switch the active instance |
74
- | `snow instance remove <alias>` | Remove an instance |
75
- | `snow instance test` | Test the active instance connection |
76
-
77
- **Adding an instance:**
78
- ```bash
79
- # Interactive (recommended)
80
- snow instance add
81
-
82
- # Basic auth — non-interactive
83
- snow instance add --alias dev --url https://dev12345.service-now.com --auth basic
84
-
85
- # OAuth (password grant) — prompts for client ID and secret
86
- snow instance add --alias prod --url https://prod.service-now.com --auth oauth
87
- ```
88
-
89
- OAuth access tokens are fetched automatically using the password grant flow and cached in the config file with their expiry time. They are refreshed transparently when they expire.
90
-
91
- ---
92
-
93
- ### `snow table`
94
-
95
- Table API CRUD operations. Output defaults to a terminal-friendly table; use `--json` for raw JSON.
96
-
97
- ```bash
98
- # Query records (table format by default)
99
- snow table get incident -q "active=true^priority=1" -l 20
100
- snow table get incident -q "active=true" -f "number,short_description,assigned_to" -l 50
101
-
102
- # Output as JSON array
103
- snow table get incident -q "active=true" -l 5 --json
104
-
105
- # Fetch a single record by sys_id
106
- snow table fetch incident <sys_id>
107
- snow table fetch incident <sys_id> -f "number,state,short_description"
108
-
109
- # Create a record
110
- snow table create incident -d '{"short_description":"VPN issue","urgency":"2","category":"network"}'
111
-
112
- # Update a record
113
- snow table update incident <sys_id> -d '{"state":"6","close_notes":"Resolved"}'
114
-
115
- # Delete a record (prompts for confirmation)
116
- snow table delete incident <sys_id>
117
- snow table delete incident <sys_id> --yes # skip confirmation
118
- ```
119
-
120
- **`snow table get` flags:**
121
-
122
- | Flag | Description |
123
- |---|---|
124
- | `-q, --query <sysparm_query>` | ServiceNow encoded query string |
125
- | `-f, --fields <fields>` | Comma-separated field list to return |
126
- | `-l, --limit <n>` | Max records (default: `20`) |
127
- | `-o, --offset <n>` | Pagination offset (default: `0`) |
128
- | `--display-value` | Return display values instead of raw values |
129
- | `--json` | Output as a JSON array instead of a table |
130
-
131
- **Output behaviour:** When a query returns more fields than can fit in the terminal, the CLI automatically switches to a **card layout** (one record per block) rather than trying to squash too many columns. Use `-f` to select specific fields for tabular output.
132
-
133
- ---
134
-
135
- ### `snow schema`
136
-
137
- Inspect field definitions for any table by querying `sys_dictionary`.
138
-
139
- ```bash
140
- snow schema incident
141
- snow schema sys_user --filter email # filter fields by name or label
142
- snow schema cmdb_ci_server --format json # JSON output
143
- ```
144
-
145
- Output columns: field name, label, type, max length, and flags (`M` = mandatory, `R` = read-only, `ref=<table>` for reference fields).
146
-
147
- ---
148
-
149
- ### `snow script`
150
-
151
- Pull a script field to disk, open it in your editor, then push the edited version back — all in one workflow.
152
-
153
- ```bash
154
- # Pull → edit → push (interactive)
155
- snow script pull sys_script_include <sys_id> script
156
-
157
- # Pull to a specific file path without opening an editor
158
- snow script pull sys_script_include <sys_id> script --no-open -o ./my-script.js
159
-
160
- # Push a local file to a script field
161
- snow script push sys_script_include <sys_id> script ./my-script.js
162
-
163
- # Push the last-pulled file for a record (no file path needed)
164
- snow script push sys_script_include <sys_id> script
165
-
166
- # List locally cached scripts
167
- snow script list
168
- ```
169
-
170
- **Supported field types and file extensions:**
171
-
172
- | Field type | Extension |
173
- |---|---|
174
- | Script (default) | `.js` |
175
- | HTML | `.html` |
176
- | CSS | `.css` |
177
- | XML | `.xml` |
178
- | JSON | `.json` |
179
-
180
- **Editor resolution order:**
181
- 1. `--editor <cmd>` flag
182
- 2. `$VISUAL` environment variable
183
- 3. `$EDITOR` environment variable
184
- 4. First found: `code`, `notepad++`, `notepad` (Windows) / `code`, `nvim`, `vim`, `nano`, `vi` (Unix)
185
-
186
- Cached scripts are stored in `~/.snow/scripts/`.
187
-
188
- ---
189
-
190
- ### `snow provider`
191
-
192
- Configure LLM providers used by `snow ai`. API keys and model preferences are stored in `~/.snow/config.json`.
193
-
194
- | Command | Description |
195
- |---|---|
196
- | `snow provider set <name>` | Add or update a provider (prompts for key and model) |
197
- | `snow provider list` | List all configured providers |
198
- | `snow provider use <name>` | Set the active provider |
199
- | `snow provider show` | Show the active provider details |
200
- | `snow provider test [name]` | Send a test message to verify connectivity |
201
- | `snow provider remove <name>` | Remove a provider configuration |
202
-
203
- **Supported providers:**
204
-
205
- | Name | Models | Notes |
206
- |---|---|---|
207
- | `openai` | `gpt-4o`, `gpt-4-turbo`, `gpt-4o-mini`, | Requires OpenAI API key |
208
- | `anthropic` | `claude-opus-4-6`, `claude-sonnet-4-6`, | Requires Anthropic API key |
209
- | `xai` | `grok-3`, `grok-2`, … | Requires xAI API key; uses OpenAI-compatible API |
210
- | `ollama` | `llama3`, `mistral`, `codellama`, | Local inference; no API key needed |
211
-
212
- **Setup examples:**
213
-
214
- ```bash
215
- # OpenAI (prompts interactively for key and model)
216
- snow provider set openai
217
-
218
- # Anthropic
219
- snow provider set anthropic
220
-
221
- # xAI / Grok
222
- snow provider set xai
223
-
224
- # Ollama (local no key required)
225
- snow provider set ollama --model llama3
226
- snow provider set ollama --model codellama --url http://localhost:11434
227
-
228
- # Non-interactive with flags
229
- snow provider set openai --key sk-... --model gpt-4o
230
-
231
- # Switch between configured providers
232
- snow provider use anthropic
233
- snow provider use openai
234
-
235
- # Verify a provider is working
236
- snow provider test
237
- snow provider test anthropic
238
- ```
239
-
240
- ---
241
-
242
- ### `snow ai`
243
-
244
- Generate ServiceNow applications using an LLM. The AI produces structured artifacts that are exported as an importable **update set XML** file and optionally pushed directly to your instance via the Table API. When a request warrants it, the AI automatically creates a **scoped application** to namespace the artifacts.
245
-
246
- #### Supported artifact types
247
-
248
- | Type | ServiceNow table(s) | Description |
249
- |---|---|---|
250
- | `script_include` | `sys_script_include` | Reusable server-side JavaScript class |
251
- | `business_rule` | `sys_script` | Auto-triggers on table insert/update/delete/query |
252
- | `client_script` | `sys_client_script` | Browser-side form script (onLoad, onChange, onSubmit) |
253
- | `ui_action` | `sys_ui_action` | Form button or context menu item |
254
- | `ui_page` | `sys_ui_page` | Custom HTML page with Jelly templating |
255
- | `scheduled_job` | `sys_trigger` | Script that runs on a schedule |
256
- | `table` | `sys_db_object` + `sys_dictionary` + `sys_choice` | Custom database table with typed columns and choice lists |
257
- | `decision_table` | `sys_decision` + children | Ordered rule set mapping input conditions to an output value |
258
- | `flow_action` | `sys_hub_action_type_definition` + children | Reusable custom action for Flow Designer |
259
-
260
- #### Scoped applications
261
-
262
- The AI automatically determines whether a build warrants its own application scope. Scope is added when:
263
- - The request creates two or more custom tables
264
- - The request is described as an "application" or "module"
265
- - A distinct namespace is needed to avoid naming conflicts
266
-
267
- When a scope is generated, the build includes a `sys_app` record and every artifact is stamped with `sys_scope`/`sys_package`. All custom table names and script include API names are automatically prefixed (e.g. `x_myco_myapp_tablename`). The scope is shown in the build summary:
268
-
269
- ```
270
- My Incident App
271
- scope: x_myco_incapp v1.0.0
272
-
273
- + table x_myco_incapp_request
274
- + script_include x_myco_incapp.RequestUtils
275
- + business_rule Auto-Assign on Insert
276
- ```
277
-
278
- For Table API push, the CLI resolves or creates the scoped application on the target instance before pushing artifacts.
279
-
280
- #### `snow ai build <prompt>`
281
-
282
- Generate a ServiceNow application from a text description. The AI may ask clarifying questions before generating — answer them interactively and the build proceeds once the AI has enough information. Saves output to a directory named after the update set.
283
-
284
- ```bash
285
- snow ai build "Create a script include that routes incidents based on category and urgency"
286
-
287
- snow ai build "Add a before business rule on the incident table that sets priority = urgency + impact when a record is inserted"
288
-
289
- # Save to a custom directory
290
- snow ai build "Create a client script that hides the CI field when category is Software" --output my-feature
291
-
292
- # Push artifacts directly to the active instance after generating
293
- snow ai build "Create a scheduled job that closes resolved incidents older than 30 days" --push
294
-
295
- # Use a specific provider for this request only
296
- snow ai build "..." --provider anthropic
297
-
298
- # Debug mode — prints raw LLM response and full error traces
299
- snow ai build "..." --debug
300
- ```
301
-
302
- **Interactive clarification:** If your prompt is vague or the AI needs more detail, it will ask targeted questions before generating. You answer in the terminal and the conversation continues until the AI produces the final build.
303
-
304
- ```
305
- AI > To generate the right artifacts, a few questions:
306
- 1. Which table should the business rule operate on?
307
- 2. Should it fire on insert, update, or both?
308
-
309
- You > incident table, on insert only
310
-
311
- [AI generates the build]
312
- ```
313
-
314
- **Post-build flow:** After generating, the CLI presents two optional prompts:
315
-
316
- 1. **Review artifacts** — opens an interactive selector to inspect and edit any code field in your preferred editor before deploying
317
- 2. **Push to instance** — confirm to deploy directly, or decline to deploy later with `snow ai push`
318
-
319
- **Output structure:**
320
- ```
321
- incident-priority-router/
322
- incident-priority-router.xml ← importable update set XML
323
- incident-priority-router.manifest.json ← artifact manifest for snow ai push/review
324
- ```
325
-
326
- **Importing the update set in ServiceNow:**
327
- 1. Navigate to **System Update Sets Retrieved Update Sets**
328
- 2. Click **Import Update Set from XML**
329
- 3. Upload the `.xml` file
330
- 4. Click **Load Update Set**, then **Preview**, then **Commit**
331
-
332
- **Generating tables:**
333
- ```
334
- snow ai build "Create a custom table for tracking hardware asset requests with fields for requester, asset type, urgency, and status"
335
- ```
336
- The AI generates `sys_db_object` + `sys_dictionary` entries for each column, plus `sys_choice` records for any choice fields. If the table extends `task`, inherited fields (number, state, assigned_to, etc.) are not re-declared.
337
-
338
- **Generating decision tables:**
339
- ```
340
- snow ai build "Create a decision table that maps urgency and impact values to a priority level"
341
- ```
342
- Produces a `sys_decision` record with input columns, ordered rules, and per-rule conditions. Decision tables are evaluated top-down — the first matching rule wins.
343
-
344
- **Generating flow actions:**
345
- ```
346
- snow ai build "Create a Flow Designer action that creates an incident from an email subject and body and returns the sys_id"
347
- ```
348
- Produces a `sys_hub_action_type_definition` with typed input and output variables. The action appears in the Flow Designer action picker under the specified category and can be used in any flow or subflow.
349
-
350
- **Generating a scoped application:**
351
- ```
352
- snow ai build "Build a full hardware asset request application with a custom table, approval workflow script include, and email notification business rule"
353
- ```
354
- When the AI determines a scope is appropriate it generates the `sys_app` record and prefixes all custom artifacts automatically. You can also explicitly ask for a scope:
355
- ```
356
- snow ai build "Create a scoped application called 'HR Onboarding' with scope prefix x_myco_hronboard ..."
357
- ```
358
-
359
- #### `snow ai chat`
360
-
361
- Interactive multi-turn session. The AI can ask clarifying questions before generating, and refines artifacts as the conversation continues. The update set is re-saved to disk automatically after each generation.
362
-
363
- ```bash
364
- snow ai chat
365
- snow ai chat --push # auto-push to instance on each generation
366
- snow ai chat --debug # show raw LLM responses
367
- snow ai chat --output ./my-app # save all builds to a specific directory
368
- ```
369
-
370
- **In-session commands:**
371
-
372
- | Command | Action |
373
- |---|---|
374
- | `/status` | Show the current build summary and artifact list |
375
- | `/save` | Write the current XML and manifest to disk |
376
- | `/push` | Push the current build to the active ServiceNow instance |
377
- | `/clear` | Reset the session (clears history and current build) |
378
- | `/exit` | Quit |
379
-
380
- After each generation, the CLI prompts whether to push to the active instance (unless `--push` is set, in which case it pushes automatically).
381
-
382
- **Example session:**
383
- ```
384
- You > I want to build an incident auto-assignment feature
385
-
386
- AI > To generate the right artifacts, a few questions:
387
- 1. Which teams/groups should incidents be routed to?
388
- 2. What fields determine the routing — category, urgency, location, or something else?
389
- 3. Should assignment happen on insert only, or also on update?
390
-
391
- You > Route by category: Network → Network Ops, Software → App Support.
392
- Assignment on insert only.
393
-
394
- AI > [generates script include + business rule]
395
-
396
- + script_include IncidentRouter
397
- + business_rule Auto-Assign Incident on Insert
398
-
399
- ✔ incident-auto-assignment/
400
- incident-auto-assignment.xml
401
- incident-auto-assignment.manifest.json
402
-
403
- Push 2 artifact(s) to dev (https://dev12345.service-now.com)? (y/N)
404
-
405
- You > Also add a UI action button to manually re-trigger the routing
406
-
407
- AI > [generates updated build with all three artifacts]
408
-
409
- + script_include IncidentRouter (unchanged)
410
- + business_rule Auto-Assign ... (unchanged)
411
- + ui_action Re-Route Incident ← new
412
- ```
413
-
414
- #### `snow ai review <path>`
415
-
416
- Review and edit a previously generated build. Opens an interactive artifact selector, shows code with line numbers, and lets you open any field in your editor. Changes are saved back to the XML and manifest immediately. After reviewing, you can optionally push the (modified) build to the active instance.
417
-
418
- ```bash
419
- # Review from a build directory
420
- snow ai review ./incident-auto-assignment/
421
-
422
- # Review from a manifest file
423
- snow ai review ./incident-auto-assignment/incident-auto-assignment.manifest.json
424
-
425
- # Review from the XML file (locates the sibling .manifest.json automatically)
426
- snow ai review ./incident-auto-assignment/incident-auto-assignment.xml
427
- ```
428
-
429
- **Review workflow:**
430
- 1. Select an artifact from the list
431
- 2. The code is printed with line numbers
432
- 3. Confirm to open in your editor (uses `$VISUAL`, `$EDITOR`, or auto-detected editor)
433
- 4. Save and close the editor — changes are written back to disk immediately
434
- 5. Select another artifact or choose **Done reviewing**
435
- 6. Confirm whether to push the build to the active instance
436
-
437
- #### `snow ai push <path>`
438
-
439
- Push a previously generated build to the active instance via Table API.
440
-
441
- ```bash
442
- # Push from a build directory
443
- snow ai push ./incident-auto-assignment/
444
-
445
- # Push from a manifest file
446
- snow ai push ./incident-auto-assignment/incident-auto-assignment.manifest.json
447
- ```
448
-
449
- **Push behaviour per artifact type:**
450
-
451
- | Artifact | Strategy |
452
- |---|---|
453
- | script_include, business_rule, etc. | Looks up by name — creates or updates the single record |
454
- | `table` | Upserts `sys_db_object`; upserts each `sys_dictionary` column by table+element; upserts `sys_choice` rows by table+element+value |
455
- | `decision_table` | Upserts `sys_decision`; deletes and recreates all input columns and rules on update |
456
- | `flow_action` | Upserts `sys_hub_action_type_definition`; deletes and recreates all input/output variables on update |
457
- | Scoped build | Resolves or creates the `sys_app` record first; stamps `sys_scope`/`sys_package` on every record |
458
-
459
- ---
460
-
461
- ## Configuration File
462
-
463
- All settings are stored in `~/.snow/config.json`. The directory is created with mode `0700` and the file with `0600`.
464
-
465
- ```json
466
- {
467
- "activeInstance": "dev",
468
- "instances": {
469
- "dev": {
470
- "alias": "dev",
471
- "url": "https://dev12345.service-now.com",
472
- "auth": {
473
- "type": "basic",
474
- "username": "admin",
475
- "password": "your-password"
476
- }
477
- },
478
- "prod": {
479
- "alias": "prod",
480
- "url": "https://prod.service-now.com",
481
- "auth": {
482
- "type": "oauth",
483
- "clientId": "...",
484
- "clientSecret": "...",
485
- "accessToken": "...",
486
- "tokenExpiry": 1700000000000
487
- }
488
- }
489
- },
490
- "ai": {
491
- "activeProvider": "openai",
492
- "providers": {
493
- "openai": {
494
- "model": "gpt-4o",
495
- "apiKey": "sk-..."
496
- },
497
- "anthropic": {
498
- "model": "claude-opus-4-6",
499
- "apiKey": "sk-ant-..."
500
- },
501
- "xai": {
502
- "model": "grok-3",
503
- "apiKey": "xai-...",
504
- "baseUrl": "https://api.x.ai/v1"
505
- },
506
- "ollama": {
507
- "model": "llama3",
508
- "baseUrl": "http://localhost:11434"
509
- }
510
- }
511
- }
512
- }
513
- ```
514
-
515
- ---
516
-
517
- ## Development
518
-
519
- ```bash
520
- npm run dev # Watch mode — rebuilds on every file change
521
- npm run build # One-time production build to dist/
522
- ```
523
-
524
- The entry point is `src/index.ts`. Commands live in `src/commands/`, shared utilities in `src/lib/`.
525
-
526
- **Project structure:**
527
- ```
528
- src/
529
- index.ts CLI entry point and command registration
530
- commands/
531
- instance.ts snow instance
532
- table.ts snow table
533
- schema.ts snow schema
534
- script.ts snow script
535
- provider.ts snow provider
536
- ai.ts snow ai (build, chat, review, push)
537
- lib/
538
- config.ts Config file read/write + instance and provider helpers
539
- client.ts ServiceNow HTTP client (axios, basic + OAuth auth)
540
- llm.ts LLM provider abstraction (OpenAI, Anthropic, xAI, Ollama)
541
- sn-context.ts ServiceNow system prompt and artifact type definitions
542
- update-set.ts XML update set generation and Table API push
543
- types/
544
- index.ts Shared TypeScript interfaces
545
- ```
1
+ # snow-cli
2
+
3
+ A portable CLI for ServiceNow. Query tables, inspect schemas, edit and search script fields, bulk-update records, manage users and groups, handle attachments, promote update sets across environments, and generate complete applications using AI — all from your terminal.
4
+
5
+ - [Installation](#installation)
6
+ - [Quick Start](#quick-start)
7
+ - [Commands](#commands)
8
+ - [snow instance](#snow-instance)
9
+ - [snow table](#snow-table)
10
+ - [snow schema](#snow-schema)
11
+ - [snow schema map](#snow-schema-map)
12
+ - [snow script](#snow-script)
13
+ - [snow bulk](#snow-bulk)
14
+ - [snow user](#snow-user)
15
+ - [snow attachment](#snow-attachment)
16
+ - [snow updateset](#snow-updateset)
17
+ - [snow status](#snow-status)
18
+ - [snow provider](#snow-provider)
19
+ - [snow ai](#snow-ai)
20
+ - [Configuration File](#configuration-file)
21
+ - [Development](#development)
22
+
23
+ ---
24
+
25
+ ## Installation
26
+
27
+ **From npm (when published):**
28
+ ```bash
29
+ npm install -g snow-cli
30
+ ```
31
+
32
+ **From source:**
33
+ ```bash
34
+ git clone <repo>
35
+ cd snow-cli
36
+ npm install
37
+ npm run build
38
+ npm link
39
+ ```
40
+
41
+ Requires Node.js 18+.
42
+
43
+ ---
44
+
45
+ ## Quick Start
46
+
47
+ ```bash
48
+ # 1. Add a ServiceNow instance
49
+ snow instance add
50
+
51
+ # 2. Query a table
52
+ snow table get incident -q "active=true" -l 10 -f "number,short_description,state,assigned_to"
53
+
54
+ # 3. Bulk-update records matching a query
55
+ snow bulk update incident -q "active=true^priority=1" --set assigned_to=admin --dry-run
56
+
57
+ # 4. Pull a script field, edit it locally, push it back
58
+ snow script pull sys_script_include <sys_id> script
59
+
60
+ # 5. Search for a pattern across all scripts in an app scope
61
+ snow script search x_myapp --contains "GlideRecord('old_table')"
62
+
63
+ # 6. Manage update sets — list, export, and promote to another instance
64
+ snow updateset list
65
+ snow updateset export "Sprint 42" --out ./updatesets
66
+ snow updateset apply ./sprint-42.xml --target prod
67
+
68
+ # 7. Add a user to a group or assign a role
69
+ snow user add-to-group john.doe "Network Support"
70
+ snow user assign-role john.doe itil
71
+
72
+ # 8. Download all attachments from a record
73
+ snow attachment pull incident <sys_id> --all --out ./downloads
74
+
75
+ # 9. Configure an AI provider and generate a feature
76
+ snow provider set openai
77
+ snow ai build "Create a script include that auto-routes incidents by category and urgency"
78
+
79
+ # 10. Start an interactive session to build iteratively
80
+ snow ai chat
81
+ ```
82
+
83
+ ---
84
+
85
+ ## Commands
86
+
87
+ ### `snow instance`
88
+
89
+ Manage ServiceNow instance connections. Credentials are stored in `~/.snow/config.json` (mode `0600`).
90
+
91
+ | Command | Description |
92
+ |---|---|
93
+ | `snow instance add` | Interactively add an instance (prompts for alias, URL, auth type) |
94
+ | `snow instance list` | List all configured instances, showing the active one |
95
+ | `snow instance use <alias>` | Switch the active instance |
96
+ | `snow instance remove <alias>` | Remove an instance |
97
+ | `snow instance test` | Test the active instance connection |
98
+
99
+ **Adding an instance:**
100
+ ```bash
101
+ # Interactive (recommended)
102
+ snow instance add
103
+
104
+ # Basic auth — non-interactive
105
+ snow instance add --alias dev --url https://dev12345.service-now.com --auth basic
106
+
107
+ # OAuth (password grant) prompts for client ID and secret
108
+ snow instance add --alias prod --url https://prod.service-now.com --auth oauth
109
+ ```
110
+
111
+ OAuth access tokens are fetched automatically using the password grant flow and cached in the config file with their expiry time. They are refreshed transparently when they expire.
112
+
113
+ ---
114
+
115
+ ### `snow table`
116
+
117
+ Table API CRUD operations. Output defaults to a terminal-friendly table; use `--json` for raw JSON.
118
+
119
+ ```bash
120
+ # Query records (table format by default)
121
+ snow table get incident -q "active=true^priority=1" -l 20
122
+ snow table get incident -q "active=true" -f "number,short_description,assigned_to" -l 50
123
+
124
+ # Output as JSON array
125
+ snow table get incident -q "active=true" -l 5 --json
126
+
127
+ # Fetch a single record by sys_id
128
+ snow table fetch incident <sys_id>
129
+ snow table fetch incident <sys_id> -f "number,state,short_description"
130
+
131
+ # Create a record
132
+ snow table create incident -d '{"short_description":"VPN issue","urgency":"2","category":"network"}'
133
+
134
+ # Update a record
135
+ snow table update incident <sys_id> -d '{"state":"6","close_notes":"Resolved"}'
136
+
137
+ # Delete a record (prompts for confirmation)
138
+ snow table delete incident <sys_id>
139
+ snow table delete incident <sys_id> --yes # skip confirmation
140
+ ```
141
+
142
+ **`snow table get` flags:**
143
+
144
+ | Flag | Description |
145
+ |---|---|
146
+ | `-q, --query <sysparm_query>` | ServiceNow encoded query string |
147
+ | `-f, --fields <fields>` | Comma-separated field list to return |
148
+ | `-l, --limit <n>` | Max records (default: `20`) |
149
+ | `-o, --offset <n>` | Pagination offset (default: `0`) |
150
+ | `--display-value` | Return display values instead of raw values |
151
+ | `--json` | Output as a JSON array instead of a table |
152
+
153
+ **Output behaviour:** When a query returns more fields than can fit in the terminal, the CLI automatically switches to a **card layout** (one record per block) rather than trying to squash too many columns. Use `-f` to select specific fields for tabular output.
154
+
155
+ ---
156
+
157
+ ### `snow schema`
158
+
159
+ Inspect field definitions for any table, or generate a full cross-table schema map.
160
+
161
+ #### Field inspection
162
+
163
+ ```bash
164
+ snow schema incident
165
+ snow schema sys_user --filter email # filter fields by name or label
166
+ snow schema cmdb_ci_server --format json # JSON output
167
+ ```
168
+
169
+ Output columns: field name, label, type, max length, and flags (`M` = mandatory, `R` = read-only, `ref=<table>` for reference fields).
170
+
171
+ #### `snow schema map`
172
+
173
+ Crawl a table's reference and M2M fields to generate a complete relational schema diagram. The crawl follows references to the specified depth, building a graph of all connected tables including their scope information. Outputs Mermaid (`.mmd`) or DBML (`.dbml`) to disk.
174
+
175
+ ```bash
176
+ # Mermaid diagram, depth 2 (default)
177
+ snow schema map incident
178
+
179
+ # DBML format, custom filename, saved to a directory
180
+ snow schema map incident --format dbml --out ./diagrams --name incident-full
181
+
182
+ # Follow 3 levels of references
183
+ snow schema map incident --depth 3
184
+
185
+ # Include glide_list (M2M) relationships
186
+ snow schema map incident --show-m2m
187
+
188
+ # Also crawl tables that reference incident (inbound — keep depth low)
189
+ snow schema map incident --inbound --depth 1
190
+
191
+ # Include choice field enum blocks in DBML output
192
+ snow schema map incident --format dbml --enums
193
+
194
+ # Generate the diagram then have the AI explain the data model
195
+ snow schema map incident --explain
196
+
197
+ # All options combined
198
+ snow schema map x_myco_myapp_request --depth 2 --format dbml --enums --explain --name myapp-schema
199
+ ```
200
+
201
+ **Options:**
202
+
203
+ | Flag | Default | Description |
204
+ |---|---|---|
205
+ | `-d, --depth <n>` | `2` | How many levels of reference fields to follow |
206
+ | `--show-m2m` | off | Include `glide_list` fields as many-to-many relationships |
207
+ | `--format <fmt>` | `mermaid` | Output format: `mermaid` or `dbml` |
208
+ | `--out <dir>` | `.` | Directory to write the output file(s) |
209
+ | `--name <name>` | `<table>-schema` | Base filename extension added automatically |
210
+ | `--inbound` | off | Also crawl tables that have reference fields pointing *to* this table |
211
+ | `--enums` | off | Fetch `sys_choice` values and emit Enum blocks (DBML) or `%%` comments (Mermaid) |
212
+ | `--explain` | off | Use the active AI provider to generate a plain-English explanation of the schema |
213
+
214
+ **Output files:**
215
+
216
+ | Format | File | Open with |
217
+ |---|---|---|
218
+ | Mermaid | `<name>.mmd` | VS Code Mermaid Preview, GitHub, mermaid.live |
219
+ | DBML | `<name>.dbml` | dbdiagram.io, any DBML-compatible tool |
220
+ | Explanation | `<name>.explanation.md` | Any Markdown viewer (`--explain` only) |
221
+
222
+ **How the crawl works:**
223
+
224
+ 1. Fetches all fields for the root table from `sys_dictionary`, plus label and `sys_scope` via `sys_db_object`
225
+ 2. For every `reference`-type field, records the relationship and queues the target table
226
+ 3. Repeats for each discovered table until `--depth` is reached
227
+ 4. With `--show-m2m`: also follows `glide_list` fields, shown as many-to-many edges
228
+ 5. With `--inbound`: additionally queues tables that have reference fields pointing *to* the current table
229
+ 6. Tables referenced by fields that fall outside the crawl depth are rendered as **stub placeholders** (marked `not crawled`) so all references in the diagram resolve without broken links
230
+
231
+ > **Tip:** `--inbound` with depth > 1 can produce very large graphs for highly-referenced tables like `sys_user` or `task`. Use `--depth 1` when combining these flags. A warning is also printed when the crawl discovers more than 50 tables.
232
+
233
+ **Scope annotations:**
234
+
235
+ Each table's application scope is fetched alongside its label. Both output formats include a scope summary header. In DBML, non-Global tables are annotated directly:
236
+
237
+ ```dbml
238
+ // Scopes: Global (12), ITSM (3)
239
+ // Warning: tables from 2 scopes — cross-scope references present
240
+
241
+ Table incident [note: 'Incident | scope: ITSM'] { ... }
242
+ ```
243
+
244
+ In Mermaid, scope information appears as `%%` comments at the top of the file.
245
+
246
+ **Choice enums (`--enums`):**
247
+
248
+ Queries `sys_choice` for every `choice`-type field across all crawled tables. In DBML, each set of choices becomes a named `Enum` block and the field type references it:
249
+
250
+ ```dbml
251
+ Table incident {
252
+ state incident__state [not null]
253
+ }
254
+
255
+ Enum incident__state {
256
+ "1" [note: 'New']
257
+ "2" [note: 'In Progress']
258
+ "6" [note: 'Resolved']
259
+ }
260
+ ```
261
+
262
+ In Mermaid, choice values are emitted as `%%` comments since `erDiagram` has no enum syntax.
263
+
264
+ **AI explanation (`--explain`):**
265
+
266
+ Requires a configured AI provider (`snow provider set`). After writing the schema file, the CLI sends the schema content to the active LLM and saves the Markdown response to `<name>.explanation.md`. The explanation is also printed to the terminal. It covers the business domain, key tables, notable relationships, and any cross-scope dependencies.
267
+
268
+ ```bash
269
+ snow provider set anthropic # configure a provider first
270
+ snow schema map incident --format dbml --explain
271
+ ```
272
+
273
+ **Example output (Mermaid):**
274
+ ```
275
+ %% Scopes: Global (8), ITSM (2)
276
+ incident }o--|| sys_user : "Caller"
277
+ incident }o--|| problem : "Problem"
278
+ incident }o--|| change_request : "RFC"
279
+ sys_user }o--|| cmn_department : "Department"
280
+ ...
281
+ cmn_location { string sys_id } %% stub: referenced but not crawled
282
+ ```
283
+
284
+ **Example output (DBML):**
285
+ ```dbml
286
+ // Scopes: Global (8), ITSM (2)
287
+
288
+ Table incident [note: 'Incident | scope: ITSM'] {
289
+ caller_id varchar(32) [ref: > sys_user.sys_id]
290
+ problem_id varchar(32) [ref: > problem.sys_id]
291
+ state incident__state [not null]
292
+ }
293
+
294
+ // Placeholder tables — referenced but not crawled (increase --depth to explore)
295
+ Table cmn_schedule [note: 'not crawled'] {
296
+ sys_id varchar(32) [pk]
297
+ }
298
+
299
+ // Choice field enums
300
+ Enum incident__state {
301
+ "1" [note: 'New']
302
+ "2" [note: 'In Progress']
303
+ "6" [note: 'Resolved']
304
+ }
305
+ ```
306
+
307
+ **Cardinality notation:**
308
+
309
+ | Relationship | Mermaid | Meaning |
310
+ |---|---|---|
311
+ | Reference field | `}o--\|\|` | Many records → one target |
312
+ | Glide list (M2M) | `}o--o{` | Many records ↔ many targets |
313
+
314
+ ---
315
+
316
+ ### `snow script`
317
+
318
+ Pull a script field to disk, open it in your editor, then push the edited version back — all in one workflow.
319
+
320
+ ```bash
321
+ # Pull → edit → push (interactive)
322
+ snow script pull sys_script_include <sys_id> script
323
+
324
+ # Pull to a specific file path without opening an editor
325
+ snow script pull sys_script_include <sys_id> script --no-open -o ./my-script.js
326
+
327
+ # Push a local file to a script field
328
+ snow script push sys_script_include <sys_id> script ./my-script.js
329
+
330
+ # Push the last-pulled file for a record (no file path needed)
331
+ snow script push sys_script_include <sys_id> script
332
+
333
+ # List locally cached scripts
334
+ snow script list
335
+ ```
336
+
337
+ **Supported field types and file extensions:**
338
+
339
+ | Field type | Extension |
340
+ |---|---|
341
+ | Script (default) | `.js` |
342
+ | HTML | `.html` |
343
+ | CSS | `.css` |
344
+ | XML | `.xml` |
345
+ | JSON | `.json` |
346
+
347
+ **Editor resolution order:**
348
+ 1. `--editor <cmd>` flag
349
+ 2. `$VISUAL` environment variable
350
+ 3. `$EDITOR` environment variable
351
+ 4. First found: `code`, `notepad++`, `notepad` (Windows) / `code`, `nvim`, `vim`, `nano`, `vi` (Unix)
352
+
353
+ Cached scripts are stored in `~/.snow/scripts/`.
354
+
355
+ #### `snow script search`
356
+
357
+ Search for a text pattern across script fields in a given app scope. Searches Script Includes, Business Rules, Client Scripts, UI Actions, UI Pages (HTML, client, and server scripts), and Scheduled Jobs.
358
+
359
+ ```bash
360
+ # Find all scripts containing a string
361
+ snow script search x_myapp --contains "GlideRecord('incident')"
362
+
363
+ # Search only specific tables
364
+ snow script search x_myapp --contains "oldMethod" --tables sys_script_include,sys_script
365
+
366
+ # Use a JavaScript regex
367
+ snow script search x_myapp --contains "gs\.(log|warn)" --regex
368
+ ```
369
+
370
+ Results show the record name, sys_id, and matching lines with line numbers (up to 5 preview lines per record).
371
+
372
+ **Options:**
373
+
374
+ | Flag | Description |
375
+ |---|---|
376
+ | `-c, --contains <pattern>` | Text or regex pattern to search for (required) |
377
+ | `-t, --tables <tables>` | Comma-separated table list (default: all 8 script tables) |
378
+ | `--regex` | Treat `--contains` as a JavaScript regex |
379
+ | `-l, --limit <n>` | Max records per table (default: `500`) |
380
+
381
+ #### `snow script replace`
382
+
383
+ Find and replace text across script fields in an app scope. Supports dry-run preview before committing changes.
384
+
385
+ ```bash
386
+ # Preview what would change
387
+ snow script replace x_myapp --find "gs.log" --replace "gs.info" --dry-run
388
+
389
+ # Replace with confirmation prompt
390
+ snow script replace x_myapp --find "gs.log" --replace "gs.info"
391
+
392
+ # Replace with regex and skip confirmation
393
+ snow script replace x_myapp --find "GlideRecord\('old_table'\)" --replace "GlideRecord('new_table')" --regex --yes
394
+
395
+ # Target only specific tables
396
+ snow script replace x_myapp --find "deprecated.util" --replace "NewUtils" --tables sys_script_include
397
+ ```
398
+
399
+ **Options:**
400
+
401
+ | Flag | Description |
402
+ |---|---|
403
+ | `-f, --find <pattern>` | Text to find (required) |
404
+ | `-r, --replace <text>` | Replacement text (required) |
405
+ | `-t, --tables <tables>` | Comma-separated table list (default: all 8 script tables) |
406
+ | `--regex` | Treat `--find` as a JavaScript regex |
407
+ | `-l, --limit <n>` | Max records per table (default: `500`) |
408
+ | `--dry-run` | Show matches without writing any changes |
409
+ | `--yes` | Skip confirmation prompt |
410
+
411
+ ---
412
+
413
+ ### `snow bulk`
414
+
415
+ Bulk update multiple records in one command. Fetches matching records, shows a preview, asks for confirmation, then patches each record.
416
+
417
+ ```bash
418
+ # Preview affected records without making changes
419
+ snow bulk update incident -q "active=true^priority=1" --set state=2 --dry-run
420
+
421
+ # Update with confirmation prompt
422
+ snow bulk update incident -q "active=true^priority=1" --set state=2 --set assigned_to=admin
423
+
424
+ # Skip confirmation (useful in scripts)
425
+ snow bulk update sys_user -q "department=IT^active=true" --set location=NYC --yes
426
+
427
+ # Cap the number of records updated
428
+ snow bulk update incident -q "active=true" --set impact=2 --limit 50
429
+ ```
430
+
431
+ **Options:**
432
+
433
+ | Flag | Description |
434
+ |---|---|
435
+ | `-q, --query <query>` | ServiceNow encoded query to select records (required) |
436
+ | `-s, --set <field=value>` | Field assignment — repeat for multiple fields (required) |
437
+ | `-l, --limit <n>` | Max records to update (default: `200`) |
438
+ | `--dry-run` | Show preview without making changes |
439
+ | `--yes` | Skip confirmation prompt |
440
+
441
+ The preview table shows the sys_id, display name, and new values for every record that will be updated.
442
+
443
+ ---
444
+
445
+ ### `snow user`
446
+
447
+ Manage group membership and role assignments for ServiceNow users. Users can be specified by `user_name`, email, display name, or sys_id.
448
+
449
+ ```bash
450
+ # Add a user to a group
451
+ snow user add-to-group john.doe "Network Support"
452
+ snow user add-to-group john.doe@example.com "IT Operations" --yes
453
+
454
+ # Remove a user from a group
455
+ snow user remove-from-group john.doe "Network Support"
456
+
457
+ # Assign a role to a user
458
+ snow user assign-role john.doe itil
459
+
460
+ # Remove a role from a user
461
+ snow user remove-role john.doe itil --yes
462
+ ```
463
+
464
+ Each command resolves the user and target, checks for existing membership/role to prevent duplicates, then prompts for confirmation before making any change.
465
+
466
+ | Command | Description |
467
+ |---|---|
468
+ | `snow user add-to-group <user> <group>` | Add user to a `sys_user_group` |
469
+ | `snow user remove-from-group <user> <group>` | Remove user from a `sys_user_group` |
470
+ | `snow user assign-role <user> <role>` | Grant a `sys_user_role` to a user |
471
+ | `snow user remove-role <user> <role>` | Revoke a `sys_user_role` from a user |
472
+
473
+ All subcommands accept `--yes` to skip the confirmation prompt.
474
+
475
+ ---
476
+
477
+ ### `snow attachment`
478
+
479
+ Download and upload file attachments on ServiceNow records via the Attachment API. Also available as `snow att`.
480
+
481
+ ```bash
482
+ # List attachments on a record
483
+ snow attachment list incident <sys_id>
484
+ snow att ls incident <sys_id>
485
+
486
+ # Download all attachments to a directory
487
+ snow attachment pull incident <sys_id> --all --out ./downloads
488
+
489
+ # Download a specific attachment by file name
490
+ snow attachment pull incident <sys_id> --name report.pdf
491
+
492
+ # Upload a file as an attachment
493
+ snow attachment push incident <sys_id> ./fix-notes.pdf
494
+
495
+ # Override the auto-detected Content-Type
496
+ snow attachment push incident <sys_id> ./data.bin --type application/octet-stream
497
+ ```
498
+
499
+ **`snow attachment pull` options:**
500
+
501
+ | Flag | Description |
502
+ |---|---|
503
+ | `-a, --all` | Download all attachments on the record |
504
+ | `-n, --name <file_name>` | Download a specific attachment by its file name |
505
+ | `-o, --out <dir>` | Output directory (default: current directory) |
506
+
507
+ **`snow attachment push` options:**
508
+
509
+ | Flag | Description |
510
+ |---|---|
511
+ | `-t, --type <content-type>` | Override the Content-Type header (auto-detected from extension by default) |
512
+
513
+ Content-Type is inferred from the file extension for common formats (PDF, PNG, JPG, CSV, XML, JSON, ZIP, DOCX, XLSX, etc.). Defaults to `application/octet-stream` for unknown types.
514
+
515
+ ---
516
+
517
+ ### `snow updateset`
518
+
519
+ Manage ServiceNow update sets from the CLI — list, inspect, export, import, and diff. Also available as `snow us`.
520
+
521
+ | Command | Description |
522
+ |---|---|
523
+ | `snow updateset list` | List update sets on the instance |
524
+ | `snow updateset current` | Show the currently active update set |
525
+ | `snow updateset set <name>` | Set the active update set |
526
+ | `snow updateset show <name>` | Show details and all captured items |
527
+ | `snow updateset capture <name> --add <table:sys_id>` | Add specific records to an update set |
528
+ | `snow updateset export <name>` | Download the update set as an XML file |
529
+ | `snow updateset apply <xml-file>` | Upload an XML file to another instance |
530
+ | `snow updateset diff <set1> <set2>` | Compare captured items between two update sets |
531
+
532
+ Names or sys_ids are accepted wherever `<name>` appears.
533
+
534
+ #### `snow updateset list`
535
+
536
+ ```bash
537
+ # All in-progress and complete update sets (default: excludes "ignore")
538
+ snow updateset list
539
+
540
+ # Filter by state
541
+ snow updateset list --state "in progress"
542
+ snow updateset list --state complete --limit 20
543
+ ```
544
+
545
+ **Options:**
546
+
547
+ | Flag | Description |
548
+ |---|---|
549
+ | `-s, --state <state>` | Filter by state: `in progress`, `complete`, `ignore` (default: all except `ignore`) |
550
+ | `-l, --limit <n>` | Max results (default: `50`) |
551
+
552
+ Output columns: **Name**, **State**, **Application** (scope), **Created by**, **Created on**. In-progress sets are highlighted in green; the active set is marked with a ★.
553
+
554
+ #### `snow updateset current`
555
+
556
+ ```bash
557
+ snow updateset current
558
+ ```
559
+
560
+ Shows the update set that is currently active for the authenticated user (read from `sys_user_preference`). REST API writes go to this update set.
561
+
562
+ #### `snow updateset set <name>`
563
+
564
+ ```bash
565
+ snow updateset set "Sprint 42 - Incident fixes"
566
+ snow updateset set a1b2c3d4e5f6... # sys_id also accepted
567
+ ```
568
+
569
+ Stores the selection in `sys_user_preference` so subsequent REST API operations (table updates, script pushes, etc.) are captured into the selected update set.
570
+
571
+ #### `snow updateset show <name>`
572
+
573
+ ```bash
574
+ snow updateset show "Sprint 42 - Incident fixes"
575
+ snow updateset show "Sprint 42 - Incident fixes" --limit 200
576
+ ```
577
+
578
+ Displays update set metadata followed by a table of every captured item (`sys_update_xml`) with type, action, and target name.
579
+
580
+ #### `snow updateset capture <name> --add <table:sys_id>`
581
+
582
+ Force-captures specific records into an update set without modifying them:
583
+
584
+ ```bash
585
+ # Capture a single record
586
+ snow updateset capture "My Update Set" --add sys_script_include:abc123...
587
+
588
+ # Capture multiple records at once
589
+ snow updateset capture "My Update Set" \
590
+ --add sys_script_include:abc123... \
591
+ --add sys_script:def456... \
592
+ --yes
593
+ ```
594
+
595
+ **How it works:** temporarily activates the target update set for the authenticated user, performs a no-op PATCH on each record to trigger ServiceNow's capture mechanism, then restores the previously active update set. The records themselves are not changed.
596
+
597
+ #### `snow updateset export <name>`
598
+
599
+ ```bash
600
+ # Export to current directory
601
+ snow updateset export "Sprint 42 - Incident fixes"
602
+
603
+ # Export to a specific directory
604
+ snow updateset export "Sprint 42 - Incident fixes" --out ./updatesets
605
+ ```
606
+
607
+ Calls `/export_update_set.do` and saves the XML to `<safe-name>.xml`. The file can be imported into any ServiceNow instance using `snow updateset apply` or via the ServiceNow UI.
608
+
609
+ #### `snow updateset apply <xml-file>`
610
+
611
+ Import an update set XML into an instance. Creates a **Retrieved Update Set** record that you then load, preview, and commit.
612
+
613
+ ```bash
614
+ # Apply to the active instance
615
+ snow updateset apply ./sprint-42-incident-fixes.xml
616
+
617
+ # Apply to a different instance by alias
618
+ snow updateset apply ./sprint-42-incident-fixes.xml --target prod
619
+
620
+ # Skip confirmation
621
+ snow updateset apply ./sprint-42-incident-fixes.xml --target prod --yes
622
+ ```
623
+
624
+ After uploading, the CLI prints the direct link to the Retrieved Update Set record and instructions for the load → preview → commit steps, which must be completed in the ServiceNow UI (or scripted separately).
625
+
626
+ #### `snow updateset diff <set1> <set2>`
627
+
628
+ Compare the captured items of two update sets side by side:
629
+
630
+ ```bash
631
+ snow updateset diff "Sprint 42" "Sprint 43"
632
+ snow updateset diff a1b2c3... b4c5d6... # sys_ids
633
+ ```
634
+
635
+ Output shows:
636
+ - Items only in the first set (removed in second) — in red
637
+ - Items only in the second set (added in second) — in green
638
+ - Items in both sets, flagged if the action changed (e.g. `INSERT_OR_UPDATE` → `DELETE`) — in yellow
639
+ - A summary line: `3 removed 5 added 1 changed 42 unchanged`
640
+
641
+ ---
642
+
643
+ ### `snow status`
644
+
645
+ Print a dashboard-style health and stats overview of the active instance. All sections run in parallel and degrade gracefully to `N/A` when the authenticated user lacks access.
646
+
647
+ ```bash
648
+ snow status
649
+
650
+ # Omit syslog sections (faster for non-admin users, or when syslog is restricted)
651
+ snow status --no-errors
652
+ ```
653
+
654
+ **Sections:**
655
+
656
+ | Section | What it shows |
657
+ |---|---|
658
+ | **Instance** | ServiceNow version (`glide.version`), cluster node count and status |
659
+ | **Users** | Total active user count |
660
+ | **Development** | Custom scoped app count, custom table count (`x_` prefix), in-progress update sets (up to 5, with author) |
661
+ | **Syslog errors** | Error count in the last hour + last 3 error messages with timestamps |
662
+ | **Scheduler errors** | Failed scheduled job count in the last 24h + last 3 messages |
663
+
664
+ **Example output:**
665
+ ```
666
+ ────────────────────────────────────────────────────
667
+ snow-cli · dev (https://dev12345.service-now.com)
668
+ ────────────────────────────────────────────────────
669
+
670
+ Instance
671
+ ────────
672
+ Version Utah Patch 7
673
+ Cluster nodes 3 active / 3 total
674
+
675
+ Users
676
+ ─────
677
+ Active users 1,234
678
+
679
+ Development
680
+ ───────────
681
+ Custom apps 5
682
+ Custom tables 34
683
+ Update sets 2 in progress
684
+ • My Feature Branch admin
685
+ • Hotfix-001 dev.user
686
+
687
+ Syslog errors (last hour)
688
+ ──────────────────────────
689
+ Error count 3
690
+ [10:34:01] Script error in BusinessRule 'Assign P...
691
+ [10:22:45] Invalid GlideRecord field: assigne_to
692
+
693
+ Scheduler errors (last 24h)
694
+ ────────────────────────────
695
+ Failed jobs 0
696
+ ```
697
+
698
+ > **Note:** Version and cluster node stats require admin access to `sys_properties` and `sys_cluster_state`. Syslog sections require read access to the `syslog` table. Sections that can't be read are shown as `N/A` rather than failing the command.
699
+
700
+ ---
701
+
702
+ ### `snow provider`
703
+
704
+ Configure LLM providers used by `snow ai`. API keys and model preferences are stored in `~/.snow/config.json`.
705
+
706
+ | Command | Description |
707
+ |---|---|
708
+ | `snow provider set <name>` | Add or update a provider (prompts for key and model) |
709
+ | `snow provider list` | List all configured providers |
710
+ | `snow provider use <name>` | Set the active provider |
711
+ | `snow provider show` | Show the active provider details |
712
+ | `snow provider test [name]` | Send a test message to verify connectivity |
713
+ | `snow provider remove <name>` | Remove a provider configuration |
714
+
715
+ **Supported providers:**
716
+
717
+ | Name | Models | Notes |
718
+ |---|---|---|
719
+ | `openai` | `gpt-4o`, `gpt-4-turbo`, `gpt-4o-mini`, … | Requires OpenAI API key |
720
+ | `anthropic` | `claude-opus-4-6`, `claude-sonnet-4-6`, … | Requires Anthropic API key |
721
+ | `xai` | `grok-3`, `grok-2`, … | Requires xAI API key; uses OpenAI-compatible API |
722
+ | `ollama` | `llama3`, `mistral`, `codellama`, … | Local inference; no API key needed |
723
+
724
+ **Setup examples:**
725
+
726
+ ```bash
727
+ # OpenAI (prompts interactively for key and model)
728
+ snow provider set openai
729
+
730
+ # Anthropic
731
+ snow provider set anthropic
732
+
733
+ # xAI / Grok
734
+ snow provider set xai
735
+
736
+ # Ollama (local — no key required)
737
+ snow provider set ollama --model llama3
738
+ snow provider set ollama --model codellama --url http://localhost:11434
739
+
740
+ # Non-interactive with flags
741
+ snow provider set openai --key sk-... --model gpt-4o
742
+
743
+ # Switch between configured providers
744
+ snow provider use anthropic
745
+ snow provider use openai
746
+
747
+ # Verify a provider is working
748
+ snow provider test
749
+ snow provider test anthropic
750
+ ```
751
+
752
+ ---
753
+
754
+ ### `snow ai`
755
+
756
+ Generate ServiceNow applications using an LLM. The AI produces structured artifacts that are exported as an importable **update set XML** file and optionally pushed directly to your instance via the Table API. When a request warrants it, the AI automatically creates a **scoped application** to namespace the artifacts.
757
+
758
+ #### Supported artifact types
759
+
760
+ | Type | ServiceNow table(s) | Description |
761
+ |---|---|---|
762
+ | `script_include` | `sys_script_include` | Reusable server-side JavaScript class |
763
+ | `business_rule` | `sys_script` | Auto-triggers on table insert/update/delete/query |
764
+ | `client_script` | `sys_client_script` | Browser-side form script (onLoad, onChange, onSubmit) |
765
+ | `ui_action` | `sys_ui_action` | Form button or context menu item |
766
+ | `ui_page` | `sys_ui_page` | Custom HTML page with Jelly templating |
767
+ | `scheduled_job` | `sys_trigger` | Script that runs on a schedule |
768
+ | `table` | `sys_db_object` + `sys_dictionary` + `sys_choice` | Custom database table with typed columns and choice lists |
769
+ | `decision_table` | `sys_decision` + children | Ordered rule set mapping input conditions to an output value |
770
+ | `flow_action` | `sys_hub_action_type_definition` + children | Reusable custom action for Flow Designer |
771
+
772
+ #### Scoped applications
773
+
774
+ The AI automatically determines whether a build warrants its own application scope. Scope is added when:
775
+ - The request creates two or more custom tables
776
+ - The request is described as an "application" or "module"
777
+ - A distinct namespace is needed to avoid naming conflicts
778
+
779
+ When a scope is generated, the build includes a `sys_app` record and every artifact is stamped with `sys_scope`/`sys_package`. All custom table names and script include API names are automatically prefixed (e.g. `x_myco_myapp_tablename`). The scope is shown in the build summary:
780
+
781
+ ```
782
+ My Incident App
783
+ scope: x_myco_incapp v1.0.0
784
+
785
+ + table x_myco_incapp_request
786
+ + script_include x_myco_incapp.RequestUtils
787
+ + business_rule Auto-Assign on Insert
788
+ ```
789
+
790
+ For Table API push, the CLI resolves or creates the scoped application on the target instance before pushing artifacts.
791
+
792
+ #### `snow ai build <prompt>`
793
+
794
+ Generate a ServiceNow application from a text description. The AI may ask clarifying questions before generating — answer them interactively and the build proceeds once the AI has enough information. Saves output to a directory named after the update set.
795
+
796
+ ```bash
797
+ snow ai build "Create a script include that routes incidents based on category and urgency"
798
+
799
+ snow ai build "Add a before business rule on the incident table that sets priority = urgency + impact when a record is inserted"
800
+
801
+ # Save to a custom directory
802
+ snow ai build "Create a client script that hides the CI field when category is Software" --output my-feature
803
+
804
+ # Push artifacts directly to the active instance after generating
805
+ snow ai build "Create a scheduled job that closes resolved incidents older than 30 days" --push
806
+
807
+ # Use a specific provider for this request only
808
+ snow ai build "..." --provider anthropic
809
+
810
+ # Debug mode — prints raw LLM response and full error traces
811
+ snow ai build "..." --debug
812
+ ```
813
+
814
+ **Interactive clarification:** If your prompt is vague or the AI needs more detail, it will ask targeted questions before generating. You answer in the terminal and the conversation continues until the AI produces the final build.
815
+
816
+ ```
817
+ AI > To generate the right artifacts, a few questions:
818
+ 1. Which table should the business rule operate on?
819
+ 2. Should it fire on insert, update, or both?
820
+
821
+ You > incident table, on insert only
822
+
823
+ [AI generates the build]
824
+ ```
825
+
826
+ **Post-build flow:** After generating, the CLI presents two optional prompts:
827
+
828
+ 1. **Review artifacts** — opens an interactive selector to inspect and edit any code field in your preferred editor before deploying
829
+ 2. **Push to instance** — confirm to deploy directly, or decline to deploy later with `snow ai push`
830
+
831
+ **Output structure:**
832
+ ```
833
+ incident-priority-router/
834
+ incident-priority-router.xml ← importable update set XML
835
+ incident-priority-router.manifest.json ← artifact manifest for snow ai push/review
836
+ ```
837
+
838
+ **Importing the update set in ServiceNow:**
839
+ 1. Navigate to **System Update Sets → Retrieved Update Sets**
840
+ 2. Click **Import Update Set from XML**
841
+ 3. Upload the `.xml` file
842
+ 4. Click **Load Update Set**, then **Preview**, then **Commit**
843
+
844
+ **Generating tables:**
845
+ ```
846
+ snow ai build "Create a custom table for tracking hardware asset requests with fields for requester, asset type, urgency, and status"
847
+ ```
848
+ The AI generates `sys_db_object` + `sys_dictionary` entries for each column, plus `sys_choice` records for any choice fields. If the table extends `task`, inherited fields (number, state, assigned_to, etc.) are not re-declared.
849
+
850
+ **Generating decision tables:**
851
+ ```
852
+ snow ai build "Create a decision table that maps urgency and impact values to a priority level"
853
+ ```
854
+ Produces a `sys_decision` record with input columns, ordered rules, and per-rule conditions. Decision tables are evaluated top-down — the first matching rule wins.
855
+
856
+ **Generating flow actions:**
857
+ ```
858
+ snow ai build "Create a Flow Designer action that creates an incident from an email subject and body and returns the sys_id"
859
+ ```
860
+ Produces a `sys_hub_action_type_definition` with typed input and output variables. The action appears in the Flow Designer action picker under the specified category and can be used in any flow or subflow.
861
+
862
+ **Generating a scoped application:**
863
+ ```
864
+ snow ai build "Build a full hardware asset request application with a custom table, approval workflow script include, and email notification business rule"
865
+ ```
866
+ When the AI determines a scope is appropriate it generates the `sys_app` record and prefixes all custom artifacts automatically. You can also explicitly ask for a scope:
867
+ ```
868
+ snow ai build "Create a scoped application called 'HR Onboarding' with scope prefix x_myco_hronboard ..."
869
+ ```
870
+
871
+ #### `snow ai chat`
872
+
873
+ Interactive multi-turn session. The AI can ask clarifying questions before generating, and refines artifacts as the conversation continues. The update set is re-saved to disk automatically after each generation.
874
+
875
+ ```bash
876
+ snow ai chat
877
+ snow ai chat --push # auto-push to instance on each generation
878
+ snow ai chat --debug # show raw LLM responses
879
+ snow ai chat --output ./my-app # save all builds to a specific directory
880
+ ```
881
+
882
+ **In-session commands:**
883
+
884
+ | Command | Action |
885
+ |---|---|
886
+ | `/status` | Show the current build summary and artifact list |
887
+ | `/save` | Write the current XML and manifest to disk |
888
+ | `/push` | Push the current build to the active ServiceNow instance |
889
+ | `/clear` | Reset the session (clears history and current build) |
890
+ | `/exit` | Quit |
891
+
892
+ After each generation, the CLI prompts whether to push to the active instance (unless `--push` is set, in which case it pushes automatically).
893
+
894
+ **Example session:**
895
+ ```
896
+ You > I want to build an incident auto-assignment feature
897
+
898
+ AI > To generate the right artifacts, a few questions:
899
+ 1. Which teams/groups should incidents be routed to?
900
+ 2. What fields determine the routing — category, urgency, location, or something else?
901
+ 3. Should assignment happen on insert only, or also on update?
902
+
903
+ You > Route by category: Network → Network Ops, Software → App Support.
904
+ Assignment on insert only.
905
+
906
+ AI > [generates script include + business rule]
907
+
908
+ + script_include IncidentRouter
909
+ + business_rule Auto-Assign Incident on Insert
910
+
911
+ ✔ incident-auto-assignment/
912
+ incident-auto-assignment.xml
913
+ incident-auto-assignment.manifest.json
914
+
915
+ Push 2 artifact(s) to dev (https://dev12345.service-now.com)? (y/N)
916
+
917
+ You > Also add a UI action button to manually re-trigger the routing
918
+
919
+ AI > [generates updated build with all three artifacts]
920
+
921
+ + script_include IncidentRouter (unchanged)
922
+ + business_rule Auto-Assign ... (unchanged)
923
+ + ui_action Re-Route Incident ← new
924
+ ```
925
+
926
+ #### `snow ai review <path>`
927
+
928
+ Review and edit a previously generated build. Opens an interactive artifact selector, shows code with line numbers, and lets you open any field in your editor. Changes are saved back to the XML and manifest immediately. After reviewing, you can optionally push the (modified) build to the active instance.
929
+
930
+ ```bash
931
+ # Review from a build directory
932
+ snow ai review ./incident-auto-assignment/
933
+
934
+ # Review from a manifest file
935
+ snow ai review ./incident-auto-assignment/incident-auto-assignment.manifest.json
936
+
937
+ # Review from the XML file (locates the sibling .manifest.json automatically)
938
+ snow ai review ./incident-auto-assignment/incident-auto-assignment.xml
939
+ ```
940
+
941
+ **Review workflow:**
942
+ 1. Select an artifact from the list
943
+ 2. The code is printed with line numbers
944
+ 3. Confirm to open in your editor (uses `$VISUAL`, `$EDITOR`, or auto-detected editor)
945
+ 4. Save and close the editor — changes are written back to disk immediately
946
+ 5. Select another artifact or choose **Done reviewing**
947
+ 6. Confirm whether to push the build to the active instance
948
+
949
+ #### `snow ai push <path>`
950
+
951
+ Push a previously generated build to the active instance via Table API.
952
+
953
+ ```bash
954
+ # Push from a build directory
955
+ snow ai push ./incident-auto-assignment/
956
+
957
+ # Push from a manifest file
958
+ snow ai push ./incident-auto-assignment/incident-auto-assignment.manifest.json
959
+ ```
960
+
961
+ **Push behaviour per artifact type:**
962
+
963
+ | Artifact | Strategy |
964
+ |---|---|
965
+ | script_include, business_rule, etc. | Looks up by name — creates or updates the single record |
966
+ | `table` | Upserts `sys_db_object`; upserts each `sys_dictionary` column by table+element; upserts `sys_choice` rows by table+element+value |
967
+ | `decision_table` | Upserts `sys_decision`; deletes and recreates all input columns and rules on update |
968
+ | `flow_action` | Upserts `sys_hub_action_type_definition`; deletes and recreates all input/output variables on update |
969
+ | Scoped build | Resolves or creates the `sys_app` record first; stamps `sys_scope`/`sys_package` on every record |
970
+
971
+ ---
972
+
973
+ ## Configuration File
974
+
975
+ All settings are stored in `~/.snow/config.json`. The directory is created with mode `0700` and the file with `0600`.
976
+
977
+ ```json
978
+ {
979
+ "activeInstance": "dev",
980
+ "instances": {
981
+ "dev": {
982
+ "alias": "dev",
983
+ "url": "https://dev12345.service-now.com",
984
+ "auth": {
985
+ "type": "basic",
986
+ "username": "admin",
987
+ "password": "your-password"
988
+ }
989
+ },
990
+ "prod": {
991
+ "alias": "prod",
992
+ "url": "https://prod.service-now.com",
993
+ "auth": {
994
+ "type": "oauth",
995
+ "clientId": "...",
996
+ "clientSecret": "...",
997
+ "accessToken": "...",
998
+ "tokenExpiry": 1700000000000
999
+ }
1000
+ }
1001
+ },
1002
+ "ai": {
1003
+ "activeProvider": "openai",
1004
+ "providers": {
1005
+ "openai": {
1006
+ "model": "gpt-4o",
1007
+ "apiKey": "sk-..."
1008
+ },
1009
+ "anthropic": {
1010
+ "model": "claude-opus-4-6",
1011
+ "apiKey": "sk-ant-..."
1012
+ },
1013
+ "xai": {
1014
+ "model": "grok-3",
1015
+ "apiKey": "xai-...",
1016
+ "baseUrl": "https://api.x.ai/v1"
1017
+ },
1018
+ "ollama": {
1019
+ "model": "llama3",
1020
+ "baseUrl": "http://localhost:11434"
1021
+ }
1022
+ }
1023
+ }
1024
+ }
1025
+ ```
1026
+
1027
+ ---
1028
+
1029
+ ## Development
1030
+
1031
+ ```bash
1032
+ npm run dev # Watch mode — rebuilds on every file change
1033
+ npm run build # One-time production build to dist/
1034
+ ```
1035
+
1036
+ The entry point is `src/index.ts`. Commands live in `src/commands/`, shared utilities in `src/lib/`.
1037
+
1038
+ **Project structure:**
1039
+ ```
1040
+ src/
1041
+ index.ts CLI entry point and command registration
1042
+ commands/
1043
+ instance.ts snow instance
1044
+ table.ts snow table
1045
+ schema.ts snow schema
1046
+ script.ts snow script (pull/push/list/search/replace)
1047
+ bulk.ts snow bulk (update)
1048
+ user.ts snow user (add-to-group/remove-from-group/assign-role/remove-role)
1049
+ attachment.ts snow attachment (list/pull/push)
1050
+ updateset.ts snow updateset (list/current/set/show/capture/export/apply/diff)
1051
+ status.ts snow status
1052
+ provider.ts snow provider
1053
+ ai.ts snow ai (build, chat, review, push)
1054
+ lib/
1055
+ config.ts Config file read/write + instance and provider helpers
1056
+ client.ts ServiceNow HTTP client (axios, basic + OAuth auth)
1057
+ llm.ts LLM provider abstraction (OpenAI, Anthropic, xAI, Ollama)
1058
+ sn-context.ts ServiceNow system prompt and artifact type definitions
1059
+ update-set.ts XML update set generation and Table API push
1060
+ types/
1061
+ index.ts Shared TypeScript interfaces
1062
+ ```