npcsh 1.0.17__tar.gz → 1.0.18__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: npcsh
3
- Version: 1.0.17
3
+ Version: 1.0.18
4
4
  Summary: npcsh is a command-line toolkit for using AI agents in novel ways.
5
5
  Home-page: https://github.com/NPC-Worldwide/npcsh
6
6
  Author: Christopher Agostino
@@ -43,6 +43,7 @@ Requires-Dist: redis
43
43
  Requires-Dist: psycopg2-binary
44
44
  Requires-Dist: flask_sse
45
45
  Requires-Dist: wikipedia
46
+ Requires-Dist: mcp
46
47
  Provides-Extra: lite
47
48
  Requires-Dist: anthropic; extra == "lite"
48
49
  Requires-Dist: openai; extra == "lite"
@@ -65,8 +66,6 @@ Requires-Dist: playsound==1.2.2; extra == "yap"
65
66
  Requires-Dist: pygame; extra == "yap"
66
67
  Requires-Dist: faster_whisper; extra == "yap"
67
68
  Requires-Dist: pyttsx3; extra == "yap"
68
- Provides-Extra: mcp
69
- Requires-Dist: mcp; extra == "mcp"
70
69
  Provides-Extra: all
71
70
  Requires-Dist: anthropic; extra == "all"
72
71
  Requires-Dist: openai; extra == "all"
@@ -87,7 +86,6 @@ Requires-Dist: playsound==1.2.2; extra == "all"
87
86
  Requires-Dist: pygame; extra == "all"
88
87
  Requires-Dist: faster_whisper; extra == "all"
89
88
  Requires-Dist: pyttsx3; extra == "all"
90
- Requires-Dist: mcp; extra == "all"
91
89
  Dynamic: author
92
90
  Dynamic: author-email
93
91
  Dynamic: classifier
@@ -107,21 +105,25 @@ Dynamic: summary
107
105
 
108
106
  # NPC Shell
109
107
 
110
- The NPC shell is a suite of executable command-line programs that allow users to easily interact with NPCs and LLMs through a command line shell.
111
-
112
- Programs within the NPC shell use the properties defined in `~/.npcshrc`, which is generated upon installation and running of `npcsh` for the first time.
108
+ The NPC shell is the toolkit for tomorrow, providing a suite of programs to make use of multi-modal LLMs and agents in novel interactive modes. `npcsh` is a command-line program, and so can be used wherever you work. `npcsh` is developed to work relibly with small models and performs excellently with the state-of-the-art models from major model providers.
113
109
 
114
110
  To get started:
115
- ```
111
+ ```bash
112
+ # for users who want to mainly use models through APIs (e.g. , gemini, grok, deepseek, anthropic, openai, mistral, , any others provided by litellm ):
113
+ pip install 'npcsh[lite]'
114
+ # for users who want to use local models (these install diffusers/transformers/torch stack so it is big.):
116
115
  pip install 'npcsh[local]'
116
+ # for users who want to use the voice mode `yap`, see also the OS-specific installation instructions for installing needed system audio libraries
117
+ pip install 'npcsh[yap]'
117
118
  ```
118
- Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc` cli, `yap` `pti`, `wander`, and `spool`.
119
-
120
-
121
- # npcsh
122
- - An AI-powered shell that parses bash, natural language, and special macro calls, `npcsh` processes your input accordingly, agentically, and automatically.
119
+ Once installed: run
120
+ ```bash
121
+ npcsh
122
+ ```
123
+ and you will enter the NPC shell. Additionally, the pip installation includes making the following CLI tools available in bash: `corca`, `guac`, `npc` cli, `pti`, `spool`, `wander`, and`yap`.
123
124
 
124
125
 
126
+ # Usage
125
127
  - Get help with a task:
126
128
  ```bash
127
129
  npcsh:🤖sibiji:gemini-2.5-flash>can you help me identify what process is listening on port 5337?
@@ -139,18 +141,7 @@ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc
139
141
  ```bash
140
142
  npcsh> has there ever been a better pasta shape than bucatini?
141
143
  ```
142
- ```
143
- Ultimately, the "best" pasta shape depends on personal preference and the dish being prepared. Bucatini shines in specific contexts, but pasta lovers often appreciate a diverse range of shapes for their unique qualities and compatibilities with various sauces and ingredients. Each shape has its place in the culinary world, and trying different types can enhance the overall dining experience.
144
- ```
145
-
146
- ```
147
- .Loaded .env file...
148
- Initializing database schema...
149
- Database schema initialization complete.
150
- Processing prompt: 'has there ever been a better pasta shape than bucatini?' with NPC: 'sibiji'...
151
- • Action chosen: answer_question
152
- • Explanation given: The question is a general opinion-based inquiry about pasta shapes and can be answered without external data or jinx invocation.
153
- ...............................................................................
144
+ ```
154
145
  Bucatini is certainly a favorite for many due to its unique hollow center, which holds sauces beautifully. Whether it's "better" is subjective and depends on the dish and personal
155
146
  preference. Shapes like orecchiette, rigatoni, or trofie excel in different recipes. Bucatini stands out for its versatility and texture, making it a top contender among pasta shapes!
156
147
  ```
@@ -193,38 +184,67 @@ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc
193
184
  ```bash
194
185
  /ots
195
186
  ```
187
+ - **Use an mcp server**: make use of NPCs with MCP servers.
188
+ ```bash
189
+ /corca --mcp-server-path /path.to.server.py
190
+ ```
191
+
192
+
193
+ # NPC Data Layer
194
+
195
+ The core of npcsh's capabilities is powered by the NPC Data Layer. Upon initialization, a user will be prompted to make a team in the current directory or to use a global team stored in `~/.npcsh/` which houses the NPC team with its jinxs, models, contexts, assembly lines. By implementing these components as simple data structures, users can focus on tweaking the relevant parts of their multi-agent systems.
196
196
 
197
+ ## Creating Custom Components
198
+
199
+ Users can extend NPC capabilities through simple YAML files:
200
+
201
+ - **NPCs** (.npc): are defined with a name, primary directive, and optional model specifications
202
+ - **Jinxs** (.jinx): specify function-like capabilities with preprocessing, execution, and postprocessing steps
203
+ - **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team. Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context
204
+
205
+ The NPC Shell system integrates the capabilities of `npcpy` to maintain conversation history, track command execution, and provide intelligent autocomplete through an extensible command routing system. State is preserved between sessions, allowing for continuous knowledge building over time.
206
+
207
+ This architecture enables complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities as composable data structures rather than code, npcsh creates a more accessible and adaptable framework for AI automation.
197
208
 
198
209
  # Macros
199
210
  - activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
200
211
 
201
- - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings
202
- - `/brainblast` - Execute an advanced chunked search on command history
203
- - `/breathe` - Condense context on a regular cadence
204
- - `/compile` - Compile NPC profiles
205
- - `/corca` - Enter the Corca MCP-powered agentic shell. Usage: /corca [--mcp-server-path path]
206
- - `/flush` - Flush the last N messages
207
- - `/guac` - Enter guac mode
208
- - `/help` - Show help for commands, NPCs, or Jinxs. Usage: /help
209
- - `/init` - Initialize NPC project
210
- - `/jinxs` - Show available jinxs for the current NPC/Team
211
- - `/npc-studio` - Start npc studio
212
- - `/ots` - Take screenshot and analyze with vision model
213
- - `/plan` - Execute a plan command
214
- - `/plonk` - Use vision model to interact with GUI. Usage: /plonk <task description>
215
- - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM
216
- - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file)
217
- - `/roll` - generate a video with video generation model
218
- - `/sample` - Send a prompt directly to the LLM
219
- - `/search` - Execute a web search command
220
- - `/serve` - Serve an NPC Team server.
221
- - `/set` - Set configuration values
222
- - `/sleep` - Evolve knowledge graph with options for dreaming.
223
- - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag
224
- - `/trigger` - Execute a trigger command
225
- - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini
226
- - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature
227
- - `/yap` - Enter voice chat (yap) mode
212
+ - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings. Usage: `/alicanto 'query to be researched' --num-npcs <int> --depth <int>`
213
+ - `/brainblast` - Execute an advanced chunked search on command history. Usage: `/brainblast 'query' --top_k 10`
214
+ - `/breathe` - Condense context on a regular cadence. Usage: `/breathe -p <provider: NPCSH_CHAT_PROVIDER> -m <model: NPCSH_CHAT_MODEL>`
215
+ - `/compile` - Compile NPC profiles. Usage: `/compile <path_to_npc> `
216
+ - `/corca` - Enter the Corca MCP-powered agentic shell. Usage: `/corca [--mcp-server-path path]`
217
+ - `/flush` - Flush the last N messages. Usage: `/flush N=10`
218
+ - `/guac` - Enter guac mode. Usage: `/guac`
219
+ - `/help` - Show help for commands, NPCs, or Jinxs. Usage: `/help`
220
+ - `/init` - Initialize NPC project. Usage: `/init`
221
+ - `/jinxs` - Show available jinxs for the current NPC/Team. Usage: `/jinxs`
222
+ - `/<jinx_name>` - Run a jinx with specified command line arguments. `/<jinx_name> jinx_arg1 jinx_arg2`
223
+ - `/npc-studio` - Start npc studio. Pulls NPC Studio github to `~/.npcsh/npc-studio` and launches it in development mode after installing necessary NPM dependencies.Usage: `/npc-studio`
224
+ - `/ots` - Take screenshot and analyze with vision model. Usage: `/ots filename=<output_file_name_for_screenshot>` then select an area, and you will be prompted for your request.
225
+ - `/plan` - Execute a plan command. Usage: `/plan 'idea for a cron job to be set up to accomplish'`
226
+ - `/plonk` - Use vision model to interact with GUI. Usage: `/plonk '<task description>' `
227
+ - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM. Usage: `/pti`
228
+ - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file). Usage: `/rag '<query_to_rag>' --emodel <NPCSH_EMBEDDING_MODEL> --eprovider <NPCSH_EMBEDDING_PROVIDER>`
229
+ - `/roll` - generate a video with video generation model. Usage: `/roll '<description_for_a_movie>' --vgmodel <NPCSH_VIDEO_GEN_MODEL> --vgprovider <NPCSH_VIDEO_GEN_PROVIDER>`
230
+ - `/sample` - Send a context-free prompt to an LLM, letting you get fresh answers without needing to start a separate conversation/shell. Usage: `/sample -m <NPCSH_CHAT_MODEL> 'question to sample --temp <float> --top_k int`
231
+ - `/search` - Execute a web search command. Usage: `/search 'search query' --sprovider <provider>` where provider is currently limited to DuckDuckGo and Perplexity. Wikipedia integration ongoing.
232
+ - `/serve` - Serve an NPC Team server.
233
+ - `/set` - Set configuration values.
234
+ - Usage:
235
+ - `/set model gemma3:4b`,
236
+ - `/set provider ollama`,
237
+ - `/set NPCSH_VIDEO_GEN_PROVIDER diffusers`
238
+ - `/sleep` - Evolve knowledge graph with options for dreaming. Usage: `/sleep --ops link_facts,deepen`
239
+ - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag. Usage: `/spool --attachments 'path1,path2,path3' -n <npc_name> -m <modell> -p <provider>`
240
+ - `/trigger` - Execute a trigger command. Usage: `/trigger 'a description of a trigger to implement with system daemons/file system listeners.' -m gemma3:27b -p ollama`
241
+ - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini.
242
+ - Usage:
243
+ - Gen Image: `/vixynt -igp <NPCSH_IMAGE_GEN_PROVIDER> --igmodel <NPCSH_IMAGE_GEN_MODEL> --output_file <path_to_file> width=<int:1024> height =<int:1024> 'description of image`
244
+ - Edit Image: `/vixynt 'edit this....' --attachments '/path/to/image.png,/path/to/image.jpeg'`
245
+
246
+ - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature. Usage: `/wander 'query to wander about' --provider "ollama" --model "deepseek-r1:32b" environment="a vast dark ocean" interruption-likelihood=.1`
247
+ - `/yap` - Enter voice chat (yap) mode. Usage: `/yap -n <npc_to_chat_with>`
228
248
 
229
249
  ## Common Command-Line Flags:
230
250
 
@@ -372,23 +392,38 @@ python-tkinter (pyautogui)
372
392
  </details>
373
393
 
374
394
  ## Startup Configuration and Project Structure
375
- After `npcsh` has been pip installed, `npcsh`, `corca`, `guac`, `pti`, `spool`, `yap` and the `npc` CLI can be used as command line tools. To initialize these correctly, first start by starting the NPC shell:
395
+ To initialize the NPC shell environment parameters correctly, first start the NPC shell:
376
396
  ```bash
377
397
  npcsh
378
398
  ```
379
- When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
380
- Here is an example of what the .npcshrc file might look like after this has been run.
399
+ When initialized, `npcsh` will generate a `.npcshrc` file in your home directory that stores your npcsh settings.
400
+ Here is an example of what the `.npcshrc` file might look like after this has been run.
381
401
  ```bash
382
402
  # NPCSH Configuration File
383
403
  export NPCSH_INITIALIZED=1
384
- export NPCSH_CHAT_PROVIDER='ollama'
385
- export NPCSH_CHAT_MODEL='llama3.2'
386
404
  export NPCSH_DB_PATH='~/npcsh_history.db'
405
+ export NPCSH_CHAT_MODEL=gemma3:4b
406
+ export NPCSH_CHAT_PROVIDER=ollama
407
+ export NPCSH_DEFAULT_MODE=agent
408
+ export NPCSH_EMBEDDING_MODEL=nomic-embed-text
409
+ export NPCSH_EMBEDDING_PROVIDER=ollama
410
+ export NPCSH_IMAGE_GEN_MODEL=gpt-image-1
411
+ export NPCSH_IMAGE_GEN_PROVIDER=openai
412
+ export NPCSH_INITIALIZED=1
413
+ export NPCSH_REASONING_MODEL=deepseek-r1
414
+ export NPCSH_REASONING_PROVIDER=deepseek
415
+ export NPCSH_SEARCH_PROVIDER=duckduckgo
416
+ export NPCSH_STREAM_OUTPUT=1
417
+ export NPCSH_VECTOR_DB_PATH=~/npcsh_chroma.db
418
+ export NPCSH_VIDEO_GEN_MODEL=runwayml/stable-diffusion-v1-5
419
+ export NPCSH_VIDEO_GEN_PROVIDER=diffusers
420
+ export NPCSH_VISION_MODEL=gpt-4o-mini
421
+ export NPCSH_VISION_PROVIDER=openai
387
422
  ```
388
423
 
389
- `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
424
+ `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at `~/.npcsh/` that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the `.npcshrc `file. This will default to `~/npcsh_history.db` if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
390
425
 
391
- The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
426
+ The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your `.bashrc` or `.zshrc`:
392
427
 
393
428
  ```bash
394
429
  # Source NPCSH configuration
@@ -399,7 +434,7 @@ fi
399
434
 
400
435
  We support inference via all providers supported by litellm. For openai-compatible providers that are not explicitly named in litellm, use simply `openai-like` as the provider. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
401
436
 
402
- To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
437
+ To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your `~/.npcshrc`. If you already have these API keys set in a `~/.bashrc` or a `~/.zshrc` or similar files, you need not additionally add them to `~/.npcshrc` or to an `.env` file. Here is an example of what an `.env` file might look like:
403
438
 
404
439
  ```bash
405
440
  export OPENAI_API_KEY="your_openai_key"
@@ -411,14 +446,15 @@ export PERPLEXITY_API_KEY='your_perplexity_key'
411
446
 
412
447
 
413
448
  Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
414
- Once initialized and set up, you will find the following in your ~/.npcsh directory:
449
+
450
+ Once initialized and set up, you will find the following in your `~/.npcsh` directory:
415
451
  ```bash
416
452
  ~/.npcsh/
417
453
  ├── npc_team/ # Global NPCs
418
454
  │ ├── jinxs/ # Global tools
419
455
  │ └── assembly_lines/ # Workflow pipelines
420
- │ └── example.npc # globally available npc
421
- │ └── global.ctx # global context file
456
+ │ └── sibiji.npc # globally available npc
457
+ │ └── npcsh.ctx # global context file
422
458
 
423
459
 
424
460
 
@@ -5,21 +5,25 @@
5
5
 
6
6
  # NPC Shell
7
7
 
8
- The NPC shell is a suite of executable command-line programs that allow users to easily interact with NPCs and LLMs through a command line shell.
9
-
10
- Programs within the NPC shell use the properties defined in `~/.npcshrc`, which is generated upon installation and running of `npcsh` for the first time.
8
+ The NPC shell is the toolkit for tomorrow, providing a suite of programs to make use of multi-modal LLMs and agents in novel interactive modes. `npcsh` is a command-line program, and so can be used wherever you work. `npcsh` is developed to work relibly with small models and performs excellently with the state-of-the-art models from major model providers.
11
9
 
12
10
  To get started:
13
- ```
11
+ ```bash
12
+ # for users who want to mainly use models through APIs (e.g. , gemini, grok, deepseek, anthropic, openai, mistral, , any others provided by litellm ):
13
+ pip install 'npcsh[lite]'
14
+ # for users who want to use local models (these install diffusers/transformers/torch stack so it is big.):
14
15
  pip install 'npcsh[local]'
16
+ # for users who want to use the voice mode `yap`, see also the OS-specific installation instructions for installing needed system audio libraries
17
+ pip install 'npcsh[yap]'
15
18
  ```
16
- Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc` cli, `yap` `pti`, `wander`, and `spool`.
17
-
18
-
19
- # npcsh
20
- - An AI-powered shell that parses bash, natural language, and special macro calls, `npcsh` processes your input accordingly, agentically, and automatically.
19
+ Once installed: run
20
+ ```bash
21
+ npcsh
22
+ ```
23
+ and you will enter the NPC shell. Additionally, the pip installation includes making the following CLI tools available in bash: `corca`, `guac`, `npc` cli, `pti`, `spool`, `wander`, and`yap`.
21
24
 
22
25
 
26
+ # Usage
23
27
  - Get help with a task:
24
28
  ```bash
25
29
  npcsh:🤖sibiji:gemini-2.5-flash>can you help me identify what process is listening on port 5337?
@@ -37,18 +41,7 @@ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc
37
41
  ```bash
38
42
  npcsh> has there ever been a better pasta shape than bucatini?
39
43
  ```
40
- ```
41
- Ultimately, the "best" pasta shape depends on personal preference and the dish being prepared. Bucatini shines in specific contexts, but pasta lovers often appreciate a diverse range of shapes for their unique qualities and compatibilities with various sauces and ingredients. Each shape has its place in the culinary world, and trying different types can enhance the overall dining experience.
42
- ```
43
-
44
- ```
45
- .Loaded .env file...
46
- Initializing database schema...
47
- Database schema initialization complete.
48
- Processing prompt: 'has there ever been a better pasta shape than bucatini?' with NPC: 'sibiji'...
49
- • Action chosen: answer_question
50
- • Explanation given: The question is a general opinion-based inquiry about pasta shapes and can be answered without external data or jinx invocation.
51
- ...............................................................................
44
+ ```
52
45
  Bucatini is certainly a favorite for many due to its unique hollow center, which holds sauces beautifully. Whether it's "better" is subjective and depends on the dish and personal
53
46
  preference. Shapes like orecchiette, rigatoni, or trofie excel in different recipes. Bucatini stands out for its versatility and texture, making it a top contender among pasta shapes!
54
47
  ```
@@ -91,38 +84,67 @@ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc
91
84
  ```bash
92
85
  /ots
93
86
  ```
87
+ - **Use an mcp server**: make use of NPCs with MCP servers.
88
+ ```bash
89
+ /corca --mcp-server-path /path.to.server.py
90
+ ```
91
+
92
+
93
+ # NPC Data Layer
94
+
95
+ The core of npcsh's capabilities is powered by the NPC Data Layer. Upon initialization, a user will be prompted to make a team in the current directory or to use a global team stored in `~/.npcsh/` which houses the NPC team with its jinxs, models, contexts, assembly lines. By implementing these components as simple data structures, users can focus on tweaking the relevant parts of their multi-agent systems.
94
96
 
97
+ ## Creating Custom Components
98
+
99
+ Users can extend NPC capabilities through simple YAML files:
100
+
101
+ - **NPCs** (.npc): are defined with a name, primary directive, and optional model specifications
102
+ - **Jinxs** (.jinx): specify function-like capabilities with preprocessing, execution, and postprocessing steps
103
+ - **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team. Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context
104
+
105
+ The NPC Shell system integrates the capabilities of `npcpy` to maintain conversation history, track command execution, and provide intelligent autocomplete through an extensible command routing system. State is preserved between sessions, allowing for continuous knowledge building over time.
106
+
107
+ This architecture enables complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities as composable data structures rather than code, npcsh creates a more accessible and adaptable framework for AI automation.
95
108
 
96
109
  # Macros
97
110
  - activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
98
111
 
99
- - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings
100
- - `/brainblast` - Execute an advanced chunked search on command history
101
- - `/breathe` - Condense context on a regular cadence
102
- - `/compile` - Compile NPC profiles
103
- - `/corca` - Enter the Corca MCP-powered agentic shell. Usage: /corca [--mcp-server-path path]
104
- - `/flush` - Flush the last N messages
105
- - `/guac` - Enter guac mode
106
- - `/help` - Show help for commands, NPCs, or Jinxs. Usage: /help
107
- - `/init` - Initialize NPC project
108
- - `/jinxs` - Show available jinxs for the current NPC/Team
109
- - `/npc-studio` - Start npc studio
110
- - `/ots` - Take screenshot and analyze with vision model
111
- - `/plan` - Execute a plan command
112
- - `/plonk` - Use vision model to interact with GUI. Usage: /plonk <task description>
113
- - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM
114
- - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file)
115
- - `/roll` - generate a video with video generation model
116
- - `/sample` - Send a prompt directly to the LLM
117
- - `/search` - Execute a web search command
118
- - `/serve` - Serve an NPC Team server.
119
- - `/set` - Set configuration values
120
- - `/sleep` - Evolve knowledge graph with options for dreaming.
121
- - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag
122
- - `/trigger` - Execute a trigger command
123
- - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini
124
- - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature
125
- - `/yap` - Enter voice chat (yap) mode
112
+ - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings. Usage: `/alicanto 'query to be researched' --num-npcs <int> --depth <int>`
113
+ - `/brainblast` - Execute an advanced chunked search on command history. Usage: `/brainblast 'query' --top_k 10`
114
+ - `/breathe` - Condense context on a regular cadence. Usage: `/breathe -p <provider: NPCSH_CHAT_PROVIDER> -m <model: NPCSH_CHAT_MODEL>`
115
+ - `/compile` - Compile NPC profiles. Usage: `/compile <path_to_npc> `
116
+ - `/corca` - Enter the Corca MCP-powered agentic shell. Usage: `/corca [--mcp-server-path path]`
117
+ - `/flush` - Flush the last N messages. Usage: `/flush N=10`
118
+ - `/guac` - Enter guac mode. Usage: `/guac`
119
+ - `/help` - Show help for commands, NPCs, or Jinxs. Usage: `/help`
120
+ - `/init` - Initialize NPC project. Usage: `/init`
121
+ - `/jinxs` - Show available jinxs for the current NPC/Team. Usage: `/jinxs`
122
+ - `/<jinx_name>` - Run a jinx with specified command line arguments. `/<jinx_name> jinx_arg1 jinx_arg2`
123
+ - `/npc-studio` - Start npc studio. Pulls NPC Studio github to `~/.npcsh/npc-studio` and launches it in development mode after installing necessary NPM dependencies.Usage: `/npc-studio`
124
+ - `/ots` - Take screenshot and analyze with vision model. Usage: `/ots filename=<output_file_name_for_screenshot>` then select an area, and you will be prompted for your request.
125
+ - `/plan` - Execute a plan command. Usage: `/plan 'idea for a cron job to be set up to accomplish'`
126
+ - `/plonk` - Use vision model to interact with GUI. Usage: `/plonk '<task description>' `
127
+ - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM. Usage: `/pti`
128
+ - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file). Usage: `/rag '<query_to_rag>' --emodel <NPCSH_EMBEDDING_MODEL> --eprovider <NPCSH_EMBEDDING_PROVIDER>`
129
+ - `/roll` - generate a video with video generation model. Usage: `/roll '<description_for_a_movie>' --vgmodel <NPCSH_VIDEO_GEN_MODEL> --vgprovider <NPCSH_VIDEO_GEN_PROVIDER>`
130
+ - `/sample` - Send a context-free prompt to an LLM, letting you get fresh answers without needing to start a separate conversation/shell. Usage: `/sample -m <NPCSH_CHAT_MODEL> 'question to sample --temp <float> --top_k int`
131
+ - `/search` - Execute a web search command. Usage: `/search 'search query' --sprovider <provider>` where provider is currently limited to DuckDuckGo and Perplexity. Wikipedia integration ongoing.
132
+ - `/serve` - Serve an NPC Team server.
133
+ - `/set` - Set configuration values.
134
+ - Usage:
135
+ - `/set model gemma3:4b`,
136
+ - `/set provider ollama`,
137
+ - `/set NPCSH_VIDEO_GEN_PROVIDER diffusers`
138
+ - `/sleep` - Evolve knowledge graph with options for dreaming. Usage: `/sleep --ops link_facts,deepen`
139
+ - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag. Usage: `/spool --attachments 'path1,path2,path3' -n <npc_name> -m <modell> -p <provider>`
140
+ - `/trigger` - Execute a trigger command. Usage: `/trigger 'a description of a trigger to implement with system daemons/file system listeners.' -m gemma3:27b -p ollama`
141
+ - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini.
142
+ - Usage:
143
+ - Gen Image: `/vixynt -igp <NPCSH_IMAGE_GEN_PROVIDER> --igmodel <NPCSH_IMAGE_GEN_MODEL> --output_file <path_to_file> width=<int:1024> height =<int:1024> 'description of image`
144
+ - Edit Image: `/vixynt 'edit this....' --attachments '/path/to/image.png,/path/to/image.jpeg'`
145
+
146
+ - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature. Usage: `/wander 'query to wander about' --provider "ollama" --model "deepseek-r1:32b" environment="a vast dark ocean" interruption-likelihood=.1`
147
+ - `/yap` - Enter voice chat (yap) mode. Usage: `/yap -n <npc_to_chat_with>`
126
148
 
127
149
  ## Common Command-Line Flags:
128
150
 
@@ -270,23 +292,38 @@ python-tkinter (pyautogui)
270
292
  </details>
271
293
 
272
294
  ## Startup Configuration and Project Structure
273
- After `npcsh` has been pip installed, `npcsh`, `corca`, `guac`, `pti`, `spool`, `yap` and the `npc` CLI can be used as command line tools. To initialize these correctly, first start by starting the NPC shell:
295
+ To initialize the NPC shell environment parameters correctly, first start the NPC shell:
274
296
  ```bash
275
297
  npcsh
276
298
  ```
277
- When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
278
- Here is an example of what the .npcshrc file might look like after this has been run.
299
+ When initialized, `npcsh` will generate a `.npcshrc` file in your home directory that stores your npcsh settings.
300
+ Here is an example of what the `.npcshrc` file might look like after this has been run.
279
301
  ```bash
280
302
  # NPCSH Configuration File
281
303
  export NPCSH_INITIALIZED=1
282
- export NPCSH_CHAT_PROVIDER='ollama'
283
- export NPCSH_CHAT_MODEL='llama3.2'
284
304
  export NPCSH_DB_PATH='~/npcsh_history.db'
305
+ export NPCSH_CHAT_MODEL=gemma3:4b
306
+ export NPCSH_CHAT_PROVIDER=ollama
307
+ export NPCSH_DEFAULT_MODE=agent
308
+ export NPCSH_EMBEDDING_MODEL=nomic-embed-text
309
+ export NPCSH_EMBEDDING_PROVIDER=ollama
310
+ export NPCSH_IMAGE_GEN_MODEL=gpt-image-1
311
+ export NPCSH_IMAGE_GEN_PROVIDER=openai
312
+ export NPCSH_INITIALIZED=1
313
+ export NPCSH_REASONING_MODEL=deepseek-r1
314
+ export NPCSH_REASONING_PROVIDER=deepseek
315
+ export NPCSH_SEARCH_PROVIDER=duckduckgo
316
+ export NPCSH_STREAM_OUTPUT=1
317
+ export NPCSH_VECTOR_DB_PATH=~/npcsh_chroma.db
318
+ export NPCSH_VIDEO_GEN_MODEL=runwayml/stable-diffusion-v1-5
319
+ export NPCSH_VIDEO_GEN_PROVIDER=diffusers
320
+ export NPCSH_VISION_MODEL=gpt-4o-mini
321
+ export NPCSH_VISION_PROVIDER=openai
285
322
  ```
286
323
 
287
- `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
324
+ `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at `~/.npcsh/` that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the `.npcshrc `file. This will default to `~/npcsh_history.db` if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
288
325
 
289
- The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
326
+ The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your `.bashrc` or `.zshrc`:
290
327
 
291
328
  ```bash
292
329
  # Source NPCSH configuration
@@ -297,7 +334,7 @@ fi
297
334
 
298
335
  We support inference via all providers supported by litellm. For openai-compatible providers that are not explicitly named in litellm, use simply `openai-like` as the provider. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
299
336
 
300
- To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
337
+ To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your `~/.npcshrc`. If you already have these API keys set in a `~/.bashrc` or a `~/.zshrc` or similar files, you need not additionally add them to `~/.npcshrc` or to an `.env` file. Here is an example of what an `.env` file might look like:
301
338
 
302
339
  ```bash
303
340
  export OPENAI_API_KEY="your_openai_key"
@@ -309,14 +346,15 @@ export PERPLEXITY_API_KEY='your_perplexity_key'
309
346
 
310
347
 
311
348
  Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
312
- Once initialized and set up, you will find the following in your ~/.npcsh directory:
349
+
350
+ Once initialized and set up, you will find the following in your `~/.npcsh` directory:
313
351
  ```bash
314
352
  ~/.npcsh/
315
353
  ├── npc_team/ # Global NPCs
316
354
  │ ├── jinxs/ # Global tools
317
355
  │ └── assembly_lines/ # Workflow pipelines
318
- │ └── example.npc # globally available npc
319
- │ └── global.ctx # global context file
356
+ │ └── sibiji.npc # globally available npc
357
+ │ └── npcsh.ctx # global context file
320
358
 
321
359
 
322
360
 
@@ -1860,7 +1860,8 @@ def should_skip_kg_processing(user_input: str, assistant_output: str) -> bool:
1860
1860
  def execute_slash_command(command: str,
1861
1861
  stdin_input: Optional[str],
1862
1862
  state: ShellState,
1863
- stream: bool, router) -> Tuple[ShellState, Any]:
1863
+ stream: bool,
1864
+ router) -> Tuple[ShellState, Any]:
1864
1865
  """Executes slash commands using the router or checking NPC/Team jinxs."""
1865
1866
  all_command_parts = shlex.split(command)
1866
1867
  command_name = all_command_parts[0].lstrip('/')
@@ -2034,7 +2035,11 @@ def process_pipeline_command(
2034
2035
  exec_provider = provider_override or npc_provider or state.chat_provider
2035
2036
 
2036
2037
  if cmd_to_process.startswith("/"):
2037
- return execute_slash_command(cmd_to_process, stdin_input, state, stream_final, router)
2038
+ return execute_slash_command(cmd_to_process,
2039
+ stdin_input,
2040
+ state,
2041
+ stream_final,
2042
+ router)
2038
2043
 
2039
2044
  cmd_parts = parse_command_safely(cmd_to_process)
2040
2045
  if not cmd_parts:
@@ -2198,7 +2203,7 @@ def execute_command(
2198
2203
  for i, cmd_segment in enumerate(commands):
2199
2204
  render_markdown(f'- executing command {i+1}/{len(commands)}')
2200
2205
  is_last_command = (i == len(commands) - 1)
2201
- stream_this_segment = state.stream_output and not is_last_command
2206
+ stream_this_segment = state.stream_output and is_last_command
2202
2207
  try:
2203
2208
  current_state, output = process_pipeline_command(
2204
2209
  cmd_segment.strip(),
@@ -612,9 +612,7 @@ def brainblast_handler(command: str, **kwargs):
612
612
  return {"output": f"Error executing brainblast command: {e}", "messages": messages}
613
613
 
614
614
  @router.route("rag", "Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file)")
615
- def rag_handler(command: str, **kwargs):
616
- messages = safe_get(kwargs, "messages", [])
617
-
615
+ def rag_handler(command: str, **kwargs):
618
616
  parts = shlex.split(command)
619
617
  user_command = []
620
618
  file_paths = []
@@ -640,7 +638,7 @@ def rag_handler(command: str, **kwargs):
640
638
  embedding_provider = safe_get(kwargs, "eprovider", NPCSH_EMBEDDING_PROVIDER)
641
639
 
642
640
  if not user_command and not file_paths:
643
- return {"output": "Usage: /rag [-f file_path] <query>", "messages": messages}
641
+ return {"output": "Usage: /rag [-f file_path] <query>", "messages": kwargs.get('messages', [])}
644
642
 
645
643
  try:
646
644
  # Process files if provided
@@ -652,9 +650,7 @@ def rag_handler(command: str, **kwargs):
652
650
  file_contents.extend([f"[{file_name}] {chunk}" for chunk in chunks])
653
651
  except Exception as file_err:
654
652
  file_contents.append(f"Error processing file {file_path}: {str(file_err)}")
655
-
656
- # Execute the RAG command
657
- return execute_rag_command(
653
+ exe_rag = execute_rag_command(
658
654
  command=user_command,
659
655
  vector_db_path=vector_db_path,
660
656
  embedding_model=embedding_model,
@@ -662,10 +658,11 @@ def rag_handler(command: str, **kwargs):
662
658
  file_contents=file_contents if file_paths else None,
663
659
  **kwargs
664
660
  )
661
+ return {'output':exe_rag.get('response'), 'messages': exe_rag.get('messages', kwargs.get('messages', []))}
665
662
 
666
663
  except Exception as e:
667
664
  traceback.print_exc()
668
- return {"output": f"Error executing RAG command: {e}", "messages": messages}
665
+ return {"output": f"Error executing RAG command: {e}", "messages": kwargs.get('messages', [])}
669
666
  @router.route("roll", "generate a video")
670
667
  def roll_handler(command: str, **kwargs):
671
668
  messages = safe_get(kwargs, "messages", [])
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: npcsh
3
- Version: 1.0.17
3
+ Version: 1.0.18
4
4
  Summary: npcsh is a command-line toolkit for using AI agents in novel ways.
5
5
  Home-page: https://github.com/NPC-Worldwide/npcsh
6
6
  Author: Christopher Agostino
@@ -43,6 +43,7 @@ Requires-Dist: redis
43
43
  Requires-Dist: psycopg2-binary
44
44
  Requires-Dist: flask_sse
45
45
  Requires-Dist: wikipedia
46
+ Requires-Dist: mcp
46
47
  Provides-Extra: lite
47
48
  Requires-Dist: anthropic; extra == "lite"
48
49
  Requires-Dist: openai; extra == "lite"
@@ -65,8 +66,6 @@ Requires-Dist: playsound==1.2.2; extra == "yap"
65
66
  Requires-Dist: pygame; extra == "yap"
66
67
  Requires-Dist: faster_whisper; extra == "yap"
67
68
  Requires-Dist: pyttsx3; extra == "yap"
68
- Provides-Extra: mcp
69
- Requires-Dist: mcp; extra == "mcp"
70
69
  Provides-Extra: all
71
70
  Requires-Dist: anthropic; extra == "all"
72
71
  Requires-Dist: openai; extra == "all"
@@ -87,7 +86,6 @@ Requires-Dist: playsound==1.2.2; extra == "all"
87
86
  Requires-Dist: pygame; extra == "all"
88
87
  Requires-Dist: faster_whisper; extra == "all"
89
88
  Requires-Dist: pyttsx3; extra == "all"
90
- Requires-Dist: mcp; extra == "all"
91
89
  Dynamic: author
92
90
  Dynamic: author-email
93
91
  Dynamic: classifier
@@ -107,21 +105,25 @@ Dynamic: summary
107
105
 
108
106
  # NPC Shell
109
107
 
110
- The NPC shell is a suite of executable command-line programs that allow users to easily interact with NPCs and LLMs through a command line shell.
111
-
112
- Programs within the NPC shell use the properties defined in `~/.npcshrc`, which is generated upon installation and running of `npcsh` for the first time.
108
+ The NPC shell is the toolkit for tomorrow, providing a suite of programs to make use of multi-modal LLMs and agents in novel interactive modes. `npcsh` is a command-line program, and so can be used wherever you work. `npcsh` is developed to work relibly with small models and performs excellently with the state-of-the-art models from major model providers.
113
109
 
114
110
  To get started:
115
- ```
111
+ ```bash
112
+ # for users who want to mainly use models through APIs (e.g. , gemini, grok, deepseek, anthropic, openai, mistral, , any others provided by litellm ):
113
+ pip install 'npcsh[lite]'
114
+ # for users who want to use local models (these install diffusers/transformers/torch stack so it is big.):
116
115
  pip install 'npcsh[local]'
116
+ # for users who want to use the voice mode `yap`, see also the OS-specific installation instructions for installing needed system audio libraries
117
+ pip install 'npcsh[yap]'
117
118
  ```
118
- Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc` cli, `yap` `pti`, `wander`, and `spool`.
119
-
120
-
121
- # npcsh
122
- - An AI-powered shell that parses bash, natural language, and special macro calls, `npcsh` processes your input accordingly, agentically, and automatically.
119
+ Once installed: run
120
+ ```bash
121
+ npcsh
122
+ ```
123
+ and you will enter the NPC shell. Additionally, the pip installation includes making the following CLI tools available in bash: `corca`, `guac`, `npc` cli, `pti`, `spool`, `wander`, and`yap`.
123
124
 
124
125
 
126
+ # Usage
125
127
  - Get help with a task:
126
128
  ```bash
127
129
  npcsh:🤖sibiji:gemini-2.5-flash>can you help me identify what process is listening on port 5337?
@@ -139,18 +141,7 @@ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc
139
141
  ```bash
140
142
  npcsh> has there ever been a better pasta shape than bucatini?
141
143
  ```
142
- ```
143
- Ultimately, the "best" pasta shape depends on personal preference and the dish being prepared. Bucatini shines in specific contexts, but pasta lovers often appreciate a diverse range of shapes for their unique qualities and compatibilities with various sauces and ingredients. Each shape has its place in the culinary world, and trying different types can enhance the overall dining experience.
144
- ```
145
-
146
- ```
147
- .Loaded .env file...
148
- Initializing database schema...
149
- Database schema initialization complete.
150
- Processing prompt: 'has there ever been a better pasta shape than bucatini?' with NPC: 'sibiji'...
151
- • Action chosen: answer_question
152
- • Explanation given: The question is a general opinion-based inquiry about pasta shapes and can be answered without external data or jinx invocation.
153
- ...............................................................................
144
+ ```
154
145
  Bucatini is certainly a favorite for many due to its unique hollow center, which holds sauces beautifully. Whether it's "better" is subjective and depends on the dish and personal
155
146
  preference. Shapes like orecchiette, rigatoni, or trofie excel in different recipes. Bucatini stands out for its versatility and texture, making it a top contender among pasta shapes!
156
147
  ```
@@ -193,38 +184,67 @@ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc
193
184
  ```bash
194
185
  /ots
195
186
  ```
187
+ - **Use an mcp server**: make use of NPCs with MCP servers.
188
+ ```bash
189
+ /corca --mcp-server-path /path.to.server.py
190
+ ```
191
+
192
+
193
+ # NPC Data Layer
194
+
195
+ The core of npcsh's capabilities is powered by the NPC Data Layer. Upon initialization, a user will be prompted to make a team in the current directory or to use a global team stored in `~/.npcsh/` which houses the NPC team with its jinxs, models, contexts, assembly lines. By implementing these components as simple data structures, users can focus on tweaking the relevant parts of their multi-agent systems.
196
196
 
197
+ ## Creating Custom Components
198
+
199
+ Users can extend NPC capabilities through simple YAML files:
200
+
201
+ - **NPCs** (.npc): are defined with a name, primary directive, and optional model specifications
202
+ - **Jinxs** (.jinx): specify function-like capabilities with preprocessing, execution, and postprocessing steps
203
+ - **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team. Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context
204
+
205
+ The NPC Shell system integrates the capabilities of `npcpy` to maintain conversation history, track command execution, and provide intelligent autocomplete through an extensible command routing system. State is preserved between sessions, allowing for continuous knowledge building over time.
206
+
207
+ This architecture enables complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities as composable data structures rather than code, npcsh creates a more accessible and adaptable framework for AI automation.
197
208
 
198
209
  # Macros
199
210
  - activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
200
211
 
201
- - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings
202
- - `/brainblast` - Execute an advanced chunked search on command history
203
- - `/breathe` - Condense context on a regular cadence
204
- - `/compile` - Compile NPC profiles
205
- - `/corca` - Enter the Corca MCP-powered agentic shell. Usage: /corca [--mcp-server-path path]
206
- - `/flush` - Flush the last N messages
207
- - `/guac` - Enter guac mode
208
- - `/help` - Show help for commands, NPCs, or Jinxs. Usage: /help
209
- - `/init` - Initialize NPC project
210
- - `/jinxs` - Show available jinxs for the current NPC/Team
211
- - `/npc-studio` - Start npc studio
212
- - `/ots` - Take screenshot and analyze with vision model
213
- - `/plan` - Execute a plan command
214
- - `/plonk` - Use vision model to interact with GUI. Usage: /plonk <task description>
215
- - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM
216
- - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file)
217
- - `/roll` - generate a video with video generation model
218
- - `/sample` - Send a prompt directly to the LLM
219
- - `/search` - Execute a web search command
220
- - `/serve` - Serve an NPC Team server.
221
- - `/set` - Set configuration values
222
- - `/sleep` - Evolve knowledge graph with options for dreaming.
223
- - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag
224
- - `/trigger` - Execute a trigger command
225
- - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini
226
- - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature
227
- - `/yap` - Enter voice chat (yap) mode
212
+ - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings. Usage: `/alicanto 'query to be researched' --num-npcs <int> --depth <int>`
213
+ - `/brainblast` - Execute an advanced chunked search on command history. Usage: `/brainblast 'query' --top_k 10`
214
+ - `/breathe` - Condense context on a regular cadence. Usage: `/breathe -p <provider: NPCSH_CHAT_PROVIDER> -m <model: NPCSH_CHAT_MODEL>`
215
+ - `/compile` - Compile NPC profiles. Usage: `/compile <path_to_npc> `
216
+ - `/corca` - Enter the Corca MCP-powered agentic shell. Usage: `/corca [--mcp-server-path path]`
217
+ - `/flush` - Flush the last N messages. Usage: `/flush N=10`
218
+ - `/guac` - Enter guac mode. Usage: `/guac`
219
+ - `/help` - Show help for commands, NPCs, or Jinxs. Usage: `/help`
220
+ - `/init` - Initialize NPC project. Usage: `/init`
221
+ - `/jinxs` - Show available jinxs for the current NPC/Team. Usage: `/jinxs`
222
+ - `/<jinx_name>` - Run a jinx with specified command line arguments. `/<jinx_name> jinx_arg1 jinx_arg2`
223
+ - `/npc-studio` - Start npc studio. Pulls NPC Studio github to `~/.npcsh/npc-studio` and launches it in development mode after installing necessary NPM dependencies.Usage: `/npc-studio`
224
+ - `/ots` - Take screenshot and analyze with vision model. Usage: `/ots filename=<output_file_name_for_screenshot>` then select an area, and you will be prompted for your request.
225
+ - `/plan` - Execute a plan command. Usage: `/plan 'idea for a cron job to be set up to accomplish'`
226
+ - `/plonk` - Use vision model to interact with GUI. Usage: `/plonk '<task description>' `
227
+ - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM. Usage: `/pti`
228
+ - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file). Usage: `/rag '<query_to_rag>' --emodel <NPCSH_EMBEDDING_MODEL> --eprovider <NPCSH_EMBEDDING_PROVIDER>`
229
+ - `/roll` - generate a video with video generation model. Usage: `/roll '<description_for_a_movie>' --vgmodel <NPCSH_VIDEO_GEN_MODEL> --vgprovider <NPCSH_VIDEO_GEN_PROVIDER>`
230
+ - `/sample` - Send a context-free prompt to an LLM, letting you get fresh answers without needing to start a separate conversation/shell. Usage: `/sample -m <NPCSH_CHAT_MODEL> 'question to sample --temp <float> --top_k int`
231
+ - `/search` - Execute a web search command. Usage: `/search 'search query' --sprovider <provider>` where provider is currently limited to DuckDuckGo and Perplexity. Wikipedia integration ongoing.
232
+ - `/serve` - Serve an NPC Team server.
233
+ - `/set` - Set configuration values.
234
+ - Usage:
235
+ - `/set model gemma3:4b`,
236
+ - `/set provider ollama`,
237
+ - `/set NPCSH_VIDEO_GEN_PROVIDER diffusers`
238
+ - `/sleep` - Evolve knowledge graph with options for dreaming. Usage: `/sleep --ops link_facts,deepen`
239
+ - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag. Usage: `/spool --attachments 'path1,path2,path3' -n <npc_name> -m <modell> -p <provider>`
240
+ - `/trigger` - Execute a trigger command. Usage: `/trigger 'a description of a trigger to implement with system daemons/file system listeners.' -m gemma3:27b -p ollama`
241
+ - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini.
242
+ - Usage:
243
+ - Gen Image: `/vixynt -igp <NPCSH_IMAGE_GEN_PROVIDER> --igmodel <NPCSH_IMAGE_GEN_MODEL> --output_file <path_to_file> width=<int:1024> height =<int:1024> 'description of image`
244
+ - Edit Image: `/vixynt 'edit this....' --attachments '/path/to/image.png,/path/to/image.jpeg'`
245
+
246
+ - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature. Usage: `/wander 'query to wander about' --provider "ollama" --model "deepseek-r1:32b" environment="a vast dark ocean" interruption-likelihood=.1`
247
+ - `/yap` - Enter voice chat (yap) mode. Usage: `/yap -n <npc_to_chat_with>`
228
248
 
229
249
  ## Common Command-Line Flags:
230
250
 
@@ -372,23 +392,38 @@ python-tkinter (pyautogui)
372
392
  </details>
373
393
 
374
394
  ## Startup Configuration and Project Structure
375
- After `npcsh` has been pip installed, `npcsh`, `corca`, `guac`, `pti`, `spool`, `yap` and the `npc` CLI can be used as command line tools. To initialize these correctly, first start by starting the NPC shell:
395
+ To initialize the NPC shell environment parameters correctly, first start the NPC shell:
376
396
  ```bash
377
397
  npcsh
378
398
  ```
379
- When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
380
- Here is an example of what the .npcshrc file might look like after this has been run.
399
+ When initialized, `npcsh` will generate a `.npcshrc` file in your home directory that stores your npcsh settings.
400
+ Here is an example of what the `.npcshrc` file might look like after this has been run.
381
401
  ```bash
382
402
  # NPCSH Configuration File
383
403
  export NPCSH_INITIALIZED=1
384
- export NPCSH_CHAT_PROVIDER='ollama'
385
- export NPCSH_CHAT_MODEL='llama3.2'
386
404
  export NPCSH_DB_PATH='~/npcsh_history.db'
405
+ export NPCSH_CHAT_MODEL=gemma3:4b
406
+ export NPCSH_CHAT_PROVIDER=ollama
407
+ export NPCSH_DEFAULT_MODE=agent
408
+ export NPCSH_EMBEDDING_MODEL=nomic-embed-text
409
+ export NPCSH_EMBEDDING_PROVIDER=ollama
410
+ export NPCSH_IMAGE_GEN_MODEL=gpt-image-1
411
+ export NPCSH_IMAGE_GEN_PROVIDER=openai
412
+ export NPCSH_INITIALIZED=1
413
+ export NPCSH_REASONING_MODEL=deepseek-r1
414
+ export NPCSH_REASONING_PROVIDER=deepseek
415
+ export NPCSH_SEARCH_PROVIDER=duckduckgo
416
+ export NPCSH_STREAM_OUTPUT=1
417
+ export NPCSH_VECTOR_DB_PATH=~/npcsh_chroma.db
418
+ export NPCSH_VIDEO_GEN_MODEL=runwayml/stable-diffusion-v1-5
419
+ export NPCSH_VIDEO_GEN_PROVIDER=diffusers
420
+ export NPCSH_VISION_MODEL=gpt-4o-mini
421
+ export NPCSH_VISION_PROVIDER=openai
387
422
  ```
388
423
 
389
- `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
424
+ `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at `~/.npcsh/` that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the `.npcshrc `file. This will default to `~/npcsh_history.db` if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
390
425
 
391
- The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
426
+ The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your `.bashrc` or `.zshrc`:
392
427
 
393
428
  ```bash
394
429
  # Source NPCSH configuration
@@ -399,7 +434,7 @@ fi
399
434
 
400
435
  We support inference via all providers supported by litellm. For openai-compatible providers that are not explicitly named in litellm, use simply `openai-like` as the provider. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
401
436
 
402
- To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
437
+ To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your `~/.npcshrc`. If you already have these API keys set in a `~/.bashrc` or a `~/.zshrc` or similar files, you need not additionally add them to `~/.npcshrc` or to an `.env` file. Here is an example of what an `.env` file might look like:
403
438
 
404
439
  ```bash
405
440
  export OPENAI_API_KEY="your_openai_key"
@@ -411,14 +446,15 @@ export PERPLEXITY_API_KEY='your_perplexity_key'
411
446
 
412
447
 
413
448
  Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
414
- Once initialized and set up, you will find the following in your ~/.npcsh directory:
449
+
450
+ Once initialized and set up, you will find the following in your `~/.npcsh` directory:
415
451
  ```bash
416
452
  ~/.npcsh/
417
453
  ├── npc_team/ # Global NPCs
418
454
  │ ├── jinxs/ # Global tools
419
455
  │ └── assembly_lines/ # Workflow pipelines
420
- │ └── example.npc # globally available npc
421
- │ └── global.ctx # global context file
456
+ │ └── sibiji.npc # globally available npc
457
+ │ └── npcsh.ctx # global context file
422
458
 
423
459
 
424
460
 
@@ -31,6 +31,7 @@ redis
31
31
  psycopg2-binary
32
32
  flask_sse
33
33
  wikipedia
34
+ mcp
34
35
 
35
36
  [all]
36
37
  anthropic
@@ -52,7 +53,6 @@ playsound==1.2.2
52
53
  pygame
53
54
  faster_whisper
54
55
  pyttsx3
55
- mcp
56
56
 
57
57
  [lite]
58
58
  anthropic
@@ -71,9 +71,6 @@ nltk
71
71
  torch
72
72
  darts
73
73
 
74
- [mcp]
75
- mcp
76
-
77
74
  [yap]
78
75
  pyaudio
79
76
  gtts
@@ -40,6 +40,7 @@ base_requirements = [
40
40
  "psycopg2-binary",
41
41
  "flask_sse",
42
42
  "wikipedia",
43
+ "mcp"
43
44
  ]
44
45
 
45
46
  # API integration requirements
@@ -50,10 +51,6 @@ api_requirements = [
50
51
  "google-genai",
51
52
  ]
52
53
 
53
- # mcp integration requirements
54
- mcp_requirements = [
55
- "mcp",
56
- ]
57
54
  # Local ML/AI requirements
58
55
  local_requirements = [
59
56
  "sentence_transformers",
@@ -81,15 +78,14 @@ extra_files = package_files("npcpy/npc_team/")
81
78
 
82
79
  setup(
83
80
  name="npcsh",
84
- version="1.0.17",
81
+ version="1.0.18",
85
82
  packages=find_packages(exclude=["tests*"]),
86
83
  install_requires=base_requirements, # Only install base requirements by default
87
84
  extras_require={
88
- "lite": api_requirements, # Just API integrations
89
- "local": local_requirements, # Local AI/ML features
90
- "yap": voice_requirements, # Voice/Audio features
91
- "mcp": mcp_requirements, # MCP integration
92
- "all": api_requirements + local_requirements + voice_requirements + mcp_requirements, # Everything
85
+ "lite": api_requirements,
86
+ "local": local_requirements,
87
+ "yap": voice_requirements,
88
+ "all": api_requirements + local_requirements + voice_requirements ,
93
89
  },
94
90
  entry_points={
95
91
  "console_scripts": [
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes