npcsh 1.0.12__tar.gz → 1.0.14__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (31) hide show
  1. npcsh-1.0.14/PKG-INFO +777 -0
  2. npcsh-1.0.14/README.md +680 -0
  3. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/_state.py +110 -41
  4. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/alicanto.py +22 -7
  5. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/npcsh.py +270 -469
  6. npcsh-1.0.14/npcsh/plonk.py +342 -0
  7. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/routes.py +369 -170
  8. npcsh-1.0.14/npcsh/spool.py +259 -0
  9. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/yap.py +115 -106
  10. npcsh-1.0.14/npcsh.egg-info/PKG-INFO +777 -0
  11. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh.egg-info/requires.txt +2 -0
  12. {npcsh-1.0.12 → npcsh-1.0.14}/setup.py +4 -2
  13. npcsh-1.0.12/PKG-INFO +0 -596
  14. npcsh-1.0.12/README.md +0 -501
  15. npcsh-1.0.12/npcsh/plonk.py +0 -409
  16. npcsh-1.0.12/npcsh/spool.py +0 -318
  17. npcsh-1.0.12/npcsh.egg-info/PKG-INFO +0 -596
  18. {npcsh-1.0.12 → npcsh-1.0.14}/LICENSE +0 -0
  19. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/__init__.py +0 -0
  20. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/guac.py +0 -0
  21. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/mcp_helpers.py +0 -0
  22. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/mcp_npcsh.py +0 -0
  23. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/mcp_server.py +0 -0
  24. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/npc.py +0 -0
  25. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/pti.py +0 -0
  26. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh/wander.py +0 -0
  27. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh.egg-info/SOURCES.txt +0 -0
  28. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh.egg-info/dependency_links.txt +0 -0
  29. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh.egg-info/entry_points.txt +0 -0
  30. {npcsh-1.0.12 → npcsh-1.0.14}/npcsh.egg-info/top_level.txt +0 -0
  31. {npcsh-1.0.12 → npcsh-1.0.14}/setup.cfg +0 -0
npcsh-1.0.14/PKG-INFO ADDED
@@ -0,0 +1,777 @@
1
+ Metadata-Version: 2.4
2
+ Name: npcsh
3
+ Version: 1.0.14
4
+ Summary: npcsh is a command-line toolkit for using AI agents in novel ways.
5
+ Home-page: https://github.com/NPC-Worldwide/npcsh
6
+ Author: Christopher Agostino
7
+ Author-email: info@npcworldwi.de
8
+ Classifier: Programming Language :: Python :: 3
9
+ Classifier: License :: OSI Approved :: MIT License
10
+ Requires-Python: >=3.10
11
+ Description-Content-Type: text/markdown
12
+ License-File: LICENSE
13
+ Requires-Dist: npcpy
14
+ Requires-Dist: jinja2
15
+ Requires-Dist: litellm
16
+ Requires-Dist: docx
17
+ Requires-Dist: scipy
18
+ Requires-Dist: numpy
19
+ Requires-Dist: imagehash
20
+ Requires-Dist: requests
21
+ Requires-Dist: matplotlib
22
+ Requires-Dist: markdown
23
+ Requires-Dist: networkx
24
+ Requires-Dist: PyYAML
25
+ Requires-Dist: PyMuPDF
26
+ Requires-Dist: pyautogui
27
+ Requires-Dist: pydantic
28
+ Requires-Dist: pygments
29
+ Requires-Dist: sqlalchemy
30
+ Requires-Dist: termcolor
31
+ Requires-Dist: rich
32
+ Requires-Dist: colorama
33
+ Requires-Dist: Pillow
34
+ Requires-Dist: python-dotenv
35
+ Requires-Dist: pandas
36
+ Requires-Dist: beautifulsoup4
37
+ Requires-Dist: duckduckgo-search
38
+ Requires-Dist: flask
39
+ Requires-Dist: flask_cors
40
+ Requires-Dist: redis
41
+ Requires-Dist: psycopg2-binary
42
+ Requires-Dist: flask_sse
43
+ Provides-Extra: lite
44
+ Requires-Dist: anthropic; extra == "lite"
45
+ Requires-Dist: openai; extra == "lite"
46
+ Requires-Dist: google-generativeai; extra == "lite"
47
+ Requires-Dist: google-genai; extra == "lite"
48
+ Provides-Extra: local
49
+ Requires-Dist: sentence_transformers; extra == "local"
50
+ Requires-Dist: opencv-python; extra == "local"
51
+ Requires-Dist: ollama; extra == "local"
52
+ Requires-Dist: kuzu; extra == "local"
53
+ Requires-Dist: chromadb; extra == "local"
54
+ Requires-Dist: diffusers; extra == "local"
55
+ Requires-Dist: nltk; extra == "local"
56
+ Requires-Dist: torch; extra == "local"
57
+ Provides-Extra: yap
58
+ Requires-Dist: pyaudio; extra == "yap"
59
+ Requires-Dist: gtts; extra == "yap"
60
+ Requires-Dist: playsound==1.2.2; extra == "yap"
61
+ Requires-Dist: pygame; extra == "yap"
62
+ Requires-Dist: faster_whisper; extra == "yap"
63
+ Requires-Dist: pyttsx3; extra == "yap"
64
+ Provides-Extra: mcp
65
+ Requires-Dist: mcp; extra == "mcp"
66
+ Provides-Extra: all
67
+ Requires-Dist: anthropic; extra == "all"
68
+ Requires-Dist: openai; extra == "all"
69
+ Requires-Dist: google-generativeai; extra == "all"
70
+ Requires-Dist: google-genai; extra == "all"
71
+ Requires-Dist: sentence_transformers; extra == "all"
72
+ Requires-Dist: opencv-python; extra == "all"
73
+ Requires-Dist: ollama; extra == "all"
74
+ Requires-Dist: kuzu; extra == "all"
75
+ Requires-Dist: chromadb; extra == "all"
76
+ Requires-Dist: diffusers; extra == "all"
77
+ Requires-Dist: nltk; extra == "all"
78
+ Requires-Dist: torch; extra == "all"
79
+ Requires-Dist: pyaudio; extra == "all"
80
+ Requires-Dist: gtts; extra == "all"
81
+ Requires-Dist: playsound==1.2.2; extra == "all"
82
+ Requires-Dist: pygame; extra == "all"
83
+ Requires-Dist: faster_whisper; extra == "all"
84
+ Requires-Dist: pyttsx3; extra == "all"
85
+ Requires-Dist: mcp; extra == "all"
86
+ Dynamic: author
87
+ Dynamic: author-email
88
+ Dynamic: classifier
89
+ Dynamic: description
90
+ Dynamic: description-content-type
91
+ Dynamic: home-page
92
+ Dynamic: license-file
93
+ Dynamic: provides-extra
94
+ Dynamic: requires-dist
95
+ Dynamic: requires-python
96
+ Dynamic: summary
97
+
98
+ <p align="center">
99
+ <a href= "https://github.com/npc-worldwide/npcsh/blob/main/docs/npcsh.md">
100
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npcsh.png" alt="npcsh logo" width=250></a>
101
+ </p>
102
+
103
+ # NPC Shell
104
+
105
+ The NPC shell is a suite of executable command-line programs that allow users to easily interact with NPCs and LLMs through a command line shell.
106
+
107
+ Programs within the NPC shell use the properties defined in `~/.npcshrc`, which is generated upon installation and running of `npcsh` for the first time.
108
+
109
+ To get started:
110
+ ```
111
+ pip install 'npcsh[local]'
112
+ ```
113
+ Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc` cli, `yap` `pti`, `wander`, and `spool`.
114
+
115
+
116
+ # npcsh
117
+ - An AI-powered shell that parses bash, natural language, and special macro calls, `npcsh` processes your input accordingly, agentically, and automatically.
118
+
119
+
120
+ - Get help with a task:
121
+ ```
122
+ npcsh:🤖sibiji:gemini-2.5-flash>can you help me identify what process is listening on port 5337?
123
+ ```
124
+ <p align="center">
125
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/port5337.png" alt="example of running npcsh to check what processes are listening on port 5337", width=600>
126
+ </p>
127
+ - Edit files
128
+
129
+ - **Ask a Generic Question**
130
+ ```bash
131
+ npcsh> has there ever been a better pasta shape than bucatini?
132
+ ```
133
+
134
+ ```
135
+ .Loaded .env file...
136
+ Initializing database schema...
137
+ Database schema initialization complete.
138
+ Processing prompt: 'has there ever been a better pasta shape than bucatini?' with NPC: 'sibiji'...
139
+ • Action chosen: answer_question
140
+ • Explanation given: The question is a general opinion-based inquiry about pasta shapes and can be answered without external data or jinx invocation.
141
+ ...............................................................................
142
+ Bucatini is certainly a favorite for many due to its unique hollow center, which holds sauces beautifully. Whether it's "better" is subjective and depends on the dish and personal
143
+ preference. Shapes like orecchiette, rigatoni, or trofie excel in different recipes. Bucatini stands out for its versatility and texture, making it a top contender among pasta shapes!
144
+ ```
145
+
146
+
147
+ - **Search the Web**
148
+ ```bash
149
+ /search "cal golden bears football schedule" -sp perplexity
150
+ ```
151
+ <p align="center">
152
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/search_example.png" alt="example of search results", width=600>
153
+ </p>
154
+
155
+ - **Computer Use**
156
+ ```bash
157
+ /plonk -n 'npc_name' -sp 'task for plonk to carry out'
158
+ ```
159
+
160
+ - **Generate Image**
161
+ ```bash
162
+ /vixynt 'generate an image of a rabbit eating ham in the brink of dawn' model='gpt-image-1' provider='openai'
163
+ ```
164
+ <p align="center">
165
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/rabbit.PNG" alt="a rabbit eating ham in the bring of dawn", width=250>
166
+ </p>
167
+ - **Generate Video**
168
+ ```bash
169
+ /roll 'generate a video of a hat riding a dog'
170
+ ```
171
+ <p align="center">
172
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/hat_video.mp4" alt="video of a hat riding a dog", width=250>
173
+ </p>
174
+
175
+ - **Serve an NPC Team**
176
+ ```bash
177
+ /serve --port 5337 --cors='http://localhost:5137/'
178
+ ```
179
+ - **Screenshot Analysis**
180
+ ```bash
181
+ /ots
182
+ ```
183
+
184
+
185
+ # Macros
186
+ - activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
187
+
188
+ - ## TL; DR:
189
+ - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings
190
+ - `/brainblast` - Execute an advanced chunked search on command history
191
+ - `/breathe` - Condense context on a regular cadence
192
+ - `/compile` - Compile NPC profiles
193
+ - `/flush` - Flush the last N messages
194
+ - `/guac` - Enter guac mode
195
+ - `/help` - Show help for commands, NPCs, or Jinxs. Usage: /help
196
+ - `/init` - Initialize NPC project
197
+ - `/jinxs` - Show available jinxs for the current NPC/Team
198
+ - `/ots` - Take screenshot and analyze with vision model
199
+ - `/plan` - Execute a plan command\n\n/plonk - Use vision model to interact with GUI. Usage: /plonk <task description>
200
+ - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM
201
+ - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file)
202
+ - `/roll` - generate a video with video generation model
203
+ - `/sample` - Send a prompt directly to the LLM
204
+ - `/search` - Execute a web search command
205
+ - `/serve` - Serve an NPC Team server.
206
+ - `/set` - Set configuration values
207
+ - `/sleep` - Evolve knowledge graph with options for dreaming.
208
+ - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag
209
+ - `/trigger` - Execute a trigger command
210
+ - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini
211
+ - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature
212
+ - `/yap` - Enter voice chat (yap) mode
213
+
214
+ ## Common Command-Line Flags\n\nThe shortest unambiguous prefix works (e.g., `-t` for `--temperature`).
215
+
216
+ ```
217
+ Flag Shorthand | Flag Shorthand | Flag Shorthand | Flag Shorthand
218
+ ------------------------------ | ------------------------------ | ------------------------------ | ------------------------------
219
+ --attachments (-a) | --height (-h) | --num_npcs (-num_n) | --team (-tea)
220
+ --config_dir (-con) | --igmodel (-igm) | --output_file (-o) | --temperature (-tem)
221
+ --cors (-cor) | --igprovider (-igp) | --plots_dir (-pl) | --top_k
222
+ --creativity (-cr) | --lang (-l) | --port (-po) | --top_p
223
+ --depth (-d) | --max_tokens (-ma) | --provider (-pr) | --vmodel (-vm)
224
+ --emodel (-em) | --messages (-me) | --refresh_period (-re) | --vprovider (-vp)
225
+ --eprovider (-ep) | --model (-mo) | --rmodel (-rm) | --width (-w)
226
+ --exploration (-ex) | --npc (-np) | --rprovider (-rp) |
227
+ --format (-f) | --num_frames (-num_f) | --sprovider (-s) |
228
+ ```
229
+ '
230
+
231
+ - ## alicanto: a research exploration agent flow.
232
+
233
+ <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/alicanto.md">
234
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/alicanto.png" alt="logo for deep research", width=250></a>
235
+ </p>
236
+
237
+ - Examples:
238
+ ```bash
239
+ # npcsh
240
+ /alicanto "What are the implications of quantum computing for cybersecurity?"
241
+ /alicanto "How might climate change impact global food security?" --num-npcs 8 --depth 5
242
+ ```
243
+
244
+ ```bash
245
+ # bash
246
+ npc alicanto "What ethical considerations should guide AI development?" --max_facts_per_chain 0.5 --max_thematic_groups 3 --max_criticisms_per_group 3 max_conceptual_combinations 3 max_experiments 10
247
+
248
+ npc alicanto "What is the future of remote work?" --format report
249
+ ```
250
+ - ## Brainblast: searching through past messages:
251
+ ```bash
252
+ # npcsh
253
+ /brainblast 'subtle summer winds' --top_k 10
254
+ ```
255
+ ```bash
256
+ # bash
257
+ npc brainblast 'python dictionaries'
258
+ ```
259
+ - ## Breathe: Condense conversation context (shell only):
260
+ ```bash
261
+ # npcsh
262
+ /breathe
263
+ /breathe -p ollama -m qwen3:latest
264
+ ```
265
+ - ## Compile: render npcs for use without re-loading npcsh
266
+ ```bash
267
+ # npcsh
268
+ /compile /path/to/npc
269
+ ```
270
+ - ## flush: flush context (shell only):
271
+ ```bash
272
+ /flush
273
+ ```
274
+
275
+
276
+ - ## `guac`
277
+
278
+ <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/guac.md">
279
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/guac.png" alt="npcsh logo of a solarpunk sign", width=250></a>
280
+ </p>
281
+
282
+ - a replacement shell for interpreters like python/r/node/julia with an avocado input marker 🥑 that brings a pomodoro-like approach to interactive coding.
283
+ - Simulation:
284
+ `🥑 Make a markov chain simulation of a random walk in 2D space with 1000 steps and visualize`
285
+ ```
286
+ # Generated python code:
287
+ import numpy as np
288
+ import matplotlib.pyplot as plt
289
+
290
+ # Number of steps
291
+ n_steps = 1000
292
+
293
+ # Possible moves: up, down, left, right
294
+ moves = np.array([[0, 1], [0, -1], [1, 0], [-1, 0]])
295
+
296
+ # Initialize position array
297
+ positions = np.zeros((n_steps+1, 2), dtype=int)
298
+
299
+ # Generate random moves
300
+ for i in range(1, n_steps+1):
301
+ step = moves[np.random.choice(4)]
302
+ positions[i] = positions[i-1] + step
303
+
304
+ # Plot the random walk
305
+ plt.figure(figsize=(8, 8))
306
+ plt.plot(positions[:, 0], positions[:, 1], lw=1)
307
+ plt.scatter([positions[0, 0]], [positions[0, 1]], color='green', label='Start')
308
+ plt.scatter([positions[-1, 0]], [positions[-1, 1]], color='red', label='End')
309
+ plt.title('2D Random Walk - 1000 Steps (Markov Chain)')
310
+ plt.xlabel('X Position')
311
+ plt.ylabel('Y Position')
312
+ plt.legend()
313
+ plt.grid(True)
314
+ plt.axis('equal')
315
+ plt.show()
316
+ # Generated code executed successfully
317
+
318
+ ```
319
+ <p align="center">
320
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/markov_chain.png" alt="markov_chain_figure", width=250>
321
+ </p>
322
+
323
+ Access the variables created in the code:
324
+ `🥑 print(positions)`
325
+ ```
326
+ [[ 0 0]
327
+ [ 0 -1]
328
+ [ -1 -1]
329
+ ...
330
+ [ 29 -23]
331
+ [ 28 -23]
332
+ [ 27 -23]]
333
+ ```
334
+
335
+ - Run a python script:
336
+ `🥑 run file.py`
337
+ - Refresh:
338
+ `🥑 /refresh`
339
+ - Show current variables:
340
+ `🥑 /show`
341
+
342
+ A guac session progresses through a series of stages, each of equal length. Each stage adjusts the emoji input prompt. Once the stages have passed, it is time to refresh. Stage 1: `🥑`, Stage 2: `🥑🔪` Stage 3: `🥑🥣` Stage:4 `🥑🥣🧂`, `Stage 5: 🥘 TIME TO REFRESH`. At stage 5, the user is reminded to refresh with the /refresh macro. This will evaluate the session so farand suggest and implement new functions or automations that will aid in future sessions, with the ultimate approval of the user.
343
+
344
+
345
+ - ## help:/ Show help for commands, NPCs, or Jinxs.
346
+ ```bash
347
+ /help
348
+ ```
349
+ - ## init - Initialize NPC project
350
+ - ## jinxs : show available jinxs
351
+ - ## ots: Over-the-shoulder screen shot analysis
352
+ - Screenshot analysis:
353
+ ```bash
354
+ #npcsh
355
+ /ots
356
+ /ots output_filename =...
357
+ ```
358
+ ```bash
359
+ #bash
360
+ npc ots ...
361
+ ```
362
+ - ## `plan`: set up cron jobs:
363
+ ```bash
364
+ # npcsh
365
+ /plan 'a description of a cron job to implement' -m gemma3:27b -p ollama
366
+ ```
367
+ ```bash
368
+ # bash
369
+ npc plan
370
+ ```
371
+
372
+ - ## `plonk`: Computer use:
373
+ ```bash
374
+ # npcsh
375
+ /plonk -n 'npc_name' -sp 'task for plonk to carry out '
376
+
377
+ #bash
378
+ npc plonk
379
+ ```
380
+ - ## `pti`: a reasoning REPL loop with interruptions
381
+
382
+ ```npcsh
383
+ /pti -n frederic -m qwen3:latest -p ollama
384
+ ```
385
+
386
+ Or from the bash cmd line:
387
+ ```bash
388
+ pti
389
+ ```
390
+ <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/pti.md">
391
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/frederic4.png" alt="npcsh logo of frederic the bear and the pti logo", width=250></a>
392
+ </p>
393
+
394
+ - ## `rag`: embedding search through chroma db, optional file input
395
+ - ## `roll`: your video generation assistant
396
+ -
397
+ ```npcsh
398
+ /roll --provider ollama --model llama3
399
+ ```
400
+
401
+ - ## sample: one-shot sampling from LLMs with specific parameters
402
+ ```bash
403
+ # npcsh
404
+ /sample 'prompt'
405
+ /sample -m gemini-1.5-flash "Summarize the plot of 'The Matrix' in three sentences."
406
+
407
+ /sample --model claude-3-5-haiku-latest "Translate 'good morning' to Japanese."
408
+
409
+ /sample model=qwen3:latest "tell me about the last time you went shopping."
410
+
411
+
412
+ ```
413
+ ```bash
414
+ # bash
415
+ npc sample -p ollama -m gemma3:12b --temp 1.8 --top_k 50 "Write a haiku about the command line."
416
+
417
+ npc sample model=gpt-4o-mini "What are the primary colors?" --provider openai
418
+ ```
419
+
420
+ - ## search: use an internet search provider
421
+ ```npcsh
422
+ /search -sp perplexity 'cal bears football schedule'
423
+ /search --sprovider duckduckgo 'beef tongue'
424
+ # Other search providers could be added, but we have only integrated duckduckgo and perplexity for the moment.
425
+ ```
426
+
427
+ ```bash
428
+ npc search 'when is the moon gonna go away from the earth'
429
+ ```
430
+
431
+
432
+ - ## serve: serve an npc team
433
+ ```bash
434
+ /serve
435
+ /serve ....
436
+ # Other search providers could be added, but we have only integrated duckduckgo and perplexity for the moment.
437
+ ```
438
+
439
+ ```bash
440
+ npc serve
441
+ ```
442
+
443
+ - ## set: change current model, env params
444
+ ```bash
445
+ /set model ...
446
+ /set provider ...
447
+ /set NPCSH_API_URL https://localhost:1937
448
+ ```
449
+
450
+ ```bash
451
+ npc set ...
452
+ ```
453
+ - ## sleep: prune and evolve the current knowledge graph
454
+ ```bash
455
+ /sleep
456
+ /sleep --dream
457
+ /sleep --ops link_facts,deepen
458
+ ```
459
+
460
+ ```bash
461
+ npc sleep
462
+ ```
463
+ - ## `spool`
464
+ <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/spool.md">
465
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/spool.png" alt="logo for spool", width=250></a>
466
+ </p>
467
+
468
+ - Enter chat loop with isolated context, attachments, specified models/providers:
469
+ ```npcsh
470
+ /spool -n <npc_name>
471
+ /spool --attachments ./test_data/port5337.png,./test_data/yuan2004.pdf,./test_data/books.csv
472
+ /spool --provider ollama --model llama3
473
+ /spool -p deepseek -m deepseek-reasoner
474
+ /spool -n alicanto
475
+ ```
476
+
477
+
478
+
479
+ - ## Trigger: schedule listeners, daemons
480
+ ```bash
481
+ /trigger 'a description of a trigger to implement with system daemons/file system listeners.' -m gemma3:27b -p ollama
482
+ ```
483
+ ```bash
484
+ npc trigger
485
+ ```
486
+
487
+
488
+
489
+
490
+
491
+
492
+ - ## Vixynt: Image generation and editing:
493
+ ```bash
494
+ npcsh
495
+ /vixynt 'an image of a dog eating a hat'
496
+ /vixynt --output_file ~/Desktop/dragon.png "A terrifying dragon"
497
+ /vixynt "A photorealistic portrait of a cat wearing a wizard hat in the dungeon of the master and margarita" -w 1024. height=1024
498
+ /vixynt -igp ollama --igmodel Qwen/QwenImage --output_file /tmp/sub.png width=1024 height=512 "A detailed steampunk submarine exploring a vibrant coral reef, wide aspect ratio"
499
+ ```
500
+
501
+ ```bash
502
+ # bash
503
+ npc vixynt --attachments ./test_data/rabbit.PNG "Turn this rabbit into a fierce warrior in a snowy winter scene" -igp openai -igm gpt-image
504
+ npc vixynt --igmodel CompVis/stable-diffusion-v1-4 --igprovider diffusers "sticker of a red tree"
505
+ ```
506
+
507
+
508
+
509
+
510
+
511
+ - ## `wander`: daydreaming for LLMs
512
+
513
+ <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/wander.md">
514
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/kadiefa.png" alt="logo for wander", width=250></a>
515
+ </p>
516
+ A system for thinking outside of the box. From our testing, it appears gpt-4o-mini and gpt-series models in general appear to wander the most through various languages and ideas with high temperatures. Gemini models and many llama ones appear more stable despite high temps. Thinking models in general appear to be worse at this task.
517
+
518
+ - Wander with an auto-generated environment
519
+ ```
520
+ npc --model "gemini-2.0-flash" --provider "gemini" wander "how does the bar of a galaxy influence the the surrounding IGM?" \
521
+ n-high-temp-streams=10 \
522
+ high-temp=1.95 \
523
+ low-temp=0.4 \
524
+ sample-rate=0.5 \
525
+ interruption-likelihood=1
526
+ ```
527
+ - Specify a custom environment
528
+ ```
529
+ npc --model "gpt-4o-mini" --provider "openai" wander "how does the goos-hanchen effect impact neutron scattering?" \
530
+ environment='a ships library in the south.' \
531
+ num-events=3 \
532
+ n-high-temp-streams=10 \
533
+ high-temp=1.95 \
534
+ low-temp=0.4 \
535
+ sample-rate=0.5 \
536
+ interruption-likelihood=1
537
+ ```
538
+ - Control event generation
539
+ ```
540
+ npc wander "what is the goos hanchen effect and does it affect water refraction?" \
541
+ --provider "ollama" \
542
+ --model "deepseek-r1:32b" \
543
+ environment="a vast, dark ocean ." \
544
+ interruption-likelihood=.1
545
+ ```
546
+
547
+ - ## `yap`: an agentic voice control loop
548
+
549
+
550
+ <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/yap.md">
551
+ <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/yap.png" alt="logo for yap ", width=250></a>
552
+ </p>
553
+
554
+ - an agentic voice control loop with a specified agent. When launching `yap`, the user enters the typical `npcsh` agentic loop except that the system is waiting for either text or audio input.
555
+ - voice chat:
556
+ ```bash
557
+ # npcsh
558
+ /yap
559
+ ```
560
+ ```bash
561
+ # bash
562
+ yap
563
+ npc yap
564
+ ```
565
+ - Show available Jinja Execution Templates:
566
+ ```bash
567
+ # npcsh
568
+ /jinxs
569
+ ```
570
+ ```bash
571
+ # bash
572
+ npc jinxs
573
+ ```
574
+
575
+
576
+
577
+ ## Inference Capabilities
578
+ - `npcsh` works with local and enterprise LLM providers through its LiteLLM integration, allowing users to run inference from Ollama, LMStudio, OpenAI, Anthropic, Gemini, and Deepseek, making it a versatile tool for both simple commands and sophisticated AI-driven tasks.
579
+
580
+ ## Read the Docs
581
+
582
+ Read the docs at [npcsh.readthedocs.io](https://npcsh.readthedocs.io/en/latest/)
583
+
584
+
585
+ ## NPC Studio
586
+ There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the open source code for NPC Studio [here](https://github.com/npc-worldwide/npc-studio). Download the executables at [our website](https://enpisi.com/npc-studio).
587
+
588
+
589
+ ## Mailing List
590
+ Interested to stay in the loop and to hear the latest and greatest about `npcpy`, `npcsh`, and NPC Studio? Be sure to sign up for the [newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)!
591
+
592
+
593
+ ## Support
594
+ If you appreciate the work here, [consider supporting NPC Worldwide with a monthly donation](https://buymeacoffee.com/npcworldwide), [buying NPC-WW themed merch](https://enpisi.com/shop), or hiring us to help you explore how to use the NPC Toolkit and AI tools to help your business or research team, please reach out to info@npcworldwi.de .
595
+
596
+
597
+ ## Installation
598
+ `npcsh` is available on PyPI and can be installed using pip. Before installing, make sure you have the necessary dependencies installed on your system. Below are the instructions for installing such dependencies on Linux, Mac, and Windows. If you find any other dependencies that are needed, please let us know so we can update the installation instructions to be more accommodating.
599
+
600
+ ### Linux install
601
+ <details> <summary> Toggle </summary>
602
+
603
+ ```bash
604
+
605
+ # these are for audio primarily, skip if you dont need tts
606
+ sudo apt-get install espeak
607
+ sudo apt-get install portaudio19-dev python3-pyaudio
608
+ sudo apt-get install alsa-base alsa-utils
609
+ sudo apt-get install libcairo2-dev
610
+ sudo apt-get install libgirepository1.0-dev
611
+ sudo apt-get install ffmpeg
612
+
613
+ # for triggers
614
+ sudo apt install inotify-tools
615
+
616
+
617
+ #And if you don't have ollama installed, use this:
618
+ curl -fsSL https://ollama.com/install.sh | sh
619
+
620
+ ollama pull llama3.2
621
+ ollama pull llava:7b
622
+ ollama pull nomic-embed-text
623
+ pip install npcsh
624
+ # if you want to install with the API libraries
625
+ pip install 'npcsh[lite]'
626
+ # if you want the full local package set up (ollama, diffusers, transformers, cuda etc.)
627
+ pip install 'npcsh[local]'
628
+ # if you want to use tts/stt
629
+ pip install 'npcsh[yap]'
630
+ # if you want everything:
631
+ pip install 'npcsh[all]'
632
+
633
+ ```
634
+
635
+ </details>
636
+
637
+
638
+ ### Mac install
639
+
640
+ <details> <summary> Toggle </summary>
641
+
642
+ ```bash
643
+ #mainly for audio
644
+ brew install portaudio
645
+ brew install ffmpeg
646
+ brew install pygobject3
647
+
648
+ # for triggers
649
+ brew install inotify-tools
650
+
651
+
652
+ brew install ollama
653
+ brew services start ollama
654
+ ollama pull llama3.2
655
+ ollama pull llava:7b
656
+ ollama pull nomic-embed-text
657
+ pip install npcsh
658
+ # if you want to install with the API libraries
659
+ pip install npcsh[lite]
660
+ # if you want the full local package set up (ollama, diffusers, transformers, cuda etc.)
661
+ pip install npcsh[local]
662
+ # if you want to use tts/stt
663
+ pip install npcsh[yap]
664
+
665
+ # if you want everything:
666
+ pip install npcsh[all]
667
+ ```
668
+ </details>
669
+
670
+ ### Windows Install
671
+
672
+ <details> <summary> Toggle </summary>
673
+ Download and install ollama exe.
674
+
675
+ Then, in a powershell. Download and install ffmpeg.
676
+
677
+ ```powershell
678
+ ollama pull llama3.2
679
+ ollama pull llava:7b
680
+ ollama pull nomic-embed-text
681
+ pip install npcsh
682
+ # if you want to install with the API libraries
683
+ pip install 'npcsh[lite]'
684
+ # if you want the full local package set up (ollama, diffusers, transformers, cuda etc.)
685
+ pip install 'npcsh[local]'
686
+ # if you want to use tts/stt
687
+ pip install 'npcsh[yap]'
688
+
689
+ # if you want everything:
690
+ pip install 'npcsh[all]'
691
+ ```
692
+ As of now, npcsh appears to work well with some of the core functionalities like /ots and /yap.
693
+
694
+ </details>
695
+
696
+ ### Fedora Install (under construction)
697
+
698
+ <details> <summary> Toggle </summary>
699
+
700
+ ```bash
701
+ python3-dev #(fixes hnswlib issues with chroma db)
702
+ xhost + (pyautogui)
703
+ python-tkinter (pyautogui)
704
+ ```
705
+
706
+ </details>
707
+
708
+ ## Startup Configuration and Project Structure
709
+ After `npcsh` has been pip installed, `npcsh`, `guac`, `pti`, `spool`, `yap` and the `npc` CLI can be used as command line tools. To initialize these correctly, first start by starting the NPC shell:
710
+ ```bash
711
+ npcsh
712
+ ```
713
+ When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
714
+ Here is an example of what the .npcshrc file might look like after this has been run.
715
+ ```bash
716
+ # NPCSH Configuration File
717
+ export NPCSH_INITIALIZED=1
718
+ export NPCSH_CHAT_PROVIDER='ollama'
719
+ export NPCSH_CHAT_MODEL='llama3.2'
720
+ export NPCSH_DB_PATH='~/npcsh_history.db'
721
+ ```
722
+
723
+ `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
724
+
725
+ The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
726
+
727
+ ```bash
728
+ # Source NPCSH configuration
729
+ if [ -f ~/.npcshrc ]; then
730
+ . ~/.npcshrc
731
+ fi
732
+ ```
733
+
734
+ We support inference via all providers supported by litellm. For openai-compatible providers that are not explicitly named in litellm, use simply `openai-like` as the provider. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
735
+
736
+ To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
737
+
738
+ ```bash
739
+ export OPENAI_API_KEY="your_openai_key"
740
+ export ANTHROPIC_API_KEY="your_anthropic_key"
741
+ export DEEPSEEK_API_KEY='your_deepseek_key'
742
+ export GEMINI_API_KEY='your_gemini_key'
743
+ export PERPLEXITY_API_KEY='your_perplexity_key'
744
+ ```
745
+
746
+
747
+ Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
748
+ Once initialized and set up, you will find the following in your ~/.npcsh directory:
749
+ ```bash
750
+ ~/.npcsh/
751
+ ├── npc_team/ # Global NPCs
752
+ │ ├── jinxs/ # Global tools
753
+ │ └── assembly_lines/ # Workflow pipelines
754
+
755
+ ```
756
+ For cases where you wish to set up a project specific set of NPCs, jinxs, and assembly lines, add a `npc_team` directory to your project and `npcsh` should be able to pick up on its presence, like so:
757
+ ```bash
758
+ ./npc_team/ # Project-specific NPCs
759
+ ├── jinxs/ # Project jinxs #example jinx next
760
+ │ └── example.jinx
761
+ └── assembly_lines/ # Project workflows
762
+ └── example.pipe
763
+ └── models/ # Project workflows
764
+ └── example.model
765
+ └── example1.npc # Example NPC
766
+ └── example2.npc # Example NPC
767
+ └── team.ctx # Example ctx
768
+
769
+
770
+ ```
771
+
772
+ ## Contributing
773
+ Contributions are welcome! Please submit issues and pull requests on the GitHub repository.
774
+
775
+
776
+ ## License
777
+ This project is licensed under the MIT License.