npcsh 1.0.16__py3-none-any.whl → 1.0.18__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,825 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: npcsh
3
- Version: 1.0.16
4
- Summary: npcsh is a command-line toolkit for using AI agents in novel ways.
5
- Home-page: https://github.com/NPC-Worldwide/npcsh
6
- Author: Christopher Agostino
7
- Author-email: info@npcworldwi.de
8
- Classifier: Programming Language :: Python :: 3
9
- Classifier: License :: OSI Approved :: MIT License
10
- Requires-Python: >=3.10
11
- Description-Content-Type: text/markdown
12
- License-File: LICENSE
13
- Requires-Dist: npcpy
14
- Requires-Dist: jinja2
15
- Requires-Dist: litellm
16
- Requires-Dist: docx
17
- Requires-Dist: scipy
18
- Requires-Dist: numpy
19
- Requires-Dist: thefuzz
20
- Requires-Dist: imagehash
21
- Requires-Dist: requests
22
- Requires-Dist: matplotlib
23
- Requires-Dist: markdown
24
- Requires-Dist: networkx
25
- Requires-Dist: PyYAML
26
- Requires-Dist: PyMuPDF
27
- Requires-Dist: pyautogui
28
- Requires-Dist: pydantic
29
- Requires-Dist: pygments
30
- Requires-Dist: sqlalchemy
31
- Requires-Dist: termcolor
32
- Requires-Dist: rich
33
- Requires-Dist: colorama
34
- Requires-Dist: Pillow
35
- Requires-Dist: python-dotenv
36
- Requires-Dist: pandas
37
- Requires-Dist: beautifulsoup4
38
- Requires-Dist: duckduckgo-search
39
- Requires-Dist: flask
40
- Requires-Dist: flask_cors
41
- Requires-Dist: redis
42
- Requires-Dist: psycopg2-binary
43
- Requires-Dist: flask_sse
44
- Requires-Dist: wikipedia
45
- Provides-Extra: lite
46
- Requires-Dist: anthropic; extra == "lite"
47
- Requires-Dist: openai; extra == "lite"
48
- Requires-Dist: google-generativeai; extra == "lite"
49
- Requires-Dist: google-genai; extra == "lite"
50
- Provides-Extra: local
51
- Requires-Dist: sentence_transformers; extra == "local"
52
- Requires-Dist: opencv-python; extra == "local"
53
- Requires-Dist: ollama; extra == "local"
54
- Requires-Dist: kuzu; extra == "local"
55
- Requires-Dist: chromadb; extra == "local"
56
- Requires-Dist: diffusers; extra == "local"
57
- Requires-Dist: nltk; extra == "local"
58
- Requires-Dist: torch; extra == "local"
59
- Provides-Extra: yap
60
- Requires-Dist: pyaudio; extra == "yap"
61
- Requires-Dist: gtts; extra == "yap"
62
- Requires-Dist: playsound==1.2.2; extra == "yap"
63
- Requires-Dist: pygame; extra == "yap"
64
- Requires-Dist: faster_whisper; extra == "yap"
65
- Requires-Dist: pyttsx3; extra == "yap"
66
- Provides-Extra: mcp
67
- Requires-Dist: mcp; extra == "mcp"
68
- Provides-Extra: all
69
- Requires-Dist: anthropic; extra == "all"
70
- Requires-Dist: openai; extra == "all"
71
- Requires-Dist: google-generativeai; extra == "all"
72
- Requires-Dist: google-genai; extra == "all"
73
- Requires-Dist: sentence_transformers; extra == "all"
74
- Requires-Dist: opencv-python; extra == "all"
75
- Requires-Dist: ollama; extra == "all"
76
- Requires-Dist: kuzu; extra == "all"
77
- Requires-Dist: chromadb; extra == "all"
78
- Requires-Dist: diffusers; extra == "all"
79
- Requires-Dist: nltk; extra == "all"
80
- Requires-Dist: torch; extra == "all"
81
- Requires-Dist: pyaudio; extra == "all"
82
- Requires-Dist: gtts; extra == "all"
83
- Requires-Dist: playsound==1.2.2; extra == "all"
84
- Requires-Dist: pygame; extra == "all"
85
- Requires-Dist: faster_whisper; extra == "all"
86
- Requires-Dist: pyttsx3; extra == "all"
87
- Requires-Dist: mcp; extra == "all"
88
- Dynamic: author
89
- Dynamic: author-email
90
- Dynamic: classifier
91
- Dynamic: description
92
- Dynamic: description-content-type
93
- Dynamic: home-page
94
- Dynamic: license-file
95
- Dynamic: provides-extra
96
- Dynamic: requires-dist
97
- Dynamic: requires-python
98
- Dynamic: summary
99
-
100
- <p align="center">
101
- <a href= "https://github.com/npc-worldwide/npcsh/blob/main/docs/npcsh.md">
102
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npcsh.png" alt="npcsh logo" width=250></a>
103
- </p>
104
-
105
- # NPC Shell
106
-
107
- The NPC shell is a suite of executable command-line programs that allow users to easily interact with NPCs and LLMs through a command line shell.
108
-
109
- Programs within the NPC shell use the properties defined in `~/.npcshrc`, which is generated upon installation and running of `npcsh` for the first time.
110
-
111
- To get started:
112
- ```
113
- pip install 'npcsh[local]'
114
- ```
115
- Once installed, the following CLI tools will be available: `npcsh`, `guac`, `npc` cli, `yap` `pti`, `wander`, and `spool`.
116
-
117
-
118
- # npcsh
119
- - An AI-powered shell that parses bash, natural language, and special macro calls, `npcsh` processes your input accordingly, agentically, and automatically.
120
-
121
-
122
- - Get help with a task:
123
- ```
124
- npcsh:🤖sibiji:gemini-2.5-flash>can you help me identify what process is listening on port 5337?
125
- ```
126
- <p align="center">
127
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/port5337.png" alt="example of running npcsh to check what processes are listening on port 5337", width=600>
128
- </p>
129
- - Edit files
130
-
131
- - **Ask a Generic Question**
132
- ```bash
133
- npcsh> has there ever been a better pasta shape than bucatini?
134
- ```
135
-
136
- ```
137
- .Loaded .env file...
138
- Initializing database schema...
139
- Database schema initialization complete.
140
- Processing prompt: 'has there ever been a better pasta shape than bucatini?' with NPC: 'sibiji'...
141
- • Action chosen: answer_question
142
- • Explanation given: The question is a general opinion-based inquiry about pasta shapes and can be answered without external data or jinx invocation.
143
- ...............................................................................
144
- Bucatini is certainly a favorite for many due to its unique hollow center, which holds sauces beautifully. Whether it's "better" is subjective and depends on the dish and personal
145
- preference. Shapes like orecchiette, rigatoni, or trofie excel in different recipes. Bucatini stands out for its versatility and texture, making it a top contender among pasta shapes!
146
- ```
147
-
148
-
149
- - **Search the Web**
150
- ```bash
151
- /search "cal golden bears football schedule" -sp perplexity
152
- ```
153
- <p align="center">
154
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/search_example.png" alt="example of search results", width=600>
155
- </p>
156
-
157
- - **Computer Use**
158
- ```bash
159
- /plonk 'find out the latest news on cnn'
160
- ```
161
-
162
- - **Generate Image**
163
- ```bash
164
- /vixynt 'generate an image of a rabbit eating ham in the brink of dawn' model='gpt-image-1' provider='openai'
165
- ```
166
- <p align="center">
167
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/rabbit.PNG" alt="a rabbit eating ham in the bring of dawn", width=250>
168
- </p>
169
- - **Generate Video**
170
- ```bash
171
- /roll 'generate a video of a hat riding a dog'
172
- ```
173
- <!--
174
- <p align="center">
175
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/hat_video.mp4" alt="video of a hat riding a dog", width=250>
176
- </p> -->
177
-
178
- - **Serve an NPC Team**
179
- ```bash
180
- /serve --port 5337 --cors='http://localhost:5137/'
181
- ```
182
- - **Screenshot Analysis**
183
- ```bash
184
- /ots
185
- ```
186
-
187
-
188
- # Macros
189
- - activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
190
-
191
- - ## TL; DR:
192
- - `/alicanto` - Conduct deep research with multiple perspectives, identifying gold insights and cliff warnings
193
- - `/brainblast` - Execute an advanced chunked search on command history
194
- - `/breathe` - Condense context on a regular cadence
195
- - `/compile` - Compile NPC profiles
196
- - `/flush` - Flush the last N messages
197
- - `/guac` - Enter guac mode
198
- - `/help` - Show help for commands, NPCs, or Jinxs. Usage: /help
199
- - `/init` - Initialize NPC project
200
- - `/jinxs` - Show available jinxs for the current NPC/Team
201
- - `/ots` - Take screenshot and analyze with vision model
202
- - `/plan` - Execute a plan command
203
- - `/plonk` - Use vision model to interact with GUI. Usage: /plonk <task description>
204
- - `/pti` - Use pardon-the-interruption mode to interact with reasoning model LLM
205
- - `/rag` - Execute a RAG command using ChromaDB embeddings with optional file input (-f/--file)
206
- - `/roll` - generate a video with video generation model
207
- - `/sample` - Send a prompt directly to the LLM
208
- - `/search` - Execute a web search command
209
- - `/serve` - Serve an NPC Team server.
210
- - `/set` - Set configuration values
211
- - `/sleep` - Evolve knowledge graph with options for dreaming.
212
- - `/spool` - Enter interactive chat (spool) mode with an npc with fresh context or files for rag
213
- - `/trigger` - Execute a trigger command
214
- - `/vixynt` - Generate and edit images from text descriptions using local models, openai, gemini
215
- - `/wander` - A method for LLMs to think on a problem by switching between states of high temperature and low temperature
216
- - `/yap` - Enter voice chat (yap) mode
217
-
218
- ## Common Command-Line Flags:
219
-
220
- ```
221
- Flag Shorthand | Flag Shorthand | Flag Shorthand | Flag Shorthand
222
- ------------------------------ | ------------------------------ | ------------------------------ | ------------------------------
223
- --attachments (-a) | --height (-h) | --num_npcs (-num_n) | --team (-tea)
224
- --config_dir (-con) | --igmodel (-igm) | --output_file (-o) | --temperature (-t)
225
- --cors (-cor) | --igprovider (-igp) | --plots_dir (-pl) | --top_k
226
- --creativity (-cr) | --lang (-l) | --port (-po) | --top_p
227
- --depth (-d) | --max_tokens (-ma) | --provider (-pr) | --vmodel (-vm)
228
- --emodel (-em) | --messages (-me) | --refresh_period (-re) | --vprovider (-vp)
229
- --eprovider (-ep) | --model (-mo) | --rmodel (-rm) | --width (-w)
230
- --exploration (-ex) | --npc (-np) | --rprovider (-rp) |
231
- --format (-f) | --num_frames (-num_f) | --sprovider (-s) |
232
- ```
233
- '
234
-
235
- - ## `/alicanto`: a research exploration agent flow.
236
-
237
- <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/alicanto.md">
238
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/alicanto.png" alt="logo for deep research", width=250></a>
239
- </p>
240
-
241
- - Examples:
242
- ```bash
243
- # npcsh
244
- /alicanto "What are the implications of quantum computing for cybersecurity?"
245
- /alicanto "How might climate change impact global food security?" --num-npcs 8 --depth 5
246
- ```
247
-
248
- ```bash
249
- # bash
250
- npc alicanto "What ethical considerations should guide AI development?" --max_facts_per_chain 0.5 --max_thematic_groups 3 --max_criticisms_per_group 3 max_conceptual_combinations 3 max_experiments 10
251
-
252
- npc alicanto "What is the future of remote work?" --format report # NOTE: Report generation and formatting requires latex installed.
253
- ```
254
- - ## `/brainblast`: searching through past messages (soon to incorporate options for knowledge graph)
255
- ```bash
256
- # npcsh
257
- /brainblast 'subtle summer winds' --top_k 10
258
- ```
259
- ```bash
260
- # bash
261
- npc brainblast 'executing a mirror in the wonderous moon'
262
- ```
263
- - ## `/breathe`: Condense conversation context (shell only):
264
- ```bash
265
- # npcsh
266
- /breathe
267
- /breathe -p ollama -m qwen3:latest
268
- ```
269
- - ## `/compile`: render npcs for use without re-loading npcsh
270
- ```bash
271
- # npcsh
272
- /compile ./npc_team/sibiji.npc
273
- ```
274
- - ## `/flush`: flush context (shell only):
275
- If you're in the NPC shell and have been in a conversation thats going nowhere and you want to start over... just flush theh contexf.
276
- ```bash
277
- /flush
278
- ```
279
-
280
-
281
- - ## `/guac`
282
-
283
- <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/guac.md">
284
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/guac.png" alt="npcsh logo of a solarpunk sign", width=250></a>
285
- </p>
286
-
287
- - a replacement shell for interpreters like python/r/node/julia with an avocado input marker 🥑 that brings a pomodoro-like approach to interactive coding.
288
- - available as a standalone program runnable via the `guac` command after `npcsh` has been installed via pip.
289
-
290
- - Simulation:
291
- `🥑 Make a markov chain simulation of a random walk in 2D space with 1000 steps and visualize`
292
- ```
293
- # Generated python code:
294
- import numpy as np
295
- import matplotlib.pyplot as plt
296
-
297
- # Number of steps
298
- n_steps = 1000
299
-
300
- # Possible moves: up, down, left, right
301
- moves = np.array([[0, 1], [0, -1], [1, 0], [-1, 0]])
302
-
303
- # Initialize position array
304
- positions = np.zeros((n_steps+1, 2), dtype=int)
305
-
306
- # Generate random moves
307
- for i in range(1, n_steps+1):
308
- step = moves[np.random.choice(4)]
309
- positions[i] = positions[i-1] + step
310
-
311
- # Plot the random walk
312
- plt.figure(figsize=(8, 8))
313
- plt.plot(positions[:, 0], positions[:, 1], lw=1)
314
- plt.scatter([positions[0, 0]], [positions[0, 1]], color='green', label='Start')
315
- plt.scatter([positions[-1, 0]], [positions[-1, 1]], color='red', label='End')
316
- plt.title('2D Random Walk - 1000 Steps (Markov Chain)')
317
- plt.xlabel('X Position')
318
- plt.ylabel('Y Position')
319
- plt.legend()
320
- plt.grid(True)
321
- plt.axis('equal')
322
- plt.show()
323
- # Generated code executed successfully
324
-
325
- ```
326
- <p align="center">
327
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/test_data/markov_chain.png" alt="markov_chain_figure", width=250>
328
- </p>
329
-
330
- Access the variables created in the code:
331
- `🥑 print(positions)`
332
- ```
333
- [[ 0 0]
334
- [ 0 -1]
335
- [ -1 -1]
336
- ...
337
- [ 29 -23]
338
- [ 28 -23]
339
- [ 27 -23]]
340
- ```
341
-
342
- - Run a python script:
343
- `🥑 run file.py`
344
- - Refresh:
345
- `🥑 /refresh`
346
- - Show current variables:
347
- `🥑 /show`
348
-
349
- A guac session progresses through a series of stages, each of equal length. Each stage adjusts the emoji input prompt. Once the stages have passed, it is time to refresh. Stage 1: `🥑`, Stage 2: `🥑🔪` Stage 3: `🥑🥣` Stage:4 `🥑🥣🧂`, `Stage 5: 🥘 TIME TO REFRESH`. At stage 5, the user is reminded to refresh with the /refresh macro. This will evaluate the session so farand suggest and implement new functions or automations that will aid in future sessions, with the ultimate approval of the user.
350
-
351
-
352
- - ## `/help`: Show help for commands, NPCs, or Jinxs.
353
- ```bash
354
- /help
355
- ```
356
- ```
357
- npc help
358
- ```
359
- - ## `/init` - Initialize NPC project
360
- -set up bare bones infra for an npc team
361
- ```bash
362
- # npcsh
363
- /init
364
- ```
365
- ```bash
366
- # bash
367
- npc init
368
- ```
369
-
370
-
371
- - ## `/jinxs` : show available jinxs for team
372
- Jinxs are Jinja execution templates that let users develop small programs that can build on each other and reference each other through jinja templating. Jinx methods allow us to give smaller LLMs the scaffolding to perform `tool calling`, so to speak, reliably
373
- ```bash
374
- # npcsh
375
- /jinxs
376
- # bash
377
- npc jinxs
378
- ```
379
-
380
- ```python
381
- Available Jinxs:
382
- --- Jinxs for NPC: sibiji ---
383
-
384
- • /bash_executor: Execute bash queries.
385
-
386
- • /calc: A jinx to simplify and evaluate mathematical expressions (/calc 1+5, /calc 47233*234234)
387
-
388
- • /data_pull: Execute queries on the ~/npcsh_history.db to pull data. The database contains only information about conversations and other user-provided data. It does not store any information about individual files (/data_pull 'select * from conversation_history limit 10')
389
-
390
-
391
- • /file_editor: Examines a file, determines what changes are needed, and applies those changes. (/file_editor filename.py 'instructions for carrying out the editing')
392
-
393
- • /image_generation_jinx: Generates images based on a text prompt. (/image_generation_jinx 'prompt for llm' output_name )
394
-
395
- • /internet_search: Searches the web for information based on a query in order to verify timiely details (e.g. current events) or to corroborate information in uncertain situations. Should be mainly only used when users
396
- specifically request a search, otherwise an LLMs basic knowledge should be sufficient. ( /internet_search 'cost of cubs tickets' )
397
- • /local_search: Searches files in current and downstream directories to find items related to the users query using fuzzy matching. (/local_search 'class NPC')
398
- Returns only relevant snippets (10 lines around matches) to avoid including too much irrelevant content. Intended for fuzzy searches, not for understanding file sizes.
399
-
400
- • /python_executor: Execute scripts with python. Set the ultimate result as the "output" variable. It must be a string. Do not add unnecessary print statements. (/python_executor 'import numpy as np; print(np.arange(1000))')
401
- • /screen_capture_analysis_jinx: Captures the whole screen and sends the image for analysis (mostly redundant with /ots.)
402
- ```
403
-
404
-
405
-
406
- - ## `/ots`: Over-the-shoulder screen shot analysis
407
- - Screenshot analysis:
408
- ```bash
409
- #npcsh
410
- /ots
411
- /ots output_filename =...
412
- ```
413
- ```bash
414
- #bash
415
- npc ots ...
416
- ```
417
- - ## `/plan`: set up cron jobs:
418
- ```bash
419
- # npcsh
420
- /plan 'set up a cron job that reminds me to stretch every thirty minutes' -m gemma3:27b -p ollama
421
- ```
422
- ```bash
423
- # bash
424
- npc plan 'record my cpu usage percentage every 45 minutes'
425
- ```
426
-
427
- - ## `/plonk`: Computer use:
428
- ```bash
429
- # npcsh
430
- /plonk -n 'npc_name' -sp 'task for plonk to carry out '
431
-
432
- #bash
433
- npc plonk
434
- ```
435
- - ## `/pti`: a reasoning REPL loop with interruptions
436
-
437
- ```npcsh
438
- /pti -n frederic -m qwen3:latest -p ollama
439
- ```
440
-
441
- Or from the bash cmd line:
442
- ```bash
443
- pti
444
- ```
445
- <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/pti.md">
446
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/frederic4.png" alt="npcsh logo of frederic the bear and the pti logo", width=250></a>
447
- </p>
448
-
449
- - ## `/rag`: embedding search through chroma db, optional file input
450
- - ## `/roll`: your video generation assistant
451
- -
452
- ```npcsh
453
- /roll --provider ollama --model llama3
454
- ```
455
-
456
- - ## `/sample`: one-shot sampling from LLMs with specific parameters
457
- ```bash
458
- # npcsh
459
- /sample 'prompt'
460
- /sample -m gemini-1.5-flash "Summarize the plot of 'The Matrix' in three sentences."
461
-
462
- /sample --model claude-3-5-haiku-latest "Translate 'good morning' to Japanese."
463
-
464
- /sample model=qwen3:latest "tell me about the last time you went shopping."
465
-
466
-
467
- ```
468
- ```bash
469
- # bash
470
- npc sample -p ollama -m gemma3:12b --temp 1.8 --top_k 50 "Write a haiku about the command line."
471
-
472
- npc sample model=gpt-4o-mini "What are the primary colors?" --provider openai
473
- ```
474
-
475
- - ## `/search`: use an internet search provider
476
- ```npcsh
477
- /search -sp perplexity 'cal bears football schedule'
478
- /search --sprovider duckduckgo 'beef tongue'
479
- # Other search providers could be added, but we have only integrated duckduckgo and perplexity for the moment.
480
- ```
481
-
482
- ```bash
483
- npc search 'when is the moon gonna go away from the earth'
484
- ```
485
-
486
-
487
- - ## `/serve`: serve an npc team
488
- ```bash
489
- /serve
490
- /serve ....
491
- # Other search providers could be added, but we have only integrated duckduckgo and perplexity for the moment.
492
- ```
493
-
494
- ```bash
495
- npc serve
496
- ```
497
-
498
- - ## `/set`: change current model, env params
499
- ```bash
500
- /set model ...
501
- /set provider ...
502
- /set NPCSH_API_URL https://localhost:1937
503
- ```
504
-
505
- ```bash
506
- npc set ...
507
- ```
508
- - ## `/sleep`: prune and evolve the current knowledge graph
509
- ```bash
510
- /sleep
511
- /sleep --dream
512
- /sleep --ops link_facts,deepen
513
- ```
514
-
515
- ```bash
516
- npc sleep
517
- ```
518
- - ## `/spool`
519
- <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/spool.md">
520
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/spool.png" alt="logo for spool", width=250></a>
521
- </p>
522
-
523
- - Enter chat loop with isolated context, attachments, specified models/providers:
524
- ```npcsh
525
- /spool -n <npc_name>
526
- /spool --attachments ./test_data/port5337.png,./test_data/yuan2004.pdf,./test_data/books.csv
527
- /spool --provider ollama --model llama3
528
- /spool -p deepseek -m deepseek-reasoner
529
- /spool -n alicanto
530
- ```
531
-
532
-
533
-
534
- - ## Trigger: schedule listeners, daemons
535
- ```bash
536
- /trigger 'a description of a trigger to implement with system daemons/file system listeners.' -m gemma3:27b -p ollama
537
- ```
538
- ```bash
539
- npc trigger
540
- ```
541
-
542
-
543
-
544
-
545
-
546
-
547
- - ## `/vixynt`: Image generation and editing:
548
- ```bash
549
- npcsh
550
- /vixynt 'an image of a dog eating a hat'
551
- /vixynt --output_file ~/Desktop/dragon.png "A terrifying dragon"
552
- /vixynt "A photorealistic portrait of a cat wearing a wizard hat in the dungeon of the master and margarita" -w 1024. height=1024
553
- /vixynt -igp ollama --igmodel Qwen/QwenImage --output_file /tmp/sub.png width=1024 height=512 "A detailed steampunk submarine exploring a vibrant coral reef, wide aspect ratio"
554
- ```
555
-
556
- ```bash
557
- # bash
558
- npc vixynt --attachments ./test_data/rabbit.PNG "Turn this rabbit into a fierce warrior in a snowy winter scene" -igp openai -igm gpt-image
559
- npc vixynt --igmodel CompVis/stable-diffusion-v1-4 --igprovider diffusers "sticker of a red tree"
560
- ```
561
-
562
-
563
-
564
-
565
-
566
- - ## `/wander`: daydreaming for LLMs
567
-
568
- <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/wander.md">
569
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/kadiefa.png" alt="logo for wander", width=250></a>
570
- </p>
571
- A system for thinking outside of the box. From our testing, it appears gpt-4o-mini and gpt-series models in general appear to wander the most through various languages and ideas with high temperatures. Gemini models and many llama ones appear more stable despite high temps. Thinking models in general appear to be worse at this task.
572
-
573
- - Wander with an auto-generated environment
574
- ```
575
- npc --model "gemini-2.0-flash" --provider "gemini" wander "how does the bar of a galaxy influence the the surrounding IGM?" \
576
- n-high-temp-streams=10 \
577
- high-temp=1.95 \
578
- low-temp=0.4 \
579
- sample-rate=0.5 \
580
- interruption-likelihood=1
581
- ```
582
- - Specify a custom environment
583
- ```
584
- npc --model "gpt-4o-mini" --provider "openai" wander "how does the goos-hanchen effect impact neutron scattering?" \
585
- environment='a ships library in the south.' \
586
- num-events=3 \
587
- n-high-temp-streams=10 \
588
- high-temp=1.95 \
589
- low-temp=0.4 \
590
- sample-rate=0.5 \
591
- interruption-likelihood=1
592
- ```
593
- - Control event generation
594
- ```
595
- npc wander "what is the goos hanchen effect and does it affect water refraction?" \
596
- --provider "ollama" \
597
- --model "deepseek-r1:32b" \
598
- environment="a vast, dark ocean ." \
599
- interruption-likelihood=.1
600
- ```
601
-
602
- - ## `/yap`: an agentic voice control loop
603
-
604
-
605
- <p align="center"><a href ="https://github.com/npc-worldwide/npcsh/blob/main/docs/yap.md">
606
- <img src="https://raw.githubusercontent.com/npc-worldwide/npcsh/main/npcsh/npc_team/yap.png" alt="logo for yap ", width=250></a>
607
- </p>
608
-
609
- - an agentic voice control loop with a specified agent. When launching `yap`, the user enters the typical `npcsh` agentic loop except that the system is waiting for either text or audio input.
610
- - voice chat:
611
- ```bash
612
- # npcsh
613
- /yap
614
- ```
615
- ```bash
616
- # bash
617
- yap
618
- npc yap
619
- ```
620
-
621
- ## Inference Capabilities
622
- - `npcsh` works with local and enterprise LLM providers through its LiteLLM integration, allowing users to run inference from Ollama, LMStudio, vLLM, MLX, OpenAI, Anthropic, Gemini, and Deepseek, making it a versatile tool for both simple commands and sophisticated AI-driven tasks.
623
-
624
- ## Read the Docs
625
-
626
- Read the docs at [npcsh.readthedocs.io](https://npcsh.readthedocs.io/en/latest/)
627
-
628
-
629
- ## NPC Studio
630
- There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the open source code for NPC Studio [here](https://github.com/npc-worldwide/npc-studio). Download the executables at [our website](https://enpisi.com/npc-studio).
631
-
632
-
633
- ## Mailing List
634
- Interested to stay in the loop and to hear the latest and greatest about `npcpy`, `npcsh`, and NPC Studio? Be sure to sign up for the [newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)!
635
-
636
-
637
- ## Support
638
- If you appreciate the work here, [consider supporting NPC Worldwide with a monthly donation](https://buymeacoffee.com/npcworldwide), [buying NPC-WW themed merch](https://enpisi.com/shop), or hiring us to help you explore how to use the NPC Toolkit and AI tools to help your business or research team, please reach out to info@npcworldwi.de .
639
-
640
-
641
- ## Installation
642
- `npcsh` is available on PyPI and can be installed using pip. Before installing, make sure you have the necessary dependencies installed on your system. Below are the instructions for installing such dependencies on Linux, Mac, and Windows. If you find any other dependencies that are needed, please let us know so we can update the installation instructions to be more accommodating.
643
-
644
- ### Linux install
645
- <details> <summary> Toggle </summary>
646
-
647
- ```bash
648
-
649
- # these are for audio primarily, skip if you dont need tts
650
- sudo apt-get install espeak
651
- sudo apt-get install portaudio19-dev python3-pyaudio
652
- sudo apt-get install alsa-base alsa-utils
653
- sudo apt-get install libcairo2-dev
654
- sudo apt-get install libgirepository1.0-dev
655
- sudo apt-get install ffmpeg
656
-
657
- # for triggers
658
- sudo apt install inotify-tools
659
-
660
-
661
- #And if you don't have ollama installed, use this:
662
- curl -fsSL https://ollama.com/install.sh | sh
663
-
664
- ollama pull llama3.2
665
- ollama pull llava:7b
666
- ollama pull nomic-embed-text
667
- pip install npcsh
668
- # if you want to install with the API libraries
669
- pip install 'npcsh[lite]'
670
- # if you want the full local package set up (ollama, diffusers, transformers, cuda etc.)
671
- pip install 'npcsh[local]'
672
- # if you want to use tts/stt
673
- pip install 'npcsh[yap]'
674
- # if you want everything:
675
- pip install 'npcsh[all]'
676
-
677
- ```
678
-
679
- </details>
680
-
681
-
682
- ### Mac install
683
-
684
- <details> <summary> Toggle </summary>
685
-
686
- ```bash
687
- #mainly for audio
688
- brew install portaudio
689
- brew install ffmpeg
690
- brew install pygobject3
691
-
692
- # for triggers
693
- brew install inotify-tools
694
-
695
-
696
- brew install ollama
697
- brew services start ollama
698
- ollama pull llama3.2
699
- ollama pull llava:7b
700
- ollama pull nomic-embed-text
701
- pip install npcsh
702
- # if you want to install with the API libraries
703
- pip install npcsh[lite]
704
- # if you want the full local package set up (ollama, diffusers, transformers, cuda etc.)
705
- pip install npcsh[local]
706
- # if you want to use tts/stt
707
- pip install npcsh[yap]
708
-
709
- # if you want everything:
710
- pip install npcsh[all]
711
- ```
712
- </details>
713
-
714
- ### Windows Install
715
-
716
- <details> <summary> Toggle </summary>
717
- Download and install ollama exe.
718
-
719
- Then, in a powershell. Download and install ffmpeg.
720
-
721
- ```powershell
722
- ollama pull llama3.2
723
- ollama pull llava:7b
724
- ollama pull nomic-embed-text
725
- pip install npcsh
726
- # if you want to install with the API libraries
727
- pip install 'npcsh[lite]'
728
- # if you want the full local package set up (ollama, diffusers, transformers, cuda etc.)
729
- pip install 'npcsh[local]'
730
- # if you want to use tts/stt
731
- pip install 'npcsh[yap]'
732
-
733
- # if you want everything:
734
- pip install 'npcsh[all]'
735
- ```
736
- As of now, npcsh appears to work well with some of the core functionalities like /ots and /yap.
737
-
738
- </details>
739
-
740
- ### Fedora Install (under construction)
741
-
742
- <details> <summary> Toggle </summary>
743
-
744
- ```bash
745
- python3-dev #(fixes hnswlib issues with chroma db)
746
- xhost + (pyautogui)
747
- python-tkinter (pyautogui)
748
- ```
749
-
750
- </details>
751
-
752
- ## Startup Configuration and Project Structure
753
- After `npcsh` has been pip installed, `npcsh`, `guac`, `pti`, `spool`, `yap` and the `npc` CLI can be used as command line tools. To initialize these correctly, first start by starting the NPC shell:
754
- ```bash
755
- npcsh
756
- ```
757
- When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
758
- Here is an example of what the .npcshrc file might look like after this has been run.
759
- ```bash
760
- # NPCSH Configuration File
761
- export NPCSH_INITIALIZED=1
762
- export NPCSH_CHAT_PROVIDER='ollama'
763
- export NPCSH_CHAT_MODEL='llama3.2'
764
- export NPCSH_DB_PATH='~/npcsh_history.db'
765
- ```
766
-
767
- `npcsh` also comes with a set of jinxs and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
768
-
769
- The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
770
-
771
- ```bash
772
- # Source NPCSH configuration
773
- if [ -f ~/.npcshrc ]; then
774
- . ~/.npcshrc
775
- fi
776
- ```
777
-
778
- We support inference via all providers supported by litellm. For openai-compatible providers that are not explicitly named in litellm, use simply `openai-like` as the provider. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
779
-
780
- To use tools that require API keys, create an `.env` file in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
781
-
782
- ```bash
783
- export OPENAI_API_KEY="your_openai_key"
784
- export ANTHROPIC_API_KEY="your_anthropic_key"
785
- export DEEPSEEK_API_KEY='your_deepseek_key'
786
- export GEMINI_API_KEY='your_gemini_key'
787
- export PERPLEXITY_API_KEY='your_perplexity_key'
788
- ```
789
-
790
-
791
- Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
792
- Once initialized and set up, you will find the following in your ~/.npcsh directory:
793
- ```bash
794
- ~/.npcsh/
795
- ├── npc_team/ # Global NPCs
796
- │ ├── jinxs/ # Global tools
797
- │ └── assembly_lines/ # Workflow pipelines
798
- │ └── example.npc # globally available npc
799
- │ └── global.ctx # global context file
800
-
801
-
802
-
803
- ```
804
- For cases where you wish to set up a project specific set of NPCs, jinxs, and assembly lines, add a `npc_team` directory to your project and `npcsh` should be able to pick up on its presence, like so:
805
- ```bash
806
- ./npc_team/ # Project-specific NPCs
807
- ├── jinxs/ # Project jinxs #example jinx next
808
- │ └── example.jinx
809
- └── assembly_lines/ # Project workflows
810
- └── example.pipe
811
- └── models/ # Project workflows
812
- └── example.model
813
- └── example1.npc # Example NPC
814
- └── example2.npc # Example NPC
815
- └── team.ctx # Example ctx
816
-
817
-
818
- ```
819
-
820
- ## Contributing
821
- Contributions are welcome! Please submit issues and pull requests on the GitHub repository.
822
-
823
-
824
- ## License
825
- This project is licensed under the MIT License.