npcsh 0.3.25__py3-none-any.whl → 0.3.26__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (26) hide show
  1. npcsh/llm_funcs.py +25 -11
  2. {npcsh-0.3.25.dist-info → npcsh-0.3.26.dist-info}/METADATA +1207 -1202
  3. {npcsh-0.3.25.dist-info → npcsh-0.3.26.dist-info}/RECORD +26 -26
  4. {npcsh-0.3.25.dist-info → npcsh-0.3.26.dist-info}/WHEEL +1 -1
  5. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/calculator.tool +0 -0
  6. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/celona.npc +0 -0
  7. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/corca.npc +0 -0
  8. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/eriane.npc +0 -0
  9. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/foreman.npc +0 -0
  10. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/generic_search.tool +0 -0
  11. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/image_generation.tool +0 -0
  12. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/lineru.npc +0 -0
  13. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/local_search.tool +0 -0
  14. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/maurawa.npc +0 -0
  15. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/npcsh.ctx +0 -0
  16. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/raone.npc +0 -0
  17. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/screen_cap.tool +0 -0
  18. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/sibiji.npc +0 -0
  19. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/slean.npc +0 -0
  20. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/sql_executor.tool +0 -0
  21. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/test_pipeline.py +0 -0
  22. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/turnic.npc +0 -0
  23. {npcsh-0.3.25.data → npcsh-0.3.26.data}/data/npcsh/npc_team/welxor.npc +0 -0
  24. {npcsh-0.3.25.dist-info → npcsh-0.3.26.dist-info}/entry_points.txt +0 -0
  25. {npcsh-0.3.25.dist-info → npcsh-0.3.26.dist-info}/licenses/LICENSE +0 -0
  26. {npcsh-0.3.25.dist-info → npcsh-0.3.26.dist-info}/top_level.txt +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: npcsh
3
- Version: 0.3.25
3
+ Version: 0.3.26
4
4
  Summary: npcsh is a command line tool for integrating LLMs into everyday workflows and for orchestrating teams of NPCs.
5
5
  Home-page: https://github.com/cagostino/npcsh
6
6
  Author: Christopher Agostino
@@ -27,7 +27,6 @@ Requires-Dist: opencv-python
27
27
  Requires-Dist: librosa
28
28
  Requires-Dist: openai
29
29
  Requires-Dist: jinja2
30
- Requires-Dist: pyautogui
31
30
  Requires-Dist: pandas
32
31
  Requires-Dist: matplotlib
33
32
  Requires-Dist: IPython
@@ -37,14 +36,11 @@ Requires-Dist: markdown
37
36
  Requires-Dist: PyYAML
38
37
  Requires-Dist: langchain
39
38
  Requires-Dist: langchain_community
40
- Requires-Dist: openai-whisper
41
- Requires-Dist: pyaudio
39
+ Requires-Dist: pyautogui
42
40
  Requires-Dist: pygments
43
41
  Requires-Dist: pyttsx3
44
42
  Requires-Dist: kuzu
45
43
  Requires-Dist: chromadb
46
- Requires-Dist: gtts
47
- Requires-Dist: playsound==1.2.2
48
44
  Requires-Dist: termcolor
49
45
  Requires-Dist: colorama
50
46
  Requires-Dist: python-dotenv
@@ -72,7 +68,7 @@ Dynamic: summary
72
68
  # npcsh
73
69
 
74
70
 
75
- - `npcsh` is a python-based command-line tool designed to integrate Large Language Models (LLMs) and Agents into one's daily workflow by making them available and easily configurable through the command line shell.
71
+ - `npcsh` is a python-based AI Agent framework designed to integrate Large Language Models (LLMs) and Agents into one's daily workflow by making them available and easily configurable through a command line shell as well as an extensible python library.
76
72
 
77
73
  - **Smart Interpreter**: `npcsh` leverages the power of LLMs to understand your natural language commands and questions, executing tasks, answering queries, and providing relevant information from local files and the web.
78
74
 
@@ -83,10 +79,9 @@ Dynamic: summary
83
79
 
84
80
  * **Extensible with Python:** `npcsh`'s python package provides useful functions for interacting with LLMs, including explicit coverage for popular providers like ollama, anthropic, openai, gemini, deepseek, and openai-like providers. Each macro has a corresponding function and these can be used in python scripts. `npcsh`'s functions are purpose-built to simplify NPC interactions but NPCs are not required for them to work if you don't see the need.
85
81
 
86
- * **Simple, Powerful CLI:** Use the `npc` CLI commands to set up a flask server so you can expose your NPC team for use as a backend service. You can also use the `npc` CLI to run SQL models defined in your project, execute assembly lines, and verify the integrity of your NPC team's interrelations. `npcsh`'s NPCs take advantage of jinja templating to reference other NPCs and tools in their properties, and the `npc` CLI can be used to verify these references.
87
-
88
- * **Shell Strengths:** Execute bash commands directly. Use your favorite command-line tools like VIM, Emacs, ipython, sqlite3, git. Pipe the output of these commands to LLMs or pass LLM results to bash commands.
82
+ * **Simple, Powerful CLI:** Use the `npc` CLI commands to run `npcsh` macros or commands from one's regular shell. Set up a flask server so you can expose your NPC team for use as a backend service. You can also use the `npc` CLI to run SQL models defined in your project, execute assembly lines, and verify the integrity of your NPC team's interrelations. `npcsh`'s NPCs take advantage of jinja templating to reference other NPCs and tools in their properties, and the `npc` CLI can be used to verify these references.
89
83
 
84
+ * **Powerful Tool integrations:** `npcsh` has built-in tools for users to have agents execute code, analyze data, generate images, search the web, and more. Tools can be defined in YAML files as part of project-specific `npc_team`s or in the global `~/.npcsh/npc_team` directory or simply in python scripts. Once compiled, the tools can be used as macros in the `npc` cli as well as `/{tool_name}` commands in the `npcsh` shell.
90
85
 
91
86
 
92
87
  Interested to stay in the loop and to hear the latest and greatest about `npcsh` ? Be sure to sign up for the [npcsh newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)!
@@ -113,1625 +108,1635 @@ Users can take advantage of `npcsh` through its custom shell or through a comman
113
108
  | Voice Chat | npc whisper 'npc_name' | /whisper |
114
109
 
115
110
 
116
- ## Star History
111
+ ## Python Examples
112
+ Integrate npcsh into your Python projects for additional flexibility. Below are a few examples of how to use the library programmatically.
117
113
 
118
- [![Star History Chart](https://api.star-history.com/svg?repos=cagostino/npcsh&type=Date)](https://star-history.com/#cagostino/npcsh&Date)
119
114
 
120
- ## Installation
121
- `npcsh` is available on PyPI and can be installed using pip. Before installing, make sure you have the necessary dependencies installed on your system. Below are the instructions for installing such dependencies on Linux, Mac, and Windows. If you find any other dependencies that are needed, please let us know so we can update the installation instructions to be more accommodating.
122
115
 
123
- ### Linux install
116
+ ### Example 1: Creating and Using an NPC
117
+ This example shows how to create and initialize an NPC and use it to answer a question.
124
118
  ```bash
119
+ import sqlite3
120
+ from npcsh.npc_compiler import NPC
125
121
 
126
- sudo apt-get install espeak
127
- sudo apt-get install portaudio19-dev python3-pyaudio
128
- sudo apt-get install alsa-base alsa-utils
129
- sudo apt-get install libcairo2-dev
130
- sudo apt-get install libgirepository1.0-dev
131
- sudo apt-get install ffmpeg
122
+ # Set up database connection
123
+ db_path = '~/npcsh_history.db'
124
+ conn = sqlite3.connect(db_path)
132
125
 
133
- #And if you don't have ollama installed, use this:
134
- curl -fsSL https://ollama.com/install.sh | sh
126
+ # Load NPC from a file
127
+ npc = NPC(
128
+ name='Simon Bolivar',
129
+ db_conn=conn,
130
+ primary_directive='Liberate South America from the Spanish Royalists.',
131
+ model='gpt-4o-mini',
132
+ provider='openai',
133
+ )
135
134
 
136
- ollama pull llama3.2
137
- ollama pull llava:7b
138
- ollama pull nomic-embed-text
139
- pip install npcsh
135
+ response = npc.get_llm_response("What is the most important territory to retain in the Andes mountains?")
136
+ print(response['response'])
137
+ ```
138
+ ```bash
139
+ 'The most important territory to retain in the Andes mountains for the cause of liberation in South America would be the region of Quito in present-day Ecuador. This area is strategically significant due to its location and access to key trade routes. It also acts as a vital link between the northern and southern parts of the continent, influencing both military movements and the morale of the independence struggle. Retaining control over Quito would bolster efforts to unite various factions in the fight against Spanish colonial rule across the Andean states.'
140
140
  ```
141
+ ### Example 2: Using an NPC to Analyze Data
142
+ This example shows how to use an NPC to perform data analysis on a DataFrame using LLM commands.
143
+ ```bash
144
+ from npcsh.npc_compiler import NPC
145
+ import sqlite3
146
+ import os
147
+ # Set up database connection
148
+ db_path = '~/npcsh_history.db'
149
+ conn = sqlite3.connect(os.path.expanduser(db_path))
141
150
 
151
+ # make a table to put into npcsh_history.db or change this example to use an existing table in a database you have
152
+ import pandas as pd
153
+ data = {
154
+ 'customer_feedback': ['The product is great!', 'The service was terrible.', 'I love the new feature.'],
155
+ 'customer_id': [1, 2, 3],
156
+ 'customer_rating': [5, 1, 3],
157
+ 'timestamp': ['2022-01-01', '2022-01-02', '2022-01-03']
158
+ }
142
159
 
143
160
 
161
+ df = pd.DataFrame(data)
162
+ df.to_sql('customer_feedback', conn, if_exists='replace', index=False)
144
163
 
145
- ### Mac install
146
- ```bash
147
- brew install portaudio
148
- brew install ffmpeg
149
- brew install ollama
150
- brew services start ollama
151
- brew install pygobject3
152
- ollama pull llama3.2
153
- ollama pull llava:7b
154
- ollama pull nomic-embed-text
155
- pip install npcsh
156
- ```
157
- ### Windows Install
158
164
 
159
- Download and install ollama exe.
165
+ npc = NPC(
166
+ name='Felix',
167
+ db_conn=conn,
168
+ primary_directive='Analyze customer feedback for sentiment.',
169
+ model='gpt-4o-mini',
170
+ provider='openai',
171
+ )
172
+ response = npc.analyze_db_data('Provide a detailed report on the data contained in the `customer_feedback` table?')
160
173
 
161
- Then, in a powershell. Download and install ffmpeg.
162
174
 
163
175
  ```
164
- ollama pull llama3.2
165
- ollama pull llava:7b
166
- ollama pull nomic-embed-text
167
- pip install npcsh
168
- ```
169
- As of now, npcsh appears to work well with some of the core functionalities like /ots and /whisper.
170
176
 
171
177
 
172
- ### Fedora Install (under construction)
173
-
174
- python3-dev (fixes hnswlib issues with chroma db)
175
- xhost + (pyautogui)
176
- python-tkinter (pyautogui)
178
+ ### Example 3: Creating and Using a Tool
179
+ You can define a tool and execute it from within your Python script.
180
+ Here we'll create a tool that will take in a pdf file, extract the text, and then answer a user request about the text.
177
181
 
178
- ## Startup Configuration and Project Structure
179
- After it has been pip installed, `npcsh` can be used as a command line tool. Start it by typing:
180
- ```bash
181
- npcsh
182
- ```
183
- When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
184
- Here is an example of what the .npcshrc file might look like after this has been run.
185
182
  ```bash
186
- # NPCSH Configuration File
187
- export NPCSH_INITIALIZED=1
188
- export NPCSH_CHAT_PROVIDER='ollama'
189
- export NPCSH_CHAT_MODEL='llama3.2'
190
- export NPCSH_DB_PATH='~/npcsh_history.db'
191
- ```
192
- `npcsh` also comes with a set of tools and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
183
+ from npcsh.npc_compiler import Tool, NPC
184
+ import sqlite3
185
+ import os
193
186
 
194
- The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
187
+ from jinja2 import Environment, FileSystemLoader
195
188
 
196
- ```bash
197
- # Source NPCSH configuration
198
- if [ -f ~/.npcshrc ]; then
199
- . ~/.npcshrc
200
- fi
201
- ```
189
+ # Create a proper Jinja environment
190
+ jinja_env = Environment(loader=FileSystemLoader('.'))
202
191
 
203
- We support inference via `openai`, `anthropic`, `ollama`,`gemini`, `deepseek`, and `openai-like` APIs. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
204
192
 
205
- To use tools that require API keys, create an `.env` file up in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
193
+ tool_data = {
194
+ "tool_name": "pdf_analyzer",
195
+ "inputs": ["request", "file"],
196
+ "steps": [{ # Make this a list with one dict inside
197
+ "engine": "python",
198
+ "code": """
199
+ try:
200
+ import fitz # PyMuPDF
206
201
 
207
- ```bash
208
- export OPENAI_API_KEY="your_openai_key"
209
- export ANTHROPIC_API_KEY="your_anthropic_key"
210
- export DEEPSEEK_API_KEY='your_deepseek_key'
211
- export GEMINI_API_KEY='your_gemini_key'
212
- export PERPLEXITY_API_KEY='your_perplexity_key'
213
- ```
202
+ shared_context = {}
203
+ shared_context['inputs'] = '{{request}}'
214
204
 
205
+ pdf_path = '{{file}}'
215
206
 
216
- Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
217
- Once initialized and set up, you will find the following in your ~/.npcsh directory:
218
- ```bash
219
- ~/.npcsh/
220
- ├── npc_team/ # Global NPCs
221
- │ ├── tools/ # Global tools
222
- │ └── assembly_lines/ # Workflow pipelines
223
207
 
224
- ```
225
- For cases where you wish to set up a project specific set of NPCs, tools, and assembly lines, add a `npc_team` directory to your project and `npcsh` should be able to pick up on its presence, like so:
226
- ```bash
227
- ./npc_team/ # Project-specific NPCs
228
- ├── tools/ # Project tools #example tool next
229
- │ └── example.tool
230
- └── assembly_lines/ # Project workflows
231
- └── example.pipe
232
- └── models/ # Project workflows
233
- └── example.model
234
- └── example1.npc # Example NPC
235
- └── example2.npc # Example NPC
236
- └── example1.ctx # Example NPC
237
- └── example2.ctx # Example NPC
238
208
 
239
- ```
209
+ # Open the PDF
210
+ doc = fitz.open(pdf_path)
211
+ text = ""
240
212
 
241
- ## IMPORTANT: migrations and deprecations
213
+ # Extract text from each page
214
+ for page_num in range(len(doc)):
215
+ page = doc[page_num]
216
+ text += page.get_text()
242
217
 
243
- ### v0.3.4
244
- -In v0.3.4, the structure for tools was adjusted. If you have made custom tools please refer to the structure within npc_compiler to ensure that they are in the correct format. Otherwise, do the following
245
- ```bash
246
- rm ~/.npcsh/npc_team/tools/*.tool
247
- ```
248
- and then
249
- ```bash
250
- npcsh
251
- ```
252
- and the updated tools will be copied over into the correct location.
218
+ # Close the document
219
+ doc.close()
253
220
 
254
- ### v0.3.5
255
- -Version 0.3.5 included a complete overhaul and refactoring of the llm_funcs module. This was done to make it not as horribly long and to make it easier to add new models and providers
221
+ print(f"Extracted text length: {len(text)}")
222
+ if len(text) > 100:
223
+ print(f"First 100 characters: {text[:100]}...")
256
224
 
225
+ shared_context['extracted_text'] = text
226
+ print("Text extraction completed successfully")
257
227
 
258
- -in version 0.3.5, a change was introduced to the database schema for messages to add npcs, models, providers, and associated attachments to data. If you have used `npcsh` before this version, you will need to run this migration script to update your database schema: [migrate_conversation_history_v0.3.5.py](https://github.com/cagostino/npcsh/blob/cfb9dc226e227b3e888f3abab53585693e77f43d/npcsh/migrations/migrate_conversation_history_%3Cv0.3.4-%3Ev0.3.5.py)
228
+ except Exception as e:
229
+ error_msg = f"Error processing PDF: {str(e)}"
230
+ print(error_msg)
231
+ shared_context['extracted_text'] = f"Error: {error_msg}"
232
+ """
233
+ },
234
+ {
235
+ "engine": "natural",
236
+ "code": """
237
+ {% if shared_context and shared_context.extracted_text %}
238
+ {% if shared_context.extracted_text.startswith('Error:') %}
239
+ {{ shared_context.extracted_text }}
240
+ {% else %}
241
+ Here is the text extracted from the PDF:
259
242
 
260
- -additionally, NPCSH_MODEL and NPCSH_PROVIDER have been renamed to NPCSH_CHAT_MODEL and NPCSH_CHAT_PROVIDER
261
- to provide a more consistent naming scheme now that we have additionally introduced `NPCSH_VISION_MODEL` and `NPCSH_VISION_PROVIDER`, `NPCSH_EMBEDDING_MODEL`, `NPCSH_EMBEDDING_PROVIDER`, `NPCSH_REASONING_MODEL`, `NPCSH_REASONING_PROVIDER`, `NPCSH_IMAGE_GEN_MODEL`, and `NPCSH_IMAGE_GEN_PROVIDER`.
262
- - In addition, we have added NPCSH_API_URL to better accommodate openai-like apis that require a specific url to be set as well as `NPCSH_STREAM_OUTPUT` to indicate whether or not to use streaming in one's responses. It will be set to 0 (false) by default as it has only been tested and verified for a small subset of the models and providers we have available (openai, anthropic, and ollama). If you try it and run into issues, please post them here so we can correct them as soon as possible !
243
+ {{ shared_context.extracted_text }}
263
244
 
245
+ Please provide a response to user request: {{ request }} using the information extracted above.
246
+ {% endif %}
247
+ {% else %}
248
+ Error: No text was extracted from the PDF.
249
+ {% endif %}
250
+ """
251
+ },]
252
+ }
264
253
 
265
- ## npcsh usage
266
- In the `npcsh` shell, users can ask LLMs questions, have LLMLs execute commands or use tools, or utilize macros that provide additional functionality. When a user does not invoke a specific macro, the shell will automatically decide which tool to use based on the user's input. Here are some examples of things one might ask the npcsh shell.
254
+ # Instantiate the tool
255
+ tool = Tool(tool_data)
267
256
 
268
- Here are some examples of how you can use npcsh
269
- ```npcsh
270
- npcsh>Who was Simon Bolivar?
257
+ # Create an NPC instance
258
+ npc = NPC(
259
+ name='starlana',
260
+ primary_directive='Analyze text from Astrophysics papers with a keen attention to theoretical machinations and mechanisms.',
261
+ model = 'llama3.2',
262
+ provider='ollama',
263
+ db_conn=sqlite3.connect(os.path.expanduser('~/npcsh_database.db'))
264
+ )
271
265
 
272
- Simón Bolívar, often referred to as "El Libertador," was a Venezuelan military and political leader who played a key role in the Latin American wars of independence against Spanish
273
- colonial rule in the early 19th century. He was born on July 24, 1783, in Caracas, Venezuela, into a wealthy Creole family.
274
- Bolívar's vision extended beyond merely liberating territories; he aspired to unify the newly independent nations of South America into a single federation, which he called "Gran
275
- Colombia," consisting of present-day Colombia, Venezuela, Ecuador, and Panama. He was known for his exceptional military strategies and leadership, which led to successful campaigns in
276
- various regions, including the battles of Boyacá, Carabobo, and Ayacucho.
277
- He faced numerous challenges, including political fragmentation, regional rivalries, and personal betrayals. Despite his initial successes, the unity he sought for Latin America proved
278
- difficult to achieve, and Gran Colombia eventually disintegrated in the early 1830s.
279
- Bolívar's influence and legacy extend far beyond his lifetime. He is celebrated in various countries across Latin America as a national hero and symbol of independence. His thoughts on
280
- liberty, governance, and social issues continue to inspire movements for democracy and social justice in the region. Simón Bolívar died on December 17, 1830, but remains a pivotal figure
281
- in the history of Latin America.
282
- ```
266
+ # Define input values dictionary
267
+ input_values = {
268
+ "request": "what is the point of the yuan and narayanan work?",
269
+ "file": os.path.abspath("test_data/yuan2004.pdf")
270
+ }
283
271
 
272
+ print(f"Attempting to read file: {input_values['file']}")
273
+ print(f"File exists: {os.path.exists(input_values['file'])}")
284
274
 
285
- ```npcsh
286
- npcsh> What is the capital of France?
287
- The capital of France is Paris. It is the largest city in the country and is known for its rich history, art, culture, and architecture, including famous landmarks such as the Eiffel Tower, Notre-Dame Cathedral, and the Louvre Museum.
275
+ # Execute the tool
276
+ output = tool.execute(input_values, npc.tools_dict, jinja_env, 'Sample Command',model=npc.model, provider=npc.provider, npc=npc)
277
+
278
+ print('Tool Output:', output)
288
279
  ```
289
280
 
290
- ```npcsh
291
- npcsh> can you tell me a joke about my favorite city?
281
+ ### Example 4: Orchestrating a team
292
282
 
293
- Additional input needed: The user did not specify their favorite city, which is necessary to generate a relevant joke.
294
- Please tell me your favorite city so I can share a joke about it!: boston
295
283
 
296
- Sure! Here's a joke about Boston:
297
- Why do Bostonians like to play hide and seek?
298
- Because good luck hiding when everyone yells, "Wicked awesome, ya gotta be here!"
299
- ```
300
284
 
301
- ```npcsh
302
- npcsh> What's the weather in Tokyo?
285
+ ```python
286
+ import pandas as pd
287
+ import numpy as np
288
+ import os
289
+ from npcsh.npc_compiler import NPC, NPCTeam, Tool
303
290
 
304
- handle_tool_call invoked with tool_name: generic_search_tool
305
291
 
306
- The weather in Tokyo, Japan, is expected to be mixed with sun and clouds. Here are some details from the recent forecasts:
292
+ # Create test data and save to CSV
293
+ def create_test_data(filepath="sales_data.csv"):
294
+ sales_data = pd.DataFrame(
295
+ {
296
+ "date": pd.date_range(start="2024-01-01", periods=90),
297
+ "revenue": np.random.normal(10000, 2000, 90),
298
+ "customer_count": np.random.poisson(100, 90),
299
+ "avg_ticket": np.random.normal(100, 20, 90),
300
+ "region": np.random.choice(["North", "South", "East", "West"], 90),
301
+ "channel": np.random.choice(["Online", "Store", "Mobile"], 90),
302
+ }
303
+ )
307
304
 
308
- Highs: Around 53°F to 58°F with a few variations depending on the day.
305
+ # Add patterns to make data more realistic
306
+ sales_data["revenue"] *= 1 + 0.3 * np.sin(
307
+ np.pi * np.arange(90) / 30
308
+ ) # Seasonal pattern
309
+ sales_data.loc[sales_data["channel"] == "Mobile", "revenue"] *= 1.1 # Mobile growth
310
+ sales_data.loc[
311
+ sales_data["channel"] == "Online", "customer_count"
312
+ ] *= 1.2 # Online customer growth
309
313
 
310
- • Lows: Approximately 35°F to 40°F.
314
+ sales_data.to_csv(filepath, index=False)
315
+ return filepath, sales_data
311
316
 
312
- • Winds: Generally from the northwest at 5 to 10 mph.
313
317
 
314
- Condition: Mainly sunny, but there may be periods of clouds and some overcast conditions throughout the week.
315
- For more detailed information, you can refer to sources like The Weather Channel or AccuWeather.
316
- /home/caug/npcww/npcsh:npcsh>
318
+ code_execution_tool = Tool(
319
+ {
320
+ "tool_name": "execute_code",
321
+ "description": """Executes a Python code block with access to pandas,
322
+ numpy, and matplotlib.
323
+ Results should be stored in the 'results' dict to be returned.
324
+ The only input should be a single code block with \n characters included.
325
+ The code block must use only the libraries or methods contained withen the
326
+ pandas, numpy, and matplotlib libraries or using builtin methods.
327
+ do not include any json formatting or markdown formatting.
317
328
 
318
- ```
319
- In the below example, the code that was open was the screen capture analysis tool itself.
320
- ```npcsh
321
- npcsh> Can you explain what the code does in the currently open VS code window?
329
+ When generating your script, the final output must be encoded in a variable
330
+ named "output". e.g.
322
331
 
323
- handle_tool_call invoked with tool_name: screen_capture_analysis_tool
332
+ output = some_analysis_function(inputs, derived_data_from_inputs)
333
+ Adapt accordingly based on the scope of the analysis
324
334
 
325
- Screenshot saved as screenshot_20241223_225815.png
335
+ """,
336
+ "inputs": ["script"],
337
+ "steps": [
338
+ {
339
+ "engine": "python",
340
+ "code": """{{script}}""",
341
+ }
342
+ ],
343
+ }
344
+ )
326
345
 
327
- The code in the visible section of your VS Code window appears to be a script for capturing and analyzing screenshots. Here's a breakdown of what the code does:
346
+ # Analytics team definition
347
+ analytics_team = [
348
+ {
349
+ "name": "analyst",
350
+ "primary_directive": "You analyze sales performance data, focusing on revenue trends, customer behavior metrics, and market indicators. Your expertise is in extracting actionable insights from complex datasets.",
351
+ "model": "gpt-4o-mini",
352
+ "provider": "openai",
353
+ "tools": [code_execution_tool], # Only the code execution tool
354
+ },
355
+ {
356
+ "name": "researcher",
357
+ "primary_directive": "You specialize in causal analysis and experimental design. Given data insights, you determine what factors drive observed patterns and design tests to validate hypotheses.",
358
+ "model": "gpt-4o-mini",
359
+ "provider": "openai",
360
+ "tools": [code_execution_tool], # Only the code execution tool
361
+ },
362
+ {
363
+ "name": "engineer",
364
+ "primary_directive": "You implement data pipelines and optimize data processing. When given analysis requirements, you create efficient workflows to automate insights generation.",
365
+ "model": "gpt-4o-mini",
366
+ "provider": "openai",
367
+ "tools": [code_execution_tool], # Only the code execution tool
368
+ },
369
+ ]
328
370
 
329
- 1 Import Necessary Libraries: It imports required libraries like system, datetime, and pyautogui, which are essential for capturing screenshots and handling date-time operations.
330
371
 
331
- 2 Capture the Screen: The code captures the current screen using pyautogui.screenshot(), taking a screenshot of the entire screen.
372
+ def create_analytics_team():
373
+ # Initialize NPCs with just the code execution tool
374
+ npcs = []
375
+ for npc_data in analytics_team:
376
+ npc = NPC(
377
+ name=npc_data["name"],
378
+ primary_directive=npc_data["primary_directive"],
379
+ model=npc_data["model"],
380
+ provider=npc_data["provider"],
381
+ tools=[code_execution_tool], # Only code execution tool
382
+ )
383
+ npcs.append(npc)
332
384
 
333
- 3 File Management: It generates a unique filename for the screenshot using the current date and time. This ensures that each screenshot file is distinct.
385
+ # Create coordinator with just code execution tool
386
+ coordinator = NPC(
387
+ name="coordinator",
388
+ primary_directive="You coordinate the analytics team, ensuring each specialist contributes their expertise effectively. You synthesize insights and manage the workflow.",
389
+ model="gpt-4o-mini",
390
+ provider="openai",
391
+ tools=[code_execution_tool], # Only code execution tool
392
+ )
334
393
 
335
- 4 Save the Screenshot: The captured screenshot is saved to a file with the generated filename.
394
+ # Create team
395
+ team = NPCTeam(npcs=npcs, foreman=coordinator)
396
+ return team
336
397
 
337
- 5 Image Analysis: It likely imports a functionality (analyze_image_base) to analyze the screenshot for specific content or information.
338
398
 
339
- 6 Run the Analysis: While not explicitly shown, it appears to call this analysis function to process the saved screenshot.
340
- In summary, this code automates the process of capturing a screenshot, saving it with a unique filename, and analyzing that image for specific content or patterns.
399
+ def main():
400
+ # Create and save test data
401
+ data_path, sales_data = create_test_data()
341
402
 
403
+ # Initialize team
404
+ team = create_analytics_team()
342
405
 
343
- ```
406
+ # Run analysis - updated prompt to reflect code execution approach
407
+ results = team.orchestrate(
408
+ f"""
409
+ Analyze the sales data at {data_path} to:
410
+ 1. Identify key performance drivers
411
+ 2. Determine if mobile channel growth is significant
412
+ 3. Recommend tests to validate growth hypotheses
344
413
 
345
- ```npcsh
346
- npcsh> What is the biggest file in my current folder?
414
+ Here is a header for the data file at {data_path}:
415
+ {sales_data.head()}
347
416
 
348
- LLM suggests the following bash command: ls -S | head -n 1
417
+ When working with dates, ensure that date columns are converted from raw strings. e.g. use the pd.to_datetime function.
349
418
 
350
- Running command: ls -S | head -n 1
351
419
 
352
- Command executed with output: image_20241111_000033.png
420
+ When working with potentially messy data, handle null values by using nan versions of numpy functions or
421
+ by filtering them with a mask .
353
422
 
354
- I ran the command ls -S | head -n 1 in your current folder. This command sorts all files by size in descending order and then selects the first entry, which represents the largest file. The result of this operation shows that the biggest file in your current folder is image_20241111_000033.png.
423
+ Use Python code execution to perform the analysis - load the data and perform statistical analysis directly.
424
+ """
425
+ )
355
426
 
356
- ```
427
+ print(results)
357
428
 
358
- ```npcsh
359
- npcsh>What is the best way to implement a linked list in Python?
429
+ # Cleanup
430
+ os.remove(data_path)
360
431
 
361
- The best way to implement a linked list in Python is to define a Node class for the individual elements and a LinkedList class to manage the nodes. Here's a basic implementation:
362
- class Node:
363
- def __init__(self, data):
364
- self.data = data # Store the data
365
- self.next = None # Initialize the next node as None
366
432
 
367
- class LinkedList:
368
- def __init__(self):
369
- self.head = None # Initialize the head of the list
433
+ if __name__ == "__main__":
434
+ main()
370
435
 
371
- def insert(self, data):
372
- """Inserts a new node at the end of the linked list."""
373
- new_node = Node(data)
374
- if not self.head: # If list is empty, set new node as head
375
- self.head = new_node
376
- return
377
- last_node = self.head
378
- while last_node.next: # Traverse to the last node
379
- last_node = last_node.next
380
- last_node.next = new_node # Link the new node
436
+ ```
381
437
 
382
- def display(self):
383
- """Prints the linked list."""
384
- current = self.head
385
- while current:
386
- print(current.data, end=" -> ")
387
- current = current.next
388
- print("None")
389
438
 
390
- # Example usage:
391
- llist = LinkedList()
392
- llist.insert(1)
393
- llist.insert(2)
394
- llist.insert(3)
395
- llist.display() # Output: 1 -> 2 -> 3 -> None
396
- Explanation:
397
439
 
398
- 1 Node Class: Represents each node in the linked list. It has two attributes: data (to store the value) and next (to point to the next node).
440
+ ## Star History
399
441
 
400
- 2 LinkedList Class: Manages the linked list. It has a head attribute pointing to the first node.
442
+ [![Star History Chart](https://api.star-history.com/svg?repos=cagostino/npcsh&type=Date)](https://star-history.com/#cagostino/npcsh&Date)
401
443
 
402
- 3 Insert Method: Adds a new node to the end of the list.
444
+ ## Installation
445
+ `npcsh` is available on PyPI and can be installed using pip. Before installing, make sure you have the necessary dependencies installed on your system. Below are the instructions for installing such dependencies on Linux, Mac, and Windows. If you find any other dependencies that are needed, please let us know so we can update the installation instructions to be more accommodating.
403
446
 
404
- 4 Display Method: Prints the elements of the list in order.
405
- This implementation is simple and can be extended with more methods (e.g., deletion, searching, etc.) as needed.
447
+ ### Linux install
448
+ ```bash
406
449
 
407
- ```
450
+ sudo apt-get install espeak
451
+ sudo apt-get install portaudio19-dev python3-pyaudio
452
+ sudo apt-get install alsa-base alsa-utils
453
+ sudo apt-get install libcairo2-dev
454
+ sudo apt-get install libgirepository1.0-dev
455
+ sudo apt-get install ffmpeg
408
456
 
409
- ```npcsh
410
- npcsh> In which of the files in the current directory is "Simon Bolivar" mentioned?
457
+ #And if you don't have ollama installed, use this:
458
+ curl -fsSL https://ollama.com/install.sh | sh
411
459
 
412
- The user's query was about "Simon Bolivar." Based on the relevant context from the codebase, I found specific mentions and relevant information regarding Simon Bolívar in the documentation provided
413
- in the README.md file.
414
- From the relevant context, specifically in the section of README.md, it shared a code example that demonstrates how to load an NPC (Non-Player Character) named "Simon Bolivar" using SQLite and
415
- answer a question about project updates. This code snippet indicates that "Simon Bolivar" is represented in the NPC system, allowing for interaction through questions to retrieve information.
416
- Additionally, there is another passage that directly answers the query. It reads:
460
+ ollama pull llama3.2
461
+ ollama pull llava:7b
462
+ ollama pull nomic-embed-text
463
+ pip install npcsh
464
+ ```
465
+ If you'd like to install the abilities to use STT and TTS, additionall install the following
466
+ ```
467
+ pip install openai-whisper pyaudio gtts playsound
468
+ ```
417
469
 
418
- ▌ "Simón Bolívar, often referred to as 'El Libertador,' was a Venezuelan military and political leader who played a key role in the Latin American wars of independence against Spanish colonial
419
- ▌ rule in the early 19th century."
420
- This information highlights Bolívar's significant historical role and achievements, answering the query regarding who Simon Bolivar was, thus making it clear that he was a prominent leader in Latin
421
- America's fight for independence.
422
- To summarize, I arrived at this answer by examining the provided context in the README.md file that includes both the coding example that initializes an NPC named "Simon Bolivar" and a historical
423
- overview of his contributions, which directly addresses the user's inquiry.
424
470
 
425
471
 
472
+ ### Mac install
473
+ ```bash
474
+ brew install portaudio
475
+ brew install ffmpeg
476
+ brew install ollama
477
+ brew services start ollama
478
+ brew install pygobject3
479
+ ollama pull llama3.2
480
+ ollama pull llava:7b
481
+ ollama pull nomic-embed-text
482
+ pip install npcsh
426
483
  ```
484
+ ### Windows Install
427
485
 
486
+ Download and install ollama exe.
428
487
 
429
- ```npcsh
430
- npcsh>what process is using the most ram on my computer right now?
488
+ Then, in a powershell. Download and install ffmpeg.
431
489
 
432
- LLM suggests the following bash command: ps aux --sort=-%mem | head -n 10
433
- Running command: ps aux --sort=-%mem | head -n 10
434
- Command executed with output: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
435
- caug 1040046 1.1 2.7 1226193464 1598980 ? SNl Dec22 23:30 /opt/vivaldi/vivaldi-bin --type=renderer --crashpad-handler-pid=4529 --enable-crash-reporter=,stable --change-stack-guard-on-fork=
436
- caug 683148 0.0 1.9 2047464 1137232 ? SNl Dec13 10:11 io.elementary.appcenter
437
- ollama 1884 0.0 1.8 3683920 1051020 ? SNsl Dec02 0:30 /usr/local/bin/ollama serve
438
- caug 1191836 5.3 1.6 7346052 964988 pts/17 Sl+ 22:57 0:22 /home/caug/.pyenv/versions/npcsh/bin/python /home/caug/.pyenv/versions/3.11.0/envs/npcsh/bin/npcsh
439
- caug 1145749 0.0 1.5 7068128 892500 pts/13 Sl+ Dec22 0:15 /home/caug/.pyenv/versions/3.11.0/bin/python3.11 /home/caug/.pyenv/versions/3.11.0/bin/npcsh
440
- caug 493912 0.0 1.5 7050968 890012 pts/11 Sl+ Dec10 0:14 /home/caug/.pyenv/versions/npcsh/bin/python /home/caug/.pyenv/versions/npcsh/bin/npcsh
441
- caug 4522 0.9 1.5 34391236 874756 ? SNLl Dec02 295:06 /opt/vivaldi/vivaldi-bin
442
- caug 3995 0.7 1.1 6051812 683916 ? R<sl Dec02 221:19 /usr/bin/gnome-shell
443
- caug 501721 0.3 1.0 1212584660 626248 ? SNl Dec10 60:22 /opt/vivaldi/vivaldi-bin --type=renderer --crashpad-handler-pid=4529 --enable-crash-reporter=,stable --change-stack-guard-on-fork=
444
-
445
- To determine which processes are using the most RAM on your computer, I executed the command ps aux --sort=-%mem | head -n 10.
446
- This command lists all running processes (ps aux), sorts them by memory usage in descending order (--sort=-%mem), and then retrieves the top 10 entries (head -n 10).
447
- The output shows the processes currently consuming the most memory. In the results, the process using the most RAM is Vivaldi's renderer, followed by other applications such as the Elementary App Center, Ollama server, and several instances of Python related to your NPC shell.
448
490
  ```
449
-
450
- ```npcsh
451
- npcsh> can you generate an image of a calico cat and an orange tabby cat in the style of a japanese woodblock print where the two cats are engaged in a samurai battle but instead of samurai swords they are holding cat toys that are like wands with felt birds on the ends of them on strings
452
-
453
- handle_tool_call invoked with tool_name: image_generation_tool
491
+ ollama pull llama3.2
492
+ ollama pull llava:7b
493
+ ollama pull nomic-embed-text
494
+ pip install npcsh
454
495
  ```
455
- ![catfight](test_data/catfight.PNG)
496
+ As of now, npcsh appears to work well with some of the core functionalities like /ots and /whisper.
456
497
 
457
498
 
458
- In addition to its ability to execute one-off tasks or tool calls, npcsh also has
459
- the ability to generate tool call sequences when asked:
460
- ```npcsh
499
+ ### Fedora Install (under construction)
461
500
 
462
- npcsh> can you use a tool sequence to find the gdp of russia in 2024 and then to use that information to generate an image?
501
+ python3-dev (fixes hnswlib issues with chroma db)
502
+ xhost + (pyautogui)
503
+ python-tkinter (pyautogui)
463
504
 
464
- handle_tool_call invoked with tool_name: generic_search_tool
465
- Tool found: generic_search_tool
466
- handle_tool_call invoked with tool_name: image_generation_tool
467
- Tool found: image_generation_tool
505
+ ## Startup Configuration and Project Structure
506
+ After it has been pip installed, `npcsh` can be used as a command line tool. Start it by typing:
507
+ ```bash
508
+ npcsh
509
+ ```
510
+ When initialized, `npcsh` will generate a .npcshrc file in your home directory that stores your npcsh settings.
511
+ Here is an example of what the .npcshrc file might look like after this has been run.
512
+ ```bash
513
+ # NPCSH Configuration File
514
+ export NPCSH_INITIALIZED=1
515
+ export NPCSH_CHAT_PROVIDER='ollama'
516
+ export NPCSH_CHAT_MODEL='llama3.2'
517
+ export NPCSH_DB_PATH='~/npcsh_history.db'
518
+ ```
519
+ `npcsh` also comes with a set of tools and NPCs that are used in processing. It will generate a folder at ~/.npcsh/ that contains the tools and NPCs that are used in the shell and these will be used in the absence of other project-specific ones. Additionally, `npcsh` records interactions and compiled information about npcs within a local SQLite database at the path specified in the .npcshrc file. This will default to ~/npcsh_history.db if not specified. When the data mode is used to load or analyze data in CSVs or PDFs, these data will be stored in the same database for future reference.
468
520
 
469
- The Gross Domestic Product (GDP) of Russia is estimated to be $8.311 trillion by the end of 2024, according to World Economics.
470
- This figure is significantly larger than the official estimate of $5.804 trillion published by the World Bank for the end of 2023.
521
+ The installer will automatically add this file to your shell config, but if it does not do so successfully for whatever reason you can add the following to your .bashrc or .zshrc:
471
522
 
472
- It seems that you've generated an image. If you have any questions or need assistance related to this image, please let me know how I can help!
523
+ ```bash
524
+ # Source NPCSH configuration
525
+ if [ -f ~/.npcshrc ]; then
526
+ . ~/.npcshrc
527
+ fi
473
528
  ```
474
- and then the associated image :
475
- ![gdp](test_data/r8ss9a.PNG)
476
-
477
529
 
530
+ We support inference via `openai`, `anthropic`, `ollama`,`gemini`, `deepseek`, and `openai-like` APIs. The default provider must be one of `['openai','anthropic','ollama', 'gemini', 'deepseek', 'openai-like']` and the model must be one available from those providers.
478
531
 
532
+ To use tools that require API keys, create an `.env` file up in the folder where you are working or place relevant API keys as env variables in your ~/.npcshrc. If you already have these API keys set in a ~/.bashrc or a ~/.zshrc or similar files, you need not additionally add them to ~/.npcshrc or to an `.env` file. Here is an example of what an `.env` file might look like:
479
533
 
534
+ ```bash
535
+ export OPENAI_API_KEY="your_openai_key"
536
+ export ANTHROPIC_API_KEY="your_anthropic_key"
537
+ export DEEPSEEK_API_KEY='your_deepseek_key'
538
+ export GEMINI_API_KEY='your_gemini_key'
539
+ export PERPLEXITY_API_KEY='your_perplexity_key'
540
+ ```
480
541
 
481
542
 
543
+ Individual npcs can also be set to use different models and providers by setting the `model` and `provider` keys in the npc files.
544
+ Once initialized and set up, you will find the following in your ~/.npcsh directory:
545
+ ```bash
546
+ ~/.npcsh/
547
+ ├── npc_team/ # Global NPCs
548
+ │ ├── tools/ # Global tools
549
+ │ └── assembly_lines/ # Workflow pipelines
482
550
 
483
- ### Piping outputs
484
- An important facet that makes `npcsh` so powerful is the ability to pipe outputs from one tool call to another. This allows for the chaining of commands and the creation of complex workflows. For example, you can use the output of a search to generate an image, or you can use the output of an image analysis to generate a report. Here is an example of how this might look in practice:
485
- ```npcsh
486
- npcsh> what is the gdp of russia in 2024? | /vixynt 'generate an image that contains {0}'
551
+ ```
552
+ For cases where you wish to set up a project specific set of NPCs, tools, and assembly lines, add a `npc_team` directory to your project and `npcsh` should be able to pick up on its presence, like so:
553
+ ```bash
554
+ ./npc_team/ # Project-specific NPCs
555
+ ├── tools/ # Project tools #example tool next
556
+ │ └── example.tool
557
+ └── assembly_lines/ # Project workflows
558
+ └── example.pipe
559
+ └── models/ # Project workflows
560
+ └── example.model
561
+ └── example1.npc # Example NPC
562
+ └── example2.npc # Example NPC
563
+ └── example1.ctx # Example NPC
564
+ └── example2.ctx # Example NPC
487
565
 
488
- ### Executing Bash Commands
489
- You can execute bash commands directly within npcsh. The LLM can also generate and execute bash commands based on your natural language requests.
490
- For example:
491
- ```npcsh
492
- npcsh> ls -l
566
+ ```
493
567
 
494
- npcsh> cp file1.txt file2.txt
495
- npcsh> mv file1.txt file2.txt
496
- npcsh> mkdir new_directory
497
- npcsh> git status
498
- npcsh> vim file.txt
568
+ ## IMPORTANT: migrations and deprecations
499
569
 
570
+ ### v0.3.4
571
+ -In v0.3.4, the structure for tools was adjusted. If you have made custom tools please refer to the structure within npc_compiler to ensure that they are in the correct format. Otherwise, do the following
572
+ ```bash
573
+ rm ~/.npcsh/npc_team/tools/*.tool
574
+ ```
575
+ and then
576
+ ```bash
577
+ npcsh
500
578
  ```
579
+ and the updated tools will be copied over into the correct location.
501
580
 
502
- ### NPC CLI
503
- When npcsh is installed, it comes with the `npc` cli as well. The `npc` cli has various command to make initializing and serving NPC projects easier.
581
+ ### v0.3.5
582
+ -Version 0.3.5 included a complete overhaul and refactoring of the llm_funcs module. This was done to make it not as horribly long and to make it easier to add new models and providers
504
583
 
505
- Users can make queries like so:
506
- ```bash
507
- $ npc 'whats the biggest filei n my computer'
508
- Loaded .env file from /home/caug/npcww/npcsh
509
- action chosen: request_input
510
- explanation given: The user needs to provide more context about their operating system or specify which directory to search for the biggest file.
511
584
 
512
- Additional input needed: The user did not specify their operating system or the directory to search for the biggest file, making it unclear how to execute the command.
513
- Please specify your operating system (e.g., Windows, macOS, Linux) and the directory you want to search in.: linux and root
514
- action chosen: execute_command
515
- explanation given: The user is asking for the biggest file on their computer, which can be accomplished with a simple bash command that searches for the largest files.
516
- sibiji generating command
517
- LLM suggests the following bash command: sudo find / -type f -exec du -h {} + | sort -rh | head -n 1
518
- Running command: sudo find / -type f -exec du -h {} + | sort -rh | head -n 1
519
- Command executed with output: 11G /home/caug/.cache/huggingface/hub/models--state-spaces--mamba-2.8b/blobs/39911a8470a2b256016b57cc71c68e0f96751cba5b229216ab1f4f9d82096a46
585
+ -in version 0.3.5, a change was introduced to the database schema for messages to add npcs, models, providers, and associated attachments to data. If you have used `npcsh` before this version, you will need to run this migration script to update your database schema: [migrate_conversation_history_v0.3.5.py](https://github.com/cagostino/npcsh/blob/cfb9dc226e227b3e888f3abab53585693e77f43d/npcsh/migrations/migrate_conversation_history_%3Cv0.3.4-%3Ev0.3.5.py)
520
586
 
521
- I ran a command on your Linux system that searches for the largest files on your computer. The command `sudo find / -type f -exec du -h {} + | sort -rh | head -n 1` performs the following steps:
587
+ -additionally, NPCSH_MODEL and NPCSH_PROVIDER have been renamed to NPCSH_CHAT_MODEL and NPCSH_CHAT_PROVIDER
588
+ to provide a more consistent naming scheme now that we have additionally introduced `NPCSH_VISION_MODEL` and `NPCSH_VISION_PROVIDER`, `NPCSH_EMBEDDING_MODEL`, `NPCSH_EMBEDDING_PROVIDER`, `NPCSH_REASONING_MODEL`, `NPCSH_REASONING_PROVIDER`, `NPCSH_IMAGE_GEN_MODEL`, and `NPCSH_IMAGE_GEN_PROVIDER`.
589
+ - In addition, we have added NPCSH_API_URL to better accommodate openai-like apis that require a specific url to be set as well as `NPCSH_STREAM_OUTPUT` to indicate whether or not to use streaming in one's responses. It will be set to 0 (false) by default as it has only been tested and verified for a small subset of the models and providers we have available (openai, anthropic, and ollama). If you try it and run into issues, please post them here so we can correct them as soon as possible !
522
590
 
523
- 1. **Find Command**: It searches for all files (`-type f`) starting from the root directory (`/`).
524
- 2. **Disk Usage**: For each file found, it calculates its disk usage in a human-readable format (`du -h`).
525
- 3. **Sort**: It sorts the results in reverse order based on size (`sort -rh`), so the largest files appear first.
526
- 4. **Head**: Finally, it retrieves just the largest file using `head -n 1`.
527
591
 
528
- The output indicates that the biggest file on your system is located at `/home/caug/.cache/huggingface/hub/models--state-spaces--mamba-2.8b/blobs/39911a8470a2b256016b57cc71c68e0f96751cba5b229216ab1f4f9d82096a46` and is 11GB in size.
592
+ ## npcsh usage
593
+ In the `npcsh` shell, users can ask LLMs questions, have LLMLs execute commands or use tools, or utilize macros that provide additional functionality. When a user does not invoke a specific macro, the shell will automatically decide which tool to use based on the user's input. Here are some examples of things one might ask the npcsh shell.
529
594
 
595
+ Here are some examples of how you can use npcsh
596
+ ```npcsh
597
+ npcsh>Who was Simon Bolivar?
598
+
599
+ Simón Bolívar, often referred to as "El Libertador," was a Venezuelan military and political leader who played a key role in the Latin American wars of independence against Spanish
600
+ colonial rule in the early 19th century. He was born on July 24, 1783, in Caracas, Venezuela, into a wealthy Creole family.
601
+ Bolívar's vision extended beyond merely liberating territories; he aspired to unify the newly independent nations of South America into a single federation, which he called "Gran
602
+ Colombia," consisting of present-day Colombia, Venezuela, Ecuador, and Panama. He was known for his exceptional military strategies and leadership, which led to successful campaigns in
603
+ various regions, including the battles of Boyacá, Carabobo, and Ayacucho.
604
+ He faced numerous challenges, including political fragmentation, regional rivalries, and personal betrayals. Despite his initial successes, the unity he sought for Latin America proved
605
+ difficult to achieve, and Gran Colombia eventually disintegrated in the early 1830s.
606
+ Bolívar's influence and legacy extend far beyond his lifetime. He is celebrated in various countries across Latin America as a national hero and symbol of independence. His thoughts on
607
+ liberty, governance, and social issues continue to inspire movements for democracy and social justice in the region. Simón Bolívar died on December 17, 1830, but remains a pivotal figure
608
+ in the history of Latin America.
530
609
  ```
531
610
 
532
- ```bash
533
- $ npc 'whats the weather in tokyo'
534
- Loaded .env file from /home/caug/npcww/npcsh
535
- action chosen: invoke_tool
536
- explanation given: The user's request for the current weather in Tokyo requires up-to-date information, which can be best obtained through an internet search.
537
- Tool found: internet_search
538
- Executing tool with input values: {'query': 'whats the weather in tokyo'}
539
- QUERY in tool whats the weather in tokyo
540
- [{'title': 'Tokyo, Tokyo, Japan Weather Forecast | AccuWeather', 'href': 'https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396', 'body': 'Tokyo, Tokyo, Japan Weather Forecast, with current conditions, wind, air quality, and what to expect for the next 3 days.'}, {'title': 'Tokyo, Japan 14 day weather forecast - timeanddate.com', 'href': 'https://www.timeanddate.com/weather/japan/tokyo/ext', 'body': 'Tokyo Extended Forecast with high and low temperatures. °F. Last 2 weeks of weather'}, {'title': 'Tokyo, Tokyo, Japan Current Weather | AccuWeather', 'href': 'https://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396', 'body': 'Current weather in Tokyo, Tokyo, Japan. Check current conditions in Tokyo, Tokyo, Japan with radar, hourly, and more.'}, {'title': 'Weather in Tokyo, Japan - timeanddate.com', 'href': 'https://www.timeanddate.com/weather/japan/tokyo', 'body': 'Current weather in Tokyo and forecast for today, tomorrow, and next 14 days'}, {'title': 'Tokyo Weather Forecast Today', 'href': 'https://japanweather.org/tokyo', 'body': "For today's mild weather in Tokyo, with temperatures between 13ºC to 16ºC (55.4ºF to 60.8ºF), consider wearing: - Comfortable jeans or slacks - Sun hat (if spending time outdoors) - Lightweight sweater or cardigan - Long-sleeve shirt or blouse. Temperature. Day. 14°C. Night. 10°C. Morning. 10°C. Afternoon."}] <class 'list'>
541
- RESULTS in tool ["Tokyo, Tokyo, Japan Weather Forecast, with current conditions, wind, air quality, and what to expect for the next 3 days.\n Citation: https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396\n\n\n\nTokyo Extended Forecast with high and low temperatures. °F. Last 2 weeks of weather\n Citation: https://www.timeanddate.com/weather/japan/tokyo/ext\n\n\n\nCurrent weather in Tokyo, Tokyo, Japan. Check current conditions in Tokyo, Tokyo, Japan with radar, hourly, and more.\n Citation: https://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396\n\n\n\nCurrent weather in Tokyo and forecast for today, tomorrow, and next 14 days\n Citation: https://www.timeanddate.com/weather/japan/tokyo\n\n\n\nFor today's mild weather in Tokyo, with temperatures between 13ºC to 16ºC (55.4ºF to 60.8ºF), consider wearing: - Comfortable jeans or slacks - Sun hat (if spending time outdoors) - Lightweight sweater or cardigan - Long-sleeve shirt or blouse. Temperature. Day. 14°C. Night. 10°C. Morning. 10°C. Afternoon.\n Citation: https://japanweather.org/tokyo\n\n\n", 'https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396\n\nhttps://www.timeanddate.com/weather/japan/tokyo/ext\n\nhttps://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396\n\nhttps://www.timeanddate.com/weather/japan/tokyo\n\nhttps://japanweather.org/tokyo\n']
542
- The current weather in Tokyo, Japan is mild, with temperatures ranging from 13°C to 16°C (approximately 55.4°F to 60.8°F). For today's conditions, it is suggested to wear comfortable jeans or slacks, a lightweight sweater or cardigan, and a long-sleeve shirt or blouse, especially if spending time outdoors. The temperature today is expected to reach a high of 14°C (57.2°F) during the day and a low of 10°C (50°F) at night.
543
611
 
544
- For more detailed weather information, you can check out the following sources:
545
- - [AccuWeather Forecast](https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396)
546
- - [Time and Date Extended Forecast](https://www.timeanddate.com/weather/japan/tokyo/ext)
547
- - [Current Weather on AccuWeather](https://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396)
548
- - [More on Time and Date](https://www.timeanddate.com/weather/japan/tokyo)
549
- - [Japan Weather](https://japanweather.org/tokyo)
612
+ ```npcsh
613
+ npcsh> What is the capital of France?
614
+ The capital of France is Paris. It is the largest city in the country and is known for its rich history, art, culture, and architecture, including famous landmarks such as the Eiffel Tower, Notre-Dame Cathedral, and the Louvre Museum.
550
615
  ```
551
616
 
617
+ ```npcsh
618
+ npcsh> can you tell me a joke about my favorite city?
552
619
 
553
- ### Serving
554
- To serve an NPC project, first install redis-server and start it
555
-
556
- on Ubuntu:
557
- ```bash
558
- sudo apt update && sudo apt install redis-server
559
- redis-server
560
- ```
620
+ Additional input needed: The user did not specify their favorite city, which is necessary to generate a relevant joke.
621
+ Please tell me your favorite city so I can share a joke about it!: boston
561
622
 
562
- on macOS:
563
- ```bash
564
- brew install redis
565
- redis-server
623
+ Sure! Here's a joke about Boston:
624
+ Why do Bostonians like to play hide and seek?
625
+ Because good luck hiding when everyone yells, "Wicked awesome, ya gotta be here!"
566
626
  ```
567
- Then navigate to the project directory and run:
568
627
 
569
- ```bash
570
- npc serve
571
- ```
572
- If you want to specify a certain port, you can do so with the `-p` flag:
573
- ```bash
574
- npc serve -p 5337
575
- ```
576
- or with the `--port` flag:
577
- ```bash
578
- npc serve --port 5337
628
+ ```npcsh
629
+ npcsh> What's the weather in Tokyo?
579
630
 
580
- ```
581
- If you want to initialize a project based on templates and then make it available for serving, you can do so like this
582
- ```bash
583
- npc serve -t 'sales, marketing' -ctx 'im developing a team that will focus on sales and marketing within the logging industry. I need a team that can help me with the following: - generate leads - create marketing campaigns - build a sales funnel - close deals - manage customer relationships - manage sales pipeline - manage marketing campaigns - manage marketing budget' -m llama3.2 -pr ollama
584
- ```
585
- This will use the specified model and provider to generate a team of npcs to fit the templates and context provided.
631
+ handle_tool_call invoked with tool_name: generic_search_tool
586
632
 
633
+ The weather in Tokyo, Japan, is expected to be mixed with sun and clouds. Here are some details from the recent forecasts:
587
634
 
588
- Once the server is up and running, you can access the API endpoints at `http://localhost:5337/api/`. Here are some example curl commands to test the endpoints:
635
+ Highs: Around 53°F to 58°F with a few variations depending on the day.
589
636
 
590
- ```bash
591
- echo "Testing health endpoint..."
592
- curl -s http://localhost:5337/api/health | jq '.'
637
+ • Lows: Approximately 35°F to 40°F.
593
638
 
594
- echo -e "\nTesting execute endpoint..."
595
- curl -s -X POST http://localhost:5337/api/execute \
596
- -H "Content-Type: application/json" \
597
- -d '{"commandstr": "hello world", "currentPath": "~/", "conversationId": "test124"}' | jq '.'
639
+ Winds: Generally from the northwest at 5 to 10 mph.
598
640
 
599
- echo -e "\nTesting conversations endpoint..."
600
- curl -s "http://localhost:5337/api/conversations?path=/tmp" | jq '.'
641
+ Condition: Mainly sunny, but there may be periods of clouds and some overcast conditions throughout the week.
642
+ For more detailed information, you can refer to sources like The Weather Channel or AccuWeather.
643
+ /home/caug/npcww/npcsh:npcsh>
601
644
 
602
- echo -e "\nTesting conversation messages endpoint..."
603
- curl -s http://localhost:5337/api/conversation/test123/messages | jq '.'
604
645
  ```
646
+ In the below example, the code that was open was the screen capture analysis tool itself.
647
+ ```npcsh
648
+ npcsh> Can you explain what the code does in the currently open VS code window?
605
649
 
606
- ###
650
+ handle_tool_call invoked with tool_name: screen_capture_analysis_tool
607
651
 
652
+ Screenshot saved as screenshot_20241223_225815.png
608
653
 
609
- * **Planned:** -npc scripts
610
- -npc run select +sql_model <run up>
611
- -npc run select +sql_model+ <run up and down>
612
- -npc run select sql_model+ <run down>
613
- -npc run line <assembly_line>
614
- -npc conjure fabrication_plan.fab
654
+ The code in the visible section of your VS Code window appears to be a script for capturing and analyzing screenshots. Here's a breakdown of what the code does:
615
655
 
656
+ 1 Import Necessary Libraries: It imports required libraries like system, datetime, and pyautogui, which are essential for capturing screenshots and handling date-time operations.
616
657
 
658
+ 2 Capture the Screen: The code captures the current screen using pyautogui.screenshot(), taking a screenshot of the entire screen.
617
659
 
618
- ## Macros
660
+ 3 File Management: It generates a unique filename for the screenshot using the current date and time. This ensures that each screenshot file is distinct.
619
661
 
620
- While npcsh can decide the best option to use based on the user's input, the user can also execute certain actions with a macro. Macros are commands within the NPC shell that start with a forward slash (/) and are followed (in some cases) by the relevant arguments for those macros. Each macro is also available as a sub-program within the NPC CLI. In the following examples we demonstrate how to carry out the same operations from within npcsh and from a regular shell.
662
+ 4 Save the Screenshot: The captured screenshot is saved to a file with the generated filename.
621
663
 
664
+ 5 Image Analysis: It likely imports a functionality (analyze_image_base) to analyze the screenshot for specific content or information.
622
665
 
623
- To learn about the available macros from within the shell, type:
624
- ```npcsh
625
- npcsh> /help
626
- ```
666
+ 6 Run the Analysis: While not explicitly shown, it appears to call this analysis function to process the saved screenshot.
667
+ In summary, this code automates the process of capturing a screenshot, saving it with a unique filename, and analyzing that image for specific content or patterns.
627
668
 
628
- or from bash
629
- ```bash
630
- npc --help
631
- #alternatively
632
- npc -h
633
- ```
634
669
 
635
- To exit the shell:
636
- ```npcsh
637
- npcsh> /exit
638
670
  ```
639
671
 
640
- Otherwise, here are some more detailed examples of macros that can be used in npcsh:
641
- ### Conjure (under construction)
642
- Use the `/conjure` macro to generate an NPC, a NPC tool, an assembly line, a job, or an SQL model
672
+ ```npcsh
673
+ npcsh> What is the biggest file in my current folder?
643
674
 
644
- ```bash
645
- npc conjure -n name -t 'templates'
646
- ```
675
+ LLM suggests the following bash command: ls -S | head -n 1
647
676
 
677
+ Running command: ls -S | head -n 1
648
678
 
649
- ### Data Interaction and analysis (under construction)
679
+ Command executed with output: image_20241111_000033.png
650
680
 
681
+ I ran the command ls -S | head -n 1 in your current folder. This command sorts all files by size in descending order and then selects the first entry, which represents the largest file. The result of this operation shows that the biggest file in your current folder is image_20241111_000033.png.
651
682
 
652
- ### Debate (under construction)
653
- Use the `/debate` macro to have two or more NPCs debate a topic, problem, or question.
683
+ ```
654
684
 
655
- For example:
656
685
  ```npcsh
657
- npcsh> /debate Should humans colonize Mars? npcs = ['sibiji', 'mark', 'ted']
658
- ```
686
+ npcsh>What is the best way to implement a linked list in Python?
659
687
 
688
+ The best way to implement a linked list in Python is to define a Node class for the individual elements and a LinkedList class to manage the nodes. Here's a basic implementation:
689
+ class Node:
690
+ def __init__(self, data):
691
+ self.data = data # Store the data
692
+ self.next = None # Initialize the next node as None
660
693
 
694
+ class LinkedList:
695
+ def __init__(self):
696
+ self.head = None # Initialize the head of the list
661
697
 
662
- ### Notes
663
- Jot down notes and store them within the npcsh database and in the current directory as a text file.
664
- ```npcsh
665
- npcsh> /notes
666
- ```
698
+ def insert(self, data):
699
+ """Inserts a new node at the end of the linked list."""
700
+ new_node = Node(data)
701
+ if not self.head: # If list is empty, set new node as head
702
+ self.head = new_node
703
+ return
704
+ last_node = self.head
705
+ while last_node.next: # Traverse to the last node
706
+ last_node = last_node.next
707
+ last_node.next = new_node # Link the new node
667
708
 
709
+ def display(self):
710
+ """Prints the linked list."""
711
+ current = self.head
712
+ while current:
713
+ print(current.data, end=" -> ")
714
+ current = current.next
715
+ print("None")
668
716
 
669
- ### Over-the-shoulder: Screenshots and image analysis
717
+ # Example usage:
718
+ llist = LinkedList()
719
+ llist.insert(1)
720
+ llist.insert(2)
721
+ llist.insert(3)
722
+ llist.display() # Output: 1 -> 2 -> 3 -> None
723
+ Explanation:
670
724
 
671
- Use the /ots macro to take a screenshot and write a prompt for an LLM to answer about the screenshot.
672
- ```npcsh
673
- npcsh> /ots
725
+ 1 Node Class: Represents each node in the linked list. It has two attributes: data (to store the value) and next (to point to the next node).
674
726
 
675
- Screenshot saved to: /home/caug/.npcsh/screenshots/screenshot_1735015011.png
727
+ 2 LinkedList Class: Manages the linked list. It has a head attribute pointing to the first node.
676
728
 
677
- Enter a prompt for the LLM about this image (or press Enter to skip): describe whats in this image
729
+ 3 Insert Method: Adds a new node to the end of the list.
678
730
 
679
- The image displays a source control graph, likely from a version control system like Git. It features a series of commits represented by colored dots connected by lines, illustrating the project's development history. Each commit message provides a brief description of the changes made, including tasks like fixing issues, merging pull requests, updating README files, and adjusting code or documentation. Notably, several commits mention specific users, particularly "Chris Agostino," indicating collaboration and contributions to the project. The graph visually represents the branching and merging of code changes.
680
- ```
731
+ 4 Display Method: Prints the elements of the list in order.
732
+ This implementation is simple and can be extended with more methods (e.g., deletion, searching, etc.) as needed.
681
733
 
682
- In bash:
683
- ```bash
684
- npc ots
685
734
  ```
686
735
 
687
-
688
-
689
- Alternatively, pass an existing image in like :
690
736
  ```npcsh
691
- npcsh> /ots test_data/catfight.PNG
692
- Enter a prompt for the LLM about this image (or press Enter to skip): whats in this ?
737
+ npcsh> In which of the files in the current directory is "Simon Bolivar" mentioned?
693
738
 
694
- The image features two cats, one calico and one orange tabby, playing with traditional Japanese-style toys. They are each holding sticks attached to colorful pom-pom balls, which resemble birds. The background includes stylized waves and a red sun, accentuating a vibrant, artistic style reminiscent of classic Japanese art. The playful interaction between the cats evokes a lively, whimsical scene.
695
- ```
739
+ The user's query was about "Simon Bolivar." Based on the relevant context from the codebase, I found specific mentions and relevant information regarding Simon Bolívar in the documentation provided
740
+ in the README.md file.
741
+ From the relevant context, specifically in the section of README.md, it shared a code example that demonstrates how to load an NPC (Non-Player Character) named "Simon Bolivar" using SQLite and
742
+ answer a question about project updates. This code snippet indicates that "Simon Bolivar" is represented in the NPC system, allowing for interaction through questions to retrieve information.
743
+ Additionally, there is another passage that directly answers the query. It reads:
696
744
 
697
- ```bash
698
- npc ots -f test_data/catfight.PNG
699
- ```
745
+ ▌ "Simón Bolívar, often referred to as 'El Libertador,' was a Venezuelan military and political leader who played a key role in the Latin American wars of independence against Spanish colonial
746
+ rule in the early 19th century."
747
+ This information highlights Bolívar's significant historical role and achievements, answering the query regarding who Simon Bolivar was, thus making it clear that he was a prominent leader in Latin
748
+ America's fight for independence.
749
+ To summarize, I arrived at this answer by examining the provided context in the README.md file that includes both the coding example that initializes an NPC named "Simon Bolivar" and a historical
750
+ overview of his contributions, which directly addresses the user's inquiry.
700
751
 
701
752
 
702
- ### Plan : Schedule tasks to be run at regular intervals (under construction)
703
- Use the /plan macro to schedule tasks to be run at regular intervals.
704
- ```npcsh
705
- npcsh> /plan run a rag search on the files in the current directory every 5 minutes
706
753
  ```
707
754
 
708
- ```bash
709
- npc plan -f 30m -t 'task'
710
- ```
711
755
 
712
- ### Plonk : Computer Control
713
- Use the /plonk macro to allow the LLM to control your computer.
714
756
  ```npcsh
715
- npcsh> /plonk go to a web browser and go to wikipedia and find out information about simon bolivar
716
- ```
717
-
718
- ```bash
719
- npc plonk 'use a web browser to find out information about simon boliver'
720
- ```
721
-
722
- ### RAG
757
+ npcsh>what process is using the most ram on my computer right now?
723
758
 
724
- Use the /rag macro to perform a local rag search.
725
- If you pass a `-f` flag with a filename or list of filenames (e.g. *.py) then it will embed the documents and perform the cosine similarity scoring.
759
+ LLM suggests the following bash command: ps aux --sort=-%mem | head -n 10
760
+ Running command: ps aux --sort=-%mem | head -n 10
761
+ Command executed with output: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
762
+ caug 1040046 1.1 2.7 1226193464 1598980 ? SNl Dec22 23:30 /opt/vivaldi/vivaldi-bin --type=renderer --crashpad-handler-pid=4529 --enable-crash-reporter=,stable --change-stack-guard-on-fork=
763
+ caug 683148 0.0 1.9 2047464 1137232 ? SNl Dec13 10:11 io.elementary.appcenter
764
+ ollama 1884 0.0 1.8 3683920 1051020 ? SNsl Dec02 0:30 /usr/local/bin/ollama serve
765
+ caug 1191836 5.3 1.6 7346052 964988 pts/17 Sl+ 22:57 0:22 /home/caug/.pyenv/versions/npcsh/bin/python /home/caug/.pyenv/versions/3.11.0/envs/npcsh/bin/npcsh
766
+ caug 1145749 0.0 1.5 7068128 892500 pts/13 Sl+ Dec22 0:15 /home/caug/.pyenv/versions/3.11.0/bin/python3.11 /home/caug/.pyenv/versions/3.11.0/bin/npcsh
767
+ caug 493912 0.0 1.5 7050968 890012 pts/11 Sl+ Dec10 0:14 /home/caug/.pyenv/versions/npcsh/bin/python /home/caug/.pyenv/versions/npcsh/bin/npcsh
768
+ caug 4522 0.9 1.5 34391236 874756 ? SNLl Dec02 295:06 /opt/vivaldi/vivaldi-bin
769
+ caug 3995 0.7 1.1 6051812 683916 ? R<sl Dec02 221:19 /usr/bin/gnome-shell
770
+ caug 501721 0.3 1.0 1212584660 626248 ? SNl Dec10 60:22 /opt/vivaldi/vivaldi-bin --type=renderer --crashpad-handler-pid=4529 --enable-crash-reporter=,stable --change-stack-guard-on-fork=
726
771
 
727
- ```npcsh
728
- npcsh> /rag -f *.py what is the best way to implement a linked list in Python?
772
+ To determine which processes are using the most RAM on your computer, I executed the command ps aux --sort=-%mem | head -n 10.
773
+ This command lists all running processes (ps aux), sorts them by memory usage in descending order (--sort=-%mem), and then retrieves the top 10 entries (head -n 10).
774
+ The output shows the processes currently consuming the most memory. In the results, the process using the most RAM is Vivaldi's renderer, followed by other applications such as the Elementary App Center, Ollama server, and several instances of Python related to your NPC shell.
729
775
  ```
730
776
 
731
- Alternatively , if you want to perform rag on your past conversations, you can do so like this:
732
777
  ```npcsh
733
- npcsh> /rag what is the best way to implement a linked list in Python?
734
- ```
735
- and it will automatically look through the recorded conversations in the ~/npcsh_history.db
736
-
778
+ npcsh> can you generate an image of a calico cat and an orange tabby cat in the style of a japanese woodblock print where the two cats are engaged in a samurai battle but instead of samurai swords they are holding cat toys that are like wands with felt birds on the ends of them on strings
737
779
 
738
- In bash:
739
- ```bash
740
- npc rag -f *.py
780
+ handle_tool_call invoked with tool_name: image_generation_tool
741
781
  ```
782
+ ![catfight](test_data/catfight.PNG)
742
783
 
743
- ### Rehash
744
-
745
- Use the /rehash macro to re-send the last message to the LLM.
746
- ```npcsh
747
- npcsh> /rehash
748
- ```
749
784
 
750
- ### Sample
751
- Send a one-shot question to the LLM.
785
+ In addition to its ability to execute one-off tasks or tool calls, npcsh also has
786
+ the ability to generate tool call sequences when asked:
752
787
  ```npcsh
753
- npcsh> /sample What is the capital of France?
754
- ```
755
788
 
756
- Bash:
757
- ```bash
758
- npc sample 'thing' -m model -p provider
789
+ npcsh> can you use a tool sequence to find the gdp of russia in 2024 and then to use that information to generate an image?
759
790
 
760
- ```
791
+ handle_tool_call invoked with tool_name: generic_search_tool
792
+ Tool found: generic_search_tool
793
+ handle_tool_call invoked with tool_name: image_generation_tool
794
+ Tool found: image_generation_tool
761
795
 
796
+ The Gross Domestic Product (GDP) of Russia is estimated to be $8.311 trillion by the end of 2024, according to World Economics.
797
+ This figure is significantly larger than the official estimate of $5.804 trillion published by the World Bank for the end of 2023.
762
798
 
763
- ### Search
764
- Search can be accomplished through the `/search` macro. You can specify the provider as being "perplexity" or "duckduckgo". For the former,
765
- you must set a perplexity api key as an environment variable as described above. The default provider is duckduckgo.
799
+ It seems that you've generated an image. If you have any questions or need assistance related to this image, please let me know how I can help!
800
+ ```
801
+ and then the associated image :
802
+ ![gdp](test_data/r8ss9a.PNG)
766
803
 
767
- NOTE: while google is an available search engine, they recently implemented changes (early 2025) that make the python google search package no longer as reliable.
768
- For now, we will use duckduckgo and revisit this issue when other more critical aspects are handled.
769
804
 
770
805
 
771
- ```npcsh
772
- npcsh!> /search -p duckduckgo who is the current us president
773
806
 
774
807
 
775
- President Donald J. Trump entered office on January 20, 2025. News, issues, and photos of the President Footer Disclaimer This is the official website of the U.S. Mission to the United Nations. External links to other Internet sites should not be construed as an endorsement of the views or privacy policies contained therein.
776
808
 
777
- Citation: https://usun.usmission.gov/our-leaders/the-president-of-the-united-states/
778
- 45th & 47th President of the United States After a landslide election victory in 2024, President Donald J. Trump is returning to the White House to build upon his previous successes and use his mandate to reject the extremist policies of the radical left while providing tangible quality of life improvements for the American people. Vice President of the United States In 2024, President Donald J. Trump extended JD the incredible honor of asking him to serve as the Vice-Presidential Nominee for th...
779
- Citation: https://www.whitehouse.gov/administration/
780
- Citation: https://www.instagram.com/potus/?hl=en
781
- The president of the United States (POTUS)[B] is the head of state and head of government of the United States. The president directs the executive branch of the federal government and is the commander-in-chief of the United States Armed Forces. The power of the presidency has grown substantially[12] since the first president, George Washington, took office in 1789.[6] While presidential power has ebbed and flowed over time, the presidency has played an increasingly significant role in American ...
782
- Citation: https://en.wikipedia.org/wiki/President_of_the_United_States
783
- Citation Links: https://usun.usmission.gov/our-leaders/the-president-of-the-united-states/
784
- https://www.whitehouse.gov/administration/
785
- https://www.instagram.com/potus/?hl=en
786
- https://en.wikipedia.org/wiki/President_of_the_United_States
787
- ```
788
809
 
810
+ ### Piping outputs
811
+ An important facet that makes `npcsh` so powerful is the ability to pipe outputs from one tool call to another. This allows for the chaining of commands and the creation of complex workflows. For example, you can use the output of a search to generate an image, or you can use the output of an image analysis to generate a report. Here is an example of how this might look in practice:
812
+ ```npcsh
813
+ npcsh> what is the gdp of russia in 2024? | /vixynt 'generate an image that contains {0}'
789
814
 
815
+ ### Executing Bash Commands
816
+ You can execute bash commands directly within npcsh. The LLM can also generate and execute bash commands based on your natural language requests.
817
+ For example:
790
818
  ```npcsh
791
- npcsh> /search -p perplexity who is the current us president
792
- The current President of the United States is Donald Trump, who assumed office on January 20, 2025, for his second non-consecutive term as the 47th president[1].
819
+ npcsh> ls -l
820
+
821
+ npcsh> cp file1.txt file2.txt
822
+ npcsh> mv file1.txt file2.txt
823
+ npcsh> mkdir new_directory
824
+ npcsh> git status
825
+ npcsh> vim file.txt
793
826
 
794
- Citation Links: ['https://en.wikipedia.org/wiki/List_of_presidents_of_the_United_States',
795
- 'https://en.wikipedia.org/wiki/Joe_Biden',
796
- 'https://www.britannica.com/topic/Presidents-of-the-United-States-1846696',
797
- 'https://news.gallup.com/poll/329384/presidential-approval-ratings-joe-biden.aspx',
798
- 'https://www.usa.gov/presidents']
799
827
  ```
800
828
 
801
- Bash:
829
+ ### NPC CLI
830
+ When npcsh is installed, it comes with the `npc` cli as well. The `npc` cli has various command to make initializing and serving NPC projects easier.
802
831
 
832
+ Users can make queries like so:
803
833
  ```bash
804
- (npcsh) caug@pop-os:~/npcww/npcsh$ npc search 'simon bolivar' -sp perplexity
834
+ $ npc 'whats the biggest filei n my computer'
805
835
  Loaded .env file from /home/caug/npcww/npcsh
806
- urls ['https://en.wikipedia.org/wiki/Sim%C3%B3n_Bol%C3%ADvar', 'https://www.britannica.com/biography/Simon-Bolivar', 'https://en.wikipedia.org/wiki/File:Sim%C3%B3n_Bol%C3%ADvar_2.jpg', 'https://www.historytoday.com/archive/simon-bolivar-and-spanish-revolutions', 'https://kids.britannica.com/kids/article/Sim%C3%B3n-Bol%C3%ADvar/352872']
807
- openai
808
- - Simón José Antonio de la Santísima Trinidad Bolívar Palacios Ponte y Blanco[c] (24 July 1783 – 17 December 1830) was a Venezuelan statesman and military officer who led what are currently the countries of Colombia, Venezuela, Ecuador, Peru, Panama, and Bolivia to independence from the Spanish Empire. He is known colloquially as El Libertador, or the Liberator of America. Simón Bolívar was born in Caracas in the Captaincy General of Venezuela into a wealthy family of American-born Spaniards (crio...
809
- Citation: https://en.wikipedia.org/wiki/Sim%C3%B3n_Bol%C3%ADvar
836
+ action chosen: request_input
837
+ explanation given: The user needs to provide more context about their operating system or specify which directory to search for the biggest file.
810
838
 
839
+ Additional input needed: The user did not specify their operating system or the directory to search for the biggest file, making it unclear how to execute the command.
840
+ Please specify your operating system (e.g., Windows, macOS, Linux) and the directory you want to search in.: linux and root
841
+ action chosen: execute_command
842
+ explanation given: The user is asking for the biggest file on their computer, which can be accomplished with a simple bash command that searches for the largest files.
843
+ sibiji generating command
844
+ LLM suggests the following bash command: sudo find / -type f -exec du -h {} + | sort -rh | head -n 1
845
+ Running command: sudo find / -type f -exec du -h {} + | sort -rh | head -n 1
846
+ Command executed with output: 11G /home/caug/.cache/huggingface/hub/models--state-spaces--mamba-2.8b/blobs/39911a8470a2b256016b57cc71c68e0f96751cba5b229216ab1f4f9d82096a46
811
847
 
848
+ I ran a command on your Linux system that searches for the largest files on your computer. The command `sudo find / -type f -exec du -h {} + | sort -rh | head -n 1` performs the following steps:
812
849
 
813
- Our editors will review what you’ve submitted and determine whether to revise the article. Simón Bolívar was a Venezuelan soldier and statesman who played a central role in the South American independence movement. Bolívar served as president of Gran Colombia (1819–30) and as dictator of Peru (1823–26). The country of Bolivia is named for him. Simón Bolívar was born on July 24, 1783, in Caracas, Venezuela. Neither Bolívar’s aristocrat father nor his mother lived to see his 10th birthday. Bolívar...
814
- Citation: https://www.britannica.com/biography/Simon-Bolivar
850
+ 1. **Find Command**: It searches for all files (`-type f`) starting from the root directory (`/`).
851
+ 2. **Disk Usage**: For each file found, it calculates its disk usage in a human-readable format (`du -h`).
852
+ 3. **Sort**: It sorts the results in reverse order based on size (`sort -rh`), so the largest files appear first.
853
+ 4. **Head**: Finally, it retrieves just the largest file using `head -n 1`.
815
854
 
855
+ The output indicates that the biggest file on your system is located at `/home/caug/.cache/huggingface/hub/models--state-spaces--mamba-2.8b/blobs/39911a8470a2b256016b57cc71c68e0f96751cba5b229216ab1f4f9d82096a46` and is 11GB in size.
816
856
 
857
+ ```
817
858
 
818
- Original file (1,525 × 1,990 pixels, file size: 3.02 MB, MIME type: image/jpeg) Derivative works of this file: Simón Bolívar 5.jpg This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1930. https://creativecommons.org/publicdomain/mark/1.0/PDMCreat...
819
- Citation: https://en.wikipedia.org/wiki/File:Sim%C3%B3n_Bol%C3%ADvar_2.jpg
820
-
821
-
822
-
823
- SubscriptionOffers Give a Gift Subscribe A map of Gran Colombia showing the 12 departments created in 1824 and territories disputed with neighboring countries. What role did Simon Bolivar play in the history of Latin America's independence from Spain? Simon Bolivar lived a short but comprehensive life. History records his extraordinary versatility. He was a revolutionary who freed six countries, an intellectual who argued the problems of national liberation, a general who fought a war of unremit...
824
- Citation: https://www.historytoday.com/archive/simon-bolivar-and-spanish-revolutions
825
-
859
+ ```bash
860
+ $ npc 'whats the weather in tokyo'
861
+ Loaded .env file from /home/caug/npcww/npcsh
862
+ action chosen: invoke_tool
863
+ explanation given: The user's request for the current weather in Tokyo requires up-to-date information, which can be best obtained through an internet search.
864
+ Tool found: internet_search
865
+ Executing tool with input values: {'query': 'whats the weather in tokyo'}
866
+ QUERY in tool whats the weather in tokyo
867
+ [{'title': 'Tokyo, Tokyo, Japan Weather Forecast | AccuWeather', 'href': 'https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396', 'body': 'Tokyo, Tokyo, Japan Weather Forecast, with current conditions, wind, air quality, and what to expect for the next 3 days.'}, {'title': 'Tokyo, Japan 14 day weather forecast - timeanddate.com', 'href': 'https://www.timeanddate.com/weather/japan/tokyo/ext', 'body': 'Tokyo Extended Forecast with high and low temperatures. °F. Last 2 weeks of weather'}, {'title': 'Tokyo, Tokyo, Japan Current Weather | AccuWeather', 'href': 'https://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396', 'body': 'Current weather in Tokyo, Tokyo, Japan. Check current conditions in Tokyo, Tokyo, Japan with radar, hourly, and more.'}, {'title': 'Weather in Tokyo, Japan - timeanddate.com', 'href': 'https://www.timeanddate.com/weather/japan/tokyo', 'body': 'Current weather in Tokyo and forecast for today, tomorrow, and next 14 days'}, {'title': 'Tokyo Weather Forecast Today', 'href': 'https://japanweather.org/tokyo', 'body': "For today's mild weather in Tokyo, with temperatures between 13ºC to 16ºC (55.4ºF to 60.8ºF), consider wearing: - Comfortable jeans or slacks - Sun hat (if spending time outdoors) - Lightweight sweater or cardigan - Long-sleeve shirt or blouse. Temperature. Day. 14°C. Night. 10°C. Morning. 10°C. Afternoon."}] <class 'list'>
868
+ RESULTS in tool ["Tokyo, Tokyo, Japan Weather Forecast, with current conditions, wind, air quality, and what to expect for the next 3 days.\n Citation: https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396\n\n\n\nTokyo Extended Forecast with high and low temperatures. °F. Last 2 weeks of weather\n Citation: https://www.timeanddate.com/weather/japan/tokyo/ext\n\n\n\nCurrent weather in Tokyo, Tokyo, Japan. Check current conditions in Tokyo, Tokyo, Japan with radar, hourly, and more.\n Citation: https://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396\n\n\n\nCurrent weather in Tokyo and forecast for today, tomorrow, and next 14 days\n Citation: https://www.timeanddate.com/weather/japan/tokyo\n\n\n\nFor today's mild weather in Tokyo, with temperatures between 13ºC to 16ºC (55.4ºF to 60.8ºF), consider wearing: - Comfortable jeans or slacks - Sun hat (if spending time outdoors) - Lightweight sweater or cardigan - Long-sleeve shirt or blouse. Temperature. Day. 14°C. Night. 10°C. Morning. 10°C. Afternoon.\n Citation: https://japanweather.org/tokyo\n\n\n", 'https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396\n\nhttps://www.timeanddate.com/weather/japan/tokyo/ext\n\nhttps://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396\n\nhttps://www.timeanddate.com/weather/japan/tokyo\n\nhttps://japanweather.org/tokyo\n']
869
+ The current weather in Tokyo, Japan is mild, with temperatures ranging from 13°C to 16°C (approximately 55.4°F to 60.8°F). For today's conditions, it is suggested to wear comfortable jeans or slacks, a lightweight sweater or cardigan, and a long-sleeve shirt or blouse, especially if spending time outdoors. The temperature today is expected to reach a high of 14°C (57.2°F) during the day and a low of 10°C (50°F) at night.
826
870
 
871
+ For more detailed weather information, you can check out the following sources:
872
+ - [AccuWeather Forecast](https://www.accuweather.com/en/jp/tokyo/226396/weather-forecast/226396)
873
+ - [Time and Date Extended Forecast](https://www.timeanddate.com/weather/japan/tokyo/ext)
874
+ - [Current Weather on AccuWeather](https://www.accuweather.com/en/jp/tokyo/226396/current-weather/226396)
875
+ - [More on Time and Date](https://www.timeanddate.com/weather/japan/tokyo)
876
+ - [Japan Weather](https://japanweather.org/tokyo)
877
+ ```
827
878
 
828
- Known as the Liberator, Simón Bolívar led revolutions against Spanish rule in South America. The countries of Venezuela, Colombia, Ecuador, Panama, Peru, and Bolivia all owe their independence largely to him. Bolívar was born on July 24, 1783, in Caracas, New Granada (now in Venezuela). After studying in Europe, he returned to South America and began to fight Spanish rule. Between 1810 and 1814 Venezuela made two failed tries to break free from Spain. After the second defeat, Bolívar fled to Jam...
829
- Citation: https://kids.britannica.com/kids/article/Sim%C3%B3n-Bol%C3%ADvar/352872
830
879
 
880
+ ### Serving
881
+ To serve an NPC project, first install redis-server and start it
831
882
 
883
+ on Ubuntu:
884
+ ```bash
885
+ sudo apt update && sudo apt install redis-server
886
+ redis-server
887
+ ```
832
888
 
833
- - https://en.wikipedia.org/wiki/Sim%C3%B3n_Bol%C3%ADvar
889
+ on macOS:
890
+ ```bash
891
+ brew install redis
892
+ redis-server
893
+ ```
894
+ Then navigate to the project directory and run:
834
895
 
835
- https://www.britannica.com/biography/Simon-Bolivar
896
+ ```bash
897
+ npc serve
898
+ ```
899
+ If you want to specify a certain port, you can do so with the `-p` flag:
900
+ ```bash
901
+ npc serve -p 5337
902
+ ```
903
+ or with the `--port` flag:
904
+ ```bash
905
+ npc serve --port 5337
836
906
 
837
- https://en.wikipedia.org/wiki/File:Sim%C3%B3n_Bol%C3%ADvar_2.jpg
907
+ ```
908
+ If you want to initialize a project based on templates and then make it available for serving, you can do so like this
909
+ ```bash
910
+ npc serve -t 'sales, marketing' -ctx 'im developing a team that will focus on sales and marketing within the logging industry. I need a team that can help me with the following: - generate leads - create marketing campaigns - build a sales funnel - close deals - manage customer relationships - manage sales pipeline - manage marketing campaigns - manage marketing budget' -m llama3.2 -pr ollama
911
+ ```
912
+ This will use the specified model and provider to generate a team of npcs to fit the templates and context provided.
838
913
 
839
- https://www.historytoday.com/archive/simon-bolivar-and-spanish-revolutions
840
914
 
841
- https://kids.britannica.com/kids/article/Sim%C3%B3n-Bol%C3%ADvar/352872
842
- ```
915
+ Once the server is up and running, you can access the API endpoints at `http://localhost:5337/api/`. Here are some example curl commands to test the endpoints:
843
916
 
844
917
  ```bash
845
- npc search 'snipers on the roof indiana university' -sp duckduckgo
846
- ```
918
+ echo "Testing health endpoint..."
919
+ curl -s http://localhost:5337/api/health | jq '.'
847
920
 
921
+ echo -e "\nTesting execute endpoint..."
922
+ curl -s -X POST http://localhost:5337/api/execute \
923
+ -H "Content-Type: application/json" \
924
+ -d '{"commandstr": "hello world", "currentPath": "~/", "conversationId": "test124"}' | jq '.'
848
925
 
849
- ### Set: Changing defaults from within npcsh
850
- Users can change the default model and provider from within npcsh by using the following commands:
851
- ```npcsh
852
- npcsh> /set model ollama
853
- npcsh> /set provider llama3.2
926
+ echo -e "\nTesting conversations endpoint..."
927
+ curl -s "http://localhost:5337/api/conversations?path=/tmp" | jq '.'
928
+
929
+ echo -e "\nTesting conversation messages endpoint..."
930
+ curl -s http://localhost:5337/api/conversation/test123/messages | jq '.'
854
931
  ```
855
932
 
933
+ ###
856
934
 
857
- ### Sleep : a method for creating and updating a knowledge graph (under construction)
858
935
 
859
- Use the `/sleep` macro to create or update a knowledge graph. A knowledge graph is a structured representation of facts about you as a user that the NPCs can determine based on the conversations you have had with it.
860
- ```npcsh
861
- npcsh> /sleep
862
- ```
936
+ * **Planned:** -npc scripts
937
+ -npc run select +sql_model <run up>
938
+ -npc run select +sql_model+ <run up and down>
939
+ -npc run select sql_model+ <run down>
940
+ -npc run line <assembly_line>
941
+ -npc conjure fabrication_plan.fab
863
942
 
864
- ### breathe: a method for condensing context on a regular cadence (# messages, len(context), etc) (under construction)
865
- -every 10 messages/7500 characters, condense the conversation into lessons learned. write the lessons learned down by the np
866
- for the day, then the npc will see the lessons they have learned that day in that folder as part of the context.
867
943
 
868
944
 
945
+ ## Macros
869
946
 
870
- ### Spool
871
- Spool mode allows one to enter into a conversation with a specific LLM or a specific NPC.
872
- This is used for having distinct interactions from those in the base shell and these will be separately contained.
947
+ While npcsh can decide the best option to use based on the user's input, the user can also execute certain actions with a macro. Macros are commands within the NPC shell that start with a forward slash (/) and are followed (in some cases) by the relevant arguments for those macros. Each macro is also available as a sub-program within the NPC CLI. In the following examples we demonstrate how to carry out the same operations from within npcsh and from a regular shell.
873
948
 
874
949
 
875
- Start the spool mode:
950
+ To learn about the available macros from within the shell, type:
876
951
  ```npcsh
877
- npcsh> /spool
952
+ npcsh> /help
878
953
  ```
879
- Start the spool mode with a specific npc
880
954
 
881
- ```npcsh
882
- npcsh> /spool npc=foreman
955
+ or from bash
956
+ ```bash
957
+ npc --help
958
+ #alternatively
959
+ npc -h
883
960
  ```
884
961
 
885
- Start the spool mode with specific files in context that will be referenced through RAG searches when relevant.
886
-
962
+ To exit the shell:
887
963
  ```npcsh
888
- npcsh> /spool files=[*.py,*.md] # Load specific files for context
964
+ npcsh> /exit
889
965
  ```
890
966
 
891
- Have a conversation and switch between text and voice mode by invoking `/whisper` mode from within spool mode.
892
- ```npcsh
893
- spool> what can you tell me about green bull from one piece?
894
-
895
- Green Bull, also known as Ryokugyu, is a character from the popular anime and manga series One Piece. He is one of the Marine Admirals and was introduced during the Wano Country arc. Here are some key points about Green
896
- Bull:
897
- 1 Real Name: His real name is Aramaki, but he is primarily known by his title, Green Bull (Ryokugyu in Japanese).
898
- 2 Appearance: Green Bull has a distinct look characterized by a green military uniform, a long cape, and a
899
- noticeable plant-based theme, which aligns with his powers.
900
- 3 Devil Fruit: Green Bull is a user of the Mori Mori no Mi (Forest Forest Fruit), a Logia-type Devil Fruit that
901
- allows him to generate and manipulate plant matter. This ability can be seen in his combat style, which utilizes
902
- plants to entrap and attack opponents.
903
- 4 Role as an Admiral: As an Admiral, Green Bull holds a high-ranking position in the Marines and is
904
- known for his strength and authority. He is fiercely loyal to the Marine organization and its goals.
905
- 5 Personality: Green Bull displays a laid-back attitude and often appears more relaxed compared to other Admirals.
906
- His personality can be somewhat ambiguous, which adds depth to his character.
907
- 6 Involvement in the Story: Green Bull's introduction brings anticipation regarding his potential confrontations with
908
- key characters in the series. His role in major arcs like Wano and his interactions with the Straw Hat Pirates and
909
- other significant figures are crucial for the unfolding storyline.
910
- Overall, Green Bull is an intriguing character with unique abilities and a significant role within the One Piece universe.
911
-
912
- spool> /whisper
913
-
914
- Calibrating silence level. Please remain quiet...
915
- Silence threshold set to: 5679.193548387097
916
- Listening... (speak now)
917
- ...............
918
- Max duration reached.
919
-
920
- Processing...
921
- You said: What do you think his greatest weakness is?
922
-
923
-
924
- Based on what we know about Green Bull from the One Piece series, I can identify a few potential weaknesses:
925
-
926
- 1. **Extreme Laziness**:
927
- - Green Bull is known for being incredibly lazy, often refusing to move or exert himself unnecessarily.
928
- - This laziness could be exploited by opponents who understand how to manipulate or provoke him into action.
967
+ Otherwise, here are some more detailed examples of macros that can be used in npcsh:
968
+ ### Conjure (under construction)
969
+ Use the `/conjure` macro to generate an NPC, a NPC tool, an assembly line, a job, or an SQL model
929
970
 
930
- 2. **Dependency on External Nutrition**:
931
- - His Devil Fruit power allows him to absorb nutrients from the environment, which suggests he might become weakened in environments with limited plant life or nutrients.
932
- - In extremely barren or non-vegetative settings, his regenerative and plant-based abilities might be significantly reduced.
971
+ ```bash
972
+ npc conjure -n name -t 'templates'
973
+ ```
933
974
 
934
- 3. **Pride and Marine Ideology**:
935
- - Like many Marines, he likely has a rigid sense of justice that could be used against him strategically.
936
- - His commitment to Marine principles might make him predictable in certain confrontational scenarios.
937
975
 
938
- 4. **Potential Overconfidence**:
939
- - As an Admiral, he might underestimate opponents, especially pirates, due to his high-ranking status.
940
- - His laid-back nature might lead him to not take threats seriously until it's too late.
976
+ ### Data Interaction and analysis (under construction)
941
977
 
942
- The most pronounced weakness seems to be his extreme laziness, which could potentially be exploited tactically by skilled opponents who understand how to force him into action or create scenarios that challenge his passive nature.
943
978
 
944
- Here are some additional details about Green Bull (Aramaki) in "One Piece":
979
+ ### Debate (under construction)
980
+ Use the `/debate` macro to have two or more NPCs debate a topic, problem, or question.
945
981
 
946
- 1 Devil Fruit Name:
982
+ For example:
983
+ ```npcsh
984
+ npcsh> /debate Should humans colonize Mars? npcs = ['sibiji', 'mark', 'ted']
985
+ ```
947
986
 
948
- • Green Bull's Devil Fruit has been confirmed to be the Mori Mori no Mi, which is a mythical Zoan-type fruit. This allows him to utilize various forms of plant life in combat and has regenerative
949
- capabilities.
950
987
 
951
- 2 Combat Abilities:
952
988
 
953
- • His ability to control vegetation gives him significant advantages in battle. He can generate plants to attack or defend and possibly use them for tactical advantages, such as creating barriers
954
- or entangling enemies.
989
+ ### Notes
990
+ Jot down notes and store them within the npcsh database and in the current directory as a text file.
991
+ ```npcsh
992
+ npcsh> /notes
993
+ ```
955
994
 
956
- 3 Reputation:
957
995
 
958
- • As an Admiral, Green Bull holds a position of great power and authority within the Marines. His reputation is likely formidable among pirates and other opponents.
996
+ ### Over-the-shoulder: Screenshots and image analysis
959
997
 
960
- 4 Interactions with Other Characters:
998
+ Use the /ots macro to take a screenshot and write a prompt for an LLM to answer about the screenshot.
999
+ ```npcsh
1000
+ npcsh> /ots
961
1001
 
962
- Green Bull's character dynamics with others in the series, particularly with fellow Marines and pirates, can provide insight into his personality and values. His interactions during missions or
963
- discussions about justice will reveal more about his character.
1002
+ Screenshot saved to: /home/caug/.npcsh/screenshots/screenshot_1735015011.png
964
1003
 
965
- 5 Appearance and Style:
1004
+ Enter a prompt for the LLM about this image (or press Enter to skip): describe whats in this image
966
1005
 
967
- He has a rather unique aesthetic, characterized by his green clothing that symbolizes his connection to nature. His overall appearance contributes to his identity as a plant-based fighter.
1006
+ The image displays a source control graph, likely from a version control system like Git. It features a series of commits represented by colored dots connected by lines, illustrating the project's development history. Each commit message provides a brief description of the changes made, including tasks like fixing issues, merging pull requests, updating README files, and adjusting code or documentation. Notably, several commits mention specific users, particularly "Chris Agostino," indicating collaboration and contributions to the project. The graph visually represents the branching and merging of code changes.
1007
+ ```
968
1008
 
969
- 6 Backstory:
1009
+ In bash:
1010
+ ```bash
1011
+ npc ots
1012
+ ```
970
1013
 
971
- • As of now, specific details about his backstory and how he rose to the rank of Admiral are not extensively explored in the series. This leaves room for further character development and
972
- background to be unveiled in future chapters.
973
1014
 
974
- 7 Ambiguous Personality:
975
1015
 
976
- • While his laziness and laid-back demeanor are evident, it is possible that there are deeper layers to his character that might be revealed through his actions and motivations within the
977
- overarching narrative of "One Piece."
1016
+ Alternatively, pass an existing image in like :
1017
+ ```npcsh
1018
+ npcsh> /ots test_data/catfight.PNG
1019
+ Enter a prompt for the LLM about this image (or press Enter to skip): whats in this ?
978
1020
 
979
- 8 Role in the Marine Organization:
1021
+ The image features two cats, one calico and one orange tabby, playing with traditional Japanese-style toys. They are each holding sticks attached to colorful pom-pom balls, which resemble birds. The background includes stylized waves and a red sun, accentuating a vibrant, artistic style reminiscent of classic Japanese art. The playful interaction between the cats evokes a lively, whimsical scene.
1022
+ ```
980
1023
 
981
- • His position as Admiral places him in direct opposition to the main pirate characters, particularly the Straw Hat crew, making him a significant figure in the ongoing conflict between pirates
982
- and the Marines.
983
- As the story continues to develop, Green Bull's character may evolve and reveal more complexities, weaknesses, and relationships within the world of "One Piece."
1024
+ ```bash
1025
+ npc ots -f test_data/catfight.PNG
984
1026
  ```
985
1027
 
986
1028
 
987
- Start the spool with a specific llm model:
1029
+ ### Plan : Schedule tasks to be run at regular intervals (under construction)
1030
+ Use the /plan macro to schedule tasks to be run at regular intervals.
988
1031
  ```npcsh
989
- #note this is not yet implemented
990
- npcsh> /spool model=llama3.3
1032
+ npcsh> /plan run a rag search on the files in the current directory every 5 minutes
991
1033
  ```
992
1034
 
993
1035
  ```bash
994
- npc spool -n npc.npc
1036
+ npc plan -f 30m -t 'task'
995
1037
  ```
996
1038
 
1039
+ ### Plonk : Computer Control
1040
+ Use the /plonk macro to allow the LLM to control your computer.
1041
+ ```npcsh
1042
+ npcsh> /plonk go to a web browser and go to wikipedia and find out information about simon bolivar
1043
+ ```
997
1044
 
1045
+ ```bash
1046
+ npc plonk 'use a web browser to find out information about simon boliver'
1047
+ ```
998
1048
 
999
- ### Vixynt: Image Generation
1000
- Image generation can be done with the /vixynt macro.
1049
+ ### RAG
1001
1050
 
1002
- Use /vixynt like so where you can also specify the model to use with an @ reference. This @ reference will override the default model in ~/.npcshrc.
1051
+ Use the /rag macro to perform a local rag search.
1052
+ If you pass a `-f` flag with a filename or list of filenames (e.g. *.py) then it will embed the documents and perform the cosine similarity scoring.
1003
1053
 
1004
1054
  ```npcsh
1005
- npcsh> /vixynt A futuristic cityscape @dall-e-3
1055
+ npcsh> /rag -f *.py what is the best way to implement a linked list in Python?
1006
1056
  ```
1007
- ![futuristic cityscape](test_data/futuristic_cityscape.PNG)
1008
1057
 
1058
+ Alternatively , if you want to perform rag on your past conversations, you can do so like this:
1009
1059
  ```npcsh
1010
- npcsh> /vixynt A peaceful landscape @runwayml/stable-diffusion-v1-5
1060
+ npcsh> /rag what is the best way to implement a linked list in Python?
1011
1061
  ```
1012
- ![peaceful landscape](test_data/peaceful_landscape_stable_diff.png)
1062
+ and it will automatically look through the recorded conversations in the ~/npcsh_history.db
1013
1063
 
1014
1064
 
1015
- Similarly, use vixynt with the NPC CLI from a regular shell:
1065
+ In bash:
1016
1066
  ```bash
1017
- $ npc --model 'dall-e-2' --provider 'openai' vixynt 'whats a french man to do in the southern bayeaux'
1067
+ npc rag -f *.py
1018
1068
  ```
1019
1069
 
1070
+ ### Rehash
1020
1071
 
1021
-
1022
-
1023
- ### Whisper: Voice Control
1024
- Enter into a voice-controlled mode to interact with the LLM. This mode can executet commands and use tools just like the basic npcsh shell.
1072
+ Use the /rehash macro to re-send the last message to the LLM.
1025
1073
  ```npcsh
1026
- npcsh> /whisper
1074
+ npcsh> /rehash
1027
1075
  ```
1028
1076
 
1029
-
1030
-
1031
-
1032
- ### Compilation and NPC Interaction
1033
- Compile a specified NPC profile. This will make it available for use in npcsh interactions.
1077
+ ### Sample
1078
+ Send a one-shot question to the LLM.
1034
1079
  ```npcsh
1035
- npcsh> /compile <npc_file>
1080
+ npcsh> /sample What is the capital of France?
1036
1081
  ```
1037
- You can also use `/com` as an alias for `/compile`. If no NPC file is specified, all NPCs in the npc_team directory will be compiled.
1038
1082
 
1039
- Begin a conversations with a specified NPC by referencing their name
1040
- ```npcsh
1041
- npcsh> /<npc_name>:
1083
+ Bash:
1084
+ ```bash
1085
+ npc sample 'thing' -m model -p provider
1086
+
1042
1087
  ```
1043
1088
 
1044
1089
 
1090
+ ### Search
1091
+ Search can be accomplished through the `/search` macro. You can specify the provider as being "perplexity" or "duckduckgo". For the former,
1092
+ you must set a perplexity api key as an environment variable as described above. The default provider is duckduckgo.
1045
1093
 
1046
- ## NPC Data Layer
1094
+ NOTE: while google is an available search engine, they recently implemented changes (early 2025) that make the python google search package no longer as reliable.
1095
+ For now, we will use duckduckgo and revisit this issue when other more critical aspects are handled.
1047
1096
 
1048
- What principally powers the capabilities of npcsh is the NPC Data Layer. In the `~/.npcsh/` directory after installation, you will find
1049
- the npc teaam with its tools, models, contexts, assembly lines, and NPCs. By making tools, NPCs, contexts, and assembly lines simple data structures with
1050
- a fixed set of parameters, we can let users define them in easy-to-read YAML files, allowing for a modular and extensible system that can be easily modified and expanded upon. Furthermore, this data layer relies heavily on jinja templating to allow for dynamic content generation and the ability to reference other NPCs, tools, and assembly lines in the system.
1051
1097
 
1052
- ### Creating NPCs
1053
- NPCs are defined in YAML files within the npc_team directory. Each NPC must have a name and a primary directive. Optionally, one can specify an LLM model/provider for the NPC as well as provide an explicit list of tools and whether or not to use the globally available tools. See the data models contained in `npcsh/data_models.py` for more explicit type details on the NPC data structure.
1098
+ ```npcsh
1099
+ npcsh!> /search -p duckduckgo who is the current us president
1054
1100
 
1055
1101
 
1102
+ President Donald J. Trump entered office on January 20, 2025. News, issues, and photos of the President Footer Disclaimer This is the official website of the U.S. Mission to the United Nations. External links to other Internet sites should not be construed as an endorsement of the views or privacy policies contained therein.
1056
1103
 
1057
- Here is a typical NPC file:
1058
- ```yaml
1059
- name: sibiji
1060
- primary_directive: You are a foundational AI assistant. Your role is to provide basic support and information. Respond to queries concisely and accurately.
1061
- tools:
1062
- - simple data retrieval
1063
- model: llama3.2
1064
- provider: ollama
1104
+ Citation: https://usun.usmission.gov/our-leaders/the-president-of-the-united-states/
1105
+ 45th & 47th President of the United States After a landslide election victory in 2024, President Donald J. Trump is returning to the White House to build upon his previous successes and use his mandate to reject the extremist policies of the radical left while providing tangible quality of life improvements for the American people. Vice President of the United States In 2024, President Donald J. Trump extended JD the incredible honor of asking him to serve as the Vice-Presidential Nominee for th...
1106
+ Citation: https://www.whitehouse.gov/administration/
1107
+ Citation: https://www.instagram.com/potus/?hl=en
1108
+ The president of the United States (POTUS)[B] is the head of state and head of government of the United States. The president directs the executive branch of the federal government and is the commander-in-chief of the United States Armed Forces. The power of the presidency has grown substantially[12] since the first president, George Washington, took office in 1789.[6] While presidential power has ebbed and flowed over time, the presidency has played an increasingly significant role in American ...
1109
+ Citation: https://en.wikipedia.org/wiki/President_of_the_United_States
1110
+ Citation Links: https://usun.usmission.gov/our-leaders/the-president-of-the-united-states/
1111
+ https://www.whitehouse.gov/administration/
1112
+ https://www.instagram.com/potus/?hl=en
1113
+ https://en.wikipedia.org/wiki/President_of_the_United_States
1065
1114
  ```
1066
1115
 
1067
1116
 
1068
- ## Creating Tools
1069
- Tools are defined as YAMLs with `.tool` extension within the npc_team/tools directory. Each tool has a name, inputs, and consists of three distinct steps: preprocess, prompt, and postprocess. The idea here is that a tool consists of a stage where information is preprocessed and then passed to a prompt for some kind of analysis and then can be passed to another stage for postprocessing. In each of these three cases, the engine must be specified. The engine can be either "natural" for natural language processing or "python" for Python code. The code is the actual code that will be executed.
1117
+ ```npcsh
1118
+ npcsh> /search -p perplexity who is the current us president
1119
+ The current President of the United States is Donald Trump, who assumed office on January 20, 2025, for his second non-consecutive term as the 47th president[1].
1070
1120
 
1071
- Here is an example of a tool file:
1072
- ```yaml
1073
- tool_name: "screen_capture_analysis_tool"
1074
- inputs:
1075
- - "prompt"
1076
- preprocess:
1077
- - engine: "python"
1078
- code: |
1079
- # Capture the screen
1080
- import pyautogui
1081
- import datetime
1082
- import os
1083
- from PIL import Image
1084
- from npcsh.image import analyze_image_base
1085
-
1086
- # Generate filename
1087
- filename = f"screenshot_{datetime.datetime.now().strftime('%Y%m%d_%H%M%S')}.png"
1088
- screenshot = pyautogui.screenshot()
1089
- screenshot.save(filename)
1090
- print(f"Screenshot saved as {filename}")
1091
-
1092
- # Load image
1093
- image = Image.open(filename)
1094
-
1095
- # Full file path
1096
- file_path = os.path.abspath('./'+filename)
1097
- # Analyze the image
1098
-
1099
- llm_output = analyze_image_base(inputs['prompt']+ '\n\n attached is a screenshot of my screen currently.', file_path, filename, npc=npc)
1100
- prompt:
1101
- engine: "natural"
1102
- code: ""
1103
- postprocess:
1104
- - engine: "natural"
1105
- code: |
1106
- Screenshot captured and saved as {{ filename }}.
1107
- Analysis Result: {{ llm_output }}
1121
+ Citation Links: ['https://en.wikipedia.org/wiki/List_of_presidents_of_the_United_States',
1122
+ 'https://en.wikipedia.org/wiki/Joe_Biden',
1123
+ 'https://www.britannica.com/topic/Presidents-of-the-United-States-1846696',
1124
+ 'https://news.gallup.com/poll/329384/presidential-approval-ratings-joe-biden.aspx',
1125
+ 'https://www.usa.gov/presidents']
1108
1126
  ```
1109
1127
 
1128
+ Bash:
1110
1129
 
1111
- When you have created a tool, it will be surfaced as a potential option to be used when you ask a question in the base npcsh shell. The LLM will decide if it is the best tool to use based on the user's input. Alternatively, if you'd like, you can call the tools directly, without needing to let the AI decide if it's the right one to use.
1112
-
1113
- ```npcsh
1114
- npcsh> /screen_cap_tool <prompt>
1115
- ```
1116
- or
1117
- ```npcsh
1118
- npcsh> /sql_executor select * from conversation_history limit 1
1119
-
1120
- ```
1121
- or
1122
- ```npcsh
1123
- npcsh> /calculator 5+6
1124
- ```
1125
-
1130
+ ```bash
1131
+ (npcsh) caug@pop-os:~/npcww/npcsh$ npc search 'simon bolivar' -sp perplexity
1132
+ Loaded .env file from /home/caug/npcww/npcsh
1133
+ urls ['https://en.wikipedia.org/wiki/Sim%C3%B3n_Bol%C3%ADvar', 'https://www.britannica.com/biography/Simon-Bolivar', 'https://en.wikipedia.org/wiki/File:Sim%C3%B3n_Bol%C3%ADvar_2.jpg', 'https://www.historytoday.com/archive/simon-bolivar-and-spanish-revolutions', 'https://kids.britannica.com/kids/article/Sim%C3%B3n-Bol%C3%ADvar/352872']
1134
+ openai
1135
+ - Simón José Antonio de la Santísima Trinidad Bolívar Palacios Ponte y Blanco[c] (24 July 1783 – 17 December 1830) was a Venezuelan statesman and military officer who led what are currently the countries of Colombia, Venezuela, Ecuador, Peru, Panama, and Bolivia to independence from the Spanish Empire. He is known colloquially as El Libertador, or the Liberator of America. Simón Bolívar was born in Caracas in the Captaincy General of Venezuela into a wealthy family of American-born Spaniards (crio...
1136
+ Citation: https://en.wikipedia.org/wiki/Sim%C3%B3n_Bol%C3%ADvar
1126
1137
 
1127
- ## NPC Pipelines
1128
1138
 
1129
1139
 
1140
+ Our editors will review what you’ve submitted and determine whether to revise the article. Simón Bolívar was a Venezuelan soldier and statesman who played a central role in the South American independence movement. Bolívar served as president of Gran Colombia (1819–30) and as dictator of Peru (1823–26). The country of Bolivia is named for him. Simón Bolívar was born on July 24, 1783, in Caracas, Venezuela. Neither Bolívar’s aristocrat father nor his mother lived to see his 10th birthday. Bolívar...
1141
+ Citation: https://www.britannica.com/biography/Simon-Bolivar
1130
1142
 
1131
- Let's say you want to create a pipeline of steps where NPCs are used along the way. Let's initialize with a pipeline file we'll call `morning_routine.pipe`:
1132
- ```yaml
1133
- steps:
1134
- - step_name: "review_email"
1135
- npc: "{{ ref('email_assistant') }}"
1136
- task: "Get me up to speed on my recent emails: {{source('emails')}}."
1137
1143
 
1138
1144
 
1139
- - step_name: "market_update"
1140
- npc: "{{ ref('market_analyst') }}"
1141
- task: "Give me an update on the latest events in the market: {{source('market_events')}}."
1145
+ Original file (1,525 × 1,990 pixels, file size: 3.02 MB, MIME type: image/jpeg) Derivative works of this file: Simón Bolívar 5.jpg This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1930. https://creativecommons.org/publicdomain/mark/1.0/PDMCreat...
1146
+ Citation: https://en.wikipedia.org/wiki/File:Sim%C3%B3n_Bol%C3%ADvar_2.jpg
1142
1147
 
1143
- - step_name: "summarize"
1144
- npc: "{{ ref('sibiji') }}"
1145
- model: llama3.2
1146
- provider: ollama
1147
- task: "Review the outputs from the {{review_email}} and {{market_update}} and provide me with a summary."
1148
1148
 
1149
- ```
1150
- Now youll see that we reference NPCs in the pipeline file. We'll need to make sure we have each of those NPCs available.
1151
- Here is an example for the email assistant:
1152
- ```yaml
1153
- name: email_assistant
1154
- primary_directive: You are an AI assistant specialized in managing and summarizing emails. You should present the information in a clear and concise manner.
1155
- model: gpt-4o-mini
1156
- provider: openai
1157
- ```
1158
- Now for the marketing analyst:
1159
- ```yaml
1160
- name: market_analyst
1161
- primary_directive: You are an AI assistant focused on monitoring and analyzing market trends. Provide de
1162
- model: llama3.2
1163
- provider: ollama
1164
- ```
1165
- and then here is our trusty friend sibiji:
1166
- ```yaml
1167
- name: sibiji
1168
- primary_directive: You are a foundational AI assistant. Your role is to provide basic support and information. Respond to queries concisely and accurately.
1169
- suggested_tools_to_use:
1170
- - simple data retrieval
1171
- model: claude-3-5-sonnet-latest
1172
- provider: anthropic
1173
- ```
1174
- Now that we have our pipeline and NPCs defined, we also need to ensure that the source data we are referencing will be there. When we use source('market_events') and source('emails') we are asking npcsh to pull those data directly from tables in our npcsh database. For simplicity we will just make these in python to insert them for this demo:
1175
- ```python
1176
- import pandas as pd
1177
- from sqlalchemy import create_engine
1178
- import os
1179
1149
 
1180
- # Sample market events data
1181
- market_events_data = {
1182
- "datetime": [
1183
- "2023-10-15 09:00:00",
1184
- "2023-10-16 10:30:00",
1185
- "2023-10-17 11:45:00",
1186
- "2023-10-18 13:15:00",
1187
- "2023-10-19 14:30:00",
1188
- ],
1189
- "headline": [
1190
- "Stock Market Rallies Amid Positive Economic Data",
1191
- "Tech Giant Announces New Product Line",
1192
- "Federal Reserve Hints at Interest Rate Pause",
1193
- "Oil Prices Surge Following Supply Concerns",
1194
- "Retail Sector Reports Record Q3 Earnings",
1195
- ],
1196
- }
1150
+ SubscriptionOffers Give a Gift Subscribe A map of Gran Colombia showing the 12 departments created in 1824 and territories disputed with neighboring countries. What role did Simon Bolivar play in the history of Latin America's independence from Spain? Simon Bolivar lived a short but comprehensive life. History records his extraordinary versatility. He was a revolutionary who freed six countries, an intellectual who argued the problems of national liberation, a general who fought a war of unremit...
1151
+ Citation: https://www.historytoday.com/archive/simon-bolivar-and-spanish-revolutions
1197
1152
 
1198
- # Create a DataFrame
1199
- market_events_df = pd.DataFrame(market_events_data)
1200
1153
 
1201
- # Define database path relative to user's home directory
1202
- db_path = os.path.expanduser("~/npcsh_history.db")
1203
1154
 
1204
- # Create a connection to the SQLite database
1205
- engine = create_engine(f"sqlite:///{db_path}")
1206
- with engine.connect() as connection:
1207
- # Write the data to a new table 'market_events', replacing existing data
1208
- market_events_df.to_sql(
1209
- "market_events", con=connection, if_exists="replace", index=False
1210
- )
1155
+ Known as the Liberator, Simón Bolívar led revolutions against Spanish rule in South America. The countries of Venezuela, Colombia, Ecuador, Panama, Peru, and Bolivia all owe their independence largely to him. Bolívar was born on July 24, 1783, in Caracas, New Granada (now in Venezuela). After studying in Europe, he returned to South America and began to fight Spanish rule. Between 1810 and 1814 Venezuela made two failed tries to break free from Spain. After the second defeat, Bolívar fled to Jam...
1156
+ Citation: https://kids.britannica.com/kids/article/Sim%C3%B3n-Bol%C3%ADvar/352872
1211
1157
 
1212
- print("Market events have been added to the database.")
1213
1158
 
1214
- email_data = {
1215
- "datetime": [
1216
- "2023-10-10 10:00:00",
1217
- "2023-10-11 11:00:00",
1218
- "2023-10-12 12:00:00",
1219
- "2023-10-13 13:00:00",
1220
- "2023-10-14 14:00:00",
1221
- ],
1222
- "subject": [
1223
- "Meeting Reminder",
1224
- "Project Update",
1225
- "Invoice Attached",
1226
- "Weekly Report",
1227
- "Holiday Notice",
1228
- ],
1229
- "sender": [
1230
- "alice@example.com",
1231
- "bob@example.com",
1232
- "carol@example.com",
1233
- "dave@example.com",
1234
- "eve@example.com",
1235
- ],
1236
- "recipient": [
1237
- "bob@example.com",
1238
- "carol@example.com",
1239
- "dave@example.com",
1240
- "eve@example.com",
1241
- "alice@example.com",
1242
- ],
1243
- "body": [
1244
- "Don't forget the meeting tomorrow at 10 AM.",
1245
- "The project is progressing well, see attached update.",
1246
- "Please find your invoice attached.",
1247
- "Here is the weekly report.",
1248
- "The office will be closed on holidays, have a great time!",
1249
- ],
1250
- }
1251
1159
 
1252
- # Create a DataFrame
1253
- emails_df = pd.DataFrame(email_data)
1160
+ - https://en.wikipedia.org/wiki/Sim%C3%B3n_Bol%C3%ADvar
1254
1161
 
1255
- # Define database path relative to user's home directory
1256
- db_path = os.path.expanduser("~/npcsh_history.db")
1162
+ https://www.britannica.com/biography/Simon-Bolivar
1257
1163
 
1258
- # Create a connection to the SQLite database
1259
- engine = create_engine(f"sqlite:///{db_path}")
1260
- with engine.connect() as connection:
1261
- # Write the data to a new table 'emails', replacing existing data
1262
- emails_df.to_sql("emails", con=connection, if_exists="replace", index=False)
1164
+ https://en.wikipedia.org/wiki/File:Sim%C3%B3n_Bol%C3%ADvar_2.jpg
1263
1165
 
1264
- print("Sample emails have been added to the database.")
1166
+ https://www.historytoday.com/archive/simon-bolivar-and-spanish-revolutions
1265
1167
 
1168
+ https://kids.britannica.com/kids/article/Sim%C3%B3n-Bol%C3%ADvar/352872
1266
1169
  ```
1267
1170
 
1171
+ ```bash
1172
+ npc search 'snipers on the roof indiana university' -sp duckduckgo
1173
+ ```
1268
1174
 
1269
- With these data now in place, we can proceed with running the pipeline. We can do this in npcsh by using the /compile command.
1270
1175
 
1176
+ ### Set: Changing defaults from within npcsh
1177
+ Users can change the default model and provider from within npcsh by using the following commands:
1178
+ ```npcsh
1179
+ npcsh> /set model ollama
1180
+ npcsh> /set provider llama3.2
1181
+ ```
1271
1182
 
1272
1183
 
1184
+ ### Sleep : a method for creating and updating a knowledge graph (under construction)
1273
1185
 
1186
+ Use the `/sleep` macro to create or update a knowledge graph. A knowledge graph is a structured representation of facts about you as a user that the NPCs can determine based on the conversations you have had with it.
1274
1187
  ```npcsh
1275
- npcsh> /compile morning_routine.pipe
1188
+ npcsh> /sleep
1276
1189
  ```
1277
1190
 
1191
+ ### breathe: a method for condensing context on a regular cadence (# messages, len(context), etc) (under construction)
1192
+ -every 10 messages/7500 characters, condense the conversation into lessons learned. write the lessons learned down by the np
1193
+ for the day, then the npc will see the lessons they have learned that day in that folder as part of the context.
1278
1194
 
1279
1195
 
1280
- Alternatively we can run a pipeline like so in Python:
1281
1196
 
1282
- ```bash
1283
- from npcsh.npc_compiler import PipelineRunner
1284
- import os
1197
+ ### Spool
1198
+ Spool mode allows one to enter into a conversation with a specific LLM or a specific NPC.
1199
+ This is used for having distinct interactions from those in the base shell and these will be separately contained.
1285
1200
 
1286
- pipeline_runner = PipelineRunner(
1287
- pipeline_file="morning_routine.pipe",
1288
- npc_root_dir=os.path.abspath("./"),
1289
- db_path="~/npcsh_history.db",
1290
- )
1291
- pipeline_runner.execute_pipeline(inputs)
1292
- ```
1293
1201
 
1294
- What if you wanted to run operations on each row and some operations on all the data at once? We can do this with the pipelines as well. Here we will build a pipeline for news article analysis.
1295
- First we make the data for the pipeline that well use:
1296
- ```python
1297
- import pandas as pd
1298
- from sqlalchemy import create_engine
1299
- import os
1202
+ Start the spool mode:
1203
+ ```npcsh
1204
+ npcsh> /spool
1205
+ ```
1206
+ Start the spool mode with a specific npc
1300
1207
 
1301
- # Sample data generation for news articles
1302
- news_articles_data = {
1303
- "news_article_id": list(range(1, 21)),
1304
- "headline": [
1305
- "Economy sees unexpected growth in Q4",
1306
- "New tech gadget takes the world by storm",
1307
- "Political debate heats up over new policy",
1308
- "Health concerns rise amid new disease outbreak",
1309
- "Sports team secures victory in last minute",
1310
- "New economic policy introduced by government",
1311
- "Breakthrough in AI technology announced",
1312
- "Political leader delivers speech on reforms",
1313
- "Healthcare systems pushed to limits",
1314
- "Celebrated athlete breaks world record",
1315
- "Controversial economic measures spark debate",
1316
- "Innovative tech startup gains traction",
1317
- "Political scandal shakes administration",
1318
- "Healthcare workers protest for better pay",
1319
- "Major sports event postponed due to weather",
1320
- "Trade tensions impact global economy",
1321
- "Tech company accused of data breach",
1322
- "Election results lead to political upheaval",
1323
- "Vaccine developments offer hope amid pandemic",
1324
- "Sports league announces return to action",
1325
- ],
1326
- "content": ["Article content here..." for _ in range(20)],
1327
- "publication_date": pd.date_range(start="1/1/2023", periods=20, freq="D"),
1328
- }
1208
+ ```npcsh
1209
+ npcsh> /spool npc=foreman
1329
1210
  ```
1330
1211
 
1331
- Then we will create the pipeline file:
1332
- ```yaml
1333
- # news_analysis.pipe
1334
- steps:
1335
- - step_name: "classify_news"
1336
- npc: "{{ ref('news_assistant') }}"
1337
- task: |
1338
- Classify the following news articles into one of the categories:
1339
- ["Politics", "Economy", "Technology", "Sports", "Health"].
1340
- {{ source('news_articles') }}
1212
+ Start the spool mode with specific files in context that will be referenced through RAG searches when relevant.
1341
1213
 
1342
- - step_name: "analyze_news"
1343
- npc: "{{ ref('news_assistant') }}"
1344
- batch_mode: true # Process articles with knowledge of their tags
1345
- task: |
1346
- Based on the category assigned in {{classify_news}}, provide an in-depth
1347
- analysis and perspectives on the article. Consider these aspects:
1348
- ["Impacts", "Market Reaction", "Cultural Significance", "Predictions"].
1349
- {{ source('news_articles') }}
1214
+ ```npcsh
1215
+ npcsh> /spool files=[*.py,*.md] # Load specific files for context
1350
1216
  ```
1351
1217
 
1352
- Then we can run the pipeline like so:
1353
- ```bash
1354
- /compile ./npc_team/news_analysis.pipe
1355
- ```
1356
- or in python like:
1218
+ Have a conversation and switch between text and voice mode by invoking `/whisper` mode from within spool mode.
1219
+ ```npcsh
1220
+ spool> what can you tell me about green bull from one piece?
1357
1221
 
1358
- ```bash
1222
+ Green Bull, also known as Ryokugyu, is a character from the popular anime and manga series One Piece. He is one of the Marine Admirals and was introduced during the Wano Country arc. Here are some key points about Green
1223
+ Bull:
1224
+ 1 Real Name: His real name is Aramaki, but he is primarily known by his title, Green Bull (Ryokugyu in Japanese).
1225
+ 2 Appearance: Green Bull has a distinct look characterized by a green military uniform, a long cape, and a
1226
+ noticeable plant-based theme, which aligns with his powers.
1227
+ 3 Devil Fruit: Green Bull is a user of the Mori Mori no Mi (Forest Forest Fruit), a Logia-type Devil Fruit that
1228
+ allows him to generate and manipulate plant matter. This ability can be seen in his combat style, which utilizes
1229
+ plants to entrap and attack opponents.
1230
+ 4 Role as an Admiral: As an Admiral, Green Bull holds a high-ranking position in the Marines and is
1231
+ known for his strength and authority. He is fiercely loyal to the Marine organization and its goals.
1232
+ 5 Personality: Green Bull displays a laid-back attitude and often appears more relaxed compared to other Admirals.
1233
+ His personality can be somewhat ambiguous, which adds depth to his character.
1234
+ 6 Involvement in the Story: Green Bull's introduction brings anticipation regarding his potential confrontations with
1235
+ key characters in the series. His role in major arcs like Wano and his interactions with the Straw Hat Pirates and
1236
+ other significant figures are crucial for the unfolding storyline.
1237
+ Overall, Green Bull is an intriguing character with unique abilities and a significant role within the One Piece universe.
1359
1238
 
1360
- from npcsh.npc_compiler import PipelineRunner
1361
- import os
1362
- runner = PipelineRunner(
1363
- "./news_analysis.pipe",
1364
- db_path=os.path.expanduser("~/npcsh_history.db"),
1365
- npc_root_dir=os.path.abspath("."),
1366
- )
1367
- results = runner.execute_pipeline()
1368
- ```
1239
+ spool> /whisper
1369
1240
 
1370
- Alternatively, if youd like to use a mixture of agents in your pipeline, set one up like this:
1371
- ```yaml
1372
- steps:
1373
- - step_name: "classify_news"
1374
- npc: "news_assistant"
1375
- mixa: true
1376
- mixa_agents:
1377
- - "{{ ref('news_assistant') }}"
1378
- - "{{ ref('journalist_npc') }}"
1379
- - "{{ ref('data_scientist_npc') }}"
1380
- mixa_voters:
1381
- - "{{ ref('critic_npc') }}"
1382
- - "{{ ref('editor_npc') }}"
1383
- - "{{ ref('researcher_npc') }}"
1384
- mixa_voter_count: 5
1385
- mixa_turns: 3
1386
- mixa_strategy: "vote"
1387
- task: |
1388
- Classify the following news articles...
1389
- {{ source('news_articles') }}
1390
- ```
1391
- You'll have to make npcs for these references to work, here are versions that should work with the above:
1392
- ```yaml
1393
- name: news_assistant
1394
- ```
1395
- Then, we can run the mixture of agents method like:
1241
+ Calibrating silence level. Please remain quiet...
1242
+ Silence threshold set to: 5679.193548387097
1243
+ Listening... (speak now)
1244
+ ...............
1245
+ Max duration reached.
1396
1246
 
1397
- ```bash
1398
- /compile ./npc_team/news_analysis_mixa.pipe
1399
- ```
1400
- or in python like:
1247
+ Processing...
1248
+ You said: What do you think his greatest weakness is?
1401
1249
 
1402
- ```bash
1403
1250
 
1404
- from npcsh.npc_compiler import PipelineRunner
1405
- import os
1251
+ Based on what we know about Green Bull from the One Piece series, I can identify a few potential weaknesses:
1406
1252
 
1407
- runner = PipelineRunner(
1408
- "./news_analysis_mixa.pipe",
1409
- db_path=os.path.expanduser("~/npcsh_history.db"),
1410
- npc_root_dir=os.path.abspath("."),
1411
- )
1412
- results = runner.execute_pipeline()
1413
- ```
1253
+ 1. **Extreme Laziness**:
1254
+ - Green Bull is known for being incredibly lazy, often refusing to move or exert himself unnecessarily.
1255
+ - This laziness could be exploited by opponents who understand how to manipulate or provoke him into action.
1414
1256
 
1257
+ 2. **Dependency on External Nutrition**:
1258
+ - His Devil Fruit power allows him to absorb nutrients from the environment, which suggests he might become weakened in environments with limited plant life or nutrients.
1259
+ - In extremely barren or non-vegetative settings, his regenerative and plant-based abilities might be significantly reduced.
1415
1260
 
1261
+ 3. **Pride and Marine Ideology**:
1262
+ - Like many Marines, he likely has a rigid sense of justice that could be used against him strategically.
1263
+ - His commitment to Marine principles might make him predictable in certain confrontational scenarios.
1416
1264
 
1417
- Note, in the future we will aim to separate compilation and running so that we will have a compilation step that is more like a jinja rendering of the relevant information so that it can be more easily audited.
1265
+ 4. **Potential Overconfidence**:
1266
+ - As an Admiral, he might underestimate opponents, especially pirates, due to his high-ranking status.
1267
+ - His laid-back nature might lead him to not take threats seriously until it's too late.
1418
1268
 
1269
+ The most pronounced weakness seems to be his extreme laziness, which could potentially be exploited tactically by skilled opponents who understand how to force him into action or create scenarios that challenge his passive nature.
1419
1270
 
1420
- ## Python Examples
1421
- Integrate npcsh into your Python projects for additional flexibility. Below are a few examples of how to use the library programmatically.
1271
+ Here are some additional details about Green Bull (Aramaki) in "One Piece":
1422
1272
 
1273
+ 1 Devil Fruit Name:
1423
1274
 
1275
+ • Green Bull's Devil Fruit has been confirmed to be the Mori Mori no Mi, which is a mythical Zoan-type fruit. This allows him to utilize various forms of plant life in combat and has regenerative
1276
+ capabilities.
1424
1277
 
1425
- ### Example 1: Creating and Using an NPC
1426
- This example shows how to create and initialize an NPC and use it to answer a question.
1427
- ```bash
1428
- import sqlite3
1429
- from npcsh.npc_compiler import NPC
1278
+ 2 Combat Abilities:
1430
1279
 
1431
- # Set up database connection
1432
- db_path = '~/npcsh_history.db'
1433
- conn = sqlite3.connect(db_path)
1280
+ His ability to control vegetation gives him significant advantages in battle. He can generate plants to attack or defend and possibly use them for tactical advantages, such as creating barriers
1281
+ or entangling enemies.
1434
1282
 
1435
- # Load NPC from a file
1436
- npc = NPC(db_conn=conn,
1437
- name='Simon Bolivar',
1438
- primary_directive='Liberate South America from the Spanish Royalists.',
1439
- model='gpt-4o-mini',
1440
- provider='openai',
1441
- )
1283
+ 3 Reputation:
1442
1284
 
1443
- response = npc.get_llm_response("What is the most important territory to retain in the Andes mountains?")
1444
- print(response['response'])
1445
- ```
1446
- ```bash
1447
- 'The most important territory to retain in the Andes mountains for the cause of liberation in South America would be the region of Quito in present-day Ecuador. This area is strategically significant due to its location and access to key trade routes. It also acts as a vital link between the northern and southern parts of the continent, influencing both military movements and the morale of the independence struggle. Retaining control over Quito would bolster efforts to unite various factions in the fight against Spanish colonial rule across the Andean states.'
1448
- ```
1449
- ### Example 2: Using an NPC to Analyze Data
1450
- This example shows how to use an NPC to perform data analysis on a DataFrame using LLM commands.
1451
- ```bash
1452
- from npcsh.npc_compiler import NPC
1453
- import sqlite3
1454
- import os
1455
- # Set up database connection
1456
- db_path = '~/npcsh_history.db'
1457
- conn = sqlite3.connect(os.path.expanduser(db_path))
1285
+ As an Admiral, Green Bull holds a position of great power and authority within the Marines. His reputation is likely formidable among pirates and other opponents.
1458
1286
 
1459
- # make a table to put into npcsh_history.db or change this example to use an existing table in a database you have
1460
- import pandas as pd
1461
- data = {
1462
- 'customer_feedback': ['The product is great!', 'The service was terrible.', 'I love the new feature.'],
1463
- 'customer_id': [1, 2, 3],
1464
- 'customer_rating': [5, 1, 3],
1465
- 'timestamp': ['2022-01-01', '2022-01-02', '2022-01-03']
1466
- }
1287
+ 4 Interactions with Other Characters:
1467
1288
 
1289
+ • Green Bull's character dynamics with others in the series, particularly with fellow Marines and pirates, can provide insight into his personality and values. His interactions during missions or
1290
+ discussions about justice will reveal more about his character.
1468
1291
 
1469
- df = pd.DataFrame(data)
1470
- df.to_sql('customer_feedback', conn, if_exists='replace', index=False)
1292
+ 5 Appearance and Style:
1471
1293
 
1294
+ • He has a rather unique aesthetic, characterized by his green clothing that symbolizes his connection to nature. His overall appearance contributes to his identity as a plant-based fighter.
1472
1295
 
1473
- npc = NPC(db_conn=conn,
1474
- name='Felix',
1475
- primary_directive='Analyze customer feedback for sentiment.',
1476
- model='llama3.2',
1477
- provider='ollama',
1478
- )
1479
- response = npc.analyze_db_data('Provide a detailed report on the data contained in the `customer_feedback` table?')
1296
+ 6 Backstory:
1297
+
1298
+ As of now, specific details about his backstory and how he rose to the rank of Admiral are not extensively explored in the series. This leaves room for further character development and
1299
+ background to be unveiled in future chapters.
1300
+
1301
+ 7 Ambiguous Personality:
1302
+
1303
+ • While his laziness and laid-back demeanor are evident, it is possible that there are deeper layers to his character that might be revealed through his actions and motivations within the
1304
+ overarching narrative of "One Piece."
1480
1305
 
1306
+ 8 Role in the Marine Organization:
1481
1307
 
1308
+ • His position as Admiral places him in direct opposition to the main pirate characters, particularly the Straw Hat crew, making him a significant figure in the ongoing conflict between pirates
1309
+ and the Marines.
1310
+ As the story continues to develop, Green Bull's character may evolve and reveal more complexities, weaknesses, and relationships within the world of "One Piece."
1482
1311
  ```
1483
1312
 
1484
1313
 
1485
- ### Example 3: Creating and Using a Tool
1486
- You can define a tool and execute it from within your Python script.
1487
- Here we'll create a tool that will take in a pdf file, extract the text, and then answer a user request about the text.
1314
+ Start the spool with a specific llm model:
1315
+ ```npcsh
1316
+ #note this is not yet implemented
1317
+ npcsh> /spool model=llama3.3
1318
+ ```
1488
1319
 
1489
1320
  ```bash
1490
- from npcsh.npc_compiler import Tool, NPC
1491
- import sqlite3
1492
- import os
1493
-
1494
- tool_data = {
1495
- "tool_name": "pdf_analyzer",
1496
- "inputs": ["request", "file"],
1497
- "steps": [{ # Make this a list with one dict inside
1498
- "engine": "python",
1499
- "code": """
1500
- try:
1501
- import fitz # PyMuPDF
1321
+ npc spool -n npc.npc
1322
+ ```
1502
1323
 
1503
- shared_context = {}
1504
- shared_context['inputs'] = inputs
1505
1324
 
1506
- pdf_path = inputs['file']
1507
- print(f"Processing PDF file: {pdf_path}")
1508
1325
 
1509
- # Open the PDF
1510
- doc = fitz.open(pdf_path)
1511
- text = ""
1326
+ ### Vixynt: Image Generation
1327
+ Image generation can be done with the /vixynt macro.
1512
1328
 
1513
- # Extract text from each page
1514
- for page_num in range(len(doc)):
1515
- page = doc[page_num]
1516
- text += page.get_text()
1329
+ Use /vixynt like so where you can also specify the model to use with an @ reference. This @ reference will override the default model in ~/.npcshrc.
1517
1330
 
1518
- # Close the document
1519
- doc.close()
1331
+ ```npcsh
1332
+ npcsh> /vixynt A futuristic cityscape @dall-e-3
1333
+ ```
1334
+ ![futuristic cityscape](test_data/futuristic_cityscape.PNG)
1520
1335
 
1521
- print(f"Extracted text length: {len(text)}")
1522
- if len(text) > 100:
1523
- print(f"First 100 characters: {text[:100]}...")
1336
+ ```npcsh
1337
+ npcsh> /vixynt A peaceful landscape @runwayml/stable-diffusion-v1-5
1338
+ ```
1339
+ ![peaceful landscape](test_data/peaceful_landscape_stable_diff.png)
1524
1340
 
1525
- shared_context['extracted_text'] = text
1526
- print("Text extraction completed successfully")
1527
1341
 
1528
- except Exception as e:
1529
- error_msg = f"Error processing PDF: {str(e)}"
1530
- print(error_msg)
1531
- shared_context['extracted_text'] = f"Error: {error_msg}"
1532
- """
1533
- },
1534
- {
1535
- "engine": "natural",
1536
- "code": """
1537
- {% if shared_context and shared_context.extracted_text %}
1538
- {% if shared_context.extracted_text.startswith('Error:') %}
1539
- {{ shared_context.extracted_text }}
1540
- {% else %}
1541
- Here is the text extracted from the PDF:
1342
+ Similarly, use vixynt with the NPC CLI from a regular shell:
1343
+ ```bash
1344
+ $ npc --model 'dall-e-2' --provider 'openai' vixynt 'whats a french man to do in the southern bayeaux'
1345
+ ```
1542
1346
 
1543
- {{ shared_context.extracted_text }}
1544
1347
 
1545
- Please provide a response to user request: {{ inputs.request }} using the information extracted above.
1546
- {% endif %}
1547
- {% else %}
1548
- Error: No text was extracted from the PDF.
1549
- {% endif %}
1550
- """
1551
- },]
1552
1348
 
1553
- # Instantiate the tool
1554
- tool = Tool(tool_data)
1555
1349
 
1556
- # Create an NPC instance
1557
- npc = NPC(
1558
- name='starlana',
1559
- primary_directive='Analyze text from Astrophysics papers with a keen attention to theoretical machinations and mechanisms.',
1560
- db_conn=sqlite3.connect(os.path.expanduser('~/npcsh_database.db'))
1561
- )
1350
+ ### Whisper: Voice Control
1351
+ Enter into a voice-controlled mode to interact with the LLM. This mode can executet commands and use tools just like the basic npcsh shell.
1352
+ ```npcsh
1353
+ npcsh> /whisper
1354
+ ```
1562
1355
 
1563
- # Define input values dictionary
1564
- input_values = {
1565
- "request": "what is the point of the yuan and narayanan work?",
1566
- "file": os.path.abspath("test_data/yuan2004.pdf")
1567
- }
1568
1356
 
1569
- print(f"Attempting to read file: {input_values['file']}")
1570
- print(f"File exists: {os.path.exists(input_values['file'])}")
1571
1357
 
1572
- # Execute the tool
1573
- output = tool.execute(input_values, npc.tools_dict, None, 'Sample Command', npc)
1574
1358
 
1575
- print('Tool Output:', output)
1359
+ ### Compilation and NPC Interaction
1360
+ Compile a specified NPC profile. This will make it available for use in npcsh interactions.
1361
+ ```npcsh
1362
+ npcsh> /compile <npc_file>
1576
1363
  ```
1364
+ You can also use `/com` as an alias for `/compile`. If no NPC file is specified, all NPCs in the npc_team directory will be compiled.
1577
1365
 
1578
- ### Example 4: Orchestrating a team
1579
-
1580
-
1366
+ Begin a conversations with a specified NPC by referencing their name
1367
+ ```npcsh
1368
+ npcsh> /<npc_name>:
1369
+ ```
1581
1370
 
1582
- ```python
1583
- import pandas as pd
1584
- import numpy as np
1585
- import os
1586
- from npcsh.npc_compiler import NPC, NPCTeam, Tool
1587
1371
 
1588
1372
 
1589
- # Create test data and save to CSV
1590
- def create_test_data(filepath="sales_data.csv"):
1591
- sales_data = pd.DataFrame(
1592
- {
1593
- "date": pd.date_range(start="2024-01-01", periods=90),
1594
- "revenue": np.random.normal(10000, 2000, 90),
1595
- "customer_count": np.random.poisson(100, 90),
1596
- "avg_ticket": np.random.normal(100, 20, 90),
1597
- "region": np.random.choice(["North", "South", "East", "West"], 90),
1598
- "channel": np.random.choice(["Online", "Store", "Mobile"], 90),
1599
- }
1600
- )
1373
+ ## NPC Data Layer
1601
1374
 
1602
- # Add patterns to make data more realistic
1603
- sales_data["revenue"] *= 1 + 0.3 * np.sin(
1604
- np.pi * np.arange(90) / 30
1605
- ) # Seasonal pattern
1606
- sales_data.loc[sales_data["channel"] == "Mobile", "revenue"] *= 1.1 # Mobile growth
1607
- sales_data.loc[
1608
- sales_data["channel"] == "Online", "customer_count"
1609
- ] *= 1.2 # Online customer growth
1375
+ What principally powers the capabilities of npcsh is the NPC Data Layer. In the `~/.npcsh/` directory after installation, you will find
1376
+ the npc teaam with its tools, models, contexts, assembly lines, and NPCs. By making tools, NPCs, contexts, and assembly lines simple data structures with
1377
+ a fixed set of parameters, we can let users define them in easy-to-read YAML files, allowing for a modular and extensible system that can be easily modified and expanded upon. Furthermore, this data layer relies heavily on jinja templating to allow for dynamic content generation and the ability to reference other NPCs, tools, and assembly lines in the system.
1610
1378
 
1611
- sales_data.to_csv(filepath, index=False)
1612
- return filepath, sales_data
1379
+ ### Creating NPCs
1380
+ NPCs are defined in YAML files within the npc_team directory. Each NPC must have a name and a primary directive. Optionally, one can specify an LLM model/provider for the NPC as well as provide an explicit list of tools and whether or not to use the globally available tools. See the data models contained in `npcsh/data_models.py` for more explicit type details on the NPC data structure.
1613
1381
 
1614
1382
 
1615
- code_execution_tool = Tool(
1616
- {
1617
- "tool_name": "execute_code",
1618
- "description": """Executes a Python code block with access to pandas,
1619
- numpy, and matplotlib.
1620
- Results should be stored in the 'results' dict to be returned.
1621
- The only input should be a single code block with \n characters included.
1622
- The code block must use only the libraries or methods contained withen the
1623
- pandas, numpy, and matplotlib libraries or using builtin methods.
1624
- do not include any json formatting or markdown formatting.
1625
1383
 
1626
- When generating your script, the final output must be encoded in a variable
1627
- named "output". e.g.
1384
+ Here is a typical NPC file:
1385
+ ```yaml
1386
+ name: sibiji
1387
+ primary_directive: You are a foundational AI assistant. Your role is to provide basic support and information. Respond to queries concisely and accurately.
1388
+ tools:
1389
+ - simple data retrieval
1390
+ model: llama3.2
1391
+ provider: ollama
1392
+ ```
1628
1393
 
1629
- output = some_analysis_function(inputs, derived_data_from_inputs)
1630
- Adapt accordingly based on the scope of the analysis
1631
1394
 
1632
- """,
1633
- "inputs": ["script"],
1634
- "steps": [
1635
- {
1636
- "engine": "python",
1637
- "code": """{{script}}""",
1638
- }
1639
- ],
1640
- }
1641
- )
1395
+ ## Creating Tools
1396
+ Tools are defined as YAMLs with `.tool` extension within the npc_team/tools directory. Each tool has a name, inputs, and consists of three distinct steps: preprocess, prompt, and postprocess. The idea here is that a tool consists of a stage where information is preprocessed and then passed to a prompt for some kind of analysis and then can be passed to another stage for postprocessing. In each of these three cases, the engine must be specified. The engine can be either "natural" for natural language processing or "python" for Python code. The code is the actual code that will be executed.
1642
1397
 
1643
- # Analytics team definition
1644
- analytics_team = [
1645
- {
1646
- "name": "analyst",
1647
- "primary_directive": "You analyze sales performance data, focusing on revenue trends, customer behavior metrics, and market indicators. Your expertise is in extracting actionable insights from complex datasets.",
1648
- "model": "gpt-4o-mini",
1649
- "provider": "openai",
1650
- "tools": [code_execution_tool], # Only the code execution tool
1651
- },
1652
- {
1653
- "name": "researcher",
1654
- "primary_directive": "You specialize in causal analysis and experimental design. Given data insights, you determine what factors drive observed patterns and design tests to validate hypotheses.",
1655
- "model": "gpt-4o-mini",
1656
- "provider": "openai",
1657
- "tools": [code_execution_tool], # Only the code execution tool
1658
- },
1659
- {
1660
- "name": "engineer",
1661
- "primary_directive": "You implement data pipelines and optimize data processing. When given analysis requirements, you create efficient workflows to automate insights generation.",
1662
- "model": "gpt-4o-mini",
1663
- "provider": "openai",
1664
- "tools": [code_execution_tool], # Only the code execution tool
1665
- },
1666
- ]
1398
+ Here is an example of a tool file:
1399
+ ```yaml
1400
+ tool_name: "screen_capture_analysis_tool"
1401
+ description: Captures the whole screen and sends the image for analysis
1402
+ inputs:
1403
+ - "prompt"
1404
+ steps:
1405
+ - engine: "python"
1406
+ code: |
1407
+ # Capture the screen
1408
+ import pyautogui
1409
+ import datetime
1410
+ import os
1411
+ from PIL import Image
1412
+ import time
1413
+ from npcsh.image import analyze_image_base, capture_screenshot
1667
1414
 
1415
+ out = capture_screenshot(npc = npc, full = True)
1668
1416
 
1669
- def create_analytics_team():
1670
- # Initialize NPCs with just the code execution tool
1671
- npcs = []
1672
- for npc_data in analytics_team:
1673
- npc = NPC(
1674
- name=npc_data["name"],
1675
- primary_directive=npc_data["primary_directive"],
1676
- model=npc_data["model"],
1677
- provider=npc_data["provider"],
1678
- tools=[code_execution_tool], # Only code execution tool
1679
- )
1680
- npcs.append(npc)
1417
+ llm_response = analyze_image_base( '{{prompt}}' + "\n\nAttached is a screenshot of my screen currently. Please use this to evaluate the situation. If the user asked for you to explain what's on their screen or something similar, they are referring to the details contained within the attached image. You do not need to actually view their screen. You do not need to mention that you cannot view or interpret images directly. You only need to answer the user's request based on the attached screenshot!",
1418
+ out['file_path'],
1419
+ out['filename'],
1420
+ npc=npc,
1421
+ **out['model_kwargs'])
1422
+ # To this:
1423
+ if isinstance(llm_response, dict):
1424
+ llm_response = llm_response.get('response', 'No response from image analysis')
1425
+ else:
1426
+ llm_response = 'No response from image analysis'
1681
1427
 
1682
- # Create coordinator with just code execution tool
1683
- coordinator = NPC(
1684
- name="coordinator",
1685
- primary_directive="You coordinate the analytics team, ensuring each specialist contributes their expertise effectively. You synthesize insights and manage the workflow.",
1686
- model="gpt-4o-mini",
1687
- provider="openai",
1688
- tools=[code_execution_tool], # Only code execution tool
1689
- )
1428
+ ```
1690
1429
 
1691
- # Create team
1692
- team = NPCTeam(npcs=npcs, foreman=coordinator)
1693
- return team
1694
1430
 
1431
+ When you have created a tool, it will be surfaced as a potential option to be used when you ask a question in the base npcsh shell. The LLM will decide if it is the best tool to use based on the user's input. Alternatively, if you'd like, you can call the tools directly, without needing to let the AI decide if it's the right one to use.
1695
1432
 
1696
- def main():
1697
- # Create and save test data
1698
- data_path, sales_data = create_test_data()
1433
+ ```npcsh
1434
+ npcsh> /screen_cap_tool <prompt>
1435
+ ```
1436
+ or
1437
+ ```npcsh
1438
+ npcsh> /sql_executor select * from conversation_history limit 1
1699
1439
 
1700
- # Initialize team
1701
- team = create_analytics_team()
1440
+ ```
1441
+ or
1442
+ ```npcsh
1443
+ npcsh> /calculator 5+6
1444
+ ```
1702
1445
 
1703
- # Run analysis - updated prompt to reflect code execution approach
1704
- results = team.orchestrate(
1705
- f"""
1706
- Analyze the sales data at {data_path} to:
1707
- 1. Identify key performance drivers
1708
- 2. Determine if mobile channel growth is significant
1709
- 3. Recommend tests to validate growth hypotheses
1710
1446
 
1711
- Here is a header for the data file at {data_path}:
1712
- {sales_data.head()}
1447
+ ## NPC Pipelines
1713
1448
 
1714
- When working with dates, ensure that date columns are converted from raw strings. e.g. use the pd.to_datetime function.
1715
1449
 
1716
1450
 
1717
- When working with potentially messy data, handle null values by using nan versions of numpy functions or
1718
- by filtering them with a mask .
1451
+ Let's say you want to create a pipeline of steps where NPCs are used along the way. Let's initialize with a pipeline file we'll call `morning_routine.pipe`:
1452
+ ```yaml
1453
+ steps:
1454
+ - step_name: "review_email"
1455
+ npc: "{{ ref('email_assistant') }}"
1456
+ task: "Get me up to speed on my recent emails: {{source('emails')}}."
1719
1457
 
1720
- Use Python code execution to perform the analysis - load the data and perform statistical analysis directly.
1721
- """
1458
+
1459
+ - step_name: "market_update"
1460
+ npc: "{{ ref('market_analyst') }}"
1461
+ task: "Give me an update on the latest events in the market: {{source('market_events')}}."
1462
+
1463
+ - step_name: "summarize"
1464
+ npc: "{{ ref('sibiji') }}"
1465
+ model: llama3.2
1466
+ provider: ollama
1467
+ task: "Review the outputs from the {{review_email}} and {{market_update}} and provide me with a summary."
1468
+
1469
+ ```
1470
+ Now youll see that we reference NPCs in the pipeline file. We'll need to make sure we have each of those NPCs available.
1471
+ Here is an example for the email assistant:
1472
+ ```yaml
1473
+ name: email_assistant
1474
+ primary_directive: You are an AI assistant specialized in managing and summarizing emails. You should present the information in a clear and concise manner.
1475
+ model: gpt-4o-mini
1476
+ provider: openai
1477
+ ```
1478
+ Now for the marketing analyst:
1479
+ ```yaml
1480
+ name: market_analyst
1481
+ primary_directive: You are an AI assistant focused on monitoring and analyzing market trends. Provide de
1482
+ model: llama3.2
1483
+ provider: ollama
1484
+ ```
1485
+ and then here is our trusty friend sibiji:
1486
+ ```yaml
1487
+ name: sibiji
1488
+ primary_directive: You are a foundational AI assistant. Your role is to provide basic support and information. Respond to queries concisely and accurately.
1489
+ suggested_tools_to_use:
1490
+ - simple data retrieval
1491
+ model: claude-3-5-sonnet-latest
1492
+ provider: anthropic
1493
+ ```
1494
+ Now that we have our pipeline and NPCs defined, we also need to ensure that the source data we are referencing will be there. When we use source('market_events') and source('emails') we are asking npcsh to pull those data directly from tables in our npcsh database. For simplicity we will just make these in python to insert them for this demo:
1495
+ ```python
1496
+ import pandas as pd
1497
+ from sqlalchemy import create_engine
1498
+ import os
1499
+
1500
+ # Sample market events data
1501
+ market_events_data = {
1502
+ "datetime": [
1503
+ "2023-10-15 09:00:00",
1504
+ "2023-10-16 10:30:00",
1505
+ "2023-10-17 11:45:00",
1506
+ "2023-10-18 13:15:00",
1507
+ "2023-10-19 14:30:00",
1508
+ ],
1509
+ "headline": [
1510
+ "Stock Market Rallies Amid Positive Economic Data",
1511
+ "Tech Giant Announces New Product Line",
1512
+ "Federal Reserve Hints at Interest Rate Pause",
1513
+ "Oil Prices Surge Following Supply Concerns",
1514
+ "Retail Sector Reports Record Q3 Earnings",
1515
+ ],
1516
+ }
1517
+
1518
+ # Create a DataFrame
1519
+ market_events_df = pd.DataFrame(market_events_data)
1520
+
1521
+ # Define database path relative to user's home directory
1522
+ db_path = os.path.expanduser("~/npcsh_history.db")
1523
+
1524
+ # Create a connection to the SQLite database
1525
+ engine = create_engine(f"sqlite:///{db_path}")
1526
+ with engine.connect() as connection:
1527
+ # Write the data to a new table 'market_events', replacing existing data
1528
+ market_events_df.to_sql(
1529
+ "market_events", con=connection, if_exists="replace", index=False
1722
1530
  )
1723
1531
 
1724
- print(results)
1532
+ print("Market events have been added to the database.")
1725
1533
 
1726
- # Cleanup
1727
- os.remove(data_path)
1534
+ email_data = {
1535
+ "datetime": [
1536
+ "2023-10-10 10:00:00",
1537
+ "2023-10-11 11:00:00",
1538
+ "2023-10-12 12:00:00",
1539
+ "2023-10-13 13:00:00",
1540
+ "2023-10-14 14:00:00",
1541
+ ],
1542
+ "subject": [
1543
+ "Meeting Reminder",
1544
+ "Project Update",
1545
+ "Invoice Attached",
1546
+ "Weekly Report",
1547
+ "Holiday Notice",
1548
+ ],
1549
+ "sender": [
1550
+ "alice@example.com",
1551
+ "bob@example.com",
1552
+ "carol@example.com",
1553
+ "dave@example.com",
1554
+ "eve@example.com",
1555
+ ],
1556
+ "recipient": [
1557
+ "bob@example.com",
1558
+ "carol@example.com",
1559
+ "dave@example.com",
1560
+ "eve@example.com",
1561
+ "alice@example.com",
1562
+ ],
1563
+ "body": [
1564
+ "Don't forget the meeting tomorrow at 10 AM.",
1565
+ "The project is progressing well, see attached update.",
1566
+ "Please find your invoice attached.",
1567
+ "Here is the weekly report.",
1568
+ "The office will be closed on holidays, have a great time!",
1569
+ ],
1570
+ }
1728
1571
 
1572
+ # Create a DataFrame
1573
+ emails_df = pd.DataFrame(email_data)
1729
1574
 
1730
- if __name__ == "__main__":
1731
- main()
1575
+ # Define database path relative to user's home directory
1576
+ db_path = os.path.expanduser("~/npcsh_history.db")
1577
+
1578
+ # Create a connection to the SQLite database
1579
+ engine = create_engine(f"sqlite:///{db_path}")
1580
+ with engine.connect() as connection:
1581
+ # Write the data to a new table 'emails', replacing existing data
1582
+ emails_df.to_sql("emails", con=connection, if_exists="replace", index=False)
1583
+
1584
+ print("Sample emails have been added to the database.")
1732
1585
 
1733
1586
  ```
1734
1587
 
1588
+
1589
+ With these data now in place, we can proceed with running the pipeline. We can do this in npcsh by using the /compile command.
1590
+
1591
+
1592
+
1593
+
1594
+ ```npcsh
1595
+ npcsh> /compile morning_routine.pipe
1596
+ ```
1597
+
1598
+
1599
+
1600
+ Alternatively we can run a pipeline like so in Python:
1601
+
1602
+ ```bash
1603
+ from npcsh.npc_compiler import PipelineRunner
1604
+ import os
1605
+
1606
+ pipeline_runner = PipelineRunner(
1607
+ pipeline_file="morning_routine.pipe",
1608
+ npc_root_dir=os.path.abspath("./"),
1609
+ db_path="~/npcsh_history.db",
1610
+ )
1611
+ pipeline_runner.execute_pipeline(inputs)
1612
+ ```
1613
+
1614
+ What if you wanted to run operations on each row and some operations on all the data at once? We can do this with the pipelines as well. Here we will build a pipeline for news article analysis.
1615
+ First we make the data for the pipeline that well use:
1616
+ ```python
1617
+ import pandas as pd
1618
+ from sqlalchemy import create_engine
1619
+ import os
1620
+
1621
+ # Sample data generation for news articles
1622
+ news_articles_data = {
1623
+ "news_article_id": list(range(1, 21)),
1624
+ "headline": [
1625
+ "Economy sees unexpected growth in Q4",
1626
+ "New tech gadget takes the world by storm",
1627
+ "Political debate heats up over new policy",
1628
+ "Health concerns rise amid new disease outbreak",
1629
+ "Sports team secures victory in last minute",
1630
+ "New economic policy introduced by government",
1631
+ "Breakthrough in AI technology announced",
1632
+ "Political leader delivers speech on reforms",
1633
+ "Healthcare systems pushed to limits",
1634
+ "Celebrated athlete breaks world record",
1635
+ "Controversial economic measures spark debate",
1636
+ "Innovative tech startup gains traction",
1637
+ "Political scandal shakes administration",
1638
+ "Healthcare workers protest for better pay",
1639
+ "Major sports event postponed due to weather",
1640
+ "Trade tensions impact global economy",
1641
+ "Tech company accused of data breach",
1642
+ "Election results lead to political upheaval",
1643
+ "Vaccine developments offer hope amid pandemic",
1644
+ "Sports league announces return to action",
1645
+ ],
1646
+ "content": ["Article content here..." for _ in range(20)],
1647
+ "publication_date": pd.date_range(start="1/1/2023", periods=20, freq="D"),
1648
+ }
1649
+ ```
1650
+
1651
+ Then we will create the pipeline file:
1652
+ ```yaml
1653
+ # news_analysis.pipe
1654
+ steps:
1655
+ - step_name: "classify_news"
1656
+ npc: "{{ ref('news_assistant') }}"
1657
+ task: |
1658
+ Classify the following news articles into one of the categories:
1659
+ ["Politics", "Economy", "Technology", "Sports", "Health"].
1660
+ {{ source('news_articles') }}
1661
+
1662
+ - step_name: "analyze_news"
1663
+ npc: "{{ ref('news_assistant') }}"
1664
+ batch_mode: true # Process articles with knowledge of their tags
1665
+ task: |
1666
+ Based on the category assigned in {{classify_news}}, provide an in-depth
1667
+ analysis and perspectives on the article. Consider these aspects:
1668
+ ["Impacts", "Market Reaction", "Cultural Significance", "Predictions"].
1669
+ {{ source('news_articles') }}
1670
+ ```
1671
+
1672
+ Then we can run the pipeline like so:
1673
+ ```bash
1674
+ /compile ./npc_team/news_analysis.pipe
1675
+ ```
1676
+ or in python like:
1677
+
1678
+ ```bash
1679
+
1680
+ from npcsh.npc_compiler import PipelineRunner
1681
+ import os
1682
+ runner = PipelineRunner(
1683
+ "./news_analysis.pipe",
1684
+ db_path=os.path.expanduser("~/npcsh_history.db"),
1685
+ npc_root_dir=os.path.abspath("."),
1686
+ )
1687
+ results = runner.execute_pipeline()
1688
+ ```
1689
+
1690
+ Alternatively, if youd like to use a mixture of agents in your pipeline, set one up like this:
1691
+ ```yaml
1692
+ steps:
1693
+ - step_name: "classify_news"
1694
+ npc: "news_assistant"
1695
+ mixa: true
1696
+ mixa_agents:
1697
+ - "{{ ref('news_assistant') }}"
1698
+ - "{{ ref('journalist_npc') }}"
1699
+ - "{{ ref('data_scientist_npc') }}"
1700
+ mixa_voters:
1701
+ - "{{ ref('critic_npc') }}"
1702
+ - "{{ ref('editor_npc') }}"
1703
+ - "{{ ref('researcher_npc') }}"
1704
+ mixa_voter_count: 5
1705
+ mixa_turns: 3
1706
+ mixa_strategy: "vote"
1707
+ task: |
1708
+ Classify the following news articles...
1709
+ {{ source('news_articles') }}
1710
+ ```
1711
+ You'll have to make npcs for these references to work, here are versions that should work with the above:
1712
+ ```yaml
1713
+ name: news_assistant
1714
+ ```
1715
+ Then, we can run the mixture of agents method like:
1716
+
1717
+ ```bash
1718
+ /compile ./npc_team/news_analysis_mixa.pipe
1719
+ ```
1720
+ or in python like:
1721
+
1722
+ ```bash
1723
+
1724
+ from npcsh.npc_compiler import PipelineRunner
1725
+ import os
1726
+
1727
+ runner = PipelineRunner(
1728
+ "./news_analysis_mixa.pipe",
1729
+ db_path=os.path.expanduser("~/npcsh_history.db"),
1730
+ npc_root_dir=os.path.abspath("."),
1731
+ )
1732
+ results = runner.execute_pipeline()
1733
+ ```
1734
+
1735
+
1736
+
1737
+ Note, in the future we will aim to separate compilation and running so that we will have a compilation step that is more like a jinja rendering of the relevant information so that it can be more easily audited.
1738
+
1739
+
1735
1740
  ## npcsql: SQL Integration and pipelines (UNDER CONSTRUCTION)
1736
1741
 
1737
1742