npcsh 1.0.25__tar.gz → 1.0.26__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {npcsh-1.0.25 → npcsh-1.0.26}/PKG-INFO +12 -6
- {npcsh-1.0.25 → npcsh-1.0.26}/README.md +11 -5
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/routes.py +36 -16
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh.egg-info/PKG-INFO +12 -6
- {npcsh-1.0.25 → npcsh-1.0.26}/setup.py +1 -1
- {npcsh-1.0.25 → npcsh-1.0.26}/LICENSE +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/__init__.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/_state.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/alicanto.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/corca.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/guac.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/mcp_helpers.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/mcp_server.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/alicanto.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/alicanto.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/corca.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/corca.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/foreman.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/frederic.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/frederic4.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/guac.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/jinxs/bash_executer.jinx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/jinxs/edit_file.jinx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/jinxs/image_generation.jinx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/jinxs/internet_search.jinx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/jinxs/python_executor.jinx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/jinxs/screen_cap.jinx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/kadiefa.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/kadiefa.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/npcsh.ctx +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/npcsh_sibiji.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/plonk.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/plonk.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/plonkjr.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/plonkjr.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/sibiji.npc +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/sibiji.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/spool.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npc_team/yap.png +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/npcsh.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/plonk.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/pti.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/spool.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/wander.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh/yap.py +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh.egg-info/SOURCES.txt +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh.egg-info/dependency_links.txt +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh.egg-info/entry_points.txt +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh.egg-info/requires.txt +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/npcsh.egg-info/top_level.txt +0 -0
- {npcsh-1.0.25 → npcsh-1.0.26}/setup.cfg +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: npcsh
|
|
3
|
-
Version: 1.0.
|
|
3
|
+
Version: 1.0.26
|
|
4
4
|
Summary: npcsh is a command-line toolkit for using AI agents in novel ways.
|
|
5
5
|
Home-page: https://github.com/NPC-Worldwide/npcsh
|
|
6
6
|
Author: Christopher Agostino
|
|
@@ -125,7 +125,7 @@ Once installed: run
|
|
|
125
125
|
```bash
|
|
126
126
|
npcsh
|
|
127
127
|
```
|
|
128
|
-
and you will enter the NPC shell. Additionally, the pip installation includes
|
|
128
|
+
and you will enter the NPC shell. Additionally, the pip installation includes the following CLI tools available in bash: `corca`, `guac`, `npc` cli, `pti`, `spool`, `wander`, and`yap`.
|
|
129
129
|
|
|
130
130
|
|
|
131
131
|
# Usage
|
|
@@ -204,12 +204,14 @@ The core of npcsh's capabilities is powered by the NPC Data Layer. Upon initiali
|
|
|
204
204
|
Users can extend NPC capabilities through simple YAML files:
|
|
205
205
|
|
|
206
206
|
- **NPCs** (.npc): are defined with a name, primary directive, and optional model specifications
|
|
207
|
-
- **Jinxs** (.jinx):
|
|
208
|
-
- **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team. Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context
|
|
207
|
+
- **Jinxs** (.jinx): Jinja execution templates that provide function-like capabilities and scaleable extensibility through Jinja references to call other jinxs to build upon. Jinxs are executed through prompt-based flows, allowing them to be used by models regardless of their tool-calling capabilities, making it possible then to enable agents at the edge of computing through this simple methodology.
|
|
208
|
+
- **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team or for specific agents (e.g. `GUAC_FORENPC`). Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context who is used whenever the team is called upon for orchestration.
|
|
209
209
|
|
|
210
210
|
The NPC Shell system integrates the capabilities of `npcpy` to maintain conversation history, track command execution, and provide intelligent autocomplete through an extensible command routing system. State is preserved between sessions, allowing for continuous knowledge building over time.
|
|
211
211
|
|
|
212
|
-
This architecture enables complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities
|
|
212
|
+
This architecture enables users to build complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities in composable data structures rather than code, `npcsh` creates a more accessible and adaptable framework for AI automation that can scale more intentionally. Within teams can be subteams, and these sub-teams may be called upon for orchestration, but importantly, when the orchestrator is deciding between using one of its own team's NPCs versus yielding to a sub-team, they see only the descriptions of the subteams rather than the full persona descriptions for each of the sub-team's agents, making it easier for the orchestrator to better delineate and keep their attention focused by restricting the number of options in each decisino step. Thus, they may yield to the sub-team's orchestrator, letting them decide which sub-team NPC to use based on their own team's agents.
|
|
213
|
+
|
|
214
|
+
Importantly, users can switch easily between the NPCs they are chatting with by typing `/n npc_name` within the NPC shell. Likewise, they can create Jinxs and then use them from within the NPC shell by invoking the jinx name and the arguments required for the Jinx; `/<jinx_name> arg1 arg2`
|
|
213
215
|
|
|
214
216
|
# Macros
|
|
215
217
|
- activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
|
|
@@ -276,7 +278,11 @@ To see more about how to use the macros and modes in the NPC Shell, read the doc
|
|
|
276
278
|
- `npcsh` works with local and enterprise LLM providers through its LiteLLM integration, allowing users to run inference from Ollama, LMStudio, vLLM, MLX, OpenAI, Anthropic, Gemini, and Deepseek, making it a versatile tool for both simple commands and sophisticated AI-driven tasks.
|
|
277
279
|
|
|
278
280
|
## NPC Studio
|
|
279
|
-
There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the
|
|
281
|
+
There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the source code for NPC Studio [here](https://github.com/npc-worldwide/npc-studio). Download the executables at [our website](https://enpisi.com/npc-studio). For the most up to date version, you can use NPC Studio by invoking it in npcsh
|
|
282
|
+
```
|
|
283
|
+
/npc-studio
|
|
284
|
+
```
|
|
285
|
+
which will download and set up and serve the NPC Studio application within your `~/.npcsh` folder. It requires `npm` and `node` to work.
|
|
280
286
|
|
|
281
287
|
## Mailing List
|
|
282
288
|
Interested to stay in the loop and to hear the latest and greatest about `npcpy`, `npcsh`, and NPC Studio? Be sure to sign up for the [newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)!
|
|
@@ -25,7 +25,7 @@ Once installed: run
|
|
|
25
25
|
```bash
|
|
26
26
|
npcsh
|
|
27
27
|
```
|
|
28
|
-
and you will enter the NPC shell. Additionally, the pip installation includes
|
|
28
|
+
and you will enter the NPC shell. Additionally, the pip installation includes the following CLI tools available in bash: `corca`, `guac`, `npc` cli, `pti`, `spool`, `wander`, and`yap`.
|
|
29
29
|
|
|
30
30
|
|
|
31
31
|
# Usage
|
|
@@ -104,12 +104,14 @@ The core of npcsh's capabilities is powered by the NPC Data Layer. Upon initiali
|
|
|
104
104
|
Users can extend NPC capabilities through simple YAML files:
|
|
105
105
|
|
|
106
106
|
- **NPCs** (.npc): are defined with a name, primary directive, and optional model specifications
|
|
107
|
-
- **Jinxs** (.jinx):
|
|
108
|
-
- **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team. Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context
|
|
107
|
+
- **Jinxs** (.jinx): Jinja execution templates that provide function-like capabilities and scaleable extensibility through Jinja references to call other jinxs to build upon. Jinxs are executed through prompt-based flows, allowing them to be used by models regardless of their tool-calling capabilities, making it possible then to enable agents at the edge of computing through this simple methodology.
|
|
108
|
+
- **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team or for specific agents (e.g. `GUAC_FORENPC`). Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context who is used whenever the team is called upon for orchestration.
|
|
109
109
|
|
|
110
110
|
The NPC Shell system integrates the capabilities of `npcpy` to maintain conversation history, track command execution, and provide intelligent autocomplete through an extensible command routing system. State is preserved between sessions, allowing for continuous knowledge building over time.
|
|
111
111
|
|
|
112
|
-
This architecture enables complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities
|
|
112
|
+
This architecture enables users to build complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities in composable data structures rather than code, `npcsh` creates a more accessible and adaptable framework for AI automation that can scale more intentionally. Within teams can be subteams, and these sub-teams may be called upon for orchestration, but importantly, when the orchestrator is deciding between using one of its own team's NPCs versus yielding to a sub-team, they see only the descriptions of the subteams rather than the full persona descriptions for each of the sub-team's agents, making it easier for the orchestrator to better delineate and keep their attention focused by restricting the number of options in each decisino step. Thus, they may yield to the sub-team's orchestrator, letting them decide which sub-team NPC to use based on their own team's agents.
|
|
113
|
+
|
|
114
|
+
Importantly, users can switch easily between the NPCs they are chatting with by typing `/n npc_name` within the NPC shell. Likewise, they can create Jinxs and then use them from within the NPC shell by invoking the jinx name and the arguments required for the Jinx; `/<jinx_name> arg1 arg2`
|
|
113
115
|
|
|
114
116
|
# Macros
|
|
115
117
|
- activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
|
|
@@ -176,7 +178,11 @@ To see more about how to use the macros and modes in the NPC Shell, read the doc
|
|
|
176
178
|
- `npcsh` works with local and enterprise LLM providers through its LiteLLM integration, allowing users to run inference from Ollama, LMStudio, vLLM, MLX, OpenAI, Anthropic, Gemini, and Deepseek, making it a versatile tool for both simple commands and sophisticated AI-driven tasks.
|
|
177
179
|
|
|
178
180
|
## NPC Studio
|
|
179
|
-
There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the
|
|
181
|
+
There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the source code for NPC Studio [here](https://github.com/npc-worldwide/npc-studio). Download the executables at [our website](https://enpisi.com/npc-studio). For the most up to date version, you can use NPC Studio by invoking it in npcsh
|
|
182
|
+
```
|
|
183
|
+
/npc-studio
|
|
184
|
+
```
|
|
185
|
+
which will download and set up and serve the NPC Studio application within your `~/.npcsh` folder. It requires `npm` and `node` to work.
|
|
180
186
|
|
|
181
187
|
## Mailing List
|
|
182
188
|
Interested to stay in the loop and to hear the latest and greatest about `npcpy`, `npcsh`, and NPC Studio? Be sure to sign up for the [newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)!
|
|
@@ -976,8 +976,9 @@ def vixynt_handler(command: str, **kwargs):
|
|
|
976
976
|
provider = safe_get(kwargs, 'igprovider', NPCSH_IMAGE_GEN_PROVIDER)
|
|
977
977
|
height = safe_get(kwargs, 'height', 1024)
|
|
978
978
|
width = safe_get(kwargs, 'width', 1024)
|
|
979
|
-
|
|
979
|
+
output_file_base = safe_get(kwargs, 'output_file')
|
|
980
980
|
attachments = safe_get(kwargs, 'attachments')
|
|
981
|
+
n_images = safe_get(kwargs, 'n_images', 1) # Get n_images from kwargs
|
|
981
982
|
if isinstance(attachments, str):
|
|
982
983
|
attachments = attachments.split(',')
|
|
983
984
|
|
|
@@ -986,34 +987,51 @@ def vixynt_handler(command: str, **kwargs):
|
|
|
986
987
|
user_prompt = " ".join(safe_get(kwargs, 'positional_args', []))
|
|
987
988
|
|
|
988
989
|
if not user_prompt:
|
|
989
|
-
return {"output": "Usage: /vixynt <prompt> [--output_file path] [--attachments path]", "messages": messages}
|
|
990
|
+
return {"output": "Usage: /vixynt <prompt> [--output_file path] [--attachments path] [--n_images num]", "messages": messages}
|
|
991
|
+
|
|
990
992
|
try:
|
|
991
|
-
|
|
993
|
+
# Call gen_image, passing n_images and expecting a list of images
|
|
994
|
+
images_list = gen_image(
|
|
992
995
|
prompt=user_prompt,
|
|
993
996
|
model=model,
|
|
994
997
|
provider=provider,
|
|
995
998
|
npc=npc,
|
|
996
999
|
height=height,
|
|
997
1000
|
width=width,
|
|
1001
|
+
n_images=n_images, # Pass n_images
|
|
998
1002
|
input_images=attachments
|
|
999
1003
|
)
|
|
1000
1004
|
|
|
1001
|
-
|
|
1002
|
-
|
|
1003
|
-
|
|
1004
|
-
os.path.expanduser("~/.npcsh/images/")
|
|
1005
|
-
+ f"image_{datetime.now().strftime('%Y%m%d_%H%M%S')}.png"
|
|
1006
|
-
)
|
|
1007
|
-
else:
|
|
1008
|
-
output_file = os.path.expanduser(output_file)
|
|
1005
|
+
saved_files = []
|
|
1006
|
+
if not isinstance(images_list, list):
|
|
1007
|
+
images_list = [images_list] if images_list is not None else []
|
|
1009
1008
|
|
|
1010
|
-
image
|
|
1011
|
-
|
|
1009
|
+
for i, image in enumerate(images_list):
|
|
1010
|
+
if image is None:
|
|
1011
|
+
continue
|
|
1012
1012
|
|
|
1013
|
-
|
|
1014
|
-
|
|
1013
|
+
if output_file_base is None:
|
|
1014
|
+
os.makedirs(os.path.expanduser("~/.npcsh/images/"), exist_ok=True)
|
|
1015
|
+
current_output_file = (
|
|
1016
|
+
os.path.expanduser("~/.npcsh/images/")
|
|
1017
|
+
+ f"image_{datetime.now().strftime('%Y%m%d_%H%M%S')}_{i}.png"
|
|
1018
|
+
)
|
|
1019
|
+
else:
|
|
1020
|
+
base_name, ext = os.path.splitext(os.path.expanduser(output_file_base))
|
|
1021
|
+
current_output_file = f"{base_name}_{i}{ext}"
|
|
1022
|
+
|
|
1023
|
+
image.save(current_output_file)
|
|
1024
|
+
image.show()
|
|
1025
|
+
saved_files.append(current_output_file)
|
|
1026
|
+
|
|
1027
|
+
if saved_files:
|
|
1028
|
+
if attachments:
|
|
1029
|
+
output = f"Image(s) edited and saved to: {', '.join(saved_files)}"
|
|
1030
|
+
else:
|
|
1031
|
+
output = f"Image(s) generated and saved to: {', '.join(saved_files)}"
|
|
1015
1032
|
else:
|
|
1016
|
-
output = f"
|
|
1033
|
+
output = f"No images {'edited' if attachments else 'generated'}."
|
|
1034
|
+
|
|
1017
1035
|
except Exception as e:
|
|
1018
1036
|
traceback.print_exc()
|
|
1019
1037
|
output = f"Error {'editing' if attachments else 'generating'} image: {e}"
|
|
@@ -1024,6 +1042,8 @@ def vixynt_handler(command: str, **kwargs):
|
|
|
1024
1042
|
"model": model,
|
|
1025
1043
|
"provider": provider
|
|
1026
1044
|
}
|
|
1045
|
+
|
|
1046
|
+
|
|
1027
1047
|
@router.route("wander", "Enter wander mode (experimental)")
|
|
1028
1048
|
def wander_handler(command: str, **kwargs):
|
|
1029
1049
|
messages = safe_get(kwargs, "messages", [])
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: npcsh
|
|
3
|
-
Version: 1.0.
|
|
3
|
+
Version: 1.0.26
|
|
4
4
|
Summary: npcsh is a command-line toolkit for using AI agents in novel ways.
|
|
5
5
|
Home-page: https://github.com/NPC-Worldwide/npcsh
|
|
6
6
|
Author: Christopher Agostino
|
|
@@ -125,7 +125,7 @@ Once installed: run
|
|
|
125
125
|
```bash
|
|
126
126
|
npcsh
|
|
127
127
|
```
|
|
128
|
-
and you will enter the NPC shell. Additionally, the pip installation includes
|
|
128
|
+
and you will enter the NPC shell. Additionally, the pip installation includes the following CLI tools available in bash: `corca`, `guac`, `npc` cli, `pti`, `spool`, `wander`, and`yap`.
|
|
129
129
|
|
|
130
130
|
|
|
131
131
|
# Usage
|
|
@@ -204,12 +204,14 @@ The core of npcsh's capabilities is powered by the NPC Data Layer. Upon initiali
|
|
|
204
204
|
Users can extend NPC capabilities through simple YAML files:
|
|
205
205
|
|
|
206
206
|
- **NPCs** (.npc): are defined with a name, primary directive, and optional model specifications
|
|
207
|
-
- **Jinxs** (.jinx):
|
|
208
|
-
- **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team. Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context
|
|
207
|
+
- **Jinxs** (.jinx): Jinja execution templates that provide function-like capabilities and scaleable extensibility through Jinja references to call other jinxs to build upon. Jinxs are executed through prompt-based flows, allowing them to be used by models regardless of their tool-calling capabilities, making it possible then to enable agents at the edge of computing through this simple methodology.
|
|
208
|
+
- **Context** (.ctx): Specify contextual information, team preferences, MCP server paths, database connections, and other environment variables that are loaded for the team or for specific agents (e.g. `GUAC_FORENPC`). Teams are specified by their path and the team name in the `<team>.ctx` file. Teams organize collections of NPCs with shared context and specify a coordinator within the team context who is used whenever the team is called upon for orchestration.
|
|
209
209
|
|
|
210
210
|
The NPC Shell system integrates the capabilities of `npcpy` to maintain conversation history, track command execution, and provide intelligent autocomplete through an extensible command routing system. State is preserved between sessions, allowing for continuous knowledge building over time.
|
|
211
211
|
|
|
212
|
-
This architecture enables complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities
|
|
212
|
+
This architecture enables users to build complex AI workflows while maintaining a simple, declarative syntax that abstracts away implementation complexity. By organizing AI capabilities in composable data structures rather than code, `npcsh` creates a more accessible and adaptable framework for AI automation that can scale more intentionally. Within teams can be subteams, and these sub-teams may be called upon for orchestration, but importantly, when the orchestrator is deciding between using one of its own team's NPCs versus yielding to a sub-team, they see only the descriptions of the subteams rather than the full persona descriptions for each of the sub-team's agents, making it easier for the orchestrator to better delineate and keep their attention focused by restricting the number of options in each decisino step. Thus, they may yield to the sub-team's orchestrator, letting them decide which sub-team NPC to use based on their own team's agents.
|
|
213
|
+
|
|
214
|
+
Importantly, users can switch easily between the NPCs they are chatting with by typing `/n npc_name` within the NPC shell. Likewise, they can create Jinxs and then use them from within the NPC shell by invoking the jinx name and the arguments required for the Jinx; `/<jinx_name> arg1 arg2`
|
|
213
215
|
|
|
214
216
|
# Macros
|
|
215
217
|
- activated by invoking `/<command> ...` in `npcsh`, macros can be called in bash or through the `npc` CLI. In our examples, we provide both `npcsh` calls as well as bash calls with the `npc` cli where relevant. For converting any `/<command>` in `npcsh` to a bash version, replace the `/` with `npc ` and the macro command will be invoked as a positional argument. Some, like breathe, flush,
|
|
@@ -276,7 +278,11 @@ To see more about how to use the macros and modes in the NPC Shell, read the doc
|
|
|
276
278
|
- `npcsh` works with local and enterprise LLM providers through its LiteLLM integration, allowing users to run inference from Ollama, LMStudio, vLLM, MLX, OpenAI, Anthropic, Gemini, and Deepseek, making it a versatile tool for both simple commands and sophisticated AI-driven tasks.
|
|
277
279
|
|
|
278
280
|
## NPC Studio
|
|
279
|
-
There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the
|
|
281
|
+
There is a graphical user interface that makes use of the NPC Toolkit through the NPC Studio. See the source code for NPC Studio [here](https://github.com/npc-worldwide/npc-studio). Download the executables at [our website](https://enpisi.com/npc-studio). For the most up to date version, you can use NPC Studio by invoking it in npcsh
|
|
282
|
+
```
|
|
283
|
+
/npc-studio
|
|
284
|
+
```
|
|
285
|
+
which will download and set up and serve the NPC Studio application within your `~/.npcsh` folder. It requires `npm` and `node` to work.
|
|
280
286
|
|
|
281
287
|
## Mailing List
|
|
282
288
|
Interested to stay in the loop and to hear the latest and greatest about `npcpy`, `npcsh`, and NPC Studio? Be sure to sign up for the [newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)!
|
|
@@ -78,7 +78,7 @@ extra_files = package_files("npcsh/npc_team/")
|
|
|
78
78
|
|
|
79
79
|
setup(
|
|
80
80
|
name="npcsh",
|
|
81
|
-
version="1.0.
|
|
81
|
+
version="1.0.26",
|
|
82
82
|
packages=find_packages(exclude=["tests*"]),
|
|
83
83
|
install_requires=base_requirements, # Only install base requirements by default
|
|
84
84
|
extras_require={
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|