aia 0.5.18 → 0.8.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (51) hide show
  1. checksums.yaml +4 -4
  2. data/.envrc +1 -0
  3. data/.version +1 -1
  4. data/CHANGELOG.md +39 -5
  5. data/README.md +388 -219
  6. data/Rakefile +16 -5
  7. data/_notes.txt +231 -0
  8. data/bin/aia +3 -2
  9. data/examples/README.md +140 -0
  10. data/examples/headlines +21 -0
  11. data/lib/aia/ai_client_adapter.rb +210 -0
  12. data/lib/aia/chat_processor_service.rb +120 -0
  13. data/lib/aia/config.rb +473 -4
  14. data/lib/aia/context_manager.rb +58 -0
  15. data/lib/aia/directive_processor.rb +267 -0
  16. data/lib/aia/{tools/fzf.rb → fzf.rb} +9 -17
  17. data/lib/aia/history_manager.rb +85 -0
  18. data/lib/aia/prompt_handler.rb +178 -0
  19. data/lib/aia/session.rb +215 -0
  20. data/lib/aia/shell_command_executor.rb +109 -0
  21. data/lib/aia/ui_presenter.rb +110 -0
  22. data/lib/aia/utility.rb +24 -0
  23. data/lib/aia/version.rb +9 -6
  24. data/lib/aia.rb +57 -61
  25. data/lib/extensions/openstruct_merge.rb +44 -0
  26. metadata +29 -43
  27. data/LICENSE.txt +0 -21
  28. data/doc/aia_and_pre_compositional_prompts.md +0 -474
  29. data/lib/aia/clause.rb +0 -7
  30. data/lib/aia/cli.rb +0 -452
  31. data/lib/aia/directives.rb +0 -142
  32. data/lib/aia/dynamic_content.rb +0 -26
  33. data/lib/aia/logging.rb +0 -62
  34. data/lib/aia/main.rb +0 -265
  35. data/lib/aia/prompt.rb +0 -275
  36. data/lib/aia/tools/ai_client_backend.rb +0 -92
  37. data/lib/aia/tools/backend_common.rb +0 -58
  38. data/lib/aia/tools/client.rb +0 -197
  39. data/lib/aia/tools/editor.rb +0 -52
  40. data/lib/aia/tools/glow.rb +0 -90
  41. data/lib/aia/tools/llm.rb +0 -77
  42. data/lib/aia/tools/mods.rb +0 -100
  43. data/lib/aia/tools/sgpt.rb +0 -79
  44. data/lib/aia/tools/subl.rb +0 -68
  45. data/lib/aia/tools/vim.rb +0 -93
  46. data/lib/aia/tools.rb +0 -88
  47. data/lib/aia/user_query.rb +0 -21
  48. data/lib/core_ext/string_wrap.rb +0 -73
  49. data/lib/core_ext/tty-spinner_log.rb +0 -25
  50. data/man/aia.1 +0 -272
  51. data/man/aia.1.md +0 -236
data/README.md CHANGED
@@ -1,13 +1,33 @@
1
- # AI Assistant (AIA)
1
+ 3# AI Assistant (AIA)
2
2
 
3
- `aia` is a command-line utility that facilitates interaction with AI models. It automates the management of pre-compositional prompts and executes generative AI (Gen-AI) commands on those prompts taking advantage of modern LLMs increased context window size.
3
+ **The prompt is the code!**
4
4
 
5
- It leverages the `prompt_manager` gem to manage prompts for the `mods` and `sgpt` CLI utilities. It utilizes "ripgrep" for searching for prompt files. It uses `fzf` for prompt selection based on a search term and fuzzy matching.
5
+ ```plain
6
+ , , AIA is a command-line utility that facilitates
7
+ (\____/) AI Assistant interaction with AI models. It automates the
8
+ (_oo_) Fancy LLM management of pre-compositional prompts and
9
+ (O) is Online executes generative AI (Gen-AI) commands on those
10
+ __||__ \) prompts. AIA includes enhanced feathres such as
11
+ [/______\] / * embedded directives * shell integration
12
+ / \__AI__/ \/ * embedded Ruby * history management
13
+ / /__\ * interactive chat * prompt workflows
14
+ (\ /____\
15
+ ```
6
16
 
7
- **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
17
+ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
8
18
 
9
- > Just an FYI ... I am working in the `develop` branch to **drop the dependency on backend LLM processors like mods and llm.** I'm refactor aia to use my own universal client gem called ai_client which gives access to all models and all providers.
19
+ **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
10
20
 
21
+ **Notable Recent Changes:**
22
+ - **Directive Processing in Chat and Prompts:** You can now use directives in chat sessions and prompt files with the syntax: `//command args`. Supported directives include:
23
+ - `shell`/`sh`: Execute shell commands
24
+ - `ruby`/`rb`: Execute Ruby code
25
+ - `config`/`cfg`: Display or update configuration
26
+ - `include`/`inc`: Include file content
27
+ - `next`: Specify the next prompt ID in a sequence
28
+ - `pipeline`: Specify a pipeline of prompt IDs to process
29
+ - `clear`: Clear the context (handy in a chat session)
30
+ - `help`: Show available directives
11
31
 
12
32
 
13
33
  <!-- Tocer[start]: Auto-generated, don't remove. -->
@@ -15,42 +35,46 @@ It leverages the `prompt_manager` gem to manage prompts for the `mods` and `sgpt
15
35
  ## Table of Contents
16
36
 
17
37
  - [Installation](#installation)
38
+ - [What is a Prompt ID?](#what-is-a-prompt-id)
39
+ - [Embedded Parameters as Placeholders](#embedded-parameters-as-placeholders)
18
40
  - [Usage](#usage)
19
- - [Configuration Using Envars](#configuration-using-envars)
41
+ - [Configuration Options](#configuration-options)
42
+ - [Configuration Flexibility](#configuration-flexibility)
43
+ - [Expandable Configuration](#expandable-configuration)
20
44
  - [Shell Integration inside of a Prompt](#shell-integration-inside-of-a-prompt)
21
- - [Access to System Environment Variables](#access-to-system-environment-variables)
22
45
  - [Dynamic Shell Commands](#dynamic-shell-commands)
46
+ - [Shell Command Safety](#shell-command-safety)
23
47
  - [Chat Session Use](#chat-session-use)
24
48
  - [*E*mbedded *R*u*B*y (ERB)](#embedded-ruby-erb)
25
- - [Chat Session Behavior](#chat-session-behavior)
26
49
  - [Prompt Directives](#prompt-directives)
27
50
  - [Parameter and Shell Substitution in Directives](#parameter-and-shell-substitution-in-directives)
28
- - [`aia` Specific Directive Commands](#aia-specific-directive-commands)
51
+ - [Directive Syntax](#directive-syntax)
52
+ - [AIA Specific Directive Commands](#aia-specific-directive-commands)
29
53
  - [//config](#config)
30
54
  - [//include](#include)
31
55
  - [//ruby](#ruby)
32
56
  - [//shell](#shell)
33
- - [Backend Directive Commands](#backend-directive-commands)
57
+ - [//next](#next)
58
+ - [//pipeline](#pipeline)
34
59
  - [Using Directives in Chat Sessions](#using-directives-in-chat-sessions)
35
60
  - [Prompt Sequences](#prompt-sequences)
36
61
  - [--next](#--next)
37
62
  - [--pipeline](#--pipeline)
38
63
  - [Best Practices ??](#best-practices-)
39
- - [Example pipline](#example-pipline)
64
+ - [Example pipeline](#example-pipeline)
40
65
  - [All About ROLES](#all-about-roles)
41
- - [The --roles_dir (AIA_ROLES_DIR)](#the---roles_dir-aia_roles_dir)
66
+ - [The --roles_prefix (AIA_ROLES_PREFIX)](#the---roles_prefix-aia_roles_prefix)
42
67
  - [The --role Option](#the---role-option)
43
68
  - [Other Ways to Insert Roles into Prompts](#other-ways-to-insert-roles-into-prompts)
44
69
  - [External CLI Tools Used](#external-cli-tools-used)
45
- - [Optional External CLI-tools](#optional-external-cli-tools)
46
- - [Backend Processor `llm`](#backend-processor-llm)
47
- - [Backend Processor `sgpt`](#backend-processor-sgpt)
48
- - [Occassionally Useful Tool `plz`](#occassionally-useful-tool-plz)
49
70
  - [Shell Completion](#shell-completion)
50
71
  - [My Most Powerful Prompt](#my-most-powerful-prompt)
51
72
  - [My Configuration](#my-configuration)
73
+ - [Executable Prompts](#executable-prompts)
52
74
  - [Development](#development)
53
75
  - [Contributing](#contributing)
76
+ - [History of Development](#history-of-development)
77
+ - [Roadmap](#roadmap)
54
78
  - [License](#license)
55
79
 
56
80
  <!-- Tocer[finish]: Auto-generated, don't remove. -->
@@ -62,137 +86,251 @@ Install the gem by executing:
62
86
 
63
87
  gem install aia
64
88
 
65
-
66
89
  Install the command-line utilities by executing:
67
90
 
68
- brew install mods fzf ripgrep
91
+ brew install fzf
69
92
 
70
93
  You will also need to establish a directory in your file system where your prompt text files, last used parameters and usage log files are kept.
71
94
 
72
- Setup a system environment variable (envar) named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts". The envar "AIA_ROLES_DIR" points to your role directory where you have prompts that define the different roles you want the LLM to assume when it is doing its work. The default roles directory is inside the prompts directory. Its name is "roles".
95
+ Setup a system environment variable (envar) named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts". The envar "AIA_ROLES_PREFIX" points to your role prefix where you have prompts that define the different roles you want the LLM to assume when it is doing its work. The default roles prefix is "roles".
73
96
 
74
97
  You may also want to install the completion script for your shell. To get a copy of the completion script do:
75
98
 
76
- `aia --completion bash`
99
+ ```bash
100
+ aia --completion bash
101
+ ```
77
102
 
78
103
  `fish` and `zsh` are also available.
79
104
 
105
+ ## What is a Prompt ID?
80
106
 
81
- ## Usage
107
+ A prompt ID is the basename of a text file (extension *.txt) located in a prompts directory. The prompts directory is specified by the environment variable "AIA_PROMPTS_DIR". If this variable is not set, the default is in your HOME directory named ".prompts". It can also be set on the command line with the `--prompts-dir` option.
108
+
109
+ This file contains the context and instructions for the LLM to follow. The prompt ID is what you use as an option on the command line to specify which prompt text file to use. Prompt files can have comments, parameters, directives and ERB blocks along with the instruction text to feed to the LLM. It can also have shell commands and use system environment variables. Consider the following example:
110
+
111
+ ```plaintext
112
+ #!/usr/bin/env aia run
113
+ # ~/.prompts/example.txt
114
+ # Desc: Be an example prompt with all? the bells and whistles
115
+
116
+ # Set the configuration for this prompt
117
+
118
+ //config model = gpt-4
119
+ //config temperature = 0.7
120
+ //config shell = true
121
+ //config erb = true
122
+ //config out_file = path/to/output.md
123
+
124
+ # Add some file content to the context/instructions
125
+
126
+ //include path/to/file
127
+ //shell cat path/to/file
128
+ $(cat path/to/file)
129
+
130
+ # Setup some workflows
82
131
 
83
- The usage report obtained using either `-h` or `--help` is implemented as a standard `man` page. You can use both `--help --verbose` of `-h -v` together to get not only the `aia` man page but also the usage report from the `backend` LLM processing tool.
132
+ //next next_prompt_id
133
+ //pipeline prompt_id_1, prompt_ie_2, prompt_id_3
84
134
 
85
- ```shell
86
- $ aia --help
135
+ # Execute some Ruby code
136
+
137
+ //ruby require 'some_library' # inserts into the context/instructions
138
+ <% some_ruby_things # not inserted into the context %>
139
+ <%= some_other_ruby_things # that are part of the context/instructions %>
140
+
141
+ Tell me how to do something for a $(uname -s) platform that would rename all
142
+ of the files in the directory $MY_DIRECTORY to have a prefix of for its filename
143
+ that is [PREFIX] and a ${SUFFIX}
144
+
145
+ # directives, ERB blocks and other junk can be used
146
+ # anywhere in the file mixing dynamic context/instructions with
147
+ # the static stuff.
148
+
149
+ __END__
150
+
151
+ Block comments that are not part of the context or instructions to
152
+ the LLM. Stuff that is just here ofr documentation.
87
153
  ```
88
154
 
89
- ## Configuration Using Envars
155
+ That is just about everything including the kitchen sink that a pre-compositional parameterized prompt file can have. It can be an executable with a she-bang line and a special system prompt name `run` as shown in the example. It has line comments that use the `#` symbol. It had end of file block comments that appear after the "__END__" line. It has directive command that begin with the double slash `//` - an homage to IBM JCL. It has shell variables in both forms. It has shell commands. It has parameters that default to a regex that uses square brackets and all uppercase characeters to define the parameter name whose value is to be given in a Q&A session before the prompt is sent to the LLM for processing.
90
156
 
91
- The `aia` configuration defaults can be over-ridden by system environment variables *(envars)* with the prefix "AIA_" followed by the config item name also in uppercase. All configuration items can be over-ridden in this way by an envar. The following table show a few examples.
157
+ AIA has the ability to define a workflow of prompt IDs with either the //next or //pipeline directives.
92
158
 
93
- | Config Item | Default Value | envar key |
94
- | ------------- | ------------- | --------- |
95
- | backend | mods | AIA_BACKEND |
96
- | config_file | nil | AIA_CONFIG_FILE |
97
- | debug | false | AIA_DEBUG |
98
- | edit | false | AIA_EDIT |
99
- | extra | '' | AIA_EXTRA |
100
- | fuzzy | false | AIA_FUZZY |
101
- | log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
102
- | markdown | true | AIA_MARKDOWN |
103
- | model | gpt-4-1106-preview | AIA_MODEL |
104
- | out_file | STDOUT | AIA_OUT_FILE |
105
- | prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
106
- | speech_model. | tts-1 | AIA_SPEECH_MODEL |
107
- | verbose | FALSE | AIA_VERBOSE |
108
- | voice | alloy | AIA_VOICE |
159
+ You could say that instead of the prompt being part of a program, a program can be part of the prompt. **The prompt is the code!**
109
160
 
161
+ By using ERB you can make parts of the context/instructions conditional. You can also use ERB to make parts of the context/instructions dynamic for example to pull information from a database or an API.
110
162
 
163
+ ## Embedded Parameters as Placeholders
111
164
 
112
- See the `@options` hash in the `cli.rb` file for a complete list. There are some config items that do not necessarily make sense for use as an envar over-ride. For example if you set `export AIA_DUMP_FILE=config.yaml` then `aia` would dump the current configuration config.yaml and exit every time it is ran until you finally `unset AIA_DUMP_FILE`
165
+ In the example prompt text file above I used the default regex to define parameters as all upper case characters plus space, underscore and the vertical pipe enclosed within square brackets. Since the time that I original starting writing AIA I've seen more developers use double curly braces to define parameters. AIA allows you to specify your own regex as a string. If you want the curly brackets use the `--regex` option on the command line like this:
113
166
 
114
- In addition to these config items for `aia` the optional command line parameters for the backend prompt processing utilities (mods and sgpt) can also be set using envars with the "AIA_" prefix. For example "export AIA_TOPP=1.0" will set the "--topp 1.0" command line option for the `mods` utility when its used as the backend processor.
167
+ `--regex '(?-mix:({{[a-zA-Z _|]+}}))'`
115
168
 
116
- ## Shell Integration inside of a Prompt
117
169
 
118
- Using the option `--shell` enables `aia` to access your terminal's shell environment from inside the prompt text.
170
+ ## Usage
171
+
172
+ The usage report is obtained with either `-h` or `--help` options.
173
+
174
+ ```bash
175
+ aia --help
176
+ ```
177
+
178
+ ## Configuration Options
179
+
180
+ The following table provides a comprehensive list of configuration options, their default values, and the associated environment variables:
181
+
182
+ | Option | Default Value | Environment Variable |
183
+ |-------------------------|---------------------------------|---------------------------|
184
+ | out_file | temp.md | AIA_OUT_FILE |
185
+ | log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
186
+ | prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
187
+ | roles_prefix | roles | AIA_ROLES_PREFIX |
188
+ | model | gpt-4o-mini | AIA_MODEL |
189
+ | speech_model | tts-1 | AIA_SPEECH_MODEL |
190
+ | transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
191
+ | verbose | false | AIA_VERBOSE |
192
+ | markdown | true | AIA_MARKDOWN |
193
+ | shell | false | AIA_SHELL |
194
+ | erb | false | AIA_ERB |
195
+ | chat | false | AIA_CHAT |
196
+ | clear | false | AIA_CLEAR |
197
+ | terse | false | AIA_TERSE |
198
+ | debug | false | AIA_DEBUG |
199
+ | fuzzy | false | AIA_FUZZY |
200
+ | speak | false | AIA_SPEAK |
201
+ | append | false | AIA_APPEND |
202
+ | temperature | 0.7 | AIA_TEMPERATURE |
203
+ | max_tokens | 2048 | AIA_MAX_TOKENS |
204
+ | top_p | 1.0 | AIA_TOP_P |
205
+ | frequency_penalty | 0.0 | AIA_FREQUENCY_PENALTY |
206
+ | presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
207
+ | image_size | 1024x1024 | AIA_IMAGE_SIZE |
208
+ | image_quality | standard | AIA_IMAGE_QUALITY |
209
+ | image_style | vivid | AIA_IMAGE_STYLE |
210
+ | embedding_model | text-embedding-ada-002 | AIA_EMBEDDING_MODEL |
211
+ | speak_command | afplay | AIA_SPEAK_COMMAND |
212
+ | require_libs | [] | AIA_REQUIRE_LIBS |
213
+ | regex | '(?-mix:(\\[[A-Z _|]+\\]))' | AIA_REGEX |
214
+
215
+ These options can be configured via command-line arguments, environment variables, or configuration files.
216
+
217
+ ### Configuration Flexibility
218
+
219
+ AIA determines configuration settings using the following order of precedence:
220
+
221
+ 1. Embedded config directives
222
+ 2. Command-line arguments
223
+ 3. Environment variables
224
+ 4. Configuration files
225
+ 5. Default values
226
+
227
+ For example, let's consider the `model` option. Suppose the following conditions:
228
+
229
+ - Default value is "gpt-4o-mini"
230
+ - No entry in the config file
231
+ - No environment variable value for `AIA_MODEL`
232
+ - No command-line argument provided for `--model`
233
+ - No embedded directive like `//config model = some-fancy-llm`
234
+
235
+ In this scenario, the model used will be "gpt-4o-mini". However, you can override this default by setting the model at any level of the precedence order. Additionally, you can dynamically ask for user input by incorporating an embedded directive with a placeholder parameter, such as `//config model = [PROCESS_WITH_MODEL]`. When processing the prompt, AIA will prompt you to input a value for `[PROCESS_WITH_MODEL]`.
236
+
237
+ If you do not like the default regex used to identify parameters within the prompt text, don't worry there is a way to configure it using the `--regex` option.
238
+
239
+ ### Expandable Configuration
240
+
241
+ The configuration options are expandable through a config file, allowing you to add custom entries. For example, you can define a custom configuration item like "xyzzy" in your config file. This value can then be accessed in your prompts using `AIA.config.xyzzy` within a `//ruby` directive or an ERB block, enabling dynamic prompt generation based on your custom configurations.
119
242
 
120
- #### Access to System Environment Variables
243
+ ## Shell Integration inside of a Prompt
121
244
 
122
- `aia` can replace any system environment variable (envar) references in the prompt text with the value of the envar. Patterns like $USER and ${USER} in the prompt will be replaced with that envar's value - the name of the user's account. Any envar can be used.
245
+ Using the option `--shell` enables AIA to access your terminal's shell environment from inside the prompt text.
123
246
 
124
247
  #### Dynamic Shell Commands
125
248
 
126
- Dynamic content can be inserted into the prompt using the pattern $(shell command) where the output of the shell command will replace the $(...) pattern.
249
+ Dynamic content can be inserted into the prompt using the pattern $(shell command) where the output of the shell command will replace the $(...) pattern. It will become part of the context / instructions for the prompt.
127
250
 
128
- Consider the power to tailoring a prompt to your specific operating system:
251
+ Consider the power of tailoring a prompt to your specific operating system:
129
252
 
130
- ```
253
+ ```plaintext
131
254
  As a system administration on a $(uname -v) platform what is the best way to [DO_SOMETHING]
132
255
  ```
133
256
 
134
- or insert content from a file in your home directory:
257
+ Or insert content from a file in your home directory:
135
258
 
136
- ```
259
+ ```plaintext
137
260
  Given the following constraints $(cat ~/3_laws_of_robotics.txt) determine the best way to instruct my roomba to clean my kids room.
138
261
  ```
139
262
 
263
+ #### Shell Command Safety
264
+
265
+ The catchphrase "the prompt is the code" within AIA means that you have the power to execute any command you want, but you must be careful not to execute commands that could cause harm. AIA is not going to protect you from doing something stupid. Sure that's a copout. I just can't think (actually I can) of all the ways you can mess things up writing code. Remember what we learned from Forrest Gump "Stupid is as stupid does." So don't do anything stupid. If someone gives you a prompt as says "run this with AIA" you had better review the prompt before processing it.
266
+
140
267
  #### Chat Session Use
141
268
 
142
- When you use the `--shell` option to start a chat session, shell integration is available in your follow up prompts. Suppose you started up a chat session using a roll of "Ruby Expert" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started. You could enter this as your follow up prompt to this to keep going:
269
+ When you use the `--shell` option to start a chat session, shell integration is available in your follow up prompts. Suppose you started a chat session (--chat) using a role of "Ruby Expert" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started. You could enter this as your follow up prompt to keep going:
143
270
 
144
- ```
271
+ ```plaintext
145
272
  The class I want to chat about refactoring is this one: $(cat my_class.rb)
146
273
  ```
147
274
 
148
- That inserts the entire class source file into your follow up prompt. You can continue chatting with you AI Assistant avout changes to the class.
275
+ That inserts the entire class source file into your follow up prompt. You can continue chatting with you AI Assistant about changes to the class.
149
276
 
150
277
  ## *E*mbedded *R*u*B*y (ERB)
151
278
 
152
- The inclusion of dynamic content through the shell integration provided by the `--shell` option is significant. `aia` also provides the full power of embedded Ruby code processing within the prompt text.
279
+ The inclusion of dynamic content through the shell integration provided by the `--shell` option is significant. AIA also provides the full power of embedded Ruby code processing within the prompt text.
153
280
 
154
281
  The `--erb` option turns the prompt text file into a fully functioning ERB template. The [Embedded Ruby (ERB) template syntax (2024)](https://bophin-com.ngontinh24.com/article/language-embedded-ruby-erb-template-syntax) provides a good overview of the syntax and power of ERB.
155
282
 
156
- Most websites that have information about ERB will give examples of how to use ERB to generate dynamice HTML content for web-based applications. That is a common use case for ERB. `aia` on the other hand uses ERB to generate dynamic prompt text.
283
+ Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic prompt text for LLM processing.
157
284
 
158
- ### Chat Session Behavior
159
-
160
- In a chat session whether started by the `--chat` option or its equivalent with a directive within a prompt text file behaves a little differently w/r/t its binding and local variable assignments. Since a chat session by definition has multiple prompts, setting a local variable in one prompt and expecting it to be available in a subsequent prompt does not work. You need to use instance variables to accomplish this prompt to prompt carry over of information.
161
-
162
- Also since follow up prompts are expected to be a single thing - sentence or paragraph - terminated by a single return, its likely that ERB enhance will be of benefit; but, you may find a use for it.
163
285
 
164
286
  ## Prompt Directives
165
287
 
166
- Downstream processing directives were added to the `prompt_manager` gem used by `au` at version 0.4.1. These directives are lines in the prompt text file that begin with "//" having this pattern:
288
+ Downstream processing directives were added to the `prompt_manager` gem used by AIA at version 0.4.1. These directives are lines in the prompt text file that begin with "//" having this pattern:
167
289
 
168
- ```
169
- //command parameters
290
+ ```bash
291
+ //command params
170
292
  ```
171
293
 
172
- There is no space between the "//" and the command.
294
+ There is no space between the "//" and the command. Commands do not have to have params. These params are typically space delimited when more than one is required. It all depens on the command.
173
295
 
174
- ### Parameter and Shell Substitution in Directives
296
+ ### Parameter and Shell Substitution in Directives
175
297
 
176
298
  When you combine prompt directives with prompt parameters and shell envar substitutions you can get some powerful compositional prompts.
177
299
 
178
300
  Here is an example of a pure generic directive.
179
301
 
180
- ```
302
+ ```bash
181
303
  //[DIRECTIVE_NAME] [DIRECTIVE_PARAMS]
182
304
  ```
183
305
 
184
306
  When the prompt runs, you will be asked to provide a value for each of the parameters. You could answer "shell" for the directive name and "calc 22/7" if you wanted a bad approximation of PI.
185
307
 
186
308
  Try this prompt file:
187
- ```
309
+ ```bash
188
310
  //shell calc [FORMULA]
189
311
 
190
312
  What does that number mean to you?
191
313
  ```
192
314
 
193
- ### `aia` Specific Directive Commands
315
+ ### Directive Syntax
316
+
317
+ Directives can be entered in chat or prompt files using the following syntax:
318
+ - `//command args`
319
+
320
+ Supported directives:
321
+ - `help`: Show available directives
322
+ - `shell` or `sh`: Execute a shell command
323
+ - `ruby` or `rb`: Execute Ruby code
324
+ - `config` or `cfg`: Show or update configuration
325
+ - `include` or `inc`: Include file content
326
+ - `next`: Set/Show the next prompt ID to be processed
327
+ - `pipeline`: Set/Extend/Show the workflow of prompt IDs
194
328
 
195
- At this time `aia` only has a few directives which are detailed below.
329
+ When a directive produces output, it is added to the chat context. If there is no output, you are prompted again.
330
+
331
+ ### AIA Specific Directive Commands
332
+
333
+ At this time AIA only has a few directives which are detailed below.
196
334
 
197
335
  #### //config
198
336
 
@@ -204,25 +342,23 @@ The switch options are treated like booleans. They are either `true` or `false`
204
342
 
205
343
  To set the value of a switch using ``//config` for example `--terse` or `--chat` to this:
206
344
 
207
- ```
345
+ ```bash
208
346
  //config chat? = true
209
347
  //config terse? = true
210
348
  ```
211
349
 
212
350
  A configuration item such as `--out_file` or `--model` has an associated value on the command line. To set that value with the `//config` directive do it like this:
213
351
 
214
- ```
352
+ ```bash
215
353
  //config model = gpt-3.5-turbo
216
354
  //config out_file = temp.md
217
- //config backend = mods
218
355
  ```
219
356
 
220
357
  BTW: the "=" is completely options. Its actuall ignored as is ":=" if you were to choose that as your assignment operator. Also the number of spaces between the item and the value is complete arbitrary. I like to line things up so this syntax is just as valie:
221
358
 
222
- ```
359
+ ```bash
223
360
  //config model gpt-3.5-turbo
224
361
  //config out_file temp.md
225
- //config backend mods
226
362
  //config chat? true
227
363
  //config terse? true
228
364
  //config model gpt-4
@@ -233,7 +369,7 @@ NOTE: if you specify the same config item name more than once within the prompt
233
369
  #### //include
234
370
 
235
371
  Example:
236
- ```
372
+ ```bash
237
373
  //include path_to_file
238
374
  ```
239
375
 
@@ -241,100 +377,120 @@ The `path_to_file` can be either absolute or relative. If it is relative, it is
241
377
 
242
378
  The file that is included will have any comments or directives excluded. It is expected that the file will be a text file so that its content can be pre-pended to the existing prompt; however, if the file is a source code file (ex: file.rb) the source code will be included HOWEVER any comment line or line that starts with "//" will be excluded.
243
379
 
244
- TODO: Consider adding a command line option `--include_dir` to specify the place from which relative files are to come.
245
380
 
246
381
  #### //ruby
247
- Example:
248
- ```
249
- //ruby any_code_that_returns_an_instance_of_String
250
- ```
251
382
 
252
- This directive is in addition to ERB. At this point the `//ruby` directive is limited by the current binding which is within the `AIA::Directives#ruby` method. As such it is not likely to see much use.
383
+ The `//ruby` directive executes Ruby code. You can use this to perform complex operations or interact with Ruby libraries.
253
384
 
254
- However, sinces it implemented as a simple `eval(code)` then there is a potential for use like this:
255
- ```
256
- //ruby load(some_ruby_file); execute_some_method
385
+ For example:
386
+ ```ruby
387
+ //ruby puts "Hello from Ruby"
257
388
  ```
258
389
 
259
- Each execution of a `//ruby` directive will be a fresh execution of the `AIA::Directives#ruby` method so you cannot carry local variables from one invocation to another; however, you could do something with instance variables or global variables. You might even add something to the `AIA.config` object to be pasted on to the next invocation of the directive within the context of the same prompt.
390
+ You can also use the `--rq` option to specify Ruby libraries to require before executing Ruby code:
391
+
392
+ ```bash
393
+ # Command line
394
+ aia --rq json,csv my_prompt
395
+
396
+ # In chat
397
+ //ruby JSON.parse('{"data": [1,2,3]}')["data"]
398
+ ```
260
399
 
261
400
  #### //shell
262
401
  Example:
263
- ```
402
+ ```bash
264
403
  //shell some_shell_command
265
404
  ```
266
405
 
267
406
  It is expected that the shell command will return some text to STDOUT which will be pre-pending to the existing prompt text within the prompt file.
268
407
 
269
408
  There are no limitations on what the shell command can be. For example if you wanted to bypass the stripping of comments and directives from a file you could do something like this:
270
- ```
409
+ ```bash
271
410
  //shell cat path_to_file
272
411
  ```
273
412
 
274
413
  Which does basically the same thing as the `//include` directive, except it uses the entire content of the file. For relative file paths the same thing applies. The file's path will be relative to the PWD.
275
414
 
276
415
 
416
+ #### //next
417
+ Examples:
418
+ ```bash
419
+ # Show the next promt ID
420
+ //next
277
421
 
278
- ### Backend Directive Commands
422
+ # Set the next prompt ID
423
+ //next prompt_id
279
424
 
280
- See the source code for the directives supported by the backends which at this time are configuration-based as well.
425
+ # Same as
426
+ //config next
427
+ //config next = prompt_id
428
+ ```
281
429
 
282
- - [mods](lib/aia/tools/mods.rb)
283
- - [sgpt](lib/aia/tools/sgpt.rb)
430
+ #### //pipeline
284
431
 
285
- For example `mods` has a configuration item `topp` which can be set by a directive in a prompt text file directly.
432
+ Examples:
433
+ ```bash
434
+ # Show the current prompt workflow
435
+ //pipeline
286
436
 
287
- ```
288
- //topp 1.5
289
- ```
437
+ # Set the prompt workflow
438
+ //pipeline = prompt_id_1, prompt_id_2, prompt_id_3
439
+
440
+ # Extend the prompt workflow
441
+ //pipeline << prompt_id_4, prompt_id_5, prompt_id_6
290
442
 
291
- If `mods` is not the backend the `//topp` direcive is ignored.
443
+ # Same as
444
+ //config pipeline
445
+ //config pipeline = prompt_id_1, prompt_id_2, prompt_id_3
446
+ //config pipeline << prompt_id_4, prompt_id_5, prompt_id_6
447
+ ```
292
448
 
293
449
  ### Using Directives in Chat Sessions
294
450
 
295
- Whe you are in a chat session, you may use a directive as a follow up prompt. For example if you started the chat session with the option `--terse` expecting to get short answers from the backend; but, then you decide that you want more comprehensive answers you may do this:
451
+ Whe you are in a chat session, you may use a directive as a follow up prompt. For example if you started the chat session with the option `--terse` expecting to get short answers from the LLM; but, then you decide that you want more comprehensive answers you may do this in a chat follow up:
296
452
 
297
- ```
453
+ ```bash
298
454
  //config terse? false
299
455
  ```
300
456
 
301
- The directive is executed and a new follow up prompt can be entered with a more lengthy response generated from the backend.
457
+ The directive is executed and a new follow up prompt can be entered with a more lengthy response generated from the LLM.
302
458
 
459
+ The directive `//clear` truncates your entire session context. The LLM will not remember anything you have discussed.
303
460
 
304
461
  ## Prompt Sequences
305
462
 
306
- Why would you need/want to use a sequence of prompts in a batch situation. Maybe you have a complex prompt which exceeds the token limitations of your model for input so you need to break it up into multiple parts. Or suppose its a simple prompt but the number of tokens on the output is limited and you do not get exactly the kind of full response for which you were looking.
463
+ Why would you need/want to use a sequence of prompts in a batch situation. Maybe you have a complex prompt which exceeds the token limitations of your model for input so you need to break it up into multiple parts. Or suppose its a simple prompt but the number of tokens on the output is limited and you do not get exactly the kind of full response for which you were looking.
307
464
 
308
- Sometimes it takes a series of prompts to get the kind of response that you want. The reponse from one prompt becomes a context for the next prompt. This is easy to do within a `chat` session were you are manually entering and adjusting your prompts until you get the kind of response that you want.
465
+ Sometimes it takes a series of prompts to get the kind of response that you want. The reponse from one prompt becomes a context for the next prompt. This is easy to do within a `chat` session were you are manually entering and adjusting your prompts until you get the kind of response that you want.
309
466
 
310
- If you need to do this on a regular basis or within a batch you can use `aia` and the `--next` and `--pipeline` command line options.
467
+ If you need to do this on a regular basis or within a batch you can use AIA and the `--next` and `--pipeline` command line options.
311
468
 
312
- These two options specify the sequence of prompt IDs to be processed. Both options are available to be used within a prompt file using the `//config` directive. Like all embedded directives you can take advantage of parameterization shell integration and Ruby. I'm start to feel like TIm Tool man - more power!
469
+ These two options specify the sequence of prompt IDs to be processed. Both options are available to be used within a prompt file using the `//next` and `//pipeline` directives. Like all embedded directives you can take advantage of parameterization shell integration and Ruby. With this kind of dynamic content and control flow in your prompts you will start to feel like Tim the Tool man - more power!
313
470
 
314
471
  Consider the condition in which you have 4 prompt IDs that need to be processed in sequence. The IDs and associated prompt file names are:
315
472
 
316
- | Promt ID | Prompt File |
317
- | -------- | ----------- |
318
- | one. | one.txt |
319
- | two. | two.txt |
320
- | three. | three.txt |
321
- | four. | four.txt |
473
+ | Prompt ID | Prompt File |
474
+ | --------- | ----------- |
475
+ | one | one.txt |
476
+ | two | two.txt |
477
+ | three | three.txt |
478
+ | four | four.txt |
322
479
 
323
480
 
324
481
  ### --next
325
482
 
326
- ```shell
327
- export AIA_OUT_FILE=temp.md
328
- aia one --next two
329
- aia three --next four temp.md
483
+ ```bash
484
+ aia one --next two --out_file temp.md
485
+ aia three --next four temp.md -o answer.md
330
486
  ```
331
487
 
332
- or within each of the prompt files you use the config directive:
488
+ or within each of the prompt files you use the `//next` directive:
333
489
 
334
- ```
335
- one.txt contains //config next two
336
- two.txt contains //config next three
337
- three.txt contains //config next four
490
+ ```bash
491
+ one.txt contains //next two
492
+ two.txt contains //next three
493
+ three.txt contains //next four
338
494
  ```
339
495
  BUT if you have more than two prompts in your sequence then consider using the --pipeline option.
340
496
 
@@ -342,11 +498,15 @@ BUT if you have more than two prompts in your sequence then consider using the -
342
498
 
343
499
  ### --pipeline
344
500
 
345
- `aia one --pipeline two,three,four`
501
+ ```bash
502
+ aia one --pipeline two,three,four
503
+ ```
346
504
 
347
505
  or inside of the `one.txt` prompt file use this directive:
348
506
 
349
- `//config pipeline two,three,four`
507
+ ```bash
508
+ //pipeline two,three,four
509
+ ```
350
510
 
351
511
  **The directive //pipeline is short for //config pipeline**
352
512
 
@@ -356,15 +516,15 @@ Since the response of one prompt is fed into the next prompt within the sequence
356
516
 
357
517
 
358
518
  | Prompt File | Directive |
359
- | --- | --- |
360
- | one.txt | //config out_file one.md |
361
- | two.txt | //config out_file two.md |
362
- | three.txt | //config out_file three.md |
363
- | four.txt | //config out_file four.md |
519
+ | ----------- | --------- |
520
+ | one.txt | //config out_file one.md |
521
+ | two.txt | //config out_file two.md |
522
+ | three.txt | //config out_file three.md |
523
+ | four.txt | //config out_file four.md |
364
524
 
365
525
  This way you can see the response that was generated for each prompt in the sequence.
366
526
 
367
- ### Example pipline
527
+ ### Example pipeline
368
528
 
369
529
  TODO: the audio-to-text is still under development.
370
530
 
@@ -372,24 +532,24 @@ Suppose you have an audio file of a meeting. You what to get a transcription of
372
532
 
373
533
  Create two prompts named transcribe.txt and tech_summary.txt
374
534
 
375
- ```
535
+ ```bash
376
536
  # transcribe.txt
377
537
  # Desc: takes one audio file
378
538
  # note that there is no "prompt" text only the directive
379
539
 
380
- //config backend client
381
540
  //config model whisper-1
382
541
  //next tech_summary
383
542
  ```
543
+
384
544
  and
385
545
 
386
- ```
546
+ ```bash
387
547
  # tech_summary.txt
388
548
 
389
- //config model gpt-4-turbo
549
+ //config model gpt-4o-mini
390
550
  //config out_file meeting_summary.md
391
551
 
392
- Review the raw transcript of a technical meeting,
552
+ Review the raw transcript of a technical meeting,
393
553
  summarize the discussion and
394
554
  note any action items that were generated.
395
555
 
@@ -398,7 +558,7 @@ Format your response in markdown.
398
558
 
399
559
  Now you can do this:
400
560
 
401
- ```
561
+ ```bash
402
562
  aia transcribe my_tech_meeting.m4a
403
563
  ```
404
564
 
@@ -407,104 +567,62 @@ You summary of the meeting is in the file `meeting_summary.md`
407
567
 
408
568
  ## All About ROLES
409
569
 
410
- ### The --roles_dir (AIA_ROLES_DIR)
411
-
412
- There are two kinds of prompts
413
- 1. instructional - tells the LLM what to do
414
- 2. personification - tells the LLM who it should pretend to be when it does its transformational work.
570
+ ### The --roles_prefix (AIA_ROLES_PREFIX)
415
571
 
416
- That second kind of prompt is called a role. Sometimes the role is incorporated into the instruction. For example "As a magician make a rabbit appear out of a hat." To reuse the same role in multiple prompts `aia` encourages you to designate a special `roles_dir` into which you put prompts that are specific to personification - roles.
572
+ The second kind of prompt is called a role (aka system prompt). Sometimes the role is incorporated into the instruction. For example, "As a magician make a rabbit appear out of a hat." To reuse the same role in multiple prompts, AIA encourages you to designate a special subdirectory for prompts that are specific to personification - roles.
417
573
 
418
- The default `roles_dir` is a sub-directory of the `prompts_dir` named roles. You can, however, put your `roles_dir` anywhere that makes sense to you.
574
+ The default `roles_prefix` is set to 'roles'. This creates a subdirectory under the `prompts_dir` where role files are stored. Internally, AIA calculates a `roles_dir` value by joining `prompts_dir` and `roles_prefix`. It is recommended to keep the roles organized this way for better organization and management.
419
575
 
420
576
  ### The --role Option
421
577
 
422
- The `--role` option is used to identify a personification prompt within your roles directory which defines the context within which the LLM is to provide its response. The text of the role ID is pre-pended to the text of the primary prompt to form a complete prompt to be processed by the backend.
578
+ The `--role` option is used to identify a personification prompt within your roles directory which defines the context within which the LLM is to provide its response. The text of the role ID is pre-pended to the text of the primary prompt to form a complete prompt to be processed by the LLM.
423
579
 
424
580
  For example consider:
425
581
 
426
- ```shell
582
+ ```bash
427
583
  aia -r ruby refactor my_class.rb
428
584
  ```
429
585
 
430
- Within the roles directory the contents of the text file `ruby.txt` will be pre-pre-pended to the contents of the `refactor.txt` file from the prompts directory to produce a complete prompt. That complete prompt will have any parameters followed by directives processed before sending the combined prompt text to the backend.
586
+ The role ID is `ruby` the prompt ID is `refactor` and my_class.rb is a context file.
587
+
588
+ Within the roles directory the contents of the text file `ruby.txt` will be pre-pre-pended to the contents of the `refactor.txt` file from the prompts directory to produce a complete prompt. That complete prompt will have any parameters followed by directives processed before sending the combined prompt text and the content of the context file to the LLM.
431
589
 
432
590
  Note that `--role` is just a way of saying add this prompt text file to the front of this other prompt text file. The contents of the "role" prompt could be anything. It does not necessarily have be an actual role.
433
591
 
434
- `aia` fully supports a directory tree within the `prompts_dir` as a way of organization or classification of your different prompt text files.
592
+ AIA fully supports a directory tree within the `prompts_dir` as a way of organization or classification of your different prompt text files.
435
593
 
436
- ```shell
437
- aia -r sw_eng doc_the_methods my_class.rb
594
+ ```bash
595
+ aia -r ruby sw_eng/doc_the_methods my_class.rb
438
596
  ```
439
597
 
440
- In this example the prompt text file `$AIA_ROLES_DIR/sw_eng.txt` is prepended to the prompt text file `$AIA_PROMPTS_DIR/doc_the_methods.txt`
441
-
598
+ In this example the prompt text file `$AIA_ROLES_PREFIX/ruby.txt` is prepended to the prompt text file `$AIA_PROMPTS_DIR/sw_eng/doc_the_methods.txt`
442
599
 
443
600
  ### Other Ways to Insert Roles into Prompts
444
601
 
445
- Since `aia` supports parameterized prompts you could make a keyword like "[ROLE]" be part of your prompt. For example consider this prompt:
602
+ Since AIA supports parameterized prompts you could make a keyword like "[ROLE]" be part of your prompt. For example consider this prompt:
446
603
 
447
604
  ```text
448
605
  As a [ROLE] tell me what you think about [SUBJECT]
449
606
  ```
450
607
 
451
- When this prompt is processed, `aia` will ask you for a value for the keyword "ROLE" and the keyword "SUBJECT" to complete the prompt. Since `aia` maintains a history of your previous answers, you could just choose something that you used in the past or answer with a completely new value.
608
+ When this prompt is processed, AIA will ask you for a value for the keyword "[ROLE]" and the keyword "[SUBJECT]" to complete the prompt. Since AIA maintains a history of your previous answers, you could just choose something that you used in the past or answer with a completely new value.
452
609
 
453
610
  ## External CLI Tools Used
454
611
 
455
- To install the external CLI programs used by aia:
456
-
457
- brew install fzf mods rg glow
612
+ To install the external CLI programs used by AIA:
613
+
614
+ brew install fzf
458
615
 
459
616
  fzf
460
617
  Command-line fuzzy finder written in Go
461
618
  [https://github.com/junegunn/fzf](https://github.com/junegunn/fzf)
462
619
 
463
- mods
464
- AI on the command-line
465
- [https://github.com/charmbracelet/mods](https://github.com/charmbracelet/mods)
466
-
467
- rg
468
- Search tool like grep and The Silver Searcher
469
- [https://github.com/BurntSushi/ripgrep](https://github.com/BurntSushi/ripgrep)
470
-
471
- glow
472
- Render markdown on the CLI
473
- [https://github.com/charmbracelet/glow](https://github.com/charmbracelet/glow)
474
-
475
- A text editor whose executable is setup in the
476
- system environment variable 'EDITOR' like this:
477
-
478
- export EDITOR="subl -w"
479
-
480
- ### Optional External CLI-tools
481
-
482
- #### Backend Processor `llm`
483
-
484
- ```
485
- llm Access large language models from the command-line
486
- | brew install llm
487
- |__ https://llm.datasette.io/
488
- ```
489
-
490
- As of `aia v0.5.13` the `llm` backend processor is available in a limited integration. It is a very powerful python-based implementation that has its own prompt templating system. The reason that it is be included within the `aia` environment is for its ability to make use of local LLM models.
491
-
492
-
493
- #### Backend Processor `sgpt`
494
-
495
- `shell-gpt` aka `sgpt` is also a python implementation of a CLI-tool that processes prompts through OpenAI. It has less features than both `mods` and `llm` and is less flexible.
496
-
497
- #### Occassionally Useful Tool `plz`
498
-
499
- `plz-cli` aka `plz` is not integrated with `aia` however, it gets an honorable mention for its ability to except a prompt that tailored to doing something on the command line. Its response is a CLI command (sometimes a piped sequence) that accomplishes the task set forth in the prompt. It will return the commands to be executed agaist the data files you specified with a query to execute the command.
500
-
501
- - brew install plz-cli
502
620
 
503
621
  ## Shell Completion
504
622
 
505
623
  You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do this:
506
624
 
507
- ```shell
625
+ ```bash
508
626
  aia --completion bash
509
627
  ```
510
628
 
@@ -516,29 +634,30 @@ Copy the function to a place where it can be installed in your shell's instance.
516
634
 
517
635
  This is just between you and me so don't go blabbing this around to everyone. My most power prompt is in a file named `ad_hoc.txt`. It looks like this:
518
636
 
519
- > [WHAT NOW HUMAN]
637
+ ```text
638
+ [WHAT_NOW_HUMAN]
639
+ ```
520
640
 
521
641
  Yep. Just a single parameter for which I can provide a value of anything that is on my mind at the time. Its advantage is that I do not pollute my shell's command history with lots of text.
522
642
 
523
- Which do you think is better to have in your shell's history file?
524
-
525
- ```shell
526
- mods "As a certified public accountant specializing in forensic audit and analysis of public company financial statements, what do you think of mine? What is the best way to hide the millions dracma that I've skimmed?" < financial_statement.txt
643
+ ```bash
644
+ aia ad_hoc
527
645
  ```
528
646
 
529
- or
647
+ Or consider this executable prompt file:
530
648
 
531
- ```shell
532
- aia ad_hoc financial_statement.txt
649
+ ```bash
650
+ #!/usr/bin/env aia run
651
+ [WHAT_NOW_HUMAN]
533
652
  ```
534
653
 
535
- Both do the same thing; however, `aia` does not put the text of the prompt into the shell's history file.... of course the keyword/parameter value is saved in the prompt's JSON file and the prompt with the response are logged unless `--no-log` is specified; but, its not messing up the shell history!
654
+ Where the `run` prompt ID has a `run.txt` file in the prompt directory that is basically empty. Or maybe `run.txt` has some prompt instructions for how to run the prompt - some kind of meta-thinking instructions.
536
655
 
537
656
  ## My Configuration
538
657
 
539
658
  I use the `bash` shell. In my `.bashrc` file I source another file named `.bashrc__aia` which looks like this:
540
659
 
541
- ```shell
660
+ ```bash
542
661
  # ~/.bashic_aia
543
662
  # AI Assistant
544
663
 
@@ -546,41 +665,91 @@ I use the `bash` shell. In my `.bashrc` file I source another file named `.bash
546
665
  export AIA_PROMPTS_DIR=~/.prompts
547
666
  export AIA_OUT_FILE=./temp.md
548
667
  export AIA_LOG_FILE=$AIA_PROMPTS_DIR/_prompts.log
549
- export AIA_BACKEND=mods
550
- export AIA_MODEL=gpt-4-1106-preview
668
+ export AIA_MODEL=gpt-4o-mini
551
669
 
552
- # Not a default. Invokes spinner.
670
+ # Not a default. Invokes spinner. If not true then there is no spinner
671
+ # for feedback while waiting for the LLM to respond.
553
672
  export AIA_VERBOSE=true
554
673
 
555
- alias chat='aia chat --terse'
674
+ alias chat='aia --chat --shell --erb --terse'
556
675
 
557
676
  # rest of the file is the completion function
558
677
  ```
559
678
 
560
- Here is what my `chat` prompt file looks like:
561
679
 
562
- ```shell
563
- # ~/.prompts/chat.txt
564
- # Desc: Start a chat session
565
680
 
566
- //config chat? = true
681
+ ## Executable Prompts
682
+
683
+ With all of the capabilities of the AI Assistant, you can create your own executable prompts. These prompts can be used to automate tasks, generate content, or perform any other action that you can think of. All you need to get started with executable prompts is a prompt that does not do anything. For example consider my `run.txt` prompt.
684
+
685
+ ```bash
686
+ # ~/.prompts/run.txt
687
+ # Desc: Run executable prompts coming in via STDIN
688
+ ```
689
+
690
+ Remember that the '#' character indicates a comment line making the `run` prompt ID basically a do nothing prompt.
691
+
692
+ An executable prompt can reside anywhere either in your $PATH or not. That is your choice. It must however be executable. Consider the following `top10` executable prompt:
693
+
694
+ ```bash
695
+ #!/usr/bin/env aia run --no-out_file
696
+ # File: top10
697
+ # Desc: The tope 10 cities by population
698
+
699
+ what are the top 10 cities by population in the USA. Summarize what people
700
+ like about living in each city. Include an average cost of living. Include
701
+ links to the Wikipedia pages. Format your response as a markdown document.
702
+ ```
703
+
704
+ Make sure that it is executable.
567
705
 
568
- [WHAT]
706
+ ```bash
707
+ chmod +x top10
569
708
  ```
570
709
 
710
+ The magic is in the first line of the prompt. It is a shebang line that tells the system how to execute the prompt. In this case it is telling the system to use the `aia` command line tool to execute the `run` prompt. The `--no-out_file` option tells the AIA command line tool not to write the output of the prompt to a file. Instead it will write the output to STDOUT. The remaining content of this `top10` prompt is send via STDIN to the LLM.
711
+
712
+ Now just execute it like any other command in your terminal.
713
+
714
+ ```bash
715
+ ./top10
716
+ ```
717
+
718
+ Since its output is going to STDOUT you can setup a pipe chain. Using the CLI program `glow` to render markdown in the terminal
719
+ (brew install glow)
720
+
721
+ ```bash
722
+ ./top10 | glow
723
+ ```
724
+
725
+ This executable prompt concept sets up the building blocks of a *nix CLI-based pipeline in the same way that the --pipeline and --next options and directives are used.
726
+
571
727
  ## Development
572
728
 
573
- This CLI tool started life as a few lines of ruby in a file in my scripts repo. I just kep growing as I decided to add more capability and more backend tools. There was no real architecture to guide the design. What was left is a large code mess which is slowly being refactored into something more maintainable. That work is taking place in the `develop` branch. I welcome you help. Take a look at what is going on in that branch and send me a PR against it.
729
+ **ShellCommandExecutor Refactor:**
730
+ The `ShellCommandExecutor` is now a class (previously a module). It stores the config object as an instance variable and provides cleaner encapsulation. For backward compatibility, class-level methods are available and delegate to instance methods internally.
574
731
 
575
- Of course if you see something in the main branch send me a PR against that one so that we can fix the problem for all.
732
+ **Prompt Variable Fallback:**
733
+ When processing a prompt file without a `.json` history file, variables are always parsed from the prompt text so you are prompted for values as needed.
576
734
 
577
735
  ## Contributing
578
736
 
579
737
  Bug reports and pull requests are welcome on GitHub at https://github.com/MadBomber/aia.
580
738
 
581
- When you find problems with `aia` please note them as an issue. This thing was written mostly by a human and you know how error prone humans are. There should be plenty of errors to find.
739
+ When you find problems with AIA please note them as an issue. This thing was written mostly by a human and you know how error prone humans are. There should be plenty of errors to find.
740
+
741
+ I'm not happy with the way where some command line options for external command are hard coded. I'm specific talking about the way in which the `rg` and `fzf` tools are used. Their options decide the basic look and feel of the search capability on the command line. Maybe they should be part of the overall configuration so that users can tune their UI to the way they like.
742
+
743
+ ## History of Development
744
+
745
+ I originally wrote a tiny script called `aip.rb` to experiment with parameterized prompts. That was in August of 2023. AIP meant AI Parameterized. Adding an extra P for Prompts just seemed to be a little silly. It lived in my [scripts repo](https://github.com/MadBomber/scripts) for a while. It became useful to me so of course I need to keep enhancing it. I moved it into my [experiments repo](https://github.com/MadBomber/experiments) and began adding features in a haphazard manner. No real plan or architecture. From those experiments I refactored out the [prompt_manager gem](https://github.com/MadBomber/prompt_manager) and the [ai_client gem](https://github.com/MadBomber/ai_client)(https://github.com/MadBomber/ai_client). The name was changed from AIP to AIA and it became a gem.
746
+
747
+ All of that undirected experimentation without a clear picture of where this thing was going resulted in chaotic code. I would use an Italian food dish to explain the organization but I think chaotic is more descriptive.
748
+
749
+ ## Roadmap
582
750
 
583
- I'm not happy with the way where some command line options for external command are hard coded. I'm specific talking about the way in which the `rg` and `fzf` tools are used. There options decide the basic look and feel of the search capability on the command line. Maybe they should be part of the overall configuration so that users can tune their UI to the way they like.
751
+ - support for using Ruby-based functional callback tools
752
+ - support for Model Context Protocol
584
753
 
585
754
  ## License
586
755