aia 0.4.1 → 0.4.2

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 0542f46fbbf4d3dd960229ae8f707f89b2cbab347d763d2b8db11aa8f4e2e552
4
- data.tar.gz: 6dd9b949fbdc26019c64f2ece134114b515782395bce52a71b0f5d281042071b
3
+ metadata.gz: 24e36e9066b83229df951d172cfb3fd7185cadb433fd992ab37fede8d45b8ea5
4
+ data.tar.gz: f7ce893975dfb29dd69d1ec0922f8ea30987f88d1d3dd69f0ca877218c6c54fc
5
5
  SHA512:
6
- metadata.gz: 2eefbbd2215bc82361d1335da437714760ca7c9db5237b89280dd5cdbe220da2dd14f48890ca56641758186c0a54d5cc199b22b18535fa62ed147c52bda2a77a
7
- data.tar.gz: '03990e11ef3fa9960fa8c3bd8aa34f667cf9fdd10b2e37646d2c54b199ea629fb58c1cafaeeb328bff1cbda2ce219300ab3831ab95894958317bc97c2835f18f'
6
+ metadata.gz: 0172e7c82b9e346d176df7691e5eb422515e25f4f7ed243cb92cd38419a4e8a18a1df9f5500ac0606c6bb30d48e88662237877aee4667aedf9cd02973a190268
7
+ data.tar.gz: c826c22f5b1789ffeeca7982860764527350d3b352cb428a4b1cc6ccc63025fa1c4347ed098477d8d6a0f2f63dfe8fc8187eb9570096698487bf69fe6f8903c0
data/.semver CHANGED
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  :major: 0
3
3
  :minor: 4
4
- :patch: 1
4
+ :patch: 2
5
5
  :special: ''
6
6
  :metadata: ''
data/CHANGELOG.md CHANGED
@@ -1,5 +1,8 @@
1
1
  ## [Unreleased]
2
- ## [0.4.1] 2023-31
2
+ ## [0.4.2] 2023-12-31
3
+ - added the --role CLI option to pre-pend a "role" prompt to the front of a primary prompt.
4
+
5
+ ## [0.4.1] 2023-12-31
3
6
  - added a chat mode
4
7
  - prompt directives now supported
5
8
  - version bumped to match the `prompt_manager` gem
data/README.md CHANGED
@@ -6,6 +6,7 @@ Uses the gem "prompt_manager" to manage the prompts sent to the `mods` command-l
6
6
 
7
7
  **Most Recent Change**
8
8
 
9
+ v0.4.2 = Added a role option to be prepended to a primary prompt.
9
10
  v0.4.1 - Added a chat mode. Prompt directives are now supported.
10
11
 
11
12
  <!-- Tocer[start]: Auto-generated, don't remove. -->
@@ -16,8 +17,11 @@ v0.4.1 - Added a chat mode. Prompt directives are now supported.
16
17
  - [Usage](#usage)
17
18
  - [System Environment Variables (envar)](#system-environment-variables-envar)
18
19
  - [Prompt Directives](#prompt-directives)
20
+ - [All About ROLES](#all-about-roles)
21
+ - [Other Ways to Insert Roles into Prompts](#other-ways-to-insert-roles-into-prompts)
19
22
  - [External CLI Tools Used](#external-cli-tools-used)
20
23
  - [Shell Completion](#shell-completion)
24
+ - [Ny Most Powerful Prompt](#ny-most-powerful-prompt)
21
25
  - [Development](#development)
22
26
  - [Contributing](#contributing)
23
27
  - [License](#license)
@@ -38,13 +42,14 @@ Install the command-line utilities by executing:
38
42
 
39
43
  You will also need to establish a directory in your file system where your prompt text files, last used parameters and usage log files are kept.
40
44
 
41
- Setup a system environment variable named "PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts_dir"
45
+ Setup a system environment variable named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts_dir"
42
46
 
43
- You will also need to source the completion script.
47
+ You may also want to install the completion script for your shell. To get a copy of the completion script do:
44
48
 
45
- TODO: May want to add a `setup` function to the command-line options that will create the directory, and do something with the completion function.
49
+ `aia --completion bash`
50
+
51
+ `fish` and `zsh` are also available.
46
52
 
47
- TODO: don't forget to mention have access token (API keys) setup as envars for the various backend services like OpenAI... if they are still in business.
48
53
 
49
54
  ## Usage
50
55
 
@@ -133,6 +138,11 @@ OPTIONS
133
138
  -p, --prompts PATH_TO_DIRECTORY
134
139
  Directory containing the prompt files - default is ~/.prompts
135
140
 
141
+ -r, --role ROLE_ID
142
+ A role ID is the same as a prompt ID. A “role” is a specialized prompt that
143
+ gets pre-pended to another prompt. It’s purpose is to configure the LLM into a
144
+ certain orientation within which to resolve its primary prompt.
145
+
136
146
  -v, --verbose
137
147
  Be Verbose - default is false
138
148
 
@@ -217,7 +227,6 @@ AUTHOR
217
227
  Dewayne VanHoozer <dvanhoozer@gmail.com>
218
228
 
219
229
  AIA 2024-01-01 aia(1)
220
-
221
230
  ```
222
231
 
223
232
  ## System Environment Variables (envar)
@@ -256,9 +265,35 @@ That prompt will enter the chat loop regardles of the presents of a "--chat" CLI
256
265
 
257
266
  BTW did I mention that `aia` supports a chat mode where you can send an initial prompt to the backend and then followup the backend's reponse with additional keyboard entered questions, instructions, prompts etc.
258
267
 
259
- ## External CLI Tools Used
268
+ See the [AIA::Directives](lib/aia/directives.rb) class to see what directives are available on the fromend within `aia`.
269
+
270
+ See the [AIA::Mods](lib/aia/tools/mods.rb) class to for directives that are available to the `mods` backend.
271
+
272
+ See the [AIA::Sgpt](lib/aia/tools/sgpt.rb) class to for directives that are available to the `sgpt` backend.
273
+
274
+ ## All About ROLES
275
+
276
+ `aia` provides the "-r --role" CLI option to identify a prompt ID within your prompts directory which defines the context within which the LLM is to provide its response. The text of the role ID is pre-pended to the text of the primary prompt to form a complete prompt to be processed by the backend.
277
+
278
+ For example consider:
279
+
280
+ > aia -r ruby refactor my_class.rb
281
+
282
+ Within the prompts directory the contents of the text file `ruby.txt` will be pre-pre-pended to the contents of the refactor.txt file to produce a complete prompt. That complete prompt will have any parameters then directives processed before sending the prompt text to the backend.
283
+
284
+ Note that "role" is just a way of saying add this prompt to the front of this other prompt. The contents of the "role" prompt could be anything. It does not necessarily have be an actual role.
285
+
286
+ ### Other Ways to Insert Roles into Prompts
287
+
288
+ Since `aia` supports parameterized prompts you could make a keyword like "[ROLE]" be part of your prompt. For example consider this prompt:
289
+
290
+ ```text
291
+ As a [ROLE] tell me what you think about [SUBJECT]
292
+ ```
260
293
 
261
- From the verbose help text ...
294
+ When this prompt is processed, `aia` will ask you for a value for the keyword "ROLE" and the keyword "SUBJECT" to complete the prompt. Since `aia` maintains a history your previous answers, you could just choose something that you used in the past or answer with a completely new value.
295
+
296
+ ## External CLI Tools Used
262
297
 
263
298
  ```text
264
299
  External Tools Used
@@ -296,6 +331,22 @@ If you're not a fan of "born again" replace `bash` with one of the others.
296
331
 
297
332
  Copy the function to a place where it can be installed in your shell's instance. This might be a `.profile` or `.bashrc` file, etc.
298
333
 
334
+ ## Ny Most Powerful Prompt
335
+
336
+ This is just between you and me so don't go blabbing this around to everyone. My most power prompt is in a file named `ad_hoc.txt`. It looks like this:
337
+
338
+ > [WHAT NOW HUMAN]
339
+
340
+ Yep. Just a single parameter for which I can provide a value of anything that is on my mind at the time. Its advantage is that I do not pollute my shell's command history with lots of text.
341
+
342
+ Which do you think is better to have in your shell's history file?
343
+
344
+ > mods "As a certified public accountant specializing in forensic audit and analysis of public company financial statements, what do you think of mine? What is the best way to hide the millions dracma that I've skimmed?" < financial_statement.txt
345
+
346
+ > aia ad_hoc financial_statement.txt
347
+
348
+ Both do the same thing; however, aia does not put the text of the prompt into the shell's history file.... of course the keyword/parameter value is saved in the prompt's JSON file and the prompt with the response are logged unless --no-log is specified; but, its not messing up the shell history!
349
+
299
350
  ## Development
300
351
 
301
352
  After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake test` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
@@ -304,6 +355,21 @@ After checking out the repo, run `bin/setup` to install dependencies. Then, run
304
355
 
305
356
  Bug reports and pull requests are welcome on GitHub at https://github.com/MadBomber/aia.
306
357
 
358
+ I've designed `aia` so that it should be easy to integrate other backend LLM processors. If you've found one that you like, send me a pull request or a feature request.
359
+
360
+ When you find problems with `aia` please note them as an issue. This thing was written mostly by a human and you know how error prone humans are. There should be plenty of errors to find.
361
+
362
+ Also I'm interested in doing more with the prompt directives. I'm thinking that there is a way to include dynamic content into the prompt computationally. Maybe something like tjos wpi;d be easu tp dp:
363
+
364
+ > //insert url https://www.whitehouse.gov/briefing-room/
365
+ > //insert file path_to_file.txt
366
+
367
+ or maybe incorporating the contents of system environment variables into prompts using $UPPERCASE or $(command) or ${envar_name} patterns.
368
+
369
+ I've also been thinking that the REGEX used to identify a keyword within a prompt could be a configuration item. I chose to use square brackets and uppercase in the default regex; maybe, you have a collection of prompt files that use some other regex. Why should it be one way and not the other.
370
+
371
+ Also I'm not happy with the way where I hve some command line options for external command hard coded. I think they should be part of the configuration as well. For example the way I'm using `rg` and `fzf` may not be the way that you want to use them.
372
+
307
373
  ## License
308
374
 
309
375
  The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
data/lib/aia/cli.rb CHANGED
@@ -136,11 +136,7 @@ class AIA::Cli
136
136
  terse?: [false, "--terse"],
137
137
  speak?: [false, "--speak"],
138
138
  #
139
- # TODO: May have to process the
140
- # "~" character and replace it with HOME
141
- #
142
- # TODO: Consider using standard suffix of _dif and _file
143
- # to signal Pathname objects fo validation
139
+ role: ['', "-r --role"],
144
140
  #
145
141
  config_file:[nil, "-c --config"],
146
142
  prompts_dir:["~/.prompts", "-p --prompts"],
data/lib/aia/prompt.rb CHANGED
@@ -22,8 +22,20 @@ class AIA::Prompt
22
22
 
23
23
  # setting build: false supports unit testing.
24
24
  def initialize(build: true)
25
+ if AIA.config.role.empty?
26
+ @role = nil
27
+ else
28
+ AIA.config.arguments.prepend AIA.config.role
29
+ get_prompt
30
+ @role = @prompt.dup
31
+ end
32
+
25
33
  get_prompt
26
34
 
35
+ unless @role.nil?
36
+ @prompt.text.prepend @role.text
37
+ end
38
+
27
39
  process_prompt if build
28
40
  end
29
41
 
@@ -128,7 +140,7 @@ class AIA::Prompt
128
140
 
129
141
  puts "Parameter #{kw} ..."
130
142
 
131
- if default.empty?
143
+ if default&.empty?
132
144
  user_prompt = "\n-=> "
133
145
  else
134
146
  user_prompt = "\n(#{default}) -=>"
data/man/aia.1 CHANGED
@@ -71,6 +71,9 @@ Out FILENAME \- default is \.\[sl]temp\.md
71
71
  \fB\-p\fR, \fB\-\-prompts\fR \fIPATH\[ru]TO\[ru]DIRECTORY\fP
72
72
  Directory containing the prompt files \- default is \[ti]\[sl]\.prompts
73
73
  .TP
74
+ \fB\-r\fR, \fB\-\-role\fR \fIROLE\[ru]ID\fP
75
+ A role ID is the same as a prompt ID\. A \[lq]role\[rq] is a specialized prompt that gets pre\-pended to another prompt\. It\[cq]s purpose is to configure the LLM into a certain orientation within which to resolve its primary prompt\.
76
+ .TP
74
77
  \fB\-v\fR, \fB\-\-verbose\fR
75
78
  Be Verbose \- default is false
76
79
  .SH CONFIGURATION HIERARCHY
data/man/aia.1.md CHANGED
@@ -76,6 +76,9 @@ The aia command-line tool is an interface for interacting with an AI model backe
76
76
  `-p`, `--prompts` *PATH_TO_DIRECTORY*
77
77
  : Directory containing the prompt files - default is ~/.prompts
78
78
 
79
+ `-r`, `--role` *ROLE_ID*
80
+ : A role ID is the same as a prompt ID. A "role" is a specialized prompt that gets pre-pended to another prompt. It's purpose is to configure the LLM into a certain orientation within which to resolve its primary prompt.
81
+
79
82
  `-v`, `--verbose`
80
83
  : Be Verbose - default is false
81
84
 
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.4.1
4
+ version: 0.4.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
@@ -205,7 +205,6 @@ files:
205
205
  - lib/aia/tools/vim.rb
206
206
  - lib/aia/version.rb
207
207
  - lib/core_ext/string_wrap.rb
208
- - lib/modularization_plan.md
209
208
  - main.just
210
209
  - man/aia.1
211
210
  - man/aia.1.md
@@ -1,126 +0,0 @@
1
- ## Suggested Refactoring into Modules
2
-
3
- ### ConfigurationModule
4
-
5
- This module could encapsulate all the constants and environment-dependent settings.
6
-
7
- ```ruby
8
- module Configuration
9
- HOME = Pathname.new(ENV['HOME'])
10
- PROMPTS_DIR = Pathname.new(ENV['PROMPTS_DIR'] || (HOME + ".prompts_dir"))
11
- AI_CLI_PROGRAM = "mods"
12
- EDITOR = ENV['EDITOR'] || 'edit'
13
- MY_NAME = Pathname.new(__FILE__).basename.to_s.split('.')[0]
14
- MODS_MODEL = ENV['MODS_MODEL'] || 'gpt-4-1106-preview'
15
- OUTPUT = Pathname.pwd + "temp.md"
16
- PROMPT_LOG = PROMPTS_DIR + "_prompts.log"
17
- USAGE = <<~EOUSAGE
18
- AI Assistant (aia)
19
- ==================
20
- The AI cli program being used is: #{AI_CLI_PROGRAM}
21
- You can pass additional CLI options to #{AI_CLI_PROGRAM} like this:
22
- "#{MY_NAME} my options -- options for #{AI_CLI_PROGRAM}"
23
- EOUSAGE
24
- end
25
- ```
26
-
27
- ### OptionParsingModule
28
-
29
- This module could manage the parsing of command-line arguments and configuring the options for the application.
30
-
31
- ```ruby
32
- module OptionParsing
33
- def build_reader_methods
34
- # ... method definition ...
35
- end
36
-
37
- def process_arguments
38
- # ... method definition ...
39
- end
40
-
41
- def check_for(an_option)
42
- # ... method definition ...
43
- end
44
-
45
- def process_option(option_sym, switches)
46
- # ... method definition ...
47
- end
48
- end
49
- ```
50
-
51
- ### CommandLineInterfaceModule
52
-
53
- This module would manage interactions with the command-line interface including editing of prompts and selection processes.
54
-
55
- ```ruby
56
- module CommandLineInterface
57
- def keyword_value(kw, default)
58
- # ... method definition ...
59
- end
60
-
61
- def handle_multiple_prompts(found_these, while_looking_for_this)
62
- # ... method definition ...
63
- end
64
- end
65
- ```
66
-
67
- ### LoggingModule
68
-
69
- Responsible for logging the results of the command.
70
-
71
- ```ruby
72
- module Logging
73
- def write_to_log(answer)
74
- # ... method definition ...
75
- end
76
- end
77
- ```
78
-
79
- ### AICommandModule
80
-
81
- Manages the building and execution of the AI CLI command.
82
-
83
- ```ruby
84
- module AICommand
85
- def setup_cli_program
86
- # ... method definition ...
87
- end
88
-
89
- def build_command
90
- # ... method definition ...
91
- end
92
-
93
- def execute_and_log_command(command)
94
- # ... method definition ...
95
- end
96
- end
97
- ```
98
-
99
- ### PromptProcessingModule
100
-
101
- Handles prompt retrieval, existing check, and keyword processing.
102
-
103
- ```ruby
104
- module PromptProcessing
105
- def existing_prompt?(prompt_id)
106
- # ... method definition ...
107
- end
108
-
109
- def process_prompt
110
- # ... method definition ...
111
- end
112
-
113
- def replace_keywords
114
- # ... method definition ...
115
- end
116
-
117
- def search_for_a_matching_prompt(prompt_id)
118
- # ... method definition ...
119
- end
120
- end
121
- ```
122
-
123
- Each module should only contain the methods relevant to that module's purpose. After defining these modules, they can be included in the `AIA::Main` class where appropriate. Note that the method `get_prompt_id` didn't fit neatly into one of the outlined modules; it may remain in the main class or be included in a module if additional context becomes available or if it can be logically grouped with similar methods.
124
-
125
- The `__END__` block and the Readline history management could be encapsulated into a separate module for terminal interactions if that block grows in complexity or moves out of the overall class definition.
126
-