aia 0.4.1 → 0.4.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 0542f46fbbf4d3dd960229ae8f707f89b2cbab347d763d2b8db11aa8f4e2e552
4
- data.tar.gz: 6dd9b949fbdc26019c64f2ece134114b515782395bce52a71b0f5d281042071b
3
+ metadata.gz: 3e8cad3b88bdc76e72dce9b7fa202923ab655f2da811039bf18df52c5fe714ae
4
+ data.tar.gz: 57ae0a1dd04c0fbd237ba8daa3210ae273a8db21e0efe5c154ffbe72c9afd0e5
5
5
  SHA512:
6
- metadata.gz: 2eefbbd2215bc82361d1335da437714760ca7c9db5237b89280dd5cdbe220da2dd14f48890ca56641758186c0a54d5cc199b22b18535fa62ed147c52bda2a77a
7
- data.tar.gz: '03990e11ef3fa9960fa8c3bd8aa34f667cf9fdd10b2e37646d2c54b199ea629fb58c1cafaeeb328bff1cbda2ce219300ab3831ab95894958317bc97c2835f18f'
6
+ metadata.gz: 0cc90b667bd99aae7688231286c58ee3cc7a3ad4a1b4202c531a71113911d24c084f6ccce5d154d57e0059a2a0b3d6d1706f3ee9a5fae5bf229b2604d1982d4d
7
+ data.tar.gz: 18dafdbfdeb257a2597f53070feebcb05f409f101b06ff17fdbe81cc813ebbf335833a0eace414638e437425ca47298b26655768890737cb065e3b15cfbf405e
data/.semver CHANGED
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  :major: 0
3
3
  :minor: 4
4
- :patch: 1
4
+ :patch: 3
5
5
  :special: ''
6
6
  :metadata: ''
data/CHANGELOG.md CHANGED
@@ -1,5 +1,12 @@
1
1
  ## [Unreleased]
2
- ## [0.4.1] 2023-31
2
+ ## [0.4.3] 2023-12-31
3
+ - added --env to process embedded system environment variables and shell commands within a prompt.
4
+ - added --erb to process Embedded RuBy within a prompt because have embedded shell commands will only get you in a trouble. Having ERB will really get you into trouble. Remember the simple prompt is usually the best prompt.
5
+
6
+ ## [0.4.2] 2023-12-31
7
+ - added the --role CLI option to pre-pend a "role" prompt to the front of a primary prompt.
8
+
9
+ ## [0.4.1] 2023-12-31
3
10
  - added a chat mode
4
11
  - prompt directives now supported
5
12
  - version bumped to match the `prompt_manager` gem
data/README.md CHANGED
@@ -6,7 +6,9 @@ Uses the gem "prompt_manager" to manage the prompts sent to the `mods` command-l
6
6
 
7
7
  **Most Recent Change**
8
8
 
9
- v0.4.1 - Added a chat mode. Prompt directives are now supported.
9
+ v0.4.3 - Added the --env and --erb options. Prompts can now be dynamic.
10
+ v0.4.2 - Added a --role option to be prepended to a primary prompt.
11
+ v0.4.1 - Added --chat and --speak options. Prompt directives are now supported.
10
12
 
11
13
  <!-- Tocer[start]: Auto-generated, don't remove. -->
12
14
 
@@ -15,9 +17,14 @@ v0.4.1 - Added a chat mode. Prompt directives are now supported.
15
17
  - [Installation](#installation)
16
18
  - [Usage](#usage)
17
19
  - [System Environment Variables (envar)](#system-environment-variables-envar)
20
+ - [System Environment Variables inside of a Prompt](#system-environment-variables-inside-of-a-prompt)
21
+ - [*E*mbedded *R*u*B*y (ERB)](#embedded-ruby-erb)
18
22
  - [Prompt Directives](#prompt-directives)
23
+ - [All About ROLES](#all-about-roles)
24
+ - [Other Ways to Insert Roles into Prompts](#other-ways-to-insert-roles-into-prompts)
19
25
  - [External CLI Tools Used](#external-cli-tools-used)
20
26
  - [Shell Completion](#shell-completion)
27
+ - [Ny Most Powerful Prompt](#ny-most-powerful-prompt)
21
28
  - [Development](#development)
22
29
  - [Contributing](#contributing)
23
30
  - [License](#license)
@@ -38,13 +45,14 @@ Install the command-line utilities by executing:
38
45
 
39
46
  You will also need to establish a directory in your file system where your prompt text files, last used parameters and usage log files are kept.
40
47
 
41
- Setup a system environment variable named "PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts_dir"
48
+ Setup a system environment variable named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts_dir"
42
49
 
43
- You will also need to source the completion script.
50
+ You may also want to install the completion script for your shell. To get a copy of the completion script do:
44
51
 
45
- TODO: May want to add a `setup` function to the command-line options that will create the directory, and do something with the completion function.
52
+ `aia --completion bash`
53
+
54
+ `fish` and `zsh` are also available.
46
55
 
47
- TODO: don't forget to mention have access token (API keys) setup as envars for the various backend services like OpenAI... if they are still in business.
48
56
 
49
57
  ## Usage
50
58
 
@@ -89,6 +97,24 @@ OPTIONS
89
97
 
90
98
  --dump FORMAT
91
99
 
100
+ -e, --edit
101
+ Invokes an editor on the prompt file. You can make changes to the prompt file,
102
+ save it and the newly saved prompt will be processed by the backend.
103
+
104
+ --env This option tells aia to replace references to system environment variables in
105
+ the prompt with the value of the envar. envars are like $HOME and ${HOME} in
106
+ this example their occurance will be replaced by the value of ENV[‘HOME’].
107
+ Also the dynamic shell command in the pattern $(shell command) will be executed
108
+ and its output replaces its pattern. It does not matter if your shell uses
109
+ different patters than BASH since the replacement is being done within a Ruby
110
+ context.
111
+
112
+ --erb If dynamic prompt content using $(...) wasn’t enough here is ERB. Embedded
113
+ RUby. <%= ruby code %> within a prompt will have its ruby code executed and
114
+ the results of that execution will be inserted into the prompt. I’m sure we
115
+ will find a way to truly misuse this capability. Remember, some say that the
116
+ simple prompt is the best prompt.
117
+
92
118
  --model NAME
93
119
  Name of the LLM model to use - default is gpt-4-1106-preview
94
120
 
@@ -133,6 +159,11 @@ OPTIONS
133
159
  -p, --prompts PATH_TO_DIRECTORY
134
160
  Directory containing the prompt files - default is ~/.prompts
135
161
 
162
+ -r, --role ROLE_ID
163
+ A role ID is the same as a prompt ID. A “role” is a specialized prompt that
164
+ gets pre-pended to another prompt. It’s purpose is to configure the LLM into a
165
+ certain orientation within which to resolve its primary prompt.
166
+
136
167
  -v, --verbose
137
168
  Be Verbose - default is false
138
169
 
@@ -217,7 +248,6 @@ AUTHOR
217
248
  Dewayne VanHoozer <dvanhoozer@gmail.com>
218
249
 
219
250
  AIA 2024-01-01 aia(1)
220
-
221
251
  ```
222
252
 
223
253
  ## System Environment Variables (envar)
@@ -244,6 +274,20 @@ See the `@options` hash in the `cli.rb` file for a complete list. There are som
244
274
 
245
275
  In addition to these config items for `aia` the optional command line parameters for the backend prompt processing utilities (mods and sgpt) can also be set using envars with the "AIA_" prefix. For example "export AIA_TOPP=1.0" will set the "--topp 1.0" command line option for the `mods` utility when its used as the backend processor.
246
276
 
277
+ ### System Environment Variables inside of a Prompt
278
+
279
+ The cpmmand line option "--env" instructs `aia` to replace any system environment variable references in the prompt text with the value of the system envornment variable. So patterns like $HOME and ${HOME} in the prompt will be replaced with the value of ENV['HOME'].
280
+
281
+ As an added bonus, dynamic content can be inserted into the prompt using the pattern $(shell command) where the output of the shell command will replace the $(...) pattern.
282
+
283
+ That's enough power to get anyone into deep trouble. Wait there's more coming. Do you want a full programming language in your prompt? Feel like running the prompt through ERB before sending it to the backend? I've been thinking about it. Should be a simple thing to do. Yes it was.
284
+
285
+ ## *E*mbedded *R*u*B*y (ERB)
286
+
287
+ The --erb options now submits the prompt text through the ERB parser for execution before environment variable, keyword and dynamic shell replace take place. With this much dynamic prompt manipulation we should really be able to get into trouble!
288
+
289
+ If you are not a Rubyist and do not know about ERB see this [webpage](https://ruby-doc.org/stdlib-3.0.0/libdoc/erb/rdoc/ERB.html) for some information about the power of embedded Ruby.
290
+
247
291
  ## Prompt Directives
248
292
 
249
293
  Downstream processing directives were added to the `prompt_manager` gem at version 0.4.1. These directives are lines in the prompt text file that begin with "//"
@@ -256,9 +300,35 @@ That prompt will enter the chat loop regardles of the presents of a "--chat" CLI
256
300
 
257
301
  BTW did I mention that `aia` supports a chat mode where you can send an initial prompt to the backend and then followup the backend's reponse with additional keyboard entered questions, instructions, prompts etc.
258
302
 
259
- ## External CLI Tools Used
303
+ See the [AIA::Directives](lib/aia/directives.rb) class to see what directives are available on the fromend within `aia`.
304
+
305
+ See the [AIA::Mods](lib/aia/tools/mods.rb) class to for directives that are available to the `mods` backend.
306
+
307
+ See the [AIA::Sgpt](lib/aia/tools/sgpt.rb) class to for directives that are available to the `sgpt` backend.
308
+
309
+ ## All About ROLES
310
+
311
+ `aia` provides the "-r --role" CLI option to identify a prompt ID within your prompts directory which defines the context within which the LLM is to provide its response. The text of the role ID is pre-pended to the text of the primary prompt to form a complete prompt to be processed by the backend.
312
+
313
+ For example consider:
314
+
315
+ > aia -r ruby refactor my_class.rb
316
+
317
+ Within the prompts directory the contents of the text file `ruby.txt` will be pre-pre-pended to the contents of the refactor.txt file to produce a complete prompt. That complete prompt will have any parameters then directives processed before sending the prompt text to the backend.
318
+
319
+ Note that "role" is just a way of saying add this prompt to the front of this other prompt. The contents of the "role" prompt could be anything. It does not necessarily have be an actual role.
320
+
321
+ ### Other Ways to Insert Roles into Prompts
260
322
 
261
- From the verbose help text ...
323
+ Since `aia` supports parameterized prompts you could make a keyword like "[ROLE]" be part of your prompt. For example consider this prompt:
324
+
325
+ ```text
326
+ As a [ROLE] tell me what you think about [SUBJECT]
327
+ ```
328
+
329
+ When this prompt is processed, `aia` will ask you for a value for the keyword "ROLE" and the keyword "SUBJECT" to complete the prompt. Since `aia` maintains a history your previous answers, you could just choose something that you used in the past or answer with a completely new value.
330
+
331
+ ## External CLI Tools Used
262
332
 
263
333
  ```text
264
334
  External Tools Used
@@ -296,6 +366,22 @@ If you're not a fan of "born again" replace `bash` with one of the others.
296
366
 
297
367
  Copy the function to a place where it can be installed in your shell's instance. This might be a `.profile` or `.bashrc` file, etc.
298
368
 
369
+ ## Ny Most Powerful Prompt
370
+
371
+ This is just between you and me so don't go blabbing this around to everyone. My most power prompt is in a file named `ad_hoc.txt`. It looks like this:
372
+
373
+ > [WHAT NOW HUMAN]
374
+
375
+ Yep. Just a single parameter for which I can provide a value of anything that is on my mind at the time. Its advantage is that I do not pollute my shell's command history with lots of text.
376
+
377
+ Which do you think is better to have in your shell's history file?
378
+
379
+ > mods "As a certified public accountant specializing in forensic audit and analysis of public company financial statements, what do you think of mine? What is the best way to hide the millions dracma that I've skimmed?" < financial_statement.txt
380
+
381
+ > aia ad_hoc financial_statement.txt
382
+
383
+ Both do the same thing; however, aia does not put the text of the prompt into the shell's history file.... of course the keyword/parameter value is saved in the prompt's JSON file and the prompt with the response are logged unless --no-log is specified; but, its not messing up the shell history!
384
+
299
385
  ## Development
300
386
 
301
387
  After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake test` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
@@ -304,6 +390,21 @@ After checking out the repo, run `bin/setup` to install dependencies. Then, run
304
390
 
305
391
  Bug reports and pull requests are welcome on GitHub at https://github.com/MadBomber/aia.
306
392
 
393
+ I've designed `aia` so that it should be easy to integrate other backend LLM processors. If you've found one that you like, send me a pull request or a feature request.
394
+
395
+ When you find problems with `aia` please note them as an issue. This thing was written mostly by a human and you know how error prone humans are. There should be plenty of errors to find.
396
+
397
+ Also I'm interested in doing more with the prompt directives. I'm thinking that there is a way to include dynamic content into the prompt computationally. Maybe something like tjos wpi;d be easu tp dp:
398
+
399
+ > //insert url https://www.whitehouse.gov/briefing-room/
400
+ > //insert file path_to_file.txt
401
+
402
+ or maybe incorporating the contents of system environment variables into prompts using $UPPERCASE or $(command) or ${envar_name} patterns.
403
+
404
+ I've also been thinking that the REGEX used to identify a keyword within a prompt could be a configuration item. I chose to use square brackets and uppercase in the default regex; maybe, you have a collection of prompt files that use some other regex. Why should it be one way and not the other.
405
+
406
+ Also I'm not happy with the way where I hve some command line options for external command hard coded. I think they should be part of the configuration as well. For example the way I'm using `rg` and `fzf` may not be the way that you want to use them.
407
+
307
408
  ## License
308
409
 
309
410
  The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
data/lib/aia/cli.rb CHANGED
@@ -126,6 +126,8 @@ class AIA::Cli
126
126
  completion: [nil, "--completion"],
127
127
  #
128
128
  edit?: [false, "-e --edit"],
129
+ env?: [false, "--env"],
130
+ erb?: [false, "--erb"],
129
131
  debug?: [false, "-d --debug"],
130
132
  verbose?: [false, "-v --verbose"],
131
133
  version?: [false, "--version"],
@@ -136,11 +138,7 @@ class AIA::Cli
136
138
  terse?: [false, "--terse"],
137
139
  speak?: [false, "--speak"],
138
140
  #
139
- # TODO: May have to process the
140
- # "~" character and replace it with HOME
141
- #
142
- # TODO: Consider using standard suffix of _dif and _file
143
- # to signal Pathname objects fo validation
141
+ role: ['', "-r --role"],
144
142
  #
145
143
  config_file:[nil, "-c --config"],
146
144
  prompts_dir:["~/.prompts", "-p --prompts"],
data/lib/aia/prompt.rb CHANGED
@@ -1,6 +1,7 @@
1
1
  # lib/aia/prompt.rb
2
2
 
3
3
  require 'reline'
4
+ require 'erb'
4
5
 
5
6
  class AIA::Prompt
6
7
  #
@@ -22,9 +23,25 @@ class AIA::Prompt
22
23
 
23
24
  # setting build: false supports unit testing.
24
25
  def initialize(build: true)
26
+ if AIA.config.role.empty?
27
+ @role = nil
28
+ else
29
+ AIA.config.arguments.prepend AIA.config.role
30
+ get_prompt
31
+ @role = @prompt.dup
32
+ end
33
+
25
34
  get_prompt
26
35
 
27
- process_prompt if build
36
+ unless @role.nil?
37
+ @prompt.text.prepend @role.text
38
+ end
39
+
40
+ if build
41
+ @prompt.text = render_erb(@prompt.text) if AIA.config.erb?
42
+ @prompt.text = replace_env(@prompt.text) if AIA.config.env?
43
+ process_prompt
44
+ end
28
45
  end
29
46
 
30
47
 
@@ -65,6 +82,25 @@ class AIA::Prompt
65
82
  end
66
83
 
67
84
 
85
+ # inserts environmant variables and dynamic content into a prompt
86
+ # replaces patterns like $HOME and ${HOME} with the value of ENV['HOME']
87
+ # replaces patterns like $(shell command) with the output of the shell command
88
+ #
89
+ def replace_env(a_string)
90
+ a_string.gsub(/\$(\w+|\{\w+\})/) do |match|
91
+ ENV[match.tr('$', '').tr('{}', '')]
92
+ end.gsub(/\$\((.*?)\)/) do |match|
93
+ `#{match[2..-2]}`.chomp
94
+ end
95
+ end
96
+
97
+
98
+ # You are just asking for trouble!
99
+ def render_erb(a_string)
100
+ ERB.new(a_string).result(binding)
101
+ end
102
+
103
+
68
104
  def replace_keywords
69
105
  puts
70
106
  puts "ID: #{@prompt.id}"
@@ -128,7 +164,7 @@ class AIA::Prompt
128
164
 
129
165
  puts "Parameter #{kw} ..."
130
166
 
131
- if default.empty?
167
+ if default&.empty?
132
168
  user_prompt = "\n-=> "
133
169
  else
134
170
  user_prompt = "\n(#{default}) -=>"
data/man/aia.1 CHANGED
@@ -29,6 +29,15 @@ begin a chat session with the backend after the initial prompt response; will s
29
29
  .TP
30
30
  \fB\-\-dump\fR \fIFORMAT\fP
31
31
  .TP
32
+ \fB\-e\fR, \fB\-\-edit\fR
33
+ Invokes an editor on the prompt file\. You can make changes to the prompt file, save it and the newly saved prompt will be processed by the backend\.
34
+ .TP
35
+ \fB\-\-env\fR
36
+ This option tells \fBaia\fR to replace references to system environment variables in the prompt with the value of the envar\. envars are like \[Do]HOME and \[Do]\[lC]HOME\[rC] in this example their occurance will be replaced by the value of ENV\[lB]\[oq]HOME\[cq]\[rB]\. Also the dynamic shell command in the pattern \[Do](shell command) will be executed and its output replaces its pattern\. It does not matter if your shell uses different patters than BASH since the replacement is being done within a Ruby context\.
37
+ .TP
38
+ \fB\-\-erb\fR
39
+ If dynamic prompt content using \[Do](\.\.\.) wasn\[cq]t enough here is ERB\. Embedded RUby\. <%\[eq] ruby code %> within a prompt will have its ruby code executed and the results of that execution will be inserted into the prompt\. I\[cq]m sure we will find a way to truly misuse this capability\. Remember, some say that the simple prompt is the best prompt\.
40
+ .TP
32
41
  \fB\-\-model\fR \fINAME\fP
33
42
  Name of the LLM model to use \- default is gpt\-4\-1106\-preview
34
43
  .TP
@@ -71,6 +80,9 @@ Out FILENAME \- default is \.\[sl]temp\.md
71
80
  \fB\-p\fR, \fB\-\-prompts\fR \fIPATH\[ru]TO\[ru]DIRECTORY\fP
72
81
  Directory containing the prompt files \- default is \[ti]\[sl]\.prompts
73
82
  .TP
83
+ \fB\-r\fR, \fB\-\-role\fR \fIROLE\[ru]ID\fP
84
+ A role ID is the same as a prompt ID\. A \[lq]role\[rq] is a specialized prompt that gets pre\-pended to another prompt\. It\[cq]s purpose is to configure the LLM into a certain orientation within which to resolve its primary prompt\.
85
+ .TP
74
86
  \fB\-v\fR, \fB\-\-verbose\fR
75
87
  Be Verbose \- default is false
76
88
  .SH CONFIGURATION HIERARCHY
data/man/aia.1.md CHANGED
@@ -34,6 +34,15 @@ The aia command-line tool is an interface for interacting with an AI model backe
34
34
  `--dump` *FORMAT*
35
35
  : Dump a Config File in [yaml | toml] to STDOUT - default is nil
36
36
 
37
+ `-e`, `--edit`
38
+ : Invokes an editor on the prompt file. You can make changes to the prompt file, save it and the newly saved prompt will be processed by the backend.
39
+
40
+ `--env`
41
+ : This option tells `aia` to replace references to system environment variables in the prompt with the value of the envar. envars are like $HOME and ${HOME} in this example their occurance will be replaced by the value of ENV['HOME']. Also the dynamic shell command in the pattern $(shell command) will be executed and its output replaces its pattern. It does not matter if your shell uses different patters than BASH since the replacement is being done within a Ruby context.
42
+
43
+ `--erb`
44
+ : If dynamic prompt content using $(...) wasn't enough here is ERB. Embedded RUby. <%= ruby code %> within a prompt will have its ruby code executed and the results of that execution will be inserted into the prompt. I'm sure we will find a way to truly misuse this capability. Remember, some say that the simple prompt is the best prompt.
45
+
37
46
  `--model` *NAME*
38
47
  : Name of the LLM model to use - default is gpt-4-1106-preview
39
48
 
@@ -76,6 +85,9 @@ The aia command-line tool is an interface for interacting with an AI model backe
76
85
  `-p`, `--prompts` *PATH_TO_DIRECTORY*
77
86
  : Directory containing the prompt files - default is ~/.prompts
78
87
 
88
+ `-r`, `--role` *ROLE_ID*
89
+ : A role ID is the same as a prompt ID. A "role" is a specialized prompt that gets pre-pended to another prompt. It's purpose is to configure the LLM into a certain orientation within which to resolve its primary prompt.
90
+
79
91
  `-v`, `--verbose`
80
92
  : Be Verbose - default is false
81
93
 
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.4.1
4
+ version: 0.4.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2023-12-31 00:00:00.000000000 Z
11
+ date: 2024-01-01 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: hashie
@@ -205,7 +205,6 @@ files:
205
205
  - lib/aia/tools/vim.rb
206
206
  - lib/aia/version.rb
207
207
  - lib/core_ext/string_wrap.rb
208
- - lib/modularization_plan.md
209
208
  - main.just
210
209
  - man/aia.1
211
210
  - man/aia.1.md
@@ -1,126 +0,0 @@
1
- ## Suggested Refactoring into Modules
2
-
3
- ### ConfigurationModule
4
-
5
- This module could encapsulate all the constants and environment-dependent settings.
6
-
7
- ```ruby
8
- module Configuration
9
- HOME = Pathname.new(ENV['HOME'])
10
- PROMPTS_DIR = Pathname.new(ENV['PROMPTS_DIR'] || (HOME + ".prompts_dir"))
11
- AI_CLI_PROGRAM = "mods"
12
- EDITOR = ENV['EDITOR'] || 'edit'
13
- MY_NAME = Pathname.new(__FILE__).basename.to_s.split('.')[0]
14
- MODS_MODEL = ENV['MODS_MODEL'] || 'gpt-4-1106-preview'
15
- OUTPUT = Pathname.pwd + "temp.md"
16
- PROMPT_LOG = PROMPTS_DIR + "_prompts.log"
17
- USAGE = <<~EOUSAGE
18
- AI Assistant (aia)
19
- ==================
20
- The AI cli program being used is: #{AI_CLI_PROGRAM}
21
- You can pass additional CLI options to #{AI_CLI_PROGRAM} like this:
22
- "#{MY_NAME} my options -- options for #{AI_CLI_PROGRAM}"
23
- EOUSAGE
24
- end
25
- ```
26
-
27
- ### OptionParsingModule
28
-
29
- This module could manage the parsing of command-line arguments and configuring the options for the application.
30
-
31
- ```ruby
32
- module OptionParsing
33
- def build_reader_methods
34
- # ... method definition ...
35
- end
36
-
37
- def process_arguments
38
- # ... method definition ...
39
- end
40
-
41
- def check_for(an_option)
42
- # ... method definition ...
43
- end
44
-
45
- def process_option(option_sym, switches)
46
- # ... method definition ...
47
- end
48
- end
49
- ```
50
-
51
- ### CommandLineInterfaceModule
52
-
53
- This module would manage interactions with the command-line interface including editing of prompts and selection processes.
54
-
55
- ```ruby
56
- module CommandLineInterface
57
- def keyword_value(kw, default)
58
- # ... method definition ...
59
- end
60
-
61
- def handle_multiple_prompts(found_these, while_looking_for_this)
62
- # ... method definition ...
63
- end
64
- end
65
- ```
66
-
67
- ### LoggingModule
68
-
69
- Responsible for logging the results of the command.
70
-
71
- ```ruby
72
- module Logging
73
- def write_to_log(answer)
74
- # ... method definition ...
75
- end
76
- end
77
- ```
78
-
79
- ### AICommandModule
80
-
81
- Manages the building and execution of the AI CLI command.
82
-
83
- ```ruby
84
- module AICommand
85
- def setup_cli_program
86
- # ... method definition ...
87
- end
88
-
89
- def build_command
90
- # ... method definition ...
91
- end
92
-
93
- def execute_and_log_command(command)
94
- # ... method definition ...
95
- end
96
- end
97
- ```
98
-
99
- ### PromptProcessingModule
100
-
101
- Handles prompt retrieval, existing check, and keyword processing.
102
-
103
- ```ruby
104
- module PromptProcessing
105
- def existing_prompt?(prompt_id)
106
- # ... method definition ...
107
- end
108
-
109
- def process_prompt
110
- # ... method definition ...
111
- end
112
-
113
- def replace_keywords
114
- # ... method definition ...
115
- end
116
-
117
- def search_for_a_matching_prompt(prompt_id)
118
- # ... method definition ...
119
- end
120
- end
121
- ```
122
-
123
- Each module should only contain the methods relevant to that module's purpose. After defining these modules, they can be included in the `AIA::Main` class where appropriate. Note that the method `get_prompt_id` didn't fit neatly into one of the outlined modules; it may remain in the main class or be included in a module if additional context becomes available or if it can be logically grouped with similar methods.
124
-
125
- The `__END__` block and the Readline history management could be encapsulated into a separate module for terminal interactions if that block grows in complexity or moves out of the overall class definition.
126
-