aia 0.3.20 → 0.4.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: b3061141e7330122b3ec6ffd64cf967d8fb0c4f8741531337bf32bfba28ee4ea
4
- data.tar.gz: b62080a48b5d4afefb9da0eb5bdc1deff20a8c666048a755b38f759815bea8f3
3
+ metadata.gz: 0542f46fbbf4d3dd960229ae8f707f89b2cbab347d763d2b8db11aa8f4e2e552
4
+ data.tar.gz: 6dd9b949fbdc26019c64f2ece134114b515782395bce52a71b0f5d281042071b
5
5
  SHA512:
6
- metadata.gz: c2e83d7db907fe2e3c91aae7acad12bf1569071479f99ba70f3626c98640323148e83b46cda2aa2353afcad54d740116999399e8e646b7db02b4a7e42916d0fc
7
- data.tar.gz: 3f327c042a67a574231d9e25a5ed8df83ca6d4cf28444ae92be20d241480143ce95ec4e4238aab70caecf0c9c64417d579c6e8fd3dd3e7928c64f8e56fb2196a
6
+ metadata.gz: 2eefbbd2215bc82361d1335da437714760ca7c9db5237b89280dd5cdbe220da2dd14f48890ca56641758186c0a54d5cc199b22b18535fa62ed147c52bda2a77a
7
+ data.tar.gz: '03990e11ef3fa9960fa8c3bd8aa34f667cf9fdd10b2e37646d2c54b199ea629fb58c1cafaeeb328bff1cbda2ce219300ab3831ab95894958317bc97c2835f18f'
data/.semver CHANGED
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  :major: 0
3
- :minor: 3
4
- :patch: 20
3
+ :minor: 4
4
+ :patch: 1
5
5
  :special: ''
6
6
  :metadata: ''
data/CHANGELOG.md CHANGED
@@ -1,4 +1,9 @@
1
1
  ## [Unreleased]
2
+ ## [0.4.1] 2023-31
3
+ - added a chat mode
4
+ - prompt directives now supported
5
+ - version bumped to match the `prompt_manager` gem
6
+
2
7
  ## [0.3.20] 2023-12-28
3
8
  - added work around to issue with multiple context files going to the `mods` backend
4
9
  - added shellwords gem to santize prompt text on the command line
data/README.md CHANGED
@@ -6,7 +6,7 @@ Uses the gem "prompt_manager" to manage the prompts sent to the `mods` command-l
6
6
 
7
7
  **Most Recent Change**
8
8
 
9
- v0.3.19 - Major code refactoring. Now supports conf giles in YAML or TOML format. Supports system environment variable (envar) over-rides of default config values using envar's with keys that have the prefix "AIA_" followed by a config item name in all upper case.
9
+ v0.4.1 - Added a chat mode. Prompt directives are now supported.
10
10
 
11
11
  <!-- Tocer[start]: Auto-generated, don't remove. -->
12
12
 
@@ -15,6 +15,7 @@ v0.3.19 - Major code refactoring. Now supports conf giles in YAML or TOML forma
15
15
  - [Installation](#installation)
16
16
  - [Usage](#usage)
17
17
  - [System Environment Variables (envar)](#system-environment-variables-envar)
18
+ - [Prompt Directives](#prompt-directives)
18
19
  - [External CLI Tools Used](#external-cli-tools-used)
19
20
  - [Shell Completion](#shell-completion)
20
21
  - [Development](#development)
@@ -52,7 +53,7 @@ The usage report obtained using either `-h` or `--help` is implemented as a stan
52
53
  ```text
53
54
  $ aia --help
54
55
 
55
- aia(1) User Manuals aia(1)
56
+ aia(1) User Manuals aia(1)
56
57
 
57
58
  NAME
58
59
  aia - command-line interface for an AI assistant
@@ -61,157 +62,167 @@ SYNOPSIS
61
62
  aia [options]* PROMPT_ID [CONTEXT_FILE]* [-- EXTERNAL_OPTIONS+]
62
63
 
63
64
  DESCRIPTION
64
- The aia command-line tool is an interface for interacting with an AI
65
- model backend, providing a simple way to send prompts and receive
66
- responses. The CLI supports various options to customize the
67
- interaction, load a configuration file, edit prompts, set debugging
68
- levels, and more.
65
+ The aia command-line tool is an interface for interacting with an AI model backend,
66
+ providing a simple way to send prompts and receive responses. The CLI supports various
67
+ options to customize the interaction, load a configuration file, edit prompts, set
68
+ debugging levels, and more.
69
69
 
70
70
  ARGUMENTS
71
71
  PROMPT_ID
72
72
  This is a required argument.
73
73
 
74
74
  CONTEXT_FILES
75
- This is an optional argument. One or more files can be added to
76
- the prompt as context for the backend gen-AI tool to process.
75
+ This is an optional argument. One or more files can be added to the prompt as
76
+ context for the backend gen-AI tool to process.
77
77
 
78
78
  EXTERNAL_OPTIONS
79
- External options are optional. Anything that follow “ -- “ will
80
- be sent to the backend gen-AI tool. For example “-- -C -m
81
- gpt4-128k” will send the options “-C -m gpt4-128k” to the
82
- backend gen-AI tool. aia will not validate these external
83
- options before sending them to the backend gen-AI tool.
79
+ External options are optional. Anything that follow “ -- “ will be sent to the
80
+ backend gen-AI tool. For example “-- -C -m gpt4-128k” will send the options
81
+ “-C -m gpt4-128k” to the backend gen-AI tool. aia will not validate these
82
+ external options before sending them to the backend gen-AI tool.
84
83
 
85
84
  OPTIONS
86
- -c, --config PATH_TO_CONFIG_FILE
87
- Load Config File - default: nil
85
+ --chat begin a chat session with the backend after the initial prompt response; will
86
+ set --no-output so that the backend response comes to STDOUT.
87
+
88
+ --completion SHELL_NAME
88
89
 
89
90
  --dump FORMAT
90
91
 
91
- -e, --edit
92
- Edit the Prompt File - default: false
92
+ --model NAME
93
+ Name of the LLM model to use - default is gpt-4-1106-preview
93
94
 
94
- -d, --debug
95
- Turn On Debugging - default: false
95
+ --speak
96
+ Simple implementation. Uses the “say” command to speak the response. Fun with
97
+ --chat
96
98
 
97
- -v, --verbose
98
- Be Verbose - default: false
99
+ --terse
100
+ Add a clause to the prompt text that instructs the backend to be terse in its
101
+ response.
99
102
 
100
103
  --version
101
- Print Version - default: false
104
+ Print Version - default is false
102
105
 
103
- -h, --help
104
- Show Usage - default: false
106
+ -b, --[no]-backend LLM TOOL
107
+ Specify the backend prompt resolver - default is mods
105
108
 
106
- -s, --search TERM
107
- Search for prompts contain TERM - default: nil
109
+ -c, --config PATH_TO_CONFIG_FILE
110
+ Load Config File - default is nil
108
111
 
109
- -f, --fuzzy`
110
- Use Fuzzy Matching when searching for a prompt - default: false
112
+ -d, --debug
113
+ Turn On Debugging - default is false
111
114
 
112
- --completion SHELL_NAME
115
+ -e, --edit
116
+ Edit the Prompt File - default is false
113
117
 
114
- -o, --[no]-output PATH_TO_OUTPUT_FILE
115
- Out FILENAME - default: ./temp.md
118
+ -f, --fuzzy`
119
+ Use Fuzzy Matching when searching for a prompt - default is false
120
+
121
+ -h, --help
122
+ Show Usage - default is false
116
123
 
117
124
  -l, --[no]-log PATH_TO_LOG_FILE
118
- Log FILEPATH - default: $HOME/.prompts/prompts.log
125
+ Log FILEPATH - default is $HOME/.prompts/prompts.log
119
126
 
120
127
  -m, --[no]-markdown
121
- Format with Markdown - default: true
128
+ Format with Markdown - default is true
122
129
 
123
- --model NAME
124
- Name of the LLM model to use - default: gpt-4-1106-preview
130
+ -o, --[no]-output PATH_TO_OUTPUT_FILE
131
+ Out FILENAME - default is ./temp.md
125
132
 
126
133
  -p, --prompts PATH_TO_DIRECTORY
127
- Directory containing the prompt files - default: ~/.prompts
128
-
129
- -b, --[no]-backend LLM TOOL
130
- Specify the backend prompt resolver - default: :mods
131
-
132
- ENVIRONMENT
133
- The aia CLI uses the following environment variables:
134
+ Directory containing the prompt files - default is ~/.prompts
134
135
 
135
- AIA_PROMPTS_DIR: Path to the directory containing prompts
136
- files - default: $HOME/.prompts_dir
137
-
138
- • AIA_BACKEND: The AI command-line program used - default: mods
136
+ -v, --verbose
137
+ Be Verbose - default is false
139
138
 
140
- EDITOR: The text editor used by the edit option - default:
141
- edit
139
+ CONFIGURATION HIERARCHY
140
+ System Environment Variables (envars) that are all uppercase and begin with “AIA_” can
141
+ be used to over-ride the default configuration settings. For example setting “export
142
+ AIA_PROMPTS_DIR=~/Documents/prompts” will over-ride the default configuration;
143
+ however, a config value provided by a command line options will over-ride an envar
144
+ setting.
142
145
 
143
- AIA_MODEL: The AI model specification - default:
144
- gpt-4-1106-preview
146
+ Configuration values found in a config file will over-ride all other values set for a
147
+ config item.
145
148
 
146
- AIA_OUTPUT: The default filename for output - default:
147
- ./temp.md
149
+ ”//config” directives found inside a prompt file over-rides that config item
150
+ regardless of where the value was set.
148
151
 
149
- AIA_PROMPT_LOG: The default filepath for the prompts log -
150
- default: $HOME/.prompts/_prompts.log
152
+ For example “//config chat? = true” within a prompt will setup the chat back and forth
153
+ chat session for that specific prompt regardless of the command line options or the
154
+ envar AIA_CHAT settings
151
155
 
152
- Additionally, the program requires an OpenAI access key, which can be
153
- specified using one of the following environment variables:
156
+ OpenAI ACCOUNT IS REQUIRED
157
+ Additionally, the program requires an OpenAI access key, which can be specified using
158
+ one of the following environment variables:
154
159
 
155
160
  • OPENAI_ACCESS_TOKEN
156
161
 
157
162
  • OPENAI_API_KEY
158
163
 
159
- Currently there is not specific standard for name of the OpenAI key.
160
- Some programs use one name, while others use a different name. Both of
161
- the envars listed above mean the same thing. If you use more than one
162
- tool to access OpenAI resources, you may have to set several envars to
163
- the same key value.
164
+ Currently there is not specific standard for name of the OpenAI key. Some programs
165
+ use one name, while others use a different name. Both of the envars listed above mean
166
+ the same thing. If you use more than one tool to access OpenAI resources, you may
167
+ have to set several envars to the same key value.
164
168
 
165
- To acquire an OpenAI access key, first create an account on the OpenAI
166
- platform, where further documentation is available.
169
+ To acquire an OpenAI access key, first create an account on the OpenAI platform, where
170
+ further documentation is available.
167
171
 
168
172
  USAGE NOTES
169
- aia is designed for flexibility, allowing users to pass prompt ids and
170
- context files as arguments. Some options change the behavior of the
171
- output, such as --output for specifying a file or --no-output for
172
- disabling file output in favor of standard output.
173
+ aia is designed for flexibility, allowing users to pass prompt ids and context files
174
+ as arguments. Some options change the behavior of the output, such as --output for
175
+ specifying a file or --no-output for disabling file output in favor of standard output
176
+ (STDPIT).
177
+
178
+ The --completion option displays a script that enables prompt ID auto-completion for
179
+ bash, zsh, or fish shells. It’s crucial to integrate the script into the shell’s
180
+ runtime to take effect.
181
+
182
+ The --dump options will send the current configuration to STDOUT in the format
183
+ requested. Both YAML and TOML formats are supported.
173
184
 
174
- The --completion option displays a script that enables prompt ID
175
- auto-completion for bash, zsh, or fish shells. It’s crucial to
176
- integrate the script into the shell’s runtime to take effect.
185
+ PROMPT DIRECTIVES
186
+ Within a prompt text file any line that begins with “//” is considered a prompt
187
+ directive. There are numerious prompt directives available. In the discussion above
188
+ on the configuration you learned about the “//config” directive.
189
+
190
+ Detail discussion on individual prompt directives is TBD. Most likely it will be
191
+ handled in the github wiki <https://github.com/MadBomber/aia>
177
192
 
178
193
  SEE ALSO
179
194
 
180
- • OpenAI Platform Documentation
181
- <https://platform.openai.com/docs/overview>
195
+ • OpenAI Platform Documentation <https://platform.openai.com/docs/overview>
182
196
  for more information on obtaining access tokens
183
197
  <https://platform.openai.com/account/api-keys>
184
198
  and working with OpenAI models.
185
199
 
186
200
  • mods <https://github.com/charmbracelet/mods>
187
- for more information on mods - AI for the command line, built
188
- for pipelines. LLM based AI is really good at interpreting
189
- the output of commands and returning the results in CLI
190
- friendly text formats like Markdown. Mods is a simple tool
191
- that makes it super easy to use AI on the command line and in
201
+ for more information on mods - AI for the command line, built for pipelines.
202
+ LLM based AI is really good at interpreting the output of commands and
203
+ returning the results in CLI friendly text formats like Markdown. Mods is a
204
+ simple tool that makes it super easy to use AI on the command line and in
192
205
  your pipelines. Mods works with OpenAI
193
206
  <https://platform.openai.com/account/api-keys>
194
207
  and LocalAI <https://github.com/go-skynet/LocalAI>
195
208
 
196
209
  • sgpt <https://github.com/tbckr/sgpt>
197
- (aka shell-gpt) is a powerful command-line interface (CLI)
198
- tool designed for seamless interaction with OpenAI models
199
- directly from your terminal. Effortlessly run queries,
200
- generate shell commands or code, create images from text, and
201
- more, using simple commands. Streamline your workflow and
202
- enhance productivity with this powerful and user-friendly CLI
203
- tool.
210
+ (aka shell-gpt) is a powerful command-line interface (CLI) tool designed for
211
+ seamless interaction with OpenAI models directly from your terminal.
212
+ Effortlessly run queries, generate shell commands or code, create images from
213
+ text, and more, using simple commands. Streamline your workflow and enhance
214
+ productivity with this powerful and user-friendly CLI tool.
204
215
 
205
216
  AUTHOR
206
217
  Dewayne VanHoozer <dvanhoozer@gmail.com>
207
218
 
208
- AIA 2024-01-01 aia(1)
219
+ AIA 2024-01-01 aia(1)
209
220
 
210
221
  ```
211
222
 
212
223
  ## System Environment Variables (envar)
213
224
 
214
- The `aia` configuration defaults can be over-ridden by envars with the prefix "AIA_" followed by the config item name also in uppercase.
225
+ The `aia` configuration defaults can be over-ridden by envars with the prefix "AIA_" followed by the config item name also in uppercase. All configuration items can be over-ridden in this way by an envar. The following table show a few examples.
215
226
 
216
227
  | Config Item | Default Value | envar key |
217
228
  | ------------- | ------------- | --------- |
@@ -231,6 +242,20 @@ The `aia` configuration defaults can be over-ridden by envars with the prefix "A
231
242
 
232
243
  See the `@options` hash in the `cli.rb` file for a complete list. There are some config items that do not necessarily make sense for use as an envar over-ride. For example if you set `export AIA_DUMP=yaml` then `aia` would dump a config file in HAML format and exit every time it is ran until you finally did `unset AIA_DUMP`
233
244
 
245
+ In addition to these config items for `aia` the optional command line parameters for the backend prompt processing utilities (mods and sgpt) can also be set using envars with the "AIA_" prefix. For example "export AIA_TOPP=1.0" will set the "--topp 1.0" command line option for the `mods` utility when its used as the backend processor.
246
+
247
+ ## Prompt Directives
248
+
249
+ Downstream processing directives were added to the `prompt_manager` gem at version 0.4.1. These directives are lines in the prompt text file that begin with "//"
250
+
251
+ For example if a prompt text file has this line:
252
+
253
+ > //config chat? = true
254
+
255
+ That prompt will enter the chat loop regardles of the presents of a "--chat" CLI option or the setting of the envar AIA_CHAT.
256
+
257
+ BTW did I mention that `aia` supports a chat mode where you can send an initial prompt to the backend and then followup the backend's reponse with additional keyboard entered questions, instructions, prompts etc.
258
+
234
259
  ## External CLI Tools Used
235
260
 
236
261
  From the verbose help text ...
data/lib/aia/cli.rb CHANGED
@@ -117,7 +117,8 @@ class AIA::Cli
117
117
  # Default
118
118
  # Key Value, switches
119
119
  arguments: [args], # NOTE: after process, prompt_id and context_files will be left
120
- extra: [''], # SMELL: should be nil?
120
+ directives: [[]], # an empty Array as the default value
121
+ extra: [''], #
121
122
  #
122
123
  model: ["gpt-4-1106-preview", "--llm --model"],
123
124
  #
@@ -130,8 +131,10 @@ class AIA::Cli
130
131
  version?: [false, "--version"],
131
132
  help?: [false, "-h --help"],
132
133
  fuzzy?: [false, "-f --fuzzy"],
133
- search: [nil, "-s --search"],
134
134
  markdown?: [true, "-m --markdown --no-markdown --md --no-md"],
135
+ chat?: [false, "--chat"],
136
+ terse?: [false, "--terse"],
137
+ speak?: [false, "--speak"],
135
138
  #
136
139
  # TODO: May have to process the
137
140
  # "~" character and replace it with HOME
@@ -194,13 +197,15 @@ class AIA::Cli
194
197
 
195
198
 
196
199
  def process_command_line_arguments
200
+ # get the options meant for the backend AI command
201
+ # doing this first in case there are any options that conflict
202
+ # between frontend and backend.
203
+ extract_extra_options
204
+
197
205
  @options.keys.each do |option|
198
206
  check_for option
199
207
  end
200
208
 
201
- # get the options meant for the backend AI command
202
- extract_extra_options
203
-
204
209
  bad_options = arguments.select{|a| a.start_with?('-')}
205
210
 
206
211
  unless bad_options.empty?
@@ -0,0 +1,66 @@
1
+ # lib/aia/directives.rb
2
+
3
+ require 'hashie'
4
+
5
+ class AIA::Directives
6
+ def initialize( prompt: )
7
+ @prompt = prompt # PromptManager::Prompt instance
8
+ AIA.config.directives = @prompt.directives
9
+ end
10
+
11
+
12
+ def execute_my_directives
13
+ return if AIA.config.directives.nil? || AIA.config.directives.empty?
14
+
15
+ not_mine = []
16
+
17
+ AIA.config.directives.each do |entry|
18
+ directive = entry[0].to_sym
19
+ parameters = entry[1]
20
+
21
+ if respond_to? directive
22
+ send(directive, parameters)
23
+ else
24
+ not_mine << entry
25
+ end
26
+ end
27
+
28
+ AIA.config.directives = not_mine
29
+ end
30
+
31
+
32
+ def box(what)
33
+ f = what[0]
34
+ bar = "#{f}"*what.size
35
+ puts "#{bar}\n#{what}\n#{bar}"
36
+ end
37
+
38
+
39
+ def shell(what) = puts `#{what}`
40
+ def ruby(what) = eval what
41
+
42
+
43
+ # Allows a prompt to change its configuration environment
44
+ def config(what)
45
+ parts = what.split(' ')
46
+ item = parts.shift
47
+ parts.shift if %w[:= =].include? parts[0]
48
+
49
+ if '<<' == parts[0]
50
+ parts.shift
51
+ value = parts.join
52
+ if AIA.config(item).is_a?(Array)
53
+ AIA.config[item] << value
54
+ else
55
+ AIA.config[item] = [ value ]
56
+ end
57
+ else
58
+ value = parts.join
59
+ if item.end_with?('?')
60
+ AIA.config[item] = %w[1 y yea yes t true].include?(value.downcase)
61
+ else
62
+ AIA.config[item] = "STDOUT" == value ? STDOUT : value
63
+ end
64
+ end
65
+ end
66
+ end
data/lib/aia/main.rb CHANGED
@@ -4,7 +4,8 @@ module AIA ; end
4
4
 
5
5
  require_relative 'config'
6
6
  require_relative 'cli'
7
- require_relative 'prompt_processing'
7
+ require_relative 'directives'
8
+ require_relative 'prompt'
8
9
  require_relative 'logging'
9
10
  require_relative 'tools'
10
11
 
@@ -12,26 +13,58 @@ require_relative 'tools'
12
13
  # of a single class.
13
14
 
14
15
  class AIA::Main
15
- include AIA::PromptProcessing
16
16
 
17
- attr_accessor :logger, :tools
17
+ attr_accessor :logger, :tools, :backend
18
+
19
+ def initialize(args= ARGV)
20
+ AIA::Tools.load_tools
18
21
 
19
- def initialize(args= ARGV)
20
22
  AIA::Cli.new(args)
21
23
 
22
24
  @logger = AIA::Logging.new(AIA.config.log_file)
23
- AIA::Tools.load_tools
25
+
26
+ @logger.info(AIA.config) if AIA.config.debug? || AIA.config.verbose?
27
+
28
+ @prompt = AIA::Prompt.new.prompt
29
+
30
+ @engine = AIA::Directives.new(prompt: @prompt)
24
31
 
25
32
  # TODO: still should verify that the tools are ion the $PATH
26
33
  # tools.class.verify_tools
27
34
  end
28
35
 
29
36
 
37
+ def speak(what)
38
+ return unless AIA.config.speak?
39
+ # MacOS uses the say command
40
+ system "say #{Shellwords.escape(what)}"
41
+ end
42
+
43
+
44
+ # Function to setup the Reline history with a maximum depth
45
+ def setup_reline_history(max_history_size=5)
46
+ Reline::HISTORY.clear
47
+ # Reline::HISTORY.max_size = max_history_size
48
+ end
49
+
50
+
51
+ # Function to prompt the user with a question using reline
52
+ def ask_question_with_reline(prompt)
53
+ answer = Reline.readline(prompt)
54
+ Reline::HISTORY.push(answer) unless answer.nil? || Reline::HISTORY.to_a.include?(answer)
55
+ answer
56
+ rescue Interrupt
57
+ ''
58
+ end
59
+
60
+
30
61
  def call
31
- get_prompt
32
- process_prompt
33
-
34
- # send_prompt_to_external_command
62
+ @engine.execute_my_directives
63
+
64
+ if AIA.config.chat?
65
+ AIA.config.output_file = STDOUT
66
+ AIA.config.extra = "--quiet" if 'mods' == AIA.config.backend
67
+ end
35
68
 
36
69
  # TODO: the context_files left in the @arguments array
37
70
  # should be verified BEFORE asking the user for a
@@ -56,15 +89,63 @@ class AIA::Main
56
89
 
57
90
  abort "backend not found: #{AIA.config.backend}" if backend_klass.nil?
58
91
 
59
- backend = backend_klass.new(
60
- text: @prompt.to_s,
61
- files: AIA.config.arguments # FIXME: want validated context files
62
- )
92
+ the_prompt = @prompt.to_s
93
+
94
+ if AIA.config.terse?
95
+ the_prompt.prepend "Be terse in your response. "
96
+ end
97
+
98
+ @backend = backend_klass.new(
99
+ text: the_prompt,
100
+ files: AIA.config.arguments # FIXME: want validated context files
101
+ )
102
+
63
103
 
64
104
  result = backend.run
65
105
 
66
106
  AIA.config.output_file.write result
67
107
 
68
108
  logger.prompt_result(@prompt, result)
109
+
110
+
111
+ if AIA.config.chat?
112
+ setup_reline_history
113
+ speak result
114
+ lets_chat
115
+ end
116
+ end
117
+
118
+
119
+ def lets_chat
120
+ if 'mods' == AIA.config.backend
121
+ AIA.config.extra += " -C"
122
+ end
123
+
124
+ backend.text = ask_question_with_reline("\nFollow Up: ")
125
+
126
+ until backend.text.empty?
127
+ if AIA.config.terse?
128
+ backend.text.prepend "Be terse in your response. "
129
+ end
130
+
131
+ logger.info "Follow Up: #{backend.text}"
132
+ response = backend.run
133
+
134
+ speak response
135
+
136
+ puts "\nResponse: #{response}"
137
+ logger.info "Response: #{backend.run}"
138
+
139
+ # TODO: Allow user to enter a directive; loop
140
+ # until answer is not a directive
141
+ #
142
+ # while !directive do
143
+ backend.text = ask_question_with_reline("\nFollow Up: ")
144
+
145
+ speak backend.text
146
+
147
+ # execute the directive
148
+ # end
149
+ end
69
150
  end
70
151
  end