aia 0.9.2 → 0.9.3rc1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 543eb535dae7c828b1dbd2a2dd981e566d27b1c5d8ad4c81d1fd21566015f175
4
- data.tar.gz: 021bf728dd9082c9fa1aca73d72001e4ed512d2c7ddd23cff044d5d9447a8d2d
3
+ metadata.gz: 9e3aa7b74ca7b64aca1acd2360043829f9ac14ae63f4c66ced12ea1a70ae34d0
4
+ data.tar.gz: 5802184befd82ae7ff4eecf16da0e5fc1d9d39cc61cdcc5736ad4d56dbe0b87e
5
5
  SHA512:
6
- metadata.gz: 5b108c87dc2c82c0c347efa2b084ce1274b412324c33c2450a8174a655dca3be9c14e34b83347b90a9a7c8817b0eadb90a17f81536234eb38b8d09377f4a3221
7
- data.tar.gz: 43bbef3a40562333650251dde1ea5868b4cef29ed7b35c447429a75552899a6a73989ae87546320e8a51370675192f1d18d71c56eef8b98985beb863341f4869
6
+ metadata.gz: 31dae0e58c2d7f4b97a23f90b1b588b8735bb6a8e9774a2f40554baf64636ca8c56adc2df9fad804c7e05e91ca5d6a2fde358dd886c56219190f1ba8262fdbf0
7
+ data.tar.gz: d710b9e9fbe52cf014ed8a10274eb84ad4bd1ed00b2ee03a75e71767a4f45abb37d4b5ff6b258597c77b352cf1eb5c8272f1e59c3611eaba9c730447b17e2f2c
data/.version CHANGED
@@ -1 +1 @@
1
- 0.9.2
1
+ 0.9.3rc1
data/CHANGELOG.md CHANGED
@@ -1,6 +1,18 @@
1
1
  # Changelog
2
2
  ## [Unreleased]
3
+ ### [0.9.3] WIP
4
+ - need to pay attention to the test suite
5
+ - also need to ensure the non text2text modes are working
6
+
3
7
  ## Released
8
+ ### [0.9.3rc1] 2025-05-24
9
+ - using ruby_llm v1.3.0rc1
10
+ - added a models database refresh based on integer days interval with the --refresh option
11
+ - config file now has a "last_refresh" String in format YYYY-MM-DD
12
+ - enhanced the robot figure to show more config items including tools
13
+ - fixed bug with the --require option with the specified libraries were not being loaded.
14
+ - fixed a bug in the prompt_manager gem which is now at v0.5.5
15
+
4
16
 
5
17
  ### [0.9.2] 2025-05-18
6
18
  - removing the MCP experiment
data/README.md CHANGED
@@ -18,7 +18,7 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
18
18
 
19
19
  **Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
20
20
 
21
- **MCRubyLLM::Tool Support:** AIA now supports the integration of Tools for those models that support function callbacks. See the --tools, --allowed_tools and --rejected_tools options. Yes, functional callbacks provided for dynamic prompt just like the AIA directives so why have both? Well, AIA is older that functional callbacks. Directives or legacy but more than that not all models support functional callbacks. That means the old directives capability, shell and erb integration are still viable ways to provided dynamic extra content to your prompts.
21
+ **MCRubyLLM::Tool Support:** AIA now supports the integration of Tools for those models that support function callbacks. See the --tools, --allowed_tools and --rejected_tools options. Yes, functional callbacks provided for dynamic prompts just like the AIA directives, shell and ERB integrations so why have both? Well, AIA is older that functional callbacks. The AIA integrations are legacy but more than that not all models support functional callbacks. That means the AIA integrationsß∑ are still viable ways to provided dynamic extra content to your prompts.
22
22
 
23
23
  <!-- Tocer[start]: Auto-generated, don't remove. -->
24
24
 
@@ -27,11 +27,13 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
27
27
  - [Configuration Options](#configuration-options)
28
28
  - [Configuration Flexibility](#configuration-flexibility)
29
29
  - [Expandable Configuration](#expandable-configuration)
30
+ - [The Local Model Registry Refresh](#the-local-model-registry-refresh)
31
+ - [Important Note](#important-note)
30
32
  - [Shell Integration inside of a Prompt](#shell-integration-inside-of-a-prompt)
31
33
  - [Dynamic Shell Commands](#dynamic-shell-commands)
32
34
  - [Shell Command Safety](#shell-command-safety)
33
35
  - [Chat Session Use](#chat-session-use)
34
- - [*E*mbedded *R*u*B*y (ERB)](#embedded-ruby-erb)
36
+ - [Embedded Ruby (ERB)](#embedded-ruby-erb)
35
37
  - [Prompt Directives](#prompt-directives)
36
38
  - [Parameter and Shell Substitution in Directives](#parameter-and-shell-substitution-in-directives)
37
39
  - [Directive Syntax](#directive-syntax)
@@ -105,7 +107,8 @@ The following table provides a comprehensive list of configuration options, thei
105
107
  | presence_penalty | --presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
106
108
  | prompt_extname | | .txt | AIA_PROMPT_EXTNAME |
107
109
  | prompts_dir | -p, --prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
108
- | require_libs | --rq | [] | AIA_REQUIRE_LIBS |
110
+ | refresh | --refresh | 0 (days) | AIA_REFRESH |
111
+ | require_libs | --rq --require | [] | AIA_REQUIRE_LIBS |
109
112
  | role | -r, --role | | AIA_ROLE |
110
113
  | roles_dir | | ~/.prompts/roles | AIA_ROLES_DIR |
111
114
  | roles_prefix | --roles_prefix | roles | AIA_ROLES_PREFIX |
@@ -152,9 +155,25 @@ If you do not like the default regex used to identify parameters within the prom
152
155
 
153
156
  The configuration options are expandable through a config file, allowing you to add custom entries. For example, you can define a custom configuration item like "xyzzy" in your config file. This value can then be accessed in your prompts using `AIA.config.xyzzy` within a `//ruby` directive or an ERB block, enabling dynamic prompt generation based on your custom configurations.
154
157
 
158
+ ## The Local Model Registry Refresh
159
+
160
+ The `ruby_llm` gem maintains a registry of providers and models integrated with a new website that allows users to download the latest information about each model. This capability is scheduled for release in version 1.3.0 of the gem.
161
+
162
+ In anticipation of this new feature, the AIA tool has introduced the `--refresh` option, which specifies the number of days between updates to the centralized model registry. Here’s how the `--refresh` option works:
163
+
164
+ - A value of `0` (zero) updates the local model registry every time AIA is executed.
165
+ - A value of `1` (one) updates the local model registry once per day.
166
+ - etc.
167
+
168
+ The date of the last successful refresh is stored in the configuration file under the key `last_refresh`. The default configuration file is located at `~/.aia/config.yml`. When a refresh is successful, the `last_refresh` value is updated to the current date, and the updated configuration is saved in `AIA.config.config_file`.
169
+
170
+ ### Important Note
171
+
172
+ This approach to saving the `last_refresh` date can become cumbersome, particularly if you maintain multiple configuration files for different projects. The `last_refresh` date is only updated in the currently active configuration file. If you switch to a different project with a different configuration file, you may inadvertently hit the central model registry again, even if your local registry is already up to date.
173
+
155
174
  ## Shell Integration inside of a Prompt
156
175
 
157
- Using the option `--shell` enables AIA to access your terminal's shell environment from inside the prompt text.
176
+ AIA configures the `prompt_manager` gem to be fully integrated with your local shell by default. This is not an option - its a feature. If your prompt inclues text patterns like $HOME, ${HOME} or $(command) those patterns will be automatically replaced in the prompt text by the shell's value for those patterns.
158
177
 
159
178
  #### Dynamic Shell Commands
160
179
 
@@ -174,25 +193,25 @@ Given the following constraints $(cat ~/3_laws_of_robotics.txt) determine the be
174
193
 
175
194
  #### Shell Command Safety
176
195
 
177
- The catchphrase "the prompt is the code" within AIA means that you have the power to execute any command you want, but you must be careful not to execute commands that could cause harm. AIA is not going to protect you from doing something stupid. Sure that's a copout. I just can't think (actually I can) of all the ways you can mess things up writing code. Remember what we learned from Forrest Gump "Stupid is as stupid does." So don't do anything stupid. If someone gives you a prompt as says "run this with AIA" you had better review the prompt before processing it.
196
+ The catchphrase "the prompt is the code" within AIA means that you have the power to execute any command you want, but you must be careful not to execute commands that could cause harm. AIA is not going to protect you from doing something dumb. Sure that's a copout. I just can't think (actually I can) of all the ways you can mess things up writing code. Remember what we learned from Forrest Gump "Stupid is as stupid does." So don't break the dumb law. If someone gives you a prompt as says "run this with AIA" you had better review the prompt before processing it.
178
197
 
179
198
  #### Chat Session Use
180
199
 
181
- When you use the `--shell` option to start a chat session, shell integration is available in your follow up prompts. Suppose you started a chat session (--chat) using a role of "Ruby Expert" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started. You could enter this as your follow up prompt to keep going:
200
+ Shell integration is available in your follow up prompts within a chat session. Suppose you started a chat session (--chat) using a role of "Ruby Expert" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started. You could enter this as your follow up prompt to keep going:
182
201
 
183
202
  ```plaintext
184
203
  The class I want to chat about refactoring is this one: $(cat my_class.rb)
185
204
  ```
186
205
 
187
- That inserts the entire class source file into your follow up prompt. You can continue chatting with you AI Assistant about changes to the class.
206
+ That inserts the entire class source file into your follow up prompt. You can continue chatting with your AI Assistant about changes to the class.
188
207
 
189
- ## *E*mbedded *R*u*B*y (ERB)
208
+ ## Embedded Ruby (ERB)
190
209
 
191
- The inclusion of dynamic content through the shell integration provided by the `--shell` option is significant. AIA also provides the full power of embedded Ruby code processing within the prompt text.
210
+ The inclusion of dynamic content through the shell integration is significant. AIA also provides the full power of embedded Ruby code processing within the prompt text.
192
211
 
193
- The `--erb` option turns the prompt text file into a fully functioning ERB template. The [Embedded Ruby (ERB) template syntax (2024)](https://bophin-com.ngontinh24.com/article/language-embedded-ruby-erb-template-syntax) provides a good overview of the syntax and power of ERB.
212
+ AIA takes advantage of the `prompt_manager` gem to enable ERB integration in prompt text as a default. Its an always available feature of AIA prompts. The [Embedded Ruby (ERB) template syntax (2024)](https://bophin-com.ngontinh24.com/article/language-embedded-ruby-erb-template-syntax) provides a good overview of the syntax and power of ERB.
194
213
 
195
- Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic prompt text for LLM processing.
214
+ Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic or conditional prompt text for LLM processing.
196
215
 
197
216
  ## Prompt Directives
198
217
 
@@ -297,11 +316,11 @@ For example:
297
316
  //ruby puts "Hello from Ruby"
298
317
  ```
299
318
 
300
- You can also use the `--rq` option to specify Ruby libraries to require before executing Ruby code:
319
+ You can also use the `--require` option to specify Ruby libraries to require before executing Ruby code:
301
320
 
302
321
  ```bash
303
322
  # Command line
304
- aia --rq json,csv my_prompt
323
+ aia --rq json,csv --require os my_prompt
305
324
 
306
325
  # In chat
307
326
  //ruby JSON.parse('{"data": [1,2,3]}')["data"]
@@ -28,9 +28,16 @@ module AIA
28
28
 
29
29
 
30
30
  def process_prompt(prompt, operation_type)
31
+ result = nil
31
32
  @ui_presenter.with_spinner("Processing", operation_type) do
32
- send_to_client(prompt, operation_type)
33
+ result = send_to_client(prompt, operation_type)
33
34
  end
35
+
36
+ unless result.is_a? String
37
+ result = result.content
38
+ end
39
+
40
+ result
34
41
  end
35
42
 
36
43
 
data/lib/aia/config.rb CHANGED
@@ -7,6 +7,7 @@
7
7
 
8
8
  require 'yaml'
9
9
  require 'toml-rb'
10
+ require 'date'
10
11
  require 'erb'
11
12
  require 'optparse'
12
13
  require 'json'
@@ -52,7 +53,6 @@ module AIA
52
53
  pipeline: [],
53
54
 
54
55
  # PromptManager::Prompt Tailoring
55
-
56
56
  parameter_regex: PromptManager::Prompt.parameter_regex.to_s,
57
57
 
58
58
  # LLM tuning parameters
@@ -64,15 +64,18 @@ module AIA
64
64
  image_size: '1024x1024',
65
65
  image_quality: 'standard',
66
66
  image_style: 'vivid',
67
+
67
68
  model: 'gpt-4o-mini',
68
69
  speech_model: 'tts-1',
69
70
  transcription_model: 'whisper-1',
71
+ embedding_model: 'text-embedding-ada-002',
72
+ image_model: 'dall-e-3',
73
+ refresh: 0, # days between refreshes of model info; 0 means every startup
74
+ last_refresh: Date.today - 1,
75
+
70
76
  voice: 'alloy',
71
77
  adapter: 'ruby_llm', # 'ruby_llm' or ???
72
78
 
73
- # Embedding parameters
74
- embedding_model: 'text-embedding-ada-002',
75
-
76
79
  # Default speak command
77
80
  speak_command: 'afplay', # 'afplay' for audio files
78
81
 
@@ -98,8 +101,13 @@ module AIA
98
101
  )
99
102
 
100
103
  tailor_the_config(config)
104
+ load_libraries(config)
101
105
  load_tools(config)
102
106
 
107
+ if config.dump_file
108
+ dump_config(config, config.dump_file)
109
+ end
110
+
103
111
  config
104
112
  end
105
113
 
@@ -180,11 +188,6 @@ module AIA
180
188
  and_exit = true
181
189
  end
182
190
 
183
- if config.dump_file
184
- dump_config(config, config.dump_file)
185
- and_exit = true
186
- end
187
-
188
191
  exit if and_exit
189
192
 
190
193
  # Only require a prompt_id if we're not in chat mode, not using fuzzy search, and no context files
@@ -207,6 +210,26 @@ module AIA
207
210
  end
208
211
 
209
212
 
213
+ def self.load_libraries(config)
214
+ return if config.require_libs.empty?
215
+
216
+ exit_on_error = false
217
+
218
+ config.require_libs.each do |library|
219
+ begin
220
+ require(library)
221
+ rescue => e
222
+ STDERR.puts "Error loading library '#{library}' #{e.message}"
223
+ exit_on_error = true
224
+ end
225
+ end
226
+
227
+ exit(1) if exit_on_error
228
+
229
+ config
230
+ end
231
+
232
+
210
233
  def self.load_tools(config)
211
234
  return if config.tool_paths.empty?
212
235
 
@@ -230,7 +253,7 @@ module AIA
230
253
  absolute_tool_path = File.expand_path(tool_path)
231
254
  require(absolute_tool_path)
232
255
  rescue => e
233
- SYSERR.puts "Error loading tool '#{tool_path}' #{e.message}"
256
+ STDERR.puts "Error loading tool '#{tool_path}' #{e.message}"
234
257
  exit_on_error = true
235
258
  end
236
259
  end
@@ -272,253 +295,264 @@ module AIA
272
295
  def self.cli_options
273
296
  config = OpenStruct.new
274
297
 
275
- opt_parser = OptionParser.new do |opts|
276
- opts.banner = "Usage: aia [options] [PROMPT_ID] [CONTEXT_FILE]*\n" +
277
- " aia --chat [PROMPT_ID] [CONTEXT_FILE]*\n" +
278
- " aia --chat [CONTEXT_FILE]*"
279
-
280
- opts.on("--chat", "Begin a chat session with the LLM after the initial prompt response; will set --no-out_file so that the LLM response comes to STDOUT.") do
281
- config.chat = true
282
- puts "Debug: Setting chat mode to true" if config.debug
283
- end
298
+ begin
299
+ opt_parser = OptionParser.new do |opts|
300
+ opts.banner = "Usage: aia [options] [PROMPT_ID] [CONTEXT_FILE]*\n" +
301
+ " aia --chat [PROMPT_ID] [CONTEXT_FILE]*\n" +
302
+ " aia --chat [CONTEXT_FILE]*"
303
+
304
+ opts.on("--chat", "Begin a chat session with the LLM after the initial prompt response; will set --no-out_file so that the LLM response comes to STDOUT.") do
305
+ config.chat = true
306
+ puts "Debug: Setting chat mode to true" if config.debug
307
+ end
284
308
 
285
- opts.on("--adapter ADAPTER", "Interface that adapts AIA to the LLM") do |adapter|
286
- adapter.downcase!
287
- valid_adapters = %w[ ruby_llm ] # NOTE: Add additional adapters here when needed
288
- if valid_adapters.include? adapter
289
- config.adapter = adapter
290
- else
291
- STDERR.puts "ERROR: Invalid adapter #{adapter} must be one of these: #{valid_adapters.join(', ')}"
292
- exit 1
309
+ opts.on("--adapter ADAPTER", "Interface that adapts AIA to the LLM") do |adapter|
310
+ adapter.downcase!
311
+ valid_adapters = %w[ ruby_llm ] # NOTE: Add additional adapters here when needed
312
+ if valid_adapters.include? adapter
313
+ config.adapter = adapter
314
+ else
315
+ STDERR.puts "ERROR: Invalid adapter #{adapter} must be one of these: #{valid_adapters.join(', ')}"
316
+ exit 1
317
+ end
293
318
  end
294
- end
295
319
 
320
+ opts.on("-m MODEL", "--model MODEL", "Name of the LLM model to use") do |model|
321
+ config.model = model
322
+ end
296
323
 
297
- opts.on("-m MODEL", "--model MODEL", "Name of the LLM model to use") do |model|
298
- config.model = model
299
- end
324
+ opts.on("--terse", "Adds a special instruction to the prompt asking the AI to keep responses short and to the point") do
325
+ config.terse = true
326
+ end
300
327
 
301
- opts.on("--terse", "Adds a special instruction to the prompt asking the AI to keep responses short and to the point") do
302
- config.terse = true
303
- end
328
+ opts.on("-c", "--config_file FILE", "Load config file") do |file|
329
+ if File.exist?(file)
330
+ ext = File.extname(file).downcase
331
+ content = File.read(file)
304
332
 
305
- opts.on("-c", "--config_file FILE", "Load config file") do |file|
306
- if File.exist?(file)
307
- ext = File.extname(file).downcase
308
- content = File.read(file)
333
+ # Process ERB if filename ends with .erb
334
+ if file.end_with?('.erb')
335
+ content = ERB.new(content).result
336
+ file = file.chomp('.erb')
337
+ File.write(file, content)
338
+ end
309
339
 
310
- # Process ERB if filename ends with .erb
311
- if file.end_with?('.erb')
312
- content = ERB.new(content).result
313
- file = file.chomp('.erb')
314
- File.write(file, content)
340
+ file_config = case ext
341
+ when '.yml', '.yaml'
342
+ YAML.safe_load(content, permitted_classes: [Symbol], symbolize_names: true)
343
+ when '.toml'
344
+ TomlRB.parse(content)
345
+ else
346
+ raise "Unsupported config file format: #{ext}"
347
+ end
348
+
349
+ file_config.each do |key, value|
350
+ config[key.to_sym] = value
351
+ end
352
+ else
353
+ raise "Config file not found: #{file}"
315
354
  end
355
+ end
316
356
 
317
- file_config = case ext
318
- when '.yml', '.yaml'
319
- YAML.safe_load(content, permitted_classes: [Symbol], symbolize_names: true)
320
- when '.toml'
321
- TomlRB.parse(content)
322
- else
323
- raise "Unsupported config file format: #{ext}"
324
- end
325
-
326
- file_config.each do |key, value|
327
- config[key.to_sym] = value
328
- end
329
- else
330
- raise "Config file not found: #{file}"
357
+ opts.on("-p", "--prompts_dir DIR", "Directory containing prompt files") do |dir|
358
+ config.prompts_dir = dir
331
359
  end
332
- end
333
360
 
334
- opts.on("-p", "--prompts_dir DIR", "Directory containing prompt files") do |dir|
335
- config.prompts_dir = dir
336
- end
361
+ opts.on("--roles_prefix PREFIX", "Subdirectory name for role files (default: roles)") do |prefix|
362
+ config.roles_prefix = prefix
363
+ end
337
364
 
338
- opts.on("--roles_prefix PREFIX", "Subdirectory name for role files (default: roles)") do |prefix|
339
- config.roles_prefix = prefix
340
- end
365
+ opts.on("-r", "--role ROLE_ID", "Role ID to prepend to prompt") do |role|
366
+ config.role = role
367
+ end
341
368
 
342
- opts.on("-r", "--role ROLE_ID", "Role ID to prepend to prompt") do |role|
343
- config.role = role
344
- end
369
+ opts.on("--refresh DAYS", Integer, "Refresh models database interval in days") do |days|
370
+ config.refresh = days || 0
371
+ end
345
372
 
346
- opts.on('--regex pattern', 'Regex pattern to extract parameters from prompt text') do |pattern|
347
- config.parameter_regex = pattern
348
- end
373
+ opts.on('--regex pattern', 'Regex pattern to extract parameters from prompt text') do |pattern|
374
+ config.parameter_regex = pattern
375
+ end
349
376
 
350
- opts.on("-o", "--[no-]out_file [FILE]", "Output file (default: temp.md)") do |file|
351
- config.out_file = file ? File.expand_path(file, Dir.pwd) : 'temp.md'
352
- end
377
+ opts.on("-o", "--[no-]out_file [FILE]", "Output file (default: temp.md)") do |file|
378
+ config.out_file = file ? File.expand_path(file, Dir.pwd) : 'temp.md'
379
+ end
353
380
 
354
- opts.on("-a", "--[no-]append", "Append to output file instead of overwriting") do |append|
355
- config.append = append
356
- end
381
+ opts.on("-a", "--[no-]append", "Append to output file instead of overwriting") do |append|
382
+ config.append = append
383
+ end
357
384
 
358
- opts.on("-l", "--[no-]log_file [FILE]", "Log file") do |file|
359
- config.log_file = file
360
- end
385
+ opts.on("-l", "--[no-]log_file [FILE]", "Log file") do |file|
386
+ config.log_file = file
387
+ end
361
388
 
362
- opts.on("--md", "--[no-]markdown", "Format with Markdown") do |md|
363
- config.markdown = md
364
- end
389
+ opts.on("--md", "--[no-]markdown", "Format with Markdown") do |md|
390
+ config.markdown = md
391
+ end
365
392
 
366
- opts.on("-n", "--next PROMPT_ID", "Next prompt to process") do |next_prompt|
367
- config.next = next_prompt
368
- end
393
+ opts.on("-n", "--next PROMPT_ID", "Next prompt to process") do |next_prompt|
394
+ config.next = next_prompt
395
+ end
369
396
 
370
- opts.on("--pipeline PROMPTS", "Pipeline of prompts to process") do |pipeline|
371
- config.pipeline = pipeline.split(',')
372
- end
397
+ opts.on("--pipeline PROMPTS", "Pipeline of prompts to process") do |pipeline|
398
+ config.pipeline = pipeline.split(',')
399
+ end
373
400
 
374
- opts.on("-f", "--fuzzy", "Use fuzzy matching for prompt search") do
375
- unless system("which fzf > /dev/null 2>&1")
376
- STDERR.puts "Error: 'fzf' is not installed. Please install 'fzf' to use the --fuzzy option."
377
- exit 1
401
+ opts.on("-f", "--fuzzy", "Use fuzzy matching for prompt search") do
402
+ unless system("which fzf > /dev/null 2>&1")
403
+ STDERR.puts "Error: 'fzf' is not installed. Please install 'fzf' to use the --fuzzy option."
404
+ exit 1
405
+ end
406
+ config.fuzzy = true
378
407
  end
379
- config.fuzzy = true
380
- end
381
408
 
382
- opts.on("-d", "--debug", "Enable debug output") do
383
- config.debug = $DEBUG_ME = true
384
- end
409
+ opts.on("-d", "--debug", "Enable debug output") do
410
+ config.debug = $DEBUG_ME = true
411
+ end
385
412
 
386
- opts.on("--no-debug", "Disable debug output") do
387
- config.debug = $DEBUG_ME = false
388
- end
413
+ opts.on("--no-debug", "Disable debug output") do
414
+ config.debug = $DEBUG_ME = false
415
+ end
389
416
 
390
- opts.on("-v", "--verbose", "Be verbose") do
391
- config.verbose = true
392
- end
417
+ opts.on("-v", "--verbose", "Be verbose") do
418
+ config.verbose = true
419
+ end
393
420
 
394
- opts.on("--speak", "Simple implementation. Uses the speech model to convert text to audio, then plays the audio. Fun with --chat. Supports configuration of speech model and voice.") do
395
- config.speak = true
396
- end
421
+ opts.on("--speak", "Simple implementation. Uses the speech model to convert text to audio, then plays the audio. Fun with --chat. Supports configuration of speech model and voice.") do
422
+ config.speak = true
423
+ end
397
424
 
398
- opts.on("--voice VOICE", "Voice to use for speech") do |voice|
399
- config.voice = voice
400
- end
425
+ opts.on("--voice VOICE", "Voice to use for speech") do |voice|
426
+ config.voice = voice
427
+ end
401
428
 
402
- opts.on("--sm", "--speech_model MODEL", "Speech model to use") do |model|
403
- config.speech_model = model
404
- end
429
+ opts.on("--sm", "--speech_model MODEL", "Speech model to use") do |model|
430
+ config.speech_model = model
431
+ end
405
432
 
406
- opts.on("--tm", "--transcription_model MODEL", "Transcription model to use") do |model|
407
- config.transcription_model = model
408
- end
433
+ opts.on("--tm", "--transcription_model MODEL", "Transcription model to use") do |model|
434
+ config.transcription_model = model
435
+ end
409
436
 
410
- opts.on("--is", "--image_size SIZE", "Image size for image generation") do |size|
411
- config.image_size = size
412
- end
437
+ opts.on("--is", "--image_size SIZE", "Image size for image generation") do |size|
438
+ config.image_size = size
439
+ end
413
440
 
414
- opts.on("--iq", "--image_quality QUALITY", "Image quality for image generation") do |quality|
415
- config.image_quality = quality
416
- end
441
+ opts.on("--iq", "--image_quality QUALITY", "Image quality for image generation") do |quality|
442
+ config.image_quality = quality
443
+ end
417
444
 
418
- opts.on("--style", "--image_style STYLE", "Style for image generation") do |style|
419
- config.image_style = style
420
- end
445
+ opts.on("--style", "--image_style STYLE", "Style for image generation") do |style|
446
+ config.image_style = style
447
+ end
421
448
 
422
- opts.on("--system_prompt PROMPT_ID", "System prompt ID to use for chat sessions") do |prompt_id|
423
- config.system_prompt = prompt_id
424
- end
449
+ opts.on("--system_prompt PROMPT_ID", "System prompt ID to use for chat sessions") do |prompt_id|
450
+ config.system_prompt = prompt_id
451
+ end
425
452
 
426
- # AI model parameters
427
- opts.on("-t", "--temperature TEMP", Float, "Temperature for text generation") do |temp|
428
- config.temperature = temp
429
- end
453
+ # AI model parameters
454
+ opts.on("-t", "--temperature TEMP", Float, "Temperature for text generation") do |temp|
455
+ config.temperature = temp
456
+ end
430
457
 
431
- opts.on("--max_tokens TOKENS", Integer, "Maximum tokens for text generation") do |tokens|
432
- config.max_tokens = tokens
433
- end
458
+ opts.on("--max_tokens TOKENS", Integer, "Maximum tokens for text generation") do |tokens|
459
+ config.max_tokens = tokens
460
+ end
434
461
 
435
- opts.on("--top_p VALUE", Float, "Top-p sampling value") do |value|
436
- config.top_p = value
437
- end
462
+ opts.on("--top_p VALUE", Float, "Top-p sampling value") do |value|
463
+ config.top_p = value
464
+ end
438
465
 
439
- opts.on("--frequency_penalty VALUE", Float, "Frequency penalty") do |value|
440
- config.frequency_penalty = value
441
- end
466
+ opts.on("--frequency_penalty VALUE", Float, "Frequency penalty") do |value|
467
+ config.frequency_penalty = value
468
+ end
442
469
 
443
- opts.on("--presence_penalty VALUE", Float, "Presence penalty") do |value|
444
- config.presence_penalty = value
445
- end
470
+ opts.on("--presence_penalty VALUE", Float, "Presence penalty") do |value|
471
+ config.presence_penalty = value
472
+ end
446
473
 
447
- opts.on("--dump FILE", "Dump config to file") do |file|
448
- config.dump_file = file
449
- end
474
+ opts.on("--dump FILE", "Dump config to file") do |file|
475
+ config.dump_file = file
476
+ end
450
477
 
451
- opts.on("--completion SHELL", "Show completion script for bash|zsh|fish - default is nil") do |shell|
452
- config.completion = shell
453
- end
478
+ opts.on("--completion SHELL", "Show completion script for bash|zsh|fish - default is nil") do |shell|
479
+ config.completion = shell
480
+ end
454
481
 
455
- opts.on("--version", "Show version") do
456
- puts AIA::VERSION
457
- exit
458
- end
482
+ opts.on("--version", "Show version") do
483
+ puts AIA::VERSION
484
+ exit
485
+ end
459
486
 
460
- opts.on("-h", "--help", "Prints this help") do
461
- puts opts
462
- exit
463
- end
487
+ opts.on("-h", "--help", "Prints this help") do
488
+ puts opts
489
+ exit
490
+ end
464
491
 
465
- opts.on("--rq LIBS", "Ruby libraries to require for Ruby directive") do |libs|
466
- config.require_libs = libs.split(',')
467
- end
492
+ opts.on("--rq LIBS", "--require LIBS", "Ruby libraries to require for Ruby directive") do |libs|
493
+ config.require_libs ||= []
494
+ config.require_libs += libs.split(',')
495
+ end
468
496
 
469
- opts.on("--tools PATH_LIST", "Add a tool(s)") do |a_path_list|
470
- config.tool_paths ||= []
497
+ opts.on("--tools PATH_LIST", "Add a tool(s)") do |a_path_list|
498
+ config.tool_paths ||= []
471
499
 
472
- if a_path_list.empty?
473
- STDERR.puts "No list of paths for --tools option"
474
- exit 1
475
- else
476
- paths = a_path_list.split(',').map(&:strip).uniq
477
- end
500
+ if a_path_list.empty?
501
+ STDERR.puts "No list of paths for --tools option"
502
+ exit 1
503
+ else
504
+ paths = a_path_list.split(',').map(&:strip).uniq
505
+ end
478
506
 
479
- paths.each do |a_path|
480
- if File.exist?(a_path)
481
- if File.file?(a_path)
482
- if '.rb' == File.extname(a_path)
483
- config.tool_paths << a_path
484
- else
485
- STDERR.puts "file should have *.rb extension: #{a_path}"
486
- exit 1
507
+ paths.each do |a_path|
508
+ if File.exist?(a_path)
509
+ if File.file?(a_path)
510
+ if '.rb' == File.extname(a_path)
511
+ config.tool_paths << a_path
512
+ else
513
+ STDERR.puts "file should have *.rb extension: #{a_path}"
514
+ exit 1
515
+ end
516
+ elsif File.directory?(a_path)
517
+ rb_files = Dir.glob(File.join(a_path, '**', '*.rb'))
518
+ config.tool_paths += rb_files
487
519
  end
488
- elsif File.directory?(a_path)
489
- rb_files = Dir.glob(File.join(a_path, '**', '*.rb'))
490
- config.tool_paths += rb_files
520
+ else
521
+ STDERR.puts "file/dir path is not valid: #{a_path}"
522
+ exit 1
491
523
  end
492
- else
493
- STDERR.puts "file/dir path is not valid: #{a_path}"
494
- exit 1
495
524
  end
496
- end
497
525
 
498
- config.tool_paths.uniq!
499
- end
526
+ config.tool_paths.uniq!
527
+ end
500
528
 
501
- opts.on("--at", "--allowed_tools TOOLS_LIST", "Allow only these tools to be used") do |tools_list|
502
- config.allowed_tools ||= []
503
- if tools_list.empty?
504
- STDERR.puts "No list of tool names provided for --allowed_tools option"
505
- exit 1
506
- else
507
- config.allowed_tools += tools_list.split(',').map(&:strip)
508
- config.allowed_tools.uniq!
529
+ opts.on("--at", "--allowed_tools TOOLS_LIST", "Allow only these tools to be used") do |tools_list|
530
+ config.allowed_tools ||= []
531
+ if tools_list.empty?
532
+ STDERR.puts "No list of tool names provided for --allowed_tools option"
533
+ exit 1
534
+ else
535
+ config.allowed_tools += tools_list.split(',').map(&:strip)
536
+ config.allowed_tools.uniq!
537
+ end
509
538
  end
510
- end
511
539
 
512
- opts.on("--rt", "--rejected_tools TOOLS_LIST", "Reject these tools") do |tools_list|
513
- config.rejected_tools ||= []
514
- if tools_list.empty?
515
- STDERR.puts "No list of tool names provided for --rejected_tools option"
516
- exit 1
517
- else
518
- config.rejected_tools += tools_list.split(',').map(&:strip)
519
- config.rejected_tools.uniq!
540
+ opts.on("--rt", "--rejected_tools TOOLS_LIST", "Reject these tools") do |tools_list|
541
+ config.rejected_tools ||= []
542
+ if tools_list.empty?
543
+ STDERR.puts "No list of tool names provided for --rejected_tools option"
544
+ exit 1
545
+ else
546
+ config.rejected_tools += tools_list.split(',').map(&:strip)
547
+ config.rejected_tools.uniq!
548
+ end
520
549
  end
521
550
  end
551
+ opt_parser.parse!
552
+ rescue => e
553
+ STDERR.puts "ERROR: #{e.message}"
554
+ STDERR.puts " use --help for usage report"
555
+ exit 1
522
556
  end
523
557
 
524
558
  args = ARGV.dup
@@ -566,6 +600,12 @@ module AIA
566
600
  STDERR.puts "WARNING:Config file not found: #{file}"
567
601
  end
568
602
 
603
+ if config.last_refresh
604
+ if config.last_refresh.is_a? String
605
+ config.last_refresh = Date.strptime(config.last_refresh, '%Y-%m-%d')
606
+ end
607
+ end
608
+
569
609
  config
570
610
  end
571
611
 
@@ -584,10 +624,10 @@ module AIA
584
624
  def self.dump_config(config, file)
585
625
  # Implementation for config dump
586
626
  ext = File.extname(file).downcase
587
- config_hash = config.to_h
588
627
 
589
- # Remove non-serializable objects
590
- config_hash.delete_if { |_, v| !v.nil? && !v.is_a?(String) && !v.is_a?(Numeric) && !v.is_a?(TrueClass) && !v.is_a?(FalseClass) && !v.is_a?(Array) && !v.is_a?(Hash) }
628
+ config.last_refresh = config.last_refresh.to_s if config.last_refresh.is_a? Date
629
+
630
+ config_hash = config.to_h
591
631
 
592
632
  # Remove dump_file key to prevent automatic exit on next load
593
633
  config_hash.delete(:dump_file)
@@ -57,7 +57,7 @@ module AIA
57
57
  RubyLLM.chat.clear_history
58
58
  end
59
59
  rescue => e
60
- SYSERR.puts "ERROR: context_manager clear_context error #{e.message}"
60
+ STDERR.puts "ERROR: context_manager clear_context error #{e.message}"
61
61
  end
62
62
  end
63
63
 
@@ -57,7 +57,7 @@ module AIA
57
57
  else
58
58
  a_string.to_s
59
59
  end
60
-
60
+
61
61
  content.strip.start_with?(PromptManager::Prompt::DIRECTIVE_SIGNAL)
62
62
  end
63
63
 
@@ -162,6 +162,7 @@ module AIA
162
162
  end
163
163
  ''
164
164
  end
165
+ alias_method :workflow, :pipeline
165
166
 
166
167
  desc "Inserts the contents of a file Example: //include path/to/file"
167
168
  def include(args, context_manager=nil)
@@ -240,6 +241,10 @@ module AIA
240
241
 
241
242
  desc "Clears the conversation history (aka context) same as //config clear = true"
242
243
  def clear(args, context_manager=nil)
244
+ # TODO: review the robot's code in the Session class for when the
245
+ # //clear directive is used in a follow up prompt. That processing
246
+ # should be moved here so that it is also available in batch
247
+ # sessions.
243
248
  if context_manager.nil?
244
249
  return "Error: Context manager not available for //clear directive."
245
250
  end
@@ -2,48 +2,135 @@
2
2
 
3
3
  require 'ruby_llm'
4
4
 
5
+ class RubyLLM::Modalities
6
+ def supports?(query_mode)
7
+ parts = query_mode
8
+ .to_s
9
+ .downcase
10
+ .split(/2|-to-| to |_to_/)
11
+ .map(&:strip)
12
+
13
+ if 2 == parts.size
14
+ input.include?(parts[0]) && output.include?(parts[1])
15
+ elsif 1 == parts.size
16
+ input.include?(parts[0]) || output.include?(parts[0])
17
+ else
18
+ false
19
+ end
20
+ end
21
+ end
22
+
5
23
  module AIA
6
24
  class RubyLLMAdapter
7
25
  attr_reader :tools
8
26
 
9
27
  def initialize
10
- @model = AIA.config.model
11
- model_info = extract_model_parts(@model)
28
+ @provider, @model = extract_model_parts.values
29
+
30
+ configure_rubyllm
31
+ refresh_local_model_registry
32
+ setup_chat_with_tools
33
+ end
12
34
 
13
- # Configure RubyLLM with available API keys
35
+ def configure_rubyllm
36
+ # TODO: Add some of these configuration items to AIA.config
14
37
  RubyLLM.configure do |config|
15
- config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
16
- config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
17
- config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
18
- config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
38
+ config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
39
+ config.openai_organization_id = ENV.fetch('OPENAI_ORGANIZATION_ID', nil)
40
+ config.openai_project_id = ENV.fetch('OPENAI_PROJECT_ID', nil)
41
+
42
+ config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
43
+ config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
44
+ config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
45
+ config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
46
+
47
+ config.bedrock_api_key = ENV.fetch('BEDROCK_ACCESS_KEY_ID', nil)
48
+ config.bedrock_secret_key = ENV.fetch('BEDROCK_SECRET_ACCESS_KEY', nil)
49
+ config.bedrock_region = ENV.fetch('BEDROCK_REGION', nil)
50
+ config.bedrock_session_token = ENV.fetch('BEDROCK_SESSION_TOKEN', nil)
51
+
52
+ config.ollama_api_base = ENV.fetch('OLLAMA_API_BASE', nil)
53
+
54
+ # --- Custom OpenAI Endpoint ---
55
+ # Use this for Azure OpenAI, proxies, or self-hosted models via OpenAI-compatible APIs.
56
+ config.openai_api_base = ENV.fetch('OPENAI_API_BASE', nil) # e.g., "https://your-azure.openai.azure.com"
57
+
58
+ # --- Default Models ---
59
+ # Used by RubyLLM.chat, RubyLLM.embed, RubyLLM.paint if no model is specified.
60
+ # config.default_model = 'gpt-4.1-nano' # Default: 'gpt-4.1-nano'
61
+ # config.default_embedding_model = 'text-embedding-3-small' # Default: 'text-embedding-3-small'
62
+ # config.default_image_model = 'dall-e-3' # Default: 'dall-e-3'
63
+
64
+ # --- Connection Settings ---
65
+ # config.request_timeout = 120 # Request timeout in seconds (default: 120)
66
+ # config.max_retries = 3 # Max retries on transient network errors (default: 3)
67
+ # config.retry_interval = 0.1 # Initial delay in seconds (default: 0.1)
68
+ # config.retry_backoff_factor = 2 # Multiplier for subsequent retries (default: 2)
69
+ # config.retry_interval_randomness = 0.5 # Jitter factor (default: 0.5)
70
+
71
+ # --- Logging Settings ---
72
+ # config.log_file = '/logs/ruby_llm.log'
73
+ config.log_level = :fatal # debug level can also be set to debug by setting RUBYLLM_DEBUG envar to true
74
+ end
75
+ end
19
76
 
20
- # Bedrock configuration
21
- config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
22
- config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
23
- config.bedrock_region = ENV.fetch('AWS_REGION', nil)
24
- config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
77
+ def refresh_local_model_registry
78
+ if AIA.config.refresh.nil? ||
79
+ Integer(AIA.config.refresh).zero? ||
80
+ Date.today > (AIA.config.last_refresh + Integer(AIA.config.refresh))
81
+ RubyLLM.models.refresh!
82
+ AIA.config.last_refresh = Date.today
83
+ if AIA.config.config_file
84
+ AIA::Config.dump_config(AIA.config, AIA.config.config_file)
85
+ end
25
86
  end
87
+ end
26
88
 
27
- @chat = RubyLLM.chat(model: model_info[:model])
89
+ def setup_chat_with_tools
90
+ begin
91
+ @chat = RubyLLM.chat(model: @model)
92
+ rescue => e
93
+ STDERR.puts "ERROR: #{e.message}"
94
+ exit 1
95
+ end
96
+
97
+ if !AIA.config.tool_paths.empty? && !@chat.model.supports?(:function_calling)
98
+ STDERR.puts "ERROR: The model #{@model} does not support tools"
99
+ exit 1
100
+ end
28
101
 
29
102
  @tools = ObjectSpace.each_object(Class).select do |klass|
30
103
  klass < RubyLLM::Tool
31
104
  end
32
105
 
33
- @chat.with_tools(*tools) unless tools.empty?
106
+ unless tools.empty?
107
+ @chat.with_tools(*tools)
108
+ AIA.config.tools = tools.map(&:name).join(', ')
109
+ end
34
110
  end
35
111
 
112
+ # TODO: Need to rethink this dispatcher pattern w/r/t RubyLLM's capabilities
113
+ # This code was originally designed for AiClient
114
+ #
36
115
  def chat(prompt)
37
- if @model.downcase.include?('dall-e') || @model.downcase.include?('image-generation')
38
- text_to_image(prompt)
39
- elsif @model.downcase.include?('vision') || @model.downcase.include?('image')
116
+ modes = @chat.model.modalities
117
+
118
+ # TODO: Need to consider how to handle multi-mode models
119
+ if modes.supports? :text_to_text
120
+ text_to_text(prompt)
121
+
122
+ elsif modes.supports? :image_to_text
40
123
  image_to_text(prompt)
41
- elsif @model.downcase.include?('tts') || @model.downcase.include?('speech')
124
+ elsif modes.supports? :text_to_image
125
+ text_to_image(prompt)
126
+
127
+ elsif modes.supports? :text_to_audio
42
128
  text_to_audio(prompt)
43
- elsif @model.downcase.include?('whisper') || @model.downcase.include?('transcription')
129
+ elsif modes.supports? :audio_to_text
44
130
  audio_to_text(prompt)
131
+
45
132
  else
46
- text_to_text(prompt)
133
+ # TODO: what else can be done?
47
134
  end
48
135
  end
49
136
 
@@ -67,14 +154,6 @@ module AIA
67
154
  end
68
155
  end
69
156
 
70
- def method_missing(method, *args, &block)
71
- if @chat.respond_to?(method)
72
- @chat.public_send(method, *args, &block)
73
- else
74
- super
75
- end
76
- end
77
-
78
157
  # Clear the chat context/history
79
158
  # Needed for the //clear directive
80
159
  def clear_context
@@ -88,12 +167,18 @@ module AIA
88
167
 
89
168
  # Option 2: Force RubyLLM to create a new chat instance at the global level
90
169
  # This ensures any shared state is reset
91
- model_info = extract_model_parts(@model)
170
+ @provider, @model = extract_model_parts.values
92
171
  RubyLLM.instance_variable_set(:@chat, nil) if RubyLLM.instance_variable_defined?(:@chat)
93
172
 
94
173
  # Option 3: Create a completely fresh chat instance for this adapter
95
174
  @chat = nil # First nil it to help garbage collection
96
- @chat = RubyLLM.chat(model: model_info[:model])
175
+
176
+ begin
177
+ @chat = RubyLLM.chat(model: @model)
178
+ rescue => e
179
+ STDERR.puts "ERROR: #{e.message}"
180
+ exit 1
181
+ end
97
182
 
98
183
  # Option 4: Call official clear_history method if it exists
99
184
  if @chat.respond_to?(:clear_history)
@@ -114,22 +199,33 @@ module AIA
114
199
  end
115
200
  end
116
201
 
202
+ def method_missing(method, *args, &block)
203
+ if @chat.respond_to?(method)
204
+ @chat.public_send(method, *args, &block)
205
+ else
206
+ super
207
+ end
208
+ end
209
+
117
210
  def respond_to_missing?(method, include_private = false)
118
211
  @chat.respond_to?(method) || super
119
212
  end
120
213
 
121
214
  private
122
215
 
123
- def extract_model_parts(model_string)
124
- parts = model_string.split('/')
216
+ def extract_model_parts
217
+ parts = AIA.config.model.split('/')
125
218
  parts.map!(&:strip)
126
219
 
127
- if parts.length > 1
128
- provider = parts[0]
129
- model = parts[1]
220
+ if 2 == parts.length
221
+ provider = parts[0]
222
+ model = parts[1]
223
+ elsif 1 == parts.length
224
+ provider = nil # RubyLLM will figure it out from the model name
225
+ model = parts[0]
130
226
  else
131
- provider = nil # RubyLLM will figure it out from the model name
132
- model = parts[0]
227
+ STDERR.puts "ERROR: malformed model name: #{AIA.config.model}"
228
+ exit 1
133
229
  end
134
230
 
135
231
  { provider: provider, model: model }
@@ -149,7 +245,7 @@ module AIA
149
245
 
150
246
  def text_to_text(prompt)
151
247
  text_prompt = extract_text_prompt(prompt)
152
- @chat.ask(text_prompt)
248
+ @chat.ask(text_prompt).content
153
249
  end
154
250
 
155
251
  def text_to_image(prompt)
@@ -158,9 +254,9 @@ module AIA
158
254
 
159
255
  begin
160
256
  RubyLLM.paint(text_prompt, output_path: output_file,
161
- size: AIA.config.image_size,
162
- quality: AIA.config.image_quality,
163
- style: AIA.config.image_style)
257
+ size: AIA.config.image_size,
258
+ quality: AIA.config.image_quality,
259
+ style: AIA.config.image_style)
164
260
  "Image generated and saved to: #{output_file}"
165
261
  rescue => e
166
262
  "Error generating image: #{e.message}"
@@ -168,12 +264,12 @@ module AIA
168
264
  end
169
265
 
170
266
  def image_to_text(prompt)
171
- image_path = extract_image_path(prompt)
267
+ image_path = extract_image_path(prompt)
172
268
  text_prompt = extract_text_prompt(prompt)
173
269
 
174
270
  if image_path && File.exist?(image_path)
175
271
  begin
176
- @chat.ask(text_prompt, with: { image: image_path })
272
+ @chat.ask(text_prompt, with: { image: image_path }).content
177
273
  rescue => e
178
274
  "Error analyzing image: #{e.message}"
179
275
  end
@@ -197,11 +293,16 @@ module AIA
197
293
  end
198
294
  end
199
295
 
296
+ # TODO: what if its a multi-mode model and a text prompt is provided with
297
+ # the audio file?
200
298
  def audio_to_text(prompt)
299
+ text = extract_text_prompt(prompt)
300
+ text = 'Transcribe this audio' if text.nil? || text.empty?
301
+
201
302
  if prompt.is_a?(String) && File.exist?(prompt) &&
202
303
  prompt.downcase.end_with?('.mp3', '.wav', '.m4a', '.flac')
203
304
  begin
204
- @chat.ask("Transcribe this audio", with: { audio: prompt })
305
+ @chat.ask(text, with: { audio: prompt }).content
205
306
  rescue => e
206
307
  "Error transcribing audio: #{e.message}"
207
308
  end
data/lib/aia/session.rb CHANGED
@@ -273,7 +273,7 @@ module AIA
273
273
  begin
274
274
  AIA.config.client = AIA::RubyLLMAdapter.new
275
275
  rescue => e
276
- SYSERR.puts "Error reinitializing client: #{e.message}"
276
+ STDERR.puts "Error reinitializing client: #{e.message}"
277
277
  end
278
278
 
279
279
  @ui_presenter.display_info("Chat context cleared.")
data/lib/aia/utility.rb CHANGED
@@ -1,23 +1,33 @@
1
1
  # lib/aia/utility.rb
2
2
 
3
+ require 'word_wrapper' # Pure ruby word wrapping
4
+
3
5
  module AIA
4
6
  class Utility
5
7
  class << self
6
8
  # Displays the AIA robot ASCII art
9
+ # Yes, its slightly frivolous but it does contain some
10
+ # useful configuration information.
7
11
  def robot
12
+ indent = 18
13
+ spaces = " "*indent
14
+ width = TTY::Screen.width - indent - 2
15
+
8
16
  puts <<-ROBOT
9
17
 
10
18
  , ,
11
- (\\____/) AI Assistant
19
+ (\\____/) AI Assistant (v#{AIA::VERSION}) is Online
12
20
  (_oo_) #{AIA.config.model}
13
- (O) is Online
14
- __||__ \\) using #{AIA.config.adapter}
15
- [/______\\] /
16
- / \\__AI__/ \\/
21
+ (O) using #{AIA.config.adapter} (v#{RubyLLM::VERSION})
22
+ __||__ \\) model db was last refreshed on
23
+ [/______\\] / #{AIA.config.last_refresh}
24
+ / \\__AI__/ \\/ #{AIA.config.tool_paths.empty? ? 'I forgot my toolbox' : 'I brought some tools'}
17
25
  / /__\\
18
- (\\ /____\\
19
-
26
+ (\\ /____\\ #{AIA.config.tool_paths.empty? ? '' : 'My Toolbox contains:'}
20
27
  ROBOT
28
+ if AIA.config.tools
29
+ puts WordWrapper::MinimumRaggedness.new(width, AIA.config.tools).wrap.split("\n").map{|s| spaces+s+"\n"}.join
30
+ end
21
31
  end
22
32
  end
23
33
  end
data/lib/aia.rb CHANGED
@@ -5,7 +5,6 @@
5
5
  # provides an interface for interacting with AI models and managing prompts.
6
6
 
7
7
  require 'ruby_llm'
8
-
9
8
  require 'prompt_manager'
10
9
 
11
10
  require 'debug_me'
@@ -80,12 +79,21 @@ module AIA
80
79
  prompt_handler = PromptHandler.new
81
80
 
82
81
  # Initialize the appropriate client adapter based on configuration
83
- @config.client = if @config.adapter == 'ruby_llm'
82
+ @config.client = if 'ruby_llm' == @config.adapter
84
83
  RubyLLMAdapter.new
85
84
  else
86
- AIClientAdapter.new
85
+ # TODO: ?? some other LLM API wrapper
86
+ STDERR.puts "ERROR: There is no adapter for #{@config.adapter}"
87
+ exit 1
87
88
  end
88
89
 
90
+ # There are two kinds of sessions: batch and chat
91
+ # A chat session is started when the --chat CLI option is used
92
+ # BUT its also possible to start a chat session with an initial prompt AND
93
+ # within that initial prompt there can be a workflow (aka pipeline)
94
+ # defined. If that is the case, then the chat session will not start
95
+ # until the initial prompt has completed its workflow.
96
+
89
97
  session = Session.new(prompt_handler)
90
98
 
91
99
  session.start
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.9.2
4
+ version: 0.9.3rc1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
@@ -38,49 +38,49 @@ dependencies:
38
38
  - !ruby/object:Gem::Version
39
39
  version: '0'
40
40
  - !ruby/object:Gem::Dependency
41
- name: os
41
+ name: prompt_manager
42
42
  requirement: !ruby/object:Gem::Requirement
43
43
  requirements:
44
44
  - - ">="
45
45
  - !ruby/object:Gem::Version
46
- version: '0'
46
+ version: 0.5.4
47
47
  type: :runtime
48
48
  prerelease: false
49
49
  version_requirements: !ruby/object:Gem::Requirement
50
50
  requirements:
51
51
  - - ">="
52
52
  - !ruby/object:Gem::Version
53
- version: '0'
53
+ version: 0.5.4
54
54
  - !ruby/object:Gem::Dependency
55
- name: prompt_manager
55
+ name: ruby_llm
56
56
  requirement: !ruby/object:Gem::Requirement
57
57
  requirements:
58
58
  - - ">="
59
59
  - !ruby/object:Gem::Version
60
- version: 0.5.4
60
+ version: 1.3.0rc1
61
61
  type: :runtime
62
62
  prerelease: false
63
63
  version_requirements: !ruby/object:Gem::Requirement
64
64
  requirements:
65
65
  - - ">="
66
66
  - !ruby/object:Gem::Version
67
- version: 0.5.4
67
+ version: 1.3.0rc1
68
68
  - !ruby/object:Gem::Dependency
69
- name: ruby_llm
69
+ name: reline
70
70
  requirement: !ruby/object:Gem::Requirement
71
71
  requirements:
72
72
  - - ">="
73
73
  - !ruby/object:Gem::Version
74
- version: 1.2.0
74
+ version: '0'
75
75
  type: :runtime
76
76
  prerelease: false
77
77
  version_requirements: !ruby/object:Gem::Requirement
78
78
  requirements:
79
79
  - - ">="
80
80
  - !ruby/object:Gem::Version
81
- version: 1.2.0
81
+ version: '0'
82
82
  - !ruby/object:Gem::Dependency
83
- name: reline
83
+ name: shellwords
84
84
  requirement: !ruby/object:Gem::Requirement
85
85
  requirements:
86
86
  - - ">="
@@ -94,7 +94,7 @@ dependencies:
94
94
  - !ruby/object:Gem::Version
95
95
  version: '0'
96
96
  - !ruby/object:Gem::Dependency
97
- name: shellwords
97
+ name: toml-rb
98
98
  requirement: !ruby/object:Gem::Requirement
99
99
  requirements:
100
100
  - - ">="
@@ -108,7 +108,7 @@ dependencies:
108
108
  - !ruby/object:Gem::Version
109
109
  version: '0'
110
110
  - !ruby/object:Gem::Dependency
111
- name: toml-rb
111
+ name: tty-screen
112
112
  requirement: !ruby/object:Gem::Requirement
113
113
  requirements:
114
114
  - - ">="
@@ -122,7 +122,7 @@ dependencies:
122
122
  - !ruby/object:Gem::Version
123
123
  version: '0'
124
124
  - !ruby/object:Gem::Dependency
125
- name: tty-screen
125
+ name: tty-spinner
126
126
  requirement: !ruby/object:Gem::Requirement
127
127
  requirements:
128
128
  - - ">="
@@ -136,7 +136,7 @@ dependencies:
136
136
  - !ruby/object:Gem::Version
137
137
  version: '0'
138
138
  - !ruby/object:Gem::Dependency
139
- name: tty-spinner
139
+ name: versionaire
140
140
  requirement: !ruby/object:Gem::Requirement
141
141
  requirements:
142
142
  - - ">="
@@ -150,7 +150,7 @@ dependencies:
150
150
  - !ruby/object:Gem::Version
151
151
  version: '0'
152
152
  - !ruby/object:Gem::Dependency
153
- name: versionaire
153
+ name: word_wrapper
154
154
  requirement: !ruby/object:Gem::Requirement
155
155
  requirements:
156
156
  - - ">="