aia 0.8.5 → 0.8.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 307fe4fba0b6c2134cba097ffd9b01750eea1f3e06d6c6de5131099435caaa16
4
- data.tar.gz: ac46548236f1c9162d62af9a999776d9a01ffa5cc8ff283ed9bfd774414ddc85
3
+ metadata.gz: 7ce34c18195898a722843fc5936628eebb00e16517618ffc2d2735e291caa528
4
+ data.tar.gz: bdf4a3fcf9d5da9546caf69f4c5b9c969e4453d53755fc0ff9e70bce273c6998
5
5
  SHA512:
6
- metadata.gz: '09ec4085dabf70365ccc74a95126891a6075eefc57ad1a652432278b38355f55d2c503ebc6e10b52d9f77e3d7788fef0a9b1d9d6e6aa3b7d72b2cda82d337c08'
7
- data.tar.gz: 84eed4f09f8d9d506887ea91cd9b810cc47e77a724c3031e43b464779e5cfcaf5f0a92cc30304bf0d6ffea0eb31e35a23dc2baabff1671cff1bca8852b7e7801
6
+ metadata.gz: c72b22c0321c1b54b103a74acc7b79a3e4650f8588ce0017294966fe2a895cc0942a7a44471603a3517cb78bb9234a619174a91cde30c32f4eb9c1d69ff7f200
7
+ data.tar.gz: 49d2253408f2e5f846aa15dc83b6165fb2f6d4fad28d029aef4536ae699e267cfb476cce375576465a5bd17a418231f1cd3686bcde00c4f206735f838f78e798
data/.version CHANGED
@@ -1 +1 @@
1
- 0.8.5
1
+ 0.8.6
data/CHANGELOG.md CHANGED
@@ -1,8 +1,12 @@
1
1
  # Changelog
2
2
  ## [Unreleased]
3
-
4
3
  ## Released
5
- ### [0.8.5] 2025-04-49
4
+
5
+ ### [0.8.6] 2025-04-23
6
+ - Added a client adapter for the ruby_llm gem
7
+ - Added the adapter config item and the --adapter option to select at runtime which client to use ai_client or ruby_llm
8
+
9
+ ### [0.8.5] 2025-04-19
6
10
  - documentation updates
7
11
  - integrated the https://pure.md web service for inserting web pages into the context window
8
12
  - //include http?://example.com/stuff
data/README.md CHANGED
@@ -1,4 +1,4 @@
1
- 3# AI Assistant (AIA)
1
+ # AI Assistant (AIA)
2
2
 
3
3
  **The prompt is the code!**
4
4
 
@@ -17,22 +17,19 @@
17
17
  AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
18
18
 
19
19
  **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
20
+ - Added support for the `ruby_llm` gem as an alternative to `ai_client`
20
21
  - //include directive now supports web URLs
21
22
  - //webpage insert web URL content as markdown into context
22
23
 
23
24
  **Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
24
25
 
25
26
  **Notable Recent Changes:**
26
- - **Directive Processing in Chat and Prompts:** You can now use directives in chat sessions and prompt files. Use the directive **//help** to get a list of available directives.
27
+ - **RubyLLM Integration:** AIA now supports the RubyLLM gem as an alternative to ai_client. Use `--adapter ruby_llm` to switch. Why am I replacing my on gem ai_client with the ruby_llm gem? Because its better, newer, elegant and will easily support some of the new features I have planned for AIA. Its not fully integrated but its close ofenough to work on text-to-text generation. Other modes will be added in the future.
27
28
 
28
29
  <!-- Tocer[start]: Auto-generated, don't remove. -->
29
30
 
30
31
  ## Table of Contents
31
32
 
32
- - [Installation](#installation)
33
- - [What is a Prompt ID?](#what-is-a-prompt-id)
34
- - [Embedded Parameters as Placeholders](#embedded-parameters-as-placeholders)
35
- - [Usage](#usage)
36
33
  - [Configuration Options](#configuration-options)
37
34
  - [Configuration Flexibility](#configuration-flexibility)
38
35
  - [Expandable Configuration](#expandable-configuration)
@@ -66,132 +63,21 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
66
63
  - [My Most Powerful Prompt](#my-most-powerful-prompt)
67
64
  - [My Configuration](#my-configuration)
68
65
  - [Executable Prompts](#executable-prompts)
66
+ - [Usage](#usage)
69
67
  - [Development](#development)
70
68
  - [Contributing](#contributing)
71
- - [History of Development](#history-of-development)
72
69
  - [Roadmap](#roadmap)
73
70
  - [License](#license)
74
71
 
75
72
  <!-- Tocer[finish]: Auto-generated, don't remove. -->
76
73
 
77
-
78
- ## Installation
79
-
80
- Install the gem by executing:
81
-
82
- gem install aia
83
-
84
- Install the command-line utilities by executing:
85
-
86
- brew install fzf
87
-
88
- You will also need to establish a directory in your filesystem where your prompt text files, last used parameters and usage log files are kept.
89
-
90
- Setup a system environment variable (envar) named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts". The envar "AIA_ROLES_PREFIX" points to your role prefix where you have prompts that define the different roles you want the LLM to assume when it is doing its work. The default roles prefix is "roles".
91
-
92
- You may also want to install the completion script for your shell. To get a copy of the completion script do:
93
-
94
- ```bash
95
- aia --completion bash
96
- ```
97
-
98
- `fish` and `zsh` are also available.
99
-
100
- ## What is a Prompt ID?
101
-
102
- A prompt ID is the basename of a text file (extension *.txt) located in a prompts directory. The prompts directory is specified by the environment variable "AIA_PROMPTS_DIR". If this variable is not set, the default is in your HOME directory named ".prompts". It can also be set on the command line with the `--prompts-dir` option.
103
-
104
- This file contains the context and instructions for the LLM to follow. The prompt ID is what you use as an option on the command line to specify which prompt text file to use. Prompt files can have comments, parameters, directives and ERB blocks along with the instruction text to feed to the LLM. It can also have shell commands and use system environment variables. Consider the following example:
105
-
106
- ```plaintext
107
- #!/usr/bin/env aia run
108
- # ~/.prompts/example.txt
109
- # Desc: Be an example prompt with all? the bells and whistles
110
-
111
- # Set the configuration for this prompt
112
-
113
- //config model = gpt-4
114
- //config temperature = 0.7
115
- //config shell = true
116
- //config erb = true
117
- //config out_file = path/to/output.md
118
-
119
- # Add some file content to the context/instructions
120
-
121
- //include path/to/file
122
- //shell cat path/to/file
123
- $(cat path/to/file)
124
-
125
- # Setup some workflows
126
-
127
- //next next_prompt_id
128
- //pipeline prompt_id_1, prompt_ie_2, prompt_id_3
129
-
130
- # Execute some Ruby code
131
-
132
- //ruby require 'some_library' # inserts into the context/instructions
133
- <% some_ruby_things # not inserted into the context %>
134
- <%= some_other_ruby_things # that are part of the context/instructions %>
135
-
136
- Tell me how to do something for a $(uname -s) platform that would rename all
137
- of the files in the directory $MY_DIRECTORY to have a prefix of for its filename
138
- that is [PREFIX] and a ${SUFFIX}
139
-
140
- <!--
141
- directives, ERB blocks and other junk can be used
142
- anywhere in the file mixing dynamic context/instructions with
143
- the static stuff.
144
- -->
145
-
146
- ```markdown
147
- # Header 1 -- not a comment
148
- ## Header 2 -- not a comment
149
- ### Header 3, etc -- not a comment
150
-
151
- ```ruby
152
- # this is a comment; but it stays in the prompt
153
- puts "hello world" <!-- this is also a comment; but it gets removed -->
154
- ```
155
- Kewl!
156
- ```
157
-
158
- __END__
159
-
160
- Everything after the "__END__" line is not part of the context or instructions to
161
- the LLM.
162
- ```
163
-
164
- Comments in a prompt text file are just there to document the prompt. They are removed before the completed prompt is processed by the LLM. This reduces token counts; but, more importantly it helps you remember why you structured your prompt they way you did - if you remembered to document your prompt.
165
-
166
- That is just about everything including the kitchen sink that a pre-compositional parameterized prompt file can have. It can be an executable with a she-bang line and a special system prompt name `run` as shown in the example. It has line comments that use the `#` symbol. It had end of file block comments that appear after the "__END__" line. It has directive command that begin with the double slash `//` - an homage to IBM JCL. It has shell variables in both forms. It has shell commands. It has parameters that default to a regex that uses square brackets and all uppercase characeters to define the parameter name whose value is to be given in a Q&A session before the prompt is sent to the LLM for processing.
167
-
168
- AIA has the ability to define a workflow of prompt IDs with either the //next or //pipeline directives.
169
-
170
- You could say that instead of the prompt being part of a program, a program can be part of the prompt. **The prompt is the code!**
171
-
172
- By using ERB you can make parts of the context/instructions conditional. You can also use ERB to make parts of the context/instructions dynamic for example to pull information from a database or an API.
173
-
174
- ## Embedded Parameters as Placeholders
175
-
176
- In the example prompt text file above I used the default regex to define parameters as all upper case characters plus space, underscore and the vertical pipe enclosed within square brackets. Since the time that I original starting writing AIA I've seen more developers use double curly braces to define parameters. AIA allows you to specify your own regex as a string. If you want the curly brackets use the `--regex` option on the command line like this:
177
-
178
- `--regex '(?-mix:({{[a-zA-Z _|]+}}))'`
179
-
180
-
181
- ## Usage
182
-
183
- The usage report is obtained with either `-h` or `--help` options.
184
-
185
- ```bash
186
- aia --help
187
- ```
188
-
189
74
  ## Configuration Options
190
75
 
191
76
  The following table provides a comprehensive list of configuration options, their default values, and the associated environment variables:
192
77
 
193
78
  | Option | Default Value | Environment Variable |
194
79
  |-------------------------|---------------------------------|---------------------------|
80
+ | adapter | ai_client | AIA_ADAPTER |
195
81
  | out_file | temp.md | AIA_OUT_FILE |
196
82
  | log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
197
83
  | prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
@@ -293,7 +179,6 @@ The `--erb` option turns the prompt text file into a fully functioning ERB templ
293
179
 
294
180
  Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic prompt text for LLM processing.
295
181
 
296
-
297
182
  ## Prompt Directives
298
183
 
299
184
  Downstream processing directives were added to the `prompt_manager` gem used by AIA at version 0.4.1. These directives are lines in the prompt text file that begin with "//" having this pattern:
@@ -388,7 +273,6 @@ The `path_to_file` can be either absolute or relative. If it is relative, it is
388
273
 
389
274
  The file that is included will have any comments or directives excluded. It is expected that the file will be a text file so that its content can be pre-pended to the existing prompt; however, if the file is a source code file (ex: file.rb) the source code will be included HOWEVER any comment line or line that starts with "//" will be excluded.
390
275
 
391
-
392
276
  #### //ruby
393
277
 
394
278
  The `//ruby` directive executes Ruby code. You can use this to perform complex operations or interact with Ruby libraries.
@@ -423,7 +307,6 @@ There are no limitations on what the shell command can be. For example if you w
423
307
 
424
308
  Which does basically the same thing as the `//include` directive, except it uses the entire content of the file. For relative file paths the same thing applies. The file's path will be relative to the PWD.
425
309
 
426
-
427
310
  #### //next
428
311
  Examples:
429
312
  ```bash
@@ -523,6 +406,7 @@ or inside of the `one.txt` prompt file use this directive:
523
406
 
524
407
  ### Best Practices ??
525
408
 
409
+
526
410
  Since the response of one prompt is fed into the next prompt within the sequence instead of having all prompts write their response to the same out file, use these directives inside the associated prompt files:
527
411
 
528
412
 
@@ -631,7 +515,7 @@ fzf
631
515
 
632
516
  ## Shell Completion
633
517
 
634
- You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do this:
518
+ You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do:
635
519
 
636
520
  ```bash
637
521
  aia --completion bash
@@ -735,6 +619,24 @@ Since its output is going to STDOUT you can setup a pipe chain. Using the CLI p
735
619
 
736
620
  This executable prompt concept sets up the building blocks of a *nix CLI-based pipeline in the same way that the --pipeline and --next options and directives are used.
737
621
 
622
+ ## Usage
623
+
624
+ The usage report is obtained with either `-h` or `--help` options.
625
+
626
+ ```bash
627
+ aia --help
628
+ ```
629
+
630
+ Key command-line options include:
631
+
632
+ - `--adapter ADAPTER`: Choose the LLM interface adapter to use. Valid options are 'ai_client' (default) or 'ruby_llm'. See [RubyLLM Integration Guide](README_RUBY_LLM.md) for details.
633
+ - `--model MODEL`: Specify which LLM model to use
634
+ - `--chat`: Start an interactive chat session
635
+ - `--shell`: Enable shell command integration
636
+ - `--erb`: Enable ERB processing
637
+ - `--role ROLE`: Specify a role/system prompt
638
+ - And many more (use --help to see all options)
639
+
738
640
  ## Development
739
641
 
740
642
  **ShellCommandExecutor Refactor:**
@@ -751,15 +653,11 @@ When you find problems with AIA please note them as an issue. This thing was wr
751
653
 
752
654
  I'm not happy with the way where some command line options for external command are hard coded. I'm specific talking about the way in which the `rg` and `fzf` tools are used. Their options decide the basic look and feel of the search capability on the command line. Maybe they should be part of the overall configuration so that users can tune their UI to the way they like.
753
655
 
754
- ## History of Development
755
-
756
- I originally wrote a tiny script called `aip.rb` to experiment with parameterized prompts. That was in August of 2023. AIP meant AI Parameterized. Adding an extra P for Prompts just seemed to be a little silly. It lived in my [scripts repo](https://github.com/MadBomber/scripts) for a while. It became useful to me so of course I need to keep enhancing it. I moved it into my [experiments repo](https://github.com/MadBomber/experiments) and began adding features in a haphazard manner. No real plan or architecture. From those experiments I refactored out the [prompt_manager gem](https://github.com/MadBomber/prompt_manager) and the [ai_client gem](https://github.com/MadBomber/ai_client)(https://github.com/MadBomber/ai_client). The name was changed from AIP to AIA and it became a gem.
757
-
758
- All of that undirected experimentation without a clear picture of where this thing was going resulted in chaotic code. I would use an Italian food dish to explain the organization but I think chaotic is more descriptive.
759
-
760
656
  ## Roadmap
761
657
 
762
- - support for using Ruby-based functional callback tools
658
+ - I'm thinking about removing the --erb and --shell options and just making those two integrations available all the time.
659
+ - restore the prompt text file search. currently fzf only looks a prompt IDs.
660
+ - continue integration of the ruby_llm gem
763
661
  - support for Model Context Protocol
764
662
 
765
663
  ## License
data/lib/aia/config.rb CHANGED
@@ -60,6 +60,7 @@ module AIA
60
60
  speech_model: 'tts-1',
61
61
  transcription_model: 'whisper-1',
62
62
  voice: 'alloy',
63
+ adapter: 'ai_client', # 'ai_client' or 'ruby_llm'
63
64
 
64
65
  # Embedding parameters
65
66
  embedding_model: 'text-embedding-ada-002',
@@ -236,6 +237,18 @@ module AIA
236
237
  puts "Debug: Setting chat mode to true" if config.debug
237
238
  end
238
239
 
240
+ opts.on("--adapter ADAPTER", "Interface that adapts AIA to the LLM") do |adapter|
241
+ adapter.downcase!
242
+ valid_adapters = %w[ ai_client ruby_llm]
243
+ if valid_adapters.include? adapter
244
+ config.adapter = adapter
245
+ else
246
+ STDERR.puts "ERROR: Invalid adapter #{adapter} must be one of these: #{valid_adapters.join(', ')}"
247
+ exit 1
248
+ end
249
+ end
250
+
251
+
239
252
  opts.on("-m MODEL", "--model MODEL", "Name of the LLM model to use") do |model|
240
253
  config.model = model
241
254
  end
@@ -48,6 +48,13 @@ module AIA
48
48
  erb_flag: AIA.config.erb,
49
49
  envar_flag: AIA.config.shell
50
50
  )
51
+
52
+ # Ensure parameters are extracted even if no history file exists
53
+ if prompt && prompt.parameters.empty?
54
+ # Force re-reading of the prompt text to extract parameters
55
+ # This ensures parameters are found even without a .json file
56
+ prompt.reload
57
+ end
51
58
 
52
59
  return prompt if prompt
53
60
  else
@@ -0,0 +1,179 @@
1
+ # lib/aia/ruby_llm_adapter.rb
2
+ #
3
+
4
+ require 'ruby_llm'
5
+
6
+ module AIA
7
+ class RubyLLMAdapter
8
+ def initialize
9
+ @model = AIA.config.model
10
+ model_info = extract_model_parts(@model)
11
+
12
+ # Configure RubyLLM with available API keys
13
+ RubyLLM.configure do |config|
14
+ config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
15
+ config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
16
+ config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
17
+ config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
18
+
19
+ # Bedrock configuration
20
+ config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
21
+ config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
22
+ config.bedrock_region = ENV.fetch('AWS_REGION', nil)
23
+ config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
24
+ end
25
+
26
+ # Initialize chat with the specified model
27
+ @chat = RubyLLM.chat(model: model_info[:model])
28
+ end
29
+
30
+ def chat(prompt)
31
+ if @model.downcase.include?('dall-e') || @model.downcase.include?('image-generation')
32
+ text_to_image(prompt)
33
+ elsif @model.downcase.include?('vision') || @model.downcase.include?('image')
34
+ image_to_text(prompt)
35
+ elsif @model.downcase.include?('tts') || @model.downcase.include?('speech')
36
+ text_to_audio(prompt)
37
+ elsif @model.downcase.include?('whisper') || @model.downcase.include?('transcription')
38
+ audio_to_text(prompt)
39
+ else
40
+ text_to_text(prompt)
41
+ end
42
+ end
43
+
44
+ def transcribe(audio_file)
45
+ @chat.ask("Transcribe this audio", with: { audio: audio_file })
46
+ end
47
+
48
+ def speak(text)
49
+ output_file = "#{Time.now.to_i}.mp3"
50
+
51
+ # Note: RubyLLM doesn't have a direct text-to-speech feature
52
+ # This is a placeholder for a custom implementation or external service
53
+ begin
54
+ # Try using a TTS API if available
55
+ # For now, we'll use a mock implementation
56
+ File.write(output_file, "Mock TTS audio content")
57
+ system("#{AIA.config.speak_command} #{output_file}") if File.exist?(output_file) && system("which #{AIA.config.speak_command} > /dev/null 2>&1")
58
+ "Audio generated and saved to: #{output_file}"
59
+ rescue => e
60
+ "Error generating audio: #{e.message}"
61
+ end
62
+ end
63
+
64
+ def method_missing(method, *args, &block)
65
+ if @chat.respond_to?(method)
66
+ @chat.public_send(method, *args, &block)
67
+ else
68
+ super
69
+ end
70
+ end
71
+
72
+ def respond_to_missing?(method, include_private = false)
73
+ @chat.respond_to?(method) || super
74
+ end
75
+
76
+ private
77
+
78
+ def extract_model_parts(model_string)
79
+ parts = model_string.split('/')
80
+ parts.map!(&:strip)
81
+
82
+ if parts.length > 1
83
+ provider = parts[0]
84
+ model = parts[1]
85
+ else
86
+ provider = nil # RubyLLM will figure it out from the model name
87
+ model = parts[0]
88
+ end
89
+
90
+ { provider: provider, model: model }
91
+ end
92
+
93
+ def extract_text_prompt(prompt)
94
+ if prompt.is_a?(String)
95
+ prompt
96
+ elsif prompt.is_a?(Hash) && prompt[:text]
97
+ prompt[:text]
98
+ elsif prompt.is_a?(Hash) && prompt[:content]
99
+ prompt[:content]
100
+ else
101
+ prompt.to_s
102
+ end
103
+ end
104
+
105
+ def text_to_text(prompt)
106
+ text_prompt = extract_text_prompt(prompt)
107
+ @chat.ask(text_prompt)
108
+ end
109
+
110
+ def text_to_image(prompt)
111
+ text_prompt = extract_text_prompt(prompt)
112
+ output_file = "#{Time.now.to_i}.png"
113
+
114
+ begin
115
+ RubyLLM.paint(text_prompt, output_path: output_file,
116
+ size: AIA.config.image_size,
117
+ quality: AIA.config.image_quality,
118
+ style: AIA.config.image_style)
119
+ "Image generated and saved to: #{output_file}"
120
+ rescue => e
121
+ "Error generating image: #{e.message}"
122
+ end
123
+ end
124
+
125
+ def image_to_text(prompt)
126
+ image_path = extract_image_path(prompt)
127
+ text_prompt = extract_text_prompt(prompt)
128
+
129
+ if image_path && File.exist?(image_path)
130
+ begin
131
+ @chat.ask(text_prompt, with: { image: image_path })
132
+ rescue => e
133
+ "Error analyzing image: #{e.message}"
134
+ end
135
+ else
136
+ text_to_text(prompt)
137
+ end
138
+ end
139
+
140
+ def text_to_audio(prompt)
141
+ text_prompt = extract_text_prompt(prompt)
142
+ output_file = "#{Time.now.to_i}.mp3"
143
+
144
+ begin
145
+ # Note: RubyLLM doesn't have a direct TTS feature
146
+ # This is a placeholder for a custom implementation
147
+ File.write(output_file, text_prompt)
148
+ system("#{AIA.config.speak_command} #{output_file}") if File.exist?(output_file) && system("which #{AIA.config.speak_command} > /dev/null 2>&1")
149
+ "Audio generated and saved to: #{output_file}"
150
+ rescue => e
151
+ "Error generating audio: #{e.message}"
152
+ end
153
+ end
154
+
155
+ def audio_to_text(prompt)
156
+ if prompt.is_a?(String) && File.exist?(prompt) &&
157
+ prompt.downcase.end_with?('.mp3', '.wav', '.m4a', '.flac')
158
+ begin
159
+ @chat.ask("Transcribe this audio", with: { audio: prompt })
160
+ rescue => e
161
+ "Error transcribing audio: #{e.message}"
162
+ end
163
+ else
164
+ # Fall back to regular chat if no valid audio file is found
165
+ text_to_text(prompt)
166
+ end
167
+ end
168
+
169
+ def extract_image_path(prompt)
170
+ if prompt.is_a?(String)
171
+ prompt.scan(/\b[\w\/\.\-]+\.(jpg|jpeg|png|gif|webp)\b/i).first&.first
172
+ elsif prompt.is_a?(Hash)
173
+ prompt[:image] || prompt[:image_path]
174
+ else
175
+ nil
176
+ end
177
+ end
178
+ end
179
+ end
data/lib/aia/session.rb CHANGED
@@ -187,7 +187,7 @@ module AIA
187
187
  # Check for piped input (STDIN not a TTY and has data)
188
188
  if !STDIN.tty?
189
189
  # Save the original STDIN
190
- orig_stdin = STDIN.dup
190
+ original_stdin = STDIN.dup
191
191
 
192
192
  # Read the piped input
193
193
  piped_input = STDIN.read.strip
@@ -209,9 +209,12 @@ module AIA
209
209
 
210
210
  # Output the response
211
211
  @chat_processor.output_response(response)
212
- @chat_processor.speak(response)
212
+ @chat_processor.speak(response) if AIA.speak?
213
213
  @ui_presenter.display_separator
214
214
  end
215
+
216
+ # Restore original stdin when done with piped input processing
217
+ STDIN.reopen(original_stdin)
215
218
  end
216
219
 
217
220
  loop do
data/lib/aia/utility.rb CHANGED
@@ -11,7 +11,7 @@ module AIA
11
11
  (\\____/) AI Assistant
12
12
  (_oo_) #{AIA.config.model}
13
13
  (O) is Online
14
- __||__ \\)
14
+ __||__ \\) using #{AIA.config.adapter}
15
15
  [/______\\] /
16
16
  / \\__AI__/ \\/
17
17
  / /__\\
data/lib/aia.rb CHANGED
@@ -5,6 +5,7 @@
5
5
  # provides an interface for interacting with AI models and managing prompts.
6
6
 
7
7
  require 'ai_client'
8
+ require 'ruby_llm'
8
9
  require 'prompt_manager'
9
10
  require 'debug_me'
10
11
  include DebugMe
@@ -18,6 +19,7 @@ require_relative 'aia/config'
18
19
  require_relative 'aia/shell_command_executor'
19
20
  require_relative 'aia/prompt_handler'
20
21
  require_relative 'aia/ai_client_adapter'
22
+ require_relative 'aia/ruby_llm_adapter'
21
23
  require_relative 'aia/directive_processor'
22
24
  require_relative 'aia/history_manager'
23
25
  require_relative 'aia/ui_presenter'
@@ -76,7 +78,14 @@ module AIA
76
78
  end
77
79
 
78
80
  prompt_handler = PromptHandler.new
79
- @config.client = AIClientAdapter.new
81
+
82
+ # Initialize the appropriate client adapter based on configuration
83
+ @config.client = if @config.adapter == 'ruby_llm'
84
+ RubyLLMAdapter.new
85
+ else
86
+ AIClientAdapter.new
87
+ end
88
+
80
89
  session = Session.new(prompt_handler)
81
90
 
82
91
  session.start
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.8.5
4
+ version: 0.8.6
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
@@ -79,6 +79,20 @@ dependencies:
79
79
  - - ">="
80
80
  - !ruby/object:Gem::Version
81
81
  version: 0.5.2
82
+ - !ruby/object:Gem::Dependency
83
+ name: ruby_llm
84
+ requirement: !ruby/object:Gem::Requirement
85
+ requirements:
86
+ - - ">="
87
+ - !ruby/object:Gem::Version
88
+ version: 1.2.0
89
+ type: :runtime
90
+ prerelease: false
91
+ version_requirements: !ruby/object:Gem::Requirement
92
+ requirements:
93
+ - - ">="
94
+ - !ruby/object:Gem::Version
95
+ version: 1.2.0
82
96
  - !ruby/object:Gem::Dependency
83
97
  name: reline
84
98
  requirement: !ruby/object:Gem::Requirement
@@ -300,6 +314,7 @@ files:
300
314
  - lib/aia/fzf.rb
301
315
  - lib/aia/history_manager.rb
302
316
  - lib/aia/prompt_handler.rb
317
+ - lib/aia/ruby_llm_adapter.rb
303
318
  - lib/aia/session.rb
304
319
  - lib/aia/shell_command_executor.rb
305
320
  - lib/aia/ui_presenter.rb