aia 0.8.4 → 0.8.6
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/.version +1 -1
- data/CHANGELOG.md +10 -1
- data/README.md +69 -175
- data/lib/aia/config.rb +13 -0
- data/lib/aia/directive_processor.rb +29 -0
- data/lib/aia/prompt_handler.rb +7 -0
- data/lib/aia/ruby_llm_adapter.rb +179 -0
- data/lib/aia/session.rb +5 -2
- data/lib/aia/utility.rb +1 -1
- data/lib/aia.rb +10 -1
- metadata +39 -13
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 7ce34c18195898a722843fc5936628eebb00e16517618ffc2d2735e291caa528
|
4
|
+
data.tar.gz: bdf4a3fcf9d5da9546caf69f4c5b9c969e4453d53755fc0ff9e70bce273c6998
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: c72b22c0321c1b54b103a74acc7b79a3e4650f8588ce0017294966fe2a895cc0942a7a44471603a3517cb78bb9234a619174a91cde30c32f4eb9c1d69ff7f200
|
7
|
+
data.tar.gz: 49d2253408f2e5f846aa15dc83b6165fb2f6d4fad28d029aef4536ae699e267cfb476cce375576465a5bd17a418231f1cd3686bcde00c4f206735f838f78e798
|
data/.version
CHANGED
@@ -1 +1 @@
|
|
1
|
-
0.8.
|
1
|
+
0.8.6
|
data/CHANGELOG.md
CHANGED
@@ -1,8 +1,17 @@
|
|
1
1
|
# Changelog
|
2
2
|
## [Unreleased]
|
3
|
-
|
4
3
|
## Released
|
5
4
|
|
5
|
+
### [0.8.6] 2025-04-23
|
6
|
+
- Added a client adapter for the ruby_llm gem
|
7
|
+
- Added the adapter config item and the --adapter option to select at runtime which client to use ai_client or ruby_llm
|
8
|
+
|
9
|
+
### [0.8.5] 2025-04-19
|
10
|
+
- documentation updates
|
11
|
+
- integrated the https://pure.md web service for inserting web pages into the context window
|
12
|
+
- //include http?://example.com/stuff
|
13
|
+
- //webpage http?://example.com/stuff
|
14
|
+
|
6
15
|
### [0.8.2] 2025-04-18
|
7
16
|
- fixed problems with pre-loaded context and chat repl
|
8
17
|
- piped content into `aia --chat` is now a part of the context/instructions
|
data/README.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1
|
-
|
1
|
+
# AI Assistant (AIA)
|
2
2
|
|
3
3
|
**The prompt is the code!**
|
4
4
|
|
@@ -17,185 +17,67 @@
|
|
17
17
|
AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
|
18
18
|
|
19
19
|
**Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
|
20
|
+
- Added support for the `ruby_llm` gem as an alternative to `ai_client`
|
21
|
+
- //include directive now supports web URLs
|
22
|
+
- //webpage insert web URL content as markdown into context
|
20
23
|
|
21
|
-
**
|
22
|
-
- **Directive Processing in Chat and Prompts:** You can now use directives in chat sessions and prompt files with the syntax: `//command args`. Supported directives include:
|
23
|
-
- `shell`/`sh`: Execute shell commands
|
24
|
-
- `ruby`/`rb`: Execute Ruby code
|
25
|
-
- `config`/`cfg`: Display or update configuration
|
26
|
-
- `include`/`inc`: Include file content
|
27
|
-
- `next`: Specify the next prompt ID in a sequence
|
28
|
-
- `pipeline`: Specify a pipeline of prompt IDs to process
|
29
|
-
- `clear`: Clear the context (handy in a chat session)
|
30
|
-
- `help`: Show available directives
|
24
|
+
**Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
|
31
25
|
|
26
|
+
**Notable Recent Changes:**
|
27
|
+
- **RubyLLM Integration:** AIA now supports the RubyLLM gem as an alternative to ai_client. Use `--adapter ruby_llm` to switch. Why am I replacing my on gem ai_client with the ruby_llm gem? Because its better, newer, elegant and will easily support some of the new features I have planned for AIA. Its not fully integrated but its close ofenough to work on text-to-text generation. Other modes will be added in the future.
|
32
28
|
|
33
29
|
<!-- Tocer[start]: Auto-generated, don't remove. -->
|
34
30
|
|
35
31
|
## Table of Contents
|
36
32
|
|
37
|
-
- [
|
38
|
-
- [
|
39
|
-
- [
|
40
|
-
- [
|
41
|
-
|
42
|
-
|
43
|
-
- [
|
44
|
-
|
45
|
-
|
46
|
-
- [
|
47
|
-
- [
|
48
|
-
- [
|
49
|
-
|
50
|
-
|
51
|
-
|
52
|
-
|
53
|
-
|
54
|
-
|
55
|
-
- [
|
56
|
-
|
57
|
-
- [
|
58
|
-
|
59
|
-
- [
|
60
|
-
|
61
|
-
- [
|
62
|
-
|
63
|
-
|
64
|
-
- [
|
65
|
-
- [
|
66
|
-
- [
|
67
|
-
- [
|
68
|
-
- [
|
69
|
-
- [
|
70
|
-
- [
|
71
|
-
- [
|
72
|
-
- [
|
73
|
-
- [
|
74
|
-
- [
|
75
|
-
- [History of Development](#history-of-development)
|
76
|
-
- [Roadmap](#roadmap)
|
77
|
-
- [License](#license)
|
33
|
+
- [Configuration Options](#configuration-options)
|
34
|
+
- [Configuration Flexibility](#configuration-flexibility)
|
35
|
+
- [Expandable Configuration](#expandable-configuration)
|
36
|
+
- [Shell Integration inside of a Prompt](#shell-integration-inside-of-a-prompt)
|
37
|
+
- [Dynamic Shell Commands](#dynamic-shell-commands)
|
38
|
+
- [Shell Command Safety](#shell-command-safety)
|
39
|
+
- [Chat Session Use](#chat-session-use)
|
40
|
+
- [*E*mbedded *R*u*B*y (ERB)](#embedded-ruby-erb)
|
41
|
+
- [Prompt Directives](#prompt-directives)
|
42
|
+
- [Parameter and Shell Substitution in Directives](#parameter-and-shell-substitution-in-directives)
|
43
|
+
- [Directive Syntax](#directive-syntax)
|
44
|
+
- [AIA Specific Directive Commands](#aia-specific-directive-commands)
|
45
|
+
- [//config](#config)
|
46
|
+
- [//include](#include)
|
47
|
+
- [//ruby](#ruby)
|
48
|
+
- [//shell](#shell)
|
49
|
+
- [//next](#next)
|
50
|
+
- [//pipeline](#pipeline)
|
51
|
+
- [Using Directives in Chat Sessions](#using-directives-in-chat-sessions)
|
52
|
+
- [Prompt Sequences](#prompt-sequences)
|
53
|
+
- [--next](#--next)
|
54
|
+
- [--pipeline](#--pipeline)
|
55
|
+
- [Best Practices ??](#best-practices-)
|
56
|
+
- [Example pipeline](#example-pipeline)
|
57
|
+
- [All About ROLES](#all-about-roles)
|
58
|
+
- [The --roles_prefix (AIA_ROLES_PREFIX)](#the---roles_prefix-aia_roles_prefix)
|
59
|
+
- [The --role Option](#the---role-option)
|
60
|
+
- [Other Ways to Insert Roles into Prompts](#other-ways-to-insert-roles-into-prompts)
|
61
|
+
- [External CLI Tools Used](#external-cli-tools-used)
|
62
|
+
- [Shell Completion](#shell-completion)
|
63
|
+
- [My Most Powerful Prompt](#my-most-powerful-prompt)
|
64
|
+
- [My Configuration](#my-configuration)
|
65
|
+
- [Executable Prompts](#executable-prompts)
|
66
|
+
- [Usage](#usage)
|
67
|
+
- [Development](#development)
|
68
|
+
- [Contributing](#contributing)
|
69
|
+
- [Roadmap](#roadmap)
|
70
|
+
- [License](#license)
|
78
71
|
|
79
72
|
<!-- Tocer[finish]: Auto-generated, don't remove. -->
|
80
73
|
|
81
|
-
|
82
|
-
## Installation
|
83
|
-
|
84
|
-
Install the gem by executing:
|
85
|
-
|
86
|
-
gem install aia
|
87
|
-
|
88
|
-
Install the command-line utilities by executing:
|
89
|
-
|
90
|
-
brew install fzf
|
91
|
-
|
92
|
-
You will also need to establish a directory in your file system where your prompt text files, last used parameters and usage log files are kept.
|
93
|
-
|
94
|
-
Setup a system environment variable (envar) named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts". The envar "AIA_ROLES_PREFIX" points to your role prefix where you have prompts that define the different roles you want the LLM to assume when it is doing its work. The default roles prefix is "roles".
|
95
|
-
|
96
|
-
You may also want to install the completion script for your shell. To get a copy of the completion script do:
|
97
|
-
|
98
|
-
```bash
|
99
|
-
aia --completion bash
|
100
|
-
```
|
101
|
-
|
102
|
-
`fish` and `zsh` are also available.
|
103
|
-
|
104
|
-
## What is a Prompt ID?
|
105
|
-
|
106
|
-
A prompt ID is the basename of a text file (extension *.txt) located in a prompts directory. The prompts directory is specified by the environment variable "AIA_PROMPTS_DIR". If this variable is not set, the default is in your HOME directory named ".prompts". It can also be set on the command line with the `--prompts-dir` option.
|
107
|
-
|
108
|
-
This file contains the context and instructions for the LLM to follow. The prompt ID is what you use as an option on the command line to specify which prompt text file to use. Prompt files can have comments, parameters, directives and ERB blocks along with the instruction text to feed to the LLM. It can also have shell commands and use system environment variables. Consider the following example:
|
109
|
-
|
110
|
-
```plaintext
|
111
|
-
#!/usr/bin/env aia run
|
112
|
-
# ~/.prompts/example.txt
|
113
|
-
# Desc: Be an example prompt with all? the bells and whistles
|
114
|
-
|
115
|
-
# Set the configuration for this prompt
|
116
|
-
|
117
|
-
//config model = gpt-4
|
118
|
-
//config temperature = 0.7
|
119
|
-
//config shell = true
|
120
|
-
//config erb = true
|
121
|
-
//config out_file = path/to/output.md
|
122
|
-
|
123
|
-
# Add some file content to the context/instructions
|
124
|
-
|
125
|
-
//include path/to/file
|
126
|
-
//shell cat path/to/file
|
127
|
-
$(cat path/to/file)
|
128
|
-
|
129
|
-
# Setup some workflows
|
130
|
-
|
131
|
-
//next next_prompt_id
|
132
|
-
//pipeline prompt_id_1, prompt_ie_2, prompt_id_3
|
133
|
-
|
134
|
-
# Execute some Ruby code
|
135
|
-
|
136
|
-
//ruby require 'some_library' # inserts into the context/instructions
|
137
|
-
<% some_ruby_things # not inserted into the context %>
|
138
|
-
<%= some_other_ruby_things # that are part of the context/instructions %>
|
139
|
-
|
140
|
-
Tell me how to do something for a $(uname -s) platform that would rename all
|
141
|
-
of the files in the directory $MY_DIRECTORY to have a prefix of for its filename
|
142
|
-
that is [PREFIX] and a ${SUFFIX}
|
143
|
-
|
144
|
-
<!--
|
145
|
-
directives, ERB blocks and other junk can be used
|
146
|
-
anywhere in the file mixing dynamic context/instructions with
|
147
|
-
the static stuff.
|
148
|
-
-->
|
149
|
-
|
150
|
-
```markdown
|
151
|
-
# Header 1 -- not a comment
|
152
|
-
## Header 2 -- not a comment
|
153
|
-
### Header 3, etc -- not a comment
|
154
|
-
|
155
|
-
```ruby
|
156
|
-
# this is a comment; but it stays in the prompt
|
157
|
-
puts "hello world" <!-- this is also a comment; but it gets removed -->
|
158
|
-
```
|
159
|
-
Kewl!
|
160
|
-
```
|
161
|
-
|
162
|
-
__END__
|
163
|
-
|
164
|
-
Everything after the "__END__" line is not part of the context or instructions to
|
165
|
-
the LLM.
|
166
|
-
```
|
167
|
-
|
168
|
-
Comments in a prompt text file are just there to document the prompt. They are removed before the completed prompt is processed by the LLM. This reduces token counts; but, more importantly it helps you remember why you structured your prompt they way you did - if you remembered to document your prompt.
|
169
|
-
|
170
|
-
That is just about everything including the kitchen sink that a pre-compositional parameterized prompt file can have. It can be an executable with a she-bang line and a special system prompt name `run` as shown in the example. It has line comments that use the `#` symbol. It had end of file block comments that appear after the "__END__" line. It has directive command that begin with the double slash `//` - an homage to IBM JCL. It has shell variables in both forms. It has shell commands. It has parameters that default to a regex that uses square brackets and all uppercase characeters to define the parameter name whose value is to be given in a Q&A session before the prompt is sent to the LLM for processing.
|
171
|
-
|
172
|
-
AIA has the ability to define a workflow of prompt IDs with either the //next or //pipeline directives.
|
173
|
-
|
174
|
-
You could say that instead of the prompt being part of a program, a program can be part of the prompt. **The prompt is the code!**
|
175
|
-
|
176
|
-
By using ERB you can make parts of the context/instructions conditional. You can also use ERB to make parts of the context/instructions dynamic for example to pull information from a database or an API.
|
177
|
-
|
178
|
-
## Embedded Parameters as Placeholders
|
179
|
-
|
180
|
-
In the example prompt text file above I used the default regex to define parameters as all upper case characters plus space, underscore and the vertical pipe enclosed within square brackets. Since the time that I original starting writing AIA I've seen more developers use double curly braces to define parameters. AIA allows you to specify your own regex as a string. If you want the curly brackets use the `--regex` option on the command line like this:
|
181
|
-
|
182
|
-
`--regex '(?-mix:({{[a-zA-Z _|]+}}))'`
|
183
|
-
|
184
|
-
|
185
|
-
## Usage
|
186
|
-
|
187
|
-
The usage report is obtained with either `-h` or `--help` options.
|
188
|
-
|
189
|
-
```bash
|
190
|
-
aia --help
|
191
|
-
```
|
192
|
-
|
193
74
|
## Configuration Options
|
194
75
|
|
195
76
|
The following table provides a comprehensive list of configuration options, their default values, and the associated environment variables:
|
196
77
|
|
197
78
|
| Option | Default Value | Environment Variable |
|
198
79
|
|-------------------------|---------------------------------|---------------------------|
|
80
|
+
| adapter | ai_client | AIA_ADAPTER |
|
199
81
|
| out_file | temp.md | AIA_OUT_FILE |
|
200
82
|
| log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
|
201
83
|
| prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
|
@@ -297,7 +179,6 @@ The `--erb` option turns the prompt text file into a fully functioning ERB templ
|
|
297
179
|
|
298
180
|
Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic prompt text for LLM processing.
|
299
181
|
|
300
|
-
|
301
182
|
## Prompt Directives
|
302
183
|
|
303
184
|
Downstream processing directives were added to the `prompt_manager` gem used by AIA at version 0.4.1. These directives are lines in the prompt text file that begin with "//" having this pattern:
|
@@ -392,7 +273,6 @@ The `path_to_file` can be either absolute or relative. If it is relative, it is
|
|
392
273
|
|
393
274
|
The file that is included will have any comments or directives excluded. It is expected that the file will be a text file so that its content can be pre-pended to the existing prompt; however, if the file is a source code file (ex: file.rb) the source code will be included HOWEVER any comment line or line that starts with "//" will be excluded.
|
394
275
|
|
395
|
-
|
396
276
|
#### //ruby
|
397
277
|
|
398
278
|
The `//ruby` directive executes Ruby code. You can use this to perform complex operations or interact with Ruby libraries.
|
@@ -427,7 +307,6 @@ There are no limitations on what the shell command can be. For example if you w
|
|
427
307
|
|
428
308
|
Which does basically the same thing as the `//include` directive, except it uses the entire content of the file. For relative file paths the same thing applies. The file's path will be relative to the PWD.
|
429
309
|
|
430
|
-
|
431
310
|
#### //next
|
432
311
|
Examples:
|
433
312
|
```bash
|
@@ -527,6 +406,7 @@ or inside of the `one.txt` prompt file use this directive:
|
|
527
406
|
|
528
407
|
### Best Practices ??
|
529
408
|
|
409
|
+
|
530
410
|
Since the response of one prompt is fed into the next prompt within the sequence instead of having all prompts write their response to the same out file, use these directives inside the associated prompt files:
|
531
411
|
|
532
412
|
|
@@ -635,7 +515,7 @@ fzf
|
|
635
515
|
|
636
516
|
## Shell Completion
|
637
517
|
|
638
|
-
You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do
|
518
|
+
You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do:
|
639
519
|
|
640
520
|
```bash
|
641
521
|
aia --completion bash
|
@@ -739,6 +619,24 @@ Since its output is going to STDOUT you can setup a pipe chain. Using the CLI p
|
|
739
619
|
|
740
620
|
This executable prompt concept sets up the building blocks of a *nix CLI-based pipeline in the same way that the --pipeline and --next options and directives are used.
|
741
621
|
|
622
|
+
## Usage
|
623
|
+
|
624
|
+
The usage report is obtained with either `-h` or `--help` options.
|
625
|
+
|
626
|
+
```bash
|
627
|
+
aia --help
|
628
|
+
```
|
629
|
+
|
630
|
+
Key command-line options include:
|
631
|
+
|
632
|
+
- `--adapter ADAPTER`: Choose the LLM interface adapter to use. Valid options are 'ai_client' (default) or 'ruby_llm'. See [RubyLLM Integration Guide](README_RUBY_LLM.md) for details.
|
633
|
+
- `--model MODEL`: Specify which LLM model to use
|
634
|
+
- `--chat`: Start an interactive chat session
|
635
|
+
- `--shell`: Enable shell command integration
|
636
|
+
- `--erb`: Enable ERB processing
|
637
|
+
- `--role ROLE`: Specify a role/system prompt
|
638
|
+
- And many more (use --help to see all options)
|
639
|
+
|
742
640
|
## Development
|
743
641
|
|
744
642
|
**ShellCommandExecutor Refactor:**
|
@@ -755,15 +653,11 @@ When you find problems with AIA please note them as an issue. This thing was wr
|
|
755
653
|
|
756
654
|
I'm not happy with the way where some command line options for external command are hard coded. I'm specific talking about the way in which the `rg` and `fzf` tools are used. Their options decide the basic look and feel of the search capability on the command line. Maybe they should be part of the overall configuration so that users can tune their UI to the way they like.
|
757
655
|
|
758
|
-
## History of Development
|
759
|
-
|
760
|
-
I originally wrote a tiny script called `aip.rb` to experiment with parameterized prompts. That was in August of 2023. AIP meant AI Parameterized. Adding an extra P for Prompts just seemed to be a little silly. It lived in my [scripts repo](https://github.com/MadBomber/scripts) for a while. It became useful to me so of course I need to keep enhancing it. I moved it into my [experiments repo](https://github.com/MadBomber/experiments) and began adding features in a haphazard manner. No real plan or architecture. From those experiments I refactored out the [prompt_manager gem](https://github.com/MadBomber/prompt_manager) and the [ai_client gem](https://github.com/MadBomber/ai_client)(https://github.com/MadBomber/ai_client). The name was changed from AIP to AIA and it became a gem.
|
761
|
-
|
762
|
-
All of that undirected experimentation without a clear picture of where this thing was going resulted in chaotic code. I would use an Italian food dish to explain the organization but I think chaotic is more descriptive.
|
763
|
-
|
764
656
|
## Roadmap
|
765
657
|
|
766
|
-
-
|
658
|
+
- I'm thinking about removing the --erb and --shell options and just making those two integrations available all the time.
|
659
|
+
- restore the prompt text file search. currently fzf only looks a prompt IDs.
|
660
|
+
- continue integration of the ruby_llm gem
|
767
661
|
- support for Model Context Protocol
|
768
662
|
|
769
663
|
## License
|
data/lib/aia/config.rb
CHANGED
@@ -60,6 +60,7 @@ module AIA
|
|
60
60
|
speech_model: 'tts-1',
|
61
61
|
transcription_model: 'whisper-1',
|
62
62
|
voice: 'alloy',
|
63
|
+
adapter: 'ai_client', # 'ai_client' or 'ruby_llm'
|
63
64
|
|
64
65
|
# Embedding parameters
|
65
66
|
embedding_model: 'text-embedding-ada-002',
|
@@ -236,6 +237,18 @@ module AIA
|
|
236
237
|
puts "Debug: Setting chat mode to true" if config.debug
|
237
238
|
end
|
238
239
|
|
240
|
+
opts.on("--adapter ADAPTER", "Interface that adapts AIA to the LLM") do |adapter|
|
241
|
+
adapter.downcase!
|
242
|
+
valid_adapters = %w[ ai_client ruby_llm]
|
243
|
+
if valid_adapters.include? adapter
|
244
|
+
config.adapter = adapter
|
245
|
+
else
|
246
|
+
STDERR.puts "ERROR: Invalid adapter #{adapter} must be one of these: #{valid_adapters.join(', ')}"
|
247
|
+
exit 1
|
248
|
+
end
|
249
|
+
end
|
250
|
+
|
251
|
+
|
239
252
|
opts.on("-m MODEL", "--model MODEL", "Name of the LLM model to use") do |model|
|
240
253
|
config.model = model
|
241
254
|
end
|
@@ -1,7 +1,10 @@
|
|
1
1
|
# lib/aia/directive_processor.rb
|
2
2
|
|
3
|
+
require 'faraday'
|
4
|
+
|
3
5
|
module AIA
|
4
6
|
class DirectiveProcessor
|
7
|
+
PUREMD_API_KEY = ENV.fetch('PUREMD_API_KEY', nil)
|
5
8
|
EXCLUDED_METHODS = %w[ run initialize private? ]
|
6
9
|
@descriptions = {}
|
7
10
|
@aliases = {}
|
@@ -100,10 +103,32 @@ module AIA
|
|
100
103
|
!respond_to?(method_name) && respond_to?(method_name, true)
|
101
104
|
end
|
102
105
|
|
106
|
+
|
103
107
|
################
|
104
108
|
## Directives ##
|
105
109
|
################
|
106
110
|
|
111
|
+
|
112
|
+
desc "webpage inserted as markdown to context using pure.md"
|
113
|
+
def webpage(args, context_manager=nil)
|
114
|
+
if PUREMD_API_KEY.nil?
|
115
|
+
"ERROR: PUREMD_API_KEY is required in order to include a webpage"
|
116
|
+
else
|
117
|
+
url = `echo #{args.shift}`.strip
|
118
|
+
puremd_url = "https://pure.md/#{url}"
|
119
|
+
|
120
|
+
response = Faraday.get(puremd_url) do |req|
|
121
|
+
req.headers['x-puremd-api-token'] = PUREMD_API_KEY
|
122
|
+
end
|
123
|
+
|
124
|
+
if 200 == response.status
|
125
|
+
response.body
|
126
|
+
else
|
127
|
+
"Error: wtatus was #{r.status}\n#{ap response}"
|
128
|
+
end
|
129
|
+
end
|
130
|
+
end
|
131
|
+
|
107
132
|
desc "Specify the next prompt ID to process after this one"
|
108
133
|
def next(args = [])
|
109
134
|
if args.empty?
|
@@ -129,6 +154,10 @@ module AIA
|
|
129
154
|
# echo takes care of envars and tilde expansion
|
130
155
|
file_path = `echo #{args.shift}`.strip
|
131
156
|
|
157
|
+
if file_path.start_with?(/http?:\/\//)
|
158
|
+
return webpage(args)
|
159
|
+
end
|
160
|
+
|
132
161
|
if @included_files.include?(file_path)
|
133
162
|
""
|
134
163
|
else
|
data/lib/aia/prompt_handler.rb
CHANGED
@@ -48,6 +48,13 @@ module AIA
|
|
48
48
|
erb_flag: AIA.config.erb,
|
49
49
|
envar_flag: AIA.config.shell
|
50
50
|
)
|
51
|
+
|
52
|
+
# Ensure parameters are extracted even if no history file exists
|
53
|
+
if prompt && prompt.parameters.empty?
|
54
|
+
# Force re-reading of the prompt text to extract parameters
|
55
|
+
# This ensures parameters are found even without a .json file
|
56
|
+
prompt.reload
|
57
|
+
end
|
51
58
|
|
52
59
|
return prompt if prompt
|
53
60
|
else
|
@@ -0,0 +1,179 @@
|
|
1
|
+
# lib/aia/ruby_llm_adapter.rb
|
2
|
+
#
|
3
|
+
|
4
|
+
require 'ruby_llm'
|
5
|
+
|
6
|
+
module AIA
|
7
|
+
class RubyLLMAdapter
|
8
|
+
def initialize
|
9
|
+
@model = AIA.config.model
|
10
|
+
model_info = extract_model_parts(@model)
|
11
|
+
|
12
|
+
# Configure RubyLLM with available API keys
|
13
|
+
RubyLLM.configure do |config|
|
14
|
+
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
|
15
|
+
config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
|
16
|
+
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
|
17
|
+
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
|
18
|
+
|
19
|
+
# Bedrock configuration
|
20
|
+
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
|
21
|
+
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
|
22
|
+
config.bedrock_region = ENV.fetch('AWS_REGION', nil)
|
23
|
+
config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
|
24
|
+
end
|
25
|
+
|
26
|
+
# Initialize chat with the specified model
|
27
|
+
@chat = RubyLLM.chat(model: model_info[:model])
|
28
|
+
end
|
29
|
+
|
30
|
+
def chat(prompt)
|
31
|
+
if @model.downcase.include?('dall-e') || @model.downcase.include?('image-generation')
|
32
|
+
text_to_image(prompt)
|
33
|
+
elsif @model.downcase.include?('vision') || @model.downcase.include?('image')
|
34
|
+
image_to_text(prompt)
|
35
|
+
elsif @model.downcase.include?('tts') || @model.downcase.include?('speech')
|
36
|
+
text_to_audio(prompt)
|
37
|
+
elsif @model.downcase.include?('whisper') || @model.downcase.include?('transcription')
|
38
|
+
audio_to_text(prompt)
|
39
|
+
else
|
40
|
+
text_to_text(prompt)
|
41
|
+
end
|
42
|
+
end
|
43
|
+
|
44
|
+
def transcribe(audio_file)
|
45
|
+
@chat.ask("Transcribe this audio", with: { audio: audio_file })
|
46
|
+
end
|
47
|
+
|
48
|
+
def speak(text)
|
49
|
+
output_file = "#{Time.now.to_i}.mp3"
|
50
|
+
|
51
|
+
# Note: RubyLLM doesn't have a direct text-to-speech feature
|
52
|
+
# This is a placeholder for a custom implementation or external service
|
53
|
+
begin
|
54
|
+
# Try using a TTS API if available
|
55
|
+
# For now, we'll use a mock implementation
|
56
|
+
File.write(output_file, "Mock TTS audio content")
|
57
|
+
system("#{AIA.config.speak_command} #{output_file}") if File.exist?(output_file) && system("which #{AIA.config.speak_command} > /dev/null 2>&1")
|
58
|
+
"Audio generated and saved to: #{output_file}"
|
59
|
+
rescue => e
|
60
|
+
"Error generating audio: #{e.message}"
|
61
|
+
end
|
62
|
+
end
|
63
|
+
|
64
|
+
def method_missing(method, *args, &block)
|
65
|
+
if @chat.respond_to?(method)
|
66
|
+
@chat.public_send(method, *args, &block)
|
67
|
+
else
|
68
|
+
super
|
69
|
+
end
|
70
|
+
end
|
71
|
+
|
72
|
+
def respond_to_missing?(method, include_private = false)
|
73
|
+
@chat.respond_to?(method) || super
|
74
|
+
end
|
75
|
+
|
76
|
+
private
|
77
|
+
|
78
|
+
def extract_model_parts(model_string)
|
79
|
+
parts = model_string.split('/')
|
80
|
+
parts.map!(&:strip)
|
81
|
+
|
82
|
+
if parts.length > 1
|
83
|
+
provider = parts[0]
|
84
|
+
model = parts[1]
|
85
|
+
else
|
86
|
+
provider = nil # RubyLLM will figure it out from the model name
|
87
|
+
model = parts[0]
|
88
|
+
end
|
89
|
+
|
90
|
+
{ provider: provider, model: model }
|
91
|
+
end
|
92
|
+
|
93
|
+
def extract_text_prompt(prompt)
|
94
|
+
if prompt.is_a?(String)
|
95
|
+
prompt
|
96
|
+
elsif prompt.is_a?(Hash) && prompt[:text]
|
97
|
+
prompt[:text]
|
98
|
+
elsif prompt.is_a?(Hash) && prompt[:content]
|
99
|
+
prompt[:content]
|
100
|
+
else
|
101
|
+
prompt.to_s
|
102
|
+
end
|
103
|
+
end
|
104
|
+
|
105
|
+
def text_to_text(prompt)
|
106
|
+
text_prompt = extract_text_prompt(prompt)
|
107
|
+
@chat.ask(text_prompt)
|
108
|
+
end
|
109
|
+
|
110
|
+
def text_to_image(prompt)
|
111
|
+
text_prompt = extract_text_prompt(prompt)
|
112
|
+
output_file = "#{Time.now.to_i}.png"
|
113
|
+
|
114
|
+
begin
|
115
|
+
RubyLLM.paint(text_prompt, output_path: output_file,
|
116
|
+
size: AIA.config.image_size,
|
117
|
+
quality: AIA.config.image_quality,
|
118
|
+
style: AIA.config.image_style)
|
119
|
+
"Image generated and saved to: #{output_file}"
|
120
|
+
rescue => e
|
121
|
+
"Error generating image: #{e.message}"
|
122
|
+
end
|
123
|
+
end
|
124
|
+
|
125
|
+
def image_to_text(prompt)
|
126
|
+
image_path = extract_image_path(prompt)
|
127
|
+
text_prompt = extract_text_prompt(prompt)
|
128
|
+
|
129
|
+
if image_path && File.exist?(image_path)
|
130
|
+
begin
|
131
|
+
@chat.ask(text_prompt, with: { image: image_path })
|
132
|
+
rescue => e
|
133
|
+
"Error analyzing image: #{e.message}"
|
134
|
+
end
|
135
|
+
else
|
136
|
+
text_to_text(prompt)
|
137
|
+
end
|
138
|
+
end
|
139
|
+
|
140
|
+
def text_to_audio(prompt)
|
141
|
+
text_prompt = extract_text_prompt(prompt)
|
142
|
+
output_file = "#{Time.now.to_i}.mp3"
|
143
|
+
|
144
|
+
begin
|
145
|
+
# Note: RubyLLM doesn't have a direct TTS feature
|
146
|
+
# This is a placeholder for a custom implementation
|
147
|
+
File.write(output_file, text_prompt)
|
148
|
+
system("#{AIA.config.speak_command} #{output_file}") if File.exist?(output_file) && system("which #{AIA.config.speak_command} > /dev/null 2>&1")
|
149
|
+
"Audio generated and saved to: #{output_file}"
|
150
|
+
rescue => e
|
151
|
+
"Error generating audio: #{e.message}"
|
152
|
+
end
|
153
|
+
end
|
154
|
+
|
155
|
+
def audio_to_text(prompt)
|
156
|
+
if prompt.is_a?(String) && File.exist?(prompt) &&
|
157
|
+
prompt.downcase.end_with?('.mp3', '.wav', '.m4a', '.flac')
|
158
|
+
begin
|
159
|
+
@chat.ask("Transcribe this audio", with: { audio: prompt })
|
160
|
+
rescue => e
|
161
|
+
"Error transcribing audio: #{e.message}"
|
162
|
+
end
|
163
|
+
else
|
164
|
+
# Fall back to regular chat if no valid audio file is found
|
165
|
+
text_to_text(prompt)
|
166
|
+
end
|
167
|
+
end
|
168
|
+
|
169
|
+
def extract_image_path(prompt)
|
170
|
+
if prompt.is_a?(String)
|
171
|
+
prompt.scan(/\b[\w\/\.\-]+\.(jpg|jpeg|png|gif|webp)\b/i).first&.first
|
172
|
+
elsif prompt.is_a?(Hash)
|
173
|
+
prompt[:image] || prompt[:image_path]
|
174
|
+
else
|
175
|
+
nil
|
176
|
+
end
|
177
|
+
end
|
178
|
+
end
|
179
|
+
end
|
data/lib/aia/session.rb
CHANGED
@@ -187,7 +187,7 @@ module AIA
|
|
187
187
|
# Check for piped input (STDIN not a TTY and has data)
|
188
188
|
if !STDIN.tty?
|
189
189
|
# Save the original STDIN
|
190
|
-
|
190
|
+
original_stdin = STDIN.dup
|
191
191
|
|
192
192
|
# Read the piped input
|
193
193
|
piped_input = STDIN.read.strip
|
@@ -209,9 +209,12 @@ module AIA
|
|
209
209
|
|
210
210
|
# Output the response
|
211
211
|
@chat_processor.output_response(response)
|
212
|
-
@chat_processor.speak(response)
|
212
|
+
@chat_processor.speak(response) if AIA.speak?
|
213
213
|
@ui_presenter.display_separator
|
214
214
|
end
|
215
|
+
|
216
|
+
# Restore original stdin when done with piped input processing
|
217
|
+
STDIN.reopen(original_stdin)
|
215
218
|
end
|
216
219
|
|
217
220
|
loop do
|
data/lib/aia/utility.rb
CHANGED
data/lib/aia.rb
CHANGED
@@ -5,6 +5,7 @@
|
|
5
5
|
# provides an interface for interacting with AI models and managing prompts.
|
6
6
|
|
7
7
|
require 'ai_client'
|
8
|
+
require 'ruby_llm'
|
8
9
|
require 'prompt_manager'
|
9
10
|
require 'debug_me'
|
10
11
|
include DebugMe
|
@@ -18,6 +19,7 @@ require_relative 'aia/config'
|
|
18
19
|
require_relative 'aia/shell_command_executor'
|
19
20
|
require_relative 'aia/prompt_handler'
|
20
21
|
require_relative 'aia/ai_client_adapter'
|
22
|
+
require_relative 'aia/ruby_llm_adapter'
|
21
23
|
require_relative 'aia/directive_processor'
|
22
24
|
require_relative 'aia/history_manager'
|
23
25
|
require_relative 'aia/ui_presenter'
|
@@ -76,7 +78,14 @@ module AIA
|
|
76
78
|
end
|
77
79
|
|
78
80
|
prompt_handler = PromptHandler.new
|
79
|
-
|
81
|
+
|
82
|
+
# Initialize the appropriate client adapter based on configuration
|
83
|
+
@config.client = if @config.adapter == 'ruby_llm'
|
84
|
+
RubyLLMAdapter.new
|
85
|
+
else
|
86
|
+
AIClientAdapter.new
|
87
|
+
end
|
88
|
+
|
80
89
|
session = Session.new(prompt_handler)
|
81
90
|
|
82
91
|
session.start
|
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: aia
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.8.
|
4
|
+
version: 0.8.6
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Dewayne VanHoozer
|
@@ -37,6 +37,20 @@ dependencies:
|
|
37
37
|
- - ">="
|
38
38
|
- !ruby/object:Gem::Version
|
39
39
|
version: '0'
|
40
|
+
- !ruby/object:Gem::Dependency
|
41
|
+
name: faraday
|
42
|
+
requirement: !ruby/object:Gem::Requirement
|
43
|
+
requirements:
|
44
|
+
- - ">="
|
45
|
+
- !ruby/object:Gem::Version
|
46
|
+
version: '0'
|
47
|
+
type: :runtime
|
48
|
+
prerelease: false
|
49
|
+
version_requirements: !ruby/object:Gem::Requirement
|
50
|
+
requirements:
|
51
|
+
- - ">="
|
52
|
+
- !ruby/object:Gem::Version
|
53
|
+
version: '0'
|
40
54
|
- !ruby/object:Gem::Dependency
|
41
55
|
name: os
|
42
56
|
requirement: !ruby/object:Gem::Requirement
|
@@ -65,6 +79,20 @@ dependencies:
|
|
65
79
|
- - ">="
|
66
80
|
- !ruby/object:Gem::Version
|
67
81
|
version: 0.5.2
|
82
|
+
- !ruby/object:Gem::Dependency
|
83
|
+
name: ruby_llm
|
84
|
+
requirement: !ruby/object:Gem::Requirement
|
85
|
+
requirements:
|
86
|
+
- - ">="
|
87
|
+
- !ruby/object:Gem::Version
|
88
|
+
version: 1.2.0
|
89
|
+
type: :runtime
|
90
|
+
prerelease: false
|
91
|
+
version_requirements: !ruby/object:Gem::Requirement
|
92
|
+
requirements:
|
93
|
+
- - ">="
|
94
|
+
- !ruby/object:Gem::Version
|
95
|
+
version: 1.2.0
|
68
96
|
- !ruby/object:Gem::Dependency
|
69
97
|
name: reline
|
70
98
|
requirement: !ruby/object:Gem::Requirement
|
@@ -247,17 +275,13 @@ dependencies:
|
|
247
275
|
- - ">="
|
248
276
|
- !ruby/object:Gem::Version
|
249
277
|
version: '0'
|
250
|
-
description:
|
251
|
-
|
252
|
-
|
253
|
-
|
254
|
-
|
255
|
-
|
256
|
-
|
257
|
-
power house that rivals specialized gen-AI tools. aia currently supports "mods"
|
258
|
-
and "sgpt" CLI tools. aia uses "ripgrep" and "fzf" CLI utilities to search for
|
259
|
-
and select prompt files to send to the backend gen-AI tool along with supported
|
260
|
-
context files.
|
278
|
+
description: |
|
279
|
+
Unleash the full power of AI from your terminal! AIA is a cutting-edge CLI
|
280
|
+
assistant for generative AI workflows, offering dynamic prompt management,
|
281
|
+
seamless shell and Ruby integration, interactive chat, and advanced automation.
|
282
|
+
Effortlessly craft, manage, and execute prompts with embedded directives,
|
283
|
+
history, and flexible configuration. Experience next-level productivity for
|
284
|
+
developers, power users, and AI enthusiasts—all from your command line.
|
261
285
|
email:
|
262
286
|
- dvanhoozer@gmail.com
|
263
287
|
executables:
|
@@ -290,6 +314,7 @@ files:
|
|
290
314
|
- lib/aia/fzf.rb
|
291
315
|
- lib/aia/history_manager.rb
|
292
316
|
- lib/aia/prompt_handler.rb
|
317
|
+
- lib/aia/ruby_llm_adapter.rb
|
293
318
|
- lib/aia/session.rb
|
294
319
|
- lib/aia/shell_command_executor.rb
|
295
320
|
- lib/aia/ui_presenter.rb
|
@@ -321,5 +346,6 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
321
346
|
requirements: []
|
322
347
|
rubygems_version: 3.6.8
|
323
348
|
specification_version: 4
|
324
|
-
summary: AI Assistant
|
349
|
+
summary: 'AI Assistant: dynamic prompts, shell & Ruby integration, and seamless chat
|
350
|
+
workflows.'
|
325
351
|
test_files: []
|