aia 0.8.5 → 0.9.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 307fe4fba0b6c2134cba097ffd9b01750eea1f3e06d6c6de5131099435caaa16
4
- data.tar.gz: ac46548236f1c9162d62af9a999776d9a01ffa5cc8ff283ed9bfd774414ddc85
3
+ metadata.gz: 489e116b8ebfbfed10c2cd0d699a4e37294719d39899e65d4fec6f226dadf36d
4
+ data.tar.gz: 35f9488983cd2944c8638d51aacd31ba606ee3d01fcada61673a47aaba3f7645
5
5
  SHA512:
6
- metadata.gz: '09ec4085dabf70365ccc74a95126891a6075eefc57ad1a652432278b38355f55d2c503ebc6e10b52d9f77e3d7788fef0a9b1d9d6e6aa3b7d72b2cda82d337c08'
7
- data.tar.gz: 84eed4f09f8d9d506887ea91cd9b810cc47e77a724c3031e43b464779e5cfcaf5f0a92cc30304bf0d6ffea0eb31e35a23dc2baabff1671cff1bca8852b7e7801
6
+ metadata.gz: f4bb461ff9f7ac7c775cd72be754e1698cf057033f3ae6100d9ad7d1f348de96116af1f4a7271adf32b399147ad572f2c9bb198b27bbc7909f88548d4124f577
7
+ data.tar.gz: 3312f7ab73822ab0c315f0842bde82223d617b08bcc0b0734ff502c58494ef9c608433a9efcff1c7923a4fa89537eee4df35247d85fa96a34cc5f0b5b71177f8
data/.version CHANGED
@@ -1 +1 @@
1
- 0.8.5
1
+ 0.9.0
data/CHANGELOG.md CHANGED
@@ -1,8 +1,16 @@
1
1
  # Changelog
2
2
  ## [Unreleased]
3
+ ### [0.9.0] WIP
4
+ - Adding MCP Client suppot
5
+ - removed the CLI options --erb and --shell but kept them in the config file with a default of true for both
3
6
 
4
7
  ## Released
5
- ### [0.8.5] 2025-04-49
8
+
9
+ ### [0.8.6] 2025-04-23
10
+ - Added a client adapter for the ruby_llm gem
11
+ - Added the adapter config item and the --adapter option to select at runtime which client to use ai_client or ruby_llm
12
+
13
+ ### [0.8.5] 2025-04-19
6
14
  - documentation updates
7
15
  - integrated the https://pure.md web service for inserting web pages into the context window
8
16
  - //include http?://example.com/stuff
data/COMMITS.md ADDED
@@ -0,0 +1,23 @@
1
+ <!-- ~/COMMITS.md gem install aigcm -->
2
+
3
+ The JIRA ticket reference should be the first thing mentioned in the commit message.
4
+ It is useually the basename of the repository root. The repository root is
5
+ found in the system environment variable $RR.
6
+
7
+ A Git commit message includes:
8
+
9
+ 1. **Subject line**: starts with the JIRA ticket and is 50 characters or less, imperative mood.
10
+ - Example: `Fix bug in user login`
11
+
12
+ 2. **Body** (optional): Explain the "why" and "how", wrapped at 72 characters.
13
+ <example>
14
+ This commit fixes the login issue that occurs when the user
15
+ enters incorrect credentials. It improves error handling and
16
+ provides user feedback.
17
+ </example>
18
+
19
+ The body should also include bullet points for each change made.
20
+
21
+ 3. **Footer** (optional): Reference issues or breaking changes.
22
+ <example> Closes #123 </example>
23
+ <example> BREAKING CHANGE: API changed </example>
data/README.md CHANGED
@@ -1,4 +1,4 @@
1
- 3# AI Assistant (AIA)
1
+ # AI Assistant (AIA)
2
2
 
3
3
  **The prompt is the code!**
4
4
 
@@ -11,28 +11,28 @@
11
11
  [/______\] / * embedded directives * shell integration
12
12
  / \__AI__/ \/ * embedded Ruby * history management
13
13
  / /__\ * interactive chat * prompt workflows
14
- (\ /____\
14
+ (\ /____\ # Experimental support of MCP servers via ruby_llm extension
15
15
  ```
16
16
 
17
17
  AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
18
18
 
19
19
  **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
20
+ - Replaced `ai_client` with `ruby_llm` gem
21
+ - Added --adapter w/default ruby_llm in case there is a need to consider something else
20
22
  - //include directive now supports web URLs
21
23
  - //webpage insert web URL content as markdown into context
22
24
 
23
25
  **Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
24
26
 
25
27
  **Notable Recent Changes:**
26
- - **Directive Processing in Chat and Prompts:** You can now use directives in chat sessions and prompt files. Use the directive **//help** to get a list of available directives.
28
+ - **RubyLLM Integration:** AIA now uses the `ruby_llm` gem as a replacement to ai_client. The option `--adapter ruby_llm` is the default. The `--adapter` option is there in case in the future an alternative to ruby_llm may be needed. I replacing my on gem ai_client with the ruby_llm gem? Because its better, newer, elegant and will easily support some of the new features I have planned for AIA.
29
+
30
+ - **MCP Server Support:** AIA now supports Model Context Protocol (MCP) servers through an extension to the ruby_llm gem. This experimental feature allows AIA to interact with various external tools and services through MCP servers using the --mcp and --allowed_tools CLI options.
27
31
 
28
32
  <!-- Tocer[start]: Auto-generated, don't remove. -->
29
33
 
30
34
  ## Table of Contents
31
35
 
32
- - [Installation](#installation)
33
- - [What is a Prompt ID?](#what-is-a-prompt-id)
34
- - [Embedded Parameters as Placeholders](#embedded-parameters-as-placeholders)
35
- - [Usage](#usage)
36
36
  - [Configuration Options](#configuration-options)
37
37
  - [Configuration Flexibility](#configuration-flexibility)
38
38
  - [Expandable Configuration](#expandable-configuration)
@@ -66,162 +66,68 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
66
66
  - [My Most Powerful Prompt](#my-most-powerful-prompt)
67
67
  - [My Configuration](#my-configuration)
68
68
  - [Executable Prompts](#executable-prompts)
69
+ - [Usage](#usage)
69
70
  - [Development](#development)
70
71
  - [Contributing](#contributing)
71
- - [History of Development](#history-of-development)
72
72
  - [Roadmap](#roadmap)
73
+ - [Model Context Protocol (MCP) Support](#model-context-protocol-mcp-support)
74
+ - [What is MCP?](#what-is-mcp)
75
+ - [Using MCP Servers with AIA](#using-mcp-servers-with-aia)
76
+ - [Focusing on Specific Tools with --allowed_tools](#focusing-on-specific-tools-with---allowed_tools)
77
+ - [How It Works](#how-it-works)
78
+ - [Current Limitations](#current-limitations)
79
+ - [Sounds like directives](#sounds-like-directives)
73
80
  - [License](#license)
74
81
 
75
82
  <!-- Tocer[finish]: Auto-generated, don't remove. -->
76
83
 
77
-
78
- ## Installation
79
-
80
- Install the gem by executing:
81
-
82
- gem install aia
83
-
84
- Install the command-line utilities by executing:
85
-
86
- brew install fzf
87
-
88
- You will also need to establish a directory in your filesystem where your prompt text files, last used parameters and usage log files are kept.
89
-
90
- Setup a system environment variable (envar) named "AIA_PROMPTS_DIR" that points to your prompts directory. The default is in your HOME directory named ".prompts". The envar "AIA_ROLES_PREFIX" points to your role prefix where you have prompts that define the different roles you want the LLM to assume when it is doing its work. The default roles prefix is "roles".
91
-
92
- You may also want to install the completion script for your shell. To get a copy of the completion script do:
93
-
94
- ```bash
95
- aia --completion bash
96
- ```
97
-
98
- `fish` and `zsh` are also available.
99
-
100
- ## What is a Prompt ID?
101
-
102
- A prompt ID is the basename of a text file (extension *.txt) located in a prompts directory. The prompts directory is specified by the environment variable "AIA_PROMPTS_DIR". If this variable is not set, the default is in your HOME directory named ".prompts". It can also be set on the command line with the `--prompts-dir` option.
103
-
104
- This file contains the context and instructions for the LLM to follow. The prompt ID is what you use as an option on the command line to specify which prompt text file to use. Prompt files can have comments, parameters, directives and ERB blocks along with the instruction text to feed to the LLM. It can also have shell commands and use system environment variables. Consider the following example:
105
-
106
- ```plaintext
107
- #!/usr/bin/env aia run
108
- # ~/.prompts/example.txt
109
- # Desc: Be an example prompt with all? the bells and whistles
110
-
111
- # Set the configuration for this prompt
112
-
113
- //config model = gpt-4
114
- //config temperature = 0.7
115
- //config shell = true
116
- //config erb = true
117
- //config out_file = path/to/output.md
118
-
119
- # Add some file content to the context/instructions
120
-
121
- //include path/to/file
122
- //shell cat path/to/file
123
- $(cat path/to/file)
124
-
125
- # Setup some workflows
126
-
127
- //next next_prompt_id
128
- //pipeline prompt_id_1, prompt_ie_2, prompt_id_3
129
-
130
- # Execute some Ruby code
131
-
132
- //ruby require 'some_library' # inserts into the context/instructions
133
- <% some_ruby_things # not inserted into the context %>
134
- <%= some_other_ruby_things # that are part of the context/instructions %>
135
-
136
- Tell me how to do something for a $(uname -s) platform that would rename all
137
- of the files in the directory $MY_DIRECTORY to have a prefix of for its filename
138
- that is [PREFIX] and a ${SUFFIX}
139
-
140
- <!--
141
- directives, ERB blocks and other junk can be used
142
- anywhere in the file mixing dynamic context/instructions with
143
- the static stuff.
144
- -->
145
-
146
- ```markdown
147
- # Header 1 -- not a comment
148
- ## Header 2 -- not a comment
149
- ### Header 3, etc -- not a comment
150
-
151
- ```ruby
152
- # this is a comment; but it stays in the prompt
153
- puts "hello world" <!-- this is also a comment; but it gets removed -->
154
- ```
155
- Kewl!
156
- ```
157
-
158
- __END__
159
-
160
- Everything after the "__END__" line is not part of the context or instructions to
161
- the LLM.
162
- ```
163
-
164
- Comments in a prompt text file are just there to document the prompt. They are removed before the completed prompt is processed by the LLM. This reduces token counts; but, more importantly it helps you remember why you structured your prompt they way you did - if you remembered to document your prompt.
165
-
166
- That is just about everything including the kitchen sink that a pre-compositional parameterized prompt file can have. It can be an executable with a she-bang line and a special system prompt name `run` as shown in the example. It has line comments that use the `#` symbol. It had end of file block comments that appear after the "__END__" line. It has directive command that begin with the double slash `//` - an homage to IBM JCL. It has shell variables in both forms. It has shell commands. It has parameters that default to a regex that uses square brackets and all uppercase characeters to define the parameter name whose value is to be given in a Q&A session before the prompt is sent to the LLM for processing.
167
-
168
- AIA has the ability to define a workflow of prompt IDs with either the //next or //pipeline directives.
169
-
170
- You could say that instead of the prompt being part of a program, a program can be part of the prompt. **The prompt is the code!**
171
-
172
- By using ERB you can make parts of the context/instructions conditional. You can also use ERB to make parts of the context/instructions dynamic for example to pull information from a database or an API.
173
-
174
- ## Embedded Parameters as Placeholders
175
-
176
- In the example prompt text file above I used the default regex to define parameters as all upper case characters plus space, underscore and the vertical pipe enclosed within square brackets. Since the time that I original starting writing AIA I've seen more developers use double curly braces to define parameters. AIA allows you to specify your own regex as a string. If you want the curly brackets use the `--regex` option on the command line like this:
177
-
178
- `--regex '(?-mix:({{[a-zA-Z _|]+}}))'`
179
-
180
-
181
- ## Usage
182
-
183
- The usage report is obtained with either `-h` or `--help` options.
184
-
185
- ```bash
186
- aia --help
187
- ```
188
-
189
84
  ## Configuration Options
190
85
 
191
86
  The following table provides a comprehensive list of configuration options, their default values, and the associated environment variables:
192
87
 
193
- | Option | Default Value | Environment Variable |
194
- |-------------------------|---------------------------------|---------------------------|
195
- | out_file | temp.md | AIA_OUT_FILE |
196
- | log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
197
- | prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
198
- | roles_prefix | roles | AIA_ROLES_PREFIX |
199
- | model | gpt-4o-mini | AIA_MODEL |
200
- | speech_model | tts-1 | AIA_SPEECH_MODEL |
201
- | transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
202
- | verbose | false | AIA_VERBOSE |
203
- | markdown | true | AIA_MARKDOWN |
204
- | shell | false | AIA_SHELL |
205
- | erb | false | AIA_ERB |
206
- | chat | false | AIA_CHAT |
207
- | clear | false | AIA_CLEAR |
208
- | terse | false | AIA_TERSE |
209
- | debug | false | AIA_DEBUG |
210
- | fuzzy | false | AIA_FUZZY |
211
- | speak | false | AIA_SPEAK |
212
- | append | false | AIA_APPEND |
213
- | temperature | 0.7 | AIA_TEMPERATURE |
214
- | max_tokens | 2048 | AIA_MAX_TOKENS |
215
- | top_p | 1.0 | AIA_TOP_P |
216
- | frequency_penalty | 0.0 | AIA_FREQUENCY_PENALTY |
217
- | presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
218
- | image_size | 1024x1024 | AIA_IMAGE_SIZE |
219
- | image_quality | standard | AIA_IMAGE_QUALITY |
220
- | image_style | vivid | AIA_IMAGE_STYLE |
221
- | embedding_model | text-embedding-ada-002 | AIA_EMBEDDING_MODEL |
222
- | speak_command | afplay | AIA_SPEAK_COMMAND |
223
- | require_libs | [] | AIA_REQUIRE_LIBS |
224
- | regex | '(?-mix:(\\[[A-Z _|]+\\]))' | AIA_REGEX |
88
+ | Config Item Name | CLI Options | Default Value | Environment Variable |
89
+ |----------------------|-------------|-----------------------------|---------------------------|
90
+ | adapter | --adapter | ruby_llm | AIA_ADAPTER |
91
+ | aia_dir | | ~/.aia | AIA_DIR |
92
+ | append | -a, --append | false | AIA_APPEND |
93
+ | chat | --chat | false | AIA_CHAT |
94
+ | clear | --clear | false | AIA_CLEAR |
95
+ | config_file | -c, --config_file | ~/.aia/config.yml | AIA_CONFIG_FILE |
96
+ | debug | -d, --debug | false | AIA_DEBUG |
97
+ | embedding_model | --em, --embedding_model | text-embedding-ada-002 | AIA_EMBEDDING_MODEL |
98
+ | erb | | true | AIA_ERB |
99
+ | frequency_penalty | --frequency_penalty | 0.0 | AIA_FREQUENCY_PENALTY |
100
+ | fuzzy | -f, --fuzzy | false | AIA_FUZZY |
101
+ | image_quality | --iq, --image_quality | standard | AIA_IMAGE_QUALITY |
102
+ | image_size | --is, --image_size | 1024x1024 | AIA_IMAGE_SIZE |
103
+ | image_style | --style, --image_style | vivid | AIA_IMAGE_STYLE |
104
+ | log_file | -l, --log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
105
+ | markdown | --md, --markdown | true | AIA_MARKDOWN |
106
+ | max_tokens | --max_tokens | 2048 | AIA_MAX_TOKENS |
107
+ | mcp_servers | --mcp | [] | AIA_MCP_SERVERS |
108
+ | model | -m, --model | gpt-4o-mini | AIA_MODEL |
109
+ | next | -n, --next | nil | AIA_NEXT |
110
+ | out_file | -o, --out_file | temp.md | AIA_OUT_FILE |
111
+ | parameter_regex | --regex | '(?-mix:(\[[A-Z _|]+\]))' | AIA_PARAMETER_REGEX |
112
+ | pipeline | --pipeline | [] | AIA_PIPELINE |
113
+ | presence_penalty | --presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
114
+ | prompt_extname | | .txt | AIA_PROMPT_EXTNAME |
115
+ | prompts_dir | -p, --prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
116
+ | require_libs | --rq | [] | AIA_REQUIRE_LIBS |
117
+ | role | -r, --role | | AIA_ROLE |
118
+ | roles_dir | | ~/.prompts/roles | AIA_ROLES_DIR |
119
+ | roles_prefix | --roles_prefix | roles | AIA_ROLES_PREFIX |
120
+ | shell | | true | AIA_SHELL |
121
+ | speak | --speak | false | AIA_SPEAK |
122
+ | speak_command | | afplay | AIA_SPEAK_COMMAND |
123
+ | speech_model | --sm, --speech_model | tts-1 | AIA_SPEECH_MODEL |
124
+ | system_prompt | --system_prompt | | AIA_SYSTEM_PROMPT |
125
+ | temperature | -t, --temperature | 0.7 | AIA_TEMPERATURE |
126
+ | terse | --terse | false | AIA_TERSE |
127
+ | top_p | --top_p | 1.0 | AIA_TOP_P |
128
+ | transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
129
+ | verbose | -v, --verbose | false | AIA_VERBOSE |
130
+ | voice | --voice | alloy | AIA_VOICE |
225
131
 
226
132
  These options can be configured via command-line arguments, environment variables, or configuration files.
227
133
 
@@ -293,7 +199,6 @@ The `--erb` option turns the prompt text file into a fully functioning ERB templ
293
199
 
294
200
  Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic prompt text for LLM processing.
295
201
 
296
-
297
202
  ## Prompt Directives
298
203
 
299
204
  Downstream processing directives were added to the `prompt_manager` gem used by AIA at version 0.4.1. These directives are lines in the prompt text file that begin with "//" having this pattern:
@@ -388,7 +293,6 @@ The `path_to_file` can be either absolute or relative. If it is relative, it is
388
293
 
389
294
  The file that is included will have any comments or directives excluded. It is expected that the file will be a text file so that its content can be pre-pended to the existing prompt; however, if the file is a source code file (ex: file.rb) the source code will be included HOWEVER any comment line or line that starts with "//" will be excluded.
390
295
 
391
-
392
296
  #### //ruby
393
297
 
394
298
  The `//ruby` directive executes Ruby code. You can use this to perform complex operations or interact with Ruby libraries.
@@ -423,7 +327,6 @@ There are no limitations on what the shell command can be. For example if you w
423
327
 
424
328
  Which does basically the same thing as the `//include` directive, except it uses the entire content of the file. For relative file paths the same thing applies. The file's path will be relative to the PWD.
425
329
 
426
-
427
330
  #### //next
428
331
  Examples:
429
332
  ```bash
@@ -523,6 +426,7 @@ or inside of the `one.txt` prompt file use this directive:
523
426
 
524
427
  ### Best Practices ??
525
428
 
429
+
526
430
  Since the response of one prompt is fed into the next prompt within the sequence instead of having all prompts write their response to the same out file, use these directives inside the associated prompt files:
527
431
 
528
432
 
@@ -631,7 +535,7 @@ fzf
631
535
 
632
536
  ## Shell Completion
633
537
 
634
- You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do this:
538
+ You can setup a completion function in your shell that will complete on the prompt_id saved in your `prompts_dir` - functions for `bash`, `fish` and `zsh` are available. To get a copy of these functions do:
635
539
 
636
540
  ```bash
637
541
  aia --completion bash
@@ -682,7 +586,7 @@ export AIA_MODEL=gpt-4o-mini
682
586
  # for feedback while waiting for the LLM to respond.
683
587
  export AIA_VERBOSE=true
684
588
 
685
- alias chat='aia --chat --shell --erb --terse'
589
+ alias chat='aia --chat --terse'
686
590
 
687
591
  # rest of the file is the completion function
688
592
  ```
@@ -735,6 +639,24 @@ Since its output is going to STDOUT you can setup a pipe chain. Using the CLI p
735
639
 
736
640
  This executable prompt concept sets up the building blocks of a *nix CLI-based pipeline in the same way that the --pipeline and --next options and directives are used.
737
641
 
642
+ ## Usage
643
+
644
+ The usage report is obtained with either `-h` or `--help` options.
645
+
646
+ ```bash
647
+ aia --help
648
+ ```
649
+
650
+ Key command-line options include:
651
+
652
+ - `--adapter ADAPTER`: Choose the LLM interface adapter to use. Valid options are 'ruby_llm' (default) or something else in the future. See [RubyLLM Integration Guide](README_RUBY_LLM.md) for details.
653
+ - `--model MODEL`: Specify which LLM model to use
654
+ - `--chat`: Start an interactive chat session
655
+ - `--role ROLE`: Specify a role/system prompt
656
+ - And many more (use --help to see all options)
657
+
658
+ **Note:** ERB and shell processing are now standard features and always enabled. This allows you to use embedded Ruby code and shell commands in your prompts without needing to specify any additional options.
659
+
738
660
  ## Development
739
661
 
740
662
  **ShellCommandExecutor Refactor:**
@@ -751,16 +673,62 @@ When you find problems with AIA please note them as an issue. This thing was wr
751
673
 
752
674
  I'm not happy with the way where some command line options for external command are hard coded. I'm specific talking about the way in which the `rg` and `fzf` tools are used. Their options decide the basic look and feel of the search capability on the command line. Maybe they should be part of the overall configuration so that users can tune their UI to the way they like.
753
675
 
754
- ## History of Development
676
+ ## Roadmap
755
677
 
756
- I originally wrote a tiny script called `aip.rb` to experiment with parameterized prompts. That was in August of 2023. AIP meant AI Parameterized. Adding an extra P for Prompts just seemed to be a little silly. It lived in my [scripts repo](https://github.com/MadBomber/scripts) for a while. It became useful to me so of course I need to keep enhancing it. I moved it into my [experiments repo](https://github.com/MadBomber/experiments) and began adding features in a haphazard manner. No real plan or architecture. From those experiments I refactored out the [prompt_manager gem](https://github.com/MadBomber/prompt_manager) and the [ai_client gem](https://github.com/MadBomber/ai_client)(https://github.com/MadBomber/ai_client). The name was changed from AIP to AIA and it became a gem.
678
+ - restore the prompt text file search. currently fzf only looks a prompt IDs.
679
+ - continue integration of the ruby_llm gem
680
+ - support for Model Context Protocol
757
681
 
758
- All of that undirected experimentation without a clear picture of where this thing was going resulted in chaotic code. I would use an Italian food dish to explain the organization but I think chaotic is more descriptive.
682
+ ## Model Context Protocol (MCP) Support
759
683
 
760
- ## Roadmap
684
+ AIA now includes experimental support for Model Context Protocol (MCP) servers through an extension to the `ruby_llm` gem, which utilizes the `ruby-mcp-client` gem for communication with MCP servers.
685
+
686
+ ### What is MCP?
687
+
688
+ Model Context Protocol (MCP) is a standardized way for AI models to interact with external tools and services. It allows AI assistants to perform actions beyond mere text generation, such as searching the web, accessing databases, or interacting with APIs.
689
+
690
+ ### Using MCP Servers with AIA
691
+
692
+ To use MCP with AIA, you can specify one or more MCP servers using the `--mcp` CLI option:
693
+
694
+ ```bash
695
+ aia --chat --mcp server1,server2
696
+ ```
697
+
698
+ This will connect your AIA session to the specified MCP servers, allowing the AI model to access tools provided by those servers.
699
+
700
+ ### Focusing on Specific Tools with --allowed_tools
701
+
702
+ If you want to limit which MCP tools are available to the AI model during your session, you can use the `--allowed_tools` option:
703
+
704
+ ```bash
705
+ aia --chat --mcp server1 --allowed_tools tool1,tool2,tool3
706
+ ```
707
+
708
+ This restricts the AI model to only use the specified tools, which can be helpful for:
709
+
710
+ - Focusing the AI on a specific task
711
+ - Preventing the use of unnecessary or potentially harmful tools
712
+ - Reducing the cognitive load on the AI by limiting available options
713
+
714
+ ### How It Works
715
+
716
+ Behind the scenes, AIA leverages a new `with_mcp` method in `RubyLLM::Chat` to set up the MCP tools for chat sessions. This method processes the hash-structured MCP tool definitions and makes them available for the AI to use during the conversation.
717
+
718
+ When you specify `--mcp` and optionally `--allowed_tools`, AIA configures the underlying `ruby_llm` adapter to connect to the appropriate MCP servers and expose only the tools you've explicitly allowed.
719
+
720
+ ### Current Limitations
721
+
722
+ As this is an experimental feature:
723
+
724
+ - Not all AI models support MCP tools
725
+ - The available tools may vary depending on the MCP server
726
+ - Tool behavior may change as the MCP specification evolves
727
+
728
+ ### Sounds like directives
729
+
730
+ Anthropic's MCP standard came months after the aia appeared with its inherent support of dynamic prompt context enhancements via directives, shell integration and ERB support. Yes you can argue that if you have a working MCP client and access to appropriate MCP servers then the dynamic prompt context enhancements via directives, shell integration and ERB support are superfluous. I don't agree with that argument. I think the dynamic prompt context enhancements via directives, shell integration and ERB support are powerful tools in and of themselves. I think they are a better way to do things.
761
731
 
762
- - support for using Ruby-based functional callback tools
763
- - support for Model Context Protocol
764
732
 
765
733
  ## License
766
734
 
@@ -60,7 +60,20 @@ module AIA
60
60
 
61
61
 
62
62
  def maybe_change_model
63
- if AIA.client.model != AIA.config.model
63
+ if AIA.client.model.is_a?(String)
64
+ client_model = AIA.client.model # AiClient instance
65
+ else
66
+ client_model = AIA.client.model.id # RubyLLM::Model instance
67
+ end
68
+
69
+ debug_me('== dynamic model change? =='){[
70
+ :client_model,
71
+ "AIA.config.model"
72
+ ]}
73
+
74
+ # when adapter is ruby_llm must use model.id as the name
75
+ unless AIA.config.model.downcase.include?(client_model.downcase)
76
+ # FIXME: assumes that the adapter is AiClient. It might be RUbyLLM
64
77
  AIA.client = AIClientAdapter.new
65
78
  end
66
79
  end