aia 0.8.6 → 0.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/.version +1 -1
- data/CHANGELOG.md +4 -0
- data/COMMITS.md +23 -0
- data/README.md +111 -41
- data/lib/aia/chat_processor_service.rb +14 -1
- data/lib/aia/config.rb +147 -13
- data/lib/aia/prompt_handler.rb +7 -9
- data/lib/aia/ruby_llm_adapter.rb +79 -33
- data/lib/aia/session.rb +127 -99
- data/lib/aia/ui_presenter.rb +10 -1
- data/lib/aia.rb +6 -4
- data/lib/extensions/ruby_llm/chat.rb +197 -0
- data/mcp_servers/README.md +90 -0
- data/mcp_servers/filesystem.json +9 -0
- data/mcp_servers/imcp.json +7 -0
- data/mcp_servers/launcher.json +11 -0
- data/mcp_servers/playwright_server_definition.json +9 -0
- data/mcp_servers/timeserver.json +8 -0
- metadata +21 -14
- data/lib/aia/ai_client_adapter.rb +0 -210
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 489e116b8ebfbfed10c2cd0d699a4e37294719d39899e65d4fec6f226dadf36d
|
4
|
+
data.tar.gz: 35f9488983cd2944c8638d51aacd31ba606ee3d01fcada61673a47aaba3f7645
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: f4bb461ff9f7ac7c775cd72be754e1698cf057033f3ae6100d9ad7d1f348de96116af1f4a7271adf32b399147ad572f2c9bb198b27bbc7909f88548d4124f577
|
7
|
+
data.tar.gz: 3312f7ab73822ab0c315f0842bde82223d617b08bcc0b0734ff502c58494ef9c608433a9efcff1c7923a4fa89537eee4df35247d85fa96a34cc5f0b5b71177f8
|
data/.version
CHANGED
@@ -1 +1 @@
|
|
1
|
-
0.
|
1
|
+
0.9.0
|
data/CHANGELOG.md
CHANGED
data/COMMITS.md
ADDED
@@ -0,0 +1,23 @@
|
|
1
|
+
<!-- ~/COMMITS.md gem install aigcm -->
|
2
|
+
|
3
|
+
The JIRA ticket reference should be the first thing mentioned in the commit message.
|
4
|
+
It is useually the basename of the repository root. The repository root is
|
5
|
+
found in the system environment variable $RR.
|
6
|
+
|
7
|
+
A Git commit message includes:
|
8
|
+
|
9
|
+
1. **Subject line**: starts with the JIRA ticket and is 50 characters or less, imperative mood.
|
10
|
+
- Example: `Fix bug in user login`
|
11
|
+
|
12
|
+
2. **Body** (optional): Explain the "why" and "how", wrapped at 72 characters.
|
13
|
+
<example>
|
14
|
+
This commit fixes the login issue that occurs when the user
|
15
|
+
enters incorrect credentials. It improves error handling and
|
16
|
+
provides user feedback.
|
17
|
+
</example>
|
18
|
+
|
19
|
+
The body should also include bullet points for each change made.
|
20
|
+
|
21
|
+
3. **Footer** (optional): Reference issues or breaking changes.
|
22
|
+
<example> Closes #123 </example>
|
23
|
+
<example> BREAKING CHANGE: API changed </example>
|
data/README.md
CHANGED
@@ -11,20 +11,23 @@
|
|
11
11
|
[/______\] / * embedded directives * shell integration
|
12
12
|
/ \__AI__/ \/ * embedded Ruby * history management
|
13
13
|
/ /__\ * interactive chat * prompt workflows
|
14
|
-
(\ /____\
|
14
|
+
(\ /____\ # Experimental support of MCP servers via ruby_llm extension
|
15
15
|
```
|
16
16
|
|
17
17
|
AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
|
18
18
|
|
19
19
|
**Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
|
20
|
-
-
|
20
|
+
- Replaced `ai_client` with `ruby_llm` gem
|
21
|
+
- Added --adapter w/default ruby_llm in case there is a need to consider something else
|
21
22
|
- //include directive now supports web URLs
|
22
23
|
- //webpage insert web URL content as markdown into context
|
23
24
|
|
24
25
|
**Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
|
25
26
|
|
26
27
|
**Notable Recent Changes:**
|
27
|
-
- **RubyLLM Integration:** AIA now
|
28
|
+
- **RubyLLM Integration:** AIA now uses the `ruby_llm` gem as a replacement to ai_client. The option `--adapter ruby_llm` is the default. The `--adapter` option is there in case in the future an alternative to ruby_llm may be needed. I replacing my on gem ai_client with the ruby_llm gem? Because its better, newer, elegant and will easily support some of the new features I have planned for AIA.
|
29
|
+
|
30
|
+
- **MCP Server Support:** AIA now supports Model Context Protocol (MCP) servers through an extension to the ruby_llm gem. This experimental feature allows AIA to interact with various external tools and services through MCP servers using the --mcp and --allowed_tools CLI options.
|
28
31
|
|
29
32
|
<!-- Tocer[start]: Auto-generated, don't remove. -->
|
30
33
|
|
@@ -67,6 +70,13 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
|
|
67
70
|
- [Development](#development)
|
68
71
|
- [Contributing](#contributing)
|
69
72
|
- [Roadmap](#roadmap)
|
73
|
+
- [Model Context Protocol (MCP) Support](#model-context-protocol-mcp-support)
|
74
|
+
- [What is MCP?](#what-is-mcp)
|
75
|
+
- [Using MCP Servers with AIA](#using-mcp-servers-with-aia)
|
76
|
+
- [Focusing on Specific Tools with --allowed_tools](#focusing-on-specific-tools-with---allowed_tools)
|
77
|
+
- [How It Works](#how-it-works)
|
78
|
+
- [Current Limitations](#current-limitations)
|
79
|
+
- [Sounds like directives](#sounds-like-directives)
|
70
80
|
- [License](#license)
|
71
81
|
|
72
82
|
<!-- Tocer[finish]: Auto-generated, don't remove. -->
|
@@ -75,39 +85,49 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
|
|
75
85
|
|
76
86
|
The following table provides a comprehensive list of configuration options, their default values, and the associated environment variables:
|
77
87
|
|
78
|
-
|
|
79
|
-
|
80
|
-
| adapter
|
81
|
-
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
|
|
87
|
-
|
|
88
|
-
|
|
89
|
-
|
|
90
|
-
|
|
91
|
-
|
|
92
|
-
|
|
93
|
-
|
|
94
|
-
|
|
95
|
-
|
|
96
|
-
|
|
97
|
-
|
|
98
|
-
|
|
99
|
-
|
|
100
|
-
|
|
101
|
-
|
|
102
|
-
|
|
103
|
-
| presence_penalty
|
104
|
-
|
|
105
|
-
|
|
106
|
-
|
|
107
|
-
|
|
108
|
-
|
|
109
|
-
|
|
110
|
-
|
|
88
|
+
| Config Item Name | CLI Options | Default Value | Environment Variable |
|
89
|
+
|----------------------|-------------|-----------------------------|---------------------------|
|
90
|
+
| adapter | --adapter | ruby_llm | AIA_ADAPTER |
|
91
|
+
| aia_dir | | ~/.aia | AIA_DIR |
|
92
|
+
| append | -a, --append | false | AIA_APPEND |
|
93
|
+
| chat | --chat | false | AIA_CHAT |
|
94
|
+
| clear | --clear | false | AIA_CLEAR |
|
95
|
+
| config_file | -c, --config_file | ~/.aia/config.yml | AIA_CONFIG_FILE |
|
96
|
+
| debug | -d, --debug | false | AIA_DEBUG |
|
97
|
+
| embedding_model | --em, --embedding_model | text-embedding-ada-002 | AIA_EMBEDDING_MODEL |
|
98
|
+
| erb | | true | AIA_ERB |
|
99
|
+
| frequency_penalty | --frequency_penalty | 0.0 | AIA_FREQUENCY_PENALTY |
|
100
|
+
| fuzzy | -f, --fuzzy | false | AIA_FUZZY |
|
101
|
+
| image_quality | --iq, --image_quality | standard | AIA_IMAGE_QUALITY |
|
102
|
+
| image_size | --is, --image_size | 1024x1024 | AIA_IMAGE_SIZE |
|
103
|
+
| image_style | --style, --image_style | vivid | AIA_IMAGE_STYLE |
|
104
|
+
| log_file | -l, --log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
|
105
|
+
| markdown | --md, --markdown | true | AIA_MARKDOWN |
|
106
|
+
| max_tokens | --max_tokens | 2048 | AIA_MAX_TOKENS |
|
107
|
+
| mcp_servers | --mcp | [] | AIA_MCP_SERVERS |
|
108
|
+
| model | -m, --model | gpt-4o-mini | AIA_MODEL |
|
109
|
+
| next | -n, --next | nil | AIA_NEXT |
|
110
|
+
| out_file | -o, --out_file | temp.md | AIA_OUT_FILE |
|
111
|
+
| parameter_regex | --regex | '(?-mix:(\[[A-Z _|]+\]))' | AIA_PARAMETER_REGEX |
|
112
|
+
| pipeline | --pipeline | [] | AIA_PIPELINE |
|
113
|
+
| presence_penalty | --presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
|
114
|
+
| prompt_extname | | .txt | AIA_PROMPT_EXTNAME |
|
115
|
+
| prompts_dir | -p, --prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
|
116
|
+
| require_libs | --rq | [] | AIA_REQUIRE_LIBS |
|
117
|
+
| role | -r, --role | | AIA_ROLE |
|
118
|
+
| roles_dir | | ~/.prompts/roles | AIA_ROLES_DIR |
|
119
|
+
| roles_prefix | --roles_prefix | roles | AIA_ROLES_PREFIX |
|
120
|
+
| shell | | true | AIA_SHELL |
|
121
|
+
| speak | --speak | false | AIA_SPEAK |
|
122
|
+
| speak_command | | afplay | AIA_SPEAK_COMMAND |
|
123
|
+
| speech_model | --sm, --speech_model | tts-1 | AIA_SPEECH_MODEL |
|
124
|
+
| system_prompt | --system_prompt | | AIA_SYSTEM_PROMPT |
|
125
|
+
| temperature | -t, --temperature | 0.7 | AIA_TEMPERATURE |
|
126
|
+
| terse | --terse | false | AIA_TERSE |
|
127
|
+
| top_p | --top_p | 1.0 | AIA_TOP_P |
|
128
|
+
| transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
|
129
|
+
| verbose | -v, --verbose | false | AIA_VERBOSE |
|
130
|
+
| voice | --voice | alloy | AIA_VOICE |
|
111
131
|
|
112
132
|
These options can be configured via command-line arguments, environment variables, or configuration files.
|
113
133
|
|
@@ -566,7 +586,7 @@ export AIA_MODEL=gpt-4o-mini
|
|
566
586
|
# for feedback while waiting for the LLM to respond.
|
567
587
|
export AIA_VERBOSE=true
|
568
588
|
|
569
|
-
alias chat='aia --chat --
|
589
|
+
alias chat='aia --chat --terse'
|
570
590
|
|
571
591
|
# rest of the file is the completion function
|
572
592
|
```
|
@@ -629,14 +649,14 @@ aia --help
|
|
629
649
|
|
630
650
|
Key command-line options include:
|
631
651
|
|
632
|
-
- `--adapter ADAPTER`: Choose the LLM interface adapter to use. Valid options are '
|
652
|
+
- `--adapter ADAPTER`: Choose the LLM interface adapter to use. Valid options are 'ruby_llm' (default) or something else in the future. See [RubyLLM Integration Guide](README_RUBY_LLM.md) for details.
|
633
653
|
- `--model MODEL`: Specify which LLM model to use
|
634
654
|
- `--chat`: Start an interactive chat session
|
635
|
-
- `--shell`: Enable shell command integration
|
636
|
-
- `--erb`: Enable ERB processing
|
637
655
|
- `--role ROLE`: Specify a role/system prompt
|
638
656
|
- And many more (use --help to see all options)
|
639
657
|
|
658
|
+
**Note:** ERB and shell processing are now standard features and always enabled. This allows you to use embedded Ruby code and shell commands in your prompts without needing to specify any additional options.
|
659
|
+
|
640
660
|
## Development
|
641
661
|
|
642
662
|
**ShellCommandExecutor Refactor:**
|
@@ -655,11 +675,61 @@ I'm not happy with the way where some command line options for external command
|
|
655
675
|
|
656
676
|
## Roadmap
|
657
677
|
|
658
|
-
- I'm thinking about removing the --erb and --shell options and just making those two integrations available all the time.
|
659
678
|
- restore the prompt text file search. currently fzf only looks a prompt IDs.
|
660
679
|
- continue integration of the ruby_llm gem
|
661
680
|
- support for Model Context Protocol
|
662
681
|
|
682
|
+
## Model Context Protocol (MCP) Support
|
683
|
+
|
684
|
+
AIA now includes experimental support for Model Context Protocol (MCP) servers through an extension to the `ruby_llm` gem, which utilizes the `ruby-mcp-client` gem for communication with MCP servers.
|
685
|
+
|
686
|
+
### What is MCP?
|
687
|
+
|
688
|
+
Model Context Protocol (MCP) is a standardized way for AI models to interact with external tools and services. It allows AI assistants to perform actions beyond mere text generation, such as searching the web, accessing databases, or interacting with APIs.
|
689
|
+
|
690
|
+
### Using MCP Servers with AIA
|
691
|
+
|
692
|
+
To use MCP with AIA, you can specify one or more MCP servers using the `--mcp` CLI option:
|
693
|
+
|
694
|
+
```bash
|
695
|
+
aia --chat --mcp server1,server2
|
696
|
+
```
|
697
|
+
|
698
|
+
This will connect your AIA session to the specified MCP servers, allowing the AI model to access tools provided by those servers.
|
699
|
+
|
700
|
+
### Focusing on Specific Tools with --allowed_tools
|
701
|
+
|
702
|
+
If you want to limit which MCP tools are available to the AI model during your session, you can use the `--allowed_tools` option:
|
703
|
+
|
704
|
+
```bash
|
705
|
+
aia --chat --mcp server1 --allowed_tools tool1,tool2,tool3
|
706
|
+
```
|
707
|
+
|
708
|
+
This restricts the AI model to only use the specified tools, which can be helpful for:
|
709
|
+
|
710
|
+
- Focusing the AI on a specific task
|
711
|
+
- Preventing the use of unnecessary or potentially harmful tools
|
712
|
+
- Reducing the cognitive load on the AI by limiting available options
|
713
|
+
|
714
|
+
### How It Works
|
715
|
+
|
716
|
+
Behind the scenes, AIA leverages a new `with_mcp` method in `RubyLLM::Chat` to set up the MCP tools for chat sessions. This method processes the hash-structured MCP tool definitions and makes them available for the AI to use during the conversation.
|
717
|
+
|
718
|
+
When you specify `--mcp` and optionally `--allowed_tools`, AIA configures the underlying `ruby_llm` adapter to connect to the appropriate MCP servers and expose only the tools you've explicitly allowed.
|
719
|
+
|
720
|
+
### Current Limitations
|
721
|
+
|
722
|
+
As this is an experimental feature:
|
723
|
+
|
724
|
+
- Not all AI models support MCP tools
|
725
|
+
- The available tools may vary depending on the MCP server
|
726
|
+
- Tool behavior may change as the MCP specification evolves
|
727
|
+
|
728
|
+
### Sounds like directives
|
729
|
+
|
730
|
+
Anthropic's MCP standard came months after the aia appeared with its inherent support of dynamic prompt context enhancements via directives, shell integration and ERB support. Yes you can argue that if you have a working MCP client and access to appropriate MCP servers then the dynamic prompt context enhancements via directives, shell integration and ERB support are superfluous. I don't agree with that argument. I think the dynamic prompt context enhancements via directives, shell integration and ERB support are powerful tools in and of themselves. I think they are a better way to do things.
|
731
|
+
|
732
|
+
|
663
733
|
## License
|
664
734
|
|
665
735
|
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
@@ -60,7 +60,20 @@ module AIA
|
|
60
60
|
|
61
61
|
|
62
62
|
def maybe_change_model
|
63
|
-
if AIA.client.model
|
63
|
+
if AIA.client.model.is_a?(String)
|
64
|
+
client_model = AIA.client.model # AiClient instance
|
65
|
+
else
|
66
|
+
client_model = AIA.client.model.id # RubyLLM::Model instance
|
67
|
+
end
|
68
|
+
|
69
|
+
debug_me('== dynamic model change? =='){[
|
70
|
+
:client_model,
|
71
|
+
"AIA.config.model"
|
72
|
+
]}
|
73
|
+
|
74
|
+
# when adapter is ruby_llm must use model.id as the name
|
75
|
+
unless AIA.config.model.downcase.include?(client_model.downcase)
|
76
|
+
# FIXME: assumes that the adapter is AiClient. It might be RUbyLLM
|
64
77
|
AIA.client = AIClientAdapter.new
|
65
78
|
end
|
66
79
|
end
|
data/lib/aia/config.rb
CHANGED
@@ -9,6 +9,9 @@ require 'yaml'
|
|
9
9
|
require 'toml-rb'
|
10
10
|
require 'erb'
|
11
11
|
require 'optparse'
|
12
|
+
require 'json'
|
13
|
+
require 'tempfile'
|
14
|
+
require 'fileutils'
|
12
15
|
|
13
16
|
module AIA
|
14
17
|
class Config
|
@@ -26,10 +29,14 @@ module AIA
|
|
26
29
|
role: '',
|
27
30
|
system_prompt: '',
|
28
31
|
|
32
|
+
# MCP configuration
|
33
|
+
mcp_servers: [],
|
34
|
+
allowed_tools: nil, # nil means all tools are allowed; otherwise an Array of Strings which are the tool names
|
35
|
+
|
29
36
|
# Flags
|
30
37
|
markdown: true,
|
31
|
-
shell:
|
32
|
-
erb:
|
38
|
+
shell: true,
|
39
|
+
erb: true,
|
33
40
|
chat: false,
|
34
41
|
clear: false,
|
35
42
|
terse: false,
|
@@ -60,7 +67,7 @@ module AIA
|
|
60
67
|
speech_model: 'tts-1',
|
61
68
|
transcription_model: 'whisper-1',
|
62
69
|
voice: 'alloy',
|
63
|
-
adapter: '
|
70
|
+
adapter: 'ruby_llm', # 'ruby_llm' or ???
|
64
71
|
|
65
72
|
# Embedding parameters
|
66
73
|
embedding_model: 'text-embedding-ada-002',
|
@@ -193,6 +200,13 @@ module AIA
|
|
193
200
|
PromptManager::Prompt.parameter_regex = Regexp.new(config.parameter_regex)
|
194
201
|
end
|
195
202
|
|
203
|
+
debug_me{[ 'config.mcp_servers' ]}
|
204
|
+
|
205
|
+
unless config.mcp_servers.empty?
|
206
|
+
# create a single JSON file contain all of the MCP server definitions specified my the --mcp option
|
207
|
+
config.mcp_servers = combine_mcp_server_json_files config.mcp_servers
|
208
|
+
end
|
209
|
+
|
196
210
|
config
|
197
211
|
end
|
198
212
|
|
@@ -239,7 +253,7 @@ module AIA
|
|
239
253
|
|
240
254
|
opts.on("--adapter ADAPTER", "Interface that adapts AIA to the LLM") do |adapter|
|
241
255
|
adapter.downcase!
|
242
|
-
valid_adapters = %w[
|
256
|
+
valid_adapters = %w[ ruby_llm ] # NOTE: Add additional adapters here when needed
|
243
257
|
if valid_adapters.include? adapter
|
244
258
|
config.adapter = adapter
|
245
259
|
else
|
@@ -253,15 +267,7 @@ module AIA
|
|
253
267
|
config.model = model
|
254
268
|
end
|
255
269
|
|
256
|
-
opts.on("--
|
257
|
-
config.shell = true
|
258
|
-
end
|
259
|
-
|
260
|
-
opts.on("--erb", "Turns the prompt text file into a fully functioning ERB template, allowing for embedded Ruby code processing within the prompt text. This enables dynamic content generation and complex logic within prompts.") do
|
261
|
-
config.erb = true
|
262
|
-
end
|
263
|
-
|
264
|
-
opts.on("--terse", "Add terse instruction to prompt") do
|
270
|
+
opts.on("--terse", "Adds a special instruction to the prompt asking the AI to keep responses short and to the point") do
|
265
271
|
config.terse = true
|
266
272
|
end
|
267
273
|
|
@@ -428,6 +434,44 @@ module AIA
|
|
428
434
|
opts.on("--rq LIBS", "Ruby libraries to require for Ruby directive") do |libs|
|
429
435
|
config.require_libs = libs.split(',')
|
430
436
|
end
|
437
|
+
|
438
|
+
opts.on("--mcp FILE", "Add MCP server configuration from JSON file. Can be specified multiple times.") do |file|
|
439
|
+
# debug_me FIXME ruby-mcp-client is looking for a single JSON file that
|
440
|
+
# could contain multiple server definitions that looks like this:
|
441
|
+
# {
|
442
|
+
# "mcpServers": {
|
443
|
+
# "server one": { ... },
|
444
|
+
# "server two": { ... }, ....
|
445
|
+
# }
|
446
|
+
# }
|
447
|
+
# FIXME: need to rurn multiple JSON files into one.
|
448
|
+
if AIA.good_file?(file)
|
449
|
+
config.mcp_servers ||= []
|
450
|
+
config.mcp_servers << file
|
451
|
+
begin
|
452
|
+
server_config = JSON.parse(File.read(file))
|
453
|
+
config.mcp_servers_config ||= []
|
454
|
+
config.mcp_servers_config << server_config
|
455
|
+
rescue JSON::ParserError => e
|
456
|
+
STDERR.puts "Error parsing MCP server config file #{file}: #{e.message}"
|
457
|
+
exit 1
|
458
|
+
end
|
459
|
+
else
|
460
|
+
STDERR.puts "MCP server config file not found: #{file}"
|
461
|
+
exit 1
|
462
|
+
end
|
463
|
+
end
|
464
|
+
|
465
|
+
opts.on("--at", "--allowed_tools TOOLS_LIST", "Allow only these tools to be used") do |tools_list|
|
466
|
+
config.allowed_tools ||= []
|
467
|
+
if tools_list.empty?
|
468
|
+
STDERR.puts "No list of tool names provided for --allowed_tools option"
|
469
|
+
exit 1
|
470
|
+
else
|
471
|
+
config.allowed_tools += tools_list.split(',').map(&:strip)
|
472
|
+
config.allowed_tools.uniq!
|
473
|
+
end
|
474
|
+
end
|
431
475
|
end
|
432
476
|
|
433
477
|
args = ARGV.dup
|
@@ -513,5 +557,95 @@ module AIA
|
|
513
557
|
File.write(file, content)
|
514
558
|
puts "Config successfully dumped to #{file}"
|
515
559
|
end
|
560
|
+
|
561
|
+
|
562
|
+
# Combine multiple MCP server JSON files into a single file
|
563
|
+
def self.combine_mcp_server_json_files(file_paths)
|
564
|
+
raise ArgumentError, "No JSON files provided" if file_paths.nil? || file_paths.empty?
|
565
|
+
|
566
|
+
# The output will have only one top-level key: "mcpServers"
|
567
|
+
mcp_servers = {} # This will store all collected server_name => server_config pairs
|
568
|
+
|
569
|
+
file_paths.each do |file_path|
|
570
|
+
file_content = JSON.parse(File.read(file_path))
|
571
|
+
# Clean basename, e.g., "filesystem.json" -> "filesystem", "foo.json.erb" -> "foo"
|
572
|
+
cleaned_basename = File.basename(file_path).sub(/\.json\.erb$/, '').sub(/\.json$/, '')
|
573
|
+
|
574
|
+
if file_content.is_a?(Hash)
|
575
|
+
if file_content.key?("mcpServers") && file_content["mcpServers"].is_a?(Hash)
|
576
|
+
# Case A: {"mcpServers": {"name1": {...}, "name2": {...}}}
|
577
|
+
file_content["mcpServers"].each do |server_name, server_data|
|
578
|
+
if mcp_servers.key?(server_name)
|
579
|
+
STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' found. Overwriting with definition from #{file_path}."
|
580
|
+
end
|
581
|
+
mcp_servers[server_name] = server_data
|
582
|
+
end
|
583
|
+
# Check if the root hash itself is a single server definition
|
584
|
+
elsif is_single_server_definition?(file_content)
|
585
|
+
# Case B: {"type": "stdio", ...} or {"url": "...", ...}
|
586
|
+
# Use "name" property from JSON if present, otherwise use cleaned_basename
|
587
|
+
server_name = file_content["name"] || cleaned_basename
|
588
|
+
if mcp_servers.key?(server_name)
|
589
|
+
STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' (from file #{file_path}). Overwriting."
|
590
|
+
end
|
591
|
+
mcp_servers[server_name] = file_content
|
592
|
+
else
|
593
|
+
# Case D: Fallback for {"custom_name1": {server_config1}, "custom_name2": {server_config2}}
|
594
|
+
# This assumes top-level keys are server names and values are server configs.
|
595
|
+
file_content.each do |server_name, server_data|
|
596
|
+
if server_data.is_a?(Hash) && is_single_server_definition?(server_data)
|
597
|
+
if mcp_servers.key?(server_name)
|
598
|
+
STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' found in #{file_path}. Overwriting."
|
599
|
+
end
|
600
|
+
mcp_servers[server_name] = server_data
|
601
|
+
else
|
602
|
+
STDERR.puts "Warning: Unrecognized structure for key '#{server_name}' in #{file_path}. Value is not a valid server definition. Skipping."
|
603
|
+
end
|
604
|
+
end
|
605
|
+
end
|
606
|
+
elsif file_content.is_a?(Array)
|
607
|
+
# Case C: [ {server_config1}, {server_config2_with_name} ]
|
608
|
+
file_content.each_with_index do |server_data, index|
|
609
|
+
if server_data.is_a?(Hash) && is_single_server_definition?(server_data)
|
610
|
+
# Use "name" property from JSON if present, otherwise generate one
|
611
|
+
server_name = server_data["name"] || "#{cleaned_basename}_#{index}"
|
612
|
+
if mcp_servers.key?(server_name)
|
613
|
+
STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' (from array in #{file_path}). Overwriting."
|
614
|
+
end
|
615
|
+
mcp_servers[server_name] = server_data
|
616
|
+
else
|
617
|
+
STDERR.puts "Warning: Unrecognized item in array in #{file_path} at index #{index}. Skipping."
|
618
|
+
end
|
619
|
+
end
|
620
|
+
else
|
621
|
+
STDERR.puts "Warning: Unrecognized JSON structure in #{file_path}. Skipping."
|
622
|
+
end
|
623
|
+
end
|
624
|
+
|
625
|
+
# Create the final output structure
|
626
|
+
output = {"mcpServers" => mcp_servers}
|
627
|
+
temp_file = Tempfile.new(['combined', '.json'])
|
628
|
+
temp_file.write(JSON.pretty_generate(output))
|
629
|
+
temp_file.close
|
630
|
+
|
631
|
+
temp_file.path
|
632
|
+
end
|
633
|
+
|
634
|
+
# Helper method to determine if a hash represents a valid MCP server definition
|
635
|
+
def self.is_single_server_definition?(config)
|
636
|
+
return false unless config.is_a?(Hash)
|
637
|
+
type = config['type']
|
638
|
+
if type
|
639
|
+
return true if type == 'stdio' && config.key?('command')
|
640
|
+
return true if type == 'sse' && config.key?('url')
|
641
|
+
# Potentially other explicit types if they exist in MCP
|
642
|
+
return false # Known type but missing required fields for it, or unknown type
|
643
|
+
else
|
644
|
+
# Infer type
|
645
|
+
return true if config.key?('command') || config.key?('args') || config.key?('env') # stdio
|
646
|
+
return true if config.key?('url') # sse
|
647
|
+
end
|
648
|
+
false
|
649
|
+
end
|
516
650
|
end
|
517
651
|
end
|
data/lib/aia/prompt_handler.rb
CHANGED
@@ -48,14 +48,8 @@ module AIA
|
|
48
48
|
erb_flag: AIA.config.erb,
|
49
49
|
envar_flag: AIA.config.shell
|
50
50
|
)
|
51
|
-
|
52
|
-
# Ensure parameters are extracted even if no history file exists
|
53
|
-
if prompt && prompt.parameters.empty?
|
54
|
-
# Force re-reading of the prompt text to extract parameters
|
55
|
-
# This ensures parameters are found even without a .json file
|
56
|
-
prompt.reload
|
57
|
-
end
|
58
51
|
|
52
|
+
# Parameters should be extracted during initialization or to_s
|
59
53
|
return prompt if prompt
|
60
54
|
else
|
61
55
|
puts "Warning: Invalid prompt ID or file not found: #{prompt_id}"
|
@@ -147,8 +141,11 @@ module AIA
|
|
147
141
|
end
|
148
142
|
|
149
143
|
|
144
|
+
# FIXME: original implementation used a search_proc to look into the content of the prompt
|
145
|
+
# files. The use of the select statement does not work.
|
150
146
|
def search_prompt_id_with_fzf(initial_query)
|
151
|
-
prompt_files = Dir.glob(File.join(@prompts_dir, "*.txt"))
|
147
|
+
prompt_files = Dir.glob(File.join(@prompts_dir, "*.txt"))
|
148
|
+
.map { |file| File.basename(file, ".txt") }
|
152
149
|
fzf = AIA::Fzf.new(
|
153
150
|
list: prompt_files,
|
154
151
|
directory: @prompts_dir,
|
@@ -160,7 +157,8 @@ module AIA
|
|
160
157
|
end
|
161
158
|
|
162
159
|
def search_role_id_with_fzf(initial_query)
|
163
|
-
role_files = Dir.glob(File.join(@roles_dir, "*.txt"))
|
160
|
+
role_files = Dir.glob(File.join(@roles_dir, "*.txt"))
|
161
|
+
.map { |file| File.basename(file, ".txt") }
|
164
162
|
fzf = AIA::Fzf.new(
|
165
163
|
list: role_files,
|
166
164
|
directory: @prompts_dir,
|