aia 0.9.0 → 0.9.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 489e116b8ebfbfed10c2cd0d699a4e37294719d39899e65d4fec6f226dadf36d
4
- data.tar.gz: 35f9488983cd2944c8638d51aacd31ba606ee3d01fcada61673a47aaba3f7645
3
+ metadata.gz: 543eb535dae7c828b1dbd2a2dd981e566d27b1c5d8ad4c81d1fd21566015f175
4
+ data.tar.gz: 021bf728dd9082c9fa1aca73d72001e4ed512d2c7ddd23cff044d5d9447a8d2d
5
5
  SHA512:
6
- metadata.gz: f4bb461ff9f7ac7c775cd72be754e1698cf057033f3ae6100d9ad7d1f348de96116af1f4a7271adf32b399147ad572f2c9bb198b27bbc7909f88548d4124f577
7
- data.tar.gz: 3312f7ab73822ab0c315f0842bde82223d617b08bcc0b0734ff502c58494ef9c608433a9efcff1c7923a4fa89537eee4df35247d85fa96a34cc5f0b5b71177f8
6
+ metadata.gz: 5b108c87dc2c82c0c347efa2b084ce1274b412324c33c2450a8174a655dca3be9c14e34b83347b90a9a7c8817b0eadb90a17f81536234eb38b8d09377f4a3221
7
+ data.tar.gz: 43bbef3a40562333650251dde1ea5868b4cef29ed7b35c447429a75552899a6a73989ae87546320e8a51370675192f1d18d71c56eef8b98985beb863341f4869
data/.version CHANGED
@@ -1 +1 @@
1
- 0.9.0
1
+ 0.9.2
data/CHANGELOG.md CHANGED
@@ -1,11 +1,21 @@
1
1
  # Changelog
2
2
  ## [Unreleased]
3
- ### [0.9.0] WIP
4
- - Adding MCP Client suppot
5
- - removed the CLI options --erb and --shell but kept them in the config file with a default of true for both
6
-
7
3
  ## Released
8
4
 
5
+ ### [0.9.2] 2025-05-18
6
+ - removing the MCP experiment
7
+ - adding support for RubyLLM::Tool usage in place of the MCP stuff
8
+ - updated prompt_manager to v0.5.4 which fixed shell integration problem
9
+
10
+ ### [0.9.1] 2025-05-16
11
+ - rethink MCP approach in favor of just RubyLLM::Tool
12
+ - fixed problem with //clear
13
+ - fixed a problem with a priming prompt in a chat loop
14
+
15
+ ### [0.9.0] 2025-05-13
16
+ - Adding experimental MCP Client suppot
17
+ - removed the CLI options --erb and --shell but kept them in the config file with a default of true for both
18
+
9
19
  ### [0.8.6] 2025-04-23
10
20
  - Added a client adapter for the ruby_llm gem
11
21
  - Added the adapter config item and the --adapter option to select at runtime which client to use ai_client or ruby_llm
data/README.md CHANGED
@@ -11,23 +11,14 @@
11
11
  [/______\] / * embedded directives * shell integration
12
12
  / \__AI__/ \/ * embedded Ruby * history management
13
13
  / /__\ * interactive chat * prompt workflows
14
- (\ /____\ # Experimental support of MCP servers via ruby_llm extension
14
+ (\ /____\ # supports RubyLLM::Tool integration
15
15
  ```
16
16
 
17
17
  AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
18
18
 
19
- **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
20
- - Replaced `ai_client` with `ruby_llm` gem
21
- - Added --adapter w/default ruby_llm in case there is a need to consider something else
22
- - //include directive now supports web URLs
23
- - //webpage insert web URL content as markdown into context
24
-
25
19
  **Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
26
20
 
27
- **Notable Recent Changes:**
28
- - **RubyLLM Integration:** AIA now uses the `ruby_llm` gem as a replacement to ai_client. The option `--adapter ruby_llm` is the default. The `--adapter` option is there in case in the future an alternative to ruby_llm may be needed. I replacing my on gem ai_client with the ruby_llm gem? Because its better, newer, elegant and will easily support some of the new features I have planned for AIA.
29
-
30
- - **MCP Server Support:** AIA now supports Model Context Protocol (MCP) servers through an extension to the ruby_llm gem. This experimental feature allows AIA to interact with various external tools and services through MCP servers using the --mcp and --allowed_tools CLI options.
21
+ **MCRubyLLM::Tool Support:** AIA now supports the integration of Tools for those models that support function callbacks. See the --tools, --allowed_tools and --rejected_tools options. Yes, functional callbacks provided for dynamic prompt just like the AIA directives so why have both? Well, AIA is older that functional callbacks. Directives or legacy but more than that not all models support functional callbacks. That means the old directives capability, shell and erb integration are still viable ways to provided dynamic extra content to your prompts.
31
22
 
32
23
  <!-- Tocer[start]: Auto-generated, don't remove. -->
33
24
 
@@ -70,13 +61,15 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
70
61
  - [Development](#development)
71
62
  - [Contributing](#contributing)
72
63
  - [Roadmap](#roadmap)
73
- - [Model Context Protocol (MCP) Support](#model-context-protocol-mcp-support)
74
- - [What is MCP?](#what-is-mcp)
75
- - [Using MCP Servers with AIA](#using-mcp-servers-with-aia)
76
- - [Focusing on Specific Tools with --allowed_tools](#focusing-on-specific-tools-with---allowed_tools)
77
- - [How It Works](#how-it-works)
78
- - [Current Limitations](#current-limitations)
79
- - [Sounds like directives](#sounds-like-directives)
64
+ - [RubyLLM::Tool Support](#rubyllmtool-support)
65
+ - [What Are RubyLLM Tools?](#what-are-rubyllm-tools)
66
+ - [How to Use Tools](#how-to-use-tools)
67
+ - [`--tools` Option](#--tools-option)
68
+ - [Filtering the tool paths](#filtering-the-tool-paths)
69
+ - [`--at`, `--allowed_tools` Option](#--at---allowed_tools-option)
70
+ - [`--rt`, `--rejected_tools` Option](#--rt---rejected_tools-option)
71
+ - [Creating Your Own Tools](#creating-your-own-tools)
72
+ - [MCP Supported](#mcp-supported)
80
73
  - [License](#license)
81
74
 
82
75
  <!-- Tocer[finish]: Auto-generated, don't remove. -->
@@ -104,7 +97,6 @@ The following table provides a comprehensive list of configuration options, thei
104
97
  | log_file | -l, --log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
105
98
  | markdown | --md, --markdown | true | AIA_MARKDOWN |
106
99
  | max_tokens | --max_tokens | 2048 | AIA_MAX_TOKENS |
107
- | mcp_servers | --mcp | [] | AIA_MCP_SERVERS |
108
100
  | model | -m, --model | gpt-4o-mini | AIA_MODEL |
109
101
  | next | -n, --next | nil | AIA_NEXT |
110
102
  | out_file | -o, --out_file | temp.md | AIA_OUT_FILE |
@@ -124,8 +116,11 @@ The following table provides a comprehensive list of configuration options, thei
124
116
  | system_prompt | --system_prompt | | AIA_SYSTEM_PROMPT |
125
117
  | temperature | -t, --temperature | 0.7 | AIA_TEMPERATURE |
126
118
  | terse | --terse | false | AIA_TERSE |
119
+ | tool_paths | --tools | [] | AIA_TOOL_PATHS |
120
+ | allowed_tools | --at --allowed_tools | nil | AIA_ALLOWED_TOOLS |
121
+ | rejected_tools | --rt --rejected_tools | nil | AIA_REJECTED_TOOLS |
127
122
  | top_p | --top_p | 1.0 | AIA_TOP_P |
128
- | transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
123
+ | transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
129
124
  | verbose | -v, --verbose | false | AIA_VERBOSE |
130
125
  | voice | --voice | alloy | AIA_VOICE |
131
126
 
@@ -679,56 +674,83 @@ I'm not happy with the way where some command line options for external command
679
674
  - continue integration of the ruby_llm gem
680
675
  - support for Model Context Protocol
681
676
 
682
- ## Model Context Protocol (MCP) Support
677
+ ## RubyLLM::Tool Support
678
+
679
+ AIA supports function calling capabilities through the `RubyLLM::Tool` framework, enabling LLMs to execute custom functions during a chat session.
680
+
681
+ ### What Are RubyLLM Tools?
683
682
 
684
- AIA now includes experimental support for Model Context Protocol (MCP) servers through an extension to the `ruby_llm` gem, which utilizes the `ruby-mcp-client` gem for communication with MCP servers.
683
+ Tools (or functions) allow LLMs to perform actions beyond generating text, such as:
685
684
 
686
- ### What is MCP?
685
+ - Retrieving real-time information
686
+ - Executing system commands
687
+ - Accessing external APIs
688
+ - Performing calculations
687
689
 
688
- Model Context Protocol (MCP) is a standardized way for AI models to interact with external tools and services. It allows AI assistants to perform actions beyond mere text generation, such as searching the web, accessing databases, or interacting with APIs.
690
+ Check out the [examples/tools](examples/tools) directory which contains several ready-to-use tool implementations you can use as references.
689
691
 
690
- ### Using MCP Servers with AIA
692
+ ### How to Use Tools
691
693
 
692
- To use MCP with AIA, you can specify one or more MCP servers using the `--mcp` CLI option:
694
+ AIA provides three CLI options to manage function calling:
695
+
696
+ #### `--tools` Option
697
+
698
+ Specifies where to find tool implementations:
693
699
 
694
700
  ```bash
695
- aia --chat --mcp server1,server2
701
+ # Load tools from multiple sources
702
+ --tools /path/to/tools/directory,other/tools/dir,my_tool.rb
703
+
704
+ # Or use multiple --tools flags
705
+ --tools my_first_tool.rb --tools /tool_repo/tools
696
706
  ```
697
707
 
698
- This will connect your AIA session to the specified MCP servers, allowing the AI model to access tools provided by those servers.
708
+ Each path can be:
699
709
 
700
- ### Focusing on Specific Tools with --allowed_tools
710
+ - A Ruby file implementing a `RubyLLM::Tool` subclass
711
+ - A directory containing tool implementations (all Ruby files in that directory will be loaded)
701
712
 
702
- If you want to limit which MCP tools are available to the AI model during your session, you can use the `--allowed_tools` option:
713
+ Supporting files for tools can be placed in the same directory or subdirectories.
714
+
715
+ ### Filtering the tool paths
716
+
717
+ The --tools option must have exact relative or absolute paths to the tool files to be used by AIA for function callbacks. If you are specifying directories you may find yourself needing filter the entire set of tools to either allow some or reject others based upon some indicator in their file name. The following two options allow you to specify multiple sub-strings to match the tolls paths against. For example you might be comparing one version of a tool against another. Their filenames could have version prefixes like tool_v1.rb and tool_v2.rb Using the allowed and rejected filters you can choose one of the other when using an entire directory full of tools.
718
+
719
+ #### `--at`, `--allowed_tools` Option
720
+
721
+ Filters which tools to make available when loading from directories:
703
722
 
704
723
  ```bash
705
- aia --chat --mcp server1 --allowed_tools tool1,tool2,tool3
724
+ # Only allow tools with 'test' in their filename
725
+ --tools my_tools_directory --allowed_tools test
706
726
  ```
707
727
 
708
- This restricts the AI model to only use the specified tools, which can be helpful for:
728
+ This is useful when you have many tools but only want to use specific ones in a session.
709
729
 
710
- - Focusing the AI on a specific task
711
- - Preventing the use of unnecessary or potentially harmful tools
712
- - Reducing the cognitive load on the AI by limiting available options
730
+ #### `--rt`, `--rejected_tools` Option
713
731
 
714
- ### How It Works
732
+ Excludes specific tools:
715
733
 
716
- Behind the scenes, AIA leverages a new `with_mcp` method in `RubyLLM::Chat` to set up the MCP tools for chat sessions. This method processes the hash-structured MCP tool definitions and makes them available for the AI to use during the conversation.
734
+ ```bash
735
+ # Exclude tools with '_v1' in their filename
736
+ --tools my_tools_directory --rejected_tools _v1
737
+ ```
717
738
 
718
- When you specify `--mcp` and optionally `--allowed_tools`, AIA configures the underlying `ruby_llm` adapter to connect to the appropriate MCP servers and expose only the tools you've explicitly allowed.
739
+ Ideal for excluding older versions or temporarily disabling specific tools.
719
740
 
720
- ### Current Limitations
741
+ ### Creating Your Own Tools
721
742
 
722
- As this is an experimental feature:
743
+ To create a custom tool:
723
744
 
724
- - Not all AI models support MCP tools
725
- - The available tools may vary depending on the MCP server
726
- - Tool behavior may change as the MCP specification evolves
745
+ 1. Create a Ruby file that subclasses `RubyLLM::Tool`
746
+ 2. Define the tool's parameters and functionality
747
+ 3. Use the `--tools` option to load it in your AIA session
727
748
 
728
- ### Sounds like directives
749
+ For implementation details, refer to the [examples in the repository](examples/tools) or the RubyLLM documentation.
729
750
 
730
- Anthropic's MCP standard came months after the aia appeared with its inherent support of dynamic prompt context enhancements via directives, shell integration and ERB support. Yes you can argue that if you have a working MCP client and access to appropriate MCP servers then the dynamic prompt context enhancements via directives, shell integration and ERB support are superfluous. I don't agree with that argument. I think the dynamic prompt context enhancements via directives, shell integration and ERB support are powerful tools in and of themselves. I think they are a better way to do things.
751
+ ## MCP Supported
731
752
 
753
+ Abandon all hope of seeing an MCP client added to AIA. Maybe sometime in the future there will be a new gem "ruby_llm-mcp" that implements an MCP client as a native RubyLLM::Tool subclass. If that every happens you would use it the same way you use any other RubyLLM::Tool subclass which AIA now supports.
732
754
 
733
755
  ## License
734
756
 
@@ -0,0 +1,26 @@
1
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/edit_file.rb
2
+
3
+ require "ruby_llm/tool"
4
+
5
+ module Tools
6
+ class EditFile < RubyLLM::Tool
7
+ description <<~DESCRIPTION
8
+ Make edits to a text file.
9
+
10
+ Replaces 'old_str' with 'new_str' in the given file.
11
+ 'old_str' and 'new_str' MUST be different from each other.
12
+
13
+ If the file specified with path doesn't exist, it will be created.
14
+ DESCRIPTION
15
+ param :path, desc: "The path to the file"
16
+ param :old_str, desc: "Text to search for - must match exactly and must only have one match exactly"
17
+ param :new_str, desc: "Text to replace old_str with"
18
+
19
+ def execute(path:, old_str:, new_str:)
20
+ content = File.exist?(path) ? File.read(path) : ""
21
+ File.write(path, content.sub(old_str, new_str))
22
+ rescue => e
23
+ { error: e.message }
24
+ end
25
+ end
26
+ end
@@ -0,0 +1,18 @@
1
+
2
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/list_files.rb
3
+ #
4
+ require "ruby_llm/tool"
5
+
6
+ module Tools
7
+ class ListFiles < RubyLLM::Tool
8
+ description "List files and directories at a given path. If no path is provided, lists files in the current directory."
9
+ param :path, desc: "Optional relative path to list files from. Defaults to current directory if not provided."
10
+
11
+ def execute(path: Dir.pwd)
12
+ Dir.glob(File.join(path, "*"))
13
+ .map { |filename| File.directory?(filename) ? "#{filename}/" : filename }
14
+ rescue => e
15
+ { error: e.message }
16
+ end
17
+ end
18
+ end
@@ -0,0 +1,16 @@
1
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/read_file.rb
2
+ #
3
+ "ruby_llm/tool"
4
+
5
+ module Tools
6
+ class ReadFile < RubyLLM::Tool
7
+ description "Read the contents of a given relative file path. Use this when you want to see what's inside a file. Do not use this with directory names."
8
+ param :path, desc: "The relative path of a file in the working directory."
9
+
10
+ def execute(path:)
11
+ File.read(path)
12
+ rescue => e
13
+ { error: e.message }
14
+ end
15
+ end
16
+ end
@@ -0,0 +1,21 @@
1
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/run_shell_command.rb
2
+
3
+ require "ruby_llm/tool"
4
+
5
+ module Tools
6
+ class RunShellCommand < RubyLLM::Tool
7
+ description "Execute a linux shell command"
8
+ param :command, desc: "The command to execute"
9
+
10
+ def execute(command:)
11
+ puts "AI wants to execute the following shell command: '#{command}'"
12
+ print "Do you want to execute it? (y/n) "
13
+ response = gets.chomp
14
+ return { error: "User declined to execute the command" } unless response == "y"
15
+
16
+ `#{command}`
17
+ rescue => e
18
+ { error: e.message }
19
+ end
20
+ end
21
+ end
@@ -66,11 +66,6 @@ module AIA
66
66
  client_model = AIA.client.model.id # RubyLLM::Model instance
67
67
  end
68
68
 
69
- debug_me('== dynamic model change? =='){[
70
- :client_model,
71
- "AIA.config.model"
72
- ]}
73
-
74
69
  # when adapter is ruby_llm must use model.id as the name
75
70
  unless AIA.config.model.downcase.include?(client_model.downcase)
76
71
  # FIXME: assumes that the adapter is AiClient. It might be RUbyLLM
data/lib/aia/config.rb CHANGED
@@ -29,9 +29,10 @@ module AIA
29
29
  role: '',
30
30
  system_prompt: '',
31
31
 
32
- # MCP configuration
33
- mcp_servers: [],
34
- allowed_tools: nil, # nil means all tools are allowed; otherwise an Array of Strings which are the tool names
32
+ # Tools
33
+ allowed_tools: nil, # nil means all tools are allowed; otherwise an Array of Strings which are the tool names
34
+ rejected_tools: nil, # nil means no tools are rejected
35
+ tool_paths: [], # Strings - absolute and relative to tools
35
36
 
36
37
  # Flags
37
38
  markdown: true,
@@ -97,6 +98,9 @@ module AIA
97
98
  )
98
99
 
99
100
  tailor_the_config(config)
101
+ load_tools(config)
102
+
103
+ config
100
104
  end
101
105
 
102
106
 
@@ -129,7 +133,6 @@ module AIA
129
133
  exit 1
130
134
  end
131
135
 
132
-
133
136
  unless config.role.empty?
134
137
  unless config.roles_prefix.empty?
135
138
  unless config.role.start_with?(config.roles_prefix)
@@ -200,16 +203,44 @@ module AIA
200
203
  PromptManager::Prompt.parameter_regex = Regexp.new(config.parameter_regex)
201
204
  end
202
205
 
203
- debug_me{[ 'config.mcp_servers' ]}
206
+ config
207
+ end
208
+
209
+
210
+ def self.load_tools(config)
211
+ return if config.tool_paths.empty?
212
+
213
+ exit_on_error = false
214
+
215
+ unless config.allowed_tools.nil?
216
+ config.tool_paths.select! do |path|
217
+ config.allowed_tools.any? { |allowed| path.include?(allowed) }
218
+ end
219
+ end
220
+
221
+ unless config.rejected_tools.nil?
222
+ config.tool_paths.reject! do |path|
223
+ config.rejected_tools.any? { |rejected| path.include?(rejected) }
224
+ end
225
+ end
204
226
 
205
- unless config.mcp_servers.empty?
206
- # create a single JSON file contain all of the MCP server definitions specified my the --mcp option
207
- config.mcp_servers = combine_mcp_server_json_files config.mcp_servers
227
+ config.tool_paths.each do |tool_path|
228
+ begin
229
+ # expands path based on PWD
230
+ absolute_tool_path = File.expand_path(tool_path)
231
+ require(absolute_tool_path)
232
+ rescue => e
233
+ SYSERR.puts "Error loading tool '#{tool_path}' #{e.message}"
234
+ exit_on_error = true
235
+ end
208
236
  end
209
237
 
238
+ exit(1) if exit_on_error
239
+
210
240
  config
211
241
  end
212
242
 
243
+
213
244
  # envar values are always String object so need other config
214
245
  # layers to know the prompter type for each key's value
215
246
  def self.envar_options(default, cli_config)
@@ -435,31 +466,36 @@ module AIA
435
466
  config.require_libs = libs.split(',')
436
467
  end
437
468
 
438
- opts.on("--mcp FILE", "Add MCP server configuration from JSON file. Can be specified multiple times.") do |file|
439
- # debug_me FIXME ruby-mcp-client is looking for a single JSON file that
440
- # could contain multiple server definitions that looks like this:
441
- # {
442
- # "mcpServers": {
443
- # "server one": { ... },
444
- # "server two": { ... }, ....
445
- # }
446
- # }
447
- # FIXME: need to rurn multiple JSON files into one.
448
- if AIA.good_file?(file)
449
- config.mcp_servers ||= []
450
- config.mcp_servers << file
451
- begin
452
- server_config = JSON.parse(File.read(file))
453
- config.mcp_servers_config ||= []
454
- config.mcp_servers_config << server_config
455
- rescue JSON::ParserError => e
456
- STDERR.puts "Error parsing MCP server config file #{file}: #{e.message}"
469
+ opts.on("--tools PATH_LIST", "Add a tool(s)") do |a_path_list|
470
+ config.tool_paths ||= []
471
+
472
+ if a_path_list.empty?
473
+ STDERR.puts "No list of paths for --tools option"
474
+ exit 1
475
+ else
476
+ paths = a_path_list.split(',').map(&:strip).uniq
477
+ end
478
+
479
+ paths.each do |a_path|
480
+ if File.exist?(a_path)
481
+ if File.file?(a_path)
482
+ if '.rb' == File.extname(a_path)
483
+ config.tool_paths << a_path
484
+ else
485
+ STDERR.puts "file should have *.rb extension: #{a_path}"
486
+ exit 1
487
+ end
488
+ elsif File.directory?(a_path)
489
+ rb_files = Dir.glob(File.join(a_path, '**', '*.rb'))
490
+ config.tool_paths += rb_files
491
+ end
492
+ else
493
+ STDERR.puts "file/dir path is not valid: #{a_path}"
457
494
  exit 1
458
495
  end
459
- else
460
- STDERR.puts "MCP server config file not found: #{file}"
461
- exit 1
462
496
  end
497
+
498
+ config.tool_paths.uniq!
463
499
  end
464
500
 
465
501
  opts.on("--at", "--allowed_tools TOOLS_LIST", "Allow only these tools to be used") do |tools_list|
@@ -472,6 +508,17 @@ module AIA
472
508
  config.allowed_tools.uniq!
473
509
  end
474
510
  end
511
+
512
+ opts.on("--rt", "--rejected_tools TOOLS_LIST", "Reject these tools") do |tools_list|
513
+ config.rejected_tools ||= []
514
+ if tools_list.empty?
515
+ STDERR.puts "No list of tool names provided for --rejected_tools option"
516
+ exit 1
517
+ else
518
+ config.rejected_tools += tools_list.split(',').map(&:strip)
519
+ config.rejected_tools.uniq!
520
+ end
521
+ end
475
522
  end
476
523
 
477
524
  args = ARGV.dup
@@ -557,95 +604,5 @@ module AIA
557
604
  File.write(file, content)
558
605
  puts "Config successfully dumped to #{file}"
559
606
  end
560
-
561
-
562
- # Combine multiple MCP server JSON files into a single file
563
- def self.combine_mcp_server_json_files(file_paths)
564
- raise ArgumentError, "No JSON files provided" if file_paths.nil? || file_paths.empty?
565
-
566
- # The output will have only one top-level key: "mcpServers"
567
- mcp_servers = {} # This will store all collected server_name => server_config pairs
568
-
569
- file_paths.each do |file_path|
570
- file_content = JSON.parse(File.read(file_path))
571
- # Clean basename, e.g., "filesystem.json" -> "filesystem", "foo.json.erb" -> "foo"
572
- cleaned_basename = File.basename(file_path).sub(/\.json\.erb$/, '').sub(/\.json$/, '')
573
-
574
- if file_content.is_a?(Hash)
575
- if file_content.key?("mcpServers") && file_content["mcpServers"].is_a?(Hash)
576
- # Case A: {"mcpServers": {"name1": {...}, "name2": {...}}}
577
- file_content["mcpServers"].each do |server_name, server_data|
578
- if mcp_servers.key?(server_name)
579
- STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' found. Overwriting with definition from #{file_path}."
580
- end
581
- mcp_servers[server_name] = server_data
582
- end
583
- # Check if the root hash itself is a single server definition
584
- elsif is_single_server_definition?(file_content)
585
- # Case B: {"type": "stdio", ...} or {"url": "...", ...}
586
- # Use "name" property from JSON if present, otherwise use cleaned_basename
587
- server_name = file_content["name"] || cleaned_basename
588
- if mcp_servers.key?(server_name)
589
- STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' (from file #{file_path}). Overwriting."
590
- end
591
- mcp_servers[server_name] = file_content
592
- else
593
- # Case D: Fallback for {"custom_name1": {server_config1}, "custom_name2": {server_config2}}
594
- # This assumes top-level keys are server names and values are server configs.
595
- file_content.each do |server_name, server_data|
596
- if server_data.is_a?(Hash) && is_single_server_definition?(server_data)
597
- if mcp_servers.key?(server_name)
598
- STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' found in #{file_path}. Overwriting."
599
- end
600
- mcp_servers[server_name] = server_data
601
- else
602
- STDERR.puts "Warning: Unrecognized structure for key '#{server_name}' in #{file_path}. Value is not a valid server definition. Skipping."
603
- end
604
- end
605
- end
606
- elsif file_content.is_a?(Array)
607
- # Case C: [ {server_config1}, {server_config2_with_name} ]
608
- file_content.each_with_index do |server_data, index|
609
- if server_data.is_a?(Hash) && is_single_server_definition?(server_data)
610
- # Use "name" property from JSON if present, otherwise generate one
611
- server_name = server_data["name"] || "#{cleaned_basename}_#{index}"
612
- if mcp_servers.key?(server_name)
613
- STDERR.puts "Warning: Duplicate MCP server name '#{server_name}' (from array in #{file_path}). Overwriting."
614
- end
615
- mcp_servers[server_name] = server_data
616
- else
617
- STDERR.puts "Warning: Unrecognized item in array in #{file_path} at index #{index}. Skipping."
618
- end
619
- end
620
- else
621
- STDERR.puts "Warning: Unrecognized JSON structure in #{file_path}. Skipping."
622
- end
623
- end
624
-
625
- # Create the final output structure
626
- output = {"mcpServers" => mcp_servers}
627
- temp_file = Tempfile.new(['combined', '.json'])
628
- temp_file.write(JSON.pretty_generate(output))
629
- temp_file.close
630
-
631
- temp_file.path
632
- end
633
-
634
- # Helper method to determine if a hash represents a valid MCP server definition
635
- def self.is_single_server_definition?(config)
636
- return false unless config.is_a?(Hash)
637
- type = config['type']
638
- if type
639
- return true if type == 'stdio' && config.key?('command')
640
- return true if type == 'sse' && config.key?('url')
641
- # Potentially other explicit types if they exist in MCP
642
- return false # Known type but missing required fields for it, or unknown type
643
- else
644
- # Infer type
645
- return true if config.key?('command') || config.key?('args') || config.key?('env') # stdio
646
- return true if config.key?('url') # sse
647
- end
648
- false
649
- end
650
607
  end
651
608
  end
@@ -41,7 +41,23 @@ module AIA
41
41
  @context = [@context.first]
42
42
  else
43
43
  @context = []
44
- AIA.config.client.clear_context
44
+ end
45
+
46
+ # Attempt to clear the LLM client's context as well
47
+ begin
48
+ if AIA.config.client && AIA.config.client.respond_to?(:clear_context)
49
+ AIA.config.client.clear_context
50
+ end
51
+
52
+ if AIA.config.respond_to?(:llm) && AIA.config.llm && AIA.config.llm.respond_to?(:clear_context)
53
+ AIA.config.llm.clear_context
54
+ end
55
+
56
+ if defined?(RubyLLM) && RubyLLM.respond_to?(:chat) && RubyLLM.chat.respond_to?(:clear_history)
57
+ RubyLLM.chat.clear_history
58
+ end
59
+ rescue => e
60
+ SYSERR.puts "ERROR: context_manager clear_context error #{e.message}"
45
61
  end
46
62
  end
47
63
 
@@ -51,14 +51,28 @@ module AIA
51
51
  end
52
52
 
53
53
  def directive?(a_string)
54
- a_string.strip.start_with?(PromptManager::Prompt::DIRECTIVE_SIGNAL)
54
+ # Handle RubyLLM::Message objects by extracting their content first
55
+ content = if a_string.is_a?(RubyLLM::Message)
56
+ a_string.content rescue a_string.to_s
57
+ else
58
+ a_string.to_s
59
+ end
60
+
61
+ content.strip.start_with?(PromptManager::Prompt::DIRECTIVE_SIGNAL)
55
62
  end
56
63
 
57
64
  # Used with the chat loop to allow user to enter a single directive
58
65
  def process(a_string, context_manager)
59
66
  return a_string unless directive?(a_string)
60
67
 
61
- key = a_string.strip
68
+ # Handle RubyLLM::Message objects by extracting their content first
69
+ content = if a_string.is_a?(RubyLLM::Message)
70
+ a_string.content rescue a_string.to_s
71
+ else
72
+ a_string.to_s
73
+ end
74
+
75
+ key = content.strip
62
76
  sans_prefix = key[@prefix_size..]
63
77
  args = sans_prefix.split(' ')
64
78
  method_name = args.shift.downcase
@@ -1,14 +1,12 @@
1
1
  # lib/aia/ruby_llm_adapter.rb
2
2
 
3
3
  require 'ruby_llm'
4
- require 'mcp_client'
5
4
 
6
5
  module AIA
7
6
  class RubyLLMAdapter
8
- def initialize
9
-
10
- debug_me('=== RubyLLMAdapter ===')
7
+ attr_reader :tools
11
8
 
9
+ def initialize
12
10
  @model = AIA.config.model
13
11
  model_info = extract_model_parts(@model)
14
12
 
@@ -26,17 +24,13 @@ module AIA
26
24
  config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
27
25
  end
28
26
 
29
- debug_me{[ :model_info ]}
30
-
31
- mcp_client, mcp_tools = generate_mcp_tools(model_info[:provider])
32
-
33
- debug_me{[ :mcp_tools ]}
27
+ @chat = RubyLLM.chat(model: model_info[:model])
34
28
 
35
- if mcp_tools && !mcp_tools.empty?
36
- RubyLLM::Chat.with_mcp(client: mcp_client, call_tool_method: :call_tool, tools: mcp_tools)
29
+ @tools = ObjectSpace.each_object(Class).select do |klass|
30
+ klass < RubyLLM::Tool
37
31
  end
38
32
 
39
- @chat = RubyLLM.chat(model: model_info[:model])
33
+ @chat.with_tools(*tools) unless tools.empty?
40
34
  end
41
35
 
42
36
  def chat(prompt)
@@ -74,7 +68,6 @@ module AIA
74
68
  end
75
69
 
76
70
  def method_missing(method, *args, &block)
77
- debug_me(tag: '== missing ==', levels: 25){[ :method, :args ]}
78
71
  if @chat.respond_to?(method)
79
72
  @chat.public_send(method, *args, &block)
80
73
  else
@@ -82,45 +75,51 @@ module AIA
82
75
  end
83
76
  end
84
77
 
78
+ # Clear the chat context/history
79
+ # Needed for the //clear directive
80
+ def clear_context
81
+ begin
82
+ # Option 1: Directly clear the messages array in the current chat object
83
+ if @chat.instance_variable_defined?(:@messages)
84
+ old_messages = @chat.instance_variable_get(:@messages)
85
+ # Force a completely empty array, not just attempting to clear it
86
+ @chat.instance_variable_set(:@messages, [])
87
+ end
88
+
89
+ # Option 2: Force RubyLLM to create a new chat instance at the global level
90
+ # This ensures any shared state is reset
91
+ model_info = extract_model_parts(@model)
92
+ RubyLLM.instance_variable_set(:@chat, nil) if RubyLLM.instance_variable_defined?(:@chat)
93
+
94
+ # Option 3: Create a completely fresh chat instance for this adapter
95
+ @chat = nil # First nil it to help garbage collection
96
+ @chat = RubyLLM.chat(model: model_info[:model])
97
+
98
+ # Option 4: Call official clear_history method if it exists
99
+ if @chat.respond_to?(:clear_history)
100
+ @chat.clear_history
101
+ end
102
+
103
+ # Option 5: If chat has messages, force set it to empty again as a final check
104
+ if @chat.instance_variable_defined?(:@messages) && !@chat.instance_variable_get(:@messages).empty?
105
+ @chat.instance_variable_set(:@messages, [])
106
+ end
107
+
108
+ # Final verification
109
+ new_messages = @chat.instance_variable_defined?(:@messages) ? @chat.instance_variable_get(:@messages) : []
110
+
111
+ return "Chat context successfully cleared."
112
+ rescue => e
113
+ return "Error clearing chat context: #{e.message}"
114
+ end
115
+ end
116
+
85
117
  def respond_to_missing?(method, include_private = false)
86
118
  @chat.respond_to?(method) || super
87
119
  end
88
120
 
89
121
  private
90
122
 
91
- # Generate an array of MCP tools, filtered and formatted for the correct provider.
92
- # @param config [OpenStruct] the config object containing mcp_servers, allowed_tools, and model
93
- # @return [Array<Hash>, nil] the filtered and formatted MCP tools or nil if no tools
94
- def generate_mcp_tools(provider)
95
- return [nil, nil] unless AIA.config.mcp_servers && !AIA.config.mcp_servers.empty?
96
-
97
- debug_me('=== generate_mcp_tools ===')
98
-
99
- # AIA.config.mcp_servers is now a path to the combined JSON file
100
- mcp_client = MCPClient.create_client(server_definition_file: AIA.config.mcp_servers)
101
- debug_me
102
- all_tools = mcp_client.list_tools(cache: false).map(&:name)
103
- debug_me
104
- allowed = AIA.config.allowed_tools
105
- debug_me
106
- filtered_tools = allowed.nil? ? all_tools : all_tools & allowed
107
- debug_me{[ :filtered_tools ]}
108
-
109
- debug_me{[ :provider ]}
110
-
111
- mcp_tools = if :anthropic == provider.to_sym
112
- debug_me
113
- mcp_client.to_anthropic_tools(tool_names: filtered_tools)
114
- else
115
- debug_me
116
- mcp_client.to_openai_tools(tool_names: filtered_tools)
117
- end
118
- [mcp_client, mcp_tools]
119
- rescue => e
120
- STDERR.puts "ERROR: Failed to generate MCP tools: #{e.message}"
121
- nil
122
- end
123
-
124
123
  def extract_model_parts(model_string)
125
124
  parts = model_string.split('/')
126
125
  parts.map!(&:strip)
data/lib/aia/session.rb CHANGED
@@ -184,8 +184,8 @@ module AIA
184
184
  @chat_prompt = PromptManager::Prompt.new(
185
185
  id: @chat_prompt_id,
186
186
  directives_processor: @directive_processor,
187
- erb_flag: AIA.config.erb,
188
- envar_flag: AIA.config.shell,
187
+ erb_flag: true,
188
+ envar_flag: true,
189
189
  external_binding: binding,
190
190
  )
191
191
 
@@ -257,6 +257,25 @@ module AIA
257
257
  directive_output = @directive_processor.process(follow_up_prompt, @context_manager)
258
258
 
259
259
  if follow_up_prompt.strip.start_with?('//clear')
260
+ # The directive processor has called context_manager.clear_context
261
+ # but we need a more aggressive approach to fully clear all context
262
+
263
+ # First, clear the context manager's context
264
+ @context_manager.clear_context(keep_system_prompt: true)
265
+
266
+ # Second, try clearing the client's context
267
+ if AIA.config.client && AIA.config.client.respond_to?(:clear_context)
268
+ AIA.config.client.clear_context
269
+ end
270
+
271
+ # Third, completely reinitialize the client to ensure fresh state
272
+ # This is the most aggressive approach to ensure no context remains
273
+ begin
274
+ AIA.config.client = AIA::RubyLLMAdapter.new
275
+ rescue => e
276
+ SYSERR.puts "Error reinitializing client: #{e.message}"
277
+ end
278
+
260
279
  @ui_presenter.display_info("Chat context cleared.")
261
280
  next
262
281
  elsif directive_output.nil? || directive_output.strip.empty?
data/lib/aia.rb CHANGED
@@ -5,10 +5,8 @@
5
5
  # provides an interface for interacting with AI models and managing prompts.
6
6
 
7
7
  require 'ruby_llm'
8
- require_relative 'extensions/ruby_llm/chat'
9
8
 
10
9
  require 'prompt_manager'
11
- require 'mcp_client'
12
10
 
13
11
  require 'debug_me'
14
12
  include DebugMe
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: aia
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.9.0
4
+ version: 0.9.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
@@ -57,14 +57,14 @@ dependencies:
57
57
  requirements:
58
58
  - - ">="
59
59
  - !ruby/object:Gem::Version
60
- version: 0.5.2
60
+ version: 0.5.4
61
61
  type: :runtime
62
62
  prerelease: false
63
63
  version_requirements: !ruby/object:Gem::Requirement
64
64
  requirements:
65
65
  - - ">="
66
66
  - !ruby/object:Gem::Version
67
- version: 0.5.2
67
+ version: 0.5.4
68
68
  - !ruby/object:Gem::Dependency
69
69
  name: ruby_llm
70
70
  requirement: !ruby/object:Gem::Requirement
@@ -79,20 +79,6 @@ dependencies:
79
79
  - - ">="
80
80
  - !ruby/object:Gem::Version
81
81
  version: 1.2.0
82
- - !ruby/object:Gem::Dependency
83
- name: ruby-mcp-client
84
- requirement: !ruby/object:Gem::Requirement
85
- requirements:
86
- - - ">="
87
- - !ruby/object:Gem::Version
88
- version: '0'
89
- type: :runtime
90
- prerelease: false
91
- version_requirements: !ruby/object:Gem::Requirement
92
- requirements:
93
- - - ">="
94
- - !ruby/object:Gem::Version
95
- version: '0'
96
82
  - !ruby/object:Gem::Dependency
97
83
  name: reline
98
84
  requirement: !ruby/object:Gem::Requirement
@@ -302,6 +288,10 @@ files:
302
288
  - bin/aia
303
289
  - examples/README.md
304
290
  - examples/headlines
291
+ - examples/tools/edit_file.rb
292
+ - examples/tools/list_files.rb
293
+ - examples/tools/read_file.rb
294
+ - examples/tools/run_shell_command.rb
305
295
  - justfile
306
296
  - lib/aia.rb
307
297
  - lib/aia/aia_completion.bash
@@ -321,7 +311,6 @@ files:
321
311
  - lib/aia/utility.rb
322
312
  - lib/aia/version.rb
323
313
  - lib/extensions/openstruct_merge.rb
324
- - lib/extensions/ruby_llm/chat.rb
325
314
  - main.just
326
315
  - mcp_servers/README.md
327
316
  - mcp_servers/filesystem.json
@@ -351,7 +340,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
351
340
  - !ruby/object:Gem::Version
352
341
  version: '0'
353
342
  requirements: []
354
- rubygems_version: 3.6.8
343
+ rubygems_version: 3.6.9
355
344
  specification_version: 4
356
345
  summary: 'AI Assistant: dynamic prompts, shell & Ruby integration, and seamless chat
357
346
  workflows.'
@@ -1,197 +0,0 @@
1
- # lib/extensions/ruby_llm/chat.rb
2
-
3
- module RubyLLM
4
- class Chat
5
- class << self
6
- # Sets up Model Control Protocol (MCP) tools
7
- #
8
- # @param client [instance object] MCP client instance to use
9
- # @param call_tool_method [Symbol] Method name to use for tool execution
10
- # @param tools [Array<Hash>] Array of MCP tool definitions
11
- #
12
- # @return [self] Returns self for method chaining
13
- #
14
- def with_mcp(client:, call_tool_method:, tools:)
15
- # Validate all required parameters are present
16
- if client.nil?
17
- RubyLLM.logger.error "MCP setup failed: client must be provided"
18
- return clear_mcp_state
19
- end
20
-
21
- if call_tool_method.nil?
22
- RubyLLM.logger.error "MCP setup failed: call_tool_method must be provided"
23
- return clear_mcp_state
24
- end
25
-
26
- if tools.nil?
27
- RubyLLM.logger.error "MCP setup failed: tools must be provided"
28
- return clear_mcp_state
29
- end
30
-
31
- # Validate call_tool_method type
32
- unless call_tool_method.is_a?(Symbol) || call_tool_method.is_a?(String)
33
- RubyLLM.logger.error "MCP setup failed: call_tool_method must be a Symbol or String, got #{call_tool_method.class}"
34
- return clear_mcp_state
35
- end
36
-
37
- # Validate client responds to the method
38
- unless client.respond_to?(call_tool_method)
39
- RubyLLM.logger.error "MCP setup failed: client instance does not respond to call_tool_method #{call_tool_method}"
40
- return clear_mcp_state
41
- end
42
-
43
- # Set MCP configuration
44
- @mcp_client = client
45
- @mcp_call_tool = call_tool_method.to_sym
46
- @mcp_tools = tools
47
-
48
- self
49
- end
50
-
51
- # Get the MCP client instance if configured
52
- # @return [MCPClient::Client, nil] The MCP client instance or nil if not configured
53
- def mcp_client
54
- @mcp_client
55
- end
56
-
57
- # Get the method name to use for tool execution if configured
58
- # @return [Symbol, nil] The method name or nil if not configured
59
- def mcp_call_tool
60
- @mcp_call_tool
61
- end
62
-
63
- # Get the MCP tool definitions if configured
64
- # @return [Array<Hash>] The MCP tool definitions or empty array if not configured
65
- def mcp_tools
66
- @mcp_tools || []
67
- end
68
-
69
- private
70
-
71
- # Clear all MCP state and return self
72
- # @return [self]
73
- def clear_mcp_state
74
- @mcp_client = nil
75
- @mcp_call_tool = nil
76
- @mcp_tools = []
77
- self
78
- end
79
- end
80
-
81
- # Prepend a module to add MCP tool support
82
- module MCPSupport
83
- def initialize(...)
84
- super
85
- add_mcp_tools
86
- end
87
-
88
- private
89
-
90
- def add_mcp_tools
91
- self.class.mcp_tools.each do |tool_def|
92
- debug_me{[ :tool_def ]}
93
- tool_name = tool_def.dig(:function, :name).to_sym
94
- next if @tools.key?(tool_name) # Skip if local or MCP tool exists with same name
95
-
96
- @tools[tool_name] = MCPToolWrapper.new(tool_def)
97
- end
98
- end
99
- end
100
-
101
- # Add MCP support to the Chat class
102
- prepend MCPSupport
103
- end
104
-
105
- # Wraps an MCP tool definition to match the RubyLLM::Tool interface
106
- class MCPToolWrapper
107
- def initialize(mcp_tool)
108
- @mcp_tool = mcp_tool
109
- end
110
-
111
- def name
112
- @mcp_tool.dig(:function, :name)
113
- end
114
-
115
- def description
116
- @mcp_tool.dig(:function, :description)
117
- end
118
-
119
- # Simple parameter class that implements the interface expected by RubyLLM::Providers::OpenAI::Tools#param_schema
120
- class Parameter
121
- attr_reader :type, :description, :required
122
-
123
- def initialize(type, description, required)
124
- @type = type || 'string'
125
- @description = description
126
- @required = required
127
- end
128
- end
129
-
130
- def parameters
131
- @parameters ||= begin
132
- props = @mcp_tool.dig(:function, :parameters, "properties") || {}
133
- required_params = @mcp_tool.dig(:function, :parameters, "required") || []
134
-
135
- # Create Parameter objects with the expected interface
136
- # The parameter name is the key in the properties hash
137
- result = {}
138
- props.each do |param_name, param_def|
139
- result[param_name.to_sym] = Parameter.new(
140
- param_def["type"],
141
- param_def["description"],
142
- required_params.include?(param_name)
143
- )
144
- end
145
- result
146
- end
147
- end
148
-
149
- def call(args)
150
- # Log the tool call with arguments
151
- RubyLLM.logger.debug "Tool #{name} called with: #{args.inspect}"
152
-
153
- # Verify MCP client is configured properly
154
- unless Chat.mcp_client && Chat.mcp_call_tool
155
- error = { error: "MCP client not properly configured" }
156
- RubyLLM.logger.error error[:error]
157
- return error
158
- end
159
-
160
- # Handle tool calls that require non-string parameters
161
- normalized_args = {}
162
- args.each do |key, value|
163
- # Convert string numbers to actual numbers when needed
164
- if value.is_a?(String) && value.match?(/\A-?\d+(\.\d+)?\z/)
165
- param_type = @mcp_tool.dig(:function, :parameters, "properties", key.to_s, "type")
166
- if param_type == "number" || param_type == "integer"
167
- normalized_args[key] = value.include?('.') ? value.to_f : value.to_i
168
- next
169
- end
170
- end
171
- normalized_args[key] = value
172
- end
173
-
174
- # Execute the tool via the MCP client with a timeout
175
- timeout = 10 # seconds
176
- result = nil
177
-
178
- begin
179
- Timeout.timeout(timeout) do
180
- result = Chat.mcp_client.send(Chat.mcp_call_tool, name, normalized_args)
181
- end
182
- rescue Timeout::Error
183
- error = { error: "MCP tool execution timed out after #{timeout} seconds" }
184
- RubyLLM.logger.error error[:error]
185
- return error
186
- rescue StandardError => e
187
- error = { error: "MCP tool execution failed: #{e.message}" }
188
- RubyLLM.logger.error error[:error]
189
- return error
190
- end
191
-
192
- # Log the result
193
- RubyLLM.logger.debug "Tool #{name} returned: #{result.inspect}"
194
- result
195
- end
196
- end
197
- end