aia 0.9.1 → 0.9.3rc1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 22c90e43d930e16211a66eca8d8698791de4e0c3d867ddcbf9ad6cec98e88202
4
- data.tar.gz: 5350fd33ec94debce7dc1f055cdb1b8eef6663506e6ae6b3d4d7f057537c1ea6
3
+ metadata.gz: 9e3aa7b74ca7b64aca1acd2360043829f9ac14ae63f4c66ced12ea1a70ae34d0
4
+ data.tar.gz: 5802184befd82ae7ff4eecf16da0e5fc1d9d39cc61cdcc5736ad4d56dbe0b87e
5
5
  SHA512:
6
- metadata.gz: 95f6953cb27609d45712b76c154ba949ac3b745004314ad48c1b234a49084db9594938a2a039e56650226cf821c0f52ed834b208665299b8b1677491860f8afd
7
- data.tar.gz: 03f418e5e112be5cd5f3d31a923e628e9c66a59838272fa0d80899b00cfc4065a079f06d03bb92e4139a3b1089f1d7bb05aee89ba3d820bf460bad177cebb52e
6
+ metadata.gz: 31dae0e58c2d7f4b97a23f90b1b588b8735bb6a8e9774a2f40554baf64636ca8c56adc2df9fad804c7e05e91ca5d6a2fde358dd886c56219190f1ba8262fdbf0
7
+ data.tar.gz: d710b9e9fbe52cf014ed8a10274eb84ad4bd1ed00b2ee03a75e71767a4f45abb37d4b5ff6b258597c77b352cf1eb5c8272f1e59c3611eaba9c730447b17e2f2c
data/.version CHANGED
@@ -1 +1 @@
1
- 0.9.1
1
+ 0.9.3rc1
data/CHANGELOG.md CHANGED
@@ -1,7 +1,23 @@
1
1
  # Changelog
2
2
  ## [Unreleased]
3
+ ### [0.9.3] WIP
4
+ - need to pay attention to the test suite
5
+ - also need to ensure the non text2text modes are working
3
6
 
4
7
  ## Released
8
+ ### [0.9.3rc1] 2025-05-24
9
+ - using ruby_llm v1.3.0rc1
10
+ - added a models database refresh based on integer days interval with the --refresh option
11
+ - config file now has a "last_refresh" String in format YYYY-MM-DD
12
+ - enhanced the robot figure to show more config items including tools
13
+ - fixed bug with the --require option with the specified libraries were not being loaded.
14
+ - fixed a bug in the prompt_manager gem which is now at v0.5.5
15
+
16
+
17
+ ### [0.9.2] 2025-05-18
18
+ - removing the MCP experiment
19
+ - adding support for RubyLLM::Tool usage in place of the MCP stuff
20
+ - updated prompt_manager to v0.5.4 which fixed shell integration problem
5
21
 
6
22
  ### [0.9.1] 2025-05-16
7
23
  - rethink MCP approach in favor of just RubyLLM::Tool
data/README.md CHANGED
@@ -11,23 +11,14 @@
11
11
  [/______\] / * embedded directives * shell integration
12
12
  / \__AI__/ \/ * embedded Ruby * history management
13
13
  / /__\ * interactive chat * prompt workflows
14
- (\ /____\ # Experimental support of MCP servers via ruby_llm extension
14
+ (\ /____\ # supports RubyLLM::Tool integration
15
15
  ```
16
16
 
17
17
  AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manager) to manage prompts. It utilizes the [CLI tool fzf](https://github.com/junegunn/fzf) for prompt selection.
18
18
 
19
- **Most Recent Change**: Refer to the [Changelog](CHANGELOG.md)
20
- - Replaced `ai_client` with `ruby_llm` gem
21
- - Added --adapter w/default ruby_llm in case there is a need to consider something else
22
- - //include directive now supports web URLs
23
- - //webpage insert web URL content as markdown into context
24
-
25
19
  **Wiki**: [Checkout the AIA Wiki](https://github.com/MadBomber/aia/wiki)
26
20
 
27
- **Notable Recent Changes:**
28
- - **RubyLLM Integration:** AIA now uses the `ruby_llm` gem as a replacement to ai_client. The option `--adapter ruby_llm` is the default. The `--adapter` option is there in case in the future an alternative to ruby_llm may be needed. I replacing my on gem ai_client with the ruby_llm gem? Because its better, newer, elegant and will easily support some of the new features I have planned for AIA.
29
-
30
- - **MCP Server Support:** AIA now supports Model Context Protocol (MCP) servers through an extension to the ruby_llm gem. This experimental feature allows AIA to interact with various external tools and services through MCP servers using the --mcp and --allowed_tools CLI options.
21
+ **MCRubyLLM::Tool Support:** AIA now supports the integration of Tools for those models that support function callbacks. See the --tools, --allowed_tools and --rejected_tools options. Yes, functional callbacks provided for dynamic prompts just like the AIA directives, shell and ERB integrations so why have both? Well, AIA is older that functional callbacks. The AIA integrations are legacy but more than that not all models support functional callbacks. That means the AIA integrationsß∑ are still viable ways to provided dynamic extra content to your prompts.
31
22
 
32
23
  <!-- Tocer[start]: Auto-generated, don't remove. -->
33
24
 
@@ -36,11 +27,13 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
36
27
  - [Configuration Options](#configuration-options)
37
28
  - [Configuration Flexibility](#configuration-flexibility)
38
29
  - [Expandable Configuration](#expandable-configuration)
30
+ - [The Local Model Registry Refresh](#the-local-model-registry-refresh)
31
+ - [Important Note](#important-note)
39
32
  - [Shell Integration inside of a Prompt](#shell-integration-inside-of-a-prompt)
40
33
  - [Dynamic Shell Commands](#dynamic-shell-commands)
41
34
  - [Shell Command Safety](#shell-command-safety)
42
35
  - [Chat Session Use](#chat-session-use)
43
- - [*E*mbedded *R*u*B*y (ERB)](#embedded-ruby-erb)
36
+ - [Embedded Ruby (ERB)](#embedded-ruby-erb)
44
37
  - [Prompt Directives](#prompt-directives)
45
38
  - [Parameter and Shell Substitution in Directives](#parameter-and-shell-substitution-in-directives)
46
39
  - [Directive Syntax](#directive-syntax)
@@ -70,13 +63,15 @@ AIA leverages the [prompt_manager gem](https://github.com/madbomber/prompt_manag
70
63
  - [Development](#development)
71
64
  - [Contributing](#contributing)
72
65
  - [Roadmap](#roadmap)
73
- - [Model Context Protocol (MCP) Support](#model-context-protocol-mcp-support)
74
- - [What is MCP?](#what-is-mcp)
75
- - [Using MCP Servers with AIA](#using-mcp-servers-with-aia)
76
- - [Focusing on Specific Tools with --allowed_tools](#focusing-on-specific-tools-with---allowed_tools)
77
- - [How It Works](#how-it-works)
78
- - [Current Limitations](#current-limitations)
79
- - [Sounds like directives](#sounds-like-directives)
66
+ - [RubyLLM::Tool Support](#rubyllmtool-support)
67
+ - [What Are RubyLLM Tools?](#what-are-rubyllm-tools)
68
+ - [How to Use Tools](#how-to-use-tools)
69
+ - [`--tools` Option](#--tools-option)
70
+ - [Filtering the tool paths](#filtering-the-tool-paths)
71
+ - [`--at`, `--allowed_tools` Option](#--at---allowed_tools-option)
72
+ - [`--rt`, `--rejected_tools` Option](#--rt---rejected_tools-option)
73
+ - [Creating Your Own Tools](#creating-your-own-tools)
74
+ - [MCP Supported](#mcp-supported)
80
75
  - [License](#license)
81
76
 
82
77
  <!-- Tocer[finish]: Auto-generated, don't remove. -->
@@ -104,7 +99,6 @@ The following table provides a comprehensive list of configuration options, thei
104
99
  | log_file | -l, --log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
105
100
  | markdown | --md, --markdown | true | AIA_MARKDOWN |
106
101
  | max_tokens | --max_tokens | 2048 | AIA_MAX_TOKENS |
107
- | mcp_servers | --mcp | [] | AIA_MCP_SERVERS |
108
102
  | model | -m, --model | gpt-4o-mini | AIA_MODEL |
109
103
  | next | -n, --next | nil | AIA_NEXT |
110
104
  | out_file | -o, --out_file | temp.md | AIA_OUT_FILE |
@@ -113,7 +107,8 @@ The following table provides a comprehensive list of configuration options, thei
113
107
  | presence_penalty | --presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
114
108
  | prompt_extname | | .txt | AIA_PROMPT_EXTNAME |
115
109
  | prompts_dir | -p, --prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
116
- | require_libs | --rq | [] | AIA_REQUIRE_LIBS |
110
+ | refresh | --refresh | 0 (days) | AIA_REFRESH |
111
+ | require_libs | --rq --require | [] | AIA_REQUIRE_LIBS |
117
112
  | role | -r, --role | | AIA_ROLE |
118
113
  | roles_dir | | ~/.prompts/roles | AIA_ROLES_DIR |
119
114
  | roles_prefix | --roles_prefix | roles | AIA_ROLES_PREFIX |
@@ -124,8 +119,11 @@ The following table provides a comprehensive list of configuration options, thei
124
119
  | system_prompt | --system_prompt | | AIA_SYSTEM_PROMPT |
125
120
  | temperature | -t, --temperature | 0.7 | AIA_TEMPERATURE |
126
121
  | terse | --terse | false | AIA_TERSE |
122
+ | tool_paths | --tools | [] | AIA_TOOL_PATHS |
123
+ | allowed_tools | --at --allowed_tools | nil | AIA_ALLOWED_TOOLS |
124
+ | rejected_tools | --rt --rejected_tools | nil | AIA_REJECTED_TOOLS |
127
125
  | top_p | --top_p | 1.0 | AIA_TOP_P |
128
- | transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
126
+ | transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
129
127
  | verbose | -v, --verbose | false | AIA_VERBOSE |
130
128
  | voice | --voice | alloy | AIA_VOICE |
131
129
 
@@ -157,9 +155,25 @@ If you do not like the default regex used to identify parameters within the prom
157
155
 
158
156
  The configuration options are expandable through a config file, allowing you to add custom entries. For example, you can define a custom configuration item like "xyzzy" in your config file. This value can then be accessed in your prompts using `AIA.config.xyzzy` within a `//ruby` directive or an ERB block, enabling dynamic prompt generation based on your custom configurations.
159
157
 
158
+ ## The Local Model Registry Refresh
159
+
160
+ The `ruby_llm` gem maintains a registry of providers and models integrated with a new website that allows users to download the latest information about each model. This capability is scheduled for release in version 1.3.0 of the gem.
161
+
162
+ In anticipation of this new feature, the AIA tool has introduced the `--refresh` option, which specifies the number of days between updates to the centralized model registry. Here’s how the `--refresh` option works:
163
+
164
+ - A value of `0` (zero) updates the local model registry every time AIA is executed.
165
+ - A value of `1` (one) updates the local model registry once per day.
166
+ - etc.
167
+
168
+ The date of the last successful refresh is stored in the configuration file under the key `last_refresh`. The default configuration file is located at `~/.aia/config.yml`. When a refresh is successful, the `last_refresh` value is updated to the current date, and the updated configuration is saved in `AIA.config.config_file`.
169
+
170
+ ### Important Note
171
+
172
+ This approach to saving the `last_refresh` date can become cumbersome, particularly if you maintain multiple configuration files for different projects. The `last_refresh` date is only updated in the currently active configuration file. If you switch to a different project with a different configuration file, you may inadvertently hit the central model registry again, even if your local registry is already up to date.
173
+
160
174
  ## Shell Integration inside of a Prompt
161
175
 
162
- Using the option `--shell` enables AIA to access your terminal's shell environment from inside the prompt text.
176
+ AIA configures the `prompt_manager` gem to be fully integrated with your local shell by default. This is not an option - its a feature. If your prompt inclues text patterns like $HOME, ${HOME} or $(command) those patterns will be automatically replaced in the prompt text by the shell's value for those patterns.
163
177
 
164
178
  #### Dynamic Shell Commands
165
179
 
@@ -179,25 +193,25 @@ Given the following constraints $(cat ~/3_laws_of_robotics.txt) determine the be
179
193
 
180
194
  #### Shell Command Safety
181
195
 
182
- The catchphrase "the prompt is the code" within AIA means that you have the power to execute any command you want, but you must be careful not to execute commands that could cause harm. AIA is not going to protect you from doing something stupid. Sure that's a copout. I just can't think (actually I can) of all the ways you can mess things up writing code. Remember what we learned from Forrest Gump "Stupid is as stupid does." So don't do anything stupid. If someone gives you a prompt as says "run this with AIA" you had better review the prompt before processing it.
196
+ The catchphrase "the prompt is the code" within AIA means that you have the power to execute any command you want, but you must be careful not to execute commands that could cause harm. AIA is not going to protect you from doing something dumb. Sure that's a copout. I just can't think (actually I can) of all the ways you can mess things up writing code. Remember what we learned from Forrest Gump "Stupid is as stupid does." So don't break the dumb law. If someone gives you a prompt as says "run this with AIA" you had better review the prompt before processing it.
183
197
 
184
198
  #### Chat Session Use
185
199
 
186
- When you use the `--shell` option to start a chat session, shell integration is available in your follow up prompts. Suppose you started a chat session (--chat) using a role of "Ruby Expert" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started. You could enter this as your follow up prompt to keep going:
200
+ Shell integration is available in your follow up prompts within a chat session. Suppose you started a chat session (--chat) using a role of "Ruby Expert" expecting to chat about changes that could be made to a specific class BUT you forgot to include the class source file as part of the context when you got started. You could enter this as your follow up prompt to keep going:
187
201
 
188
202
  ```plaintext
189
203
  The class I want to chat about refactoring is this one: $(cat my_class.rb)
190
204
  ```
191
205
 
192
- That inserts the entire class source file into your follow up prompt. You can continue chatting with you AI Assistant about changes to the class.
206
+ That inserts the entire class source file into your follow up prompt. You can continue chatting with your AI Assistant about changes to the class.
193
207
 
194
- ## *E*mbedded *R*u*B*y (ERB)
208
+ ## Embedded Ruby (ERB)
195
209
 
196
- The inclusion of dynamic content through the shell integration provided by the `--shell` option is significant. AIA also provides the full power of embedded Ruby code processing within the prompt text.
210
+ The inclusion of dynamic content through the shell integration is significant. AIA also provides the full power of embedded Ruby code processing within the prompt text.
197
211
 
198
- The `--erb` option turns the prompt text file into a fully functioning ERB template. The [Embedded Ruby (ERB) template syntax (2024)](https://bophin-com.ngontinh24.com/article/language-embedded-ruby-erb-template-syntax) provides a good overview of the syntax and power of ERB.
212
+ AIA takes advantage of the `prompt_manager` gem to enable ERB integration in prompt text as a default. Its an always available feature of AIA prompts. The [Embedded Ruby (ERB) template syntax (2024)](https://bophin-com.ngontinh24.com/article/language-embedded-ruby-erb-template-syntax) provides a good overview of the syntax and power of ERB.
199
213
 
200
- Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic prompt text for LLM processing.
214
+ Most websites that have information about ERB will give examples of how to use ERB to generate dynamic HTML content for web-based applications. That is a common use case for ERB. AIA on the other hand uses ERB to generate dynamic or conditional prompt text for LLM processing.
201
215
 
202
216
  ## Prompt Directives
203
217
 
@@ -302,11 +316,11 @@ For example:
302
316
  //ruby puts "Hello from Ruby"
303
317
  ```
304
318
 
305
- You can also use the `--rq` option to specify Ruby libraries to require before executing Ruby code:
319
+ You can also use the `--require` option to specify Ruby libraries to require before executing Ruby code:
306
320
 
307
321
  ```bash
308
322
  # Command line
309
- aia --rq json,csv my_prompt
323
+ aia --rq json,csv --require os my_prompt
310
324
 
311
325
  # In chat
312
326
  //ruby JSON.parse('{"data": [1,2,3]}')["data"]
@@ -679,56 +693,83 @@ I'm not happy with the way where some command line options for external command
679
693
  - continue integration of the ruby_llm gem
680
694
  - support for Model Context Protocol
681
695
 
682
- ## Model Context Protocol (MCP) Support
696
+ ## RubyLLM::Tool Support
697
+
698
+ AIA supports function calling capabilities through the `RubyLLM::Tool` framework, enabling LLMs to execute custom functions during a chat session.
683
699
 
684
- AIA now includes experimental support for Model Context Protocol (MCP) servers through an extension to the `ruby_llm` gem, which utilizes the `ruby-mcp-client` gem for communication with MCP servers.
700
+ ### What Are RubyLLM Tools?
685
701
 
686
- ### What is MCP?
702
+ Tools (or functions) allow LLMs to perform actions beyond generating text, such as:
687
703
 
688
- Model Context Protocol (MCP) is a standardized way for AI models to interact with external tools and services. It allows AI assistants to perform actions beyond mere text generation, such as searching the web, accessing databases, or interacting with APIs.
704
+ - Retrieving real-time information
705
+ - Executing system commands
706
+ - Accessing external APIs
707
+ - Performing calculations
689
708
 
690
- ### Using MCP Servers with AIA
709
+ Check out the [examples/tools](examples/tools) directory which contains several ready-to-use tool implementations you can use as references.
691
710
 
692
- To use MCP with AIA, you can specify one or more MCP servers using the `--mcp` CLI option:
711
+ ### How to Use Tools
712
+
713
+ AIA provides three CLI options to manage function calling:
714
+
715
+ #### `--tools` Option
716
+
717
+ Specifies where to find tool implementations:
693
718
 
694
719
  ```bash
695
- aia --chat --mcp server1,server2
720
+ # Load tools from multiple sources
721
+ --tools /path/to/tools/directory,other/tools/dir,my_tool.rb
722
+
723
+ # Or use multiple --tools flags
724
+ --tools my_first_tool.rb --tools /tool_repo/tools
696
725
  ```
697
726
 
698
- This will connect your AIA session to the specified MCP servers, allowing the AI model to access tools provided by those servers.
727
+ Each path can be:
728
+
729
+ - A Ruby file implementing a `RubyLLM::Tool` subclass
730
+ - A directory containing tool implementations (all Ruby files in that directory will be loaded)
699
731
 
700
- ### Focusing on Specific Tools with --allowed_tools
732
+ Supporting files for tools can be placed in the same directory or subdirectories.
701
733
 
702
- If you want to limit which MCP tools are available to the AI model during your session, you can use the `--allowed_tools` option:
734
+ ### Filtering the tool paths
735
+
736
+ The --tools option must have exact relative or absolute paths to the tool files to be used by AIA for function callbacks. If you are specifying directories you may find yourself needing filter the entire set of tools to either allow some or reject others based upon some indicator in their file name. The following two options allow you to specify multiple sub-strings to match the tolls paths against. For example you might be comparing one version of a tool against another. Their filenames could have version prefixes like tool_v1.rb and tool_v2.rb Using the allowed and rejected filters you can choose one of the other when using an entire directory full of tools.
737
+
738
+ #### `--at`, `--allowed_tools` Option
739
+
740
+ Filters which tools to make available when loading from directories:
703
741
 
704
742
  ```bash
705
- aia --chat --mcp server1 --allowed_tools tool1,tool2,tool3
743
+ # Only allow tools with 'test' in their filename
744
+ --tools my_tools_directory --allowed_tools test
706
745
  ```
707
746
 
708
- This restricts the AI model to only use the specified tools, which can be helpful for:
747
+ This is useful when you have many tools but only want to use specific ones in a session.
709
748
 
710
- - Focusing the AI on a specific task
711
- - Preventing the use of unnecessary or potentially harmful tools
712
- - Reducing the cognitive load on the AI by limiting available options
749
+ #### `--rt`, `--rejected_tools` Option
713
750
 
714
- ### How It Works
751
+ Excludes specific tools:
715
752
 
716
- Behind the scenes, AIA leverages a new `with_mcp` method in `RubyLLM::Chat` to set up the MCP tools for chat sessions. This method processes the hash-structured MCP tool definitions and makes them available for the AI to use during the conversation.
753
+ ```bash
754
+ # Exclude tools with '_v1' in their filename
755
+ --tools my_tools_directory --rejected_tools _v1
756
+ ```
717
757
 
718
- When you specify `--mcp` and optionally `--allowed_tools`, AIA configures the underlying `ruby_llm` adapter to connect to the appropriate MCP servers and expose only the tools you've explicitly allowed.
758
+ Ideal for excluding older versions or temporarily disabling specific tools.
719
759
 
720
- ### Current Limitations
760
+ ### Creating Your Own Tools
721
761
 
722
- As this is an experimental feature:
762
+ To create a custom tool:
723
763
 
724
- - Not all AI models support MCP tools
725
- - The available tools may vary depending on the MCP server
726
- - Tool behavior may change as the MCP specification evolves
764
+ 1. Create a Ruby file that subclasses `RubyLLM::Tool`
765
+ 2. Define the tool's parameters and functionality
766
+ 3. Use the `--tools` option to load it in your AIA session
727
767
 
728
- ### Sounds like directives
768
+ For implementation details, refer to the [examples in the repository](examples/tools) or the RubyLLM documentation.
729
769
 
730
- Anthropic's MCP standard came months after the aia appeared with its inherent support of dynamic prompt context enhancements via directives, shell integration and ERB support. Yes you can argue that if you have a working MCP client and access to appropriate MCP servers then the dynamic prompt context enhancements via directives, shell integration and ERB support are superfluous. I don't agree with that argument. I think the dynamic prompt context enhancements via directives, shell integration and ERB support are powerful tools in and of themselves. I think they are a better way to do things.
770
+ ## MCP Supported
731
771
 
772
+ Abandon all hope of seeing an MCP client added to AIA. Maybe sometime in the future there will be a new gem "ruby_llm-mcp" that implements an MCP client as a native RubyLLM::Tool subclass. If that every happens you would use it the same way you use any other RubyLLM::Tool subclass which AIA now supports.
732
773
 
733
774
  ## License
734
775
 
@@ -0,0 +1,26 @@
1
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/edit_file.rb
2
+
3
+ require "ruby_llm/tool"
4
+
5
+ module Tools
6
+ class EditFile < RubyLLM::Tool
7
+ description <<~DESCRIPTION
8
+ Make edits to a text file.
9
+
10
+ Replaces 'old_str' with 'new_str' in the given file.
11
+ 'old_str' and 'new_str' MUST be different from each other.
12
+
13
+ If the file specified with path doesn't exist, it will be created.
14
+ DESCRIPTION
15
+ param :path, desc: "The path to the file"
16
+ param :old_str, desc: "Text to search for - must match exactly and must only have one match exactly"
17
+ param :new_str, desc: "Text to replace old_str with"
18
+
19
+ def execute(path:, old_str:, new_str:)
20
+ content = File.exist?(path) ? File.read(path) : ""
21
+ File.write(path, content.sub(old_str, new_str))
22
+ rescue => e
23
+ { error: e.message }
24
+ end
25
+ end
26
+ end
@@ -0,0 +1,18 @@
1
+
2
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/list_files.rb
3
+ #
4
+ require "ruby_llm/tool"
5
+
6
+ module Tools
7
+ class ListFiles < RubyLLM::Tool
8
+ description "List files and directories at a given path. If no path is provided, lists files in the current directory."
9
+ param :path, desc: "Optional relative path to list files from. Defaults to current directory if not provided."
10
+
11
+ def execute(path: Dir.pwd)
12
+ Dir.glob(File.join(path, "*"))
13
+ .map { |filename| File.directory?(filename) ? "#{filename}/" : filename }
14
+ rescue => e
15
+ { error: e.message }
16
+ end
17
+ end
18
+ end
@@ -0,0 +1,16 @@
1
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/read_file.rb
2
+ #
3
+ "ruby_llm/tool"
4
+
5
+ module Tools
6
+ class ReadFile < RubyLLM::Tool
7
+ description "Read the contents of a given relative file path. Use this when you want to see what's inside a file. Do not use this with directory names."
8
+ param :path, desc: "The relative path of a file in the working directory."
9
+
10
+ def execute(path:)
11
+ File.read(path)
12
+ rescue => e
13
+ { error: e.message }
14
+ end
15
+ end
16
+ end
@@ -0,0 +1,21 @@
1
+ # experiments/ai_misc/coding_agent_with_ruby_llm/tools/run_shell_command.rb
2
+
3
+ require "ruby_llm/tool"
4
+
5
+ module Tools
6
+ class RunShellCommand < RubyLLM::Tool
7
+ description "Execute a linux shell command"
8
+ param :command, desc: "The command to execute"
9
+
10
+ def execute(command:)
11
+ puts "AI wants to execute the following shell command: '#{command}'"
12
+ print "Do you want to execute it? (y/n) "
13
+ response = gets.chomp
14
+ return { error: "User declined to execute the command" } unless response == "y"
15
+
16
+ `#{command}`
17
+ rescue => e
18
+ { error: e.message }
19
+ end
20
+ end
21
+ end
@@ -28,9 +28,16 @@ module AIA
28
28
 
29
29
 
30
30
  def process_prompt(prompt, operation_type)
31
+ result = nil
31
32
  @ui_presenter.with_spinner("Processing", operation_type) do
32
- send_to_client(prompt, operation_type)
33
+ result = send_to_client(prompt, operation_type)
33
34
  end
35
+
36
+ unless result.is_a? String
37
+ result = result.content
38
+ end
39
+
40
+ result
34
41
  end
35
42
 
36
43
 
@@ -66,11 +73,6 @@ module AIA
66
73
  client_model = AIA.client.model.id # RubyLLM::Model instance
67
74
  end
68
75
 
69
- debug_me('== dynamic model change? =='){[
70
- :client_model,
71
- "AIA.config.model"
72
- ]}
73
-
74
76
  # when adapter is ruby_llm must use model.id as the name
75
77
  unless AIA.config.model.downcase.include?(client_model.downcase)
76
78
  # FIXME: assumes that the adapter is AiClient. It might be RUbyLLM