girb-ruby_llm 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a587322ce50087756c3a8e0178062385ea59f83737fb8b9c70816777024b230a
4
- data.tar.gz: 165288aad6307d5e658abc4f0385409752bab2f35b2e0cc9fe81f9a0f0fac05d
3
+ metadata.gz: cfe19c19205f70c38e3f95b1fca105e6c3c5d5c3398871e1a43cf5310e7bfb7a
4
+ data.tar.gz: 52dcce3593879b4bd66864f8928e5b7eb0aa1f4adc186c62bc665bfc217c1fd5
5
5
  SHA512:
6
- metadata.gz: 15a8df02dcec847a6c884d7ffc099cb9c9a7bfa484d210b8796102ef60e6a2029828f1efdfe343fefec2196f889fb467eb2d0921cd28468b4d27f317a6a72645
7
- data.tar.gz: 2cded93009f453527579f0cf9b1ba9c356760bb3f6dabb63b7aea122172a12d6454294d0a7b9a5f3d1972a8f92f65871e174dbf744d3ed19ffa36984d4308f3c
6
+ metadata.gz: e030b3dd7df74d07d15b6bfef72e61c7f5c1d9c6d8aa093eed9440855960c8e0773bb3046461a8d05e496e07d8891d6366a4e17bac299cb74d4d8da3bfb5fc2b
7
+ data.tar.gz: eb256345ace0da90621bd10069a4b0ed3dc94bc72b768d5da79cfd3cf4f558250fae6d9aa0aa3782aa66c632c86783931cf025739dab8dd6e1826b6010dca17f
data/CHANGELOG.md CHANGED
@@ -1,5 +1,23 @@
1
1
  # Changelog
2
2
 
3
+ ## [0.1.2] - 2026-02-03
4
+
5
+ ### Fixed
6
+
7
+ - Fix tool names being empty strings in dynamic RubyLLM::Tool classes
8
+ - Properly execute girb tools within RubyLLM's auto-execute framework
9
+
10
+ ## [0.1.1] - 2026-02-03
11
+
12
+ ### Added
13
+
14
+ - GIRB_MODEL environment variable support (required)
15
+ - Auto-refresh models for local providers (Ollama, GPUStack)
16
+
17
+ ### Changed
18
+
19
+ - Recommend ~/.irbrc configuration instead of environment variables
20
+
3
21
  ## [0.1.0] - 2025-02-02
4
22
 
5
23
  ### Added
data/README.md CHANGED
@@ -8,43 +8,55 @@ This gem allows you to use multiple LLM providers (OpenAI, Anthropic, Google Gem
8
8
 
9
9
  ## Installation
10
10
 
11
+ ### For Rails Projects
12
+
11
13
  Add to your Gemfile:
12
14
 
13
15
  ```ruby
14
- gem 'girb'
15
- gem 'girb-ruby_llm'
16
+ group :development do
17
+ gem 'girb-ruby_llm'
18
+ end
16
19
  ```
17
20
 
18
- Or install directly:
21
+ Then run:
19
22
 
20
23
  ```bash
21
- gem install girb girb-ruby_llm
24
+ bundle install
22
25
  ```
23
26
 
24
- ## Setup
27
+ Create a `.girbrc` file in your project root:
25
28
 
26
- Set the provider and your API key:
29
+ ```ruby
30
+ # .girbrc
31
+ require 'girb-ruby_llm'
27
32
 
28
- ```bash
29
- export GIRB_PROVIDER=girb-ruby_llm
30
- export GEMINI_API_KEY=your-api-key # or other provider's API key
33
+ Girb.configure do |c|
34
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
35
+ end
31
36
  ```
32
37
 
33
- Then start girb:
38
+ Now `rails console` will automatically load girb!
39
+
40
+ ### For Non-Rails Projects
41
+
42
+ Install globally:
34
43
 
35
44
  ```bash
36
- girb
45
+ gem install girb girb-ruby_llm
37
46
  ```
38
47
 
39
- ### Using with regular irb
40
-
41
- Add to your `~/.irbrc`:
48
+ Create a `.girbrc` file in your project directory:
42
49
 
43
50
  ```ruby
51
+ # .girbrc
44
52
  require 'girb-ruby_llm'
53
+
54
+ Girb.configure do |c|
55
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
56
+ end
45
57
  ```
46
58
 
47
- Then use regular `irb` command.
59
+ Then use `girb` command instead of `irb`.
48
60
 
49
61
  ## Configuration
50
62
 
@@ -85,61 +97,94 @@ Set your API key or endpoint as an environment variable:
85
97
 
86
98
  ## Examples
87
99
 
100
+ ### Using Google Gemini
101
+
102
+ ```ruby
103
+ # .girbrc
104
+ require 'girb-ruby_llm'
105
+
106
+ # Set GEMINI_API_KEY environment variable
107
+ Girb.configure do |c|
108
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
109
+ end
110
+ ```
111
+
88
112
  ### Using OpenAI
89
113
 
90
- ```bash
91
- export GIRB_PROVIDER=girb-ruby_llm
92
- export OPENAI_API_KEY="sk-..."
93
- girb
114
+ ```ruby
115
+ # .girbrc
116
+ require 'girb-ruby_llm'
117
+
118
+ # Set OPENAI_API_KEY environment variable
119
+ Girb.configure do |c|
120
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gpt-4o')
121
+ end
94
122
  ```
95
123
 
96
124
  ### Using Anthropic Claude
97
125
 
98
- ```bash
99
- export GIRB_PROVIDER=girb-ruby_llm
100
- export ANTHROPIC_API_KEY="sk-ant-..."
101
- girb
126
+ ```ruby
127
+ # .girbrc
128
+ require 'girb-ruby_llm'
129
+
130
+ # Set ANTHROPIC_API_KEY environment variable
131
+ Girb.configure do |c|
132
+ c.provider = Girb::Providers::RubyLlm.new(model: 'claude-sonnet-4-20250514')
133
+ end
102
134
  ```
103
135
 
104
136
  ### Using Ollama (Local)
105
137
 
106
- ```bash
107
- # Start Ollama first
108
- ollama serve
138
+ ```ruby
139
+ # .girbrc
140
+ require 'girb-ruby_llm'
109
141
 
110
- # Set the provider and API base URL
111
- export GIRB_PROVIDER=girb-ruby_llm
112
- export OLLAMA_API_BASE="http://localhost:11434/v1"
113
- girb
142
+ # Set OLLAMA_API_BASE environment variable (e.g., http://localhost:11434/v1)
143
+ Girb.configure do |c|
144
+ c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
145
+ end
114
146
  ```
115
147
 
116
148
  ### Using OpenAI-compatible APIs (e.g., LM Studio, vLLM)
117
149
 
118
- ```bash
119
- export GIRB_PROVIDER=girb-ruby_llm
120
- export OPENAI_API_KEY="not-needed" # Some require any non-empty value
121
- export OPENAI_API_BASE="http://localhost:1234/v1"
122
- girb
123
- ```
150
+ ```ruby
151
+ # .girbrc
152
+ require 'girb-ruby_llm'
124
153
 
125
- ### Manual Configuration
154
+ # Set OPENAI_API_BASE and OPENAI_API_KEY environment variables
155
+ Girb.configure do |c|
156
+ c.provider = Girb::Providers::RubyLlm.new(model: 'your-model-name')
157
+ end
158
+ ```
126
159
 
127
- You can configure the provider manually in your `~/.irbrc`:
160
+ ### Advanced Configuration
128
161
 
129
162
  ```ruby
130
- # ~/.irbrc
163
+ # .girbrc
131
164
  require 'girb-ruby_llm'
132
165
 
133
- RubyLLM.configure do |config|
134
- config.ollama_api_base = "http://localhost:11434/v1"
135
- end
136
- RubyLLM::Models.refresh!
137
-
138
166
  Girb.configure do |c|
139
- c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
167
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
168
+ c.debug = true # Enable debug output
169
+ c.custom_prompt = <<~PROMPT
170
+ Always confirm before destructive operations.
171
+ PROMPT
140
172
  end
141
173
  ```
142
174
 
175
+ Note: `RubyLLM::Models.refresh!` is automatically called for local providers (Ollama, GPUStack).
176
+
177
+ ## Alternative: Environment Variable Configuration
178
+
179
+ For the `girb` command, you can also configure via environment variables (used when no `.girbrc` is found):
180
+
181
+ ```bash
182
+ export GIRB_PROVIDER=girb-ruby_llm
183
+ export GIRB_MODEL=gemini-2.5-flash
184
+ export GEMINI_API_KEY=your-api-key
185
+ girb
186
+ ```
187
+
143
188
  ## Supported Models
144
189
 
145
190
  See [RubyLLM Available Models](https://rubyllm.com/reference/available-models) for the full list of supported models.
data/README_ja.md CHANGED
@@ -6,43 +6,55 @@
6
6
 
7
7
  ## インストール
8
8
 
9
+ ### Railsプロジェクトの場合
10
+
9
11
  Gemfileに追加:
10
12
 
11
13
  ```ruby
12
- gem 'girb'
13
- gem 'girb-ruby_llm'
14
+ group :development do
15
+ gem 'girb-ruby_llm'
16
+ end
14
17
  ```
15
18
 
16
- または直接インストール:
19
+ そして実行:
17
20
 
18
21
  ```bash
19
- gem install girb girb-ruby_llm
22
+ bundle install
20
23
  ```
21
24
 
22
- ## セットアップ
25
+ プロジェクトルートに `.girbrc` ファイルを作成:
23
26
 
24
- プロバイダーとAPIキーを設定:
27
+ ```ruby
28
+ # .girbrc
29
+ require 'girb-ruby_llm'
25
30
 
26
- ```bash
27
- export GIRB_PROVIDER=girb-ruby_llm
28
- export GEMINI_API_KEY=your-api-key # または他のプロバイダーのAPIキー
31
+ Girb.configure do |c|
32
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
33
+ end
29
34
  ```
30
35
 
31
- girbを起動:
36
+ これで `rails console` が自動的にgirbを読み込みます!
37
+
38
+ ### 非Railsプロジェクトの場合
39
+
40
+ グローバルにインストール:
32
41
 
33
42
  ```bash
34
- girb
43
+ gem install girb girb-ruby_llm
35
44
  ```
36
45
 
37
- ### 通常のirbで使用する場合
38
-
39
- `~/.irbrc` に追加:
46
+ プロジェクトディレクトリに `.girbrc` ファイルを作成:
40
47
 
41
48
  ```ruby
49
+ # .girbrc
42
50
  require 'girb-ruby_llm'
51
+
52
+ Girb.configure do |c|
53
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
54
+ end
43
55
  ```
44
56
 
45
- 通常の `irb` コマンドで使用できます。
57
+ `irb` の代わりに `girb` コマンドを使用します。
46
58
 
47
59
  ## 設定
48
60
 
@@ -83,61 +95,94 @@ APIキーまたはエンドポイントを環境変数として設定します:
83
95
 
84
96
  ## 使用例
85
97
 
98
+ ### Google Geminiを使用
99
+
100
+ ```ruby
101
+ # .girbrc
102
+ require 'girb-ruby_llm'
103
+
104
+ # GEMINI_API_KEY 環境変数を設定
105
+ Girb.configure do |c|
106
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
107
+ end
108
+ ```
109
+
86
110
  ### OpenAIを使用
87
111
 
88
- ```bash
89
- export GIRB_PROVIDER=girb-ruby_llm
90
- export OPENAI_API_KEY="sk-..."
91
- girb
112
+ ```ruby
113
+ # .girbrc
114
+ require 'girb-ruby_llm'
115
+
116
+ # OPENAI_API_KEY 環境変数を設定
117
+ Girb.configure do |c|
118
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gpt-4o')
119
+ end
92
120
  ```
93
121
 
94
122
  ### Anthropic Claudeを使用
95
123
 
96
- ```bash
97
- export GIRB_PROVIDER=girb-ruby_llm
98
- export ANTHROPIC_API_KEY="sk-ant-..."
99
- girb
124
+ ```ruby
125
+ # .girbrc
126
+ require 'girb-ruby_llm'
127
+
128
+ # ANTHROPIC_API_KEY 環境変数を設定
129
+ Girb.configure do |c|
130
+ c.provider = Girb::Providers::RubyLlm.new(model: 'claude-sonnet-4-20250514')
131
+ end
100
132
  ```
101
133
 
102
134
  ### Ollama(ローカル)を使用
103
135
 
104
- ```bash
105
- # まずOllamaを起動
106
- ollama serve
136
+ ```ruby
137
+ # .girbrc
138
+ require 'girb-ruby_llm'
107
139
 
108
- # プロバイダーとAPIベースURLを設定
109
- export GIRB_PROVIDER=girb-ruby_llm
110
- export OLLAMA_API_BASE="http://localhost:11434/v1"
111
- girb
140
+ # OLLAMA_API_BASE 環境変数を設定(例: http://localhost:11434/v1)
141
+ Girb.configure do |c|
142
+ c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
143
+ end
112
144
  ```
113
145
 
114
146
  ### OpenAI互換API(LM Studio、vLLMなど)を使用
115
147
 
116
- ```bash
117
- export GIRB_PROVIDER=girb-ruby_llm
118
- export OPENAI_API_KEY="not-needed" # 空でない値が必要な場合
119
- export OPENAI_API_BASE="http://localhost:1234/v1"
120
- girb
121
- ```
148
+ ```ruby
149
+ # .girbrc
150
+ require 'girb-ruby_llm'
122
151
 
123
- ### 手動設定
152
+ # OPENAI_API_BASE と OPENAI_API_KEY 環境変数を設定
153
+ Girb.configure do |c|
154
+ c.provider = Girb::Providers::RubyLlm.new(model: 'your-model-name')
155
+ end
156
+ ```
124
157
 
125
- `~/.irbrc` でプロバイダーを手動で設定できます:
158
+ ### 詳細設定
126
159
 
127
160
  ```ruby
128
- # ~/.irbrc
161
+ # .girbrc
129
162
  require 'girb-ruby_llm'
130
163
 
131
- RubyLLM.configure do |config|
132
- config.ollama_api_base = "http://localhost:11434/v1"
133
- end
134
- RubyLLM::Models.refresh!
135
-
136
164
  Girb.configure do |c|
137
- c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
165
+ c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
166
+ c.debug = true # デバッグ出力を有効化
167
+ c.custom_prompt = <<~PROMPT
168
+ 破壊的操作の前に必ず確認してください。
169
+ PROMPT
138
170
  end
139
171
  ```
140
172
 
173
+ 注: ローカルプロバイダー(Ollama、GPUStack)使用時は`RubyLLM::Models.refresh!`が自動的に呼ばれます。
174
+
175
+ ## 代替: 環境変数での設定
176
+
177
+ `girb` コマンドでは、`.girbrc` が見つからない場合に環境変数で設定することもできます:
178
+
179
+ ```bash
180
+ export GIRB_PROVIDER=girb-ruby_llm
181
+ export GIRB_MODEL=gemini-2.5-flash
182
+ export GEMINI_API_KEY=your-api-key
183
+ girb
184
+ ```
185
+
141
186
  ## 対応モデル
142
187
 
143
188
  サポートされているモデルの完全なリストは[RubyLLM Available Models](https://rubyllm.com/reference/available-models)を参照してください。
@@ -6,11 +6,23 @@ require "girb/providers/base"
6
6
  module Girb
7
7
  module Providers
8
8
  class RubyLlm < Base
9
+ # Thread-local storage for current binding
10
+ def self.current_binding=(binding)
11
+ Thread.current[:girb_ruby_llm_binding] = binding
12
+ end
13
+
14
+ def self.current_binding
15
+ Thread.current[:girb_ruby_llm_binding]
16
+ end
17
+
9
18
  def initialize(model: nil)
10
19
  @model = model
11
20
  end
12
21
 
13
- def chat(messages:, system_prompt:, tools:)
22
+ def chat(messages:, system_prompt:, tools:, binding: nil)
23
+ # Store binding for tool execution
24
+ self.class.current_binding = binding
25
+
14
26
  # Use specified model or RubyLLM's default
15
27
  chat_options = @model ? { model: @model } : {}
16
28
  ruby_llm_chat = ::RubyLLM.chat(**chat_options)
@@ -29,7 +41,7 @@ module Girb
29
41
  last_message = messages.last
30
42
  last_content = extract_content(last_message)
31
43
 
32
- # Send the request
44
+ # Send the request - RubyLLM will auto-execute tools
33
45
  response = ruby_llm_chat.ask(last_content)
34
46
 
35
47
  parse_response(response)
@@ -37,6 +49,8 @@ module Girb
37
49
  Response.new(error: "API Error: #{e.message}")
38
50
  rescue StandardError => e
39
51
  Response.new(error: "Error: #{e.message}")
52
+ ensure
53
+ self.class.current_binding = nil
40
54
  end
41
55
 
42
56
  private
@@ -108,11 +122,42 @@ module Girb
108
122
  required: required_params.include?(prop_name.to_s) || required_params.include?(prop_name)
109
123
  end
110
124
 
111
- # Override name method to return the custom name
125
+ # Store tool_name for execute method and override name for RubyLLM
126
+ define_method(:girb_tool_name) { tool_name }
112
127
  define_method(:name) { tool_name }
113
128
 
114
- # Execute method (never actually called, just for tool definition)
115
- define_method(:execute) { |**_args| "" }
129
+ # Execute method - actually execute the girb tool
130
+ define_method(:execute) do |**args|
131
+ tool_name_for_log = girb_tool_name
132
+
133
+ if Girb.configuration&.debug
134
+ args_str = args.map { |k, v| "#{k}: #{v.inspect}" }.join(", ")
135
+ puts "[girb] Tool: #{tool_name_for_log}(#{args_str})"
136
+ end
137
+
138
+ binding = Girb::Providers::RubyLlm.current_binding
139
+ girb_tool_class = Girb::Tools.find_tool(tool_name_for_log)
140
+
141
+ unless girb_tool_class
142
+ return { error: "Unknown tool: #{tool_name_for_log}" }
143
+ end
144
+
145
+ girb_tool = girb_tool_class.new
146
+
147
+ result = if binding
148
+ girb_tool.execute(binding, **args)
149
+ else
150
+ { error: "No binding available for tool execution" }
151
+ end
152
+
153
+ if Girb.configuration&.debug && result.is_a?(Hash) && result[:error]
154
+ puts "[girb] Tool error: #{result[:error]}"
155
+ end
156
+
157
+ result
158
+ rescue StandardError => e
159
+ { error: "Tool execution failed: #{e.class} - #{e.message}" }
160
+ end
116
161
  end
117
162
 
118
163
  tool_class.new
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module GirbRubyLlm
4
- VERSION = "0.1.0"
4
+ VERSION = "0.1.2"
5
5
  end
data/lib/girb-ruby_llm.rb CHANGED
@@ -32,6 +32,9 @@ RUBY_LLM_CONFIG_MAP = {
32
32
  # Check if any RubyLLM config is available
33
33
  has_config = RUBY_LLM_CONFIG_MAP.any? { |env_var, _| ENV[env_var] }
34
34
 
35
+ # Local providers that require Models.refresh!
36
+ LOCAL_PROVIDERS = %w[OLLAMA_API_BASE GPUSTACK_API_BASE].freeze
37
+
35
38
  if has_config
36
39
  # Configure RubyLLM with all available environment variables
37
40
  RubyLLM.configure do |config|
@@ -40,9 +43,18 @@ if has_config
40
43
  end
41
44
  end
42
45
 
46
+ # Refresh models for local providers (Ollama, GPUStack)
47
+ if LOCAL_PROVIDERS.any? { |env_var| ENV[env_var] }
48
+ RubyLLM::Models.refresh!
49
+ end
50
+
51
+ model = ENV["GIRB_MODEL"]
52
+ unless model
53
+ warn "[girb-ruby_llm] GIRB_MODEL not set. Please specify a model."
54
+ warn "[girb-ruby_llm] Example: export GIRB_MODEL=gemini-2.5-flash"
55
+ end
56
+
43
57
  Girb.configure do |c|
44
- unless c.provider
45
- c.provider = Girb::Providers::RubyLlm.new(model: c.model)
46
- end
58
+ c.provider ||= Girb::Providers::RubyLlm.new(model: model)
47
59
  end
48
60
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: girb-ruby_llm
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - rira100000000