girb-ruby_llm 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: a587322ce50087756c3a8e0178062385ea59f83737fb8b9c70816777024b230a
4
+ data.tar.gz: 165288aad6307d5e658abc4f0385409752bab2f35b2e0cc9fe81f9a0f0fac05d
5
+ SHA512:
6
+ metadata.gz: 15a8df02dcec847a6c884d7ffc099cb9c9a7bfa484d210b8796102ef60e6a2029828f1efdfe343fefec2196f889fb467eb2d0921cd28468b4d27f317a6a72645
7
+ data.tar.gz: 2cded93009f453527579f0cf9b1ba9c356760bb3f6dabb63b7aea122172a12d6454294d0a7b9a5f3d1972a8f92f65871e174dbf744d3ed19ffa36984d4308f3c
data/CHANGELOG.md ADDED
@@ -0,0 +1,11 @@
1
+ # Changelog
2
+
3
+ ## [0.1.0] - 2025-02-02
4
+
5
+ ### Added
6
+
7
+ - Initial release
8
+ - RubyLLM provider for girb
9
+ - Support for multiple LLM providers: OpenAI, Anthropic, Gemini, Ollama, DeepSeek, Mistral, and more
10
+ - Auto-configuration via environment variables
11
+ - Function calling support
data/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Rira
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,149 @@
1
+ # girb-ruby_llm
2
+
3
+ [日本語版 README](README_ja.md)
4
+
5
+ RubyLLM provider for [girb](https://github.com/rira100000000/girb) (AI-powered IRB assistant).
6
+
7
+ This gem allows you to use multiple LLM providers (OpenAI, Anthropic, Google Gemini, Ollama, and more) through the [RubyLLM](https://github.com/crmne/ruby_llm) unified API.
8
+
9
+ ## Installation
10
+
11
+ Add to your Gemfile:
12
+
13
+ ```ruby
14
+ gem 'girb'
15
+ gem 'girb-ruby_llm'
16
+ ```
17
+
18
+ Or install directly:
19
+
20
+ ```bash
21
+ gem install girb girb-ruby_llm
22
+ ```
23
+
24
+ ## Setup
25
+
26
+ Set the provider and your API key:
27
+
28
+ ```bash
29
+ export GIRB_PROVIDER=girb-ruby_llm
30
+ export GEMINI_API_KEY=your-api-key # or other provider's API key
31
+ ```
32
+
33
+ Then start girb:
34
+
35
+ ```bash
36
+ girb
37
+ ```
38
+
39
+ ### Using with regular irb
40
+
41
+ Add to your `~/.irbrc`:
42
+
43
+ ```ruby
44
+ require 'girb-ruby_llm'
45
+ ```
46
+
47
+ Then use regular `irb` command.
48
+
49
+ ## Configuration
50
+
51
+ Set your API key or endpoint as an environment variable:
52
+
53
+ ### Cloud Providers
54
+
55
+ | Provider | Environment Variable |
56
+ |----------|---------------------|
57
+ | Anthropic | `ANTHROPIC_API_KEY` |
58
+ | OpenAI | `OPENAI_API_KEY` |
59
+ | Google Gemini | `GEMINI_API_KEY` |
60
+ | DeepSeek | `DEEPSEEK_API_KEY` |
61
+ | Mistral | `MISTRAL_API_KEY` |
62
+ | OpenRouter | `OPENROUTER_API_KEY` |
63
+ | Perplexity | `PERPLEXITY_API_KEY` |
64
+ | xAI | `XAI_API_KEY` |
65
+
66
+ ### Local Providers
67
+
68
+ | Provider | Environment Variable |
69
+ |----------|---------------------|
70
+ | Ollama | `OLLAMA_API_BASE` |
71
+ | GPUStack | `GPUSTACK_API_BASE` |
72
+
73
+ ### Additional Configuration
74
+
75
+ | Environment Variable | Description |
76
+ |---------------------|-------------|
77
+ | `OPENAI_API_BASE` | Custom OpenAI-compatible API endpoint |
78
+ | `GEMINI_API_BASE` | Custom Gemini API endpoint |
79
+ | `GPUSTACK_API_KEY` | GPUStack API key |
80
+ | `BEDROCK_API_KEY` | AWS Bedrock access key |
81
+ | `BEDROCK_SECRET_KEY` | AWS Bedrock secret key |
82
+ | `BEDROCK_REGION` | AWS Bedrock region |
83
+ | `VERTEXAI_PROJECT_ID` | Google Vertex AI project ID |
84
+ | `VERTEXAI_LOCATION` | Google Vertex AI location |
85
+
86
+ ## Examples
87
+
88
+ ### Using OpenAI
89
+
90
+ ```bash
91
+ export GIRB_PROVIDER=girb-ruby_llm
92
+ export OPENAI_API_KEY="sk-..."
93
+ girb
94
+ ```
95
+
96
+ ### Using Anthropic Claude
97
+
98
+ ```bash
99
+ export GIRB_PROVIDER=girb-ruby_llm
100
+ export ANTHROPIC_API_KEY="sk-ant-..."
101
+ girb
102
+ ```
103
+
104
+ ### Using Ollama (Local)
105
+
106
+ ```bash
107
+ # Start Ollama first
108
+ ollama serve
109
+
110
+ # Set the provider and API base URL
111
+ export GIRB_PROVIDER=girb-ruby_llm
112
+ export OLLAMA_API_BASE="http://localhost:11434/v1"
113
+ girb
114
+ ```
115
+
116
+ ### Using OpenAI-compatible APIs (e.g., LM Studio, vLLM)
117
+
118
+ ```bash
119
+ export GIRB_PROVIDER=girb-ruby_llm
120
+ export OPENAI_API_KEY="not-needed" # Some require any non-empty value
121
+ export OPENAI_API_BASE="http://localhost:1234/v1"
122
+ girb
123
+ ```
124
+
125
+ ### Manual Configuration
126
+
127
+ You can configure the provider manually in your `~/.irbrc`:
128
+
129
+ ```ruby
130
+ # ~/.irbrc
131
+ require 'girb-ruby_llm'
132
+
133
+ RubyLLM.configure do |config|
134
+ config.ollama_api_base = "http://localhost:11434/v1"
135
+ end
136
+ RubyLLM::Models.refresh!
137
+
138
+ Girb.configure do |c|
139
+ c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
140
+ end
141
+ ```
142
+
143
+ ## Supported Models
144
+
145
+ See [RubyLLM Available Models](https://rubyllm.com/reference/available-models) for the full list of supported models.
146
+
147
+ ## License
148
+
149
+ MIT License
data/README_ja.md ADDED
@@ -0,0 +1,147 @@
1
+ # girb-ruby_llm
2
+
3
+ [girb](https://github.com/rira100000000/girb)(AI搭載IRBアシスタント)用のRubyLLMプロバイダーです。
4
+
5
+ このgemを使用すると、[RubyLLM](https://github.com/crmne/ruby_llm)の統一APIを通じて、複数のLLMプロバイダー(OpenAI、Anthropic、Google Gemini、Ollamaなど)を利用できます。
6
+
7
+ ## インストール
8
+
9
+ Gemfileに追加:
10
+
11
+ ```ruby
12
+ gem 'girb'
13
+ gem 'girb-ruby_llm'
14
+ ```
15
+
16
+ または直接インストール:
17
+
18
+ ```bash
19
+ gem install girb girb-ruby_llm
20
+ ```
21
+
22
+ ## セットアップ
23
+
24
+ プロバイダーとAPIキーを設定:
25
+
26
+ ```bash
27
+ export GIRB_PROVIDER=girb-ruby_llm
28
+ export GEMINI_API_KEY=your-api-key # または他のプロバイダーのAPIキー
29
+ ```
30
+
31
+ girbを起動:
32
+
33
+ ```bash
34
+ girb
35
+ ```
36
+
37
+ ### 通常のirbで使用する場合
38
+
39
+ `~/.irbrc` に追加:
40
+
41
+ ```ruby
42
+ require 'girb-ruby_llm'
43
+ ```
44
+
45
+ 通常の `irb` コマンドで使用できます。
46
+
47
+ ## 設定
48
+
49
+ APIキーまたはエンドポイントを環境変数として設定します:
50
+
51
+ ### クラウドプロバイダー
52
+
53
+ | プロバイダー | 環境変数 |
54
+ |-------------|---------|
55
+ | Anthropic | `ANTHROPIC_API_KEY` |
56
+ | OpenAI | `OPENAI_API_KEY` |
57
+ | Google Gemini | `GEMINI_API_KEY` |
58
+ | DeepSeek | `DEEPSEEK_API_KEY` |
59
+ | Mistral | `MISTRAL_API_KEY` |
60
+ | OpenRouter | `OPENROUTER_API_KEY` |
61
+ | Perplexity | `PERPLEXITY_API_KEY` |
62
+ | xAI | `XAI_API_KEY` |
63
+
64
+ ### ローカルプロバイダー
65
+
66
+ | プロバイダー | 環境変数 |
67
+ |-------------|---------|
68
+ | Ollama | `OLLAMA_API_BASE` |
69
+ | GPUStack | `GPUSTACK_API_BASE` |
70
+
71
+ ### 追加設定
72
+
73
+ | 環境変数 | 説明 |
74
+ |---------|------|
75
+ | `OPENAI_API_BASE` | カスタムOpenAI互換APIエンドポイント |
76
+ | `GEMINI_API_BASE` | カスタムGemini APIエンドポイント |
77
+ | `GPUSTACK_API_KEY` | GPUStack APIキー |
78
+ | `BEDROCK_API_KEY` | AWS Bedrockアクセスキー |
79
+ | `BEDROCK_SECRET_KEY` | AWS Bedrockシークレットキー |
80
+ | `BEDROCK_REGION` | AWS Bedrockリージョン |
81
+ | `VERTEXAI_PROJECT_ID` | Google Vertex AIプロジェクトID |
82
+ | `VERTEXAI_LOCATION` | Google Vertex AIロケーション |
83
+
84
+ ## 使用例
85
+
86
+ ### OpenAIを使用
87
+
88
+ ```bash
89
+ export GIRB_PROVIDER=girb-ruby_llm
90
+ export OPENAI_API_KEY="sk-..."
91
+ girb
92
+ ```
93
+
94
+ ### Anthropic Claudeを使用
95
+
96
+ ```bash
97
+ export GIRB_PROVIDER=girb-ruby_llm
98
+ export ANTHROPIC_API_KEY="sk-ant-..."
99
+ girb
100
+ ```
101
+
102
+ ### Ollama(ローカル)を使用
103
+
104
+ ```bash
105
+ # まずOllamaを起動
106
+ ollama serve
107
+
108
+ # プロバイダーとAPIベースURLを設定
109
+ export GIRB_PROVIDER=girb-ruby_llm
110
+ export OLLAMA_API_BASE="http://localhost:11434/v1"
111
+ girb
112
+ ```
113
+
114
+ ### OpenAI互換API(LM Studio、vLLMなど)を使用
115
+
116
+ ```bash
117
+ export GIRB_PROVIDER=girb-ruby_llm
118
+ export OPENAI_API_KEY="not-needed" # 空でない値が必要な場合
119
+ export OPENAI_API_BASE="http://localhost:1234/v1"
120
+ girb
121
+ ```
122
+
123
+ ### 手動設定
124
+
125
+ `~/.irbrc` でプロバイダーを手動で設定できます:
126
+
127
+ ```ruby
128
+ # ~/.irbrc
129
+ require 'girb-ruby_llm'
130
+
131
+ RubyLLM.configure do |config|
132
+ config.ollama_api_base = "http://localhost:11434/v1"
133
+ end
134
+ RubyLLM::Models.refresh!
135
+
136
+ Girb.configure do |c|
137
+ c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
138
+ end
139
+ ```
140
+
141
+ ## 対応モデル
142
+
143
+ サポートされているモデルの完全なリストは[RubyLLM Available Models](https://rubyllm.com/reference/available-models)を参照してください。
144
+
145
+ ## ライセンス
146
+
147
+ MIT License
data/Rakefile ADDED
@@ -0,0 +1,3 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "bundler/gem_tasks"
@@ -0,0 +1,33 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "lib/girb-ruby_llm/version"
4
+
5
+ Gem::Specification.new do |spec|
6
+ spec.name = "girb-ruby_llm"
7
+ spec.version = GirbRubyLlm::VERSION
8
+ spec.authors = ["rira100000000"]
9
+ spec.email = ["101010hayakawa@gmail.com"]
10
+
11
+ spec.summary = "RubyLLM provider for girb"
12
+ spec.description = "RubyLLM provider for girb (AI-powered IRB assistant). " \
13
+ "Install this gem to use OpenAI, Anthropic, Gemini and other LLMs via RubyLLM as your LLM backend."
14
+ spec.homepage = "https://github.com/rira100000000/girb-ruby_llm"
15
+ spec.license = "MIT"
16
+ spec.required_ruby_version = ">= 3.2.0"
17
+
18
+ spec.metadata["homepage_uri"] = spec.homepage
19
+ spec.metadata["source_code_uri"] = spec.homepage
20
+ spec.metadata["changelog_uri"] = "#{spec.homepage}/blob/main/CHANGELOG.md"
21
+
22
+ spec.files = Dir.chdir(__dir__) do
23
+ `git ls-files -z`.split("\x0").reject do |f|
24
+ (File.expand_path(f) == __FILE__) ||
25
+ f.start_with?(*%w[bin/ test/ spec/ features/ .git .github appveyor Gemfile])
26
+ end
27
+ end
28
+ spec.require_paths = ["lib"]
29
+
30
+ # Runtime dependencies
31
+ spec.add_dependency "girb", "~> 0.1"
32
+ spec.add_dependency "ruby_llm", "~> 1.0"
33
+ end
@@ -0,0 +1,142 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "ruby_llm"
4
+ require "girb/providers/base"
5
+
6
+ module Girb
7
+ module Providers
8
+ class RubyLlm < Base
9
+ def initialize(model: nil)
10
+ @model = model
11
+ end
12
+
13
+ def chat(messages:, system_prompt:, tools:)
14
+ # Use specified model or RubyLLM's default
15
+ chat_options = @model ? { model: @model } : {}
16
+ ruby_llm_chat = ::RubyLLM.chat(**chat_options)
17
+
18
+ # Set system prompt
19
+ ruby_llm_chat.with_instructions(system_prompt) if system_prompt && !system_prompt.empty?
20
+
21
+ # Add tools
22
+ tool_instances = build_tools(tools)
23
+ tool_instances.each { |tool| ruby_llm_chat.with_tool(tool) }
24
+
25
+ # Add messages except the last user message
26
+ add_messages_to_chat(ruby_llm_chat, messages[0..-2])
27
+
28
+ # Get the last user message
29
+ last_message = messages.last
30
+ last_content = extract_content(last_message)
31
+
32
+ # Send the request
33
+ response = ruby_llm_chat.ask(last_content)
34
+
35
+ parse_response(response)
36
+ rescue Faraday::BadRequestError => e
37
+ Response.new(error: "API Error: #{e.message}")
38
+ rescue StandardError => e
39
+ Response.new(error: "Error: #{e.message}")
40
+ end
41
+
42
+ private
43
+
44
+ def add_messages_to_chat(chat, messages)
45
+ messages.each do |msg|
46
+ case msg[:role]
47
+ when :user
48
+ chat.add_message(role: :user, content: msg[:content])
49
+ when :assistant
50
+ chat.add_message(role: :assistant, content: msg[:content])
51
+ when :tool_call
52
+ # Add as assistant message with tool_calls
53
+ chat.add_message(
54
+ role: :assistant,
55
+ content: nil,
56
+ tool_calls: {
57
+ msg[:name] => ::RubyLLM::ToolCall.new(
58
+ id: msg[:id] || "call_#{SecureRandom.hex(12)}",
59
+ name: msg[:name],
60
+ arguments: msg[:args]
61
+ )
62
+ }
63
+ )
64
+ when :tool_result
65
+ chat.add_message(
66
+ role: :tool,
67
+ content: msg[:result].to_s,
68
+ tool_call_id: msg[:id] || "call_#{SecureRandom.hex(12)}"
69
+ )
70
+ end
71
+ end
72
+ end
73
+
74
+ def extract_content(message)
75
+ case message[:role]
76
+ when :user, :assistant
77
+ message[:content]
78
+ else
79
+ message[:content].to_s
80
+ end
81
+ end
82
+
83
+ def build_tools(tools)
84
+ return [] if tools.nil? || tools.empty?
85
+
86
+ tools.map do |tool|
87
+ create_dynamic_tool(tool)
88
+ end
89
+ end
90
+
91
+ def create_dynamic_tool(tool_def)
92
+ tool_name = tool_def[:name]
93
+ tool_description = tool_def[:description]
94
+ tool_parameters = tool_def[:parameters] || {}
95
+
96
+ # Create a dynamic tool class
97
+ tool_class = Class.new(::RubyLLM::Tool) do
98
+ description tool_description
99
+
100
+ # Define parameters
101
+ properties = tool_parameters[:properties] || {}
102
+ required_params = tool_parameters[:required] || []
103
+
104
+ properties.each do |prop_name, prop_def|
105
+ param prop_name.to_sym,
106
+ type: prop_def[:type] || "string",
107
+ desc: prop_def[:description],
108
+ required: required_params.include?(prop_name.to_s) || required_params.include?(prop_name)
109
+ end
110
+
111
+ # Override name method to return the custom name
112
+ define_method(:name) { tool_name }
113
+
114
+ # Execute method (never actually called, just for tool definition)
115
+ define_method(:execute) { |**_args| "" }
116
+ end
117
+
118
+ tool_class.new
119
+ end
120
+
121
+ def parse_response(response)
122
+ return Response.new(error: "No response") unless response
123
+
124
+ text = response.content.is_a?(String) ? response.content : response.content.to_s
125
+ function_calls = parse_function_calls(response)
126
+
127
+ Response.new(text: text, function_calls: function_calls, raw_response: response)
128
+ end
129
+
130
+ def parse_function_calls(response)
131
+ return [] unless response.tool_call?
132
+
133
+ response.tool_calls.map do |_id, tool_call|
134
+ {
135
+ name: tool_call.name.to_s,
136
+ args: tool_call.arguments || {}
137
+ }
138
+ end
139
+ end
140
+ end
141
+ end
142
+ end
@@ -0,0 +1,5 @@
1
+ # frozen_string_literal: true
2
+
3
+ module GirbRubyLlm
4
+ VERSION = "0.1.0"
5
+ end
@@ -0,0 +1,48 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "girb"
4
+ require_relative "girb/providers/ruby_llm"
5
+
6
+ # Environment variable to RubyLLM config mapping
7
+ RUBY_LLM_CONFIG_MAP = {
8
+ # Cloud providers (API key based)
9
+ "ANTHROPIC_API_KEY" => :anthropic_api_key,
10
+ "OPENAI_API_KEY" => :openai_api_key,
11
+ "GEMINI_API_KEY" => :gemini_api_key,
12
+ "DEEPSEEK_API_KEY" => :deepseek_api_key,
13
+ "MISTRAL_API_KEY" => :mistral_api_key,
14
+ "OPENROUTER_API_KEY" => :openrouter_api_key,
15
+ "PERPLEXITY_API_KEY" => :perplexity_api_key,
16
+ "XAI_API_KEY" => :xai_api_key,
17
+ # Local providers (URL based)
18
+ "OLLAMA_API_BASE" => :ollama_api_base,
19
+ "GPUSTACK_API_BASE" => :gpustack_api_base,
20
+ "GPUSTACK_API_KEY" => :gpustack_api_key,
21
+ # Additional configs
22
+ "OPENAI_API_BASE" => :openai_api_base,
23
+ "GEMINI_API_BASE" => :gemini_api_base,
24
+ "BEDROCK_API_KEY" => :bedrock_api_key,
25
+ "BEDROCK_SECRET_KEY" => :bedrock_secret_key,
26
+ "BEDROCK_REGION" => :bedrock_region,
27
+ "BEDROCK_SESSION_TOKEN" => :bedrock_session_token,
28
+ "VERTEXAI_PROJECT_ID" => :vertexai_project_id,
29
+ "VERTEXAI_LOCATION" => :vertexai_location
30
+ }.freeze
31
+
32
+ # Check if any RubyLLM config is available
33
+ has_config = RUBY_LLM_CONFIG_MAP.any? { |env_var, _| ENV[env_var] }
34
+
35
+ if has_config
36
+ # Configure RubyLLM with all available environment variables
37
+ RubyLLM.configure do |config|
38
+ RUBY_LLM_CONFIG_MAP.each do |env_var, config_key|
39
+ config.send("#{config_key}=", ENV[env_var]) if ENV[env_var]
40
+ end
41
+ end
42
+
43
+ Girb.configure do |c|
44
+ unless c.provider
45
+ c.provider = Girb::Providers::RubyLlm.new(model: c.model)
46
+ end
47
+ end
48
+ end
metadata ADDED
@@ -0,0 +1,81 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: girb-ruby_llm
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - rira100000000
8
+ bindir: bin
9
+ cert_chain: []
10
+ date: 1980-01-02 00:00:00.000000000 Z
11
+ dependencies:
12
+ - !ruby/object:Gem::Dependency
13
+ name: girb
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - "~>"
17
+ - !ruby/object:Gem::Version
18
+ version: '0.1'
19
+ type: :runtime
20
+ prerelease: false
21
+ version_requirements: !ruby/object:Gem::Requirement
22
+ requirements:
23
+ - - "~>"
24
+ - !ruby/object:Gem::Version
25
+ version: '0.1'
26
+ - !ruby/object:Gem::Dependency
27
+ name: ruby_llm
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - "~>"
31
+ - !ruby/object:Gem::Version
32
+ version: '1.0'
33
+ type: :runtime
34
+ prerelease: false
35
+ version_requirements: !ruby/object:Gem::Requirement
36
+ requirements:
37
+ - - "~>"
38
+ - !ruby/object:Gem::Version
39
+ version: '1.0'
40
+ description: RubyLLM provider for girb (AI-powered IRB assistant). Install this gem
41
+ to use OpenAI, Anthropic, Gemini and other LLMs via RubyLLM as your LLM backend.
42
+ email:
43
+ - 101010hayakawa@gmail.com
44
+ executables: []
45
+ extensions: []
46
+ extra_rdoc_files: []
47
+ files:
48
+ - CHANGELOG.md
49
+ - LICENSE
50
+ - README.md
51
+ - README_ja.md
52
+ - Rakefile
53
+ - girb-ruby_llm.gemspec
54
+ - lib/girb-ruby_llm.rb
55
+ - lib/girb-ruby_llm/version.rb
56
+ - lib/girb/providers/ruby_llm.rb
57
+ homepage: https://github.com/rira100000000/girb-ruby_llm
58
+ licenses:
59
+ - MIT
60
+ metadata:
61
+ homepage_uri: https://github.com/rira100000000/girb-ruby_llm
62
+ source_code_uri: https://github.com/rira100000000/girb-ruby_llm
63
+ changelog_uri: https://github.com/rira100000000/girb-ruby_llm/blob/main/CHANGELOG.md
64
+ rdoc_options: []
65
+ require_paths:
66
+ - lib
67
+ required_ruby_version: !ruby/object:Gem::Requirement
68
+ requirements:
69
+ - - ">="
70
+ - !ruby/object:Gem::Version
71
+ version: 3.2.0
72
+ required_rubygems_version: !ruby/object:Gem::Requirement
73
+ requirements:
74
+ - - ">="
75
+ - !ruby/object:Gem::Version
76
+ version: '0'
77
+ requirements: []
78
+ rubygems_version: 3.7.2
79
+ specification_version: 4
80
+ summary: RubyLLM provider for girb
81
+ test_files: []