ruby_llm 0.1.0.pre36 → 0.1.0.pre37

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,247 @@
1
+ ---
2
+ layout: default
3
+ title: Tools
4
+ parent: Guides
5
+ nav_order: 3
6
+ permalink: /guides/tools
7
+ ---
8
+
9
+ # Using Tools with RubyLLM
10
+
11
+ Tools allow AI models to call your Ruby code to perform actions or retrieve information. This guide explains how to create and use tools with RubyLLM.
12
+
13
+ ## What Are Tools?
14
+
15
+ Tools (also known as "functions" or "plugins") let AI models:
16
+
17
+ 1. Recognize when external functionality is needed
18
+ 2. Call your Ruby code with appropriate parameters
19
+ 3. Use the results to enhance their responses
20
+
21
+ Common use cases include:
22
+ - Retrieving real-time data
23
+ - Performing calculations
24
+ - Accessing databases
25
+ - Controlling external systems
26
+
27
+ ## Creating a Tool
28
+
29
+ Tools are defined as Ruby classes that inherit from `RubyLLM::Tool`:
30
+
31
+ ```ruby
32
+ class Calculator < RubyLLM::Tool
33
+ description "Performs arithmetic calculations"
34
+
35
+ param :expression,
36
+ type: :string,
37
+ desc: "A mathematical expression to evaluate (e.g. '2 + 2')"
38
+
39
+ def execute(expression:)
40
+ eval(expression).to_s
41
+ rescue StandardError => e
42
+ "Error: #{e.message}"
43
+ end
44
+ end
45
+ ```
46
+
47
+ ### Tool Components
48
+
49
+ Each tool has these key elements:
50
+
51
+ 1. **Description** - Explains what the tool does, helping the AI decide when to use it
52
+ 2. **Parameters** - Define the inputs the tool expects
53
+ 3. **Execute Method** - The code that runs when the tool is called
54
+
55
+ ### Parameter Definition
56
+
57
+ Parameters accept several options:
58
+
59
+ ```ruby
60
+ param :parameter_name,
61
+ type: :string, # Data type (:string, :integer, :boolean, :array, :object)
62
+ desc: "Description", # Description of what the parameter does
63
+ required: true # Whether the parameter is required (default: true)
64
+ ```
65
+
66
+ ## Using Tools in Chat
67
+
68
+ To use a tool, attach it to a chat:
69
+
70
+ ```ruby
71
+ # Create the chat
72
+ chat = RubyLLM.chat
73
+
74
+ # Add a tool
75
+ chat.with_tool(Calculator)
76
+
77
+ # Now you can ask questions that might require calculation
78
+ response = chat.ask "What's 123 * 456?"
79
+ # => "Let me calculate that for you. 123 * 456 = 56088."
80
+ ```
81
+
82
+ ### Multiple Tools
83
+
84
+ You can provide multiple tools to a single chat:
85
+
86
+ ```ruby
87
+ class Weather < RubyLLM::Tool
88
+ description "Gets current weather for a location"
89
+
90
+ param :location,
91
+ desc: "City name or zip code"
92
+
93
+ def execute(location:)
94
+ # Simulate weather lookup
95
+ "72°F and sunny in #{location}"
96
+ end
97
+ end
98
+
99
+ # Add multiple tools
100
+ chat = RubyLLM.chat
101
+ .with_tools(Calculator, Weather)
102
+
103
+ # Ask questions that might use either tool
104
+ chat.ask "What's the temperature in New York City?"
105
+ chat.ask "If it's 72°F in NYC and 54°F in Boston, what's the average?"
106
+ ```
107
+
108
+ ## Custom Initialization
109
+
110
+ Tools can have custom initialization:
111
+
112
+ ```ruby
113
+ class DocumentSearch < RubyLLM::Tool
114
+ description "Searches documents by relevance"
115
+
116
+ param :query,
117
+ desc: "The search query"
118
+
119
+ param :limit,
120
+ type: :integer,
121
+ desc: "Maximum number of results",
122
+ required: false
123
+
124
+ def initialize(database)
125
+ @database = database
126
+ end
127
+
128
+ def execute(query:, limit: 5)
129
+ # Search in @database
130
+ @database.search(query, limit: limit)
131
+ end
132
+ end
133
+
134
+ # Initialize with dependencies
135
+ search_tool = DocumentSearch.new(MyDatabase)
136
+ chat.with_tool(search_tool)
137
+ ```
138
+
139
+ ## The Tool Execution Flow
140
+
141
+ Here's what happens when a tool is used:
142
+
143
+ 1. You ask a question
144
+ 2. The model decides a tool is needed
145
+ 3. The model selects the tool and provides arguments
146
+ 4. RubyLLM calls your tool's `execute` method
147
+ 5. The result is sent back to the model
148
+ 6. The model incorporates the result into its response
149
+
150
+ For example:
151
+
152
+ ```ruby
153
+ response = chat.ask "What's 123 squared plus 456?"
154
+
155
+ # Behind the scenes:
156
+ # 1. Model decides it needs to calculate
157
+ # 2. Model calls Calculator with expression: "123 * 123 + 456"
158
+ # 3. Tool returns "15,585"
159
+ # 4. Model incorporates this in its response
160
+ ```
161
+
162
+ ## Debugging Tools
163
+
164
+ Enable debugging to see tool calls in action:
165
+
166
+ ```ruby
167
+ # Enable debug logging
168
+ ENV['RUBY_LLM_DEBUG'] = 'true'
169
+
170
+ # Make a request
171
+ chat.ask "What's 15329 divided by 437?"
172
+
173
+ # Console output:
174
+ # D, -- RubyLLM: Tool calculator called with: {"expression"=>"15329 / 437"}
175
+ # D, -- RubyLLM: Tool calculator returned: "35.078719"
176
+ ```
177
+
178
+ ## Error Handling
179
+
180
+ Tools can handle errors gracefully:
181
+
182
+ ```ruby
183
+ class Calculator < RubyLLM::Tool
184
+ description "Performs arithmetic calculations"
185
+
186
+ param :expression,
187
+ type: :string,
188
+ desc: "Math expression to evaluate"
189
+
190
+ def execute(expression:)
191
+ eval(expression).to_s
192
+ rescue StandardError => e
193
+ # Return error as a result
194
+ { error: "Error calculating #{expression}: #{e.message}" }
195
+ end
196
+ end
197
+
198
+ # When there's an error, the model will receive and explain it
199
+ chat.ask "What's 1/0?"
200
+ # => "I tried to calculate 1/0, but there was an error: divided by 0"
201
+ ```
202
+
203
+ ## Advanced Tool Parameters
204
+
205
+ Tools can have complex parameter types:
206
+
207
+ ```ruby
208
+ class DataAnalysis < RubyLLM::Tool
209
+ description "Analyzes numerical data"
210
+
211
+ param :data,
212
+ type: :array,
213
+ desc: "Array of numbers to analyze"
214
+
215
+ param :operations,
216
+ type: :object,
217
+ desc: "Analysis operations to perform",
218
+ required: false
219
+
220
+ def execute(data:, operations: {mean: true, median: false})
221
+ result = {}
222
+
223
+ result[:mean] = data.sum.to_f / data.size if operations[:mean]
224
+ result[:median] = calculate_median(data) if operations[:median]
225
+
226
+ result
227
+ end
228
+
229
+ private
230
+
231
+ def calculate_median(data)
232
+ sorted = data.sort
233
+ mid = sorted.size / 2
234
+ sorted.size.odd? ? sorted[mid] : (sorted[mid-1] + sorted[mid]) / 2.0
235
+ end
236
+ end
237
+ ```
238
+
239
+ ## When to Use Tools
240
+
241
+ Tools are best for:
242
+
243
+ 1. **External data retrieval** - Getting real-time information like weather, prices, or database records
244
+ 2. **Computation** - When calculations are complex or involve large numbers
245
+ 3. **System integration** - Connecting to external APIs or services
246
+ 4. **Data processing** - Working with files, formatting data, or analyzing information
247
+ 5. **Stateful operations** - When you need to maintain state between calls
data/docs/index.md ADDED
@@ -0,0 +1,53 @@
1
+ ---
2
+ layout: default
3
+ title: Home
4
+ nav_order: 1
5
+ description: "RubyLLM is a delightful Ruby way to work with AI."
6
+ permalink: /
7
+ ---
8
+
9
+ # RubyLLM
10
+ {: .fs-9 }
11
+
12
+ A delightful Ruby way to work with AI through a unified interface to OpenAI, Anthropic, Google, and DeepSeek.
13
+ {: .fs-6 .fw-300 }
14
+
15
+ [Get started now]({% link installation.md %}){: .btn .btn-primary .fs-5 .mb-4 .mb-md-0 .mr-2 }
16
+ [View on GitHub](https://github.com/crmne/ruby_llm){: .btn .fs-5 .mb-4 .mb-md-0 }
17
+
18
+ ---
19
+
20
+ ## Overview
21
+
22
+ RubyLLM provides a beautiful, unified interface to modern AI services, including:
23
+
24
+ - 💬 **Chat** with OpenAI GPT, Anthropic Claude, Google Gemini, and DeepSeek models
25
+ - 🖼️ **Image generation** with DALL-E and other providers
26
+ - 🔍 **Embeddings** for vector search and semantic analysis
27
+ - 🔧 **Tools** that let AI use your Ruby code
28
+ - 🚊 **Rails integration** to persist chats and messages with ActiveRecord
29
+ - 🌊 **Streaming** responses with proper Ruby patterns
30
+
31
+ ## Quick start
32
+
33
+ ```ruby
34
+ require 'ruby_llm'
35
+
36
+ # Configure your API keys
37
+ RubyLLM.configure do |config|
38
+ config.openai_api_key = ENV['OPENAI_API_KEY']
39
+ end
40
+
41
+ # Start chatting
42
+ chat = RubyLLM.chat
43
+ response = chat.ask "What's the best way to learn Ruby?"
44
+
45
+ # Generate images
46
+ image = RubyLLM.paint "a sunset over mountains"
47
+ puts image.url
48
+ ```
49
+
50
+ ## Learn more
51
+
52
+ - [Installation]({% link installation.md %})
53
+ - [Guides]({% link guides/index.md %})
@@ -0,0 +1,98 @@
1
+ ---
2
+ layout: default
3
+ title: Installation
4
+ nav_order: 2
5
+ permalink: /installation
6
+ ---
7
+
8
+ # Installation
9
+
10
+ RubyLLM is packaged as a Ruby gem, making it easy to install in your projects.
11
+
12
+ ## Requirements
13
+
14
+ * Ruby 3.1 or later
15
+ * An API key from at least one of the supported providers:
16
+ * OpenAI
17
+ * Anthropic
18
+ * Google (Gemini)
19
+ * DeepSeek
20
+
21
+ ## Installation Methods
22
+
23
+ ### Using Bundler (recommended)
24
+
25
+ Add RubyLLM to your project's Gemfile:
26
+
27
+ ```ruby
28
+ gem 'ruby_llm'
29
+ ```
30
+
31
+ Then install the dependencies:
32
+
33
+ ```bash
34
+ bundle install
35
+ ```
36
+
37
+ ### Manual Installation
38
+
39
+ If you're not using Bundler, you can install RubyLLM directly:
40
+
41
+ ```bash
42
+ gem install ruby_llm
43
+ ```
44
+
45
+ ## Configuration
46
+
47
+ After installing RubyLLM, you'll need to configure it with your API keys:
48
+
49
+ ```ruby
50
+ require 'ruby_llm'
51
+
52
+ RubyLLM.configure do |config|
53
+ # Required: At least one API key
54
+ config.openai_api_key = ENV['OPENAI_API_KEY']
55
+ config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
56
+ config.gemini_api_key = ENV['GEMINI_API_KEY']
57
+ config.deepseek_api_key = ENV['DEEPSEEK_API_KEY']
58
+
59
+ # Optional: Set default models
60
+ config.default_model = 'gpt-4o-mini' # Default chat model
61
+ config.default_embedding_model = 'text-embedding-3-small' # Default embedding model
62
+ config.default_image_model = 'dall-e-3' # Default image generation model
63
+
64
+ # Optional: Configure request settings
65
+ config.request_timeout = 120 # Request timeout in seconds
66
+ config.max_retries = 3 # Number of retries on failures
67
+ end
68
+ ```
69
+
70
+ We recommend storing your API keys as environment variables rather than hardcoding them in your application.
71
+
72
+ ## Verifying Installation
73
+
74
+ You can verify that RubyLLM is correctly installed and configured by running a simple test:
75
+
76
+ ```ruby
77
+ require 'ruby_llm'
78
+
79
+ # Configure with at least one API key
80
+ RubyLLM.configure do |config|
81
+ config.openai_api_key = ENV['OPENAI_API_KEY']
82
+ end
83
+
84
+ # Try a simple query
85
+ chat = RubyLLM.chat
86
+ response = chat.ask "Hello, world!"
87
+ puts response.content
88
+
89
+ # Check available models
90
+ puts "Available models:"
91
+ RubyLLM.models.chat_models.each do |model|
92
+ puts "- #{model.id} (#{model.provider})"
93
+ end
94
+ ```
95
+
96
+ ## Next Steps
97
+
98
+ Once you've successfully installed RubyLLM, check out the [Getting Started guide]({% link guides/getting-started.md %}) to learn how to use it in your applications.
@@ -85,10 +85,10 @@ module RubyLLM
85
85
  .on_end_message { |msg| persist_message_completion(msg) }
86
86
  end
87
87
 
88
- def ask(message, &block)
88
+ def ask(message, &)
89
89
  message = { role: :user, content: message }
90
90
  messages.create!(**message)
91
- to_llm.complete(&block)
91
+ to_llm.complete(&)
92
92
  end
93
93
 
94
94
  alias say ask
data/lib/ruby_llm/chat.rb CHANGED
@@ -72,18 +72,18 @@ module RubyLLM
72
72
  self
73
73
  end
74
74
 
75
- def each(&block)
76
- messages.each(&block)
75
+ def each(&)
76
+ messages.each(&)
77
77
  end
78
78
 
79
- def complete(&block)
79
+ def complete(&)
80
80
  @on[:new_message]&.call
81
- response = @provider.complete(messages, tools: @tools, temperature: @temperature, model: @model.id, &block)
81
+ response = @provider.complete(messages, tools: @tools, temperature: @temperature, model: @model.id, &)
82
82
  @on[:end_message]&.call(response)
83
83
 
84
84
  add_message response
85
85
  if response.tool_call?
86
- handle_tool_calls response, &block
86
+ handle_tool_calls(response, &)
87
87
  else
88
88
  response
89
89
  end
@@ -97,7 +97,7 @@ module RubyLLM
97
97
 
98
98
  private
99
99
 
100
- def handle_tool_calls(response, &block)
100
+ def handle_tool_calls(response, &)
101
101
  response.tool_calls.each_value do |tool_call|
102
102
  @on[:new_message]&.call
103
103
  result = execute_tool tool_call
@@ -105,7 +105,7 @@ module RubyLLM
105
105
  @on[:end_message]&.call(message)
106
106
  end
107
107
 
108
- complete(&block)
108
+ complete(&)
109
109
  end
110
110
 
111
111
  def execute_tool(tool_call)
@@ -20,8 +20,7 @@ module RubyLLM
20
20
  # @return [Integer] the maximum output tokens
21
21
  def determine_max_tokens(model_id)
22
22
  case model_id
23
- when /claude-3-7-sonnet/ then 8_192 # Can be increased to 64K with extended thinking
24
- when /claude-3-5/ then 8_192
23
+ when /claude-3-(7-sonnet|5)/ then 8_192 # Can be increased to 64K with extended thinking
25
24
  else 4_096 # Claude 3 Opus and Haiku
26
25
  end
27
26
  end
@@ -35,7 +35,7 @@ module RubyLLM
35
35
 
36
36
  def extract_text_content(blocks)
37
37
  text_blocks = blocks.select { |c| c['type'] == 'text' }
38
- text_blocks.map { |c| c['text'] }.join('')
38
+ text_blocks.map { |c| c['text'] }.join
39
39
  end
40
40
 
41
41
  def build_message(data, content, tool_use)
@@ -68,8 +68,7 @@ module RubyLLM
68
68
 
69
69
  def convert_role(role)
70
70
  case role
71
- when :tool then 'user'
72
- when :user then 'user'
71
+ when :tool, :user then 'user'
73
72
  else 'assistant'
74
73
  end
75
74
  end
@@ -95,7 +95,6 @@ module RubyLLM
95
95
  # @return [Symbol] the model family
96
96
  def model_family(model_id)
97
97
  case model_id
98
- when /deepseek-chat/ then :chat
99
98
  when /deepseek-reasoner/ then :reasoner
100
99
  else :chat # Default to chat family
101
100
  end
@@ -97,7 +97,7 @@ module RubyLLM
97
97
  .join(' ')
98
98
  .gsub(/(\d+\.\d+)/, ' \1') # Add space before version numbers
99
99
  .gsub(/\s+/, ' ') # Clean up multiple spaces
100
- .gsub(/Aqa/, 'AQA') # Special case for AQA
100
+ .gsub('Aqa', 'AQA') # Special case for AQA
101
101
  .strip
102
102
  end
103
103
 
@@ -15,7 +15,6 @@ module RubyLLM
15
15
  when /o1-2024/, /o3-mini/, /o3-mini-2025/ then 200_000
16
16
  when /gpt-4o/, /gpt-4o-mini/, /gpt-4-turbo/, /o1-mini/ then 128_000
17
17
  when /gpt-4-0[0-9]{3}/ then 8_192
18
- when /gpt-3.5-turbo-instruct/ then 4_096
19
18
  when /gpt-3.5/ then 16_385
20
19
  when /babbage-002/, /davinci-002/ then 16_384
21
20
  else 4_096
@@ -25,16 +24,12 @@ module RubyLLM
25
24
  # Returns the maximum output tokens for the given model ID
26
25
  # @param model_id [String] the model identifier
27
26
  # @return [Integer] the maximum output tokens
28
- def max_tokens_for(model_id) # rubocop:disable Metrics/CyclomaticComplexity,Metrics/MethodLength
27
+ def max_tokens_for(model_id)
29
28
  case model_id
30
29
  when /o1-2024/, /o3-mini/, /o3-mini-2025/ then 100_000
31
30
  when /o1-mini-2024/ then 65_536
32
- when /gpt-4o-2024-05-13/ then 4_096
33
- when /gpt-4o-realtime/, /gpt-4o-mini-realtime/ then 4_096
34
- when /gpt-4o/, /gpt-4o-mini/, /gpt-4o-audio/, /gpt-4o-mini-audio/ then 16_384
31
+ when /gpt-4o/, /gpt-4o-mini/, /gpt-4o-audio/, /gpt-4o-mini-audio/, /babbage-002/, /davinci-002/ then 16_384
35
32
  when /gpt-4-0[0-9]{3}/ then 8_192
36
- when /gpt-4-turbo/, /gpt-3.5-turbo/ then 4_096
37
- when /babbage-002/, /davinci-002/ then 16_384
38
33
  else 4_096
39
34
  end
40
35
  end
@@ -211,7 +206,7 @@ module RubyLLM
211
206
  # @return [String] the humanized model name
212
207
  def humanize(id)
213
208
  id.tr('-', ' ')
214
- .split(' ')
209
+ .split
215
210
  .map(&:capitalize)
216
211
  .join(' ')
217
212
  end
@@ -229,12 +224,12 @@ module RubyLLM
229
224
  .gsub(/^Chatgpt /, 'ChatGPT-')
230
225
  .gsub(/^Tts /, 'TTS-')
231
226
  .gsub(/^Dall E /, 'DALL-E-')
232
- .gsub(/3\.5 /, '3.5-')
233
- .gsub(/4 /, '4-')
227
+ .gsub('3.5 ', '3.5-')
228
+ .gsub('4 ', '4-')
234
229
  .gsub(/4o (?=Mini|Preview|Turbo|Audio|Realtime)/, '4o-')
235
230
  .gsub(/\bHd\b/, 'HD')
236
- .gsub(/Omni Moderation/, 'Omni-Moderation')
237
- .gsub(/Text Moderation/, 'Text-Moderation')
231
+ .gsub('Omni Moderation', 'Omni-Moderation')
232
+ .gsub('Text Moderation', 'Text-Moderation')
238
233
  end
239
234
  end
240
235
  end
@@ -25,7 +25,7 @@ module RubyLLM
25
25
  vectors = data['data'].map { |d| d['embedding'] }
26
26
 
27
27
  # If we only got one embedding, return it as a single vector
28
- vectors = vectors.size == 1 ? vectors.first : vectors
28
+ vectors = vectors.first if vectors.size == 1
29
29
 
30
30
  Embedding.new(
31
31
  vectors: vectors,
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module RubyLLM
4
- VERSION = '0.1.0.pre36'
4
+ VERSION = '0.1.0.pre37'
5
5
  end
@@ -70,8 +70,8 @@ namespace :models do # rubocop:disable Metrics/BlockLength
70
70
  RubyLLM.configure do |config|
71
71
  config.openai_api_key = ENV.fetch('OPENAI_API_KEY')
72
72
  config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY')
73
- config.gemini_api_key = ENV['GEMINI_API_KEY']
74
- config.deepseek_api_key = ENV['DEEPSEEK_API_KEY']
73
+ config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
74
+ config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
75
75
  end
76
76
 
77
77
  # Get all models
data/ruby_llm.gemspec CHANGED
@@ -9,19 +9,21 @@ Gem::Specification.new do |spec|
9
9
  spec.email = ['carmine@paolino.me']
10
10
 
11
11
  spec.summary = 'Beautiful Ruby interface to modern AI'
12
- spec.description = 'A delightful Ruby way to work with AI. Chat in text, analyze and generate images, understand' \
13
- ' audio, and use tools through a unified interface to OpenAI, Anthropic, Google, and DeepSeek.' \
14
- ' Built for developer happiness with automatic token counting, proper streaming, and Rails' \
15
- ' integration. No wrapping your head around multiple APIs - just clean Ruby code that works.'
16
- spec.homepage = 'https://github.com/crmne/ruby_llm'
12
+ spec.description = 'A delightful Ruby way to work with AI. Chat in text, analyze and generate images, understand ' \
13
+ 'audio, and use tools through a unified interface to OpenAI, Anthropic, Google, and DeepSeek. ' \
14
+ 'Built for developer happiness with automatic token counting, proper streaming, and Rails ' \
15
+ 'integration. No wrapping your head around multiple APIs - just clean Ruby code that works.'
16
+ spec.homepage = 'https://rubyllm.com'
17
17
  spec.license = 'MIT'
18
18
  spec.required_ruby_version = Gem::Requirement.new('>= 3.1.0')
19
19
 
20
20
  spec.metadata['homepage_uri'] = spec.homepage
21
- spec.metadata['source_code_uri'] = spec.homepage
22
- spec.metadata['changelog_uri'] = "#{spec.homepage}/commits/main"
21
+ spec.metadata['source_code_uri'] = 'https://github.com/crmne/ruby_llm'
22
+ spec.metadata['changelog_uri'] = "#{spec.metadata['source_code_uri']}/commits/main"
23
23
  spec.metadata['documentation_uri'] = spec.homepage
24
- spec.metadata['bug_tracker_uri'] = "#{spec.homepage}/issues"
24
+ spec.metadata['bug_tracker_uri'] = "#{spec.metadata['source_code_uri']}/issues"
25
+
26
+ spec.metadata['rubygems_mfa_required'] = 'true'
25
27
 
26
28
  # Specify which files should be added to the gem when it is released.
27
29
  # The `git ls-files -z` loads the files in the RubyGem that have been added into git.
@@ -38,28 +40,4 @@ Gem::Specification.new do |spec|
38
40
  spec.add_dependency 'faraday-multipart', '>= 1.0'
39
41
  spec.add_dependency 'faraday-retry', '>= 2.0'
40
42
  spec.add_dependency 'zeitwerk', '>= 2.6'
41
-
42
- # Rails integration dependencies
43
- spec.add_development_dependency 'activerecord', '>= 6.0', '< 9.0'
44
- spec.add_development_dependency 'activesupport', '>= 6.0', '< 9.0'
45
-
46
- # Development dependencies
47
- spec.add_development_dependency 'bundler', '>= 2.0'
48
- spec.add_development_dependency 'codecov'
49
- spec.add_development_dependency 'dotenv'
50
- spec.add_development_dependency 'irb'
51
- spec.add_development_dependency 'nokogiri'
52
- spec.add_development_dependency 'overcommit', '>= 0.66'
53
- spec.add_development_dependency 'pry', '>= 0.14'
54
- spec.add_development_dependency 'rake', '>= 13.0'
55
- spec.add_development_dependency 'rdoc'
56
- spec.add_development_dependency 'reline'
57
- spec.add_development_dependency 'rspec', '~> 3.12'
58
- spec.add_development_dependency 'rubocop', '>= 1.0'
59
- spec.add_development_dependency 'rubocop-rake', '>= 0.6'
60
- spec.add_development_dependency 'simplecov', '>= 0.21'
61
- spec.add_development_dependency 'simplecov-cobertura'
62
- spec.add_development_dependency 'sqlite3'
63
- spec.add_development_dependency 'webmock', '~> 3.18'
64
- spec.add_development_dependency 'yard', '>= 0.9'
65
43
  end