ai-lite 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: f6f864c2c29ee79c0aa144c940fa34a3a4b84189239436cab38fcd1862401726
4
+ data.tar.gz: 293507b5e1af278ab17658f81f63dc257a7d6ce2b00c79b77d64c6ba1e9fb434
5
+ SHA512:
6
+ metadata.gz: 77bf79ade3eec73d1cbd06b269dd0d4566d4b06f96d7919dfa74a6f514572ff24fe36a93e65e9e43aa4a369e7c9a3db2e771e5c4419c8e22fc8f19d6f63db1af
7
+ data.tar.gz: 2033b7e8215841e2b57eb46355cef483e5badc30d56affffd1cd47b7e245442612fee91ec0918fda4828064fca7a7dc62e2268b329730ae0534e8ec07ed16aee
data/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 William Basmayor
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,150 @@
1
+ # AI Lite
2
+
3
+ AI Lite is a pure Ruby, dependency-light client for simple chat-style calls through OpenAI's Responses API.
4
+
5
+ It is built for Rails apps and plain Ruby projects where installing a full OpenAI SDK is too heavy, incompatible, or unnecessary. It is especially useful in legacy Rails apps where Rails, ActiveSupport, Ruby, or HTTP-client dependency constraints make larger client libraries difficult to add.
6
+
7
+ Use AI Lite when you want a small OpenAI client that works without tying your app to a specific Rails version or a larger dependency stack.
8
+
9
+ This gem is intentionally small:
10
+
11
+ - No Rails dependency
12
+ - No official OpenAI gem dependency
13
+ - No Faraday, HTTParty, ActiveSupport, or connection pool dependency
14
+ - Uses only Ruby stdlib: `Net::HTTP`, `URI`, and `JSON`
15
+
16
+ It is not meant to replace the official OpenAI SDK. It is a small wrapper for projects that only need one clean interface:
17
+
18
+ ```ruby
19
+ ai.chat("Say hello")
20
+ ```
21
+
22
+ ## Usage
23
+
24
+ ```ruby
25
+ require "ai_lite"
26
+
27
+ ai = AiLite.new
28
+ result = ai.chat("Say hello")
29
+
30
+ puts result["content"]
31
+ ```
32
+
33
+ By default, the client looks for an API key in `OPENAI_API_KEY`, then falls back to `OPEN_AI_TOKEN`.
34
+
35
+ You can also pass the key directly:
36
+
37
+ ```ruby
38
+ ai = AiLite.new(api_key: "sk-...")
39
+ ```
40
+
41
+ ## Configuration
42
+
43
+ In Rails, configure the default client from an initializer:
44
+
45
+ ```ruby
46
+ # config/initializers/ai_lite.rb
47
+ AiLite.configure do |config|
48
+ config.api_key = ENV["OPENAI_API_KEY"]
49
+ config.model = "gpt-5.5"
50
+ config.timeout = 120
51
+ config.max_output_tokens = 2000
52
+ end
53
+ ```
54
+
55
+ Then use the configured singleton-style client:
56
+
57
+ ```ruby
58
+ result = AiLite.chat("Say hello")
59
+ ```
60
+
61
+ You can still instantiate a separate client for another token:
62
+
63
+ ```ruby
64
+ client = AiLite.new(api_key: "sk-other-token")
65
+ result = client.chat("Say hello")
66
+ ```
67
+
68
+ ## Chat
69
+
70
+ ```ruby
71
+ result = ai.chat(
72
+ "Return JSON confirming whether this is valid.",
73
+ instructions: "Return only minified JSON.",
74
+ max_output_tokens: 500,
75
+ options: {
76
+ text: { verbosity: "low" }
77
+ }
78
+ )
79
+ ```
80
+
81
+ `chat` sends a `POST` request to `/v1/responses` with:
82
+
83
+ - `model`
84
+ - `input`
85
+ - `max_output_tokens`
86
+ - optional `instructions`
87
+ - optional `debug`
88
+ - optional extra `options`
89
+
90
+ The default model is `gpt-5.5`.
91
+
92
+ The OpenAI API URL is fixed to `https://api.openai.com/v1/responses`.
93
+
94
+ ## Return Shape
95
+
96
+ `chat` always returns a hash envelope.
97
+
98
+ Text output:
99
+
100
+ ```ruby
101
+ {
102
+ "content" => "Hello!",
103
+ "status" => 200,
104
+ "error" => nil,
105
+ "raw" => nil
106
+ }
107
+ ```
108
+
109
+ JSON-looking model output:
110
+
111
+ ```ruby
112
+ {
113
+ "content" => { "valid" => true },
114
+ "status" => 200,
115
+ "error" => nil,
116
+ "raw" => nil
117
+ }
118
+ ```
119
+
120
+ Failure:
121
+
122
+ ```ruby
123
+ {
124
+ "content" => nil,
125
+ "status" => 401,
126
+ "error" => "Invalid API key",
127
+ "raw" => nil
128
+ }
129
+ ```
130
+
131
+ Pass `debug: true` to include the raw OpenAI response:
132
+
133
+ ```ruby
134
+ result = ai.chat("Say hello", debug: true)
135
+
136
+ {
137
+ "content" => "Hello!",
138
+ "status" => 200,
139
+ "error" => nil,
140
+ "raw" => { ... }
141
+ }
142
+ ```
143
+
144
+ ## Development
145
+
146
+ Run the test suite:
147
+
148
+ ```sh
149
+ ruby -Ilib:test test/ai_lite_test.rb
150
+ ```
@@ -0,0 +1,3 @@
1
+ class AiLite
2
+ VERSION = "0.1.0".freeze
3
+ end
data/lib/ai_lite.rb ADDED
@@ -0,0 +1,165 @@
1
+ require "json"
2
+ require "net/http"
3
+ require "uri"
4
+ require_relative "ai_lite/version"
5
+
6
+ class AiLite
7
+ API_BASE_URL = "https://api.openai.com/v1".freeze
8
+ DEFAULT_MODEL = "gpt-5.5".freeze
9
+ DEFAULT_TIMEOUT = 120
10
+ DEFAULT_MAX_OUTPUT_TOKENS = 2000
11
+
12
+ class Configuration
13
+ attr_accessor :api_key, :model, :timeout, :max_output_tokens
14
+
15
+ def initialize
16
+ @api_key = nil
17
+ @model = DEFAULT_MODEL
18
+ @timeout = DEFAULT_TIMEOUT
19
+ @max_output_tokens = DEFAULT_MAX_OUTPUT_TOKENS
20
+ end
21
+ end
22
+
23
+ class << self
24
+ def configuration
25
+ @configuration ||= Configuration.new
26
+ end
27
+
28
+ def configure
29
+ yield(configuration)
30
+ reset_client!
31
+ configuration
32
+ end
33
+
34
+ def reset_configuration!
35
+ @configuration = Configuration.new
36
+ reset_client!
37
+ configuration
38
+ end
39
+
40
+ def client
41
+ @client ||= new
42
+ end
43
+
44
+ def chat(message, **kwargs)
45
+ client.chat(message, **kwargs)
46
+ end
47
+
48
+ def reset_client!
49
+ @client = nil
50
+ end
51
+ end
52
+
53
+ attr_reader :api_key, :model, :timeout, :max_output_tokens, :headers
54
+
55
+ def initialize(api_key: nil, model: nil, timeout: nil, max_output_tokens: nil)
56
+ @api_key = api_key || self.class.configuration.api_key || ENV["OPENAI_API_KEY"] || ENV["OPEN_AI_TOKEN"]
57
+ raise ArgumentError, "Missing OpenAI API key" if @api_key.to_s.strip.empty?
58
+
59
+ @model = model || self.class.configuration.model
60
+ @timeout = timeout || self.class.configuration.timeout
61
+ @max_output_tokens = max_output_tokens || self.class.configuration.max_output_tokens
62
+ @headers = {
63
+ "Authorization" => "Bearer #{@api_key}",
64
+ "Content-Type" => "application/json"
65
+ }
66
+ end
67
+
68
+ def chat(message, model: nil, instructions: nil, max_output_tokens: nil, debug: false, options: {})
69
+ payload = options.merge(
70
+ model: model || self.model,
71
+ input: message,
72
+ max_output_tokens: max_output_tokens || self.max_output_tokens
73
+ )
74
+ payload[:instructions] = instructions if instructions
75
+
76
+ extract_content(post(payload), debug: debug)
77
+ rescue => e
78
+ prettify_data(status: "unknown", error: e.message, raw: nil, debug: debug)
79
+ end
80
+
81
+ private
82
+
83
+ def post(payload)
84
+ uri = URI.parse(response_endpoint)
85
+ request = Net::HTTP::Post.new(uri)
86
+
87
+ headers.each do |key, value|
88
+ request[key] = value
89
+ end
90
+
91
+ request.body = JSON.generate(payload)
92
+
93
+ Net::HTTP.start(uri.host, uri.port, use_ssl: uri.scheme == "https") do |http|
94
+ http.open_timeout = timeout if http.respond_to?(:open_timeout=)
95
+ http.read_timeout = timeout if http.respond_to?(:read_timeout=)
96
+ http.request(request)
97
+ end
98
+ end
99
+
100
+ def response_endpoint
101
+ "#{API_BASE_URL}/responses"
102
+ end
103
+
104
+ def extract_content(response, debug: false)
105
+ status = response.code.to_i
106
+ parsed_response = JSON.parse(response.body)
107
+
108
+ unless success_status?(status)
109
+ return prettify_data(status: status, error: error_message(parsed_response), raw: parsed_response, debug: debug)
110
+ end
111
+
112
+ raw_content = extract_output_text(parsed_response)
113
+ content = parse_content(raw_content)
114
+ prettify_data(status: status, content: content, raw: parsed_response, debug: debug)
115
+ rescue JSON::ParserError => e
116
+ prettify_data(status: response_status(response), error: e.message, raw: response&.body, debug: debug)
117
+ rescue => e
118
+ prettify_data(status: response_status(response), error: e.message, raw: nil, debug: debug)
119
+ end
120
+
121
+ def extract_output_text(raw)
122
+ Array(raw["output"]).flat_map do |item|
123
+ next [] unless item.is_a?(Hash) && item["type"] == "message"
124
+
125
+ Array(item["content"]).map do |content|
126
+ next unless content.is_a?(Hash) && content["type"] == "output_text"
127
+
128
+ content["text"]
129
+ end.compact
130
+ end.join.strip
131
+ end
132
+
133
+ def parse_content(raw_text)
134
+ return nil if raw_text.nil? || raw_text.empty?
135
+
136
+ JSON.parse(raw_text)
137
+ rescue JSON::ParserError
138
+ raw_text
139
+ end
140
+
141
+ def success_status?(status)
142
+ status >= 200 && status < 300
143
+ end
144
+
145
+ def error_message(raw)
146
+ if raw.is_a?(Hash)
147
+ raw.dig("error", "message") || raw["error"] || raw["message"] || raw.to_s
148
+ else
149
+ raw.to_s
150
+ end
151
+ end
152
+
153
+ def response_status(response)
154
+ response&.code&.to_i || "unknown"
155
+ end
156
+
157
+ def prettify_data(status:, content: nil, error: nil, raw:, debug: false)
158
+ {
159
+ "content" => content,
160
+ "status" => status,
161
+ "error" => error,
162
+ "raw" => debug ? raw : nil
163
+ }
164
+ end
165
+ end
@@ -0,0 +1,379 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "test_helper"
4
+
5
+ class AiLiteTest < Minitest::Test
6
+ FakeResponse = Struct.new(:code, :body)
7
+
8
+ class FakeHttp
9
+ attr_accessor :open_timeout, :read_timeout
10
+ attr_reader :last_request
11
+
12
+ def initialize(response)
13
+ @response = response
14
+ end
15
+
16
+ def request(request)
17
+ @last_request = request
18
+ @response
19
+ end
20
+ end
21
+
22
+ def setup
23
+ AiLite.reset_configuration!
24
+ end
25
+
26
+ def test_has_a_version_number
27
+ refute_nil AiLite::VERSION
28
+ end
29
+
30
+ def test_stores_initializer_config_and_headers
31
+ with_env("OPENAI_API_KEY" => nil, "OPEN_AI_TOKEN" => nil) do
32
+ client = AiLite.new(
33
+ api_key: "explicit-key",
34
+ model: "gpt-test",
35
+ timeout: 10
36
+ )
37
+
38
+ assert_equal "explicit-key", client.api_key
39
+ assert_equal "gpt-test", client.model
40
+ assert_equal 10, client.timeout
41
+ assert_equal 2000, client.max_output_tokens
42
+ assert_equal "Bearer explicit-key", client.headers["Authorization"]
43
+ assert_equal "application/json", client.headers["Content-Type"]
44
+ end
45
+ end
46
+
47
+ def test_configure_sets_defaults_for_client
48
+ with_env("OPENAI_API_KEY" => nil, "OPEN_AI_TOKEN" => nil) do
49
+ AiLite.configure do |config|
50
+ config.api_key = "configured-key"
51
+ config.model = "gpt-config"
52
+ config.timeout = 15
53
+ config.max_output_tokens = 750
54
+ end
55
+
56
+ client = AiLite.client
57
+
58
+ assert_equal "configured-key", client.api_key
59
+ assert_equal "gpt-config", client.model
60
+ assert_equal 15, client.timeout
61
+ assert_equal 750, client.max_output_tokens
62
+ assert_same client, AiLite.client
63
+ end
64
+ end
65
+
66
+ def test_configure_resets_cached_client
67
+ AiLite.configure { |config| config.api_key = "first-key" }
68
+ first_client = AiLite.client
69
+
70
+ AiLite.configure { |config| config.api_key = "second-key" }
71
+ second_client = AiLite.client
72
+
73
+ refute_same first_client, second_client
74
+ assert_equal "second-key", second_client.api_key
75
+ end
76
+
77
+ def test_instance_arguments_override_configuration
78
+ AiLite.configure do |config|
79
+ config.api_key = "configured-key"
80
+ config.model = "gpt-config"
81
+ config.timeout = 15
82
+ config.max_output_tokens = 750
83
+ end
84
+
85
+ client = AiLite.new(
86
+ api_key: "explicit-key",
87
+ model: "gpt-explicit",
88
+ timeout: 5,
89
+ max_output_tokens: 300
90
+ )
91
+
92
+ assert_equal "explicit-key", client.api_key
93
+ assert_equal "gpt-explicit", client.model
94
+ assert_equal 5, client.timeout
95
+ assert_equal 300, client.max_output_tokens
96
+ end
97
+
98
+ def test_api_key_fallback_order
99
+ with_env("OPENAI_API_KEY" => "openai-key", "OPEN_AI_TOKEN" => "legacy-key") do
100
+ assert_equal "openai-key", AiLite.new.api_key
101
+ assert_equal "explicit-key", AiLite.new(api_key: "explicit-key").api_key
102
+ end
103
+
104
+ with_env("OPENAI_API_KEY" => nil, "OPEN_AI_TOKEN" => "legacy-key") do
105
+ assert_equal "legacy-key", AiLite.new.api_key
106
+ end
107
+ end
108
+
109
+ def test_configured_api_key_takes_precedence_over_env
110
+ AiLite.configure { |config| config.api_key = "configured-key" }
111
+
112
+ with_env("OPENAI_API_KEY" => "openai-key", "OPEN_AI_TOKEN" => "legacy-key") do
113
+ assert_equal "configured-key", AiLite.new.api_key
114
+ assert_equal "explicit-key", AiLite.new(api_key: "explicit-key").api_key
115
+ end
116
+ end
117
+
118
+ def test_missing_api_key_raises
119
+ with_env("OPENAI_API_KEY" => nil, "OPEN_AI_TOKEN" => nil) do
120
+ assert_raises(ArgumentError) { AiLite.new }
121
+ end
122
+ end
123
+
124
+ def test_chat_sends_post_to_responses_with_default_payload
125
+ client = AiLite.new(api_key: "token-abc")
126
+
127
+ with_stubbed_http(success_response("Hello!")) do |captured, _response|
128
+ result = client.chat("Say hello")
129
+ request = captured[:http].last_request
130
+ payload = JSON.parse(request.body)
131
+
132
+ assert_equal "Hello!", result["content"]
133
+ assert_equal 200, result["status"]
134
+ assert_nil result["error"]
135
+ assert_nil result["raw"]
136
+ assert_equal "api.openai.com", captured[:host]
137
+ assert_equal 443, captured[:port]
138
+ assert_equal true, captured[:use_ssl]
139
+ assert_instance_of Net::HTTP::Post, request
140
+ assert_equal "/v1/responses", request.path
141
+ assert_equal "Bearer token-abc", request["Authorization"]
142
+ assert_equal "application/json", request["Content-Type"]
143
+ assert_equal "gpt-5.5", payload["model"]
144
+ assert_equal "Say hello", payload["input"]
145
+ assert_equal 2000, payload["max_output_tokens"]
146
+ end
147
+ end
148
+
149
+ def test_chat_allows_overriding_max_output_tokens
150
+ client = AiLite.new(api_key: "token-abc")
151
+
152
+ with_stubbed_http(success_response("Short")) do |captured, _response|
153
+ client.chat("Summarize", max_output_tokens: 500)
154
+ payload = JSON.parse(captured[:http].last_request.body)
155
+
156
+ assert_equal 500, payload["max_output_tokens"]
157
+ end
158
+ end
159
+
160
+ def test_chat_uses_configured_max_output_tokens
161
+ AiLite.configure do |config|
162
+ config.api_key = "configured-key"
163
+ config.max_output_tokens = 750
164
+ end
165
+
166
+ with_stubbed_http(success_response("Configured")) do |captured, _response|
167
+ AiLite.chat("Use configured defaults")
168
+ payload = JSON.parse(captured[:http].last_request.body)
169
+
170
+ assert_equal 750, payload["max_output_tokens"]
171
+ assert_equal "Bearer configured-key", captured[:http].last_request["Authorization"]
172
+ end
173
+ end
174
+
175
+ def test_chat_includes_instructions_model_and_options
176
+ client = AiLite.new(api_key: "token-abc")
177
+
178
+ with_stubbed_http(success_response("Aye")) do |captured, _response|
179
+ client.chat(
180
+ "Are semicolons optional in JavaScript?",
181
+ model: "gpt-5.4-mini",
182
+ instructions: "Be concise.",
183
+ options: {
184
+ reasoning: { effort: "none" },
185
+ text: { verbosity: "low" },
186
+ store: false
187
+ }
188
+ )
189
+ payload = JSON.parse(captured[:http].last_request.body)
190
+
191
+ assert_equal "gpt-5.4-mini", payload["model"]
192
+ assert_equal "Be concise.", payload["instructions"]
193
+ assert_equal({ "effort" => "none" }, payload["reasoning"])
194
+ assert_equal({ "verbosity" => "low" }, payload["text"])
195
+ assert_equal false, payload["store"]
196
+ end
197
+ end
198
+
199
+ def test_extracts_text_from_nested_output_text_items
200
+ client = AiLite.new(api_key: "token-abc")
201
+ body = JSON.generate(
202
+ "output" => [
203
+ { "type" => "reasoning", "content" => [] },
204
+ {
205
+ "type" => "message",
206
+ "content" => [
207
+ { "type" => "output_text", "text" => "Hello" },
208
+ { "type" => "output_text", "text" => " world" }
209
+ ]
210
+ }
211
+ ]
212
+ )
213
+
214
+ with_stubbed_http(FakeResponse.new("200", body)) do |_captured, _response|
215
+ result = client.chat("Say hello")
216
+
217
+ assert_equal "Hello world", result["content"]
218
+ assert_equal 200, result["status"]
219
+ assert_nil result["error"]
220
+ end
221
+ end
222
+
223
+ def test_parses_json_looking_output_with_string_keys
224
+ client = AiLite.new(api_key: "token-abc")
225
+
226
+ with_stubbed_http(success_response(JSON.generate("valid" => true))) do |_captured, _response|
227
+ result = client.chat("Validate")
228
+
229
+ assert_equal({ "valid" => true }, result["content"])
230
+ assert_equal 200, result["status"]
231
+ assert_nil result["error"]
232
+ assert_nil result["raw"]
233
+ end
234
+ end
235
+
236
+ def test_debug_true_returns_raw_success_response
237
+ client = AiLite.new(api_key: "token-abc")
238
+
239
+ with_stubbed_http(success_response("Hello!")) do |_captured, _response|
240
+ result = client.chat("Say hello", debug: true)
241
+
242
+ assert_equal "Hello!", result["content"]
243
+ assert_equal 200, result["status"]
244
+ assert_nil result["error"]
245
+ assert_equal(
246
+ [
247
+ {
248
+ "type" => "message",
249
+ "content" => [
250
+ { "type" => "output_text", "text" => "Hello!" }
251
+ ]
252
+ }
253
+ ],
254
+ result["raw"]["output"]
255
+ )
256
+ end
257
+ end
258
+
259
+ def test_http_errors_return_standard_envelope
260
+ client = AiLite.new(api_key: "token-abc")
261
+ body = JSON.generate("error" => { "message" => "Invalid API key" })
262
+
263
+ with_stubbed_http(FakeResponse.new("401", body)) do |_captured, _response|
264
+ result = client.chat("Say hello")
265
+
266
+ assert_nil result["content"]
267
+ assert_equal 401, result["status"]
268
+ assert_equal "Invalid API key", result["error"]
269
+ assert_nil result["raw"]
270
+ end
271
+ end
272
+
273
+ def test_debug_true_returns_raw_http_error_response
274
+ client = AiLite.new(api_key: "token-abc")
275
+ body = JSON.generate("error" => { "message" => "Invalid API key" })
276
+
277
+ with_stubbed_http(FakeResponse.new("401", body)) do |_captured, _response|
278
+ result = client.chat("Say hello", debug: true)
279
+
280
+ assert_nil result["content"]
281
+ assert_equal 401, result["status"]
282
+ assert_equal "Invalid API key", result["error"]
283
+ assert_equal({ "error" => { "message" => "Invalid API key" } }, result["raw"])
284
+ end
285
+ end
286
+
287
+ def test_response_parse_errors_return_standard_envelope
288
+ client = AiLite.new(api_key: "token-abc")
289
+
290
+ with_stubbed_http(FakeResponse.new("200", "not-json")) do |_captured, _response|
291
+ result = client.chat("Say hello")
292
+
293
+ assert_nil result["content"]
294
+ assert_equal 200, result["status"]
295
+ assert_match(/unexpected token|unexpected character|parse/i, result["error"])
296
+ assert_nil result["raw"]
297
+ end
298
+ end
299
+
300
+ def test_debug_true_returns_raw_response_parse_error_body
301
+ client = AiLite.new(api_key: "token-abc")
302
+
303
+ with_stubbed_http(FakeResponse.new("200", "not-json")) do |_captured, _response|
304
+ result = client.chat("Say hello", debug: true)
305
+
306
+ assert_nil result["content"]
307
+ assert_equal 200, result["status"]
308
+ assert_match(/unexpected token|unexpected character|parse/i, result["error"])
309
+ assert_equal "not-json", result["raw"]
310
+ end
311
+ end
312
+
313
+ def test_network_errors_return_standard_envelope
314
+ client = AiLite.new(api_key: "token-abc")
315
+ original_start = Net::HTTP.method(:start)
316
+
317
+ Net::HTTP.singleton_class.send(:define_method, :start) do |_host, _port, use_ssl:, &_block|
318
+ raise IOError, "connection failed"
319
+ end
320
+
321
+ result = client.chat("Say hello")
322
+
323
+ assert_nil result["content"]
324
+ assert_equal "unknown", result["status"]
325
+ assert_equal "connection failed", result["error"]
326
+ assert_nil result["raw"]
327
+ ensure
328
+ Net::HTTP.singleton_class.send(:define_method, :start, original_start)
329
+ end
330
+
331
+ private
332
+
333
+ def success_response(content)
334
+ body = JSON.generate(
335
+ "output" => [
336
+ {
337
+ "type" => "message",
338
+ "content" => [
339
+ { "type" => "output_text", "text" => content }
340
+ ]
341
+ }
342
+ ]
343
+ )
344
+ FakeResponse.new("200", body)
345
+ end
346
+
347
+ def with_env(values)
348
+ originals = {}
349
+
350
+ values.each do |key, value|
351
+ originals[key] = ENV.key?(key) ? ENV[key] : :missing
352
+ value.nil? ? ENV.delete(key) : ENV[key] = value
353
+ end
354
+
355
+ yield
356
+ ensure
357
+ originals.each do |key, value|
358
+ value == :missing ? ENV.delete(key) : ENV[key] = value
359
+ end
360
+ end
361
+
362
+ def with_stubbed_http(response)
363
+ captured = {}
364
+ original_start = Net::HTTP.method(:start)
365
+
366
+ Net::HTTP.singleton_class.send(:define_method, :start) do |host, port, use_ssl:, &block|
367
+ http = FakeHttp.new(response)
368
+ captured[:host] = host
369
+ captured[:port] = port
370
+ captured[:use_ssl] = use_ssl
371
+ captured[:http] = http
372
+ block.call(http)
373
+ end
374
+
375
+ yield captured, response
376
+ ensure
377
+ Net::HTTP.singleton_class.send(:define_method, :start, original_start)
378
+ end
379
+ end
@@ -0,0 +1,6 @@
1
+ # frozen_string_literal: true
2
+
3
+ $LOAD_PATH.unshift File.expand_path("../lib", __dir__)
4
+
5
+ require "minitest/autorun"
6
+ require "ai_lite"
metadata ADDED
@@ -0,0 +1,50 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: ai-lite
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - William Basmayor
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2026-05-13 00:00:00.000000000 Z
12
+ dependencies: []
13
+ description: AI Lite is a dependency-light Ruby client for Rails apps and plain Ruby
14
+ projects that need simple OpenAI Responses API calls.
15
+ email:
16
+ executables: []
17
+ extensions: []
18
+ extra_rdoc_files: []
19
+ files:
20
+ - LICENSE
21
+ - README.md
22
+ - lib/ai_lite.rb
23
+ - lib/ai_lite/version.rb
24
+ - test/ai_lite_test.rb
25
+ - test/test_helper.rb
26
+ homepage: https://github.com/wbasmayor/ai-lite
27
+ licenses:
28
+ - MIT
29
+ metadata: {}
30
+ post_install_message:
31
+ rdoc_options: []
32
+ require_paths:
33
+ - lib
34
+ required_ruby_version: !ruby/object:Gem::Requirement
35
+ requirements:
36
+ - - ">="
37
+ - !ruby/object:Gem::Version
38
+ version: '2.6'
39
+ required_rubygems_version: !ruby/object:Gem::Requirement
40
+ requirements:
41
+ - - ">="
42
+ - !ruby/object:Gem::Version
43
+ version: '0'
44
+ requirements: []
45
+ rubygems_version: 3.5.22
46
+ signing_key:
47
+ specification_version: 4
48
+ summary: Minimal Ruby client for simple AI chat calls through the OpenAI Responses
49
+ API.
50
+ test_files: []