ai-chat 0.1.1 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c3ac8b911bbb322ec7470c226ed15f524b520ca927ccb50eb53d1beba9579e21
4
- data.tar.gz: e15f7f7c3f8f06819357b16e853eb8937eb543c8d8f5079e61a5e2ce3599448e
3
+ metadata.gz: 1b559dc1098b7391dbca24aea20c9c631ec770eecaf125dc118a1358d62fba39
4
+ data.tar.gz: 8e5c4b588ed741e7e07d7cbfa975764e7c8df6bb7ebb27ab7d3f1730924bdca7
5
5
  SHA512:
6
- metadata.gz: e78959fc4366d03d9cbc96e218f77283001af9e9a49b1753c9944040bb8de1fd7c25635cbfb40543a7abead39614cfd68a4ad9d3958907b56023651c44d395e3
7
- data.tar.gz: 448d2cdc7892504079edff0028f820bbe384a440e06fc109ac8b4d131e903c196c5ca1f3ff1103615dfeb4b80c3cc6e8b8a08ddbfb5ab044d84c4eb7aa580563
6
+ metadata.gz: 934e8b03fee2aade7ec67eb122d78c1d271af3681f8a4ac4712f4ec8e1132a36a2cf9291167d05a44fa2d0a7e7c9096e1ce767ccb277de929e5acc783bf1ff52
7
+ data.tar.gz: d9edb5b4a0a2fb8da9cab3ad380ef21f66a054438c57bcc202769e4b5688a90e9820ff77a73331b4dceccb7f06bbebfa675264856cc1810fb77eb686575373af
data/README.md CHANGED
@@ -1,6 +1,44 @@
1
- # OpenAI Chat
1
+ # AI Chat
2
2
 
3
- This gem provides a class called `OpenAI::Chat` that is intended to make it as easy as possible to use OpenAI's cutting-edge generative AI models.
3
+ This gem provides a class called `AI::Chat` that is intended to make it as easy as possible to use OpenAI's cutting-edge generative AI models.
4
+
5
+ ## Examples
6
+
7
+ This gem includes comprehensive example scripts that showcase all features and serve as both documentation and validation tests. To explore the capabilities:
8
+
9
+ ### Quick Start
10
+
11
+ ```bash
12
+ # Run a quick overview of key features (takes ~1 minute)
13
+ bundle exec ruby examples/01_quick.rb
14
+ ```
15
+
16
+ ### Run All Examples
17
+
18
+ ```bash
19
+ # Run the complete test suite demonstrating all features
20
+ bundle exec ruby examples/all.rb
21
+ ```
22
+
23
+ ### Individual Feature Examples
24
+
25
+ The `examples/` directory contains focused examples for specific features:
26
+
27
+ - `01_quick.rb` - Quick overview of key features
28
+ - `02_core.rb` - Core functionality (basic chat, messages, responses)
29
+ - `03_configuration.rb` - Configuration options (API keys, models, reasoning effort)
30
+ - `04_multimodal.rb` - Basic file and image handling
31
+ - `05_file_handling_comprehensive.rb` - Advanced file handling (PDFs, text files, Rails uploads)
32
+ - `06_structured_output.rb` - Basic structured output with schemas
33
+ - `07_structured_output_comprehensive.rb` - All 6 supported schema formats
34
+ - `08_advanced_usage.rb` - Advanced patterns (chaining, web search)
35
+ - `09_edge_cases.rb` - Error handling and edge cases
36
+ - `10_additional_patterns.rb` - Less common usage patterns (direct add method, web search + schema, etc.)
37
+
38
+ Each example is self-contained and can be run individually:
39
+ ```bash
40
+ bundle exec ruby examples/[filename]
41
+ ```
4
42
 
5
43
  ## Installation
6
44
 
@@ -9,7 +47,7 @@ This gem provides a class called `OpenAI::Chat` that is intended to make it as e
9
47
  Add this line to your application's Gemfile:
10
48
 
11
49
  ```ruby
12
- gem "openai-chat", "< 1.0.0"
50
+ gem "ai-chat", "< 1.0.0"
13
51
  ```
14
52
 
15
53
  And then, at a command prompt:
@@ -23,7 +61,7 @@ bundle install
23
61
  Or, install it directly with:
24
62
 
25
63
  ```
26
- gem install openai-chat
64
+ gem install ai-chat
27
65
  ```
28
66
 
29
67
  ## Simplest usage
@@ -31,10 +69,10 @@ gem install openai-chat
31
69
  In your Ruby program:
32
70
 
33
71
  ```ruby
34
- require "openai/chat"
72
+ require "ai-chat"
35
73
 
36
- # Create an instance of OpenAI::Chat
37
- a = OpenAI::Chat.new
74
+ # Create an instance of AI::Chat
75
+ a = AI::Chat.new
38
76
 
39
77
  # Build up your conversation by adding messages
40
78
  a.add("If the Ruby community had an official motto, what might it be?")
@@ -50,7 +88,7 @@ a.generate! # => "Matz is nice and so we are nice" (or similar)
50
88
  pp a.messages
51
89
  # => [
52
90
  # {:role=>"user", :content=>"If the Ruby community had an official motto, what might it be?"},
53
- # {:role=>"assistant", :content=>"Matz is nice and so we are nice", :response => #<OpenAI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
91
+ # {:role=>"assistant", :content=>"Matz is nice and so we are nice", :response => #<AI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
54
92
  # ]
55
93
 
56
94
  # Continue the conversation
@@ -70,7 +108,7 @@ That's it! You're building something like this:
70
108
  [
71
109
  {:role => "system", :content => "You are a helpful assistant"},
72
110
  {:role => "user", :content => "Hello!"},
73
- {:role => "assistant", :content => "Hi there! How can I help you today?", :response => #<OpenAI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
111
+ {:role => "assistant", :content => "Hi there! How can I help you today?", :response => #<AI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
74
112
  ]
75
113
  ```
76
114
 
@@ -79,9 +117,9 @@ That last bit, under `:response`, is an object that represents the JSON that the
79
117
  ## Adding Different Types of Messages
80
118
 
81
119
  ```ruby
82
- require "openai/chat"
120
+ require "ai-chat"
83
121
 
84
- b = OpenAI::Chat.new
122
+ b = AI::Chat.new
85
123
 
86
124
  # Add system instructions
87
125
  b.add("You are a helpful assistant that talks like Shakespeare.", role: "system")
@@ -105,7 +143,7 @@ b.generate! # => "Methinks 'tis 'Ruby doth bring joy to all who craft with care'
105
143
  Instead of always specifying the role, you can use these shortcuts:
106
144
 
107
145
  ```ruby
108
- c = OpenAI::Chat.new
146
+ c = AI::Chat.new
109
147
 
110
148
  # These are equivalent:
111
149
  c.add("You are helpful", role: "system")
@@ -130,7 +168,7 @@ We use the `add` method (and its shortcuts) to build up an array because:
130
168
 
131
169
  ```ruby
132
170
  # In a Rails app, you might do:
133
- d = OpenAI::Chat.new
171
+ d = AI::Chat.new
134
172
  d.messages = @conversation.messages # Load existing messages
135
173
  d.user("What should I do next?") # Add a new question
136
174
  d.generate! # Generate a response
@@ -143,7 +181,7 @@ d.generate! # Generate a response
143
181
  By default, the gem uses OpenAI's `gpt-4.1-nano` model. If you want to use a different model, you can set it:
144
182
 
145
183
  ```ruby
146
- e = OpenAI::Chat.new
184
+ e = AI::Chat.new
147
185
  e.model = "o4-mini"
148
186
  ```
149
187
 
@@ -167,13 +205,13 @@ The gem by default looks for an environment variable called `OPENAI_API_KEY` and
167
205
  You can specify a different environment variable name:
168
206
 
169
207
  ```ruby
170
- f = OpenAI::Chat.new(api_key_env_var: "MY_OPENAI_TOKEN")
208
+ f = AI::Chat.new(api_key_env_var: "MY_OPENAI_TOKEN")
171
209
  ```
172
210
 
173
211
  Or, you can pass an API key in directly:
174
212
 
175
213
  ```ruby
176
- g = OpenAI::Chat.new(api_key: "your-api-key-goes-here")
214
+ g = AI::Chat.new(api_key: "your-api-key-goes-here")
177
215
  ```
178
216
 
179
217
  ## Inspecting Your Conversation
@@ -181,7 +219,7 @@ g = OpenAI::Chat.new(api_key: "your-api-key-goes-here")
181
219
  You can call `.messages` to get an array containing the conversation so far:
182
220
 
183
221
  ```ruby
184
- h = OpenAI::Chat.new
222
+ h = AI::Chat.new
185
223
  h.system("You are a helpful cooking assistant")
186
224
  h.user("How do I boil an egg?")
187
225
  h.generate!
@@ -208,7 +246,7 @@ h.last
208
246
  Get back Structured Output by setting the `schema` attribute (I suggest using [OpenAI's handy tool for generating the JSON Schema](https://platform.openai.com/docs/guides/structured-outputs)):
209
247
 
210
248
  ```ruby
211
- i = OpenAI::Chat.new
249
+ i = AI::Chat.new
212
250
 
213
251
  i.system("You are an expert nutritionist. The user will describe a meal. Estimate the calories, carbs, fat, and protein.")
214
252
 
@@ -224,10 +262,34 @@ response = i.generate!
224
262
  response[:total_calories] # => 285
225
263
  ```
226
264
 
227
- You can also provide the equivalent Ruby `Hash` rather than a `String` containing JSON.
265
+ ### Schema Formats
228
266
 
267
+ The gem supports multiple schema formats to accommodate different preferences and use cases. The gem will automatically wrap your schema in the correct format for the API.
268
+
269
+ #### 1. Full Schema with `format` Key (Most Explicit)
270
+ ```ruby
271
+ # When you need complete control over the schema structure
272
+ i.schema = {
273
+ format: {
274
+ type: :json_schema,
275
+ name: "nutrition_values",
276
+ strict: true,
277
+ schema: {
278
+ type: "object",
279
+ properties: {
280
+ fat: { type: "number", description: "The amount of fat in grams." },
281
+ protein: { type: "number", description: "The amount of protein in grams." }
282
+ },
283
+ required: ["fat", "protein"],
284
+ additionalProperties: false
285
+ }
286
+ }
287
+ }
288
+ ```
289
+
290
+ #### 2. Schema with `name`, `strict`, and `schema` Keys
229
291
  ```ruby
230
- # Equivalent to assigning the String above
292
+ # The format shown in OpenAI's documentation
231
293
  i.schema = {
232
294
  name: "nutrition_values",
233
295
  strict: true,
@@ -235,25 +297,55 @@ i.schema = {
235
297
  type: "object",
236
298
  properties: {
237
299
  fat: { type: "number", description: "The amount of fat in grams." },
238
- protein: { type: "number", description: "The amount of protein in grams." },
239
- carbs: { type: "number", description: "The amount of carbohydrates in grams." },
240
- total_calories: { type: "number", description:
241
- "The total calories calculated based on fat, protein, and carbohydrates." }
300
+ protein: { type: "number", description: "The amount of protein in grams." }
242
301
  },
243
- required: [:fat, :protein, :carbs, :total_calories],
302
+ required: [:fat, :protein],
244
303
  additionalProperties: false
245
304
  }
246
305
  }
247
306
  ```
248
307
 
249
- The keys can be `String`s or `Symbol`s.
308
+ #### 3. Simple JSON Schema Object
309
+ ```ruby
310
+ # The simplest format - just provide the schema itself
311
+ # The gem will wrap it with sensible defaults (name: "response", strict: true)
312
+ i.schema = {
313
+ type: "object",
314
+ properties: {
315
+ fat: { type: "number", description: "The amount of fat in grams." },
316
+ protein: { type: "number", description: "The amount of protein in grams." }
317
+ },
318
+ required: ["fat", "protein"],
319
+ additionalProperties: false
320
+ }
321
+ ```
322
+
323
+ #### 4. JSON String Formats
324
+ All the above formats also work as JSON strings:
325
+
326
+ ```ruby
327
+ # As a JSON string with full format
328
+ i.schema = '{"format":{"type":"json_schema","name":"nutrition_values","strict":true,"schema":{...}}}'
329
+
330
+ # As a JSON string with name/strict/schema
331
+ i.schema = '{"name":"nutrition_values","strict":true,"schema":{...}}'
332
+
333
+ # As a simple JSON schema string
334
+ i.schema = '{"type":"object","properties":{...}}'
335
+ ```
336
+
337
+ ### Schema Notes
338
+
339
+ - The keys can be `String`s or `Symbol`s.
340
+ - The gem automatically converts your schema to the format expected by the API.
341
+ - When a schema is set, `generate!` returns a parsed Ruby Hash with symbolized keys, not a String.
250
342
 
251
343
  ## Including Images
252
344
 
253
345
  You can include images in your chat messages using the `user` method with the `image` or `images` parameter:
254
346
 
255
347
  ```ruby
256
- j = OpenAI::Chat.new
348
+ j = AI::Chat.new
257
349
 
258
350
  # Send a single image
259
351
  j.user("What's in this image?", image: "path/to/local/image.jpg")
@@ -277,12 +369,55 @@ The gem supports three types of image inputs:
277
369
  - **File paths**: Pass a string with a path to a local image file
278
370
  - **File-like objects**: Pass an object that responds to `read` (like `File.open("image.jpg")` or Rails uploaded files)
279
371
 
372
+ ## Including Files
373
+
374
+ You can include files (PDFs, text files, etc.) in your messages using the `file` or `files` parameter:
375
+
376
+ ```ruby
377
+ k = AI::Chat.new
378
+
379
+ # Send a single file
380
+ k.user("Summarize this document", file: "report.pdf")
381
+ k.generate!
382
+
383
+ # Send multiple files
384
+ k.user("Compare these documents", files: ["doc1.pdf", "doc2.txt", "data.json"])
385
+ k.generate!
386
+ ```
387
+
388
+ Files are handled intelligently based on their type:
389
+ - **PDFs**: Sent as file attachments for the model to analyze
390
+ - **Text files**: Content is automatically extracted and sent as text
391
+ - **Other formats**: The gem attempts to read them as text if possible
392
+
393
+ ## Mixed Content (Images + Files)
394
+
395
+ You can send images and files together in a single message:
396
+
397
+ ```ruby
398
+ l = AI::Chat.new
399
+
400
+ # Mix image and file in one message
401
+ l.user("Compare this photo with the document",
402
+ image: "photo.jpg",
403
+ file: "document.pdf")
404
+ l.generate!
405
+
406
+ # Mix multiple images and files
407
+ l.user("Analyze all these materials",
408
+ images: ["chart1.png", "chart2.png"],
409
+ files: ["report.pdf", "data.csv"])
410
+ l.generate!
411
+ ```
412
+
413
+ **Note**: Images should use `image:`/`images:` parameters, while documents should use `file:`/`files:` parameters.
414
+
280
415
  ## Web Search
281
416
 
282
417
  To give the model access to real-time information from the internet, you can enable the `web_search` feature. This uses OpenAI's built-in `web_search_preview` tool.
283
418
 
284
419
  ```ruby
285
- m = OpenAI::Chat.new
420
+ m = AI::Chat.new
286
421
  m.web_search = true
287
422
  m.user("What are the latest developments in the Ruby language?")
288
423
  m.generate! # This may use web search to find current information
@@ -296,7 +431,7 @@ You can manually add assistant messages without making API calls, which is usefu
296
431
 
297
432
  ```ruby
298
433
  # Create a new chat instance
299
- k = OpenAI::Chat.new
434
+ k = AI::Chat.new
300
435
 
301
436
  # Add previous messages
302
437
  k.system("You are a helpful assistant who provides information about planets.")
@@ -316,14 +451,14 @@ response = k.generate!
316
451
  puts response
317
452
  ```
318
453
 
319
- With this, you can loop through any conversation's history (perhaps after retrieving it from your database), recreate an `OpenAI::Chat`, and then continue it.
454
+ With this, you can loop through any conversation's history (perhaps after retrieving it from your database), recreate an `AI::Chat`, and then continue it.
320
455
 
321
456
  ## Reasoning Models
322
457
 
323
458
  When using reasoning models like `o3` or `o4-mini`, you can specify a reasoning effort level to control how much reasoning the model does before producing its final response:
324
459
 
325
460
  ```ruby
326
- l = OpenAI::Chat.new
461
+ l = AI::Chat.new
327
462
  l.model = "o3-mini"
328
463
  l.reasoning_effort = "medium" # Can be "low", "medium", or "high"
329
464
 
@@ -343,7 +478,7 @@ Setting to `nil` disables the reasoning parameter.
343
478
  When you call `generate!` or `generate!`, the gem stores additional information about the API response:
344
479
 
345
480
  ```ruby
346
- t = OpenAI::Chat.new
481
+ t = AI::Chat.new
347
482
  t.user("Hello!")
348
483
  t.generate!
349
484
 
@@ -352,19 +487,14 @@ pp t.messages.last
352
487
  # => {
353
488
  # :role => "assistant",
354
489
  # :content => "Hello! How can I help you today?",
355
- # :response => #<OpenAI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>
490
+ # :response => #<AI::Response id=resp_abc... model=gpt-4.1-nano tokens=12>
356
491
  # }
357
492
 
358
493
  # Access detailed information
359
- response = t.last_response
494
+ response = t.last[:response]
360
495
  response.id # => "resp_abc123..."
361
496
  response.model # => "gpt-4.1-nano"
362
497
  response.usage # => {:prompt_tokens=>5, :completion_tokens=>7, :total_tokens=>12}
363
-
364
- # Helper methods
365
- t.last_response_id # => "resp_abc123..."
366
- t.last_usage # => {:prompt_tokens=>5, :completion_tokens=>7, :total_tokens=>12}
367
- t.total_tokens # => 12
368
498
  ```
369
499
 
370
500
  This information is useful for:
@@ -373,20 +503,20 @@ This information is useful for:
373
503
  - Understanding which model was actually used.
374
504
  - Future features like cost tracking.
375
505
 
376
- You can also, if you know a response ID, pick up an old conversation at that point in time:
506
+ You can also, if you know a response ID, continue an old conversation by setting the `previous_response_id`:
377
507
 
378
508
  ```ruby
379
- t = OpenAI::Chat.new
509
+ t = AI::Chat.new
380
510
  t.user("Hello!")
381
511
  t.generate!
382
- old_id = t.last_response_id # => "resp_abc123..."
512
+ old_id = t.last[:response].id # => "resp_abc123..."
383
513
 
384
514
  # Some time in the future...
385
515
 
386
- u = OpenAI::Chat.new
387
- u.pick_up_from("resp_abc123...")
388
- u.messages # => [
389
- # {:role=>"assistant", :response => #<OpenAI::Chat::Response id=resp_abc...}
516
+ u = AI::Chat.new
517
+ u.previous_response_id = "resp_abc123..."
518
+ u.user("What did I just say?")
519
+ u.generate! # Will have context from the previous conversation}
390
520
  # ]
391
521
  u.user("What should we do next?")
392
522
  u.generate!
@@ -400,7 +530,7 @@ You can use `.messages=()` to assign an `Array` of `Hashes`. Each `Hash` must ha
400
530
 
401
531
  ```ruby
402
532
  # Using the planet example with array of hashes
403
- p = OpenAI::Chat.new
533
+ p = AI::Chat.new
404
534
 
405
535
  # Set all messages at once instead of calling methods sequentially
406
536
  p.messages = [
@@ -423,7 +553,7 @@ You can still include images:
423
553
 
424
554
  ```ruby
425
555
  # Create a new chat instance
426
- q = OpenAI::Chat.new
556
+ q = AI::Chat.new
427
557
 
428
558
  # With images
429
559
  q.messages = [
@@ -446,7 +576,7 @@ If your chat history is contained in an `ActiveRecord::Relation`, you can assign
446
576
  # Load from ActiveRecord
447
577
  @thread = Thread.find(42)
448
578
 
449
- r = OpenAI::Chat.new
579
+ r = AI::Chat.new
450
580
  r.messages = @thread.posts.order(:created_at)
451
581
  r.user("What should we discuss next?")
452
582
  r.generate! # Creates a new post record, too
@@ -466,7 +596,7 @@ In order for the above to "magically" work, there are a few requirements. Your A
466
596
  If your columns have different names:
467
597
 
468
598
  ```ruby
469
- s = OpenAI::Chat.new
599
+ s = AI::Chat.new
470
600
  s.configure_message_attributes(
471
601
  role: :message_type, # Your column for role
472
602
  content: :message_body, # Your column for content
@@ -485,13 +615,13 @@ add_column :messages, :openai_response, :text
485
615
 
486
616
  # In your model
487
617
  class Message < ApplicationRecord
488
- serialize :openai_response, OpenAI::Chat::Response
618
+ serialize :openai_response, AI::Chat::Response
489
619
  end
490
620
 
491
621
  # Usage
492
622
  @thread = Thread.find(42)
493
623
 
494
- t = OpenAI::Chat.new
624
+ t = AI::Chat.new
495
625
  t.posts = @thread.messages
496
626
  t.user("Hello!")
497
627
  t.generate!
@@ -511,13 +641,12 @@ last_message.openai_response.usage # => {:prompt_tokens=>10, ...}
511
641
 
512
642
  While this gem includes specs, they use mocked API responses. To test with real API calls:
513
643
 
514
- 1. Navigate to the test program directory: `cd demo`
515
- 2. Create a `.env` file in the test_program directory with your API credentials:
644
+ 1. Create a `.env` file at the project root with your API credentials:
516
645
  ```
517
646
  # Your OpenAI API key
518
647
  OPENAI_API_KEY=your_openai_api_key_here
519
648
  ```
520
- 3. Install dependencies: `bundle install`
521
- 4. Run the test program: `ruby demo.rb`
649
+ 2. Install dependencies: `bundle install`
650
+ 3. Run the examples: `bundle exec ruby examples/all.rb`
522
651
 
523
652
  This test program runs through all the major features of the gem, making real API calls to OpenAI.
data/ai-chat.gemspec CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  Gem::Specification.new do |spec|
4
4
  spec.name = "ai-chat"
5
- spec.version = "0.1.1"
5
+ spec.version = "0.2.0"
6
6
  spec.authors = ["Raghu Betina"]
7
7
  spec.email = ["raghu@firstdraft.com"]
8
8
  spec.homepage = "https://github.com/firstdraft/ai-chat"
@@ -18,12 +18,11 @@ Gem::Specification.new do |spec|
18
18
  "source_code_uri" => "https://github.com/firstdraft/ai-chat"
19
19
  }
20
20
 
21
-
22
21
  spec.required_ruby_version = "~> 3.2"
23
- spec.add_dependency "zeitwerk", "~> 2.7"
24
- spec.add_dependency "openai", "~> 0.14"
25
- spec.add_runtime_dependency "mime-types", "~> 3.0"
26
- spec.add_runtime_dependency "base64", "~> 0.1" # Works for all Ruby versions
22
+ spec.add_runtime_dependency "openai", "~> 0.16"
23
+ spec.add_runtime_dependency "marcel", "~> 1.0"
24
+ spec.add_runtime_dependency "base64", "> 0.1.1"
25
+ spec.add_runtime_dependency "json", "~> 2.0"
27
26
 
28
27
  spec.add_development_dependency "dotenv"
29
28
  spec.add_development_dependency "refinements", "~> 11.1"
data/lib/ai/chat.rb CHANGED
@@ -1,31 +1,38 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require "base64"
4
- require "mime-types"
4
+ require "json"
5
+ require "marcel"
5
6
  require "openai"
7
+ require "pathname"
8
+ require "stringio"
6
9
 
7
10
  require_relative "response"
8
11
 
9
12
  module AI
10
- # Main namespace.
13
+ # :reek:MissingSafeMethod { exclude: [ generate! ] }
14
+ # :reek:TooManyMethods
15
+ # :reek:TooManyInstanceVariables
16
+ # :reek:InstanceVariableAssumption
17
+ # :reek:IrresponsibleModule
11
18
  class Chat
12
- def self.loader(registry = Zeitwerk::Registry)
13
- @loader ||= registry.loaders.each.find { |loader| loader.tag == "ai-chat" }
14
- end
15
-
16
- attr_accessor :messages, :schema, :model, :web_search
17
- attr_reader :reasoning_effort, :client
19
+ # :reek:Attribute
20
+ attr_accessor :messages, :model, :web_search, :previous_response_id
21
+ attr_reader :reasoning_effort, :client, :schema
18
22
 
19
23
  VALID_REASONING_EFFORTS = [:low, :medium, :high].freeze
20
-
24
+
21
25
  def initialize(api_key: nil, api_key_env_var: "OPENAI_API_KEY")
22
- @api_key = api_key || ENV.fetch(api_key_env_var)
26
+ api_key ||= ENV.fetch(api_key_env_var)
23
27
  @messages = []
24
28
  @reasoning_effort = nil
25
29
  @model = "gpt-4.1-nano"
26
- @client = OpenAI::Client.new(api_key: @api_key)
30
+ @client = OpenAI::Client.new(api_key: api_key)
31
+ @previous_response_id = nil
27
32
  end
28
33
 
34
+ # :reek:TooManyStatements
35
+ # :reek:NilCheck
29
36
  def add(content, role: "user", response: nil, image: nil, images: nil, file: nil, files: nil)
30
37
  if image.nil? && images.nil? && file.nil? && files.nil?
31
38
  messages.push(
@@ -44,41 +51,25 @@ module AI
44
51
  }
45
52
  ]
46
53
 
47
- if images && !images.empty?
48
- images_array = images.map do |image|
49
- {
50
- type: "input_image",
51
- image_url: process_file(image)
52
- }
53
- end
54
+ all_images = []
55
+ all_images << image if image
56
+ all_images.concat(Array(images)) if images
54
57
 
55
- text_and_files_array += images_array
56
- elsif image
58
+ all_images.each do |img|
57
59
  text_and_files_array.push(
58
60
  {
59
61
  type: "input_image",
60
- image_url: process_file(image)
62
+ image_url: process_image_input(img)
61
63
  }
62
64
  )
63
- elsif files && !files.empty?
64
- files_array = files.map do |file|
65
- {
66
- type: "input_file",
67
- filename: "test",
68
- file_data: process_file(file)
69
- }
70
- end
65
+ end
71
66
 
72
- text_and_files_array += files_array
73
- else
74
- text_and_files_array.push(
75
- {
76
- type: "input_file",
77
- filename: "test",
78
- file_data: process_file(file)
79
- }
80
- )
67
+ all_files = []
68
+ all_files << file if file
69
+ all_files.concat(Array(files)) if files
81
70
 
71
+ all_files.each do |file|
72
+ text_and_files_array.push(process_file_input(file))
82
73
  end
83
74
 
84
75
  messages.push(
@@ -97,69 +88,60 @@ module AI
97
88
  def user(message, image: nil, images: nil, file: nil, files: nil)
98
89
  add(message, role: "user", image: image, images: images, file: file, files: files)
99
90
  end
100
-
91
+
101
92
  def assistant(message, response: nil)
102
93
  add(message, role: "assistant", response: response)
103
94
  end
104
95
 
96
+ # :reek:NilCheck
97
+ # :reek:TooManyStatements
105
98
  def generate!
106
99
  response = create_response
107
100
 
108
- if web_search
109
- message = response.output.last.content.first.text
110
- chat_response = Response.new(response)
111
- assistant(message, response: chat_response)
112
- elsif schema
113
- # filtering out refusals...
114
- json_response = response.output.flat_map { _1.content }.select { _1.is_a?(OpenAI::Models::Responses::ResponseOutputText)}.first.text
115
- chat_response = Response.new(response)
116
- message = JSON.parse(json_response, symbolize_names: true)
117
- assistant(message, response: chat_response)
101
+ chat_response = Response.new(response)
102
+
103
+ text_response = extract_text_from_response(response)
104
+
105
+ message = if schema
106
+ if text_response.nil? || text_response.empty?
107
+ raise ArgumentError, "No text content in response to parse as JSON for schema: #{schema.inspect}"
108
+ end
109
+ JSON.parse(text_response, symbolize_names: true)
118
110
  else
119
- message = response.output.last.content.first.text
120
- chat_response = Response.new(response)
121
- assistant(message, response: chat_response)
111
+ text_response
122
112
  end
123
113
 
124
- message
125
- end
126
-
127
- def pick_up_from(response_id)
128
- response = client.responses.retrieve(response_id)
129
- chat_response = Response.new(response)
130
- message = response.output.flat_map { _1.content }.select { _1.is_a?(OpenAI::Models::Responses::ResponseOutputText)}.first.text
131
114
  assistant(message, response: chat_response)
115
+
116
+ self.previous_response_id = response.id
117
+
132
118
  message
133
119
  end
134
120
 
121
+ # :reek:NilCheck
122
+ # :reek:TooManyStatements
135
123
  def reasoning_effort=(value)
136
124
  if value.nil?
137
125
  @reasoning_effort = nil
138
- else
139
- # Convert string to symbol if needed
140
- symbol_value = value.is_a?(String) ? value.to_sym : value
126
+ return
127
+ end
141
128
 
142
- if VALID_REASONING_EFFORTS.include?(symbol_value)
143
- @reasoning_effort = symbol_value
144
- else
145
- valid_values = VALID_REASONING_EFFORTS.map { |v| ":#{v} or \"#{v}\"" }.join(", ")
146
- raise ArgumentError, "Invalid reasoning_effort value: '#{value}'. Must be one of: #{valid_values}"
147
- end
129
+ normalized_value = value.to_sym
130
+
131
+ if VALID_REASONING_EFFORTS.include?(normalized_value)
132
+ @reasoning_effort = normalized_value
133
+ else
134
+ valid_values = VALID_REASONING_EFFORTS.map { |valid_value| ":#{valid_value} or \"#{valid_value}\"" }.join(", ")
135
+ raise ArgumentError, "Invalid reasoning_effort value: '#{value}'. Must be one of: #{valid_values}"
148
136
  end
149
137
  end
150
-
138
+
151
139
  def schema=(value)
152
140
  if value.is_a?(String)
153
- @schema = JSON.parse(value, symbolize_names: true)
154
- unless @schema.key?(:format) || @schema.key?("format")
155
- @schema = { format: @schema }
156
- end
141
+ parsed = JSON.parse(value, symbolize_names: true)
142
+ @schema = wrap_schema_if_needed(parsed)
157
143
  elsif value.is_a?(Hash)
158
- if value.key?(:format) || value.key?("format")
159
- @schema = value
160
- else
161
- @schema = { format: value }
162
- end
144
+ @schema = wrap_schema_if_needed(value)
163
145
  else
164
146
  raise ArgumentError, "Invalid schema value: '#{value}'. Must be a String containing JSON or a Hash."
165
147
  end
@@ -169,50 +151,70 @@ module AI
169
151
  messages.last
170
152
  end
171
153
 
172
- def last_response
173
- last[:response]
174
- end
175
-
176
- def last_response_id
177
- last_response&.id
178
- end
179
-
180
154
  def inspect
181
155
  "#<#{self.class.name} @messages=#{messages.inspect} @model=#{@model.inspect} @schema=#{@schema.inspect} @reasoning_effort=#{@reasoning_effort.inspect}>"
182
156
  end
183
157
 
184
158
  private
185
159
 
186
- # Custom exception class for input classification errors.
187
160
  class InputClassificationError < StandardError; end
188
161
 
162
+ # :reek:FeatureEnvy
163
+ # :reek:ManualDispatch
164
+ def extract_filename(obj)
165
+ if obj.respond_to?(:original_filename)
166
+ obj.original_filename
167
+ elsif obj.respond_to?(:path)
168
+ File.basename(obj.path)
169
+ else
170
+ raise InputClassificationError,
171
+ "Unable to determine filename from file object. File objects must respond to :original_filename or :path"
172
+ end
173
+ end
174
+
175
+ # :reek:TooManyStatements
189
176
  def create_response
190
177
  parameters = {
191
- model: model,
192
- input: strip_responses(messages),
193
- tools: tools,
194
- text: schema,
195
- reasoning: {
196
- effort: reasoning_effort
197
- }.compact
198
- }.compact
199
- parameters = parameters.delete_if { |k, v| v.empty? }
178
+ model: model
179
+ }
180
+
181
+ parameters[:tools] = tools unless tools.empty?
182
+ parameters[:text] = schema if schema
183
+ parameters[:reasoning] = {effort: reasoning_effort} if reasoning_effort
184
+ parameters[:previous_response_id] = previous_response_id if previous_response_id
185
+
186
+ messages_to_send = prepare_messages_for_api
187
+ parameters[:input] = strip_responses(messages_to_send) unless messages_to_send.empty?
188
+
200
189
  client.responses.create(**parameters)
201
190
  end
202
191
 
192
+ def prepare_messages_for_api
193
+ return messages unless previous_response_id
194
+
195
+ previous_response_index = messages.find_index { |message| message[:response]&.id == previous_response_id }
196
+
197
+ if previous_response_index
198
+ messages[(previous_response_index + 1)..] || []
199
+ else
200
+ messages
201
+ end
202
+ end
203
+
204
+ # :reek:DuplicateMethodCall
205
+ # :reek:FeatureEnvy
206
+ # :reek:ManualDispatch
207
+ # :reek:TooManyStatements
203
208
  def classify_obj(obj)
204
209
  if obj.is_a?(String)
205
- # Attempt to parse as a URL.
206
210
  begin
207
211
  uri = URI.parse(obj)
208
212
  if uri.is_a?(URI::HTTP) || uri.is_a?(URI::HTTPS)
209
213
  return :url
210
214
  end
211
215
  rescue URI::InvalidURIError
212
- # Not a valid URL; continue to check if it's a file path.
213
216
  end
214
217
 
215
- # Check if the string represents a local file path (must exist on disk).
216
218
  if File.exist?(obj)
217
219
  :file_path
218
220
  else
@@ -220,7 +222,6 @@ module AI
220
222
  "String provided is neither a valid URL (must start with http:// or https://) nor an existing file path on disk. Received value: #{obj.inspect}"
221
223
  end
222
224
  elsif obj.respond_to?(:read)
223
- # For non-String objects, check if it behaves like a file.
224
225
  :file_like
225
226
  else
226
227
  raise InputClassificationError,
@@ -228,57 +229,143 @@ module AI
228
229
  end
229
230
  end
230
231
 
231
- def process_file(obj)
232
+ # :reek:DuplicateMethodCall
233
+ # :reek:ManualDispatch
234
+ # :reek:TooManyStatements
235
+ def process_file_input(obj)
232
236
  case classify_obj(obj)
233
237
  when :url
234
- obj
238
+ {
239
+ type: "input_file",
240
+ file_url: obj
241
+ }
235
242
  when :file_path
236
- file_path = obj
243
+ mime_type = Marcel::MimeType.for(Pathname.new(obj))
237
244
 
238
- mime_type = MIME::Types.type_for(file_path).first.to_s
245
+ if mime_type == "application/pdf"
246
+ pdf_data = File.binread(obj)
247
+ {
248
+ type: "input_file",
249
+ filename: File.basename(obj),
250
+ file_data: encode_as_data_uri(pdf_data, mime_type)
251
+ }
252
+ else
253
+ begin
254
+ content = File.read(obj, encoding: "UTF-8")
255
+ # Verify the content can be encoded as JSON (will raise if not)
256
+ JSON.generate({text: content})
257
+ {
258
+ type: "input_text",
259
+ text: content
260
+ }
261
+ rescue Encoding::InvalidByteSequenceError, Encoding::UndefinedConversionError, JSON::GeneratorError
262
+ raise InputClassificationError,
263
+ "Unable to read #{File.basename(obj)} as text. Only PDF and text files are supported."
264
+ end
265
+ end
266
+ when :file_like
267
+ filename = extract_filename(obj)
239
268
 
240
- image_data = File.binread(file_path)
269
+ content = obj.read
270
+ obj.rewind if obj.respond_to?(:rewind)
241
271
 
242
- base64_string = Base64.strict_encode64(image_data)
272
+ mime_type = Marcel::MimeType.for(StringIO.new(content), name: filename)
243
273
 
244
- "data:#{mime_type};base64,#{base64_string}"
245
- when :file_like
246
- filename = if obj.respond_to?(:path)
247
- obj.path
248
- elsif obj.respond_to?(:original_filename)
249
- obj.original_filename
274
+ if mime_type == "application/pdf"
275
+ {
276
+ type: "input_file",
277
+ filename: filename,
278
+ file_data: encode_as_data_uri(content, mime_type)
279
+ }
250
280
  else
251
- "unknown"
281
+ begin
282
+ text_content = content.force_encoding("UTF-8")
283
+ JSON.generate({text: text_content})
284
+ {
285
+ type: "input_text",
286
+ text: text_content
287
+ }
288
+ rescue Encoding::InvalidByteSequenceError, Encoding::UndefinedConversionError, JSON::GeneratorError
289
+ raise InputClassificationError,
290
+ "Unable to read #{filename} as text. Only PDF and text files are supported."
291
+ end
252
292
  end
293
+ end
294
+ end
253
295
 
254
- mime_type = MIME::Types.type_for(filename).first.to_s
255
- mime_type = "image/jpeg" if mime_type.empty?
256
-
296
+ # :reek:ManualDispatch
297
+ # :reek:TooManyStatements
298
+ def process_image_input(obj)
299
+ case classify_obj(obj)
300
+ when :url
301
+ obj
302
+ when :file_path
303
+ mime_type = Marcel::MimeType.for(Pathname.new(obj))
304
+ image_data = File.binread(obj)
305
+ encode_as_data_uri(image_data, mime_type)
306
+ when :file_like
307
+ filename = extract_filename(obj)
257
308
  file_data = obj.read
258
309
  obj.rewind if obj.respond_to?(:rewind)
259
-
260
- base64_string = Base64.strict_encode64(file_data)
261
-
262
- "data:#{mime_type};base64,#{base64_string}"
310
+ mime_type = Marcel::MimeType.for(StringIO.new(file_data), name: filename)
311
+ encode_as_data_uri(file_data, mime_type)
263
312
  end
264
313
  end
265
314
 
315
+ # :reek:UtilityFunction
316
+ def encode_as_data_uri(data, mime_type)
317
+ "data:#{mime_type};base64,#{Base64.strict_encode64(data)}"
318
+ end
319
+
320
+ # :reek:DuplicateMethodCall
321
+ # :reek:UtilityFunction
266
322
  def strip_responses(messages)
267
- messages.each do |message|
268
- message.delete(:response) if message.key?(:response)
269
- message[:content] = JSON.generate(message[:content]) if message[:content].is_a?(Hash)
323
+ messages.map do |message|
324
+ stripped = message.dup
325
+ stripped.delete(:response)
326
+ stripped[:content] = JSON.generate(stripped[:content]) if stripped[:content].is_a?(Hash)
327
+ stripped
270
328
  end
271
329
  end
272
330
 
273
331
  def tools
274
332
  tools_list = []
275
333
  if web_search
276
- tools_list << { type: "web_search_preview" }
334
+ tools_list << {type: "web_search_preview"}
277
335
  end
336
+ tools_list
278
337
  end
279
338
 
280
- def extract_message(response)
281
- response.output.flat_map { _1.content }.select { _1.is_a?(OpenAI::Models::Responses::ResponseOutputText)}.first.text
339
+ # :reek:UtilityFunction
340
+ # :reek:ManualDispatch
341
+ def extract_text_from_response(response)
342
+ response.output.flat_map { |output|
343
+ output.respond_to?(:content) ? output.content : []
344
+ }.compact.find { |content|
345
+ content.is_a?(OpenAI::Models::Responses::ResponseOutputText)
346
+ }&.text
347
+ end
348
+
349
+ # :reek:UtilityFunction
350
+ def wrap_schema_if_needed(schema)
351
+ if schema.key?(:format) || schema.key?("format")
352
+ schema
353
+ elsif (schema.key?(:name) || schema.key?("name")) &&
354
+ (schema.key?(:schema) || schema.key?("schema")) &&
355
+ (schema.key?(:strict) || schema.key?("strict"))
356
+ {
357
+ format: schema.merge(type: :json_schema)
358
+ }
359
+ else
360
+ {
361
+ format: {
362
+ type: :json_schema,
363
+ name: "response",
364
+ schema: schema,
365
+ strict: true
366
+ }
367
+ }
368
+ end
282
369
  end
283
370
  end
284
371
  end
data/lib/ai/response.rb CHANGED
@@ -1,4 +1,5 @@
1
1
  module AI
2
+ # :reek:IrresponsibleModule
2
3
  class Response
3
4
  attr_reader :id, :model, :usage, :total_tokens
4
5
 
@@ -9,4 +10,4 @@ module AI
9
10
  @total_tokens = @usage[:total_tokens]
10
11
  end
11
12
  end
12
- end
13
+ end
data/lib/ai-chat.rb CHANGED
@@ -1 +1 @@
1
- require_relative "ai/chat"
1
+ require_relative "ai/chat"
metadata CHANGED
@@ -1,71 +1,71 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai-chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.1
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Raghu Betina
8
- autorequire:
8
+ autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2025-07-29 00:00:00.000000000 Z
11
+ date: 2025-08-05 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: zeitwerk
14
+ name: openai
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: '2.7'
19
+ version: '0.16'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: '2.7'
26
+ version: '0.16'
27
27
  - !ruby/object:Gem::Dependency
28
- name: openai
28
+ name: marcel
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
31
  - - "~>"
32
32
  - !ruby/object:Gem::Version
33
- version: '0.14'
33
+ version: '1.0'
34
34
  type: :runtime
35
35
  prerelease: false
36
36
  version_requirements: !ruby/object:Gem::Requirement
37
37
  requirements:
38
38
  - - "~>"
39
39
  - !ruby/object:Gem::Version
40
- version: '0.14'
40
+ version: '1.0'
41
41
  - !ruby/object:Gem::Dependency
42
- name: mime-types
42
+ name: base64
43
43
  requirement: !ruby/object:Gem::Requirement
44
44
  requirements:
45
- - - "~>"
45
+ - - ">"
46
46
  - !ruby/object:Gem::Version
47
- version: '3.0'
47
+ version: 0.1.1
48
48
  type: :runtime
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
- - - "~>"
52
+ - - ">"
53
53
  - !ruby/object:Gem::Version
54
- version: '3.0'
54
+ version: 0.1.1
55
55
  - !ruby/object:Gem::Dependency
56
- name: base64
56
+ name: json
57
57
  requirement: !ruby/object:Gem::Requirement
58
58
  requirements:
59
59
  - - "~>"
60
60
  - !ruby/object:Gem::Version
61
- version: '0.1'
61
+ version: '2.0'
62
62
  type: :runtime
63
63
  prerelease: false
64
64
  version_requirements: !ruby/object:Gem::Requirement
65
65
  requirements:
66
66
  - - "~>"
67
67
  - !ruby/object:Gem::Version
68
- version: '0.1'
68
+ version: '2.0'
69
69
  - !ruby/object:Gem::Dependency
70
70
  name: dotenv
71
71
  requirement: !ruby/object:Gem::Requirement
@@ -94,7 +94,7 @@ dependencies:
94
94
  - - "~>"
95
95
  - !ruby/object:Gem::Version
96
96
  version: '11.1'
97
- description:
97
+ description:
98
98
  email:
99
99
  - raghu@firstdraft.com
100
100
  executables: []
@@ -109,7 +109,6 @@ files:
109
109
  - lib/ai-chat.rb
110
110
  - lib/ai/chat.rb
111
111
  - lib/ai/response.rb
112
- - lib/ai_chat.rb
113
112
  homepage: https://github.com/firstdraft/ai-chat
114
113
  licenses:
115
114
  - MIT
@@ -120,7 +119,7 @@ metadata:
120
119
  label: AI Chat
121
120
  rubygems_mfa_required: 'true'
122
121
  source_code_uri: https://github.com/firstdraft/ai-chat
123
- post_install_message:
122
+ post_install_message:
124
123
  rdoc_options: []
125
124
  require_paths:
126
125
  - lib
@@ -135,8 +134,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
135
134
  - !ruby/object:Gem::Version
136
135
  version: '0'
137
136
  requirements: []
138
- rubygems_version: 3.5.23
139
- signing_key:
137
+ rubygems_version: 3.4.6
138
+ signing_key:
140
139
  specification_version: 4
141
140
  summary: A beginner-friendly Ruby interface for OpenAI's API
142
141
  test_files: []
data/lib/ai_chat.rb DELETED
@@ -1,6 +0,0 @@
1
- require "zeitwerk"
2
- loader = Zeitwerk::Loader.for_gem
3
- loader.inflector.inflect("ai" => "AI")
4
- loader.setup
5
-
6
- require_relative "ai/chat"