ai-chat 0.1.1 → 0.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c3ac8b911bbb322ec7470c226ed15f524b520ca927ccb50eb53d1beba9579e21
4
- data.tar.gz: e15f7f7c3f8f06819357b16e853eb8937eb543c8d8f5079e61a5e2ce3599448e
3
+ metadata.gz: 6b050afeef6a27a67c0125c131e9f2825a0201cf1c1781f7f87750705b150ea8
4
+ data.tar.gz: d87412fd5c1439eaad5eba3d919b6cbb7dfc795e762199beaeb28825dc1d0281
5
5
  SHA512:
6
- metadata.gz: e78959fc4366d03d9cbc96e218f77283001af9e9a49b1753c9944040bb8de1fd7c25635cbfb40543a7abead39614cfd68a4ad9d3958907b56023651c44d395e3
7
- data.tar.gz: 448d2cdc7892504079edff0028f820bbe384a440e06fc109ac8b4d131e903c196c5ca1f3ff1103615dfeb4b80c3cc6e8b8a08ddbfb5ab044d84c4eb7aa580563
6
+ metadata.gz: d7e6064820465b1ce64d2fa551e5e92fdf4bb74f6a817e4473a28f69541e431cd84deec12dce1df08fba16041baf27f120e7b37ec6721d40201d138f7f563f69
7
+ data.tar.gz: f13ebe743b083cd8089fa28e1de37750416b03ce0414eb0b754454f16a1b0bd080abb71c6e2c5b966462b6bda5dc9893dfd530060997a177fce493b1e0782eee
data/README.md CHANGED
@@ -1,6 +1,44 @@
1
- # OpenAI Chat
1
+ # AI Chat
2
2
 
3
- This gem provides a class called `OpenAI::Chat` that is intended to make it as easy as possible to use OpenAI's cutting-edge generative AI models.
3
+ This gem provides a class called `AI::Chat` that is intended to make it as easy as possible to use OpenAI's cutting-edge generative AI models.
4
+
5
+ ## Examples
6
+
7
+ This gem includes comprehensive example scripts that showcase all features and serve as both documentation and validation tests. To explore the capabilities:
8
+
9
+ ### Quick Start
10
+
11
+ ```bash
12
+ # Run a quick overview of key features (takes ~1 minute)
13
+ bundle exec ruby examples/01_quick.rb
14
+ ```
15
+
16
+ ### Run All Examples
17
+
18
+ ```bash
19
+ # Run the complete test suite demonstrating all features
20
+ bundle exec ruby examples/all.rb
21
+ ```
22
+
23
+ ### Individual Feature Examples
24
+
25
+ The `examples/` directory contains focused examples for specific features:
26
+
27
+ - `01_quick.rb` - Quick overview of key features
28
+ - `02_core.rb` - Core functionality (basic chat, messages, responses)
29
+ - `03_configuration.rb` - Configuration options (API keys, models, reasoning effort)
30
+ - `04_multimodal.rb` - Basic file and image handling
31
+ - `05_file_handling_comprehensive.rb` - Advanced file handling (PDFs, text files, Rails uploads)
32
+ - `06_structured_output.rb` - Basic structured output with schemas
33
+ - `07_structured_output_comprehensive.rb` - All 6 supported schema formats
34
+ - `08_advanced_usage.rb` - Advanced patterns (chaining, web search)
35
+ - `09_edge_cases.rb` - Error handling and edge cases
36
+ - `10_additional_patterns.rb` - Less common usage patterns (direct add method, web search + schema, etc.)
37
+
38
+ Each example is self-contained and can be run individually:
39
+ ```bash
40
+ bundle exec ruby examples/[filename]
41
+ ```
4
42
 
5
43
  ## Installation
6
44
 
@@ -9,7 +47,7 @@ This gem provides a class called `OpenAI::Chat` that is intended to make it as e
9
47
  Add this line to your application's Gemfile:
10
48
 
11
49
  ```ruby
12
- gem "openai-chat", "< 1.0.0"
50
+ gem "ai-chat", "< 1.0.0"
13
51
  ```
14
52
 
15
53
  And then, at a command prompt:
@@ -23,7 +61,7 @@ bundle install
23
61
  Or, install it directly with:
24
62
 
25
63
  ```
26
- gem install openai-chat
64
+ gem install ai-chat
27
65
  ```
28
66
 
29
67
  ## Simplest usage
@@ -31,10 +69,10 @@ gem install openai-chat
31
69
  In your Ruby program:
32
70
 
33
71
  ```ruby
34
- require "openai/chat"
72
+ require "ai-chat"
35
73
 
36
- # Create an instance of OpenAI::Chat
37
- a = OpenAI::Chat.new
74
+ # Create an instance of AI::Chat
75
+ a = AI::Chat.new
38
76
 
39
77
  # Build up your conversation by adding messages
40
78
  a.add("If the Ruby community had an official motto, what might it be?")
@@ -50,7 +88,7 @@ a.generate! # => "Matz is nice and so we are nice" (or similar)
50
88
  pp a.messages
51
89
  # => [
52
90
  # {:role=>"user", :content=>"If the Ruby community had an official motto, what might it be?"},
53
- # {:role=>"assistant", :content=>"Matz is nice and so we are nice", :response => #<OpenAI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
91
+ # {:role=>"assistant", :content=>"Matz is nice and so we are nice", :response => #<AI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
54
92
  # ]
55
93
 
56
94
  # Continue the conversation
@@ -70,7 +108,7 @@ That's it! You're building something like this:
70
108
  [
71
109
  {:role => "system", :content => "You are a helpful assistant"},
72
110
  {:role => "user", :content => "Hello!"},
73
- {:role => "assistant", :content => "Hi there! How can I help you today?", :response => #<OpenAI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
111
+ {:role => "assistant", :content => "Hi there! How can I help you today?", :response => #<AI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>}
74
112
  ]
75
113
  ```
76
114
 
@@ -79,9 +117,9 @@ That last bit, under `:response`, is an object that represents the JSON that the
79
117
  ## Adding Different Types of Messages
80
118
 
81
119
  ```ruby
82
- require "openai/chat"
120
+ require "ai-chat"
83
121
 
84
- b = OpenAI::Chat.new
122
+ b = AI::Chat.new
85
123
 
86
124
  # Add system instructions
87
125
  b.add("You are a helpful assistant that talks like Shakespeare.", role: "system")
@@ -105,7 +143,7 @@ b.generate! # => "Methinks 'tis 'Ruby doth bring joy to all who craft with care'
105
143
  Instead of always specifying the role, you can use these shortcuts:
106
144
 
107
145
  ```ruby
108
- c = OpenAI::Chat.new
146
+ c = AI::Chat.new
109
147
 
110
148
  # These are equivalent:
111
149
  c.add("You are helpful", role: "system")
@@ -130,7 +168,7 @@ We use the `add` method (and its shortcuts) to build up an array because:
130
168
 
131
169
  ```ruby
132
170
  # In a Rails app, you might do:
133
- d = OpenAI::Chat.new
171
+ d = AI::Chat.new
134
172
  d.messages = @conversation.messages # Load existing messages
135
173
  d.user("What should I do next?") # Add a new question
136
174
  d.generate! # Generate a response
@@ -143,7 +181,7 @@ d.generate! # Generate a response
143
181
  By default, the gem uses OpenAI's `gpt-4.1-nano` model. If you want to use a different model, you can set it:
144
182
 
145
183
  ```ruby
146
- e = OpenAI::Chat.new
184
+ e = AI::Chat.new
147
185
  e.model = "o4-mini"
148
186
  ```
149
187
 
@@ -167,13 +205,13 @@ The gem by default looks for an environment variable called `OPENAI_API_KEY` and
167
205
  You can specify a different environment variable name:
168
206
 
169
207
  ```ruby
170
- f = OpenAI::Chat.new(api_key_env_var: "MY_OPENAI_TOKEN")
208
+ f = AI::Chat.new(api_key_env_var: "MY_OPENAI_TOKEN")
171
209
  ```
172
210
 
173
211
  Or, you can pass an API key in directly:
174
212
 
175
213
  ```ruby
176
- g = OpenAI::Chat.new(api_key: "your-api-key-goes-here")
214
+ g = AI::Chat.new(api_key: "your-api-key-goes-here")
177
215
  ```
178
216
 
179
217
  ## Inspecting Your Conversation
@@ -181,7 +219,7 @@ g = OpenAI::Chat.new(api_key: "your-api-key-goes-here")
181
219
  You can call `.messages` to get an array containing the conversation so far:
182
220
 
183
221
  ```ruby
184
- h = OpenAI::Chat.new
222
+ h = AI::Chat.new
185
223
  h.system("You are a helpful cooking assistant")
186
224
  h.user("How do I boil an egg?")
187
225
  h.generate!
@@ -203,12 +241,33 @@ h.last
203
241
  # => "Here's how to boil an egg..."
204
242
  ```
205
243
 
244
+ ## Web Search
245
+
246
+ To give the model access to real-time information from the internet, we enable the `web_search` feature by default. This uses OpenAI's built-in `web_search_preview` tool.
247
+
248
+ ```ruby
249
+ m = AI::Chat.new
250
+ m.user("What are the latest developments in the Ruby language?")
251
+ m.generate! # This may use web search to find current information
252
+ ```
253
+
254
+ **Note:** This feature requires a model that supports the `web_search_preview` tool, such as `gpt-4o` or `gpt-4o-mini`. The gem will attempt to use a compatible model if you have `web_search` enabled.
255
+
256
+ If you don't want the model to use web search, set `web_search` to `false`:
257
+
258
+ ```ruby
259
+ m = AI::Chat.new
260
+ m.web_search = false
261
+ m.user("What are the latest developments in the Ruby language?")
262
+ m.generate! # This definitely won't use web search to find current information
263
+ ```
264
+
206
265
  ## Structured Output
207
266
 
208
267
  Get back Structured Output by setting the `schema` attribute (I suggest using [OpenAI's handy tool for generating the JSON Schema](https://platform.openai.com/docs/guides/structured-outputs)):
209
268
 
210
269
  ```ruby
211
- i = OpenAI::Chat.new
270
+ i = AI::Chat.new
212
271
 
213
272
  i.system("You are an expert nutritionist. The user will describe a meal. Estimate the calories, carbs, fat, and protein.")
214
273
 
@@ -224,10 +283,34 @@ response = i.generate!
224
283
  response[:total_calories] # => 285
225
284
  ```
226
285
 
227
- You can also provide the equivalent Ruby `Hash` rather than a `String` containing JSON.
286
+ ### Schema Formats
287
+
288
+ The gem supports multiple schema formats to accommodate different preferences and use cases. The gem will automatically wrap your schema in the correct format for the API.
289
+
290
+ #### 1. Full Schema with `format` Key (Most Explicit)
291
+ ```ruby
292
+ # When you need complete control over the schema structure
293
+ i.schema = {
294
+ format: {
295
+ type: :json_schema,
296
+ name: "nutrition_values",
297
+ strict: true,
298
+ schema: {
299
+ type: "object",
300
+ properties: {
301
+ fat: { type: "number", description: "The amount of fat in grams." },
302
+ protein: { type: "number", description: "The amount of protein in grams." }
303
+ },
304
+ required: ["fat", "protein"],
305
+ additionalProperties: false
306
+ }
307
+ }
308
+ }
309
+ ```
228
310
 
311
+ #### 2. Schema with `name`, `strict`, and `schema` Keys
229
312
  ```ruby
230
- # Equivalent to assigning the String above
313
+ # The format shown in OpenAI's documentation
231
314
  i.schema = {
232
315
  name: "nutrition_values",
233
316
  strict: true,
@@ -235,25 +318,55 @@ i.schema = {
235
318
  type: "object",
236
319
  properties: {
237
320
  fat: { type: "number", description: "The amount of fat in grams." },
238
- protein: { type: "number", description: "The amount of protein in grams." },
239
- carbs: { type: "number", description: "The amount of carbohydrates in grams." },
240
- total_calories: { type: "number", description:
241
- "The total calories calculated based on fat, protein, and carbohydrates." }
321
+ protein: { type: "number", description: "The amount of protein in grams." }
242
322
  },
243
- required: [:fat, :protein, :carbs, :total_calories],
323
+ required: [:fat, :protein],
244
324
  additionalProperties: false
245
325
  }
246
326
  }
247
327
  ```
248
328
 
249
- The keys can be `String`s or `Symbol`s.
329
+ #### 3. Simple JSON Schema Object
330
+ ```ruby
331
+ # The simplest format - just provide the schema itself
332
+ # The gem will wrap it with sensible defaults (name: "response", strict: true)
333
+ i.schema = {
334
+ type: "object",
335
+ properties: {
336
+ fat: { type: "number", description: "The amount of fat in grams." },
337
+ protein: { type: "number", description: "The amount of protein in grams." }
338
+ },
339
+ required: ["fat", "protein"],
340
+ additionalProperties: false
341
+ }
342
+ ```
343
+
344
+ #### 4. JSON String Formats
345
+ All the above formats also work as JSON strings:
346
+
347
+ ```ruby
348
+ # As a JSON string with full format
349
+ i.schema = '{"format":{"type":"json_schema","name":"nutrition_values","strict":true,"schema":{...}}}'
350
+
351
+ # As a JSON string with name/strict/schema
352
+ i.schema = '{"name":"nutrition_values","strict":true,"schema":{...}}'
353
+
354
+ # As a simple JSON schema string
355
+ i.schema = '{"type":"object","properties":{...}}'
356
+ ```
357
+
358
+ ### Schema Notes
359
+
360
+ - The keys can be `String`s or `Symbol`s.
361
+ - The gem automatically converts your schema to the format expected by the API.
362
+ - When a schema is set, `generate!` returns a parsed Ruby Hash with symbolized keys, not a String.
250
363
 
251
364
  ## Including Images
252
365
 
253
366
  You can include images in your chat messages using the `user` method with the `image` or `images` parameter:
254
367
 
255
368
  ```ruby
256
- j = OpenAI::Chat.new
369
+ j = AI::Chat.new
257
370
 
258
371
  # Send a single image
259
372
  j.user("What's in this image?", image: "path/to/local/image.jpg")
@@ -277,18 +390,130 @@ The gem supports three types of image inputs:
277
390
  - **File paths**: Pass a string with a path to a local image file
278
391
  - **File-like objects**: Pass an object that responds to `read` (like `File.open("image.jpg")` or Rails uploaded files)
279
392
 
280
- ## Web Search
393
+ ## Including Files
281
394
 
282
- To give the model access to real-time information from the internet, you can enable the `web_search` feature. This uses OpenAI's built-in `web_search_preview` tool.
395
+ You can include files (PDFs, text files, etc.) in your messages using the `file` or `files` parameter:
283
396
 
284
397
  ```ruby
285
- m = OpenAI::Chat.new
286
- m.web_search = true
287
- m.user("What are the latest developments in the Ruby language?")
288
- m.generate! # This may use web search to find current information
398
+ k = AI::Chat.new
399
+
400
+ # Send a single file
401
+ k.user("Summarize this document", file: "report.pdf")
402
+ k.generate!
403
+
404
+ # Send multiple files
405
+ k.user("Compare these documents", files: ["doc1.pdf", "doc2.txt", "data.json"])
406
+ k.generate!
289
407
  ```
290
408
 
291
- **Note:** This feature requires a model that supports the `web_search_preview` tool, such as `gpt-4o` or `gpt-4o-mini`. The gem will attempt to use a compatible model if you have `web_search` enabled.
409
+ Files are handled intelligently based on their type:
410
+ - **PDFs**: Sent as file attachments for the model to analyze
411
+ - **Text files**: Content is automatically extracted and sent as text
412
+ - **Other formats**: The gem attempts to read them as text if possible
413
+
414
+ ## Mixed Content (Images + Files)
415
+
416
+ You can send images and files together in a single message:
417
+
418
+ ```ruby
419
+ l = AI::Chat.new
420
+
421
+ # Mix image and file in one message
422
+ l.user("Compare this photo with the document",
423
+ image: "photo.jpg",
424
+ file: "document.pdf")
425
+ l.generate!
426
+
427
+ # Mix multiple images and files
428
+ l.user("Analyze all these materials",
429
+ images: ["chart1.png", "chart2.png"],
430
+ files: ["report.pdf", "data.csv"])
431
+ l.generate!
432
+ ```
433
+
434
+ **Note**: Images should use `image:`/`images:` parameters, while documents should use `file:`/`files:` parameters.
435
+
436
+ ## Re-sending old images and files
437
+
438
+ Note: if you generate another API request using the same chat, old images and files in the conversation history will not be re-sent by default. If you really want to re-send old images and files, then you must set `previous_response_id` to `nil`:
439
+
440
+ ```ruby
441
+ a = AI::Chat.new
442
+ a.user("What color is the object in this photo?", image: "thing.png")
443
+ a.generate! # => "Red"
444
+ a.user("What is the object in the photo?")
445
+ a.generate! # => "I don't see a photo"
446
+
447
+ b = AI::Chat.new
448
+ b.user("What color is the object in this photo?", image: "thing.png")
449
+ b.generate! # => "Red"
450
+ b.user("What is the object in the photo?")
451
+ b.previous_response_id = nil
452
+ b.generate! # => "An apple"
453
+ ```
454
+
455
+ If you don't set `previous_response_id` to `nil`, the model won't have the old image(s) to work with.
456
+
457
+ ## Image generation
458
+
459
+ You can enable OpenAI's image generation tool:
460
+
461
+ ```ruby
462
+ a = AI::Chat.new
463
+ a.image_generation = true
464
+ a.user("Draw a picture of a kitten")
465
+ a.generate! # => "Here is your picture of a kitten:"
466
+ ```
467
+
468
+ By default, images are saved to `./images`. You can configure a different location:
469
+
470
+ ```ruby
471
+ a = AI::Chat.new
472
+ a.image_generation = true
473
+ a.image_folder = "./my_images"
474
+ a.user("Draw a picture of a kitten")
475
+ a.generate! # => "Here is your picture of a kitten:"
476
+ ```
477
+
478
+ Images are saved in timestamped subfolders using ISO 8601 basic format. For example:
479
+ - `./images/20250804T11303912_resp_abc123/001.png`
480
+ - `./images/20250804T11303912_resp_abc123/002.png` (if multiple images)
481
+
482
+ The folder structure ensures images are organized chronologically and by response.
483
+
484
+ The messages array will now look like this:
485
+
486
+ ```ruby
487
+ pp a.messages
488
+ # => [
489
+ # {:role=>"user", :content=>"Draw a picture of a kitten"},
490
+ # {:role=>"assistant", :content=>"Here is your picture of a kitten:", :images => ["./images/20250804T11303912_resp_abc123/001.png"], :response => #<Response ...>}
491
+ # ]
492
+ ```
493
+
494
+ You can access the image filenames in several ways:
495
+
496
+ ```ruby
497
+ # From the last message
498
+ images = a.messages.last[:images]
499
+ # => ["./images/20250804T11303912_resp_abc123/001.png"]
500
+
501
+ # From the response object
502
+ images = a.messages.last[:response].images
503
+ # => ["./images/20250804T11303912_resp_abc123/001.png"]
504
+ ```
505
+
506
+ Note: Unlike with user-provided input images, OpenAI _does_ store AI-generated output images. So, if you make another API request using the same chat, previous images generated by the model in the conversation history will automatically be used — you don't have to re-send them. This allows you to easily refine an image with user input over multi-turn chats.
507
+
508
+ ```ruby
509
+ a = AI::Chat.new
510
+ a.image_generation = true
511
+ a.image_folder = "./images"
512
+ a.user("Draw a picture of a kitten")
513
+ a.generate! # => "Here is a picture of a kitten:"
514
+ a.user("Make it even cuter")
515
+ a.generate! # => "Here is the kitten, but even cuter:"
516
+ ```
292
517
 
293
518
  ## Building Conversations Without API Calls
294
519
 
@@ -296,7 +521,7 @@ You can manually add assistant messages without making API calls, which is usefu
296
521
 
297
522
  ```ruby
298
523
  # Create a new chat instance
299
- k = OpenAI::Chat.new
524
+ k = AI::Chat.new
300
525
 
301
526
  # Add previous messages
302
527
  k.system("You are a helpful assistant who provides information about planets.")
@@ -316,14 +541,14 @@ response = k.generate!
316
541
  puts response
317
542
  ```
318
543
 
319
- With this, you can loop through any conversation's history (perhaps after retrieving it from your database), recreate an `OpenAI::Chat`, and then continue it.
544
+ With this, you can loop through any conversation's history (perhaps after retrieving it from your database), recreate an `AI::Chat`, and then continue it.
320
545
 
321
546
  ## Reasoning Models
322
547
 
323
548
  When using reasoning models like `o3` or `o4-mini`, you can specify a reasoning effort level to control how much reasoning the model does before producing its final response:
324
549
 
325
550
  ```ruby
326
- l = OpenAI::Chat.new
551
+ l = AI::Chat.new
327
552
  l.model = "o3-mini"
328
553
  l.reasoning_effort = "medium" # Can be "low", "medium", or "high"
329
554
 
@@ -343,7 +568,7 @@ Setting to `nil` disables the reasoning parameter.
343
568
  When you call `generate!` or `generate!`, the gem stores additional information about the API response:
344
569
 
345
570
  ```ruby
346
- t = OpenAI::Chat.new
571
+ t = AI::Chat.new
347
572
  t.user("Hello!")
348
573
  t.generate!
349
574
 
@@ -352,19 +577,14 @@ pp t.messages.last
352
577
  # => {
353
578
  # :role => "assistant",
354
579
  # :content => "Hello! How can I help you today?",
355
- # :response => #<OpenAI::Chat::Response id=resp_abc... model=gpt-4.1-nano tokens=12>
580
+ # :response => #<AI::Response id=resp_abc... model=gpt-4.1-nano tokens=12>
356
581
  # }
357
582
 
358
583
  # Access detailed information
359
- response = t.last_response
584
+ response = t.last[:response]
360
585
  response.id # => "resp_abc123..."
361
586
  response.model # => "gpt-4.1-nano"
362
587
  response.usage # => {:prompt_tokens=>5, :completion_tokens=>7, :total_tokens=>12}
363
-
364
- # Helper methods
365
- t.last_response_id # => "resp_abc123..."
366
- t.last_usage # => {:prompt_tokens=>5, :completion_tokens=>7, :total_tokens=>12}
367
- t.total_tokens # => 12
368
588
  ```
369
589
 
370
590
  This information is useful for:
@@ -373,20 +593,20 @@ This information is useful for:
373
593
  - Understanding which model was actually used.
374
594
  - Future features like cost tracking.
375
595
 
376
- You can also, if you know a response ID, pick up an old conversation at that point in time:
596
+ You can also, if you know a response ID, continue an old conversation by setting the `previous_response_id`:
377
597
 
378
598
  ```ruby
379
- t = OpenAI::Chat.new
599
+ t = AI::Chat.new
380
600
  t.user("Hello!")
381
601
  t.generate!
382
- old_id = t.last_response_id # => "resp_abc123..."
602
+ old_id = t.last[:response].id # => "resp_abc123..."
383
603
 
384
604
  # Some time in the future...
385
605
 
386
- u = OpenAI::Chat.new
387
- u.pick_up_from("resp_abc123...")
388
- u.messages # => [
389
- # {:role=>"assistant", :response => #<OpenAI::Chat::Response id=resp_abc...}
606
+ u = AI::Chat.new
607
+ u.previous_response_id = "resp_abc123..."
608
+ u.user("What did I just say?")
609
+ u.generate! # Will have context from the previous conversation}
390
610
  # ]
391
611
  u.user("What should we do next?")
392
612
  u.generate!
@@ -400,7 +620,7 @@ You can use `.messages=()` to assign an `Array` of `Hashes`. Each `Hash` must ha
400
620
 
401
621
  ```ruby
402
622
  # Using the planet example with array of hashes
403
- p = OpenAI::Chat.new
623
+ p = AI::Chat.new
404
624
 
405
625
  # Set all messages at once instead of calling methods sequentially
406
626
  p.messages = [
@@ -423,7 +643,7 @@ You can still include images:
423
643
 
424
644
  ```ruby
425
645
  # Create a new chat instance
426
- q = OpenAI::Chat.new
646
+ q = AI::Chat.new
427
647
 
428
648
  # With images
429
649
  q.messages = [
@@ -446,7 +666,7 @@ If your chat history is contained in an `ActiveRecord::Relation`, you can assign
446
666
  # Load from ActiveRecord
447
667
  @thread = Thread.find(42)
448
668
 
449
- r = OpenAI::Chat.new
669
+ r = AI::Chat.new
450
670
  r.messages = @thread.posts.order(:created_at)
451
671
  r.user("What should we discuss next?")
452
672
  r.generate! # Creates a new post record, too
@@ -466,7 +686,7 @@ In order for the above to "magically" work, there are a few requirements. Your A
466
686
  If your columns have different names:
467
687
 
468
688
  ```ruby
469
- s = OpenAI::Chat.new
689
+ s = AI::Chat.new
470
690
  s.configure_message_attributes(
471
691
  role: :message_type, # Your column for role
472
692
  content: :message_body, # Your column for content
@@ -485,13 +705,13 @@ add_column :messages, :openai_response, :text
485
705
 
486
706
  # In your model
487
707
  class Message < ApplicationRecord
488
- serialize :openai_response, OpenAI::Chat::Response
708
+ serialize :openai_response, AI::Chat::Response
489
709
  end
490
710
 
491
711
  # Usage
492
712
  @thread = Thread.find(42)
493
713
 
494
- t = OpenAI::Chat.new
714
+ t = AI::Chat.new
495
715
  t.posts = @thread.messages
496
716
  t.user("Hello!")
497
717
  t.generate!
@@ -511,13 +731,12 @@ last_message.openai_response.usage # => {:prompt_tokens=>10, ...}
511
731
 
512
732
  While this gem includes specs, they use mocked API responses. To test with real API calls:
513
733
 
514
- 1. Navigate to the test program directory: `cd demo`
515
- 2. Create a `.env` file in the test_program directory with your API credentials:
734
+ 1. Create a `.env` file at the project root with your API credentials:
516
735
  ```
517
736
  # Your OpenAI API key
518
737
  OPENAI_API_KEY=your_openai_api_key_here
519
738
  ```
520
- 3. Install dependencies: `bundle install`
521
- 4. Run the test program: `ruby demo.rb`
739
+ 2. Install dependencies: `bundle install`
740
+ 3. Run the examples: `bundle exec ruby examples/all.rb`
522
741
 
523
742
  This test program runs through all the major features of the gem, making real API calls to OpenAI.
data/ai-chat.gemspec CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  Gem::Specification.new do |spec|
4
4
  spec.name = "ai-chat"
5
- spec.version = "0.1.1"
5
+ spec.version = "0.2.1"
6
6
  spec.authors = ["Raghu Betina"]
7
7
  spec.email = ["raghu@firstdraft.com"]
8
8
  spec.homepage = "https://github.com/firstdraft/ai-chat"
@@ -18,12 +18,11 @@ Gem::Specification.new do |spec|
18
18
  "source_code_uri" => "https://github.com/firstdraft/ai-chat"
19
19
  }
20
20
 
21
-
22
21
  spec.required_ruby_version = "~> 3.2"
23
- spec.add_dependency "zeitwerk", "~> 2.7"
24
- spec.add_dependency "openai", "~> 0.14"
25
- spec.add_runtime_dependency "mime-types", "~> 3.0"
26
- spec.add_runtime_dependency "base64", "~> 0.1" # Works for all Ruby versions
22
+ spec.add_runtime_dependency "openai", "~> 0.16"
23
+ spec.add_runtime_dependency "marcel", "~> 1.0"
24
+ spec.add_runtime_dependency "base64", "> 0.1.1"
25
+ spec.add_runtime_dependency "json", "~> 2.0"
27
26
 
28
27
  spec.add_development_dependency "dotenv"
29
28
  spec.add_development_dependency "refinements", "~> 11.1"
data/lib/ai/chat.rb CHANGED
@@ -1,31 +1,41 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require "base64"
4
- require "mime-types"
4
+ require "json"
5
+ require "marcel"
5
6
  require "openai"
7
+ require "pathname"
8
+ require "stringio"
9
+ require "fileutils"
6
10
 
7
11
  require_relative "response"
8
12
 
9
13
  module AI
10
- # Main namespace.
14
+ # :reek:MissingSafeMethod { exclude: [ generate! ] }
15
+ # :reek:TooManyMethods
16
+ # :reek:TooManyInstanceVariables
17
+ # :reek:InstanceVariableAssumption
18
+ # :reek:IrresponsibleModule
11
19
  class Chat
12
- def self.loader(registry = Zeitwerk::Registry)
13
- @loader ||= registry.loaders.each.find { |loader| loader.tag == "ai-chat" }
14
- end
15
-
16
- attr_accessor :messages, :schema, :model, :web_search
17
- attr_reader :reasoning_effort, :client
20
+ # :reek:Attribute
21
+ attr_accessor :messages, :model, :web_search, :previous_response_id, :image_generation, :image_folder
22
+ attr_reader :reasoning_effort, :client, :schema
18
23
 
19
24
  VALID_REASONING_EFFORTS = [:low, :medium, :high].freeze
20
-
25
+
21
26
  def initialize(api_key: nil, api_key_env_var: "OPENAI_API_KEY")
22
- @api_key = api_key || ENV.fetch(api_key_env_var)
27
+ api_key ||= ENV.fetch(api_key_env_var)
23
28
  @messages = []
24
29
  @reasoning_effort = nil
25
30
  @model = "gpt-4.1-nano"
26
- @client = OpenAI::Client.new(api_key: @api_key)
31
+ @client = OpenAI::Client.new(api_key: api_key)
32
+ @previous_response_id = nil
33
+ @image_generation = false
34
+ @image_folder = "./images"
27
35
  end
28
36
 
37
+ # :reek:TooManyStatements
38
+ # :reek:NilCheck
29
39
  def add(content, role: "user", response: nil, image: nil, images: nil, file: nil, files: nil)
30
40
  if image.nil? && images.nil? && file.nil? && files.nil?
31
41
  messages.push(
@@ -44,41 +54,25 @@ module AI
44
54
  }
45
55
  ]
46
56
 
47
- if images && !images.empty?
48
- images_array = images.map do |image|
49
- {
50
- type: "input_image",
51
- image_url: process_file(image)
52
- }
53
- end
57
+ all_images = []
58
+ all_images << image if image
59
+ all_images.concat(Array(images)) if images
54
60
 
55
- text_and_files_array += images_array
56
- elsif image
61
+ all_images.each do |img|
57
62
  text_and_files_array.push(
58
63
  {
59
64
  type: "input_image",
60
- image_url: process_file(image)
65
+ image_url: process_image_input(img)
61
66
  }
62
67
  )
63
- elsif files && !files.empty?
64
- files_array = files.map do |file|
65
- {
66
- type: "input_file",
67
- filename: "test",
68
- file_data: process_file(file)
69
- }
70
- end
68
+ end
71
69
 
72
- text_and_files_array += files_array
73
- else
74
- text_and_files_array.push(
75
- {
76
- type: "input_file",
77
- filename: "test",
78
- file_data: process_file(file)
79
- }
80
- )
70
+ all_files = []
71
+ all_files << file if file
72
+ all_files.concat(Array(files)) if files
81
73
 
74
+ all_files.each do |file|
75
+ text_and_files_array.push(process_file_input(file))
82
76
  end
83
77
 
84
78
  messages.push(
@@ -97,69 +91,75 @@ module AI
97
91
  def user(message, image: nil, images: nil, file: nil, files: nil)
98
92
  add(message, role: "user", image: image, images: images, file: file, files: files)
99
93
  end
100
-
94
+
101
95
  def assistant(message, response: nil)
102
96
  add(message, role: "assistant", response: response)
103
97
  end
104
98
 
99
+ # :reek:NilCheck
100
+ # :reek:TooManyStatements
105
101
  def generate!
106
102
  response = create_response
107
103
 
108
- if web_search
109
- message = response.output.last.content.first.text
110
- chat_response = Response.new(response)
111
- assistant(message, response: chat_response)
112
- elsif schema
113
- # filtering out refusals...
114
- json_response = response.output.flat_map { _1.content }.select { _1.is_a?(OpenAI::Models::Responses::ResponseOutputText)}.first.text
115
- chat_response = Response.new(response)
116
- message = JSON.parse(json_response, symbolize_names: true)
117
- assistant(message, response: chat_response)
104
+ chat_response = Response.new(response)
105
+
106
+ text_response = extract_text_from_response(response)
107
+
108
+ image_filenames = extract_and_save_images(response)
109
+
110
+ chat_response.images = image_filenames
111
+
112
+ message = if schema
113
+ if text_response.nil? || text_response.empty?
114
+ raise ArgumentError, "No text content in response to parse as JSON for schema: #{schema.inspect}"
115
+ end
116
+ JSON.parse(text_response, symbolize_names: true)
118
117
  else
119
- message = response.output.last.content.first.text
120
- chat_response = Response.new(response)
118
+ text_response
119
+ end
120
+
121
+ if image_filenames.empty?
121
122
  assistant(message, response: chat_response)
123
+ else
124
+ messages.push(
125
+ {
126
+ role: "assistant",
127
+ content: message,
128
+ images: image_filenames,
129
+ response: chat_response
130
+ }.compact
131
+ )
122
132
  end
123
133
 
124
- message
125
- end
134
+ self.previous_response_id = response.id
126
135
 
127
- def pick_up_from(response_id)
128
- response = client.responses.retrieve(response_id)
129
- chat_response = Response.new(response)
130
- message = response.output.flat_map { _1.content }.select { _1.is_a?(OpenAI::Models::Responses::ResponseOutputText)}.first.text
131
- assistant(message, response: chat_response)
132
136
  message
133
137
  end
134
138
 
139
+ # :reek:NilCheck
140
+ # :reek:TooManyStatements
135
141
  def reasoning_effort=(value)
136
142
  if value.nil?
137
143
  @reasoning_effort = nil
138
- else
139
- # Convert string to symbol if needed
140
- symbol_value = value.is_a?(String) ? value.to_sym : value
144
+ return
145
+ end
141
146
 
142
- if VALID_REASONING_EFFORTS.include?(symbol_value)
143
- @reasoning_effort = symbol_value
144
- else
145
- valid_values = VALID_REASONING_EFFORTS.map { |v| ":#{v} or \"#{v}\"" }.join(", ")
146
- raise ArgumentError, "Invalid reasoning_effort value: '#{value}'. Must be one of: #{valid_values}"
147
- end
147
+ normalized_value = value.to_sym
148
+
149
+ if VALID_REASONING_EFFORTS.include?(normalized_value)
150
+ @reasoning_effort = normalized_value
151
+ else
152
+ valid_values = VALID_REASONING_EFFORTS.map { |valid_value| ":#{valid_value} or \"#{valid_value}\"" }.join(", ")
153
+ raise ArgumentError, "Invalid reasoning_effort value: '#{value}'. Must be one of: #{valid_values}"
148
154
  end
149
155
  end
150
-
156
+
151
157
  def schema=(value)
152
158
  if value.is_a?(String)
153
- @schema = JSON.parse(value, symbolize_names: true)
154
- unless @schema.key?(:format) || @schema.key?("format")
155
- @schema = { format: @schema }
156
- end
159
+ parsed = JSON.parse(value, symbolize_names: true)
160
+ @schema = wrap_schema_if_needed(parsed)
157
161
  elsif value.is_a?(Hash)
158
- if value.key?(:format) || value.key?("format")
159
- @schema = value
160
- else
161
- @schema = { format: value }
162
- end
162
+ @schema = wrap_schema_if_needed(value)
163
163
  else
164
164
  raise ArgumentError, "Invalid schema value: '#{value}'. Must be a String containing JSON or a Hash."
165
165
  end
@@ -169,50 +169,70 @@ module AI
169
169
  messages.last
170
170
  end
171
171
 
172
- def last_response
173
- last[:response]
174
- end
175
-
176
- def last_response_id
177
- last_response&.id
178
- end
179
-
180
172
  def inspect
181
173
  "#<#{self.class.name} @messages=#{messages.inspect} @model=#{@model.inspect} @schema=#{@schema.inspect} @reasoning_effort=#{@reasoning_effort.inspect}>"
182
174
  end
183
175
 
184
176
  private
185
177
 
186
- # Custom exception class for input classification errors.
187
178
  class InputClassificationError < StandardError; end
188
179
 
180
+ # :reek:FeatureEnvy
181
+ # :reek:ManualDispatch
182
+ def extract_filename(obj)
183
+ if obj.respond_to?(:original_filename)
184
+ obj.original_filename
185
+ elsif obj.respond_to?(:path)
186
+ File.basename(obj.path)
187
+ else
188
+ raise InputClassificationError,
189
+ "Unable to determine filename from file object. File objects must respond to :original_filename or :path"
190
+ end
191
+ end
192
+
193
+ # :reek:TooManyStatements
189
194
  def create_response
190
195
  parameters = {
191
- model: model,
192
- input: strip_responses(messages),
193
- tools: tools,
194
- text: schema,
195
- reasoning: {
196
- effort: reasoning_effort
197
- }.compact
198
- }.compact
199
- parameters = parameters.delete_if { |k, v| v.empty? }
196
+ model: model
197
+ }
198
+
199
+ parameters[:tools] = tools unless tools.empty?
200
+ parameters[:text] = schema if schema
201
+ parameters[:reasoning] = {effort: reasoning_effort} if reasoning_effort
202
+ parameters[:previous_response_id] = previous_response_id if previous_response_id
203
+
204
+ messages_to_send = prepare_messages_for_api
205
+ parameters[:input] = strip_responses(messages_to_send) unless messages_to_send.empty?
206
+
200
207
  client.responses.create(**parameters)
201
208
  end
202
209
 
210
+ def prepare_messages_for_api
211
+ return messages unless previous_response_id
212
+
213
+ previous_response_index = messages.find_index { |message| message[:response]&.id == previous_response_id }
214
+
215
+ if previous_response_index
216
+ messages[(previous_response_index + 1)..] || []
217
+ else
218
+ messages
219
+ end
220
+ end
221
+
222
+ # :reek:DuplicateMethodCall
223
+ # :reek:FeatureEnvy
224
+ # :reek:ManualDispatch
225
+ # :reek:TooManyStatements
203
226
  def classify_obj(obj)
204
227
  if obj.is_a?(String)
205
- # Attempt to parse as a URL.
206
228
  begin
207
229
  uri = URI.parse(obj)
208
230
  if uri.is_a?(URI::HTTP) || uri.is_a?(URI::HTTPS)
209
231
  return :url
210
232
  end
211
233
  rescue URI::InvalidURIError
212
- # Not a valid URL; continue to check if it's a file path.
213
234
  end
214
235
 
215
- # Check if the string represents a local file path (must exist on disk).
216
236
  if File.exist?(obj)
217
237
  :file_path
218
238
  else
@@ -220,7 +240,6 @@ module AI
220
240
  "String provided is neither a valid URL (must start with http:// or https://) nor an existing file path on disk. Received value: #{obj.inspect}"
221
241
  end
222
242
  elsif obj.respond_to?(:read)
223
- # For non-String objects, check if it behaves like a file.
224
243
  :file_like
225
244
  else
226
245
  raise InputClassificationError,
@@ -228,57 +247,217 @@ module AI
228
247
  end
229
248
  end
230
249
 
231
- def process_file(obj)
250
+ # :reek:DuplicateMethodCall
251
+ # :reek:ManualDispatch
252
+ # :reek:TooManyStatements
253
+ def process_file_input(obj)
232
254
  case classify_obj(obj)
233
255
  when :url
234
- obj
256
+ {
257
+ type: "input_file",
258
+ file_url: obj
259
+ }
235
260
  when :file_path
236
- file_path = obj
261
+ mime_type = Marcel::MimeType.for(Pathname.new(obj))
237
262
 
238
- mime_type = MIME::Types.type_for(file_path).first.to_s
263
+ if mime_type == "application/pdf"
264
+ pdf_data = File.binread(obj)
265
+ {
266
+ type: "input_file",
267
+ filename: File.basename(obj),
268
+ file_data: encode_as_data_uri(pdf_data, mime_type)
269
+ }
270
+ else
271
+ begin
272
+ content = File.read(obj, encoding: "UTF-8")
273
+ # Verify the content can be encoded as JSON (will raise if not)
274
+ JSON.generate({text: content})
275
+ {
276
+ type: "input_text",
277
+ text: content
278
+ }
279
+ rescue Encoding::InvalidByteSequenceError, Encoding::UndefinedConversionError, JSON::GeneratorError
280
+ raise InputClassificationError,
281
+ "Unable to read #{File.basename(obj)} as text. Only PDF and text files are supported."
282
+ end
283
+ end
284
+ when :file_like
285
+ filename = extract_filename(obj)
239
286
 
240
- image_data = File.binread(file_path)
287
+ content = obj.read
288
+ obj.rewind if obj.respond_to?(:rewind)
241
289
 
242
- base64_string = Base64.strict_encode64(image_data)
290
+ mime_type = Marcel::MimeType.for(StringIO.new(content), name: filename)
243
291
 
244
- "data:#{mime_type};base64,#{base64_string}"
245
- when :file_like
246
- filename = if obj.respond_to?(:path)
247
- obj.path
248
- elsif obj.respond_to?(:original_filename)
249
- obj.original_filename
292
+ if mime_type == "application/pdf"
293
+ {
294
+ type: "input_file",
295
+ filename: filename,
296
+ file_data: encode_as_data_uri(content, mime_type)
297
+ }
250
298
  else
251
- "unknown"
299
+ begin
300
+ text_content = content.force_encoding("UTF-8")
301
+ JSON.generate({text: text_content})
302
+ {
303
+ type: "input_text",
304
+ text: text_content
305
+ }
306
+ rescue Encoding::InvalidByteSequenceError, Encoding::UndefinedConversionError, JSON::GeneratorError
307
+ raise InputClassificationError,
308
+ "Unable to read #{filename} as text. Only PDF and text files are supported."
309
+ end
252
310
  end
311
+ end
312
+ end
253
313
 
254
- mime_type = MIME::Types.type_for(filename).first.to_s
255
- mime_type = "image/jpeg" if mime_type.empty?
256
-
314
+ # :reek:ManualDispatch
315
+ # :reek:TooManyStatements
316
+ def process_image_input(obj)
317
+ case classify_obj(obj)
318
+ when :url
319
+ obj
320
+ when :file_path
321
+ mime_type = Marcel::MimeType.for(Pathname.new(obj))
322
+ image_data = File.binread(obj)
323
+ encode_as_data_uri(image_data, mime_type)
324
+ when :file_like
325
+ filename = extract_filename(obj)
257
326
  file_data = obj.read
258
327
  obj.rewind if obj.respond_to?(:rewind)
259
-
260
- base64_string = Base64.strict_encode64(file_data)
261
-
262
- "data:#{mime_type};base64,#{base64_string}"
328
+ mime_type = Marcel::MimeType.for(StringIO.new(file_data), name: filename)
329
+ encode_as_data_uri(file_data, mime_type)
263
330
  end
264
331
  end
265
332
 
333
+ # :reek:UtilityFunction
334
+ def encode_as_data_uri(data, mime_type)
335
+ "data:#{mime_type};base64,#{Base64.strict_encode64(data)}"
336
+ end
337
+
338
+ # :reek:DuplicateMethodCall
339
+ # :reek:UtilityFunction
266
340
  def strip_responses(messages)
267
- messages.each do |message|
268
- message.delete(:response) if message.key?(:response)
269
- message[:content] = JSON.generate(message[:content]) if message[:content].is_a?(Hash)
341
+ messages.map do |message|
342
+ stripped = message.dup
343
+ stripped.delete(:response)
344
+ stripped[:content] = JSON.generate(stripped[:content]) if stripped[:content].is_a?(Hash)
345
+ stripped
270
346
  end
271
347
  end
272
348
 
273
349
  def tools
274
350
  tools_list = []
275
351
  if web_search
276
- tools_list << { type: "web_search_preview" }
352
+ tools_list << {type: "web_search_preview"}
353
+ end
354
+ if image_generation
355
+ tools_list << {type: "image_generation"}
356
+ end
357
+ tools_list
358
+ end
359
+
360
+ def extract_text_from_response(response)
361
+ response.output.flat_map { |output|
362
+ output.respond_to?(:content) ? output.content : []
363
+ }.compact.find { |content|
364
+ content.is_a?(OpenAI::Models::Responses::ResponseOutputText)
365
+ }&.text
366
+ end
367
+
368
+ # :reek:FeatureEnvy
369
+ def wrap_schema_if_needed(schema)
370
+ if schema.key?(:format) || schema.key?("format")
371
+ schema
372
+ elsif (schema.key?(:name) || schema.key?("name")) &&
373
+ (schema.key?(:schema) || schema.key?("schema")) &&
374
+ (schema.key?(:strict) || schema.key?("strict"))
375
+ {
376
+ format: schema.merge(type: :json_schema)
377
+ }
378
+ else
379
+ {
380
+ format: {
381
+ type: :json_schema,
382
+ name: "response",
383
+ schema: schema,
384
+ strict: true
385
+ }
386
+ }
387
+ end
388
+ tools_list
389
+ end
390
+
391
+ # :reek:DuplicateMethodCall
392
+ # :reek:FeatureEnvy
393
+ # :reek:ManualDispatch
394
+ # :reek:TooManyStatements
395
+ def extract_and_save_images(response)
396
+ image_filenames = []
397
+
398
+ image_outputs = response.output.select { |output|
399
+ output.respond_to?(:type) && output.type == :image_generation_call
400
+ }
401
+
402
+ return image_filenames if image_outputs.empty?
403
+
404
+ # ISO 8601 basic format with centisecond precision
405
+ timestamp = Time.now.strftime("%Y%m%dT%H%M%S%2N")
406
+
407
+ subfolder_name = "#{timestamp}_#{response.id}"
408
+ subfolder_path = File.join(image_folder || "./images", subfolder_name)
409
+ FileUtils.mkdir_p(subfolder_path)
410
+
411
+ image_outputs.each_with_index do |output, index|
412
+ next unless output.respond_to?(:result) && output.result
413
+
414
+ begin
415
+ image_data = Base64.strict_decode64(output.result)
416
+
417
+ filename = "#{(index + 1).to_s.rjust(3, "0")}.png"
418
+ filepath = File.join(subfolder_path, filename)
419
+
420
+ File.binwrite(filepath, image_data)
421
+
422
+ image_filenames << filepath
423
+ rescue => error
424
+ warn "Failed to save image: #{error.message}"
425
+ end
277
426
  end
427
+
428
+ image_filenames
429
+ end
430
+
431
+ # :reek:UtilityFunction
432
+ # :reek:ManualDispatch
433
+ def extract_text_from_response(response)
434
+ response.output.flat_map { |output|
435
+ output.respond_to?(:content) ? output.content : []
436
+ }.compact.find { |content|
437
+ content.is_a?(OpenAI::Models::Responses::ResponseOutputText)
438
+ }&.text
278
439
  end
279
440
 
280
- def extract_message(response)
281
- response.output.flat_map { _1.content }.select { _1.is_a?(OpenAI::Models::Responses::ResponseOutputText)}.first.text
441
+ # :reek:UtilityFunction
442
+ def wrap_schema_if_needed(schema)
443
+ if schema.key?(:format) || schema.key?("format")
444
+ schema
445
+ elsif (schema.key?(:name) || schema.key?("name")) &&
446
+ (schema.key?(:schema) || schema.key?("schema")) &&
447
+ (schema.key?(:strict) || schema.key?("strict"))
448
+ {
449
+ format: schema.merge(type: :json_schema)
450
+ }
451
+ else
452
+ {
453
+ format: {
454
+ type: :json_schema,
455
+ name: "response",
456
+ schema: schema,
457
+ strict: true
458
+ }
459
+ }
460
+ end
282
461
  end
283
462
  end
284
463
  end
data/lib/ai/response.rb CHANGED
@@ -1,12 +1,17 @@
1
1
  module AI
2
+ # :reek:IrresponsibleModule
3
+ # :reek:TooManyInstanceVariables
2
4
  class Response
3
5
  attr_reader :id, :model, :usage, :total_tokens
6
+ # :reek:Attribute
7
+ attr_accessor :images
4
8
 
5
9
  def initialize(response)
6
10
  @id = response.id
7
11
  @model = response.model
8
12
  @usage = response.usage.to_h.slice(:input_tokens, :output_tokens, :total_tokens)
9
13
  @total_tokens = @usage[:total_tokens]
14
+ @images = []
10
15
  end
11
16
  end
12
- end
17
+ end
data/lib/ai-chat.rb CHANGED
@@ -1 +1 @@
1
- require_relative "ai/chat"
1
+ require_relative "ai/chat"
metadata CHANGED
@@ -1,71 +1,71 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai-chat
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.1
4
+ version: 0.2.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Raghu Betina
8
- autorequire:
8
+ autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2025-07-29 00:00:00.000000000 Z
11
+ date: 2025-08-05 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: zeitwerk
14
+ name: openai
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: '2.7'
19
+ version: '0.16'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: '2.7'
26
+ version: '0.16'
27
27
  - !ruby/object:Gem::Dependency
28
- name: openai
28
+ name: marcel
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
31
  - - "~>"
32
32
  - !ruby/object:Gem::Version
33
- version: '0.14'
33
+ version: '1.0'
34
34
  type: :runtime
35
35
  prerelease: false
36
36
  version_requirements: !ruby/object:Gem::Requirement
37
37
  requirements:
38
38
  - - "~>"
39
39
  - !ruby/object:Gem::Version
40
- version: '0.14'
40
+ version: '1.0'
41
41
  - !ruby/object:Gem::Dependency
42
- name: mime-types
42
+ name: base64
43
43
  requirement: !ruby/object:Gem::Requirement
44
44
  requirements:
45
- - - "~>"
45
+ - - ">"
46
46
  - !ruby/object:Gem::Version
47
- version: '3.0'
47
+ version: 0.1.1
48
48
  type: :runtime
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
- - - "~>"
52
+ - - ">"
53
53
  - !ruby/object:Gem::Version
54
- version: '3.0'
54
+ version: 0.1.1
55
55
  - !ruby/object:Gem::Dependency
56
- name: base64
56
+ name: json
57
57
  requirement: !ruby/object:Gem::Requirement
58
58
  requirements:
59
59
  - - "~>"
60
60
  - !ruby/object:Gem::Version
61
- version: '0.1'
61
+ version: '2.0'
62
62
  type: :runtime
63
63
  prerelease: false
64
64
  version_requirements: !ruby/object:Gem::Requirement
65
65
  requirements:
66
66
  - - "~>"
67
67
  - !ruby/object:Gem::Version
68
- version: '0.1'
68
+ version: '2.0'
69
69
  - !ruby/object:Gem::Dependency
70
70
  name: dotenv
71
71
  requirement: !ruby/object:Gem::Requirement
@@ -94,7 +94,7 @@ dependencies:
94
94
  - - "~>"
95
95
  - !ruby/object:Gem::Version
96
96
  version: '11.1'
97
- description:
97
+ description:
98
98
  email:
99
99
  - raghu@firstdraft.com
100
100
  executables: []
@@ -109,7 +109,6 @@ files:
109
109
  - lib/ai-chat.rb
110
110
  - lib/ai/chat.rb
111
111
  - lib/ai/response.rb
112
- - lib/ai_chat.rb
113
112
  homepage: https://github.com/firstdraft/ai-chat
114
113
  licenses:
115
114
  - MIT
@@ -120,7 +119,7 @@ metadata:
120
119
  label: AI Chat
121
120
  rubygems_mfa_required: 'true'
122
121
  source_code_uri: https://github.com/firstdraft/ai-chat
123
- post_install_message:
122
+ post_install_message:
124
123
  rdoc_options: []
125
124
  require_paths:
126
125
  - lib
@@ -135,8 +134,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
135
134
  - !ruby/object:Gem::Version
136
135
  version: '0'
137
136
  requirements: []
138
- rubygems_version: 3.5.23
139
- signing_key:
137
+ rubygems_version: 3.4.6
138
+ signing_key:
140
139
  specification_version: 4
141
140
  summary: A beginner-friendly Ruby interface for OpenAI's API
142
141
  test_files: []
data/lib/ai_chat.rb DELETED
@@ -1,6 +0,0 @@
1
- require "zeitwerk"
2
- loader = Zeitwerk::Loader.for_gem
3
- loader.inflector.inflect("ai" => "AI")
4
- loader.setup
5
-
6
- require_relative "ai/chat"