ai-chat 0.2.3 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c2397a7d43d950bf70abd84c83d0e57b2cadb70fc127985fae4da4a9ecf142b8
4
- data.tar.gz: d44a9d1999ce48cd0af92f367f5f0e2f586b8c19d1fc9e95ce8c7aa7e3fffa84
3
+ metadata.gz: b21d17972572c7c6282aa2a40c539f48967b47caad6ac0f80f99046337436576
4
+ data.tar.gz: b27033cec74910347d8f965e7aa4928b888a0d42aa8f14c48bc229014522cf7d
5
5
  SHA512:
6
- metadata.gz: fdb13f4e405f46a21591f71ae818544bc60f7bc083e4b3eb9a8a6d57b9eba03995fe8b75828f3972e564dfa3ed3ed09aed4a8b7ae2260b3602cbede9d80afd8e
7
- data.tar.gz: 427b5db79e006d0f6f9b20b83472a78c3234118807e4d43d472304426ca2d80d5b189d029924cd548d7004206914a4e63fd52ca93b1944e55e81276309a501d9
6
+ metadata.gz: 5fd677c3e077c29a9777c1c1fa57108798d39ad09fed0093db12ce0819c9a4a204bbc8cbbfd84b4236df48b9ee7e3a0ea0469d29745188d1a22d3f5789f5c65a
7
+ data.tar.gz: 4edb6917f52330c5a8dd27e825e1cc27201da82ab8cc7e4988d0514b3ed92969f0b8369be4ceec44fbaa3684545774060430ca110730e5dfe753686c4f292526
data/README.md CHANGED
@@ -34,6 +34,11 @@ The `examples/` directory contains focused examples for specific features:
34
34
  - `08_advanced_usage.rb` - Advanced patterns (chaining, web search)
35
35
  - `09_edge_cases.rb` - Error handling and edge cases
36
36
  - `10_additional_patterns.rb` - Less common usage patterns (direct add method, web search + schema, etc.)
37
+ - `11_mixed_content.rb` - Combining text and images in messages
38
+ - `12_image_generation.rb` - Using the image generation tool
39
+ - `13_code_interpreter.rb` - Using the code interpreter tool
40
+ - `14_background_mode.rb` - Running responses in background mode
41
+ - `15_conversation_features_comprehensive.rb` - All conversation features (auto-creation, inspection, loading, forking)
37
42
 
38
43
  Each example is self-contained and can be run individually:
39
44
  ```bash
@@ -82,7 +87,7 @@ pp a.messages
82
87
  # => [{:role=>"user", :content=>"If the Ruby community had an official motto, what might it be?"}]
83
88
 
84
89
  # Generate the next message using AI
85
- a.generate! # => "Matz is nice and so we are nice" (or similar)
90
+ a.generate! # => { :role => "assistant", :content => "Matz is nice and so we are nice" (or similar) }
86
91
 
87
92
  # Your array now includes the assistant's response
88
93
  pp a.messages
@@ -93,7 +98,7 @@ pp a.messages
93
98
 
94
99
  # Continue the conversation
95
100
  a.add("What about Rails?")
96
- a.generate! # => "Convention over configuration."
101
+ a.generate! # => { :role => "assistant", :content => "Convention over configuration."}
97
102
  ```
98
103
 
99
104
  ## Understanding the Data Structure
@@ -135,7 +140,7 @@ pp b.messages
135
140
  # ]
136
141
 
137
142
  # Generate a response
138
- b.generate! # => "Methinks 'tis 'Ruby doth bring joy to all who craft with care'"
143
+ b.generate! # => { :role => "assistant", :content => "Methinks 'tis 'Ruby doth bring joy to all who craft with care'" }
139
144
  ```
140
145
 
141
146
  ### Convenience Methods
@@ -237,23 +242,24 @@ h.messages.last[:content]
237
242
  # => "Here's how to boil an egg..."
238
243
 
239
244
  # Or use the convenient shortcut
240
- h.last
245
+ h.last[:content]
241
246
  # => "Here's how to boil an egg..."
242
247
  ```
243
248
 
244
249
  ## Web Search
245
250
 
246
- To give the model access to real-time information from the internet, we enable the `web_search` feature by default. This uses OpenAI's built-in `web_search_preview` tool.
251
+ To give the model access to real-time information from the internet, you can enable web searching. This uses OpenAI's built-in `web_search_preview` tool.
247
252
 
248
253
  ```ruby
249
254
  m = AI::Chat.new
255
+ m.web_search = true
250
256
  m.user("What are the latest developments in the Ruby language?")
251
257
  m.generate! # This may use web search to find current information
252
258
  ```
253
259
 
254
260
  **Note:** This feature requires a model that supports the `web_search_preview` tool, such as `gpt-4o` or `gpt-4o-mini`. The gem will attempt to use a compatible model if you have `web_search` enabled.
255
261
 
256
- If you don't want the model to use web search, set `web_search` to `false`:
262
+ If you don't want the model to use web search, set `web_search` to `false` (this is the default):
257
263
 
258
264
  ```ruby
259
265
  m = AI::Chat.new
@@ -277,10 +283,11 @@ i.schema = '{"name": "nutrition_values","strict": true,"schema": {"type": "objec
277
283
  i.user("1 slice of pizza")
278
284
 
279
285
  response = i.generate!
286
+ data = response[:content]
280
287
  # => {:fat=>15, :protein=>12, :carbs=>35, :total_calories=>285}
281
288
 
282
289
  # The response is parsed JSON, not a string!
283
- response[:total_calories] # => 285
290
+ data[:total_calories] # => 285
284
291
  ```
285
292
 
286
293
  ### Schema Formats
@@ -442,14 +449,14 @@ a = AI::Chat.new
442
449
  a.user("What color is the object in this photo?", image: "thing.png")
443
450
  a.generate! # => "Red"
444
451
  a.user("What is the object in the photo?")
445
- a.generate! # => "I don't see a photo"
452
+ a.generate! # => { :content => "I don't see a photo", ... }
446
453
 
447
454
  b = AI::Chat.new
448
455
  b.user("What color is the object in this photo?", image: "thing.png")
449
456
  b.generate! # => "Red"
450
457
  b.user("What is the object in the photo?")
451
458
  b.previous_response_id = nil
452
- b.generate! # => "An apple"
459
+ b.generate! # => { :content => "An apple", ... }
453
460
  ```
454
461
 
455
462
  If you don't set `previous_response_id` to `nil`, the model won't have the old image(s) to work with.
@@ -462,7 +469,7 @@ You can enable OpenAI's image generation tool:
462
469
  a = AI::Chat.new
463
470
  a.image_generation = true
464
471
  a.user("Draw a picture of a kitten")
465
- a.generate! # => "Here is your picture of a kitten:"
472
+ a.generate! # => { :content => "Here is your picture of a kitten:", ... }
466
473
  ```
467
474
 
468
475
  By default, images are saved to `./images`. You can configure a different location:
@@ -472,7 +479,7 @@ a = AI::Chat.new
472
479
  a.image_generation = true
473
480
  a.image_folder = "./my_images"
474
481
  a.user("Draw a picture of a kitten")
475
- a.generate! # => "Here is your picture of a kitten:"
482
+ a.generate! # => { :content => "Here is your picture of a kitten:", ... }
476
483
  ```
477
484
 
478
485
  Images are saved in timestamped subfolders using ISO 8601 basic format. For example:
@@ -510,11 +517,35 @@ a = AI::Chat.new
510
517
  a.image_generation = true
511
518
  a.image_folder = "./images"
512
519
  a.user("Draw a picture of a kitten")
513
- a.generate! # => "Here is a picture of a kitten:"
520
+ a.generate! # => { :content => "Here is a picture of a kitten:", ... }
514
521
  a.user("Make it even cuter")
515
- a.generate! # => "Here is the kitten, but even cuter:"
522
+ a.generate! # => { :content => "Here is the kitten, but even cuter:", ... }
516
523
  ```
517
524
 
525
+ ## Code Interpreter
526
+
527
+ ```ruby
528
+ y = AI::Chat.new
529
+ y.code_interpreter = true
530
+ y.user("Plot y = 2x*3 when x is -5 to 5.")
531
+ y.generate! # => {:content => "Here is the graph.", ... }
532
+ ```
533
+
534
+ ## Proxying Through prepend.me
535
+
536
+ You can proxy API calls through [prepend.me](https://prepend.me/).
537
+
538
+ ```rb
539
+ chat = AI::Chat.new
540
+ chat.proxy = true
541
+ chat.user("Tell me a story")
542
+ chat.generate!
543
+ puts chat.last[:content]
544
+ # => "Once upon a time..."
545
+ ```
546
+
547
+ When proxy is enabled, **you must use the API key provided by prepend.me** in place of a real OpenAI API key. Refer to [the section on API keys](#api-key) for options on how to set your key.
548
+
518
549
  ## Building Conversations Without API Calls
519
550
 
520
551
  You can manually add assistant messages without making API calls, which is useful when reconstructing a past conversation:
@@ -614,6 +645,93 @@ u.generate!
614
645
 
615
646
  Unless you've stored the previous messages somewhere yourself, this technique won't bring them back. But OpenAI remembers what they were, so that you can at least continue the conversation. (If you're using a reasoning model, this technique also preserves all of the model's reasoning.)
616
647
 
648
+ ### Automatic Conversation Management
649
+
650
+ Starting with your first `generate!` call, the gem automatically creates and manages a conversation with OpenAI. This conversation is stored server-side and tracks all messages, tool calls, reasoning, and other items.
651
+
652
+ ```ruby
653
+ chat = AI::Chat.new
654
+ chat.user("Hello")
655
+ chat.generate!
656
+
657
+ # Conversation ID is automatically set
658
+ puts chat.conversation_id # => "conv_abc123..."
659
+
660
+ # Continue the conversation - context is automatically maintained
661
+ chat.user("What did I just say?")
662
+ chat.generate! # Uses the same conversation automatically
663
+ ```
664
+
665
+ You can also load an existing conversation from your database:
666
+
667
+ ```ruby
668
+ # Load stored conversation_id from your database
669
+ chat = AI::Chat.new
670
+ chat.conversation_id = @thread.conversation_id # From your database
671
+
672
+ chat.user("Continue our discussion")
673
+ chat.generate! # Uses the loaded conversation
674
+ ```
675
+
676
+ **Note on forking:** If you want to "fork" a conversation (create a branch), you can still use `previous_response_id`. If both `conversation_id` and `previous_response_id` are set, the gem will use `previous_response_id` and warn you.
677
+
678
+ ## Inspecting Conversation Details
679
+
680
+ The gem provides two methods to inspect what happened during a conversation:
681
+
682
+ ### `items` - Programmatic Access
683
+
684
+ Returns the raw conversation items for programmatic use (displaying in views, filtering, etc.):
685
+
686
+ ```ruby
687
+ chat = AI::Chat.new
688
+ chat.web_search = true
689
+ chat.user("Search for Ruby tutorials")
690
+ chat.generate!
691
+
692
+ # Get all conversation items (chronological order by default)
693
+ page = chat.items
694
+
695
+ # Access item data
696
+ page.data.each do |item|
697
+ case item.type
698
+ when :message
699
+ puts "#{item.role}: #{item.content.first.text}"
700
+ when :web_search_call
701
+ puts "Web search: #{item.action.query}"
702
+ puts "Results: #{item.results.length}"
703
+ when :reasoning
704
+ puts "Reasoning: #{item.summary.first.text}"
705
+ end
706
+ end
707
+
708
+ # For long conversations, you can request reverse chronological order
709
+ # (useful for pagination to get most recent items first)
710
+ recent_items = chat.items(order: :desc)
711
+ ```
712
+
713
+ ### `verbose` - Terminal Output
714
+
715
+ Pretty-prints the entire conversation with all details for debugging and learning:
716
+
717
+ ```ruby
718
+ chat.verbose
719
+
720
+ # Output:
721
+ # ┌────────────────────────────────────────────────────────────────────────────┐
722
+ # │ Conversation: conv_6903c1eea6cc819695af3a1b1ebf9b390c3db5e8ec021c9a │
723
+ # │ Items: 3 │
724
+ # └────────────────────────────────────────────────────────────────────────────┘
725
+ #
726
+ # [detailed colorized output of all items including web searches,
727
+ # reasoning, tool calls, messages, etc.]
728
+ ```
729
+
730
+ This is useful for:
731
+ - **Learning** how the model uses tools (web search, code interpreter, etc.)
732
+ - **Debugging** why the model made certain decisions
733
+ - **Understanding** the full context beyond just the final response
734
+
617
735
  ## Setting messages directly
618
736
 
619
737
  You can use `.messages=()` to assign an `Array` of `Hashes`. Each `Hash` must have keys `:role` and `:content`, and optionally `:image` or `:images`:
@@ -658,69 +776,6 @@ q.messages = [
658
776
  ]
659
777
  ```
660
778
 
661
- ## Assigning `ActiveRecord::Relation`s
662
-
663
- If your chat history is contained in an `ActiveRecord::Relation`, you can assign it directly:
664
-
665
- ```ruby
666
- # Load from ActiveRecord
667
- @thread = Thread.find(42)
668
-
669
- r = AI::Chat.new
670
- r.messages = @thread.posts.order(:created_at)
671
- r.user("What should we discuss next?")
672
- r.generate! # Creates a new post record, too
673
- ```
674
-
675
- ### Requirements
676
-
677
- In order for the above to "magically" work, there are a few requirements. Your ActiveRecord model must have:
678
-
679
- - `.role` method that returns "system", "user", or "assistant"
680
- - `.content` method that returns the message text
681
- - `.image` method (optional) for single images - can return URLs, file paths, or Active Storage attachments
682
- - `.images` method (optional) for multiple images
683
-
684
- ### Custom Column Names
685
-
686
- If your columns have different names:
687
-
688
- ```ruby
689
- s = AI::Chat.new
690
- s.configure_message_attributes(
691
- role: :message_type, # Your column for role
692
- content: :message_body, # Your column for content
693
- image: :attachment # Your column/association for images
694
- )
695
- s.messages = @conversation.messages
696
- ```
697
-
698
- ### Saving Responses with Metadata
699
-
700
- To preserve response metadata, add an `openai_response` column to your messages table:
701
-
702
- ```ruby
703
- # In your migration
704
- add_column :messages, :openai_response, :text
705
-
706
- # In your model
707
- class Message < ApplicationRecord
708
- serialize :openai_response, AI::Chat::Response
709
- end
710
-
711
- # Usage
712
- @thread = Thread.find(42)
713
-
714
- t = AI::Chat.new
715
- t.posts = @thread.messages
716
- t.user("Hello!")
717
- t.generate!
718
-
719
- # The saved message will include token usage, model info, etc.
720
- last_message = @thread.messages.last
721
- last_message.openai_response.usage # => {:prompt_tokens=>10, ...}
722
- ```
723
-
724
779
  ## Other Features Being Considered
725
780
 
726
781
  - **Session management**: Save and restore conversations by ID
@@ -740,3 +795,15 @@ While this gem includes specs, they use mocked API responses. To test with real
740
795
  3. Run the examples: `bundle exec ruby examples/all.rb`
741
796
 
742
797
  This test program runs through all the major features of the gem, making real API calls to OpenAI.
798
+
799
+ ## Contributing
800
+
801
+ When contributing to this project:
802
+
803
+ 1. **Code Style**: This project uses StandardRB for linting. Run `bundle exec standardrb --fix` before committing to automatically fix style issues.
804
+
805
+ 2. **Testing**: Ensure all specs pass with `bundle exec rspec`.
806
+
807
+ 3. **Examples**: If adding a feature, consider adding an example in the `examples/` directory.
808
+
809
+ 4. **Documentation**: Update the README if your changes affect the public API.
data/ai-chat.gemspec CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  Gem::Specification.new do |spec|
4
4
  spec.name = "ai-chat"
5
- spec.version = "0.2.3"
5
+ spec.version = "0.3.0"
6
6
  spec.authors = ["Raghu Betina"]
7
7
  spec.email = ["raghu@firstdraft.com"]
8
8
  spec.homepage = "https://github.com/firstdraft/ai-chat"
@@ -12,19 +12,22 @@ Gem::Specification.new do |spec|
12
12
  spec.metadata = {
13
13
  "bug_tracker_uri" => "https://github.com/firstdraft/ai-chat/issues",
14
14
  "changelog_uri" => "https://github.com/firstdraft/ai-chat/blob/main/CHANGELOG.md",
15
- "homepage_uri" => "https://github.com/firstdraft/ai-chat",
15
+ "homepage_uri" => "https://rubygems.org/gems/ai-chat",
16
16
  "label" => "AI Chat",
17
17
  "rubygems_mfa_required" => "true",
18
18
  "source_code_uri" => "https://github.com/firstdraft/ai-chat"
19
19
  }
20
20
 
21
21
  spec.required_ruby_version = "~> 3.2"
22
- spec.add_runtime_dependency "openai", "~> 0.16"
22
+ spec.add_runtime_dependency "openai", "~> 0.34"
23
23
  spec.add_runtime_dependency "marcel", "~> 1.0"
24
- spec.add_runtime_dependency "base64", "> 0.1.1"
24
+ spec.add_runtime_dependency "base64", "~> 0.1", "> 0.1.1"
25
25
  spec.add_runtime_dependency "json", "~> 2.0"
26
+ spec.add_runtime_dependency "ostruct", "~> 0.2"
27
+ spec.add_runtime_dependency "tty-spinner", "~> 0.9.3"
28
+ spec.add_runtime_dependency "amazing_print", "~> 1.8"
26
29
 
27
- spec.add_development_dependency "dotenv"
30
+ spec.add_development_dependency "dotenv", ">= 1.0.0"
28
31
  spec.add_development_dependency "refinements", "~> 11.1"
29
32
 
30
33
  spec.extra_rdoc_files = Dir["README*", "LICENSE*"]
@@ -1,5 +1,5 @@
1
1
  require "amazing_print"
2
-
2
+ # :reek:IrresponsibleModule
3
3
  module AmazingPrint
4
4
  module AI
5
5
  def self.included(base)
@@ -27,6 +27,10 @@ module AmazingPrint
27
27
  end
28
28
  end
29
29
 
30
+ # :reek:DuplicateMethodCall
31
+ # :reek:FeatureEnvy
32
+ # :reek:NilCheck
33
+ # :reek:TooManyStatements
30
34
  def format_ai_chat(chat)
31
35
  vars = []
32
36
 
@@ -53,6 +57,8 @@ module AmazingPrint
53
57
  format_object(chat, vars)
54
58
  end
55
59
 
60
+ # :reek:TooManyStatements
61
+ # :reek:DuplicateMethodCall
56
62
  def format_object(object, vars)
57
63
  data = vars.map do |(name, value)|
58
64
  name = colorize(name, :variable) unless @options[:plain]