ruby-openai 6.3.1 → 6.4.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 958a72f1590182a35f447bb66e286adde77217e58b5d81e9621cd00e7f62cf58
4
- data.tar.gz: 46729b5fc28187ba376af8c24d02e6b96f8dd4a18dca5cd19ad6fcea09e9d6a1
3
+ metadata.gz: fb6b0c3d7f87d9db80a130891489e538f5abe737edd0faac2d95305b6b4f2071
4
+ data.tar.gz: 79ca8323159be1b6b4aadac2712535a8ac09ca658d3b9b9312b753f625cf9546
5
5
  SHA512:
6
- metadata.gz: f50b6b5044c70a5e549600c5563ec57daf1aefd4301cf04ca5bb992942d5ff3a361ad26e1defefa9cdc490b4eeae7ced18c4b304a6b44245516501286005c159
7
- data.tar.gz: 26ce015243210f29239e4f993baabdee6e5a0265708ee407d5f7d3bef17db50b398e35c7ab225724aaec8519abc35f80e2d928d44e821bdfe27f6ff9db2bcc79
6
+ metadata.gz: d3370f207d43c390ee0c0fd8d6fa7a3860877f0f6fefe27c0dfd17ae5d629162e37b5f02e627eb60737a47eef16f98a91f91fb10815256cb883b89e395f8a344
7
+ data.tar.gz: 22c23aeda4f01a56d285a76c86dc36e6175ce95e027b69dbf69cc306953a612dd994b55f7e4966f9bfc751c5c6afa677acbc0339cc54a2cd4f1482a2b0594ab2
data/CHANGELOG.md CHANGED
@@ -5,6 +5,21 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [6.4.0] - 2024-03-27
9
+
10
+ ### Added
11
+
12
+ - Add DALL·E 3 to specs and README - thanks to [@Gary-H9](https://github.com/Gary-H9)
13
+ - Add Whisper transcription language selection parameter to README - thanks to [@nfedyashev](https://github.com/nfedyashev)
14
+ - Add bundle exec rake lint and bundle exec rake test to make development easier - thanks to [@ignacio-chiazzo](https://github.com/ignacio-chiazzo)
15
+ - Add link to [https://github.com/sponsors/alexrudall](https://github.com/sponsors/alexrudall) when users run `bundle fund`
16
+
17
+ ### Fixed
18
+
19
+ - Update README and spec to use tool calls instead of functions - thanks to [@mpallenjr](https://github.com/mpallenjr)
20
+ - Remove nonexistent Thread#list method - thanks again! to [@ignacio-chiazzo](https://github.com/ignacio-chiazzo)
21
+ - Update finetunes docs in README to use chat instead of completions endpoint - thanks to [@blefev](https://github.com/blefev)
22
+
8
23
  ## [6.3.1] - 2023-12-04
9
24
 
10
25
  ### Fixed
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- ruby-openai (6.3.1)
4
+ ruby-openai (6.4.0)
5
5
  event_stream_parser (>= 0.3.0, < 2.0.0)
6
6
  faraday (>= 1)
7
7
  faraday-multipart (>= 1)
@@ -12,14 +12,14 @@ GEM
12
12
  addressable (2.8.5)
13
13
  public_suffix (>= 2.0.2, < 6.0)
14
14
  ast (2.4.2)
15
- base64 (0.1.1)
15
+ base64 (0.2.0)
16
16
  byebug (11.1.3)
17
17
  crack (0.4.5)
18
18
  rexml
19
19
  diff-lcs (1.5.0)
20
20
  dotenv (2.8.1)
21
21
  event_stream_parser (1.0.0)
22
- faraday (2.7.11)
22
+ faraday (2.7.12)
23
23
  base64
24
24
  faraday-net_http (>= 2.0, < 3.1)
25
25
  ruby2_keywords (>= 0.0.4)
data/README.md CHANGED
@@ -10,6 +10,55 @@ Stream text with GPT-4, transcribe and translate audio with Whisper, or create i
10
10
 
11
11
  [🚢 Hire me](https://peaceterms.com?utm_source=ruby-openai&utm_medium=readme&utm_id=26072023) | [🎮 Ruby AI Builders Discord](https://discord.gg/k4Uc224xVD) | [🐦 Twitter](https://twitter.com/alexrudall) | [🧠 Anthropic Gem](https://github.com/alexrudall/anthropic) | [🚂 Midjourney Gem](https://github.com/alexrudall/midjourney)
12
12
 
13
+ # Table of Contents
14
+
15
+ - [Ruby OpenAI](#ruby-openai)
16
+ - [Table of Contents](#table-of-contents)
17
+ - [Installation](#installation)
18
+ - [Bundler](#bundler)
19
+ - [Gem install](#gem-install)
20
+ - [Usage](#usage)
21
+ - [Quickstart](#quickstart)
22
+ - [With Config](#with-config)
23
+ - [Custom timeout or base URI](#custom-timeout-or-base-uri)
24
+ - [Extra Headers per Client](#extra-headers-per-client)
25
+ - [Verbose Logging](#verbose-logging)
26
+ - [Azure](#azure)
27
+ - [Counting Tokens](#counting-tokens)
28
+ - [Models](#models)
29
+ - [Examples](#examples)
30
+ - [Chat](#chat)
31
+ - [Streaming Chat](#streaming-chat)
32
+ - [Vision](#vision)
33
+ - [JSON Mode](#json-mode)
34
+ - [Functions](#functions)
35
+ - [Edits](#edits)
36
+ - [Embeddings](#embeddings)
37
+ - [Files](#files)
38
+ - [Finetunes](#finetunes)
39
+ - [Assistants](#assistants)
40
+ - [Threads and Messages](#threads-and-messages)
41
+ - [Runs](#runs)
42
+ - [Runs involving function tools](#runs-involving-function-tools)
43
+ - [Image Generation](#image-generation)
44
+ - [DALL·E 2](#dalle-2)
45
+ - [DALL·E 3](#dalle-3)
46
+ - [Image Edit](#image-edit)
47
+ - [Image Variations](#image-variations)
48
+ - [Moderations](#moderations)
49
+ - [Whisper](#whisper)
50
+ - [Translate](#translate)
51
+ - [Transcribe](#transcribe)
52
+ - [Speech](#speech)
53
+ - [Errors](#errors)
54
+ - [Development](#development)
55
+ - [Release](#release)
56
+ - [Contributing](#contributing)
57
+ - [License](#license)
58
+ - [Code of Conduct](#code-of-conduct)
59
+
60
+ ## Installation
61
+
13
62
  ### Bundler
14
63
 
15
64
  Add this line to your application's Gemfile:
@@ -108,6 +157,15 @@ OpenAI.configure do |config|
108
157
  end
109
158
  ```
110
159
 
160
+ #### Extra Headers per Client
161
+
162
+ You can dynamically pass headers per client object, which will be merged with any headers set globally with OpenAI.configure:
163
+
164
+ ```ruby
165
+ client = OpenAI::Client.new(access_token: "access_token_goes_here")
166
+ client.add_headers("X-Proxy-TTL" => "43200")
167
+ ```
168
+
111
169
  #### Verbose Logging
112
170
 
113
171
  You can pass [Faraday middleware](https://lostisland.github.io/faraday/#/middleware/index) to the client in a block, eg. to enable verbose logging with Ruby's [Logger](https://ruby-doc.org/3.2.2/stdlibs/logger/Logger.html):
@@ -297,36 +355,39 @@ response =
297
355
  "content": "What is the weather like in San Francisco?",
298
356
  },
299
357
  ],
300
- functions: [
358
+ tools: [
301
359
  {
302
- name: "get_current_weather",
303
- description: "Get the current weather in a given location",
304
- parameters: {
305
- type: :object,
306
- properties: {
307
- location: {
308
- type: :string,
309
- description: "The city and state, e.g. San Francisco, CA",
310
- },
311
- unit: {
312
- type: "string",
313
- enum: %w[celsius fahrenheit],
360
+ type: "function",
361
+ function: {
362
+ name: "get_current_weather",
363
+ description: "Get the current weather in a given location",
364
+ parameters: {
365
+ type: :object,
366
+ properties: {
367
+ location: {
368
+ type: :string,
369
+ description: "The city and state, e.g. San Francisco, CA",
370
+ },
371
+ unit: {
372
+ type: "string",
373
+ enum: %w[celsius fahrenheit],
374
+ },
314
375
  },
376
+ required: ["location"],
315
377
  },
316
- required: ["location"],
317
378
  },
318
- },
379
+ }
319
380
  ],
320
381
  },
321
382
  )
322
383
 
323
384
  message = response.dig("choices", 0, "message")
324
385
 
325
- if message["role"] == "assistant" && message["function_call"]
326
- function_name = message.dig("function_call", "name")
386
+ if message["role"] == "assistant" && message["tool_calls"]
387
+ function_name = message.dig("tool_calls", "function", "name")
327
388
  args =
328
389
  JSON.parse(
329
- message.dig("function_call", "arguments"),
390
+ message.dig("tool_calls", "function", "arguments"),
330
391
  { symbolize_names: true },
331
392
  )
332
393
 
@@ -423,16 +484,16 @@ response = client.finetunes.retrieve(id: fine_tune_id)
423
484
  fine_tuned_model = response["fine_tuned_model"]
424
485
  ```
425
486
 
426
- This fine-tuned model name can then be used in completions:
487
+ This fine-tuned model name can then be used in chat completions:
427
488
 
428
489
  ```ruby
429
- response = client.completions(
490
+ response = client.chat(
430
491
  parameters: {
431
492
  model: fine_tuned_model,
432
- prompt: "I love Mondays!"
493
+ messages: [{ role: "user", content: "I love Mondays!"}]
433
494
  }
434
495
  )
435
- response.dig("choices", 0, "text")
496
+ response.dig("choices", 0, "message", "content")
436
497
  ```
437
498
 
438
499
  You can also capture the events for a job:
@@ -441,10 +502,221 @@ You can also capture the events for a job:
441
502
  client.finetunes.list_events(id: fine_tune_id)
442
503
  ```
443
504
 
505
+ ### Assistants
506
+
507
+ Assistants can call models to interact with threads and use tools to perform tasks (see [Assistant Overview](https://platform.openai.com/docs/assistants/overview)).
508
+
509
+ To create a new assistant (see [API documentation](https://platform.openai.com/docs/api-reference/assistants/createAssistant)):
510
+
511
+ ```ruby
512
+ response = client.assistants.create(
513
+ parameters: {
514
+ model: "gpt-3.5-turbo-1106", # Retrieve via client.models.list. Assistants need 'gpt-3.5-turbo-1106' or later.
515
+ name: "OpenAI-Ruby test assistant",
516
+ description: nil,
517
+ instructions: "You are a helpful assistant for coding a OpenAI API client using the OpenAI-Ruby gem.",
518
+ tools: [
519
+ { type: 'retrieval' }, # Allow access to files attached using file_ids
520
+ { type: 'code_interpreter' }, # Allow access to Python code interpreter
521
+ ],
522
+ "file_ids": ["file-123"], # See Files section above for how to upload files
523
+ "metadata": { my_internal_version_id: '1.0.0' }
524
+ })
525
+ assistant_id = response["id"]
526
+ ```
527
+
528
+ Given an `assistant_id` you can `retrieve` the current field values:
529
+
530
+ ```ruby
531
+ client.assistants.retrieve(id: assistant_id)
532
+ ```
533
+
534
+ You can get a `list` of all assistants currently available under the organization:
535
+
536
+ ```ruby
537
+ client.assistants.list
538
+ ```
539
+
540
+ You can modify an existing assistant using the assistant's id (see [API documentation](https://platform.openai.com/docs/api-reference/assistants/modifyAssistant)):
541
+
542
+ ```ruby
543
+ response = client.assistants.modify(
544
+ id: assistant_id,
545
+ parameters: {
546
+ name: "Modified Test Assistant for OpenAI-Ruby",
547
+ metadata: { my_internal_version_id: '1.0.1' }
548
+ })
549
+ ```
550
+
551
+ You can delete assistants:
552
+
553
+ ```
554
+ client.assistants.delete(id: assistant_id)
555
+ ```
556
+
557
+ ### Threads and Messages
558
+
559
+ Once you have created an assistant as described above, you need to prepare a `Thread` of `Messages` for the assistant to work on (see [introduction on Assistants](https://platform.openai.com/docs/assistants/how-it-works)). For example, as an initial setup you could do:
560
+
561
+ ```ruby
562
+ # Create thread
563
+ response = client.threads.create # Note: Once you create a thread, there is no way to list it
564
+ # or recover it currently (as of 2023-12-10). So hold onto the `id`
565
+ thread_id = response["id"]
566
+
567
+ # Add initial message from user (see https://platform.openai.com/docs/api-reference/messages/createMessage)
568
+ message_id = client.messages.create(
569
+ thread_id: thread_id,
570
+ parameters: {
571
+ role: "user", # Required for manually created messages
572
+ content: "Can you help me write an API library to interact with the OpenAI API please?"
573
+ })["id"]
574
+
575
+ # Retrieve individual message
576
+ message = client.messages.retrieve(thread_id: thread_id, id: message_id)
577
+
578
+ # Review all messages on the thread
579
+ messages = client.messages.list(thread_id: thread_id)
580
+ ```
581
+
582
+ To clean up after a thread is no longer needed:
583
+
584
+ ```ruby
585
+ # To delete the thread (and all associated messages):
586
+ client.threads.delete(id: thread_id)
587
+
588
+ client.messages.retrieve(thread_id: thread_id, id: message_id) # -> Fails after thread is deleted
589
+ ```
590
+
591
+ ### Runs
592
+
593
+ To submit a thread to be evaluated with the model of an assistant, create a `Run` as follows (Note: This is one place where OpenAI will take your money):
594
+
595
+ ```ruby
596
+ # Create run (will use instruction/model/tools from Assistant's definition)
597
+ response = client.runs.create(thread_id: thread_id,
598
+ parameters: {
599
+ assistant_id: assistant_id
600
+ })
601
+ run_id = response['id']
602
+
603
+ # Retrieve/poll Run to observe status
604
+ response = client.runs.retrieve(id: run_id, thread_id: thread_id)
605
+ status = response['status']
606
+ ```
607
+
608
+ The `status` response can include the following strings `queued`, `in_progress`, `requires_action`, `cancelling`, `cancelled`, `failed`, `completed`, or `expired` which you can handle as follows:
609
+
610
+ ```ruby
611
+ while true do
612
+
613
+ response = client.runs.retrieve(id: run_id, thread_id: thread_id)
614
+ status = response['status']
615
+
616
+ case status
617
+ when 'queued', 'in_progress', 'cancelling'
618
+ puts 'Sleeping'
619
+ sleep 1 # Wait one second and poll again
620
+ when 'completed'
621
+ break # Exit loop and report result to user
622
+ when 'requires_action'
623
+ # Handle tool calls (see below)
624
+ when 'cancelled', 'failed', 'expired'
625
+ puts response['last_error'].inspect
626
+ break # or `exit`
627
+ else
628
+ puts "Unknown status response: #{status}"
629
+ end
630
+ end
631
+ ```
632
+
633
+ If the `status` response indicates that the `run` is `completed`, the associated `thread` will have one or more new `messages` attached:
634
+
635
+ ```ruby
636
+ # Either retrieve all messages in bulk again, or...
637
+ messages = client.messages.list(thread_id: thread_id) # Note: as of 2023-12-11 adding limit or order options isn't working, yet
638
+
639
+ # Alternatively retrieve the `run steps` for the run which link to the messages:
640
+ run_steps = client.run_steps.list(thread_id: thread_id, run_id: run_id)
641
+ new_message_ids = run_steps['data'].filter_map { |step|
642
+ if step['type'] == 'message_creation'
643
+ step.dig('step_details', "message_creation", "message_id")
644
+ end # Ignore tool calls, because they don't create new messages.
645
+ }
646
+
647
+ # Retrieve the individual messages
648
+ new_messages = new_message_ids.map { |msg_id|
649
+ client.messages.retrieve(id: msg_id, thread_id: thread_id)
650
+ }
651
+
652
+ # Find the actual response text in the content array of the messages
653
+ new_messages.each { |msg|
654
+ msg['content'].each { |content_item|
655
+ case content_item['type']
656
+ when 'text'
657
+ puts content_item.dig('text', 'value')
658
+ # Also handle annotations
659
+ when 'image_file'
660
+ # Use File endpoint to retrieve file contents via id
661
+ id = content_item.dig('image_file', 'file_id')
662
+ end
663
+ }
664
+ }
665
+ ```
666
+
667
+ At any time you can list all runs which have been performed on a particular thread or are currently running (in descending/newest first order):
668
+
669
+ ```ruby
670
+ client.runs.list(thread_id: thread_id)
671
+ ```
672
+
673
+ #### Runs involving function tools
674
+
675
+ In case you are allowing the assistant to access `function` tools (they are defined in the same way as functions during chat completion), you might get a status code of `requires_action` when the assistant wants you to evaluate one or more function tools:
676
+
677
+ ```ruby
678
+ def get_current_weather(location:, unit: "celsius")
679
+ # Your function code goes here
680
+ if location =~ /San Francisco/i
681
+ return unit == "celsius" ? "The weather is nice 🌞 at 27°C" : "The weather is nice 🌞 at 80°F"
682
+ else
683
+ return unit == "celsius" ? "The weather is icy 🥶 at -5°C" : "The weather is icy 🥶 at 23°F"
684
+ end
685
+ end
686
+
687
+ if status == 'requires_action'
688
+
689
+ tools_to_call = response.dig('required_action', 'submit_tool_outputs', 'tool_calls')
690
+
691
+ my_tool_outputs = tools_to_call.map { |tool|
692
+ # Call the functions based on the tool's name
693
+ function_name = tool.dig('function', 'name')
694
+ arguments = JSON.parse(
695
+ tool.dig("function", "arguments"),
696
+ { symbolize_names: true },
697
+ )
698
+
699
+ tool_output = case function_name
700
+ when "get_current_weather"
701
+ get_current_weather(**arguments)
702
+ end
703
+
704
+ { tool_call_id: tool['id'], output: tool_output }
705
+ }
706
+
707
+ client.runs.submit_tool_outputs(thread_id: thread_id, run_id: run_id, parameters: { tool_outputs: my_tool_outputs })
708
+ end
709
+ ```
710
+
711
+ Note that you have 10 minutes to submit your tool output before the run expires.
712
+
444
713
  ### Image Generation
445
714
 
446
- Generate an image using DALL·E! The size of any generated images must be one of `256x256`, `512x512` or `1024x1024` -
447
- if not specified the image will default to `1024x1024`.
715
+ Generate images using DALL·E 2 or DALL·E 3!
716
+
717
+ #### DALL·E 2
718
+
719
+ For DALL·E 2 the size of any generated images must be one of `256x256`, `512x512` or `1024x1024` - if not specified the image will default to `1024x1024`.
448
720
 
449
721
  ```ruby
450
722
  response = client.images.generate(parameters: { prompt: "A baby sea otter cooking pasta wearing a hat of some sort", size: "256x256" })
@@ -454,6 +726,18 @@ puts response.dig("data", 0, "url")
454
726
 
455
727
  ![Ruby](https://i.ibb.co/6y4HJFx/img-d-Tx-Rf-RHj-SO5-Gho-Cbd8o-LJvw3.png)
456
728
 
729
+ #### DALL·E 3
730
+
731
+ For DALL·E 3 the size of any generated images must be one of `1024x1024`, `1024x1792` or `1792x1024`. Additionally the quality of the image can be specified to either `standard` or `hd`.
732
+
733
+ ```ruby
734
+ response = client.images.generate(parameters: { prompt: "A springer spaniel cooking pasta wearing a hat of some sort", size: "1024x1792", quality: "standard" })
735
+ puts response.dig("data", 0, "url")
736
+ # => "https://oaidalleapiprodscus.blob.core.windows.net/private/org-Rf437IxKhh..."
737
+ ```
738
+
739
+ ![Ruby](https://i.ibb.co/z2tCKv9/img-Goio0l-S0i81-NUNa-BIx-Eh-CT6-L.png)
740
+
457
741
  ### Image Edit
458
742
 
459
743
  Fill in the transparent part of an image, or upload a mask with transparent sections to indicate the parts of an image that can be changed according to your prompt...
@@ -511,11 +795,14 @@ puts response["text"]
511
795
 
512
796
  The transcriptions API takes as input the audio file you want to transcribe and returns the text in the desired output file format.
513
797
 
798
+ You can pass the language of the audio file to improve transcription quality. Supported languages are listed [here](https://github.com/openai/whisper#available-models-and-languages). You need to provide the language as an ISO-639-1 code, eg. "en" for English or "ne" for Nepali. You can look up the codes [here](https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes).
799
+
514
800
  ```ruby
515
801
  response = client.audio.transcribe(
516
802
  parameters: {
517
803
  model: "whisper-1",
518
804
  file: File.open("path_to_file", "rb"),
805
+ language: "en" # Optional.
519
806
  })
520
807
  puts response["text"]
521
808
  # => "Transcription of the text"
@@ -555,9 +842,10 @@ After checking out the repo, run `bin/setup` to install dependencies. You can ru
555
842
 
556
843
  To install this gem onto your local machine, run `bundle exec rake install`.
557
844
 
558
- ### Warning
845
+ To run all tests, execute the command `bundle exec rake`, which will also run the linter (Rubocop). This repository uses [VCR](https://github.com/vcr/vcr) to log API requests.
559
846
 
560
- If you have an `OPENAI_ACCESS_TOKEN` in your `ENV`, running the specs will use this to run the specs against the actual API, which will be slow and cost you money - 2 cents or more! Remove it from your environment with `unset` or similar if you just want to run the specs against the stored VCR responses.
847
+ > [!WARNING]
848
+ > If you have an `OPENAI_ACCESS_TOKEN` in your `ENV`, running the specs will use this to run the specs against the actual API, which will be slow and cost you money - 2 cents or more! Remove it from your environment with `unset` or similar if you just want to run the specs against the stored VCR responses.
561
849
 
562
850
  ## Release
563
851
 
data/Rakefile CHANGED
@@ -1,6 +1,19 @@
1
1
  require "bundler/gem_tasks"
2
2
  require "rspec/core/rake_task"
3
+ require "rubocop/rake_task"
3
4
 
4
5
  RSpec::Core::RakeTask.new(:spec)
5
6
 
6
- task default: :spec
7
+ task :default do
8
+ Rake::Task["test"].invoke
9
+ Rake::Task["lint"].invoke
10
+ end
11
+
12
+ task :test do
13
+ Rake::Task["spec"].invoke
14
+ end
15
+
16
+ task :lint do
17
+ RuboCop::RakeTask.new(:rubocop)
18
+ Rake::Task["rubocop"].invoke
19
+ end
@@ -4,10 +4,6 @@ module OpenAI
4
4
  @client = client.beta(assistants: "v1")
5
5
  end
6
6
 
7
- def list
8
- @client.get(path: "/threads")
9
- end
10
-
11
7
  def retrieve(id:)
12
8
  @client.get(path: "/threads/#{id}")
13
9
  end
@@ -1,3 +1,3 @@
1
1
  module OpenAI
2
- VERSION = "6.3.1".freeze
2
+ VERSION = "6.4.0".freeze
3
3
  end
data/ruby-openai.gemspec CHANGED
@@ -15,6 +15,7 @@ Gem::Specification.new do |spec|
15
15
  spec.metadata["source_code_uri"] = "https://github.com/alexrudall/ruby-openai"
16
16
  spec.metadata["changelog_uri"] = "https://github.com/alexrudall/ruby-openai/blob/main/CHANGELOG.md"
17
17
  spec.metadata["rubygems_mfa_required"] = "true"
18
+ spec.metadata["funding_uri"] = "https://github.com/sponsors/alexrudall"
18
19
 
19
20
  # Specify which files should be added to the gem when it is released.
20
21
  # The `git ls-files -z` loads the files in the RubyGem that have been added into git.
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby-openai
3
3
  version: !ruby/object:Gem::Version
4
- version: 6.3.1
4
+ version: 6.4.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Alex
8
- autorequire:
8
+ autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-12-04 00:00:00.000000000 Z
11
+ date: 2024-03-27 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: event_stream_parser
@@ -58,7 +58,7 @@ dependencies:
58
58
  - - ">="
59
59
  - !ruby/object:Gem::Version
60
60
  version: '1'
61
- description:
61
+ description:
62
62
  email:
63
63
  - alexrudall@users.noreply.github.com
64
64
  executables: []
@@ -113,7 +113,8 @@ metadata:
113
113
  source_code_uri: https://github.com/alexrudall/ruby-openai
114
114
  changelog_uri: https://github.com/alexrudall/ruby-openai/blob/main/CHANGELOG.md
115
115
  rubygems_mfa_required: 'true'
116
- post_install_message:
116
+ funding_uri: https://github.com/sponsors/alexrudall
117
+ post_install_message:
117
118
  rdoc_options: []
118
119
  require_paths:
119
120
  - lib
@@ -128,8 +129,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
128
129
  - !ruby/object:Gem::Version
129
130
  version: '0'
130
131
  requirements: []
131
- rubygems_version: 3.4.10
132
- signing_key:
132
+ rubygems_version: 3.4.22
133
+ signing_key:
133
134
  specification_version: 4
134
135
  summary: "OpenAI API + Ruby! \U0001F916\U0001FA75"
135
136
  test_files: []