ruby-openai 3.7.0 → 4.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 3a26687eb00056e37ab5f2eed5b7bf6e73b4a9479016a101a828ae014ed299aa
4
- data.tar.gz: c729b1a8b2df16db50af716664108b1fb2418ef22f4159b7993148474cd93ad2
3
+ metadata.gz: ed826c08b658bc553da53acfc53a57c5a2f09320787dbe00fa6c901f1941e7a8
4
+ data.tar.gz: 0b1cb51e0cd24f1e50877045881485cdb1c0ecf772c98464ff159b1001abd81d
5
5
  SHA512:
6
- metadata.gz: ddd2bf5992ade00580d0c2cd2b668e70eb999c39a487da4020e46ad8debe96a08b5e0e5560baa612b4e89d0f91765532394a1d69f3cd7214c6bb938edb10118f
7
- data.tar.gz: aba44b222706820b3233fdcb623da8ba766c431485a723c870092c09608b4e1a7ab3466834d8683690b2873318bd807bcc3c57fe848831ed3c5fb8fc473d1a1a
6
+ metadata.gz: d8575821c9b6921840c3e8b34916992ef0d947f1cce1308e19b2e82d0d3f968870649eb9e456d9c38cfb95ae667576330176bae45cb173dab116735450cfc0b2
7
+ data.tar.gz: 8846dceef8669867c09168b19e0aedcb77cd76cd403a8ddf8a16e259132a17116c6c318adcbd4b04d0316c123166ef5e7da7e02cbbbe30e4e2ad3b13983fd67b
data/.rubocop.yml CHANGED
@@ -12,6 +12,9 @@ Layout/LineLength:
12
12
  Exclude:
13
13
  - "**/*.gemspec"
14
14
 
15
+ Metrics/AbcSize:
16
+ Max: 20
17
+
15
18
  Metrics/BlockLength:
16
19
  Exclude:
17
20
  - "spec/**/*"
data/CHANGELOG.md CHANGED
@@ -5,6 +5,18 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [4.0.0] - 2023-04-25
9
+
10
+ ### Added
11
+
12
+ - Add the ability to stream Chat responses from the API! Thanks to everyone who requested this and made suggestions.
13
+ - Added instructions for streaming to the README.
14
+
15
+ ### Changed
16
+
17
+ - Switch HTTP library from HTTParty to Faraday to allow streaming and future feature and performance improvements.
18
+ - [BREAKING] Endpoints now return JSON rather than HTTParty objects. You will need to update your code to handle this change, changing `JSON.parse(response.body)["key"]` and `response.parsed_response["key"]` to just `response["key"]`.
19
+
8
20
  ## [3.7.0] - 2023-03-25
9
21
 
10
22
  ### Added
data/Gemfile CHANGED
@@ -7,6 +7,6 @@ gem "byebug", "~> 11.1.3"
7
7
  gem "dotenv", "~> 2.8.1"
8
8
  gem "rake", "~> 13.0"
9
9
  gem "rspec", "~> 3.12"
10
- gem "rubocop", "~> 1.48.1"
10
+ gem "rubocop", "~> 1.50.2"
11
11
  gem "vcr", "~> 6.1.0"
12
12
  gem "webmock", "~> 3.18.1"
data/Gemfile.lock CHANGED
@@ -1,8 +1,9 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- ruby-openai (3.7.0)
5
- httparty (>= 0.18.1)
4
+ ruby-openai (4.0.0)
5
+ faraday (>= 1)
6
+ faraday-multipart (>= 1)
6
7
 
7
8
  GEM
8
9
  remote: https://rubygems.org/
@@ -15,20 +16,22 @@ GEM
15
16
  rexml
16
17
  diff-lcs (1.5.0)
17
18
  dotenv (2.8.1)
19
+ faraday (2.7.4)
20
+ faraday-net_http (>= 2.0, < 3.1)
21
+ ruby2_keywords (>= 0.0.4)
22
+ faraday-multipart (1.0.4)
23
+ multipart-post (~> 2)
24
+ faraday-net_http (3.0.2)
18
25
  hashdiff (1.0.1)
19
- httparty (0.21.0)
20
- mini_mime (>= 1.0.0)
21
- multi_xml (>= 0.5.2)
22
26
  json (2.6.3)
23
- mini_mime (1.1.2)
24
- multi_xml (0.6.0)
27
+ multipart-post (2.3.0)
25
28
  parallel (1.22.1)
26
- parser (3.2.1.1)
29
+ parser (3.2.2.0)
27
30
  ast (~> 2.4.1)
28
31
  public_suffix (5.0.1)
29
32
  rainbow (3.1.1)
30
33
  rake (13.0.6)
31
- regexp_parser (2.7.0)
34
+ regexp_parser (2.8.0)
32
35
  rexml (3.2.5)
33
36
  rspec (3.12.0)
34
37
  rspec-core (~> 3.12.0)
@@ -43,19 +46,20 @@ GEM
43
46
  diff-lcs (>= 1.2.0, < 2.0)
44
47
  rspec-support (~> 3.12.0)
45
48
  rspec-support (3.12.0)
46
- rubocop (1.48.1)
49
+ rubocop (1.50.2)
47
50
  json (~> 2.3)
48
51
  parallel (~> 1.10)
49
52
  parser (>= 3.2.0.0)
50
53
  rainbow (>= 2.2.2, < 4.0)
51
54
  regexp_parser (>= 1.8, < 3.0)
52
55
  rexml (>= 3.2.5, < 4.0)
53
- rubocop-ast (>= 1.26.0, < 2.0)
56
+ rubocop-ast (>= 1.28.0, < 2.0)
54
57
  ruby-progressbar (~> 1.7)
55
58
  unicode-display_width (>= 2.4.0, < 3.0)
56
- rubocop-ast (1.27.0)
59
+ rubocop-ast (1.28.0)
57
60
  parser (>= 3.2.1.0)
58
61
  ruby-progressbar (1.13.0)
62
+ ruby2_keywords (0.0.5)
59
63
  unicode-display_width (2.4.2)
60
64
  vcr (6.1.0)
61
65
  webmock (3.18.1)
@@ -71,7 +75,7 @@ DEPENDENCIES
71
75
  dotenv (~> 2.8.1)
72
76
  rake (~> 13.0)
73
77
  rspec (~> 3.12)
74
- rubocop (~> 1.48.1)
78
+ rubocop (~> 1.50.2)
75
79
  ruby-openai!
76
80
  vcr (~> 6.1.0)
77
81
  webmock (~> 3.18.1)
data/README.md CHANGED
@@ -3,13 +3,12 @@
3
3
  [![Gem Version](https://badge.fury.io/rb/ruby-openai.svg)](https://badge.fury.io/rb/ruby-openai)
4
4
  [![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/alexrudall/ruby-openai/blob/main/LICENSE.txt)
5
5
  [![CircleCI Build Status](https://circleci.com/gh/alexrudall/ruby-openai.svg?style=shield)](https://circleci.com/gh/alexrudall/ruby-openai)
6
- [![Maintainability](https://api.codeclimate.com/v1/badges/a99a88d28ad37a79dbf6/maintainability)](https://codeclimate.com/github/codeclimate/codeclimate/maintainability)
7
6
 
8
7
  Use the [OpenAI API](https://openai.com/blog/openai-api/) with Ruby! 🤖❤️
9
8
 
10
- Generate text with ChatGPT, transcribe and translate audio with Whisper, or create images with DALL·E...
9
+ Stream text with GPT-4, transcribe and translate audio with Whisper, or create images with DALL·E...
11
10
 
12
- Check out [Ruby AI Builders](https://discord.gg/k4Uc224xVD) on Discord!
11
+ [Ruby AI Builders Discord](https://discord.gg/k4Uc224xVD)
13
12
 
14
13
  ### Bundler
15
14
 
@@ -35,10 +34,6 @@ and require with:
35
34
  require "openai"
36
35
  ```
37
36
 
38
- ## Upgrading
39
-
40
- The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top level `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`.
41
-
42
37
  ## Usage
43
38
 
44
39
  - Get your API key from [https://beta.openai.com/account/api-keys](https://beta.openai.com/account/api-keys)
@@ -58,8 +53,8 @@ For a more robust setup, you can configure the gem with your API keys, for examp
58
53
 
59
54
  ```ruby
60
55
  OpenAI.configure do |config|
61
- config.access_token = ENV.fetch('OPENAI_ACCESS_TOKEN')
62
- config.organization_id = ENV.fetch('OPENAI_ORGANIZATION_ID') # Optional.
56
+ config.access_token = ENV.fetch("OPENAI_ACCESS_TOKEN")
57
+ config.organization_id = ENV.fetch("OPENAI_ORGANIZATION_ID") # Optional.
63
58
  end
64
59
  ```
65
60
 
@@ -71,30 +66,30 @@ client = OpenAI::Client.new
71
66
 
72
67
  #### Custom timeout or base URI
73
68
 
74
- The default timeout for any OpenAI request is 120 seconds. You can change that passing the `request_timeout` when initializing the client. You can also change the base URI used for all requests, eg. to use observability tools like [Helicone](https://docs.helicone.ai/quickstart/integrate-in-one-line-of-code):
69
+ The default timeout for any request using this library is 120 seconds. You can change that by passing a number of seconds to the `request_timeout` when initializing the client. You can also change the base URI used for all requests, eg. to use observability tools like [Helicone](https://docs.helicone.ai/quickstart/integrate-in-one-line-of-code):
75
70
 
76
71
  ```ruby
77
- client = OpenAI::Client.new(
78
- access_token: "access_token_goes_here",
79
- uri_base: "https://oai.hconeai.com",
80
- request_timeout: 240
81
- )
72
+ client = OpenAI::Client.new(
73
+ access_token: "access_token_goes_here",
74
+ uri_base: "https://oai.hconeai.com/",
75
+ request_timeout: 240
76
+ )
82
77
  ```
83
78
 
84
79
  or when configuring the gem:
85
80
 
86
81
  ```ruby
87
- OpenAI.configure do |config|
88
- config.access_token = ENV.fetch("OPENAI_ACCESS_TOKEN")
89
- config.organization_id = ENV.fetch("OPENAI_ORGANIZATION_ID") # Optional
90
- config.uri_base = "https://oai.hconeai.com" # Optional
91
- config.request_timeout = 240 # Optional
92
- end
82
+ OpenAI.configure do |config|
83
+ config.access_token = ENV.fetch("OPENAI_ACCESS_TOKEN")
84
+ config.organization_id = ENV.fetch("OPENAI_ORGANIZATION_ID") # Optional
85
+ config.uri_base = "https://oai.hconeai.com/" # Optional
86
+ config.request_timeout = 240 # Optional
87
+ end
93
88
  ```
94
89
 
95
90
  ### Models
96
91
 
97
- There are different models that can be used to generate text. For a full list and to retrieve information about a single models:
92
+ There are different models that can be used to generate text. For a full list and to retrieve information about a single model:
98
93
 
99
94
  ```ruby
100
95
  client.models.list
@@ -131,6 +126,23 @@ puts response.dig("choices", 0, "message", "content")
131
126
  # => "Hello! How may I assist you today?"
132
127
  ```
133
128
 
129
+ ### Streaming ChatGPT
130
+
131
+ You can stream from the API in realtime, which can be much faster and used to create a more engaging user experience. Pass a [Proc](https://ruby-doc.org/core-2.6/Proc.html) to the `stream` parameter to receive the stream of text chunks as they are generated. Each time one or more chunks is received, the Proc will be called once with each chunk, parsed as a Hash. If OpenAI returns an error, `ruby-openai` will pass that to your proc as a Hash.
132
+
133
+ ```ruby
134
+ client.chat(
135
+ parameters: {
136
+ model: "gpt-3.5-turbo", # Required.
137
+ messages: [{ role: "user", content: "Describe a character called Anna!"}], # Required.
138
+ temperature: 0.7,
139
+ stream: proc do |chunk, _bytesize|
140
+ print chunk.dig("choices", 0, "delta", "content")
141
+ end
142
+ })
143
+ # => "Anna is a young woman in her mid-twenties, with wavy chestnut hair that falls to her shoulders..."
144
+ ```
145
+
134
146
  ### Completions
135
147
 
136
148
  Hit the OpenAI API for a completion using other GPT-3 models:
@@ -189,9 +201,9 @@ and pass the path to `client.files.upload` to upload it to OpenAI, and then inte
189
201
  ```ruby
190
202
  client.files.upload(parameters: { file: "path/to/sentiment.jsonl", purpose: "fine-tune" })
191
203
  client.files.list
192
- client.files.retrieve(id: 123)
193
- client.files.content(id: 123)
194
- client.files.delete(id: 123)
204
+ client.files.retrieve(id: "file-123")
205
+ client.files.content(id: "file-123")
206
+ client.files.delete(id: "file-123")
195
207
  ```
196
208
 
197
209
  ### Fine-tunes
@@ -209,9 +221,9 @@ You can then use this file ID to create a fine-tune model:
209
221
  response = client.finetunes.create(
210
222
  parameters: {
211
223
  training_file: file_id,
212
- model: "text-ada-001"
224
+ model: "ada"
213
225
  })
214
- fine_tune_id = JSON.parse(response.body)["id"]
226
+ fine_tune_id = response["id"]
215
227
  ```
216
228
 
217
229
  That will give you the fine-tune ID. If you made a mistake you can cancel the fine-tune model before it is processed:
@@ -225,7 +237,7 @@ You may need to wait a short time for processing to complete. Once processed, yo
225
237
  ```ruby
226
238
  client.finetunes.list
227
239
  response = client.finetunes.retrieve(id: fine_tune_id)
228
- fine_tuned_model = JSON.parse(response.body)["fine_tuned_model"]
240
+ fine_tuned_model = response["fine_tuned_model"]
229
241
  ```
230
242
 
231
243
  This fine-tuned model name can then be used in completions:
@@ -237,7 +249,7 @@ response = client.completions(
237
249
  prompt: "I love Mondays!"
238
250
  }
239
251
  )
240
- JSON.parse(response.body)["choices"].map { |c| c["text"] }
252
+ response.dig("choices", 0, "text")
241
253
  ```
242
254
 
243
255
  You can delete the fine-tuned model when you are done with it:
@@ -306,9 +318,9 @@ The translations API takes as input the audio file in any of the supported langu
306
318
  response = client.translate(
307
319
  parameters: {
308
320
  model: "whisper-1",
309
- file: File.open('path_to_file', 'rb'),
321
+ file: File.open("path_to_file", "rb"),
310
322
  })
311
- puts response.parsed_response['text']
323
+ puts response["text"]
312
324
  # => "Translation of the text"
313
325
  ```
314
326
 
@@ -320,9 +332,9 @@ The transcriptions API takes as input the audio file you want to transcribe and
320
332
  response = client.transcribe(
321
333
  parameters: {
322
334
  model: "whisper-1",
323
- file: File.open('path_to_file', 'rb'),
335
+ file: File.open("path_to_file", "rb"),
324
336
  })
325
- puts response.parsed_response['text']
337
+ puts response["text"]
326
338
  # => "Transcription of the text"
327
339
  ```
328
340
 
@@ -332,12 +344,16 @@ After checking out the repo, run `bin/setup` to install dependencies. You can ru
332
344
 
333
345
  To install this gem onto your local machine, run `bundle exec rake install`.
334
346
 
347
+ ### Warning
348
+
349
+ If you have an `OPENAI_ACCESS_TOKEN` in your `ENV`, running the specs will use this to run the specs against the actual API, which will be slow and cost you money - 2 cents or more! Remove it from your environment with `unset` or similar if you just want to run the specs against the stored VCR responses.
350
+
335
351
  ## Release
336
352
 
337
- First run the specs without VCR so they actually hit the API. This will cost about 2 cents. You'll need to add your `OPENAI_ACCESS_TOKEN=` in `.env`.
353
+ First run the specs without VCR so they actually hit the API. This will cost 2 cents or more. Set OPENAI_ACCESS_TOKEN in your environment or pass it in like this:
338
354
 
339
355
  ```
340
- NO_VCR=true bundle exec rspec
356
+ OPENAI_ACCESS_TOKEN=123abc bundle exec rspec
341
357
  ```
342
358
 
343
359
  Then update the version number in `version.rb`, update `CHANGELOG.md`, run `bundle install` to update Gemfile.lock, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
data/lib/openai/client.rb CHANGED
@@ -1,5 +1,7 @@
1
1
  module OpenAI
2
2
  class Client
3
+ extend OpenAI::HTTP
4
+
3
5
  def initialize(access_token: nil, organization_id: nil, uri_base: nil, request_timeout: nil)
4
6
  OpenAI.configuration.access_token = access_token if access_token
5
7
  OpenAI.configuration.organization_id = organization_id if organization_id
@@ -50,55 +52,5 @@ module OpenAI
50
52
  def translate(parameters: {})
51
53
  OpenAI::Client.multipart_post(path: "/audio/translations", parameters: parameters)
52
54
  end
53
-
54
- def self.get(path:)
55
- HTTParty.get(
56
- uri(path: path),
57
- headers: headers,
58
- timeout: request_timeout
59
- )
60
- end
61
-
62
- def self.json_post(path:, parameters:)
63
- HTTParty.post(
64
- uri(path: path),
65
- headers: headers,
66
- body: parameters&.to_json,
67
- timeout: request_timeout
68
- )
69
- end
70
-
71
- def self.multipart_post(path:, parameters: nil)
72
- HTTParty.post(
73
- uri(path: path),
74
- headers: headers.merge({ "Content-Type" => "multipart/form-data" }),
75
- body: parameters,
76
- timeout: request_timeout
77
- )
78
- end
79
-
80
- def self.delete(path:)
81
- HTTParty.delete(
82
- uri(path: path),
83
- headers: headers,
84
- timeout: request_timeout
85
- )
86
- end
87
-
88
- private_class_method def self.uri(path:)
89
- OpenAI.configuration.uri_base + OpenAI.configuration.api_version + path
90
- end
91
-
92
- private_class_method def self.headers
93
- {
94
- "Content-Type" => "application/json",
95
- "Authorization" => "Bearer #{OpenAI.configuration.access_token}",
96
- "OpenAI-Organization" => OpenAI.configuration.organization_id
97
- }
98
- end
99
-
100
- private_class_method def self.request_timeout
101
- OpenAI.configuration.request_timeout
102
- end
103
55
  end
104
56
  end
@@ -0,0 +1,93 @@
1
+ module OpenAI
2
+ module HTTP
3
+ def get(path:)
4
+ to_json(conn.get(uri(path: path)) do |req|
5
+ req.headers = headers
6
+ end&.body)
7
+ end
8
+
9
+ def json_post(path:, parameters:)
10
+ to_json(conn.post(uri(path: path)) do |req|
11
+ if parameters[:stream].is_a?(Proc)
12
+ req.options.on_data = to_json_stream(user_proc: parameters[:stream])
13
+ parameters[:stream] = true # Necessary to tell OpenAI to stream.
14
+ end
15
+
16
+ req.headers = headers
17
+ req.body = parameters.to_json
18
+ end&.body)
19
+ end
20
+
21
+ def multipart_post(path:, parameters: nil)
22
+ to_json(conn(multipart: true).post(uri(path: path)) do |req|
23
+ req.headers = headers.merge({ "Content-Type" => "multipart/form-data" })
24
+ req.body = multipart_parameters(parameters)
25
+ end&.body)
26
+ end
27
+
28
+ def delete(path:)
29
+ to_json(conn.delete(uri(path: path)) do |req|
30
+ req.headers = headers
31
+ end&.body)
32
+ end
33
+
34
+ private
35
+
36
+ def to_json(string)
37
+ return unless string
38
+
39
+ JSON.parse(string)
40
+ rescue JSON::ParserError
41
+ # Convert a multiline string of JSON objects to a JSON array.
42
+ JSON.parse(string.gsub("}\n{", "},{").prepend("[").concat("]"))
43
+ end
44
+
45
+ # Given a proc, returns an outer proc that can be used to iterate over a JSON stream of chunks.
46
+ # For each chunk, the inner user_proc is called giving it the JSON object. The JSON object could
47
+ # be a data object or an error object as described in the OpenAI API documentation.
48
+ #
49
+ # If the JSON object for a given data or error message is invalid, it is ignored.
50
+ #
51
+ # @param user_proc [Proc] The inner proc to call for each JSON object in the chunk.
52
+ # @return [Proc] An outer proc that iterates over a raw stream, converting it to JSON.
53
+ def to_json_stream(user_proc:)
54
+ proc do |chunk, _|
55
+ chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|
56
+ user_proc.call(JSON.parse(data))
57
+ rescue JSON::ParserError
58
+ # Ignore invalid JSON.
59
+ end
60
+ end
61
+ end
62
+
63
+ def conn(multipart: false)
64
+ Faraday.new do |f|
65
+ f.options[:timeout] = OpenAI.configuration.request_timeout
66
+ f.request(:multipart) if multipart
67
+ end
68
+ end
69
+
70
+ def uri(path:)
71
+ OpenAI.configuration.uri_base + OpenAI.configuration.api_version + path
72
+ end
73
+
74
+ def headers
75
+ {
76
+ "Content-Type" => "application/json",
77
+ "Authorization" => "Bearer #{OpenAI.configuration.access_token}",
78
+ "OpenAI-Organization" => OpenAI.configuration.organization_id
79
+ }
80
+ end
81
+
82
+ def multipart_parameters(parameters)
83
+ parameters&.transform_values do |value|
84
+ next value unless value.is_a?(File)
85
+
86
+ # Doesn't seem like OpenAI need mime_type yet, so not worth
87
+ # the library to figure this out. Hence the empty string
88
+ # as the second argument.
89
+ Faraday::UploadIO.new(value, "", value.path)
90
+ end
91
+ end
92
+ end
93
+ end
@@ -1,3 +1,3 @@
1
1
  module OpenAI
2
- VERSION = "3.7.0".freeze
2
+ VERSION = "4.0.0".freeze
3
3
  end
data/lib/openai.rb CHANGED
@@ -1,5 +1,7 @@
1
- require "httparty"
1
+ require "faraday"
2
+ require "faraday/multipart"
2
3
 
4
+ require_relative "openai/http"
3
5
  require_relative "openai/client"
4
6
  require_relative "openai/files"
5
7
  require_relative "openai/finetunes"
data/ruby-openai.gemspec CHANGED
@@ -25,7 +25,6 @@ Gem::Specification.new do |spec|
25
25
  spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) }
26
26
  spec.require_paths = ["lib"]
27
27
 
28
- spec.add_dependency "httparty", ">= 0.18.1"
29
-
30
- spec.post_install_message = "Note if upgrading: The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top level `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`."
28
+ spec.add_dependency "faraday", ">= 1"
29
+ spec.add_dependency "faraday-multipart", ">= 1"
31
30
  end
metadata CHANGED
@@ -1,30 +1,44 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby-openai
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.7.0
4
+ version: 4.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Alex
8
- autorequire:
8
+ autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2023-03-25 00:00:00.000000000 Z
11
+ date: 2023-04-26 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: httparty
14
+ name: faraday
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
17
  - - ">="
18
18
  - !ruby/object:Gem::Version
19
- version: 0.18.1
19
+ version: '1'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - ">="
25
25
  - !ruby/object:Gem::Version
26
- version: 0.18.1
27
- description:
26
+ version: '1'
27
+ - !ruby/object:Gem::Dependency
28
+ name: faraday-multipart
29
+ requirement: !ruby/object:Gem::Requirement
30
+ requirements:
31
+ - - ">="
32
+ - !ruby/object:Gem::Version
33
+ version: '1'
34
+ type: :runtime
35
+ prerelease: false
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - ">="
39
+ - !ruby/object:Gem::Version
40
+ version: '1'
41
+ description:
28
42
  email:
29
43
  - alexrudall@users.noreply.github.com
30
44
  executables: []
@@ -53,6 +67,7 @@ files:
53
67
  - lib/openai/compatibility.rb
54
68
  - lib/openai/files.rb
55
69
  - lib/openai/finetunes.rb
70
+ - lib/openai/http.rb
56
71
  - lib/openai/images.rb
57
72
  - lib/openai/models.rb
58
73
  - lib/openai/version.rb
@@ -67,10 +82,7 @@ metadata:
67
82
  source_code_uri: https://github.com/alexrudall/ruby-openai
68
83
  changelog_uri: https://github.com/alexrudall/ruby-openai/blob/main/CHANGELOG.md
69
84
  rubygems_mfa_required: 'true'
70
- post_install_message: 'Note if upgrading: The `::Ruby::OpenAI` module has been removed
71
- and all classes have been moved under the top level `::OpenAI` module. To upgrade,
72
- change `require ''ruby/openai''` to `require ''openai''` and change all references
73
- to `Ruby::OpenAI` to `OpenAI`.'
85
+ post_install_message:
74
86
  rdoc_options: []
75
87
  require_paths:
76
88
  - lib
@@ -85,8 +97,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
85
97
  - !ruby/object:Gem::Version
86
98
  version: '0'
87
99
  requirements: []
88
- rubygems_version: 3.1.4
89
- signing_key:
100
+ rubygems_version: 3.4.12
101
+ signing_key:
90
102
  specification_version: 4
91
103
  summary: "OpenAI API + Ruby! \U0001F916❤️"
92
104
  test_files: []