openai 0.57.0 → 0.58.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (42) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +31 -0
  3. data/README.md +98 -1
  4. data/lib/openai/auth/subject_token_provider.rb +15 -0
  5. data/lib/openai/auth/subject_token_providers/azure_managed_identity_token_provider.rb +88 -0
  6. data/lib/openai/auth/subject_token_providers/gcp_id_token_provider.rb +66 -0
  7. data/lib/openai/auth/subject_token_providers/k8s_service_account_token_provider.rb +37 -0
  8. data/lib/openai/auth/token_type.rb +10 -0
  9. data/lib/openai/auth/workload_identity.rb +23 -0
  10. data/lib/openai/auth/workload_identity_auth.rb +176 -0
  11. data/lib/openai/client.rb +59 -4
  12. data/lib/openai/errors.rb +39 -0
  13. data/lib/openai/internal/util.rb +22 -7
  14. data/lib/openai/models/conversations/message.rb +28 -1
  15. data/lib/openai/models/oauth_error_code.rb +29 -0
  16. data/lib/openai/models/realtime/realtime_session_create_request.rb +4 -3
  17. data/lib/openai/models/realtime/realtime_session_create_response.rb +7 -5
  18. data/lib/openai/models/realtime/realtime_tracing_config.rb +3 -2
  19. data/lib/openai/models/vector_stores/file_batch_create_params.rb +9 -5
  20. data/lib/openai/models/vector_stores/file_create_params.rb +3 -1
  21. data/lib/openai/models.rb +2 -0
  22. data/lib/openai/resources/realtime/calls.rb +1 -1
  23. data/lib/openai/version.rb +1 -1
  24. data/lib/openai.rb +8 -0
  25. data/rbi/openai/auth.rbi +55 -0
  26. data/rbi/openai/internal/util.rbi +8 -0
  27. data/rbi/openai/models/conversations/message.rbi +53 -1
  28. data/rbi/openai/models/oauth_error_code.rbi +24 -0
  29. data/rbi/openai/models/realtime/realtime_session_create_request.rbi +6 -4
  30. data/rbi/openai/models/realtime/realtime_session_create_response.rbi +9 -6
  31. data/rbi/openai/models/realtime/realtime_tracing_config.rbi +3 -2
  32. data/rbi/openai/models/vector_stores/file_batch_create_params.rbi +18 -10
  33. data/rbi/openai/models/vector_stores/file_create_params.rbi +6 -2
  34. data/rbi/openai/models.rbi +2 -0
  35. data/rbi/openai/resources/realtime/calls.rbi +3 -2
  36. data/rbi/openai/resources/vector_stores/file_batches.rbi +6 -4
  37. data/rbi/openai/resources/vector_stores/files.rbi +3 -1
  38. data/sig/openai/internal/util.rbs +4 -0
  39. data/sig/openai/models/conversations/message.rbs +18 -2
  40. data/sig/openai/models/oauth_error_code.rbs +14 -0
  41. data/sig/openai/models.rbs +2 -0
  42. metadata +13 -2
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 1b3abca0a4ee2539ad853331fb8ba61b722546103cd80ab21e983af50b12b53b
4
- data.tar.gz: 303c04c16a7dd957587657d7adc826a6baa096b0805609c35f6b56d7fd3c03c1
3
+ metadata.gz: 73c6650d695d8ce23e2049359d56d6d4d2ae68971afb993c362f09b2f7bcfa37
4
+ data.tar.gz: 7a47b51f536ab52ce437c98663f56642d67638a187cba7055be29b79185630a0
5
5
  SHA512:
6
- metadata.gz: 2150df450775d2f132f901d485aaf115831036dfecc438ca1eed13d89d44fc01eabde37c559dd1c834e490bee861ac43358a0237a93247586c3d08e77a960c0c
7
- data.tar.gz: b3c47548b3b6ef867128331819dcada070087ba3028beaa1b7578a67ec1de7b7309aec8fafcadf1643717f70fc4799236999e74041c1eccd9dc341a80b7aca9e
6
+ metadata.gz: 84bf4110e464bab432883383984801d45c6cc9dc9dc40d23a8b70c1ceda3c498d7a835fb55773f9734e7f0891bf763d2fb71d213539b04993027518b158eb9ce
7
+ data.tar.gz: fa6a7d0c969bb6a6b70779e827cb5756e8420c21b986a313bd7d8e7d00663f6f300f75d61a51156831080f6510579a79523e2433189c356d63d9ba739f9021b7
data/CHANGELOG.md CHANGED
@@ -1,5 +1,36 @@
1
1
  # Changelog
2
2
 
3
+ ## 0.58.0 (2026-04-08)
4
+
5
+ Full Changelog: [v0.57.0...v0.58.0](https://github.com/openai/openai-ruby/compare/v0.57.0...v0.58.0)
6
+
7
+ ### Features
8
+
9
+ * **api:** add phase field to conversations message model ([a5dc6f8](https://github.com/openai/openai-ruby/commit/a5dc6f85304b7271da53b91e581cc7c124e5a3b0))
10
+ * **api:** add WEB_SEARCH_CALL_RESULTS to ResponseIncludable enum ([a556507](https://github.com/openai/openai-ruby/commit/a55650709792c28327253e054a849ed293167fc9))
11
+ * **client:** add support for short-lived tokens ([#1311](https://github.com/openai/openai-ruby/issues/1311)) ([a86d3bc](https://github.com/openai/openai-ruby/commit/a86d3bccd6cfe6b003d7383a525b7285ddd4b4b5))
12
+
13
+
14
+ ### Bug Fixes
15
+
16
+ * align path encoding with RFC 3986 section 3.3 ([8058a6d](https://github.com/openai/openai-ruby/commit/8058a6d56f8ad87bf6cd840760eff58f1b9fd973))
17
+ * **api:** remove web_search_call.results from ResponseIncludable enum ([2861387](https://github.com/openai/openai-ruby/commit/2861387a6f43db7d7e7cc703022b0093b27723b9))
18
+ * **internal:** correct multipart form field name encoding ([683d14b](https://github.com/openai/openai-ruby/commit/683d14b37f2b57d5273fbb5ed1e5c6d2aea20f4d))
19
+ * multipart encoding for file arrays ([755b444](https://github.com/openai/openai-ruby/commit/755b4448c4065a2bc2993f718e1ced82b99b6ebf))
20
+ * variable name typo ([6b333a4](https://github.com/openai/openai-ruby/commit/6b333a4c4a76ad8cc91f6696ca8352c633381cfc))
21
+
22
+
23
+ ### Chores
24
+
25
+ * **ci:** support opting out of skipping builds on metadata-only commits ([1b6ddfa](https://github.com/openai/openai-ruby/commit/1b6ddfa633228b299115e84918be246ec60e95d8))
26
+ * **tests:** bump steady to v0.20.1 ([952ea68](https://github.com/openai/openai-ruby/commit/952ea68c40059c7dbaacc1f2238e04d7191b31d9))
27
+ * **tests:** bump steady to v0.20.2 ([615427b](https://github.com/openai/openai-ruby/commit/615427b02e022b32824303b303a1cf1a5d528a58))
28
+
29
+
30
+ ### Documentation
31
+
32
+ * **api:** update file parameter descriptions in vector_stores ([260e754](https://github.com/openai/openai-ruby/commit/260e754f501203b1a4e7c17a2ace1642b5194d52))
33
+
3
34
  ## 0.57.0 (2026-03-25)
4
35
 
5
36
  Full Changelog: [v0.56.0...v0.57.0](https://github.com/openai/openai-ruby/compare/v0.56.0...v0.57.0)
data/README.md CHANGED
@@ -15,7 +15,7 @@ To use this gem, install via Bundler by adding the following to your application
15
15
  <!-- x-release-please-start-version -->
16
16
 
17
17
  ```ruby
18
- gem "openai", "~> 0.57.0"
18
+ gem "openai", "~> 0.58.0"
19
19
  ```
20
20
 
21
21
  <!-- x-release-please-end -->
@@ -107,6 +107,103 @@ puts(edited.data.first)
107
107
 
108
108
  Note that you can also pass a raw `IO` descriptor, but this disables retries, as the library can't be sure if the descriptor is a file or pipe (which cannot be rewound).
109
109
 
110
+ ## Workload Identity Authentication
111
+
112
+ For secure, automated environments like cloud-managed Kubernetes, Azure, and GCP, you can use workload identity authentication with short-lived tokens from cloud identity providers instead of long-lived API keys.
113
+
114
+
115
+ ### Kubernetes Service Account
116
+
117
+ ```ruby
118
+ require "openai"
119
+
120
+ # Configure Kubernetes service account provider
121
+ provider = OpenAI::Auth::SubjectTokenProviders::K8sServiceAccountTokenProvider.new
122
+
123
+ workload_identity = OpenAI::Auth::WorkloadIdentity.new(
124
+ client_id: ENV["OAUTH_CLIENT_ID"], # This is the default and can be omitted
125
+ identity_provider_id: ENV["IDENTITY_PROVIDER_ID"], # This is the default and can be omitted
126
+ service_account_id: ENV["SERVICE_ACCOUNT_ID"], # This is the default and can be omitted
127
+ provider: provider
128
+ )
129
+
130
+ client = OpenAI::Client.new(
131
+ workload_identity: workload_identity,
132
+ )
133
+
134
+ response = client.chat.completions.create(
135
+ messages: [{role: "user", content: "Hello!"}],
136
+ model: "gpt-5.2"
137
+ )
138
+ ```
139
+
140
+ ### Azure Managed Identity
141
+
142
+ ```ruby
143
+ provider = OpenAI::Auth::SubjectTokenProviders::AzureManagedIdentityTokenProvider.new
144
+
145
+ workload_identity = OpenAI::Auth::WorkloadIdentity.new(
146
+ client_id: ENV["OAUTH_CLIENT_ID"], # This is the default and can be omitted
147
+ identity_provider_id: ENV["IDENTITY_PROVIDER_ID"], # This is the default and can be omitted
148
+ service_account_id: ENV["SERVICE_ACCOUNT_ID"], # This is the default and can be omitted
149
+ provider: provider
150
+ )
151
+
152
+ client = OpenAI::Client.new(
153
+ workload_identity: workload_identity,
154
+ )
155
+ ```
156
+
157
+ ### GCP Metadata Server
158
+
159
+ ```ruby
160
+ provider = OpenAI::Auth::SubjectTokenProviders::GCPIDTokenProvider.new
161
+
162
+ workload_identity = OpenAI::Auth::WorkloadIdentity.new(
163
+ client_id: ENV["OAUTH_CLIENT_ID"], # This is the default and can be omitted
164
+ identity_provider_id: ENV["IDENTITY_PROVIDER_ID"], # This is the default and can be omitted
165
+ service_account_id: ENV["SERVICE_ACCOUNT_ID"], # This is the default and can be omitted
166
+ provider: provider
167
+ )
168
+
169
+ client = OpenAI::Client.new(
170
+ workload_identity: workload_identity,
171
+ )
172
+ ```
173
+
174
+ ### Custom Token Providers
175
+
176
+ You can implement custom token providers by including the `OpenAI::Auth::SubjectTokenProvider` module:
177
+
178
+ ```ruby
179
+ class CustomProvider
180
+ include OpenAI::Auth::SubjectTokenProvider
181
+
182
+ def token_type
183
+ OpenAI::Auth::TokenType::JWT
184
+ end
185
+
186
+ def get_token
187
+ "custom-token"
188
+ end
189
+ end
190
+
191
+ provider = CustomProvider.new
192
+
193
+ workload_identity = OpenAI::Auth::WorkloadIdentity.new(
194
+ client_id: ENV["OAUTH_CLIENT_ID"], # This is the default and can be omitted
195
+ identity_provider_id: ENV["IDENTITY_PROVIDER_ID"], # This is the default and can be omitted
196
+ service_account_id: ENV["SERVICE_ACCOUNT_ID"], # This is the default and can be omitted
197
+ provider: provider
198
+ )
199
+
200
+ client = OpenAI::Client.new(
201
+ workload_identity: workload_identity,
202
+ organization: ENV["OPENAI_ORG_ID"],
203
+ project: ENV["OPENAI_PROJECT_ID"]
204
+ )
205
+ ```
206
+
110
207
  ## Webhook Verification
111
208
 
112
209
  Verifying webhook signatures is _optional but encouraged_.
@@ -0,0 +1,15 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ module SubjectTokenProvider
6
+ def token_type
7
+ raise NotImplementedError.new("#{self.class} must implement #token_type")
8
+ end
9
+
10
+ def get_token
11
+ raise NotImplementedError.new("#{self.class} must implement #get_token")
12
+ end
13
+ end
14
+ end
15
+ end
@@ -0,0 +1,88 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ module SubjectTokenProviders
6
+ class AzureManagedIdentityTokenProvider
7
+ include OpenAI::Auth::SubjectTokenProvider
8
+
9
+ IMDS_ENDPOINT = "http://169.254.169.254/metadata/identity/oauth2/token"
10
+ DEFAULT_RESOURCE = "https://management.azure.com/"
11
+ DEFAULT_API_VERSION = "2018-02-01"
12
+ DEFAULT_TIMEOUT = 10.0
13
+
14
+ def initialize(
15
+ resource: self.class::DEFAULT_RESOURCE,
16
+ object_id: nil,
17
+ client_id: nil,
18
+ msi_res_id: nil,
19
+ api_version: self.class::DEFAULT_API_VERSION,
20
+ timeout: self.class::DEFAULT_TIMEOUT
21
+ )
22
+ @resource = resource
23
+ @object_id = object_id
24
+ @client_id = client_id
25
+ @msi_res_id = msi_res_id
26
+ @api_version = api_version
27
+ @timeout = timeout
28
+ end
29
+
30
+ def token_type
31
+ TokenType::JWT
32
+ end
33
+
34
+ def get_token
35
+ uri = URI(self.class::IMDS_ENDPOINT)
36
+ params = {
37
+ "api-version" => @api_version,
38
+ "resource" => @resource
39
+ }
40
+ params["object_id"] = @object_id if @object_id
41
+ params["client_id"] = @client_id if @client_id
42
+ params["msi_res_id"] = @msi_res_id if @msi_res_id
43
+ uri.query = URI.encode_www_form(params)
44
+
45
+ request = Net::HTTP::Get.new(uri)
46
+ request["Metadata"] = "true"
47
+
48
+ response = Net::HTTP.start(
49
+ uri.hostname,
50
+ uri.port,
51
+ use_ssl: uri.scheme == "https",
52
+ open_timeout: @timeout,
53
+ read_timeout: @timeout
54
+ ) do |http|
55
+ http.request(request)
56
+ end
57
+
58
+ unless response.is_a?(Net::HTTPSuccess)
59
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
60
+ message: "Azure IMDS returned #{response.code}: #{response.body}",
61
+ provider: "azure-imds"
62
+ )
63
+ end
64
+
65
+ data = JSON.parse(response.body, symbolize_names: true)
66
+
67
+ case data
68
+ in {access_token: String => token}
69
+ token
70
+ else
71
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
72
+ message: "Azure IMDS response missing access_token field",
73
+ provider: "azure-imds"
74
+ )
75
+ end
76
+ rescue OpenAI::Errors::SubjectTokenProviderError
77
+ raise
78
+ rescue StandardError => e
79
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
80
+ message: "Failed to fetch token from Azure IMDS: #{e.message}",
81
+ provider: "azure-imds",
82
+ cause: e
83
+ )
84
+ end
85
+ end
86
+ end
87
+ end
88
+ end
@@ -0,0 +1,66 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ module SubjectTokenProviders
6
+ class GCPIDTokenProvider
7
+ include OpenAI::Auth::SubjectTokenProvider
8
+
9
+ METADATA_HOST = "metadata.google.internal"
10
+ DEFAULT_AUDIENCE = "https://api.openai.com/v1"
11
+ DEFAULT_TIMEOUT = 10.0
12
+
13
+ def initialize(
14
+ audience: self.class::DEFAULT_AUDIENCE,
15
+ timeout: self.class::DEFAULT_TIMEOUT
16
+ )
17
+ @audience = audience
18
+ @timeout = timeout
19
+ end
20
+
21
+ def token_type
22
+ TokenType::ID
23
+ end
24
+
25
+ def get_token
26
+ path = "/computeMetadata/v1/instance/service-accounts/default/identity"
27
+ uri = URI::HTTP.build(
28
+ host: self.class::METADATA_HOST,
29
+ path: path,
30
+ query: URI.encode_www_form("audience" => @audience)
31
+ )
32
+
33
+ request = Net::HTTP::Get.new(uri)
34
+ request["Metadata-Flavor"] = "Google"
35
+
36
+ response = Net::HTTP.start(
37
+ uri.hostname,
38
+ uri.port,
39
+ use_ssl: false,
40
+ open_timeout: @timeout,
41
+ read_timeout: @timeout
42
+ ) do |http|
43
+ http.request(request)
44
+ end
45
+
46
+ unless response.is_a?(Net::HTTPSuccess)
47
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
48
+ message: "GCP Metadata Server returned #{response.code}: #{response.body}",
49
+ provider: "gcp-metadata"
50
+ )
51
+ end
52
+
53
+ response.body
54
+ rescue OpenAI::Errors::SubjectTokenProviderError
55
+ raise
56
+ rescue StandardError => e
57
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
58
+ message: "Failed to fetch token from GCP Metadata Server: #{e.message}",
59
+ provider: "gcp-metadata",
60
+ cause: e
61
+ )
62
+ end
63
+ end
64
+ end
65
+ end
66
+ end
@@ -0,0 +1,37 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ module SubjectTokenProviders
6
+ class K8sServiceAccountTokenProvider
7
+ include OpenAI::Auth::SubjectTokenProvider
8
+
9
+ DEFAULT_TOKEN_PATH = "/var/run/secrets/kubernetes.io/serviceaccount/token"
10
+
11
+ def initialize(token_path: self.class::DEFAULT_TOKEN_PATH)
12
+ @token_path = token_path
13
+ end
14
+
15
+ def token_type
16
+ TokenType::JWT
17
+ end
18
+
19
+ def get_token
20
+ File.read(@token_path).strip
21
+ rescue SystemCallError => e
22
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
23
+ message: "Failed to read Kubernetes service account token from #{@token_path}: #{e.message}",
24
+ provider: "kubernetes",
25
+ cause: e
26
+ )
27
+ rescue StandardError => e
28
+ raise OpenAI::Errors::SubjectTokenProviderError.new(
29
+ message: "Unexpected error reading Kubernetes token: #{e.message}",
30
+ provider: "kubernetes",
31
+ cause: e
32
+ )
33
+ end
34
+ end
35
+ end
36
+ end
37
+ end
@@ -0,0 +1,10 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ module TokenType
6
+ JWT = :jwt
7
+ ID = :id
8
+ end
9
+ end
10
+ end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ class WorkloadIdentity
6
+ attr_reader :client_id, :identity_provider_id, :service_account_id, :provider, :refresh_buffer_seconds
7
+
8
+ def initialize(
9
+ client_id:,
10
+ identity_provider_id:,
11
+ service_account_id:,
12
+ provider:,
13
+ refresh_buffer_seconds: 1200
14
+ )
15
+ @client_id = client_id.to_s
16
+ @identity_provider_id = identity_provider_id.to_s
17
+ @service_account_id = service_account_id.to_s
18
+ @provider = provider
19
+ @refresh_buffer_seconds = refresh_buffer_seconds.to_i
20
+ end
21
+ end
22
+ end
23
+ end
@@ -0,0 +1,176 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OpenAI
4
+ module Auth
5
+ class WorkloadIdentityAuth
6
+ SUBJECT_TOKEN_TYPES = {
7
+ TokenType::JWT => "urn:ietf:params:oauth:token-type:jwt",
8
+ TokenType::ID => "urn:ietf:params:oauth:token-type:id_token"
9
+ }.freeze
10
+
11
+ TOKEN_EXCHANGE_GRANT_TYPE = "urn:ietf:params:oauth:grant-type:token-exchange"
12
+ DEFAULT_TOKEN_EXCHANGE_URL = "https://auth.openai.com/oauth/token"
13
+ DEFAULT_REFRESH_BUFFER_SECONDS = 1200
14
+
15
+ def initialize(
16
+ config,
17
+ organization_id,
18
+ token_exchange_url: DEFAULT_TOKEN_EXCHANGE_URL
19
+ )
20
+ @config = config
21
+ @organization_id = organization_id
22
+ @token_exchange_url = URI(token_exchange_url)
23
+
24
+ @cached_token = nil
25
+ @cached_token_expires_at_monotonic = nil
26
+ @cached_token_refresh_at_monotonic = nil
27
+ @refreshing = false
28
+ @mutex = Mutex.new
29
+ @cond_var = ConditionVariable.new
30
+ end
31
+
32
+ def get_token
33
+ @mutex.synchronize do
34
+ @cond_var.wait(@mutex) while @refreshing && token_unusable?
35
+
36
+ unless token_unusable? || needs_refresh?
37
+ return @cached_token
38
+ end
39
+
40
+ if @refreshing
41
+ @cond_var.wait(@mutex) while @refreshing
42
+ token = @cached_token
43
+ raise OpenAI::Errors::AuthenticationError, "Token refresh failed" if token_unusable?
44
+ return token
45
+ end
46
+
47
+ @refreshing = true
48
+ end
49
+
50
+ perform_refresh
51
+ @mutex.synchronize do
52
+ raise OpenAI::Errors::AuthenticationError, "Token refresh failed" if token_unusable?
53
+ @cached_token
54
+ end
55
+ end
56
+
57
+ def invalidate_token
58
+ @mutex.synchronize do
59
+ @cached_token = nil
60
+ @cached_token_expires_at_monotonic = nil
61
+ @cached_token_refresh_at_monotonic = nil
62
+ end
63
+ end
64
+
65
+ private
66
+
67
+ def perform_refresh
68
+ token_data = fetch_token_from_exchange
69
+ now = OpenAI::Internal::Util.monotonic_secs
70
+ expires_in = token_data.fetch(:expires_in)
71
+
72
+ @mutex.synchronize do
73
+ @cached_token = token_data.fetch(:id)
74
+ @cached_token_expires_at_monotonic = now + expires_in
75
+ @cached_token_refresh_at_monotonic = now + refresh_delay_seconds(expires_in)
76
+ end
77
+ ensure
78
+ @mutex.synchronize do
79
+ @refreshing = false
80
+ @cond_var.broadcast
81
+ end
82
+ end
83
+
84
+ def fetch_token_from_exchange
85
+ subject_token = @config.provider.get_token
86
+
87
+ token_type = @config.provider.token_type
88
+ subject_token_type = SUBJECT_TOKEN_TYPES.fetch(token_type) do
89
+ raise ArgumentError,
90
+ "Unsupported token type: #{token_type.inspect}. Supported types: #{SUBJECT_TOKEN_TYPES.keys.join(', ')}"
91
+ end
92
+
93
+ request = Net::HTTP::Post.new(@token_exchange_url)
94
+ request["Content-Type"] = "application/json"
95
+ request.body = JSON.generate(
96
+ grant_type: TOKEN_EXCHANGE_GRANT_TYPE,
97
+ client_id: @config.client_id,
98
+ subject_token: subject_token,
99
+ subject_token_type: subject_token_type,
100
+ identity_provider_id: @config.identity_provider_id,
101
+ service_account_id: @config.service_account_id
102
+ )
103
+
104
+ response = Net::HTTP.start(
105
+ @token_exchange_url.hostname,
106
+ @token_exchange_url.port,
107
+ use_ssl: @token_exchange_url.scheme == "https",
108
+ open_timeout: 5,
109
+ read_timeout: 5,
110
+ write_timeout: 5
111
+ ) do |http|
112
+ http.request(request)
113
+ end
114
+
115
+ handle_token_response(response)
116
+ end
117
+
118
+ def handle_token_response(response)
119
+ body = parse_response_body(response)
120
+
121
+ case response
122
+ in Net::HTTPBadRequest | Net::HTTPUnauthorized | Net::HTTPForbidden
123
+ raise OpenAI::Errors::OAuthError.new(
124
+ status: response.code.to_i,
125
+ body: body,
126
+ headers: response.to_hash
127
+ )
128
+ in Net::HTTPSuccess
129
+ {
130
+ id: body&.dig(:access_token),
131
+ expires_in: body&.dig(:expires_in)
132
+ }
133
+ else
134
+ raise OpenAI::Errors::APIError.new(
135
+ url: @token_exchange_url,
136
+ status: response.code.to_i,
137
+ headers: response.to_hash,
138
+ body: body,
139
+ message: "Token exchange failed with status #{response.code}"
140
+ )
141
+ end
142
+ end
143
+
144
+ def parse_response_body(response)
145
+ return nil if response.body.nil? || response.body.empty?
146
+
147
+ JSON.parse(response.body, symbolize_names: true)
148
+ rescue JSON::ParserError
149
+ nil
150
+ end
151
+
152
+ def token_unusable?
153
+ @cached_token.nil? || token_expired?
154
+ end
155
+
156
+ def token_expired?
157
+ return true if @cached_token_expires_at_monotonic.nil?
158
+
159
+ OpenAI::Internal::Util.monotonic_secs >= @cached_token_expires_at_monotonic
160
+ end
161
+
162
+ def needs_refresh?
163
+ return false if @cached_token_refresh_at_monotonic.nil?
164
+
165
+ OpenAI::Internal::Util.monotonic_secs >= @cached_token_refresh_at_monotonic
166
+ end
167
+
168
+ def refresh_delay_seconds(expires_in)
169
+ configured_buffer = @config.refresh_buffer_seconds || DEFAULT_REFRESH_BUFFER_SECONDS
170
+ effective_buffer = [configured_buffer, expires_in / 2].min
171
+
172
+ [expires_in - effective_buffer, 0].max
173
+ end
174
+ end
175
+ end
176
+ end
data/lib/openai/client.rb CHANGED
@@ -15,6 +15,8 @@ module OpenAI
15
15
  # Default max retry delay in seconds.
16
16
  DEFAULT_MAX_RETRY_DELAY = 8.0
17
17
 
18
+ WORKLOAD_IDENTITY_API_KEY_PLACEHOLDER = "workload-identity-auth"
19
+
18
20
  # @return [String]
19
21
  attr_reader :api_key
20
22
 
@@ -27,6 +29,10 @@ module OpenAI
27
29
  # @return [String, nil]
28
30
  attr_reader :webhook_secret
29
31
 
32
+ # @return [OpenAI::Auth::WorkloadIdentityAuth, nil]
33
+ # @api private
34
+ attr_reader :workload_identity_auth
35
+
30
36
  # Given a prompt, the model will return one or more predicted completions, and can
31
37
  # also return the probabilities of alternative tokens at each position.
32
38
  # @return [OpenAI::Resources::Completions]
@@ -116,13 +122,48 @@ module OpenAI
116
122
  {"authorization" => "Bearer #{@api_key}"}
117
123
  end
118
124
 
125
+ # @api private
126
+ private def request_replayable?(request)
127
+ body = request[:body]
128
+ return true if body.nil? || body.is_a?(String)
129
+ return false if body.respond_to?(:read)
130
+ true
131
+ end
132
+
133
+ # @api private
134
+ private def send_request(request, redirect_count:, retry_count:, send_retry_header:)
135
+ return super unless @workload_identity_auth
136
+
137
+ token = @workload_identity_auth.get_token
138
+ updated_headers = request[:headers].merge("authorization" => "Bearer #{token}")
139
+ updated_request = request.merge(headers: updated_headers)
140
+
141
+ begin
142
+ super(updated_request, redirect_count: redirect_count, retry_count: retry_count, send_retry_header: send_retry_header)
143
+ rescue OpenAI::Errors::AuthenticationError
144
+ raise unless retry_count.zero? && request_replayable?(request)
145
+ @workload_identity_auth.invalidate_token
146
+
147
+ fresh_token = @workload_identity_auth.get_token
148
+ refreshed_headers = request[:headers].merge("authorization" => "Bearer #{fresh_token}")
149
+ refreshed_request = request.merge(headers: refreshed_headers)
150
+
151
+ super(refreshed_request, redirect_count: redirect_count, retry_count: retry_count + 1, send_retry_header: send_retry_header)
152
+ end
153
+ end
154
+
119
155
  # Creates and returns a new client for interacting with the API.
120
156
  #
121
- # @param api_key [String, nil] Defaults to `ENV["OPENAI_API_KEY"]`
157
+ # @param api_key [String, nil] Defaults to `ENV["OPENAI_API_KEY"]`.
158
+ # Mutually exclusive with `workload_identity`.
159
+ #
160
+ # @param workload_identity [OpenAI::Auth::WorkloadIdentity, nil]
161
+ # OAuth2 workload identity configuration for token exchange authentication.
162
+ # Mutually exclusive with `api_key`.
122
163
  #
123
- # @param organization [String, nil] Defaults to `ENV["OPENAI_ORG_ID"]`
164
+ # @param organization [String, nil] Defaults to `ENV["OPENAI_ORG_ID"]`.
124
165
  #
125
- # @param project [String, nil] Defaults to `ENV["OPENAI_PROJECT_ID"]`
166
+ # @param project [String, nil] Defaults to `ENV["OPENAI_PROJECT_ID"]`.
126
167
  #
127
168
  # @param webhook_secret [String, nil] Defaults to `ENV["OPENAI_WEBHOOK_SECRET"]`
128
169
  #
@@ -138,6 +179,7 @@ module OpenAI
138
179
  # @param max_retry_delay [Float]
139
180
  def initialize(
140
181
  api_key: ENV["OPENAI_API_KEY"],
182
+ workload_identity: nil,
141
183
  organization: ENV["OPENAI_ORG_ID"],
142
184
  project: ENV["OPENAI_PROJECT_ID"],
143
185
  webhook_secret: ENV["OPENAI_WEBHOOK_SECRET"],
@@ -149,7 +191,20 @@ module OpenAI
149
191
  )
150
192
  base_url ||= "https://api.openai.com/v1"
151
193
 
152
- if api_key.nil?
194
+ if workload_identity && api_key && api_key != WORKLOAD_IDENTITY_API_KEY_PLACEHOLDER
195
+ raise ArgumentError.new(
196
+ "The `api_key` and `workload_identity` arguments are mutually exclusive; " \
197
+ "only one can be passed at a time."
198
+ )
199
+ end
200
+
201
+ if workload_identity
202
+ @workload_identity_auth = OpenAI::Auth::WorkloadIdentityAuth.new(
203
+ workload_identity,
204
+ organization
205
+ )
206
+ api_key = WORKLOAD_IDENTITY_API_KEY_PLACEHOLDER
207
+ elsif api_key.nil?
153
208
  raise ArgumentError.new("api_key is required, and can be set via environ: \"OPENAI_API_KEY\"")
154
209
  end
155
210