durable_streams 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: 8b7669dbd968851bfab918dd9de9e7f31cc9a936ad5088d4920cc2543adf5737
4
+ data.tar.gz: c58558467f45deb44f6497c48e119d8e4ec2a7842e6f5dba07c7dc32294594af
5
+ SHA512:
6
+ metadata.gz: 5420a19b41a0333e82da75a559f1f6f24b475c0367d1cfddf3ccfe6197c23948b577b77fa793fa7c9d59e4c834cefcd499c075023c015c65c49d6720a02a6950
7
+ data.tar.gz: 9453e5324920d5cfee9c5b0548e4a212ebb9b2490bb3573a4a6fafc69ebc45ecf5e5b33286a7b56acb4ebeb1f8ae2cc674cabd2ccccff18dcde6ff56364148ff
data/CHANGELOG.md ADDED
@@ -0,0 +1,7 @@
1
+ # Changelog
2
+
3
+ ## 0.1.0
4
+
5
+ - Initial release based on upstream [durable-streams/packages/client-rb](https://github.com/durable-streams/durable-streams/tree/main/packages/client-rb) at commit TBD
6
+ - All source code manually reviewed
7
+ - Published to RubyGems for straightforward installation
data/LICENSE ADDED
@@ -0,0 +1,22 @@
1
+ MIT License
2
+
3
+ Copyright (c) Durable Streams contributors (https://github.com/durable-streams/durable-streams)
4
+ Copyright (c) tokimonki (https://github.com/tokimonki/durable_streams)
5
+
6
+ Permission is hereby granted, free of charge, to any person obtaining a copy
7
+ of this software and associated documentation files (the "Software"), to deal
8
+ in the Software without restriction, including without limitation the rights
9
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
10
+ copies of the Software, and to permit persons to whom the Software is
11
+ furnished to do so, subject to the following conditions:
12
+
13
+ The above copyright notice and this permission notice shall be included in all
14
+ copies or substantial portions of the Software.
15
+
16
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
22
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,276 @@
1
+ # Durable Streams Ruby Client
2
+
3
+ A maintained Ruby client for [Durable Streams](https://github.com/durable-streams/durable-streams) — the open protocol for persistent, resumable event streams over HTTP.
4
+
5
+ Based on the [upstream reference client](https://github.com/durable-streams/durable-streams/tree/main/packages/client-rb) with manual code review, testing, and ongoing maintenance. See [UPSTREAM.md](UPSTREAM.md) for the full sync policy.
6
+
7
+ ## Why this gem exists
8
+
9
+ The Durable Streams project maintains clients in 10+ languages from a monorepo. The Ruby client is not yet published to RubyGems, which means installing it requires a `git:` + `glob:` directive in your Gemfile pointing into a monorepo subdirectory.
10
+
11
+ We maintain this standalone copy so that:
12
+
13
+ - **Installation is simple** — `gem "durable_streams"` in your Gemfile
14
+ - **The code is reviewed** — every line has been manually examined
15
+ - **Upstream changes are tracked** — we sync protocol and bug fixes, documented in the [CHANGELOG](CHANGELOG.md)
16
+
17
+ If the upstream team publishes the Ruby client to RubyGems, we will coordinate with them to consolidate.
18
+
19
+ ## Installation
20
+
21
+ Add to your Gemfile:
22
+
23
+ ```ruby
24
+ gem "durable_streams"
25
+ ```
26
+
27
+ For edge (latest from GitHub):
28
+
29
+ ```ruby
30
+ gem "durable_streams", github: "tokimonki/durable_streams"
31
+ ```
32
+
33
+ **Requirements:** Ruby 3.1+
34
+
35
+ ## Quick Start
36
+
37
+ ```ruby
38
+ require "durable_streams"
39
+
40
+ # Configure once at startup
41
+ DurableStreams.configure do |config|
42
+ config.base_url = "https://streams.example.com"
43
+ config.timeout = 30
44
+ config.default_headers = {
45
+ "Authorization" => -> { "Bearer #{current_token}" }
46
+ }
47
+ end
48
+
49
+ # Create a stream
50
+ stream = DurableStreams.create("/my-stream", content_type: :json)
51
+
52
+ # Write
53
+ stream << { event: "user.created", user_id: 123 }
54
+
55
+ # Read
56
+ stream.read.each { |msg| puts msg }
57
+ ```
58
+
59
+ ## Writing
60
+
61
+ ```ruby
62
+ stream = DurableStreams.create("/events/orders", content_type: :json)
63
+
64
+ stream << { order_id: 1, status: "created" }
65
+ stream << { order_id: 2, status: "shipped" }
66
+
67
+ # With sequence numbers for ordering
68
+ stream.append({ order_id: 3 }, seq: "123")
69
+ ```
70
+
71
+ ## Reading
72
+
73
+ ```ruby
74
+ stream = DurableStreams.stream("/events/orders")
75
+
76
+ # Catch-up: read all existing messages
77
+ stream.each { |event| process(event) }
78
+
79
+ # With checkpointing
80
+ stream.read(offset: saved_offset).each_batch do |batch|
81
+ batch.items.each { |event| process(event) }
82
+ save_offset(batch.next_offset)
83
+ end
84
+ ```
85
+
86
+ ## Live Streaming
87
+
88
+ ```ruby
89
+ stream = DurableStreams.stream("/events")
90
+
91
+ # Long-poll mode
92
+ stream.read(live: :long_poll).each { |msg| process(msg) }
93
+
94
+ # SSE mode (for JSON/text streams)
95
+ stream.read(live: :sse).each { |msg| puts msg }
96
+
97
+ # Lazy enumeration
98
+ stream.read(live: :sse).each.lazy.take(10).to_a
99
+ ```
100
+
101
+ ## Exactly-Once Writes with Producer
102
+
103
+ ```ruby
104
+ DurableStreams::Producer.open(
105
+ url: "https://streams.example.com/events",
106
+ producer_id: "order-service-#{Process.pid}",
107
+ epoch: 0,
108
+ auto_claim: true
109
+ ) do |producer|
110
+ 1000.times { |i| producer << { order_id: i, status: "created" } }
111
+ end
112
+ ```
113
+
114
+ ## Configuration
115
+
116
+ ```ruby
117
+ DurableStreams.configure do |config|
118
+ config.base_url = ENV["DURABLE_STREAMS_URL"]
119
+ config.default_content_type = :json
120
+ config.timeout = 30
121
+ config.retry_policy = DurableStreams::RetryPolicy.new(
122
+ max_retries: 5,
123
+ initial_delay: 0.1,
124
+ max_delay: 30.0
125
+ )
126
+ config.default_headers = {
127
+ "Authorization" => -> { "Bearer #{refresh_token}" }
128
+ }
129
+ end
130
+
131
+ # Reset to defaults (useful in tests)
132
+ DurableStreams.reset_configuration!
133
+ ```
134
+
135
+ ### Isolated contexts
136
+
137
+ For multi-tenant or testing scenarios:
138
+
139
+ ```ruby
140
+ staging = DurableStreams.new_context do |config|
141
+ config.base_url = "https://staging.example.com"
142
+ end
143
+
144
+ stream = DurableStreams::Stream.new("/events", context: staging)
145
+ ```
146
+
147
+ ## API Reference
148
+
149
+ ### Module methods
150
+
151
+ ```ruby
152
+ DurableStreams.configure { |config| ... }
153
+ DurableStreams.reset_configuration!
154
+ DurableStreams.new_context { |config| ... }
155
+ DurableStreams.stream("/path")
156
+ DurableStreams.create("/path", content_type: :json)
157
+ DurableStreams.append("/path", data)
158
+ DurableStreams.read("/path", offset: "-1")
159
+ ```
160
+
161
+ ### Stream
162
+
163
+ ```ruby
164
+ stream = DurableStreams.stream("/events")
165
+
166
+ # Metadata
167
+ stream.head # => HeadResult
168
+ stream.exists? # => true/false
169
+ stream.json? # => true/false
170
+
171
+ # Writing
172
+ stream.append(data, seq: nil) # => AppendResult
173
+ stream << data # => self (chainable)
174
+
175
+ # Reading
176
+ stream.each { |msg| ... }
177
+ stream.read(offset: "-1", live: false, format: :auto) # => Reader
178
+ stream.read_all(offset: "-1") # => Array
179
+
180
+ # Lifecycle
181
+ stream.create_stream(content_type:, ttl_seconds: nil, expires_at: nil)
182
+ stream.delete
183
+ stream.close
184
+ ```
185
+
186
+ ### Read options
187
+
188
+ | `live` mode | Behavior |
189
+ |---|---|
190
+ | `false` | Return when caught up (catch-up only) |
191
+ | `:long_poll` | Wait for new data or timeout |
192
+ | `:sse` | Server-Sent Events with automatic reconnection |
193
+
194
+ | `format` | Behavior |
195
+ |---|---|
196
+ | `:auto` | Detect from Content-Type header |
197
+ | `:json` | Force JSON parsing (JsonReader) |
198
+ | `:bytes` | Force raw bytes (ByteReader) |
199
+
200
+ ### Producer
201
+
202
+ ```ruby
203
+ DurableStreams::Producer.open(url: "...", producer_id: "...") do |producer|
204
+ producer << data
205
+ end
206
+
207
+ # Manual form
208
+ producer = DurableStreams::Producer.new(
209
+ url: "...",
210
+ producer_id: "unique-id",
211
+ epoch: 0,
212
+ auto_claim: false,
213
+ max_batch_bytes: 1_048_576,
214
+ linger_ms: 5,
215
+ max_in_flight: 5
216
+ )
217
+
218
+ producer.append(data) # Fire-and-forget (batched)
219
+ producer << data # Same as append
220
+ producer.append!(data) # Wait for acknowledgment => ProducerResult
221
+ producer.flush
222
+ producer.close
223
+ ```
224
+
225
+ ### Errors
226
+
227
+ All errors inherit from `DurableStreams::Error`:
228
+
229
+ | Error | HTTP Status | Meaning |
230
+ |---|---|---|
231
+ | `StreamNotFoundError` | 404 | Stream doesn't exist |
232
+ | `StreamExistsError` | 409 | Already exists with different config |
233
+ | `SeqConflictError` | 409 | Sequence number conflict |
234
+ | `ContentTypeMismatchError` | 409 | Wrong content type |
235
+ | `StaleEpochError` | 403 | Producer epoch is stale |
236
+ | `SequenceGapError` | 409 | Producer sequence gap |
237
+ | `RateLimitedError` | 429 | Rate limited |
238
+ | `BadRequestError` | 400 | Invalid request |
239
+ | `ConnectionError` | — | Network error |
240
+ | `TimeoutError` | — | Request timeout |
241
+
242
+ ## Testing
243
+
244
+ ```ruby
245
+ require "durable_streams/testing"
246
+
247
+ # Setup
248
+ DurableStreams::Testing.install!
249
+ DurableStreams::Testing.clear!
250
+
251
+ # Seed and use
252
+ DurableStreams::Testing.mock_transport.seed_stream("/events", [])
253
+ stream = DurableStreams.create("/events", content_type: :json)
254
+ stream.append({ event: "test" })
255
+
256
+ # Verify
257
+ messages = DurableStreams::Testing.messages_for("/events")
258
+
259
+ # Teardown
260
+ DurableStreams::Testing.reset!
261
+ ```
262
+
263
+ ## Thread safety
264
+
265
+ - **Configuration** — thread-safe after freeze (configure at startup)
266
+ - **Stream** — create one per thread for concurrent reads
267
+ - **Producer** — thread-safe, uses mutex internally
268
+ - **Readers** — single-threaded (create new reader per thread)
269
+
270
+ ## Upstream
271
+
272
+ This gem is based on the reference Ruby client from the [Durable Streams](https://github.com/durable-streams/durable-streams) project (`packages/client-rb/`). See [UPSTREAM.md](UPSTREAM.md) for the sync policy and process.
273
+
274
+ ## License
275
+
276
+ MIT — see [LICENSE](LICENSE)
data/UPSTREAM.md ADDED
@@ -0,0 +1,85 @@
1
+ # Upstream Sync
2
+
3
+ This gem is based on the Ruby client from the [Durable Streams](https://github.com/durable-streams/durable-streams) project. The upstream code lives at `packages/client-rb/` in their monorepo.
4
+
5
+ We publish this as a standalone gem because the upstream team maintains clients in 10+ languages and has not yet published the Ruby client to RubyGems. Since we depend on this client for `durable_streams-rails`, we maintain a reviewed, tested copy here.
6
+
7
+ ## Relationship with upstream
8
+
9
+ - **We coordinate, not compete.** If upstream publishes `durable_streams` to RubyGems, we will work with them to consolidate — either transferring gem ownership or deprecating this fork.
10
+ - **We attribute.** The LICENSE includes the original copyright. Every CHANGELOG entry that pulls upstream changes links to the specific commits.
11
+ - **We contribute back.** Bugs found here get reported upstream. Fixes that apply to the reference client get submitted as PRs to the upstream repo.
12
+
13
+ ## Sync process
14
+
15
+ ### Checking for upstream changes
16
+
17
+ ```bash
18
+ # Add upstream as a remote (one-time setup)
19
+ git remote add upstream-ref https://github.com/durable-streams/durable-streams.git
20
+
21
+ # Fetch latest
22
+ git fetch upstream-ref main
23
+
24
+ # Compare the Ruby client directory against our lib/
25
+ git diff HEAD upstream-ref/main -- packages/client-rb/lib/
26
+ ```
27
+
28
+ Alternatively, review the upstream commit log filtered to the Ruby client:
29
+
30
+ ```bash
31
+ git log upstream-ref/main -- packages/client-rb/
32
+ ```
33
+
34
+ ### Applying upstream changes
35
+
36
+ 1. **Review each upstream commit individually.** Don't blindly merge. The upstream client may include changes we've already fixed differently, or patterns we've deliberately diverged from.
37
+
38
+ 2. **For each relevant change, create a branch:**
39
+ ```bash
40
+ git checkout -b sync/upstream-YYYY-MM-DD
41
+ ```
42
+
43
+ 3. **Cherry-pick or manually apply** the changes. Prefer manual application — it forces you to understand what changed.
44
+
45
+ 4. **Run the full test suite** before merging.
46
+
47
+ 5. **Document in CHANGELOG.md** with a link to the upstream commit:
48
+ ```
49
+ ## x.y.z
50
+
51
+ - Sync: applied upstream fix for SSE reconnection (durable-streams/durable-streams@abc1234)
52
+ ```
53
+
54
+ ### What to sync
55
+
56
+ - **Bug fixes** — always pull these in after review
57
+ - **Protocol compliance changes** — required to stay compatible with the server
58
+ - **New features** — evaluate case by case; only pull in what we need or what improves the client
59
+ - **Refactors** — evaluate carefully; our codebase may have diverged structurally
60
+
61
+ ### What NOT to sync
62
+
63
+ - **Conformance adapter** (`conformance_adapter.rb`) — this is upstream's test harness, not part of the client
64
+ - **Design docs** (`design.md`) — internal upstream documentation
65
+ - **Style changes** — we follow our own conventions where they diverge
66
+
67
+ ## Sync schedule
68
+
69
+ Check upstream monthly, or whenever a new protocol version is released. Pin the upstream commit hash in CHANGELOG.md so we always know our baseline.
70
+
71
+ ## RubyGems coordination
72
+
73
+ The upstream team has an open issue to publish the Ruby client to RubyGems:
74
+ https://github.com/durable-streams/durable-streams/issues/166
75
+
76
+ As of February 2026, this issue has no activity beyond the initial filing. If upstream
77
+ publishes their own `durable_streams` gem, we will coordinate with them — either
78
+ transferring ownership of this gem or deprecating it in favor of theirs.
79
+
80
+ ## Current baseline
81
+
82
+ - **Upstream repo:** https://github.com/durable-streams/durable-streams
83
+ - **Upstream path:** `packages/client-rb/`
84
+ - **Baseline commit:** TBD (fill in when publishing)
85
+ - **Baseline date:** TBD
@@ -0,0 +1,185 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "base64"
4
+
5
+ module DurableStreams
6
+ # Reader for byte streams - yields raw chunks
7
+ class ByteReader
8
+ attr_reader :next_offset, :cursor, :up_to_date, :status
9
+
10
+ # @param stream [Stream] Parent stream handle
11
+ # @param offset [String] Starting offset
12
+ # @param live [Symbol, false] Live mode (:long_poll, :sse, false)
13
+ # @param cursor [String, nil] Initial cursor
14
+ def initialize(stream, offset: "-1", live: false, cursor: nil)
15
+ @stream = stream
16
+ @offset = DurableStreams.normalize_offset(offset)
17
+ @live = live
18
+ @next_offset = @offset
19
+ @cursor = cursor
20
+ @up_to_date = false
21
+ @closed = false
22
+ @status = nil
23
+ @sse_reader = nil
24
+ end
25
+
26
+ # Iterate over byte chunks
27
+ # @yield [ByteChunk] Each chunk with data, next_offset, cursor, up_to_date
28
+ def each(&block)
29
+ return enum_for(:each) unless block_given?
30
+
31
+ # Handle SSE mode
32
+ if @live == :sse
33
+ each_sse(&block)
34
+ return
35
+ end
36
+
37
+ loop do
38
+ break if @closed
39
+
40
+ chunk = fetch_next_chunk
41
+ break if chunk.nil?
42
+
43
+ @next_offset = chunk.next_offset
44
+ @cursor = chunk.cursor
45
+ @up_to_date = chunk.up_to_date
46
+
47
+ yield chunk
48
+
49
+ # Break for non-live modes when up_to_date
50
+ break if @live == false && @up_to_date
51
+ # Break for long-poll on 204 timeout (up_to_date with empty data = no new data)
52
+ break if @live == :long_poll && @up_to_date && chunk.data.empty? && @status == 204
53
+ break if @closed
54
+ end
55
+ end
56
+
57
+ # Accumulate all bytes until up_to_date
58
+ # @return [String]
59
+ def body
60
+ chunks = []
61
+ each { |chunk| chunks << chunk.data }
62
+ chunks.join
63
+ end
64
+
65
+ # Get as text
66
+ # @return [String]
67
+ def text
68
+ body.encode("UTF-8")
69
+ end
70
+
71
+ # Collect chunks as array
72
+ def to_a
73
+ result = []
74
+ each { |chunk| result << chunk }
75
+ result
76
+ end
77
+
78
+ # Cancel/close the reader
79
+ def close
80
+ @closed = true
81
+ @sse_reader&.close
82
+ end
83
+
84
+ def closed?
85
+ @closed
86
+ end
87
+
88
+ def up_to_date?
89
+ @up_to_date
90
+ end
91
+
92
+ private
93
+
94
+ def each_sse(&block)
95
+ @sse_reader = SSEReader.new(
96
+ @stream,
97
+ offset: @next_offset,
98
+ cursor: @cursor
99
+ )
100
+
101
+ @sse_reader.each_event do |event|
102
+ break if @closed
103
+
104
+ @next_offset = event[:next_offset] if event[:next_offset]
105
+ @cursor = event[:cursor] if event[:cursor]
106
+ @up_to_date = event[:up_to_date]
107
+ @status = 200
108
+
109
+ # Only yield if there's data
110
+ if event[:data] && !event[:data].empty?
111
+ chunk = ByteChunk.new(
112
+ data: event[:data],
113
+ next_offset: @next_offset,
114
+ cursor: @cursor,
115
+ up_to_date: @up_to_date
116
+ )
117
+ yield chunk
118
+ elsif event[:up_to_date]
119
+ # Yield empty chunk on control event with up_to_date
120
+ chunk = ByteChunk.new(
121
+ data: "",
122
+ next_offset: @next_offset,
123
+ cursor: @cursor,
124
+ up_to_date: @up_to_date
125
+ )
126
+ yield chunk
127
+ end
128
+ end
129
+ ensure
130
+ @sse_reader&.close
131
+ end
132
+
133
+ def fetch_next_chunk
134
+ params = { offset: @next_offset }
135
+ params[:cursor] = @cursor if @cursor
136
+
137
+ # Add live mode parameter
138
+ case @live
139
+ when :long_poll
140
+ params[:live] = "long-poll"
141
+ when :sse
142
+ # SSE is handled separately via each_sse
143
+ return nil
144
+ when false
145
+ # No live param for catch-up only
146
+ end
147
+
148
+ request_url = HTTP.build_url(@stream.url, params)
149
+ headers = @stream.resolved_headers
150
+
151
+ response = @stream.transport.request(:get, request_url, headers: headers)
152
+ @status = response.status
153
+
154
+ if response.status == 404
155
+ raise StreamNotFoundError.new(url: @stream.url)
156
+ end
157
+
158
+ # Handle 204 No Content (long-poll timeout)
159
+ # Still parse headers as they contain offset info
160
+ if response.status == 204
161
+ @next_offset = response[STREAM_NEXT_OFFSET_HEADER] || @next_offset
162
+ @cursor = response[STREAM_CURSOR_HEADER] || @cursor
163
+ @up_to_date = true
164
+ return ByteChunk.new(data: "", next_offset: @next_offset, cursor: @cursor, up_to_date: true)
165
+ end
166
+
167
+ unless response.success?
168
+ raise DurableStreams.error_from_status(response.status, url: @stream.url, body: response.body,
169
+ headers: response.headers)
170
+ end
171
+
172
+ headers = DurableStreams.parse_stream_headers(response, next_offset: @next_offset, cursor: @cursor)
173
+ @next_offset = headers[:next_offset]
174
+ @cursor = headers[:cursor]
175
+ @up_to_date = headers[:up_to_date]
176
+
177
+ ByteChunk.new(
178
+ data: response.body || "",
179
+ next_offset: @next_offset,
180
+ cursor: @cursor,
181
+ up_to_date: @up_to_date
182
+ )
183
+ end
184
+ end
185
+ end
@@ -0,0 +1,68 @@
1
+ # frozen_string_literal: true
2
+
3
+ module DurableStreams
4
+ # Client manages HTTP connections and provides stream handles.
5
+ # Creates a Context internally for configuration isolation.
6
+ # Thread-safe for concurrent use.
7
+ class Client
8
+ attr_reader :context
9
+
10
+ # Open a client with block form for automatic cleanup
11
+ # @example
12
+ # Client.open(base_url: "https://...") do |client|
13
+ # stream = client.stream("/events")
14
+ # # ...
15
+ # end # auto-closes
16
+ # @yield [Client] The client instance
17
+ # @return [Object] The block's return value
18
+ def self.open(**options, &block)
19
+ client = new(**options)
20
+ return client unless block_given?
21
+
22
+ begin
23
+ yield client
24
+ ensure
25
+ client.close
26
+ end
27
+ end
28
+
29
+ # @param base_url [String, nil] Optional base URL for relative paths
30
+ # @param headers [Hash] Default headers (values can be strings or callables)
31
+ # @param timeout [Numeric] Request timeout in seconds
32
+ # @param retry_policy [RetryPolicy] Custom retry configuration
33
+ def initialize(base_url: nil, headers: {}, timeout: 30, retry_policy: nil)
34
+ @context = Context.new do |config|
35
+ config.base_url = base_url
36
+ config.default_headers = headers || {}
37
+ config.timeout = timeout
38
+ config.retry_policy = retry_policy if retry_policy
39
+ end
40
+ end
41
+
42
+ # Get a Stream handle for the given URL
43
+ # @param url [String] Full URL or path (if base_url set)
44
+ # @return [Stream]
45
+ def stream(url, **options)
46
+ Stream.new(url, context: @context, **options)
47
+ end
48
+
49
+ # Shortcut: connect to existing stream
50
+ # @param url [String] Stream URL or path
51
+ # @return [Stream]
52
+ def connect(url, **options)
53
+ Stream.connect(url, context: @context, **options)
54
+ end
55
+
56
+ # Shortcut: create new stream on server
57
+ # @param url [String] Stream URL or path
58
+ # @param content_type [String] Content type for the stream
59
+ # @return [Stream]
60
+ def create(url, content_type:, **options)
61
+ Stream.create(url, content_type: content_type, context: @context, **options)
62
+ end
63
+
64
+ # No-op close (Context doesn't hold resources)
65
+ def close
66
+ end
67
+ end
68
+ end
@@ -0,0 +1,26 @@
1
+ # frozen_string_literal: true
2
+
3
+ module DurableStreams
4
+ # Configuration holds all settings for DurableStreams.
5
+ # Thread-safe when frozen (after configure block completes).
6
+ class Configuration
7
+ attr_accessor :base_url, :default_content_type, :default_headers,
8
+ :timeout, :retry_policy
9
+
10
+ def initialize
11
+ @base_url = nil
12
+ @default_content_type = :json
13
+ @default_headers = {}
14
+ @timeout = 30
15
+ @retry_policy = RetryPolicy.default
16
+ end
17
+
18
+ # Deep dup nested mutable objects when copying.
19
+ # Called automatically by dup/clone.
20
+ def initialize_copy(other)
21
+ super
22
+ @default_headers = other.default_headers.dup
23
+ @retry_policy = other.retry_policy.dup if other.retry_policy.respond_to?(:dup) && !other.retry_policy.frozen?
24
+ end
25
+ end
26
+ end