dspy-openai 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: e6c2209a6cdc9e1eba041246f0c8450fe569fcb0e7b7bf7be9988e24ac305f89
4
+ data.tar.gz: 43c0139cde022436be8b30ab36a5e42a68b1ee445161e8ba00c3571a51c0a962
5
+ SHA512:
6
+ metadata.gz: 6e85f8eb04fc8bb782ff35221da6de10b52f3786d4065aded4ea1a61a484770925679cc2fdc9714482128c8bcf6e1d79c9600134d42258e60b6a2819b89e7e45
7
+ data.tar.gz: 90848415fb21d32cf14445abcca2049b136a61c7ffd98476ef6b7439fdd5c96c46819829d82a0a96f1df524f4c0e3ebdcf2f4593f1b5d195d1d65cd709d0ec14
data/LICENSE ADDED
@@ -0,0 +1,45 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Vicente Services SL
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
22
+
23
+ This project is a Ruby port of the original Python [DSPy library](https://github.com/stanfordnlp/dspy), which is licensed under the MIT License:
24
+
25
+ MIT License
26
+
27
+ Copyright (c) 2023 Stanford Future Data Systems
28
+
29
+ Permission is hereby granted, free of charge, to any person obtaining a copy
30
+ of this software and associated documentation files (the "Software"), to deal
31
+ in the Software without restriction, including without limitation the rights
32
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
33
+ copies of the Software, and to permit persons to whom the Software is
34
+ furnished to do so, subject to the following conditions:
35
+
36
+ The above copyright notice and this permission notice shall be included in all
37
+ copies or substantial portions of the Software.
38
+
39
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
40
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
41
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
42
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
43
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
44
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
45
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,292 @@
1
+ # DSPy.rb
2
+
3
+ [![Gem Version](https://img.shields.io/gem/v/dspy)](https://rubygems.org/gems/dspy)
4
+ [![Total Downloads](https://img.shields.io/gem/dt/dspy)](https://rubygems.org/gems/dspy)
5
+ [![Build Status](https://img.shields.io/github/actions/workflow/status/vicentereig/dspy.rb/ruby.yml?branch=main&label=build)](https://github.com/vicentereig/dspy.rb/actions/workflows/ruby.yml)
6
+ [![Documentation](https://img.shields.io/badge/docs-vicentereig.github.io%2Fdspy.rb-blue)](https://vicentereig.github.io/dspy.rb/)
7
+ [![Discord](https://img.shields.io/discord/1161519468141355160?label=discord&logo=discord&logoColor=white)](https://discord.gg/zWBhrMqn)
8
+
9
+ > [!NOTE]
10
+ > The core Prompt Engineering Framework is production-ready with
11
+ > comprehensive documentation. I am focusing now on educational content on systematic Prompt Optimization and Context Engineering.
12
+ > Your feedback is invaluable. if you encounter issues, please open an [issue](https://github.com/vicentereig/dspy.rb/issues). If you have suggestions, open a [new thread](https://github.com/vicentereig/dspy.rb/discussions).
13
+ >
14
+ > If you want to contribute, feel free to reach out to me to coordinate efforts: hey at vicente.services
15
+ >
16
+ > And, yes, this is 100% a legit project. :)
17
+
18
+
19
+ **Build reliable LLM applications in idiomatic Ruby using composable, type-safe modules.**
20
+
21
+ DSPy.rb is the Ruby-first surgical port of Stanford's [DSPy framework](https://github.com/stanfordnlp/dspy). It delivers structured LLM programming, prompt engineering, and context engineering in the language we love. Instead of wrestling with brittle prompt strings, you define typed signatures in idiomatic Ruby and compose workflows and agents that actually behave.
22
+
23
+ **Prompts are just functions.** Traditional prompting is like writing code with string concatenation: it works until it doesn't. DSPy.rb brings you the programming approach pioneered by [dspy.ai](https://dspy.ai/): define modular signatures and let the framework deal with the messy bits.
24
+
25
+ While we implement the same signatures, predictors, and optimization algorithms as the original library, DSPy.rb leans hard into Ruby conventions with Sorbet-based typing, ReAct loops, and production-ready integrations like non-blocking OpenTelemetry instrumentation.
26
+
27
+ **What you get?** Ruby LLM applications that scale and don't break when you sneeze.
28
+
29
+ Check the [examples](examples/) and take them for a spin!
30
+
31
+ ## Your First DSPy Program
32
+ ### Installation
33
+
34
+ Add to your Gemfile:
35
+
36
+ ```ruby
37
+ gem 'dspy'
38
+ ```
39
+
40
+ and
41
+
42
+ ```bash
43
+ bundle install
44
+ ```
45
+
46
+ ### Your First Reliable Predictor
47
+
48
+ ```ruby
49
+ require 'dspy'
50
+
51
+ # Configure DSPy globally to use your fave LLM (you can override per predictor).
52
+ DSPy.configure do |c|
53
+ c.lm = DSPy::LM.new('openai/gpt-4o-mini',
54
+ api_key: ENV['OPENAI_API_KEY'],
55
+ structured_outputs: true) # Enable OpenAI's native JSON mode
56
+ end
57
+
58
+ # Define a signature for sentiment classification - instead of writing a full prompt!
59
+ class Classify < DSPy::Signature
60
+ description "Classify sentiment of a given sentence." # sets the goal of the underlying prompt
61
+
62
+ class Sentiment < T::Enum
63
+ enums do
64
+ Positive = new('positive')
65
+ Negative = new('negative')
66
+ Neutral = new('neutral')
67
+ end
68
+ end
69
+
70
+ # Structured Inputs: makes sure you are sending only valid prompt inputs to your model
71
+ input do
72
+ const :sentence, String, description: 'The sentence to analyze'
73
+ end
74
+
75
+ # Structured Outputs: your predictor will validate the output of the model too.
76
+ output do
77
+ const :sentiment, Sentiment, description: 'The sentiment of the sentence'
78
+ const :confidence, Float, description: 'A number between 0.0 and 1.0'
79
+ end
80
+ end
81
+
82
+ # Wire it to the simplest prompting technique: a prediction loop.
83
+ classify = DSPy::Predict.new(Classify)
84
+ # it may raise an error if you mess the inputs or your LLM messes the outputs.
85
+ result = classify.call(sentence: "This book was super fun to read!")
86
+
87
+ puts result.sentiment # => #<Sentiment::Positive>
88
+ puts result.confidence # => 0.85
89
+ ```
90
+
91
+ Save this as `examples/first_predictor.rb` and run it with:
92
+
93
+ ```bash
94
+ bundle exec ruby examples/first_predictor.rb
95
+ ```
96
+
97
+ ### Sibling Gems
98
+
99
+ DSPy.rb ships multiple gems from this monorepo so you can opt into features with heavier dependency trees (e.g., datasets pull in Polars/Arrow, MIPROv2 requires `numo-*` BLAS bindings) only when you need them. Add these alongside `dspy`:
100
+
101
+ | Gem | Description | Status |
102
+ | --- | --- | --- |
103
+ | `dspy-schema` | Exposes `DSPy::TypeSystem::SorbetJsonSchema` for downstream reuse. (Still required by the core `dspy` gem; extraction lets other projects depend on it directly.) | **Stable** (v1.0.0) |
104
+ | `dspy-code_act` | Think-Code-Observe agents that synthesize and execute Ruby safely. (Add the gem or set `DSPY_WITH_CODE_ACT=1` before requiring `dspy/code_act`.) | **Stable** (v1.0.0) |
105
+ | `dspy-datasets` | Dataset helpers plus Parquet/Polars tooling for richer evaluation corpora. (Toggle via `DSPY_WITH_DATASETS`.) | **Stable** (v1.0.0) |
106
+ | `dspy-evals` | High-throughput evaluation harness with metrics, callbacks, and regression fixtures. (Toggle via `DSPY_WITH_EVALS`.) | **Stable** (v1.0.0) |
107
+ | `dspy-miprov2` | Bayesian optimization + Gaussian Process backend for the MIPROv2 teleprompter. (Install or export `DSPY_WITH_MIPROV2=1` before requiring the teleprompter.) | **Stable** (v1.0.0) |
108
+ | `dspy-gepa` | `DSPy::Teleprompt::GEPA`, reflection loops, experiment tracking, telemetry adapters. (Install or set `DSPY_WITH_GEPA=1`.) | **Stable** (v1.0.0) |
109
+ | `gepa` | GEPA optimizer core (Pareto engine, telemetry, reflective proposer). | **Stable** (v1.0.0) |
110
+ | `dspy-o11y` | Core observability APIs: `DSPy::Observability`, async span processor, observation types. (Install or set `DSPY_WITH_O11Y=1`.) | **Stable** (v1.0.0) |
111
+ | `dspy-o11y-langfuse` | Auto-configures DSPy observability to stream spans to Langfuse via OTLP. (Install or set `DSPY_WITH_O11Y_LANGFUSE=1`.) | **Stable** (v1.0.0) |
112
+ | `dspy-deep_search` | Production DeepSearch loop with Exa-backed search/read, token budgeting, and instrumentation (Issue #163). | **Stable** (v1.0.0) |
113
+ | `dspy-deep_research` | Planner/QA orchestration atop DeepSearch plus the memory supervisor used by the CLI example. | **Stable** (v1.0.0) |
114
+
115
+ Set the matching `DSPY_WITH_*` environment variables (see `Gemfile`) to include or exclude each sibling gem when running Bundler locally (for example `DSPY_WITH_GEPA=1` or `DSPY_WITH_O11Y_LANGFUSE=1`). Refer to `adr/013-dependency-tree.md` for the full dependency map and roadmap.
116
+ ### Access to 200+ Models Across 5 Providers
117
+
118
+ DSPy.rb provides unified access to major LLM providers with provider-specific optimizations:
119
+
120
+ ```ruby
121
+ # OpenAI (GPT-4, GPT-4o, GPT-4o-mini, GPT-5, etc.)
122
+ DSPy.configure do |c|
123
+ c.lm = DSPy::LM.new('openai/gpt-4o-mini',
124
+ api_key: ENV['OPENAI_API_KEY'],
125
+ structured_outputs: true) # Native JSON mode
126
+ end
127
+
128
+ # Google Gemini (Gemini 1.5 Pro, Flash, Gemini 2.0, etc.)
129
+ DSPy.configure do |c|
130
+ c.lm = DSPy::LM.new('gemini/gemini-2.5-flash',
131
+ api_key: ENV['GEMINI_API_KEY'],
132
+ structured_outputs: true) # Native structured outputs
133
+ end
134
+
135
+ # Anthropic Claude (Claude 3.5, Claude 4, etc.)
136
+ DSPy.configure do |c|
137
+ c.lm = DSPy::LM.new('anthropic/claude-sonnet-4-5-20250929',
138
+ api_key: ENV['ANTHROPIC_API_KEY'],
139
+ structured_outputs: true) # Tool-based extraction (default)
140
+ end
141
+
142
+ # Ollama - Run any local model (Llama, Mistral, Gemma, etc.)
143
+ DSPy.configure do |c|
144
+ c.lm = DSPy::LM.new('ollama/llama3.2') # Free, runs locally, no API key needed
145
+ end
146
+
147
+ # OpenRouter - Access to 200+ models from multiple providers
148
+ DSPy.configure do |c|
149
+ c.lm = DSPy::LM.new('openrouter/deepseek/deepseek-chat-v3.1:free',
150
+ api_key: ENV['OPENROUTER_API_KEY'])
151
+ end
152
+ ```
153
+
154
+ ## What You Get
155
+
156
+ **Developer Experience:** Official clients, multimodal coverage, and observability baked in.
157
+ <details>
158
+ <summary>Expand for everything included</summary>
159
+
160
+ - LLM provider support using official Ruby clients:
161
+ - [OpenAI Ruby](https://github.com/openai/openai-ruby) with vision model support
162
+ - [Anthropic Ruby SDK](https://github.com/anthropics/anthropic-sdk-ruby) with multimodal capabilities
163
+ - [Google Gemini API](https://ai.google.dev/) with native structured outputs
164
+ - [Ollama](https://ollama.com/) via OpenAI compatibility layer for local models
165
+ - **Multimodal Support** - Complete image analysis with DSPy::Image, type-safe bounding boxes, vision-capable models
166
+ - Runtime type checking with [Sorbet](https://sorbet.org/) including T::Enum and union types
167
+ - Type-safe tool definitions for ReAct agents
168
+ - Comprehensive instrumentation and observability
169
+ </details>
170
+
171
+ **Core Building Blocks:** Predictors, agents, and pipelines wired through type-safe signatures.
172
+ <details>
173
+ <summary>Expand for everything included</summary>
174
+
175
+ - **Signatures** - Define input/output schemas using Sorbet types with T::Enum and union type support
176
+ - **Predict** - LLM completion with structured data extraction and multimodal support
177
+ - **Chain of Thought** - Step-by-step reasoning for complex problems with automatic prompt optimization
178
+ - **ReAct** - Tool-using agents with type-safe tool definitions and error recovery
179
+ - **Module Composition** - Combine multiple LLM calls into production-ready workflows
180
+ </details>
181
+
182
+ **Optimization & Evaluation:** Treat prompt optimization like a real ML workflow.
183
+ <details>
184
+ <summary>Expand for everything included</summary>
185
+
186
+ - **Prompt Objects** - Manipulate prompts as first-class objects instead of strings
187
+ - **Typed Examples** - Type-safe training data with automatic validation
188
+ - **Evaluation Framework** - Advanced metrics beyond simple accuracy with error-resilient pipelines
189
+ - **MIPROv2 Optimization** - Advanced Bayesian optimization with Gaussian Processes, multiple optimization strategies, auto-config presets, and storage persistence
190
+ </details>
191
+
192
+ **Production Features:** Hardened behaviors for teams shipping actual products.
193
+ <details>
194
+ <summary>Expand for everything included</summary>
195
+
196
+ - **Reliable JSON Extraction** - Native structured outputs for OpenAI and Gemini, Anthropic tool-based extraction, and automatic strategy selection with fallback
197
+ - **Type-Safe Configuration** - Strategy enums with automatic provider optimization (Strict/Compatible modes)
198
+ - **Smart Retry Logic** - Progressive fallback with exponential backoff for handling transient failures
199
+ - **Zero-Config Langfuse Integration** - Set env vars and get automatic OpenTelemetry traces in Langfuse
200
+ - **Performance Caching** - Schema and capability caching for faster repeated operations
201
+ - **File-based Storage** - Optimization result persistence with versioning
202
+ - **Structured Logging** - JSON and key=value formats with span tracking
203
+ </details>
204
+
205
+ ## Recent Achievements
206
+
207
+ DSPy.rb has gone from experimental to production-ready in three fast releases.
208
+ <details>
209
+ <summary>Expand for the full changelog highlights</summary>
210
+
211
+ ### Foundation
212
+ - ✅ **JSON Parsing Reliability** - Native OpenAI structured outputs with adaptive retry logic and schema-aware fallbacks
213
+ - ✅ **Type-Safe Strategy Configuration** - Provider-optimized strategy selection and enum-backed optimizer presets
214
+ - ✅ **Core Module System** - Predict, ChainOfThought, ReAct with type safety (add `dspy-code_act` for Think-Code-Observe agents)
215
+ - ✅ **Production Observability** - OpenTelemetry, New Relic, and Langfuse integration
216
+ - ✅ **Advanced Optimization** - MIPROv2 with Bayesian optimization, Gaussian Processes, and multi-mode search
217
+
218
+ ### Recent Advances
219
+ - ✅ **MIPROv2 ADE Integrity (v0.29.1)** - Stratified train/val/test splits, honest precision accounting, and enum-driven `--auto` presets with integration coverage
220
+ - ✅ **Instruction Deduplication (v0.29.1)** - Candidate generation now filters repeated programs so optimization logs highlight unique strategies
221
+ - ✅ **GEPA Teleprompter (v0.29.0)** - Genetic-Pareto reflective prompt evolution with merge proposer scheduling, reflective mutation, and ADE demo parity
222
+ - ✅ **Optimizer Utilities Parity (v0.29.0)** - Bootstrap strategies, dataset summaries, and Layer 3 utilities unlock multi-predictor programs on Ruby
223
+ - ✅ **Observability Hardening (v0.29.0)** - OTLP exporter runs on a single-thread executor preventing frozen SSL contexts without blocking spans
224
+ - ✅ **Documentation Refresh (v0.29.x)** - New GEPA guide plus ADE optimization docs covering presets, stratified splits, and error-handling defaults
225
+ </details>
226
+
227
+ **Current Focus Areas:** Closing the loop on production patterns and community adoption ahead of v1.0.
228
+ <details>
229
+ <summary>Expand for the roadmap</summary>
230
+
231
+ ### Production Readiness
232
+ - 🚧 **Production Patterns** - Real-world usage validation and performance optimization
233
+ - 🚧 **Ruby Ecosystem Integration** - Rails integration, Sidekiq compatibility, deployment patterns
234
+
235
+ ### Community & Adoption
236
+ - 🚧 **Community Examples** - Real-world applications and case studies
237
+ - 🚧 **Contributor Experience** - Making it easier to contribute and extend
238
+ - 🚧 **Performance Benchmarks** - Comparative analysis vs other frameworks
239
+ </details>
240
+
241
+ **v1.0 Philosophy:** v1.0 lands after battle-testing, not checkbox bingo. The API is already stable; the milestone marks production confidence.
242
+
243
+
244
+ ## Documentation
245
+
246
+ 📖 **[Complete Documentation Website](https://vicentereig.github.io/dspy.rb/)**
247
+
248
+ ### LLM-Friendly Documentation
249
+
250
+ For LLMs and AI assistants working with DSPy.rb:
251
+ - **[llms.txt](https://vicentereig.github.io/dspy.rb/llms.txt)** - Concise reference optimized for LLMs
252
+ - **[llms-full.txt](https://vicentereig.github.io/dspy.rb/llms-full.txt)** - Comprehensive API documentation
253
+
254
+ ### Getting Started
255
+ - **[Installation & Setup](docs/src/getting-started/installation.md)** - Detailed installation and configuration
256
+ - **[Quick Start Guide](docs/src/getting-started/quick-start.md)** - Your first DSPy programs
257
+ - **[Core Concepts](docs/src/getting-started/core-concepts.md)** - Understanding signatures, predictors, and modules
258
+
259
+ ### Prompt Engineering
260
+ - **[Signatures & Types](docs/src/core-concepts/signatures.md)** - Define typed interfaces for LLM operations
261
+ - **[Predictors](docs/src/core-concepts/predictors.md)** - Predict, ChainOfThought, ReAct, and more
262
+ - **[Modules & Pipelines](docs/src/core-concepts/modules.md)** - Compose complex multi-stage workflows
263
+ - **[Multimodal Support](docs/src/core-concepts/multimodal.md)** - Image analysis with vision-capable models
264
+ - **[Examples & Validation](docs/src/core-concepts/examples.md)** - Type-safe training data
265
+ - **[Rich Types](docs/src/advanced/complex-types.md)** - Sorbet type integration with automatic coercion for structs, enums, and arrays
266
+ - **[Composable Pipelines](docs/src/advanced/pipelines.md)** - Manual module composition patterns
267
+
268
+ ### Prompt Optimization
269
+ - **[Evaluation Framework](docs/src/optimization/evaluation.md)** - Advanced metrics beyond simple accuracy
270
+ - **[Prompt Optimization](docs/src/optimization/prompt-optimization.md)** - Manipulate prompts as objects
271
+ - **[MIPROv2 Optimizer](docs/src/optimization/miprov2.md)** - Advanced Bayesian optimization with Gaussian Processes
272
+ - **[GEPA Optimizer](docs/src/optimization/gepa.md)** *(beta)* - Reflective mutation with optional reflection LMs
273
+
274
+ ### Context Engineering
275
+ - **[Tools](docs/src/core-concepts/toolsets.md)** - Tool wieldint agents.
276
+ - **[Agentic Memory](docs/src/core-concepts/memory.md)** - Memory Tools & Agentic Loops
277
+ - **[RAG Patterns](docs/src/advanced/rag.md)** - Manual RAG implementation with external services
278
+
279
+ ### Production Features
280
+ - **[Observability](docs/src/production/observability.md)** - Zero-config Langfuse integration with a dedicated export worker that never blocks your LLMs
281
+ - **[Storage System](docs/src/production/storage.md)** - Persistence and optimization result storage
282
+ - **[Custom Metrics](docs/src/advanced/custom-metrics.md)** - Proc-based evaluation logic
283
+
284
+
285
+
286
+
287
+
288
+
289
+
290
+
291
+ ## License
292
+ This project is licensed under the MIT License.
@@ -0,0 +1,40 @@
1
+ # DSPy OpenAI adapter gem
2
+
3
+ `dspy-openai` packages the OpenAI-compatible adapters for DSPy.rb so we can keep the core `dspy` gem lean while still talking to GPT models (and any OpenAI-compatible endpoint). Install it whenever your project needs to invoke `openai/*`, `openrouter/*`, or `ollama/*` models through DSPy.
4
+
5
+ ## When you need it
6
+ - You call `DSPy::LM.new` with a model id that starts with `openai/`, `openrouter/`, or `ollama/`.
7
+ - You want to take advantage of structured outputs, streaming, or multimodal (vision) features exposed by OpenAI's API.
8
+
9
+ If your project only uses non-OpenAI providers (e.g. Anthropic or Gemini) you can omit this gem entirely.
10
+
11
+ ## Installation
12
+ Add the gem next to your `dspy` dependency and install Bundler dependencies:
13
+
14
+ ```ruby
15
+ # Gemfile
16
+ gem 'dspy'
17
+ gem 'dspy-openai'
18
+ ```
19
+
20
+ ```sh
21
+ bundle install
22
+ ```
23
+
24
+ The gem depends on the official `openai` Ruby SDK (`~> 0.17`). The adapter checks this at load time and will raise if an incompatible version, or the older `ruby-openai` gem, is detected.
25
+
26
+ ## Basic usage
27
+
28
+ ```ruby
29
+ require 'dspy'
30
+ # No need to explicitly require 'dspy/openai'
31
+
32
+ lm = DSPy::LM.new('openai/gpt-4o-mini', api_key: ENV.fetch('OPENAI_API_KEY'))
33
+
34
+ ```
35
+
36
+ ## Working with alternate providers
37
+ - **OpenRouter**: instantiate with `DSPy::LM.new('openrouter/x-ai/grok-4-fast:free', api_key: ENV['OPENROUTER_API_KEY'])`. Any OpenAI-compatible model exposed by OpenRouter will work.
38
+ - **Ollama**: use `DSPy::LM.new('ollama/llama3.2', api_key: nil)`.
39
+
40
+ All three adapters share the same request handling, structured output support, and error reporting, so you can swap providers without changing higher-level DSPy code.
@@ -0,0 +1,32 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'dspy/lm/errors'
4
+
5
+ module DSPy
6
+ module OpenAI
7
+ class Guardrails
8
+ SUPPORTED_OPENAI_VERSIONS = "~> 0.17".freeze
9
+
10
+ def self.ensure_openai_installed!
11
+ require 'openai'
12
+
13
+ spec = Gem.loaded_specs["openai"]
14
+ unless spec && Gem::Requirement.new(SUPPORTED_OPENAI_VERSIONS).satisfied_by?(spec.version)
15
+ msg = <<~MSG
16
+ DSPY requires the official `openai` gem #{SUPPORTED_OPENAI_VERSIONS}.
17
+ Please install or upgrade it with `bundle add openai --version "#{SUPPORTED_OPENAI_VERSIONS}"`.
18
+ MSG
19
+ raise DSPy::LM::UnsupportedVersionError, msg
20
+ end
21
+
22
+ if Gem.loaded_specs["ruby-openai"]
23
+ msg = <<~MSG
24
+ DSPy uses the official `openai` gem.
25
+ Please remove the `ruby-openai` gem to avoid namespace conflicts.
26
+ MSG
27
+ raise DSPy::LM::MissingOfficialSDKError, msg
28
+ end
29
+ end
30
+ end
31
+ end
32
+ end
@@ -0,0 +1,78 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'openai'
4
+ require 'dspy/openai/lm/adapters/openai_adapter'
5
+
6
+ module DSPy
7
+ module OpenAI
8
+ module LM
9
+ module Adapters
10
+ class OllamaAdapter < OpenAIAdapter
11
+ DEFAULT_BASE_URL = 'http://localhost:11434/v1'
12
+
13
+ def initialize(model:, api_key: nil, base_url: nil, structured_outputs: true)
14
+ # Ollama doesn't require API key for local instances
15
+ # But may need it for remote/protected instances
16
+ api_key ||= 'ollama' # OpenAI client requires non-empty key
17
+ base_url ||= DEFAULT_BASE_URL
18
+
19
+ # Store base_url before calling super
20
+ @base_url = base_url
21
+
22
+ # Don't call parent's initialize, do it manually to control client creation
23
+ @model = model
24
+ @api_key = api_key
25
+ @structured_outputs_enabled = structured_outputs
26
+ validate_configuration!
27
+
28
+ # Create client with custom base URL
29
+ @client = ::OpenAI::Client.new(
30
+ api_key: @api_key,
31
+ base_url: @base_url
32
+ )
33
+ end
34
+
35
+ def chat(messages:, signature: nil, response_format: nil, &block)
36
+ # For Ollama, we need to be more lenient with structured outputs
37
+ # as it may not fully support OpenAI's response_format spec
38
+ begin
39
+ super
40
+ rescue => e
41
+ # If structured output fails, retry with enhanced prompting
42
+ if @structured_outputs_enabled && signature && e.message.include?('response_format')
43
+ DSPy.logger.debug("Ollama structured output failed, falling back to enhanced prompting")
44
+ @structured_outputs_enabled = false
45
+ retry
46
+ else
47
+ raise
48
+ end
49
+ end
50
+ end
51
+
52
+ private
53
+
54
+ def validate_configuration!
55
+ super
56
+ # Additional Ollama-specific validation could go here
57
+ end
58
+
59
+ def validate_api_key!(api_key, provider)
60
+ # For Ollama, API key is optional for local instances
61
+ # Only validate if it looks like a remote URL
62
+ if @base_url && !@base_url.include?('localhost') && !@base_url.include?('127.0.0.1')
63
+ super
64
+ end
65
+ end
66
+
67
+
68
+ # Ollama may have different model support for structured outputs
69
+ def supports_structured_outputs?
70
+ # For now, assume all Ollama models support basic JSON mode
71
+ # but may not support full OpenAI structured output spec
72
+ true
73
+ end
74
+ end
75
+ end
76
+ end
77
+ end
78
+ end
@@ -0,0 +1,196 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'openai'
4
+ require_relative '../schema_converter'
5
+ require 'dspy/lm/vision_models'
6
+ require 'dspy/lm/adapter'
7
+
8
+ require 'dspy/openai/guardrails'
9
+ DSPy::OpenAI::Guardrails.ensure_openai_installed!
10
+
11
+ module DSPy
12
+ module OpenAI
13
+ module LM
14
+ module Adapters
15
+ class OpenAIAdapter < DSPy::LM::Adapter
16
+ def initialize(model:, api_key:, structured_outputs: false)
17
+ super(model: model, api_key: api_key)
18
+ validate_api_key!(api_key, 'openai')
19
+ @client = ::OpenAI::Client.new(api_key: api_key)
20
+ @structured_outputs_enabled = structured_outputs
21
+ end
22
+
23
+ def chat(messages:, signature: nil, response_format: nil, &block)
24
+ normalized_messages = normalize_messages(messages)
25
+
26
+ # Validate vision support if images are present
27
+ if contains_images?(normalized_messages)
28
+ DSPy::LM::VisionModels.validate_vision_support!('openai', model)
29
+ # Convert messages to OpenAI format with proper image handling
30
+ normalized_messages = format_multimodal_messages(normalized_messages)
31
+ end
32
+
33
+ # Handle O1 model restrictions - convert system messages to user messages
34
+ if o1_model?(model)
35
+ normalized_messages = handle_o1_messages(normalized_messages)
36
+ end
37
+
38
+ request_params = default_request_params.merge(
39
+ messages: normalized_messages
40
+ )
41
+
42
+ # Add temperature based on model capabilities
43
+ unless o1_model?(model)
44
+ temperature = case model
45
+ when /^gpt-5/, /^gpt-4o/
46
+ 1.0 # GPT-5 and GPT-4o models only support default temperature of 1.0
47
+ else
48
+ 0.0 # Near-deterministic for other models (0.0 no longer universally supported)
49
+ end
50
+ request_params[:temperature] = temperature
51
+ end
52
+
53
+ # Add response format if provided by strategy
54
+ if response_format
55
+ request_params[:response_format] = response_format
56
+ elsif @structured_outputs_enabled && signature && supports_structured_outputs?
57
+ # Legacy behavior for backward compatibility
58
+ response_format = DSPy::OpenAI::LM::SchemaConverter.to_openai_format(signature)
59
+ request_params[:response_format] = response_format
60
+ end
61
+
62
+ # Add streaming if block provided
63
+ if block_given?
64
+ request_params[:stream] = proc do |chunk, _bytesize|
65
+ block.call(chunk) if chunk.dig("choices", 0, "delta", "content")
66
+ end
67
+ end
68
+
69
+ begin
70
+ response = @client.chat.completions.create(**request_params)
71
+
72
+ if response.respond_to?(:error) && response.error
73
+ raise DSPy::LM::AdapterError, "OpenAI API error: #{response.error}"
74
+ end
75
+
76
+ choice = response.choices.first
77
+ message = choice.message
78
+ content = message.content
79
+ usage = response.usage
80
+
81
+ # Handle structured output refusals
82
+ if message.respond_to?(:refusal) && message.refusal
83
+ raise DSPy::LM::AdapterError, "OpenAI refused to generate output: #{message.refusal}"
84
+ end
85
+
86
+ # Convert usage data to typed struct
87
+ usage_struct = DSPy::LM::UsageFactory.create('openai', usage)
88
+
89
+ # Create typed metadata
90
+ metadata = DSPy::LM::ResponseMetadataFactory.create('openai', {
91
+ model: model,
92
+ response_id: response.id,
93
+ created: response.created,
94
+ structured_output: @structured_outputs_enabled && signature && supports_structured_outputs?,
95
+ system_fingerprint: response.system_fingerprint,
96
+ finish_reason: choice.finish_reason
97
+ })
98
+
99
+ DSPy::LM::Response.new(
100
+ content: content,
101
+ usage: usage_struct,
102
+ metadata: metadata
103
+ )
104
+ rescue => e
105
+ # Check for specific error types and messages
106
+ error_msg = e.message.to_s
107
+
108
+ # Try to parse error body if it looks like JSON
109
+ error_body = if error_msg.start_with?('{')
110
+ JSON.parse(error_msg) rescue nil
111
+ elsif e.respond_to?(:response) && e.response
112
+ e.response[:body] rescue nil
113
+ end
114
+
115
+ # Check for specific image-related errors
116
+ if error_msg.include?('image_parse_error') || error_msg.include?('unsupported image')
117
+ raise DSPy::LM::AdapterError, "Image processing failed: #{error_msg}. Ensure your image is a valid PNG, JPEG, GIF, or WebP format and under 5MB."
118
+ elsif error_msg.include?('rate') && error_msg.include?('limit')
119
+ raise DSPy::LM::AdapterError, "OpenAI rate limit exceeded: #{error_msg}. Please wait and try again."
120
+ elsif error_msg.include?('authentication') || error_msg.include?('API key') || error_msg.include?('Unauthorized')
121
+ raise DSPy::LM::AdapterError, "OpenAI authentication failed: #{error_msg}. Check your API key."
122
+ elsif error_body && error_body.dig('error', 'message')
123
+ raise DSPy::LM::AdapterError, "OpenAI API error: #{error_body.dig('error', 'message')}"
124
+ else
125
+ # Generic error handling
126
+ raise DSPy::LM::AdapterError, "OpenAI adapter error: #{e.message}"
127
+ end
128
+ end
129
+ end
130
+
131
+ protected
132
+
133
+ # Allow subclasses to override request params (add headers, etc)
134
+ def default_request_params
135
+ {
136
+ model: model
137
+ }
138
+ end
139
+
140
+ private
141
+
142
+ def supports_structured_outputs?
143
+ DSPy::OpenAI::LM::SchemaConverter.supports_structured_outputs?(model)
144
+ end
145
+
146
+ def format_multimodal_messages(messages)
147
+ messages.map do |msg|
148
+ if msg[:content].is_a?(Array)
149
+ # Convert multimodal content to OpenAI format
150
+ formatted_content = msg[:content].map do |item|
151
+ case item[:type]
152
+ when 'text'
153
+ { type: 'text', text: item[:text] }
154
+ when 'image'
155
+ # Validate image compatibility before formatting
156
+ item[:image].validate_for_provider!('openai')
157
+ item[:image].to_openai_format
158
+ else
159
+ item
160
+ end
161
+ end
162
+
163
+ {
164
+ role: msg[:role],
165
+ content: formatted_content
166
+ }
167
+ else
168
+ msg
169
+ end
170
+ end
171
+ end
172
+
173
+ # Check if model is an O1 reasoning model (includes O1, O3, O4 series)
174
+ def o1_model?(model_name)
175
+ model_name.match?(/^o[134](-.*)?$/)
176
+ end
177
+
178
+ # Handle O1 model message restrictions
179
+ def handle_o1_messages(messages)
180
+ messages.map do |msg|
181
+ # Convert system messages to user messages for O1 models
182
+ if msg[:role] == 'system'
183
+ {
184
+ role: 'user',
185
+ content: "Instructions: #{msg[:content]}"
186
+ }
187
+ else
188
+ msg
189
+ end
190
+ end
191
+ end
192
+ end
193
+ end
194
+ end
195
+ end
196
+ end
@@ -0,0 +1,73 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'openai'
4
+ require 'dspy/openai/lm/adapters/openai_adapter'
5
+
6
+ module DSPy
7
+ module OpenAI
8
+ module LM
9
+ module Adapters
10
+ class OpenRouterAdapter < OpenAIAdapter
11
+ BASE_URL = 'https://openrouter.ai/api/v1'
12
+
13
+ def initialize(model:, api_key: nil, structured_outputs: true, http_referrer: nil, x_title: nil)
14
+ # Don't call parent's initialize, do it manually to control client creation
15
+ @model = model
16
+ @api_key = api_key
17
+ @structured_outputs_enabled = structured_outputs
18
+
19
+ @http_referrer = http_referrer
20
+ @x_title = x_title
21
+
22
+ validate_configuration!
23
+
24
+ # Create client with custom base URL
25
+ @client = ::OpenAI::Client.new(
26
+ api_key: @api_key,
27
+ base_url: BASE_URL
28
+ )
29
+ end
30
+
31
+ def chat(messages:, signature: nil, response_format: nil, &block)
32
+ # For OpenRouter, we need to be more lenient with structured outputs
33
+ # as the model behind it may not fully support OpenAI's response_format spec
34
+ begin
35
+ super
36
+ rescue => e
37
+ # If structured output fails, retry with enhanced prompting
38
+ if @structured_outputs_enabled && signature && e.message.include?('response_format')
39
+ DSPy.logger.debug("OpenRouter structured output failed, falling back to enhanced prompting")
40
+ @structured_outputs_enabled = false
41
+ retry
42
+ else
43
+ raise
44
+ end
45
+ end
46
+ end
47
+
48
+ protected
49
+
50
+ # Add any OpenRouter-specific headers to all requests
51
+ def default_request_params
52
+ headers = {
53
+ 'X-Title' => @x_title,
54
+ 'HTTP-Referer' => @http_referrer
55
+ }.compact
56
+
57
+ upstream_params = super
58
+ upstream_params.merge!(request_options: { extra_headers: headers }) if headers.any?
59
+ upstream_params
60
+ end
61
+
62
+ private
63
+
64
+ def supports_structured_outputs?
65
+ # Different models behind OpenRouter may have different capabilities
66
+ # For now, we rely on whatever was passed to the constructor
67
+ @structured_outputs_enabled
68
+ end
69
+ end
70
+ end
71
+ end
72
+ end
73
+ end
@@ -0,0 +1,362 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "sorbet-runtime"
4
+
5
+ module DSPy
6
+ module OpenAI
7
+ module LM
8
+ # Converts DSPy signatures to OpenAI structured output format
9
+ class SchemaConverter
10
+ extend T::Sig
11
+
12
+ # Models that support structured outputs as of July 2025
13
+ STRUCTURED_OUTPUT_MODELS = T.let([
14
+ "gpt-4o-mini",
15
+ "gpt-4o-2024-08-06",
16
+ "gpt-4o",
17
+ "gpt-4-turbo",
18
+ "gpt-4-turbo-2024-04-09",
19
+ "gpt-5",
20
+ "gpt-5-pro",
21
+ "gpt-5-mini",
22
+ "gpt-5-nano",
23
+ "gpt-5-2025-08-07"
24
+ ].freeze, T::Array[String])
25
+
26
+ sig { params(signature_class: T.class_of(DSPy::Signature), name: T.nilable(String), strict: T::Boolean).returns(T::Hash[Symbol, T.untyped]) }
27
+ def self.to_openai_format(signature_class, name: nil, strict: true)
28
+ # Get the output JSON schema from the signature class
29
+ output_schema = signature_class.output_json_schema
30
+
31
+ # Convert oneOf to anyOf where safe, or raise error for unsupported cases
32
+ output_schema = convert_oneof_to_anyof_if_safe(output_schema)
33
+
34
+ # Build the complete schema with OpenAI-specific modifications
35
+ dspy_schema = {
36
+ "$schema": "http://json-schema.org/draft-06/schema#",
37
+ type: "object",
38
+ properties: output_schema[:properties] || {},
39
+ required: openai_required_fields(signature_class, output_schema)
40
+ }
41
+
42
+ # Generate a schema name if not provided
43
+ schema_name = name || generate_schema_name(signature_class)
44
+
45
+ # Remove the $schema field as OpenAI doesn't use it
46
+ openai_schema = dspy_schema.except(:$schema)
47
+
48
+ # Add additionalProperties: false for strict mode and fix nested struct schemas
49
+ if strict
50
+ openai_schema = add_additional_properties_recursively(openai_schema)
51
+ openai_schema = fix_nested_struct_required_fields(openai_schema)
52
+ end
53
+
54
+ # Wrap in OpenAI's required format
55
+ {
56
+ type: "json_schema",
57
+ json_schema: {
58
+ name: schema_name,
59
+ strict: strict,
60
+ schema: openai_schema
61
+ }
62
+ }
63
+ end
64
+
65
+ # Convert oneOf to anyOf if safe (discriminated unions), otherwise raise error
66
+ sig { params(schema: T.untyped).returns(T.untyped) }
67
+ def self.convert_oneof_to_anyof_if_safe(schema)
68
+ return schema unless schema.is_a?(Hash)
69
+
70
+ result = schema.dup
71
+
72
+ # Check if this schema has oneOf that we can safely convert
73
+ if result[:oneOf]
74
+ if all_have_discriminators?(result[:oneOf])
75
+ # Safe to convert - discriminators ensure mutual exclusivity
76
+ result[:anyOf] = result.delete(:oneOf).map { |s| convert_oneof_to_anyof_if_safe(s) }
77
+ else
78
+ # Unsafe conversion - raise error
79
+ raise DSPy::UnsupportedSchemaError.new(
80
+ "OpenAI structured outputs do not support oneOf schemas without discriminator fields. " \
81
+ "The schema contains union types that cannot be safely converted to anyOf. " \
82
+ "Please use enhanced_prompting strategy instead or add discriminator fields to union types."
83
+ )
84
+ end
85
+ end
86
+
87
+ # Recursively process nested schemas
88
+ if result[:properties].is_a?(Hash)
89
+ result[:properties] = result[:properties].transform_values { |v| convert_oneof_to_anyof_if_safe(v) }
90
+ end
91
+
92
+ if result[:items].is_a?(Hash)
93
+ result[:items] = convert_oneof_to_anyof_if_safe(result[:items])
94
+ end
95
+
96
+ # Process arrays of schema items
97
+ if result[:items].is_a?(Array)
98
+ result[:items] = result[:items].map { |item|
99
+ item.is_a?(Hash) ? convert_oneof_to_anyof_if_safe(item) : item
100
+ }
101
+ end
102
+
103
+ # Process anyOf arrays (in case there are nested oneOf within anyOf)
104
+ if result[:anyOf].is_a?(Array)
105
+ result[:anyOf] = result[:anyOf].map { |item|
106
+ item.is_a?(Hash) ? convert_oneof_to_anyof_if_safe(item) : item
107
+ }
108
+ end
109
+
110
+ result
111
+ end
112
+
113
+ # Check if all schemas in a oneOf array have discriminator fields (const properties)
114
+ sig { params(schemas: T::Array[T.untyped]).returns(T::Boolean) }
115
+ def self.all_have_discriminators?(schemas)
116
+ schemas.all? do |schema|
117
+ next false unless schema.is_a?(Hash)
118
+ next false unless schema[:properties].is_a?(Hash)
119
+
120
+ # Check if any property has a const value (our discriminator pattern)
121
+ schema[:properties].any? { |_, prop| prop.is_a?(Hash) && prop[:const] }
122
+ end
123
+ end
124
+
125
+ sig { params(model: String).returns(T::Boolean) }
126
+ def self.supports_structured_outputs?(model)
127
+ # Extract base model name without provider prefix
128
+ base_model = model.sub(/^openai\//, "")
129
+
130
+ # Check if it's a supported model or a newer version
131
+ STRUCTURED_OUTPUT_MODELS.any? { |supported| base_model.start_with?(supported) }
132
+ end
133
+
134
+ sig { params(schema: T::Hash[Symbol, T.untyped]).returns(T::Array[String]) }
135
+ def self.validate_compatibility(schema)
136
+ issues = []
137
+
138
+ # Check for deeply nested objects (OpenAI has depth limits)
139
+ depth = calculate_depth(schema)
140
+ if depth > 5
141
+ issues << "Schema depth (#{depth}) exceeds recommended limit of 5 levels"
142
+ end
143
+
144
+ # Check for unsupported JSON Schema features
145
+ if contains_pattern_properties?(schema)
146
+ issues << "Pattern properties are not supported in OpenAI structured outputs"
147
+ end
148
+
149
+ if contains_conditional_schemas?(schema)
150
+ issues << "Conditional schemas (if/then/else) are not supported"
151
+ end
152
+
153
+ issues
154
+ end
155
+
156
+ private
157
+
158
+ # OpenAI structured outputs requires ALL properties to be in the required array
159
+ # For T.nilable fields without defaults, we warn the user and mark as required
160
+ sig { params(signature_class: T.class_of(DSPy::Signature), output_schema: T::Hash[Symbol, T.untyped]).returns(T::Array[String]) }
161
+ def self.openai_required_fields(signature_class, output_schema)
162
+ all_properties = output_schema[:properties]&.keys || []
163
+ original_required = output_schema[:required] || []
164
+
165
+ # For OpenAI structured outputs, we need ALL properties to be required
166
+ # but warn about T.nilable fields without defaults
167
+ field_descriptors = signature_class.instance_variable_get(:@output_field_descriptors) || {}
168
+
169
+ all_properties.each do |property_name|
170
+ descriptor = field_descriptors[property_name.to_sym]
171
+
172
+ # If field is not originally required and doesn't have a default
173
+ if !original_required.include?(property_name.to_s) && descriptor && !descriptor.has_default
174
+ DSPy.logger.warn(
175
+ "OpenAI structured outputs: T.nilable field '#{property_name}' without default will be marked as required. " \
176
+ "Consider adding a default value or using a different provider for optional fields."
177
+ )
178
+ end
179
+ end
180
+
181
+ # Return all properties as required (OpenAI requirement)
182
+ all_properties.map(&:to_s)
183
+ end
184
+
185
+ # Fix nested struct schemas to include all properties in required array (OpenAI requirement)
186
+ sig { params(schema: T::Hash[Symbol, T.untyped]).returns(T::Hash[Symbol, T.untyped]) }
187
+ def self.fix_nested_struct_required_fields(schema)
188
+ return schema unless schema.is_a?(Hash)
189
+
190
+ result = schema.dup
191
+
192
+ # If this is an object with properties, make all properties required
193
+ if result[:type] == "object" && result[:properties].is_a?(Hash)
194
+ all_property_names = result[:properties].keys.map(&:to_s)
195
+ result[:required] = all_property_names unless result[:required] == all_property_names
196
+ end
197
+
198
+ # Process nested objects recursively
199
+ if result[:properties].is_a?(Hash)
200
+ result[:properties] = result[:properties].transform_values do |prop|
201
+ if prop.is_a?(Hash)
202
+ processed = fix_nested_struct_required_fields(prop)
203
+ # Handle arrays with object items
204
+ if processed[:type] == "array" && processed[:items].is_a?(Hash)
205
+ processed[:items] = fix_nested_struct_required_fields(processed[:items])
206
+ end
207
+ processed
208
+ else
209
+ prop
210
+ end
211
+ end
212
+ end
213
+
214
+ result
215
+ end
216
+
217
+ sig { params(schema: T::Hash[Symbol, T.untyped]).returns(T::Hash[Symbol, T.untyped]) }
218
+ def self.add_additional_properties_recursively(schema)
219
+ return schema unless schema.is_a?(Hash)
220
+
221
+ result = schema.dup
222
+
223
+ # Add additionalProperties: false if this is an object
224
+ if result[:type] == "object"
225
+ result[:additionalProperties] = false
226
+ end
227
+
228
+ # Process properties recursively
229
+ if result[:properties].is_a?(Hash)
230
+ result[:properties] = result[:properties].transform_values do |prop|
231
+ if prop.is_a?(Hash)
232
+ processed = add_additional_properties_recursively(prop)
233
+ # Special handling for arrays - ensure their items have additionalProperties if they're objects
234
+ if processed[:type] == "array" && processed[:items].is_a?(Hash)
235
+ processed[:items] = add_additional_properties_recursively(processed[:items])
236
+ end
237
+ processed
238
+ else
239
+ prop
240
+ end
241
+ end
242
+ end
243
+
244
+ # Process array items
245
+ if result[:items].is_a?(Hash)
246
+ processed_items = add_additional_properties_recursively(result[:items])
247
+ # OpenAI requires additionalProperties on all objects, even in array items
248
+ if processed_items.is_a?(Hash) && processed_items[:type] == "object" && !processed_items.key?(:additionalProperties)
249
+ processed_items[:additionalProperties] = false
250
+ end
251
+ result[:items] = processed_items
252
+ elsif result[:items].is_a?(Array)
253
+ # Handle tuple validation
254
+ result[:items] = result[:items].map do |item|
255
+ processed = item.is_a?(Hash) ? add_additional_properties_recursively(item) : item
256
+ if processed.is_a?(Hash) && processed[:type] == "object" && !processed.key?(:additionalProperties)
257
+ processed[:additionalProperties] = false
258
+ end
259
+ processed
260
+ end
261
+ end
262
+
263
+ # Process anyOf/allOf (oneOf should be converted to anyOf by this point)
264
+ [:anyOf, :allOf].each do |key|
265
+ if result[key].is_a?(Array)
266
+ result[key] = result[key].map do |sub_schema|
267
+ sub_schema.is_a?(Hash) ? add_additional_properties_recursively(sub_schema) : sub_schema
268
+ end
269
+ end
270
+ end
271
+
272
+ result
273
+ end
274
+
275
+ sig { params(signature_class: T.class_of(DSPy::Signature)).returns(String) }
276
+ def self.generate_schema_name(signature_class)
277
+ # Use the signature class name
278
+ class_name = signature_class.name&.split("::")&.last
279
+ if class_name
280
+ class_name.gsub(/[^a-zA-Z0-9_]/, "_").downcase
281
+ else
282
+ # Fallback to a generic name
283
+ "dspy_output_#{Time.now.to_i}"
284
+ end
285
+ end
286
+
287
+ sig { params(schema: T::Hash[Symbol, T.untyped], current_depth: Integer).returns(Integer) }
288
+ def self.calculate_depth(schema, current_depth = 0)
289
+ return current_depth unless schema.is_a?(Hash)
290
+
291
+ max_depth = current_depth
292
+
293
+ # Check properties
294
+ if schema[:properties].is_a?(Hash)
295
+ schema[:properties].each_value do |prop|
296
+ if prop.is_a?(Hash)
297
+ prop_depth = calculate_depth(prop, current_depth + 1)
298
+ max_depth = [max_depth, prop_depth].max
299
+ end
300
+ end
301
+ end
302
+
303
+ # Check array items
304
+ if schema[:items].is_a?(Hash)
305
+ items_depth = calculate_depth(schema[:items], current_depth + 1)
306
+ max_depth = [max_depth, items_depth].max
307
+ end
308
+
309
+ # Check anyOf/allOf (oneOf should be converted to anyOf by this point)
310
+ [:anyOf, :allOf].each do |key|
311
+ if schema[key].is_a?(Array)
312
+ schema[key].each do |sub_schema|
313
+ if sub_schema.is_a?(Hash)
314
+ sub_depth = calculate_depth(sub_schema, current_depth + 1)
315
+ max_depth = [max_depth, sub_depth].max
316
+ end
317
+ end
318
+ end
319
+ end
320
+
321
+ max_depth
322
+ end
323
+
324
+ sig { params(schema: T::Hash[Symbol, T.untyped]).returns(T::Boolean) }
325
+ def self.contains_pattern_properties?(schema)
326
+ return true if schema[:patternProperties]
327
+
328
+ # Recursively check nested schemas (oneOf should be converted to anyOf by this point)
329
+ [:properties, :items, :anyOf, :allOf].each do |key|
330
+ value = schema[key]
331
+ case value
332
+ when Hash
333
+ return true if contains_pattern_properties?(value)
334
+ when Array
335
+ return true if value.any? { |v| v.is_a?(Hash) && contains_pattern_properties?(v) }
336
+ end
337
+ end
338
+
339
+ false
340
+ end
341
+
342
+ sig { params(schema: T::Hash[Symbol, T.untyped]).returns(T::Boolean) }
343
+ def self.contains_conditional_schemas?(schema)
344
+ return true if schema[:if] || schema[:then] || schema[:else]
345
+
346
+ # Recursively check nested schemas (oneOf should be converted to anyOf by this point)
347
+ [:properties, :items, :anyOf, :allOf].each do |key|
348
+ value = schema[key]
349
+ case value
350
+ when Hash
351
+ return true if contains_conditional_schemas?(value)
352
+ when Array
353
+ return true if value.any? { |v| v.is_a?(Hash) && contains_conditional_schemas?(v) }
354
+ end
355
+ end
356
+
357
+ false
358
+ end
359
+ end
360
+ end
361
+ end
362
+ end
@@ -0,0 +1,7 @@
1
+ # frozen_string_literal: true
2
+
3
+ module DSPy
4
+ module OpenAI
5
+ VERSION = '1.0.0'
6
+ end
7
+ end
@@ -0,0 +1,11 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'dspy/openai/version'
4
+
5
+ require 'dspy/openai/guardrails'
6
+ DSPy::OpenAI::Guardrails.ensure_openai_installed!
7
+
8
+ require 'dspy/openai/lm/adapters/openai_adapter'
9
+ require 'dspy/openai/lm/adapters/ollama_adapter'
10
+ require 'dspy/openai/lm/adapters/openrouter_adapter'
11
+ require 'dspy/openai/lm/schema_converter'
metadata ADDED
@@ -0,0 +1,80 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: dspy-openai
3
+ version: !ruby/object:Gem::Version
4
+ version: 1.0.0
5
+ platform: ruby
6
+ authors:
7
+ - Vicente Reig Rincón de Arellano
8
+ bindir: bin
9
+ cert_chain: []
10
+ date: 1980-01-02 00:00:00.000000000 Z
11
+ dependencies:
12
+ - !ruby/object:Gem::Dependency
13
+ name: dspy
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - '='
17
+ - !ruby/object:Gem::Version
18
+ version: 0.31.0
19
+ type: :runtime
20
+ prerelease: false
21
+ version_requirements: !ruby/object:Gem::Requirement
22
+ requirements:
23
+ - - '='
24
+ - !ruby/object:Gem::Version
25
+ version: 0.31.0
26
+ - !ruby/object:Gem::Dependency
27
+ name: openai
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - ">="
31
+ - !ruby/object:Gem::Version
32
+ version: '0'
33
+ type: :runtime
34
+ prerelease: false
35
+ version_requirements: !ruby/object:Gem::Requirement
36
+ requirements:
37
+ - - ">="
38
+ - !ruby/object:Gem::Version
39
+ version: '0'
40
+ description: Provides the OpenAI plus the Ollama and OpenRouter adapters so OpenAI-compatible
41
+ providers can be added to DSPy.rb projects independently of the core gem.
42
+ email:
43
+ - hey@vicente.services
44
+ executables: []
45
+ extensions: []
46
+ extra_rdoc_files: []
47
+ files:
48
+ - LICENSE
49
+ - README.md
50
+ - lib/dspy/openai.rb
51
+ - lib/dspy/openai/README.md
52
+ - lib/dspy/openai/guardrails.rb
53
+ - lib/dspy/openai/lm/adapters/ollama_adapter.rb
54
+ - lib/dspy/openai/lm/adapters/openai_adapter.rb
55
+ - lib/dspy/openai/lm/adapters/openrouter_adapter.rb
56
+ - lib/dspy/openai/lm/schema_converter.rb
57
+ - lib/dspy/openai/version.rb
58
+ homepage: https://github.com/vicentereig/dspy.rb
59
+ licenses:
60
+ - MIT
61
+ metadata:
62
+ github_repo: git@github.com:vicentereig/dspy.rb
63
+ rdoc_options: []
64
+ require_paths:
65
+ - lib
66
+ required_ruby_version: !ruby/object:Gem::Requirement
67
+ requirements:
68
+ - - ">="
69
+ - !ruby/object:Gem::Version
70
+ version: 3.3.0
71
+ required_rubygems_version: !ruby/object:Gem::Requirement
72
+ requirements:
73
+ - - ">="
74
+ - !ruby/object:Gem::Version
75
+ version: '0'
76
+ requirements: []
77
+ rubygems_version: 3.6.9
78
+ specification_version: 4
79
+ summary: OpenAI and OpenRouter adapters for DSPy.rb.
80
+ test_files: []