llm_capabilities 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: d3046a9f7990b66895e6ee190eee74fff899b2c989e2719ab024cb0e8d3f5986
4
+ data.tar.gz: 377a8fd162deb086f6cbc1d60c2b6a3799b502b5f330cc1f5219f44cff7018ed
5
+ SHA512:
6
+ metadata.gz: 69888f933e3a33071b6d647ca735b5af9b1dd3cc5e2233960c44cb52855569fd33df38a24d389f8eb52c14c3cfc593fa750cad264b8620b898f73d4b6289612d
7
+ data.tar.gz: cbd50993c32cfb0d1ce7e3a8e96d6f374585f1bdceb84b283cd81436d9dc43e7297be4865d813c6bbec726f9d34cb6291572c3b5551547bbe193cf661826bcc3
data/CHANGELOG.md ADDED
@@ -0,0 +1,14 @@
1
+ # Changelog
2
+
3
+ ## 0.2.0
4
+
5
+ - Generalize API from `supports_schema?(model, thinking:)` to `supports?(model, capability, context: {})`
6
+ - Add 4-tier resolution hierarchy: empirical cache, OpenRouter model index, RubyLLM registry, provider heuristic
7
+ - Add `ModelIndex` class for unauthenticated OpenRouter `/api/v1/models` integration
8
+ - Add 18 validated capability symbols (`KNOWN_CAPABILITIES`) with `UnknownCapabilityError`
9
+ - Add freeform context hash for cache precision (e.g., `{thinking: true}`)
10
+ - Remove `sorbet-runtime` dependency (zero runtime dependencies)
11
+
12
+ ## 0.1.0
13
+
14
+ - Initial release: 3-tier structured output detection (cache, RubyLLM, provider heuristic)
data/LICENSE.txt ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Sorcerous Machine, LLC
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,155 @@
1
+ # llm_capabilities
2
+
3
+ [![CI](https://github.com/SorcerousMachine/LLMCapabilities/actions/workflows/ci.yml/badge.svg)](https://github.com/SorcerousMachine/LLMCapabilities/actions/workflows/ci.yml)
4
+ [![Gem Version](https://img.shields.io/gem/v/llm_capabilities)](https://rubygems.org/gems/llm_capabilities)
5
+ [![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE.txt)
6
+
7
+ 4-tier capability detection for LLM models. Zero runtime dependencies.
8
+
9
+ Answers the question: *does this model support this capability?* — using a layered resolution hierarchy that combines empirical observations, live model indexes, and static heuristics.
10
+
11
+ ## Resolution Hierarchy
12
+
13
+ Each query walks four tiers in order. The first non-nil result wins.
14
+
15
+ | Tier | Source | What it knows |
16
+ |------|--------|---------------|
17
+ | 1 | **Empirical cache** | Observed results from actual API calls, with optional context (e.g., `{thinking: true}`) |
18
+ | 2 | **OpenRouter model index** | Per-model capability data from OpenRouter's public API, cached locally for 24 hours |
19
+ | 3 | **RubyLLM model registry** | Soft dependency — used automatically when the `ruby_llm` gem is loaded |
20
+ | 4 | **Provider heuristic** | Static fallback mapping providers to known capabilities |
21
+
22
+ Tier 1 is the most authoritative because it reflects what you've actually observed. Tiers 2-4 provide progressively coarser defaults.
23
+
24
+ ## Quick Start
25
+
26
+ ```ruby
27
+ gem "llm_capabilities"
28
+ ```
29
+
30
+ ```ruby
31
+ require "llm_capabilities"
32
+
33
+ # Query a capability (walks all 4 tiers automatically)
34
+ LLMCapabilities.supports?("openai/o4-mini", :structured_output)
35
+ # => true
36
+
37
+ LLMCapabilities.supports?("deepseek/deepseek-r1", :vision)
38
+ # => false
39
+
40
+ # Query with context (only matches tier 1 cache entries with the same context)
41
+ LLMCapabilities.supports?("anthropic/claude-haiku-4.5", :structured_output, context: { thinking: true })
42
+ # => false (if you've recorded that specific combination as unsupported)
43
+ ```
44
+
45
+ Model identifiers use the `"provider/model"` format (e.g., `"openai/gpt-4o"`, `"anthropic/claude-sonnet-4.5"`).
46
+
47
+ ## Recording Empirical Results
48
+
49
+ The real power is in tier 1: recording what you've actually observed from API calls.
50
+
51
+ ```ruby
52
+ # After a successful structured output call
53
+ LLMCapabilities.record("openai/o4-mini", :structured_output, supported: true)
54
+
55
+ # After discovering a model doesn't support a capability in a specific context
56
+ LLMCapabilities.record(
57
+ "anthropic/claude-haiku-4.5",
58
+ :structured_output,
59
+ context: { thinking: true },
60
+ supported: false
61
+ )
62
+
63
+ # Look up a cached result directly (nil if not recorded)
64
+ LLMCapabilities.lookup("openai/o4-mini", :structured_output)
65
+ # => true
66
+
67
+ # Cache management
68
+ LLMCapabilities.size # => 2
69
+ LLMCapabilities.clear! # wipes all cached entries
70
+ ```
71
+
72
+ Cache entries are persisted to disk as JSON and survive process restarts. Entries expire after 30 days by default.
73
+
74
+ ## Configuration
75
+
76
+ ```ruby
77
+ LLMCapabilities.configure do |config|
78
+ # File paths for persistent storage
79
+ config.cache_path = ".llm_capabilities_cache.json" # default
80
+ config.index_path = ".llm_capabilities_index.json" # default
81
+
82
+ # Cache entry lifetime (seconds)
83
+ config.max_age = 2_592_000 # 30 days, default
84
+
85
+ # OpenRouter index refresh interval (seconds)
86
+ config.index_ttl = 86_400 # 24 hours, default
87
+
88
+ # Override which providers support which capabilities
89
+ config.provider_capabilities = {
90
+ structured_output: %w[openai google anthropic deepseek],
91
+ function_calling: %w[openai google anthropic deepseek],
92
+ vision: %w[openai google anthropic],
93
+ streaming: %w[openai google anthropic deepseek]
94
+ }
95
+ end
96
+ ```
97
+
98
+ ## Known Capabilities
99
+
100
+ The gem recognizes 18 capability symbols. Passing anything else to `supports?` raises `LLMCapabilities::UnknownCapabilityError`.
101
+
102
+ | Capability | Description |
103
+ |------------|-------------|
104
+ | `:structured_output` | JSON schema-constrained output |
105
+ | `:function_calling` | Tool/function calling |
106
+ | `:vision` | Image input processing |
107
+ | `:streaming` | Streaming response support |
108
+ | `:json_mode` | JSON output mode (less strict than structured output) |
109
+ | `:reasoning` | Extended thinking / chain-of-thought |
110
+ | `:image_generation` | Image output generation |
111
+ | `:speech_generation` | Audio/speech output |
112
+ | `:transcription` | Audio-to-text conversion |
113
+ | `:translation` | Language translation |
114
+ | `:citations` | Source citation support |
115
+ | `:predicted_outputs` | Predicted/cached output optimization |
116
+ | `:distillation` | Model distillation support |
117
+ | `:fine_tuning` | Fine-tuning API support |
118
+ | `:batch` | Batch API processing |
119
+ | `:realtime` | Real-time / WebSocket API |
120
+ | `:caching` | Prompt caching |
121
+ | `:moderation` | Content moderation |
122
+
123
+ ## Default Provider Capabilities
124
+
125
+ Tier 4 uses these static mappings as a last resort when no better data is available:
126
+
127
+ | Capability | Providers |
128
+ |------------|-----------|
129
+ | `:structured_output` | openai, google, anthropic, deepseek |
130
+ | `:function_calling` | openai, google, anthropic, deepseek |
131
+ | `:vision` | openai, google, anthropic |
132
+ | `:streaming` | openai, google, anthropic, deepseek |
133
+
134
+ These can be overridden via `config.provider_capabilities`.
135
+
136
+ ## OpenRouter API Usage
137
+
138
+ Tier 2 fetches model capability data from [OpenRouter's](https://openrouter.ai) unauthenticated public endpoint (`GET /api/v1/models`). This data is cached locally on disk with a 24-hour TTL. No API key is required. No data is sent to OpenRouter — it is a read-only GET request.
139
+
140
+ If the fetch fails (network error, timeout, non-200 response), tier 2 is silently skipped and resolution falls through to tiers 3 and 4. The gem never blocks on network failure.
141
+
142
+ See OpenRouter's [terms of service](https://openrouter.ai/terms) for their data usage policies.
143
+
144
+ ## RubyLLM Integration
145
+
146
+ Tier 3 activates automatically when the [`ruby_llm`](https://github.com/crmne/ruby_llm) gem is loaded in your process. No configuration required — if `RubyLLM` is defined, the gem queries its model registry for capability data. If `ruby_llm` is not present, tier 3 is silently skipped.
147
+
148
+ ## Requirements
149
+
150
+ - Ruby >= 3.2
151
+ - Zero runtime dependencies (stdlib only: `json`, `net/http`, `fileutils`)
152
+
153
+ ## License
154
+
155
+ MIT. See [LICENSE.txt](LICENSE.txt).
@@ -0,0 +1,82 @@
1
+ # frozen_string_literal: true
2
+ # typed: true
3
+
4
+ require "json"
5
+ require "fileutils"
6
+
7
+ module LLMCapabilities
8
+ class Cache
9
+ def initialize(path: Configuration::DEFAULT_CACHE_PATH, max_age: Configuration::DEFAULT_MAX_AGE)
10
+ @path = path
11
+ @max_age = max_age
12
+ @entries = nil
13
+ end
14
+
15
+ def lookup(model, capability, context: {})
16
+ load_cache!
17
+ entry = @entries[cache_key(model, capability, context)]
18
+ return nil unless entry.is_a?(Hash)
19
+
20
+ if @max_age && entry["recorded_at"]
21
+ elapsed = Time.now.to_i - entry["recorded_at"]
22
+ return nil if elapsed > @max_age
23
+ end
24
+
25
+ entry["supported"]
26
+ end
27
+
28
+ def record(model, capability, supported:, context: {})
29
+ load_cache!
30
+ @entries[cache_key(model, capability, context)] = {
31
+ "supported" => supported,
32
+ "recorded_at" => Time.now.to_i
33
+ }
34
+ persist!
35
+ end
36
+
37
+ def clear!
38
+ @entries = {}
39
+ persist!
40
+ end
41
+
42
+ def size
43
+ load_cache!
44
+ @entries.length
45
+ end
46
+
47
+ private
48
+
49
+ def cache_key(model, capability, context)
50
+ base = "#{model}:#{capability}"
51
+ return base if context.empty?
52
+
53
+ pairs = context.sort_by { |k, _| k.to_s }.map { |k, v| "#{k}=#{v}" }.join(",")
54
+ "#{base}:#{pairs}"
55
+ end
56
+
57
+ def load_cache!
58
+ return unless @entries.nil?
59
+
60
+ @entries = if File.exist?(@path)
61
+ File.open(@path, File::RDONLY) do |f|
62
+ f.flock(File::LOCK_SH)
63
+ JSON.parse(f.read)
64
+ end
65
+ else
66
+ {}
67
+ end
68
+ rescue JSON::ParserError
69
+ @entries = {}
70
+ end
71
+
72
+ def persist!
73
+ dir = File.dirname(@path)
74
+ FileUtils.mkdir_p(dir) unless Dir.exist?(dir)
75
+
76
+ File.open(@path, File::CREAT | File::WRONLY | File::TRUNC) do |f|
77
+ f.flock(File::LOCK_EX)
78
+ f.write(JSON.pretty_generate(@entries))
79
+ end
80
+ end
81
+ end
82
+ end
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+ # typed: true
3
+
4
+ module LLMCapabilities
5
+ class Configuration
6
+ DEFAULT_CACHE_PATH = ".llm_capabilities_cache.json"
7
+ DEFAULT_INDEX_PATH = ".llm_capabilities_index.json"
8
+ DEFAULT_INDEX_TTL = 86_400 # 24 hours in seconds
9
+ DEFAULT_MAX_AGE = 2_592_000 # 30 days in seconds
10
+
11
+ DEFAULT_PROVIDER_CAPABILITIES = {
12
+ structured_output: %w[openai google anthropic deepseek],
13
+ function_calling: %w[openai google anthropic deepseek],
14
+ vision: %w[openai google anthropic],
15
+ streaming: %w[openai google anthropic deepseek]
16
+ }.freeze
17
+
18
+ attr_accessor :cache_path, :index_path, :index_ttl, :provider_capabilities, :max_age
19
+
20
+ def initialize
21
+ @cache_path = DEFAULT_CACHE_PATH
22
+ @index_path = DEFAULT_INDEX_PATH
23
+ @index_ttl = DEFAULT_INDEX_TTL
24
+ @provider_capabilities = DEFAULT_PROVIDER_CAPABILITIES.transform_values(&:dup)
25
+ @max_age = DEFAULT_MAX_AGE
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,83 @@
1
+ # frozen_string_literal: true
2
+ # typed: true
3
+
4
+ module LLMCapabilities
5
+ class Detector
6
+ # Capability vocabulary derived from RubyLLM's model capability strings.
7
+ # See: https://github.com/crmne/ruby_llm
8
+ KNOWN_CAPABILITIES = %i[
9
+ streaming function_calling structured_output predicted_outputs
10
+ distillation fine_tuning batch realtime image_generation
11
+ speech_generation transcription translation citations
12
+ reasoning caching moderation json_mode vision
13
+ ].freeze
14
+
15
+ def initialize(cache:, provider_capabilities: Configuration::DEFAULT_PROVIDER_CAPABILITIES, model_index: nil, ruby_llm_enabled: nil)
16
+ @cache = cache
17
+ @provider_capabilities = provider_capabilities
18
+ @model_index = model_index
19
+ @ruby_llm_enabled = ruby_llm_enabled
20
+ end
21
+
22
+ def supports?(model, capability, context: {})
23
+ validate_capability!(capability)
24
+
25
+ # Tier 1: Empirical cache (most authoritative, with full context)
26
+ cached = @cache.lookup(model, capability, context: context)
27
+ return cached unless cached.nil?
28
+
29
+ # Tier 2: OpenRouter model index (base capability only)
30
+ if @model_index
31
+ index_result = @model_index.lookup(model, capability)
32
+ return index_result unless index_result.nil?
33
+ end
34
+
35
+ # Tier 3: RubyLLM model registry (base capability only)
36
+ ruby_llm_result = query_ruby_llm(model, capability)
37
+ return ruby_llm_result unless ruby_llm_result.nil?
38
+
39
+ # Tier 4: Provider-level heuristic (base capability only)
40
+ provider_supports?(model, capability)
41
+ end
42
+
43
+ def provider_supports?(model, capability)
44
+ provider = model.include?("/") ? model.split("/", 2).first : nil
45
+ return false unless provider
46
+
47
+ providers = @provider_capabilities[capability]
48
+ return false unless providers
49
+
50
+ providers.include?(provider)
51
+ end
52
+
53
+ private
54
+
55
+ def validate_capability!(capability)
56
+ return if KNOWN_CAPABILITIES.include?(capability)
57
+
58
+ raise UnknownCapabilityError,
59
+ "Unknown capability: #{capability.inspect}. Known capabilities: #{KNOWN_CAPABILITIES.join(", ")}"
60
+ end
61
+
62
+ def ruby_llm_available?
63
+ if @ruby_llm_enabled.nil?
64
+ @ruby_llm_enabled = defined?(RubyLLM) ? true : false
65
+ end
66
+ @ruby_llm_enabled
67
+ end
68
+
69
+ def query_ruby_llm(model, capability)
70
+ return nil unless ruby_llm_available?
71
+
72
+ model_info = RubyLLM.models.find(model)
73
+ return nil unless model_info
74
+
75
+ method_name = :"#{capability}?"
76
+ if model_info.respond_to?(method_name)
77
+ model_info.public_send(method_name)
78
+ end
79
+ rescue
80
+ nil
81
+ end
82
+ end
83
+ end
@@ -0,0 +1,170 @@
1
+ # frozen_string_literal: true
2
+ # typed: true
3
+
4
+ require "json"
5
+ require "fileutils"
6
+ require "net/http"
7
+ require "uri"
8
+
9
+ module LLMCapabilities
10
+ class ModelIndex
11
+ OPENROUTER_MODELS_URL = "https://openrouter.ai/api/v1/models"
12
+
13
+ # Mapping from OpenRouter field values to gem capability symbols.
14
+ # NOTE: Name divergences are intentional and subtle:
15
+ # - OpenRouter "structured_outputs" (plural) -> gem :structured_output (singular)
16
+ # - OpenRouter "tools" -> gem :function_calling
17
+ # - OpenRouter "response_format" -> gem :json_mode
18
+ # - OpenRouter "reasoning" -> gem :reasoning
19
+ PARAMETER_MAPPING = {
20
+ "structured_outputs" => :structured_output,
21
+ "tools" => :function_calling,
22
+ "reasoning" => :reasoning,
23
+ "response_format" => :json_mode
24
+ }.freeze
25
+
26
+ def initialize(path:, ttl:)
27
+ @path = path
28
+ @ttl = ttl
29
+ @index = nil
30
+ end
31
+
32
+ def lookup(model, capability)
33
+ load_index!
34
+ model_caps = @index[model]
35
+ return nil unless model_caps
36
+
37
+ model_caps[capability]
38
+ end
39
+
40
+ private
41
+
42
+ def load_index!
43
+ return unless @index.nil?
44
+
45
+ if File.exist?(@path) && !stale?
46
+ load_from_disk
47
+ return unless @index.nil?
48
+ end
49
+
50
+ fetch_and_cache!
51
+ rescue => _e
52
+ @index ||= {}
53
+ end
54
+
55
+ def stale?
56
+ mtime = File.mtime(@path)
57
+ (Time.now - mtime) > @ttl
58
+ end
59
+
60
+ def load_from_disk
61
+ File.open(@path, File::RDONLY) do |f|
62
+ f.flock(File::LOCK_SH)
63
+ raw = JSON.parse(f.read)
64
+ @index = deserialize(raw)
65
+ end
66
+ rescue JSON::ParserError
67
+ @index = nil
68
+ end
69
+
70
+ def fetch_and_cache!
71
+ uri = URI.parse(OPENROUTER_MODELS_URL)
72
+ response = Net::HTTP.get_response(uri)
73
+
74
+ unless response.is_a?(Net::HTTPSuccess)
75
+ @index ||= {}
76
+ return
77
+ end
78
+
79
+ body = response.body
80
+ unless body
81
+ @index ||= {}
82
+ return
83
+ end
84
+
85
+ raw_data = JSON.parse(body)
86
+ @index = normalize(raw_data)
87
+ persist!
88
+ rescue => _e
89
+ @index ||= {}
90
+ end
91
+
92
+ def normalize(raw_data)
93
+ result = {}
94
+ models = raw_data["data"]
95
+ return result unless models.is_a?(Array)
96
+
97
+ models.each do |model_entry|
98
+ next unless model_entry.is_a?(Hash)
99
+
100
+ id = model_entry["id"]
101
+ next unless id.is_a?(String)
102
+
103
+ caps = map_capabilities(model_entry)
104
+ result[id] = caps unless caps.empty?
105
+ end
106
+
107
+ result
108
+ end
109
+
110
+ def map_capabilities(model_entry)
111
+ caps = {}
112
+
113
+ # Map supported_parameters to capabilities
114
+ # NOTE: OpenRouter uses different names than the gem:
115
+ # "structured_outputs" (plural) -> :structured_output (singular)
116
+ # "tools" -> :function_calling
117
+ # "response_format" -> :json_mode
118
+ params = model_entry["supported_parameters"]
119
+ if params.is_a?(Array)
120
+ params.each do |param|
121
+ cap = PARAMETER_MAPPING[param]
122
+ caps[cap] = true if cap
123
+ end
124
+ end
125
+
126
+ # Map architecture modalities to capabilities
127
+ arch = model_entry["architecture"]
128
+ if arch.is_a?(Hash)
129
+ input_mods = arch["input_modalities"]
130
+ if input_mods.is_a?(Array) && input_mods.include?("image")
131
+ caps[:vision] = true
132
+ end
133
+
134
+ output_mods = arch["output_modalities"]
135
+ if output_mods.is_a?(Array) && output_mods.include?("image")
136
+ caps[:image_generation] = true
137
+ end
138
+ end
139
+
140
+ caps
141
+ end
142
+
143
+ def persist!
144
+ dir = File.dirname(@path)
145
+ FileUtils.mkdir_p(dir) unless Dir.exist?(dir)
146
+
147
+ serialized = serialize(@index)
148
+ File.open(@path, File::CREAT | File::WRONLY | File::TRUNC) do |f|
149
+ f.flock(File::LOCK_EX)
150
+ f.write(JSON.pretty_generate(serialized))
151
+ end
152
+ end
153
+
154
+ def serialize(index)
155
+ index.transform_values { |caps| caps.transform_keys(&:to_s) }
156
+ end
157
+
158
+ def deserialize(raw)
159
+ result = {}
160
+ raw.each do |model_id, caps|
161
+ next unless caps.is_a?(Hash)
162
+
163
+ result[model_id] = caps.each_with_object({}) do |(k, v), acc|
164
+ acc[k.to_sym] = v if v == true || v == false
165
+ end
166
+ end
167
+ result
168
+ end
169
+ end
170
+ end
@@ -0,0 +1,6 @@
1
+ # frozen_string_literal: true
2
+ # typed: true
3
+
4
+ module LLMCapabilities
5
+ VERSION = "0.2.0"
6
+ end
@@ -0,0 +1,77 @@
1
+ # frozen_string_literal: true
2
+ # typed: true
3
+
4
+ require_relative "llm_capabilities/version"
5
+ require_relative "llm_capabilities/configuration"
6
+ require_relative "llm_capabilities/cache"
7
+ require_relative "llm_capabilities/model_index"
8
+ require_relative "llm_capabilities/detector"
9
+
10
+ module LLMCapabilities
11
+ class Error < StandardError; end
12
+ class UnknownCapabilityError < Error; end
13
+
14
+ class << self
15
+ def configuration
16
+ @configuration ||= Configuration.new
17
+ end
18
+
19
+ def configure(&block)
20
+ block.call(configuration)
21
+ @cache = nil
22
+ @model_index = nil
23
+ @detector = nil
24
+ end
25
+
26
+ def supports?(model, capability, context: {})
27
+ detector.supports?(model, capability, context: context)
28
+ end
29
+
30
+ def record(model, capability, supported:, context: {})
31
+ cache.record(model, capability, supported: supported, context: context)
32
+ end
33
+
34
+ def lookup(model, capability, context: {})
35
+ cache.lookup(model, capability, context: context)
36
+ end
37
+
38
+ def clear!
39
+ cache.clear!
40
+ end
41
+
42
+ def size
43
+ cache.size
44
+ end
45
+
46
+ def reset!
47
+ @configuration = nil
48
+ @cache = nil
49
+ @model_index = nil
50
+ @detector = nil
51
+ end
52
+
53
+ private
54
+
55
+ def cache
56
+ @cache ||= Cache.new(
57
+ path: configuration.cache_path,
58
+ max_age: configuration.max_age
59
+ )
60
+ end
61
+
62
+ def model_index
63
+ @model_index ||= ModelIndex.new(
64
+ path: configuration.index_path,
65
+ ttl: configuration.index_ttl
66
+ )
67
+ end
68
+
69
+ def detector
70
+ @detector ||= Detector.new(
71
+ cache: cache,
72
+ provider_capabilities: configuration.provider_capabilities,
73
+ model_index: model_index
74
+ )
75
+ end
76
+ end
77
+ end
metadata ADDED
@@ -0,0 +1,51 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: llm_capabilities
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.2.0
5
+ platform: ruby
6
+ authors:
7
+ - Alex
8
+ bindir: bin
9
+ cert_chain: []
10
+ date: 1980-01-02 00:00:00.000000000 Z
11
+ dependencies: []
12
+ description: Detects whether LLM models support specific capabilities via empirical
13
+ cache, OpenRouter model index, RubyLLM model registry, and provider-level heuristics.
14
+ Zero runtime dependencies.
15
+ executables: []
16
+ extensions: []
17
+ extra_rdoc_files: []
18
+ files:
19
+ - CHANGELOG.md
20
+ - LICENSE.txt
21
+ - README.md
22
+ - lib/llm_capabilities.rb
23
+ - lib/llm_capabilities/cache.rb
24
+ - lib/llm_capabilities/configuration.rb
25
+ - lib/llm_capabilities/detector.rb
26
+ - lib/llm_capabilities/model_index.rb
27
+ - lib/llm_capabilities/version.rb
28
+ homepage: https://github.com/SorcerousMachine/LLMCapabilities
29
+ licenses:
30
+ - MIT
31
+ metadata:
32
+ source_code_uri: https://github.com/SorcerousMachine/LLMCapabilities
33
+ changelog_uri: https://github.com/SorcerousMachine/LLMCapabilities/blob/main/CHANGELOG.md
34
+ rdoc_options: []
35
+ require_paths:
36
+ - lib
37
+ required_ruby_version: !ruby/object:Gem::Requirement
38
+ requirements:
39
+ - - ">="
40
+ - !ruby/object:Gem::Version
41
+ version: '3.2'
42
+ required_rubygems_version: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - ">="
45
+ - !ruby/object:Gem::Version
46
+ version: '0'
47
+ requirements: []
48
+ rubygems_version: 4.0.4
49
+ specification_version: 4
50
+ summary: 4-tier capability detection for LLM models
51
+ test_files: []