langfuse-rb 0.8.0 → 0.10.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ba92036fbe7b63b8355a113e44ff3f62cfcb11817ae3b3c1c444756af55ebf84
4
- data.tar.gz: 44097d4d441ad6d2d8780a95cb836d0368ed0b772f16036e301ece7e018df0a1
3
+ metadata.gz: 0751bccadaa11e94f6fb8db5d8eb5a356b7102d2b811f1bc8b9b5e2f9c924072
4
+ data.tar.gz: 0c52e3a2738c090fa1804ed5a8722337ca58ba045a5aad0262e9a20ce4c1dd28
5
5
  SHA512:
6
- metadata.gz: 6e8880c58683dc7ff719f0c966515841e9a3e3df024f88175297090949a6befaddf23528b24a65d3825fa40e33118348087c0dbe5312543c16bfef65856eaf12
7
- data.tar.gz: a82f8be469125e99355c5b6a1bd747a1f2e395dd6e0eb098f8319a0976419dfd60087c6252e26c2570db15994ed084d3156069019541c3b32acc7f1827c92792
6
+ metadata.gz: 8a27652b71192d9a92b520f9c95af0990aae08af3e649b6c2d6f6827dacc3b8d402d3833e96abf2e7c5d20728defc19b1b92a4bc6b5778f47809f9266a34e11f
7
+ data.tar.gz: 435c45905b38b3f3f0e0641799394a9834c7ccd6e24d5f56adbf5cabed9345c77f7e842d33c91637c775c30f5b13cb26e434f9eb5fce45ac417b9e89544a9044
data/CHANGELOG.md CHANGED
@@ -7,6 +7,30 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
7
7
 
8
8
  ## [Unreleased]
9
9
 
10
+ ## [0.10.0] - 2026-05-05
11
+
12
+ ### Added
13
+ - Expose prompt cache operations on the client (#89)
14
+
15
+ ### Changed
16
+ - Tighten cache event dispatch and generation safety for prompt caching (#90)
17
+
18
+ ### Documentation
19
+ - Align README with sibling Langfuse SDKs (#88)
20
+
21
+ ## [0.9.0] - 2026-04-28
22
+
23
+ ### Added
24
+ - Expose `type`, `commit_message`, and `resolution_graph` metadata on text and chat prompt clients (#87)
25
+
26
+ ### Fixed
27
+ - Preserve and compile chat prompt message placeholders in parity with Langfuse Python and JS SDKs (#86)
28
+ - Preserve raw prompt compile variables instead of HTML-escaping JSON, XML, and HTML-like values (#85)
29
+ - Suppress prompt name/version attribution on fallback prompt clients so fallback output is not reported as prompt version 0 (#84)
30
+
31
+ ### Documentation
32
+ - Link to upstream Langfuse agent skills and refresh README header image (#81, #83)
33
+
10
34
  ## [0.8.0] - 2026-04-24
11
35
 
12
36
  ### Added
@@ -100,7 +124,9 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
100
124
  - Migrated from legacy ingestion API to OTLP endpoint
101
125
  - Removed `tracing_enabled` configuration flag (#2)
102
126
 
103
- [Unreleased]: https://github.com/simplepractice/langfuse-rb/compare/v0.8.0...HEAD
127
+ [Unreleased]: https://github.com/simplepractice/langfuse-rb/compare/v0.10.0...HEAD
128
+ [0.10.0]: https://github.com/simplepractice/langfuse-rb/compare/v0.9.0...v0.10.0
129
+ [0.9.0]: https://github.com/simplepractice/langfuse-rb/compare/v0.8.0...v0.9.0
104
130
  [0.8.0]: https://github.com/simplepractice/langfuse-rb/compare/v0.7.0...v0.8.0
105
131
  [0.7.0]: https://github.com/simplepractice/langfuse-rb/compare/v0.6.0...v0.7.0
106
132
  [0.6.0]: https://github.com/simplepractice/langfuse-rb/compare/v0.5.0...v0.6.0
data/README.md CHANGED
@@ -1,74 +1,20 @@
1
- ![header](https://camo.githubusercontent.com/26d19b945bc752101b4aca468e07b118a44af07340db79af29f7df95505f2cea/68747470733a2f2f6c616e67667573652e636f6d2f6c616e67667573655f6c6f676f5f77686974652e706e67)
1
+ <img width="2255" height="527" alt="langfuse-wordart" src="https://github.com/user-attachments/assets/59422d0a-6ecb-4e5f-a21c-cae955b5ce75" />
2
2
 
3
3
  # Langfuse Ruby SDK
4
4
 
5
- [![Gem Version](https://badge.fury.io/rb/langfuse-rb.svg?icon=si%3Arubygems)](https://badge.fury.io/rb/langfuse-rb)
6
- [![Ruby](https://img.shields.io/badge/ruby-%3E%3D%203.2.0-ruby.svg)](https://www.ruby-lang.org/en/)
7
- [![Test Coverage](https://img.shields.io/badge/coverage-99.6%25-brightgreen.svg)](coverage)
8
-
9
- > Ruby SDK for [Langfuse](https://langfuse.com) - Open-source LLM observability and prompt management.
5
+ [![MIT License](https://img.shields.io/badge/License-MIT-red.svg?style=flat-square)](https://opensource.org/licenses/MIT)
6
+ [![CI test status](https://img.shields.io/github/actions/workflow/status/simplepractice/langfuse-rb/ci.yml?style=flat-square&label=All%20tests)](https://github.com/simplepractice/langfuse-rb/actions/workflows/ci.yml?query=branch%3Amain)
7
+ [![Gem Version](https://img.shields.io/gem/v/langfuse-rb.svg?style=flat-square&label=gem+langfuse-rb)](https://rubygems.org/gems/langfuse-rb)
10
8
 
11
9
  ## Installation
12
10
 
13
- ```ruby
14
- gem "langfuse-rb"
15
- ```
16
-
17
- ## Quick Start
18
-
19
- ```ruby
20
- Langfuse.configure do |config|
21
- config.public_key = ENV["LANGFUSE_PUBLIC_KEY"]
22
- config.secret_key = ENV["LANGFUSE_SECRET_KEY"]
23
- config.base_url = ENV.fetch("LANGFUSE_BASE_URL", "https://cloud.langfuse.com")
24
-
25
- # Optional: sample traces and trace-linked scores deterministically
26
- config.sample_rate = 1.0
27
- end
28
-
29
- message = Langfuse.client.compile_prompt(
30
- "greeting",
31
- variables: { name: "Alice" }
32
- )
33
- ```
34
-
35
- Langfuse tracing is isolated by default. `Langfuse.configure` stores configuration only; it does not replace `OpenTelemetry.tracer_provider`.
36
-
37
- `sample_rate` is applied to traces and trace-linked scores. Rebuild the client with `Langfuse.reset!` before expecting runtime sampling changes to take effect.
38
-
39
- ## Trace an LLM Call
11
+ > [!IMPORTANT]
12
+ > The SDK requires Ruby `>= 3.2.0`.
40
13
 
41
14
  ```ruby
42
- Langfuse.observe("chat-completion", as_type: :generation) do |gen|
43
- gen.model = "gpt-4.1-mini"
44
- gen.input = [{ role: "user", content: "Hello!" }]
45
-
46
- response = openai_client.chat(
47
- parameters: {
48
- model: "gpt-4.1-mini",
49
- messages: [{ role: "user", content: "Hello!" }]
50
- }
51
- )
52
-
53
- gen.update(
54
- output: response.dig("choices", 0, "message", "content"),
55
- usage_details: {
56
- prompt_tokens: response.dig("usage", "prompt_tokens"),
57
- completion_tokens: response.dig("usage", "completion_tokens")
58
- }
59
- )
60
- end
15
+ gem "langfuse-rb"
61
16
  ```
62
17
 
63
- ## Start Here
64
-
65
- - [Documentation Hub](docs/README.md)
66
- - [Getting Started](docs/GETTING_STARTED.md)
67
- - [Prompts](docs/PROMPTS.md)
68
- - [Tracing](docs/TRACING.md)
69
- - [Scoring](docs/SCORING.md)
70
- - [Rails Patterns](docs/RAILS.md)
71
-
72
- ## License
18
+ ## Docs
73
19
 
74
- [MIT](LICENSE)
20
+ Please [see our docs](docs/README.md) for detailed information on this SDK.
@@ -5,6 +5,7 @@ require "faraday/retry"
5
5
  require "base64"
6
6
  require "json"
7
7
  require "uri"
8
+ require_relative "prompt_fetch_result"
8
9
 
9
10
  module Langfuse
10
11
  # HTTP client for Langfuse API
@@ -22,6 +23,16 @@ module Langfuse
22
23
  # )
23
24
  #
24
25
  class ApiClient # rubocop:disable Metrics/ClassLength
26
+ include PromptCacheEvents
27
+
28
+ # Bundles the resolved cache key with the per-call TTL override so private
29
+ # prompt-fetch helpers take one arg instead of four.
30
+ PromptFetchOptions = Struct.new(:key, :cache_ttl, keyword_init: true) do
31
+ def name = key.name
32
+ def version = key.version
33
+ def label = key.label
34
+ end
35
+
25
36
  # @return [String] Langfuse public API key
26
37
  attr_reader :public_key
27
38
 
@@ -48,15 +59,20 @@ module Langfuse
48
59
  # @param timeout [Integer] HTTP request timeout in seconds
49
60
  # @param logger [Logger] Logger instance for debugging
50
61
  # @param cache [PromptCache, RailsCacheAdapter, nil] Optional cache for prompt responses
62
+ # @param cache_observer [#call, nil] Optional observer for prompt cache events
51
63
  # @return [ApiClient]
52
- def initialize(public_key:, secret_key:, base_url:, timeout: 5, logger: nil, cache: nil)
64
+ # rubocop:disable Metrics/ParameterLists
65
+ def initialize(public_key:, secret_key:, base_url:, timeout: 5, logger: nil, cache: nil, cache_observer: nil)
53
66
  @public_key = public_key
54
67
  @secret_key = secret_key
55
68
  @base_url = base_url
56
69
  @timeout = timeout
57
70
  @logger = logger || Logger.new($stdout, level: Logger::WARN)
58
71
  @cache = cache
72
+ @cache_backend_name = compute_cache_backend_name
73
+ setup_prompt_cache_events(cache_observer: cache_observer)
59
74
  end
75
+ # rubocop:enable Metrics/ParameterLists
60
76
 
61
77
  # Get a Faraday connection
62
78
  #
@@ -109,17 +125,127 @@ module Langfuse
109
125
  # @param name [String] The name of the prompt
110
126
  # @param version [Integer, nil] Optional specific version number
111
127
  # @param label [String, nil] Optional label (e.g., "production", "latest")
128
+ # @param cache_ttl [Integer, nil] Optional TTL override for this fetch
112
129
  # @return [Hash] The prompt data
113
130
  # @raise [ArgumentError] if both version and label are provided
114
131
  # @raise [NotFoundError] if the prompt is not found
115
132
  # @raise [UnauthorizedError] if authentication fails
116
133
  # @raise [ApiError] for other API errors
117
- def get_prompt(name, version: nil, label: nil)
134
+ def get_prompt(name, version: nil, label: nil, cache_ttl: nil)
135
+ get_prompt_result(name, version: version, label: label, cache_ttl: cache_ttl).prompt
136
+ end
137
+
138
+ # Fetch a prompt and include cache metadata.
139
+ #
140
+ # @param name [String] The name of the prompt
141
+ # @param version [Integer, nil] Optional specific version number
142
+ # @param label [String, nil] Optional label (e.g., "production", "latest")
143
+ # @param cache_ttl [Integer, nil] Optional TTL override for this fetch
144
+ # @return [PromptFetchResult] Prompt data plus cache metadata
145
+ # @raise [ArgumentError] if both version and label are provided
146
+ # @raise [ArgumentError] if cache_ttl is negative
147
+ # @raise [NotFoundError] if the prompt is not found
148
+ # @raise [UnauthorizedError] if authentication fails
149
+ # @raise [ApiError] for other API errors
150
+ def get_prompt_result(name, version: nil, label: nil, cache_ttl: nil)
151
+ validate_prompt_fetch_options!(version, label, cache_ttl)
152
+
153
+ options = PromptFetchOptions.new(
154
+ key: prompt_cache_key(name, version: version, label: label),
155
+ cache_ttl: cache_ttl
156
+ )
157
+ return fetch_uncached_prompt_result(options, CacheStatus::DISABLED) if cache.nil?
158
+ return fetch_uncached_prompt_result(options, CacheStatus::BYPASS) if cache_ttl&.zero?
159
+
160
+ fetch_cached_prompt_result(options)
161
+ end
162
+
163
+ # Refresh a prompt from the API, optionally writing through to cache.
164
+ #
165
+ # @param name [String] The name of the prompt
166
+ # @param version [Integer, nil] Optional specific version number
167
+ # @param label [String, nil] Optional label
168
+ # @param cache_ttl [Integer, nil] Optional TTL override for this refresh
169
+ # @return [PromptFetchResult] Prompt data plus cache metadata
170
+ # @raise [ArgumentError] if both version and label are provided
171
+ # @raise [ArgumentError] if cache_ttl is negative
172
+ # @raise [NotFoundError] if the prompt is not found
173
+ # @raise [UnauthorizedError] if authentication fails
174
+ # @raise [ApiError] for other API errors
175
+ def refresh_prompt(name, version: nil, label: nil, cache_ttl: nil)
176
+ validate_prompt_fetch_options!(version, label, cache_ttl)
177
+
178
+ refresh_prompt_result(
179
+ PromptFetchOptions.new(
180
+ key: prompt_cache_key(name, version: version, label: label),
181
+ cache_ttl: cache_ttl
182
+ )
183
+ )
184
+ end
185
+
186
+ # Inspect the logical and generated cache keys for a prompt.
187
+ #
188
+ # @param name [String] The prompt name
189
+ # @param version [Integer, nil] Optional specific version number
190
+ # @param label [String, nil] Optional label
191
+ # @return [PromptCacheKey] Logical and generated cache keys
192
+ # @raise [ArgumentError] if both version and label are provided
193
+ def prompt_cache_key(name, version: nil, label: nil)
118
194
  raise ArgumentError, "Cannot specify both version and label" if version && label
119
- return fetch_prompt_from_api(name, version: version, label: label) if cache.nil?
120
195
 
121
- cache_key = PromptCache.build_key(name, version: version, label: label)
122
- fetch_with_appropriate_caching_strategy(cache_key, name, version, label)
196
+ logical_key = PromptCache.build_key(name, version: version, label: label)
197
+ storage_key = if generated_storage_key_cache?
198
+ cache.storage_key(logical_key, name: name)
199
+ else
200
+ logical_key
201
+ end
202
+ PromptCacheKey.new(name: name, version: version, label: label, logical_key: logical_key, storage_key: storage_key)
203
+ end
204
+
205
+ # Invalidate one exact logical prompt cache key.
206
+ #
207
+ # @param name [String] The prompt name
208
+ # @param version [Integer, nil] Optional specific version number
209
+ # @param label [String, nil] Optional label
210
+ # @return [PromptCacheKey] The invalidated key
211
+ # @raise [ArgumentError] if both version and label are provided
212
+ def invalidate_prompt_cache(name, version: nil, label: nil)
213
+ key = prompt_cache_key(name, version: version, label: label)
214
+ deleted = cache&.delete(key.storage_key) || false
215
+ emit_prompt_cache_event(:delete) { event_payload(key, CacheStatus::MISS, CacheSource::CACHE, deleted: deleted) }
216
+ emit_prompt_cache_event(:invalidate) do
217
+ event_payload(key, CacheStatus::MISS, CacheSource::CACHE, scope: :exact)
218
+ end
219
+ key
220
+ end
221
+
222
+ # Invalidate all cached variants for one prompt name.
223
+ #
224
+ # @param name [String] The prompt name
225
+ # @return [Integer, nil] New generation, or nil when cache is disabled
226
+ def invalidate_prompt_cache_by_name(name)
227
+ generation = cache&.invalidate_name(name)
228
+ payload = { name: name, backend: cache_backend_name, generation: generation, scope: :name }
229
+ emit_prompt_cache_event(:invalidate, payload)
230
+ generation
231
+ end
232
+
233
+ # Logically clear the whole Langfuse prompt cache namespace.
234
+ #
235
+ # @return [Integer, nil] New global generation, or nil when cache is disabled
236
+ def clear_prompt_cache
237
+ generation = cache&.clear_logically
238
+ emit_prompt_cache_event(:clear, backend: cache_backend_name, generation: generation)
239
+ generation
240
+ end
241
+
242
+ # Return prompt cache statistics.
243
+ #
244
+ # @return [Hash] Cache statistics
245
+ def prompt_cache_stats
246
+ return disabled_prompt_cache_stats unless cache
247
+
248
+ cache.stats
123
249
  end
124
250
 
125
251
  # Create a new prompt (or new version if prompt with same name exists)
@@ -158,7 +284,7 @@ module Langfuse
158
284
  payload[:commitMessage] = commit_message if commit_message
159
285
 
160
286
  response = connection.post(path, payload)
161
- handle_response(response)
287
+ handle_response(response).tap { invalidate_prompt_cache_after_mutation(name) }
162
288
  end
163
289
  end
164
290
  # rubocop:enable Metrics/ParameterLists
@@ -188,7 +314,7 @@ module Langfuse
188
314
  payload = { newLabels: labels }
189
315
 
190
316
  response = connection.patch(path, payload)
191
- handle_response(response)
317
+ handle_response(response).tap { invalidate_prompt_cache_after_mutation(name) }
192
318
  end
193
319
  end
194
320
 
@@ -595,21 +721,219 @@ module Langfuse
595
721
 
596
722
  private
597
723
 
598
- # Fetch prompt using the most appropriate caching strategy available
599
- #
600
- # @param cache_key [String] The cache key for this prompt
601
- # @param name [String] The name of the prompt
602
- # @param version [Integer, nil] Optional specific version number
603
- # @param label [String, nil] Optional label
604
- # @return [Hash] The prompt data
605
- def fetch_with_appropriate_caching_strategy(cache_key, name, version, label)
606
- if swr_cache_available?
607
- fetch_with_swr_cache(cache_key, name, version, label)
608
- elsif distributed_cache_available?
609
- fetch_with_distributed_cache(cache_key, name, version, label)
724
+ def validate_prompt_fetch_options!(version, label, cache_ttl)
725
+ raise ArgumentError, "Cannot specify both version and label" if version && label
726
+ return if cache_ttl.nil?
727
+ raise ArgumentError, "cache_ttl must be a non-negative Integer" unless cache_ttl.is_a?(Integer)
728
+ raise ArgumentError, "cache_ttl must be non-negative" if cache_ttl.negative?
729
+ end
730
+
731
+ def fetch_uncached_prompt_result(options, cache_status)
732
+ prompt_data = fetch_prompt_for_options(options)
733
+ build_prompt_result(options.key, prompt_data, cache_status, CacheSource::API)
734
+ end
735
+
736
+ def fetch_cached_prompt_result(options)
737
+ return fetch_swr_prompt_result(options) if swr_cache_available?
738
+
739
+ fetch_non_swr_prompt_result(options)
740
+ end
741
+
742
+ def fetch_swr_prompt_result(options)
743
+ unless generated_storage_key_cache?
744
+ prompt_data = fetch_with_swr_cache(options.key.storage_key, options.name, options.version, options.label)
745
+ return cache_hit_prompt_result(options.key, prompt_data)
746
+ end
747
+
748
+ result = fetch_swr_cached_prompt_result(options)
749
+ return result if result
750
+
751
+ fetch_cache_miss_prompt_result(options, swr_enabled: true, distributed_enabled: false)
752
+ end
753
+
754
+ def fetch_non_swr_prompt_result(options)
755
+ distributed_enabled = distributed_cache_available?
756
+
757
+ if !generated_storage_key_cache? && distributed_enabled
758
+ prompt_data = fetch_with_distributed_cache(options.key.storage_key, options.name, options.version,
759
+ options.label)
760
+ return cache_hit_prompt_result(options.key, prompt_data)
761
+ end
762
+
763
+ cached_data = cache.get(options.key.storage_key)
764
+ return cache_hit_prompt_result(options.key, cached_data) if cached_data
765
+
766
+ fetch_cache_miss_prompt_result(options, swr_enabled: false, distributed_enabled: distributed_enabled)
767
+ end
768
+
769
+ def fetch_swr_cached_prompt_result(options)
770
+ key = options.key
771
+ entry = cache.entry(key.storage_key) if cache.respond_to?(:entry)
772
+ return nil unless entry.respond_to?(:fresh?)
773
+ return cache_hit_prompt_result(key, entry.data) if entry.fresh?
774
+ return nil unless entry.stale?
775
+
776
+ emit_prompt_cache_event(:stale_serve) { event_payload(key, CacheStatus::STALE, CacheSource::CACHE) }
777
+ schedule_prompt_cache_refresh(options)
778
+ build_prompt_result(key, entry.data, CacheStatus::STALE, CacheSource::CACHE)
779
+ end
780
+
781
+ def cache_hit_prompt_result(key, prompt_data)
782
+ emit_prompt_cache_event(:hit) { event_payload(key, CacheStatus::HIT, CacheSource::CACHE) }
783
+ build_prompt_result(key, prompt_data, CacheStatus::HIT, CacheSource::CACHE)
784
+ end
785
+
786
+ def fetch_cache_miss_prompt_result(options, swr_enabled: false, distributed_enabled: nil)
787
+ emit_prompt_cache_event(:miss) { event_payload(options.key, CacheStatus::MISS, CacheSource::API) }
788
+ distributed_enabled = distributed_cache_available? if distributed_enabled.nil?
789
+
790
+ if !swr_enabled && distributed_enabled
791
+ fetch_cache_miss_with_lock(options)
610
792
  else
611
- fetch_with_simple_cache(cache_key, name, version, label)
793
+ fetch_cache_miss_directly(options, swr_enabled: swr_enabled)
794
+ end
795
+ end
796
+
797
+ def fetch_cache_miss_with_lock(options)
798
+ key = options.key
799
+ fetched = false
800
+ prompt_data = cache_fetch_with_lock(key.storage_key, options.cache_ttl) do
801
+ fetched = true
802
+ fetch_prompt_for_options(options)
803
+ end
804
+ emit_prompt_cache_event(:write) { event_payload(key, CacheStatus::MISS, CacheSource::API) } if fetched
805
+ status = fetched ? CacheStatus::MISS : CacheStatus::HIT
806
+ source = fetched ? CacheSource::API : CacheSource::CACHE
807
+ build_prompt_result(key, prompt_data, status, source)
808
+ end
809
+
810
+ def fetch_cache_miss_directly(options, swr_enabled: false)
811
+ prompt_data = fetch_prompt_for_options(options)
812
+ write_prompt_cache(options.key, prompt_data, options.cache_ttl, swr_enabled: swr_enabled)
813
+ build_prompt_result(options.key, prompt_data, CacheStatus::MISS, CacheSource::API)
814
+ end
815
+
816
+ def refresh_prompt_result(options)
817
+ key = options.key
818
+ emit_prompt_cache_event(:refresh_start) { event_payload(key, CacheStatus::REFRESH, CacheSource::API) }
819
+ prompt_data = fetch_prompt_for_options(options)
820
+ write_refresh_prompt_cache(key, prompt_data, options.cache_ttl)
821
+ status = refresh_cache_status(options.cache_ttl)
822
+ emit_prompt_cache_event(:refresh_success) { event_payload(key, status, CacheSource::API) }
823
+ build_prompt_result(key, prompt_data, status, CacheSource::API)
824
+ rescue StandardError => e
825
+ emit_prompt_cache_event(:refresh_failure) do
826
+ event_payload(key, CacheStatus::REFRESH, CacheSource::API,
827
+ error_class: e.class.name, error_message: e.message)
612
828
  end
829
+ raise
830
+ end
831
+
832
+ def schedule_prompt_cache_refresh(options)
833
+ return unless cache.respond_to?(:refresh_async)
834
+
835
+ key = options.key
836
+ scheduled = cache.refresh_async(
837
+ key.storage_key,
838
+ ttl: options.cache_ttl,
839
+ on_success: ->(_value) { emit_refresh_success_events(key) },
840
+ on_failure: ->(error) { emit_refresh_failure_event(key, error) }
841
+ ) { fetch_prompt_for_options(options) }
842
+ return unless scheduled
843
+
844
+ emit_prompt_cache_event(:refresh_start) { event_payload(key, CacheStatus::STALE, CacheSource::CACHE) }
845
+ end
846
+
847
+ def fetch_prompt_for_options(options)
848
+ fetch_prompt_from_api(options.name, version: options.version, label: options.label)
849
+ end
850
+
851
+ def emit_refresh_success_events(key)
852
+ emit_prompt_cache_event(:refresh_success) { event_payload(key, CacheStatus::REFRESH, CacheSource::API) }
853
+ emit_prompt_cache_event(:write) { event_payload(key, CacheStatus::REFRESH, CacheSource::API) }
854
+ end
855
+
856
+ def emit_refresh_failure_event(key, error)
857
+ emit_prompt_cache_event(:refresh_failure) do
858
+ event_payload(key, CacheStatus::STALE, CacheSource::CACHE,
859
+ error_class: error.class.name, error_message: error.message)
860
+ end
861
+ end
862
+
863
+ def write_refresh_prompt_cache(key, prompt_data, cache_ttl)
864
+ return unless cache
865
+ return if cache_ttl&.zero?
866
+
867
+ write_prompt_cache(key, prompt_data, cache_ttl,
868
+ cache_status: CacheStatus::REFRESH, swr_enabled: swr_cache_available?)
869
+ end
870
+
871
+ def write_prompt_cache(key, prompt_data, cache_ttl, cache_status: CacheStatus::MISS, swr_enabled: false)
872
+ if swr_enabled && cache.respond_to?(:write_with_stale_while_revalidate)
873
+ cache.write_with_stale_while_revalidate(key.storage_key, prompt_data, ttl: cache_ttl)
874
+ elsif cache_ttl.nil?
875
+ cache.set(key.storage_key, prompt_data)
876
+ else
877
+ cache.set(key.storage_key, prompt_data, ttl: cache_ttl)
878
+ end
879
+ emit_prompt_cache_event(:write) { event_payload(key, cache_status, CacheSource::API) }
880
+ end
881
+
882
+ def cache_fetch_with_lock(storage_key, cache_ttl, &)
883
+ return cache.fetch_with_lock(storage_key, &) if cache_ttl.nil?
884
+
885
+ cache.fetch_with_lock(storage_key, ttl: cache_ttl, &)
886
+ end
887
+
888
+ def refresh_cache_status(cache_ttl)
889
+ return CacheStatus::DISABLED unless cache
890
+ return CacheStatus::BYPASS if cache_ttl&.zero?
891
+
892
+ CacheStatus::REFRESH
893
+ end
894
+
895
+ def build_prompt_result(key, prompt_data, cache_status, source)
896
+ PromptFetchResult.new(
897
+ prompt: prompt_data,
898
+ logical_key: key.logical_key,
899
+ storage_key: key.storage_key,
900
+ cache_status: cache_status,
901
+ source: source,
902
+ name: prompt_data["name"] || key.name,
903
+ version: prompt_data["version"] || key.version,
904
+ label: key.resolved_label
905
+ )
906
+ end
907
+
908
+ attr_reader :cache_backend_name
909
+
910
+ def compute_cache_backend_name
911
+ return CacheBackend::DISABLED unless cache
912
+ return CacheBackend::RAILS if cache.is_a?(RailsCacheAdapter)
913
+ return CacheBackend::MEMORY if cache.is_a?(PromptCache)
914
+
915
+ cache.class.name
916
+ end
917
+
918
+ def disabled_prompt_cache_stats
919
+ {
920
+ backend: CacheBackend::DISABLED,
921
+ enabled: false,
922
+ current_generation_entries: nil,
923
+ orphaned_entries: nil,
924
+ total_entries: nil,
925
+ unsupported_counts: CacheBackend::UNSUPPORTED_COUNT_KEYS
926
+ }
927
+ end
928
+
929
+ def generated_storage_key_cache?
930
+ cache.is_a?(PromptCache) || cache.is_a?(RailsCacheAdapter)
931
+ end
932
+
933
+ def invalidate_prompt_cache_after_mutation(name)
934
+ generation = cache&.invalidate_name(name)
935
+ payload = { name: name, backend: cache_backend_name, generation: generation, scope: :name, mutation: true }
936
+ emit_prompt_cache_event(:invalidate, payload)
613
937
  end
614
938
 
615
939
  # Check if SWR cache is available
@@ -707,16 +1031,6 @@ module Langfuse
707
1031
  end
708
1032
  end
709
1033
 
710
- # Fetch with simple cache (in-memory cache)
711
- def fetch_with_simple_cache(cache_key, name, version, label)
712
- cached_data = cache.get(cache_key)
713
- return cached_data if cached_data
714
-
715
- prompt_data = fetch_prompt_from_api(name, version: version, label: label)
716
- cache.set(cache_key, prompt_data)
717
- prompt_data
718
- end
719
-
720
1034
  # Fetch a prompt from the API (without caching)
721
1035
  #
722
1036
  # @param name [String] The name of the prompt
@@ -0,0 +1,32 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Langfuse
4
+ # Symbol constants for prompt cache event payloads.
5
+ # Producers (ApiClient, PromptFetchResult) and consumers (observers,
6
+ # ActiveSupport::Notifications subscribers) share these definitions so a
7
+ # rename in one place can't silently desync from the other.
8
+ module CacheStatus
9
+ HIT = :hit
10
+ MISS = :miss
11
+ STALE = :stale
12
+ REFRESH = :refresh
13
+ BYPASS = :bypass
14
+ DISABLED = :disabled
15
+ end
16
+
17
+ module CacheSource
18
+ CACHE = :cache
19
+ API = :api
20
+ FALLBACK = :fallback
21
+ end
22
+
23
+ module CacheBackend
24
+ MEMORY = "memory"
25
+ RAILS = "rails"
26
+ DISABLED = "disabled"
27
+
28
+ # Stat keys backend implementations may not be able to compute. Surfaced in
29
+ # `#stats[:unsupported_counts]` so callers can distinguish "0" from "unknown".
30
+ UNSUPPORTED_COUNT_KEYS = %i[current_generation_entries orphaned_entries total_entries].freeze
31
+ end
32
+ end