langfuse-ruby 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
data/README.md ADDED
@@ -0,0 +1,436 @@
1
+ # Langfuse Ruby SDK
2
+
3
+ [![Gem Version](https://badge.fury.io/rb/langfuse-ruby.svg)](https://badge.fury.io/rb/langfuse-ruby)
4
+ [![Build Status](https://github.com/your-username/langfuse-ruby/workflows/CI/badge.svg)](https://github.com/your-username/langfuse-ruby/actions)
5
+
6
+ Ruby SDK for [Langfuse](https://langfuse.com) - the open-source LLM engineering platform. This SDK provides comprehensive tracing, prompt management, and evaluation capabilities for LLM applications.
7
+
8
+ ## Features
9
+
10
+ - 🔍 **Tracing**: Complete observability for LLM applications with traces, spans, and generations
11
+ - 📝 **Prompt Management**: Version control and deployment of prompts with caching
12
+ - 📊 **Evaluation**: Built-in evaluators and custom scoring capabilities
13
+ - 🚀 **Async Processing**: Background event processing with automatic batching
14
+ - 🔒 **Type Safety**: Comprehensive error handling and validation
15
+ - 🎯 **Framework Integration**: Easy integration with popular Ruby frameworks
16
+
17
+ ## Installation
18
+
19
+ Add this line to your application's Gemfile:
20
+
21
+ ```ruby
22
+ gem 'langfuse-ruby'
23
+ ```
24
+
25
+ And then execute:
26
+
27
+ ```bash
28
+ $ bundle install
29
+ ```
30
+
31
+ Or install it yourself as:
32
+
33
+ ```bash
34
+ $ gem install langfuse-ruby
35
+ ```
36
+
37
+ ## Quick Start
38
+
39
+ ### 1. Initialize the Client
40
+
41
+ ```ruby
42
+ require 'langfuse'
43
+
44
+ # Initialize with API keys
45
+ client = Langfuse.new(
46
+ public_key: "pk-lf-...",
47
+ secret_key: "sk-lf-...",
48
+ host: "https://cloud.langfuse.com" # Optional, defaults to cloud.langfuse.com
49
+ )
50
+
51
+ # Or configure globally
52
+ Langfuse.configure do |config|
53
+ config.public_key = "pk-lf-..."
54
+ config.secret_key = "sk-lf-..."
55
+ config.host = "https://cloud.langfuse.com"
56
+ end
57
+
58
+ client = Langfuse.new
59
+ ```
60
+
61
+ ### 2. Basic Tracing
62
+
63
+ ```ruby
64
+ # Create a trace
65
+ trace = client.trace(
66
+ name: "chat-completion",
67
+ user_id: "user123",
68
+ session_id: "session456",
69
+ input: { message: "Hello, world!" },
70
+ metadata: { environment: "production" }
71
+ )
72
+
73
+ # Add a generation (LLM call)
74
+ generation = trace.generation(
75
+ name: "openai-completion",
76
+ model: "gpt-3.5-turbo",
77
+ input: [{ role: "user", content: "Hello, world!" }],
78
+ output: { content: "Hello! How can I help you today?" },
79
+ usage: { prompt_tokens: 10, completion_tokens: 15, total_tokens: 25 },
80
+ model_parameters: { temperature: 0.7, max_tokens: 100 }
81
+ )
82
+
83
+ # Update trace with final output
84
+ trace.update(
85
+ output: { response: "Hello! How can I help you today?" }
86
+ )
87
+
88
+ # Flush events (optional - happens automatically)
89
+ client.flush
90
+ ```
91
+
92
+ ### 3. Nested Spans
93
+
94
+ ```ruby
95
+ trace = client.trace(name: "document-qa")
96
+
97
+ # Create a span for document retrieval
98
+ retrieval_span = trace.span(
99
+ name: "document-retrieval",
100
+ input: { query: "What is machine learning?" }
101
+ )
102
+
103
+ # Add a generation for embedding
104
+ embedding_gen = retrieval_span.generation(
105
+ name: "embedding-generation",
106
+ model: "text-embedding-ada-002",
107
+ input: "What is machine learning?",
108
+ output: [0.1, 0.2, 0.3], # embedding vector
109
+ usage: { prompt_tokens: 5, total_tokens: 5 }
110
+ )
111
+
112
+ # End the retrieval span
113
+ retrieval_span.end(
114
+ output: { documents: ["ML is...", "Machine learning involves..."] }
115
+ )
116
+
117
+ # Create a span for answer generation
118
+ answer_span = trace.span(
119
+ name: "answer-generation",
120
+ input: {
121
+ query: "What is machine learning?",
122
+ context: ["ML is...", "Machine learning involves..."]
123
+ }
124
+ )
125
+
126
+ # Add LLM generation
127
+ llm_gen = answer_span.generation(
128
+ name: "openai-completion",
129
+ model: "gpt-3.5-turbo",
130
+ input: [
131
+ { role: "system", content: "Answer based on context" },
132
+ { role: "user", content: "What is machine learning?" }
133
+ ],
134
+ output: { content: "Machine learning is a subset of AI..." },
135
+ usage: { prompt_tokens: 50, completion_tokens: 30, total_tokens: 80 }
136
+ )
137
+
138
+ answer_span.end(output: { answer: "Machine learning is a subset of AI..." })
139
+ ```
140
+
141
+ ## Prompt Management
142
+
143
+ ### Get and Use Prompts
144
+
145
+ ```ruby
146
+ # Get a prompt
147
+ prompt = client.get_prompt("chat-prompt", version: 1)
148
+
149
+ # Compile prompt with variables
150
+ compiled = prompt.compile(
151
+ user_name: "Alice",
152
+ topic: "machine learning"
153
+ )
154
+
155
+ puts compiled
156
+ # Output: "Hello Alice! Let's discuss machine learning today."
157
+ ```
158
+
159
+ ### Create Prompts
160
+
161
+ ```ruby
162
+ # Create a text prompt
163
+ text_prompt = client.create_prompt(
164
+ name: "greeting-prompt",
165
+ prompt: "Hello {{user_name}}! How can I help you with {{topic}} today?",
166
+ labels: ["greeting", "customer-service"],
167
+ config: { temperature: 0.7 }
168
+ )
169
+
170
+ # Create a chat prompt
171
+ chat_prompt = client.create_prompt(
172
+ name: "chat-prompt",
173
+ prompt: [
174
+ { role: "system", content: "You are a helpful assistant specialized in {{domain}}." },
175
+ { role: "user", content: "{{user_message}}" }
176
+ ],
177
+ labels: ["chat", "assistant"]
178
+ )
179
+ ```
180
+
181
+ ### Prompt Templates
182
+
183
+ ```ruby
184
+ # Create prompt templates for reuse
185
+ template = Langfuse::PromptTemplate.from_template(
186
+ "Translate the following text to {{language}}: {{text}}"
187
+ )
188
+
189
+ translated = template.format(
190
+ language: "Spanish",
191
+ text: "Hello, world!"
192
+ )
193
+
194
+ # Chat prompt templates
195
+ chat_template = Langfuse::ChatPromptTemplate.from_messages([
196
+ { role: "system", content: "You are a {{role}} assistant." },
197
+ { role: "user", content: "{{user_input}}" }
198
+ ])
199
+
200
+ messages = chat_template.format(
201
+ role: "helpful",
202
+ user_input: "What is Ruby?"
203
+ )
204
+ ```
205
+
206
+ ## Evaluation and Scoring
207
+
208
+ ### Built-in Evaluators
209
+
210
+ ```ruby
211
+ # Exact match evaluator
212
+ exact_match = Langfuse::Evaluators::ExactMatchEvaluator.new
213
+
214
+ result = exact_match.evaluate(
215
+ input: "What is 2+2?",
216
+ output: "4",
217
+ expected: "4"
218
+ )
219
+ # => { name: "exact_match", value: 1, comment: "Exact match" }
220
+
221
+ # Similarity evaluator
222
+ similarity = Langfuse::Evaluators::SimilarityEvaluator.new
223
+
224
+ result = similarity.evaluate(
225
+ input: "What is AI?",
226
+ output: "Artificial Intelligence is...",
227
+ expected: "AI is artificial intelligence..."
228
+ )
229
+ # => { name: "similarity", value: 0.85, comment: "Similarity: 85%" }
230
+
231
+ # Length evaluator
232
+ length = Langfuse::Evaluators::LengthEvaluator.new(min_length: 10, max_length: 100)
233
+
234
+ result = length.evaluate(
235
+ input: "Explain AI",
236
+ output: "AI is a field of computer science that focuses on creating intelligent machines."
237
+ )
238
+ # => { name: "length", value: 1, comment: "Length 80 within range" }
239
+ ```
240
+
241
+ ### Custom Scoring
242
+
243
+ ```ruby
244
+ # Add scores to traces or observations
245
+ trace = client.trace(name: "qa-session")
246
+
247
+ # Score the entire trace
248
+ trace.score(
249
+ name: "user-satisfaction",
250
+ value: 0.9,
251
+ comment: "User was very satisfied"
252
+ )
253
+
254
+ # Score specific generations
255
+ generation = trace.generation(
256
+ name: "answer-generation",
257
+ model: "gpt-3.5-turbo",
258
+ output: { content: "The answer is 42." }
259
+ )
260
+
261
+ generation.score(
262
+ name: "accuracy",
263
+ value: 0.8,
264
+ comment: "Mostly accurate answer"
265
+ )
266
+
267
+ generation.score(
268
+ name: "helpfulness",
269
+ value: 0.95,
270
+ comment: "Very helpful response"
271
+ )
272
+ ```
273
+
274
+ ## Advanced Usage
275
+
276
+ ### Error Handling
277
+
278
+ ```ruby
279
+ begin
280
+ client = Langfuse.new(
281
+ public_key: "invalid-key",
282
+ secret_key: "invalid-secret"
283
+ )
284
+
285
+ trace = client.trace(name: "test")
286
+ client.flush
287
+ rescue Langfuse::AuthenticationError => e
288
+ puts "Authentication failed: #{e.message}"
289
+ rescue Langfuse::RateLimitError => e
290
+ puts "Rate limit exceeded: #{e.message}"
291
+ rescue Langfuse::NetworkError => e
292
+ puts "Network error: #{e.message}"
293
+ rescue Langfuse::APIError => e
294
+ puts "API error: #{e.message}"
295
+ end
296
+ ```
297
+
298
+ ### Configuration Options
299
+
300
+ ```ruby
301
+ client = Langfuse.new(
302
+ public_key: "pk-lf-...",
303
+ secret_key: "sk-lf-...",
304
+ host: "https://your-instance.langfuse.com",
305
+ debug: true, # Enable debug logging
306
+ timeout: 30, # Request timeout in seconds
307
+ retries: 3 # Number of retry attempts
308
+ )
309
+ ```
310
+
311
+ ### Environment Variables
312
+
313
+ You can also configure the client using environment variables:
314
+
315
+ ```bash
316
+ export LANGFUSE_PUBLIC_KEY="pk-lf-..."
317
+ export LANGFUSE_SECRET_KEY="sk-lf-..."
318
+ export LANGFUSE_HOST="https://cloud.langfuse.com"
319
+ ```
320
+
321
+ ### Shutdown
322
+
323
+ ```ruby
324
+ # Ensure all events are flushed before shutdown
325
+ client.shutdown
326
+ ```
327
+
328
+ ## Framework Integration
329
+
330
+ ### Rails Integration
331
+
332
+ ```ruby
333
+ # config/initializers/langfuse.rb
334
+ Langfuse.configure do |config|
335
+ config.public_key = Rails.application.credentials.langfuse_public_key
336
+ config.secret_key = Rails.application.credentials.langfuse_secret_key
337
+ config.debug = Rails.env.development?
338
+ end
339
+
340
+ # In your controller or service
341
+ class ChatController < ApplicationController
342
+ def create
343
+ @client = Langfuse.new
344
+
345
+ trace = @client.trace(
346
+ name: "chat-request",
347
+ user_id: current_user.id,
348
+ session_id: session.id,
349
+ input: params[:message],
350
+ metadata: {
351
+ controller: self.class.name,
352
+ action: action_name,
353
+ ip: request.remote_ip
354
+ }
355
+ )
356
+
357
+ # Your LLM logic here
358
+ response = generate_response(params[:message])
359
+
360
+ trace.update(output: { response: response })
361
+
362
+ render json: { response: response }
363
+ end
364
+ end
365
+ ```
366
+
367
+ ### Sidekiq Integration
368
+
369
+ ```ruby
370
+ class LLMProcessingJob < ApplicationJob
371
+ def perform(user_id, message)
372
+ client = Langfuse.new
373
+
374
+ trace = client.trace(
375
+ name: "background-llm-processing",
376
+ user_id: user_id,
377
+ input: { message: message },
378
+ metadata: { job_class: self.class.name }
379
+ )
380
+
381
+ # Process with LLM
382
+ result = process_with_llm(message)
383
+
384
+ trace.update(output: result)
385
+
386
+ # Ensure events are flushed
387
+ client.flush
388
+ end
389
+ end
390
+ ```
391
+
392
+ ## Examples
393
+
394
+ Check out the `examples/` directory for more comprehensive examples:
395
+
396
+ - [Basic Tracing](examples/basic_tracing.rb)
397
+ - [Prompt Management](examples/prompt_management.rb)
398
+ - [Evaluation Pipeline](examples/evaluation_pipeline.rb)
399
+ - [Rails Integration](examples/rails_integration.rb)
400
+
401
+ ## Development
402
+
403
+ After checking out the repo, run:
404
+
405
+ ```bash
406
+ bin/setup
407
+ ```
408
+
409
+ To install dependencies. Then, run:
410
+
411
+ ```bash
412
+ rake spec
413
+ ```
414
+
415
+ To run the tests. You can also run:
416
+
417
+ ```bash
418
+ bin/console
419
+ ```
420
+
421
+ For an interactive prompt that will allow you to experiment.
422
+
423
+ ## Contributing
424
+
425
+ Bug reports and pull requests are welcome on GitHub at https://github.com/your-username/langfuse-ruby.
426
+
427
+ ## License
428
+
429
+ The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
430
+
431
+ ## Links
432
+
433
+ - [Langfuse Documentation](https://langfuse.com/docs)
434
+ - [Langfuse GitHub](https://github.com/langfuse/langfuse)
435
+ - [API Reference](https://api.reference.langfuse.com)
436
+ - [Ruby SDK Documentation](https://rubydoc.info/gems/langfuse-ruby)
@@ -0,0 +1,130 @@
1
+ # Langfuse Ruby SDK 发布检查清单
2
+
3
+ ## 📋 发布前检查
4
+
5
+ ### 1. 代码质量检查
6
+ - [ ] 所有测试通过 (`bundle exec rspec`)
7
+ - [ ] 离线测试通过 (`ruby test_offline.rb`)
8
+ - [ ] 代码风格检查 (`bundle exec rubocop`)
9
+ - [ ] 文档更新完成
10
+
11
+ ### 2. 版本管理
12
+ - [ ] 更新版本号 (`lib/langfuse/version.rb`)
13
+ - [ ] 更新变更日志 (`CHANGELOG.md`)
14
+ - [ ] 更新 README 如有必要
15
+
16
+ ### 3. 个人信息更新
17
+ - [ ] 更新 `langfuse.gemspec` 中的作者信息
18
+ - [ ] 更新 GitHub 仓库链接
19
+ - [ ] 更新邮箱地址
20
+
21
+ ### 4. Git 仓库设置
22
+ - [ ] 初始化 Git 仓库 (`git init`)
23
+ - [ ] 添加远程仓库
24
+ - [ ] 提交所有更改
25
+ - [ ] 推送到 GitHub
26
+
27
+ ## 🚀 发布步骤
28
+
29
+ ### 方法 1: 使用发布脚本 (推荐)
30
+ ```bash
31
+ ./scripts/release.sh
32
+ ```
33
+
34
+ ### 方法 2: 手动发布
35
+ ```bash
36
+ # 1. 运行测试
37
+ bundle exec rspec
38
+ ruby test_offline.rb
39
+
40
+ # 2. 构建 gem
41
+ gem build langfuse.gemspec
42
+
43
+ # 3. 发布到 RubyGems
44
+ gem push langfuse-0.1.0.gem
45
+
46
+ # 4. 创建 Git 标签
47
+ git tag v0.1.0
48
+ git push origin main
49
+ git push origin v0.1.0
50
+ ```
51
+
52
+ ### 方法 3: 使用 GitHub Actions
53
+ 1. 推送代码到 GitHub
54
+ 2. 创建版本标签 (`git tag v0.1.0 && git push origin v0.1.0`)
55
+ 3. GitHub Actions 自动发布
56
+
57
+ ## 📊 发布后验证
58
+
59
+ ### 1. 检查 RubyGems
60
+ - [ ] 访问 https://rubygems.org/gems/langfuse
61
+ - [ ] 确认版本号正确
62
+ - [ ] 检查下载链接
63
+
64
+ ### 2. 测试安装
65
+ ```bash
66
+ # 从 RubyGems 安装
67
+ gem install langfuse-ruby
68
+
69
+ # 测试基本功能
70
+ ruby -e "require 'langfuse'; puts Langfuse::VERSION"
71
+ ```
72
+
73
+ ### 3. 更新文档
74
+ - [ ] 更新 README 中的安装说明
75
+ - [ ] 更新项目网站(如有)
76
+ - [ ] 发布公告
77
+
78
+ ## 🔧 常见问题解决
79
+
80
+ ### RubyGems 认证问题
81
+ ```bash
82
+ # 登录 RubyGems
83
+ gem signin
84
+
85
+ # 或设置 API 密钥
86
+ gem signin --key your_api_key
87
+ ```
88
+
89
+ ### Git 标签问题
90
+ ```bash
91
+ # 删除本地标签
92
+ git tag -d v0.1.0
93
+
94
+ # 删除远程标签
95
+ git push origin --delete v0.1.0
96
+
97
+ # 重新创建标签
98
+ git tag v0.1.0
99
+ git push origin v0.1.0
100
+ ```
101
+
102
+ ### 版本冲突
103
+ ```bash
104
+ # 检查现有版本
105
+ gem search langfuse
106
+
107
+ # 更新版本号后重新发布
108
+ # 编辑 lib/langfuse/version.rb
109
+ # 重新构建和发布
110
+ ```
111
+
112
+ ## 📞 获取帮助
113
+
114
+ 如果遇到问题:
115
+ 1. 查看 RubyGems 文档
116
+ 2. 检查 GitHub Issues
117
+ 3. 联系 Ruby 社区
118
+ 4. 参考其他成功的 gem 项目
119
+
120
+ ## 🎉 发布成功后
121
+
122
+ 1. 🎊 庆祝!您的 gem 已成功发布
123
+ 2. 📢 在社交媒体上分享
124
+ 3. 📝 写博客文章介绍
125
+ 4. 🔗 联系 Langfuse 团队添加到官方文档
126
+ 5. 📈 监控使用情况和反馈
127
+
128
+ ---
129
+
130
+ **记住:发布是一个里程碑,但维护才是长期的工作!**
data/Rakefile ADDED
@@ -0,0 +1,23 @@
1
+ require 'bundler/gem_tasks'
2
+ require 'rspec/core/rake_task'
3
+ require_relative 'lib/langfuse/version'
4
+
5
+ RSpec::Core::RakeTask.new(:spec)
6
+
7
+ task default: :spec
8
+
9
+ # Custom release task
10
+ desc 'Release gem to RubyGems'
11
+ task release_gem: [:build] do
12
+ sh "gem push pkg/langfuse-ruby-#{Langfuse::VERSION}.gem"
13
+ end
14
+
15
+ # Offline test task
16
+ desc 'Run offline tests'
17
+ task :test_offline do
18
+ sh 'ruby test_offline.rb'
19
+ end
20
+
21
+ # Complete test suite
22
+ desc 'Run all tests'
23
+ task test_all: %i[spec test_offline]