helicone-rb 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 30551d818f767a29ad080d1ec1beffc42a5346e0e4f94f9cbe112b6c7aaaa9b5
4
- data.tar.gz: 3505af534452304f15f075a0fbde6d334c8a1b3cc8a4a85f61abc292fe564226
3
+ metadata.gz: 22d855f83be3572fa91710cb1a63483ac70e07717b9a77103ca752d71931ad09
4
+ data.tar.gz: 26a6eca90b9d1c9ff1cee79ae8a774da5a65fec8088c30bae1956c493242e336
5
5
  SHA512:
6
- metadata.gz: 2e09d82c356de9e0ca25af6846e582c63bbc773fab7c7f257e0e06a2dcf4ebbb0ee6af983457a7fd61fa9b5ec84116e7621017cf4fb8216649bdfc23fd6d14a9
7
- data.tar.gz: 44affe9ff8922ba1254567b88cb7ac4b280aa3b95eac4162969c6ac606e6be5194f8b2b1784feaba1cdc73d340d67cef078f5bf7e63b4caa36b401a02ce384d8
6
+ metadata.gz: de7718c83e259ded888212a02c63f2cac6f02eb457ded45361efa1d903a2490a0774425dccab4861da8dd40d48888ab3a156ab769dac1c8f64d42e59b427aa41
7
+ data.tar.gz: 60bc8baed47ae2599f8a7f7f5ecfeca1e791b9b1ddf1123a61e99d7e2f1368254dfeef3a29d0d3941f27c1b1215fe1ac28220ce5ec74c292e610ea9c132d6763
data/README.md CHANGED
@@ -1,7 +1,11 @@
1
+ > **Beta** - This library is under active development. APIs may change.
2
+
1
3
  # Helicone Ruby Client
2
4
 
3
5
  A Ruby client for the [Helicone AI Gateway](https://helicone.ai), wrapping the OpenAI API with built-in session tracking, cost attribution, and an agentic framework for building AI applications with tool/function calling.
4
6
 
7
+
8
+
5
9
  ## Why Helicone?
6
10
 
7
11
  [Helicone](https://helicone.ai) is an AI gateway that sits between your application and LLM providers (OpenAI, Anthropic, etc.). It provides:
@@ -16,6 +20,35 @@ A Ruby client for the [Helicone AI Gateway](https://helicone.ai), wrapping the O
16
20
 
17
21
  *The Helicone dashboard shows all your LLM requests with latency, cost, and token usage at a glance.*
18
22
 
23
+ ### Under the hood
24
+ This is an API that interacts with **any OpenAI-format /chat/completions LLM endpoint.**
25
+ [Helicone](https://helicone.ai) acts as a proxy, similar to OpenRouter, with built in telemetry and more.
26
+
27
+ This library adds the ability to interact with Helicone specific features like [Session waterfall tracking](https://github.com/genevere-inc/helicone-rb/tree/main?tab=readme-ov-file#session-and-account-tracking).
28
+
29
+ ### Alternatives
30
+
31
+ **[RubyLLM](https://rubyllm.com/)**
32
+
33
+ This library is great for an opinioned quick start, but I found it has too much magic under the hood that makes using proxies like Helicone and OpenRouter very difficult.
34
+
35
+ **[`alexrudall/ruby-openai`](https://github.com/alexrudall/ruby-openai)**
36
+
37
+ This gem uses this great library under the hood. If you don't need tool calling or helicone, you can use this instead.
38
+
39
+ **[`openai/openai-ruby`](https://github.com/openai/openai-ruby)**
40
+
41
+ This is the official gem from OpenAI. It's severely lacking in functionality.
42
+
43
+ **[`Traceloop`](https://traceloop.com/)**
44
+
45
+ A telemetry tool built on top of OpenTelemetry. Their `traceloop-sdk` gem did not work out of the box as of June 2025, but accepts OpenTelemetry endpoints and works quite well for visualizing agentic workflows.
46
+
47
+ Note that it **does not** act as a _proxy_ in the way Helicone and OpenRouter do, which may help if you need to use LLM hosts directly, or may be a drawback given it lacks the fallback mechanisms the proxies include.
48
+
49
+ _Note:_ I've used this in production and it works very reliably for tracking, but cost aggregation has never worked correctly. For that and other challenges, switched to Helicone.
50
+
51
+
19
52
  ## Installation
20
53
 
21
54
  Add this line to your application's Gemfile:
@@ -79,7 +112,7 @@ response = client.ask("What is the capital of France?")
79
112
  # With system prompt
80
113
  response = client.ask(
81
114
  "Explain quantum computing",
82
- system: "You are a physics teacher. Explain concepts simply."
115
+ system_prompt: "You are a physics teacher. Explain concepts simply."
83
116
  )
84
117
  ```
85
118
 
@@ -223,7 +256,7 @@ end
223
256
  ```ruby
224
257
  agent = Helicone::Agent.new(
225
258
  tools: [WeatherTool, CalendarTool],
226
- system: "You are a helpful assistant with access to weather and calendar tools.",
259
+ system_prompt: "You are a helpful assistant with access to weather and calendar tools.",
227
260
  context: current_user # Passed to tool#initialize
228
261
  )
229
262
 
@@ -11,17 +11,17 @@ module Helicone
11
11
  # @param client [Helicone::Client] Optional client (creates new one if not provided)
12
12
  # @param tools [Array<Class>] Array of Tool subclasses
13
13
  # @param context [Object] Context object passed to tool#initialize
14
- # @param system [String] System prompt
14
+ # @param system_prompt [String] System prompt
15
15
  # @param messages [Array<Helicone::Message>] Initial messages (for continuing conversations)
16
- def initialize(client: nil, tools: [], context: nil, system: nil, messages: [])
16
+ def initialize(client: nil, tools: [], context: nil, system_prompt: nil, messages: [])
17
17
  @client = client || Client.new
18
18
  @tools = tools
19
19
  @context = context
20
20
  @messages = messages.dup
21
21
 
22
22
  # Add system message at the start if provided and not already present
23
- if system && @messages.none? { |m| m.respond_to?(:role) && m.role == "system" }
24
- @messages.unshift(Message.system(system))
23
+ if system_prompt && @messages.none? { |m| m.respond_to?(:role) && m.role == "system" }
24
+ @messages.unshift(Message.system(system_prompt))
25
25
  end
26
26
  end
27
27
 
@@ -66,12 +66,12 @@ module Helicone
66
66
  #
67
67
  # @param prompt [String] User prompt text
68
68
  # @param model [String] Model ID to use for completion
69
- # @param system [String] Optional system prompt
69
+ # @param system_prompt [String] Optional system prompt
70
70
  # @param options [Hash] Additional options passed to chat
71
71
  # @return [String] The text content of the response
72
- def ask(prompt, model: nil, system: nil, **options)
72
+ def ask(prompt, model: nil, system_prompt: nil, **options)
73
73
  messages = []
74
- messages << Message.system(system) if system
74
+ messages << Message.system(system_prompt) if system_prompt
75
75
  messages << Message.user_text(prompt)
76
76
 
77
77
  response = chat(messages: messages, model: model, **options)
@@ -83,13 +83,13 @@ module Helicone
83
83
  # @param prompt [String] User prompt text
84
84
  # @param image_url [String] URL or base64 data URI of the image
85
85
  # @param model [String] Model ID to use for completion
86
- # @param system [String] Optional system prompt
86
+ # @param system_prompt [String] Optional system prompt
87
87
  # @param detail [String] Image detail level: "auto", "low", or "high"
88
88
  # @param options [Hash] Additional options passed to chat
89
89
  # @return [String] The text content of the response
90
- def ask_with_image(prompt, image_url, model: nil, system: nil, detail: "auto", **options)
90
+ def ask_with_image(prompt, image_url, model: nil, system_prompt: nil, detail: "auto", **options)
91
91
  messages = []
92
- messages << Message.system(system) if system
92
+ messages << Message.system(system_prompt) if system_prompt
93
93
  messages << Message.user_with_images(prompt, image_url, detail: detail)
94
94
 
95
95
  response = chat(messages: messages, model: model, **options)
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Helicone
4
- VERSION = "0.1.0"
4
+ VERSION = "0.1.2"
5
5
  end
data/sample.rb CHANGED
@@ -23,7 +23,7 @@ client.ask("What is 2 + 2?")
23
23
  # => "2 + 2 equals 4."
24
24
 
25
25
  # With a system prompt
26
- client.ask("Tell me a joke", system: "You are a comedian")
26
+ client.ask("Tell me a joke", system_prompt: "You are a comedian")
27
27
 
28
28
  # With a specific model
29
29
  client.ask("Explain Ruby blocks", model: "gpt-4o-mini")
@@ -115,7 +115,7 @@ end
115
115
  # Run the agent
116
116
  agent = Helicone::Agent.new(
117
117
  tools: [WeatherTool, CalculatorTool],
118
- system: "You are a helpful assistant with access to weather and calculator tools."
118
+ system_prompt: "You are a helpful assistant with access to weather and calculator tools."
119
119
  )
120
120
 
121
121
  result = agent.run("What's the weather in San Francisco?")
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: helicone-rb
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Genevere