durable-llm 0.1.1 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 29574965602c24158dda8d571f97a8587e42eb50ef0248063cdc57a3ef909e84
4
- data.tar.gz: 3ec88df34382fe52b4df9e2f0ee726a7999b160bdb07fc291445146d5b92629f
3
+ metadata.gz: 16b4fd786241b183bdc060199d2623155b5082570e687b839ecb3d1b610e25ab
4
+ data.tar.gz: c8257d7a7a8a3b0847a21b9b574cc405e6cb2f5b81690f2de59949c0a4a4c69e
5
5
  SHA512:
6
- metadata.gz: 5efe187dd42f3149fe6e1571f9d22299eba2493fd0d747f6a7743a856b4a2c7b16c9a8490189fbcb7ee6d45eebc5aa1406997e189234fbf29b3c595ef12fb132
7
- data.tar.gz: 6c99719b35db2cfafb7844aa42a90eaace7f16a976821ca86831026b98d01b490214446c52079da31622bdaf4d517d9e702030f2a6b881f9e86d3b750ae37255
6
+ metadata.gz: 563537cf042a171267c80edd1fee12965822a7b766acbe39e71cc2ba146b5291773f28dd04f31d35f35565f0b3f149cc5b40b724cfac1a6f24c8f78fef6a9c85
7
+ data.tar.gz: 521c0c54ac096d6e8df667777ba61ac941bd2e5bbc9bba001b5eed3065db97958cceb4bae57ae4bfd834eb04ecc85e176bd793ce20e62c41d0ed7131e7a30734
data/CLI.md ADDED
@@ -0,0 +1,107 @@
1
+ Here's the content for the file CLI.md:
2
+
3
+ # Durable-LLM Command Line Interface (CLI)
4
+
5
+ Durable-LLM provides a command-line interface (CLI) for interacting with various Large Language Model providers. This document outlines the available commands and their usage.
6
+
7
+ ## Installation
8
+
9
+ Ensure you have installed the Durable-LLM gem:
10
+
11
+ ```
12
+ gem install durable-llm
13
+ ```
14
+
15
+ ## Commands
16
+
17
+ ### prompt
18
+
19
+ Run a single prompt and get a response.
20
+
21
+ ```
22
+ dllm prompt [OPTIONS] PROMPT
23
+ ```
24
+
25
+ Options:
26
+ - `-m, --model MODEL`: Specify the model to use (default: gpt-3.5-turbo)
27
+ - `-s, --system SYSTEM_PROMPT`: Set a system prompt
28
+ - `-c, --continue`: Continue the previous conversation
29
+ - `--cid CONVERSATION_ID`: Continue a specific conversation by ID
30
+ - `--no-stream`: Disable streaming of tokens
31
+ - `-o, --option KEY:VALUE`: Set model-specific options
32
+
33
+ Example:
34
+ ```
35
+ dllm prompt -m gpt-4 "What is the capital of France?"
36
+ ```
37
+
38
+ ### chat
39
+
40
+ Start an interactive chat session.
41
+
42
+ ```
43
+ dllm chat [OPTIONS]
44
+ ```
45
+
46
+ Options:
47
+ - `-m, --model MODEL`: Specify the model to use (default: gpt-3.5-turbo)
48
+ - `-s, --system SYSTEM_PROMPT`: Set a system prompt
49
+ - `-c, --continue`: Continue the previous conversation
50
+ - `--cid CONVERSATION_ID`: Continue a specific conversation by ID
51
+ - `-o, --option KEY:VALUE`: Set model-specific options
52
+
53
+ Example:
54
+ ```
55
+ dllm chat -m claude-3-opus-20240229
56
+ ```
57
+
58
+ ### models
59
+
60
+ List available models.
61
+
62
+ ```
63
+ dllm models [OPTIONS]
64
+ ```
65
+
66
+ Options:
67
+ - `--options`: Show model options
68
+
69
+ Example:
70
+ ```
71
+ dllm models
72
+ ```
73
+
74
+ ## Configuration
75
+
76
+ The CLI uses the same configuration as the Durable-LLM library. You can set up your API keys and other settings using environment variables or a configuration file.
77
+
78
+ Environment variables:
79
+ - `DLLM__OPENAI__API_KEY`: OpenAI API key
80
+ - `DLLM__ANTHROPIC__API_KEY`: Anthropic API key
81
+ - (Add other provider API keys as needed)
82
+
83
+ Alternatively, you can use the configuration file located at `~/.config/io.datasette.llm/keys.json` for compatibility with the `llm` tool.
84
+
85
+ ## Examples
86
+
87
+ 1. Run a simple prompt:
88
+ ```
89
+ dllm prompt "Tell me a joke"
90
+ ```
91
+
92
+ 2. Start an interactive chat with a specific model:
93
+ ```
94
+ dllm chat -m gpt-4
95
+ ```
96
+
97
+ 3. List available models:
98
+ ```
99
+ dllm models
100
+ ```
101
+
102
+ 4. Run a prompt with a system message and custom options:
103
+ ```
104
+ dllm prompt -s "You are a helpful assistant" -o temperature:0.7 "Explain quantum computing"
105
+ ```
106
+
107
+ Remember to set up your API keys before using the CLI. Enjoy using Durable-LLM from the command line!
data/Gemfile.lock CHANGED
@@ -1,8 +1,9 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- durable-llm (0.1.0)
5
- faraday (~> 2.0)
4
+ durable-llm (0.1.1)
5
+ event_stream_parser (~> 1.0)
6
+ faraday (> 1.0)
6
7
  highline (~> 3.1)
7
8
  json (~> 2.6)
8
9
  thor (~> 1.3)
data/LICENSE.txt CHANGED
@@ -1,6 +1,11 @@
1
+ All code outside of the enterprise/ directory is under the following license unless
2
+ otherwise specified.
3
+
4
+ ---
5
+
1
6
  The MIT License (MIT)
2
7
 
3
- Copyright (c) 2024 Durable Programming Team
8
+ Copyright (c) 2024 Durable Programming, LLC.
4
9
 
5
10
  Permission is hereby granted, free of charge, to any person obtaining a copy
6
11
  of this software and associated documentation files (the "Software"), to deal
@@ -19,3 +24,31 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
24
  LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
25
  OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21
26
  THE SOFTWARE.
27
+
28
+
29
+ ---
30
+
31
+ Contains code from ruby-openai project. The following license information is from the ruby-openai repo and applies to the code so denoted:
32
+
33
+ The MIT License (MIT)
34
+
35
+ Copyright (c) 2020 Alex
36
+
37
+ Permission is hereby granted, free of charge, to any person obtaining a copy
38
+ of this software and associated documentation files (the "Software"), to deal
39
+ in the Software without restriction, including without limitation the rights
40
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
41
+ copies of the Software, and to permit persons to whom the Software is
42
+ furnished to do so, subject to the following conditions:
43
+
44
+ The above copyright notice and this permission notice shall be included in
45
+ all copies or substantial portions of the Software.
46
+
47
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
48
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
49
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
50
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
51
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
52
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
53
+ THE SOFTWARE.
54
+
data/README.md CHANGED
@@ -87,6 +87,8 @@ end
87
87
 
88
88
  Thank you to the lite-llm and llm.datasette.io projects for their hard work, which was invaluable to this project. The dllm command line tool is patterned after the llm tool, though not as full-featured (yet).
89
89
 
90
+ The streaming jsonl code is from the ruby-openai repo; many thanks for their hard work.
91
+
90
92
  ## Contributing
91
93
 
92
94
  Bug reports and pull requests are welcome on GitHub at https://github.com/durableprogramming/durable-llm.
@@ -0,0 +1,85 @@
1
+ Here's the content for the file examples/anthropic.md with a line-by-line explanation of the script:
2
+
3
+ # Anthropic Example
4
+
5
+ This example demonstrates how to use the Durable-LLM gem with Anthropic's API.
6
+
7
+ ## Basic Usage
8
+
9
+ ```ruby
10
+ # Import the Durable-LLM library
11
+ require 'durable/llm'
12
+
13
+ # Initialize the client with the Anthropic provider
14
+ # You can pass the API key directly or set it in the environment variable ANTHROPIC_API_KEY
15
+ client = Durable::Llm::Client.new(:anthropic, api_key: 'your-api-key')
16
+
17
+ # Make a completion request
18
+ response = client.completion(
19
+ # Specify the Anthropic model to use
20
+ model: 'claude-3-5-sonnet-20240620',
21
+ # Provide the messages for the conversation
22
+ messages: [
23
+ { role: 'user', content: 'What is the capital of France?' }
24
+ ],
25
+ # Set the maximum number of tokens for the response (optional). Defaults to 1024.
26
+ max_tokens: 100
27
+ )
28
+
29
+ # Print the response content
30
+ puts response.choices.first.message.content
31
+
32
+ # Output: Paris is the capital of France.
33
+ ```
34
+
35
+ ## Streaming
36
+
37
+ ```ruby
38
+ # Use the stream method for real-time responses
39
+ client.stream(
40
+ model: 'claude-3-5-sonnet-20240620',
41
+ messages: [
42
+ { role: 'user', content: 'Tell me a short story about a robot.' }
43
+ ],
44
+ max_tokens: 200
45
+ ) do |chunk|
46
+ # Print each chunk of the response as it's received
47
+ print chunk.to_s
48
+ end
49
+
50
+ # Output will be printed as it's received from the API
51
+ ```
52
+
53
+ ## Configuration
54
+
55
+ You can configure the Anthropic provider globally:
56
+
57
+ ```ruby
58
+ # Set up global configuration for the Anthropic provider
59
+ Durable::Llm.configure do |config|
60
+ config.anthropic.api_key = 'your-api-key'
61
+ end
62
+
63
+ # Create a client using the global configuration
64
+ client = Durable::Llm::Client.new(:anthropic)
65
+ ```
66
+
67
+ ## Error Handling
68
+
69
+ ```ruby
70
+ begin
71
+ # Attempt to make a completion request
72
+ response = client.completion(
73
+ model: 'claude-3-5-sonnet-20240620',
74
+ messages: [{ role: 'user', content: 'Hello!' }]
75
+ )
76
+ rescue Durable::Llm::RateLimitError => e
77
+ # Handle rate limit errors
78
+ puts "Rate limit exceeded: #{e.message}"
79
+ rescue Durable::Llm::APIError => e
80
+ # Handle general API errors
81
+ puts "API error occurred: #{e.message}"
82
+ end
83
+ ```
84
+
85
+ This example covers basic usage, streaming, configuration, and error handling for the Anthropic provider using the Durable-LLM gem. Note that Anthropic's API doesn't support embeddings, so that functionality is not included in this example.
@@ -0,0 +1,85 @@
1
+ # OpenAI Example
2
+
3
+ This example demonstrates how to use the Durable-LLM gem with OpenAI's API.
4
+
5
+ ## Basic Usage
6
+
7
+ ```ruby
8
+ require 'durable/llm'
9
+
10
+ # Initialize the client
11
+ client = Durable::Llm::Client.new(:openai, api_key: 'your-api-key')
12
+
13
+ # Make a completion request
14
+ response = client.completion(
15
+ model: 'gpt-3.5-turbo',
16
+ messages: [
17
+ { role: 'system', content: 'You are a helpful assistant.' },
18
+ { role: 'user', content: 'What is the capital of France?' }
19
+ ]
20
+ )
21
+
22
+ puts response.choices.first.message.content
23
+
24
+ # Output: Paris is the capital of France.
25
+ ```
26
+
27
+ ## Streaming
28
+
29
+ ```ruby
30
+ client.stream(
31
+ model: 'gpt-3.5-turbo',
32
+ messages: [
33
+ { role: 'system', content: 'You are a helpful assistant.' },
34
+ { role: 'user', content: 'Tell me a short story about a robot.' }
35
+ ]
36
+ ) do |chunk|
37
+ print chunk.to_s
38
+ end
39
+
40
+ # Output will be printed as it's received from the API
41
+ ```
42
+
43
+ ## Embeddings
44
+
45
+ ```ruby
46
+ embedding_response = client.embed(
47
+ model: 'text-embedding-ada-002',
48
+ input: 'The quick brown fox jumps over the lazy dog'
49
+ )
50
+
51
+ puts embedding_response.embedding.first(5).inspect
52
+
53
+ # Output: [0.0023064255, -0.009327292, -0.0028842434, 0.022165427, -0.01085841]
54
+ ```
55
+
56
+ ## Configuration
57
+
58
+ You can configure the OpenAI provider globally:
59
+
60
+ ```ruby
61
+ Durable::Llm.configure do |config|
62
+ config.openai.api_key = 'your-api-key'
63
+ config.openai.organization = 'your-organization-id' # Optional
64
+ end
65
+
66
+ # Now you can create a client without specifying the API key
67
+ client = Durable::Llm::Client.new(:openai)
68
+ ```
69
+
70
+ ## Error Handling
71
+
72
+ ```ruby
73
+ begin
74
+ response = client.completion(
75
+ model: 'gpt-3.5-turbo',
76
+ messages: [{ role: 'user', content: 'Hello!' }]
77
+ )
78
+ rescue Durable::Llm::RateLimitError => e
79
+ puts "Rate limit exceeded: #{e.message}"
80
+ rescue Durable::Llm::APIError => e
81
+ puts "API error occurred: #{e.message}"
82
+ end
83
+ ```
84
+
85
+ This example covers basic usage, streaming, embeddings, configuration, and error handling for the OpenAI provider using the Durable-LLM gem.
@@ -8,6 +8,7 @@ module Durable
8
8
 
9
9
  def initialize
10
10
  @providers = {}
11
+ @default_provider = 'openai'
11
12
  load_from_env
12
13
 
13
14
  end
@@ -18,6 +18,7 @@ module Durable
18
18
 
19
19
  def initialize(api_key: nil)
20
20
  @api_key = api_key || default_api_key
21
+
21
22
  @conn = Faraday.new(url: BASE_URL) do |faraday|
22
23
  faraday.request :json
23
24
  faraday.response :json
@@ -26,7 +27,8 @@ module Durable
26
27
  end
27
28
 
28
29
  def completion(options)
29
- response = @conn.post('/v1/chat/completions') do |req|
30
+ options['max_tokens'] ||=1024
31
+ response = @conn.post('/v1/messages') do |req|
30
32
  req.headers['x-api-key'] = @api_key
31
33
  req.headers['anthropic-version'] = '2023-06-01'
32
34
  req.body = options
@@ -39,7 +41,7 @@ module Durable
39
41
  self.class.models
40
42
  end
41
43
  def self.models
42
- ['claude-3-opus', 'claude-3-sonnet', 'claude-3-haiku', 'claude-2.1', 'claude-2.0', 'claude-instant-1.2']
44
+ ['claude-3-5-sonnet-20240620', 'claude-3-opus-20240229', 'claude-3-haiku-20240307']
43
45
  end
44
46
 
45
47
  def self.stream?
@@ -47,7 +49,7 @@ module Durable
47
49
  end
48
50
  def stream(options, &block)
49
51
  options[:stream] = true
50
- response = @conn.post('/v1/chat/completions') do |req|
52
+ response = @conn.post('/v1/messages') do |req|
51
53
  req.headers['x-api-key'] = @api_key
52
54
  req.headers['anthropic-version'] = '2023-06-01'
53
55
  req.headers['Accept'] = 'text/event-stream'
@@ -2,6 +2,7 @@ require 'faraday'
2
2
  require 'json'
3
3
  require 'durable/llm/errors'
4
4
  require 'durable/llm/providers/base'
5
+ require 'event_stream_parser'
5
6
 
6
7
  module Durable
7
8
  module Llm
@@ -63,17 +64,30 @@ module Durable
63
64
  end
64
65
 
65
66
  def stream(options, &block)
67
+
66
68
  options[:stream] = true
69
+
67
70
  response = @conn.post('chat/completions') do |req|
71
+
68
72
  req.headers['Authorization'] = "Bearer #{@api_key}"
69
73
  req.headers['OpenAI-Organization'] = @organization if @organization
70
74
  req.headers['Accept'] = 'text/event-stream'
75
+
76
+ if options['temperature']
77
+ options['temperature'] = options['temperature'].to_f
78
+ end
79
+
71
80
  req.body = options
72
- req.options.on_data = Proc.new do |chunk, size, total|
73
- next if chunk.strip.empty?
81
+
82
+ user_proc = Proc.new do |chunk, size, total|
83
+
74
84
 
75
85
  yield OpenAIStreamResponse.new(chunk)
86
+
76
87
  end
88
+
89
+ req.options.on_data = to_json_stream( user_proc: user_proc )
90
+
77
91
  end
78
92
 
79
93
  handle_response(response)
@@ -81,6 +95,37 @@ module Durable
81
95
 
82
96
  private
83
97
 
98
+ # CODE-FROM: ruby-openai @ https://github.com/alexrudall/ruby-openai/blob/main/lib/openai/http.rb
99
+ # MIT License: https://github.com/alexrudall/ruby-openai/blob/main/LICENSE.md
100
+ # Given a proc, returns an outer proc that can be used to iterate over a JSON stream of chunks.
101
+ # For each chunk, the inner user_proc is called giving it the JSON object. The JSON object could
102
+ # be a data object or an error object as described in the OpenAI API documentation.
103
+ #
104
+ # @param user_proc [Proc] The inner proc to call for each JSON object in the chunk.
105
+ # @return [Proc] An outer proc that iterates over a raw stream, converting it to JSON.
106
+ def to_json_stream(user_proc:)
107
+ parser = EventStreamParser::Parser.new
108
+
109
+ proc do |chunk, _bytes, env|
110
+ if env && env.status != 200
111
+ raise_error = Faraday::Response::RaiseError.new
112
+ raise_error.on_complete(env.merge(body: try_parse_json(chunk)))
113
+ end
114
+
115
+ parser.feed(chunk) do |_type, data|
116
+ user_proc.call(JSON.parse(data)) unless data == "[DONE]"
117
+ end
118
+ end
119
+ end
120
+
121
+ def try_parse_json(maybe_json)
122
+ JSON.parse(maybe_json)
123
+ rescue JSON::ParserError
124
+ maybe_json
125
+ end
126
+
127
+ # END-CODE-FROM
128
+
84
129
  def handle_response(response, responseClass=OpenAIResponse)
85
130
  case response.status
86
131
  when 200..299
@@ -157,14 +202,13 @@ module Durable
157
202
  class OpenAIStreamResponse
158
203
  attr_reader :choices
159
204
 
160
- def initialize(fragment)
161
- parsed = fragment.split("\n").map { |_| JSON.parse(_) }
205
+ def initialize(parsed)
162
206
 
163
- @choices = parsed.map { |_| OpenAIStreamChoice.new(_['choices'])}
207
+ @choices = OpenAIStreamChoice.new(parsed['choices'])
164
208
  end
165
209
 
166
210
  def to_s
167
- @choices.map(&:to_s).join('')
211
+ @choices.to_s
168
212
  end
169
213
  end
170
214
 
@@ -2,6 +2,6 @@
2
2
 
3
3
  module Durable
4
4
  module Llm
5
- VERSION = "0.1.1"
5
+ VERSION = "0.1.2"
6
6
  end
7
7
  end
data/lib/durable/llm.rb CHANGED
@@ -1,8 +1,11 @@
1
- require 'zeitwerk'
2
- loader = Zeitwerk::Loader.for_gem
3
- loader.setup
1
+ require "zeitwerk"
2
+ loader = Zeitwerk::Loader.new
3
+ loader.tag = File.basename(__FILE__, ".rb")
4
+ loader.inflector = Zeitwerk::GemInflector.new(__FILE__)
5
+ loader.push_dir(File.dirname(__FILE__) + '/..' )
4
6
 
5
7
  require 'durable/llm/configuration'
8
+
6
9
  module Durable
7
10
  module Llm
8
11
  class << self
@@ -21,3 +24,4 @@ end
21
24
 
22
25
  Durable::Llm.configure do
23
26
  end
27
+
data/lib/durable.rb ADDED
@@ -0,0 +1,7 @@
1
+ require 'durable/llm'
2
+
3
+ module Durable
4
+ # This module serves as a namespace for the Durable gem.
5
+ # It currently only requires the Llm module, but can be expanded
6
+ # in the future to include other Durable-related functionality.
7
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: durable-llm
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.1
4
+ version: 0.1.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Durable Programming Team
@@ -80,6 +80,20 @@ dependencies:
80
80
  - - "~>"
81
81
  - !ruby/object:Gem::Version
82
82
  version: '3.1'
83
+ - !ruby/object:Gem::Dependency
84
+ name: event_stream_parser
85
+ requirement: !ruby/object:Gem::Requirement
86
+ requirements:
87
+ - - "~>"
88
+ - !ruby/object:Gem::Version
89
+ version: '1.0'
90
+ type: :runtime
91
+ prerelease: false
92
+ version_requirements: !ruby/object:Gem::Requirement
93
+ requirements:
94
+ - - "~>"
95
+ - !ruby/object:Gem::Version
96
+ version: '1.0'
83
97
  - !ruby/object:Gem::Dependency
84
98
  name: minitest
85
99
  requirement: !ruby/object:Gem::Requirement
@@ -159,12 +173,16 @@ extensions: []
159
173
  extra_rdoc_files: []
160
174
  files:
161
175
  - CHANGELOG.md
176
+ - CLI.md
162
177
  - Gemfile
163
178
  - Gemfile.lock
164
179
  - LICENSE.txt
165
180
  - README.md
166
181
  - Rakefile
167
182
  - enterprise/GET_ENTERPRISE.md
183
+ - examples/anthropic.md
184
+ - examples/openai.md
185
+ - lib/durable.rb
168
186
  - lib/durable/llm.rb
169
187
  - lib/durable/llm/cli.rb
170
188
  - lib/durable/llm/client.rb