last_llm 0.0.5 → 0.0.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: e8d63b9fe1cfc2adcf7d290b1deee351c2dbdde0922931f6bdc950b47e882884
4
- data.tar.gz: 2ce5e77677f5734d6aa300083e1d62345e7da9b57db87dec88b4eb0945f5a35f
3
+ metadata.gz: c45436277c2f585b851688ade72bbd838d33f0c51b0792942e56734e534a9a2e
4
+ data.tar.gz: 0ad94566719d472ccfcea83cb5bce21cd4142df80b2ffdda0f139ff561d6c9d8
5
5
  SHA512:
6
- metadata.gz: 99ebc99586238fa7bdc6464d01dba3ec0bb933488f2a9b61ca2201f93e97689cca59893628185649524582d4fa2da9be86867ebb7a4373b8b32ca0cc5e1459ea
7
- data.tar.gz: ed0ebd17190f1797ff6f45523461fca9813066aedbb05fa13e7e9660cad567aa10d1f082655ab6d5c498116a330347a4f998d6e0118ee5cf940a76d81b3fff3e
6
+ metadata.gz: c77932637b424601a950f6ad655c3886773be3b9880f4ce5607d4af18f0036876dd9463c973f64628c1a97ef50d73b56af9d6cd7f492144da164177dc628d4dc
7
+ data.tar.gz: 51709ae9cfee0d6047b5076ca0926f6395ebae1cbaa7c3b6621787dca86e8f0ff3a5d3e5f44a1d1038315effeae118b32352b963ae12f23fdd43c7356b99ca1e
data/README.md CHANGED
@@ -45,128 +45,28 @@ LastLLM.configure do |config|
45
45
  end
46
46
  ```
47
47
 
48
- ### Rails Integration
48
+ ### Logging
49
49
 
50
- LastLLM automatically integrates with Rails applications. Create a `config/last_llm.yml` file:
51
-
52
- ```yaml
53
- default_provider: openai
54
- default_model: gpt-3.5-turbo
55
- globals:
56
- timeout: 30
57
- max_retries: 3
58
- ```
59
-
60
- ## Usage
61
-
62
- ### Basic Text Generation
50
+ LastLLM supports custom logging for monitoring requests, responses, and errors. You can configure a custom logger to integrate with your existing logging system:
63
51
 
64
52
  ```ruby
65
- client = LastLLM.client
66
- response = client.generate_text("What is the capital of France?")
67
- ```
68
-
69
- ### Structured Data Generation
53
+ LastLLM.configure do |config|
54
+ # Use Rails logger
55
+ config.logger = Rails.logger
70
56
 
71
- ```ruby
72
- schema = Dry::Schema.JSON do
73
- required(:name).filled(:string)
74
- required(:age).filled(:integer)
75
- optional(:email).maybe(:string)
57
+ # Set log level (optional, defaults to :info)
58
+ config.log_level = :debug
76
59
  end
77
-
78
- client = LastLLM.client
79
- result = client.generate_object("Generate information about a person", schema)
80
60
  ```
81
61
 
82
- ### Tool Calling
83
-
84
- ```ruby
85
- calculator = LastLLM::Tool.new(
86
- name: "calculator",
87
- description: "Perform mathematical calculations",
88
- parameters: {
89
- type: "object",
90
- properties: {
91
- operation: {
92
- type: "string",
93
- enum: ["add", "subtract", "multiply", "divide"]
94
- },
95
- a: { type: "number" },
96
- b: { type: "number" }
97
- },
98
- required: ["operation", "a", "b"]
99
- },
100
- function: ->(params) {
101
- case params[:operation]
102
- when "add"
103
- { result: params[:a] + params[:b] }
104
- end
105
- }
106
- )
107
- ```
108
-
109
- ## Error Handling
110
-
111
- LastLLM provides several error classes:
112
- - `LastLLM::Error`: Base error class
113
- - `LastLLM::ConfigurationError`: Configuration-related errors
114
- - `LastLLM::ValidationError`: Schema validation errors
115
- - `LastLLM::ApiError`: API request errors
116
-
117
- ## Development
62
+ In a Rails application, you can also configure this in your `config/last_llm.yml`:
118
63
 
119
- After checking out the repo, run `bundle install` to install dependencies. Then, run `rspec` to run the tests.
120
-
121
- ## Contributing
122
-
123
- Bug reports and pull requests are welcome on GitHub.
124
- # LastLLM
125
-
126
- LastLLM is a unified client for interacting with various LLM (Large Language Model) providers. It provides a consistent interface for text generation, structured data generation, and tool calling across different LLM services.
127
-
128
- ## Features
129
-
130
- - Unified interface for multiple LLM providers:
131
- - OpenAI
132
- - Anthropic
133
- - Google Gemini
134
- - Deepseek
135
- - Ollama
136
- - Text generation
137
- - Structured data generation with schema validation
138
- - Tool/function calling support
139
- - Rails integration
140
- - Configuration management
141
- - Error handling
142
-
143
- ## Installation
144
-
145
- Add this line to your application's Gemfile:
146
-
147
- ```ruby
148
- gem 'last_llm'
64
+ ```yaml
65
+ logger: Rails.logger
66
+ log_level: debug
149
67
  ```
150
68
 
151
- ## Configuration
152
-
153
- Configure LastLLM globally or per instance:
154
-
155
- ```ruby
156
- LastLLM.configure do |config|
157
- config.default_provider = :openai
158
- config.default_model = 'gpt-3.5-turbo'
159
-
160
- # Configure OpenAI
161
- config.configure_provider(:openai, api_key: 'your-api-key')
162
-
163
- # Configure Anthropic
164
- config.configure_provider(:anthropic, api_key: 'your-api-key')
165
-
166
- # Configure Ollama
167
- config.configure_provider(:ollama, host: 'http://localhost:11434')
168
- end
169
- ```
69
+ This will log LLM requests, responses, and errors to your Rails application logs, helping you debug and monitor your AI interactions.
170
70
 
171
71
  ### Rails Integration
172
72
 
@@ -0,0 +1,25 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'rails/generators'
4
+
5
+ module LastLLM
6
+ module Generators
7
+ class InstallGenerator < Rails::Generators::Base
8
+ source_root File.expand_path('templates', __dir__)
9
+
10
+ desc 'Creates a LastLLM configuration file and initializer'
11
+
12
+ def create_config_file
13
+ template 'last_llm.yml', 'config/last_llm.yml'
14
+ end
15
+
16
+ def create_initializer
17
+ template 'initializer.rb', 'config/initializers/last_llm.rb'
18
+ end
19
+
20
+ def show_readme
21
+ readme 'README.md'
22
+ end
23
+ end
24
+ end
25
+ end
@@ -0,0 +1,20 @@
1
+ LastLLM Installation
2
+ ===================
3
+
4
+ LastLLM has been installed. Here's what was created:
5
+
6
+ * config/last_llm.yml - Main configuration file
7
+ * config/initializers/last_llm.rb - Rails initializer
8
+
9
+ Next steps:
10
+
11
+ 1. Edit config/last_llm.yml and add your API keys
12
+ 2. Set up environment variables for your API keys
13
+ 3. Test the installation:
14
+
15
+ ```ruby
16
+ client = LastLLM.client
17
+ response = client.generate_text("Hello, world!")
18
+ ```
19
+
20
+ For more information, visit: https://github.com/joemocha/last_llm
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+
3
+ # Configure LastLLM
4
+ LastLLM.configure do |config|
5
+ # Load configuration from config/last_llm.yml
6
+ config_file = Rails.root.join('config', 'last_llm.yml')
7
+ if File.exist?(config_file)
8
+ yaml_config = YAML.safe_load_file(config_file, symbolize_names: true)
9
+
10
+ # Set default provider
11
+ config.default_provider = yaml_config[:default_provider].to_sym if yaml_config[:default_provider]
12
+
13
+ # Set default model
14
+ config.default_model = yaml_config[:default_model] if yaml_config[:default_model]
15
+
16
+ # Configure providers
17
+ yaml_config[:providers]&.each do |provider, settings|
18
+ settings.each do |key, value|
19
+ config.set_provider_config(provider, key, value)
20
+ end
21
+ end
22
+
23
+ # Configure global settings
24
+ yaml_config[:globals]&.each do |key, value|
25
+ config.set_global(key.to_sym, value)
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,29 @@
1
+ # LastLLM Configuration
2
+
3
+ # Default provider to use (options: :openai, :anthropic, :google_gemini, :deepseek, :ollama)
4
+ default_provider: :openai
5
+
6
+ # Default model to use (provider-specific)
7
+ default_model: gpt-3.5-turbo
8
+
9
+ # Global settings
10
+ globals:
11
+ temperature: 0.7
12
+ max_tokens: 1000
13
+
14
+ # Provider-specific configurations
15
+ providers:
16
+ openai:
17
+ api_key: <%= ENV['OPENAI_API_KEY'] %>
18
+
19
+ anthropic:
20
+ api_key: <%= ENV['ANTHROPIC_API_KEY'] %>
21
+
22
+ google_gemini:
23
+ api_key: <%= ENV['GOOGLE_GEMINI_API_KEY'] %>
24
+
25
+ deepseek:
26
+ api_key: <%= ENV['DEEPSEEK_API_KEY'] %>
27
+
28
+ ollama:
29
+ host: http://localhost:11434
@@ -148,5 +148,91 @@ module LastLLM
148
148
  end
149
149
  end
150
150
  end
151
+
152
+ def make_request(prompt, options = {})
153
+ log_request(prompt, options)
154
+
155
+ response = yield
156
+
157
+ log_response(response)
158
+
159
+ handle_response(response) do |result|
160
+ yield(result)
161
+ end
162
+ rescue Faraday::Error => e
163
+ @logger&.error("[#{@name}] Request failed: #{e.message}")
164
+ handle_provider_error(e)
165
+ end
166
+
167
+ private
168
+
169
+ def log_request(prompt, options)
170
+ return unless @logger
171
+
172
+ sanitized_options = options.dup
173
+ # Remove sensitive data
174
+ sanitized_options.delete(:api_key)
175
+
176
+ @logger.info("[#{@name}] Request - Model: #{options[:model]}")
177
+ @logger.debug("[#{@name}] Prompt: #{prompt}")
178
+ @logger.debug("[#{@name}] Options: #{sanitized_options.inspect}")
179
+ end
180
+
181
+ def log_response(response)
182
+ return unless @logger
183
+
184
+ @logger.info("[#{@name}] Response received - Status: #{response.status}")
185
+ @logger.debug("[#{@name}] Response body: #{response.body}")
186
+ rescue StandardError => e
187
+ @logger.error("[#{@name}] Failed to log response: #{e.message}")
188
+ end
189
+
190
+ def handle_provider_error(error)
191
+ @logger&.error("[#{@name}] #{error.class}: #{error.message}")
192
+ raise ApiError.new(error.message, error.response&.status)
193
+ end
194
+
195
+ def make_request(prompt, options = {})
196
+ log_request(prompt, options)
197
+
198
+ response = yield
199
+
200
+ log_response(response)
201
+
202
+ handle_response(response) do |result|
203
+ yield(result)
204
+ end
205
+ rescue Faraday::Error => e
206
+ @logger&.error("[#{@name}] Request failed: #{e.message}")
207
+ handle_provider_error(e)
208
+ end
209
+
210
+ private
211
+
212
+ def log_request(prompt, options)
213
+ return unless @logger
214
+
215
+ sanitized_options = options.dup
216
+ # Remove sensitive data
217
+ sanitized_options.delete(:api_key)
218
+
219
+ @logger.info("[#{@name}] Request - Model: #{options[:model]}")
220
+ @logger.debug("[#{@name}] Prompt: #{prompt}")
221
+ @logger.debug("[#{@name}] Options: #{sanitized_options.inspect}")
222
+ end
223
+
224
+ def log_response(response)
225
+ return unless @logger
226
+
227
+ @logger.info("[#{@name}] Response received - Status: #{response.status}")
228
+ @logger.debug("[#{@name}] Response body: #{response.body}")
229
+ rescue StandardError => e
230
+ @logger.error("[#{@name}] Failed to log response: #{e.message}")
231
+ end
232
+
233
+ def handle_provider_error(error)
234
+ @logger&.error("[#{@name}] #{error.class}: #{error.message}")
235
+ raise ApiError.new(error.message, error.response&.status)
236
+ end
151
237
  end
152
238
  end
@@ -32,7 +32,7 @@ module LastLLM
32
32
  end
33
33
 
34
34
  def generate_text(prompt, options = {})
35
- make_request(prompt, options) do |response|
35
+ make_request(prompt, options) do |response|
36
36
  extract_text_content(response)
37
37
  end
38
38
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LastLLM
4
- VERSION = '0.0.5'
4
+ VERSION = '0.0.7'
5
5
  end
@@ -0,0 +1,67 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'bundler/gem_tasks'
4
+
5
+ namespace :version do
6
+ desc 'Bump major version (X.y.z)'
7
+ task :major do
8
+ bump_version(:major)
9
+ end
10
+
11
+ desc 'Bump minor version (x.Y.z)'
12
+ task :minor do
13
+ bump_version(:minor)
14
+ end
15
+
16
+ desc 'Bump patch version (x.y.Z)'
17
+ task :patch do
18
+ bump_version(:patch)
19
+ end
20
+
21
+ def bump_version(type)
22
+ version_file = 'lib/last_llm/version.rb'
23
+ content = File.read(version_file)
24
+ current_version = content.match(/VERSION\s*=\s*['"](\d+\.\d+\.\d+)['"]/)[1]
25
+ major, minor, patch = current_version.split('.').map(&:to_i)
26
+
27
+ case type
28
+ when :major
29
+ major += 1
30
+ minor = 0
31
+ patch = 0
32
+ when :minor
33
+ minor += 1
34
+ patch = 0
35
+ when :patch
36
+ patch += 1
37
+ end
38
+
39
+ new_version = "#{major}.#{minor}.#{patch}"
40
+ new_content = content.gsub(/VERSION\s*=\s*['"](\d+\.\d+\.\d+)['"]/, "VERSION = '#{new_version}'")
41
+
42
+ File.write(version_file, new_content)
43
+ puts "Version bumped to #{new_version}"
44
+ end
45
+ end
46
+
47
+ namespace :release do
48
+ desc 'Build and push gem to RubyGems'
49
+ task :push do
50
+ system('gem build last_llm.gemspec') || abort('Gem build failed')
51
+ gem_file = Dir['last_llm-*.gem'].sort_by { |f| File.mtime(f) }.last
52
+ system("gem push #{gem_file}") || abort('Gem push failed')
53
+ FileUtils.rm(gem_file)
54
+ puts "Successfully released #{gem_file}"
55
+ end
56
+ end
57
+
58
+ desc 'Bump version, build and push gem (usage: rake release[patch])'
59
+ task :release, [:type] => [:clean] do |_, args|
60
+ type = (args[:type] || 'patch').to_sym
61
+ unless %i[major minor patch].include?(type)
62
+ abort "Invalid version type. Use 'major', 'minor', or 'patch'"
63
+ end
64
+
65
+ Rake::Task["version:#{type}"].invoke
66
+ Rake::Task['release:push'].invoke
67
+ end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: last_llm
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.5
4
+ version: 0.0.7
5
5
  platform: ruby
6
6
  authors:
7
7
  - Sam Obukwelu
@@ -101,6 +101,10 @@ extensions: []
101
101
  extra_rdoc_files: []
102
102
  files:
103
103
  - README.md
104
+ - lib/generators/last_llm/install/install_generator.rb
105
+ - lib/generators/last_llm/install/templates/README.md
106
+ - lib/generators/last_llm/install/templates/initializer.rb
107
+ - lib/generators/last_llm/install/templates/last_llm.yml
104
108
  - lib/last_llm.rb
105
109
  - lib/last_llm/client.rb
106
110
  - lib/last_llm/completion.rb
@@ -119,6 +123,7 @@ files:
119
123
  - lib/last_llm/structured_output.rb
120
124
  - lib/last_llm/tool.rb
121
125
  - lib/last_llm/version.rb
126
+ - lib/tasks/release.rake
122
127
  homepage: https://github.com/joemocha/last_llm
123
128
  licenses:
124
129
  - MIT