ruby_llm 0.1.0.pre26 → 0.1.0.pre27

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: ea5a792c0bc3361e9784fc902e8e0e08f07cb2a6db03b408ee6362deb50c6edc
4
- data.tar.gz: 44679aa95838fb8cbc8ff0ae42db0a5d2c9862d7af0ec461ea98c41276276bdf
3
+ metadata.gz: cd52ddd301a631215257c48fa3c1bc460203a228128b7d8c4c2f4837beaf2f63
4
+ data.tar.gz: a0cfae0a59ff41ffeef2a1c64ff660a61839147d2587769e8e7760555fd7f8a2
5
5
  SHA512:
6
- metadata.gz: 3fa7cdaa35b5bfe2f60a25edfaefec82f89cf47d54fe8bad2f225fb4d339326fc341194fd734310e224f07055d9c0dccad81c6e610c6bd071aa64883c7b49a1d
7
- data.tar.gz: aba94a44af3cb6348cb7bac016ad30a539ef6f980e16261dce324e9235214d869e5bf9ae78d46a7dc01ca1a7cfd0ec682a97b18872c1b50a6ed984bdd699b077
6
+ metadata.gz: 854ea7639ed9e01aafe2ff987bebc47023529544525ec86dbfd1e216119ad6affae78dcadbf4d565f061970ae85454f00bcb7a3f42f6eebd876f6fa3287b8616
7
+ data.tar.gz: 99bf687ffca1d6ba80745a9406bb96f47148a9bb5a0ed499cddda5e10461675ecb5c434e12e5e3a208f4877fca5bd57b59ab15dd8e9f0f1473c66b86c44fdeb9
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # RubyLLM
2
2
 
3
- A delightful Ruby interface to the latest large language models. Stop wrestling with multiple APIs and inconsistent interfaces. RubyLLM gives you a clean, unified way to work with models from OpenAI, Anthropic, Google, and DeepSeek.
3
+ A delightful Ruby way to work with AI language models. Provides a unified interface to OpenAI, Anthropic, Google, and DeepSeek models with automatic token counting, proper streaming support, and a focus on developer happiness. No wrapping your head around multiple APIs - just clean Ruby code that works.
4
4
 
5
5
  <p align="center">
6
6
  <img src="https://upload.wikimedia.org/wikipedia/commons/4/4d/OpenAI_Logo.svg" alt="OpenAI" height="40" width="120">
@@ -73,7 +73,7 @@ image_models = RubyLLM.models.image_models
73
73
  Conversations are simple and natural:
74
74
 
75
75
  ```ruby
76
- chat = RubyLLM.chat model: 'claude-3-opus-20240229'
76
+ chat = RubyLLM.chat model: 'gemini-2.0-flash'
77
77
 
78
78
  # Ask questions
79
79
  response = chat.ask "What's your favorite Ruby feature?"
@@ -101,7 +101,7 @@ Need vector embeddings for your text? RubyLLM makes it simple:
101
101
  RubyLLM.embed "Hello, world!"
102
102
 
103
103
  # Use a specific model
104
- RubyLLM.embed "Ruby is awesome!", model: "text-embedding-3-large"
104
+ RubyLLM.embed "Ruby is awesome!", model: "text-embedding-004"
105
105
 
106
106
  # Process multiple texts at once
107
107
  RubyLLM.embed([
@@ -165,7 +165,7 @@ search = Search.new repo: Document
165
165
  chat.with_tools search, Calculator
166
166
 
167
167
  # Configure as needed
168
- chat.with_model('claude-3-opus-20240229')
168
+ chat.with_model('claude-3-5-sonnet-20241022')
169
169
  .with_temperature(0.9)
170
170
 
171
171
  chat.ask "What's 2+2?"
@@ -373,7 +373,7 @@ class WeatherTool < RubyLLM::Tool
373
373
  end
374
374
 
375
375
  # Use tools with your persisted chats
376
- chat = Chat.create! model_id: "gpt-4"
376
+ chat = Chat.create! model_id: "deepseek-reasoner"
377
377
  chat.chat.with_tool WeatherTool.new
378
378
 
379
379
  # Ask about weather - tool usage is automatically saved
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module RubyLLM
4
- VERSION = '0.1.0.pre26'
4
+ VERSION = '0.1.0.pre27'
5
5
  end
data/ruby_llm.gemspec CHANGED
@@ -10,9 +10,9 @@ Gem::Specification.new do |spec|
10
10
 
11
11
  spec.summary = 'Clean Ruby interface to modern AI language models'
12
12
  spec.description = 'A delightful Ruby way to work with AI language models. Provides a unified interface to OpenAI' \
13
- ' and Anthropic models with automatic token counting, proper streaming support, and a focus on' \
14
- ' developer happiness. No wrapping your head around multiple APIs - just clean Ruby code that' \
15
- ' works.'
13
+ ', Anthropic, Google, and DeepSeek models with automatic token counting, proper streaming' \
14
+ ' support, and a focus on developer happiness. No wrapping your head around multiple APIs' \
15
+ ' - just clean Ruby code that works.'
16
16
  spec.homepage = 'https://github.com/crmne/ruby_llm'
17
17
  spec.license = 'MIT'
18
18
  spec.required_ruby_version = Gem::Requirement.new('>= 3.1.0')
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby_llm
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0.pre26
4
+ version: 0.1.0.pre27
5
5
  platform: ruby
6
6
  authors:
7
7
  - Carmine Paolino
@@ -323,9 +323,9 @@ dependencies:
323
323
  - !ruby/object:Gem::Version
324
324
  version: '0.9'
325
325
  description: A delightful Ruby way to work with AI language models. Provides a unified
326
- interface to OpenAI and Anthropic models with automatic token counting, proper streaming
327
- support, and a focus on developer happiness. No wrapping your head around multiple
328
- APIs - just clean Ruby code that works.
326
+ interface to OpenAI, Anthropic, Google, and DeepSeek models with automatic token
327
+ counting, proper streaming support, and a focus on developer happiness. No wrapping
328
+ your head around multiple APIs - just clean Ruby code that works.
329
329
  email:
330
330
  - carmine@paolino.me
331
331
  executables: []