llm_chain 0.3.0 โ 0.5.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/README.md +466 -101
- data/examples/quick_demo.rb +93 -0
- data/examples/tools_example.rb +255 -0
- data/lib/llm_chain/chain.rb +58 -42
- data/lib/llm_chain/client_registry.rb +2 -0
- data/lib/llm_chain/clients/gemma3.rb +144 -0
- data/lib/llm_chain/clients/qwen.rb +14 -3
- data/lib/llm_chain/tools/base_tool.rb +81 -0
- data/lib/llm_chain/tools/calculator.rb +143 -0
- data/lib/llm_chain/tools/code_interpreter.rb +233 -0
- data/lib/llm_chain/tools/tool_manager.rb +204 -0
- data/lib/llm_chain/tools/web_search.rb +255 -0
- data/lib/llm_chain/version.rb +1 -1
- data/lib/llm_chain.rb +6 -0
- metadata +11 -3
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 9dd4f92a849092fdd1770088d8724a557ae12ff16f7108f0cef3f6199445c521
|
4
|
+
data.tar.gz: ec2f30766da2cd6ab9b2a92a9df35b914e995d5243359d8398b0ac96a41ebd8c
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: faa70d193aaf6d0af6cfede19867d5bb3263098d0214f71d15ced7f336fecf91483b8c046c652e5e38289cfa0e9d0ac4f0dedebbcc815981904afb07a091cfd0
|
7
|
+
data.tar.gz: d8c5f550897bfe5ed92d9aa2098dd46e1726fd2783c56d886c90db7588375f15055a101cefb3ad16fe9a3522c1917ad99cb1dc6cd4091e8d117d3ee997795119
|
data/README.md
CHANGED
@@ -1,190 +1,555 @@
|
|
1
|
-
# LLMChain
|
2
|
-
|
3
|
-
A Ruby gem for interacting with Large Language Models (LLMs) through a unified interface, with native Ollama and local model support.
|
1
|
+
# ๐ฆพ LLMChain
|
4
2
|
|
5
3
|
[](https://badge.fury.io/rb/llm_chain)
|
6
|
-
[](https://github.com/FuryCow/llm_chain/actions)
|
7
5
|
[](LICENSE.txt)
|
8
6
|
|
9
|
-
|
7
|
+
**A powerful Ruby library for working with Large Language Models (LLMs) with intelligent tool system**
|
8
|
+
|
9
|
+
LLMChain is a Ruby analog of LangChain, providing a unified interface for interacting with various LLMs, built-in tool system, and RAG (Retrieval-Augmented Generation) support.
|
10
|
+
|
11
|
+
## โจ Key Features
|
12
|
+
|
13
|
+
- ๐ค **Unified API** for multiple LLMs (OpenAI, Ollama, Qwen, LLaMA2, Gemma)
|
14
|
+
- ๐ ๏ธ **Intelligent tool system** with automatic selection
|
15
|
+
- ๐งฎ **Built-in tools**: Calculator, web search, code interpreter
|
16
|
+
- ๐ **RAG-ready** with vector database integration
|
17
|
+
- ๐พ **Flexible memory system** (Array, Redis)
|
18
|
+
- ๐ **Streaming output** for real-time responses
|
19
|
+
- ๐ **Local models** via Ollama
|
20
|
+
- ๐ง **Extensible architecture** for custom tools
|
21
|
+
|
22
|
+
## ๐ Quick Start
|
10
23
|
|
11
|
-
|
12
|
-
- Native [Ollama](https://ollama.ai/) integration for local models
|
13
|
-
- Prompt templating system
|
14
|
-
- Streaming response support
|
15
|
-
- RAG-ready with vector database integration
|
16
|
-
- Automatic model verification
|
24
|
+
### Installation
|
17
25
|
|
18
|
-
|
26
|
+
```bash
|
27
|
+
gem install llm_chain
|
28
|
+
```
|
19
29
|
|
20
|
-
|
30
|
+
Or add to Gemfile:
|
21
31
|
|
22
32
|
```ruby
|
23
33
|
gem 'llm_chain'
|
24
34
|
```
|
25
|
-
Or install directly:
|
26
35
|
|
27
|
-
|
28
|
-
|
36
|
+
### Prerequisites
|
37
|
+
|
38
|
+
1. **Install Ollama** for local models:
|
39
|
+
```bash
|
40
|
+
# macOS/Linux
|
41
|
+
curl -fsSL https://ollama.ai/install.sh | sh
|
42
|
+
|
43
|
+
# Download models
|
44
|
+
ollama pull qwen3:1.7b
|
45
|
+
ollama pull llama2:7b
|
46
|
+
```
|
47
|
+
|
48
|
+
2. **Optional**: API keys for external services
|
49
|
+
```bash
|
50
|
+
export OPENAI_API_KEY="your-key"
|
51
|
+
export SEARCH_API_KEY="your-key"
|
52
|
+
```
|
53
|
+
|
54
|
+
### Simple Example
|
55
|
+
|
56
|
+
```ruby
|
57
|
+
require 'llm_chain'
|
58
|
+
|
59
|
+
# Basic usage
|
60
|
+
chain = LLMChain::Chain.new(model: "qwen3:1.7b")
|
61
|
+
response = chain.ask("Hello! How are you?")
|
62
|
+
puts response
|
29
63
|
```
|
30
64
|
|
31
|
-
##
|
32
|
-
Install [Ollama](https://ollama.ai/)
|
65
|
+
## ๐ ๏ธ Tool System
|
33
66
|
|
34
|
-
|
67
|
+
### Automatic Tool Usage
|
35
68
|
|
36
|
-
```
|
37
|
-
|
38
|
-
|
69
|
+
```ruby
|
70
|
+
# Create chain with tools
|
71
|
+
tool_manager = LLMChain::Tools::ToolManager.create_default_toolset
|
72
|
+
chain = LLMChain::Chain.new(
|
73
|
+
model: "qwen3:1.7b",
|
74
|
+
tools: tool_manager
|
75
|
+
)
|
76
|
+
|
77
|
+
# Tools are selected automatically
|
78
|
+
chain.ask("Calculate 15 * 7 + 32")
|
79
|
+
# ๐งฎ Automatically uses calculator
|
80
|
+
|
81
|
+
chain.ask("Find information about Ruby 3.2")
|
82
|
+
# ๐ Automatically uses web search
|
83
|
+
|
84
|
+
chain.ask("Execute code: puts (1..10).sum")
|
85
|
+
# ๐ป Automatically uses code interpreter
|
39
86
|
```
|
40
87
|
|
41
|
-
|
88
|
+
### Built-in Tools
|
42
89
|
|
43
|
-
|
90
|
+
#### ๐งฎ Calculator
|
91
|
+
```ruby
|
92
|
+
calculator = LLMChain::Tools::Calculator.new
|
93
|
+
result = calculator.call("Find square root of 144")
|
94
|
+
puts result[:formatted]
|
95
|
+
# Output: sqrt(144) = 12.0
|
96
|
+
```
|
44
97
|
|
98
|
+
#### ๐ Web Search
|
45
99
|
```ruby
|
46
|
-
|
100
|
+
search = LLMChain::Tools::WebSearch.new
|
101
|
+
results = search.call("Latest Ruby news")
|
102
|
+
puts results[:formatted]
|
103
|
+
```
|
47
104
|
|
48
|
-
|
49
|
-
|
50
|
-
|
105
|
+
#### ๐ป Code Interpreter
|
106
|
+
```ruby
|
107
|
+
interpreter = LLMChain::Tools::CodeInterpreter.new
|
108
|
+
result = interpreter.call(<<~CODE)
|
109
|
+
```ruby
|
110
|
+
def factorial(n)
|
111
|
+
n <= 1 ? 1 : n * factorial(n - 1)
|
112
|
+
end
|
113
|
+
puts factorial(5)
|
114
|
+
```
|
115
|
+
CODE
|
116
|
+
puts result[:formatted]
|
51
117
|
```
|
52
118
|
|
53
|
-
|
119
|
+
### Creating Custom Tools
|
54
120
|
|
55
121
|
```ruby
|
56
|
-
|
57
|
-
|
58
|
-
|
59
|
-
|
122
|
+
class WeatherTool < LLMChain::Tools::BaseTool
|
123
|
+
def initialize(api_key:)
|
124
|
+
@api_key = api_key
|
125
|
+
super(
|
126
|
+
name: "weather",
|
127
|
+
description: "Gets weather information",
|
128
|
+
parameters: {
|
129
|
+
location: {
|
130
|
+
type: "string",
|
131
|
+
description: "City name"
|
132
|
+
}
|
133
|
+
}
|
134
|
+
)
|
135
|
+
end
|
136
|
+
|
137
|
+
def match?(prompt)
|
138
|
+
contains_keywords?(prompt, ['weather', 'temperature', 'forecast'])
|
139
|
+
end
|
140
|
+
|
141
|
+
def call(prompt, context: {})
|
142
|
+
location = extract_location(prompt)
|
143
|
+
# Your weather API integration
|
144
|
+
{
|
145
|
+
location: location,
|
146
|
+
temperature: "22ยฐC",
|
147
|
+
condition: "Sunny",
|
148
|
+
formatted: "Weather in #{location}: 22ยฐC, Sunny"
|
149
|
+
}
|
150
|
+
end
|
151
|
+
|
152
|
+
private
|
153
|
+
|
154
|
+
def extract_location(prompt)
|
155
|
+
prompt.scan(/in\s+(\w+)/i).flatten.first || "Unknown"
|
156
|
+
end
|
157
|
+
end
|
60
158
|
|
61
|
-
|
62
|
-
|
159
|
+
# Usage
|
160
|
+
weather = WeatherTool.new(api_key: "your-key")
|
161
|
+
tool_manager.register_tool(weather)
|
63
162
|
```
|
64
163
|
|
65
|
-
|
164
|
+
## ๐ค Supported Models
|
165
|
+
|
166
|
+
| Model Family | Backend | Status | Notes |
|
167
|
+
|--------------|---------|--------|-------|
|
168
|
+
| **OpenAI** | Web API | โ
Supported | GPT-3.5, GPT-4, GPT-4 Turbo |
|
169
|
+
| **Qwen/Qwen2** | Ollama | โ
Supported | 0.5B - 72B parameters |
|
170
|
+
| **LLaMA2/3** | Ollama | โ
Supported | 7B, 13B, 70B |
|
171
|
+
| **Gemma** | Ollama | โ
Supported | 2B, 7B, 9B, 27B |
|
172
|
+
| **Mistral/Mixtral** | Ollama | ๐ In development | 7B, 8x7B |
|
173
|
+
| **Claude** | Anthropic | ๐ Planned | Haiku, Sonnet, Opus |
|
174
|
+
| **Command R+** | Cohere | ๐ Planned | Optimized for RAG |
|
175
|
+
|
176
|
+
### Model Usage Examples
|
66
177
|
|
67
178
|
```ruby
|
68
|
-
#
|
69
|
-
|
70
|
-
model: "
|
179
|
+
# OpenAI
|
180
|
+
openai_chain = LLMChain::Chain.new(
|
181
|
+
model: "gpt-4",
|
182
|
+
api_key: ENV['OPENAI_API_KEY']
|
183
|
+
)
|
184
|
+
|
185
|
+
# Qwen via Ollama
|
186
|
+
qwen_chain = LLMChain::Chain.new(model: "qwen3:1.7b")
|
187
|
+
|
188
|
+
# LLaMA via Ollama with settings
|
189
|
+
llama_chain = LLMChain::Chain.new(
|
190
|
+
model: "llama2:7b",
|
71
191
|
temperature: 0.8,
|
72
192
|
top_p: 0.95
|
73
193
|
)
|
74
|
-
puts qwen.chat("Write Ruby code for Fibonacci sequence")
|
75
194
|
```
|
76
195
|
|
77
|
-
|
196
|
+
## ๐พ Memory System
|
78
197
|
|
198
|
+
### Array Memory (default)
|
79
199
|
```ruby
|
80
|
-
LLMChain::
|
81
|
-
|
82
|
-
|
83
|
-
|
200
|
+
memory = LLMChain::Memory::Array.new(max_size: 10)
|
201
|
+
chain = LLMChain::Chain.new(
|
202
|
+
model: "qwen3:1.7b",
|
203
|
+
memory: memory
|
204
|
+
)
|
84
205
|
|
85
|
-
|
206
|
+
chain.ask("My name is Alex")
|
207
|
+
chain.ask("What's my name?") # Remembers previous context
|
208
|
+
```
|
86
209
|
|
210
|
+
### Redis Memory (for production)
|
87
211
|
```ruby
|
212
|
+
memory = LLMChain::Memory::Redis.new(
|
213
|
+
redis_url: 'redis://localhost:6379',
|
214
|
+
max_size: 100,
|
215
|
+
namespace: 'my_app'
|
216
|
+
)
|
217
|
+
|
88
218
|
chain = LLMChain::Chain.new(
|
89
219
|
model: "qwen3:1.7b",
|
90
|
-
memory:
|
220
|
+
memory: memory
|
91
221
|
)
|
92
|
-
|
93
|
-
# Conversation with context
|
94
|
-
chain.ask("What's 2^10?")
|
95
|
-
chain.ask("Now multiply that by 5")
|
96
222
|
```
|
97
223
|
|
98
|
-
##
|
99
|
-
|
100
|
-
| Model Family | Backend/Service | Notes |
|
101
|
-
|-------------|----------------|-------|
|
102
|
-
| OpenAI (GPT-3.5, GPT-4) | Web API | Supports all OpenAI API models (Not tested) |
|
103
|
-
| LLaMA2 (7B, 13B, 70B) | Ollama | Local inference via Ollama |
|
104
|
-
| Qwen/Qwen3 (0.5B-72B) | Ollama | Supports all Qwen model sizes |
|
105
|
-
| Mistral/Mixtral | Ollama | Including Mistral 7B and Mixtral 8x7B (In progress) |
|
106
|
-
| Gemma (2B, 7B) | Ollama | Google's lightweight models (In progress) |
|
107
|
-
| Claude (Haiku, Sonnet, Opus) | Anthropic API | Web API access (In progress) |
|
108
|
-
| Command R+ | Cohere API | Optimized for RAG (In progress) |
|
224
|
+
## ๐ RAG (Retrieval-Augmented Generation)
|
109
225
|
|
110
|
-
|
226
|
+
### Setting up RAG with Weaviate
|
111
227
|
|
112
228
|
```ruby
|
113
229
|
# Initialize components
|
114
|
-
embedder = LLMChain::Embeddings::Clients::Local::OllamaClient.new(
|
115
|
-
|
116
|
-
|
117
|
-
memory = LLMChain::Memory::Array.new
|
118
|
-
tools = []
|
230
|
+
embedder = LLMChain::Embeddings::Clients::Local::OllamaClient.new(
|
231
|
+
model: "nomic-embed-text"
|
232
|
+
)
|
119
233
|
|
120
|
-
|
234
|
+
vector_store = LLMChain::Embeddings::Clients::Local::WeaviateVectorStore.new(
|
235
|
+
embedder: embedder,
|
236
|
+
weaviate_url: 'http://localhost:8080'
|
237
|
+
)
|
238
|
+
|
239
|
+
retriever = LLMChain::Embeddings::Clients::Local::WeaviateRetriever.new(
|
240
|
+
embedder: embedder
|
241
|
+
)
|
242
|
+
|
243
|
+
# Create chain with RAG
|
121
244
|
chain = LLMChain::Chain.new(
|
122
245
|
model: "qwen3:1.7b",
|
123
|
-
|
124
|
-
tools: tools, # There is no tools supported yet
|
125
|
-
retriever: retriever # LLMChain::Embeddings::Clients::Local::WeaviateRetriever.new is default
|
246
|
+
retriever: retriever
|
126
247
|
)
|
248
|
+
```
|
127
249
|
|
128
|
-
|
129
|
-
|
130
|
-
simple_chain = LLMChain::Chain.new(model: "qwen3:1.7b")
|
250
|
+
### Adding Documents
|
131
251
|
|
132
|
-
|
252
|
+
```ruby
|
133
253
|
documents = [
|
134
254
|
{
|
135
|
-
text: "Ruby supports
|
136
|
-
metadata: { source: "ruby-
|
255
|
+
text: "Ruby supports OOP principles: encapsulation, inheritance, polymorphism",
|
256
|
+
metadata: { source: "ruby-guide", page: 15 }
|
137
257
|
},
|
138
258
|
{
|
139
259
|
text: "Modules in Ruby are used for namespaces and mixins",
|
140
|
-
metadata: { source: "ruby-
|
141
|
-
},
|
142
|
-
{
|
143
|
-
text: "2 + 2 is equals to 4",
|
144
|
-
matadata: { source: 'mad_brain', author: 'John Doe' }
|
260
|
+
metadata: { source: "ruby-book", author: "Matz" }
|
145
261
|
}
|
146
262
|
]
|
147
263
|
|
148
|
-
#
|
264
|
+
# Add to vector database
|
149
265
|
documents.each do |doc|
|
150
|
-
|
266
|
+
vector_store.add_document(
|
151
267
|
text: doc[:text],
|
152
268
|
metadata: doc[:metadata]
|
153
269
|
)
|
154
270
|
end
|
271
|
+
```
|
155
272
|
|
156
|
-
|
157
|
-
|
158
|
-
|
273
|
+
### RAG Queries
|
274
|
+
|
275
|
+
```ruby
|
276
|
+
# Regular query
|
277
|
+
response = chain.ask("What is Ruby?")
|
159
278
|
|
160
|
-
# Query with RAG
|
279
|
+
# Query with RAG
|
161
280
|
response = chain.ask(
|
162
281
|
"What OOP principles does Ruby support?",
|
163
282
|
rag_context: true,
|
164
283
|
rag_options: { limit: 3 }
|
165
284
|
)
|
166
|
-
|
285
|
+
```
|
286
|
+
|
287
|
+
## ๐ Streaming Output
|
288
|
+
|
289
|
+
```ruby
|
290
|
+
chain = LLMChain::Chain.new(model: "qwen3:1.7b")
|
291
|
+
|
292
|
+
# Streaming with block
|
293
|
+
chain.ask("Tell me about Ruby history", stream: true) do |chunk|
|
294
|
+
print chunk
|
295
|
+
$stdout.flush
|
296
|
+
end
|
297
|
+
|
298
|
+
# Streaming with tools
|
299
|
+
tool_manager = LLMChain::Tools::ToolManager.create_default_toolset
|
300
|
+
chain = LLMChain::Chain.new(
|
301
|
+
model: "qwen3:1.7b",
|
302
|
+
tools: tool_manager
|
303
|
+
)
|
167
304
|
|
168
|
-
|
169
|
-
chain.ask("Explain Ruby modules", stream: true, rag_context: true) do |chunk|
|
305
|
+
chain.ask("Calculate 15! and explain the process", stream: true) do |chunk|
|
170
306
|
print chunk
|
171
307
|
end
|
172
308
|
```
|
173
309
|
|
174
|
-
##
|
310
|
+
## โ๏ธ Configuration
|
311
|
+
|
312
|
+
### Environment Variables
|
313
|
+
|
314
|
+
```bash
|
315
|
+
# OpenAI
|
316
|
+
export OPENAI_API_KEY="sk-..."
|
317
|
+
export OPENAI_ORGANIZATION_ID="org-..."
|
318
|
+
|
319
|
+
# Search
|
320
|
+
export SEARCH_API_KEY="your-search-api-key"
|
321
|
+
export GOOGLE_SEARCH_ENGINE_ID="your-cse-id"
|
322
|
+
|
323
|
+
# Redis
|
324
|
+
export REDIS_URL="redis://localhost:6379"
|
325
|
+
|
326
|
+
# Weaviate
|
327
|
+
export WEAVIATE_URL="http://localhost:8080"
|
328
|
+
```
|
329
|
+
|
330
|
+
### Tool Configuration
|
331
|
+
|
332
|
+
```ruby
|
333
|
+
# From configuration
|
334
|
+
tools_config = [
|
335
|
+
{
|
336
|
+
class: 'calculator'
|
337
|
+
},
|
338
|
+
{
|
339
|
+
class: 'web_search',
|
340
|
+
options: {
|
341
|
+
search_engine: :duckduckgo,
|
342
|
+
api_key: ENV['SEARCH_API_KEY']
|
343
|
+
}
|
344
|
+
},
|
345
|
+
{
|
346
|
+
class: 'code_interpreter',
|
347
|
+
options: {
|
348
|
+
timeout: 30,
|
349
|
+
allowed_languages: ['ruby', 'python']
|
350
|
+
}
|
351
|
+
}
|
352
|
+
]
|
353
|
+
|
354
|
+
tool_manager = LLMChain::Tools::ToolManager.from_config(tools_config)
|
355
|
+
```
|
356
|
+
|
357
|
+
### Client Settings
|
358
|
+
|
359
|
+
```ruby
|
360
|
+
# Qwen with custom parameters
|
361
|
+
qwen = LLMChain::Clients::Qwen.new(
|
362
|
+
model: "qwen2:7b",
|
363
|
+
temperature: 0.7,
|
364
|
+
top_p: 0.9,
|
365
|
+
base_url: "http://localhost:11434"
|
366
|
+
)
|
367
|
+
|
368
|
+
# OpenAI with settings
|
369
|
+
openai = LLMChain::Clients::OpenAI.new(
|
370
|
+
model: "gpt-4",
|
371
|
+
api_key: ENV['OPENAI_API_KEY'],
|
372
|
+
temperature: 0.8,
|
373
|
+
max_tokens: 2000
|
374
|
+
)
|
375
|
+
```
|
376
|
+
|
377
|
+
## ๐ง Error Handling
|
175
378
|
|
176
379
|
```ruby
|
177
380
|
begin
|
178
|
-
chain.
|
381
|
+
chain = LLMChain::Chain.new(model: "qwen3:1.7b")
|
382
|
+
response = chain.ask("Complex query")
|
383
|
+
rescue LLMChain::UnknownModelError => e
|
384
|
+
puts "Unknown model: #{e.message}"
|
385
|
+
rescue LLMChain::ClientError => e
|
386
|
+
puts "Client error: #{e.message}"
|
387
|
+
rescue LLMChain::TimeoutError => e
|
388
|
+
puts "Timeout exceeded: #{e.message}"
|
179
389
|
rescue LLMChain::Error => e
|
180
|
-
puts "
|
181
|
-
|
390
|
+
puts "General LLMChain error: #{e.message}"
|
391
|
+
end
|
392
|
+
```
|
393
|
+
|
394
|
+
## ๐ Usage Examples
|
395
|
+
|
396
|
+
### Chatbot with Tools
|
397
|
+
|
398
|
+
```ruby
|
399
|
+
require 'llm_chain'
|
400
|
+
|
401
|
+
class ChatBot
|
402
|
+
def initialize
|
403
|
+
@tool_manager = LLMChain::Tools::ToolManager.create_default_toolset
|
404
|
+
@memory = LLMChain::Memory::Array.new(max_size: 20)
|
405
|
+
@chain = LLMChain::Chain.new(
|
406
|
+
model: "qwen3:1.7b",
|
407
|
+
memory: @memory,
|
408
|
+
tools: @tool_manager
|
409
|
+
)
|
410
|
+
end
|
411
|
+
|
412
|
+
def chat_loop
|
413
|
+
puts "๐ค Hello! I'm an AI assistant with tools. Ask me anything!"
|
414
|
+
|
415
|
+
loop do
|
416
|
+
print "\n๐ค You: "
|
417
|
+
input = gets.chomp
|
418
|
+
break if input.downcase.in?(['exit', 'quit', 'bye'])
|
419
|
+
|
420
|
+
response = @chain.ask(input, stream: true) do |chunk|
|
421
|
+
print chunk
|
422
|
+
end
|
423
|
+
puts "\n"
|
424
|
+
end
|
425
|
+
end
|
182
426
|
end
|
427
|
+
|
428
|
+
# Run
|
429
|
+
bot = ChatBot.new
|
430
|
+
bot.chat_loop
|
431
|
+
```
|
432
|
+
|
433
|
+
### Data Analysis with Code
|
434
|
+
|
435
|
+
```ruby
|
436
|
+
data_chain = LLMChain::Chain.new(
|
437
|
+
model: "qwen3:7b",
|
438
|
+
tools: LLMChain::Tools::ToolManager.create_default_toolset
|
439
|
+
)
|
440
|
+
|
441
|
+
# Analyze CSV data
|
442
|
+
response = data_chain.ask(<<~PROMPT)
|
443
|
+
Analyze this code and execute it:
|
444
|
+
|
445
|
+
```ruby
|
446
|
+
data = [
|
447
|
+
{ name: "Alice", age: 25, salary: 50000 },
|
448
|
+
{ name: "Bob", age: 30, salary: 60000 },
|
449
|
+
{ name: "Charlie", age: 35, salary: 70000 }
|
450
|
+
]
|
451
|
+
|
452
|
+
average_age = data.sum { |person| person[:age] } / data.size.to_f
|
453
|
+
total_salary = data.sum { |person| person[:salary] }
|
454
|
+
|
455
|
+
puts "Average age: #{average_age}"
|
456
|
+
puts "Total salary: #{total_salary}"
|
457
|
+
puts "Average salary: #{total_salary / data.size}"
|
458
|
+
```
|
459
|
+
PROMPT
|
460
|
+
|
461
|
+
puts response
|
183
462
|
```
|
184
463
|
|
185
|
-
##
|
186
|
-
|
187
|
-
|
464
|
+
## ๐งช Testing
|
465
|
+
|
466
|
+
```bash
|
467
|
+
# Run tests
|
468
|
+
bundle exec rspec
|
469
|
+
|
470
|
+
# Run demo
|
471
|
+
ruby -I lib examples/tools_example.rb
|
472
|
+
|
473
|
+
# Interactive console
|
474
|
+
bundle exec bin/console
|
475
|
+
```
|
476
|
+
|
477
|
+
## ๐ API Documentation
|
478
|
+
|
479
|
+
### Main Classes
|
480
|
+
|
481
|
+
- `LLMChain::Chain` - Main class for creating chains
|
482
|
+
- `LLMChain::Tools::ToolManager` - Tool management
|
483
|
+
- `LLMChain::Memory::Array/Redis` - Memory systems
|
484
|
+
- `LLMChain::Clients::*` - Clients for various LLMs
|
485
|
+
|
486
|
+
### Chain Methods
|
487
|
+
|
488
|
+
```ruby
|
489
|
+
chain = LLMChain::Chain.new(options)
|
490
|
+
|
491
|
+
# Main method
|
492
|
+
chain.ask(prompt, stream: false, rag_context: false, rag_options: {})
|
493
|
+
|
494
|
+
# Initialization parameters
|
495
|
+
# - model: model name
|
496
|
+
# - memory: memory object
|
497
|
+
# - tools: array of tools or ToolManager
|
498
|
+
# - retriever: RAG retriever
|
499
|
+
# - client_options: additional client parameters
|
500
|
+
```
|
501
|
+
|
502
|
+
## ๐ฃ๏ธ Roadmap
|
503
|
+
|
504
|
+
### v0.6.0
|
505
|
+
- [ ] ReAct agents
|
506
|
+
- [ ] More tools (files, database)
|
507
|
+
- [ ] Claude integration
|
508
|
+
- [ ] Enhanced logging
|
509
|
+
|
510
|
+
### v0.7.0
|
511
|
+
- [ ] Multi-agent systems
|
512
|
+
- [ ] Task planning
|
513
|
+
- [ ] Web interface
|
514
|
+
- [ ] Metrics and monitoring
|
515
|
+
|
516
|
+
### v1.0.0
|
517
|
+
- [ ] Stable API
|
518
|
+
- [ ] Complete documentation
|
519
|
+
- [ ] Production readiness
|
520
|
+
|
521
|
+
## ๐ค Contributing
|
522
|
+
|
523
|
+
1. Fork the repository
|
524
|
+
2. Create feature branch (`git checkout -b feature/amazing-feature`)
|
525
|
+
3. Commit changes (`git commit -m 'Add amazing feature'`)
|
526
|
+
4. Push to branch (`git push origin feature/amazing-feature`)
|
527
|
+
5. Open Pull Request
|
528
|
+
|
529
|
+
### Development
|
530
|
+
|
531
|
+
```bash
|
532
|
+
git clone https://github.com/FuryCow/llm_chain.git
|
533
|
+
cd llm_chain
|
534
|
+
bundle install
|
535
|
+
bundle exec rspec
|
536
|
+
```
|
537
|
+
|
538
|
+
## ๐ License
|
539
|
+
|
540
|
+
This project is distributed under the [MIT License](LICENSE.txt).
|
541
|
+
|
542
|
+
## ๐ Acknowledgments
|
543
|
+
|
544
|
+
- [Ollama](https://ollama.ai/) team for excellent local LLM platform
|
545
|
+
- [LangChain](https://langchain.com/) developers for inspiration
|
546
|
+
- Ruby community for support
|
547
|
+
|
548
|
+
---
|
549
|
+
|
550
|
+
**Made with โค๏ธ for Ruby community**
|
188
551
|
|
189
|
-
|
190
|
-
|
552
|
+
[Documentation](https://github.com/FuryCow/llm_chain/wiki) |
|
553
|
+
[Examples](https://github.com/FuryCow/llm_chain/tree/main/examples) |
|
554
|
+
[Issues](https://github.com/FuryCow/llm_chain/issues) |
|
555
|
+
[Discussions](https://github.com/FuryCow/llm_chain/discussions)
|