htm 0.0.17 → 0.0.20
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/.architecture/decisions/adrs/001-use-postgresql-timescaledb-storage.md +1 -1
- data/.architecture/decisions/adrs/011-database-side-embedding-generation-with-pgai.md +4 -4
- data/.architecture/decisions/adrs/012-llm-driven-ontology-topic-extraction.md +1 -1
- data/.envrc +12 -25
- data/.irbrc +7 -7
- data/.tbls.yml +2 -2
- data/CHANGELOG.md +130 -1
- data/README.md +13 -1
- data/Rakefile +8 -3
- data/SETUP.md +12 -12
- data/bin/htm_mcp +0 -4
- data/db/seed_data/README.md +2 -2
- data/db/seeds.rb +3 -3
- data/docs/api/database.md +37 -37
- data/docs/api/embedding-service.md +140 -110
- data/docs/api/htm.md +1 -1
- data/docs/api/yard/HTM/ActiveRecordConfig.md +8 -2
- data/docs/api/yard/HTM/Config.md +173 -0
- data/docs/api/yard/HTM/ConfigSection.md +28 -0
- data/docs/api/yard/HTM/Database.md +7 -8
- data/docs/api/yard/HTM/JobAdapter.md +1 -1
- data/docs/api/yard/HTM.md +0 -57
- data/docs/api/yard/index.csv +76 -61
- data/docs/api/yard-reference.md +2 -1
- data/docs/architecture/adrs/001-postgresql-timescaledb.md +1 -1
- data/docs/architecture/adrs/003-ollama-embeddings.md +45 -36
- data/docs/architecture/adrs/004-hive-mind.md +1 -1
- data/docs/architecture/adrs/008-robot-identification.md +1 -1
- data/docs/architecture/adrs/011-pgai-integration.md +4 -4
- data/docs/architecture/index.md +11 -9
- data/docs/architecture/overview.md +11 -7
- data/docs/assets/images/balanced-strategy-decay.svg +41 -0
- data/docs/assets/images/class-hierarchy.svg +1 -1
- data/docs/assets/images/eviction-priority.svg +43 -0
- data/docs/assets/images/exception-hierarchy.svg +2 -2
- data/docs/assets/images/hive-mind-shared-memory.svg +52 -0
- data/docs/assets/images/htm-architecture-overview.svg +3 -3
- data/docs/assets/images/htm-core-components.svg +4 -4
- data/docs/assets/images/htm-layered-architecture.svg +1 -1
- data/docs/assets/images/htm-memory-addition-flow.svg +2 -2
- data/docs/assets/images/htm-memory-recall-flow.svg +2 -2
- data/docs/assets/images/memory-topology.svg +53 -0
- data/docs/assets/images/two-tier-memory-architecture.svg +55 -0
- data/docs/database_rake_tasks.md +5 -5
- data/docs/development/rake-tasks.md +11 -11
- data/docs/development/setup.md +97 -65
- data/docs/development/testing.md +1 -1
- data/docs/examples/basic-usage.md +133 -0
- data/docs/examples/config-files.md +170 -0
- data/docs/examples/file-loading.md +208 -0
- data/docs/examples/index.md +116 -0
- data/docs/examples/llm-configuration.md +168 -0
- data/docs/examples/mcp-client.md +172 -0
- data/docs/examples/rails-integration.md +173 -0
- data/docs/examples/robot-groups.md +210 -0
- data/docs/examples/sinatra-integration.md +218 -0
- data/docs/examples/standalone-app.md +216 -0
- data/docs/examples/telemetry.md +224 -0
- data/docs/examples/timeframes.md +143 -0
- data/docs/getting-started/installation.md +117 -60
- data/docs/getting-started/quick-start.md +35 -18
- data/docs/guides/configuration.md +515 -0
- data/docs/guides/file-loading.md +322 -0
- data/docs/guides/getting-started.md +42 -11
- data/docs/guides/index.md +3 -3
- data/docs/guides/long-term-memory.md +1 -1
- data/docs/guides/mcp-server.md +47 -29
- data/docs/guides/propositions.md +264 -0
- data/docs/guides/recalling-memories.md +4 -4
- data/docs/guides/search-strategies.md +3 -3
- data/docs/guides/tags.md +318 -0
- data/docs/guides/telemetry.md +229 -0
- data/docs/index.md +10 -18
- data/docs/multi_framework_support.md +8 -8
- data/docs/{architecture → robots}/hive-mind.md +8 -111
- data/docs/robots/index.md +73 -0
- data/docs/{guides → robots}/multi-robot.md +3 -3
- data/docs/{guides → robots}/robot-groups.md +14 -13
- data/docs/{architecture → robots}/two-tier-memory.md +13 -149
- data/docs/robots/why-robots.md +85 -0
- data/docs/setup_local_database.md +19 -19
- data/docs/using_rake_tasks_in_your_app.md +14 -14
- data/examples/README.md +50 -6
- data/examples/basic_usage.rb +31 -21
- data/examples/cli_app/README.md +8 -8
- data/examples/cli_app/htm_cli.rb +5 -5
- data/examples/config_file_example/README.md +256 -0
- data/examples/config_file_example/config/htm.local.yml +34 -0
- data/examples/config_file_example/custom_config.yml +22 -0
- data/examples/config_file_example/show_config.rb +125 -0
- data/examples/custom_llm_configuration.rb +7 -7
- data/examples/example_app/Rakefile +2 -2
- data/examples/example_app/app.rb +8 -8
- data/examples/file_loader_usage.rb +9 -9
- data/examples/mcp_client.rb +5 -5
- data/examples/rails_app/Gemfile.lock +48 -56
- data/examples/rails_app/README.md +1 -1
- data/examples/robot_groups/multi_process.rb +5 -5
- data/examples/robot_groups/robot_worker.rb +5 -5
- data/examples/robot_groups/same_process.rb +9 -9
- data/examples/sinatra_app/app.rb +1 -1
- data/examples/timeframe_demo.rb +1 -1
- data/lib/htm/active_record_config.rb +12 -25
- data/lib/htm/circuit_breaker.rb +0 -2
- data/lib/htm/config/defaults.yml +246 -0
- data/lib/htm/config.rb +888 -0
- data/lib/htm/database.rb +23 -27
- data/lib/htm/embedding_service.rb +0 -4
- data/lib/htm/integrations/sinatra.rb +3 -7
- data/lib/htm/job_adapter.rb +76 -16
- data/lib/htm/jobs/generate_embedding_job.rb +1 -7
- data/lib/htm/jobs/generate_propositions_job.rb +2 -12
- data/lib/htm/jobs/generate_tags_job.rb +1 -8
- data/lib/htm/loaders/defaults_loader.rb +143 -0
- data/lib/htm/loaders/xdg_config_loader.rb +116 -0
- data/lib/htm/mcp/cli.rb +200 -58
- data/lib/htm/mcp/server.rb +3 -3
- data/lib/htm/proposition_service.rb +2 -12
- data/lib/htm/railtie.rb +3 -4
- data/lib/htm/tag_service.rb +1 -8
- data/lib/htm/version.rb +1 -1
- data/lib/htm/workflows/remember_workflow.rb +212 -0
- data/lib/htm.rb +125 -5
- data/mkdocs.yml +33 -8
- metadata +83 -10
- data/config/database.yml +0 -77
- data/docs/api/yard/HTM/Configuration.md +0 -229
- data/docs/telemetry.md +0 -391
- data/lib/htm/configuration.rb +0 -799
|
@@ -8,7 +8,7 @@ Before installing HTM, ensure you have:
|
|
|
8
8
|
|
|
9
9
|
- **Ruby 3.0 or higher** - HTM requires modern Ruby features
|
|
10
10
|
- **PostgreSQL 17+** - For the database backend
|
|
11
|
-
- **
|
|
11
|
+
- **LLM Provider** - For generating embeddings and tags (Ollama is the default for local development, but OpenAI, Anthropic, Gemini, and others are also supported via RubyLLM)
|
|
12
12
|
|
|
13
13
|
### Check Your Ruby Version
|
|
14
14
|
|
|
@@ -103,24 +103,24 @@ CREATE EXTENSION IF NOT EXISTS pg_trgm;
|
|
|
103
103
|
|
|
104
104
|
```bash
|
|
105
105
|
# Add to ~/.bashrc or your preferred config file
|
|
106
|
-
export
|
|
107
|
-
export
|
|
108
|
-
export
|
|
109
|
-
export
|
|
110
|
-
export
|
|
111
|
-
export
|
|
106
|
+
export HTM_DATABASE__URL="postgres://username:password@localhost:5432/htm_db"
|
|
107
|
+
export HTM_DATABASE__NAME="htm_db"
|
|
108
|
+
export HTM_DATABASE__USER="your_username"
|
|
109
|
+
export HTM_DATABASE__PASSWORD="your_password"
|
|
110
|
+
export HTM_DATABASE__PORT="5432"
|
|
111
|
+
export HTM_DATABASE__HOST="localhost"
|
|
112
112
|
|
|
113
113
|
# Load the configuration
|
|
114
114
|
source ~/.bashrc
|
|
115
115
|
```
|
|
116
116
|
|
|
117
117
|
!!! tip "Environment Configuration"
|
|
118
|
-
HTM automatically uses the `
|
|
118
|
+
HTM automatically uses the `HTM_DATABASE__URL` environment variable if available. You can also pass database configuration directly to `HTM.new()`.
|
|
119
119
|
|
|
120
120
|
Set environment variable:
|
|
121
121
|
|
|
122
122
|
```bash
|
|
123
|
-
export
|
|
123
|
+
export HTM_DATABASE__URL="postgres://localhost/htm_db"
|
|
124
124
|
```
|
|
125
125
|
|
|
126
126
|
## Step 3: Enable PostgreSQL Extensions
|
|
@@ -139,7 +139,7 @@ Test your database connection and verify extensions:
|
|
|
139
139
|
cd /path/to/your/project
|
|
140
140
|
ruby -e "
|
|
141
141
|
require 'pg'
|
|
142
|
-
conn = PG.connect(ENV['
|
|
142
|
+
conn = PG.connect(ENV['HTM_DATABASE__URL'])
|
|
143
143
|
result = conn.exec('SELECT extname, extversion FROM pg_extension ORDER BY extname')
|
|
144
144
|
result.each { |row| puts \"✓ #{row['extname']}: Version #{row['extversion']}\" }
|
|
145
145
|
conn.close
|
|
@@ -156,14 +156,29 @@ Expected output:
|
|
|
156
156
|
!!! warning "Missing Extensions"
|
|
157
157
|
If extensions are missing, you may need to install them. On Debian/Ubuntu: `sudo apt-get install postgresql-17-pgvector`. On macOS: `brew install pgvector`.
|
|
158
158
|
|
|
159
|
-
## Step 4:
|
|
159
|
+
## Step 4: Configure LLM Provider
|
|
160
160
|
|
|
161
|
-
HTM uses
|
|
161
|
+
HTM uses RubyLLM to generate vector embeddings and extract tags. RubyLLM supports multiple providers, allowing you to choose what works best for your use case.
|
|
162
162
|
|
|
163
|
-
###
|
|
163
|
+
### Supported Providers
|
|
164
164
|
|
|
165
|
-
|
|
165
|
+
| Provider | Best For | API Key Required |
|
|
166
|
+
|----------|----------|------------------|
|
|
167
|
+
| **Ollama** (default) | Local development, privacy, no API costs | No |
|
|
168
|
+
| **OpenAI** | Production, high-quality embeddings | Yes (`OPENAI_API_KEY`) |
|
|
169
|
+
| **Anthropic** | Tag extraction with Claude models | Yes (`ANTHROPIC_API_KEY`) |
|
|
170
|
+
| **Gemini** | Google Cloud integration | Yes (`GEMINI_API_KEY`) |
|
|
171
|
+
| **Azure** | Enterprise Azure deployments | Yes (Azure credentials) |
|
|
172
|
+
| **Bedrock** | AWS integration | Yes (AWS credentials) |
|
|
173
|
+
| **DeepSeek** | Cost-effective alternative | Yes (`DEEPSEEK_API_KEY`) |
|
|
166
174
|
|
|
175
|
+
### Option A: Ollama (Recommended for Local Development)
|
|
176
|
+
|
|
177
|
+
Ollama runs locally with no API costs and keeps your data private.
|
|
178
|
+
|
|
179
|
+
#### Install Ollama
|
|
180
|
+
|
|
181
|
+
**macOS:**
|
|
167
182
|
```bash
|
|
168
183
|
# Option 1: Direct download
|
|
169
184
|
curl https://ollama.ai/install.sh | sh
|
|
@@ -172,17 +187,15 @@ curl https://ollama.ai/install.sh | sh
|
|
|
172
187
|
brew install ollama
|
|
173
188
|
```
|
|
174
189
|
|
|
175
|
-
|
|
176
|
-
|
|
190
|
+
**Linux:**
|
|
177
191
|
```bash
|
|
178
192
|
curl https://ollama.ai/install.sh | sh
|
|
179
193
|
```
|
|
180
194
|
|
|
181
|
-
|
|
182
|
-
|
|
195
|
+
**Windows:**
|
|
183
196
|
Download the installer from [https://ollama.ai/download](https://ollama.ai/download)
|
|
184
197
|
|
|
185
|
-
|
|
198
|
+
#### Start Ollama Service
|
|
186
199
|
|
|
187
200
|
```bash
|
|
188
201
|
# Ollama typically starts automatically
|
|
@@ -190,43 +203,68 @@ Download the installer from [https://ollama.ai/download](https://ollama.ai/downl
|
|
|
190
203
|
curl http://localhost:11434/api/version
|
|
191
204
|
```
|
|
192
205
|
|
|
193
|
-
|
|
206
|
+
#### Pull Required Models
|
|
207
|
+
|
|
208
|
+
```bash
|
|
209
|
+
# Download embedding model
|
|
210
|
+
ollama pull nomic-embed-text
|
|
211
|
+
|
|
212
|
+
# Download tag extraction model
|
|
213
|
+
ollama pull gemma3:latest
|
|
194
214
|
|
|
195
|
-
|
|
196
|
-
|
|
215
|
+
# Verify models are available
|
|
216
|
+
ollama list
|
|
197
217
|
```
|
|
198
218
|
|
|
199
|
-
|
|
219
|
+
#### Configure Environment (Optional)
|
|
200
220
|
|
|
201
|
-
|
|
221
|
+
If Ollama is running on a different host or port:
|
|
202
222
|
|
|
203
223
|
```bash
|
|
204
|
-
|
|
205
|
-
ollama pull gpt-oss
|
|
206
|
-
|
|
207
|
-
# Verify the model is available
|
|
208
|
-
ollama list
|
|
224
|
+
export OLLAMA_URL="http://custom-host:11434"
|
|
209
225
|
```
|
|
210
226
|
|
|
211
|
-
|
|
227
|
+
### Option B: OpenAI (Recommended for Production)
|
|
212
228
|
|
|
213
|
-
|
|
229
|
+
OpenAI provides high-quality embeddings with simple API access.
|
|
214
230
|
|
|
215
231
|
```bash
|
|
216
|
-
#
|
|
217
|
-
|
|
232
|
+
# Set your API key
|
|
233
|
+
export OPENAI_API_KEY="sk-..."
|
|
218
234
|
```
|
|
219
235
|
|
|
220
|
-
|
|
236
|
+
Configure HTM to use OpenAI:
|
|
221
237
|
|
|
222
|
-
|
|
238
|
+
```ruby
|
|
239
|
+
HTM.configure do |config|
|
|
240
|
+
config.embedding.provider = :openai
|
|
241
|
+
config.embedding.model = 'text-embedding-3-small'
|
|
242
|
+
config.tag.provider = :openai
|
|
243
|
+
config.tag.model = 'gpt-4o-mini'
|
|
244
|
+
end
|
|
245
|
+
```
|
|
246
|
+
|
|
247
|
+
### Option C: Other Providers
|
|
248
|
+
|
|
249
|
+
For Anthropic, Gemini, Azure, Bedrock, or DeepSeek, set the appropriate API key and configure HTM:
|
|
223
250
|
|
|
224
251
|
```bash
|
|
225
|
-
|
|
252
|
+
# Example: Anthropic
|
|
253
|
+
export ANTHROPIC_API_KEY="sk-ant-..."
|
|
254
|
+
|
|
255
|
+
# Example: Gemini
|
|
256
|
+
export GEMINI_API_KEY="..."
|
|
257
|
+
```
|
|
258
|
+
|
|
259
|
+
```ruby
|
|
260
|
+
HTM.configure do |config|
|
|
261
|
+
config.tag.provider = :anthropic
|
|
262
|
+
config.tag.model = 'claude-3-haiku-20240307'
|
|
263
|
+
end
|
|
226
264
|
```
|
|
227
265
|
|
|
228
|
-
!!! tip "
|
|
229
|
-
|
|
266
|
+
!!! tip "Mix and Match Providers"
|
|
267
|
+
You can use different providers for embeddings and tags. For example, use Ollama for local embedding generation and OpenAI for tag extraction.
|
|
230
268
|
|
|
231
269
|
## Step 5: Initialize HTM Database Schema
|
|
232
270
|
|
|
@@ -297,10 +335,9 @@ puts "Testing HTM Installation..."
|
|
|
297
335
|
# Initialize HTM
|
|
298
336
|
htm = HTM.new(
|
|
299
337
|
robot_name: "Test Robot",
|
|
300
|
-
working_memory_size: 128_000
|
|
301
|
-
embedding_service: :ollama,
|
|
302
|
-
embedding_model: 'gpt-oss'
|
|
338
|
+
working_memory_size: 128_000
|
|
303
339
|
)
|
|
340
|
+
# Uses configured provider, or defaults to Ollama
|
|
304
341
|
|
|
305
342
|
puts "✓ HTM initialized successfully"
|
|
306
343
|
puts " Robot ID: #{htm.robot_id}"
|
|
@@ -352,12 +389,15 @@ HTM uses the following environment variables:
|
|
|
352
389
|
|
|
353
390
|
| Variable | Description | Default | Required |
|
|
354
391
|
|----------|-------------|---------|----------|
|
|
355
|
-
| `
|
|
356
|
-
| `
|
|
357
|
-
| `
|
|
358
|
-
| `
|
|
359
|
-
| `
|
|
360
|
-
| `OLLAMA_URL` | Ollama API URL | `http://localhost:11434` | No |
|
|
392
|
+
| `HTM_DATABASE__URL` | PostgreSQL connection URL | - | Yes |
|
|
393
|
+
| `HTM_DATABASE__NAME` | Database name | `htm_db` | No |
|
|
394
|
+
| `HTM_DATABASE__USER` | Database user | `postgres` | No |
|
|
395
|
+
| `HTM_DATABASE__PASSWORD` | Database password | - | No |
|
|
396
|
+
| `HTM_DATABASE__PORT` | Database port | `5432` | No |
|
|
397
|
+
| `OLLAMA_URL` | Ollama API URL (if using Ollama) | `http://localhost:11434` | No |
|
|
398
|
+
| `OPENAI_API_KEY` | OpenAI API key (if using OpenAI) | - | No |
|
|
399
|
+
| `ANTHROPIC_API_KEY` | Anthropic API key (if using Anthropic) | - | No |
|
|
400
|
+
| `GEMINI_API_KEY` | Gemini API key (if using Gemini) | - | No |
|
|
361
401
|
|
|
362
402
|
### Example Configuration File
|
|
363
403
|
|
|
@@ -365,8 +405,13 @@ Create a configuration file for easy loading:
|
|
|
365
405
|
|
|
366
406
|
```bash
|
|
367
407
|
# ~/.bashrc__htm
|
|
368
|
-
export
|
|
408
|
+
export HTM_DATABASE__URL="postgres://user:pass@host:port/db?sslmode=require"
|
|
409
|
+
# Ollama (for local development)
|
|
369
410
|
export OLLAMA_URL="http://localhost:11434"
|
|
411
|
+
|
|
412
|
+
# Or use cloud providers:
|
|
413
|
+
# export OPENAI_API_KEY="sk-..."
|
|
414
|
+
# export ANTHROPIC_API_KEY="sk-ant-..."
|
|
370
415
|
```
|
|
371
416
|
|
|
372
417
|
Load it in your shell:
|
|
@@ -385,11 +430,11 @@ source ~/.bashrc__htm
|
|
|
385
430
|
**Solutions**:
|
|
386
431
|
|
|
387
432
|
```bash
|
|
388
|
-
# 1. Verify
|
|
389
|
-
echo $
|
|
433
|
+
# 1. Verify HTM_DATABASE__URL is set
|
|
434
|
+
echo $HTM_DATABASE__URL
|
|
390
435
|
|
|
391
436
|
# 2. Test connection manually
|
|
392
|
-
psql $
|
|
437
|
+
psql $HTM_DATABASE__URL
|
|
393
438
|
|
|
394
439
|
# 3. Check if PostgreSQL is running (local installs)
|
|
395
440
|
pg_ctl status
|
|
@@ -398,11 +443,11 @@ pg_ctl status
|
|
|
398
443
|
# Ensure URL includes: ?sslmode=require
|
|
399
444
|
```
|
|
400
445
|
|
|
401
|
-
###
|
|
446
|
+
### LLM Provider Connection Issues
|
|
402
447
|
|
|
403
|
-
**Error**: `Connection refused
|
|
448
|
+
**Error**: `Connection refused` (Ollama) or `API key invalid` (cloud providers)
|
|
404
449
|
|
|
405
|
-
**Solutions**:
|
|
450
|
+
**Solutions for Ollama**:
|
|
406
451
|
|
|
407
452
|
```bash
|
|
408
453
|
# 1. Check if Ollama is running
|
|
@@ -415,8 +460,19 @@ curl http://localhost:11434/api/version
|
|
|
415
460
|
killall ollama
|
|
416
461
|
ollama serve
|
|
417
462
|
|
|
418
|
-
# 4. Verify
|
|
419
|
-
ollama list | grep
|
|
463
|
+
# 4. Verify embedding model is installed
|
|
464
|
+
ollama list | grep nomic-embed-text
|
|
465
|
+
```
|
|
466
|
+
|
|
467
|
+
**Solutions for Cloud Providers**:
|
|
468
|
+
|
|
469
|
+
```bash
|
|
470
|
+
# Verify API key is set
|
|
471
|
+
echo $OPENAI_API_KEY
|
|
472
|
+
echo $ANTHROPIC_API_KEY
|
|
473
|
+
|
|
474
|
+
# Test API connectivity
|
|
475
|
+
curl https://api.openai.com/v1/models -H "Authorization: Bearer $OPENAI_API_KEY"
|
|
420
476
|
```
|
|
421
477
|
|
|
422
478
|
### Missing Extensions
|
|
@@ -433,7 +489,7 @@ make
|
|
|
433
489
|
sudo make install
|
|
434
490
|
|
|
435
491
|
# Enable in database
|
|
436
|
-
psql $
|
|
492
|
+
psql $HTM_DATABASE__URL -c "CREATE EXTENSION IF NOT EXISTS pgvector;"
|
|
437
493
|
```
|
|
438
494
|
|
|
439
495
|
### Ruby Version Issues
|
|
@@ -462,7 +518,7 @@ ruby --version
|
|
|
462
518
|
|
|
463
519
|
```bash
|
|
464
520
|
# Ensure your database user has necessary permissions
|
|
465
|
-
psql $
|
|
521
|
+
psql $HTM_DATABASE__URL -c "
|
|
466
522
|
GRANT ALL PRIVILEGES ON DATABASE your_db TO your_user;
|
|
467
523
|
GRANT ALL ON ALL TABLES IN SCHEMA public TO your_user;
|
|
468
524
|
"
|
|
@@ -491,7 +547,8 @@ If you encounter issues:
|
|
|
491
547
|
|
|
492
548
|
## Additional Resources
|
|
493
549
|
|
|
494
|
-
- **
|
|
550
|
+
- **RubyLLM Documentation**: [https://rubyllm.com/](https://rubyllm.com/) - Multi-provider LLM interface
|
|
551
|
+
- **Ollama Documentation**: [https://ollama.ai/](https://ollama.ai/) - Local LLM provider
|
|
552
|
+
- **OpenAI API**: [https://platform.openai.com/docs/](https://platform.openai.com/docs/) - Cloud embeddings
|
|
495
553
|
- **pgvector Documentation**: [https://github.com/pgvector/pgvector](https://github.com/pgvector/pgvector)
|
|
496
554
|
- **PostgreSQL Documentation**: [https://www.postgresql.org/docs/](https://www.postgresql.org/docs/)
|
|
497
|
-
- **RubyLLM Documentation**: [https://github.com/madbomber/ruby_llm](https://github.com/madbomber/ruby_llm)
|
|
@@ -123,12 +123,13 @@ puts "=" * 60
|
|
|
123
123
|
Create an HTM instance for your robot:
|
|
124
124
|
|
|
125
125
|
```ruby
|
|
126
|
-
# Configure HTM globally (optional -
|
|
126
|
+
# Configure HTM globally (optional - defaults to Ollama for local development)
|
|
127
|
+
# HTM uses RubyLLM which supports: :ollama, :openai, :anthropic, :gemini, :azure, :bedrock, :deepseek
|
|
127
128
|
HTM.configure do |config|
|
|
128
|
-
config.
|
|
129
|
-
config.
|
|
130
|
-
config.
|
|
131
|
-
config.
|
|
129
|
+
config.embedding.provider = :ollama # or :openai, etc.
|
|
130
|
+
config.embedding.model = 'nomic-embed-text' # provider-specific model
|
|
131
|
+
config.tag.provider = :ollama
|
|
132
|
+
config.tag.model = 'gemma3:latest'
|
|
132
133
|
end
|
|
133
134
|
|
|
134
135
|
# Initialize HTM with a robot name
|
|
@@ -336,12 +337,13 @@ require 'htm'
|
|
|
336
337
|
puts "My First HTM Application"
|
|
337
338
|
puts "=" * 60
|
|
338
339
|
|
|
339
|
-
# Step 1: Configure and initialize HTM
|
|
340
|
+
# Step 1: Configure and initialize HTM (optional - uses Ollama by default)
|
|
341
|
+
# Supports: :ollama, :openai, :anthropic, :gemini, :azure, :bedrock, :deepseek
|
|
340
342
|
HTM.configure do |config|
|
|
341
|
-
config.
|
|
342
|
-
config.
|
|
343
|
-
config.
|
|
344
|
-
config.
|
|
343
|
+
config.embedding.provider = :ollama
|
|
344
|
+
config.embedding.model = 'nomic-embed-text'
|
|
345
|
+
config.tag.provider = :ollama
|
|
346
|
+
config.tag.model = 'gemma3:latest'
|
|
345
347
|
end
|
|
346
348
|
|
|
347
349
|
htm = HTM.new(
|
|
@@ -520,10 +522,15 @@ htm = HTM.new(
|
|
|
520
522
|
working_memory_size: 256_000 # 256k tokens
|
|
521
523
|
)
|
|
522
524
|
|
|
523
|
-
# Try different
|
|
525
|
+
# Try different providers or models
|
|
524
526
|
HTM.configure do |config|
|
|
525
|
-
|
|
526
|
-
config.
|
|
527
|
+
# Use OpenAI for production
|
|
528
|
+
config.embedding.provider = :openai
|
|
529
|
+
config.embedding.model = 'text-embedding-3-small'
|
|
530
|
+
|
|
531
|
+
# Or use Ollama locally with different model
|
|
532
|
+
# config.embedding.provider = :ollama
|
|
533
|
+
# config.embedding.model = 'mxbai-embed-large'
|
|
527
534
|
end
|
|
528
535
|
|
|
529
536
|
# Try different recall strategies
|
|
@@ -595,33 +602,43 @@ htm.remember(
|
|
|
595
602
|
|
|
596
603
|
## Troubleshooting Quick Start
|
|
597
604
|
|
|
598
|
-
### Issue: "Connection refused" error
|
|
605
|
+
### Issue: "Connection refused" error (Ollama)
|
|
599
606
|
|
|
600
607
|
**Solution**: Make sure Ollama is running:
|
|
601
608
|
|
|
602
609
|
```bash
|
|
603
610
|
curl http://localhost:11434/api/version
|
|
604
|
-
# If this fails, start Ollama
|
|
611
|
+
# If this fails, start Ollama with: ollama serve
|
|
612
|
+
```
|
|
613
|
+
|
|
614
|
+
### Issue: "API key invalid" error (cloud providers)
|
|
615
|
+
|
|
616
|
+
**Solution**: Verify your API key is set:
|
|
617
|
+
|
|
618
|
+
```bash
|
|
619
|
+
echo $OPENAI_API_KEY # or ANTHROPIC_API_KEY, GEMINI_API_KEY, etc.
|
|
605
620
|
```
|
|
606
621
|
|
|
607
622
|
### Issue: "Database connection failed"
|
|
608
623
|
|
|
609
|
-
**Solution**: Verify your `
|
|
624
|
+
**Solution**: Verify your `HTM_DATABASE__URL` is set:
|
|
610
625
|
|
|
611
626
|
```bash
|
|
612
|
-
echo $
|
|
627
|
+
echo $HTM_DATABASE__URL
|
|
613
628
|
# Should show your connection string
|
|
614
629
|
```
|
|
615
630
|
|
|
616
631
|
### Issue: Embeddings taking too long
|
|
617
632
|
|
|
618
|
-
**Solution
|
|
633
|
+
**Solution for Ollama**: Check the model is downloaded:
|
|
619
634
|
|
|
620
635
|
```bash
|
|
621
636
|
ollama list | grep nomic-embed-text
|
|
622
637
|
# Should show nomic-embed-text model
|
|
623
638
|
```
|
|
624
639
|
|
|
640
|
+
**Solution for cloud providers**: Check your internet connection and API status.
|
|
641
|
+
|
|
625
642
|
### Issue: Memory not found during recall
|
|
626
643
|
|
|
627
644
|
**Solution**: Check your timeframe. If you just added a memory, use a recent timeframe:
|