@tecet/ollm 0.1.4 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (86) hide show
  1. package/dist/cli.js +20 -14
  2. package/dist/cli.js.map +3 -3
  3. package/dist/services/documentService.d.ts.map +1 -1
  4. package/dist/services/documentService.js +12 -2
  5. package/dist/services/documentService.js.map +1 -1
  6. package/dist/ui/components/docs/DocsPanel.d.ts.map +1 -1
  7. package/dist/ui/components/docs/DocsPanel.js +1 -1
  8. package/dist/ui/components/docs/DocsPanel.js.map +1 -1
  9. package/dist/ui/components/launch/VersionBanner.js +1 -1
  10. package/dist/ui/components/launch/VersionBanner.js.map +1 -1
  11. package/dist/ui/components/layout/KeybindsLegend.d.ts.map +1 -1
  12. package/dist/ui/components/layout/KeybindsLegend.js +1 -1
  13. package/dist/ui/components/layout/KeybindsLegend.js.map +1 -1
  14. package/dist/ui/components/tabs/BugReportTab.js +1 -1
  15. package/dist/ui/components/tabs/BugReportTab.js.map +1 -1
  16. package/dist/ui/services/docsService.d.ts +12 -27
  17. package/dist/ui/services/docsService.d.ts.map +1 -1
  18. package/dist/ui/services/docsService.js +40 -67
  19. package/dist/ui/services/docsService.js.map +1 -1
  20. package/docs/README.md +3 -410
  21. package/package.json +10 -7
  22. package/scripts/copy-docs-to-user.cjs +34 -0
  23. package/docs/Context/CheckpointFlowDiagram.md +0 -673
  24. package/docs/Context/ContextArchitecture.md +0 -898
  25. package/docs/Context/ContextCompression.md +0 -1102
  26. package/docs/Context/ContextManagment.md +0 -750
  27. package/docs/Context/Index.md +0 -209
  28. package/docs/Context/README.md +0 -390
  29. package/docs/DevelopmentRoadmap/Index.md +0 -238
  30. package/docs/DevelopmentRoadmap/OLLM-CLI_Releases.md +0 -419
  31. package/docs/DevelopmentRoadmap/PlanedFeatures.md +0 -448
  32. package/docs/DevelopmentRoadmap/README.md +0 -174
  33. package/docs/DevelopmentRoadmap/Roadmap.md +0 -572
  34. package/docs/DevelopmentRoadmap/RoadmapVisual.md +0 -372
  35. package/docs/Hooks/Architecture.md +0 -885
  36. package/docs/Hooks/Index.md +0 -244
  37. package/docs/Hooks/KeyboardShortcuts.md +0 -248
  38. package/docs/Hooks/Protocol.md +0 -817
  39. package/docs/Hooks/README.md +0 -403
  40. package/docs/Hooks/UserGuide.md +0 -1483
  41. package/docs/Hooks/VisualGuide.md +0 -598
  42. package/docs/Index.md +0 -506
  43. package/docs/Installation.md +0 -586
  44. package/docs/Introduction.md +0 -367
  45. package/docs/LLM Models/Index.md +0 -239
  46. package/docs/LLM Models/LLM_GettingStarted.md +0 -748
  47. package/docs/LLM Models/LLM_Index.md +0 -701
  48. package/docs/LLM Models/LLM_MemorySystem.md +0 -337
  49. package/docs/LLM Models/LLM_ModelCompatibility.md +0 -499
  50. package/docs/LLM Models/LLM_ModelsArchitecture.md +0 -933
  51. package/docs/LLM Models/LLM_ModelsCommands.md +0 -839
  52. package/docs/LLM Models/LLM_ModelsConfiguration.md +0 -1094
  53. package/docs/LLM Models/LLM_ModelsList.md +0 -1071
  54. package/docs/LLM Models/LLM_ModelsList.md.backup +0 -400
  55. package/docs/LLM Models/README.md +0 -355
  56. package/docs/MCP/MCP_Architecture.md +0 -1086
  57. package/docs/MCP/MCP_Commands.md +0 -1111
  58. package/docs/MCP/MCP_GettingStarted.md +0 -590
  59. package/docs/MCP/MCP_Index.md +0 -524
  60. package/docs/MCP/MCP_Integration.md +0 -866
  61. package/docs/MCP/MCP_Marketplace.md +0 -160
  62. package/docs/MCP/README.md +0 -415
  63. package/docs/Prompts System/Architecture.md +0 -760
  64. package/docs/Prompts System/Index.md +0 -223
  65. package/docs/Prompts System/PromptsRouting.md +0 -1047
  66. package/docs/Prompts System/PromptsTemplates.md +0 -1102
  67. package/docs/Prompts System/README.md +0 -389
  68. package/docs/Prompts System/SystemPrompts.md +0 -856
  69. package/docs/Quickstart.md +0 -535
  70. package/docs/Tools/Architecture.md +0 -884
  71. package/docs/Tools/GettingStarted.md +0 -624
  72. package/docs/Tools/Index.md +0 -216
  73. package/docs/Tools/ManifestReference.md +0 -141
  74. package/docs/Tools/README.md +0 -440
  75. package/docs/Tools/UserGuide.md +0 -773
  76. package/docs/Troubleshooting.md +0 -1265
  77. package/docs/UI&Settings/Architecture.md +0 -729
  78. package/docs/UI&Settings/ColorASCII.md +0 -34
  79. package/docs/UI&Settings/Commands.md +0 -755
  80. package/docs/UI&Settings/Configuration.md +0 -872
  81. package/docs/UI&Settings/Index.md +0 -293
  82. package/docs/UI&Settings/Keybinds.md +0 -372
  83. package/docs/UI&Settings/README.md +0 -278
  84. package/docs/UI&Settings/Terminal.md +0 -637
  85. package/docs/UI&Settings/Themes.md +0 -604
  86. package/docs/UI&Settings/UIGuide.md +0 -550
@@ -1,586 +0,0 @@
1
- # Installation Guide
2
-
3
- This guide will help you install OLLM CLI on your computer. The process is straightforward and should take about 5-10 minutes.
4
-
5
- ---
6
-
7
- ## Table of Contents
8
-
9
- - [Prerequisites](#prerequisites)
10
- - [Quick Install](#quick-install)
11
- - [Detailed Installation](#detailed-installation)
12
- - [Platform-Specific Instructions](#platform-specific-instructions)
13
- - [Verify Installation](#verify-installation)
14
- - [Next Steps](#next-steps)
15
- - [Troubleshooting](#troubleshooting)
16
-
17
- ---
18
-
19
- ## Prerequisites
20
-
21
- Before installing OLLM CLI, make sure you have:
22
-
23
- ### Required
24
-
25
- **Node.js 20 or higher**
26
-
27
- - Check your version: `node --version`
28
- - Download from: [nodejs.org](https://nodejs.org/)
29
- - We recommend the LTS (Long Term Support) version
30
-
31
- **npm (comes with Node.js)**
32
-
33
- - Check your version: `npm --version`
34
- - Should be version 10 or higher
35
-
36
- ### Recommended
37
-
38
- **Modern Terminal**
39
-
40
- - **Windows:** Windows Terminal (from Microsoft Store)
41
- - **macOS:** iTerm2 or built-in Terminal
42
- - **Linux:** Your favorite terminal emulator
43
-
44
- **System Resources**
45
-
46
- - **RAM:** 8GB minimum, 16GB recommended
47
- - **Storage:** 10GB free space (for models)
48
- - **GPU:** Optional but recommended for faster AI
49
-
50
- ---
51
-
52
- ## Quick Install
53
-
54
- The fastest way to get started:
55
-
56
- ```bash
57
- # Install OLLM CLI globally
58
- npm install -g @tecet/ollm
59
-
60
- # The installer will guide you through:
61
- # 1. Installing Ollama (if needed)
62
- # 2. Downloading a starter model
63
- # 3. Setting up configuration
64
-
65
- # Start using OLLM CLI
66
- ollm
67
- ```
68
-
69
- That's it! The interactive installer will handle everything else.
70
-
71
- ---
72
-
73
- ## Detailed Installation
74
-
75
- ### Step 1: Install Node.js
76
-
77
- If you don't have Node.js 20+ installed:
78
-
79
- **Windows:**
80
-
81
- 1. Download installer from [nodejs.org](https://nodejs.org/)
82
- 2. Run the installer
83
- 3. Follow the setup wizard
84
- 4. Restart your terminal
85
-
86
- **macOS:**
87
-
88
- ```bash
89
- # Using Homebrew (recommended)
90
- brew install node
91
-
92
- # Or download from nodejs.org
93
- ```
94
-
95
- **Linux:**
96
-
97
- ```bash
98
- # Ubuntu/Debian
99
- curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
100
- sudo apt-get install -y nodejs
101
-
102
- # Fedora
103
- sudo dnf install nodejs
104
-
105
- # Arch Linux
106
- sudo pacman -S nodejs npm
107
- ```
108
-
109
- **Verify installation:**
110
-
111
- ```bash
112
- node --version # Should show v20.x.x or higher
113
- npm --version # Should show 10.x.x or higher
114
- ```
115
-
116
- ### Step 2: Install OLLM CLI
117
-
118
- **Global Installation (Recommended):**
119
-
120
- ```bash
121
- npm install -g @tecet/ollm
122
- ```
123
-
124
- This makes the `ollm` command available everywhere.
125
-
126
- **Local Installation (Alternative):**
127
-
128
- ```bash
129
- # In your project directory
130
- npm install @tecet/ollm
131
-
132
- # Run with npx
133
- npx ollm
134
- ```
135
-
136
- ### Step 3: Interactive Setup
137
-
138
- After installation, the setup wizard will start automatically:
139
-
140
- ```
141
- ┌─ OLLM CLI Setup ─────────────────────────┐
142
- │ │
143
- │ Welcome to OLLM CLI! │
144
- │ │
145
- │ Checking for Ollama... │
146
- ```
147
-
148
- **If Ollama is not installed:**
149
-
150
- ```
151
- │ ✗ Ollama not detected │
152
- │ │
153
- │ Ollama is required to run local models. │
154
- │ Install Ollama now? (Y/n): │
155
- ```
156
-
157
- Type `Y` and press Enter. The installer will:
158
-
159
- 1. Download Ollama for your platform
160
- 2. Install it automatically
161
- 3. Start the Ollama service
162
-
163
- **Download a starter model:**
164
-
165
- ```
166
- │ Pull default model (llama3.2:3b ~2GB)? │
167
- │ This will download a small model to │
168
- │ get you started. (Y/n): │
169
- ```
170
-
171
- Type `Y` to download a small, fast model (about 2GB).
172
-
173
- **Setup complete:**
174
-
175
- ```
176
- │ ✓ Setup complete! │
177
- │ │
178
- │ Start using OLLM CLI: │
179
- │ ollm │
180
- │ │
181
- └───────────────────────────────────────────┘
182
- ```
183
-
184
- ---
185
-
186
- ## Platform-Specific Instructions
187
-
188
- ### Windows
189
-
190
- **1. Install Node.js:**
191
-
192
- - Download from [nodejs.org](https://nodejs.org/)
193
- - Run the `.msi` installer
194
- - Check "Add to PATH" during installation
195
-
196
- **2. Install OLLM CLI:**
197
-
198
- ```powershell
199
- # Open PowerShell as Administrator
200
- npm install -g @tecet/ollm
201
- ```
202
-
203
- **3. Ollama Installation:**
204
- The installer will download `OllamaSetup.exe` and install it automatically.
205
-
206
- **4. Windows Terminal (Recommended):**
207
-
208
- - Install from Microsoft Store
209
- - Better colors and Unicode support
210
- - Modern features
211
-
212
- **Common Issues:**
213
-
214
- - If you get permission errors, run PowerShell as Administrator
215
- - If `ollm` command not found, restart your terminal
216
- - Check that npm global bin is in PATH: `npm config get prefix`
217
-
218
- ### macOS
219
-
220
- **1. Install Node.js:**
221
-
222
- ```bash
223
- # Using Homebrew (recommended)
224
- brew install node
225
-
226
- # Or download from nodejs.org
227
- ```
228
-
229
- **2. Install OLLM CLI:**
230
-
231
- ```bash
232
- npm install -g @tecet/ollm
233
- ```
234
-
235
- **3. Ollama Installation:**
236
- The installer will download and install Ollama.app automatically.
237
-
238
- **4. Terminal Setup:**
239
-
240
- - Built-in Terminal works fine
241
- - iTerm2 recommended for better features
242
- - Alacritty for maximum performance
243
-
244
- **Common Issues:**
245
-
246
- - If you get permission errors: `sudo npm install -g @tecet/ollm`
247
- - Or configure npm to use user directory (see Troubleshooting)
248
- - macOS may ask for permission to install Ollama - click "Allow"
249
-
250
- ### Linux
251
-
252
- **1. Install Node.js:**
253
-
254
- ```bash
255
- # Ubuntu/Debian
256
- curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
257
- sudo apt-get install -y nodejs
258
-
259
- # Fedora
260
- sudo dnf install nodejs
261
-
262
- # Arch Linux
263
- sudo pacman -S nodejs npm
264
- ```
265
-
266
- **2. Install OLLM CLI:**
267
-
268
- ```bash
269
- npm install -g @tecet/ollm
270
- ```
271
-
272
- **3. Ollama Installation:**
273
- The installer will run the official Ollama install script:
274
-
275
- ```bash
276
- curl -fsSL https://ollama.ai/install.sh | sh
277
- ```
278
-
279
- **4. Terminal:**
280
- Most Linux terminals work great out of the box.
281
-
282
- **Common Issues:**
283
-
284
- - If you get permission errors, use `sudo` or configure npm user directory
285
- - Ensure your terminal supports 256 colors
286
- - Check that Ollama service is running: `systemctl status ollama`
287
-
288
- ---
289
-
290
- ## Verify Installation
291
-
292
- After installation, verify everything works:
293
-
294
- ### 1. Check OLLM CLI Version
295
-
296
- ```bash
297
- ollm --version
298
- ```
299
-
300
- Should show: `0.1.0` or higher
301
-
302
- ### 2. Check Ollama
303
-
304
- ```bash
305
- ollama --version
306
- ```
307
-
308
- Should show Ollama version information.
309
-
310
- ### 3. List Available Models
311
-
312
- ```bash
313
- ollama list
314
- ```
315
-
316
- Should show at least one model (like `llama3.2:3b`).
317
-
318
- ### 4. Test OLLM CLI
319
-
320
- ```bash
321
- ollm
322
- ```
323
-
324
- You should see the OLLM CLI interface with a welcome message.
325
-
326
- ### 5. Send a Test Message
327
-
328
- ```
329
- You: Hello!
330
- ```
331
-
332
- The AI should respond with a greeting.
333
-
334
- ---
335
-
336
- ## Manual Ollama Installation
337
-
338
- If the automatic installer doesn't work, install Ollama manually:
339
-
340
- ### Windows
341
-
342
- 1. Download from [ollama.ai/download](https://ollama.ai/download)
343
- 2. Run `OllamaSetup.exe`
344
- 3. Follow the installer
345
- 4. Ollama will start automatically
346
-
347
- ### macOS
348
-
349
- 1. Download from [ollama.ai/download](https://ollama.ai/download)
350
- 2. Open the `.zip` file
351
- 3. Drag Ollama.app to Applications
352
- 4. Open Ollama.app
353
- 5. It will run in the menu bar
354
-
355
- ### Linux
356
-
357
- ```bash
358
- curl -fsSL https://ollama.ai/install.sh | sh
359
- ```
360
-
361
- ### Verify Ollama Installation
362
-
363
- ```bash
364
- # Check if Ollama is running
365
- curl http://localhost:11434/api/tags
366
-
367
- # Should return JSON with model list
368
- ```
369
-
370
- ### Download a Model
371
-
372
- ```bash
373
- # Download a small, fast model
374
- ollama pull llama3.2:3b
375
-
376
- # Or a larger, more capable model
377
- ollama pull llama3.1:8b
378
- ```
379
-
380
- ---
381
-
382
- ## Configuration
383
-
384
- OLLM CLI creates configuration files automatically, but you can customize them:
385
-
386
- ### User Configuration
387
-
388
- **Location:** `~/.ollm/settings.json`
389
-
390
- ```json
391
- {
392
- "provider": {
393
- "ollama": {
394
- "autoStart": true,
395
- "host": "localhost",
396
- "port": 11434,
397
- "url": "http://localhost:11434"
398
- }
399
- },
400
- "ui": {
401
- "theme": "solarized-dark"
402
- },
403
- "context": {
404
- "autoCompress": true
405
- }
406
- }
407
- ```
408
-
409
- ### Workspace Configuration
410
-
411
- **Location:** `.ollm/settings.json` (in your project)
412
-
413
- Workspace settings override user settings.
414
-
415
- ### Environment Variables
416
-
417
- ```bash
418
- # Ollama host
419
- export OLLAMA_HOST=http://localhost:11434
420
-
421
- # Log level
422
- export OLLM_LOG_LEVEL=info
423
-
424
- # Custom config path
425
- export OLLM_CONFIG_PATH=~/.ollm/custom-config.json
426
- ```
427
-
428
- ---
429
-
430
- ## Updating
431
-
432
- ### Update OLLM CLI
433
-
434
- ```bash
435
- # Check for updates
436
- npm outdated -g @tecet/ollm
437
-
438
- # Update to latest version
439
- npm update -g @tecet/ollm
440
-
441
- # Or reinstall
442
- npm install -g @tecet/ollm@latest
443
- ```
444
-
445
- ### Update Ollama
446
-
447
- ```bash
448
- # Ollama updates itself automatically
449
- # Or manually:
450
-
451
- # macOS/Linux
452
- curl -fsSL https://ollama.ai/install.sh | sh
453
-
454
- # Windows
455
- # Download new installer from ollama.ai/download
456
- ```
457
-
458
- ### Update Models
459
-
460
- ```bash
461
- # Update a specific model
462
- ollama pull llama3.2:3b
463
-
464
- # List outdated models
465
- ollama list
466
- ```
467
-
468
- ---
469
-
470
- ## Uninstalling
471
-
472
- ### Uninstall OLLM CLI
473
-
474
- ```bash
475
- npm uninstall -g @tecet/ollm
476
- ```
477
-
478
- ### Remove Configuration
479
-
480
- ```bash
481
- # Remove user configuration
482
- rm -rf ~/.ollm
483
-
484
- # Remove workspace configuration
485
- rm -rf .ollm
486
- ```
487
-
488
- ### Uninstall Ollama
489
-
490
- **Windows:**
491
-
492
- - Use "Add or Remove Programs"
493
- - Search for "Ollama"
494
- - Click "Uninstall"
495
-
496
- **macOS:**
497
-
498
- - Drag Ollama.app to Trash
499
- - Remove data: `rm -rf ~/.ollama`
500
-
501
- **Linux:**
502
-
503
- ```bash
504
- # Stop service
505
- sudo systemctl stop ollama
506
-
507
- # Remove binary
508
- sudo rm /usr/local/bin/ollama
509
-
510
- # Remove data
511
- rm -rf ~/.ollama
512
- ```
513
-
514
- ---
515
-
516
- ## Next Steps
517
-
518
- Now that OLLM CLI is installed:
519
-
520
- 1. **[Quick Start Guide](Quickstart.md)** - Learn the basics in 5 minutes
521
- 2. **[User Interface Guide](UI&Settings/UIGuide.md)** - Understand the interface
522
- 3. **[Commands Reference](UI&Settings/Commands.md)** - Learn all commands
523
- 4. **[Configuration Guide](UI&Settings/Configuration.md)** - Customize your setup
524
-
525
- ---
526
-
527
- ## Troubleshooting
528
-
529
- ### Installation Issues
530
-
531
- **"npm: command not found"**
532
-
533
- - Node.js is not installed or not in PATH
534
- - Install Node.js from [nodejs.org](https://nodejs.org/)
535
- - Restart your terminal
536
-
537
- **"Permission denied" errors**
538
-
539
- - On macOS/Linux: Use `sudo npm install -g @tecet/ollm`
540
- - Or configure npm to use user directory (recommended)
541
- - See [Troubleshooting Guide](Troubleshooting.md#permission-errors)
542
-
543
- **"ollm: command not found"**
544
-
545
- - npm global bin not in PATH
546
- - Find npm bin: `npm config get prefix`
547
- - Add to PATH: `export PATH=$(npm config get prefix)/bin:$PATH`
548
- - Restart terminal
549
-
550
- **Ollama installation fails**
551
-
552
- - Install Ollama manually from [ollama.ai/download](https://ollama.ai/download)
553
- - Verify installation: `ollama --version`
554
- - Start Ollama: `ollama serve`
555
-
556
- **Model download fails**
557
-
558
- - Check internet connection
559
- - Check disk space (models are 2-10GB)
560
- - Try a smaller model: `ollama pull llama3.2:3b`
561
- - Check Ollama logs: `ollama logs`
562
-
563
- ### Connection Issues
564
-
565
- **"Cannot connect to Ollama"**
566
-
567
- - Check if Ollama is running: `curl http://localhost:11434/api/tags`
568
- - Start Ollama: `ollama serve`
569
- - Check firewall settings
570
- - See [Troubleshooting Guide](Troubleshooting.md#connection-issues)
571
-
572
- ### More Help
573
-
574
- For detailed troubleshooting, see the [Troubleshooting Guide](Troubleshooting.md).
575
-
576
- For other issues:
577
-
578
- - **GitHub Issues:** [github.com/tecet/ollm/issues](https://github.com/tecet/ollm/issues)
579
- - **Discussions:** [github.com/tecet/ollm/discussions](https://github.com/tecet/ollm/discussions)
580
- - **Documentation:** [Complete Documentation](README.md)
581
-
582
- ---
583
-
584
- **Last Updated:** January 26, 2026
585
- **Version:** 0.1.0
586
- **Author:** tecet