@tecet/ollm 0.1.3 → 0.1.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (46) hide show
  1. package/README.md +529 -529
  2. package/dist/cli.js +291 -244
  3. package/dist/cli.js.map +3 -3
  4. package/dist/config/LLM_profiles.json +4 -4
  5. package/dist/config/settingsService.d.ts +5 -0
  6. package/dist/config/settingsService.d.ts.map +1 -1
  7. package/dist/config/settingsService.js +20 -0
  8. package/dist/config/settingsService.js.map +1 -1
  9. package/dist/config/toolsConfig.d.ts +6 -3
  10. package/dist/config/toolsConfig.d.ts.map +1 -1
  11. package/dist/config/toolsConfig.js +52 -3
  12. package/dist/config/toolsConfig.js.map +1 -1
  13. package/dist/features/chat/hooks/useChatNetwork.d.ts.map +1 -1
  14. package/dist/features/chat/hooks/useChatNetwork.js +15 -4
  15. package/dist/features/chat/hooks/useChatNetwork.js.map +1 -1
  16. package/dist/features/context/ContextManagerContext.d.ts.map +1 -1
  17. package/dist/features/context/ContextManagerContext.js +14 -73
  18. package/dist/features/context/ContextManagerContext.js.map +1 -1
  19. package/dist/features/context/hooks/useToolSupport.d.ts.map +1 -1
  20. package/dist/features/context/hooks/useToolSupport.js +13 -5
  21. package/dist/features/context/hooks/useToolSupport.js.map +1 -1
  22. package/dist/features/context/utils/systemPromptBuilder.d.ts.map +1 -1
  23. package/dist/features/context/utils/systemPromptBuilder.js +6 -34
  24. package/dist/features/context/utils/systemPromptBuilder.js.map +1 -1
  25. package/dist/nonInteractive.d.ts.map +1 -1
  26. package/dist/nonInteractive.js +1 -2
  27. package/dist/nonInteractive.js.map +1 -1
  28. package/dist/templates/assistant/tier3.txt +2 -0
  29. package/dist/templates/system/CoreMandates.txt +3 -0
  30. package/dist/templates/system/ToolDescriptions.txt +11 -3
  31. package/dist/templates/system/skills/SkillsAssistant.txt +1 -2
  32. package/dist/ui/App.js +15 -15
  33. package/dist/ui/components/launch/VersionBanner.js +1 -1
  34. package/dist/ui/components/layout/KeybindsLegend.d.ts.map +1 -1
  35. package/dist/ui/components/layout/KeybindsLegend.js +1 -1
  36. package/dist/ui/components/layout/KeybindsLegend.js.map +1 -1
  37. package/dist/ui/components/tabs/BugReportTab.js +1 -1
  38. package/dist/ui/components/tools/CategorySection.d.ts.map +1 -1
  39. package/dist/ui/components/tools/CategorySection.js +1 -0
  40. package/dist/ui/components/tools/CategorySection.js.map +1 -1
  41. package/dist/ui/contexts/__tests__/mcpTestUtils.d.ts +14 -14
  42. package/docs/DevelopmentRoadmap/OLLM-CLI_Releases.md +419 -419
  43. package/docs/DevelopmentRoadmap/RoadmapVisual.md +372 -372
  44. package/docs/LLM Models/LLM_ModelsList.md +1071 -1071
  45. package/docs/MCP/MCP_Architecture.md +1086 -1086
  46. package/package.json +82 -82
package/README.md CHANGED
@@ -1,529 +1,529 @@
1
- ![Welcome Screen](https://github.com/Tecet/OLLM-CLI/blob/task-1-simplify-tier-selection/welcome.png)
2
-
3
- # OLLM CLI
4
-
5
- > A local-first command-line interface for open-source LLMs with intelligent context management, tools, hooks, and MCP integration.
6
-
7
- ![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)
8
- ![Node Version](https://img.shields.io/badge/node-%3E%3D20.0.0-brightgreen.svg)
9
- ![TypeScript](https://img.shields.io/badge/TypeScript-5.9-blue.svg)
10
- ![npm](https://img.shields.io/badge/npm-@tecet%2Follm-red.svg)
11
-
12
- OLLM CLI brings the power of open-source large language models to your terminal with a focus on local-first operation, intelligent resource management, and extensibility. Built with TypeScript and React, it provides a modern terminal UI while maintaining compatibility with automation workflows.
13
-
14
- ---
15
-
16
- ## ✨ Features
17
-
18
- ### 🎨 Interactive Terminal UI
19
-
20
- - **React + Ink powered interface** with streaming responses and real-time updates
21
- - **Syntax highlighting** for code blocks with language detection
22
- - **Status bar** showing model, context usage, and VRAM metrics
23
- - **Tool execution preview** with diff visualization for file changes
24
-
25
- ### 🧠 Smart Context Management
26
-
27
- - **Fixed context sizing** based on available VRAM (determined at startup)
28
- - **Automatic compression** when approaching context limits
29
- - **Snapshot and rollover** support for long conversations
30
- - **Real-time monitoring** of token usage and memory consumption
31
-
32
- ### 🛠️ Powerful Tool System
33
-
34
- - **Built-in tools**: File operations, shell execution, web fetch, search, memory
35
- - **Policy-based confirmation**: ASK, AUTO, and YOLO approval modes
36
- - **Diff preview** for file edits before applying changes
37
- - **Output truncation** and streaming for long-running operations
38
-
39
- ### 🔌 Extensibility
40
-
41
- - **Hook system** for event-driven automation and safety gates
42
- - **Extension system** with manifest-based configuration
43
- - **MCP integration** (Model Context Protocol) for external tools
44
- - **Provider-agnostic** architecture supporting multiple LLM backends
45
-
46
- ### 💾 Session Management
47
-
48
- - **Record and resume** conversations with full context
49
- - **Automatic compression** to manage context limits
50
- - **Loop detection** to prevent runaway tool calls
51
- - **Session history** with searchable archives
52
-
53
- ### 🌐 Offline First
54
-
55
- - **Works without internet** when models are installed locally
56
- - **No telemetry** - all data stays on your machine
57
- - **Local model management** - pull, list, and remove models
58
-
59
- ---
60
-
61
- ## 🚀 Quick Start
62
-
63
- ### Installation
64
-
65
- Install OLLM CLI globally using npm:
66
-
67
- ```bash
68
- # Install from npm
69
- npm install -g @tecet/ollm
70
-
71
- # The interactive installer will guide you through:
72
- # 1. Installing Ollama (if needed)
73
- # 2. Downloading a starter model
74
- # 3. Setting up configuration
75
-
76
- # Start using OLLM CLI
77
- ollm
78
- ```
79
-
80
- **That's it!** The installer handles everything automatically.
81
-
82
- For detailed installation instructions, see the **[Installation Guide](docs/Installation.md)**.
83
-
84
- ### First Steps
85
-
86
- ```bash
87
- # Start interactive mode
88
- ollm
89
-
90
- # Or try a quick question
91
- ollm -p "Explain async/await in JavaScript"
92
-
93
- # Select a specific model
94
- ollm --model llama3.1:8b
95
-
96
- # Get help
97
- ollm --help
98
- ```
99
-
100
- ### Learn More
101
-
102
- - **[Introduction](docs/Introduction.md)** - What is OLLM CLI and why use it?
103
- - **[Quick Start Guide](docs/Quickstart.md)** - Get up and running in 5 minutes
104
- - **[Complete Documentation](docs/README.md)** - Full documentation (57 guides)
105
-
106
- ---
107
-
108
- ## 📖 Documentation
109
-
110
- ### Getting Started
111
-
112
- - **[Introduction](docs/Introduction.md)** - What is OLLM CLI? (friendly, non-technical)
113
- - **[Installation](docs/Installation.md)** - Complete installation guide
114
- - **[Quick Start](docs/Quickstart.md)** - Get started in 5 minutes
115
- - **[Troubleshooting](docs/Troubleshooting.md)** - Common issues and solutions
116
-
117
- ### Core Features
118
-
119
- - **[User Interface & Settings](docs/UI&Settings/README.md)** - Interface, commands, themes, keybinds, and configuration
120
- - **[Context Management](docs/Context/README.md)** - Context sizing, compression, checkpoints, and VRAM monitoring
121
- - **[Model Management](docs/LLM%20Models/README.md)** - Models, providers, compatibility, and memory system
122
- - **[Tools System](docs/Tools/README.md)** - Tool execution, architecture, and manifest reference
123
- - **[Hooks System](docs/Hooks/README.md)** - Event-driven automation, protocol, and visual guide
124
- - **[MCP Integration](docs/MCP/README.md)** - Model Context Protocol, marketplace, and commands
125
- - **[Prompts System](docs/Prompts%20System/README.md)** - System prompts, templates, and routing
126
-
127
- ### Development
128
-
129
- - **[Development Roadmap](docs/DevelopmentRoadmap/README.md)** - Future plans and version releases (v0.2.0-v0.9.0)
130
- - **[Complete Index](docs/Index.md)** - All 57 documentation files organized by topic
131
-
132
- **Total Documentation:** 57 comprehensive guides covering every aspect of OLLM CLI
133
-
134
- ---
135
-
136
- ## ⚙️ System Requirements
137
-
138
- | Component | Minimum | Recommended |
139
- | ----------- | --------------------------- | ------------------ |
140
- | **Node.js** | 20.0.0 | 22.x LTS |
141
- | **RAM** | 8GB | 16GB+ |
142
- | **VRAM** | 4GB | 8GB+ |
143
- | **Storage** | 10GB | 50GB+ (for models) |
144
- | **OS** | Windows 10, macOS 11, Linux | Latest versions |
145
-
146
- ### Recommended Models by VRAM
147
-
148
- | VRAM | Recommended Models | Context Sweet Spot |
149
- | ----- | ------------------------ | ------------------ |
150
- | 4GB | Llama 3.2 3B, Qwen3 4B | 4K-8K tokens |
151
- | 8GB | Llama 3.1 8B, Mistral 7B | 8K-16K tokens |
152
- | 12GB | Gemma 3 12B, Phi-4 14B | 16K-32K tokens |
153
- | 24GB+ | Qwen3 32B, Mixtral 8x7B | 32K-64K tokens |
154
-
155
- ---
156
-
157
- ## 🎯 Key Concepts
158
-
159
- ### Context Management
160
-
161
- OLLM CLI uses **fixed context sizing** determined at startup based on your available VRAM:
162
-
163
- - **Minimal** (2K tokens) - Very small models or limited VRAM
164
- - **Basic** (4K tokens) - Small models, basic conversations
165
- - **Standard** (8K tokens) - Most common use case
166
- - **Premium** (16K tokens) - Larger models, complex tasks
167
- - **Ultra** (32K+ tokens) - High-end hardware, long conversations
168
-
169
- Context is **fixed for the session** and automatically compressed when approaching limits.
170
-
171
- ### Tool System
172
-
173
- The AI can use tools to help you:
174
-
175
- - **File Tools** - Read, write, edit files
176
- - **Web Tools** - Search internet, fetch pages
177
- - **Shell Tool** - Run commands
178
- - **Memory Tools** - Remember facts across sessions
179
- - **MCP Tools** - Connect to external services
180
-
181
- ### Approval Modes
182
-
183
- Control how tools are executed:
184
-
185
- - **ASK** - Confirm each tool use (default, safest)
186
- - **AUTO** - Auto-approve safe tools, ask for risky ones
187
- - **YOLO** - Auto-approve everything (use with caution!)
188
-
189
- ---
190
-
191
- ## 🔧 Common Commands
192
-
193
- ### CLI Flags
194
-
195
- ```bash
196
- # Interactive mode
197
- ollm
198
-
199
- # One-shot prompt
200
- ollm -p "Explain async/await"
201
-
202
- # Select model
203
- ollm --model llama3.1:8b
204
-
205
- # List models
206
- ollm --list-models
207
-
208
- # JSON output
209
- ollm -p "List 5 languages" --output json
210
-
211
- # Debug mode
212
- ollm --debug
213
-
214
- # Show version
215
- ollm --version
216
- ```
217
-
218
- ### Interactive Slash Commands
219
-
220
- ```bash
221
- /model list # List available models
222
- /model use <name> # Switch model
223
- /context # Show context status
224
- /theme list # List themes
225
- /theme use <name> # Switch theme
226
- /session list # List sessions
227
- /help # Show help
228
- /exit # Exit
229
- ```
230
-
231
- See **[Commands Reference](docs/UI&Settings/Commands.md)** for complete list.
232
-
233
- ---
234
-
235
- ## 🏗️ Project Structure
236
-
237
- ```
238
- ollm-cli/
239
- ├── packages/
240
- │ ├── cli/ # CLI entry point and UI components
241
- │ │ ├── src/
242
- │ │ │ ├── cli.tsx # Main entry
243
- │ │ │ ├── commands/ # Slash commands
244
- │ │ │ └── ui/ # React components
245
- │ │ └── package.json
246
- │ │
247
- │ ├── core/ # Core runtime and business logic
248
- │ │ ├── src/
249
- │ │ │ ├── context/ # Context management
250
- │ │ │ ├── tools/ # Tool system
251
- │ │ │ ├── hooks/ # Hook system
252
- │ │ │ ├── services/ # Business logic
253
- │ │ │ └── mcp/ # MCP integration
254
- │ │ └── package.json
255
- │ │
256
- │ ├── ollm-bridge/ # Provider adapters
257
- │ │ ├── src/
258
- │ │ │ └── provider/ # Ollama, vLLM, OpenAI adapters
259
- │ │ └── package.json
260
- │ │
261
- │ └── test-utils/ # Shared test utilities
262
- │ └── package.json
263
-
264
- ├── docs/ # Documentation
265
- ├── scripts/ # Build and utility scripts
266
- ├── schemas/ # JSON schemas
267
- └── package.json # Root workspace config
268
- ```
269
-
270
- ---
271
-
272
- ## 🛠️ Development
273
-
274
- ### For Contributors
275
-
276
- Want to contribute to OLLM CLI? Here's how to get started:
277
-
278
- ```bash
279
- # Clone the repository
280
- git clone https://github.com/tecet/ollm.git
281
- cd ollm
282
-
283
- # Install dependencies
284
- npm install
285
-
286
- # Build all packages
287
- npm run build
288
-
289
- # Run tests
290
- npm test
291
-
292
- # Start development
293
- npm start
294
- ```
295
-
296
- ### Development Commands
297
-
298
- ```bash
299
- npm run build # Build all packages
300
- npm test # Run tests
301
- npm run lint # Lint code
302
- npm run format # Format code
303
- npm start # Run CLI
304
- ```
305
-
306
- ### Project Structure
307
-
308
- ```
309
- ollm-cli/
310
- ├── packages/
311
- │ ├── cli/ # CLI entry and UI
312
- │ ├── core/ # Core runtime
313
- │ ├── ollm-bridge/ # Provider adapters
314
- │ └── test-utils/ # Test utilities
315
- ├── docs/ # Documentation (57 files)
316
- ├── scripts/ # Build scripts
317
- └── package.json # Root workspace
318
- ```
319
-
320
- See **[Project Structure](.kiro/steering/structure.md)** for detailed architecture.
321
-
322
- ---
323
-
324
- ## 🗺️ Roadmap
325
-
326
- OLLM CLI is under active development with a clear roadmap for future features.
327
-
328
- ### ✅ Completed (v0.1.0 - Alpha)
329
-
330
- - Interactive TUI with React + Ink
331
- - Context management with VRAM monitoring
332
- - Tool system with policy engine
333
- - Session recording and compression
334
- - Hook system for automation
335
- - MCP integration
336
- - Comprehensive documentation (57 guides)
337
- - Testing infrastructure
338
-
339
- ### 🔮 Planned Features (v0.2.0 - v0.9.0)
340
-
341
- **v0.2.0 - Enhanced Context Management**
342
-
343
- - Advanced context pool management
344
- - Multi-tier context strategies
345
- - Improved VRAM optimization
346
-
347
- **v0.3.0 - Advanced Compression**
348
-
349
- - Multiple compression strategies
350
- - Semantic compression
351
- - Context checkpointing
352
-
353
- **v0.4.0 - Reasoning Models**
354
-
355
- - Extended reasoning support
356
- - Reasoning capture and display
357
- - Specialized reasoning modes
358
-
359
- **v0.5.0 - Session Management**
360
-
361
- - Enhanced session persistence
362
- - Session templates
363
- - Collaborative sessions
364
-
365
- **v0.6.0 - Multi-Provider Support**
366
-
367
- - OpenAI, Anthropic, Google AI
368
- - Cost tracking and budgets
369
- - Auto-escalation between providers
370
-
371
- **v0.7.0 - Developer Productivity**
372
-
373
- - Git integration
374
- - @-mentions for context
375
- - Diff review workflows
376
-
377
- **v0.8.0 - Intelligence Layer**
378
-
379
- - Semantic codebase search (RAG)
380
- - Structured output
381
- - Code execution sandbox
382
- - Vision support
383
-
384
- **v0.9.0 - Cross-Platform Polish**
385
-
386
- - Platform-specific optimizations
387
- - Enhanced Windows support
388
- - Improved terminal compatibility
389
-
390
- **v1.0.0+ - Beta and Beyond**
391
-
392
- - Production-ready release
393
- - Enterprise features
394
- - Plugin marketplace
395
-
396
- See **[Development Roadmap](docs/DevelopmentRoadmap/Roadmap.md)** for detailed specifications.
397
-
398
- ---
399
-
400
- ## 🧰 Tech Stack
401
-
402
- ### Runtime & Language
403
-
404
- - **Node.js 20+** - JavaScript runtime
405
- - **TypeScript 5.9** - Type-safe development
406
- - **ES Modules** - Modern module system
407
-
408
- ### Build & Tooling
409
-
410
- - **npm workspaces** - Monorepo management
411
- - **esbuild** - Fast bundling
412
- - **Vitest** - Testing framework
413
- - **ESLint** - Code linting
414
- - **Prettier** - Code formatting
415
-
416
- ### UI Framework
417
-
418
- - **React 19** - UI library
419
- - **Ink 6** - Terminal rendering
420
-
421
- ### Key Dependencies
422
-
423
- - **yargs** - CLI argument parsing
424
- - **yaml** - Configuration parsing
425
- - **ajv** - JSON schema validation
426
- - **fast-check** - Property-based testing
427
-
428
- ---
429
-
430
- ## 📄 License
431
-
432
- This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.
433
-
434
- ```
435
- Copyright 2026 OLLM CLI Contributors
436
-
437
- Licensed under the Apache License, Version 2.0 (the "License");
438
- you may not use this file except in compliance with the License.
439
- You may obtain a copy of the License at
440
-
441
- http://www.apache.org/licenses/LICENSE-2.0
442
-
443
- Unless required by applicable law or agreed to in writing, software
444
- distributed under the License is distributed on an "AS IS" BASIS,
445
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
446
- See the License for the specific language governing permissions and
447
- limitations under the License.
448
- ```
449
-
450
- ---
451
-
452
- ## 🤝 Contributing
453
-
454
- We welcome contributions! Here's how you can help:
455
-
456
- ### Ways to Contribute
457
-
458
- 1. **Report Bugs** - Open an issue with details and reproduction steps
459
- 2. **Suggest Features** - Share your ideas in the discussions
460
- 3. **Submit Pull Requests** - Fix bugs or implement features
461
- 4. **Improve Documentation** - Help make the docs clearer
462
- 5. **Write Tests** - Increase test coverage
463
-
464
- ### Contribution Guidelines
465
-
466
- 1. **Fork the repository** and create a feature branch
467
- 2. **Follow the code style** - Run `npm run lint` and `npm run format`
468
- 3. **Write tests** for new features
469
- 4. **Update documentation** as needed
470
- 5. **Submit a pull request** with a clear description
471
-
472
- ### Development Workflow
473
-
474
- ```bash
475
- # 1. Fork and clone
476
- git clone https://github.com/tecet/ollm.git
477
- cd ollm
478
-
479
- # 2. Create a feature branch
480
- git checkout -b feature/my-feature
481
-
482
- # 3. Make changes and test
483
- npm install
484
- npm run build
485
- npm test
486
-
487
- # 4. Commit with clear messages
488
- git commit -m "feat: add new feature"
489
-
490
- # 5. Push and create PR
491
- git push origin feature/my-feature
492
- ```
493
-
494
- ### Code of Conduct
495
-
496
- - Be respectful and inclusive
497
- - Provide constructive feedback
498
- - Focus on the code, not the person
499
- - Help create a welcoming environment
500
-
501
- ---
502
-
503
- ## 🙏 Acknowledgments
504
-
505
- OLLM CLI is built on the shoulders of giants:
506
-
507
- - **[Ollama](https://ollama.com/)** - Local LLM runtime
508
- - **[React](https://react.dev/)** & **[Ink](https://github.com/vadimdemedes/ink)** - Terminal UI
509
- - **[Vitest](https://vitest.dev/)** - Testing framework
510
- - **[fast-check](https://fast-check.dev/)** - Property-based testing
511
- - All our **[contributors](https://github.com/tecet/ollm/graphs/contributors)**
512
-
513
- ---
514
-
515
- ## 📞 Support
516
-
517
- - **Documentation**: [docs/README.md](docs/README.md)
518
- - **Issues**: [GitHub Issues](https://github.com/tecet/ollm/issues)
519
- - **Discussions**: [GitHub Discussions](https://github.com/tecet/ollm/discussions)
520
-
521
- ---
522
-
523
- <div align="center">
524
-
525
- **[⬆ Back to Top](#ollm-cli)**
526
-
527
- Made with ❤️ by **[tecet](https://github.com/tecet)**
528
-
529
- </div>
1
+ ![Welcome Screen](https://github.com/Tecet/OLLM-CLI/blob/task-1-simplify-tier-selection/welcome.png)
2
+
3
+ # OLLM CLI
4
+
5
+ > A local-first command-line interface for open-source LLMs with intelligent context management, tools, hooks, and MCP integration.
6
+
7
+ ![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)
8
+ ![Node Version](https://img.shields.io/badge/node-%3E%3D20.0.0-brightgreen.svg)
9
+ ![TypeScript](https://img.shields.io/badge/TypeScript-5.9-blue.svg)
10
+ ![npm](https://img.shields.io/badge/npm-@tecet%2Follm-red.svg)
11
+
12
+ OLLM CLI brings the power of open-source large language models to your terminal with a focus on local-first operation, intelligent resource management, and extensibility. Built with TypeScript and React, it provides a modern terminal UI while maintaining compatibility with automation workflows.
13
+
14
+ ---
15
+
16
+ ## ✨ Features
17
+
18
+ ### 🎨 Interactive Terminal UI
19
+
20
+ - **React + Ink powered interface** with streaming responses and real-time updates
21
+ - **Syntax highlighting** for code blocks with language detection
22
+ - **Status bar** showing model, context usage, and VRAM metrics
23
+ - **Tool execution preview** with diff visualization for file changes
24
+
25
+ ### 🧠 Smart Context Management
26
+
27
+ - **Fixed context sizing** based on available VRAM (determined at startup)
28
+ - **Automatic compression** when approaching context limits
29
+ - **Snapshot and rollover** support for long conversations
30
+ - **Real-time monitoring** of token usage and memory consumption
31
+
32
+ ### 🛠️ Powerful Tool System
33
+
34
+ - **Built-in tools**: File operations, shell execution, web fetch, search, memory
35
+ - **Policy-based confirmation**: ASK, AUTO, and YOLO approval modes
36
+ - **Diff preview** for file edits before applying changes
37
+ - **Output truncation** and streaming for long-running operations
38
+
39
+ ### 🔌 Extensibility
40
+
41
+ - **Hook system** for event-driven automation and safety gates
42
+ - **Extension system** with manifest-based configuration
43
+ - **MCP integration** (Model Context Protocol) for external tools
44
+ - **Provider-agnostic** architecture supporting multiple LLM backends
45
+
46
+ ### 💾 Session Management
47
+
48
+ - **Record and resume** conversations with full context
49
+ - **Automatic compression** to manage context limits
50
+ - **Loop detection** to prevent runaway tool calls
51
+ - **Session history** with searchable archives
52
+
53
+ ### 🌐 Offline First
54
+
55
+ - **Works without internet** when models are installed locally
56
+ - **No telemetry** - all data stays on your machine
57
+ - **Local model management** - pull, list, and remove models
58
+
59
+ ---
60
+
61
+ ## 🚀 Quick Start
62
+
63
+ ### Installation
64
+
65
+ Install OLLM CLI globally using npm:
66
+
67
+ ```bash
68
+ # Install from npm
69
+ npm install -g @tecet/ollm
70
+
71
+ # The interactive installer will guide you through:
72
+ # 1. Installing Ollama (if needed)
73
+ # 2. Downloading a starter model
74
+ # 3. Setting up configuration
75
+
76
+ # Start using OLLM CLI
77
+ ollm
78
+ ```
79
+
80
+ **That's it!** The installer handles everything automatically.
81
+
82
+ For detailed installation instructions, see the **[Installation Guide](docs/Installation.md)**.
83
+
84
+ ### First Steps
85
+
86
+ ```bash
87
+ # Start interactive mode
88
+ ollm
89
+
90
+ # Or try a quick question
91
+ ollm -p "Explain async/await in JavaScript"
92
+
93
+ # Select a specific model
94
+ ollm --model llama3.1:8b
95
+
96
+ # Get help
97
+ ollm --help
98
+ ```
99
+
100
+ ### Learn More
101
+
102
+ - **[Introduction](docs/Introduction.md)** - What is OLLM CLI and why use it?
103
+ - **[Quick Start Guide](docs/Quickstart.md)** - Get up and running in 5 minutes
104
+ - **[Complete Documentation](docs/README.md)** - Full documentation (57 guides)
105
+
106
+ ---
107
+
108
+ ## 📖 Documentation
109
+
110
+ ### Getting Started
111
+
112
+ - **[Introduction](docs/Introduction.md)** - What is OLLM CLI? (friendly, non-technical)
113
+ - **[Installation](docs/Installation.md)** - Complete installation guide
114
+ - **[Quick Start](docs/Quickstart.md)** - Get started in 5 minutes
115
+ - **[Troubleshooting](docs/Troubleshooting.md)** - Common issues and solutions
116
+
117
+ ### Core Features
118
+
119
+ - **[User Interface & Settings](docs/UI&Settings/README.md)** - Interface, commands, themes, keybinds, and configuration
120
+ - **[Context Management](docs/Context/README.md)** - Context sizing, compression, checkpoints, and VRAM monitoring
121
+ - **[Model Management](docs/LLM%20Models/README.md)** - Models, providers, compatibility, and memory system
122
+ - **[Tools System](docs/Tools/README.md)** - Tool execution, architecture, and manifest reference
123
+ - **[Hooks System](docs/Hooks/README.md)** - Event-driven automation, protocol, and visual guide
124
+ - **[MCP Integration](docs/MCP/README.md)** - Model Context Protocol, marketplace, and commands
125
+ - **[Prompts System](docs/Prompts%20System/README.md)** - System prompts, templates, and routing
126
+
127
+ ### Development
128
+
129
+ - **[Development Roadmap](docs/DevelopmentRoadmap/README.md)** - Future plans and version releases (v0.2.0-v0.9.0)
130
+ - **[Complete Index](docs/Index.md)** - All 57 documentation files organized by topic
131
+
132
+ **Total Documentation:** 57 comprehensive guides covering every aspect of OLLM CLI
133
+
134
+ ---
135
+
136
+ ## ⚙️ System Requirements
137
+
138
+ | Component | Minimum | Recommended |
139
+ | ----------- | --------------------------- | ------------------ |
140
+ | **Node.js** | 20.0.0 | 22.x LTS |
141
+ | **RAM** | 8GB | 16GB+ |
142
+ | **VRAM** | 4GB | 8GB+ |
143
+ | **Storage** | 10GB | 50GB+ (for models) |
144
+ | **OS** | Windows 10, macOS 11, Linux | Latest versions |
145
+
146
+ ### Recommended Models by VRAM
147
+
148
+ | VRAM | Recommended Models | Context Sweet Spot |
149
+ | ----- | ------------------------ | ------------------ |
150
+ | 4GB | Llama 3.2 3B, Qwen3 4B | 4K-8K tokens |
151
+ | 8GB | Llama 3.1 8B, Mistral 7B | 8K-16K tokens |
152
+ | 12GB | Gemma 3 12B, Phi-4 14B | 16K-32K tokens |
153
+ | 24GB+ | Qwen3 32B, Mixtral 8x7B | 32K-64K tokens |
154
+
155
+ ---
156
+
157
+ ## 🎯 Key Concepts
158
+
159
+ ### Context Management
160
+
161
+ OLLM CLI uses **fixed context sizing** determined at startup based on your available VRAM:
162
+
163
+ - **Minimal** (2K tokens) - Very small models or limited VRAM
164
+ - **Basic** (4K tokens) - Small models, basic conversations
165
+ - **Standard** (8K tokens) - Most common use case
166
+ - **Premium** (16K tokens) - Larger models, complex tasks
167
+ - **Ultra** (32K+ tokens) - High-end hardware, long conversations
168
+
169
+ Context is **fixed for the session** and automatically compressed when approaching limits.
170
+
171
+ ### Tool System
172
+
173
+ The AI can use tools to help you:
174
+
175
+ - **File Tools** - Read, write, edit files
176
+ - **Web Tools** - Search internet, fetch pages
177
+ - **Shell Tool** - Run commands
178
+ - **Memory Tools** - Remember facts across sessions
179
+ - **MCP Tools** - Connect to external services
180
+
181
+ ### Approval Modes
182
+
183
+ Control how tools are executed:
184
+
185
+ - **ASK** - Confirm each tool use (default, safest)
186
+ - **AUTO** - Auto-approve safe tools, ask for risky ones
187
+ - **YOLO** - Auto-approve everything (use with caution!)
188
+
189
+ ---
190
+
191
+ ## 🔧 Common Commands
192
+
193
+ ### CLI Flags
194
+
195
+ ```bash
196
+ # Interactive mode
197
+ ollm
198
+
199
+ # One-shot prompt
200
+ ollm -p "Explain async/await"
201
+
202
+ # Select model
203
+ ollm --model llama3.1:8b
204
+
205
+ # List models
206
+ ollm --list-models
207
+
208
+ # JSON output
209
+ ollm -p "List 5 languages" --output json
210
+
211
+ # Debug mode
212
+ ollm --debug
213
+
214
+ # Show version
215
+ ollm --version
216
+ ```
217
+
218
+ ### Interactive Slash Commands
219
+
220
+ ```bash
221
+ /model list # List available models
222
+ /model use <name> # Switch model
223
+ /context # Show context status
224
+ /theme list # List themes
225
+ /theme use <name> # Switch theme
226
+ /session list # List sessions
227
+ /help # Show help
228
+ /exit # Exit
229
+ ```
230
+
231
+ See **[Commands Reference](docs/UI&Settings/Commands.md)** for complete list.
232
+
233
+ ---
234
+
235
+ ## 🏗️ Project Structure
236
+
237
+ ```
238
+ ollm-cli/
239
+ ├── packages/
240
+ │ ├── cli/ # CLI entry point and UI components
241
+ │ │ ├── src/
242
+ │ │ │ ├── cli.tsx # Main entry
243
+ │ │ │ ├── commands/ # Slash commands
244
+ │ │ │ └── ui/ # React components
245
+ │ │ └── package.json
246
+ │ │
247
+ │ ├── core/ # Core runtime and business logic
248
+ │ │ ├── src/
249
+ │ │ │ ├── context/ # Context management
250
+ │ │ │ ├── tools/ # Tool system
251
+ │ │ │ ├── hooks/ # Hook system
252
+ │ │ │ ├── services/ # Business logic
253
+ │ │ │ └── mcp/ # MCP integration
254
+ │ │ └── package.json
255
+ │ │
256
+ │ ├── ollm-bridge/ # Provider adapters
257
+ │ │ ├── src/
258
+ │ │ │ └── provider/ # Ollama, vLLM, OpenAI adapters
259
+ │ │ └── package.json
260
+ │ │
261
+ │ └── test-utils/ # Shared test utilities
262
+ │ └── package.json
263
+
264
+ ├── docs/ # Documentation
265
+ ├── scripts/ # Build and utility scripts
266
+ ├── schemas/ # JSON schemas
267
+ └── package.json # Root workspace config
268
+ ```
269
+
270
+ ---
271
+
272
+ ## 🛠️ Development
273
+
274
+ ### For Contributors
275
+
276
+ Want to contribute to OLLM CLI? Here's how to get started:
277
+
278
+ ```bash
279
+ # Clone the repository
280
+ git clone https://github.com/tecet/ollm.git
281
+ cd ollm
282
+
283
+ # Install dependencies
284
+ npm install
285
+
286
+ # Build all packages
287
+ npm run build
288
+
289
+ # Run tests
290
+ npm test
291
+
292
+ # Start development
293
+ npm start
294
+ ```
295
+
296
+ ### Development Commands
297
+
298
+ ```bash
299
+ npm run build # Build all packages
300
+ npm test # Run tests
301
+ npm run lint # Lint code
302
+ npm run format # Format code
303
+ npm start # Run CLI
304
+ ```
305
+
306
+ ### Project Structure
307
+
308
+ ```
309
+ ollm-cli/
310
+ ├── packages/
311
+ │ ├── cli/ # CLI entry and UI
312
+ │ ├── core/ # Core runtime
313
+ │ ├── ollm-bridge/ # Provider adapters
314
+ │ └── test-utils/ # Test utilities
315
+ ├── docs/ # Documentation (57 files)
316
+ ├── scripts/ # Build scripts
317
+ └── package.json # Root workspace
318
+ ```
319
+
320
+ See **[Project Structure](.kiro/steering/structure.md)** for detailed architecture.
321
+
322
+ ---
323
+
324
+ ## 🗺️ Roadmap
325
+
326
+ OLLM CLI is under active development with a clear roadmap for future features.
327
+
328
+ ### ✅ Completed (v0.1.0 - Alpha)
329
+
330
+ - Interactive TUI with React + Ink
331
+ - Context management with VRAM monitoring
332
+ - Tool system with policy engine
333
+ - Session recording and compression
334
+ - Hook system for automation
335
+ - MCP integration
336
+ - Comprehensive documentation (57 guides)
337
+ - Testing infrastructure
338
+
339
+ ### 🔮 Planned Features (v0.2.0 - v0.9.0)
340
+
341
+ **v0.2.0 - Enhanced Context Management**
342
+
343
+ - Advanced context pool management
344
+ - Multi-tier context strategies
345
+ - Improved VRAM optimization
346
+
347
+ **v0.3.0 - Advanced Compression**
348
+
349
+ - Multiple compression strategies
350
+ - Semantic compression
351
+ - Context checkpointing
352
+
353
+ **v0.4.0 - Reasoning Models**
354
+
355
+ - Extended reasoning support
356
+ - Reasoning capture and display
357
+ - Specialized reasoning modes
358
+
359
+ **v0.5.0 - Session Management**
360
+
361
+ - Enhanced session persistence
362
+ - Session templates
363
+ - Collaborative sessions
364
+
365
+ **v0.6.0 - Multi-Provider Support**
366
+
367
+ - OpenAI, Anthropic, Google AI
368
+ - Cost tracking and budgets
369
+ - Auto-escalation between providers
370
+
371
+ **v0.7.0 - Developer Productivity**
372
+
373
+ - Git integration
374
+ - @-mentions for context
375
+ - Diff review workflows
376
+
377
+ **v0.8.0 - Intelligence Layer**
378
+
379
+ - Semantic codebase search (RAG)
380
+ - Structured output
381
+ - Code execution sandbox
382
+ - Vision support
383
+
384
+ **v0.9.0 - Cross-Platform Polish**
385
+
386
+ - Platform-specific optimizations
387
+ - Enhanced Windows support
388
+ - Improved terminal compatibility
389
+
390
+ **v1.0.0+ - Beta and Beyond**
391
+
392
+ - Production-ready release
393
+ - Enterprise features
394
+ - Plugin marketplace
395
+
396
+ See **[Development Roadmap](docs/DevelopmentRoadmap/Roadmap.md)** for detailed specifications.
397
+
398
+ ---
399
+
400
+ ## 🧰 Tech Stack
401
+
402
+ ### Runtime & Language
403
+
404
+ - **Node.js 20+** - JavaScript runtime
405
+ - **TypeScript 5.9** - Type-safe development
406
+ - **ES Modules** - Modern module system
407
+
408
+ ### Build & Tooling
409
+
410
+ - **npm workspaces** - Monorepo management
411
+ - **esbuild** - Fast bundling
412
+ - **Vitest** - Testing framework
413
+ - **ESLint** - Code linting
414
+ - **Prettier** - Code formatting
415
+
416
+ ### UI Framework
417
+
418
+ - **React 19** - UI library
419
+ - **Ink 6** - Terminal rendering
420
+
421
+ ### Key Dependencies
422
+
423
+ - **yargs** - CLI argument parsing
424
+ - **yaml** - Configuration parsing
425
+ - **ajv** - JSON schema validation
426
+ - **fast-check** - Property-based testing
427
+
428
+ ---
429
+
430
+ ## 📄 License
431
+
432
+ This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.
433
+
434
+ ```
435
+ Copyright 2026 OLLM CLI Contributors
436
+
437
+ Licensed under the Apache License, Version 2.0 (the "License");
438
+ you may not use this file except in compliance with the License.
439
+ You may obtain a copy of the License at
440
+
441
+ http://www.apache.org/licenses/LICENSE-2.0
442
+
443
+ Unless required by applicable law or agreed to in writing, software
444
+ distributed under the License is distributed on an "AS IS" BASIS,
445
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
446
+ See the License for the specific language governing permissions and
447
+ limitations under the License.
448
+ ```
449
+
450
+ ---
451
+
452
+ ## 🤝 Contributing
453
+
454
+ We welcome contributions! Here's how you can help:
455
+
456
+ ### Ways to Contribute
457
+
458
+ 1. **Report Bugs** - Open an issue with details and reproduction steps
459
+ 2. **Suggest Features** - Share your ideas in the discussions
460
+ 3. **Submit Pull Requests** - Fix bugs or implement features
461
+ 4. **Improve Documentation** - Help make the docs clearer
462
+ 5. **Write Tests** - Increase test coverage
463
+
464
+ ### Contribution Guidelines
465
+
466
+ 1. **Fork the repository** and create a feature branch
467
+ 2. **Follow the code style** - Run `npm run lint` and `npm run format`
468
+ 3. **Write tests** for new features
469
+ 4. **Update documentation** as needed
470
+ 5. **Submit a pull request** with a clear description
471
+
472
+ ### Development Workflow
473
+
474
+ ```bash
475
+ # 1. Fork and clone
476
+ git clone https://github.com/tecet/ollm.git
477
+ cd ollm
478
+
479
+ # 2. Create a feature branch
480
+ git checkout -b feature/my-feature
481
+
482
+ # 3. Make changes and test
483
+ npm install
484
+ npm run build
485
+ npm test
486
+
487
+ # 4. Commit with clear messages
488
+ git commit -m "feat: add new feature"
489
+
490
+ # 5. Push and create PR
491
+ git push origin feature/my-feature
492
+ ```
493
+
494
+ ### Code of Conduct
495
+
496
+ - Be respectful and inclusive
497
+ - Provide constructive feedback
498
+ - Focus on the code, not the person
499
+ - Help create a welcoming environment
500
+
501
+ ---
502
+
503
+ ## 🙏 Acknowledgments
504
+
505
+ OLLM CLI is built on the shoulders of giants:
506
+
507
+ - **[Ollama](https://ollama.com/)** - Local LLM runtime
508
+ - **[React](https://react.dev/)** & **[Ink](https://github.com/vadimdemedes/ink)** - Terminal UI
509
+ - **[Vitest](https://vitest.dev/)** - Testing framework
510
+ - **[fast-check](https://fast-check.dev/)** - Property-based testing
511
+ - All our **[contributors](https://github.com/tecet/ollm/graphs/contributors)**
512
+
513
+ ---
514
+
515
+ ## 📞 Support
516
+
517
+ - **Documentation**: [docs/README.md](docs/README.md)
518
+ - **Issues**: [GitHub Issues](https://github.com/tecet/ollm/issues)
519
+ - **Discussions**: [GitHub Discussions](https://github.com/tecet/ollm/discussions)
520
+
521
+ ---
522
+
523
+ <div align="center">
524
+
525
+ **[⬆ Back to Top](#ollm-cli)**
526
+
527
+ Made with ❤️ by **[tecet](https://github.com/tecet)**
528
+
529
+ </div>