@tecet/ollm 0.1.4 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (86) hide show
  1. package/dist/cli.js +20 -14
  2. package/dist/cli.js.map +3 -3
  3. package/dist/services/documentService.d.ts.map +1 -1
  4. package/dist/services/documentService.js +12 -2
  5. package/dist/services/documentService.js.map +1 -1
  6. package/dist/ui/components/docs/DocsPanel.d.ts.map +1 -1
  7. package/dist/ui/components/docs/DocsPanel.js +1 -1
  8. package/dist/ui/components/docs/DocsPanel.js.map +1 -1
  9. package/dist/ui/components/launch/VersionBanner.js +1 -1
  10. package/dist/ui/components/launch/VersionBanner.js.map +1 -1
  11. package/dist/ui/components/layout/KeybindsLegend.d.ts.map +1 -1
  12. package/dist/ui/components/layout/KeybindsLegend.js +1 -1
  13. package/dist/ui/components/layout/KeybindsLegend.js.map +1 -1
  14. package/dist/ui/components/tabs/BugReportTab.js +1 -1
  15. package/dist/ui/components/tabs/BugReportTab.js.map +1 -1
  16. package/dist/ui/services/docsService.d.ts +12 -27
  17. package/dist/ui/services/docsService.d.ts.map +1 -1
  18. package/dist/ui/services/docsService.js +40 -67
  19. package/dist/ui/services/docsService.js.map +1 -1
  20. package/docs/README.md +3 -410
  21. package/package.json +10 -7
  22. package/scripts/copy-docs-to-user.cjs +34 -0
  23. package/docs/Context/CheckpointFlowDiagram.md +0 -673
  24. package/docs/Context/ContextArchitecture.md +0 -898
  25. package/docs/Context/ContextCompression.md +0 -1102
  26. package/docs/Context/ContextManagment.md +0 -750
  27. package/docs/Context/Index.md +0 -209
  28. package/docs/Context/README.md +0 -390
  29. package/docs/DevelopmentRoadmap/Index.md +0 -238
  30. package/docs/DevelopmentRoadmap/OLLM-CLI_Releases.md +0 -419
  31. package/docs/DevelopmentRoadmap/PlanedFeatures.md +0 -448
  32. package/docs/DevelopmentRoadmap/README.md +0 -174
  33. package/docs/DevelopmentRoadmap/Roadmap.md +0 -572
  34. package/docs/DevelopmentRoadmap/RoadmapVisual.md +0 -372
  35. package/docs/Hooks/Architecture.md +0 -885
  36. package/docs/Hooks/Index.md +0 -244
  37. package/docs/Hooks/KeyboardShortcuts.md +0 -248
  38. package/docs/Hooks/Protocol.md +0 -817
  39. package/docs/Hooks/README.md +0 -403
  40. package/docs/Hooks/UserGuide.md +0 -1483
  41. package/docs/Hooks/VisualGuide.md +0 -598
  42. package/docs/Index.md +0 -506
  43. package/docs/Installation.md +0 -586
  44. package/docs/Introduction.md +0 -367
  45. package/docs/LLM Models/Index.md +0 -239
  46. package/docs/LLM Models/LLM_GettingStarted.md +0 -748
  47. package/docs/LLM Models/LLM_Index.md +0 -701
  48. package/docs/LLM Models/LLM_MemorySystem.md +0 -337
  49. package/docs/LLM Models/LLM_ModelCompatibility.md +0 -499
  50. package/docs/LLM Models/LLM_ModelsArchitecture.md +0 -933
  51. package/docs/LLM Models/LLM_ModelsCommands.md +0 -839
  52. package/docs/LLM Models/LLM_ModelsConfiguration.md +0 -1094
  53. package/docs/LLM Models/LLM_ModelsList.md +0 -1071
  54. package/docs/LLM Models/LLM_ModelsList.md.backup +0 -400
  55. package/docs/LLM Models/README.md +0 -355
  56. package/docs/MCP/MCP_Architecture.md +0 -1086
  57. package/docs/MCP/MCP_Commands.md +0 -1111
  58. package/docs/MCP/MCP_GettingStarted.md +0 -590
  59. package/docs/MCP/MCP_Index.md +0 -524
  60. package/docs/MCP/MCP_Integration.md +0 -866
  61. package/docs/MCP/MCP_Marketplace.md +0 -160
  62. package/docs/MCP/README.md +0 -415
  63. package/docs/Prompts System/Architecture.md +0 -760
  64. package/docs/Prompts System/Index.md +0 -223
  65. package/docs/Prompts System/PromptsRouting.md +0 -1047
  66. package/docs/Prompts System/PromptsTemplates.md +0 -1102
  67. package/docs/Prompts System/README.md +0 -389
  68. package/docs/Prompts System/SystemPrompts.md +0 -856
  69. package/docs/Quickstart.md +0 -535
  70. package/docs/Tools/Architecture.md +0 -884
  71. package/docs/Tools/GettingStarted.md +0 -624
  72. package/docs/Tools/Index.md +0 -216
  73. package/docs/Tools/ManifestReference.md +0 -141
  74. package/docs/Tools/README.md +0 -440
  75. package/docs/Tools/UserGuide.md +0 -773
  76. package/docs/Troubleshooting.md +0 -1265
  77. package/docs/UI&Settings/Architecture.md +0 -729
  78. package/docs/UI&Settings/ColorASCII.md +0 -34
  79. package/docs/UI&Settings/Commands.md +0 -755
  80. package/docs/UI&Settings/Configuration.md +0 -872
  81. package/docs/UI&Settings/Index.md +0 -293
  82. package/docs/UI&Settings/Keybinds.md +0 -372
  83. package/docs/UI&Settings/README.md +0 -278
  84. package/docs/UI&Settings/Terminal.md +0 -637
  85. package/docs/UI&Settings/Themes.md +0 -604
  86. package/docs/UI&Settings/UIGuide.md +0 -550
@@ -1,748 +0,0 @@
1
- # Getting Started with Model Management
2
-
3
- **Quick Start Guide**
4
-
5
- This guide will help you get started with model management, routing, memory, templates, and project profiles in OLLM CLI.
6
-
7
- ---
8
-
9
- ## 📋 Table of Contents
10
-
11
- 1. [Introduction](#introduction)
12
- 2. [Prerequisites](#prerequisites)
13
- 3. [Quick Start](#quick-start)
14
- 4. [Basic Model Management](#basic-model-management)
15
- 5. [Using Model Routing](#using-model-routing)
16
- 6. [Working with Memory](#working-with-memory)
17
- 7. [Using Templates](#using-templates)
18
- 8. [Project Profiles](#project-profiles)
19
- 9. [Next Steps](#next-steps)
20
-
21
- **See Also:**
22
-
23
- - [Model Management Overview](3%20projects/OLLM%20CLI/LLM%20Models/README.md)
24
- - [Model Commands](Models_commands.md)
25
- - [Configuration Guide](Models_configuration.md)
26
-
27
- ---
28
-
29
- ## Introduction
30
-
31
- OLLM CLI provides comprehensive model management capabilities that help you:
32
-
33
- - **Manage models**: List, download, delete, and inspect models
34
- - **Route intelligently**: Automatically select appropriate models for tasks
35
- - **Remember context**: Store facts and preferences across sessions
36
- - **Use templates**: Create reusable prompts with variables
37
- - **Configure projects**: Auto-detect and apply project-specific settings
38
-
39
- This guide covers the basics to get you started quickly.
40
-
41
- ---
42
-
43
- ## Prerequisites
44
-
45
- **Required:**
46
-
47
- - OLLM CLI installed and configured
48
- - Ollama (or compatible provider) running
49
- - At least one model installed
50
-
51
- **Optional:**
52
-
53
- - Project workspace for profile detection
54
- - Custom templates directory
55
-
56
- **Check your setup:**
57
-
58
- ```bash
59
- # Verify OLLM CLI is installed
60
- ollm --version
61
-
62
- # Check provider connection
63
- /model list
64
- ```
65
-
66
- ---
67
-
68
- ## Quick Start
69
-
70
- ### 1. List Available Models
71
-
72
- See what models you have installed:
73
-
74
- ```bash
75
- /model list
76
- ```
77
-
78
- **Output:**
79
-
80
- ```
81
- Available Models:
82
- ● llama3.1:8b (loaded) 4.7 GB Modified 2 days ago
83
- mistral:7b 4.1 GB Modified 1 week ago
84
- codellama:7b 3.8 GB Modified 3 days ago
85
- ```
86
-
87
- ### 2. Download a Model
88
-
89
- Pull a new model from the registry:
90
-
91
- ```bash
92
- /model pull llama3.1:8b
93
- ```
94
-
95
- **Progress display:**
96
-
97
- ```
98
- Pulling llama3.1:8b...
99
- [████████████████████████] 100% 4.7 GB @ 12.3 MB/s
100
- Model llama3.1:8b ready.
101
- ```
102
-
103
- ### 3. View Model Details
104
-
105
- Get detailed information about a model:
106
-
107
- ```bash
108
- /model info llama3.1:8b
109
- ```
110
-
111
- **Output:**
112
-
113
- ```
114
- Model: llama3.1:8b
115
- Size: 4.7 GB
116
- Parameters: 8B
117
- Context Length: 128,000 tokens
118
- Quantization: Q4_K_M
119
- Family: llama
120
- Capabilities:
121
- ✓ Tool calling
122
- ✓ Streaming
123
- ✗ Vision
124
- ```
125
-
126
- ### 4. Use a Model
127
-
128
- Switch to a specific model:
129
-
130
- ```bash
131
- /model use llama3.1:8b
132
- ```
133
-
134
- ---
135
-
136
- ## Basic Model Management
137
-
138
- ### Listing Models
139
-
140
- View all available models with details:
141
-
142
- ```bash
143
- /model list
144
- ```
145
-
146
- **Shows:**
147
-
148
- - Model name and version
149
- - Size on disk
150
- - Last modified date
151
- - Load status (loaded/unloaded)
152
-
153
- ### Pulling Models
154
-
155
- Download models from the provider registry:
156
-
157
- ```bash
158
- # Pull a specific model
159
- /model pull llama3.1:8b
160
-
161
- # Pull with tag
162
- /model pull mistral:7b-instruct
163
- ```
164
-
165
- **Features:**
166
-
167
- - Real-time progress display
168
- - Transfer rate monitoring
169
- - Cancellable with Ctrl+C
170
-
171
- ### Deleting Models
172
-
173
- Remove models to free disk space:
174
-
175
- ```bash
176
- /model delete codellama:7b
177
- ```
178
-
179
- **Safety features:**
180
-
181
- - Confirmation prompt
182
- - Shows space to be freed
183
- - Automatic unload if currently loaded
184
-
185
- ### Keeping Models Loaded
186
-
187
- Keep frequently-used models in memory for faster responses:
188
-
189
- ```bash
190
- # Keep a model loaded
191
- /model keep llama3.1:8b
192
-
193
- # Unload a model
194
- /model unload llama3.1:8b
195
- ```
196
-
197
- **Benefits:**
198
-
199
- - Eliminates model load time (2-5 seconds)
200
- - Reduces latency for subsequent requests
201
- - Useful for interactive sessions
202
-
203
- ### Unknown Model Handling
204
-
205
- When you switch to a model that isn't in the system's database, OLLM CLI will prompt you to configure its tool support:
206
-
207
- ```bash
208
- /model use custom-model:latest
209
- ```
210
-
211
- **Prompt:**
212
-
213
- ```
214
- Unknown model detected: custom-model:latest
215
- Does this model support function calling (tools)?
216
- [y] Yes, it supports tools
217
- [n] No, it doesn't support tools
218
- [a] Auto-detect (test with a sample request)
219
- ```
220
-
221
- **Options:**
222
-
223
- 1. **Yes (y)**: Manually confirm the model supports tools
224
- - Tools will be enabled for this model
225
- - Choice is saved to `~/.ollm/user_models.json`
226
-
227
- 2. **No (n)**: Manually confirm the model doesn't support tools
228
- - Tools will be disabled for this model
229
- - Choice is saved to `~/.ollm/user_models.json`
230
-
231
- 3. **Auto-detect (a)**: Let the system test the model
232
- - Sends a test request with a minimal tool schema
233
- - Detects whether the model accepts or rejects tools
234
- - Takes ~5 seconds with timeout
235
- - Result is saved automatically
236
-
237
- **Timeout:** If you don't respond within 30 seconds, the system defaults to tools disabled (safe default).
238
-
239
- **Why this matters:**
240
-
241
- - Sending tools to models that don't support them causes errors
242
- - Proper configuration ensures smooth operation
243
- - Your choice is remembered for future sessions
244
-
245
- ### Auto-Detect Details
246
-
247
- When you choose auto-detect, here's what happens:
248
-
249
- **Process:**
250
-
251
- 1. System sends a test request to the model with a minimal tool schema
252
- 2. Monitors the response for tool-related errors
253
- 3. Times out after 5 seconds if no response
254
- 4. Saves the result to `~/.ollm/user_models.json`
255
-
256
- **Success indicators:**
257
-
258
- - Model accepts the request without tool-related errors
259
- - Tool support is enabled and saved
260
-
261
- **Failure indicators:**
262
-
263
- - Model returns errors like "unknown field: tools"
264
- - Model returns 400 status with tool-related error messages
265
- - Tool support is disabled and saved
266
-
267
- **Fallback:**
268
-
269
- - If auto-detect fails or times out, tools are disabled (safe default)
270
- - You can manually update the setting later
271
-
272
- **System messages:**
273
-
274
- ```
275
- Auto-detecting tool support for custom-model:latest...
276
- Tool support detected: Enabled
277
- ```
278
-
279
- or
280
-
281
- ```
282
- Auto-detecting tool support for custom-model:latest...
283
- Tool support detected: Disabled
284
- ```
285
-
286
- ---
287
-
288
- ## Using Model Routing
289
-
290
- ### What is Model Routing?
291
-
292
- Model routing automatically selects appropriate models based on task type, eliminating the need for manual model selection.
293
-
294
- ### Enable Routing
295
-
296
- Add to your configuration (`~/.ollm/config.yaml`):
297
-
298
- ```yaml
299
- model:
300
- routing:
301
- enabled: true
302
- defaultProfile: general
303
- ```
304
-
305
- ### Routing Profiles
306
-
307
- Four built-in profiles optimize for different use cases:
308
-
309
- **Fast Profile** - Quick responses with smaller models:
310
-
311
- ```yaml
312
- routing:
313
- defaultProfile: fast
314
- ```
315
-
316
- **General Profile** - Balanced performance for most tasks:
317
-
318
- ```yaml
319
- routing:
320
- defaultProfile: general
321
- ```
322
-
323
- **Code Profile** - Optimized for code generation:
324
-
325
- ```yaml
326
- routing:
327
- defaultProfile: code
328
- ```
329
-
330
- **Creative Profile** - Creative writing and storytelling:
331
-
332
- ```yaml
333
- routing:
334
- defaultProfile: creative
335
- ```
336
-
337
- ### Profile Overrides
338
-
339
- Specify models for specific profiles:
340
-
341
- ```yaml
342
- model:
343
- routing:
344
- enabled: true
345
- defaultProfile: general
346
- overrides:
347
- code: deepseek-coder:6.7b
348
- fast: phi3:mini
349
- ```
350
-
351
- ### Manual Override
352
-
353
- You can always manually select a model:
354
-
355
- ```bash
356
- /model use llama3.1:8b
357
- ```
358
-
359
- This overrides routing for the current session.
360
-
361
- ---
362
-
363
- ## Working with Memory
364
-
365
- ### What is Memory?
366
-
367
- The memory system stores facts, preferences, and context across sessions, so you don't have to repeat information.
368
-
369
- ### Adding Memories
370
-
371
- Store information for future sessions:
372
-
373
- ```bash
374
- # Add a simple memory
375
- /memory add user_name Alice
376
-
377
- # Add with category
378
- /memory add preferred_language TypeScript --category preference
379
- ```
380
-
381
- ### Listing Memories
382
-
383
- View all stored memories:
384
-
385
- ```bash
386
- /memory list
387
- ```
388
-
389
- **Output:**
390
-
391
- ```
392
- Stored Memories:
393
- user_name: Alice (preference)
394
- preferred_language: TypeScript (preference)
395
- project_type: monorepo (context)
396
- ```
397
-
398
- ### Searching Memories
399
-
400
- Find specific memories:
401
-
402
- ```bash
403
- /memory search project
404
- ```
405
-
406
- ### Forgetting Memories
407
-
408
- Remove memories you no longer need:
409
-
410
- ```bash
411
- # Forget a specific memory
412
- /memory forget old_preference
413
-
414
- # Clear all memories
415
- /memory clear
416
- ```
417
-
418
- ### LLM-Initiated Memory
419
-
420
- The LLM can also store memories during conversation:
421
-
422
- ```
423
- User: My name is Alice and I prefer TypeScript.
424
- Assistant: I'll remember that. [Stores: user_name=Alice, preferred_language=TypeScript]
425
- ```
426
-
427
- ### How Memory Works
428
-
429
- 1. Memories are stored in `~/.ollm/memory.json`
430
- 2. At session start, memories are loaded
431
- 3. Memories are injected into the system prompt (within token budget)
432
- 4. Recently accessed memories are prioritized
433
- 5. Access count and timestamps are tracked
434
-
435
- ---
436
-
437
- ## Using Templates
438
-
439
- ### What are Templates?
440
-
441
- Templates are reusable prompts with variable substitution, allowing you to quickly use common prompts with different inputs.
442
-
443
- ### Listing Templates
444
-
445
- View available templates:
446
-
447
- ```bash
448
- /template list
449
- ```
450
-
451
- **Output:**
452
-
453
- ```
454
- Available Templates:
455
- code_review - Review code for quality and security
456
- explain_code - Explain how code works
457
- write_tests - Generate unit tests for code
458
- ```
459
-
460
- ### Using Templates
461
-
462
- Apply a template with variables:
463
-
464
- ```bash
465
- # Use a template with variables
466
- /template use code_review language=TypeScript code="function add(a, b) { return a + b; }"
467
-
468
- # Variables with spaces
469
- /template use explain_code language="Python" code="def factorial(n): return 1 if n <= 1 else n * factorial(n-1)"
470
- ```
471
-
472
- ### Creating Templates
473
-
474
- Create a new template:
475
-
476
- ```bash
477
- /template create my_template
478
- ```
479
-
480
- **Template format (YAML):**
481
-
482
- ```yaml
483
- name: code_review
484
- description: Review code for quality and security
485
- template: "Review this {language} code for {focus:bugs and security}:\n\n{code}"
486
- variables:
487
- - name: language
488
- required: true
489
- description: Programming language
490
- - name: focus
491
- required: false
492
- default: 'bugs and security'
493
- description: Review focus areas
494
- - name: code
495
- required: true
496
- description: Code to review
497
- ```
498
-
499
- ### Template Locations
500
-
501
- Templates are loaded from:
502
-
503
- - User templates: `~/.ollm/templates/`
504
- - Workspace templates: `.ollm/templates/`
505
-
506
- Workspace templates override user templates with the same name.
507
-
508
- ---
509
-
510
- ## Project Profiles
511
-
512
- ### What are Project Profiles?
513
-
514
- Project profiles auto-detect your project type and apply appropriate settings automatically.
515
-
516
- ### Auto-Detection
517
-
518
- OLLM CLI detects project type from characteristic files:
519
-
520
- ```bash
521
- /project detect
522
- ```
523
-
524
- **Detected types:**
525
-
526
- - **TypeScript**: `package.json` with TypeScript dependencies
527
- - **Python**: `requirements.txt`, `pyproject.toml`, `setup.py`
528
- - **Rust**: `Cargo.toml`
529
- - **Go**: `go.mod`
530
-
531
- ### Using a Profile
532
-
533
- Manually select a profile:
534
-
535
- ```bash
536
- /project use typescript
537
- ```
538
-
539
- ### Initializing a Project
540
-
541
- Create a project configuration file:
542
-
543
- ```bash
544
- /project init
545
- ```
546
-
547
- This creates `.ollm/project.yaml` with the selected profile settings.
548
-
549
- ### Project Configuration
550
-
551
- Example `.ollm/project.yaml`:
552
-
553
- ```yaml
554
- # Project profile
555
- profile: typescript
556
-
557
- # Override global model
558
- model: deepseek-coder:6.7b
559
-
560
- # Project-specific routing
561
- routing:
562
- defaultProfile: code
563
-
564
- # Project-specific options
565
- options:
566
- temperature: 0.3
567
- maxTokens: 4096
568
- ```
569
-
570
- ### Built-in Profiles
571
-
572
- **TypeScript Profile:**
573
-
574
- - Code-optimized model
575
- - Code routing profile
576
- - File and shell tools enabled
577
-
578
- **Python Profile:**
579
-
580
- - Code-optimized model
581
- - Code routing profile
582
- - Python-specific tools
583
-
584
- **Rust Profile:**
585
-
586
- - Code-optimized model
587
- - Emphasis on memory safety
588
-
589
- **Go Profile:**
590
-
591
- - Code-optimized model
592
- - Emphasis on concurrency
593
-
594
- **Documentation Profile:**
595
-
596
- - Writing-optimized model
597
- - Creative routing profile
598
-
599
- ---
600
-
601
- ## Managing Tools
602
-
603
- ### What are Tools?
604
-
605
- Tools are functions that the LLM can call to perform actions like reading files, executing shell commands, searching the web, and more. OLLM CLI provides 15 built-in tools organized into 6 categories.
606
-
607
- ### Tools Panel
608
-
609
- Access the Tools Panel to enable or disable individual tools:
610
-
611
- **Navigation:**
612
-
613
- - Switch to the Tools tab in the UI
614
- - Use keyboard shortcuts to navigate:
615
- - `↑/↓`: Navigate between tools
616
- - `←/→/Enter`: Toggle tool on/off
617
- - `Tab`: Switch between tabs
618
-
619
- **Tool Categories:**
620
-
621
- 1. **File Operations** (4 tools)
622
- - `fsWrite`: Create or overwrite files
623
- - `fsAppend`: Append content to files
624
- - `strReplace`: Replace text in files
625
- - `deleteFile`: Delete files
626
-
627
- 2. **File Discovery** (5 tools)
628
- - `readFile`: Read file contents
629
- - `readMultipleFiles`: Read multiple files at once
630
- - `listDirectory`: List directory contents
631
- - `fileSearch`: Search for files by name
632
- - `grepSearch`: Search file contents with regex
633
-
634
- 3. **Shell** (4 tools)
635
- - `executePwsh`: Execute shell commands
636
- - `controlPwshProcess`: Manage background processes
637
- - `listProcesses`: List running processes
638
- - `getProcessOutput`: Read process output
639
-
640
- 4. **Web** (2 tools)
641
- - `remote_web_search`: Search the web
642
- - `webFetch`: Fetch content from URLs
643
-
644
- 5. **Memory** (1 tool)
645
- - `userInput`: Get input from the user
646
-
647
- 6. **Context** (4 tools)
648
- - `prework`: Acceptance criteria testing prework
649
- - `taskStatus`: Update task status
650
- - `updatePBTStatus`: Update property-based test status
651
- - `invokeSubAgent`: Delegate to specialized agents
652
-
653
- ### Enabling/Disabling Tools
654
-
655
- **Why disable tools?**
656
-
657
- - Reduce the number of tools sent to the LLM (improves focus)
658
- - Prevent certain actions (e.g., disable shell execution for safety)
659
- - Customize tool availability per project
660
-
661
- **How to disable:**
662
-
663
- 1. Navigate to the Tools tab
664
- 2. Use arrow keys to select a tool
665
- 3. Press Enter or Left/Right to toggle
666
-
667
- **Visual indicators:**
668
-
669
- - `[✓]` Tool is enabled
670
- - `[ ]` Tool is disabled
671
-
672
- **Persistence:**
673
-
674
- - Tool settings are saved to `~/.ollm/settings.json`
675
- - Settings persist across sessions
676
- - Workspace-specific settings can override user settings
677
-
678
- ### Tool Filtering
679
-
680
- Tools are filtered in two stages:
681
-
682
- 1. **Model Capability Check**: If the current model doesn't support function calling, all tools are automatically disabled
683
- 2. **User Preference Check**: Even if the model supports tools, you can disable specific tools via the Tools Panel
684
-
685
- **System message when tools are disabled:**
686
-
687
- ```
688
- Switched to gemma3:1b. Tools: Disabled
689
- ```
690
-
691
- ### Model Tool Support
692
-
693
- Some models don't support function calling. When you switch to such a model:
694
-
695
- - Tools are automatically disabled
696
- - System prompt includes a note: "This model does not support function calling"
697
- - Tools Panel shows: "Model doesn't support tools"
698
- - You can still view and configure tool preferences for when you switch back to a tool-capable model
699
-
700
- ---
701
-
702
- ## Next Steps
703
-
704
- ### Learn More
705
-
706
- **Model Management:**
707
-
708
- - [Model Commands Reference](Models_commands.md)
709
- - [Model Architecture](Models_architecture.md)
710
- - [Configuration Guide](Models_configuration.md)
711
-
712
- **Routing:**
713
-
714
- - [Routing User Guide](3%20projects/OLLM%20CLI/LLM%20Models/routing/user-guide.md)
715
- - [Routing Development Guide](3%20projects/OLLM%20CLI/LLM%20Models/routing/development-guide.md)
716
- - [Profiles Reference](profiles-reference.md)
717
-
718
- **Memory:**
719
-
720
- - [Memory User Guide](3%20projects/OLLM%20CLI/LLM%20Models/memory/user-guide.md)
721
- - [Memory API Reference](api-reference.md)
722
-
723
- **Templates:**
724
-
725
- - [Templates User Guide](3%20projects/OLLM%20CLI/LLM%20Models/templates/user-guide.md)
726
- - [Template Reference](template-reference.md)
727
-
728
- **Profiles:**
729
-
730
- - [Profiles User Guide](3%20projects/OLLM%20CLI/LLM%20Models/profiles/user-guide.md)
731
- - [Built-in Profiles](built-in-profiles.md)
732
-
733
- ### Advanced Topics
734
-
735
- - [Custom Routing Profiles](3%20projects/OLLM%20CLI/LLM%20Models/routing/development-guide.md)
736
- - [Template Libraries](template-reference.md)
737
- - [API Reference](api/)
738
-
739
- ### Get Help
740
-
741
- - [Troubleshooting Guide](../Troubleshooting.md)
742
- - GitHub Issues (https://github.com/ollm/ollm-cli/issues)
743
- - [Community Forum](#)
744
-
745
- ---
746
-
747
- **Last Updated:** 2026-01-16
748
- **Version:** 0.1.0