@tecet/ollm 0.1.4-b → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (66) hide show
  1. package/docs/README.md +3 -410
  2. package/package.json +2 -2
  3. package/docs/Context/CheckpointFlowDiagram.md +0 -673
  4. package/docs/Context/ContextArchitecture.md +0 -898
  5. package/docs/Context/ContextCompression.md +0 -1102
  6. package/docs/Context/ContextManagment.md +0 -750
  7. package/docs/Context/Index.md +0 -209
  8. package/docs/Context/README.md +0 -390
  9. package/docs/DevelopmentRoadmap/Index.md +0 -238
  10. package/docs/DevelopmentRoadmap/OLLM-CLI_Releases.md +0 -419
  11. package/docs/DevelopmentRoadmap/PlanedFeatures.md +0 -448
  12. package/docs/DevelopmentRoadmap/README.md +0 -174
  13. package/docs/DevelopmentRoadmap/Roadmap.md +0 -572
  14. package/docs/DevelopmentRoadmap/RoadmapVisual.md +0 -372
  15. package/docs/Hooks/Architecture.md +0 -885
  16. package/docs/Hooks/Index.md +0 -244
  17. package/docs/Hooks/KeyboardShortcuts.md +0 -248
  18. package/docs/Hooks/Protocol.md +0 -817
  19. package/docs/Hooks/README.md +0 -403
  20. package/docs/Hooks/UserGuide.md +0 -1483
  21. package/docs/Hooks/VisualGuide.md +0 -598
  22. package/docs/Index.md +0 -506
  23. package/docs/Installation.md +0 -586
  24. package/docs/Introduction.md +0 -367
  25. package/docs/LLM Models/Index.md +0 -239
  26. package/docs/LLM Models/LLM_GettingStarted.md +0 -748
  27. package/docs/LLM Models/LLM_Index.md +0 -701
  28. package/docs/LLM Models/LLM_MemorySystem.md +0 -337
  29. package/docs/LLM Models/LLM_ModelCompatibility.md +0 -499
  30. package/docs/LLM Models/LLM_ModelsArchitecture.md +0 -933
  31. package/docs/LLM Models/LLM_ModelsCommands.md +0 -839
  32. package/docs/LLM Models/LLM_ModelsConfiguration.md +0 -1094
  33. package/docs/LLM Models/LLM_ModelsList.md +0 -1071
  34. package/docs/LLM Models/LLM_ModelsList.md.backup +0 -400
  35. package/docs/LLM Models/README.md +0 -355
  36. package/docs/MCP/MCP_Architecture.md +0 -1086
  37. package/docs/MCP/MCP_Commands.md +0 -1111
  38. package/docs/MCP/MCP_GettingStarted.md +0 -590
  39. package/docs/MCP/MCP_Index.md +0 -524
  40. package/docs/MCP/MCP_Integration.md +0 -866
  41. package/docs/MCP/MCP_Marketplace.md +0 -160
  42. package/docs/MCP/README.md +0 -415
  43. package/docs/Prompts System/Architecture.md +0 -760
  44. package/docs/Prompts System/Index.md +0 -223
  45. package/docs/Prompts System/PromptsRouting.md +0 -1047
  46. package/docs/Prompts System/PromptsTemplates.md +0 -1102
  47. package/docs/Prompts System/README.md +0 -389
  48. package/docs/Prompts System/SystemPrompts.md +0 -856
  49. package/docs/Quickstart.md +0 -535
  50. package/docs/Tools/Architecture.md +0 -884
  51. package/docs/Tools/GettingStarted.md +0 -624
  52. package/docs/Tools/Index.md +0 -216
  53. package/docs/Tools/ManifestReference.md +0 -141
  54. package/docs/Tools/README.md +0 -440
  55. package/docs/Tools/UserGuide.md +0 -773
  56. package/docs/Troubleshooting.md +0 -1265
  57. package/docs/UI&Settings/Architecture.md +0 -729
  58. package/docs/UI&Settings/ColorASCII.md +0 -34
  59. package/docs/UI&Settings/Commands.md +0 -755
  60. package/docs/UI&Settings/Configuration.md +0 -872
  61. package/docs/UI&Settings/Index.md +0 -293
  62. package/docs/UI&Settings/Keybinds.md +0 -372
  63. package/docs/UI&Settings/README.md +0 -278
  64. package/docs/UI&Settings/Terminal.md +0 -637
  65. package/docs/UI&Settings/Themes.md +0 -604
  66. package/docs/UI&Settings/UIGuide.md +0 -550
@@ -1,1265 +0,0 @@
1
- # Troubleshooting Guide
2
-
3
- Having trouble with OLLM CLI? This guide covers common issues and their solutions. Most problems can be solved in a few minutes!
4
-
5
- ---
6
-
7
- ## Table of Contents
8
-
9
- - [Quick Fixes](#quick-fixes)
10
- - [Connection Issues](#connection-issues)
11
- - [Installation Issues](#installation-issues)
12
- - [Tool Execution Issues](#tool-execution-issues)
13
- - [Context and Memory Issues](#context-and-memory-issues)
14
- - [UI and Display Issues](#ui-and-display-issues)
15
- - [Debug Mode](#debug-mode)
16
- - [Getting Help](#getting-help)
17
-
18
- ---
19
-
20
- ## Quick Fixes
21
-
22
- Try these first - they solve most problems:
23
-
24
- ### 1. Restart Everything
25
-
26
- ```bash
27
- # Exit OLLM CLI
28
- Ctrl+C
29
-
30
- # Restart Ollama
31
- ollama serve
32
-
33
- # Start OLLM CLI again
34
- ollm
35
- ```
36
-
37
- ### 2. Check Ollama is Running
38
-
39
- ```bash
40
- # Test Ollama connection
41
- curl http://localhost:11434/api/tags
42
-
43
- # Should return JSON with model list
44
- ```
45
-
46
- ### 3. Verify Model is Downloaded
47
-
48
- ```bash
49
- # List models
50
- ollama list
51
-
52
- # If empty, download a model
53
- ollama pull llama3.2:3b
54
- ```
55
-
56
- ### 4. Clear Cache and Restart
57
-
58
- ```bash
59
- # Clear npm cache
60
- npm cache clean --force
61
-
62
- # Reinstall OLLM CLI
63
- npm install -g @tecet/ollm
64
- ```
65
-
66
- ### 5. Check Node.js Version
67
-
68
- ```bash
69
- # Must be 20 or higher
70
- node --version
71
-
72
- # Update if needed
73
- nvm install 20
74
- nvm use 20
75
- ```
76
-
77
- ---
78
-
79
- ---
80
-
81
- ## Connection Issues
82
-
83
- ### Cannot connect to Ollama
84
-
85
- **Symptoms:**
86
-
87
- - Error message: `Connection refused` or `ECONNREFUSED`
88
- - Error message: `Failed to connect to Ollama at http://localhost:11434`
89
- - Commands hang or timeout when trying to communicate with Ollama
90
-
91
- **Causes:**
92
-
93
- - Ollama service is not running
94
- - Ollama is running on a different host or port
95
- - Firewall blocking the connection
96
- - Network configuration issues
97
-
98
- **Solutions:**
99
-
100
- 1. **Start Ollama service:**
101
-
102
- ```bash
103
- # On macOS/Linux
104
- ollama serve
105
-
106
- # On Windows
107
- # Ollama typically runs as a service, check if it's running
108
- ```
109
-
110
- 2. **Verify Ollama is running:**
111
-
112
- ```bash
113
- # Check if Ollama is responding
114
- curl http://localhost:11434/api/tags
115
- ```
116
-
117
- 3. **Specify custom host:**
118
-
119
- ```bash
120
- # If Ollama is running on a different host/port
121
- ollm --host http://192.168.1.100:11434
122
-
123
- # Or set environment variable
124
- export OLLAMA_HOST=http://192.168.1.100:11434
125
- ollm
126
- ```
127
-
128
- 4. **Check firewall settings:**
129
- - Ensure port 11434 is not blocked by your firewall
130
- - On Windows: Check Windows Defender Firewall settings
131
- - On macOS: Check System Preferences > Security & Privacy > Firewall
132
- - On Linux: Check iptables or ufw rules
133
-
134
- 5. **Verify network connectivity:**
135
-
136
- ```bash
137
- # Test if the host is reachable
138
- ping localhost
139
-
140
- # Test if the port is open
141
- telnet localhost 11434
142
- ```
143
-
144
- ### Model not found
145
-
146
- **Symptoms:**
147
-
148
- - Error message: `Model 'model-name' not found`
149
- - Error message: `404 Not Found` when trying to use a model
150
- - Model list doesn't show the expected model
151
-
152
- **Causes:**
153
-
154
- - Model hasn't been downloaded yet
155
- - Model name is misspelled
156
- - Model was removed or renamed
157
-
158
- **Solutions:**
159
-
160
- 1. **List available models:**
161
-
162
- ```bash
163
- ollm --list-models
164
- # Or directly with Ollama
165
- ollama list
166
- ```
167
-
168
- 2. **Pull the model:**
169
-
170
- ```bash
171
- # Using OLLM CLI
172
- ollm --pull llama3.1:8b
173
-
174
- # Or directly with Ollama
175
- ollama pull llama3.1:8b
176
- ```
177
-
178
- 3. **Check model name spelling:**
179
- - Model names are case-sensitive
180
- - Use exact names from `ollama list`
181
- - Common models: `llama3.1:8b`, `codellama:7b`, `mistral:7b`
182
-
183
- 4. **Verify model installation:**
184
- ```bash
185
- # Check if model files exist
186
- ollama show llama3.1:8b
187
- ```
188
-
189
- ### Network/Firewall Issues
190
-
191
- **Symptoms:**
192
-
193
- - Intermittent connection failures
194
- - Slow response times
195
- - Timeout errors
196
-
197
- **Causes:**
198
-
199
- - Corporate firewall blocking connections
200
- - VPN interfering with local connections
201
- - Proxy configuration issues
202
- - Network instability
203
-
204
- **Solutions:**
205
-
206
- 1. **Check proxy settings:**
207
-
208
- ```bash
209
- # If behind a proxy, configure it
210
- export HTTP_PROXY=http://proxy.example.com:8080
211
- export HTTPS_PROXY=http://proxy.example.com:8080
212
- export NO_PROXY=localhost,127.0.0.1
213
- ```
214
-
215
- 2. **Disable VPN temporarily:**
216
- - Some VPNs interfere with localhost connections
217
- - Try disconnecting VPN and testing again
218
-
219
- 3. **Configure firewall exceptions:**
220
- - Add Ollama (port 11434) to firewall exceptions
221
- - Add OLLM CLI executable to allowed programs
222
-
223
- 4. **Use direct IP instead of localhost:**
224
- ```bash
225
- # Try using 127.0.0.1 instead of localhost
226
- ollm --host http://127.0.0.1:11434
227
- ```
228
-
229
- ---
230
-
231
- ## Installation Issues
232
-
233
- ### Global install fails
234
-
235
- **Symptoms:**
236
-
237
- - `npm install -g ollm-cli` fails with errors
238
- - Permission denied errors during installation
239
- - Installation completes but `ollm` command not found
240
-
241
- **Causes:**
242
-
243
- - Insufficient permissions
244
- - npm global directory not in PATH
245
- - Corrupted npm cache
246
- - Node.js version incompatibility
247
-
248
- **Solutions:**
249
-
250
- 1. **Use sudo (macOS/Linux):**
251
-
252
- ```bash
253
- sudo npm install -g ollm-cli
254
- ```
255
-
256
- 2. **Configure npm to use user directory (recommended):**
257
-
258
- ```bash
259
- # Create a directory for global packages
260
- mkdir ~/.npm-global
261
-
262
- # Configure npm to use it
263
- npm config set prefix '~/.npm-global'
264
-
265
- # Add to PATH (add to ~/.bashrc or ~/.zshrc)
266
- export PATH=~/.npm-global/bin:$PATH
267
-
268
- # Reload shell configuration
269
- source ~/.bashrc # or source ~/.zshrc
270
-
271
- # Install without sudo
272
- npm install -g ollm-cli
273
- ```
274
-
275
- 3. **Clear npm cache:**
276
-
277
- ```bash
278
- npm cache clean --force
279
- npm install -g ollm-cli
280
- ```
281
-
282
- 4. **Verify installation:**
283
-
284
- ```bash
285
- # Check if ollm is in PATH
286
- which ollm
287
-
288
- # Check version
289
- ollm --version
290
- ```
291
-
292
- 5. **Manual PATH configuration:**
293
-
294
- ```bash
295
- # Find npm global bin directory
296
- npm bin -g
297
-
298
- # Add to PATH if not already there
299
- export PATH=$(npm bin -g):$PATH
300
- ```
301
-
302
- ### Permission errors
303
-
304
- **Symptoms:**
305
-
306
- - `EACCES` or `EPERM` errors during installation
307
- - Cannot write to npm directories
308
- - Installation fails with permission denied
309
-
310
- **Causes:**
311
-
312
- - npm global directory owned by root
313
- - Insufficient file system permissions
314
- - Protected system directories
315
-
316
- **Solutions:**
317
-
318
- 1. **Fix npm permissions (macOS/Linux):**
319
-
320
- ```bash
321
- # Find npm directory
322
- npm config get prefix
323
-
324
- # Change ownership (replace USERNAME with your username)
325
- sudo chown -R $(whoami) $(npm config get prefix)/{lib/node_modules,bin,share}
326
- ```
327
-
328
- 2. **Use npx instead of global install:**
329
-
330
- ```bash
331
- # Run without installing globally
332
- npx ollm-cli -p "your prompt"
333
- ```
334
-
335
- 3. **Install in user directory (Windows):**
336
-
337
- ```cmd
338
- # npm should install to %APPDATA%\npm by default
339
- # Verify with:
340
- npm config get prefix
341
-
342
- # If needed, set to user directory:
343
- npm config set prefix %APPDATA%\npm
344
- ```
345
-
346
- 4. **Run as administrator (Windows):**
347
- - Right-click Command Prompt or PowerShell
348
- - Select "Run as administrator"
349
- - Run installation command
350
-
351
- ### Node version incompatibility
352
-
353
- **Symptoms:**
354
-
355
- - Error message about unsupported Node.js version
356
- - Syntax errors during installation
357
- - Module loading errors
358
-
359
- **Causes:**
360
-
361
- - Node.js version is too old (< 20.0.0)
362
- - Using incompatible Node.js version
363
-
364
- **Solutions:**
365
-
366
- 1. **Check Node.js version:**
367
-
368
- ```bash
369
- node --version
370
- ```
371
-
372
- 2. **Upgrade Node.js:**
373
-
374
- ```bash
375
- # Using nvm (recommended)
376
- nvm install 20
377
- nvm use 20
378
-
379
- # Or download from nodejs.org
380
- # https://nodejs.org/
381
- ```
382
-
383
- 3. **Install nvm (Node Version Manager):**
384
-
385
- ```bash
386
- # macOS/Linux
387
- curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
388
-
389
- # Windows - use nvm-windows
390
- # https://github.com/coreybutler/nvm-windows
391
- ```
392
-
393
- 4. **Verify installation after upgrade:**
394
- ```bash
395
- node --version # Should be 20.x or higher
396
- npm install -g ollm-cli
397
- ```
398
-
399
- ---
400
-
401
- ## Tool Execution Issues
402
-
403
- ### Tool support errors
404
-
405
- **Symptoms:**
406
-
407
- - Error message: `unknown field: tools`
408
- - Error message: `400 Bad Request` when using tools
409
- - Model returns errors about function calling
410
- - Tools don't work with certain models
411
-
412
- **Causes:**
413
-
414
- - Model doesn't support function calling
415
- - Tool support metadata is incorrect
416
- - Model was updated and capabilities changed
417
- - Custom/unknown model without proper configuration
418
-
419
- **Solutions:**
420
-
421
- 1. **Check model tool support:**
422
-
423
- ```bash
424
- /model info model-name
425
- ```
426
-
427
- Look for "Tool calling" in capabilities.
428
-
429
- 2. **For unknown models, configure tool support:**
430
- When switching to an unknown model, you'll be prompted:
431
-
432
- ```
433
- Does this model support function calling (tools)?
434
- [y] Yes, it supports tools
435
- [n] No, it doesn't support tools
436
- [a] Auto-detect (test with a sample request)
437
- ```
438
-
439
- Choose the appropriate option.
440
-
441
- 3. **Manually update tool support in user_models.json:**
442
-
443
- ```json
444
- {
445
- "user_models": [
446
- {
447
- "id": "custom-model:latest",
448
- "tool_support": false,
449
- "tool_support_source": "user_confirmed",
450
- "tool_support_confirmed_at": "2026-01-17T10:00:00Z"
451
- }
452
- ]
453
- }
454
- ```
455
-
456
- 4. **Use auto-detect for uncertain models:**
457
- - Select "auto-detect" when prompted
458
- - System will test the model with a sample tool request
459
- - Result is saved automatically
460
-
461
- 5. **Runtime learning:**
462
- If tool errors occur during usage, you'll be prompted:
463
-
464
- ```
465
- This model appears to not support tools. Update metadata? (y/n)
466
- ```
467
-
468
- Confirm to save the updated metadata.
469
-
470
- 6. **Check system messages:**
471
- When switching models, look for:
472
-
473
- ```
474
- Switched to model-name. Tools: Enabled/Disabled
475
- ```
476
-
477
- 7. **Verify tool filtering:**
478
- - Tools are automatically disabled for non-supporting models
479
- - Check the Tools Panel to see current tool status
480
- - System prompt includes note when tools are disabled
481
-
482
- **Prevention:**
483
-
484
- - Use models from the shipped profiles (LLM_profiles.json)
485
- - Confirm tool support when adding custom models
486
- - Keep user_models.json up to date with `/model list`
487
-
488
- ### Shell command timeout
489
-
490
- **Symptoms:**
491
-
492
- - Long-running commands are interrupted
493
- - Error message: `Command timed out`
494
- - Commands that should complete are killed prematurely
495
-
496
- **Causes:**
497
-
498
- - Default timeout is too short for the command
499
- - Command is actually hanging or stuck
500
- - System resource constraints
501
-
502
- **Solutions:**
503
-
504
- 1. **Increase timeout in configuration:**
505
-
506
- ```yaml
507
- # ~/.ollm/config.yaml or .ollm/config.yaml
508
- tools:
509
- shell:
510
- timeout: 60000 # 60 seconds (in milliseconds)
511
- ```
512
-
513
- 2. **Use command-line flag:**
514
-
515
- ```bash
516
- ollm --tool-timeout 60000
517
- ```
518
-
519
- 3. **Set environment variable:**
520
-
521
- ```bash
522
- export OLLM_TOOL_TIMEOUT=60000
523
- ollm
524
- ```
525
-
526
- 4. **For specific long-running commands:**
527
-
528
- ```yaml
529
- # Configure per-tool timeouts
530
- tools:
531
- shell:
532
- timeout: 120000 # 2 minutes for shell commands
533
- web:
534
- timeout: 30000 # 30 seconds for web requests
535
- ```
536
-
537
- 5. **Check if command is actually stuck:**
538
- ```bash
539
- # Enable debug mode to see what's happening
540
- ollm --debug
541
- ```
542
-
543
- ### File operation denied
544
-
545
- **Symptoms:**
546
-
547
- - Error message: `EACCES: permission denied`
548
- - Cannot read or write files
549
- - File operations fail silently
550
-
551
- **Causes:**
552
-
553
- - Insufficient file system permissions
554
- - File is locked by another process
555
- - Protected system directories
556
- - Workspace not in allowed paths
557
-
558
- **Solutions:**
559
-
560
- 1. **Check file permissions:**
561
-
562
- ```bash
563
- # macOS/Linux
564
- ls -la /path/to/file
565
-
566
- # Fix permissions if needed
567
- chmod 644 /path/to/file # For files
568
- chmod 755 /path/to/dir # For directories
569
- ```
570
-
571
- 2. **Run from correct directory:**
572
-
573
- ```bash
574
- # Ensure you're in a directory you have access to
575
- cd ~/projects/my-project
576
- ollm
577
- ```
578
-
579
- 3. **Configure allowed paths:**
580
-
581
- ```yaml
582
- # ~/.ollm/config.yaml
583
- tools:
584
- file:
585
- allowedPaths:
586
- - ~/projects
587
- - ~/documents
588
- - /tmp
589
- ```
590
-
591
- 4. **Check file locks:**
592
-
593
- ```bash
594
- # macOS/Linux - check if file is open
595
- lsof /path/to/file
596
-
597
- # Windows - check file handles
598
- # Use Process Explorer or Resource Monitor
599
- ```
600
-
601
- 5. **Use workspace-relative paths:**
602
- ```bash
603
- # Instead of absolute paths, use relative paths
604
- # OLLM CLI operates within the current workspace
605
- cd /path/to/workspace
606
- ollm
607
- ```
608
-
609
- ### Tools Panel issues
610
-
611
- **Symptoms:**
612
-
613
- - Tool settings don't persist across sessions
614
- - Disabled tools still appear to be called
615
- - Tools Panel shows incorrect state
616
- - Cannot toggle tools on/off
617
-
618
- **Causes:**
619
-
620
- - Settings file is corrupted or has wrong permissions
621
- - Workspace settings override user settings
622
- - Model doesn't support tools (all tools disabled)
623
- - Cache not refreshed after settings change
624
-
625
- **Solutions:**
626
-
627
- 1. **Check settings file location:**
628
-
629
- ```bash
630
- # User settings
631
- cat ~/.ollm/settings.json
632
-
633
- # Workspace settings (overrides user)
634
- cat .ollm/settings.json
635
- ```
636
-
637
- 2. **Verify settings file format:**
638
-
639
- ```json
640
- {
641
- "tools": {
642
- "executePwsh": false,
643
- "remote_web_search": true,
644
- "fsWrite": true
645
- }
646
- }
647
- ```
648
-
649
- 3. **Fix file permissions:**
650
-
651
- ```bash
652
- # macOS/Linux
653
- chmod 644 ~/.ollm/settings.json
654
-
655
- # Windows
656
- # Check file properties and ensure you have write access
657
- ```
658
-
659
- 4. **Reset to defaults:**
660
-
661
- ```bash
662
- # Backup current settings
663
- cp ~/.ollm/settings.json ~/.ollm/settings.json.backup
664
-
665
- # Remove settings file to reset
666
- rm ~/.ollm/settings.json
667
-
668
- # Restart OLLM CLI - new settings file will be created
669
- ```
670
-
671
- 5. **Check model tool support:**
672
- - If model doesn't support tools, all tools are disabled
673
- - Switch to a tool-capable model to enable tools
674
- - Check system message: "Switched to model. Tools: Enabled/Disabled"
675
-
676
- 6. **Verify workspace overrides:**
677
-
678
- ```bash
679
- # Check if workspace has tool settings
680
- cat .ollm/settings.json
681
-
682
- # Remove workspace settings to use user settings
683
- rm .ollm/settings.json
684
- ```
685
-
686
- 7. **Debug tool filtering:**
687
- - Enable debug mode: `ollm --debug`
688
- - Check logs for tool filtering messages
689
- - Verify two-stage filtering: model capability + user preference
690
-
691
- **Prevention:**
692
-
693
- - Don't manually edit settings.json (use Tools Panel)
694
- - Keep backups of working configurations
695
- - Use version control for workspace settings
696
-
697
- ---
698
-
699
- ## Context and Memory Issues
700
-
701
- ### Out of memory errors
702
-
703
- **Symptoms:**
704
-
705
- - Error message: `JavaScript heap out of memory`
706
- - Process crashes during operation
707
- - System becomes unresponsive
708
- - Slow performance before crash
709
-
710
- **Causes:**
711
-
712
- - Context size too large for available memory
713
- - Too many messages in conversation history
714
- - Large files loaded into context
715
- - Memory leak in long-running session
716
-
717
- **Solutions:**
718
-
719
- 1. **Increase Node.js memory limit:**
720
-
721
- ```bash
722
- # Set max memory to 4GB
723
- export NODE_OPTIONS="--max-old-space-size=4096"
724
- ollm
725
- ```
726
-
727
- 2. **Reduce context size:**
728
-
729
- ```yaml
730
- # ~/.ollm/config.yaml
731
- context:
732
- maxTokens: 4096 # Reduce from default
733
- maxMessages: 50 # Limit conversation history
734
- ```
735
-
736
- 3. **Enable automatic context compression:**
737
-
738
- ```yaml
739
- # ~/.ollm/config.yaml
740
- context:
741
- compression:
742
- enabled: true
743
- strategy: 'summarize' # or "truncate"
744
- threshold: 0.8 # Compress at 80% capacity
745
- ```
746
-
747
- 4. **Clear conversation history:**
748
-
749
- ```bash
750
- # In interactive mode, use slash command
751
- /clear
752
-
753
- # Or start fresh session
754
- ollm --new-session
755
- ```
756
-
757
- 5. **Use context snapshots:**
758
-
759
- ```yaml
760
- # ~/.ollm/config.yaml
761
- context:
762
- snapshots:
763
- enabled: true
764
- interval: 100 # Save every 100 messages
765
- ```
766
-
767
- 6. **Monitor memory usage:**
768
- ```bash
769
- # Enable VRAM monitoring
770
- ollm --monitor-vram
771
- ```
772
-
773
- ### Context overflow
774
-
775
- **Symptoms:**
776
-
777
- - Error message: `Context length exceeded`
778
- - Model refuses to process more input
779
- - Responses become truncated or incomplete
780
- - Warning about context limit
781
-
782
- **Causes:**
783
-
784
- - Conversation history too long
785
- - Large files or documents in context
786
- - Model's context limit reached
787
- - Accumulated tool outputs
788
-
789
- **Solutions:**
790
-
791
- 1. **Enable automatic context management:**
792
-
793
- ```yaml
794
- # ~/.ollm/config.yaml
795
- context:
796
- management:
797
- enabled: true
798
- strategy: 'sliding-window' # Keep recent messages
799
- ```
800
-
801
- 2. **Configure context limits:**
802
-
803
- ```yaml
804
- # ~/.ollm/config.yaml
805
- context:
806
- maxTokens: 8192 # Match model's capacity
807
- reserveTokens: 1024 # Reserve for response
808
- ```
809
-
810
- 3. **Use context compression:**
811
-
812
- ```yaml
813
- # ~/.ollm/config.yaml
814
- context:
815
- compression:
816
- enabled: true
817
- strategy: 'summarize'
818
- threshold: 0.75
819
- ```
820
-
821
- 4. **Manually manage context:**
822
-
823
- ```bash
824
- # Clear old messages
825
- /clear
826
-
827
- # Create snapshot before clearing
828
- /snapshot save important-context
829
-
830
- # Load snapshot later
831
- /snapshot load important-context
832
- ```
833
-
834
- 5. **Use models with larger context:**
835
-
836
- ```bash
837
- # Switch to model with larger context window
838
- ollm --model llama3.1:70b # Has larger context capacity
839
- ```
840
-
841
- 6. **Optimize file loading:**
842
- ```yaml
843
- # ~/.ollm/config.yaml
844
- tools:
845
- file:
846
- maxFileSize: 100000 # Limit file size (bytes)
847
- truncateOutput: true
848
- maxOutputLines: 100
849
- ```
850
-
851
- ---
852
-
853
- ## UI and Display Issues
854
-
855
- ### Colors not showing correctly
856
-
857
- **Symptoms:**
858
-
859
- - Text appears in wrong colors
860
- - No colors at all (plain text)
861
- - Garbled characters or boxes
862
-
863
- **Causes:**
864
-
865
- - Terminal doesn't support 256 colors
866
- - Terminal color scheme overriding
867
- - Wrong terminal emulator
868
-
869
- **Solutions:**
870
-
871
- 1. **Use a modern terminal:**
872
- - **Windows:** Windows Terminal (from Microsoft Store)
873
- - **macOS:** iTerm2 or built-in Terminal
874
- - **Linux:** Most modern terminals work
875
-
876
- 2. **Check color support:**
877
-
878
- ```bash
879
- # Test 256 color support
880
- curl -s https://gist.githubusercontent.com/HaleTom/89ffe32783f89f403bba96bd7bcd1263/raw/ | bash
881
- ```
882
-
883
- 3. **Try different theme:**
884
-
885
- ```bash
886
- /theme list
887
- /theme use neon-dark
888
- ```
889
-
890
- 4. **Disable terminal color scheme:**
891
- - Some terminals override colors
892
- - Check terminal preferences
893
- - Disable custom color schemes
894
-
895
- ### Text is too small/large
896
-
897
- **Solutions:**
898
-
899
- 1. **Adjust terminal font size:**
900
- - **Windows Terminal:** Ctrl + Plus/Minus
901
- - **iTerm2:** Cmd + Plus/Minus
902
- - **Most terminals:** Ctrl + Plus/Minus
903
-
904
- 2. **Change terminal font:**
905
- - Use a monospace font
906
- - Recommended: Fira Code, JetBrains Mono, Cascadia Code
907
-
908
- ### Interface is cut off or wrapped weird
909
-
910
- **Symptoms:**
911
-
912
- - Text wraps incorrectly
913
- - Side panel overlaps chat
914
- - Status bar missing
915
-
916
- **Causes:**
917
-
918
- - Terminal window too small
919
- - Terminal size not detected correctly
920
-
921
- **Solutions:**
922
-
923
- 1. **Resize terminal window:**
924
- - Minimum: 80 columns × 24 rows
925
- - Recommended: 120 columns × 40 rows
926
-
927
- 2. **Check terminal size:**
928
-
929
- ```bash
930
- # Show terminal dimensions
931
- echo $COLUMNS x $LINES
932
- ```
933
-
934
- 3. **Restart OLLM CLI after resizing:**
935
- ```bash
936
- # Exit and restart
937
- Ctrl+C
938
- ollm
939
- ```
940
-
941
- ### Side panel won't open/close
942
-
943
- **Solutions:**
944
-
945
- 1. **Use keyboard shortcut:**
946
-
947
- ```bash
948
- Ctrl+P # Toggle side panel
949
- ```
950
-
951
- 2. **Check focus:**
952
- - Make sure OLLM CLI window is focused
953
- - Click on the terminal window
954
-
955
- 3. **Try slash command:**
956
- ```bash
957
- /panel toggle
958
- ```
959
-
960
- ### Keyboard shortcuts not working
961
-
962
- **Symptoms:**
963
-
964
- - Ctrl+K doesn't open commands
965
- - Ctrl+P doesn't toggle panel
966
- - Other shortcuts don't respond
967
-
968
- **Causes:**
969
-
970
- - Terminal intercepting shortcuts
971
- - Wrong keyboard layout
972
- - Focus not on OLLM CLI
973
-
974
- **Solutions:**
975
-
976
- 1. **Check terminal key bindings:**
977
- - Some terminals use Ctrl+K for other things
978
- - Check terminal preferences
979
- - Disable conflicting shortcuts
980
-
981
- 2. **Use alternative shortcuts:**
982
- - Most commands have slash command alternatives
983
- - Type `/help` to see all commands
984
-
985
- 3. **Check keyboard layout:**
986
- - Ensure you're using the correct keyboard layout
987
- - Some layouts have different key positions
988
-
989
- ---
990
-
991
- ## Debug Mode
992
-
993
- Debug mode provides detailed logging to help diagnose issues.
994
-
995
- ### Enable debug mode
996
-
997
- **Using command-line flag:**
998
-
999
- ```bash
1000
- ollm --debug
1001
- ```
1002
-
1003
- **Using environment variable:**
1004
-
1005
- ```bash
1006
- # Set log level to debug
1007
- export OLLM_LOG_LEVEL=debug
1008
- ollm
1009
-
1010
- # Or inline
1011
- OLLM_LOG_LEVEL=debug ollm
1012
- ```
1013
-
1014
- **In configuration file:**
1015
-
1016
- ```yaml
1017
- # ~/.ollm/config.yaml
1018
- logging:
1019
- level: debug # Options: error, warn, info, debug
1020
- file: ~/.ollm/logs/ollm.log # Optional: log to file
1021
- ```
1022
-
1023
- ### Log levels
1024
-
1025
- | Level | Description | Use Case |
1026
- | ------- | ----------------------- | --------------- |
1027
- | `error` | Only errors | Production use |
1028
- | `warn` | Errors and warnings | Normal use |
1029
- | `info` | General information | Default |
1030
- | `debug` | Detailed debugging info | Troubleshooting |
1031
-
1032
- ### Interpreting debug output
1033
-
1034
- **Connection debugging:**
1035
-
1036
- ```
1037
- [DEBUG] Connecting to Ollama at http://localhost:11434
1038
- [DEBUG] Request: POST /api/generate
1039
- [DEBUG] Response: 200 OK
1040
- ```
1041
-
1042
- **Tool execution debugging:**
1043
-
1044
- ```
1045
- [DEBUG] Executing tool: read-file
1046
- [DEBUG] Tool input: { path: "example.txt" }
1047
- [DEBUG] Tool output: [truncated 1024 bytes]
1048
- [DEBUG] Tool execution time: 45ms
1049
- ```
1050
-
1051
- **Context management debugging:**
1052
-
1053
- ```
1054
- [DEBUG] Context usage: 3456/8192 tokens (42%)
1055
- [DEBUG] VRAM usage: 4.2GB/8GB (52%)
1056
- [DEBUG] Compression triggered at 80% threshold
1057
- [DEBUG] Context compressed: 3456 -> 2048 tokens
1058
- ```
1059
-
1060
- **Model routing debugging:**
1061
-
1062
- ```
1063
- [DEBUG] Model routing: selected llama3.1:8b
1064
- [DEBUG] Routing reason: matches 'general' profile
1065
- [DEBUG] Model capabilities: tools=true, vision=false
1066
- ```
1067
-
1068
- ### Debug output to file
1069
-
1070
- ```bash
1071
- # Redirect debug output to file
1072
- ollm --debug 2> debug.log
1073
-
1074
- # Or configure in settings
1075
- ```
1076
-
1077
- ```yaml
1078
- # ~/.ollm/config.yaml
1079
- logging:
1080
- level: debug
1081
- file: ~/.ollm/logs/debug.log
1082
- maxSize: 10485760 # 10MB
1083
- maxFiles: 5 # Keep 5 rotated logs
1084
- ```
1085
-
1086
- ### Common debug patterns
1087
-
1088
- **Trace a specific request:**
1089
-
1090
- ```bash
1091
- # Enable debug mode and watch for specific patterns
1092
- OLLM_LOG_LEVEL=debug ollm 2>&1 | grep "POST /api"
1093
- ```
1094
-
1095
- **Monitor tool execution:**
1096
-
1097
- ```bash
1098
- # See all tool calls
1099
- OLLM_LOG_LEVEL=debug ollm 2>&1 | grep "Executing tool"
1100
- ```
1101
-
1102
- **Check context usage:**
1103
-
1104
- ```bash
1105
- # Monitor context and memory
1106
- OLLM_LOG_LEVEL=debug ollm 2>&1 | grep -E "(Context|VRAM)"
1107
- ```
1108
-
1109
- ---
1110
-
1111
- ## Getting Help
1112
-
1113
- If you're still experiencing issues after trying the solutions above, here are additional resources:
1114
-
1115
- ### GitHub Issues
1116
-
1117
- Report bugs or request features:
1118
-
1119
- - **Repository:** https://github.com/ollm/ollm-cli
1120
- - **Issues:** https://github.com/ollm/ollm-cli/issues
1121
- - **Discussions:** https://github.com/ollm/ollm-cli/discussions
1122
-
1123
- **Before creating an issue:**
1124
-
1125
- 1. Search existing issues to avoid duplicates
1126
- 2. Include debug logs (`ollm --debug`)
1127
- 3. Provide system information (OS, Node.js version, Ollama version)
1128
- 4. Include steps to reproduce the problem
1129
- 5. Share relevant configuration (redact sensitive info)
1130
-
1131
- ### Documentation
1132
-
1133
- - **README:** [Main documentation](3%20projects/OLLM%20CLI/Development-Roadmap/README.md)
1134
- - **Configuration Reference:** [Configuration guide](./UI&Settings/Configuration.md)
1135
- - **Roadmap:** Future features (./DevelopmentRoadmap/Roadmap.md)
1136
-
1137
- ### Community Resources
1138
-
1139
- - **Ollama Documentation:** https://github.com/ollama/ollama/tree/main/docs
1140
- - **Ollama Discord:** https://discord.gg/ollama
1141
- - **Model Context Protocol:** https://modelcontextprotocol.io/
1142
-
1143
- ### System Information
1144
-
1145
- When reporting issues, include:
1146
-
1147
- ```bash
1148
- # Node.js version
1149
- node --version
1150
-
1151
- # npm version
1152
- npm --version
1153
-
1154
- # OLLM CLI version
1155
- ollm --version
1156
-
1157
- # Ollama version
1158
- ollama --version
1159
-
1160
- # Operating system
1161
- # macOS
1162
- sw_vers
1163
-
1164
- # Linux
1165
- lsb_release -a
1166
-
1167
- # Windows
1168
- systeminfo | findstr /B /C:"OS Name" /C:"OS Version"
1169
-
1170
- # Available models
1171
- ollama list
1172
- ```
1173
-
1174
- ### Diagnostic checklist
1175
-
1176
- Before seeking help, verify:
1177
-
1178
- - [ ] Node.js 20+ is installed
1179
- - [ ] Ollama is running and accessible
1180
- - [ ] At least one model is downloaded
1181
- - [ ] OLLM CLI is installed globally
1182
- - [ ] `ollm --version` works
1183
- - [ ] Debug mode shows detailed logs
1184
- - [ ] Configuration file is valid YAML
1185
- - [ ] No firewall blocking connections
1186
- - [ ] Sufficient disk space and memory
1187
- - [ ] Latest version of OLLM CLI installed
1188
-
1189
- ### Quick diagnostic command
1190
-
1191
- ```bash
1192
- # Run comprehensive diagnostic
1193
- ollm --diagnose
1194
-
1195
- # This will check:
1196
- # - Node.js version
1197
- # - Ollama connectivity
1198
- # - Available models
1199
- # - Configuration validity
1200
- # - Tool permissions
1201
- # - Memory availability
1202
- ```
1203
-
1204
- ---
1205
-
1206
- ## Additional Tips
1207
-
1208
- ### Performance optimization
1209
-
1210
- ```yaml
1211
- # ~/.ollm/config.yaml
1212
- performance:
1213
- # Reduce memory usage
1214
- context:
1215
- maxTokens: 4096
1216
-
1217
- # Faster model for quick tasks
1218
- routing:
1219
- defaultProfile: 'fast'
1220
-
1221
- # Limit tool output
1222
- tools:
1223
- truncateOutput: true
1224
- maxOutputLines: 100
1225
- ```
1226
-
1227
- ### Security best practices
1228
-
1229
- ```yaml
1230
- # ~/.ollm/config.yaml
1231
- security:
1232
- # Restrict file access
1233
- tools:
1234
- file:
1235
- allowedPaths:
1236
- - ~/projects
1237
- deniedPaths:
1238
- - ~/.ssh
1239
- - ~/.aws
1240
-
1241
- # Require confirmation for shell commands
1242
- policy:
1243
- shell: 'ask'
1244
- file: 'auto'
1245
- ```
1246
-
1247
- ### Backup and recovery
1248
-
1249
- ```bash
1250
- # Backup configuration
1251
- cp ~/.ollm/config.yaml ~/.ollm/config.yaml.backup
1252
-
1253
- # Backup session data
1254
- cp -r ~/.ollm/session-data ~/.ollm/session-data.backup
1255
-
1256
- # Restore from backup
1257
- cp ~/.ollm/config.yaml.backup ~/.ollm/config.yaml
1258
- ```
1259
-
1260
- ---
1261
-
1262
- **Last Updated:** January 2026
1263
- **Version:** 0.1.0
1264
-
1265
- For the latest troubleshooting information, visit the GitHub repository (https://github.com/ollm/ollm-cli).