glance-cli 0.13.1 → 0.15.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -5,6 +5,39 @@ All notable changes to glance-cli will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [0.15.0] - 2026-01-06
9
+
10
+ ### 🚀 Added: Comprehensive Local Model Support
11
+ - **Fixed gpt-oss model compatibility**: Added special handling for gpt-oss models that use "thinking" field in responses
12
+ - **Tested and verified 7 Ollama models**:
13
+ - ✅ llama3:latest - Fast, reliable general-purpose model
14
+ - ✅ gemma3:4b - Lightweight, efficient for quick summaries
15
+ - ✅ mistral:7b / mistral:latest - Excellent quality responses
16
+ - ✅ gpt-oss:20b - Advanced local model with reasoning capabilities
17
+ - ✅ gpt-oss:120b-cloud - Largest model for complex tasks
18
+ - ✅ deepseek-r1:latest - Strong reasoning and analysis
19
+ - **100% test success rate**: All models work with --tldr, --key-points, and --eli5 modes
20
+ - **Improved response parsing**: Better extraction of meaningful content from various model response formats
21
+
22
+ ### 🐛 Fixed
23
+ - **gpt-oss models now properly route to Ollama**: Previously failing gpt-oss models now correctly use local Ollama endpoint
24
+ - **Response extraction for thinking-based models**: Models that output reasoning in "thinking" field now have their responses properly extracted
25
+ - **Consistent model detection**: Enhanced provider detection to correctly identify local vs cloud models
26
+
27
+ ### 📝 Documentation
28
+ - Added comprehensive list of tested and supported local models in README
29
+ - Included model-specific notes for optimal usage
30
+ - Updated examples with various model options
31
+
32
+ ## [0.14.0] - 2026-01-06
33
+
34
+ ### ✨ Added: Clipboard Copy Feature
35
+ - **New --copy/-c flag**: Copy summaries directly to clipboard
36
+ - **Cross-platform support**: Works on macOS, Windows, and Linux via clipboardy package
37
+ - **Smart formatting**: Copies raw text for terminal output, formatted for JSON/markdown
38
+ - **Updated documentation**: Added examples and help text for copy feature
39
+ - **Fixed metadata display**: Now shows actual extracted values instead of placeholders
40
+
8
41
  ## [0.13.1] - 2026-01-05
9
42
 
10
43
  ### 📦 Optimized: Package Size (Breaking: Major Size Reduction)
package/README.md CHANGED
@@ -27,6 +27,7 @@ bun install -g glance-cli # Or: npm install -g glance-cli
27
27
  glance https://www.ayiti.ai # AI summary
28
28
  glance https://www.ayiti.ai/fr --read # Auto-detects French + voice
29
29
  glance https://news.com --output summary.md # Save as markdown
30
+ glance https://news.com --copy # Copy to clipboard
30
31
  ```
31
32
 
32
33
  **For 100% free local AI:**
@@ -51,6 +52,7 @@ glance <url> --key-points # Bullet points
51
52
  glance <url> --eli5 # Simple explanation
52
53
  glance <url> --full # Full content (no summary)
53
54
  glance <url> --ask "..." # Ask specific question
55
+ glance <url> --copy # Copy summary to clipboard
54
56
  ```
55
57
 
56
58
  ### 🎤 **Voice & Audio**
@@ -79,9 +81,21 @@ glance <url> --full -l es --voice isabella --read # Override to Spanish + voic
79
81
  ```
80
82
 
81
83
  ### 🤖 **AI Models**
84
+
85
+ #### Tested & Supported Local Models (via Ollama)
86
+ - ✅ **llama3:latest** - Fast, reliable, great for general use
87
+ - ✅ **gemma3:4b** - Lightweight, efficient for quick summaries
88
+ - ✅ **mistral:7b** / **mistral:latest** - Excellent quality responses
89
+ - ✅ **gpt-oss:20b** - Advanced local model with reasoning capabilities
90
+ - ✅ **gpt-oss:120b-cloud** - Largest model for complex tasks
91
+ - ✅ **deepseek-r1:latest** - Strong reasoning and analysis
92
+
82
93
  ```bash
83
94
  # Free local AI (recommended)
84
- glance <url> --model llama3
95
+ glance <url> --model llama3:latest
96
+ glance <url> --model gemma3:4b
97
+ glance <url> --model mistral:7b
98
+ glance <url> --model gpt-oss:20b
85
99
 
86
100
  # Premium cloud AI (optional, requires API keys)
87
101
  glance <url> --model gpt-4o-mini # OpenAI
@@ -101,6 +115,9 @@ glance <url> --free-only # Never use paid APIs
101
115
  # Morning news with audio
102
116
  glance https://news.ycombinator.com --tldr --read
103
117
 
118
+ # Quick copy for sharing
119
+ glance https://techcrunch.com/article --tldr --copy
120
+
104
121
  # Documentation lookup
105
122
  glance https://nextjs.org/docs --ask "What's the App Router?"
106
123
 
@@ -159,6 +176,7 @@ done
159
176
  ```bash
160
177
  --format <type> # Output format: md, json, plain (default: terminal)
161
178
  --output, -o <file> # Save to file (auto-detects format from extension)
179
+ --copy, -c # Copy summary to clipboard
162
180
  --stream # Live streaming output
163
181
  ```
164
182
 
@@ -197,6 +215,9 @@ export OLLAMA_ENDPOINT=http://localhost:11434
197
215
  # 1. Auto language detection + file saving
198
216
  glance https://lemonde.fr --output french-article.md # Auto-detects French format
199
217
 
218
+ # 2. Quick sharing workflow
219
+ glance https://article.com --tldr --copy # Copy summary for instant sharing
220
+
200
221
  # 2. Format override for different use cases
201
222
  glance https://news.com --format json --output backup.md # JSON content in .md file
202
223