glance-cli 0.10.9 → 0.11.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +5 -19
  2. package/dist/cli.js +262 -267
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -4,7 +4,7 @@
4
4
 
5
5
  - **100% FREE by default** – Uses local Ollama (no API keys needed!)
6
6
  - **Auto language detection** – Detects English, French, Spanish, Haitian Creole
7
- - **Privacy-first** – Your data stays on your machine
7
+ - **No tracking** – Your browsing history stays private
8
8
  - **Voice-enabled** – Read articles aloud with multilingual support
9
9
  - **File output** – Save summaries as markdown, JSON, or plain text
10
10
  - **Lightning fast** – Built with Bun and TypeScript
@@ -139,7 +139,7 @@ done
139
139
 
140
140
  ### **Voice & Audio**
141
141
  ```bash
142
- --read, --speak # Read aloud
142
+ --read, -r # Read aloud
143
143
  --audio-output <file> # Save as MP3
144
144
  --voice <name> # Choose voice (nova, onyx, antoine, isabella)
145
145
  --list-voices # Show available voices
@@ -179,7 +179,7 @@ done
179
179
 
180
180
  ```bash
181
181
  # AI Providers (optional)
182
- export OPENAI_API_KEY=sk-...
182
+ export OPENAI_API_KEY=...
183
183
  export GEMINI_API_KEY=...
184
184
 
185
185
  # Voice (optional, for premium)
@@ -200,7 +200,7 @@ glance https://lemonde.fr --output french-article.md # Auto-detects French for
200
200
  # 2. Format override for different use cases
201
201
  glance https://news.com --format json --output backup.md # JSON content in .md file
202
202
 
203
- # 3. Use local AI for privacy
203
+ # 3. Use local AI
204
204
  glance https://www.ayiti.ai --model llama3 --free-only
205
205
 
206
206
  # 4. Match voice to auto-detected language
@@ -214,26 +214,12 @@ glance https://long-article.com --stream
214
214
 
215
215
  ## 🚀 Performance
216
216
 
217
- - **Fast**: 2-3 seconds with local AI (Ollama)
217
+ - **Fast**: ~5 seconds with local AI (Ollama)
218
218
  - **Efficient**: Cheerio-based content extraction
219
- - **Smart caching**: Reduces redundant API calls
220
219
  - **Lightweight**: ~8MB bundle size
221
220
 
222
221
  ---
223
222
 
224
- ## 📊 Why Glance?
225
-
226
- | Feature | Glance + Ollama | Traditional Tools |
227
- |---------|-----------------|-------------------|
228
- | **Cost** | $0 forever | $20-100/month |
229
- | **Privacy** | 100% local | Cloud-based |
230
- | **Voice** | Built-in | Often separate |
231
- | **Speed** | 2-3 seconds | 10-30 seconds |
232
- | **Multilingual** | Full support | Limited |
233
- | **Offline** | Works offline | Requires internet |
234
-
235
- ---
236
-
237
223
  ## 🤝 Contributing
238
224
 
239
225
  Contributions welcome! Check out our [Contributing Guide](CONTRIBUTING.md).