jasper-context-compactor 0.2.1 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +80 -16
  2. package/cli.js +99 -50
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -1,27 +1,65 @@
1
- # Context Compactor
1
+ # Jasper Context Compactor
2
2
 
3
3
  > Token-based context compaction for OpenClaw with local models (MLX, llama.cpp, Ollama)
4
4
 
5
- ## Why?
5
+ ## The Problem
6
6
 
7
- Local LLM servers don't report context overflow errors like cloud APIs do. OpenClaw's built-in compaction relies on these errors to trigger. This plugin estimates tokens client-side and proactively summarizes older messages before hitting the model's limit.
7
+ Local LLMs don't report context overflow errors like cloud APIs do. When context gets too long, they either:
8
+ - Silently truncate your conversation
9
+ - Return garbage output
10
+ - Crash without explanation
11
+
12
+ OpenClaw's built-in compaction relies on error signals that local models don't provide.
13
+
14
+ ## The Solution
15
+
16
+ Jasper Context Compactor estimates tokens client-side and proactively summarizes older messages before hitting your model's limit. No more broken conversations.
8
17
 
9
18
  ## Quick Start
10
19
 
11
20
  ```bash
12
- # One command setup (installs + configures)
13
21
  npx jasper-context-compactor setup
22
+ ```
14
23
 
15
- # Restart gateway
24
+ **The setup will:**
25
+
26
+ 1. ✅ **Back up your config** — Saves `openclaw.json` to `~/.openclaw/backups/` with restore instructions
27
+ 2. ✅ **Ask permission** — Won't read your config without consent
28
+ 3. ✅ **Detect your model** — Suggests appropriate token limits based on your setup
29
+ 4. ✅ **Let you customize** — Enter your own values if auto-detection doesn't match
30
+ 5. ✅ **Update config safely** — Adds the plugin with your chosen settings
31
+
32
+ Then restart OpenClaw:
33
+ ```bash
16
34
  openclaw gateway restart
17
35
  ```
18
36
 
19
- That's it! The setup command:
20
- - Copies plugin files to `~/.openclaw/extensions/context-compactor/`
21
- - Adds plugin config to `openclaw.json` with sensible defaults
37
+ ## Privacy
38
+
39
+ 🔒 **Everything runs 100% locally.** Nothing is sent to external servers.
40
+
41
+ The setup only reads your local `openclaw.json` file (with your permission) to detect your model and suggest appropriate limits.
42
+
43
+ ## How It Works
44
+
45
+ 1. Before each message, estimates total context tokens (chars ÷ 4)
46
+ 2. If over `maxTokens`, splits messages into "old" and "recent"
47
+ 3. Summarizes old messages using your session model
48
+ 4. Injects summary as context — conversation continues seamlessly
49
+
50
+ ## Commands
51
+
52
+ After setup, use these in chat:
53
+
54
+ | Command | Description |
55
+ |---------|-------------|
56
+ | `/context-stats` | Show current token usage and limits |
57
+ | `/compact-now` | Clear cache and force fresh compaction |
22
58
 
23
59
  ## Configuration
24
60
 
61
+ The setup configures these values in `~/.openclaw/openclaw.json`:
62
+
25
63
  ```json
26
64
  {
27
65
  "plugins": {
@@ -40,17 +78,43 @@ That's it! The setup command:
40
78
  }
41
79
  ```
42
80
 
43
- ## Commands
81
+ | Option | Description |
82
+ |--------|-------------|
83
+ | `maxTokens` | Trigger compaction above this (set to ~80% of your model's context) |
84
+ | `keepRecentTokens` | Recent context to preserve (default: 25% of max) |
85
+ | `summaryMaxTokens` | Max tokens for the summary (default: 12.5% of max) |
86
+ | `charsPerToken` | Token estimation ratio (4 works for English) |
44
87
 
45
- - `/context-stats` Show current token usage
46
- - `/compact-now` — Force fresh compaction on next message
88
+ ## Restoring Your Config
47
89
 
48
- ## How It Works
90
+ Setup always backs up first. To restore:
91
+
92
+ ```bash
93
+ # List backups
94
+ ls ~/.openclaw/backups/
95
+
96
+ # Restore (use the timestamp from your backup)
97
+ cp ~/.openclaw/backups/openclaw-2026-02-11T08-00-00-000Z.json ~/.openclaw/openclaw.json
98
+
99
+ # Restart
100
+ openclaw gateway restart
101
+ ```
102
+
103
+ ## Uninstall
104
+
105
+ ```bash
106
+ # Remove plugin files
107
+ rm -rf ~/.openclaw/extensions/context-compactor
108
+
109
+ # Remove from config (edit openclaw.json and delete the context-compactor entry)
110
+ # Or restore from backup
111
+ ```
112
+
113
+ ## Links
49
114
 
50
- 1. Before each agent turn, estimates total context tokens
51
- 2. If over `maxTokens`, splits messages into "old" and "recent"
52
- 3. Summarizes old messages using the session model
53
- 4. Injects summary + recent messages as context
115
+ - **npm:** https://www.npmjs.com/package/jasper-context-compactor
116
+ - **GitHub:** https://github.com/E-x-O-Entertainment-Studios-Inc/openclaw-context-compactor
117
+ - **ClawHub:** https://clawhub.ai/skills/context-compactor
54
118
 
55
119
  ## License
56
120
 
package/cli.js CHANGED
@@ -1,6 +1,6 @@
1
1
  #!/usr/bin/env node
2
2
  /**
3
- * Context Compactor CLI
3
+ * Jasper Context Compactor CLI
4
4
  * Setup script with interactive token limit configuration
5
5
  */
6
6
 
@@ -10,7 +10,9 @@ const os = require('os');
10
10
  const readline = require('readline');
11
11
 
12
12
  const OPENCLAW_CONFIG = path.join(os.homedir(), '.openclaw', 'openclaw.json');
13
+ const OPENCLAW_BACKUPS = path.join(os.homedir(), '.openclaw', 'backups');
13
14
  const OPENCLAW_EXTENSIONS = path.join(os.homedir(), '.openclaw', 'extensions', 'context-compactor');
15
+ const OLD_EXTENSIONS = path.join(os.homedir(), '.openclaw', 'extensions', 'openclaw-context-compactor');
14
16
 
15
17
  function log(msg) {
16
18
  console.log(`📦 ${msg}`);
@@ -34,23 +36,28 @@ function prompt(question) {
34
36
  });
35
37
  }
36
38
 
39
+ function backupConfig() {
40
+ if (!fs.existsSync(OPENCLAW_CONFIG)) return null;
41
+
42
+ fs.mkdirSync(OPENCLAW_BACKUPS, { recursive: true });
43
+ const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
44
+ const backupPath = path.join(OPENCLAW_BACKUPS, `openclaw-${timestamp}.json`);
45
+
46
+ fs.copyFileSync(OPENCLAW_CONFIG, backupPath);
47
+ return backupPath;
48
+ }
49
+
37
50
  async function detectModelContextWindow(config) {
38
- // Try to detect from OpenClaw config
39
51
  const model = config?.agents?.defaults?.model?.primary;
40
-
41
52
  if (!model) return null;
42
53
 
43
- // Common context windows (conservative estimates)
44
54
  const knownContexts = {
45
- // Anthropic
46
55
  'anthropic/claude-opus': 200000,
47
56
  'anthropic/claude-sonnet': 200000,
48
57
  'anthropic/claude-haiku': 200000,
49
- // OpenAI
50
58
  'openai/gpt-4': 128000,
51
59
  'openai/gpt-4-turbo': 128000,
52
60
  'openai/gpt-3.5-turbo': 16000,
53
- // Local models (common sizes)
54
61
  'mlx': 8000,
55
62
  'ollama': 8000,
56
63
  'llama': 8000,
@@ -58,7 +65,6 @@ async function detectModelContextWindow(config) {
58
65
  'qwen': 32000,
59
66
  };
60
67
 
61
- // Check for exact match first
62
68
  for (const [pattern, tokens] of Object.entries(knownContexts)) {
63
69
  if (model.toLowerCase().includes(pattern.toLowerCase())) {
64
70
  return { model, tokens, source: 'detected' };
@@ -69,18 +75,49 @@ async function detectModelContextWindow(config) {
69
75
  }
70
76
 
71
77
  async function setup() {
72
- log('Context Compactor — Setup');
73
- console.log('='.repeat(50));
78
+ console.log('');
79
+ log('Jasper Context Compactor — Setup');
80
+ console.log('='.repeat(55));
81
+
82
+ // Explain what we're going to do
83
+ console.log('');
84
+ console.log(' This setup will:');
85
+ console.log('');
86
+ console.log(' 1. Copy plugin files to ~/.openclaw/extensions/');
87
+ console.log(' 2. Add plugin config to your openclaw.json');
88
+ console.log(' 3. Help you configure token limits for your model');
89
+ console.log('');
90
+ console.log(' 🔒 Privacy: Everything runs locally. Nothing is sent externally.');
91
+ console.log(' 📁 Your config will be backed up before any changes.');
92
+ console.log('');
93
+
94
+ const proceed = await prompt(' Continue? (y/n): ');
95
+ if (proceed.toLowerCase() !== 'y' && proceed.toLowerCase() !== 'yes') {
96
+ console.log('\n Setup cancelled.\n');
97
+ process.exit(0);
98
+ }
74
99
 
75
100
  // Check if OpenClaw is installed
76
101
  const openclawDir = path.join(os.homedir(), '.openclaw');
77
102
  if (!fs.existsSync(openclawDir)) {
103
+ console.log('');
78
104
  error('OpenClaw not detected (~/.openclaw not found)');
79
- console.log('Install OpenClaw first: https://docs.openclaw.ai');
105
+ console.log(' Install OpenClaw first: https://docs.openclaw.ai');
80
106
  process.exit(1);
81
107
  }
82
108
 
83
- // Copy plugin files to extensions directory
109
+ // Backup config FIRST
110
+ console.log('');
111
+ log('Backing up config...');
112
+ const backupPath = backupConfig();
113
+ if (backupPath) {
114
+ console.log(` ✓ Backup saved: ${backupPath}`);
115
+ console.log(' → Restore with: cp "' + backupPath + '" ~/.openclaw/openclaw.json');
116
+ } else {
117
+ console.log(' ⚠ No existing config to backup');
118
+ }
119
+
120
+ // Copy plugin files
84
121
  console.log('');
85
122
  log('Installing plugin files...');
86
123
  fs.mkdirSync(OPENCLAW_EXTENSIONS, { recursive: true });
@@ -97,6 +134,16 @@ async function setup() {
97
134
  }
98
135
  }
99
136
 
137
+ // Clean up old package name
138
+ if (fs.existsSync(OLD_EXTENSIONS)) {
139
+ try {
140
+ fs.rmSync(OLD_EXTENSIONS, { recursive: true });
141
+ console.log(' ✓ Removed old openclaw-context-compactor extension');
142
+ } catch (e) {
143
+ console.log(` ⚠ Could not remove old extension: ${e.message}`);
144
+ }
145
+ }
146
+
100
147
  // Load existing config
101
148
  let config = {};
102
149
  if (fs.existsSync(OPENCLAW_CONFIG)) {
@@ -115,12 +162,10 @@ async function setup() {
115
162
  console.log(' To set the right limit, I can check your OpenClaw config');
116
163
  console.log(' to see what model you\'re using.');
117
164
  console.log('');
118
- console.log(' 🔒 Privacy: This runs 100% locally. Nothing is sent externally.');
119
- console.log('');
120
165
 
121
166
  const checkConfig = await prompt(' Check your config for model info? (y/n): ');
122
167
 
123
- let maxTokens = 8000; // Default
168
+ let maxTokens = 8000;
124
169
  let detectedInfo = null;
125
170
 
126
171
  if (checkConfig.toLowerCase() === 'y' || checkConfig.toLowerCase() === 'yes') {
@@ -131,7 +176,6 @@ async function setup() {
131
176
  console.log(` ✓ Detected model: ${detectedInfo.model}`);
132
177
  console.log(` ✓ Context window: ~${detectedInfo.tokens.toLocaleString()} tokens`);
133
178
 
134
- // Suggest a safe limit (leave 20% headroom)
135
179
  const suggested = Math.floor(detectedInfo.tokens * 0.8);
136
180
  console.log(` → Suggested maxTokens: ${suggested.toLocaleString()} (80% of context)`);
137
181
  console.log('');
@@ -150,13 +194,13 @@ async function setup() {
150
194
  }
151
195
  }
152
196
 
153
- // If we still don't have a good value, ask manually
197
+ // Manual entry if needed
154
198
  if (maxTokens === 8000 && (!detectedInfo || !detectedInfo.tokens)) {
155
199
  console.log('');
156
200
  console.log(' Common context windows:');
157
- console.log(' • MLX / llama.cpp (small): 4,000 - 8,000');
158
- console.log(' • Mistral / Qwen (medium): 32,000');
159
- console.log(' • Claude / GPT-4 (large): 128,000+');
201
+ console.log(' • MLX / llama.cpp (small): 4,000 - 8,000');
202
+ console.log(' • Mistral / Qwen (medium): 32,000');
203
+ console.log(' • Claude / GPT-4 (large): 128,000+');
160
204
  console.log('');
161
205
  console.log(' Check your model\'s docs or LM Studio/Ollama settings.');
162
206
  console.log(' Config location: ~/.openclaw/openclaw.json');
@@ -168,12 +212,12 @@ async function setup() {
168
212
  }
169
213
  }
170
214
 
171
- // Calculate keepRecentTokens (25% of max)
215
+ // Calculate derived values
172
216
  const keepRecentTokens = Math.floor(maxTokens * 0.25);
173
217
  const summaryMaxTokens = Math.floor(maxTokens * 0.125);
174
218
 
175
219
  console.log('');
176
- console.log(` Configuration:`);
220
+ console.log(' Final configuration:');
177
221
  console.log(` maxTokens: ${maxTokens.toLocaleString()}`);
178
222
  console.log(` keepRecentTokens: ${keepRecentTokens.toLocaleString()} (25%)`);
179
223
  console.log(` summaryMaxTokens: ${summaryMaxTokens.toLocaleString()} (12.5%)`);
@@ -182,11 +226,9 @@ async function setup() {
182
226
  console.log('');
183
227
  log('Updating OpenClaw config...');
184
228
 
185
- // Initialize plugins structure if needed
186
229
  if (!config.plugins) config.plugins = {};
187
230
  if (!config.plugins.entries) config.plugins.entries = {};
188
231
 
189
- // Add/update plugin config
190
232
  config.plugins.entries['context-compactor'] = {
191
233
  enabled: true,
192
234
  config: {
@@ -197,49 +239,56 @@ async function setup() {
197
239
  }
198
240
  };
199
241
 
200
- // Write back with nice formatting
201
242
  fs.writeFileSync(OPENCLAW_CONFIG, JSON.stringify(config, null, 2) + '\n');
202
243
  console.log(' ✓ Saved to openclaw.json');
203
244
 
245
+ // Done!
204
246
  console.log('');
205
- console.log('='.repeat(50));
247
+ console.log('='.repeat(55));
206
248
  log('Setup complete!');
207
249
  console.log('');
208
- console.log('Next steps:');
209
- console.log(' 1. Restart OpenClaw: openclaw gateway restart');
210
- console.log(' 2. Check status in chat: /context-stats');
250
+ console.log(' Next steps:');
251
+ console.log(' 1. Restart OpenClaw: openclaw gateway restart');
252
+ console.log(' 2. Check status in chat: /context-stats');
211
253
  console.log('');
212
- console.log('To adjust later, edit ~/.openclaw/openclaw.json');
213
- console.log('under plugins.entries["context-compactor"].config');
254
+ console.log(' To adjust later:');
255
+ console.log(' Edit ~/.openclaw/openclaw.json');
256
+ console.log(' Look for plugins.entries["context-compactor"].config');
257
+ console.log('');
258
+ if (backupPath) {
259
+ console.log(' To restore original config:');
260
+ console.log(` cp "${backupPath}" ~/.openclaw/openclaw.json`);
261
+ console.log('');
262
+ }
214
263
  }
215
264
 
216
265
  function showHelp() {
217
266
  console.log(`
218
- Context Compactor
219
- Token-based context compaction for local models
267
+ Jasper Context Compactor
268
+ Token-based context compaction for local models (MLX, llama.cpp, Ollama)
220
269
 
221
270
  USAGE:
222
- npx openclaw-context-compactor setup Install and configure plugin
223
- npx openclaw-context-compactor help Show this help
271
+ npx jasper-context-compactor setup Install and configure plugin
272
+ npx jasper-context-compactor help Show this help
224
273
 
225
274
  WHAT IT DOES:
226
- - Copies plugin files to ~/.openclaw/extensions/context-compactor/
227
- - Detects your model's context window (with permission)
228
- - Configures appropriate token limits
229
- - Enables automatic context compaction for local models
275
+ Local LLMs don't report context overflow errors like cloud APIs.
276
+ This plugin estimates tokens client-side and proactively summarizes
277
+ older messages before hitting your model's context limit.
230
278
 
231
- CONFIGURATION:
232
- After setup, adjust in openclaw.json:
233
-
234
- "context-compactor": {
235
- "enabled": true,
236
- "config": {
237
- "maxTokens": 8000, // Your model's context limit minus buffer
238
- "keepRecentTokens": 2000 // Recent context to preserve
239
- }
240
- }
279
+ SETUP PROCESS:
280
+ 1. Backs up your openclaw.json (with restore instructions)
281
+ 2. Copies plugin files to ~/.openclaw/extensions/
282
+ 3. Asks permission before reading your config
283
+ 4. Detects your model and suggests appropriate token limits
284
+ 5. Lets you customize or enter values manually
285
+ 6. Updates openclaw.json with the plugin config
286
+
287
+ PRIVACY:
288
+ Everything runs 100% locally. Nothing is sent to external servers.
289
+ We only read your local config file (with your permission).
241
290
 
242
- COMMANDS (in chat):
291
+ COMMANDS (in chat after setup):
243
292
  /context-stats Show current token usage
244
293
  /compact-now Force fresh compaction
245
294
  `);
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "jasper-context-compactor",
3
- "version": "0.2.1",
3
+ "version": "0.3.0",
4
4
  "description": "Context compaction plugin for OpenClaw - works with local models (MLX, llama.cpp) that don't report token limits",
5
5
  "main": "index.ts",
6
6
  "bin": {