jasper-context-compactor 0.2.2 → 0.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +80 -16
- package/cli.js +89 -51
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,27 +1,65 @@
|
|
|
1
|
-
# Context Compactor
|
|
1
|
+
# Jasper Context Compactor
|
|
2
2
|
|
|
3
3
|
> Token-based context compaction for OpenClaw with local models (MLX, llama.cpp, Ollama)
|
|
4
4
|
|
|
5
|
-
##
|
|
5
|
+
## The Problem
|
|
6
6
|
|
|
7
|
-
Local
|
|
7
|
+
Local LLMs don't report context overflow errors like cloud APIs do. When context gets too long, they either:
|
|
8
|
+
- Silently truncate your conversation
|
|
9
|
+
- Return garbage output
|
|
10
|
+
- Crash without explanation
|
|
11
|
+
|
|
12
|
+
OpenClaw's built-in compaction relies on error signals that local models don't provide.
|
|
13
|
+
|
|
14
|
+
## The Solution
|
|
15
|
+
|
|
16
|
+
Jasper Context Compactor estimates tokens client-side and proactively summarizes older messages before hitting your model's limit. No more broken conversations.
|
|
8
17
|
|
|
9
18
|
## Quick Start
|
|
10
19
|
|
|
11
20
|
```bash
|
|
12
|
-
# One command setup (installs + configures)
|
|
13
21
|
npx jasper-context-compactor setup
|
|
22
|
+
```
|
|
14
23
|
|
|
15
|
-
|
|
24
|
+
**The setup will:**
|
|
25
|
+
|
|
26
|
+
1. ✅ **Back up your config** — Saves `openclaw.json` to `~/.openclaw/backups/` with restore instructions
|
|
27
|
+
2. ✅ **Ask permission** — Won't read your config without consent
|
|
28
|
+
3. ✅ **Detect your model** — Suggests appropriate token limits based on your setup
|
|
29
|
+
4. ✅ **Let you customize** — Enter your own values if auto-detection doesn't match
|
|
30
|
+
5. ✅ **Update config safely** — Adds the plugin with your chosen settings
|
|
31
|
+
|
|
32
|
+
Then restart OpenClaw:
|
|
33
|
+
```bash
|
|
16
34
|
openclaw gateway restart
|
|
17
35
|
```
|
|
18
36
|
|
|
19
|
-
|
|
20
|
-
|
|
21
|
-
|
|
37
|
+
## Privacy
|
|
38
|
+
|
|
39
|
+
🔒 **Everything runs 100% locally.** Nothing is sent to external servers.
|
|
40
|
+
|
|
41
|
+
The setup only reads your local `openclaw.json` file (with your permission) to detect your model and suggest appropriate limits.
|
|
42
|
+
|
|
43
|
+
## How It Works
|
|
44
|
+
|
|
45
|
+
1. Before each message, estimates total context tokens (chars ÷ 4)
|
|
46
|
+
2. If over `maxTokens`, splits messages into "old" and "recent"
|
|
47
|
+
3. Summarizes old messages using your session model
|
|
48
|
+
4. Injects summary as context — conversation continues seamlessly
|
|
49
|
+
|
|
50
|
+
## Commands
|
|
51
|
+
|
|
52
|
+
After setup, use these in chat:
|
|
53
|
+
|
|
54
|
+
| Command | Description |
|
|
55
|
+
|---------|-------------|
|
|
56
|
+
| `/context-stats` | Show current token usage and limits |
|
|
57
|
+
| `/compact-now` | Clear cache and force fresh compaction |
|
|
22
58
|
|
|
23
59
|
## Configuration
|
|
24
60
|
|
|
61
|
+
The setup configures these values in `~/.openclaw/openclaw.json`:
|
|
62
|
+
|
|
25
63
|
```json
|
|
26
64
|
{
|
|
27
65
|
"plugins": {
|
|
@@ -40,17 +78,43 @@ That's it! The setup command:
|
|
|
40
78
|
}
|
|
41
79
|
```
|
|
42
80
|
|
|
43
|
-
|
|
81
|
+
| Option | Description |
|
|
82
|
+
|--------|-------------|
|
|
83
|
+
| `maxTokens` | Trigger compaction above this (set to ~80% of your model's context) |
|
|
84
|
+
| `keepRecentTokens` | Recent context to preserve (default: 25% of max) |
|
|
85
|
+
| `summaryMaxTokens` | Max tokens for the summary (default: 12.5% of max) |
|
|
86
|
+
| `charsPerToken` | Token estimation ratio (4 works for English) |
|
|
44
87
|
|
|
45
|
-
|
|
46
|
-
- `/compact-now` — Force fresh compaction on next message
|
|
88
|
+
## Restoring Your Config
|
|
47
89
|
|
|
48
|
-
|
|
90
|
+
Setup always backs up first. To restore:
|
|
91
|
+
|
|
92
|
+
```bash
|
|
93
|
+
# List backups
|
|
94
|
+
ls ~/.openclaw/backups/
|
|
95
|
+
|
|
96
|
+
# Restore (use the timestamp from your backup)
|
|
97
|
+
cp ~/.openclaw/backups/openclaw-2026-02-11T08-00-00-000Z.json ~/.openclaw/openclaw.json
|
|
98
|
+
|
|
99
|
+
# Restart
|
|
100
|
+
openclaw gateway restart
|
|
101
|
+
```
|
|
102
|
+
|
|
103
|
+
## Uninstall
|
|
104
|
+
|
|
105
|
+
```bash
|
|
106
|
+
# Remove plugin files
|
|
107
|
+
rm -rf ~/.openclaw/extensions/context-compactor
|
|
108
|
+
|
|
109
|
+
# Remove from config (edit openclaw.json and delete the context-compactor entry)
|
|
110
|
+
# Or restore from backup
|
|
111
|
+
```
|
|
112
|
+
|
|
113
|
+
## Links
|
|
49
114
|
|
|
50
|
-
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
4. Injects summary + recent messages as context
|
|
115
|
+
- **npm:** https://www.npmjs.com/package/jasper-context-compactor
|
|
116
|
+
- **GitHub:** https://github.com/E-x-O-Entertainment-Studios-Inc/openclaw-context-compactor
|
|
117
|
+
- **ClawHub:** https://clawhub.ai/skills/context-compactor
|
|
54
118
|
|
|
55
119
|
## License
|
|
56
120
|
|
package/cli.js
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
2
|
/**
|
|
3
|
-
* Context Compactor CLI
|
|
3
|
+
* Jasper Context Compactor CLI
|
|
4
4
|
* Setup script with interactive token limit configuration
|
|
5
5
|
*/
|
|
6
6
|
|
|
@@ -10,6 +10,7 @@ const os = require('os');
|
|
|
10
10
|
const readline = require('readline');
|
|
11
11
|
|
|
12
12
|
const OPENCLAW_CONFIG = path.join(os.homedir(), '.openclaw', 'openclaw.json');
|
|
13
|
+
const OPENCLAW_BACKUPS = path.join(os.homedir(), '.openclaw', 'backups');
|
|
13
14
|
const OPENCLAW_EXTENSIONS = path.join(os.homedir(), '.openclaw', 'extensions', 'context-compactor');
|
|
14
15
|
const OLD_EXTENSIONS = path.join(os.homedir(), '.openclaw', 'extensions', 'openclaw-context-compactor');
|
|
15
16
|
|
|
@@ -35,23 +36,28 @@ function prompt(question) {
|
|
|
35
36
|
});
|
|
36
37
|
}
|
|
37
38
|
|
|
39
|
+
function backupConfig() {
|
|
40
|
+
if (!fs.existsSync(OPENCLAW_CONFIG)) return null;
|
|
41
|
+
|
|
42
|
+
fs.mkdirSync(OPENCLAW_BACKUPS, { recursive: true });
|
|
43
|
+
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
|
|
44
|
+
const backupPath = path.join(OPENCLAW_BACKUPS, `openclaw-${timestamp}.json`);
|
|
45
|
+
|
|
46
|
+
fs.copyFileSync(OPENCLAW_CONFIG, backupPath);
|
|
47
|
+
return backupPath;
|
|
48
|
+
}
|
|
49
|
+
|
|
38
50
|
async function detectModelContextWindow(config) {
|
|
39
|
-
// Try to detect from OpenClaw config
|
|
40
51
|
const model = config?.agents?.defaults?.model?.primary;
|
|
41
|
-
|
|
42
52
|
if (!model) return null;
|
|
43
53
|
|
|
44
|
-
// Common context windows (conservative estimates)
|
|
45
54
|
const knownContexts = {
|
|
46
|
-
// Anthropic
|
|
47
55
|
'anthropic/claude-opus': 200000,
|
|
48
56
|
'anthropic/claude-sonnet': 200000,
|
|
49
57
|
'anthropic/claude-haiku': 200000,
|
|
50
|
-
// OpenAI
|
|
51
58
|
'openai/gpt-4': 128000,
|
|
52
59
|
'openai/gpt-4-turbo': 128000,
|
|
53
60
|
'openai/gpt-3.5-turbo': 16000,
|
|
54
|
-
// Local models (common sizes)
|
|
55
61
|
'mlx': 8000,
|
|
56
62
|
'ollama': 8000,
|
|
57
63
|
'llama': 8000,
|
|
@@ -59,7 +65,6 @@ async function detectModelContextWindow(config) {
|
|
|
59
65
|
'qwen': 32000,
|
|
60
66
|
};
|
|
61
67
|
|
|
62
|
-
// Check for exact match first
|
|
63
68
|
for (const [pattern, tokens] of Object.entries(knownContexts)) {
|
|
64
69
|
if (model.toLowerCase().includes(pattern.toLowerCase())) {
|
|
65
70
|
return { model, tokens, source: 'detected' };
|
|
@@ -70,18 +75,49 @@ async function detectModelContextWindow(config) {
|
|
|
70
75
|
}
|
|
71
76
|
|
|
72
77
|
async function setup() {
|
|
73
|
-
log('
|
|
74
|
-
|
|
78
|
+
console.log('');
|
|
79
|
+
log('Jasper Context Compactor — Setup');
|
|
80
|
+
console.log('='.repeat(55));
|
|
81
|
+
|
|
82
|
+
// Explain what we're going to do
|
|
83
|
+
console.log('');
|
|
84
|
+
console.log(' This setup will:');
|
|
85
|
+
console.log('');
|
|
86
|
+
console.log(' 1. Copy plugin files to ~/.openclaw/extensions/');
|
|
87
|
+
console.log(' 2. Add plugin config to your openclaw.json');
|
|
88
|
+
console.log(' 3. Help you configure token limits for your model');
|
|
89
|
+
console.log('');
|
|
90
|
+
console.log(' 🔒 Privacy: Everything runs locally. Nothing is sent externally.');
|
|
91
|
+
console.log(' 📁 Your config will be backed up before any changes.');
|
|
92
|
+
console.log('');
|
|
93
|
+
|
|
94
|
+
const proceed = await prompt(' Continue? (y/n): ');
|
|
95
|
+
if (proceed.toLowerCase() !== 'y' && proceed.toLowerCase() !== 'yes') {
|
|
96
|
+
console.log('\n Setup cancelled.\n');
|
|
97
|
+
process.exit(0);
|
|
98
|
+
}
|
|
75
99
|
|
|
76
100
|
// Check if OpenClaw is installed
|
|
77
101
|
const openclawDir = path.join(os.homedir(), '.openclaw');
|
|
78
102
|
if (!fs.existsSync(openclawDir)) {
|
|
103
|
+
console.log('');
|
|
79
104
|
error('OpenClaw not detected (~/.openclaw not found)');
|
|
80
|
-
console.log('Install OpenClaw first: https://docs.openclaw.ai');
|
|
105
|
+
console.log(' Install OpenClaw first: https://docs.openclaw.ai');
|
|
81
106
|
process.exit(1);
|
|
82
107
|
}
|
|
83
108
|
|
|
84
|
-
//
|
|
109
|
+
// Backup config FIRST
|
|
110
|
+
console.log('');
|
|
111
|
+
log('Backing up config...');
|
|
112
|
+
const backupPath = backupConfig();
|
|
113
|
+
if (backupPath) {
|
|
114
|
+
console.log(` ✓ Backup saved: ${backupPath}`);
|
|
115
|
+
console.log(' → Restore with: cp "' + backupPath + '" ~/.openclaw/openclaw.json');
|
|
116
|
+
} else {
|
|
117
|
+
console.log(' ⚠ No existing config to backup');
|
|
118
|
+
}
|
|
119
|
+
|
|
120
|
+
// Copy plugin files
|
|
85
121
|
console.log('');
|
|
86
122
|
log('Installing plugin files...');
|
|
87
123
|
fs.mkdirSync(OPENCLAW_EXTENSIONS, { recursive: true });
|
|
@@ -98,7 +134,7 @@ async function setup() {
|
|
|
98
134
|
}
|
|
99
135
|
}
|
|
100
136
|
|
|
101
|
-
// Clean up old package name
|
|
137
|
+
// Clean up old package name
|
|
102
138
|
if (fs.existsSync(OLD_EXTENSIONS)) {
|
|
103
139
|
try {
|
|
104
140
|
fs.rmSync(OLD_EXTENSIONS, { recursive: true });
|
|
@@ -126,12 +162,10 @@ async function setup() {
|
|
|
126
162
|
console.log(' To set the right limit, I can check your OpenClaw config');
|
|
127
163
|
console.log(' to see what model you\'re using.');
|
|
128
164
|
console.log('');
|
|
129
|
-
console.log(' 🔒 Privacy: This runs 100% locally. Nothing is sent externally.');
|
|
130
|
-
console.log('');
|
|
131
165
|
|
|
132
166
|
const checkConfig = await prompt(' Check your config for model info? (y/n): ');
|
|
133
167
|
|
|
134
|
-
let maxTokens = 8000;
|
|
168
|
+
let maxTokens = 8000;
|
|
135
169
|
let detectedInfo = null;
|
|
136
170
|
|
|
137
171
|
if (checkConfig.toLowerCase() === 'y' || checkConfig.toLowerCase() === 'yes') {
|
|
@@ -142,7 +176,6 @@ async function setup() {
|
|
|
142
176
|
console.log(` ✓ Detected model: ${detectedInfo.model}`);
|
|
143
177
|
console.log(` ✓ Context window: ~${detectedInfo.tokens.toLocaleString()} tokens`);
|
|
144
178
|
|
|
145
|
-
// Suggest a safe limit (leave 20% headroom)
|
|
146
179
|
const suggested = Math.floor(detectedInfo.tokens * 0.8);
|
|
147
180
|
console.log(` → Suggested maxTokens: ${suggested.toLocaleString()} (80% of context)`);
|
|
148
181
|
console.log('');
|
|
@@ -161,13 +194,13 @@ async function setup() {
|
|
|
161
194
|
}
|
|
162
195
|
}
|
|
163
196
|
|
|
164
|
-
//
|
|
197
|
+
// Manual entry if needed
|
|
165
198
|
if (maxTokens === 8000 && (!detectedInfo || !detectedInfo.tokens)) {
|
|
166
199
|
console.log('');
|
|
167
200
|
console.log(' Common context windows:');
|
|
168
|
-
console.log(' • MLX / llama.cpp (small):
|
|
169
|
-
console.log(' • Mistral / Qwen (medium):
|
|
170
|
-
console.log(' • Claude / GPT-4 (large):
|
|
201
|
+
console.log(' • MLX / llama.cpp (small): 4,000 - 8,000');
|
|
202
|
+
console.log(' • Mistral / Qwen (medium): 32,000');
|
|
203
|
+
console.log(' • Claude / GPT-4 (large): 128,000+');
|
|
171
204
|
console.log('');
|
|
172
205
|
console.log(' Check your model\'s docs or LM Studio/Ollama settings.');
|
|
173
206
|
console.log(' Config location: ~/.openclaw/openclaw.json');
|
|
@@ -179,12 +212,12 @@ async function setup() {
|
|
|
179
212
|
}
|
|
180
213
|
}
|
|
181
214
|
|
|
182
|
-
// Calculate
|
|
215
|
+
// Calculate derived values
|
|
183
216
|
const keepRecentTokens = Math.floor(maxTokens * 0.25);
|
|
184
217
|
const summaryMaxTokens = Math.floor(maxTokens * 0.125);
|
|
185
218
|
|
|
186
219
|
console.log('');
|
|
187
|
-
console.log(
|
|
220
|
+
console.log(' Final configuration:');
|
|
188
221
|
console.log(` maxTokens: ${maxTokens.toLocaleString()}`);
|
|
189
222
|
console.log(` keepRecentTokens: ${keepRecentTokens.toLocaleString()} (25%)`);
|
|
190
223
|
console.log(` summaryMaxTokens: ${summaryMaxTokens.toLocaleString()} (12.5%)`);
|
|
@@ -193,11 +226,9 @@ async function setup() {
|
|
|
193
226
|
console.log('');
|
|
194
227
|
log('Updating OpenClaw config...');
|
|
195
228
|
|
|
196
|
-
// Initialize plugins structure if needed
|
|
197
229
|
if (!config.plugins) config.plugins = {};
|
|
198
230
|
if (!config.plugins.entries) config.plugins.entries = {};
|
|
199
231
|
|
|
200
|
-
// Add/update plugin config
|
|
201
232
|
config.plugins.entries['context-compactor'] = {
|
|
202
233
|
enabled: true,
|
|
203
234
|
config: {
|
|
@@ -208,49 +239,56 @@ async function setup() {
|
|
|
208
239
|
}
|
|
209
240
|
};
|
|
210
241
|
|
|
211
|
-
// Write back with nice formatting
|
|
212
242
|
fs.writeFileSync(OPENCLAW_CONFIG, JSON.stringify(config, null, 2) + '\n');
|
|
213
243
|
console.log(' ✓ Saved to openclaw.json');
|
|
214
244
|
|
|
245
|
+
// Done!
|
|
215
246
|
console.log('');
|
|
216
|
-
console.log('='.repeat(
|
|
247
|
+
console.log('='.repeat(55));
|
|
217
248
|
log('Setup complete!');
|
|
218
249
|
console.log('');
|
|
219
|
-
console.log('Next steps:');
|
|
220
|
-
console.log('
|
|
221
|
-
console.log('
|
|
250
|
+
console.log(' Next steps:');
|
|
251
|
+
console.log(' 1. Restart OpenClaw: openclaw gateway restart');
|
|
252
|
+
console.log(' 2. Check status in chat: /context-stats');
|
|
253
|
+
console.log('');
|
|
254
|
+
console.log(' To adjust later:');
|
|
255
|
+
console.log(' Edit ~/.openclaw/openclaw.json');
|
|
256
|
+
console.log(' Look for plugins.entries["context-compactor"].config');
|
|
222
257
|
console.log('');
|
|
223
|
-
|
|
224
|
-
|
|
258
|
+
if (backupPath) {
|
|
259
|
+
console.log(' To restore original config:');
|
|
260
|
+
console.log(` cp "${backupPath}" ~/.openclaw/openclaw.json`);
|
|
261
|
+
console.log('');
|
|
262
|
+
}
|
|
225
263
|
}
|
|
226
264
|
|
|
227
265
|
function showHelp() {
|
|
228
266
|
console.log(`
|
|
229
|
-
Context Compactor
|
|
230
|
-
Token-based context compaction for local models
|
|
267
|
+
Jasper Context Compactor
|
|
268
|
+
Token-based context compaction for local models (MLX, llama.cpp, Ollama)
|
|
231
269
|
|
|
232
270
|
USAGE:
|
|
233
|
-
npx
|
|
234
|
-
npx
|
|
271
|
+
npx jasper-context-compactor setup Install and configure plugin
|
|
272
|
+
npx jasper-context-compactor help Show this help
|
|
235
273
|
|
|
236
274
|
WHAT IT DOES:
|
|
237
|
-
|
|
238
|
-
|
|
239
|
-
|
|
240
|
-
- Enables automatic context compaction for local models
|
|
275
|
+
Local LLMs don't report context overflow errors like cloud APIs.
|
|
276
|
+
This plugin estimates tokens client-side and proactively summarizes
|
|
277
|
+
older messages before hitting your model's context limit.
|
|
241
278
|
|
|
242
|
-
|
|
243
|
-
|
|
244
|
-
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
249
|
-
|
|
250
|
-
|
|
251
|
-
|
|
279
|
+
SETUP PROCESS:
|
|
280
|
+
1. Backs up your openclaw.json (with restore instructions)
|
|
281
|
+
2. Copies plugin files to ~/.openclaw/extensions/
|
|
282
|
+
3. Asks permission before reading your config
|
|
283
|
+
4. Detects your model and suggests appropriate token limits
|
|
284
|
+
5. Lets you customize or enter values manually
|
|
285
|
+
6. Updates openclaw.json with the plugin config
|
|
286
|
+
|
|
287
|
+
PRIVACY:
|
|
288
|
+
Everything runs 100% locally. Nothing is sent to external servers.
|
|
289
|
+
We only read your local config file (with your permission).
|
|
252
290
|
|
|
253
|
-
COMMANDS (in chat):
|
|
291
|
+
COMMANDS (in chat after setup):
|
|
254
292
|
/context-stats Show current token usage
|
|
255
293
|
/compact-now Force fresh compaction
|
|
256
294
|
`);
|
package/package.json
CHANGED