@xelth/eck-snapshot 6.0.4 β 6.0.6
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +27 -2
- package/package.json +1 -1
- package/src/cli/commands/createSnapshot.js +2 -2
package/README.md
CHANGED
|
@@ -2,6 +2,7 @@
|
|
|
2
2
|
|
|
3
3
|
A specialized, AI-native CLI tool designed to create and restore single-file text snapshots of Git repositories. Optimized for providing full project context to Large Language Models (LLMs) and serving as the coordination hub for AI Coders.
|
|
4
4
|
|
|
5
|
+
|
|
5
6
|
## π― The Battle-Tested Workflow & Quick Start
|
|
6
7
|
|
|
7
8
|
I personally use this tool daily with local AI coding agents (**Claude Code** using Claude, and **OpenCode** using GLM). My reliable, heavily-tested workflow is:
|
|
@@ -14,9 +15,9 @@ npm install -g @xelth/eck-snapshot
|
|
|
14
15
|
### 2. Initial Context (Full Snapshots)
|
|
15
16
|
Take a full snapshot and feed it to a powerful Web LLM (Senior Architect like **Gemini** or **Grok**). *(Note: **ChatGPT** also works, but you MUST paste the specific prompt provided at the end of the snapshot output as your first prompt).*
|
|
16
17
|
```bash
|
|
17
|
-
eck-snapshot
|
|
18
|
+
eck-snapshot
|
|
18
19
|
```
|
|
19
|
-
*(For massive monorepos, slice the context using profiles: `eck-snapshot
|
|
20
|
+
*(For massive monorepos, slice the context using profiles: `eck-snapshot --profile frontend`)*
|
|
20
21
|
|
|
21
22
|
### 3. Direct Execution
|
|
22
23
|
Pass the Architect's technical plan to your local Coder agent (Claude Code / OpenCode). The Coder will implement the changes directly in your repository.
|
|
@@ -48,5 +49,29 @@ This core loop is highly polished, actively maintained, and works exceptionally
|
|
|
48
49
|
* **β οΈ Skeleton Mode:** Uses `Tree-sitter` and `Babel` to strip function bodies (`--skeleton`), drastically reducing token count.
|
|
49
50
|
* **π Telemetry Hub:** Integrated with a Rust-based microservice for tracking agent execution metrics and auto-syncing token estimation weights.
|
|
50
51
|
|
|
52
|
+
|
|
53
|
+
## π‘ The Philosophy: Why force a full snapshot?
|
|
54
|
+
|
|
55
|
+
You've probably noticed a pattern: the longer you chat with an LLM about your codebase, the smarter it gets and the better its code becomes.
|
|
56
|
+
|
|
57
|
+
How do you get that expert-level understanding from the very first prompt? You can't rely on an agent guessing which isolated files to read based on filenames. You need to force-feed it the entire context at once.
|
|
58
|
+
|
|
59
|
+
Think of AI models like human engineers. Imagine a person standing next to a massive bookshelf of programming textbooks. The total amount of information available to them is exactly the same before they go to university and after they graduate. Both can open a book, check the table of contents, and look up a formula. Yet, the results they produce are vastly different. Why? Because the graduate has the structural context *inside their head*.
|
|
60
|
+
|
|
61
|
+
LLMs work the exact same way. Giving an AI a "file search" tool is like putting a beginner next to the bookshelf. Forcing a complete project snapshot into the LLM's massive context window is like giving it a university degree in your specific codebase. That is what `eck-snapshot` does.
|
|
62
|
+
|
|
63
|
+
## πΊοΈ Roadmap
|
|
64
|
+
|
|
65
|
+
* **NotebookLM Optimization:** Our generated snapshots already work exceptionally well with Google NotebookLM. In the near future, we plan to introduce specific adaptations and context profiles tailored specifically for NotebookLM's document architecture, alongside our support for standard Web LLMs.
|
|
66
|
+
|
|
67
|
+
## π Ethical Automation Policy
|
|
68
|
+
|
|
69
|
+
This project strictly respects the Terms of Service of AI providers. We will **never** implement browser automation tools (like Playwright, Puppeteer, etc.) to spoof human behavior, scrape, or exploit web chat interfaces that are subsidized and intended solely for human use. We explicitly forbid the integration of such deceptive API-bypassing techniques in any official branches of this project.
|
|
70
|
+
|
|
51
71
|
## License
|
|
52
72
|
MIT Β© xelth-com
|
|
73
|
+
|
|
74
|
+
<br>
|
|
75
|
+
<div align="center">
|
|
76
|
+
<sup>made in Eschborn</sup>
|
|
77
|
+
</div>
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@xelth/eck-snapshot",
|
|
3
|
-
"version": "6.0.
|
|
3
|
+
"version": "6.0.6",
|
|
4
4
|
"description": "A powerful CLI tool to create and restore single-file text snapshots of Git repositories. Optimized for AI context, LLM workflows, and multi-agent Swarm coordination.",
|
|
5
5
|
"main": "index.js",
|
|
6
6
|
"type": "module",
|
|
@@ -927,9 +927,9 @@ export async function createRepoSnapshot(repoPath, options) {
|
|
|
927
927
|
// Output AI Prompt Suggestion for stubborn LLMs
|
|
928
928
|
console.log('\nπ€ AI PROMPT SUGGESTION (Crucial for ChatGPT, helpful for others):');
|
|
929
929
|
console.log('---------------------------------------------------');
|
|
930
|
-
console.log(chalk.yellow('π‘ Tip:
|
|
930
|
+
console.log(chalk.yellow('π‘ Tip: Gemini and Grok handle large files best. ChatGPT works but can be slow.'));
|
|
931
931
|
console.log('If your AI ignores the file instructions and acts as an external reviewer,');
|
|
932
|
-
console.log('copy and paste this exact prompt along with
|
|
932
|
+
console.log('copy and paste this exact text as your FIRST prompt along with the snapshot file:\n');
|
|
933
933
|
console.log(chalk.cyan.bold('Read the SYSTEM DIRECTIVE at the very beginning of the attached file. Immediately assume the role of Senior Architect as instructed, then await my first task.\n'));
|
|
934
934
|
|
|
935
935
|
} finally {
|