quackstack 1.0.6 β 1.0.7
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +58 -71
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -2,28 +2,27 @@
|
|
|
2
2
|
|
|
3
3
|
**Your cracked unpaid intern for all things codebase related!**
|
|
4
4
|
|
|
5
|
-
QuackStack is an interactive CLI tool that indexes your codebase using AI embeddings and lets you ask questions about it conversationally. Perfect for understanding unfamiliar code, onboarding to new projects, or giving your AI coding assistant persistent context.
|
|
6
|
-
|
|
5
|
+
QuackStack is an interactive CLI tool that indexes your codebase using local AI embeddings and lets you ask questions about it conversationally. Perfect for understanding unfamiliar code, onboarding to new projects, or giving your AI coding assistant persistent context.
|
|
7
6
|
|
|
8
7
|
## β¨ Features
|
|
9
8
|
|
|
10
9
|
* π **Zero-config** - Just run `quack` in any project directory
|
|
11
10
|
* π§ **Smart code parsing** - Automatically extracts functions and classes
|
|
12
11
|
* π¬ **Interactive REPL** - Ask questions conversationally, stays open until Ctrl+C
|
|
13
|
-
*
|
|
14
|
-
*
|
|
12
|
+
* π **100% Local embeddings** - No API calls for vector generation, your code stays private
|
|
13
|
+
* π€ **AI-powered answers** - Uses OpenAI, Claude, Gemini, DeepSeek, or Mistral for conversational responses
|
|
14
|
+
* π― **Universal AI tool support** - Auto-generate context for Cursor, Windsurf, Cline, Continue, and Aider
|
|
15
15
|
* π¦ **Local database** - Your code stays on your infrastructure
|
|
16
16
|
* π **Multi-language** - Supports JS/TS, Python, Go, Rust, Java, C/C++, C#, Ruby, PHP, Swift, Kotlin, and more
|
|
17
17
|
|
|
18
|
-
|
|
19
18
|
## π¦ Installation
|
|
20
19
|
|
|
21
20
|
### Global Install (Recommended)
|
|
22
21
|
|
|
23
22
|
```bash
|
|
24
|
-
pnpm add -g quackstack
|
|
25
|
-
|
|
26
23
|
npm install -g quackstack
|
|
24
|
+
# or
|
|
25
|
+
pnpm add -g quackstack
|
|
27
26
|
```
|
|
28
27
|
|
|
29
28
|
### Local Development
|
|
@@ -35,18 +34,18 @@ pnpm install
|
|
|
35
34
|
pnpm build
|
|
36
35
|
```
|
|
37
36
|
|
|
38
|
-
|
|
39
37
|
## βοΈ Setup
|
|
40
38
|
|
|
41
39
|
### 1. Create `.env` in your project root
|
|
42
40
|
|
|
43
41
|
```bash
|
|
44
|
-
# REQUIRED
|
|
42
|
+
# REQUIRED: Database for storing code embeddings
|
|
45
43
|
QUACKSTACK_DATABASE_URL=postgresql://user:pass@host:port/dbname
|
|
46
44
|
|
|
47
|
-
# Choose ONE AI provider
|
|
45
|
+
# REQUIRED: Choose ONE AI provider for conversational answers
|
|
46
|
+
# (Embeddings are computed locally - no API calls!)
|
|
48
47
|
|
|
49
|
-
# Option 1: OpenAI (
|
|
48
|
+
# Option 1: OpenAI (RECOMMENDED)
|
|
50
49
|
QUACKSTACK_OPENAI_KEY=sk-...
|
|
51
50
|
|
|
52
51
|
# Option 2: Anthropic Claude
|
|
@@ -60,9 +59,6 @@ QUACKSTACK_DEEPSEEK_KEY=sk-...
|
|
|
60
59
|
|
|
61
60
|
# Option 5: Mistral AI
|
|
62
61
|
QUACKSTACK_MISTRAL_KEY=...
|
|
63
|
-
|
|
64
|
-
# NOTE: If using Claude/Gemini/Mistral, you still NEED OpenAI for embeddings:
|
|
65
|
-
QUACKSTACK_EMBEDDING_KEY=sk-...
|
|
66
62
|
```
|
|
67
63
|
|
|
68
64
|
### 2. Initialize database
|
|
@@ -72,7 +68,6 @@ npx prisma generate
|
|
|
72
68
|
npx prisma db push
|
|
73
69
|
```
|
|
74
70
|
|
|
75
|
-
|
|
76
71
|
## π Usage
|
|
77
72
|
|
|
78
73
|
### Interactive Mode (Default)
|
|
@@ -80,34 +75,32 @@ npx prisma db push
|
|
|
80
75
|
```bash
|
|
81
76
|
quack
|
|
82
77
|
|
|
83
|
-
|
|
84
|
-
π₯ Quack! How can I help? > where is authentication handled?
|
|
85
|
-
|
|
86
78
|
# Answer appears with context
|
|
87
79
|
# Press Ctrl+C to exit
|
|
88
80
|
```
|
|
89
81
|
|
|
90
|
-
### Generate
|
|
82
|
+
### Generate Context for ALL AI Coding Tools
|
|
91
83
|
|
|
92
84
|
```bash
|
|
93
|
-
quack --
|
|
85
|
+
quack --context
|
|
94
86
|
|
|
95
|
-
# Creates
|
|
96
|
-
# -
|
|
97
|
-
# -
|
|
98
|
-
# -
|
|
99
|
-
# -
|
|
87
|
+
# Creates context files for:
|
|
88
|
+
# - Cursor (.cursorrules)
|
|
89
|
+
# - Windsurf (.windsurfrules)
|
|
90
|
+
# - Cline (.clinerules)
|
|
91
|
+
# - Continue (.continue/context.md)
|
|
92
|
+
# - Aider (.aider.conf.yml)
|
|
100
93
|
|
|
101
|
-
#
|
|
94
|
+
# Your AI coding assistants automatically read these files!
|
|
102
95
|
```
|
|
103
96
|
|
|
104
|
-
### Watch Mode (Auto-update
|
|
97
|
+
### Watch Mode (Auto-update Context)
|
|
105
98
|
|
|
106
99
|
```bash
|
|
107
100
|
quack --watch
|
|
108
101
|
|
|
109
102
|
# Watches for file changes
|
|
110
|
-
# Auto-regenerates
|
|
103
|
+
# Auto-regenerates all context files
|
|
111
104
|
# Keep running in background during development
|
|
112
105
|
```
|
|
113
106
|
|
|
@@ -119,7 +112,6 @@ quack --reindex
|
|
|
119
112
|
# Clears old index and re-scans entire codebase
|
|
120
113
|
```
|
|
121
114
|
|
|
122
|
-
|
|
123
115
|
## π Example Session
|
|
124
116
|
|
|
125
117
|
```bash
|
|
@@ -132,8 +124,8 @@ Welcome to QuackStack! π₯
|
|
|
132
124
|
|
|
133
125
|
π₯ Quack! How can I help? > how does the search function work?
|
|
134
126
|
|
|
135
|
-
The search function
|
|
136
|
-
against stored code embeddings using cosine similarity, ranks results,
|
|
127
|
+
The search function uses local embeddings to convert your query into a vector,
|
|
128
|
+
compares it against stored code embeddings using cosine similarity, ranks results,
|
|
137
129
|
and feeds the top matches to the AI for a conversational answer.
|
|
138
130
|
|
|
139
131
|
Implementation is in src/commands/search.ts
|
|
@@ -144,18 +136,16 @@ Implementation is in src/commands/search.ts
|
|
|
144
136
|
|
|
145
137
|
[1] src/commands/search.ts (relevance: 87.3%)
|
|
146
138
|
export async function search(query: string, projectName: string) {
|
|
147
|
-
const queryEmbedding = await aiClient.getEmbeddings(query);
|
|
148
139
|
const snippets = await client.codeSnippet.findMany({
|
|
149
140
|
where: { projectName },
|
|
150
141
|
});
|
|
151
142
|
// ... cosine similarity ranking ...
|
|
152
143
|
}
|
|
153
144
|
|
|
154
|
-
|
|
155
145
|
π₯ Quack! How can I help? > where are embeddings generated?
|
|
156
146
|
|
|
157
|
-
|
|
158
|
-
|
|
147
|
+
Embeddings are generated locally using the local-embeddings module.
|
|
148
|
+
No API calls are made for vector generation, keeping your code private.
|
|
159
149
|
|
|
160
150
|
π‘ Want more details? (y/n) > n
|
|
161
151
|
|
|
@@ -163,48 +153,46 @@ the getEmbeddings() method with OpenAI's text-embedding-3-large model.
|
|
|
163
153
|
π Happy coding!
|
|
164
154
|
```
|
|
165
155
|
|
|
166
|
-
|
|
167
156
|
## π οΈ How It Works
|
|
168
157
|
|
|
169
158
|
1. **Scanning** - Finds all code files (ignoring `node_modules`, `.git`, etc.)
|
|
170
159
|
2. **Parsing** - Uses AST parsing to extract functions/classes from JS/TS
|
|
171
|
-
3. **Chunking** - Breaks
|
|
172
|
-
4. **Embedding** - Generates vector embeddings
|
|
160
|
+
3. **Chunking** - Breaks code into logical chunks
|
|
161
|
+
4. **Local Embedding** - Generates vector embeddings **locally** (no API calls!)
|
|
173
162
|
5. **Storage** - Saves to your PostgreSQL/Neon database
|
|
174
|
-
6. **Search** - Semantic search + AI-powered conversational answers
|
|
175
|
-
|
|
163
|
+
6. **Search** - Semantic search using cosine similarity + AI-powered conversational answers
|
|
176
164
|
|
|
177
165
|
## π― Use Cases
|
|
178
166
|
|
|
179
167
|
- **Context switching** - Quickly understand projects you haven't touched in months
|
|
180
168
|
- **Onboarding** - New team members can ask questions instead of reading docs
|
|
181
169
|
- **Code archaeology** - Find implementations without grepping
|
|
182
|
-
- **AI coding assistants** - Give Cursor/
|
|
170
|
+
- **AI coding assistants** - Give Cursor/Windsurf/Cline/Continue/Aider persistent codebase context
|
|
183
171
|
- **Documentation** - Auto-generate explanations of how things work
|
|
184
|
-
|
|
172
|
+
- **Privacy-focused** - All embeddings generated locally, no code sent to embedding APIs
|
|
185
173
|
|
|
186
174
|
## π Commands Reference
|
|
187
175
|
|
|
188
176
|
| Command | Description |
|
|
189
177
|
|---------|-------------|
|
|
190
178
|
| `quack` | Start interactive REPL (auto-indexes first time) |
|
|
191
|
-
| `quack --
|
|
192
|
-
| `quack --watch` | Watch mode - auto-update
|
|
179
|
+
| `quack --context` | Generate context files for ALL AI coding tools |
|
|
180
|
+
| `quack --watch` | Watch mode - auto-update context on file changes |
|
|
193
181
|
| `quack --reindex` | Force reindex the entire codebase |
|
|
194
|
-
|
|
182
|
+
| `quack --cursor` | [DEPRECATED] Use `--context` instead |
|
|
195
183
|
|
|
196
184
|
## π Supported AI Providers
|
|
197
185
|
|
|
198
|
-
| Provider |
|
|
199
|
-
|
|
200
|
-
|
|
|
201
|
-
|
|
|
202
|
-
|
|
|
203
|
-
|
|
|
204
|
-
|
|
|
205
|
-
|
|
206
|
-
**Note:** If you use Claude, Gemini, or Mistral for chat, you still need an OpenAI or DeepSeek key for embeddings.
|
|
186
|
+
| Provider | Used For | Cost | Privacy | Setup |
|
|
187
|
+
|----------|----------|------|---------|-------|
|
|
188
|
+
| **Local** | Embeddings | FREE | π 100% Private | Built-in |
|
|
189
|
+
| OpenAI | Chat answers | $$ | Query only | [Get key](https://platform.openai.com/api-keys) |
|
|
190
|
+
| Anthropic | Chat answers | $$$ | Query only | [Get key](https://console.anthropic.com/) |
|
|
191
|
+
| Gemini | Chat answers | FREE | Query only | [Get key](https://aistudio.google.com/app/apikey) |
|
|
192
|
+
| DeepSeek | Chat answers | $ | Query only | [Get key](https://platform.deepseek.com/) |
|
|
193
|
+
| Mistral | Chat answers | $$ | Query only | [Get key](https://console.mistral.ai/) |
|
|
207
194
|
|
|
195
|
+
**Privacy Note:** QuackStack generates embeddings **locally** on your machine. Only your natural language queries and retrieved code context are sent to the AI provider for generating conversational answers. Your entire codebase is never sent to any API.
|
|
208
196
|
|
|
209
197
|
## ποΈ Database Schema
|
|
210
198
|
|
|
@@ -212,7 +200,7 @@ the getEmbeddings() method with OpenAI's text-embedding-3-large model.
|
|
|
212
200
|
model codeSnippet {
|
|
213
201
|
id Int @id @default(autoincrement())
|
|
214
202
|
content String
|
|
215
|
-
embedding Json
|
|
203
|
+
embedding Json // Stored as JSON array of numbers
|
|
216
204
|
filePath String
|
|
217
205
|
projectName String
|
|
218
206
|
language String?
|
|
@@ -228,12 +216,10 @@ model codeSnippet {
|
|
|
228
216
|
|
|
229
217
|
Each project is isolated by `projectName` (uses current directory name).
|
|
230
218
|
|
|
231
|
-
|
|
232
219
|
## π Supported Languages
|
|
233
220
|
|
|
234
221
|
JavaScript, TypeScript, Python, Go, Rust, Java, C, C++, C#, Ruby, PHP, Swift, Kotlin, Scala, R, Vue, Svelte
|
|
235
222
|
|
|
236
|
-
|
|
237
223
|
## π Development
|
|
238
224
|
|
|
239
225
|
```bash
|
|
@@ -243,24 +229,23 @@ pnpm install
|
|
|
243
229
|
|
|
244
230
|
pnpm build
|
|
245
231
|
|
|
232
|
+
# Run locally
|
|
246
233
|
node dist/cli.cjs
|
|
247
|
-
|
|
248
|
-
node dist/cli.cjs --cursor
|
|
234
|
+
node dist/cli.cjs --context
|
|
249
235
|
node dist/cli.cjs --watch
|
|
250
236
|
```
|
|
251
237
|
|
|
252
|
-
|
|
253
238
|
## πΊοΈ Roadmap
|
|
254
239
|
|
|
255
|
-
- [
|
|
256
|
-
- [
|
|
257
|
-
- [ ] Generate automatic codebase documentation
|
|
258
|
-
- [ ] Export Q&A sessions as markdown docs
|
|
240
|
+
- [x] Local embeddings (no API calls!)
|
|
241
|
+
- [x] Support for all major AI coding assistants
|
|
259
242
|
- [ ] VS Code extension
|
|
260
243
|
- [ ] Official Cursor plugin
|
|
244
|
+
- [ ] Export Q&A sessions as markdown docs
|
|
245
|
+
- [ ] Add filtering by file type, date range, author
|
|
261
246
|
- [ ] Support for code diffs and change tracking
|
|
262
247
|
- [ ] Team collaboration features
|
|
263
|
-
|
|
248
|
+
- [ ] Self-hosted web UI
|
|
264
249
|
|
|
265
250
|
## π€ Contributing
|
|
266
251
|
|
|
@@ -269,20 +254,22 @@ Contributions welcome! Feel free to:
|
|
|
269
254
|
- Submit feature requests
|
|
270
255
|
- Open pull requests
|
|
271
256
|
|
|
272
|
-
|
|
273
257
|
## π License
|
|
274
258
|
|
|
275
259
|
MIT
|
|
276
260
|
|
|
277
|
-
|
|
278
261
|
## π‘ Pro Tips
|
|
279
262
|
|
|
280
|
-
**
|
|
263
|
+
**Privacy First**: Embeddings are generated locally - your code never leaves your machine during indexing.
|
|
264
|
+
|
|
265
|
+
**Gemini Free Tier**: Start with Google Gemini for chat responses - it's free and works great for most use cases.
|
|
281
266
|
|
|
282
|
-
**
|
|
267
|
+
**Universal Context**: Run `quack --context` once to generate context files for ALL major AI coding tools at once.
|
|
283
268
|
|
|
284
|
-
**
|
|
269
|
+
**Background Watcher**: Run `quack --watch &` in the background to keep context always fresh across all your AI tools.
|
|
285
270
|
|
|
286
271
|
**Multiple Projects**: Each project gets its own namespace in the database. Just run `quack` in different directories.
|
|
287
272
|
|
|
288
|
-
**Large Codebases**: First index might take a few minutes. After that, only changed files are re-indexed.
|
|
273
|
+
**Large Codebases**: First index might take a few minutes. After that, only changed files are re-indexed.
|
|
274
|
+
|
|
275
|
+
**No Vendor Lock-in**: Unlike other tools, QuackStack works with Cursor, Windsurf, Cline, Continue, and Aider - choose your favorite!
|