@gmickel/gno 0.3.0 → 0.3.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,256 +1,262 @@
1
- # GNO: Your Local Second Brain
1
+ # GNO
2
2
 
3
- **Index, Search, and Synthesize Your Entire Digital Life.**
3
+ **Your Local Second Brain** — Index, search, and synthesize your entire digital life.
4
4
 
5
- GNO is a **Local Knowledge Engine** designed for privacy-conscious individuals and AI agents. It indexes your notes, code, documents (Markdown, PDF, Office, and more), and meeting transcripts, providing lightning-fast semantic search and AI-powered answers—all on your machine.
5
+ [![npm](./assets/badges/npm.svg)](https://www.npmjs.com/package/@gmickel/gno)
6
+ [![MIT License](./assets/badges/license.svg)](./LICENSE)
7
+ [![Website](./assets/badges/website.svg)](https://gno.sh)
8
+ [![Twitter](./assets/badges/twitter.svg)](https://twitter.com/gmickel)
9
+
10
+ GNO is a local knowledge engine for privacy-conscious developers and AI agents. Index your notes, code, PDFs, and Office docs. Get hybrid search (BM25 + vector + reranking) and AI-powered answers—all running 100% on your machine.
6
11
 
7
12
  ---
8
13
 
9
- ## ✨ Key Features
14
+ ## Contents
10
15
 
11
- * **Universal Indexing**: Effortlessly ingest and search across Markdown, PDF, DOCX, XLSX, PPTX, and plain text files.
12
- * **Hybrid Search Pipeline**: Combines **BM25 keyword search** with **vector semantic search** and **AI re-ranking** for unparalleled retrieval accuracy.
13
- * **Local LLM Integration**: Get grounded AI answers with citations using **node-llama-cpp** and auto-downloaded GGUF models. No external services, maximum privacy.
14
- * **Agent-First Design (MCP)**: Seamlessly integrate GNO with AI agents via the Model Context Protocol (MCP) server.
15
- * **Deterministic Output**: Stable, schema-driven JSON, file-line, and markdown outputs for reliable scripting.
16
- * **Multilingual Support**: Robust handling of multiple languages in indexing and retrieval.
17
- * **Privacy-Preserving**: All processing happens locally. Your data never leaves your device.
18
- * **World-Class Engineering**: Spec-driven development, rigorous testing, and eval gates ensure reliability and quality.
16
+ - [Quick Start](#quick-start)
17
+ - [Installation](#installation)
18
+ - [Search Modes](#search-modes)
19
+ - [Agent Integration](#agent-integration)
20
+ - [How It Works](#how-it-works)
21
+ - [Features](#features)
22
+ - [Local Models](#local-models)
23
+ - [Architecture](#architecture)
24
+ - [Development](#development)
19
25
 
20
26
  ---
21
27
 
22
- ## 🚀 Quick Start
28
+ ## Quick Start
23
29
 
24
- Get searching in minutes with the 3-command workflow:
30
+ ```bash
31
+ gno init ~/notes --name notes # Point at your docs
32
+ gno index # Build search index
33
+ gno query "auth best practices" # Hybrid search
34
+ gno ask "summarize the API" --answer # AI answer with citations
35
+ ```
25
36
 
26
- 1. **Initialize your knowledge base**:
27
- ```sh
28
- # Create a collection for your notes (adjust path and name as needed)
29
- gno init ~/my-notes --name notes --pattern "**/*.md"
37
+ ---
30
38
 
31
- # Full index: sync files + generate embeddings
32
- gno index
33
- ```
39
+ ## Installation
34
40
 
35
- 2. **Ask a question**:
36
- ```sh
37
- # Get a direct, cited answer from your documents
38
- gno ask "What are the best practices for API authentication?" --collection notes
41
+ ### Install GNO
39
42
 
40
- # Search with keywords or natural language
41
- gno query "Q4 planning meeting summary" --collection notes
42
- ```
43
+ Requires [Bun](https://bun.sh/) >= 1.0.0.
43
44
 
44
- 3. **Explore your data**:
45
- ```sh
46
- # Retrieve specific document content
47
- gno get "notes/2024-01-15.md"
45
+ ```bash
46
+ bun install -g @gmickel/gno
47
+ ```
48
48
 
49
- # Get results in a machine-readable format for agents
50
- gno search "project deadlines" --json -n 10
51
- ```
49
+ **macOS**: Vector search requires Homebrew SQLite:
52
50
 
53
- ---
51
+ ```bash
52
+ brew install sqlite3
53
+ ```
54
+
55
+ Verify everything works:
54
56
 
55
- ## 🧠 For Humans & AI Agents
57
+ ```bash
58
+ gno doctor
59
+ ```
56
60
 
57
- GNO is built for both worlds:
61
+ ### Connect to AI Agents
58
62
 
59
- * **For Humans**: A powerful, yet intuitive CLI to quickly find information, get answers, and explore your local knowledge base.
60
- * **For AI Agents**: Exposes a stable MCP server and structured output formats (`--json`, `--files`) for seamless integration with LLMs and agentic workflows.
63
+ #### MCP Server (Claude Desktop, Cursor, Zed, etc.)
61
64
 
62
- ---
65
+ One command to add GNO to your AI assistant:
63
66
 
64
- ## 🔎 Search Modes
67
+ ```bash
68
+ gno mcp install # Claude Desktop (default)
69
+ gno mcp install --target cursor # Cursor
70
+ gno mcp install --target claude-code # Claude Code CLI
71
+ gno mcp install --target zed # Zed
72
+ gno mcp install --target windsurf # Windsurf
73
+ gno mcp install --target codex # OpenAI Codex CLI
74
+ gno mcp install --target opencode # OpenCode
75
+ gno mcp install --target amp # Amp
76
+ gno mcp install --target lmstudio # LM Studio
77
+ gno mcp install --target librechat # LibreChat
78
+ ```
65
79
 
66
- GNO offers multiple search strategies to suit your needs:
80
+ Check status: `gno mcp status`
67
81
 
68
- | Command | Mode | Description | Best For |
69
- | :---------- | :------------- | :-------------------------------------------------------------- | :----------------------------------------------- |
70
- | `gno search`| **BM25** | Fast, keyword-based full-text search. | Exact phrase matching, known terms. |
71
- | `gno vsearch`| **Vector** | Semantic search based on meaning, not just keywords. | Natural language queries, conceptual understanding. |
72
- | `gno query` | **Hybrid** | Combines BM25 and Vector search with LLM reranking and fusion. | Highest accuracy, nuanced understanding. |
73
- | `gno ask` | **RAG-focused**| Hybrid search providing a synthesized, cited answer from results. | Getting direct answers to complex questions. |
82
+ #### Skills (Claude Code, Codex, OpenCode)
74
83
 
75
- ---
84
+ Skills integrate via CLI—no MCP overhead:
76
85
 
77
- ## 🤖 Agent Integration
86
+ ```bash
87
+ gno skill install --scope user # User-wide
88
+ gno skill install --target codex # Codex
89
+ gno skill install --target all # Both Claude + Codex
90
+ ```
78
91
 
79
- GNO is designed to be the knowledge backbone for your AI agents.
92
+ > **Full setup guide**: [MCP Integration](https://gno.sh/docs/MCP/) · [CLI Reference](https://gno.sh/docs/CLI/)
80
93
 
81
- ### CLI Output Formats
94
+ ---
82
95
 
83
- Use `--json` or `--files` for machine-readable output:
96
+ ## Search Modes
84
97
 
85
- ```sh
86
- # Get JSON results for LLM processing
87
- gno query "meeting notes on user feedback" --json -n 5
98
+ | Command | Mode | Best For |
99
+ |:--------|:-----|:---------|
100
+ | `gno search` | BM25 | Exact phrases, code identifiers |
101
+ | `gno vsearch` | Vector | Natural language, concepts |
102
+ | `gno query` | Hybrid | Best accuracy (BM25 + vector + reranking) |
103
+ | `gno ask --answer` | RAG | Direct answers with citations |
88
104
 
89
- # Get file paths and scores for agent tool use
90
- gno search "API design" --files --min-score 0.3
105
+ ```bash
106
+ gno search "handleAuth" # Find exact matches
107
+ gno vsearch "error handling patterns" # Semantic similarity
108
+ gno query "database optimization" # Full pipeline
109
+ gno ask "what did we decide" --answer # AI synthesis
91
110
  ```
92
111
 
93
- ### Skill Installation (Recommended for Claude Code/Codex/OpenCode)
112
+ Output formats: `--json`, `--files`, `--csv`, `--md`, `--xml`
94
113
 
95
- Skills integrate via CLI - the agent runs GNO commands directly. No MCP overhead, no context pollution.
114
+ ---
96
115
 
97
- ```bash
98
- gno skill install --scope user # User-wide for Claude Code
99
- gno skill install --target codex # For Codex
100
- gno skill install --target all # Both
101
- ```
116
+ ## Agent Integration
102
117
 
103
- After install, restart your agent. It will detect GNO and can search your indexed documents.
118
+ ### MCP Server
104
119
 
105
- ### MCP Server (For Claude Desktop/Cursor)
120
+ GNO exposes 6 tools via [Model Context Protocol](https://modelcontextprotocol.io):
106
121
 
107
- Exposes an MCP server for GUI-based AI applications.
122
+ | Tool | Description |
123
+ |:-----|:------------|
124
+ | `gno_search` | BM25 keyword search |
125
+ | `gno_vsearch` | Vector semantic search |
126
+ | `gno_query` | Hybrid search (recommended) |
127
+ | `gno_get` | Retrieve document by ID |
128
+ | `gno_multi_get` | Batch document retrieval |
129
+ | `gno_status` | Index health check |
108
130
 
109
- **Tools Exposed:**
110
- * `gno_search` (BM25)
111
- * `gno_vsearch` (Vector)
112
- * `gno_query` (Hybrid)
113
- * `gno_get` (Document retrieval)
114
- * `gno_multi_get` (Batch retrieval)
115
- * `gno_status` (Index health)
131
+ **Design**: MCP tools are retrieval-only. Your AI assistant (Claude, GPT-4) synthesizes answers from retrieved context—best retrieval (GNO) + best reasoning (your LLM).
116
132
 
117
- **Example Claude Desktop Configuration** (`~/Library/Application Support/Claude/claude_desktop_config.json`):
133
+ ### Skills
118
134
 
119
- ```json
120
- {
121
- "mcpServers": {
122
- "gno": {
123
- "command": "gno",
124
- "args": ["mcp"]
125
- }
126
- }
127
- }
135
+ Skills add GNO search to Claude Code/Codex without MCP protocol overhead:
136
+
137
+ ```bash
138
+ gno skill install --scope user
128
139
  ```
129
140
 
130
- *(Adjust path and `mcpServers` key based on your agent's configuration.)*
141
+ Then ask your agent: *"Search my notes for the auth discussion"*
131
142
 
132
- ---
143
+ > **Detailed docs**: [MCP Integration](https://gno.sh/docs/MCP/) · [Use Cases](https://gno.sh/docs/USE-CASES/)
133
144
 
134
- ## ⚙️ How It Works: The GNO Pipeline
145
+ ---
135
146
 
136
- GNO employs a sophisticated, multi-stage retrieval process for optimal results:
147
+ ## How It Works
137
148
 
138
149
  ```mermaid
139
150
  graph TD
140
- A[User Query] --> B(Query Expansion);
141
- B --> C{Lexical Variants};
142
- B --> D{Semantic Variants};
143
- B --> E{Optional HyDE};
144
- A --> F[Original Query];
145
-
146
- C --> G(BM25 Retrieval);
147
- D --> H(Vector Search);
148
- E --> H;
149
- F --> G;
150
- F --> H;
151
-
152
- G --> I(Ranked List 1);
153
- H --> J(Ranked List 2);
154
- I --> K{RRF Fusion + Bonus};
155
- J --> K;
156
-
157
- K --> L(Top Candidates);
158
- L --> M(LLM Re-ranking);
159
- M --> N(Position-Aware Blending);
160
- N --> O(Final Results);
161
-
162
- subgraph "Search Stages"
163
- B; C; D; E; F; G; H; I; J; K; L; M; N; O;
164
- end
151
+ A[User Query] --> B(Query Expansion)
152
+ B --> C{Lexical Variants}
153
+ B --> D{Semantic Variants}
154
+ B --> E{HyDE Passage}
155
+
156
+ C --> G(BM25 Search)
157
+ D --> H(Vector Search)
158
+ E --> H
159
+ A --> G
160
+ A --> H
161
+
162
+ G --> I(Ranked Results)
163
+ H --> J(Ranked Results)
164
+ I --> K{RRF Fusion}
165
+ J --> K
166
+
167
+ K --> L(Top 20 Candidates)
168
+ L --> M(Cross-Encoder Rerank)
169
+ M --> N[Final Results]
165
170
  ```
166
171
 
167
- ### Search Pipeline Details:
172
+ 1. **Query Expansion** — LLM generates lexical variants, semantic rephrases, and a [HyDE](https://arxiv.org/abs/2212.10496) passage
173
+ 2. **Parallel Retrieval** — BM25 + vector search run concurrently on all variants
174
+ 3. **Fusion** — Reciprocal Rank Fusion merges results with position-based scoring
175
+ 4. **Reranking** — Cross-encoder rescores top 20, blended with fusion scores
168
176
 
169
- 1. **Query Expansion**: Generates alternative queries (lexical and semantic) and an optional synthetic "HyDE" document using a local LLM for richer retrieval.
170
- 2. **Parallel Retrieval**: Executes BM25 (keyword) and Vector (semantic) searches concurrently.
171
- 3. **Fusion**: Combines results using Reciprocal Rank Fusion (RRF) with a weighted boost for original query matches and a top-rank bonus.
172
- 4. **Re-ranking**: An LLM-based cross-encoder re-scores the top candidates for final relevance.
173
- 5. **Blending**: Dynamically adjusts the mix of retrieval vs. reranked scores based on rank position to preserve accuracy.
174
-
175
- **Score Normalization**: Raw scores from FTS, vector distance, and reranker are normalized to a 0-1 scale for consistent fusion.
177
+ > **Deep dive**: [How Search Works](https://gno.sh/docs/HOW-SEARCH-WORKS/)
176
178
 
177
179
  ---
178
180
 
179
- ## 📦 Installation
181
+ ## Features
180
182
 
181
- Requires **Bun** >= 1.0.0.
183
+ | Feature | Description |
184
+ |:--------|:------------|
185
+ | **Hybrid Search** | BM25 + vector + RRF fusion + cross-encoder reranking |
186
+ | **Multi-Format** | Markdown, PDF, DOCX, XLSX, PPTX, plain text |
187
+ | **Local LLM** | AI answers via llama.cpp—no API keys |
188
+ | **Privacy First** | 100% offline, zero telemetry, your data stays yours |
189
+ | **MCP Server** | Works with Claude Desktop, Cursor, Zed, + 8 more |
190
+ | **Collections** | Organize sources with patterns, excludes, contexts |
191
+ | **Multilingual** | 30+ languages, auto-detection, cross-lingual search |
192
+ | **Incremental** | SHA-256 tracking—only changed files re-indexed |
182
193
 
183
- ```sh
184
- # Install globally
185
- bun install -g @gmickel/gno
186
- ```
194
+ ---
187
195
 
188
- **macOS users**: For optimal vector search performance, install Homebrew SQLite:
189
- ```sh
190
- brew install sqlite3
191
- ```
196
+ ## Local Models
192
197
 
193
- Verify your installation:
194
- ```sh
195
- gno doctor
196
- ```
198
+ Models auto-download on first use to `~/.cache/gno/models/`.
197
199
 
198
- ---
200
+ | Model | Purpose | Size |
201
+ |:------|:--------|:-----|
202
+ | bge-m3 | Embeddings (1024-dim, multilingual) | ~500MB |
203
+ | bge-reranker-v2-m3 | Cross-encoder reranking | ~700MB |
204
+ | Qwen/SmolLM | Query expansion + AI answers | ~600MB-1.2GB |
199
205
 
200
- ## 🏠 Local LLM Models
206
+ ### Model Presets
201
207
 
202
- GNO runs embeddings, reranking, and query expansion locally using GGUF models via `node-llama-cpp`. Models are automatically downloaded and cached on first use in `~/.cache/gno/models/`.
208
+ | Preset | Disk | Best For |
209
+ |:-------|:-----|:---------|
210
+ | `slim` | ~1GB | Fast, lower quality |
211
+ | `balanced` | ~2GB | Good balance (default) |
212
+ | `quality` | ~2.5GB | Best answers |
203
213
 
204
- | Model | Purpose | Size (approx.) |
205
- | :-------------------- | :---------------- | :------------- |
206
- | `bge-m3` | Multilingual Embeddings | ~500MB |
207
- | `bge-reranker-v2-m3` | Cross-Encoder Re-ranking | ~700MB |
208
- | `Qwen-Instruct` | Query Expansion / HyDE | ~600MB |
214
+ ```bash
215
+ gno models use balanced
216
+ gno models pull --all
217
+ ```
209
218
 
210
- *(Specific GGUF versions are pinned for stability.)*
219
+ > **Configuration**: [Model Setup](https://gno.sh/docs/CONFIGURATION/)
211
220
 
212
221
  ---
213
222
 
214
- ## 📜 Architecture Overview
215
-
216
- GNO follows a layered, Ports and Adapters architecture for maintainability and testability:
223
+ ## Architecture
217
224
 
218
225
  ```
219
- ┌───────────────────────────────────────────────────────────┐
220
- GNO CLI / MCP
221
- ├───────────────────────────────────────────────────────────┤
222
- Ports: Converter, Store, Embedding, Generation, Rerank...
223
- ├───────────────────────────────────────────────────────────┤
224
- Adapters: SQLite, FTS5, sqlite-vec, node-llama-cpp, CLI
225
- ├───────────────────────────────────────────────────────────┤
226
- Core Domain: Identity, Mirrors, Chunking, Retrieval
227
- └───────────────────────────────────────────────────────────┘
226
+ ┌─────────────────────────────────────────────────┐
227
+ GNO CLI / MCP
228
+ ├─────────────────────────────────────────────────┤
229
+ Ports: Converter, Store, Embedding, Rerank
230
+ ├─────────────────────────────────────────────────┤
231
+ Adapters: SQLite, FTS5, sqlite-vec, llama-cpp │
232
+ ├─────────────────────────────────────────────────┤
233
+ Core: Identity, Mirrors, Chunking, Retrieval
234
+ └─────────────────────────────────────────────────┘
228
235
  ```
229
236
 
237
+ > **Details**: [Architecture](https://gno.sh/docs/ARCHITECTURE/)
238
+
230
239
  ---
231
240
 
232
- ## 💻 Development
241
+ ## Development
233
242
 
234
243
  ```bash
235
- # Clone the repository
236
- git clone https://github.com/gmickel/gno.git
237
- cd gno
238
-
239
- # Install dependencies
244
+ git clone https://github.com/gmickel/gno.git && cd gno
240
245
  bun install
241
-
242
- # Run tests
243
246
  bun test
244
-
245
- # Lint and format code
246
- bun run lint
247
-
248
- # Type check
249
- bun run typecheck
247
+ bun run lint && bun run typecheck
250
248
  ```
251
249
 
250
+ > **Contributing**: [CONTRIBUTING.md](.github/CONTRIBUTING.md)
251
+
252
252
  ---
253
253
 
254
- ## 📄 License
254
+ ## License
255
+
256
+ [MIT](./LICENSE)
257
+
258
+ ---
255
259
 
256
- [MIT License](./LICENSE)
260
+ <p align="center">
261
+ made with ❤️ by <a href="https://twitter.com/gmickel">@gmickel</a>
262
+ </p>
@@ -0,0 +1,12 @@
1
+ <svg xmlns="http://www.w3.org/2000/svg" width="85" height="28" viewBox="0 0 85 28">
2
+ <defs>
3
+ <linearGradient id="bg-lic" x1="0%" y1="0%" x2="100%" y2="100%">
4
+ <stop offset="0%" style="stop-color:#1a1a2e"/>
5
+ <stop offset="100%" style="stop-color:#0f0f1a"/>
6
+ </linearGradient>
7
+ </defs>
8
+ <rect width="85" height="28" rx="6" fill="url(#bg-lic)"/>
9
+ <rect x="1" y="1" width="83" height="26" rx="5" fill="none" stroke="#2dd4bf" stroke-opacity="0.3"/>
10
+ <text x="10" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="11" fill="#94a3b8">MIT</text>
11
+ <text x="34" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="11" fill="#2dd4bf">License</text>
12
+ </svg>
@@ -0,0 +1,13 @@
1
+ <svg xmlns="http://www.w3.org/2000/svg" width="66" height="28" viewBox="0 0 66 28">
2
+ <defs>
3
+ <linearGradient id="bg-npm" x1="0%" y1="0%" x2="100%" y2="100%">
4
+ <stop offset="0%" style="stop-color:#1a1a2e"/>
5
+ <stop offset="100%" style="stop-color:#0f0f1a"/>
6
+ </linearGradient>
7
+ </defs>
8
+ <rect width="66" height="28" rx="6" fill="url(#bg-npm)"/>
9
+ <rect x="1" y="1" width="64" height="26" rx="5" fill="none" stroke="#cb3837" stroke-opacity="0.4"/>
10
+ <rect x="8" y="7" width="32" height="14" rx="3" fill="#cb3837"/>
11
+ <text x="13" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="10" font-weight="700" fill="#fff">npm</text>
12
+ <text x="46" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="12" fill="#e2e8f0">↗</text>
13
+ </svg>
@@ -0,0 +1,22 @@
1
+ <svg xmlns="http://www.w3.org/2000/svg" width="130" height="28" viewBox="0 0 130 28">
2
+ <defs>
3
+ <linearGradient id="bg2" x1="0%" y1="0%" x2="100%" y2="100%">
4
+ <stop offset="0%" style="stop-color:#1a1a2e"/>
5
+ <stop offset="100%" style="stop-color:#0f0f1a"/>
6
+ </linearGradient>
7
+ <filter id="glow2">
8
+ <feGaussianBlur stdDeviation="0.5" result="coloredBlur"/>
9
+ <feMerge>
10
+ <feMergeNode in="coloredBlur"/>
11
+ <feMergeNode in="SourceGraphic"/>
12
+ </feMerge>
13
+ </filter>
14
+ </defs>
15
+ <rect width="130" height="28" rx="6" fill="url(#bg2)"/>
16
+ <rect x="1" y="1" width="128" height="26" rx="5" fill="none" stroke="#2dd4bf" stroke-opacity="0.3"/>
17
+ <!-- X logo -->
18
+ <g transform="translate(10, 7)" fill="#e2e8f0">
19
+ <path d="M10.5 1.5L6.75 6.5L10.75 12.5H8.5L5.75 8.5L2.5 12.5H0.25L4.25 7L0.5 1.5H2.75L5.25 5.25L8.25 1.5H10.5Z" transform="scale(1.1)"/>
20
+ </g>
21
+ <text x="32" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="12" font-weight="500" fill="#e2e8f0" filter="url(#glow2)">@gmickel</text>
22
+ </svg>
@@ -0,0 +1,22 @@
1
+ <svg xmlns="http://www.w3.org/2000/svg" width="120" height="28" viewBox="0 0 120 28">
2
+ <defs>
3
+ <linearGradient id="bg" x1="0%" y1="0%" x2="100%" y2="100%">
4
+ <stop offset="0%" style="stop-color:#1a1a2e"/>
5
+ <stop offset="100%" style="stop-color:#0f0f1a"/>
6
+ </linearGradient>
7
+ <filter id="glow">
8
+ <feGaussianBlur stdDeviation="1" result="coloredBlur"/>
9
+ <feMerge>
10
+ <feMergeNode in="coloredBlur"/>
11
+ <feMergeNode in="SourceGraphic"/>
12
+ </feMerge>
13
+ </filter>
14
+ </defs>
15
+ <rect width="120" height="28" rx="6" fill="url(#bg)"/>
16
+ <rect x="1" y="1" width="118" height="26" rx="5" fill="none" stroke="#2dd4bf" stroke-opacity="0.3"/>
17
+ <text x="10" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="11" fill="#2dd4bf" filter="url(#glow)">❯</text>
18
+ <text x="24" y="18" font-family="ui-monospace,SFMono-Regular,Menlo,Monaco,monospace" font-size="12" font-weight="600" fill="#e2e8f0">gno.sh</text>
19
+ <rect x="74" y="9" width="2" height="12" fill="#2dd4bf" opacity="0.8">
20
+ <animate attributeName="opacity" values="0.8;0.2;0.8" dur="1.2s" repeatCount="indefinite"/>
21
+ </rect>
22
+ </svg>
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@gmickel/gno",
3
- "version": "0.3.0",
3
+ "version": "0.3.5",
4
4
  "description": "Local semantic search for your documents. Index Markdown, PDF, and Office files with hybrid BM25 + vector search.",
5
5
  "keywords": [
6
6
  "search",
@@ -47,6 +47,7 @@
47
47
  "test:watch": "bun test --watch",
48
48
  "test:coverage": "bun test --coverage",
49
49
  "test:coverage:html": "bun test --coverage --html",
50
+ "test:fixtures": "bun scripts/generate-test-fixtures.ts",
50
51
  "typecheck": "tsgo --noEmit",
51
52
  "lint:typeaware": "bun x oxlint --type-aware",
52
53
  "reset": "bun run src/index.ts reset --confirm",
@@ -79,9 +80,13 @@
79
80
  "@typescript/native-preview": "^7.0.0-dev.20251215.1",
80
81
  "ajv": "^8.17.1",
81
82
  "ajv-formats": "^3.0.1",
83
+ "docx": "^9.5.1",
82
84
  "evalite": "^1.0.0-beta.15",
85
+ "exceljs": "^4.4.0",
83
86
  "lefthook": "^2.0.12",
84
87
  "oxlint-tsgolint": "^0.10.0",
88
+ "pdf-lib": "^1.17.1",
89
+ "pptxgenjs": "^4.0.1",
85
90
  "ultracite": "^6.5.0"
86
91
  },
87
92
  "peerDependencies": {