@openread/mcp 0.0.1-test.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (4) hide show
  1. package/LICENSE +23 -0
  2. package/README.md +147 -0
  3. package/dist/cli.mjs +229 -0
  4. package/package.json +49 -0
package/LICENSE ADDED
@@ -0,0 +1,23 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 OpenRead Contributors
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
22
+
23
+ SPDX-License-Identifier: MIT
package/README.md ADDED
@@ -0,0 +1,147 @@
1
+ # @openread/mcp
2
+
3
+ MCP server for [Openread](https://openread.ai) -- access your book library from AI assistants like Claude, Cursor, VS Code Copilot, and more.
4
+
5
+ ## Quick Start
6
+
7
+ 1. Get an API key from [Openread Settings > API Keys](https://app.openread.ai/settings)
8
+ 2. Add to your AI client configuration (see below)
9
+ 3. Ask your AI about your books!
10
+
11
+ ## Configuration
12
+
13
+ ### Claude Desktop
14
+
15
+ Add to `~/Library/Application Support/Claude/claude_desktop_config.json` (macOS) or `%APPDATA%\Claude\claude_desktop_config.json` (Windows):
16
+
17
+ ```json
18
+ {
19
+ "mcpServers": {
20
+ "openread": {
21
+ "command": "npx",
22
+ "args": ["-y", "@openread/mcp"],
23
+ "env": {
24
+ "OPENREAD_API_KEY": "orsk-your-key-here"
25
+ }
26
+ }
27
+ }
28
+ }
29
+ ```
30
+
31
+ ### Claude Code
32
+
33
+ Add to `~/.claude/settings.json`:
34
+
35
+ ```json
36
+ {
37
+ "mcpServers": {
38
+ "openread": {
39
+ "command": "npx",
40
+ "args": ["-y", "@openread/mcp"],
41
+ "env": {
42
+ "OPENREAD_API_KEY": "orsk-your-key-here"
43
+ }
44
+ }
45
+ }
46
+ }
47
+ ```
48
+
49
+ ### Cursor
50
+
51
+ Add to `~/.cursor/mcp.json` (global) or `.cursor/mcp.json` (project):
52
+
53
+ ```json
54
+ {
55
+ "mcpServers": {
56
+ "openread": {
57
+ "command": "npx",
58
+ "args": ["-y", "@openread/mcp"],
59
+ "env": {
60
+ "OPENREAD_API_KEY": "orsk-your-key-here"
61
+ }
62
+ }
63
+ }
64
+ }
65
+ ```
66
+
67
+ ### Other Clients
68
+
69
+ The same configuration works for all MCP-compatible clients:
70
+
71
+ | Client | Config File |
72
+ | ---------- | ------------------------------------------------------------------------- |
73
+ | VS Code | `.vscode/mcp.json` (project-level) |
74
+ | Windsurf | `~/.windsurf/mcp.json` |
75
+ | Codex | `~/.codex/mcp.json` |
76
+ | Gemini CLI | `~/.gemini/settings.json` |
77
+
78
+ ## Available Tools
79
+
80
+ | Tool | Description |
81
+ | ---------------- | ---------------------------------------------------- |
82
+ | `list_books` | List all books in your library |
83
+ | `get_book_info` | Get metadata for a specific book |
84
+ | `get_toc` | Get table of contents |
85
+ | `get_chapter` | Get full chapter text |
86
+ | `get_page_range` | Get text for a page range |
87
+ | `get_headings` | Get heading hierarchy for a chapter (~500B vs 100KB) |
88
+ | `get_section` | Get content for a specific section by heading |
89
+ | `get_passage` | Get text around a location with context |
90
+ | `get_key_terms` | Get key terms extracted from a chapter |
91
+ | `search_book` | Full-text search within a book |
92
+ | `search_library` | Search across all books |
93
+
94
+ ## Agent Navigation Pattern
95
+
96
+ For efficient book navigation, agents can use a progressive disclosure pattern:
97
+
98
+ ```
99
+ 1. get_toc(bookId) -> chapter list
100
+ 2. get_key_terms(bookId, ch) -> assess relevance (~500B)
101
+ 3. get_headings(bookId, ch) -> section structure (~500B)
102
+ 4. get_section(bookId, ch, h) -> targeted content (1-5KB)
103
+ ```
104
+
105
+ This achieves 96%+ token savings compared to fetching full chapters.
106
+
107
+ ## Environment Variables
108
+
109
+ | Variable | Required | Default | Description |
110
+ | -------------------- | -------- | ------------------------- | ---------------------------------------------- |
111
+ | `OPENREAD_API_KEY` | Yes | -- | Your `orsk-` API key from the Openread web app |
112
+ | `OPENREAD_API_URL` | No | `https://api.openread.ai` | API base URL (for self-hosted or staging) |
113
+ | `OPENREAD_CACHE_DIR` | No | OS-specific | Override the default disk cache directory |
114
+
115
+ ## Troubleshooting
116
+
117
+ ### "Invalid token" error
118
+
119
+ - Verify your API key starts with `orsk-` followed by a UUID
120
+ - Check the key has not been revoked in Settings > API Keys
121
+ - Ensure `OPENREAD_API_KEY` is set in the `env` block of your config (not as a shell env var)
122
+
123
+ ### "Connection refused" or server won't start
124
+
125
+ - Verify Node.js 18+ is installed: `node --version`
126
+ - Verify npx works: `npx --version`
127
+ - Check your client's MCP log for detailed error messages
128
+
129
+ ### Server starts but no books appear
130
+
131
+ - Verify you have uploaded books to your Openread library
132
+ - Try asking your AI to call `list_books` directly to check connectivity
133
+
134
+ ### Slow first response
135
+
136
+ - The first call for each book downloads and parses the file (~5-15 seconds depending on file size)
137
+ - Subsequent calls use the disk cache and are near-instant
138
+ - Cache is stored at `~/Library/Caches/openread-mcp` (macOS), `~/.cache/openread-mcp` (Linux), or `%LOCALAPPDATA%\openread-mcp\Cache` (Windows)
139
+
140
+ ## Requirements
141
+
142
+ - Node.js 18 or later
143
+ - An Openread account with books in your library
144
+
145
+ ## License
146
+
147
+ AGPL-3.0