@pixelguild/loom 0.1.2 → 0.1.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +41 -8
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,7 +1,3 @@
|
|
|
1
|
-
<p align="center">
|
|
2
|
-
<img src="https://raw.githubusercontent.com/pixelguild/loom/main/assets/loom_app_icon.png" width="96" height="96" alt="Loom">
|
|
3
|
-
</p>
|
|
4
|
-
|
|
5
1
|
<h1 align="center">@pixelguild/loom</h1>
|
|
6
2
|
|
|
7
3
|
<p align="center">
|
|
@@ -9,6 +5,10 @@
|
|
|
9
5
|
<strong>Persistent memory · Intelligent archiving · Cross-project patterns · Peer LLM consultation</strong>
|
|
10
6
|
</p>
|
|
11
7
|
|
|
8
|
+
<p align="center">
|
|
9
|
+
<a href="https://loom.pixelguild.com">loom.pixelguild.com</a>
|
|
10
|
+
</p>
|
|
11
|
+
|
|
12
12
|
<p align="center">
|
|
13
13
|
<a href="https://www.npmjs.com/package/@pixelguild/loom"><img src="https://img.shields.io/npm/v/@pixelguild/loom?color=0a7bca&label=npm" alt="npm"></a>
|
|
14
14
|
<a href="https://github.com/pixelguild/loom/blob/main/LICENSE"><img src="https://img.shields.io/npm/l/@pixelguild/loom?color=333" alt="license"></a>
|
|
@@ -22,7 +22,8 @@ Loom is an [MCP server](https://modelcontextprotocol.io) that gives Claude Code
|
|
|
22
22
|
## Install
|
|
23
23
|
|
|
24
24
|
```bash
|
|
25
|
-
|
|
25
|
+
npm install -g @pixelguild/loom
|
|
26
|
+
loom init
|
|
26
27
|
```
|
|
27
28
|
|
|
28
29
|
This scaffolds `docs/loom/`, wires the MCP server into `.claude/mcp.json`, and adds instructions to your `CLAUDE.md`. That's it.
|
|
@@ -30,7 +31,7 @@ This scaffolds `docs/loom/`, wires the MCP server into `.claude/mcp.json`, and a
|
|
|
30
31
|
Or add manually:
|
|
31
32
|
|
|
32
33
|
```bash
|
|
33
|
-
claude mcp add loom --
|
|
34
|
+
claude mcp add loom -- loom
|
|
34
35
|
```
|
|
35
36
|
|
|
36
37
|
## What it does
|
|
@@ -81,8 +82,8 @@ The latest archive's carry-forward summary + current `context.md` always gives C
|
|
|
81
82
|
## CLI
|
|
82
83
|
|
|
83
84
|
```bash
|
|
84
|
-
|
|
85
|
-
|
|
85
|
+
loom init # Scaffold project, wire MCP, inject CLAUDE.md
|
|
86
|
+
loom status # Token count, archive status, context health
|
|
86
87
|
```
|
|
87
88
|
|
|
88
89
|
## Configuration
|
|
@@ -116,6 +117,38 @@ Layered config — per-project overrides global defaults.
|
|
|
116
117
|
| `LOOM_PROJECT_ROOT` | Project directory (falls back to cwd) |
|
|
117
118
|
| `LOOM_OPENAI_API_KEY` | OpenAI API key for peer consultation |
|
|
118
119
|
|
|
120
|
+
## Peer Consultation Setup
|
|
121
|
+
|
|
122
|
+
The `loom_consult_peer` tool lets Claude ask a second LLM for architecture advice, debugging help, or alternative perspectives.
|
|
123
|
+
|
|
124
|
+
**OpenAI** — Set `LOOM_OPENAI_API_KEY` env var or pass it via MCP config:
|
|
125
|
+
|
|
126
|
+
```bash
|
|
127
|
+
claude mcp add loom -e LOOM_PROJECT_ROOT=$(pwd) -e LOOM_OPENAI_API_KEY=sk-... -- loom
|
|
128
|
+
```
|
|
129
|
+
|
|
130
|
+
```json
|
|
131
|
+
{ "default_provider": "openai", "providers": { "openai": { "model": "gpt-4o" } } }
|
|
132
|
+
```
|
|
133
|
+
|
|
134
|
+
**Vertex AI (Gemini)** — Requires `gcloud auth application-default login`:
|
|
135
|
+
|
|
136
|
+
```json
|
|
137
|
+
{ "default_provider": "vertex", "providers": { "vertex": { "model": "gemini-2.0-flash", "project": "your-gcp-project", "location": "us-central1" } } }
|
|
138
|
+
```
|
|
139
|
+
|
|
140
|
+
**Ollama (local)** — No API key needed. Install [Ollama](https://ollama.com) and pull a model:
|
|
141
|
+
|
|
142
|
+
```json
|
|
143
|
+
{ "default_provider": "ollama", "providers": { "ollama": { "model": "llama3.1", "host": "http://localhost:11434" } } }
|
|
144
|
+
```
|
|
145
|
+
|
|
146
|
+
Restrict providers per project in `docs/loom/loom.config.json`:
|
|
147
|
+
|
|
148
|
+
```json
|
|
149
|
+
{ "peer_consultation": { "enabled": true, "allowed_providers": ["openai", "ollama"] } }
|
|
150
|
+
```
|
|
151
|
+
|
|
119
152
|
## Storage
|
|
120
153
|
|
|
121
154
|
```
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@pixelguild/loom",
|
|
3
|
-
"version": "0.1.
|
|
3
|
+
"version": "0.1.4",
|
|
4
4
|
"description": "Session intelligence layer for Claude Code — persistent memory, intelligent archiving, cross-project patterns, and peer LLM consultation.",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "dist/index.js",
|