@link-assistant/hive-mind 1.50.9 → 1.50.11

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,17 @@
1
1
  # @link-assistant/hive-mind
2
2
 
3
+ ## 1.50.11
4
+
5
+ ### Patch Changes
6
+
7
+ - bf9cf54: Fix prompt template builders crashing when literal `.png` appears in screenshot guidance.
8
+
9
+ ## 1.50.10
10
+
11
+ ### Patch Changes
12
+
13
+ - 0dc1613: Fix log upload raw URL resolution so gist metadata lookups do not mirror full gist contents to stdout, and harden stdio handling when the terminal pipe is already broken.
14
+
3
15
  ## 1.50.9
4
16
 
5
17
  ### Patch Changes
package/README.md CHANGED
@@ -195,25 +195,23 @@ docker attach hive-mind
195
195
 
196
196
  # --- Persisting auth data across restarts ---
197
197
 
198
- # Extract auth data from a running (or stopped) container to the host:
199
- mkdir -p ~/.hive-mind
200
- docker cp hive-mind:/workspace/.claude ~/.hive-mind/claude
201
- docker cp hive-mind:/workspace/.claude.json ~/.hive-mind/claude.json
202
- docker cp hive-mind:/workspace/.config/gh ~/.hive-mind/gh
198
+ # On the host, create the directories used by the current Docker workflow:
199
+ mkdir -p /root/.hive-mind/claude /root/.hive-mind/codex /root/.hive-mind/gh
200
+ touch -a /root/.hive-mind/claude.json
203
201
 
204
- # Fix ownership to match the sandbox user inside the container:
205
- SANDBOX_UID=$(docker exec hive-mind id -u sandbox)
206
- chown -R $SANDBOX_UID:$SANDBOX_UID ~/.hive-mind/claude ~/.hive-mind/gh
207
- chown $SANDBOX_UID:$SANDBOX_UID ~/.hive-mind/claude.json
208
-
209
- # On subsequent runs, mount the auth data to keep it between restarts:
210
- docker run -dit \
211
- --name hive-mind \
212
- --restart unless-stopped \
202
+ # In our Docker images HOME=/workspace, so Codex stores its data in /workspace/.codex.
203
+ # Mount the full Codex directory so auth.json, config.toml, and sessions survive restarts.
204
+ docker run -dit --user sandbox --name hive-mind --restart unless-stopped \
213
205
  -v /root/.hive-mind/claude:/workspace/.claude \
206
+ -v /root/.hive-mind/codex:/workspace/.codex \
214
207
  -v /root/.hive-mind/claude.json:/workspace/.claude.json \
215
208
  -v /root/.hive-mind/gh:/workspace/.config/gh \
216
- konard/hive-mind:latest
209
+ konard/hive-mind:latest bash -l -c 'bash /workspace/start-bot.sh'
210
+
211
+ # After the first start, fix ownership to match the sandbox user inside the container:
212
+ SANDBOX_UID=$(docker exec hive-mind id -u sandbox)
213
+ chown -R $SANDBOX_UID:$SANDBOX_UID /root/.hive-mind/claude /root/.hive-mind/codex /root/.hive-mind/gh
214
+ chown $SANDBOX_UID:$SANDBOX_UID /root/.hive-mind/claude.json
217
215
  ```
218
216
 
219
217
  **Benefits of Docker:**
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@link-assistant/hive-mind",
3
- "version": "1.50.9",
3
+ "version": "1.50.11",
4
4
  "description": "AI-powered issue solver and hive mind for collaborative problem solving",
5
5
  "main": "src/hive.mjs",
6
6
  "type": "module",
@@ -160,7 +160,7 @@ Initial research.
160
160
  - When you start, create a detailed plan for yourself and follow your todo list step by step. Add as many relevant points from these guidelines to the todo list as practical so you can track the work clearly.
161
161
  - When the user mentions CI failures or asks to investigate logs, consider adding these todos to track the investigation: (1) list recent CI runs with timestamps, (2) download logs from failed runs to the ci-logs/ directory, (3) analyze error messages and identify the root cause, (4) implement a fix, (5) verify that the fix resolves the specific errors found in the logs.
162
162
  - When you read the issue, read all details and comments thoroughly.
163
- - When you see screenshots or images in issue descriptions, pull request descriptions, comments, or discussions, download the image to a local file first, then use the Read tool to view and analyze it. Before reading downloaded images with the Read tool, verify that the file is a valid image rather than HTML by using a CLI tool such as the 'file' command. When corrupted or non-image files, such as GitHub "Not Found" pages saved as `.png`, are read, they can cause "Could not process image" errors and crash the AI solver process. When the file command shows "HTML", "text", or "ASCII text", the download failed, so do not call Read on that file. Instead: (1) when images are from GitHub issues or PRs, such as URLs containing "github.com/user-attachments", retry with: curl -L -H "Authorization: token $(gh auth token)" -o <filename> "<url>" (2) when the retry still fails, skip the image and note that it was unavailable.
163
+ - When you see screenshots or images in issue descriptions, pull request descriptions, comments, or discussions, download the image to a local file first, then use the Read tool to view and analyze it. Before reading downloaded images with the Read tool, verify that the file is a valid image rather than HTML by using a CLI tool such as the 'file' command. When corrupted or non-image files, such as GitHub "Not Found" pages saved as \`.png\`, are read, they can cause "Could not process image" errors and crash the AI solver process. When the file command shows "HTML", "text", or "ASCII text", the download failed, so do not call Read on that file. Instead: (1) when images are from GitHub issues or PRs, such as URLs containing "github.com/user-attachments", retry with: curl -L -H "Authorization: token $(gh auth token)" -o <filename> "<url>" (2) when the retry still fails, skip the image and note that it was unavailable.
164
164
  - When you need issue details, use gh issue view https://github.com/${owner}/${repo}/issues/${issueNumber}.
165
165
  - When you need related code, use gh search code --owner ${owner} [keywords].
166
166
  - When you need repo context, read files in your working directory.${
package/src/lib.mjs CHANGED
@@ -188,16 +188,107 @@ export const setupVerboseLogInterceptor = () => {
188
188
  */
189
189
  let stdioInterceptorInstalled = false;
190
190
  let _writingFromLog = false; // Guard flag to prevent double-logging from log()
191
+ let stdoutBroken = false;
192
+ let stderrBroken = false;
193
+ let brokenPipeDiagnosticsWritten = false;
194
+
195
+ const isBrokenPipeError = error => {
196
+ return error?.code === 'EPIPE' || error?.code === 'ERR_STREAM_DESTROYED';
197
+ };
198
+
199
+ const invokeWriteCallback = (callback, error = null) => {
200
+ if (typeof callback === 'function') {
201
+ callback(error);
202
+ }
203
+ };
204
+
205
+ const appendInternalDiagnostic = async message => {
206
+ if (!logFile) return;
207
+ const prefix = `[${new Date().toISOString()}] [INTERNAL]`;
208
+ await fs.appendFile(logFile, `${prefix} ${message}\n`).catch(() => {
209
+ // Silent fail to avoid recursive logging errors
210
+ });
211
+ };
212
+
213
+ const formatStreamDiagnostic = stream => {
214
+ return JSON.stringify({
215
+ isTTY: Boolean(stream?.isTTY),
216
+ destroyed: Boolean(stream?.destroyed),
217
+ writable: stream?.writable,
218
+ writableEnded: Boolean(stream?.writableEnded),
219
+ writableFinished: Boolean(stream?.writableFinished),
220
+ errored: stream?.errored?.code || stream?.errored?.message || null,
221
+ fd: typeof stream?.fd === 'number' ? stream.fd : null,
222
+ });
223
+ };
224
+
225
+ const normalizeWriteCallback = (encoding, callback) => {
226
+ return typeof encoding === 'function' ? encoding : callback;
227
+ };
228
+
229
+ const safeTerminalWrite = ({ originalWrite, chunk, encoding, callback, streamName }) => {
230
+ const isStdout = streamName === 'stdout';
231
+ const normalizedCallback = normalizeWriteCallback(encoding, callback);
232
+ if ((isStdout && stdoutBroken) || (!isStdout && stderrBroken)) {
233
+ invokeWriteCallback(normalizedCallback);
234
+ return false;
235
+ }
236
+
237
+ try {
238
+ return originalWrite(chunk, encoding, callback);
239
+ } catch (error) {
240
+ if (!isBrokenPipeError(error)) {
241
+ throw error;
242
+ }
243
+
244
+ if (isStdout) {
245
+ stdoutBroken = true;
246
+ } else {
247
+ stderrBroken = true;
248
+ }
249
+
250
+ invokeWriteCallback(normalizedCallback, error);
251
+ return false;
252
+ }
253
+ };
254
+
255
+ const installBrokenPipeGuard = (stream, streamName) => {
256
+ stream.on('error', error => {
257
+ if (isBrokenPipeError(error)) {
258
+ if (streamName === 'stdout') {
259
+ stdoutBroken = true;
260
+ } else {
261
+ stderrBroken = true;
262
+ }
263
+ if (!brokenPipeDiagnosticsWritten) {
264
+ brokenPipeDiagnosticsWritten = true;
265
+ void appendInternalDiagnostic(`Detected broken ${streamName} stream (${error.code || 'unknown'}). Stream state=${formatStreamDiagnostic(stream)}. Further terminal writes will be skipped when possible.`);
266
+ }
267
+ return;
268
+ }
269
+
270
+ throw error;
271
+ });
272
+ };
273
+
191
274
  export const setupStdioLogInterceptor = () => {
192
275
  if (stdioInterceptorInstalled) return;
193
276
  stdioInterceptorInstalled = true;
194
277
 
195
278
  const originalStdoutWrite = process.stdout.write.bind(process.stdout);
196
279
  const originalStderrWrite = process.stderr.write.bind(process.stderr);
280
+ installBrokenPipeGuard(process.stdout, 'stdout');
281
+ installBrokenPipeGuard(process.stderr, 'stderr');
197
282
 
198
283
  process.stdout.write = (chunk, encoding, callback) => {
199
- // Always write to terminal first
200
- const result = originalStdoutWrite(chunk, encoding, callback);
284
+ // Always write to terminal first, unless the output pipe is already broken.
285
+ const result = safeTerminalWrite({
286
+ originalWrite: originalStdoutWrite,
287
+ chunk,
288
+ encoding,
289
+ callback,
290
+ streamName: 'stdout',
291
+ });
201
292
 
202
293
  // Also append to log file if set, but skip if this write originated from log()
203
294
  if (logFile && !_writingFromLog) {
@@ -214,8 +305,14 @@ export const setupStdioLogInterceptor = () => {
214
305
  };
215
306
 
216
307
  process.stderr.write = (chunk, encoding, callback) => {
217
- // Always write to terminal first
218
- const result = originalStderrWrite(chunk, encoding, callback);
308
+ // Always write to terminal first, unless the output pipe is already broken.
309
+ const result = safeTerminalWrite({
310
+ originalWrite: originalStderrWrite,
311
+ chunk,
312
+ encoding,
313
+ callback,
314
+ streamName: 'stderr',
315
+ });
219
316
 
220
317
  // Also append to log file if set, but skip if this write originated from log()
221
318
  if (logFile && !_writingFromLog) {
@@ -11,6 +11,7 @@ const use = globalThis.use;
11
11
 
12
12
  // Use command-stream for consistent $ behavior across runtimes
13
13
  const { $ } = await use('command-stream');
14
+ const $silent = $({ mirror: false, capture: true });
14
15
 
15
16
  // Import shared library functions
16
17
  const lib = await import('./lib.mjs');
@@ -20,6 +21,12 @@ const { log } = lib;
20
21
  const sentryLib = await import('./sentry.lib.mjs');
21
22
  const { reportError } = sentryLib;
22
23
 
24
+ const summarizeCommandOutput = value => {
25
+ const text = value?.toString()?.trim() || '';
26
+ if (!text) return '';
27
+ return text.length > 500 ? `${text.slice(0, 500)}... [truncated ${text.length - 500} chars]` : text;
28
+ };
29
+
23
30
  /**
24
31
  * Upload a log file using gh-upload-log command
25
32
  * @param {Object} options - Upload options
@@ -92,12 +99,18 @@ export const uploadLogWithGhUploadLog = async ({ logFile, isPublic, description,
92
99
  // For gist: get raw URL from gist API
93
100
  const gistId = result.url.split('/').pop();
94
101
  try {
95
- const gistDetailsResult = await $`gh api gists/${gistId} --jq '{owner: .owner.login, files: .files, history: .history}'`;
102
+ if (verbose) {
103
+ await log(` 🔍 Fetching gist metadata for raw URL resolution (gistId=${gistId})`, { verbose: true });
104
+ }
105
+ const gistDetailsResult = await $silent`gh api gists/${gistId} --jq '{owner: .owner.login, history: .history, fileNames: (.files | keys)}'`;
106
+ if (verbose) {
107
+ await log(` 📥 Gist metadata fetch completed (code=${gistDetailsResult.code ?? 'unknown'})`, { verbose: true });
108
+ }
96
109
  if (gistDetailsResult.code === 0) {
97
110
  const gistDetails = JSON.parse(gistDetailsResult.stdout.toString());
98
111
  const gistOwner = gistDetails.owner;
99
112
  const commitSha = gistDetails.history?.[0]?.version;
100
- const fileNames = gistDetails.files ? Object.keys(gistDetails.files) : [];
113
+ const fileNames = Array.isArray(gistDetails.fileNames) ? gistDetails.fileNames : [];
101
114
  const fileName = fileNames.length > 0 ? fileNames[0] : 'log.txt';
102
115
 
103
116
  if (commitSha) {
@@ -105,6 +118,18 @@ export const uploadLogWithGhUploadLog = async ({ logFile, isPublic, description,
105
118
  } else {
106
119
  result.rawUrl = `https://gist.githubusercontent.com/${gistOwner}/${gistId}/raw/${fileName}`;
107
120
  }
121
+ if (verbose) {
122
+ await log(` 🧩 Gist metadata resolved owner=${gistOwner}, commitSha=${commitSha || 'latest'}, fileName=${fileName}`, { verbose: true });
123
+ }
124
+ } else if (verbose) {
125
+ const stderrSummary = summarizeCommandOutput(gistDetailsResult.stderr);
126
+ const stdoutSummary = summarizeCommandOutput(gistDetailsResult.stdout);
127
+ if (stderrSummary) {
128
+ await log(` ⚠️ Gist metadata stderr: ${stderrSummary}`, { verbose: true });
129
+ }
130
+ if (stdoutSummary) {
131
+ await log(` ⚠️ Gist metadata stdout: ${stdoutSummary}`, { verbose: true });
132
+ }
108
133
  }
109
134
  } catch (apiError) {
110
135
  if (verbose) {
@@ -121,7 +146,13 @@ export const uploadLogWithGhUploadLog = async ({ logFile, isPublic, description,
121
146
  try {
122
147
  const repoUrl = result.url;
123
148
  const repoPath = repoUrl.replace('https://github.com/', '');
124
- const contentsResult = await $`gh api repos/${repoPath}/contents --jq '.[].name'`;
149
+ if (verbose) {
150
+ await log(` 🔍 Fetching repository contents for raw URL resolution (repoPath=${repoPath})`, { verbose: true });
151
+ }
152
+ const contentsResult = await $silent`gh api repos/${repoPath}/contents --jq '.[].name'`;
153
+ if (verbose) {
154
+ await log(` 📥 Repository contents fetch completed (code=${contentsResult.code ?? 'unknown'})`, { verbose: true });
155
+ }
125
156
  if (contentsResult.code === 0) {
126
157
  const files = contentsResult.stdout
127
158
  .toString()
@@ -131,6 +162,18 @@ export const uploadLogWithGhUploadLog = async ({ logFile, isPublic, description,
131
162
  if (files.length > 0) {
132
163
  const fileName = files[0];
133
164
  result.rawUrl = `${repoUrl}/raw/main/${fileName}`;
165
+ if (verbose) {
166
+ await log(` 🧩 Repository contents resolved fileName=${fileName}`, { verbose: true });
167
+ }
168
+ }
169
+ } else if (verbose) {
170
+ const stderrSummary = summarizeCommandOutput(contentsResult.stderr);
171
+ const stdoutSummary = summarizeCommandOutput(contentsResult.stdout);
172
+ if (stderrSummary) {
173
+ await log(` ⚠️ Repository contents stderr: ${stderrSummary}`, { verbose: true });
174
+ }
175
+ if (stdoutSummary) {
176
+ await log(` ⚠️ Repository contents stdout: ${stdoutSummary}`, { verbose: true });
134
177
  }
135
178
  }
136
179
  } catch (apiError) {
@@ -137,7 +137,7 @@ ${workspaceInstructions}
137
137
  Initial research.
138
138
  - When you start, create a detailed plan for yourself and follow your todo list step by step. Add as many relevant points from these guidelines to the todo list as practical so you can track the work clearly.
139
139
  - When you read the issue, read all details and comments thoroughly.
140
- - When you see screenshots or images in issue descriptions, pull request descriptions, comments, or discussions, download the image to a local file first, then use the Read tool to view and analyze it. Before reading downloaded images with the Read tool, verify that the file is a valid image rather than HTML by using a CLI tool such as the 'file' command. When corrupted or non-image files, such as GitHub "Not Found" pages saved as `.png`, are read, they can cause "Could not process image" errors and crash the AI solver process. When the file command shows "HTML", "text", or "ASCII text", the download failed, so do not call Read on that file. Instead: (1) when images are from GitHub issues or PRs, such as URLs containing "github.com/user-attachments", retry with: curl -L -H "Authorization: token $(gh auth token)" -o <filename> "<url>" (2) when the retry still fails, skip the image and note that it was unavailable.
140
+ - When you see screenshots or images in issue descriptions, pull request descriptions, comments, or discussions, download the image to a local file first, then use the Read tool to view and analyze it. Before reading downloaded images with the Read tool, verify that the file is a valid image rather than HTML by using a CLI tool such as the 'file' command. When corrupted or non-image files, such as GitHub "Not Found" pages saved as \`.png\`, are read, they can cause "Could not process image" errors and crash the AI solver process. When the file command shows "HTML", "text", or "ASCII text", the download failed, so do not call Read on that file. Instead: (1) when images are from GitHub issues or PRs, such as URLs containing "github.com/user-attachments", retry with: curl -L -H "Authorization: token $(gh auth token)" -o <filename> "<url>" (2) when the retry still fails, skip the image and note that it was unavailable.
141
141
  - When you need issue details, use gh issue view https://github.com/${owner}/${repo}/issues/${issueNumber}.
142
142
  - When you need related code, use gh search code --owner ${owner} [keywords].
143
143
  - When you need repo context, read files in your working directory.${