@tiens.nguyen/gonext-local-worker 1.0.5 → 1.0.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@tiens.nguyen/gonext-local-worker",
3
- "version": "1.0.5",
3
+ "version": "1.0.7",
4
4
  "description": "Polls GoNext cloud API for async local LLM jobs and runs them against Ollama/OpenAI-compatible servers on this Mac",
5
5
  "type": "module",
6
6
  "license": "MIT",
@@ -15,6 +15,11 @@
15
15
  "bin": {
16
16
  "gonext-local-worker": "./gonext-local-worker.mjs"
17
17
  },
18
+ "scripts": {
19
+ "deploy:local": "npm install -g .",
20
+ "run": "gonext-local-worker",
21
+ "publish:org": "npm publish --access public"
22
+ },
18
23
  "files": [
19
24
  "gonext-local-worker.mjs",
20
25
  "README.md",
package/README.md DELETED
@@ -1,120 +0,0 @@
1
- # gonext-local-worker
2
-
3
- Runs on **your Mac** next to **Ollama** or any **OpenAI-compatible** local server. It:
4
-
5
- 1. Polls **`POST /api/worker/jobs/next`** on your GoNext cloud API (Lambda).
6
- 2. Runs the job against **`payload.baseURL`** (your LAN `http://127.0.0.1:11434/v1` etc.).
7
- 3. **`PATCH`**es **`running`** → **`completed`** / **`failed`** so DynamoDB and the web app update.
8
-
9
- You must create a **Worker API key** in the web app **Settings** (stored as a hash in DynamoDB).
10
-
11
- ## Install
12
-
13
- ```bash
14
- npm install -g @tiens.nguyen/gonext-local-worker
15
- ```
16
-
17
- Or from source:
18
-
19
- ```bash
20
- cd tools/gonext-local-worker
21
- npm install
22
- npm link
23
- ```
24
-
25
- Requires **Node.js 18+**.
26
-
27
- ## Configure
28
-
29
- Create `~/.gonext/worker.env` (optional):
30
-
31
- ```bash
32
- mkdir -p ~/.gonext
33
- cat > ~/.gonext/worker.env << 'EOF'
34
- GONEXT_API_BASE=https://YOUR_API.execute-api.ap-southeast-1.amazonaws.com
35
- GONEXT_WORKER_KEY=paste-your-worker-secret-here
36
- GONEXT_POLL_MS=1500
37
- EOF
38
- chmod 600 ~/.gonext/worker.env
39
- ```
40
-
41
- Or export in the shell:
42
-
43
- ```bash
44
- export GONEXT_API_BASE=https://....execute-api....amazonaws.com
45
- export GONEXT_WORKER_KEY=...
46
- ```
47
-
48
- ## Run
49
-
50
- ```bash
51
- gonext-local-worker
52
- ```
53
-
54
- Options:
55
-
56
- ```bash
57
- gonext-local-worker --poll-ms 2000
58
- gonext-local-worker --api-base https://other-host.example
59
- gonext-local-worker --mode webhook --webhook-port 5001
60
- gonext-local-worker --mode both
61
- gonext-local-worker --help
62
- ```
63
-
64
- Leave this process **running** while you use async local models from the web app.
65
-
66
- ## Push-dispatch mode (new option)
67
-
68
- You now have 2 ways to update job status:
69
-
70
- 1. **Polling mode** (existing): worker polls `/api/worker/jobs/next`.
71
- 2. **Webhook mode** (new): API push-dispatches job payloads to your local worker endpoint.
72
-
73
- Run local webhook endpoint:
74
-
75
- ```bash
76
- gonext-local-worker --mode webhook --webhook-port 5001
77
- ```
78
-
79
- or keep both (recommended transition):
80
-
81
- ```bash
82
- gonext-local-worker --mode both --webhook-port 5001
83
- ```
84
-
85
- Expose webhook with ngrok:
86
-
87
- ```bash
88
- ngrok http 5001
89
- ```
90
-
91
- Then in web app Settings set **Worker webhook URL** to:
92
-
93
- `https://<your-ngrok-domain>/api/dispatch-job`
94
-
95
- Local worker API endpoints on port 5001:
96
-
97
- - `POST /api/chat` -> proxies to `http://localhost:11434/api/chat`
98
- - `POST /api/generate` -> proxies to `http://localhost:11434/api/generate`
99
- - `GET /api/tags` -> proxies to `http://localhost:11434/api/tags`
100
- - `POST /api/dispatch-job` -> cloud dispatch entrypoint (token-protected)
101
- - `GET /api/health`
102
-
103
- Legacy non-prefixed routes (`/chat`, `/generate`, `/tags`, `/dispatch-job`, `/health`) are also available for compatibility.
104
-
105
- ## Run at login (macOS LaunchAgent)
106
-
107
- 1. Copy `launchd/com.gonext.worker.plist.example` to `~/Library/LaunchAgents/com.gonext.worker.plist`.
108
- 2. Edit paths: set **full path** to `gonext-local-worker` (`which gonext-local-worker` after `npm link`).
109
- 3. `launchctl load ~/Library/LaunchAgents/com.gonext.worker.plist`
110
-
111
- Unload:
112
-
113
- ```bash
114
- launchctl unload ~/Library/LaunchAgents/com.gonext.worker.plist
115
- ```
116
-
117
- ## Troubleshooting
118
-
119
- - **`403` / invalid key** — Regenerate worker key in Settings and update `GONEXT_WORKER_KEY`.
120
- - **Job never claimed** — Ensure DynamoDB + worker routes work; Settings must include **Ollama base URL** so the queued payload points at a reachable host from this Mac (`127.0.0.1` is fine).
@@ -1,523 +0,0 @@
1
- #!/usr/bin/env node
2
- /**
3
- * GoNext local LLM worker — runs on the Mac where Ollama / MLX HTTP server lives.
4
- * Polls your cloud API for jobs, calls the model locally, PATCHes status back.
5
- *
6
- * Env:
7
- * GONEXT_API_BASE HTTPS origin only (no trailing slash), e.g. https://xxx.execute-api....amazonaws.com
8
- * GONEXT_WORKER_KEY Plaintext worker secret from Web → Settings → Worker API key
9
- * GONEXT_POLL_MS Poll interval when idle (default 1500)
10
- *
11
- * Optional env files (loaded in order; shell exports win if set before launch):
12
- * ~/.gonext/worker.env
13
- * ./.env (cwd)
14
- */
15
- import { homedir } from "node:os";
16
- import os from "node:os";
17
- import { createHash } from "node:crypto";
18
- import { join } from "node:path";
19
- import dotenv from "dotenv";
20
- import express from "express";
21
- import OpenAI from "openai";
22
-
23
- dotenv.config({ path: join(homedir(), ".gonext", "worker.env") });
24
- dotenv.config();
25
-
26
- function parseArgs(argv) {
27
- const out = {
28
- help: false,
29
- pollMs: undefined,
30
- apiBase: undefined,
31
- mode: undefined,
32
- webhookPort: undefined,
33
- };
34
- for (let i = 2; i < argv.length; i++) {
35
- const a = argv[i];
36
- if (a === "--help" || a === "-h") out.help = true;
37
- else if (a === "--poll-ms" && argv[i + 1])
38
- out.pollMs = Number(argv[++i]);
39
- else if (a === "--api-base" && argv[i + 1]) out.apiBase = argv[++i];
40
- else if (a === "--mode" && argv[i + 1]) out.mode = argv[++i];
41
- else if (a === "--webhook-port" && argv[i + 1])
42
- out.webhookPort = Number(argv[++i]);
43
- }
44
- return out;
45
- }
46
-
47
- function printHelp() {
48
- console.log(`
49
- gonext-local-worker — bridge cloud GoNext jobs ↔ local LLM (Ollama / OpenAI-compatible)
50
-
51
- Usage:
52
- export GONEXT_API_BASE=https://....amazonaws.com
53
- export GONEXT_WORKER_KEY=your-secret-from-settings
54
- gonext-local-worker
55
-
56
- Options:
57
- --poll-ms <ms> Idle poll interval (default 1500 or GONEXT_POLL_MS)
58
- --api-base <url> Override GONEXT_API_BASE
59
- --mode <poll|webhook|both> Worker mode (default: poll)
60
- --webhook-port <n> Local webhook port (default: 5001)
61
-
62
- Config files (optional):
63
- ~/.gonext/worker.env
64
- .env in current directory
65
-
66
- Install:
67
- npm install -g @tiens.nguyen/gonext-local-worker
68
-
69
- Then keep this running while you use the web app with local models.
70
- `);
71
- }
72
-
73
- const args = parseArgs(process.argv);
74
- if (args.help) {
75
- printHelp();
76
- process.exit(0);
77
- }
78
-
79
- const apiBase = (
80
- args.apiBase ??
81
- process.env.GONEXT_API_BASE ??
82
- ""
83
- ).replace(/\/+$/, "");
84
- const workerKey = process.env.GONEXT_WORKER_KEY ?? "";
85
- const pollMs =
86
- (Number.isFinite(args.pollMs) && args.pollMs > 0
87
- ? args.pollMs
88
- : Number(process.env.GONEXT_POLL_MS ?? "1500")) || 1500;
89
- const modeRaw = String(args.mode ?? process.env.GONEXT_WORKER_MODE ?? "poll");
90
- const mode = ["poll", "webhook", "both"].includes(modeRaw)
91
- ? modeRaw
92
- : "poll";
93
- const webhookPort =
94
- (Number.isFinite(args.webhookPort) && args.webhookPort > 0
95
- ? args.webhookPort
96
- : Number(process.env.GONEXT_WEBHOOK_PORT ?? "5001")) || 5001;
97
- const ollamaBase = (
98
- process.env.GONEXT_LOCAL_OLLAMA_BASE ?? "http://127.0.0.1:11434"
99
- ).replace(/\/+$/, "");
100
-
101
- if (!apiBase || !workerKey) {
102
- console.error(
103
- "Missing GONEXT_API_BASE or GONEXT_WORKER_KEY.\nSet them or put them in ~/.gonext/worker.env — run with --help."
104
- );
105
- process.exit(1);
106
- }
107
-
108
- const WORKER_VERSION = "1.0.0";
109
- const WORKER_HOST = os.hostname();
110
- const DISPATCH_TOKEN = createHash("sha256")
111
- .update(workerKey, "utf8")
112
- .digest("hex");
113
-
114
- function ts() {
115
- return new Date().toISOString();
116
- }
117
-
118
- function toOpenAIMessages(messages) {
119
- return messages.map((m) => {
120
- if (m.role === "user" && m.attachments?.length) {
121
- return {
122
- role: m.role,
123
- content: [
124
- { type: "text", text: m.content },
125
- ...m.attachments.map((a) => ({
126
- type: "image_url",
127
- image_url: { url: `data:${a.mimeType};base64,${a.data}` },
128
- })),
129
- ],
130
- };
131
- }
132
- return { role: m.role, content: m.content };
133
- });
134
- }
135
-
136
- async function workerFetch(path, init = {}) {
137
- const url = `${apiBase}${path.startsWith("/") ? path : `/${path}`}`;
138
- const headers = {
139
- "Content-Type": "application/json",
140
- "X-Worker-Key": workerKey,
141
- ...(init.headers ?? {}),
142
- };
143
- return fetch(url, { ...init, headers });
144
- }
145
-
146
- let shuttingDown = false;
147
- let activeJobId = "";
148
- let lastError = "";
149
- let webhookServer = null;
150
-
151
- async function postHeartbeat(payload) {
152
- try {
153
- await workerFetch("/api/worker/heartbeat", {
154
- method: "POST",
155
- body: JSON.stringify({
156
- workerVersion: WORKER_VERSION,
157
- host: WORKER_HOST,
158
- ...payload,
159
- }),
160
- });
161
- } catch {
162
- /* keep worker loop alive */
163
- }
164
- }
165
-
166
- async function runChatJob(job) {
167
- const { jobId, payload } = job;
168
- const start = Date.now();
169
- activeJobId = jobId;
170
- lastError = "";
171
- await postHeartbeat({ state: "running", currentJobId: jobId });
172
- const patchRunning = await workerFetch(`/api/worker/jobs/${jobId}`, {
173
- method: "PATCH",
174
- body: JSON.stringify({ jobStatus: "running" }),
175
- });
176
- if (!patchRunning.ok) {
177
- const t = await patchRunning.text().catch(() => "");
178
- console.error(`[${ts()}] PATCH running failed ${patchRunning.status}`, t);
179
- await postHeartbeat({
180
- state: "error",
181
- currentJobId: jobId,
182
- lastError: `PATCH running failed ${patchRunning.status}`,
183
- });
184
- lastError = `PATCH running failed ${patchRunning.status}`;
185
- activeJobId = "";
186
- return;
187
- }
188
-
189
- const client = new OpenAI({
190
- baseURL: payload.baseURL,
191
- apiKey: payload.apiKey || "ollama",
192
- });
193
-
194
- try {
195
- const completion = await client.chat.completions.create({
196
- model: payload.modelId,
197
- messages: toOpenAIMessages(payload.messages),
198
- temperature: 0,
199
- });
200
- const text = completion.choices[0]?.message?.content ?? "";
201
- const totalTimeSeconds = (Date.now() - start) / 1000;
202
- await workerFetch(`/api/worker/jobs/${jobId}`, {
203
- method: "PATCH",
204
- body: JSON.stringify({
205
- jobStatus: "completed",
206
- resultText: text,
207
- tokenCount: Math.max(1, completion.usage?.total_tokens ?? 1),
208
- totalTimeSeconds,
209
- }),
210
- });
211
- console.log(
212
- `[${ts()}] completed job ${jobId} (${totalTimeSeconds.toFixed(1)}s)`
213
- );
214
- await postHeartbeat({
215
- state: "idle",
216
- currentJobId: "",
217
- lastJobCompletedAt: new Date().toISOString(),
218
- });
219
- activeJobId = "";
220
- } catch (e) {
221
- const message = e instanceof Error ? e.message : String(e);
222
- await workerFetch(`/api/worker/jobs/${jobId}`, {
223
- method: "PATCH",
224
- body: JSON.stringify({
225
- jobStatus: "failed",
226
- errorMessage: message,
227
- totalTimeSeconds: (Date.now() - start) / 1000,
228
- }),
229
- });
230
- console.error(`[${ts()}] failed job ${jobId}:`, message);
231
- await postHeartbeat({
232
- state: "error",
233
- currentJobId: jobId,
234
- lastError: message,
235
- });
236
- lastError = message;
237
- activeJobId = "";
238
- }
239
- }
240
-
241
- async function pollOnce() {
242
- const res = await workerFetch("/api/worker/jobs/next", { method: "POST" });
243
- if (res.status === 204) return;
244
- if (!res.ok) {
245
- const t = await res.text().catch(() => "");
246
- throw new Error(`POST /api/worker/jobs/next → ${res.status}: ${t}`);
247
- }
248
- const job = await res.json();
249
- if (job?.jobId && job.payload) {
250
- await runChatJob(job);
251
- }
252
- }
253
-
254
- function sleep(ms) {
255
- return new Promise((r) => setTimeout(r, ms));
256
- }
257
-
258
- function requireDispatchToken(req, res) {
259
- const token = req.header("x-gonext-dispatch-token") ?? "";
260
- if (token !== DISPATCH_TOKEN) {
261
- res.status(403).json({ error: "Invalid dispatch token." });
262
- return false;
263
- }
264
- return true;
265
- }
266
-
267
- function toOllamaChatMessages(messages) {
268
- return (messages ?? []).map((m) => ({
269
- role: m.role,
270
- content: m.content,
271
- ...(Array.isArray(m.attachments) && m.attachments.length > 0
272
- ? { images: m.attachments.map((a) => a.data) }
273
- : {}),
274
- }));
275
- }
276
-
277
- async function runOllamaAndMaybeUpdateJob(params) {
278
- const { endpoint, requestBody, jobId, extractResultText } = params;
279
- const hasJob = typeof jobId === "string" && jobId.length > 0;
280
- const start = Date.now();
281
- if (hasJob) {
282
- activeJobId = jobId;
283
- lastError = "";
284
- await postHeartbeat({ state: "running", currentJobId: jobId });
285
- await workerFetch(`/api/worker/jobs/${jobId}`, {
286
- method: "PATCH",
287
- body: JSON.stringify({ jobStatus: "running" }),
288
- });
289
- }
290
- try {
291
- const res = await fetch(`${ollamaBase}${endpoint}`, {
292
- method: "POST",
293
- headers: { "Content-Type": "application/json" },
294
- body: JSON.stringify({
295
- stream: false,
296
- ...requestBody,
297
- }),
298
- });
299
- const raw = await res.text();
300
- if (!res.ok) {
301
- throw new Error(`Ollama ${endpoint} failed ${res.status}: ${raw}`);
302
- }
303
- const parsed = raw ? JSON.parse(raw) : {};
304
- if (hasJob) {
305
- await workerFetch(`/api/worker/jobs/${jobId}`, {
306
- method: "PATCH",
307
- body: JSON.stringify({
308
- jobStatus: "completed",
309
- resultText: extractResultText(parsed),
310
- tokenCount: 1,
311
- totalTimeSeconds: (Date.now() - start) / 1000,
312
- }),
313
- });
314
- await postHeartbeat({
315
- state: "idle",
316
- currentJobId: "",
317
- lastJobCompletedAt: new Date().toISOString(),
318
- });
319
- activeJobId = "";
320
- }
321
- return parsed;
322
- } catch (e) {
323
- if (hasJob) {
324
- const message = e instanceof Error ? e.message : String(e);
325
- await workerFetch(`/api/worker/jobs/${jobId}`, {
326
- method: "PATCH",
327
- body: JSON.stringify({
328
- jobStatus: "failed",
329
- errorMessage: message,
330
- totalTimeSeconds: (Date.now() - start) / 1000,
331
- }),
332
- });
333
- await postHeartbeat({
334
- state: "error",
335
- currentJobId: jobId,
336
- lastError: message,
337
- });
338
- lastError = message;
339
- activeJobId = "";
340
- }
341
- throw e;
342
- }
343
- }
344
-
345
- function startWebhookServer() {
346
- const app = express();
347
- app.use(express.json({ limit: "10mb" }));
348
-
349
- const healthHandler = (_req, res) => {
350
- const state =
351
- activeJobId.length > 0 ? "running" : lastError ? "error" : "idle";
352
- res.json({
353
- ok: true,
354
- workerVersion: WORKER_VERSION,
355
- host: WORKER_HOST,
356
- mode,
357
- state,
358
- activeJobId: activeJobId || undefined,
359
- lastError: lastError || undefined,
360
- dispatchPath: "/dispatch-job",
361
- chatPath: "/chat",
362
- generatePath: "/generate",
363
- tagsPath: "/tags",
364
- ollamaBase,
365
- });
366
- };
367
- app.get("/health", healthHandler);
368
- app.get("/api/health", healthHandler);
369
-
370
- const tagsHandler = async (_req, res) => {
371
- try {
372
- const r = await fetch(`${ollamaBase}/api/tags`, { method: "GET" });
373
- const raw = await r.text();
374
- if (!r.ok) {
375
- res
376
- .status(r.status)
377
- .json({ error: `Ollama /api/tags failed ${r.status}`, raw });
378
- return;
379
- }
380
- res.type("application/json").send(raw || "{}");
381
- } catch (e) {
382
- res.status(500).json({ error: e instanceof Error ? e.message : "error" });
383
- }
384
- };
385
- app.get("/tags", tagsHandler);
386
- app.get("/api/tags", tagsHandler);
387
-
388
- const dispatchJobHandler = async (req, res) => {
389
- if (!requireDispatchToken(req, res)) return;
390
- const body = req.body ?? {};
391
- if (!body?.jobId || !body?.payload) {
392
- res.status(400).json({ error: "Expected { jobId, payload }." });
393
- return;
394
- }
395
- if (activeJobId) {
396
- res.status(409).json({ error: "Worker busy.", activeJobId });
397
- return;
398
- }
399
- const payload = body.payload;
400
- const chatBody = {
401
- model: payload.modelId,
402
- messages: toOllamaChatMessages(payload.messages),
403
- };
404
- void runOllamaAndMaybeUpdateJob({
405
- endpoint: "/api/chat",
406
- requestBody: chatBody,
407
- jobId: String(body.jobId),
408
- extractResultText: (j) => j?.message?.content ?? "",
409
- }).catch((e) => {
410
- console.error("[dispatch-job] error:", e instanceof Error ? e.message : e);
411
- });
412
- res.status(202).json({ ok: true, accepted: true, jobId: body.jobId });
413
- };
414
- app.post("/dispatch-job", dispatchJobHandler);
415
- app.post("/api/dispatch-job", dispatchJobHandler);
416
-
417
- const chatHandler = async (req, res) => {
418
- try {
419
- const body = req.body ?? {};
420
- const jobId = typeof body.jobId === "string" ? body.jobId : "";
421
- if (jobId && !requireDispatchToken(req, res)) return;
422
- const requestBody =
423
- body.request && typeof body.request === "object" ? body.request : body;
424
- const parsed = await runOllamaAndMaybeUpdateJob({
425
- endpoint: "/api/chat",
426
- requestBody,
427
- jobId,
428
- extractResultText: (j) => j?.message?.content ?? "",
429
- });
430
- res.json(parsed);
431
- } catch (e) {
432
- res.status(500).json({ error: e instanceof Error ? e.message : "error" });
433
- }
434
- };
435
- app.post("/chat", chatHandler);
436
- app.post("/api/chat", chatHandler);
437
-
438
- const generateHandler = async (req, res) => {
439
- try {
440
- const body = req.body ?? {};
441
- const jobId = typeof body.jobId === "string" ? body.jobId : "";
442
- if (jobId && !requireDispatchToken(req, res)) return;
443
- const requestBody =
444
- body.request && typeof body.request === "object" ? body.request : body;
445
- const parsed = await runOllamaAndMaybeUpdateJob({
446
- endpoint: "/api/generate",
447
- requestBody,
448
- jobId,
449
- extractResultText: (j) => j?.response ?? "",
450
- });
451
- res.json(parsed);
452
- } catch (e) {
453
- res.status(500).json({ error: e instanceof Error ? e.message : "error" });
454
- }
455
- };
456
- app.post("/generate", generateHandler);
457
- app.post("/api/generate", generateHandler);
458
-
459
- webhookServer = app.listen(webhookPort, "127.0.0.1", () => {
460
- console.log(
461
- `[${ts()}] local worker API on http://127.0.0.1:${webhookPort} (/chat, /generate, /tags, /dispatch-job, /health)`
462
- );
463
- console.log(`[${ts()}] dispatch token is SHA256(worker key)`);
464
- });
465
- }
466
-
467
- async function main() {
468
- console.log(`[${ts()}] gonext-local-worker`);
469
- console.log(` API ${apiBase}`);
470
- console.log(` mode ${mode}`);
471
- if (mode === "poll" || mode === "both") {
472
- console.log(` poll every ${pollMs}ms (idle)`);
473
- }
474
- if (mode === "webhook" || mode === "both") {
475
- console.log(` hook http://127.0.0.1:${webhookPort}/dispatch-job`);
476
- }
477
- console.log(` stop Ctrl+C`);
478
- await postHeartbeat({ state: "idle", currentJobId: "" });
479
-
480
- const loop = async () => {
481
- while (!shuttingDown) {
482
- try {
483
- await pollOnce();
484
- await postHeartbeat({ state: "idle", currentJobId: "" });
485
- } catch (e) {
486
- console.error(`[${ts()}] poll error:`, e instanceof Error ? e.message : e);
487
- await postHeartbeat({
488
- state: "error",
489
- currentJobId: "",
490
- lastError: e instanceof Error ? e.message : String(e),
491
- });
492
- }
493
- if (shuttingDown) break;
494
- await sleep(pollMs);
495
- }
496
- };
497
-
498
- const stop = () => {
499
- if (shuttingDown) return;
500
- shuttingDown = true;
501
- console.log(`\n[${ts()}] shutting down…`);
502
- if (webhookServer) {
503
- webhookServer.close();
504
- }
505
- process.exit(0);
506
- };
507
- process.on("SIGINT", stop);
508
- process.on("SIGTERM", stop);
509
-
510
- if (mode === "webhook" || mode === "both") {
511
- startWebhookServer();
512
- }
513
- if (mode === "poll" || mode === "both") {
514
- await loop();
515
- } else {
516
- await new Promise(() => {});
517
- }
518
- }
519
-
520
- main().catch((e) => {
521
- console.error(e);
522
- process.exit(1);
523
- });
@@ -1,34 +0,0 @@
1
- <?xml version="1.0" encoding="UTF-8"?>
2
- <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
3
- <plist version="1.0">
4
- <dict>
5
- <key>Label</key>
6
- <string>com.gonext.worker</string>
7
- <key>ProgramArguments</key>
8
- <array>
9
- <!-- Replace with output of: which gonext-local-worker -->
10
- <string>/usr/local/bin/gonext-local-worker</string>
11
- </array>
12
- <!-- Optional: uncomment to force Node path -->
13
- <!--
14
- <array>
15
- <string>/usr/local/bin/node</string>
16
- <string>/ABSOLUTE/PATH/TO/gonext/tools/gonext-local-worker/gonext-local-worker.mjs</string>
17
- </array>
18
- -->
19
- <key>RunAtLoad</key>
20
- <true/>
21
- <key>KeepAlive</key>
22
- <true/>
23
- <key>StandardOutPath</key>
24
- <string>/tmp/gonext-worker.log</string>
25
- <key>StandardErrorPath</key>
26
- <string>/tmp/gonext-worker.err</string>
27
- <key>EnvironmentVariables</key>
28
- <dict>
29
- <!-- Or rely on ~/.gonext/worker.env (loaded by the worker) -->
30
- <key>PATH</key>
31
- <string>/usr/local/bin:/usr/bin:/bin:/opt/homebrew/bin</string>
32
- </dict>
33
- </dict>
34
- </plist>