@constructive-io/knative-job-worker 0.7.15 → 0.8.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +58 -0
- package/esm/index.js +208 -0
- package/esm/req.js +55 -0
- package/{src/run.ts → esm/run.js} +4 -11
- package/{dist/index.d.ts → index.d.ts} +5 -1
- package/{dist/index.js → index.js} +55 -5
- package/package.json +22 -13
- package/CHANGELOG.md +0 -228
- package/__tests__/req.test.ts +0 -130
- package/__tests__/worker.integration.test.ts +0 -117
- package/jest.config.js +0 -18
- package/src/index.ts +0 -210
- package/src/req.ts +0 -82
- package/tsconfig.esm.json +0 -9
- package/tsconfig.json +0 -9
- /package/{dist/req.d.ts → req.d.ts} +0 -0
- /package/{dist/req.js → req.js} +0 -0
- /package/{dist/run.d.ts → run.d.ts} +0 -0
- /package/{dist/run.js → run.js} +0 -0
package/README.md
CHANGED
|
@@ -1,3 +1,61 @@
|
|
|
1
1
|
# knative-job-worker
|
|
2
2
|
|
|
3
3
|
Knative-compatible job worker that uses the existing Constructive PostgreSQL job queue and job utilities, invoking HTTP functions via `KNATIVE_SERVICE_URL` (or `INTERNAL_GATEWAY_URL` as a fallback) while preserving the same headers and payload shape as the OpenFaaS worker.
|
|
4
|
+
|
|
5
|
+
---
|
|
6
|
+
|
|
7
|
+
## Education and Tutorials
|
|
8
|
+
|
|
9
|
+
1. 🚀 [Quickstart: Getting Up and Running](https://constructive.io/learn/quickstart)
|
|
10
|
+
Get started with modular databases in minutes. Install prerequisites and deploy your first module.
|
|
11
|
+
|
|
12
|
+
2. 📦 [Modular PostgreSQL Development with Database Packages](https://constructive.io/learn/modular-postgres)
|
|
13
|
+
Learn to organize PostgreSQL projects with pgpm workspaces and reusable database modules.
|
|
14
|
+
|
|
15
|
+
3. ✏️ [Authoring Database Changes](https://constructive.io/learn/authoring-database-changes)
|
|
16
|
+
Master the workflow for adding, organizing, and managing database changes with pgpm.
|
|
17
|
+
|
|
18
|
+
4. 🧪 [End-to-End PostgreSQL Testing with TypeScript](https://constructive.io/learn/e2e-postgres-testing)
|
|
19
|
+
Master end-to-end PostgreSQL testing with ephemeral databases, RLS testing, and CI/CD automation.
|
|
20
|
+
|
|
21
|
+
5. ⚡ [Supabase Testing](https://constructive.io/learn/supabase)
|
|
22
|
+
Use TypeScript-first tools to test Supabase projects with realistic RLS, policies, and auth contexts.
|
|
23
|
+
|
|
24
|
+
6. 💧 [Drizzle ORM Testing](https://constructive.io/learn/drizzle-testing)
|
|
25
|
+
Run full-stack tests with Drizzle ORM, including database setup, teardown, and RLS enforcement.
|
|
26
|
+
|
|
27
|
+
7. 🔧 [Troubleshooting](https://constructive.io/learn/troubleshooting)
|
|
28
|
+
Common issues and solutions for pgpm, PostgreSQL, and testing.
|
|
29
|
+
|
|
30
|
+
## Related Constructive Tooling
|
|
31
|
+
|
|
32
|
+
### 📦 Package Management
|
|
33
|
+
|
|
34
|
+
* [pgpm](https://github.com/constructive-io/constructive/tree/main/pgpm/pgpm): **🖥️ PostgreSQL Package Manager** for modular Postgres development. Works with database workspaces, scaffolding, migrations, seeding, and installing database packages.
|
|
35
|
+
|
|
36
|
+
### 🧪 Testing
|
|
37
|
+
|
|
38
|
+
* [pgsql-test](https://github.com/constructive-io/constructive/tree/main/postgres/pgsql-test): **📊 Isolated testing environments** with per-test transaction rollbacks—ideal for integration tests, complex migrations, and RLS simulation.
|
|
39
|
+
* [pgsql-seed](https://github.com/constructive-io/constructive/tree/main/postgres/pgsql-seed): **🌱 PostgreSQL seeding utilities** for CSV, JSON, SQL data loading, and pgpm deployment.
|
|
40
|
+
* [supabase-test](https://github.com/constructive-io/constructive/tree/main/postgres/supabase-test): **🧪 Supabase-native test harness** preconfigured for the local Supabase stack—per-test rollbacks, JWT/role context helpers, and CI/GitHub Actions ready.
|
|
41
|
+
* [graphile-test](https://github.com/constructive-io/constructive/tree/main/graphile/graphile-test): **🔐 Authentication mocking** for Graphile-focused test helpers and emulating row-level security contexts.
|
|
42
|
+
* [pg-query-context](https://github.com/constructive-io/constructive/tree/main/postgres/pg-query-context): **🔒 Session context injection** to add session-local context (e.g., `SET LOCAL`) into queries—ideal for setting `role`, `jwt.claims`, and other session settings.
|
|
43
|
+
|
|
44
|
+
### 🧠 Parsing & AST
|
|
45
|
+
|
|
46
|
+
* [pgsql-parser](https://www.npmjs.com/package/pgsql-parser): **🔄 SQL conversion engine** that interprets and converts PostgreSQL syntax.
|
|
47
|
+
* [libpg-query-node](https://www.npmjs.com/package/libpg-query): **🌉 Node.js bindings** for `libpg_query`, converting SQL into parse trees.
|
|
48
|
+
* [pg-proto-parser](https://www.npmjs.com/package/pg-proto-parser): **📦 Protobuf parser** for parsing PostgreSQL Protocol Buffers definitions to generate TypeScript interfaces, utility functions, and JSON mappings for enums.
|
|
49
|
+
* [@pgsql/enums](https://www.npmjs.com/package/@pgsql/enums): **🏷️ TypeScript enums** for PostgreSQL AST for safe and ergonomic parsing logic.
|
|
50
|
+
* [@pgsql/types](https://www.npmjs.com/package/@pgsql/types): **📝 Type definitions** for PostgreSQL AST nodes in TypeScript.
|
|
51
|
+
* [@pgsql/utils](https://www.npmjs.com/package/@pgsql/utils): **🛠️ AST utilities** for constructing and transforming PostgreSQL syntax trees.
|
|
52
|
+
|
|
53
|
+
## Credits
|
|
54
|
+
|
|
55
|
+
**🛠 Built by the [Constructive](https://constructive.io) team — creators of modular Postgres tooling for secure, composable backends. If you like our work, contribute on [GitHub](https://github.com/constructive-io).**
|
|
56
|
+
|
|
57
|
+
## Disclaimer
|
|
58
|
+
|
|
59
|
+
AS DESCRIBED IN THE LICENSES, THE SOFTWARE IS PROVIDED "AS IS", AT YOUR OWN RISK, AND WITHOUT WARRANTIES OF ANY KIND.
|
|
60
|
+
|
|
61
|
+
No developer or entity involved in creating this software will be liable for any claims or damages whatsoever associated with your use, inability to use, or your interaction with other users of the code, including any direct, indirect, incidental, special, exemplary, punitive or consequential damages, or loss of profits, cryptocurrencies, tokens, or anything else of value.
|
package/esm/index.js
ADDED
|
@@ -0,0 +1,208 @@
|
|
|
1
|
+
import poolManager from '@constructive-io/job-pg';
|
|
2
|
+
import * as jobs from '@constructive-io/job-utils';
|
|
3
|
+
import { Logger } from '@pgpmjs/logger';
|
|
4
|
+
import { request as req } from './req';
|
|
5
|
+
const log = new Logger('jobs:worker');
|
|
6
|
+
export default class Worker {
|
|
7
|
+
idleDelay;
|
|
8
|
+
supportedTaskNames;
|
|
9
|
+
workerId;
|
|
10
|
+
doNextTimer;
|
|
11
|
+
pgPool;
|
|
12
|
+
_initialized;
|
|
13
|
+
listenClient;
|
|
14
|
+
listenRelease;
|
|
15
|
+
stopped;
|
|
16
|
+
constructor({ tasks, idleDelay = 15000, pgPool = poolManager.getPool(), workerId = 'worker-0' }) {
|
|
17
|
+
/*
|
|
18
|
+
* idleDelay: This is how long to wait between polling for jobs.
|
|
19
|
+
*
|
|
20
|
+
* Note: this does NOT need to be short, because we use LISTEN/NOTIFY to be
|
|
21
|
+
* notified when new jobs are added - this is just used in the case where
|
|
22
|
+
* LISTEN/NOTIFY fails for whatever reason.
|
|
23
|
+
*/
|
|
24
|
+
this.idleDelay = idleDelay;
|
|
25
|
+
this.supportedTaskNames = tasks;
|
|
26
|
+
this.workerId = workerId;
|
|
27
|
+
this.doNextTimer = undefined;
|
|
28
|
+
this.pgPool = pgPool;
|
|
29
|
+
poolManager.onClose(async () => {
|
|
30
|
+
await jobs.releaseJobs(pgPool, { workerId: this.workerId });
|
|
31
|
+
});
|
|
32
|
+
}
|
|
33
|
+
async initialize(client) {
|
|
34
|
+
if (this._initialized === true)
|
|
35
|
+
return;
|
|
36
|
+
// release any jobs not finished from before if fatal error prevented cleanup
|
|
37
|
+
await jobs.releaseJobs(client, { workerId: this.workerId });
|
|
38
|
+
this._initialized = true;
|
|
39
|
+
await this.doNext(client);
|
|
40
|
+
}
|
|
41
|
+
async handleFatalError(client, { err, fatalError, jobId }) {
|
|
42
|
+
const when = err ? `after failure '${err.message}'` : 'after success';
|
|
43
|
+
log.error(`Failed to release job '${jobId}' ${when}; committing seppuku`);
|
|
44
|
+
await poolManager.close();
|
|
45
|
+
log.error(String(fatalError));
|
|
46
|
+
process.exit(1);
|
|
47
|
+
}
|
|
48
|
+
async handleError(client, { err, job, duration }) {
|
|
49
|
+
log.error(`Failed task ${job.id} (${job.task_identifier}) with error ${err.message} (${duration}ms)`);
|
|
50
|
+
if (err.stack) {
|
|
51
|
+
log.debug(err.stack);
|
|
52
|
+
}
|
|
53
|
+
await jobs.failJob(client, {
|
|
54
|
+
workerId: this.workerId,
|
|
55
|
+
jobId: job.id,
|
|
56
|
+
message: err.message
|
|
57
|
+
});
|
|
58
|
+
}
|
|
59
|
+
async handleSuccess(client, { job, duration }) {
|
|
60
|
+
log.info(`Async task ${job.id} (${job.task_identifier}) to be processed`);
|
|
61
|
+
}
|
|
62
|
+
async doWork(job) {
|
|
63
|
+
const { payload, task_identifier } = job;
|
|
64
|
+
log.debug('starting work on job', {
|
|
65
|
+
id: job.id,
|
|
66
|
+
task: task_identifier,
|
|
67
|
+
databaseId: job.database_id
|
|
68
|
+
});
|
|
69
|
+
if (!jobs.getJobSupportAny() &&
|
|
70
|
+
!this.supportedTaskNames.includes(task_identifier)) {
|
|
71
|
+
throw new Error('Unsupported task');
|
|
72
|
+
}
|
|
73
|
+
await req(task_identifier, {
|
|
74
|
+
body: payload,
|
|
75
|
+
databaseId: job.database_id,
|
|
76
|
+
workerId: this.workerId,
|
|
77
|
+
jobId: job.id
|
|
78
|
+
});
|
|
79
|
+
}
|
|
80
|
+
async doNext(client) {
|
|
81
|
+
if (this.stopped)
|
|
82
|
+
return;
|
|
83
|
+
if (!this._initialized) {
|
|
84
|
+
return await this.initialize(client);
|
|
85
|
+
}
|
|
86
|
+
log.debug('checking for jobs...');
|
|
87
|
+
if (this.doNextTimer) {
|
|
88
|
+
clearTimeout(this.doNextTimer);
|
|
89
|
+
this.doNextTimer = undefined;
|
|
90
|
+
}
|
|
91
|
+
try {
|
|
92
|
+
const job = (await jobs.getJob(client, {
|
|
93
|
+
workerId: this.workerId,
|
|
94
|
+
supportedTaskNames: jobs.getJobSupportAny()
|
|
95
|
+
? null
|
|
96
|
+
: this.supportedTaskNames
|
|
97
|
+
}));
|
|
98
|
+
if (!job || !job.id) {
|
|
99
|
+
if (!this.stopped) {
|
|
100
|
+
this.doNextTimer = setTimeout(() => this.doNext(client), this.idleDelay);
|
|
101
|
+
}
|
|
102
|
+
return;
|
|
103
|
+
}
|
|
104
|
+
const start = process.hrtime();
|
|
105
|
+
let err = null;
|
|
106
|
+
try {
|
|
107
|
+
await this.doWork(job);
|
|
108
|
+
}
|
|
109
|
+
catch (error) {
|
|
110
|
+
err = error;
|
|
111
|
+
}
|
|
112
|
+
const durationRaw = process.hrtime(start);
|
|
113
|
+
const duration = ((durationRaw[0] * 1e9 + durationRaw[1]) / 1e6).toFixed(2);
|
|
114
|
+
const jobId = job.id;
|
|
115
|
+
try {
|
|
116
|
+
if (err) {
|
|
117
|
+
await this.handleError(client, { err, job, duration });
|
|
118
|
+
}
|
|
119
|
+
else {
|
|
120
|
+
await this.handleSuccess(client, { job, duration });
|
|
121
|
+
}
|
|
122
|
+
}
|
|
123
|
+
catch (fatalError) {
|
|
124
|
+
await this.handleFatalError(client, { err, fatalError, jobId });
|
|
125
|
+
}
|
|
126
|
+
if (!this.stopped) {
|
|
127
|
+
return this.doNext(client);
|
|
128
|
+
}
|
|
129
|
+
return;
|
|
130
|
+
}
|
|
131
|
+
catch (err) {
|
|
132
|
+
if (!this.stopped) {
|
|
133
|
+
this.doNextTimer = setTimeout(() => this.doNext(client), this.idleDelay);
|
|
134
|
+
}
|
|
135
|
+
}
|
|
136
|
+
}
|
|
137
|
+
listen() {
|
|
138
|
+
if (this.stopped)
|
|
139
|
+
return;
|
|
140
|
+
const listenForChanges = (err, client, release) => {
|
|
141
|
+
if (err) {
|
|
142
|
+
log.error('Error connecting with notify listener', err);
|
|
143
|
+
if (err instanceof Error && err.stack) {
|
|
144
|
+
log.debug(err.stack);
|
|
145
|
+
}
|
|
146
|
+
// Try again in 5 seconds
|
|
147
|
+
// should this really be done in the node process?
|
|
148
|
+
if (!this.stopped) {
|
|
149
|
+
setTimeout(this.listen, 5000);
|
|
150
|
+
}
|
|
151
|
+
return;
|
|
152
|
+
}
|
|
153
|
+
if (this.stopped) {
|
|
154
|
+
release();
|
|
155
|
+
return;
|
|
156
|
+
}
|
|
157
|
+
this.listenClient = client;
|
|
158
|
+
this.listenRelease = release;
|
|
159
|
+
client.on('notification', () => {
|
|
160
|
+
if (this.doNextTimer) {
|
|
161
|
+
// Must be idle, do something!
|
|
162
|
+
this.doNext(client);
|
|
163
|
+
}
|
|
164
|
+
});
|
|
165
|
+
client.query('LISTEN "jobs:insert"');
|
|
166
|
+
client.on('error', (e) => {
|
|
167
|
+
if (this.stopped) {
|
|
168
|
+
release();
|
|
169
|
+
return;
|
|
170
|
+
}
|
|
171
|
+
log.error('Error with database notify listener', e);
|
|
172
|
+
if (e instanceof Error && e.stack) {
|
|
173
|
+
log.debug(e.stack);
|
|
174
|
+
}
|
|
175
|
+
release();
|
|
176
|
+
if (!this.stopped) {
|
|
177
|
+
this.listen();
|
|
178
|
+
}
|
|
179
|
+
});
|
|
180
|
+
log.info(`${this.workerId} connected and looking for jobs...`);
|
|
181
|
+
this.doNext(client);
|
|
182
|
+
};
|
|
183
|
+
this.pgPool.connect(listenForChanges);
|
|
184
|
+
}
|
|
185
|
+
async stop() {
|
|
186
|
+
this.stopped = true;
|
|
187
|
+
if (this.doNextTimer) {
|
|
188
|
+
clearTimeout(this.doNextTimer);
|
|
189
|
+
this.doNextTimer = undefined;
|
|
190
|
+
}
|
|
191
|
+
const client = this.listenClient;
|
|
192
|
+
const release = this.listenRelease;
|
|
193
|
+
this.listenClient = undefined;
|
|
194
|
+
this.listenRelease = undefined;
|
|
195
|
+
if (client && release) {
|
|
196
|
+
client.removeAllListeners('notification');
|
|
197
|
+
client.removeAllListeners('error');
|
|
198
|
+
try {
|
|
199
|
+
await client.query('UNLISTEN "jobs:insert"');
|
|
200
|
+
}
|
|
201
|
+
catch {
|
|
202
|
+
// Ignore listener cleanup errors during shutdown.
|
|
203
|
+
}
|
|
204
|
+
release();
|
|
205
|
+
}
|
|
206
|
+
}
|
|
207
|
+
}
|
|
208
|
+
export { Worker };
|
package/esm/req.js
ADDED
|
@@ -0,0 +1,55 @@
|
|
|
1
|
+
import requestLib from 'request';
|
|
2
|
+
import { getCallbackBaseUrl, getJobGatewayConfig, getJobGatewayDevMap, getNodeEnvironment } from '@constructive-io/job-utils';
|
|
3
|
+
import { Logger } from '@pgpmjs/logger';
|
|
4
|
+
const log = new Logger('jobs:req');
|
|
5
|
+
// callback URL for job completion
|
|
6
|
+
const completeUrl = getCallbackBaseUrl();
|
|
7
|
+
// Development override map (e.g. point a function name at localhost)
|
|
8
|
+
const nodeEnv = getNodeEnvironment();
|
|
9
|
+
const DEV_MAP = nodeEnv !== 'production' ? getJobGatewayDevMap() : null;
|
|
10
|
+
const getFunctionUrl = (fn) => {
|
|
11
|
+
if (DEV_MAP && DEV_MAP[fn]) {
|
|
12
|
+
return DEV_MAP[fn] || completeUrl;
|
|
13
|
+
}
|
|
14
|
+
const { gatewayUrl } = getJobGatewayConfig();
|
|
15
|
+
const base = gatewayUrl.replace(/\/$/, '');
|
|
16
|
+
return `${base}/${fn}`;
|
|
17
|
+
};
|
|
18
|
+
const request = (fn, { body, databaseId, workerId, jobId }) => {
|
|
19
|
+
const url = getFunctionUrl(fn);
|
|
20
|
+
log.info(`dispatching job`, {
|
|
21
|
+
fn,
|
|
22
|
+
url,
|
|
23
|
+
callbackUrl: completeUrl,
|
|
24
|
+
workerId,
|
|
25
|
+
jobId,
|
|
26
|
+
databaseId
|
|
27
|
+
});
|
|
28
|
+
return new Promise((resolve, reject) => {
|
|
29
|
+
requestLib.post({
|
|
30
|
+
headers: {
|
|
31
|
+
'Content-Type': 'application/json',
|
|
32
|
+
// these are used by job-worker/job-fn
|
|
33
|
+
'X-Worker-Id': workerId,
|
|
34
|
+
'X-Job-Id': jobId,
|
|
35
|
+
'X-Database-Id': databaseId,
|
|
36
|
+
// async HTTP completion callback
|
|
37
|
+
'X-Callback-Url': completeUrl
|
|
38
|
+
},
|
|
39
|
+
url,
|
|
40
|
+
json: true,
|
|
41
|
+
body
|
|
42
|
+
}, function (error) {
|
|
43
|
+
if (error) {
|
|
44
|
+
log.error(`request error for job[${jobId}] fn[${fn}]`, error);
|
|
45
|
+
if (error instanceof Error && error.stack) {
|
|
46
|
+
log.debug(error.stack);
|
|
47
|
+
}
|
|
48
|
+
return reject(error);
|
|
49
|
+
}
|
|
50
|
+
log.debug(`request success for job[${jobId}] fn[${fn}]`);
|
|
51
|
+
return resolve(true);
|
|
52
|
+
});
|
|
53
|
+
});
|
|
54
|
+
};
|
|
55
|
+
export { request };
|
|
@@ -1,18 +1,11 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
|
-
|
|
3
2
|
import Worker from './index';
|
|
4
3
|
import poolManager from '@constructive-io/job-pg';
|
|
5
|
-
import {
|
|
6
|
-
getWorkerHostname,
|
|
7
|
-
getJobSupported
|
|
8
|
-
} from '@constructive-io/job-utils';
|
|
9
|
-
|
|
4
|
+
import { getWorkerHostname, getJobSupported } from '@constructive-io/job-utils';
|
|
10
5
|
const pgPool = poolManager.getPool();
|
|
11
|
-
|
|
12
6
|
const worker = new Worker({
|
|
13
|
-
|
|
14
|
-
|
|
15
|
-
|
|
7
|
+
pgPool,
|
|
8
|
+
workerId: getWorkerHostname(),
|
|
9
|
+
tasks: getJobSupported()
|
|
16
10
|
});
|
|
17
|
-
|
|
18
11
|
worker.listen();
|
|
@@ -1,5 +1,5 @@
|
|
|
1
1
|
import type { PgClientLike } from '@constructive-io/job-utils';
|
|
2
|
-
import type { Pool } from 'pg';
|
|
2
|
+
import type { Pool, PoolClient } from 'pg';
|
|
3
3
|
export interface JobRow {
|
|
4
4
|
id: number | string;
|
|
5
5
|
task_identifier: string;
|
|
@@ -13,6 +13,9 @@ export default class Worker {
|
|
|
13
13
|
doNextTimer?: NodeJS.Timeout;
|
|
14
14
|
pgPool: Pool;
|
|
15
15
|
_initialized?: boolean;
|
|
16
|
+
listenClient?: PoolClient;
|
|
17
|
+
listenRelease?: () => void;
|
|
18
|
+
stopped?: boolean;
|
|
16
19
|
constructor({ tasks, idleDelay, pgPool, workerId }: {
|
|
17
20
|
tasks: string[];
|
|
18
21
|
idleDelay?: number;
|
|
@@ -37,5 +40,6 @@ export default class Worker {
|
|
|
37
40
|
doWork(job: JobRow): Promise<void>;
|
|
38
41
|
doNext(client: PgClientLike): Promise<void>;
|
|
39
42
|
listen(): void;
|
|
43
|
+
stop(): Promise<void>;
|
|
40
44
|
}
|
|
41
45
|
export { Worker };
|
|
@@ -49,6 +49,9 @@ class Worker {
|
|
|
49
49
|
doNextTimer;
|
|
50
50
|
pgPool;
|
|
51
51
|
_initialized;
|
|
52
|
+
listenClient;
|
|
53
|
+
listenRelease;
|
|
54
|
+
stopped;
|
|
52
55
|
constructor({ tasks, idleDelay = 15000, pgPool = job_pg_1.default.getPool(), workerId = 'worker-0' }) {
|
|
53
56
|
/*
|
|
54
57
|
* idleDelay: This is how long to wait between polling for jobs.
|
|
@@ -114,6 +117,8 @@ class Worker {
|
|
|
114
117
|
});
|
|
115
118
|
}
|
|
116
119
|
async doNext(client) {
|
|
120
|
+
if (this.stopped)
|
|
121
|
+
return;
|
|
117
122
|
if (!this._initialized) {
|
|
118
123
|
return await this.initialize(client);
|
|
119
124
|
}
|
|
@@ -130,7 +135,9 @@ class Worker {
|
|
|
130
135
|
: this.supportedTaskNames
|
|
131
136
|
}));
|
|
132
137
|
if (!job || !job.id) {
|
|
133
|
-
|
|
138
|
+
if (!this.stopped) {
|
|
139
|
+
this.doNextTimer = setTimeout(() => this.doNext(client), this.idleDelay);
|
|
140
|
+
}
|
|
134
141
|
return;
|
|
135
142
|
}
|
|
136
143
|
const start = process.hrtime();
|
|
@@ -155,13 +162,20 @@ class Worker {
|
|
|
155
162
|
catch (fatalError) {
|
|
156
163
|
await this.handleFatalError(client, { err, fatalError, jobId });
|
|
157
164
|
}
|
|
158
|
-
|
|
165
|
+
if (!this.stopped) {
|
|
166
|
+
return this.doNext(client);
|
|
167
|
+
}
|
|
168
|
+
return;
|
|
159
169
|
}
|
|
160
170
|
catch (err) {
|
|
161
|
-
|
|
171
|
+
if (!this.stopped) {
|
|
172
|
+
this.doNextTimer = setTimeout(() => this.doNext(client), this.idleDelay);
|
|
173
|
+
}
|
|
162
174
|
}
|
|
163
175
|
}
|
|
164
176
|
listen() {
|
|
177
|
+
if (this.stopped)
|
|
178
|
+
return;
|
|
165
179
|
const listenForChanges = (err, client, release) => {
|
|
166
180
|
if (err) {
|
|
167
181
|
log.error('Error connecting with notify listener', err);
|
|
@@ -170,9 +184,17 @@ class Worker {
|
|
|
170
184
|
}
|
|
171
185
|
// Try again in 5 seconds
|
|
172
186
|
// should this really be done in the node process?
|
|
173
|
-
|
|
187
|
+
if (!this.stopped) {
|
|
188
|
+
setTimeout(this.listen, 5000);
|
|
189
|
+
}
|
|
190
|
+
return;
|
|
191
|
+
}
|
|
192
|
+
if (this.stopped) {
|
|
193
|
+
release();
|
|
174
194
|
return;
|
|
175
195
|
}
|
|
196
|
+
this.listenClient = client;
|
|
197
|
+
this.listenRelease = release;
|
|
176
198
|
client.on('notification', () => {
|
|
177
199
|
if (this.doNextTimer) {
|
|
178
200
|
// Must be idle, do something!
|
|
@@ -181,18 +203,46 @@ class Worker {
|
|
|
181
203
|
});
|
|
182
204
|
client.query('LISTEN "jobs:insert"');
|
|
183
205
|
client.on('error', (e) => {
|
|
206
|
+
if (this.stopped) {
|
|
207
|
+
release();
|
|
208
|
+
return;
|
|
209
|
+
}
|
|
184
210
|
log.error('Error with database notify listener', e);
|
|
185
211
|
if (e instanceof Error && e.stack) {
|
|
186
212
|
log.debug(e.stack);
|
|
187
213
|
}
|
|
188
214
|
release();
|
|
189
|
-
this.
|
|
215
|
+
if (!this.stopped) {
|
|
216
|
+
this.listen();
|
|
217
|
+
}
|
|
190
218
|
});
|
|
191
219
|
log.info(`${this.workerId} connected and looking for jobs...`);
|
|
192
220
|
this.doNext(client);
|
|
193
221
|
};
|
|
194
222
|
this.pgPool.connect(listenForChanges);
|
|
195
223
|
}
|
|
224
|
+
async stop() {
|
|
225
|
+
this.stopped = true;
|
|
226
|
+
if (this.doNextTimer) {
|
|
227
|
+
clearTimeout(this.doNextTimer);
|
|
228
|
+
this.doNextTimer = undefined;
|
|
229
|
+
}
|
|
230
|
+
const client = this.listenClient;
|
|
231
|
+
const release = this.listenRelease;
|
|
232
|
+
this.listenClient = undefined;
|
|
233
|
+
this.listenRelease = undefined;
|
|
234
|
+
if (client && release) {
|
|
235
|
+
client.removeAllListeners('notification');
|
|
236
|
+
client.removeAllListeners('error');
|
|
237
|
+
try {
|
|
238
|
+
await client.query('UNLISTEN "jobs:insert"');
|
|
239
|
+
}
|
|
240
|
+
catch {
|
|
241
|
+
// Ignore listener cleanup errors during shutdown.
|
|
242
|
+
}
|
|
243
|
+
release();
|
|
244
|
+
}
|
|
245
|
+
}
|
|
196
246
|
}
|
|
197
247
|
exports.default = Worker;
|
|
198
248
|
exports.Worker = Worker;
|
package/package.json
CHANGED
|
@@ -1,21 +1,24 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@constructive-io/knative-job-worker",
|
|
3
|
-
"version": "0.
|
|
3
|
+
"version": "0.8.1",
|
|
4
4
|
"description": "knative job worker",
|
|
5
5
|
"author": "Constructive <developers@constructive.io>",
|
|
6
6
|
"homepage": "https://github.com/constructive-io/jobs/tree/master/packages/knative-job-worker#readme",
|
|
7
7
|
"license": "SEE LICENSE IN LICENSE",
|
|
8
|
-
"main": "
|
|
8
|
+
"main": "index.js",
|
|
9
|
+
"module": "esm/index.js",
|
|
10
|
+
"types": "index.d.ts",
|
|
9
11
|
"directories": {
|
|
10
12
|
"lib": "src",
|
|
11
13
|
"test": "__tests__"
|
|
12
14
|
},
|
|
13
15
|
"bin": {
|
|
14
|
-
"faas-jobs": "
|
|
15
|
-
"knative-jobs": "
|
|
16
|
+
"faas-jobs": "run.js",
|
|
17
|
+
"knative-jobs": "run.js"
|
|
16
18
|
},
|
|
17
19
|
"publishConfig": {
|
|
18
|
-
"access": "public"
|
|
20
|
+
"access": "public",
|
|
21
|
+
"directory": "dist"
|
|
19
22
|
},
|
|
20
23
|
"repository": {
|
|
21
24
|
"type": "git",
|
|
@@ -25,21 +28,27 @@
|
|
|
25
28
|
"test": "jest --passWithNoTests",
|
|
26
29
|
"test:watch": "jest --watch",
|
|
27
30
|
"test:debug": "node --inspect node_modules/.bin/jest --runInBand",
|
|
28
|
-
"
|
|
29
|
-
"
|
|
31
|
+
"clean": "makage clean",
|
|
32
|
+
"prepack": "npm run build",
|
|
33
|
+
"build": "makage build",
|
|
34
|
+
"build:dev": "makage build --dev"
|
|
30
35
|
},
|
|
31
36
|
"bugs": {
|
|
32
37
|
"url": "https://github.com/constructive-io/jobs/issues"
|
|
33
38
|
},
|
|
34
39
|
"dependencies": {
|
|
35
|
-
"@constructive-io/job-pg": "^0.
|
|
36
|
-
"@constructive-io/job-utils": "^0.
|
|
37
|
-
"@pgpmjs/logger": "^1.
|
|
38
|
-
"pg": "8.
|
|
40
|
+
"@constructive-io/job-pg": "^0.4.1",
|
|
41
|
+
"@constructive-io/job-utils": "^0.6.1",
|
|
42
|
+
"@pgpmjs/logger": "^1.4.0",
|
|
43
|
+
"pg": "8.17.1",
|
|
39
44
|
"request": "2.88.2"
|
|
40
45
|
},
|
|
41
46
|
"devDependencies": {
|
|
42
|
-
"
|
|
47
|
+
"@pgpm/database-jobs": "^0.16.0",
|
|
48
|
+
"@pgpm/verify": "^0.16.0",
|
|
49
|
+
"@pgpmjs/core": "^4.16.1",
|
|
50
|
+
"makage": "^0.1.10",
|
|
51
|
+
"pgsql-test": "^2.25.1"
|
|
43
52
|
},
|
|
44
|
-
"gitHead": "
|
|
53
|
+
"gitHead": "3ffd5718e86ea5fa9ca6e0930aeb510cf392f343"
|
|
45
54
|
}
|