langaro-api 1.2.3 → 1.2.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/bin/langaro-api.js +12 -2
- package/lib/cli/documentation-templates/01-architecture-overview.md +240 -0
- package/lib/cli/documentation-templates/02-crud-layer.md +504 -0
- package/lib/cli/documentation-templates/03-models.md +362 -0
- package/lib/cli/documentation-templates/04-services.md +355 -0
- package/lib/cli/documentation-templates/05-controllers.md +395 -0
- package/lib/cli/documentation-templates/06-routes.md +268 -0
- package/lib/cli/documentation-templates/07-jobs.md +361 -0
- package/lib/cli/documentation-templates/08-tasks.md +265 -0
- package/lib/cli/documentation-templates/09-middlewares.md +238 -0
- package/lib/cli/documentation-templates/10-integrations.md +332 -0
- package/lib/cli/documentation-templates/11-config-and-bootstrap.md +352 -0
- package/lib/cli/documentation-templates/12-queues.md +205 -0
- package/lib/cli/documentation-templates/13-utils.md +281 -0
- package/lib/cli/documentation-templates/14-testing.md +315 -0
- package/lib/cli/documentation-templates/15-cli-and-scaffolding.md +344 -0
- package/lib/cli/documentation-templates/SUMMARY.md +116 -0
- package/lib/cli/init.js +30 -2
- package/package.json +1 -1
|
@@ -0,0 +1,361 @@
|
|
|
1
|
+
# Jobs (BullMQ Async Processing)
|
|
2
|
+
|
|
3
|
+
## Role
|
|
4
|
+
|
|
5
|
+
Jobs are **asynchronous task handlers** that run in BullMQ workers. They process work that should not block HTTP requests: sending emails, processing files, calling external APIs, batch operations.
|
|
6
|
+
|
|
7
|
+
**What jobs do:**
|
|
8
|
+
- Execute long-running operations asynchronously
|
|
9
|
+
- Retry failed operations with configurable backoff
|
|
10
|
+
- Process work with rate limiting
|
|
11
|
+
- Chain to other jobs
|
|
12
|
+
- Emit real-time updates via Socket.io
|
|
13
|
+
|
|
14
|
+
**What jobs do NOT do:**
|
|
15
|
+
- Handle HTTP requests (that's controllers)
|
|
16
|
+
- Define business logic that needs synchronous responses (that's services)
|
|
17
|
+
- Run on a schedule (that's tasks)
|
|
18
|
+
|
|
19
|
+
---
|
|
20
|
+
|
|
21
|
+
## Location & Naming
|
|
22
|
+
|
|
23
|
+
```
|
|
24
|
+
src/jobs/
|
|
25
|
+
├── index.js # loadJobs loader (do not modify)
|
|
26
|
+
├── send-mail.js # Job: 'send-mail'
|
|
27
|
+
├── export-invoices.js # Job: 'export-invoices'
|
|
28
|
+
├── product-invoice/ # Subdirectory for related jobs
|
|
29
|
+
│ ├── transmit.js # Job: 'product-invoice-transmit'
|
|
30
|
+
│ ├── process-response.js # Job: 'product-invoice-process-response'
|
|
31
|
+
│ └── cancel.js # Job: 'product-invoice-cancel'
|
|
32
|
+
├── affiliate/ # Another subdirectory
|
|
33
|
+
│ ├── click-register.js # Job: 'affiliate-click-register'
|
|
34
|
+
│ └── commission-calculate.js # Job: 'affiliate-commission-calculate'
|
|
35
|
+
└── ...
|
|
36
|
+
```
|
|
37
|
+
|
|
38
|
+
**Naming convention:** Kebab-case filenames. Job name derived from path:
|
|
39
|
+
- `send-mail.js` → job name `send-mail`
|
|
40
|
+
- `product-invoice/transmit.js` → job name `product-invoice-transmit`
|
|
41
|
+
- `affiliate/click-register.js` → job name `affiliate-click-register`
|
|
42
|
+
|
|
43
|
+
Files named `index.js` or ending with `.spec.js` are ignored.
|
|
44
|
+
|
|
45
|
+
---
|
|
46
|
+
|
|
47
|
+
## Standard Structure
|
|
48
|
+
|
|
49
|
+
```javascript
|
|
50
|
+
/** @param {ServicesMap} services @generated-types */
|
|
51
|
+
module.exports = (services) => ({
|
|
52
|
+
|
|
53
|
+
/**
|
|
54
|
+
* Job handler function
|
|
55
|
+
* @param {object} data - The payload passed to Queue.add()
|
|
56
|
+
* @param {object} job - BullMQ Job object
|
|
57
|
+
* @param {object} queue - Queue interface { add(name, data, options) }
|
|
58
|
+
* @param {object} io - Socket.io server instance
|
|
59
|
+
*/
|
|
60
|
+
async handle(data, job, queue, io) {
|
|
61
|
+
const { recipientId, templateName } = data;
|
|
62
|
+
|
|
63
|
+
// Access any service
|
|
64
|
+
const user = await services.UsersServices.getWhere('id', recipientId, {
|
|
65
|
+
firstOnly: true,
|
|
66
|
+
});
|
|
67
|
+
|
|
68
|
+
// Do the work
|
|
69
|
+
await sendEmail(user.email, templateName);
|
|
70
|
+
|
|
71
|
+
// Emit real-time update
|
|
72
|
+
io.sockets.in(recipientId).emit('email:sent', { templateName });
|
|
73
|
+
},
|
|
74
|
+
|
|
75
|
+
// Optional: Override default job options
|
|
76
|
+
jobOptions: {
|
|
77
|
+
attempts: 10,
|
|
78
|
+
backoff: { type: 'fixed', delay: 30000 },
|
|
79
|
+
},
|
|
80
|
+
|
|
81
|
+
// Optional: Override default worker options
|
|
82
|
+
workerOptions: {
|
|
83
|
+
limiter: { max: 5, duration: 1000 }, // 5 jobs per second
|
|
84
|
+
},
|
|
85
|
+
});
|
|
86
|
+
```
|
|
87
|
+
|
|
88
|
+
---
|
|
89
|
+
|
|
90
|
+
## Handle Function Parameters
|
|
91
|
+
|
|
92
|
+
| Parameter | Type | Description |
|
|
93
|
+
|-----------|------|-------------|
|
|
94
|
+
| `data` | object | The payload passed to `Queue.add(name, data)`. Contains all job-specific data. |
|
|
95
|
+
| `job` | BullMQ Job | The job object. Has `job.data`, `job.id`, `job.attemptsMade`, `job.discard()`. |
|
|
96
|
+
| `queue` | `{ add }` | Queue interface to chain to other jobs: `await queue.add('other-job', payload)`. |
|
|
97
|
+
| `io` | Socket.io Server | For real-time updates: `io.sockets.in(userId).emit(event, data)`. |
|
|
98
|
+
|
|
99
|
+
---
|
|
100
|
+
|
|
101
|
+
## Adding Jobs to the Queue
|
|
102
|
+
|
|
103
|
+
### From Controllers
|
|
104
|
+
|
|
105
|
+
```javascript
|
|
106
|
+
await this.Queue.add('send-mail', {
|
|
107
|
+
templateName: 'welcome',
|
|
108
|
+
recipientsList: [{ Email: user.email, TemplateModel: { name: user.name } }],
|
|
109
|
+
});
|
|
110
|
+
```
|
|
111
|
+
|
|
112
|
+
### From Other Jobs
|
|
113
|
+
|
|
114
|
+
```javascript
|
|
115
|
+
async handle(data, job, queue, io) {
|
|
116
|
+
// Process this job...
|
|
117
|
+
|
|
118
|
+
// Then trigger another job
|
|
119
|
+
await queue.add('send-notification', {
|
|
120
|
+
userId: data.userId,
|
|
121
|
+
message: 'Processing complete',
|
|
122
|
+
});
|
|
123
|
+
}
|
|
124
|
+
```
|
|
125
|
+
|
|
126
|
+
### From Tasks
|
|
127
|
+
|
|
128
|
+
```javascript
|
|
129
|
+
module.exports = (services, Queue) => ({
|
|
130
|
+
task: async () => {
|
|
131
|
+
const items = await services.ItemsServices.get({ andWhere: [['status', 'pending']] });
|
|
132
|
+
for (let i = 0; i < items.data.length; i++) {
|
|
133
|
+
await Queue.add('process-item', { itemId: items.data[i].id });
|
|
134
|
+
}
|
|
135
|
+
},
|
|
136
|
+
cronTime: '0 * * * *',
|
|
137
|
+
isActive: true,
|
|
138
|
+
});
|
|
139
|
+
```
|
|
140
|
+
|
|
141
|
+
### With Options
|
|
142
|
+
|
|
143
|
+
```javascript
|
|
144
|
+
// Delayed execution
|
|
145
|
+
await this.Queue.add('reminder', data, { delay: 60000 }); // 1 minute delay
|
|
146
|
+
|
|
147
|
+
// Grouped queue (per-company isolation)
|
|
148
|
+
await this.Queue.add('sync-data', data, { groupId: companyId });
|
|
149
|
+
```
|
|
150
|
+
|
|
151
|
+
---
|
|
152
|
+
|
|
153
|
+
## Job Options
|
|
154
|
+
|
|
155
|
+
Default options (applied to all jobs unless overridden):
|
|
156
|
+
|
|
157
|
+
```javascript
|
|
158
|
+
{
|
|
159
|
+
attempts: 100, // Max retry attempts
|
|
160
|
+
backoff: { type: 'fixed', delay: 60000 }, // 1 minute between retries
|
|
161
|
+
removeOnComplete: { age: 2592000, count: 1000 }, // Keep completed for 30 days, max 1000
|
|
162
|
+
removeOnFail: { age: 7776000, count: 5000 }, // Keep failed for 90 days, max 5000
|
|
163
|
+
}
|
|
164
|
+
```
|
|
165
|
+
|
|
166
|
+
Override per job:
|
|
167
|
+
|
|
168
|
+
```javascript
|
|
169
|
+
jobOptions: {
|
|
170
|
+
attempts: 1, // No retries (e.g., for import jobs)
|
|
171
|
+
backoff: null,
|
|
172
|
+
}
|
|
173
|
+
```
|
|
174
|
+
|
|
175
|
+
```javascript
|
|
176
|
+
jobOptions: {
|
|
177
|
+
attempts: 10,
|
|
178
|
+
backoff: {
|
|
179
|
+
type: 'exponential', // 1s, 2s, 4s, 8s, 16s...
|
|
180
|
+
delay: 1000,
|
|
181
|
+
},
|
|
182
|
+
}
|
|
183
|
+
```
|
|
184
|
+
|
|
185
|
+
---
|
|
186
|
+
|
|
187
|
+
## Worker Options
|
|
188
|
+
|
|
189
|
+
Default options:
|
|
190
|
+
|
|
191
|
+
```javascript
|
|
192
|
+
{
|
|
193
|
+
autorun: false, // Workers start manually
|
|
194
|
+
limiter: { max: 2, duration: 1000 }, // 2 jobs per second
|
|
195
|
+
maxStalledCount: 5, // Kill after 5 stalls
|
|
196
|
+
}
|
|
197
|
+
```
|
|
198
|
+
|
|
199
|
+
Override per job:
|
|
200
|
+
|
|
201
|
+
```javascript
|
|
202
|
+
// No rate limiting (e.g., email sending)
|
|
203
|
+
workerOptions: {
|
|
204
|
+
limiter: null,
|
|
205
|
+
}
|
|
206
|
+
|
|
207
|
+
// Higher throughput
|
|
208
|
+
workerOptions: {
|
|
209
|
+
limiter: { max: 10, duration: 1000 }, // 10 jobs per second
|
|
210
|
+
}
|
|
211
|
+
```
|
|
212
|
+
|
|
213
|
+
---
|
|
214
|
+
|
|
215
|
+
## Grouped Queues
|
|
216
|
+
|
|
217
|
+
When `group: true`, each unique `groupId` creates its own queue and worker pair. This provides per-entity isolation.
|
|
218
|
+
|
|
219
|
+
```javascript
|
|
220
|
+
module.exports = (services) => ({
|
|
221
|
+
async handle(data, job, queue, io) {
|
|
222
|
+
// This handler runs in an isolated queue per groupId
|
|
223
|
+
await processForCompany(data.companyId);
|
|
224
|
+
},
|
|
225
|
+
group: true, // Enable grouped queues
|
|
226
|
+
workerOptions: {
|
|
227
|
+
limiter: { max: 1, duration: 1000 }, // 1 job per second per group
|
|
228
|
+
},
|
|
229
|
+
});
|
|
230
|
+
```
|
|
231
|
+
|
|
232
|
+
**Adding grouped jobs:**
|
|
233
|
+
```javascript
|
|
234
|
+
await this.Queue.add('sync-data', { companyId: 'abc' }, { groupId: 'abc' });
|
|
235
|
+
await this.Queue.add('sync-data', { companyId: 'xyz' }, { groupId: 'xyz' });
|
|
236
|
+
// Creates 2 separate queues: sync-data-abc and sync-data-xyz
|
|
237
|
+
```
|
|
238
|
+
|
|
239
|
+
Inactive grouped queues auto-close after 10 minutes.
|
|
240
|
+
|
|
241
|
+
---
|
|
242
|
+
|
|
243
|
+
## Error Handling
|
|
244
|
+
|
|
245
|
+
### Retry (default)
|
|
246
|
+
Throw an error to trigger automatic retry:
|
|
247
|
+
```javascript
|
|
248
|
+
async handle(data, job, queue, io) {
|
|
249
|
+
const response = await callExternalApi(data);
|
|
250
|
+
if (!response.ok) {
|
|
251
|
+
throw new Error(`API failed: ${response.status}`);
|
|
252
|
+
// Will retry according to jobOptions.attempts and backoff
|
|
253
|
+
}
|
|
254
|
+
}
|
|
255
|
+
```
|
|
256
|
+
|
|
257
|
+
### Permanent Failure
|
|
258
|
+
Call `job.discard()` then throw to stop retrying:
|
|
259
|
+
```javascript
|
|
260
|
+
async handle(data, job, queue, io) {
|
|
261
|
+
const response = await callExternalApi(data);
|
|
262
|
+
if (response.status === 404) {
|
|
263
|
+
job.discard(); // Mark as permanently failed
|
|
264
|
+
throw new Error('Resource not found — not retryable');
|
|
265
|
+
}
|
|
266
|
+
}
|
|
267
|
+
```
|
|
268
|
+
|
|
269
|
+
### Partial Retry
|
|
270
|
+
Re-queue only the failed items:
|
|
271
|
+
```javascript
|
|
272
|
+
async handle(data, job, queue, io) {
|
|
273
|
+
const results = await sendBatchEmails(data.recipients);
|
|
274
|
+
const failed = results.filter(r => !r.success);
|
|
275
|
+
|
|
276
|
+
if (failed.length > 0 && failed.length < data.recipients.length) {
|
|
277
|
+
// Some succeeded, some failed — retry only failed
|
|
278
|
+
await queue.add('send-mail', {
|
|
279
|
+
...data,
|
|
280
|
+
recipients: failed.map(f => f.recipient),
|
|
281
|
+
});
|
|
282
|
+
} else if (failed.length === data.recipients.length) {
|
|
283
|
+
// All failed — throw to use standard retry
|
|
284
|
+
throw new Error('All emails failed');
|
|
285
|
+
}
|
|
286
|
+
}
|
|
287
|
+
```
|
|
288
|
+
|
|
289
|
+
All job failures are automatically captured by Sentry.
|
|
290
|
+
|
|
291
|
+
---
|
|
292
|
+
|
|
293
|
+
## Real-Time Progress Updates
|
|
294
|
+
|
|
295
|
+
```javascript
|
|
296
|
+
async handle(data, job, queue, io) {
|
|
297
|
+
const socket = io.sockets.in(data.userId);
|
|
298
|
+
|
|
299
|
+
socket.emit('import:started', { id: data.importId });
|
|
300
|
+
|
|
301
|
+
for (let i = 0; i < data.rows.length; i++) {
|
|
302
|
+
await processRow(data.rows[i]);
|
|
303
|
+
|
|
304
|
+
// Emit progress every 10%
|
|
305
|
+
if (i % Math.ceil(data.rows.length / 10) === 0) {
|
|
306
|
+
socket.emit('import:progress', {
|
|
307
|
+
id: data.importId,
|
|
308
|
+
progress: Math.round((i / data.rows.length) * 100),
|
|
309
|
+
});
|
|
310
|
+
}
|
|
311
|
+
}
|
|
312
|
+
|
|
313
|
+
socket.emit('import:completed', { id: data.importId });
|
|
314
|
+
}
|
|
315
|
+
```
|
|
316
|
+
|
|
317
|
+
---
|
|
318
|
+
|
|
319
|
+
## Creating from Scratch
|
|
320
|
+
|
|
321
|
+
1. **Create the file:** `src/jobs/{name}.js` (kebab-case)
|
|
322
|
+
2. **Use the template:**
|
|
323
|
+
```javascript
|
|
324
|
+
/** @param {ServicesMap} services @generated-types */
|
|
325
|
+
module.exports = (services) => ({
|
|
326
|
+
async handle(data, job, queue, io) {
|
|
327
|
+
// Job logic here
|
|
328
|
+
},
|
|
329
|
+
});
|
|
330
|
+
```
|
|
331
|
+
3. **Add job/worker options** if defaults aren't suitable
|
|
332
|
+
4. **Trigger the job** from a controller or task: `await this.Queue.add('job-name', data)`
|
|
333
|
+
5. **Restart the server** to register the new job
|
|
334
|
+
|
|
335
|
+
---
|
|
336
|
+
|
|
337
|
+
## Anti-patterns
|
|
338
|
+
|
|
339
|
+
- **Do NOT run jobs synchronously in controllers.** If it takes more than a few hundred ms, use a job.
|
|
340
|
+
- **Do NOT forget to handle errors.** Unhandled errors cause infinite retries until max attempts.
|
|
341
|
+
- **Do NOT store large payloads in job data.** Store data in the database, pass only IDs.
|
|
342
|
+
- **Do NOT rely on job execution order.** BullMQ does not guarantee ordering across different job instances.
|
|
343
|
+
- **Do NOT use `job.discard()` without throwing.** The discard only takes effect when followed by a throw.
|
|
344
|
+
- **Do NOT create job files named `index.js`** — they are ignored by the loader.
|
|
345
|
+
|
|
346
|
+
---
|
|
347
|
+
|
|
348
|
+
## Checklist
|
|
349
|
+
|
|
350
|
+
When creating a new job:
|
|
351
|
+
|
|
352
|
+
- [ ] File is in `src/jobs/` (or subdirectory)
|
|
353
|
+
- [ ] Filename is kebab-case (e.g., `send-mail.js`, not `sendMail.js`)
|
|
354
|
+
- [ ] Exports a factory function `(services) => JobDefinition`
|
|
355
|
+
- [ ] Has `handle` method with `(data, job, queue, io)` signature
|
|
356
|
+
- [ ] Has correct `@generated-types` JSDoc annotation
|
|
357
|
+
- [ ] Error handling: either throw (to retry) or discard + throw (permanent failure)
|
|
358
|
+
- [ ] Job data is minimal (IDs, not full objects)
|
|
359
|
+
- [ ] At least one trigger point exists (controller, task, or another job)
|
|
360
|
+
- [ ] Worker options set appropriately for the workload (rate limiting)
|
|
361
|
+
- [ ] Not named `index.js` or `*.spec.js`
|
|
@@ -0,0 +1,265 @@
|
|
|
1
|
+
# Tasks (Cron Scheduled Jobs)
|
|
2
|
+
|
|
3
|
+
## Role
|
|
4
|
+
|
|
5
|
+
Tasks are **cron-scheduled functions** that run automatically at defined intervals. They handle recurring operations like data syncing, cache warming, expiration checks, and periodic reports.
|
|
6
|
+
|
|
7
|
+
**What tasks do:**
|
|
8
|
+
- Run on a cron schedule
|
|
9
|
+
- Poll external services for updates
|
|
10
|
+
- Clean up stale data
|
|
11
|
+
- Trigger batches of background jobs
|
|
12
|
+
- Aggregate or sync data periodically
|
|
13
|
+
|
|
14
|
+
**What tasks do NOT do:**
|
|
15
|
+
- Handle HTTP requests
|
|
16
|
+
- Process individual items in real-time (that's jobs)
|
|
17
|
+
- Run once on demand (use a job or controller for that)
|
|
18
|
+
|
|
19
|
+
---
|
|
20
|
+
|
|
21
|
+
## Location & Naming
|
|
22
|
+
|
|
23
|
+
```
|
|
24
|
+
src/tasks/
|
|
25
|
+
├── index.js # loadTasks loader (do not modify)
|
|
26
|
+
├── update-currencies-exchange-rates.js # Runs every 6 hours
|
|
27
|
+
├── product-invoices-status-polling.js # Runs every 5 minutes
|
|
28
|
+
├── send-certificate-expiration-mail.js # Runs daily
|
|
29
|
+
└── ...
|
|
30
|
+
```
|
|
31
|
+
|
|
32
|
+
**Naming convention:** Kebab-case filenames describing what the task does.
|
|
33
|
+
|
|
34
|
+
Files named `index.js` or ending with `.spec.js` are ignored by the loader.
|
|
35
|
+
|
|
36
|
+
---
|
|
37
|
+
|
|
38
|
+
## Standard Structure
|
|
39
|
+
|
|
40
|
+
```javascript
|
|
41
|
+
/** @param {ServicesMap} services @param {QueueMap} Queue @generated-types */
|
|
42
|
+
module.exports = (services, Queue) => {
|
|
43
|
+
const task = async () => {
|
|
44
|
+
// Task logic runs on every cron tick
|
|
45
|
+
|
|
46
|
+
// Access any service
|
|
47
|
+
const { data: companies } = await services.CompaniesServices.get({
|
|
48
|
+
showOnly: ['id'],
|
|
49
|
+
perPage: 100000,
|
|
50
|
+
andWhere: [['needs_sync', 1]],
|
|
51
|
+
});
|
|
52
|
+
|
|
53
|
+
if (!companies.length) return;
|
|
54
|
+
|
|
55
|
+
// Option 1: Process directly
|
|
56
|
+
await services.CompaniesServices.syncExternalData(companies.map(c => c.id));
|
|
57
|
+
|
|
58
|
+
// Option 2: Queue individual jobs
|
|
59
|
+
for (let i = 0; i < companies.length; i++) {
|
|
60
|
+
await Queue.add('sync-company', { companyId: companies[i].id });
|
|
61
|
+
}
|
|
62
|
+
};
|
|
63
|
+
|
|
64
|
+
return {
|
|
65
|
+
task,
|
|
66
|
+
cronTime: '0 */3 * * *', // Every 3 hours
|
|
67
|
+
isActive: true,
|
|
68
|
+
};
|
|
69
|
+
};
|
|
70
|
+
```
|
|
71
|
+
|
|
72
|
+
---
|
|
73
|
+
|
|
74
|
+
## Configuration Properties
|
|
75
|
+
|
|
76
|
+
| Property | Type | Required | Description |
|
|
77
|
+
|----------|------|----------|-------------|
|
|
78
|
+
| `task` | async function | Yes | The function to execute on each cron tick |
|
|
79
|
+
| `cronTime` | string | Yes | Cron expression (standard format) |
|
|
80
|
+
| `isActive` | boolean | Yes | Whether this task should be started |
|
|
81
|
+
| `environments` | string[] | No | Restrict to specific NODE_ENV values. Default: all environments |
|
|
82
|
+
|
|
83
|
+
### Cron Expression Format
|
|
84
|
+
|
|
85
|
+
```
|
|
86
|
+
┌────────── minute (0-59)
|
|
87
|
+
│ ┌──────── hour (0-23)
|
|
88
|
+
│ │ ┌────── day of month (1-31)
|
|
89
|
+
│ │ │ ┌──── month (1-12)
|
|
90
|
+
│ │ │ │ ┌── day of week (0-7, 0 and 7 are Sunday)
|
|
91
|
+
│ │ │ │ │
|
|
92
|
+
* * * * *
|
|
93
|
+
```
|
|
94
|
+
|
|
95
|
+
**Common patterns:**
|
|
96
|
+
```
|
|
97
|
+
'* * * * *' — Every minute
|
|
98
|
+
'*/5 * * * *' — Every 5 minutes
|
|
99
|
+
'0 * * * *' — Every hour (at minute 0)
|
|
100
|
+
'0 */3 * * *' — Every 3 hours
|
|
101
|
+
'0 6 * * *' — Every day at 6:00 AM
|
|
102
|
+
'0 0 * * 1' — Every Monday at midnight
|
|
103
|
+
'0 0 1 * *' — First day of every month
|
|
104
|
+
```
|
|
105
|
+
|
|
106
|
+
**Timezone:** All tasks run in `America/Sao_Paulo` timezone (hardcoded in the loader).
|
|
107
|
+
|
|
108
|
+
### Environment Filtering
|
|
109
|
+
|
|
110
|
+
```javascript
|
|
111
|
+
return {
|
|
112
|
+
task,
|
|
113
|
+
cronTime: '0 */6 * * *',
|
|
114
|
+
isActive: true,
|
|
115
|
+
environments: ['production'], // Only runs in production
|
|
116
|
+
};
|
|
117
|
+
```
|
|
118
|
+
|
|
119
|
+
```javascript
|
|
120
|
+
return {
|
|
121
|
+
task,
|
|
122
|
+
cronTime: '0 */6 * * *',
|
|
123
|
+
isActive: true,
|
|
124
|
+
environments: ['production', 'staging'], // Runs in production and staging
|
|
125
|
+
};
|
|
126
|
+
```
|
|
127
|
+
|
|
128
|
+
If `environments` is omitted, the task runs in all environments (except test — tasks are never loaded in test).
|
|
129
|
+
|
|
130
|
+
---
|
|
131
|
+
|
|
132
|
+
## Factory Parameters
|
|
133
|
+
|
|
134
|
+
| Parameter | Type | Description |
|
|
135
|
+
|-----------|------|-------------|
|
|
136
|
+
| `services` | ServicesMap | All service instances |
|
|
137
|
+
| `Queue` | `{ add(name, data, options) }` | Queue interface for triggering jobs |
|
|
138
|
+
|
|
139
|
+
---
|
|
140
|
+
|
|
141
|
+
## Common Patterns
|
|
142
|
+
|
|
143
|
+
### Data Sync with Staleness Check
|
|
144
|
+
|
|
145
|
+
```javascript
|
|
146
|
+
module.exports = (services) => {
|
|
147
|
+
const task = async () => {
|
|
148
|
+
const sevenDaysAgo = new Date(Date.now() - 7 * 24 * 60 * 60 * 1000);
|
|
149
|
+
|
|
150
|
+
const { data: staleCompanies } = await services.CompaniesServices.get({
|
|
151
|
+
showOnly: ['id'],
|
|
152
|
+
perPage: 100000,
|
|
153
|
+
where: (query) => query.where((q) => {
|
|
154
|
+
q.where('last_synced_at', '<', sevenDaysAgo.toISOString())
|
|
155
|
+
.orWhereNull('last_synced_at');
|
|
156
|
+
}),
|
|
157
|
+
});
|
|
158
|
+
|
|
159
|
+
if (!staleCompanies.length) return;
|
|
160
|
+
|
|
161
|
+
await services.CompaniesServices.syncData(staleCompanies.map(c => c.id));
|
|
162
|
+
};
|
|
163
|
+
|
|
164
|
+
return { task, cronTime: '5 * * * *', isActive: true };
|
|
165
|
+
};
|
|
166
|
+
```
|
|
167
|
+
|
|
168
|
+
### Expiration Processing
|
|
169
|
+
|
|
170
|
+
```javascript
|
|
171
|
+
module.exports = (services, Queue) => {
|
|
172
|
+
const task = async () => {
|
|
173
|
+
const { data: expired } = await services.SubscriptionsServices.get({
|
|
174
|
+
perPage: 100000,
|
|
175
|
+
andWhere: [
|
|
176
|
+
['status', 'active'],
|
|
177
|
+
['expires_at', '<', new Date().toISOString()],
|
|
178
|
+
],
|
|
179
|
+
});
|
|
180
|
+
|
|
181
|
+
for (let i = 0; i < expired.length; i++) {
|
|
182
|
+
await Queue.add('process-expiration', { subscriptionId: expired[i].id });
|
|
183
|
+
}
|
|
184
|
+
};
|
|
185
|
+
|
|
186
|
+
return { task, cronTime: '0 */6 * * *', isActive: true };
|
|
187
|
+
};
|
|
188
|
+
```
|
|
189
|
+
|
|
190
|
+
### Periodic Report Generation
|
|
191
|
+
|
|
192
|
+
```javascript
|
|
193
|
+
module.exports = (services, Queue) => {
|
|
194
|
+
const task = async () => {
|
|
195
|
+
const { data: companies } = await services.CompaniesServices.get({
|
|
196
|
+
showOnly: ['id', 'admin_email'],
|
|
197
|
+
perPage: 100000,
|
|
198
|
+
andWhere: [['reports_enabled', true]],
|
|
199
|
+
});
|
|
200
|
+
|
|
201
|
+
for (let i = 0; i < companies.length; i++) {
|
|
202
|
+
await Queue.add('generate-monthly-report', { companyId: companies[i].id });
|
|
203
|
+
}
|
|
204
|
+
};
|
|
205
|
+
|
|
206
|
+
return {
|
|
207
|
+
task,
|
|
208
|
+
cronTime: '0 8 1 * *', // 1st of every month at 8 AM
|
|
209
|
+
isActive: true,
|
|
210
|
+
environments: ['production'],
|
|
211
|
+
};
|
|
212
|
+
};
|
|
213
|
+
```
|
|
214
|
+
|
|
215
|
+
---
|
|
216
|
+
|
|
217
|
+
## Creating from Scratch
|
|
218
|
+
|
|
219
|
+
1. **Create the file:** `src/tasks/{task-name}.js` (kebab-case)
|
|
220
|
+
2. **Use the template:**
|
|
221
|
+
```javascript
|
|
222
|
+
/** @param {ServicesMap} services @param {QueueMap} Queue @generated-types */
|
|
223
|
+
module.exports = (services, Queue) => {
|
|
224
|
+
const task = async () => {
|
|
225
|
+
// Task logic
|
|
226
|
+
};
|
|
227
|
+
|
|
228
|
+
return {
|
|
229
|
+
task,
|
|
230
|
+
cronTime: '0 * * * *', // Adjust schedule
|
|
231
|
+
isActive: true,
|
|
232
|
+
};
|
|
233
|
+
};
|
|
234
|
+
```
|
|
235
|
+
3. **Choose the right cron expression** for the frequency needed
|
|
236
|
+
4. **Consider environment filtering** — not all tasks should run in development
|
|
237
|
+
5. **Restart the server** to register the new task
|
|
238
|
+
|
|
239
|
+
---
|
|
240
|
+
|
|
241
|
+
## Anti-patterns
|
|
242
|
+
|
|
243
|
+
- **Do NOT run CPU-intensive work directly in tasks.** Queue individual jobs instead — tasks should orchestrate, not execute heavy work.
|
|
244
|
+
- **Do NOT forget `isActive: true`.** An omitted or false `isActive` means the task never runs.
|
|
245
|
+
- **Do NOT assume single-instance execution.** If running multiple server instances, the task runs on ALL of them. Use Redis locks to prevent duplicate execution.
|
|
246
|
+
- **Do NOT use large `perPage` without `showOnly`.** Fetching 100K full records wastes memory. Select only the fields you need.
|
|
247
|
+
- **Do NOT name the file `index.js`** — it's ignored by the loader.
|
|
248
|
+
|
|
249
|
+
---
|
|
250
|
+
|
|
251
|
+
## Checklist
|
|
252
|
+
|
|
253
|
+
When creating a new task:
|
|
254
|
+
|
|
255
|
+
- [ ] File is in `src/tasks/` (or subdirectory)
|
|
256
|
+
- [ ] Filename is kebab-case
|
|
257
|
+
- [ ] Exports a factory function `(services, Queue) => TaskDefinition`
|
|
258
|
+
- [ ] Has correct `@generated-types` JSDoc annotation
|
|
259
|
+
- [ ] Returns object with `task`, `cronTime`, `isActive`
|
|
260
|
+
- [ ] Cron expression is correct (use crontab.guru to verify)
|
|
261
|
+
- [ ] `isActive` is set to `true`
|
|
262
|
+
- [ ] `environments` is set if task should not run in development
|
|
263
|
+
- [ ] Heavy operations are queued as jobs, not run inline
|
|
264
|
+
- [ ] Queries use `showOnly` and `perPage` limits
|
|
265
|
+
- [ ] Not named `index.js` or `*.spec.js`
|