nodejs-quickstart-structure 1.16.2 โ†’ 1.17.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -5,6 +5,16 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [1.17.0] - 2026-03-23
9
+
10
+ ### Added
11
+ - **Kafka KRaft Mode Integration**: Modernized Kafka setups across all templates (MVC & Clean Architecture) by completely removing the Zookeeper dependency and enabling KRaft mode in `docker-compose.yml`, reducing project orchestration overhead.
12
+ - **End-to-End (E2E) Verification Framework**: Implemented dedicated Docker container targeted end-to-end tests (`tests/e2e/`) utilizing Supertest via dynamic `SERVER_URL` mapping to eliminate port collisions and test the fully assembled container cluster directly.
13
+ - **Enhanced Validation Pipelines**: Automatically executes the `npm run test:e2e` suite at the conclusion of internal validations across the entire platform matrix for improved CI accountability.
14
+
15
+ ### Refactored
16
+ - **Test Directory Strict Isolation**: Restructured internal code generation workflows (`lib/modules/`) to pipe all generated `.spec` files strictly into a dedicated `tests/unit/` subdirectory, cleanly abstracting unit specifications from end-to-end specifications.
17
+
8
18
  ## [1.16.2] - 2026-03-19
9
19
 
10
20
  ### Changed
package/LICENSE ADDED
@@ -0,0 +1,15 @@
1
+ ISC License
2
+
3
+ Copyright (c) 2026, Pau Dang
4
+
5
+ Permission to use, copy, modify, and/or distribute this software for any
6
+ purpose with or without fee is hereby granted, provided that the above
7
+ copyright notice and this permission notice appear in all copies.
8
+
9
+ THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
10
+ WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
11
+ MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
12
+ ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
13
+ WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
14
+ ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
15
+ OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
package/README.md CHANGED
@@ -18,7 +18,7 @@ A powerful CLI tool to scaffold production-ready Node.js microservices with buil
18
18
  - **Communication Flow**: Scaffold APIs using **REST**, **GraphQL** (with Apollo Server), or **Kafka** (event-driven).
19
19
  - **Caching Layer**: Choose between **Redis** or built-in **Memory Cache** for data caching.
20
20
  - **Centralized Error Handling**: Every project ships with a global error handler, custom error classes and structured JSON error responses โ€” consistent across REST & GraphQL.
21
- - **Dockerized**: Automatically generates `docker-compose.yml` for DB, Kafka, Redis, and Zookeeper.
21
+ - **Dockerized**: Automatically generates `docker-compose.yml` for DB, Kafka, and Redis.
22
22
  - **Database Migrations/Schemas**: Integrated **Flyway** for SQL migrations or **Mongoose** schemas for MongoDB.
23
23
  - **Professional Standards**: Generated projects come with highly professional, industry-standard tooling.
24
24
 
@@ -35,7 +35,7 @@ We don't just generate boilerplate; we generate **production-ready** foundations
35
35
  - **๐Ÿ” Code Quality**: Pre-configured `Eslint` and `Prettier` for consistent coding standards.
36
36
  - **๐Ÿ›ก๏ธ Security**: Built-in `Helmet`, `HPP`, `CORS`, and Rate-Limiting middleware.
37
37
  - **๐Ÿšจ Error Handling**: Centralized global error middleware with custom error classes and structured JSON responses. GraphQL uses Apollo's `formatError` hook; REST uses Express error middleware.
38
- - **๐Ÿงช Testing Excellence**: Integrated `Jest` and `Supertest`. Every generated project maintains **>70% Unit Test coverage** for controllers, services, and resolvers out of the box.
38
+ - **๐Ÿงช Testing Excellence**: Integrated `Jest` and `Supertest`. Every generated project maintains **>80% Unit Test coverage** for controllers, services, and resolvers out of the box.
39
39
  - **๐Ÿ”„ CI/CD Integration**: Pre-configured workflows for **GitHub Actions**, **Jenkins**, and **GitLab CI**.
40
40
  - **โš“ Git Hooks**: `Husky` and `Lint-Staged` to ensure no bad code is ever committed.
41
41
  - **๐Ÿค Reliability**: Health Checks (`/health`) with deep database pings, Infrastructure Retry Logic (handling Docker startup delays), and Graceful Shutdown workflows.
@@ -51,7 +51,7 @@ The CLI supports a massive number of configurations to fit your exact needs:
51
51
  - **Clean Architecture**: 60 variants (Languages ร— Databases ร— Communication Patterns ร— Caching)
52
52
  - **480 Total Scenarios**:
53
53
  - Every combination can be generated with or without (**GitHub Actions CI/CD** / **Jenkins** or **GitLab CI**), tripling the possibilities.
54
- - Every single one of these 480 scenarios is verified to be compatible with our 70% Coverage Threshold policy.
54
+ - Every single one of these 480 scenarios is verified to be compatible with our 80% Coverage Threshold policy.
55
55
 
56
56
  For a detailed list of all supported cases, check out [docs/generateCase.md](docs/generateCase.md).
57
57
 
package/bin/index.js CHANGED
@@ -1,89 +1,90 @@
1
- #!/usr/bin/env node
2
-
3
- import { Command } from 'commander';
4
- import chalk from 'chalk';
5
- import { getProjectDetails } from '../lib/prompts.js';
6
- import { generateProject } from '../lib/generator.js';
7
- import { readFileSync } from 'fs';
8
- import { join, dirname } from 'path';
9
- import { fileURLToPath } from 'url';
10
-
11
- const __dirname = dirname(fileURLToPath(import.meta.url));
12
- const pkg = JSON.parse(readFileSync(join(__dirname, '../package.json'), 'utf-8'));
13
-
14
- const program = new Command();
15
-
16
- program
17
- .name('nodejs-quickstart')
18
- .description('๐Ÿš€ CLI to scaffold production-ready Node.js microservices.\n\nGenerates projects with:\n- MVC or Clean Architecture\n- REST or Kafka\n- MySQL, PostgreSQL, or MongoDB\n- Docker, Flyway & Mongoose support')
19
- .version(pkg.version, '-v, --version', 'Output the current version')
20
- .addHelpText('after', `\n${chalk.yellow('Example:')}\n $ nodejs-quickstart init ${chalk.gray('# Start the interactive setup')}\n`);
21
-
22
- program
23
- .command('init')
24
- .description('Initialize a new Node.js project')
25
- .option('-n, --project-name <name>', 'Project name')
26
- .option('-l, --language <language>', 'Language (JavaScript, TypeScript)')
27
- .option('-a, --architecture <architecture>', 'Architecture (MVC, Clean Architecture)')
28
- .option('--view-engine <view>', 'View Engine (None, EJS, Pug) - MVC only')
29
- .option('-d, --database <database>', 'Database (MySQL, PostgreSQL)')
30
- .option('--db-name <name>', 'Database name')
31
- .option('-c, --communication <communication>', 'Communication (REST APIs, GraphQL, Kafka)')
32
- .option('--ci-provider <provider>', 'CI/CD Provider (None, GitHub Actions, Jenkins)')
33
- .option('--caching <type>', 'Caching Layer (None/Redis)')
34
- .action(async (options) => {
35
- // Fix for Commander camelCase conversion
36
- if (options.ciProvider) {
37
- options.ciProvider = options.ciProvider;
38
- }
39
-
40
- console.log(chalk.blue('Welcome to the Node.js Quickstart Generator!'));
41
-
42
- try {
43
- const answers = await getProjectDetails(options);
44
- console.log(chalk.green('\nConfiguration received:'));
45
- console.log(JSON.stringify(answers, null, 2));
46
-
47
- console.log(chalk.yellow('\nGenerating project...'));
48
- await generateProject(answers);
49
-
50
- console.log(chalk.green('\nโœ” Project generated successfully!'));
51
-
52
- console.log(chalk.magenta('\n๐Ÿš€ Project is AI-Ready!'));
53
- console.log(chalk.magenta('-----------------------------------------'));
54
- console.log(chalk.magenta('๐Ÿค– We detected you are using AI tools.'));
55
- console.log(chalk.magenta(`๐Ÿ“ Use Cursor? We've configured '.cursorrules' for you.`));
56
- console.log(chalk.magenta(`๐Ÿ“ Use ChatGPT/Gemini? Check the 'prompts/' folder for Agent Skills.`));
57
- console.log(chalk.magenta('-----------------------------------------'));
58
-
59
- let manualStartInstructions = `\n${chalk.yellow('Development:')}\n cd ${answers.projectName}\n npm install`;
60
-
61
- const needsInfrastructure = answers.database !== 'None' || answers.caching === 'Redis' || answers.communication === 'Kafka';
62
-
63
- if (needsInfrastructure) {
64
- let servicesToStart = '';
65
- if (answers.database !== 'None') servicesToStart += ' db';
66
- if (answers.caching === 'Redis') servicesToStart += ' redis';
67
- if (answers.communication === 'Kafka') servicesToStart += ' zookeeper kafka';
68
-
69
- manualStartInstructions += `\n docker-compose up -d${servicesToStart} # Start infrastructure first\n npm run dev`;
70
- } else {
71
- manualStartInstructions += `\n npm run dev`;
72
- }
73
-
74
- console.log(chalk.cyan(`\nNext steps:\n cd ${answers.projectName}\n npm install\n docker-compose up\n-----------------------${manualStartInstructions}\n\n${chalk.yellow('Production (PM2):')}\n npm run build\n npm run deploy\n npx pm2 logs`));
75
-
76
- } catch (error) {
77
- if (error.name === 'ExitPromptError') {
78
- console.log(chalk.yellow('\n\n๐Ÿ‘‹ Goodbye! Setup cancelled.'));
79
- process.exit(0);
80
- }
81
- console.error(chalk.red('Error generating project:'), error);
82
- process.exit(1);
83
- }
84
- });
85
- program.parse(process.argv);
86
-
87
- if (!process.argv.slice(2).length) {
88
- program.outputHelp();
89
- }
1
+ #!/usr/bin/env node
2
+
3
+ import { Command } from 'commander';
4
+ import chalk from 'chalk';
5
+ import { getProjectDetails } from '../lib/prompts.js';
6
+ import { generateProject } from '../lib/generator.js';
7
+ import { readFileSync } from 'fs';
8
+ import { join, dirname } from 'path';
9
+ import { fileURLToPath } from 'url';
10
+
11
+ const __dirname = dirname(fileURLToPath(import.meta.url));
12
+ const pkg = JSON.parse(readFileSync(join(__dirname, '../package.json'), 'utf-8'));
13
+
14
+ const program = new Command();
15
+
16
+ program
17
+ .name('nodejs-quickstart')
18
+ .description('๐Ÿš€ CLI to scaffold production-ready Node.js microservices.\n\nGenerates projects with:\n- MVC or Clean Architecture\n- REST or Kafka\n- MySQL, PostgreSQL, or MongoDB\n- Docker, Flyway & Mongoose support')
19
+ .version(pkg.version, '-v, --version', 'Output the current version')
20
+ .addHelpText('after', `\n${chalk.yellow('Example:')}\n $ nodejs-quickstart init ${chalk.gray('# Start the interactive setup')}\n`);
21
+
22
+ program
23
+ .command('init')
24
+ .description('Initialize a new Node.js project')
25
+ .option('-n, --project-name <name>', 'Project name')
26
+ .option('-l, --language <language>', 'Language (JavaScript, TypeScript)')
27
+ .option('-a, --architecture <architecture>', 'Architecture (MVC, Clean Architecture)')
28
+ .option('--view-engine <view>', 'View Engine (None, EJS, Pug) - MVC only')
29
+ .option('-d, --database <database>', 'Database (MySQL, PostgreSQL)')
30
+ .option('--db-name <name>', 'Database name')
31
+ .option('-c, --communication <communication>', 'Communication (REST APIs, GraphQL, Kafka)')
32
+ .option('--ci-provider <provider>', 'CI/CD Provider (None, GitHub Actions, Jenkins)')
33
+ .option('--caching <type>', 'Caching Layer (None/Redis)')
34
+ .action(async (options) => {
35
+ // Fix for Commander camelCase conversion
36
+ if (options.ciProvider) {
37
+ options.ciProvider = options.ciProvider;
38
+ }
39
+
40
+ console.log(chalk.blue('Welcome to the Node.js Quickstart Generator!'));
41
+
42
+ try {
43
+ const answers = await getProjectDetails(options);
44
+ console.log(chalk.green('\nConfiguration received:'));
45
+ console.log(JSON.stringify(answers, null, 2));
46
+
47
+ console.log(chalk.yellow('\nGenerating project...'));
48
+ await generateProject(answers);
49
+
50
+ console.log(chalk.green('\nโœ” Project generated successfully!'));
51
+
52
+ console.log(chalk.magenta('\n๐Ÿš€ Project is AI-Ready!'));
53
+ console.log(chalk.magenta('-----------------------------------------'));
54
+ console.log(chalk.magenta('๐Ÿค– We detected you are using AI tools.'));
55
+ console.log(chalk.magenta(`๐Ÿ“ Use Cursor? We've configured '.cursorrules' for you.`));
56
+ console.log(chalk.magenta(`๐Ÿ“ Use ChatGPT/Gemini? Check the 'prompts/' folder for Agent Skills.`));
57
+ console.log(chalk.magenta('-----------------------------------------'));
58
+
59
+ let manualStartInstructions = `\n${chalk.yellow('Development:')}\n cd ${answers.projectName}\n npm install`;
60
+
61
+ const needsInfrastructure = answers.database !== 'None' || answers.caching === 'Redis' || answers.communication === 'Kafka';
62
+
63
+ if (needsInfrastructure) {
64
+ let servicesToStart = '';
65
+ if (answers.database === 'MongoDB') servicesToStart += ' db';
66
+ else if (answers.database !== 'None') servicesToStart += ' db flyway';
67
+ if (answers.caching === 'Redis') servicesToStart += ' redis';
68
+ if (answers.communication === 'Kafka') servicesToStart += ' kafka';
69
+
70
+ manualStartInstructions += `\n docker-compose up -d${servicesToStart} # Start infrastructure first\n npm run dev`;
71
+ } else {
72
+ manualStartInstructions += `\n npm run dev`;
73
+ }
74
+
75
+ console.log(chalk.cyan(`\nNext steps:\n cd ${answers.projectName}\n npm install\n docker-compose up\n-----------------------${manualStartInstructions}\n\n${chalk.yellow('Production (PM2):')}\n npm run build\n npm run deploy\n npx pm2 logs`));
76
+
77
+ } catch (error) {
78
+ if (error.name === 'ExitPromptError') {
79
+ console.log(chalk.yellow('\n\n๐Ÿ‘‹ Goodbye! Setup cancelled.'));
80
+ process.exit(0);
81
+ }
82
+ console.error(chalk.red('Error generating project:'), error);
83
+ process.exit(1);
84
+ }
85
+ });
86
+ program.parse(process.argv);
87
+
88
+ if (!process.argv.slice(2).length) {
89
+ program.outputHelp();
90
+ }
@@ -73,7 +73,7 @@ export const renderErrorMiddleware = async (templatePath, targetDir, config) =>
73
73
  const specExt = language === 'TypeScript' ? 'ts' : 'js';
74
74
  const specTemplatePath = path.join(templatePath, '../../common/src/utils', `errorMiddleware.spec.${specExt}.ejs`);
75
75
  if (await fs.pathExists(specTemplatePath)) {
76
- const testUtilsDir = path.join(targetDir, 'tests', 'utils');
76
+ const testUtilsDir = path.join(targetDir, 'tests', 'unit', 'utils');
77
77
  await fs.ensureDir(testUtilsDir);
78
78
  const specContent = ejs.render(await fs.readFile(specTemplatePath, 'utf-8'), config);
79
79
  await fs.writeFile(path.join(testUtilsDir, `errorMiddleware.spec.${specExt}`), specContent);
@@ -89,7 +89,7 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
89
89
  const userControllerSpecName = language === 'TypeScript' ? 'userController.spec.ts' : 'userController.spec.js';
90
90
 
91
91
  const userControllerPath = path.join(targetDir, 'src/controllers', userControllerName);
92
- const userControllerSpecPath = path.join(targetDir, 'tests/controllers', userControllerSpecName);
92
+ const userControllerSpecPath = path.join(targetDir, 'tests/unit/controllers', userControllerSpecName);
93
93
 
94
94
  const userControllerTemplate = path.join(templatePath, 'src/controllers', `${userControllerName}.ejs`);
95
95
  const userControllerSpecTemplate = path.join(templatePath, 'src/controllers', `${userControllerSpecName}.ejs`);
@@ -101,7 +101,7 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
101
101
  }
102
102
 
103
103
  if (await fs.pathExists(userControllerSpecTemplate)) {
104
- await fs.ensureDir(path.join(targetDir, 'tests/controllers'));
104
+ await fs.ensureDir(path.join(targetDir, 'tests/unit/controllers'));
105
105
  const content = ejs.render(await fs.readFile(userControllerSpecTemplate, 'utf-8'), { ...config });
106
106
  await fs.writeFile(userControllerSpecPath, content);
107
107
  await fs.remove(path.join(targetDir, 'src/controllers', `${userControllerSpecName}.ejs`));
@@ -113,7 +113,7 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
113
113
  const repoSpecName = language === 'TypeScript' ? 'UserRepository.spec.ts' : 'UserRepository.spec.js';
114
114
 
115
115
  const repoPath = path.join(targetDir, 'src/infrastructure/repositories', repoName);
116
- const repoSpecPath = path.join(targetDir, 'tests/infrastructure/repositories', repoSpecName);
116
+ const repoSpecPath = path.join(targetDir, 'tests/unit/infrastructure/repositories', repoSpecName);
117
117
 
118
118
  const repoTemplate = path.join(templatePath, 'src/infrastructure/repositories', `${repoName}.ejs`);
119
119
  const repoSpecTemplate = path.join(templatePath, 'src/infrastructure/repositories', `${repoSpecName}.ejs`);
@@ -124,7 +124,7 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
124
124
  await fs.remove(path.join(targetDir, 'src/infrastructure/repositories', `${repoName}.ejs`));
125
125
  }
126
126
  if (await fs.pathExists(repoSpecTemplate)) {
127
- await fs.ensureDir(path.join(targetDir, 'tests/infrastructure/repositories'));
127
+ await fs.ensureDir(path.join(targetDir, 'tests/unit/infrastructure/repositories'));
128
128
  const content = ejs.render(await fs.readFile(repoSpecTemplate, 'utf-8'), { ...config });
129
129
  await fs.writeFile(repoSpecPath, content);
130
130
  await fs.remove(path.join(targetDir, 'src/infrastructure/repositories', `${repoSpecName}.ejs`));
@@ -134,7 +134,7 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
134
134
  const controllerSpecName = language === 'TypeScript' ? 'userController.spec.ts' : 'userController.spec.js';
135
135
 
136
136
  const controllerPath = path.join(targetDir, 'src/interfaces/controllers', controllerName);
137
- const controllerSpecPath = path.join(targetDir, 'tests/interfaces/controllers', controllerSpecName);
137
+ const controllerSpecPath = path.join(targetDir, 'tests/unit/interfaces/controllers', controllerSpecName);
138
138
 
139
139
  const controllerTemplate = path.join(templatePath, 'src/interfaces/controllers', `${controllerName}.ejs`);
140
140
  const controllerSpecTemplate = path.join(templatePath, 'src/interfaces/controllers', `${controllerSpecName}.ejs`);
@@ -146,7 +146,7 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
146
146
  }
147
147
 
148
148
  if (await fs.pathExists(controllerSpecTemplate)) {
149
- await fs.ensureDir(path.join(targetDir, 'tests/interfaces/controllers'));
149
+ await fs.ensureDir(path.join(targetDir, 'tests/unit/interfaces/controllers'));
150
150
  const content = ejs.render(await fs.readFile(controllerSpecTemplate, 'utf-8'), { ...config });
151
151
  await fs.writeFile(controllerSpecPath, content);
152
152
  await fs.remove(path.join(targetDir, 'src/interfaces/controllers', `${controllerSpecName}.ejs`));
@@ -191,9 +191,9 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
191
191
  // Render health route spec template
192
192
  const healthSpecTemplatePath = path.join(templatePath, '../../common/health', healthExt, `healthRoute.spec.${healthExt}.ejs`);
193
193
  if (await fs.pathExists(healthSpecTemplatePath)) {
194
- let testRouteDestDir = path.join(targetDir, 'tests', 'routes');
194
+ let testRouteDestDir = path.join(targetDir, 'tests', 'unit', 'routes');
195
195
  if (architecture === 'Clean Architecture') {
196
- testRouteDestDir = path.join(targetDir, 'tests', 'interfaces', 'routes');
196
+ testRouteDestDir = path.join(targetDir, 'tests', 'unit', 'interfaces', 'routes');
197
197
  }
198
198
  await fs.ensureDir(testRouteDestDir);
199
199
  const specContent = ejs.render(await fs.readFile(healthSpecTemplatePath, 'utf-8'), config);
@@ -215,13 +215,40 @@ export const renderDynamicComponents = async (templatePath, targetDir, config) =
215
215
  // Render graceful shutdown spec template
216
216
  const shutdownSpecTemplatePath = path.join(templatePath, '../../common/shutdown', shutdownExt, `gracefulShutdown.spec.${shutdownExt}.ejs`);
217
217
  if (await fs.pathExists(shutdownSpecTemplatePath)) {
218
- const testUtilsDestDir = path.join(targetDir, 'tests', 'utils');
218
+ const testUtilsDestDir = path.join(targetDir, 'tests', 'unit', 'utils');
219
219
  await fs.ensureDir(testUtilsDestDir);
220
220
  const specContent = ejs.render(await fs.readFile(shutdownSpecTemplatePath, 'utf-8'), config);
221
221
  await fs.writeFile(path.join(testUtilsDestDir, `gracefulShutdown.spec.${shutdownExt}`), specContent);
222
222
  }
223
223
  }
224
224
 
225
+ // Advanced E2E Testing Generation
226
+ const e2eExt = language === 'TypeScript' ? 'ts' : 'js';
227
+ const e2eTemplatePath = path.join(templatePath, '../../common/src/tests/e2e', `e2e.users.test.${e2eExt}.ejs`);
228
+
229
+ if (await fs.pathExists(e2eTemplatePath)) {
230
+ let e2eDestDir = path.join(targetDir, 'tests', 'e2e');
231
+ await fs.ensureDir(e2eDestDir);
232
+
233
+ const e2eContent = ejs.render(await fs.readFile(e2eTemplatePath, 'utf-8'), { ...config });
234
+ await fs.writeFile(path.join(e2eDestDir, `e2e.users.test.${e2eExt}`), e2eContent);
235
+ }
236
+
237
+ // E2E Test Orchestrator Generation
238
+ const e2eOrchestratorTemplatePath = path.join(templatePath, '../../common/scripts', 'run-e2e.js.ejs');
239
+ if (await fs.pathExists(e2eOrchestratorTemplatePath)) {
240
+ let scriptsDestDir = path.join(targetDir, 'scripts');
241
+ await fs.ensureDir(scriptsDestDir);
242
+
243
+ const orchestratorContent = ejs.render(await fs.readFile(e2eOrchestratorTemplatePath, 'utf-8'), { ...config });
244
+ await fs.writeFile(path.join(scriptsDestDir, 'run-e2e.js'), orchestratorContent);
245
+
246
+ // Cleanup the raw ejs copy in target
247
+ if (await fs.pathExists(path.join(scriptsDestDir, 'run-e2e.js.ejs'))) {
248
+ await fs.remove(path.join(scriptsDestDir, 'run-e2e.js.ejs'));
249
+ }
250
+ }
251
+
225
252
  // GraphQL Setup
226
253
  if (config.communication === 'GraphQL') {
227
254
  const ext = language === 'TypeScript' ? 'ts' : 'js';
@@ -342,13 +369,23 @@ export const processAllTests = async (targetDir, config) => {
342
369
  await processDir(itemPath);
343
370
  } else if (itemPath.endsWith('.spec.ts') ||
344
371
  itemPath.endsWith('.spec.js') ||
372
+ itemPath.endsWith('.test.ts') ||
373
+ itemPath.endsWith('.test.js') ||
345
374
  itemPath.endsWith('.spec.ts.ejs') ||
346
- itemPath.endsWith('.spec.js.ejs')) {
375
+ itemPath.endsWith('.spec.js.ejs') ||
376
+ itemPath.endsWith('.test.ts.ejs') ||
377
+ itemPath.endsWith('.test.js.ejs')) {
347
378
  const relativePath = path.relative(srcDir, itemPath);
348
379
 
349
380
  const cleanRelativePath = relativePath.replace(/\.ejs$/, '');
350
381
 
351
- const targetTestPath = path.join(testsDir, cleanRelativePath);
382
+ // Exclude e2e if it accidentally falls here, as it's processed separately
383
+ if (cleanRelativePath.includes('e2e')) {
384
+ await fs.remove(itemPath);
385
+ continue;
386
+ }
387
+
388
+ const targetTestPath = path.join(testsDir, 'unit', cleanRelativePath);
352
389
 
353
390
  await fs.ensureDir(path.dirname(targetTestPath));
354
391
 
@@ -64,7 +64,7 @@ export const setupCaching = async (templatesDir, targetDir, config) => {
64
64
  const specLoggerPath = architecture === 'Clean Architecture' ? '@/infrastructure/log/logger' : '@/utils/logger';
65
65
  const specRedisPath = architecture === 'Clean Architecture' ? '@/infrastructure/caching/redisClient' : '@/config/redisClient';
66
66
  const specContent = ejs.render(specTemplate, { ...config, loggerPath: specLoggerPath, redisClientPath: specRedisPath });
67
- const specTarget = cacheTarget.replace(`${path.sep}src${path.sep}`, `${path.sep}tests${path.sep}`).replace(`.${langExt}`, `.spec.${langExt}`);
67
+ const specTarget = cacheTarget.replace(`${path.sep}src${path.sep}`, `${path.sep}tests${path.sep}unit${path.sep}`).replace(`.${langExt}`, `.spec.${langExt}`);
68
68
  await fs.ensureDir(path.dirname(specTarget));
69
69
  await fs.writeFile(specTarget, specContent);
70
70
  }
@@ -47,6 +47,11 @@ export const renderProfessionalConfig = async (templatesDir, targetDir, config)
47
47
  const jestTemplate = await fs.readFile(path.join(templatesDir, 'common', 'jest.config.js.ejs'), 'utf-8');
48
48
  const jestContent = ejs.render(jestTemplate, { ...config });
49
49
  await fs.writeFile(path.join(targetDir, 'jest.config.js'), jestContent);
50
+
51
+ // E2E Config
52
+ const jestE2eTemplate = await fs.readFile(path.join(templatesDir, 'common', 'jest.e2e.config.js.ejs'), 'utf-8');
53
+ const jestE2eContent = ejs.render(jestE2eTemplate, { ...config });
54
+ await fs.writeFile(path.join(targetDir, 'jest.e2e.config.js'), jestE2eContent);
50
55
  };
51
56
 
52
57
  export const renderAiNativeFiles = async (templatesDir, targetDir, config) => {
@@ -67,7 +67,7 @@ export const setupDatabase = async (templatesDir, targetDir, config) => {
67
67
  if (await fs.pathExists(specTemplateSource)) {
68
68
  const specTemplate = await fs.readFile(specTemplateSource, 'utf-8');
69
69
  const specContent = ejs.render(specTemplate, { ...config });
70
- const specTarget = dbConfigTarget.replace(`${path.sep}src${path.sep}`, `${path.sep}tests${path.sep}`).replace(`.${langExt}`, `.spec.${langExt}`);
70
+ const specTarget = dbConfigTarget.replace(`${path.sep}src${path.sep}`, `${path.sep}tests${path.sep}unit${path.sep}`).replace(`.${langExt}`, `.spec.${langExt}`);
71
71
  await fs.ensureDir(path.dirname(specTarget));
72
72
  await fs.writeFile(specTarget, specContent);
73
73
  }
@@ -108,7 +108,7 @@ export const generateModels = async (templatesDir, targetDir, config) => {
108
108
  if (await fs.pathExists(modelSpecTemplateSource)) {
109
109
  const modelSpecTemplate = await fs.readFile(modelSpecTemplateSource, 'utf-8');
110
110
  const modelSpecContent = ejs.render(modelSpecTemplate, { ...config });
111
- const modelSpecTarget = modelTarget.replace(`${path.sep}src${path.sep}`, `${path.sep}tests${path.sep}`).replace(`.${langExt}`, `.spec.${langExt}`);
111
+ const modelSpecTarget = modelTarget.replace(`${path.sep}src${path.sep}`, `${path.sep}tests${path.sep}unit${path.sep}`).replace(`.${langExt}`, `.spec.${langExt}`);
112
112
  await fs.ensureDir(path.dirname(modelSpecTarget));
113
113
  await fs.writeFile(modelSpecTarget, modelSpecContent);
114
114
  }
@@ -63,9 +63,9 @@ export const setupKafka = async (templatesDir, targetDir, config) => {
63
63
  const specContent = ejs.render(await fs.readFile(kafkaConfigSpecTemplate, 'utf-8'), { ...config });
64
64
  let specTarget;
65
65
  if (architecture === 'MVC') {
66
- specTarget = path.join(targetDir, 'tests', 'config', kafkaConfigSpecFileName);
66
+ specTarget = path.join(targetDir, 'tests', 'unit', 'config', kafkaConfigSpecFileName);
67
67
  } else {
68
- specTarget = path.join(targetDir, 'tests', 'infrastructure', 'config', kafkaConfigSpecFileName);
68
+ specTarget = path.join(targetDir, 'tests', 'unit', 'infrastructure', 'config', kafkaConfigSpecFileName);
69
69
  }
70
70
  await fs.ensureDir(path.dirname(specTarget));
71
71
  await fs.writeFile(specTarget, specContent);
@@ -79,7 +79,7 @@ export const setupKafka = async (templatesDir, targetDir, config) => {
79
79
  if (architecture === 'Clean Architecture') {
80
80
  // Clean Architecture Restructuring
81
81
  await fs.ensureDir(path.join(targetDir, 'src/infrastructure/messaging'));
82
- await fs.ensureDir(path.join(targetDir, 'tests/infrastructure/messaging'));
82
+ await fs.ensureDir(path.join(targetDir, 'tests/unit/infrastructure/messaging'));
83
83
  await fs.ensureDir(path.join(targetDir, 'src/infrastructure/config'));
84
84
 
85
85
  const serviceExt = language === 'TypeScript' ? 'ts' : 'js';
@@ -93,7 +93,7 @@ export const setupKafka = async (templatesDir, targetDir, config) => {
93
93
  if (await fs.pathExists(path.join(targetDir, `src/services/kafkaService.spec.${serviceExt}`))) {
94
94
  await fs.move(
95
95
  path.join(targetDir, `src/services/kafkaService.spec.${serviceExt}`),
96
- path.join(targetDir, `tests/infrastructure/messaging/kafkaClient.spec.${serviceExt}`),
96
+ path.join(targetDir, `tests/unit/infrastructure/messaging/kafkaClient.spec.${serviceExt}`),
97
97
  { overwrite: true }
98
98
  );
99
99
  }
@@ -131,7 +131,7 @@ export const setupKafka = async (templatesDir, targetDir, config) => {
131
131
  const specTemplateSource = path.join(templatesDir, 'common', 'kafka', langExt, 'messaging', `${t.src}.spec.${langExt}.ejs`);
132
132
  if (await fs.pathExists(specTemplateSource)) {
133
133
  const specContent = ejs.render(await fs.readFile(specTemplateSource, 'utf-8'), { ...config, loggerPath });
134
- const specDest = path.join(targetDir, 'tests', `${t.dest}.spec.${langExt}`);
134
+ const specDest = path.join(targetDir, 'tests', 'unit', `${t.dest}.spec.${langExt}`);
135
135
  await fs.ensureDir(path.dirname(specDest));
136
136
  await fs.writeFile(specDest, specContent);
137
137
  }
@@ -162,17 +162,17 @@ export const setupKafka = async (templatesDir, targetDir, config) => {
162
162
  const specTemplateSource = path.join(templatesDir, 'common', 'kafka', langExt, 'messaging', `${t.src}.spec.${langExt}.ejs`);
163
163
  if (await fs.pathExists(specTemplateSource)) {
164
164
  const specContent = ejs.render(await fs.readFile(specTemplateSource, 'utf-8'), { ...config, loggerPath });
165
- const specDest = path.join(targetDir, 'tests', `${t.dest}.spec.${langExt}`);
165
+ const specDest = path.join(targetDir, 'tests', 'unit', `${t.dest}.spec.${langExt}`);
166
166
  await fs.ensureDir(path.dirname(specDest));
167
167
  await fs.writeFile(specDest, specContent);
168
168
  }
169
169
  }
170
170
 
171
171
  if (await fs.pathExists(path.join(targetDir, `src/services/kafkaService.spec.${serviceExt}`))) {
172
- await fs.ensureDir(path.join(targetDir, 'tests/services'));
172
+ await fs.ensureDir(path.join(targetDir, 'tests/unit/services'));
173
173
  await fs.move(
174
174
  path.join(targetDir, `src/services/kafkaService.spec.${serviceExt}`),
175
- path.join(targetDir, `tests/services/kafkaService.spec.${serviceExt}`),
175
+ path.join(targetDir, `tests/unit/services/kafkaService.spec.${serviceExt}`),
176
176
  { overwrite: true }
177
177
  );
178
178
  }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "nodejs-quickstart-structure",
3
- "version": "1.16.2",
3
+ "version": "1.17.0",
4
4
  "type": "module",
5
5
  "description": "The ultimate nodejs quickstart structure CLI to scaffold Node.js microservices with MVC or Clean Architecture",
6
6
  "main": "bin/index.js",
@@ -20,7 +20,7 @@ When indexing or searching the workspace, ignore the following paths to prevent
20
20
 
21
21
  ### 1. Testing First
22
22
  - Every new service or controller method MUST have a test file in `tests/`.
23
- - **Coverage Gate**: Aim for > 70% coverage (Statement/Line/Function/Branch).
23
+ - **Coverage Gate**: Aim for > 80% coverage (Statement/Line/Function/Branch).
24
24
  - **Format**: Use Jest with the AAA (Arrange, Act, Assert) pattern.
25
25
  - **Isolation**: Mock external dependencies (DB, Redis, etc.) using `jest.mock()`.
26
26
 
@@ -28,7 +28,7 @@ DB_NAME=<%= dbName %>
28
28
 
29
29
  <%_ if (communication === 'Kafka') { -%>
30
30
  # Communication
31
- KAFKA_BROKER=localhost:9092
31
+ KAFKA_BROKER=localhost:9093
32
32
  KAFKA_CLIENT_ID=<%= projectName %>
33
33
  KAFKA_GROUP_ID=<%= projectName %>-group
34
34
  <%_ } -%>
@@ -22,12 +22,22 @@ lint_code:
22
22
  script:
23
23
  - npm run lint
24
24
 
25
- run_tests:
25
+ run_unit_tests:
26
26
  stage: test
27
27
  image: node:22-alpine
28
28
  script:
29
29
  - npm run test:coverage
30
30
 
31
+ run_e2e_tests:
32
+ stage: test
33
+ image: docker:20.10.16
34
+ services:
35
+ - docker:20.10.16-dind
36
+ script:
37
+ - apk add --no-cache nodejs npm docker-compose
38
+ - npm ci
39
+ - npm run test:e2e
40
+
31
41
  build_app:
32
42
  stage: build
33
43
  image: node:22-alpine
@@ -19,12 +19,18 @@ pipeline {
19
19
  }
20
20
  }
21
21
 
22
- stage('Test') {
22
+ stage('Unit Test') {
23
23
  steps {
24
24
  sh 'npm run test:coverage'
25
25
  }
26
26
  }
27
27
 
28
+ stage('E2E Test') {
29
+ steps {
30
+ sh 'npm run test:e2e'
31
+ }
32
+ }
33
+
28
34
  // stage('Build') {
29
35
  // steps {
30
36
  // sh 'npm run build'
@@ -124,7 +124,7 @@ This project demonstrates a production-ready Kafka flow:
124
124
  2. **Consumer**: `WelcomeEmailConsumer` listens to `user-topic` and simulates sending an email.
125
125
 
126
126
  ### How to verify:
127
- 1. Ensure infrastructure is running: `docker-compose up -d<% if (database !== 'None') { %> db<% } %><% if (caching === 'Redis') { %> redis<% } %><% if (communication === 'Kafka') { %> zookeeper kafka<% } %>`
127
+ 1. Ensure infrastructure is running: `docker-compose up -d<% if (database !== 'None') { %> db<% } %><% if (caching === 'Redis') { %> redis<% } %><% if (communication === 'Kafka') { %> kafka<% } %>`
128
128
  2. Start the app: `npm run dev`
129
129
  3. Trigger an event by creating a user (via Postman or curl):
130
130
  ```bash
@@ -167,7 +167,7 @@ To run the Node.js application locally while using Docker for the infrastructure
167
167
 
168
168
  ```bash
169
169
  # Start infrastructure
170
- docker-compose up -d<% if (database !== 'None') { %> db<% } %><% if (caching === 'Redis') { %> redis<% } %><% if (communication === 'Kafka') { %> zookeeper kafka<% } %>
170
+ docker-compose up -d<% if (database !== 'None') { %> db<% } %><% if (caching === 'Redis') { %> redis<% } %><% if (communication === 'Kafka') { %> kafka<% } %>
171
171
 
172
172
  # Start the application
173
173
  npm run dev
@@ -215,7 +215,7 @@ npm install
215
215
  2. **Start Infrastructure (DB, Redis, Kafka, etc.) in the background**
216
216
  *(This specifically starts the background services without running the application inside Docker, allowing PM2 to handle it).*
217
217
  ```bash
218
- docker-compose up -d<% if (database !== 'None') { %> db<% } %><% if (caching === 'Redis') { %> redis<% } %><% if (communication === 'Kafka') { %> zookeeper kafka<% } %>
218
+ docker-compose up -d<% if (database !== 'None') { %> db<% } %><% if (caching === 'Redis') { %> redis<% } %><% if (communication === 'Kafka') { %> kafka<% } %>
219
219
  ```
220
220
  3. **Wait 5-10s** for the database to fully initialize.
221
221
  4. **Deploy the App using PM2 in Cluster Mode**
@@ -248,6 +248,6 @@ docker-compose down
248
248
  This project is "AI-Ready" out of the box. We have pre-configured industry-leading AI context files to bridge the gap between "Generated Code" and "AI-Assisted Development."
249
249
 
250
250
  - **Magic Defaults**: We've automatically tailored your AI context to focus on **<%= projectName %>** and its specific architectural stack (<%= architecture %>, <%= database %>, etc.).
251
- - **Use Cursor?** We've configured **`.cursorrules`** at the root. It enforces project standards (70% coverage, MVC/Clean) directly within the editor.
251
+ - **Use Cursor?** We've configured **`.cursorrules`** at the root. It enforces project standards (80% coverage, MVC/Clean) directly within the editor.
252
252
  - *Pro-tip*: You can customize the `Project Goal` placeholder in `.cursorrules` to help the AI understand your specific business logic!
253
253
  - **Use ChatGPT/Gemini/Claude?** Check the **`prompts/`** directory. It contains highly-specialized Agent Skill templates. You can copy-paste these into any LLM to give it a "Senior Developer" understanding of your codebase immediately.
@@ -33,8 +33,11 @@ jobs:
33
33
  - name: Lint Code
34
34
  run: npm run lint
35
35
 
36
- - name: Run Tests
36
+ - name: Run Unit Tests
37
37
  run: npm run test:coverage
38
38
 
39
+ - name: Run E2E Tests
40
+ run: npm run test:e2e
41
+
39
42
  - name: Build
40
43
  run: npm run build --if-present
@@ -1,5 +1,3 @@
1
- const Redis = require('ioredis');
2
-
3
1
  jest.mock('ioredis', () => {
4
2
  const mRedis = jest.fn(() => ({
5
3
  on: jest.fn(),
@@ -14,7 +14,7 @@ services:
14
14
  <%_ } -%>
15
15
  <%_ if (communication === 'Kafka') { -%>
16
16
  environment:
17
- - KAFKA_BROKER=kafka:29092
17
+ - KAFKA_BROKER=kafka:9092
18
18
  - KAFKAJS_NO_PARTITIONER_WARNING=1
19
19
  - PORT=3000
20
20
  <%_ if (caching === 'Redis') { -%>
@@ -131,27 +131,20 @@ services:
131
131
  - db
132
132
  <%_ } -%>
133
133
  <%_ if (communication === 'Kafka') { -%>
134
- zookeeper:
135
- image: confluentinc/cp-zookeeper:7.4.0
136
- environment:
137
- ZOOKEEPER_CLIENT_PORT: 2181
138
- ZOOKEEPER_TICK_TIME: 2000
139
- ports:
140
- - "${ZOOKEEPER_PORT:-2181}:2181"
141
-
142
134
  kafka:
143
- image: confluentinc/cp-kafka:7.4.0
144
- depends_on:
145
- - zookeeper
135
+ image: bitnamilegacy/kafka:3.4.1-debian-12-r39
146
136
  ports:
147
- - "${KAFKA_PORT:-9092}:9092"
137
+ - "${KAFKA_EXTERNAL_PORT:-9093}:9093"
148
138
  environment:
149
- KAFKA_BROKER_ID: 1
150
- KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
151
- KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
152
- KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
153
- KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
154
- KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
139
+ - KAFKA_ENABLE_KRAFT=yes
140
+ - KAFKA_CFG_PROCESS_ROLES=broker,controller
141
+ - KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@localhost:9094
142
+ - KAFKA_CFG_NODE_ID=1
143
+ - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:${KAFKA_PORT:-9093}
144
+ - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,PLAINTEXT_HOST://:9093,CONTROLLER://:9094
145
+ - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
146
+ - KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
147
+ - KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
155
148
  <%_ } -%>
156
149
  <%_ if (caching === 'Redis') { -%>
157
150
  redis:
@@ -15,7 +15,7 @@ module.exports = {
15
15
  REDIS_PASSWORD: "",
16
16
  <%_ } -%>
17
17
  <%_ if (communication === 'Kafka') { -%>
18
- KAFKA_BROKER: "127.0.0.1:9092",
18
+ KAFKA_BROKER: "127.0.0.1:9093",
19
19
  KAFKAJS_NO_PARTITIONER_WARNING: 1,
20
20
  <%_ } -%>
21
21
  <%_ if (database !== 'None') { -%>
@@ -3,6 +3,7 @@ module.exports = {
3
3
  coverageDirectory: 'coverage',
4
4
  collectCoverageFrom: ['src/**/*.{js,ts}'],
5
5
  testMatch: ['**/*.test.ts', '**/*.test.js', '**/*.spec.ts', '**/*.spec.js'],
6
+ testPathIgnorePatterns: ['/node_modules/', '/tests/e2e/'],
6
7
  <% if (language === 'TypeScript') { %>preset: 'ts-jest',<% } %>
7
8
  moduleNameMapper: {
8
9
  '^@/(.*)$': '<rootDir>/src/$1',
@@ -0,0 +1,8 @@
1
+ <%_ if (language === 'TypeScript') { _%>/* eslint-disable @typescript-eslint/no-require-imports */<%_ } _%>
2
+ module.exports = {
3
+ ...require('./jest.config'),
4
+ testMatch: ['<rootDir>/tests/e2e/**/*.test.ts', '<rootDir>/tests/e2e/**/*.test.js'],
5
+ testPathIgnorePatterns: ['/node_modules/'],
6
+ testTimeout: 30000,
7
+ clearMocks: true
8
+ };
@@ -16,7 +16,9 @@
16
16
  <% } -%>
17
17
  "test": "jest",
18
18
  "test:watch": "jest --watch",
19
- "test:coverage": "jest --coverage"
19
+ "test:coverage": "jest --coverage",
20
+ "test:e2e:run": "jest --config ./jest.e2e.config.js --passWithNoTests",
21
+ "test:e2e": "node scripts/run-e2e.js"
20
22
  },
21
23
  "dependencies": {
22
24
  "express": "^4.18.2",
@@ -62,15 +64,13 @@
62
64
  "@types/express": "^4.17.21",
63
65
  "@types/cors": "^2.8.17",
64
66
  "@types/hpp": "^0.2.3",
65
- <% if (caching === 'Redis') { %> "@types/ioredis": "^5.0.0",
66
- <% } -%>
67
67
  <% if (caching === 'Memory Cache') { %> "@types/node-cache": "^4.2.5",
68
68
  <% } -%>
69
69
  <% if (database === 'PostgreSQL') { %> "@types/pg": "^8.10.9",
70
70
  <% } -%>
71
- <%_ if (database === 'MySQL' || database === 'PostgreSQL') { -%>
71
+ <% if (database === 'MySQL' || database === 'PostgreSQL') { -%>
72
72
  "@types/sequelize": "^4.28.19",
73
- <%_ } -%>
73
+ <% } -%>
74
74
  "@types/morgan": "^1.9.9",
75
75
  "rimraf": "^6.0.1"<% if ((viewEngine && viewEngine !== 'None') || communication === 'REST APIs' || communication === 'Kafka') { %>,
76
76
  "cpx2": "^8.0.0"<% } %><% } %>,
@@ -89,12 +89,14 @@
89
89
  "jest": "^29.7.0",
90
90
  "ts-jest": "^29.2.5",
91
91
  "@types/jest": "^29.5.14",
92
- "supertest": "6.3.3",
92
+ "wait-on": "^7.2.0",
93
+ "supertest": "^7.1.3",
93
94
  "tsconfig-paths": "^4.2.0",
94
95
  "tsc-alias": "^1.8.10",
95
96
  "@types/supertest": "^6.0.2"<% } else { %>,
96
97
  "jest": "^29.7.0",
97
- "supertest": "6.3.3"<% } %>
98
+ "wait-on": "^7.2.0",
99
+ "supertest": "^7.1.3"<% } %>
98
100
  },
99
101
  "lint-staged": {
100
102
  "*.{js,ts}": [
@@ -21,6 +21,6 @@ Please provide the code implementation following these steps:
21
21
  3. **Controller**: Implement the business logic and request handling.
22
22
  4. **Route**: Create the API endpoints and wire them to the controller.
23
23
  <% } -%>
24
- 6. **Testing**: Write comprehensive Jest unit tests covering the "Happy Path" and "Edge Cases/Errors" (AAA pattern). Remember, our coverage requirement is > 70%!
24
+ 6. **Testing**: Write comprehensive Jest unit tests covering the "Happy Path" and "Edge Cases/Errors" (AAA pattern). Remember, our coverage requirement is > 80%!
25
25
 
26
26
  Please provide the plan first so I can review it before we write the code.
@@ -32,7 +32,7 @@ We use the MVC (Model-View-Controller) pattern.
32
32
  <% } -%>
33
33
 
34
34
  ## Core Standards
35
- 1. **Testing**: We enforce > 70% coverage. Tests use Jest and the AAA (Arrange, Act, Assert) pattern.
35
+ 1. **Testing**: We enforce > 80% coverage. Tests use Jest and the AAA (Arrange, Act, Assert) pattern.
36
36
  2. **Error Handling**: We use centralized custom errors (e.g., `ApiError`) and global error middleware. Status codes come from standard constants, not hardcoded numbers.
37
37
  3. **Paths & Naming**:
38
38
  <% if (language === 'TypeScript') { -%>
@@ -0,0 +1,63 @@
1
+ /* eslint-disable */
2
+ const { execSync } = require('child_process');
3
+ const path = require('path');
4
+
5
+ // Set a specific port for E2E tests to avoid collisions with local development
6
+ process.env.PORT = '3001';
7
+ const TEST_PORT = process.env.PORT;
8
+
9
+ const execute = (command) => {
10
+ console.log(`\n> ${command}`);
11
+ // Run commands from the project root instead of the scripts folder
12
+ execSync(command, { stdio: 'inherit', cwd: path.resolve(__dirname, '../') });
13
+ };
14
+
15
+ let composeCmd = 'docker-compose';
16
+ try {
17
+ execSync('docker compose version', { stdio: 'ignore' });
18
+ composeCmd = 'docker compose';
19
+ } catch (e) {
20
+ // fallback to docker-compose
21
+ }
22
+
23
+ let currentProcessStartedDocker = false;
24
+
25
+ try {
26
+ let isAlreadyUp = false;
27
+ try {
28
+ // Silently check if the endpoint is already live (1.5-second timeout)
29
+ execSync(`npx wait-on http-get://127.0.0.1:${TEST_PORT}/health -t 1500`, {
30
+ stdio: 'ignore',
31
+ cwd: path.resolve(__dirname, '../')
32
+ });
33
+ isAlreadyUp = true;
34
+ } catch (e) {
35
+ isAlreadyUp = false;
36
+ }
37
+
38
+ if (isAlreadyUp) {
39
+ console.log('Infrastructure is already running! Skipping Docker spin-up...');
40
+ } else {
41
+ console.log(`Starting Docker Compose infrastructure using '${composeCmd}'...`);
42
+ execute(`${composeCmd} up -d --build`);
43
+ currentProcessStartedDocker = true;
44
+
45
+ console.log('Waiting for application healthcheck to turn green (120s timeout)...');
46
+ // Using wait-on to poll the universal /health endpoint injected into all architectures
47
+ execute(`npx wait-on http-get://127.0.0.1:${TEST_PORT}/health -t 120000`);
48
+ console.log('Infrastructure is healthy!');
49
+ }
50
+
51
+ console.log('Running E2E tests...');
52
+ execute('npm run test:e2e:run');
53
+ } catch (error) {
54
+ console.error('E2E tests failed or infrastructure did not boot in time.');
55
+ process.exitCode = 1;
56
+ } finally {
57
+ if (currentProcessStartedDocker) {
58
+ console.log('Tearing down isolated Docker Compose infrastructure...');
59
+ execute(`${composeCmd} down`);
60
+ } else {
61
+ console.log('Leaving preexisting infrastructure running.');
62
+ }
63
+ }
@@ -0,0 +1,49 @@
1
+ const request = require('supertest');
2
+
3
+ const SERVER_URL = process.env.TEST_URL || `http://127.0.0.1:${process.env.PORT || 3001}`;
4
+
5
+ describe('E2E User Tests', () => {
6
+ // Global setup and teardown hooks can be added here
7
+ // typically for database seeding or external authentication checks prior to E2E.
8
+ const uniqueEmail = `test_${Date.now()}@example.com`;
9
+
10
+ <%_ if (communication === 'GraphQL') { _%>
11
+ it('should create a user and verify flow via GraphQL', async () => {
12
+ const query = `
13
+ mutation {
14
+ createUser(name: "Test User", email: "${uniqueEmail}") {
15
+ id
16
+ name
17
+ email
18
+ }
19
+ }
20
+ `;
21
+ const response = await request(SERVER_URL)
22
+ .post('/graphql')
23
+ .send({ query });
24
+
25
+ expect(response.statusCode).toBe(200);
26
+ });
27
+ <%_ } else if (communication === 'Kafka') { _%>
28
+ it('should trigger Kafka event for user creation', async () => {
29
+ const response = await request(SERVER_URL)
30
+ .post('/api/users')
31
+ .send({ name: 'Test User', email: uniqueEmail });
32
+
33
+ // Assuming the API returns 201 or 404 (if no REST endpoint is exposed in Kafka skeleton)
34
+ expect([201, 202, 404]).toContain(response.statusCode);
35
+
36
+ // Wait for Kafka to process...
37
+ await new Promise(resolve => setTimeout(resolve, 1000));
38
+ });
39
+ <%_ } else { _%>
40
+ it('should create a user successfully via REST', async () => {
41
+ const response = await request(SERVER_URL)
42
+ .post('/api/users')
43
+ .send({ name: 'Test User', email: uniqueEmail });
44
+
45
+ // E2E Tests must have strict and deterministic assertions
46
+ expect(response.statusCode).toBe(201);
47
+ });
48
+ <%_ } _%>
49
+ });
@@ -0,0 +1,49 @@
1
+ import request from 'supertest';
2
+
3
+ const SERVER_URL = process.env.TEST_URL || `http://127.0.0.1:${process.env.PORT || 3001}`;
4
+
5
+ describe('E2E User Tests', () => {
6
+ // Global setup and teardown hooks can be added here
7
+ // typically for database seeding or external authentication checks prior to E2E.
8
+ const uniqueEmail = `test_${Date.now()}@example.com`;
9
+
10
+ <%_ if (communication === 'GraphQL') { _%>
11
+ it('should create a user and verify flow via GraphQL', async () => {
12
+ const query = `
13
+ mutation {
14
+ createUser(name: "Test User", email: "${uniqueEmail}") {
15
+ id
16
+ name
17
+ email
18
+ }
19
+ }
20
+ `;
21
+ const response = await request(SERVER_URL)
22
+ .post('/graphql')
23
+ .send({ query });
24
+
25
+ expect(response.statusCode).toBe(200);
26
+ });
27
+ <%_ } else if (communication === 'Kafka') { _%>
28
+ it('should trigger Kafka event for user creation', async () => {
29
+ const response = await request(SERVER_URL)
30
+ .post('/api/users')
31
+ .send({ name: 'Test User', email: uniqueEmail });
32
+
33
+ // Assuming the API returns 201 or 404 (if no REST endpoint is exposed in Kafka skeleton)
34
+ expect([201, 202, 404]).toContain(response.statusCode);
35
+
36
+ // Wait for Kafka to process...
37
+ await new Promise(resolve => setTimeout(resolve, 1000));
38
+ });
39
+ <%_ } else { _%>
40
+ it('should create a user successfully via REST', async () => {
41
+ const response = await request(SERVER_URL)
42
+ .post('/api/users')
43
+ .send({ name: 'Test User', email: uniqueEmail });
44
+
45
+ // E2E Tests must have strict and deterministic assertions
46
+ expect(response.statusCode).toBe(201);
47
+ });
48
+ <%_ } _%>
49
+ });