cistack 2.0.0 → 3.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,12 +10,15 @@
10
10
 
11
11
  - šŸ” **Deep codebase analysis** — reads `package.json`, lock files, config files, and directory structure
12
12
  - 🧠 **Smart detection** — identifies 30+ frameworks, 12 languages, 12+ testing tools, and 10+ hosting platforms
13
+ - ⚔ **Native Cache support** — speeds up pipelines by 2–4min using native caching for npm, pip, go, cargo, maven, gradle, and bundler
14
+ - ✨ **PR Preview Deploys** — automatic preview environments for Vercel and Netlify on every pull request
13
15
  - šŸš€ **Hosting auto-detect** — Firebase, Vercel, Netlify, AWS, GCP, Azure, Heroku, Render, Railway, GitHub Pages, Docker
14
- - šŸ—ļø **Multi-workflow output** — generates separate `ci.yml`, `deploy.yml`, `docker.yml`, and `security.yml` as appropriate
16
+ - šŸ›”ļø **Workflow Audit & Upgrade** — analyse existing `.github/workflows` for outdated actions and missing best practices
17
+ - šŸ—ļø **Multi-workflow output** — generates separate `ci.yml`, `deploy.yml`, `docker.yml`, and `security.yml`
15
18
  - šŸ”’ **Security built-in** — CodeQL analysis + dependency auditing on every pipeline
16
- - šŸ“¦ **Monorepo aware** — detects Turborepo, Nx, Lerna, pnpm workspaces
19
+ - šŸ“¦ **Monorepo aware** — detects Turborepo, Nx, Lerna, pnpm workspaces (supports per-package workflows)
17
20
  - āœ… **Interactive mode** — confirms detected settings before writing files
18
- - šŸŽÆ **Zero config** — works out of the box with no configuration needed
21
+ - šŸŽÆ **Zero config** — works out of the box with `cistack.config.js` for overrides
19
22
 
20
23
  ---
21
24
 
@@ -33,10 +36,15 @@ npm install -g cistack
33
36
 
34
37
  ## Usage
35
38
 
39
+ ### Generate Pipelines
40
+ Analyze your stack and generate best-practice workflows.
36
41
  ```bash
37
42
  # In your project directory
38
43
  npx cistack
39
44
 
45
+ # Show reasoning for detected stack
46
+ npx cistack --explain
47
+
40
48
  # Specify a project path
41
49
  npx cistack --path /path/to/project
42
50
 
@@ -45,19 +53,40 @@ npx cistack --output .github/workflows
45
53
 
46
54
  # Dry run (print YAML without writing files)
47
55
  npx cistack --dry-run
56
+ ```
48
57
 
49
- # Skip interactive prompts
50
- npx cistack --no-prompt
58
+ ### Audit Existing Workflows
59
+ Analyze your current `.github/workflows` folder for outdated actions or missing features.
60
+ ```bash
61
+ npx cistack audit
62
+ ```
51
63
 
52
- # Verbose output
53
- npx cistack --verbose
64
+ ### Automatic Upgrade
65
+ Automatically bump all action versions (e.g., `actions/checkout@v3` → `@v4`) across all your workflow files to the latest stable releases.
66
+ ```bash
67
+ npx cistack upgrade
68
+ ```
54
69
 
55
- # Force overwrite existing files
56
- npx cistack --force
70
+ ### Initialization
71
+ Create a `cistack.config.js` to override auto-detected settings.
72
+ ```bash
73
+ npx cistack init
57
74
  ```
58
75
 
59
76
  ---
60
77
 
78
+ ## Flags
79
+
80
+ - `--explain` — Show detailed reasoning for every detection (build trust)
81
+ - `--dry-run` — Print YAML to terminal without writing to disk
82
+ - `--force` — Overwrite existing files instead of smart-merging
83
+ - `--no-prompt` — Skip interactive confirmation
84
+ - `--verbose` — Show raw analysis data
85
+ - `--path <dir>` — Project root directory
86
+ - `--output <dir>` — Workflow output directory (default: `.github/workflows`)
87
+
88
+ ---
89
+
61
90
  ## Detected Hosting Platforms
62
91
 
63
92
  | Platform | Detection Signal |
@@ -112,21 +141,23 @@ Runs on every push and pull request:
112
141
  2. **Test** — unit tests with coverage upload (matrix across Node versions)
113
142
  3. **Build** — production build, artifact upload
114
143
  4. **E2E** — Cypress / Playwright (if detected)
144
+ 5. **Caching** — Full dependency caching for faster runs
115
145
 
116
146
  ### `deploy.yml` — Continuous Deployment
117
147
  Triggers on push to `main`/`master` + manual dispatch:
118
- - Platform-specific deploy using the best available GitHub Action
148
+ - Platform-specific deploy using official GitHub Actions
149
+ - **PR Preview Deploys** — automatic previews for Vercel and Netlify pull requests
119
150
  - Proper secret references documented in the file header
120
151
 
121
152
  ### `docker.yml` — Docker Build & Push
122
153
  Triggers on push to `main` and version tags:
123
154
  - Multi-platform build via Docker Buildx
124
155
  - Pushes to GitHub Container Registry (GHCR)
125
- - Build cache via GitHub Actions cache
156
+ - Build cache via GitHub Actions cache (GHA)
126
157
 
127
158
  ### `security.yml` — Security Audit
128
159
  Runs on push, PRs, and weekly schedule:
129
- - Dependency vulnerability audit (npm audit / safety / etc.)
160
+ - Dependency vulnerability audit (npm audit / safety / cargo audit)
130
161
  - GitHub CodeQL analysis for the detected language
131
162
 
132
163
  ---
@@ -142,28 +173,11 @@ Each generated `deploy.yml` has a comment at the top listing the exact secrets n
142
173
 
143
174
  ## Examples
144
175
 
145
- **Next.js + Vercel project:**
146
- ```
147
- npx cistack
148
- # → .github/workflows/ci.yml (lint, test, build)
149
- # → .github/workflows/deploy.yml (vercel deploy)
150
- # → .github/workflows/security.yml
151
- ```
152
-
153
- **Firebase + React project:**
154
- ```
155
- npx cistack
156
- # → .github/workflows/ci.yml
157
- # → .github/workflows/deploy.yml (firebase deploy --only hosting)
158
- # → .github/workflows/security.yml
159
- ```
160
-
161
- **Node.js API + Docker:**
162
- ```
163
- npx cistack
164
- # → .github/workflows/ci.yml
165
- # → .github/workflows/docker.yml (GHCR push)
166
- # → .github/workflows/security.yml
176
+ **Next.js + Vercel project with Audit:**
177
+ ```bash
178
+ npx cistack audit # Check existing workflows
179
+ npx cistack upgrade # Update versions to v4
180
+ npx cistack generate # Refresh with latest caching & previews
167
181
  ```
168
182
 
169
183
  ---
package/bin/ciflow.js CHANGED
@@ -20,6 +20,7 @@ program
20
20
  .option('--force', 'Overwrite existing workflow files without smart-merge')
21
21
  .option('--no-prompt', 'Skip interactive prompts and use detected settings')
22
22
  .option('--verbose', 'Show detailed analysis output')
23
+ .option('--explain', 'Show reasoning for detected stack')
23
24
  .action(async (options) => {
24
25
  const ciflow = new CIFlow({
25
26
  projectPath: path.resolve(options.path),
@@ -28,12 +29,36 @@ program
28
29
  force: options.force,
29
30
  prompt: options.prompt,
30
31
  verbose: options.verbose,
32
+ explain: options.explain,
31
33
  });
32
34
  await ciflow.run();
33
35
  });
34
36
 
37
+ program
38
+ .command('audit')
39
+ .description("Analyse existing .github/workflows/ folder and suggest fixes")
40
+ .option('-p, --path <dir>', 'Path to the project root', process.cwd())
41
+ .action(async (options) => {
42
+ const ciflow = new CIFlow({ projectPath: path.resolve(options.path) });
43
+ await ciflow.audit();
44
+ });
45
+
46
+ program
47
+ .command('upgrade')
48
+ .description("Automatically bump action versions across all workflow files")
49
+ .option('-p, --path <dir>', 'Path to the project root', process.cwd())
50
+ .option('--dry-run', 'Show what would be upgraded without modifying files')
51
+ .action(async (options) => {
52
+ const ciflow = new CIFlow({
53
+ projectPath: path.resolve(options.path),
54
+ dryRun: options.dryRun
55
+ });
56
+ await ciflow.upgrade();
57
+ });
58
+
35
59
  program
36
60
  .command('init')
61
+ // ... rest of init
37
62
  .description('Create a starter cistack.config.js in the current directory')
38
63
  .option('-p, --path <dir>', 'Path to the project root', process.cwd())
39
64
  .action(async (options) => {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "cistack",
3
- "version": "2.0.0",
3
+ "version": "3.0.0",
4
4
  "description": "Automatically generate GitHub Actions CI/CD pipelines by analysing your codebase",
5
5
  "main": "src/index.js",
6
6
  "bin": {
@@ -0,0 +1,195 @@
1
+ 'use strict';
2
+
3
+ const fs = require('fs');
4
+ const path = require('path');
5
+ const yaml = require('js-yaml');
6
+ const chalk = require('chalk');
7
+
8
+ class WorkflowAnalyzer {
9
+ constructor(projectPath) {
10
+ this.projectPath = projectPath;
11
+ this.workflowsDir = path.join(projectPath, '.github/workflows');
12
+
13
+ // Latest stable versions for common actions
14
+ this.latestVersions = {
15
+ 'actions/checkout': 'v4',
16
+ 'actions/setup-node': 'v4',
17
+ 'actions/setup-python': 'v5',
18
+ 'actions/setup-java': 'v4',
19
+ 'actions/setup-go': 'v5',
20
+ 'actions/upload-artifact': 'v4',
21
+ 'actions/download-artifact': 'v4',
22
+ 'actions/cache': 'v4',
23
+ 'docker/setup-buildx-action': 'v3',
24
+ 'docker/login-action': 'v3',
25
+ 'docker/build-push-action': 'v5',
26
+ 'docker/metadata-action': 'v5',
27
+ 'pnpm/action-setup': 'v3',
28
+ 'codecov/codecov-action': 'v4',
29
+ 'github/codeql-action/init': 'v3',
30
+ 'github/codeql-action/analyze': 'v3',
31
+ 'github/codeql-action/autobuild': 'v3',
32
+ };
33
+ }
34
+
35
+ async audit() {
36
+ const results = {
37
+ files: [],
38
+ totalIssues: 0,
39
+ suggestions: [],
40
+ };
41
+
42
+ if (!fs.existsSync(this.workflowsDir)) {
43
+ return results;
44
+ }
45
+
46
+ const files = fs.readdirSync(this.workflowsDir).filter(f => f.endsWith('.yml') || f.endsWith('.yaml'));
47
+
48
+ for (const filename of files) {
49
+ const filePath = path.join(this.workflowsDir, filename);
50
+ const content = fs.readFileSync(filePath, 'utf8');
51
+
52
+ try {
53
+ const parsed = yaml.load(content);
54
+ const fileIssues = this._auditFile(filename, parsed, content);
55
+ results.files.push({
56
+ filename,
57
+ issues: fileIssues,
58
+ });
59
+ results.totalIssues += fileIssues.length;
60
+ } catch (err) {
61
+ results.files.push({
62
+ filename,
63
+ error: `Failed to parse YAML: ${err.message}`,
64
+ });
65
+ }
66
+ }
67
+
68
+ return results;
69
+ }
70
+
71
+ _auditFile(filename, parsed, rawContent) {
72
+ const issues = [];
73
+
74
+ // 1. Check for concurrency
75
+ if (!parsed.concurrency) {
76
+ issues.push({
77
+ type: 'missing_concurrency',
78
+ severity: 'medium',
79
+ message: 'Missing concurrency group (highly recommended to prevent redundant runs)',
80
+ fix: 'Add concurrency block with cancel-in-progress: true',
81
+ });
82
+ }
83
+
84
+ // 2. Check for outdated actions
85
+ const actionRegex = /uses:\s*([\w\-\/]+)@([\w\.]+)/g;
86
+ let match;
87
+ while ((match = actionRegex.exec(rawContent)) !== null) {
88
+ const fullAction = match[0];
89
+ const actionName = match[1];
90
+ const currentVersion = match[2];
91
+
92
+ const latest = this.latestVersions[actionName];
93
+ if (latest && this._isOutdated(currentVersion, latest)) {
94
+ issues.push({
95
+ type: 'outdated_action',
96
+ severity: 'low',
97
+ message: `Outdated action: ${actionName}@${currentVersion} (latest is ${latest})`,
98
+ action: actionName,
99
+ current: currentVersion,
100
+ latest: latest,
101
+ fix: `Update to @${latest}`,
102
+ });
103
+ }
104
+ }
105
+
106
+ // 3. Check for node-version (hardcoded vs matrix)
107
+ const rawLines = rawContent.split('\n');
108
+ for (let i = 0; i < rawLines.length; i++) {
109
+ if (rawLines[i].includes('node-version:')) {
110
+ const versionMatch = rawLines[i].match(/node-version:\s*['"]?(\d+)['"]?/);
111
+ if (versionMatch && parseInt(versionMatch[1]) < 18) {
112
+ issues.push({
113
+ type: 'old_node_version',
114
+ severity: 'medium',
115
+ message: `Using end-of-life Node.js version: ${versionMatch[1]}`,
116
+ line: i + 1,
117
+ fix: 'Upgrade to Node.js 18 or 20',
118
+ });
119
+ }
120
+ }
121
+ }
122
+
123
+ // 4. Check for caching
124
+ if (rawContent.includes('actions/setup-node') && !rawContent.includes('cache:')) {
125
+ issues.push({
126
+ type: 'missing_cache',
127
+ severity: 'high',
128
+ message: 'Dependency caching is not enabled in setup-node',
129
+ fix: 'Add cache: "npm" (or yarn/pnpm) to actions/setup-node',
130
+ });
131
+ }
132
+
133
+ return issues;
134
+ }
135
+
136
+ async upgrade(dryRun = false) {
137
+ const results = {
138
+ upgradedFiles: [],
139
+ changes: 0,
140
+ };
141
+
142
+ if (!fs.existsSync(this.workflowsDir)) {
143
+ return results;
144
+ }
145
+
146
+ const files = fs.readdirSync(this.workflowsDir).filter(f => f.endsWith('.yml') || f.endsWith('.yaml'));
147
+
148
+ for (const filename of files) {
149
+ const filePath = path.join(this.workflowsDir, filename);
150
+ let content = fs.readFileSync(filePath, 'utf8');
151
+ let originalContent = content;
152
+ let fileChanges = 0;
153
+
154
+ for (const [action, latest] of Object.entries(this.latestVersions)) {
155
+ const regex = new RegExp(`uses:\\s*${action}@([\\w\\.]+)`, 'g');
156
+ content = content.replace(regex, (match, version) => {
157
+ if (this._isOutdated(version, latest)) {
158
+ fileChanges++;
159
+ return `uses: ${action}@${latest}`;
160
+ }
161
+ return match;
162
+ });
163
+ }
164
+
165
+ if (fileChanges > 0) {
166
+ if (!dryRun) {
167
+ fs.writeFileSync(filePath, content, 'utf8');
168
+ }
169
+ results.upgradedFiles.push({
170
+ filename,
171
+ changes: fileChanges,
172
+ });
173
+ results.changes += fileChanges;
174
+ }
175
+ }
176
+
177
+ return results;
178
+ }
179
+
180
+ _isOutdated(current, latest) {
181
+ // Simple version comparison for vX formats
182
+ if (current === latest) return false;
183
+
184
+ const currNum = parseInt(current.replace('v', ''));
185
+ const lateNum = parseInt(latest.replace('v', ''));
186
+
187
+ if (!isNaN(currNum) && !isNaN(lateNum)) {
188
+ return currNum < lateNum;
189
+ }
190
+
191
+ return current !== latest; // Fallback for complex tags
192
+ }
193
+ }
194
+
195
+ module.exports = WorkflowAnalyzer;
@@ -34,16 +34,33 @@ class ConfigLoader {
34
34
  const fullPath = path.join(this.projectPath, candidate);
35
35
  if (fs.existsSync(fullPath)) {
36
36
  try {
37
- // Clear require cache so hot-reloads work in watch mode
38
- delete require.cache[require.resolve(fullPath)];
39
- const cfg = require(fullPath);
37
+ // For .js and .cjs, we can use require (with cache clearing)
38
+ // For .mjs, we might need a different approach, but sticking to sync require for now where possible
39
+ // In a real CLI, we might use dynamic import() but that's async.
40
+ // Since this is a CLI, we can afford a bit of hackiness or just support CommonJS primarily.
41
+
42
+ let cfg;
43
+ if (candidate.endsWith('.mjs')) {
44
+ // Very basic support for .mjs if the environment supports it, but require usually fails.
45
+ // We'll try to use the fact that many environments now support require('.mjs') or just warn.
46
+ cfg = require(fullPath);
47
+ } else {
48
+ delete require.cache[require.resolve(fullPath)];
49
+ cfg = require(fullPath);
50
+ }
51
+
40
52
  // Handle both `module.exports = {}` and `export default {}`
41
53
  const resolved = cfg && cfg.__esModule ? cfg.default : cfg;
42
54
  if (resolved && typeof resolved === 'object') {
43
55
  return resolved;
44
56
  }
45
57
  } catch (err) {
46
- console.warn(`[cistack] Warning: could not load ${candidate}: ${err.message}`);
58
+ // If it fails, it might be because it's ESM.
59
+ // We don't want to crash, but we should inform the user if they have a config but it's broken.
60
+ console.warn(chalk.yellow(`[cistack] Warning: could not load ${candidate}: ${err.message}`));
61
+ if (err.message.includes('ERR_REQUIRE_ESM')) {
62
+ console.warn(chalk.dim(` Tip: Try renaming ${candidate} to ${candidate.replace('.js', '.cjs')} or use CommonJS syntax.`));
63
+ }
47
64
  }
48
65
  }
49
66
  }
@@ -79,7 +96,7 @@ class ConfigLoader {
79
96
  * Apply config file overrides onto the full detected stack.
80
97
  *
81
98
  * @param {object} cfg - raw cistack.config.js export
82
- * @param {object} detected - { hosting, frameworks, languages, testing }
99
+ * @param {object} detected - { hosting, frameworks, languages, testing, ... }
83
100
  * @returns {object} - merged config ready for the generator
84
101
  */
85
102
  static applyToStack(cfg, detected) {
@@ -87,24 +104,25 @@ class ConfigLoader {
87
104
 
88
105
  const result = { ...detected };
89
106
 
90
- // Override primary language settings
107
+ // 1. Language overrides (Node version, package manager)
91
108
  if (cfg.nodeVersion && result.languages && result.languages.length > 0) {
92
109
  result.languages = result.languages.map((l, i) =>
93
110
  i === 0 && (l.name === 'JavaScript' || l.name === 'TypeScript')
94
- ? { ...l, nodeVersion: String(cfg.nodeVersion) }
111
+ ? { ...l, nodeVersion: String(cfg.nodeVersion), manual: true }
95
112
  : l
96
113
  );
97
114
  }
98
115
 
99
116
  if (cfg.packageManager && result.languages && result.languages.length > 0) {
100
117
  result.languages = result.languages.map((l, i) =>
101
- i === 0 ? { ...l, packageManager: cfg.packageManager } : l
118
+ i === 0 ? { ...l, packageManager: cfg.packageManager, manual: true } : l
102
119
  );
103
120
  }
104
121
 
105
- // Override hosting
106
- if (cfg.hosting && Array.isArray(cfg.hosting)) {
107
- result.hosting = cfg.hosting.map((name) => ({
122
+ // 2. Hosting overrides
123
+ if (cfg.hosting) {
124
+ const hostingNames = Array.isArray(cfg.hosting) ? cfg.hosting : [cfg.hosting];
125
+ result.hosting = hostingNames.map((name) => ({
108
126
  name,
109
127
  confidence: 1.0,
110
128
  manual: true,
@@ -113,8 +131,30 @@ class ConfigLoader {
113
131
  }));
114
132
  }
115
133
 
134
+ // 3. Framework overrides
135
+ if (cfg.frameworks) {
136
+ const frameworkNames = Array.isArray(cfg.frameworks) ? cfg.frameworks : [cfg.frameworks];
137
+ result.frameworks = frameworkNames.map(name => ({
138
+ name,
139
+ confidence: 1.0,
140
+ manual: true
141
+ }));
142
+ }
143
+
144
+ // 4. Testing overrides
145
+ if (cfg.testing) {
146
+ const testNames = Array.isArray(cfg.testing) ? cfg.testing : [cfg.testing];
147
+ result.testing = testNames.map(name => ({
148
+ name,
149
+ confidence: 1.0,
150
+ manual: true,
151
+ type: 'unit', // default
152
+ command: `npm run test` // fallback
153
+ }));
154
+ }
155
+
116
156
  // Pass through raw extras for generators to consume
117
- result._config = cfg;
157
+ result._config = { ...(result._config || {}), ...cfg };
118
158
 
119
159
  return result;
120
160
  }
@@ -64,29 +64,46 @@ class FrameworkDetector {
64
64
  // ── generic JS/TS checker ─────────────────────────────────────────────────
65
65
  _check(name, depKeys, configFiles, meta = {}) {
66
66
  let confidence = 0;
67
+ const reasons = [];
67
68
 
68
69
  for (const dep of depKeys) {
69
- if (this.deps[dep]) { confidence += 0.5; break; }
70
+ if (this.deps[dep]) {
71
+ confidence += 0.5;
72
+ reasons.push(`dependency: ${dep}`);
73
+ break;
74
+ }
70
75
  }
71
76
  for (const cfg of configFiles) {
72
- if (this.configs.has(cfg) || this.files.has(cfg)) { confidence += 0.4; break; }
77
+ if (this.configs.has(cfg) || this.files.has(cfg)) {
78
+ confidence += 0.4;
79
+ reasons.push(`config file: ${cfg}`);
80
+ break;
81
+ }
73
82
  }
74
83
 
75
- return { name, confidence: Math.min(confidence, 1), ...meta };
84
+ return { name, confidence: Math.min(confidence, 1), reasons, ...meta };
76
85
  }
77
86
 
78
87
  _checkPython(name, pkg, markerFile) {
79
88
  let confidence = 0;
89
+ const reasons = [];
80
90
  const reqFiles = ['requirements.txt', 'Pipfile', 'pyproject.toml'];
81
91
  for (const rf of reqFiles) {
82
92
  const fullPath = path.join(this.root, rf);
83
93
  if (fs.existsSync(fullPath)) {
84
94
  const content = fs.readFileSync(fullPath, 'utf8').toLowerCase();
85
- if (content.includes(pkg.toLowerCase())) { confidence += 0.7; break; }
95
+ if (content.includes(pkg.toLowerCase())) {
96
+ confidence += 0.7;
97
+ reasons.push(`found ${pkg} in ${rf}`);
98
+ break;
99
+ }
86
100
  }
87
101
  }
88
- if (markerFile && this.files.has(markerFile)) confidence += 0.2;
89
- return confidence > 0 ? { name, confidence: Math.min(confidence, 1), isServer: true, isPython: true } : null;
102
+ if (markerFile && this.files.has(markerFile)) {
103
+ confidence += 0.2;
104
+ reasons.push(`found marker file ${markerFile}`);
105
+ }
106
+ return confidence > 0 ? { name, confidence: Math.min(confidence, 1), isServer: true, isPython: true, reasons } : null;
90
107
  }
91
108
 
92
109
  _checkRuby(name, gem) {
@@ -94,20 +111,27 @@ class FrameworkDetector {
94
111
  if (!fs.existsSync(gemfilePath)) return null;
95
112
  const content = fs.readFileSync(gemfilePath, 'utf8').toLowerCase();
96
113
  const confidence = content.includes(gem.toLowerCase()) ? 0.9 : 0;
97
- return confidence > 0 ? { name, confidence, isServer: true, isRuby: true } : null;
114
+ const reasons = confidence > 0 ? [`found ${gem} in Gemfile`] : [];
115
+ return confidence > 0 ? { name, confidence, isServer: true, isRuby: true, reasons } : null;
98
116
  }
99
117
 
100
118
  _checkJVM(name, keyword) {
101
119
  const gradlePath = path.join(this.root, 'build.gradle');
102
120
  const pomPath = path.join(this.root, 'pom.xml');
103
121
  let confidence = 0;
122
+ let foundIn = '';
104
123
  for (const p of [gradlePath, pomPath]) {
105
124
  if (fs.existsSync(p)) {
106
125
  const content = fs.readFileSync(p, 'utf8').toLowerCase();
107
- if (content.includes(keyword.toLowerCase())) { confidence = 0.9; break; }
126
+ if (content.includes(keyword.toLowerCase())) {
127
+ confidence = 0.9;
128
+ foundIn = path.basename(p);
129
+ break;
130
+ }
108
131
  }
109
132
  }
110
- return confidence > 0 ? { name, confidence, isServer: true, isJVM: true } : null;
133
+ const reasons = confidence > 0 ? [`found ${keyword} in ${foundIn}`] : [];
134
+ return confidence > 0 ? { name, confidence, isServer: true, isJVM: true, reasons } : null;
111
135
  }
112
136
 
113
137
  _checkComposer(name, pkg) {
@@ -117,20 +141,21 @@ class FrameworkDetector {
117
141
  const composer = JSON.parse(fs.readFileSync(composerPath, 'utf8'));
118
142
  const allDeps = { ...(composer.require || {}), ...(composer['require-dev'] || {}) };
119
143
  const confidence = allDeps[pkg] ? 0.9 : 0;
120
- return confidence > 0 ? { name, confidence, isServer: true, isPHP: true } : null;
144
+ const reasons = confidence > 0 ? [`found ${pkg} in composer.json`] : [];
145
+ return confidence > 0 ? { name, confidence, isServer: true, isPHP: true, reasons } : null;
121
146
  } catch (_) { return null; }
122
147
  }
123
148
 
124
149
  _checkGo(name) {
125
150
  const goMod = path.join(this.root, 'go.mod');
126
151
  if (!fs.existsSync(goMod)) return null;
127
- return { name, confidence: 0.9, isServer: true, isGo: true };
152
+ return { name, confidence: 0.9, isServer: true, isGo: true, reasons: ['go.mod found'] };
128
153
  }
129
154
 
130
155
  _checkRust(name) {
131
156
  const cargoToml = path.join(this.root, 'Cargo.toml');
132
157
  if (!fs.existsSync(cargoToml)) return null;
133
- return { name, confidence: 0.9, isServer: true, isRust: true };
158
+ return { name, confidence: 0.9, isServer: true, isRust: true, reasons: ['Cargo.toml found'] };
134
159
  }
135
160
  }
136
161
 
@@ -60,13 +60,13 @@ class HostingDetector {
60
60
 
61
61
  _checkFirebase() {
62
62
  let confidence = 0;
63
- const notes = [];
63
+ const reasons = [];
64
64
 
65
- if (this.configs.has('firebase.json')) { confidence += 0.6; notes.push('firebase.json found'); }
66
- if (this.configs.has('.firebaserc')) { confidence += 0.3; notes.push('.firebaserc found'); }
67
- if (this.deps['firebase-tools'] || this.deps['firebase']) { confidence += 0.2; notes.push('firebase dep'); }
68
- if (Object.values(this.scripts).some((s) => s.includes('firebase deploy'))) { confidence += 0.3; notes.push('deploy script'); }
69
- if (this.info.srcStructure.hasFunctions) { confidence += 0.1; }
65
+ if (this.configs.has('firebase.json')) { confidence += 0.6; reasons.push('firebase.json found'); }
66
+ if (this.configs.has('.firebaserc')) { confidence += 0.3; reasons.push('.firebaserc found'); }
67
+ if (this.deps['firebase-tools'] || this.deps['firebase']) { confidence += 0.2; reasons.push('firebase dependency found'); }
68
+ if (Object.values(this.scripts).some((s) => s.includes('firebase deploy'))) { confidence += 0.3; reasons.push('firebase deploy script found'); }
69
+ if (this.info.srcStructure.hasFunctions) { confidence += 0.1; reasons.push('functions directory found'); }
70
70
 
71
71
  // Detect what Firebase services are used
72
72
  let deployTarget = 'hosting';
@@ -85,38 +85,38 @@ class HostingDetector {
85
85
  confidence: Math.min(confidence, 1),
86
86
  deployCommand: `firebase deploy --only ${deployTarget}`,
87
87
  secrets: ['FIREBASE_TOKEN'],
88
- notes,
88
+ reasons,
89
89
  buildStep: this._detectBuildScript(),
90
90
  };
91
91
  }
92
92
 
93
93
  _checkVercel() {
94
94
  let confidence = 0;
95
- const notes = [];
95
+ const reasons = [];
96
96
 
97
- if (this.configs.has('vercel.json')) { confidence += 0.7; notes.push('vercel.json found'); }
98
- if (this.configs.has('.vercel')) { confidence += 0.4; notes.push('.vercel dir found'); }
99
- if (this.deps['vercel']) { confidence += 0.3; notes.push('vercel dep'); }
100
- if (Object.values(this.scripts).some((s) => s.includes('vercel'))) { confidence += 0.3; notes.push('vercel script'); }
97
+ if (this.configs.has('vercel.json')) { confidence += 0.7; reasons.push('vercel.json found'); }
98
+ if (this.configs.has('.vercel')) { confidence += 0.4; reasons.push('.vercel directory found'); }
99
+ if (this.deps['vercel']) { confidence += 0.3; reasons.push('vercel dependency found'); }
100
+ if (Object.values(this.scripts).some((s) => s.includes('vercel'))) { confidence += 0.3; reasons.push('vercel script found'); }
101
101
 
102
102
  return {
103
103
  name: 'Vercel',
104
104
  confidence: Math.min(confidence, 1),
105
105
  deployCommand: 'vercel --prod --token $VERCEL_TOKEN',
106
106
  secrets: ['VERCEL_TOKEN', 'VERCEL_ORG_ID', 'VERCEL_PROJECT_ID'],
107
- notes,
107
+ reasons,
108
108
  buildStep: this._detectBuildScript(),
109
109
  };
110
110
  }
111
111
 
112
112
  _checkNetlify() {
113
113
  let confidence = 0;
114
- const notes = [];
114
+ const reasons = [];
115
115
 
116
- if (this.configs.has('netlify.toml')) { confidence += 0.7; notes.push('netlify.toml found'); }
117
- if (this.configs.has('_redirects')) { confidence += 0.2; notes.push('_redirects found'); }
118
- if (this.deps['netlify-cli'] || this.deps['netlify']) { confidence += 0.3; notes.push('netlify dep'); }
119
- if (Object.values(this.scripts).some((s) => s.includes('netlify'))) { confidence += 0.3; notes.push('netlify script'); }
116
+ if (this.configs.has('netlify.toml')) { confidence += 0.7; reasons.push('netlify.toml found'); }
117
+ if (this.configs.has('_redirects')) { confidence += 0.2; reasons.push('_redirects file found'); }
118
+ if (this.deps['netlify-cli'] || this.deps['netlify']) { confidence += 0.3; reasons.push('netlify dependency found'); }
119
+ if (Object.values(this.scripts).some((s) => s.includes('netlify'))) { confidence += 0.3; reasons.push('netlify script found'); }
120
120
 
121
121
  let publishDir = 'dist';
122
122
  try {
@@ -130,7 +130,7 @@ class HostingDetector {
130
130
  confidence: Math.min(confidence, 1),
131
131
  deployCommand: `netlify deploy --prod --dir=${publishDir}`,
132
132
  secrets: ['NETLIFY_AUTH_TOKEN', 'NETLIFY_SITE_ID'],
133
- notes,
133
+ reasons,
134
134
  publishDir,
135
135
  buildStep: this._detectBuildScript(),
136
136
  };
@@ -138,110 +138,118 @@ class HostingDetector {
138
138
 
139
139
  _checkRender() {
140
140
  let confidence = 0;
141
- if (this.configs.has('render.yaml')) { confidence += 0.8; }
141
+ const reasons = [];
142
+ if (this.configs.has('render.yaml')) { confidence += 0.8; reasons.push('render.yaml detected'); }
142
143
  return {
143
144
  name: 'Render',
144
145
  confidence,
145
146
  deployCommand: 'curl -X POST $RENDER_DEPLOY_HOOK_URL',
146
147
  secrets: ['RENDER_DEPLOY_HOOK_URL'],
147
- notes: ['render.yaml detected'],
148
+ reasons,
148
149
  };
149
150
  }
150
151
 
151
152
  _checkRailway() {
152
153
  let confidence = 0;
153
- if (this.configs.has('railway.json') || this.configs.has('railway.toml')) confidence += 0.8;
154
- if (this.deps['@railway/cli']) confidence += 0.2;
154
+ const reasons = [];
155
+ if (this.configs.has('railway.json') || this.configs.has('railway.toml')) { confidence += 0.8; reasons.push('railway config found'); }
156
+ if (this.deps['@railway/cli']) { confidence += 0.2; reasons.push('railway cli dependency found'); }
155
157
  return {
156
158
  name: 'Railway',
157
159
  confidence,
158
160
  deployCommand: 'railway up',
159
161
  secrets: ['RAILWAY_TOKEN'],
160
- notes: [],
162
+ reasons,
161
163
  };
162
164
  }
163
165
 
164
166
  _checkHeroku() {
165
167
  let confidence = 0;
166
- if (this.configs.has('Procfile')) { confidence += 0.5; }
167
- if (this.configs.has('heroku.yml')) { confidence += 0.5; }
168
- if (this.deps['heroku']) { confidence += 0.2; }
168
+ const reasons = [];
169
+ if (this.configs.has('Procfile')) { confidence += 0.5; reasons.push('Procfile found'); }
170
+ if (this.configs.has('heroku.yml')) { confidence += 0.5; reasons.push('heroku.yml found'); }
171
+ if (this.deps['heroku']) { confidence += 0.2; reasons.push('heroku dependency found'); }
169
172
  return {
170
173
  name: 'Heroku',
171
174
  confidence,
172
175
  deployCommand: 'git push heroku main',
173
176
  secrets: ['HEROKU_API_KEY', 'HEROKU_APP_NAME'],
174
- notes: [],
177
+ reasons,
175
178
  };
176
179
  }
177
180
 
178
181
  _checkGCPAppEngine() {
179
182
  let confidence = 0;
180
- if (this.configs.has('app.yaml')) { confidence += 0.7; }
181
- if (this.deps['@google-cloud/functions-framework']) confidence += 0.2;
183
+ const reasons = [];
184
+ if (this.configs.has('app.yaml')) { confidence += 0.7; reasons.push('app.yaml detected'); }
185
+ if (this.deps['@google-cloud/functions-framework']) { confidence += 0.2; reasons.push('gcp functions framework found'); }
182
186
  return {
183
187
  name: 'GCP App Engine',
184
188
  confidence,
185
189
  deployCommand: 'gcloud app deploy',
186
190
  secrets: ['GCP_PROJECT_ID', 'GCP_SA_KEY'],
187
- notes: [],
191
+ reasons,
188
192
  };
189
193
  }
190
194
 
191
195
  _checkAWS() {
192
196
  let confidence = 0;
193
- if (this.configs.has('appspec.yml')) confidence += 0.5;
194
- if (this.configs.has('serverless.yml') || this.configs.has('serverless.yaml')) confidence += 0.6;
195
- if (this.configs.has('cdk.json')) confidence += 0.4;
196
- if (this.deps['aws-sdk'] || this.deps['@aws-sdk/client-s3']) confidence += 0.15;
197
+ const reasons = [];
198
+ if (this.configs.has('appspec.yml')) { confidence += 0.5; reasons.push('appspec.yml found'); }
199
+ if (this.configs.has('serverless.yml') || this.configs.has('serverless.yaml')) { confidence += 0.6; reasons.push('serverless.yml found'); }
200
+ if (this.configs.has('cdk.json')) { confidence += 0.4; reasons.push('cdk.json found'); }
201
+ if (this.deps['aws-sdk'] || this.deps['@aws-sdk/client-s3']) { confidence += 0.15; reasons.push('aws-sdk found'); }
197
202
  return {
198
203
  name: 'AWS',
199
204
  confidence: Math.min(confidence, 1),
200
205
  deployCommand: 'aws s3 sync ./dist s3://$AWS_S3_BUCKET --delete',
201
206
  secrets: ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY', 'AWS_REGION'],
202
- notes: [],
207
+ reasons,
203
208
  };
204
209
  }
205
210
 
206
211
  _checkAzure() {
207
212
  let confidence = 0;
208
- if (this.files.has('.azure/pipelines.yml')) confidence += 0.5;
209
- if (this.deps['@azure/core-http']) confidence += 0.2;
213
+ const reasons = [];
214
+ if (this.files.has('.azure/pipelines.yml')) { confidence += 0.5; reasons.push('.azure/pipelines.yml found'); }
215
+ if (this.deps['@azure/core-http']) { confidence += 0.2; reasons.push('azure core-http found'); }
210
216
  return {
211
217
  name: 'Azure',
212
218
  confidence,
213
219
  deployCommand: 'az webapp up',
214
220
  secrets: ['AZURE_CREDENTIALS'],
215
- notes: [],
221
+ reasons,
216
222
  };
217
223
  }
218
224
 
219
225
  _checkGitHubPages() {
220
226
  let confidence = 0;
227
+ const reasons = [];
221
228
  const pkgHomepage = this.pkg.homepage || '';
222
- if (pkgHomepage.includes('github.io')) { confidence += 0.6; }
223
- if (this.deps['gh-pages']) { confidence += 0.4; }
224
- if (Object.values(this.scripts).some((s) => s.includes('gh-pages'))) confidence += 0.3;
229
+ if (pkgHomepage.includes('github.io')) { confidence += 0.6; reasons.push('homepage contains github.io'); }
230
+ if (this.deps['gh-pages']) { confidence += 0.4; reasons.push('gh-pages dependency found'); }
231
+ if (Object.values(this.scripts).some((s) => s.includes('gh-pages'))) { confidence += 0.3; reasons.push('gh-pages script found'); }
225
232
  return {
226
233
  name: 'GitHub Pages',
227
234
  confidence: Math.min(confidence, 1),
228
235
  deployCommand: null, // handled by actions/deploy-pages
229
236
  secrets: [],
230
- notes: [],
237
+ reasons,
231
238
  buildStep: this._detectBuildScript(),
232
239
  };
233
240
  }
234
241
 
235
242
  _checkDocker() {
236
243
  let confidence = 0;
237
- if (this.configs.has('Dockerfile')) confidence += 0.5;
238
- if (this.configs.has('docker-compose.yml') || this.configs.has('docker-compose.yaml')) confidence += 0.3;
244
+ const reasons = [];
245
+ if (this.configs.has('Dockerfile')) { confidence += 0.5; reasons.push('Dockerfile found'); }
246
+ if (this.configs.has('docker-compose.yml') || this.configs.has('docker-compose.yaml')) { confidence += 0.3; reasons.push('docker-compose.yml found'); }
239
247
  return {
240
248
  name: 'Docker',
241
249
  confidence,
242
250
  deployCommand: 'docker push $DOCKER_IMAGE',
243
251
  secrets: ['DOCKER_USERNAME', 'DOCKER_PASSWORD'],
244
- notes: [],
252
+ reasons,
245
253
  };
246
254
  }
247
255
 
@@ -277,7 +277,6 @@ class WorkflowGenerator {
277
277
  _buildDeployWorkflow() {
278
278
  const h = this.primaryHosting;
279
279
  const lang = this.primaryLang;
280
-
281
280
  const branches = this.extraConfig.branches || ['main', 'master'];
282
281
 
283
282
  const preDeploySteps = [
@@ -286,17 +285,30 @@ class WorkflowGenerator {
286
285
  this._stepInstallDeps(lang),
287
286
  ].filter(Boolean);
288
287
 
289
- const deploySteps = this._hostingDeploySteps(h, lang);
288
+ const deploySteps = this._hostingDeploySteps(h, lang, false); // production
289
+ const previewSteps = this._hostingDeploySteps(h, lang, true); // preview
290
290
 
291
291
  const jobs = {
292
292
  deploy: {
293
- name: `šŸš€ Deploy → ${h.name}`,
293
+ name: `šŸš€ Deploy → ${h.name} (Production)`,
294
+ if: "github.event_name == 'push' || github.event_name == 'workflow_dispatch'",
294
295
  'runs-on': 'ubuntu-latest',
295
296
  environment: 'production',
296
297
  steps: [...preDeploySteps, ...deploySteps].filter(Boolean),
297
298
  },
298
299
  };
299
300
 
301
+ // Add preview job if supported
302
+ if (previewSteps.length > 0) {
303
+ jobs.preview = {
304
+ name: `✨ Deploy → ${h.name} (Preview)`,
305
+ if: "github.event_name == 'pull_request'",
306
+ 'runs-on': 'ubuntu-latest',
307
+ environment: 'preview',
308
+ steps: [...preDeploySteps, ...previewSteps].filter(Boolean),
309
+ };
310
+ }
311
+
300
312
  const allSecrets = [
301
313
  ...(h.secrets || []),
302
314
  ...this.envVars.secrets,
@@ -313,6 +325,7 @@ class WorkflowGenerator {
313
325
  name: `Deploy to ${h.name}`,
314
326
  on: {
315
327
  push: { branches: branches.filter((b) => b !== 'develop') },
328
+ pull_request: { branches },
316
329
  workflow_dispatch: {},
317
330
  },
318
331
  jobs,
@@ -499,8 +512,8 @@ class WorkflowGenerator {
499
512
  uses: 'actions/setup-node@v4',
500
513
  with: {
501
514
  'node-version': lang.nodeVersion || '20',
502
- // setup-node handles npm/yarn/pnpm caching natively
503
- cache: lang.packageManager === 'yarn' ? 'yarn' : lang.packageManager === 'pnpm' ? 'pnpm' : 'npm',
515
+ // Use native caching in setup-node
516
+ cache: cacheOverride.npm !== false ? (lang.packageManager === 'yarn' ? 'yarn' : lang.packageManager === 'pnpm' ? 'pnpm' : 'npm') : undefined,
504
517
  },
505
518
  });
506
519
  }
@@ -510,35 +523,12 @@ class WorkflowGenerator {
510
523
  steps.push({
511
524
  name: 'Set up Python',
512
525
  uses: 'actions/setup-python@v5',
513
- with: { 'python-version': '3.x' },
526
+ with: {
527
+ 'python-version': '3.x',
528
+ // Native caching for pip/poetry
529
+ cache: cacheOverride.pip !== false ? (lang.packageManager === 'poetry' ? 'poetry' : 'pip') : undefined
530
+ },
514
531
  });
515
-
516
- if (cacheOverride.pip !== false) {
517
- if (lang.packageManager === 'poetry') {
518
- steps.push({
519
- name: 'Cache Poetry virtualenv',
520
- uses: 'actions/cache@v4',
521
- with: {
522
- path: [
523
- '~/.cache/pypoetry',
524
- '~/.local/share/pypoetry',
525
- ].join('\n'),
526
- key: "${{ runner.os }}-poetry-${{ hashFiles('**/poetry.lock') }}",
527
- 'restore-keys': '${{ runner.os }}-poetry-',
528
- },
529
- });
530
- } else {
531
- steps.push({
532
- name: 'Cache pip',
533
- uses: 'actions/cache@v4',
534
- with: {
535
- path: '~/.cache/pip',
536
- key: "${{ runner.os }}-pip-${{ hashFiles('**/requirements*.txt') }}",
537
- 'restore-keys': '${{ runner.os }}-pip-',
538
- },
539
- });
540
- }
541
- }
542
532
  }
543
533
 
544
534
  // ── Go ───────────────────────────────────────────────────────────────
@@ -546,20 +536,11 @@ class WorkflowGenerator {
546
536
  steps.push({
547
537
  name: 'Set up Go',
548
538
  uses: 'actions/setup-go@v5',
549
- with: { 'go-version': 'stable', cache: true },
539
+ with: {
540
+ 'go-version': 'stable',
541
+ cache: cacheOverride.go !== false
542
+ },
550
543
  });
551
- // setup-go has built-in module cache; add explicit one for Go pkg mod
552
- if (cacheOverride.go !== false) {
553
- steps.push({
554
- name: 'Cache Go modules',
555
- uses: 'actions/cache@v4',
556
- with: {
557
- path: '~/go/pkg/mod',
558
- key: "${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}",
559
- 'restore-keys': '${{ runner.os }}-go-',
560
- },
561
- });
562
- }
563
544
  }
564
545
 
565
546
  // ── Java / Kotlin ─────────────────────────────────────────────────────
@@ -567,35 +548,13 @@ class WorkflowGenerator {
567
548
  steps.push({
568
549
  name: 'Set up JDK',
569
550
  uses: 'actions/setup-java@v4',
570
- with: { 'java-version': '21', distribution: 'temurin' },
551
+ with: {
552
+ 'java-version': '21',
553
+ distribution: 'temurin',
554
+ // Native caching for maven/gradle
555
+ cache: cacheOverride.maven !== false ? (lang.packageManager === 'gradle' ? 'gradle' : 'maven') : undefined
556
+ },
571
557
  });
572
-
573
- if (lang.packageManager === 'maven' && cacheOverride.maven !== false) {
574
- steps.push({
575
- name: 'Cache Maven repository',
576
- uses: 'actions/cache@v4',
577
- with: {
578
- path: '~/.m2',
579
- key: "${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}",
580
- 'restore-keys': '${{ runner.os }}-m2-',
581
- },
582
- });
583
- }
584
-
585
- if (lang.packageManager === 'gradle' && cacheOverride.gradle !== false) {
586
- steps.push({
587
- name: 'Cache Gradle packages',
588
- uses: 'actions/cache@v4',
589
- with: {
590
- path: [
591
- '~/.gradle/caches',
592
- '~/.gradle/wrapper',
593
- ].join('\n'),
594
- key: "${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle*', '**/gradle-wrapper.properties') }}",
595
- 'restore-keys': '${{ runner.os }}-gradle-',
596
- },
597
- });
598
- }
599
558
  }
600
559
 
601
560
  // ── Ruby ─────────────────────────────────────────────────────────────
@@ -603,9 +562,8 @@ class WorkflowGenerator {
603
562
  steps.push({
604
563
  name: 'Set up Ruby',
605
564
  uses: 'ruby/setup-ruby@v1',
606
- with: { 'bundler-cache': true },
565
+ with: { 'bundler-cache': cacheOverride.bundler !== false },
607
566
  });
608
- // setup-ruby already handles bundler cache via bundler-cache: true
609
567
  }
610
568
 
611
569
  // ── Rust ─────────────────────────────────────────────────────────────
@@ -754,7 +712,7 @@ class WorkflowGenerator {
754
712
  // Hosting-specific deploy steps
755
713
  // ══════════════════════════════════════════════════════════════════════════
756
714
 
757
- _hostingDeploySteps(h, lang) {
715
+ _hostingDeploySteps(h, lang, isPreview = false) {
758
716
  const steps = [];
759
717
  const buildScript = this._findScript(['build', 'build:prod']);
760
718
  const pm = lang.packageManager || 'npm';
@@ -766,23 +724,24 @@ class WorkflowGenerator {
766
724
  steps.push({ name: 'Build', run: runCmd(buildScript), env: { NODE_ENV: 'production' } });
767
725
  }
768
726
  steps.push({
769
- name: 'Deploy to Firebase',
727
+ name: isPreview ? 'Deploy Preview' : 'Deploy to Firebase',
770
728
  uses: 'FirebaseExtended/action-hosting-deploy@v0',
771
729
  with: {
772
730
  repoToken: '${{ secrets.GITHUB_TOKEN }}',
773
731
  firebaseServiceAccount: '${{ secrets.FIREBASE_SERVICE_ACCOUNT }}',
774
- channelId: 'live',
732
+ channelId: isPreview ? 'preview-${{ github.event.number }}' : 'live',
775
733
  },
776
734
  });
777
735
  break;
778
736
  }
779
737
 
780
738
  case 'Vercel': {
739
+ const prodFlag = isPreview ? '' : '--prod';
781
740
  steps.push(
782
741
  { name: 'Install Vercel CLI', run: 'npm install -g vercel' },
783
- { name: 'Pull Vercel environment', run: 'vercel pull --yes --environment=production --token=${{ secrets.VERCEL_TOKEN }}' },
784
- { name: 'Build project', run: 'vercel build --prod --token=${{ secrets.VERCEL_TOKEN }}' },
785
- { name: 'Deploy to Vercel', run: 'vercel deploy --prebuilt --prod --token=${{ secrets.VERCEL_TOKEN }}' },
742
+ { name: 'Pull Vercel environment', run: `vercel pull --yes --environment=${isPreview ? 'preview' : 'production'} --token=\${{ secrets.VERCEL_TOKEN }}` },
743
+ { name: 'Build project', run: `vercel build ${prodFlag} --token=\${{ secrets.VERCEL_TOKEN }}` },
744
+ { name: 'Deploy to Vercel', run: `vercel deploy --prebuilt ${prodFlag} --token=\${{ secrets.VERCEL_TOKEN }}` },
786
745
  );
787
746
  break;
788
747
  }
@@ -792,15 +751,17 @@ class WorkflowGenerator {
792
751
  steps.push({ name: 'Build', run: runCmd(buildScript), env: { NODE_ENV: 'production' } });
793
752
  }
794
753
  steps.push({
795
- name: 'Deploy to Netlify',
754
+ name: isPreview ? 'Deploy Preview' : 'Deploy to Netlify',
796
755
  uses: 'nwtgck/actions-netlify@v3.0',
797
756
  with: {
798
757
  'publish-dir': h.publishDir || 'dist',
799
758
  'production-branch': 'main',
800
759
  'github-token': '${{ secrets.GITHUB_TOKEN }}',
801
- 'deploy-message': 'Deploy from GitHub Actions – ${{ github.sha }}',
760
+ 'deploy-message': isPreview ? 'Preview Deploy – ${{ github.event.number }}' : 'Production Deploy – ${{ github.sha }}',
802
761
  'enable-pull-request-comment': true,
803
762
  'enable-commit-comment': true,
763
+ 'production-deploy': !isPreview,
764
+ alias: isPreview ? 'preview-${{ github.event.number }}' : undefined,
804
765
  },
805
766
  env: {
806
767
  NETLIFY_AUTH_TOKEN: '${{ secrets.NETLIFY_AUTH_TOKEN }}',
package/src/index.js CHANGED
@@ -20,6 +20,8 @@ const ReleaseGenerator = require('./generators/release');
20
20
  const ConfigLoader = require('./config/loader');
21
21
  const { ensureDir, writeFile, banner, smartMergeWorkflow } = require('./utils/helpers');
22
22
 
23
+ const WorkflowAnalyzer = require('./analyzers/workflow');
24
+
23
25
  class CIFlow {
24
26
  constructor(options) {
25
27
  this.options = options;
@@ -29,6 +31,7 @@ class CIFlow {
29
31
  this.force = options.force || false;
30
32
  this.prompt = options.prompt !== false;
31
33
  this.verbose = options.verbose || false;
34
+ this.explain = options.explain || false;
32
35
  }
33
36
 
34
37
  async run() {
@@ -129,19 +132,100 @@ class CIFlow {
129
132
  }
130
133
  }
131
134
 
135
+ async audit() {
136
+ banner();
137
+ const spinner = ora({ text: 'Auditing existing workflows...', color: 'cyan' }).start();
138
+
139
+ try {
140
+ const analyzer = new WorkflowAnalyzer(this.projectPath);
141
+ const results = await analyzer.audit();
142
+ spinner.succeed(chalk.green('Audit complete'));
143
+
144
+ if (results.files.length === 0) {
145
+ console.log(chalk.yellow('\nNo workflow files found to audit.'));
146
+ return;
147
+ }
148
+
149
+ console.log('\n' + chalk.bold('šŸ” Workflow Audit Results'));
150
+ console.log(chalk.dim('─'.repeat(48)));
151
+
152
+ for (const file of results.files) {
153
+ if (file.error) {
154
+ console.log(`\nšŸ“„ ${chalk.red(file.filename)} – ${chalk.red(file.error)}`);
155
+ continue;
156
+ }
157
+
158
+ console.log(`\nšŸ“„ ${chalk.cyan(file.filename)} – ${file.issues.length > 0 ? chalk.yellow(file.issues.length + ' issues found') : chalk.green('Excellent')}`);
159
+
160
+ for (const issue of file.issues) {
161
+ const color = issue.severity === 'high' ? chalk.red : issue.severity === 'medium' ? chalk.yellow : chalk.dim;
162
+ console.log(` ${color('•')} ${issue.message}`);
163
+ console.log(` ${chalk.dim('Fix:')} ${chalk.italic(issue.fix)}`);
164
+ }
165
+ }
166
+
167
+ if (results.totalIssues > 0) {
168
+ console.log('\n' + chalk.yellow(`šŸ’” Run ${chalk.bold('cistack upgrade')} to automatically fix outdated actions.`));
169
+ } else {
170
+ console.log('\n' + chalk.green('āœ… Your workflows are up to date and follow best practices.'));
171
+ }
172
+ console.log('');
173
+ } catch (err) {
174
+ spinner.fail(chalk.red('Audit failed: ' + err.message));
175
+ process.exit(1);
176
+ }
177
+ }
178
+
179
+ async upgrade() {
180
+ banner();
181
+ const spinner = ora({ text: 'Upgrading actions...', color: 'cyan' }).start();
182
+
183
+ try {
184
+ const analyzer = new WorkflowAnalyzer(this.projectPath);
185
+ const results = await analyzer.upgrade(this.dryRun);
186
+
187
+ if (results.changes === 0) {
188
+ spinner.succeed(chalk.green('All actions are already up to date.'));
189
+ return;
190
+ }
191
+
192
+ spinner.succeed(chalk.green(`Upgraded ${results.changes} action(s) across ${results.upgradedFiles.length} file(s)`));
193
+
194
+ if (this.dryRun) {
195
+ console.log(chalk.yellow('\n── DRY RUN – files not modified ──'));
196
+ }
197
+
198
+ for (const file of results.upgradedFiles) {
199
+ console.log(` ${chalk.green('āœ”')} ${file.filename} (${file.changes} changes)`);
200
+ }
201
+ console.log('');
202
+ } catch (err) {
203
+ spinner.fail(chalk.red('Upgrade failed: ' + err.message));
204
+ process.exit(1);
205
+ }
206
+ }
207
+
132
208
  // ── helpers ──────────────────────────────────────────────────────────────
133
209
 
134
- _printSummary({ hosting, frameworks, languages, testing }, releaseInfo, envVars, monorepoPackages) {
135
- const line = (label, value) =>
210
+ _printSummary(config, releaseInfo, envVars, monorepoPackages) {
211
+ const { hosting, frameworks, languages, testing } = config;
212
+ const line = (label, value, reasons = []) => {
136
213
  console.log(` ${chalk.dim(label.padEnd(20))} ${chalk.cyan(value || chalk.italic('none detected'))}`);
214
+ if (this.explain && reasons && reasons.length > 0) {
215
+ for (const reason of reasons) {
216
+ console.log(` ${chalk.dim('↳')} ${chalk.italic.gray(reason)}`);
217
+ }
218
+ }
219
+ };
137
220
 
138
221
  console.log('\n' + chalk.bold(' šŸ“Š Detected Stack'));
139
222
  console.log(chalk.dim(' ' + '─'.repeat(48)));
140
- line('Languages:', languages.map((l) => l.name).join(', '));
141
- line('Frameworks:', frameworks.map((f) => f.name).join(', '));
142
- line('Hosting:', hosting.map((h) => h.name).join(', ') || 'none');
143
- line('Testing:', testing.map((t) => t.name).join(', ') || 'none');
144
- line('Release tool:', releaseInfo ? releaseInfo.tool : 'none');
223
+
224
+ line('Languages:', languages.map((l) => l.name).join(', '), languages[0] && languages[0].reasons);
225
+ line('Frameworks:', frameworks.map((f) => f.name).join(', '), frameworks[0] && frameworks[0].reasons);
226
+ line('Hosting:', hosting.map((h) => h.name).join(', ') || 'none', hosting[0] && hosting[0].reasons);
227
+ line('Testing:', testing.map((t) => t.name).join(', ') || 'none', testing[0] && testing[0].reasons);
228
+ line('Release tool:', releaseInfo ? releaseInfo.tool : 'none', releaseInfo && releaseInfo.reasons);
145
229
 
146
230
  if (monorepoPackages.length > 0) {
147
231
  line('Monorepo pkgs:', monorepoPackages.map((p) => p.name).join(', '));