ai-flow-dev 2.6.0 → 2.8.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +24 -21
- package/package.json +6 -6
- package/prompts/backend/flow-check-review.md +648 -12
- package/prompts/backend/flow-check-test.md +520 -8
- package/prompts/backend/flow-check.md +687 -29
- package/prompts/backend/flow-commit.md +18 -49
- package/prompts/backend/flow-finish.md +919 -0
- package/prompts/backend/flow-release.md +949 -0
- package/prompts/backend/flow-work.md +296 -221
- package/prompts/desktop/flow-check-review.md +648 -12
- package/prompts/desktop/flow-check-test.md +520 -8
- package/prompts/desktop/flow-check.md +687 -29
- package/prompts/desktop/flow-commit.md +18 -49
- package/prompts/desktop/flow-finish.md +910 -0
- package/prompts/desktop/flow-release.md +662 -0
- package/prompts/desktop/flow-work.md +398 -219
- package/prompts/frontend/flow-check-review.md +648 -12
- package/prompts/frontend/flow-check-test.md +520 -8
- package/prompts/frontend/flow-check.md +687 -29
- package/prompts/frontend/flow-commit.md +18 -49
- package/prompts/frontend/flow-finish.md +910 -0
- package/prompts/frontend/flow-release.md +519 -0
- package/prompts/frontend/flow-work-api.md +1540 -0
- package/prompts/frontend/flow-work.md +774 -218
- package/prompts/mobile/flow-check-review.md +648 -12
- package/prompts/mobile/flow-check-test.md +520 -8
- package/prompts/mobile/flow-check.md +687 -29
- package/prompts/mobile/flow-commit.md +18 -49
- package/prompts/mobile/flow-finish.md +910 -0
- package/prompts/mobile/flow-release.md +751 -0
- package/prompts/mobile/flow-work-api.md +1493 -0
- package/prompts/mobile/flow-work.md +792 -222
- package/templates/AGENT.template.md +1 -1
|
@@ -0,0 +1,919 @@
|
|
|
1
|
+
---
|
|
2
|
+
description: Finalization Workflow - Archive, Generate Descriptions, and Push
|
|
3
|
+
---
|
|
4
|
+
|
|
5
|
+
# AI Flow - Finish Workflow
|
|
6
|
+
|
|
7
|
+
**YOU ARE AN EXPERT DEVELOPMENT WORKFLOW AUTOMATION SPECIALIST.**
|
|
8
|
+
|
|
9
|
+
Your mission is to finalize completed work by archiving metrics, generating professional PR/Jira descriptions with AI analysis, and optionally pushing changes when the user executes `/flow-finish`.
|
|
10
|
+
|
|
11
|
+
**🚀 MODO AGENTE ACTIVADO:** Actúa proactivamente, ejecuta validaciones y commits automáticamente si es necesario. Solicita confirmación solo antes del push final.
|
|
12
|
+
|
|
13
|
+
---
|
|
14
|
+
|
|
15
|
+
## Command: `/flow-finish`
|
|
16
|
+
|
|
17
|
+
### Objective
|
|
18
|
+
|
|
19
|
+
Automate the complete finalization of development work with:
|
|
20
|
+
|
|
21
|
+
- **Smart validation** (only if needed - skip if already executed)
|
|
22
|
+
- **Smart commit** (only if uncommitted changes exist)
|
|
23
|
+
- **Automatic archiving** (metrics to analytics.jsonl)
|
|
24
|
+
- **AI-powered descriptions** (professional PR and Jira descriptions with optimized token usage)
|
|
25
|
+
- **Optional push** (always ask for confirmation)
|
|
26
|
+
- **Cleanup** (remove work folder after success)
|
|
27
|
+
|
|
28
|
+
---
|
|
29
|
+
|
|
30
|
+
## Workflow: 5 Steps
|
|
31
|
+
|
|
32
|
+
### Step 0: Pre-Flight Checks & State Detection
|
|
33
|
+
|
|
34
|
+
**🔍 CRITICAL VALIDATION** - Detect complete state BEFORE any costly operations:
|
|
35
|
+
|
|
36
|
+
```bash
|
|
37
|
+
# 1. Verify active work exists
|
|
38
|
+
if [ ! -d ".ai-flow/work" ] || [ -z "$(ls -A .ai-flow/work)" ]; then
|
|
39
|
+
echo "❌ No hay trabajo activo en .ai-flow/work/"
|
|
40
|
+
echo "💡 Inicia trabajo con: /flow-work"
|
|
41
|
+
exit 1
|
|
42
|
+
fi
|
|
43
|
+
|
|
44
|
+
# 2. Detect work folder (should be only one)
|
|
45
|
+
TASK_FOLDER=$(ls .ai-flow/work/ | head -n 1)
|
|
46
|
+
TASK_PATH=".ai-flow/work/$TASK_FOLDER"
|
|
47
|
+
|
|
48
|
+
# 3. Check if already PAUSED
|
|
49
|
+
if [ -f "$TASK_PATH/PAUSED" ]; then
|
|
50
|
+
echo "⏸️ Esta tarea está pausada."
|
|
51
|
+
echo "💡 Reanuda con: /flow-work"
|
|
52
|
+
exit 1
|
|
53
|
+
fi
|
|
54
|
+
|
|
55
|
+
# 4. Check if already archived
|
|
56
|
+
if [ ! -f "$TASK_PATH/work.md" ]; then
|
|
57
|
+
echo "✅ Esta tarea ya fue archivada."
|
|
58
|
+
exit 0
|
|
59
|
+
fi
|
|
60
|
+
|
|
61
|
+
# 5. Detect current branch
|
|
62
|
+
CURRENT_BRANCH=$(git branch --show-current)
|
|
63
|
+
|
|
64
|
+
# 6. Verify branch protection
|
|
65
|
+
PROTECTED_BRANCHES="main|master|develop|development"
|
|
66
|
+
if [[ "$CURRENT_BRANCH" =~ ^($PROTECTED_BRANCHES)$ ]]; then
|
|
67
|
+
echo "⚠️ ERROR: No puedes finalizar trabajo en branch protegido: $CURRENT_BRANCH"
|
|
68
|
+
echo "💡 Crea un feature branch primero con: git checkout -b feature/[nombre]"
|
|
69
|
+
exit 1
|
|
70
|
+
fi
|
|
71
|
+
|
|
72
|
+
# 7. Detect uncommitted changes
|
|
73
|
+
UNCOMMITTED=$(git status --porcelain)
|
|
74
|
+
HAS_UNCOMMITTED_CHANGES=false
|
|
75
|
+
if [ -n "$UNCOMMITTED" ]; then
|
|
76
|
+
HAS_UNCOMMITTED_CHANGES=true
|
|
77
|
+
fi
|
|
78
|
+
|
|
79
|
+
# 8. Read validation state (if status.json exists)
|
|
80
|
+
if [ -f "$TASK_PATH/status.json" ]; then
|
|
81
|
+
TESTS_EXECUTED=$(jq -r '.validation.tests.executed' "$TASK_PATH/status.json")
|
|
82
|
+
TESTS_PASSED=$(jq -r '.validation.tests.passed' "$TASK_PATH/status.json")
|
|
83
|
+
TESTS_FAILED=$(jq -r '.validation.tests.failed' "$TASK_PATH/status.json")
|
|
84
|
+
LINT_EXECUTED=$(jq -r '.validation.lint.executed' "$TASK_PATH/status.json")
|
|
85
|
+
LAST_VALIDATION_TIMESTAMP=$(jq -r '.validation.lastExecuted' "$TASK_PATH/status.json" 2>/dev/null || echo "0")
|
|
86
|
+
else
|
|
87
|
+
TESTS_EXECUTED=false
|
|
88
|
+
TESTS_PASSED=0
|
|
89
|
+
TESTS_FAILED=0
|
|
90
|
+
LINT_EXECUTED=false
|
|
91
|
+
LAST_VALIDATION_TIMESTAMP="0"
|
|
92
|
+
fi
|
|
93
|
+
|
|
94
|
+
# 9. Detect if there are changes since last validation
|
|
95
|
+
LAST_COMMIT_TIMESTAMP=$(git log -1 --format=%ct 2>/dev/null || echo "0")
|
|
96
|
+
NEEDS_REVALIDATION=false
|
|
97
|
+
if [ "$TESTS_EXECUTED" = "true" ] && [ "$LAST_COMMIT_TIMESTAMP" -gt "$LAST_VALIDATION_TIMESTAMP" ]; then
|
|
98
|
+
NEEDS_REVALIDATION=true
|
|
99
|
+
fi
|
|
100
|
+
|
|
101
|
+
# 10. Show current state summary
|
|
102
|
+
echo ""
|
|
103
|
+
echo "---"
|
|
104
|
+
echo "📊 Estado Actual del Trabajo"
|
|
105
|
+
echo "---"
|
|
106
|
+
echo "📂 Tarea: $TASK_FOLDER"
|
|
107
|
+
echo "🌿 Branch: $CURRENT_BRANCH"
|
|
108
|
+
echo "📝 Uncommitted changes: $([ "$HAS_UNCOMMITTED_CHANGES" = true ] && echo "⚠️ Sí" || echo "✅ No")"
|
|
109
|
+
echo "🧪 Tests ejecutados: $([ "$TESTS_EXECUTED" = "true" ] && echo "✅ Sí ($TESTS_PASSED passed, $TESTS_FAILED failed)" || echo "❌ No")"
|
|
110
|
+
echo "🔍 Lint ejecutado: $([ "$LINT_EXECUTED" = "true" ] && echo "✅ Sí" || echo "❌ No")"
|
|
111
|
+
echo "♻️ Requiere re-validación: $([ "$NEEDS_REVALIDATION" = true ] && echo "⚠️ Sí" || echo "✅ No")"
|
|
112
|
+
echo ""
|
|
113
|
+
```
|
|
114
|
+
|
|
115
|
+
---
|
|
116
|
+
|
|
117
|
+
### Step 1: Smart Validation
|
|
118
|
+
|
|
119
|
+
**Only execute `/flow-check` if:**
|
|
120
|
+
|
|
121
|
+
- `TESTS_EXECUTED == false` (never executed), **OR**
|
|
122
|
+
- `NEEDS_REVALIDATION == true` (commits after last validation)
|
|
123
|
+
|
|
124
|
+
```bash
|
|
125
|
+
SHOULD_RUN_CHECK=false
|
|
126
|
+
|
|
127
|
+
if [ "$TESTS_EXECUTED" = "false" ]; then
|
|
128
|
+
echo "🧪 Tests no ejecutados. Ejecutando /flow-check..."
|
|
129
|
+
SHOULD_RUN_CHECK=true
|
|
130
|
+
elif [ "$NEEDS_REVALIDATION" = "true" ]; then
|
|
131
|
+
echo "🔄 Cambios detectados desde última validación. Ejecutando /flow-check..."
|
|
132
|
+
SHOULD_RUN_CHECK=true
|
|
133
|
+
else
|
|
134
|
+
echo "✅ Validación previa OK. Saltando /flow-check"
|
|
135
|
+
fi
|
|
136
|
+
|
|
137
|
+
if [ "$SHOULD_RUN_CHECK" = "true" ]; then
|
|
138
|
+
# INVOKE /flow-check HERE
|
|
139
|
+
# Execute the complete /flow-check workflow
|
|
140
|
+
# This will update status.json with results
|
|
141
|
+
|
|
142
|
+
# After execution, re-read validation results
|
|
143
|
+
TESTS_PASSED=$(jq -r '.validation.tests.passed' "$TASK_PATH/status.json" 2>/dev/null || echo "0")
|
|
144
|
+
TESTS_FAILED=$(jq -r '.validation.tests.failed' "$TASK_PATH/status.json" 2>/dev/null || echo "0")
|
|
145
|
+
|
|
146
|
+
# If tests FAIL → STOP EVERYTHING
|
|
147
|
+
if [ "$TESTS_FAILED" -gt 0 ]; then
|
|
148
|
+
echo ""
|
|
149
|
+
echo "❌ TESTS FALLIDOS"
|
|
150
|
+
echo "---"
|
|
151
|
+
echo "$TESTS_FAILED test(s) fallaron."
|
|
152
|
+
echo ""
|
|
153
|
+
echo "🛑 WORKFLOW DETENIDO"
|
|
154
|
+
echo "💡 Arregla los tests y vuelve a ejecutar /flow-finish"
|
|
155
|
+
exit 1
|
|
156
|
+
fi
|
|
157
|
+
fi
|
|
158
|
+
```
|
|
159
|
+
|
|
160
|
+
---
|
|
161
|
+
|
|
162
|
+
### Step 2: Smart Commit
|
|
163
|
+
|
|
164
|
+
**Only execute `/flow-commit` if:**
|
|
165
|
+
|
|
166
|
+
- `HAS_UNCOMMITTED_CHANGES == true`
|
|
167
|
+
|
|
168
|
+
```bash
|
|
169
|
+
if [ "$HAS_UNCOMMITTED_CHANGES" = "true" ]; then
|
|
170
|
+
echo "📝 Cambios sin commitear detectados. Ejecutando /flow-commit..."
|
|
171
|
+
|
|
172
|
+
# INVOKE /flow-commit HERE
|
|
173
|
+
# Execute the complete /flow-commit workflow
|
|
174
|
+
# This will update status.json with new commits
|
|
175
|
+
else
|
|
176
|
+
echo "✅ No hay cambios sin commitear. Saltando /flow-commit"
|
|
177
|
+
fi
|
|
178
|
+
|
|
179
|
+
# Verify working directory is clean
|
|
180
|
+
FINAL_STATUS=$(git status --porcelain)
|
|
181
|
+
if [ -n "$FINAL_STATUS" ]; then
|
|
182
|
+
echo "⚠️ Aún hay cambios sin commitear después de /flow-commit"
|
|
183
|
+
echo "$FINAL_STATUS"
|
|
184
|
+
echo ""
|
|
185
|
+
echo "🛑 WORKFLOW DETENIDO"
|
|
186
|
+
echo "💡 Commitea manualmente o revisa los cambios"
|
|
187
|
+
exit 1
|
|
188
|
+
fi
|
|
189
|
+
```
|
|
190
|
+
|
|
191
|
+
---
|
|
192
|
+
|
|
193
|
+
### Step 3: Archive, Cleanup & Commit
|
|
194
|
+
|
|
195
|
+
**Execute ALWAYS, before asking for push:**
|
|
196
|
+
|
|
197
|
+
```bash
|
|
198
|
+
echo ""
|
|
199
|
+
echo "📦 Archivando y limpiando trabajo..."
|
|
200
|
+
echo ""
|
|
201
|
+
|
|
202
|
+
# 1. Extract metadata for analytics
|
|
203
|
+
if [ -f "$TASK_PATH/status.json" ]; then
|
|
204
|
+
# COMPLEX: has status.json
|
|
205
|
+
TASK_TYPE=$(jq -r '.type' "$TASK_PATH/status.json")
|
|
206
|
+
TASK_SOURCE=$(jq -r '.source' "$TASK_PATH/status.json")
|
|
207
|
+
CREATED_AT=$(jq -r '.timestamps.created' "$TASK_PATH/status.json")
|
|
208
|
+
COMPLETED_AT=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
|
|
209
|
+
|
|
210
|
+
# Calculate duration in minutes
|
|
211
|
+
CREATED_TS=$(date -d "$CREATED_AT" +%s 2>/dev/null || date -j -f "%Y-%m-%dT%H:%M:%SZ" "$CREATED_AT" +%s 2>/dev/null || echo "0")
|
|
212
|
+
COMPLETED_TS=$(date +%s)
|
|
213
|
+
DURATION_MIN=$(( ($COMPLETED_TS - $CREATED_TS) / 60 ))
|
|
214
|
+
|
|
215
|
+
TOTAL_TASKS=$(jq -r '.progress.totalTasks' "$TASK_PATH/status.json")
|
|
216
|
+
COMMIT_COUNT=$(jq -r '.git.commits | length' "$TASK_PATH/status.json")
|
|
217
|
+
VALIDATION_PASSED=$( [ "$TESTS_FAILED" -eq 0 ] && echo "true" || echo "false" )
|
|
218
|
+
else
|
|
219
|
+
# MEDIUM: only work.md + git
|
|
220
|
+
TASK_TYPE="unknown"
|
|
221
|
+
# Detect type from folder name patterns
|
|
222
|
+
if echo "$TASK_FOLDER" | grep -qiE '^(feature|feat)'; then
|
|
223
|
+
TASK_TYPE="feature"
|
|
224
|
+
elif echo "$TASK_FOLDER" | grep -qiE '^(fix|bugfix)'; then
|
|
225
|
+
TASK_TYPE="fix"
|
|
226
|
+
elif echo "$TASK_FOLDER" | grep -qiE '^refactor'; then
|
|
227
|
+
TASK_TYPE="refactor"
|
|
228
|
+
fi
|
|
229
|
+
|
|
230
|
+
TASK_SOURCE="manual"
|
|
231
|
+
|
|
232
|
+
# First commit timestamp
|
|
233
|
+
FIRST_COMMIT=$(git log --reverse --format=%ct --all -- "$TASK_PATH/work.md" 2>/dev/null | head -n 1)
|
|
234
|
+
CREATED_AT=$(date -u -d "@$FIRST_COMMIT" +"%Y-%m-%dT%H:%M:%SZ" 2>/dev/null || date -u +"%Y-%m-%dT%H:%M:%SZ")
|
|
235
|
+
COMPLETED_AT=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
|
|
236
|
+
|
|
237
|
+
CREATED_TS=$FIRST_COMMIT
|
|
238
|
+
COMPLETED_TS=$(date +%s)
|
|
239
|
+
DURATION_MIN=$(( ($COMPLETED_TS - $CREATED_TS) / 60 ))
|
|
240
|
+
|
|
241
|
+
# Count checkboxes in work.md
|
|
242
|
+
TOTAL_TASKS=$(grep -c '^\- \[ \]' "$TASK_PATH/work.md" 2>/dev/null || echo "0")
|
|
243
|
+
|
|
244
|
+
# Count commits in branch
|
|
245
|
+
COMMIT_COUNT=$(git log --oneline "$CURRENT_BRANCH" ^main 2>/dev/null | wc -l | tr -d ' ')
|
|
246
|
+
|
|
247
|
+
VALIDATION_PASSED="true"
|
|
248
|
+
fi
|
|
249
|
+
|
|
250
|
+
# Extract Story Points from work.md
|
|
251
|
+
STORY_POINTS=$(grep -oP '• \K\d+(?= SP)' "$TASK_PATH/work.md" 2>/dev/null | awk '{sum+=$1} END {print sum}')
|
|
252
|
+
STORY_POINTS=${STORY_POINTS:-0}
|
|
253
|
+
|
|
254
|
+
# Calculate hours and minutes
|
|
255
|
+
DURATION_HOURS=$(( DURATION_MIN / 60 ))
|
|
256
|
+
DURATION_MINS=$(( DURATION_MIN % 60 ))
|
|
257
|
+
|
|
258
|
+
# 2. Build JSON analytics
|
|
259
|
+
ANALYTICS_JSON="{\"task\":\"$TASK_FOLDER\",\"type\":\"$TASK_TYPE\",\"src\":\"$TASK_SOURCE\",\"dur\":$DURATION_MIN,\"start\":\"$CREATED_AT\",\"end\":\"$COMPLETED_AT\",\"tasks\":$TOTAL_TASKS,\"sp\":$STORY_POINTS,\"commits\":$COMMIT_COUNT,\"valid\":$VALIDATION_PASSED}"
|
|
260
|
+
|
|
261
|
+
# 3. Append to analytics.jsonl
|
|
262
|
+
mkdir -p .ai-flow/archive
|
|
263
|
+
echo "$ANALYTICS_JSON" >> .ai-flow/archive/analytics.jsonl
|
|
264
|
+
|
|
265
|
+
echo "✅ Métricas archivadas en analytics.jsonl"
|
|
266
|
+
|
|
267
|
+
# 4. Delete work folder
|
|
268
|
+
rm -rf "$TASK_PATH"
|
|
269
|
+
echo "✅ Carpeta de trabajo eliminada"
|
|
270
|
+
|
|
271
|
+
# 5. Commit analytics
|
|
272
|
+
git add .ai-flow/archive/analytics.jsonl
|
|
273
|
+
|
|
274
|
+
# Check if there's something to commit
|
|
275
|
+
if git diff --cached --quiet; then
|
|
276
|
+
echo "✅ Analytics ya commiteado previamente"
|
|
277
|
+
else
|
|
278
|
+
git commit -m "chore: archive $TASK_TYPE task '$TASK_FOLDER' (${DURATION_HOURS}h ${DURATION_MINS}min, ${STORY_POINTS} SP)"
|
|
279
|
+
echo "✅ Analytics commiteado"
|
|
280
|
+
fi
|
|
281
|
+
|
|
282
|
+
echo ""
|
|
283
|
+
echo "---"
|
|
284
|
+
echo "📊 Resumen del Trabajo Completado"
|
|
285
|
+
echo "---"
|
|
286
|
+
echo "📂 Tarea: $TASK_FOLDER"
|
|
287
|
+
echo "🏷️ Tipo: $TASK_TYPE"
|
|
288
|
+
echo "⏱️ Duración: ${DURATION_HOURS}h ${DURATION_MINS}min"
|
|
289
|
+
echo "📊 Story Points: $STORY_POINTS SP"
|
|
290
|
+
echo "💾 Commits: $COMMIT_COUNT"
|
|
291
|
+
echo "🧪 Validación: $([ "$VALIDATION_PASSED" = "true" ] && echo "✅ Passed" || echo "⚠️ With warnings")"
|
|
292
|
+
echo "🌿 Branch: $CURRENT_BRANCH"
|
|
293
|
+
echo ""
|
|
294
|
+
```
|
|
295
|
+
|
|
296
|
+
---
|
|
297
|
+
|
|
298
|
+
### Step 4: AI-Powered Description Generation
|
|
299
|
+
|
|
300
|
+
**Generate professional descriptions using AI with optimized token consumption:**
|
|
301
|
+
|
|
302
|
+
```bash
|
|
303
|
+
echo "---"
|
|
304
|
+
echo "📋 Generando descripciones para PR y Jira con IA..."
|
|
305
|
+
echo "---"
|
|
306
|
+
echo ""
|
|
307
|
+
|
|
308
|
+
# ============================================
|
|
309
|
+
# LAYER 1: Bash Extraction (0 tokens)
|
|
310
|
+
# ============================================
|
|
311
|
+
|
|
312
|
+
# Extract objective from work.md
|
|
313
|
+
function extract_objective_from_work_md() {
|
|
314
|
+
if [ ! -f ".ai-flow/work/$TASK_FOLDER/work.md" ]; then
|
|
315
|
+
# Work folder already deleted, try to reconstruct from commits
|
|
316
|
+
git log "$CURRENT_BRANCH" --format="%B" -1 | head -n 3 | tr '\n' ' ' | sed 's/ */ /g'
|
|
317
|
+
return
|
|
318
|
+
fi
|
|
319
|
+
|
|
320
|
+
# Extract Objective section
|
|
321
|
+
awk '/^## Objective$/,/^## [^O]/' ".ai-flow/work/$TASK_FOLDER/work.md" 2>/dev/null | \
|
|
322
|
+
grep -v '^##' | sed '/^$/d' | head -n 3 | tr '\n' ' ' | sed 's/ */ /g' | sed 's/^ *//;s/ *$//'
|
|
323
|
+
}
|
|
324
|
+
|
|
325
|
+
# Extract completed tasks
|
|
326
|
+
function extract_completed_tasks() {
|
|
327
|
+
if [ ! -f ".ai-flow/work/$TASK_FOLDER/work.md" ]; then
|
|
328
|
+
echo "Tareas completadas (ver commits)"
|
|
329
|
+
return
|
|
330
|
+
fi
|
|
331
|
+
|
|
332
|
+
awk '/^## Tasks$/,/^## [^T]/' ".ai-flow/work/$TASK_FOLDER/work.md" 2>/dev/null | \
|
|
333
|
+
grep '^\- \[x\]' | sed 's/^\- \[x\] /✅ /' | head -n 8
|
|
334
|
+
}
|
|
335
|
+
|
|
336
|
+
# Categorize changed files
|
|
337
|
+
function categorize_changed_files() {
|
|
338
|
+
local all_files=$(git diff --name-only main..HEAD 2>/dev/null || git diff --name-only --staged)
|
|
339
|
+
|
|
340
|
+
local backend_count=$(echo "$all_files" | grep -icE '(controller|service|repository|handler|route|api)' 2>/dev/null || echo 0)
|
|
341
|
+
local frontend_count=$(echo "$all_files" | grep -icE '(component|view|page|screen|widget)' 2>/dev/null || echo 0)
|
|
342
|
+
local db_count=$(echo "$all_files" | grep -icE '(migration|entity|model|schema|\.sql)' 2>/dev/null || echo 0)
|
|
343
|
+
local test_count=$(echo "$all_files" | grep -icE '(test|spec|e2e)' 2>/dev/null || echo 0)
|
|
344
|
+
local doc_count=$(echo "$all_files" | grep -icE '\.md$' 2>/dev/null || echo 0)
|
|
345
|
+
local config_count=$(echo "$all_files" | grep -icE '(\.json|\.yaml|\.yml|\.env|docker|k8s)' 2>/dev/null || echo 0)
|
|
346
|
+
|
|
347
|
+
cat <<EOF
|
|
348
|
+
- Backend: $backend_count files
|
|
349
|
+
- Frontend: $frontend_count files
|
|
350
|
+
- Database: $db_count files
|
|
351
|
+
- Tests: $test_count files
|
|
352
|
+
- Documentation: $doc_count files
|
|
353
|
+
- Configuration: $config_count files
|
|
354
|
+
EOF
|
|
355
|
+
}
|
|
356
|
+
|
|
357
|
+
# Detect file purpose
|
|
358
|
+
function detect_file_purpose() {
|
|
359
|
+
local file=$1
|
|
360
|
+
|
|
361
|
+
case "$file" in
|
|
362
|
+
*controller*|*route*|*handler*) echo "API endpoint" ;;
|
|
363
|
+
*service*|*repository*) echo "Business logic" ;;
|
|
364
|
+
*entity*|*model*|*schema*) echo "Data model" ;;
|
|
365
|
+
*test*|*spec*) echo "Tests" ;;
|
|
366
|
+
*migration*) echo "Database migration" ;;
|
|
367
|
+
*.md) echo "Documentation" ;;
|
|
368
|
+
*) echo "Source code" ;;
|
|
369
|
+
esac
|
|
370
|
+
}
|
|
371
|
+
|
|
372
|
+
# Show top 3 files by impact
|
|
373
|
+
function show_top_3_files_summary() {
|
|
374
|
+
local top_files=$(git diff --stat main..HEAD 2>/dev/null | sort -rn -k3 | head -n 3 | awk '{print $1}')
|
|
375
|
+
|
|
376
|
+
echo "### Most Impacted Files"
|
|
377
|
+
for file in $top_files; do
|
|
378
|
+
local lines_changed=$(git diff --stat main..HEAD -- "$file" 2>/dev/null | tail -n 1 | awk '{print $4}')
|
|
379
|
+
local file_type=$(detect_file_purpose "$file")
|
|
380
|
+
echo "- \`$file\` ($lines_changed lines) - $file_type"
|
|
381
|
+
done
|
|
382
|
+
}
|
|
383
|
+
|
|
384
|
+
# Detect deployment requirements
|
|
385
|
+
function detect_deployment_requirements() {
|
|
386
|
+
local changed_files=$(git diff --name-only main..HEAD 2>/dev/null || echo "")
|
|
387
|
+
|
|
388
|
+
# Migrations (universal)
|
|
389
|
+
HAS_MIGRATIONS=false
|
|
390
|
+
MIGRATION_FILES=""
|
|
391
|
+
if echo "$changed_files" | grep -qiE '(migration|migrate|schema|upgrade|\.sql$)'; then
|
|
392
|
+
HAS_MIGRATIONS=true
|
|
393
|
+
MIGRATION_FILES=$(echo "$changed_files" | grep -iE '(migration|migrate)' | wc -l | tr -d ' ')
|
|
394
|
+
fi
|
|
395
|
+
|
|
396
|
+
# Environment variables (universal)
|
|
397
|
+
NEW_ENV_VARS=""
|
|
398
|
+
ENV_FILES=$(echo "$changed_files" | grep -iE '(\.env\.example|\.env\.template|\.env\.sample|env\.example|env\.sample)')
|
|
399
|
+
if [ -n "$ENV_FILES" ]; then
|
|
400
|
+
NEW_ENV_VARS=$(git diff main..HEAD -- $ENV_FILES 2>/dev/null | grep -E '^\+[A-Z_0-9]+=' | sed 's/^+//' | cut -d'=' -f1 | sort -u)
|
|
401
|
+
fi
|
|
402
|
+
|
|
403
|
+
# Dependencies (language-agnostic)
|
|
404
|
+
HAS_NEW_DEPS=false
|
|
405
|
+
INSTALL_CMD=""
|
|
406
|
+
|
|
407
|
+
if echo "$changed_files" | grep -qE 'package\.json'; then
|
|
408
|
+
HAS_NEW_DEPS=true
|
|
409
|
+
INSTALL_CMD="npm install"
|
|
410
|
+
elif echo "$changed_files" | grep -qE '(requirements\.txt|pyproject\.toml|Pipfile)'; then
|
|
411
|
+
HAS_NEW_DEPS=true
|
|
412
|
+
INSTALL_CMD="pip install -r requirements.txt"
|
|
413
|
+
elif echo "$changed_files" | grep -qE 'composer\.json'; then
|
|
414
|
+
HAS_NEW_DEPS=true
|
|
415
|
+
INSTALL_CMD="composer install"
|
|
416
|
+
elif echo "$changed_files" | grep -qE 'Gemfile'; then
|
|
417
|
+
HAS_NEW_DEPS=true
|
|
418
|
+
INSTALL_CMD="bundle install"
|
|
419
|
+
elif echo "$changed_files" | grep -qE 'go\.(mod|sum)'; then
|
|
420
|
+
HAS_NEW_DEPS=true
|
|
421
|
+
INSTALL_CMD="go mod download"
|
|
422
|
+
elif echo "$changed_files" | grep -qE 'Cargo\.(toml|lock)'; then
|
|
423
|
+
HAS_NEW_DEPS=true
|
|
424
|
+
INSTALL_CMD="cargo build"
|
|
425
|
+
elif echo "$changed_files" | grep -qE '\.csproj'; then
|
|
426
|
+
HAS_NEW_DEPS=true
|
|
427
|
+
INSTALL_CMD="dotnet restore"
|
|
428
|
+
elif echo "$changed_files" | grep -qE 'pom\.xml'; then
|
|
429
|
+
HAS_NEW_DEPS=true
|
|
430
|
+
INSTALL_CMD="mvn install"
|
|
431
|
+
elif echo "$changed_files" | grep -qE 'build\.gradle'; then
|
|
432
|
+
HAS_NEW_DEPS=true
|
|
433
|
+
INSTALL_CMD="gradle build"
|
|
434
|
+
fi
|
|
435
|
+
|
|
436
|
+
# Determine if showing deployment section
|
|
437
|
+
SHOW_DEPLOYMENT_NOTES=false
|
|
438
|
+
if [ "$HAS_MIGRATIONS" = "true" ] || [ -n "$NEW_ENV_VARS" ] || [ "$HAS_NEW_DEPS" = "true" ]; then
|
|
439
|
+
SHOW_DEPLOYMENT_NOTES=true
|
|
440
|
+
fi
|
|
441
|
+
|
|
442
|
+
# Export variables
|
|
443
|
+
export HAS_MIGRATIONS
|
|
444
|
+
export MIGRATION_FILES
|
|
445
|
+
export NEW_ENV_VARS
|
|
446
|
+
export HAS_NEW_DEPS
|
|
447
|
+
export INSTALL_CMD
|
|
448
|
+
export SHOW_DEPLOYMENT_NOTES
|
|
449
|
+
}
|
|
450
|
+
|
|
451
|
+
# Detect area of impact
|
|
452
|
+
function detect_impact_area() {
|
|
453
|
+
local changed_files=$(git diff --name-only main..HEAD 2>/dev/null || echo "")
|
|
454
|
+
local area="General"
|
|
455
|
+
local module=""
|
|
456
|
+
|
|
457
|
+
# Backend API (framework-agnostic)
|
|
458
|
+
if echo "$changed_files" | grep -qiE '(controller|service|repository|handler|route|api|endpoint)'; then
|
|
459
|
+
area="Backend API"
|
|
460
|
+
|
|
461
|
+
# Module by subdirectory or filename
|
|
462
|
+
if echo "$changed_files" | grep -qiE '(auth|login|jwt|user|session)'; then
|
|
463
|
+
module="Authentication"
|
|
464
|
+
elif echo "$changed_files" | grep -qiE '(payment|billing|stripe|paypal)'; then
|
|
465
|
+
module="Payments"
|
|
466
|
+
elif echo "$changed_files" | grep -qiE '(notification|email|sms|push)'; then
|
|
467
|
+
module="Notifications"
|
|
468
|
+
elif echo "$changed_files" | grep -qiE '(report|analytics|dashboard)'; then
|
|
469
|
+
module="Analytics"
|
|
470
|
+
fi
|
|
471
|
+
|
|
472
|
+
# Frontend (framework-agnostic)
|
|
473
|
+
elif echo "$changed_files" | grep -qiE '(component|view|page|screen|widget|template)'; then
|
|
474
|
+
area="Frontend"
|
|
475
|
+
|
|
476
|
+
if echo "$changed_files" | grep -qiE '(auth|login)'; then
|
|
477
|
+
module="Authentication UI"
|
|
478
|
+
elif echo "$changed_files" | grep -qiE '(dashboard|home)'; then
|
|
479
|
+
module="Dashboard"
|
|
480
|
+
elif echo "$changed_files" | grep -qiE '(profile|account|settings)'; then
|
|
481
|
+
module="User Profile"
|
|
482
|
+
fi
|
|
483
|
+
|
|
484
|
+
# Mobile (agnostic: React Native, Flutter, Native)
|
|
485
|
+
elif echo "$changed_files" | grep -qiE '(ios/|android/|mobile/|\.swift|\.kt|\.dart)'; then
|
|
486
|
+
area="Mobile"
|
|
487
|
+
|
|
488
|
+
# Database (agnostic)
|
|
489
|
+
elif echo "$changed_files" | grep -qiE '(migration|schema|seed|model|entity|\.sql)'; then
|
|
490
|
+
area="Database"
|
|
491
|
+
module="Schema"
|
|
492
|
+
|
|
493
|
+
# Infrastructure (agnostic)
|
|
494
|
+
elif echo "$changed_files" | grep -qiE '(docker|k8s|kubernetes|terraform|ansible|\.yaml|\.yml|ci|cd|\.github|\.gitlab)'; then
|
|
495
|
+
area="Infrastructure"
|
|
496
|
+
module="DevOps"
|
|
497
|
+
|
|
498
|
+
# Testing (agnostic)
|
|
499
|
+
elif echo "$changed_files" | grep -qiE '(test|spec|\.test\.|\.spec\.|e2e|integration)'; then
|
|
500
|
+
area="Testing"
|
|
501
|
+
|
|
502
|
+
# Documentation
|
|
503
|
+
elif echo "$changed_files" | grep -qiE '(\.md$|docs?/|README)'; then
|
|
504
|
+
area="Documentation"
|
|
505
|
+
fi
|
|
506
|
+
|
|
507
|
+
# Final format
|
|
508
|
+
if [ -n "$module" ]; then
|
|
509
|
+
echo "$area - $module"
|
|
510
|
+
else
|
|
511
|
+
echo "$area"
|
|
512
|
+
fi
|
|
513
|
+
}
|
|
514
|
+
|
|
515
|
+
# Detect Git platform and generate commit URLs
|
|
516
|
+
function get_commit_urls() {
|
|
517
|
+
local remote_url=$(git config --get remote.origin.url 2>/dev/null)
|
|
518
|
+
|
|
519
|
+
if [ -z "$remote_url" ]; then
|
|
520
|
+
echo "⚠️ No se detectó remote origin, commits sin links"
|
|
521
|
+
COMMIT_URL_PATTERN=""
|
|
522
|
+
PLATFORM="Unknown"
|
|
523
|
+
return 1
|
|
524
|
+
fi
|
|
525
|
+
|
|
526
|
+
# Normalize URL (SSH -> HTTPS)
|
|
527
|
+
local base_url=""
|
|
528
|
+
|
|
529
|
+
# GitHub
|
|
530
|
+
if echo "$remote_url" | grep -qE 'github\.com'; then
|
|
531
|
+
base_url=$(echo "$remote_url" | sed -E 's|git@github.com:(.*)|https://github.com/\1|' | sed 's|\.git$||')
|
|
532
|
+
COMMIT_URL_PATTERN="${base_url}/commit/"
|
|
533
|
+
PLATFORM="GitHub"
|
|
534
|
+
|
|
535
|
+
# GitLab
|
|
536
|
+
elif echo "$remote_url" | grep -qE 'gitlab\.com'; then
|
|
537
|
+
base_url=$(echo "$remote_url" | sed -E 's|git@gitlab.com:(.*)|https://gitlab.com/\1|' | sed 's|\.git$||')
|
|
538
|
+
COMMIT_URL_PATTERN="${base_url}/-/commit/"
|
|
539
|
+
PLATFORM="GitLab"
|
|
540
|
+
|
|
541
|
+
# Bitbucket
|
|
542
|
+
elif echo "$remote_url" | grep -qE 'bitbucket\.org'; then
|
|
543
|
+
base_url=$(echo "$remote_url" | sed -E 's|git@bitbucket.org:(.*)|https://bitbucket.org/\1|' | sed 's|\.git$||')
|
|
544
|
+
COMMIT_URL_PATTERN="${base_url}/commits/"
|
|
545
|
+
PLATFORM="Bitbucket"
|
|
546
|
+
|
|
547
|
+
# Azure DevOps
|
|
548
|
+
elif echo "$remote_url" | grep -qE 'dev\.azure\.com'; then
|
|
549
|
+
base_url=$(echo "$remote_url" | sed -E 's|git@ssh\.dev\.azure\.com:v3/(.*)|https://dev.azure.com/\1|' | sed 's|\.git$||')
|
|
550
|
+
COMMIT_URL_PATTERN="${base_url}/commit/"
|
|
551
|
+
PLATFORM="Azure DevOps"
|
|
552
|
+
|
|
553
|
+
# GitLab Self-Hosted
|
|
554
|
+
elif echo "$remote_url" | grep -qE 'gitlab'; then
|
|
555
|
+
base_url=$(echo "$remote_url" | sed -E 's|git@([^:]+):(.*)|https://\1/\2|' | sed 's|\.git$||')
|
|
556
|
+
COMMIT_URL_PATTERN="${base_url}/-/commit/"
|
|
557
|
+
PLATFORM="GitLab (Self-Hosted)"
|
|
558
|
+
|
|
559
|
+
# GitHub Enterprise
|
|
560
|
+
elif echo "$remote_url" | grep -qE 'github'; then
|
|
561
|
+
base_url=$(echo "$remote_url" | sed -E 's|git@([^:]+):(.*)|https://\1/\2|' | sed 's|\.git$||')
|
|
562
|
+
COMMIT_URL_PATTERN="${base_url}/commit/"
|
|
563
|
+
PLATFORM="GitHub Enterprise"
|
|
564
|
+
|
|
565
|
+
else
|
|
566
|
+
echo "⚠️ Plataforma Git no reconocida, commits sin links"
|
|
567
|
+
COMMIT_URL_PATTERN=""
|
|
568
|
+
PLATFORM="Unknown"
|
|
569
|
+
return 1
|
|
570
|
+
fi
|
|
571
|
+
|
|
572
|
+
echo "✅ Detectado: $PLATFORM"
|
|
573
|
+
export COMMIT_URL_PATTERN
|
|
574
|
+
export PLATFORM
|
|
575
|
+
}
|
|
576
|
+
|
|
577
|
+
# Generate commit list with links
|
|
578
|
+
function generate_commit_links() {
|
|
579
|
+
local max_commits=${1:-5}
|
|
580
|
+
local commits=$(git log main..HEAD --format="%h" -${max_commits} 2>/dev/null)
|
|
581
|
+
local total_commits=$(git log main..HEAD --format="%h" 2>/dev/null | wc -l | tr -d ' ')
|
|
582
|
+
|
|
583
|
+
# For summary line (first 5 hashes)
|
|
584
|
+
COMMIT_HASHES_SUMMARY=""
|
|
585
|
+
local count=0
|
|
586
|
+
|
|
587
|
+
for hash in $commits; do
|
|
588
|
+
if [ $count -lt 5 ]; then
|
|
589
|
+
if [ -n "$COMMIT_HASHES_SUMMARY" ]; then
|
|
590
|
+
COMMIT_HASHES_SUMMARY="${COMMIT_HASHES_SUMMARY}, "
|
|
591
|
+
fi
|
|
592
|
+
|
|
593
|
+
if [ -n "$COMMIT_URL_PATTERN" ]; then
|
|
594
|
+
COMMIT_HASHES_SUMMARY="${COMMIT_HASHES_SUMMARY}[${hash}](${COMMIT_URL_PATTERN}${hash})"
|
|
595
|
+
else
|
|
596
|
+
COMMIT_HASHES_SUMMARY="${COMMIT_HASHES_SUMMARY}\`${hash}\`"
|
|
597
|
+
fi
|
|
598
|
+
fi
|
|
599
|
+
count=$((count + 1))
|
|
600
|
+
done
|
|
601
|
+
|
|
602
|
+
# Add indicator if more commits
|
|
603
|
+
if [ $total_commits -gt 5 ]; then
|
|
604
|
+
COMMIT_HASHES_SUMMARY="${COMMIT_HASHES_SUMMARY}, ... (${total_commits} total)"
|
|
605
|
+
elif [ $total_commits -gt 0 ]; then
|
|
606
|
+
COMMIT_HASHES_SUMMARY="${COMMIT_HASHES_SUMMARY} (${total_commits} total)"
|
|
607
|
+
else
|
|
608
|
+
COMMIT_HASHES_SUMMARY="No commits"
|
|
609
|
+
fi
|
|
610
|
+
|
|
611
|
+
export COMMIT_HASHES_SUMMARY
|
|
612
|
+
export TOTAL_COMMITS=$total_commits
|
|
613
|
+
}
|
|
614
|
+
|
|
615
|
+
# ============================================
|
|
616
|
+
# LAYER 2: Smart Summary (0 tokens)
|
|
617
|
+
# ============================================
|
|
618
|
+
|
|
619
|
+
WORK_OBJECTIVE=$(extract_objective_from_work_md)
|
|
620
|
+
WORK_TASKS=$(extract_completed_tasks)
|
|
621
|
+
COMMIT_SUBJECTS=$(git log main..HEAD --format="%s" 2>/dev/null | head -10)
|
|
622
|
+
COMMIT_BREAKING=$(git log main..HEAD --grep="BREAKING CHANGE:" --format="%s" 2>/dev/null)
|
|
623
|
+
HAS_BREAKING_CHANGES=false
|
|
624
|
+
if [ -n "$COMMIT_BREAKING" ]; then
|
|
625
|
+
HAS_BREAKING_CHANGES=true
|
|
626
|
+
fi
|
|
627
|
+
|
|
628
|
+
FILES_BY_CATEGORY=$(categorize_changed_files)
|
|
629
|
+
TOP_FILES=$(show_top_3_files_summary)
|
|
630
|
+
detect_deployment_requirements
|
|
631
|
+
IMPACT_AREA=$(detect_impact_area)
|
|
632
|
+
get_commit_urls
|
|
633
|
+
generate_commit_links 5
|
|
634
|
+
|
|
635
|
+
# File statistics
|
|
636
|
+
FILES_STAT=$(git diff --stat main..HEAD 2>/dev/null | tail -n 1)
|
|
637
|
+
FILES_COUNT=$(echo "$FILES_STAT" | awk '{print $1}' | tr -d ' ')
|
|
638
|
+
LINES_ADDED=$(echo "$FILES_STAT" | grep -oP '\d+(?= insertion)' 2>/dev/null || echo "0")
|
|
639
|
+
LINES_DELETED=$(echo "$FILES_STAT" | grep -oP '\d+(?= deletion)' 2>/dev/null || echo "0")
|
|
640
|
+
|
|
641
|
+
# Test metrics (from status.json or default)
|
|
642
|
+
TESTS_TOTAL="N/A"
|
|
643
|
+
TESTS_PASSED="N/A"
|
|
644
|
+
TESTS_NEW="0"
|
|
645
|
+
COVERAGE="N/A"
|
|
646
|
+
|
|
647
|
+
if [ -f "$TASK_PATH/status.json" ]; then
|
|
648
|
+
TESTS_PASSED=$(jq -r '.validation.tests.passed // "N/A"' "$TASK_PATH/status.json" 2>/dev/null)
|
|
649
|
+
TESTS_FAILED=$(jq -r '.validation.tests.failed // 0' "$TASK_PATH/status.json" 2>/dev/null)
|
|
650
|
+
TESTS_TOTAL=$((TESTS_PASSED + TESTS_FAILED))
|
|
651
|
+
TESTS_NEW=$(jq -r '.validation.tests.new // 0' "$TASK_PATH/status.json" 2>/dev/null)
|
|
652
|
+
COVERAGE=$(jq -r '.validation.tests.coverage // "N/A"' "$TASK_PATH/status.json" 2>/dev/null)
|
|
653
|
+
fi
|
|
654
|
+
|
|
655
|
+
# Create structured summary for AI
|
|
656
|
+
cat > /tmp/ai-context-summary.md <<EOF
|
|
657
|
+
# Context Summary for AI Analysis
|
|
658
|
+
|
|
659
|
+
## Work Overview
|
|
660
|
+
Objective: $WORK_OBJECTIVE
|
|
661
|
+
Completed Tasks:
|
|
662
|
+
$WORK_TASKS
|
|
663
|
+
Type: $TASK_TYPE
|
|
664
|
+
Story Points: $STORY_POINTS
|
|
665
|
+
|
|
666
|
+
## Changes Made
|
|
667
|
+
Commits (subjects only):
|
|
668
|
+
$COMMIT_SUBJECTS
|
|
669
|
+
|
|
670
|
+
Breaking Changes: $([ "$HAS_BREAKING_CHANGES" = true ] && echo "YES" || echo "NO")
|
|
671
|
+
|
|
672
|
+
Files Changed by Category:
|
|
673
|
+
$FILES_BY_CATEGORY
|
|
674
|
+
|
|
675
|
+
## Technical Metrics
|
|
676
|
+
- Duration: ${DURATION_HOURS}h ${DURATION_MINS}min
|
|
677
|
+
- Commits: $TOTAL_COMMITS
|
|
678
|
+
- Files: $FILES_COUNT (+$LINES_ADDED/-$LINES_DELETED)
|
|
679
|
+
- Tests: $TESTS_PASSED/$TESTS_TOTAL passing ($TESTS_NEW new)
|
|
680
|
+
- Coverage: $COVERAGE%
|
|
681
|
+
- Branch: $CURRENT_BRANCH
|
|
682
|
+
|
|
683
|
+
## Deployment Requirements
|
|
684
|
+
$([ "$HAS_MIGRATIONS" = "true" ] && echo "Migrations: YES ($MIGRATION_FILES files)" || echo "Migrations: NO")
|
|
685
|
+
$([ -n "$NEW_ENV_VARS" ] && echo "Env Vars: $NEW_ENV_VARS" || echo "Env Vars: None")
|
|
686
|
+
$([ "$HAS_NEW_DEPS" = "true" ] && echo "Dependencies: YES (install: $INSTALL_CMD)" || echo "Dependencies: NO")
|
|
687
|
+
|
|
688
|
+
$TOP_FILES
|
|
689
|
+
EOF
|
|
690
|
+
|
|
691
|
+
# ============================================
|
|
692
|
+
# LAYER 3: AI Generation (~800-1200 tokens)
|
|
693
|
+
# ============================================
|
|
694
|
+
|
|
695
|
+
echo "🤖 Analizando contexto y generando descripciones profesionales..."
|
|
696
|
+
echo ""
|
|
697
|
+
```
|
|
698
|
+
|
|
699
|
+
**NOW INVOKE AI TO GENERATE DESCRIPTIONS:**
|
|
700
|
+
|
|
701
|
+
Read the structured summary from `/tmp/ai-context-summary.md` (400-600 words) and generate two professional descriptions:
|
|
702
|
+
|
|
703
|
+
**AI Prompt:**
|
|
704
|
+
|
|
705
|
+
```markdown
|
|
706
|
+
Genera dos descripciones profesionales (PR y Jira) basándote en este resumen estructurado:
|
|
707
|
+
|
|
708
|
+
<context-summary>
|
|
709
|
+
$(cat /tmp/ai-context-summary.md)
|
|
710
|
+
</context-summary>
|
|
711
|
+
|
|
712
|
+
<commit-links>
|
|
713
|
+
Platform: $PLATFORM
|
|
714
|
+
Base URL Pattern: $COMMIT_URL_PATTERN
|
|
715
|
+
Commit Hashes Summary: $COMMIT_HASHES_SUMMARY
|
|
716
|
+
</commit-links>
|
|
717
|
+
|
|
718
|
+
<formatting-requirements>
|
|
719
|
+
- Área de impacto detectada: $IMPACT_AREA
|
|
720
|
+
- Mostrar deployment notes: $([ "$SHOW_DEPLOYMENT_NOTES" = "true" ] && echo "SÍ" || echo "NO")
|
|
721
|
+
- Breaking changes: $([ "$HAS_BREAKING_CHANGES" = "true" ] && echo "SÍ (resaltar con ⚠️)" || echo "NO")
|
|
722
|
+
- Branch: $CURRENT_BRANCH
|
|
723
|
+
- Story Points: $STORY_POINTS SP
|
|
724
|
+
- Duration: ${DURATION_HOURS}h ${DURATION_MINS}min
|
|
725
|
+
</formatting-requirements>
|
|
726
|
+
|
|
727
|
+
**Requisitos:**
|
|
728
|
+
|
|
729
|
+
1. **PR Description (GitHub/GitLab/Bitbucket):**
|
|
730
|
+
- Título: ## ${TASK_TYPE^}: [nombre descriptivo basado en objective]
|
|
731
|
+
- Header con branch, SP, duration
|
|
732
|
+
- Sección "Área de Impacto" con valor: $IMPACT_AREA
|
|
733
|
+
- Contexto: Resume el problema/necesidad en 2-3 líneas (extrae de objective)
|
|
734
|
+
- Solución Implementada: Resume enfoque técnico en 2-3 líneas (infiere de commits)
|
|
735
|
+
- Cambios principales: Lista 5-7 cambios significativos (analiza commit subjects)
|
|
736
|
+
- Validación: Tabla con Tests, Coverage, Lint, Docs
|
|
737
|
+
- Métricas: Tabla con Commits, Archivos, Breaking Changes
|
|
738
|
+
- Deployment Notes: Solo SI $SHOW_DEPLOYMENT_NOTES=true, incluir requirements específicos
|
|
739
|
+
- Referencias: Commits con links usando $COMMIT_HASHES_SUMMARY
|
|
740
|
+
- Reviewer checklist
|
|
741
|
+
|
|
742
|
+
2. **Jira Description (Markdown estándar):**
|
|
743
|
+
- Similar estructura pero más concisa
|
|
744
|
+
- Enfoque en resultado de negocio
|
|
745
|
+
- Métricas en tabla
|
|
746
|
+
- Deployment notes si aplica
|
|
747
|
+
- Referencias con commits
|
|
748
|
+
|
|
749
|
+
**Reglas Importantes:**
|
|
750
|
+
|
|
751
|
+
- Usa lenguaje profesional pero claro
|
|
752
|
+
- Sé específico con cambios técnicos
|
|
753
|
+
- Usa los commit links ya formateados en $COMMIT_HASHES_SUMMARY
|
|
754
|
+
- Escapa caracteres especiales para Markdown válido (backticks, pipes, asteriscos)
|
|
755
|
+
- Usa separadores `---` (no `━━━━`)
|
|
756
|
+
- Si breaking changes, resáltalos con ⚠️ en sección Métricas
|
|
757
|
+
- Si deployment notes, sé específico con cada requirement
|
|
758
|
+
|
|
759
|
+
**Output en formato (CRÍTICO - respetar delimitadores):**
|
|
760
|
+
|
|
761
|
+
\`\`\`markdown
|
|
762
|
+
|
|
763
|
+
<!-- PR_DESCRIPTION_START -->
|
|
764
|
+
|
|
765
|
+
[contenido completo de PR description aquí]
|
|
766
|
+
|
|
767
|
+
<!-- PR_DESCRIPTION_END -->
|
|
768
|
+
|
|
769
|
+
<!-- JIRA_DESCRIPTION_START -->
|
|
770
|
+
|
|
771
|
+
[contenido completo de Jira description aquí]
|
|
772
|
+
|
|
773
|
+
<!-- JIRA_DESCRIPTION_END -->
|
|
774
|
+
|
|
775
|
+
\`\`\`
|
|
776
|
+
|
|
777
|
+
Analiza el contexto y genera las descripciones óptimas ahora.
|
|
778
|
+
```
|
|
779
|
+
|
|
780
|
+
**After AI generates the descriptions, extract and save them:**
|
|
781
|
+
|
|
782
|
+
```bash
|
|
783
|
+
# Extract PR description
|
|
784
|
+
sed -n '/<!-- PR_DESCRIPTION_START -->/,/<!-- PR_DESCRIPTION_END -->/p' /tmp/ai-output.md | \
|
|
785
|
+
sed '1d;$d' > /tmp/pr-description.md
|
|
786
|
+
|
|
787
|
+
# Extract Jira description
|
|
788
|
+
sed -n '/<!-- JIRA_DESCRIPTION_START -->/,/<!-- JIRA_DESCRIPTION_END -->/p' /tmp/ai-output.md | \
|
|
789
|
+
sed '1d;$d' > /tmp/jira-description.md
|
|
790
|
+
|
|
791
|
+
# Display descriptions
|
|
792
|
+
echo ""
|
|
793
|
+
echo "---"
|
|
794
|
+
echo "📋 DESCRIPCIÓN PARA PULL REQUEST (GitHub/GitLab/Bitbucket)"
|
|
795
|
+
echo "---"
|
|
796
|
+
echo ""
|
|
797
|
+
cat /tmp/pr-description.md
|
|
798
|
+
echo ""
|
|
799
|
+
echo ""
|
|
800
|
+
echo "---"
|
|
801
|
+
echo "🎫 DESCRIPCIÓN PARA JIRA/CLICKUP/LINEAR (Markdown)"
|
|
802
|
+
echo "---"
|
|
803
|
+
echo ""
|
|
804
|
+
cat /tmp/jira-description.md
|
|
805
|
+
echo ""
|
|
806
|
+
echo ""
|
|
807
|
+
echo "💡 Copia las descripciones de arriba para tus tickets"
|
|
808
|
+
echo ""
|
|
809
|
+
```
|
|
810
|
+
|
|
811
|
+
---
|
|
812
|
+
|
|
813
|
+
### Step 5: Git Push (Optional)
|
|
814
|
+
|
|
815
|
+
**Always ask before pushing:**
|
|
816
|
+
|
|
817
|
+
```bash
|
|
818
|
+
echo "---"
|
|
819
|
+
echo "🚀 ¿Realizar push a origin/$CURRENT_BRANCH?"
|
|
820
|
+
echo ""
|
|
821
|
+
echo "Esto subirá:"
|
|
822
|
+
echo " - $TOTAL_COMMITS commits de trabajo"
|
|
823
|
+
echo " - 1 commit de analytics archivado"
|
|
824
|
+
echo ""
|
|
825
|
+
read -p "Confirmar push (y/N): " CONFIRM_PUSH
|
|
826
|
+
|
|
827
|
+
if [[ "$CONFIRM_PUSH" =~ ^[Yy]$ ]]; then
|
|
828
|
+
echo ""
|
|
829
|
+
echo "⬆️ Subiendo cambios a origin/$CURRENT_BRANCH..."
|
|
830
|
+
echo ""
|
|
831
|
+
|
|
832
|
+
git push origin "$CURRENT_BRANCH"
|
|
833
|
+
|
|
834
|
+
if [ $? -eq 0 ]; then
|
|
835
|
+
echo ""
|
|
836
|
+
echo "---"
|
|
837
|
+
echo "✅ PUSH EXITOSO"
|
|
838
|
+
echo "---"
|
|
839
|
+
echo ""
|
|
840
|
+
echo "📊 Trabajo completado y subido correctamente"
|
|
841
|
+
echo "🌿 Branch: $CURRENT_BRANCH (pushed)"
|
|
842
|
+
echo ""
|
|
843
|
+
echo "💡 Siguiente paso: Crear Pull Request"
|
|
844
|
+
echo ""
|
|
845
|
+
echo " GitHub CLI:"
|
|
846
|
+
echo " gh pr create --title \"$TASK_TYPE: $TASK_FOLDER\" --body-file /tmp/pr-description.md"
|
|
847
|
+
echo ""
|
|
848
|
+
echo " GitLab CLI:"
|
|
849
|
+
echo " glab mr create --title \"$TASK_TYPE: $TASK_FOLDER\" --description \"\$(cat /tmp/pr-description.md)\""
|
|
850
|
+
echo ""
|
|
851
|
+
echo " Bitbucket CLI:"
|
|
852
|
+
echo " bb pr create --title \"$TASK_TYPE: $TASK_FOLDER\" --description \"\$(cat /tmp/pr-description.md)\""
|
|
853
|
+
echo ""
|
|
854
|
+
echo " O abre tu repositorio web y crea el PR manualmente"
|
|
855
|
+
echo ""
|
|
856
|
+
else
|
|
857
|
+
echo ""
|
|
858
|
+
echo "❌ ERROR AL HACER PUSH"
|
|
859
|
+
echo ""
|
|
860
|
+
echo "Posibles causas:"
|
|
861
|
+
echo " - Conflicto con remote"
|
|
862
|
+
echo " - Branch no tiene upstream configurado"
|
|
863
|
+
echo " - Permisos insuficientes"
|
|
864
|
+
echo ""
|
|
865
|
+
echo "💡 Comandos útiles:"
|
|
866
|
+
echo " git pull origin $CURRENT_BRANCH --rebase"
|
|
867
|
+
echo " git push -u origin $CURRENT_BRANCH"
|
|
868
|
+
echo ""
|
|
869
|
+
exit 1
|
|
870
|
+
fi
|
|
871
|
+
else
|
|
872
|
+
echo ""
|
|
873
|
+
echo "⏭️ Push cancelado por el usuario"
|
|
874
|
+
echo ""
|
|
875
|
+
echo "✅ Trabajo archivado y commiteado localmente"
|
|
876
|
+
echo ""
|
|
877
|
+
echo "💡 Puedes hacer push manualmente cuando estés listo:"
|
|
878
|
+
echo " git push origin $CURRENT_BRANCH"
|
|
879
|
+
echo ""
|
|
880
|
+
echo "⚠️ No olvides crear el PR después del push"
|
|
881
|
+
echo ""
|
|
882
|
+
fi
|
|
883
|
+
```
|
|
884
|
+
|
|
885
|
+
---
|
|
886
|
+
|
|
887
|
+
## Summary
|
|
888
|
+
|
|
889
|
+
**Token Consumption Estimate:**
|
|
890
|
+
|
|
891
|
+
- **Step 1 (/flow-check):** ~1,800 tokens (only if needed)
|
|
892
|
+
- **Step 2 (/flow-commit):** ~1,000 tokens (only if uncommitted changes)
|
|
893
|
+
- **Step 4 (AI Descriptions):** ~1,200 tokens (optimized summary + generation)
|
|
894
|
+
|
|
895
|
+
**Total:** ~4,000 tokens for full execution with validation and commit
|
|
896
|
+
**Total:** ~1,200 tokens if check/commit already executed
|
|
897
|
+
**Total:** ~0 tokens if no changes detected (smart skip)
|
|
898
|
+
|
|
899
|
+
**Benefits:**
|
|
900
|
+
|
|
901
|
+
- ✅ Professional quality descriptions (senior-level)
|
|
902
|
+
- ✅ Optimized token usage (66% reduction vs full diff analysis)
|
|
903
|
+
- ✅ One command does everything
|
|
904
|
+
- ✅ Smart skip for validation/commit
|
|
905
|
+
- ✅ Tests are blocking (prevents broken code)
|
|
906
|
+
- ✅ Archive before push (guarantees versioning)
|
|
907
|
+
- ✅ Commit links (6 Git platforms supported)
|
|
908
|
+
- ✅ Language/framework agnostic
|
|
909
|
+
- ✅ Deployment notes auto-detection
|
|
910
|
+
|
|
911
|
+
---
|
|
912
|
+
|
|
913
|
+
## Notes
|
|
914
|
+
|
|
915
|
+
- This workflow completes the development cycle started by `/flow-work`
|
|
916
|
+
- Always archives analytics to `.ai-flow/archive/analytics.jsonl` before push
|
|
917
|
+
- Generates descriptions using AI with optimized context (structured summary, not full diffs)
|
|
918
|
+
- Push is always optional (user confirmation required)
|
|
919
|
+
- Work folder cleanup only happens after successful archiving
|