@forwardimpact/schema 0.8.3 → 0.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/examples/capabilities/delivery.yaml +821 -172
- package/examples/capabilities/reliability.yaml +165 -285
- package/examples/capabilities/scale.yaml +1344 -103
- package/package.json +1 -1
- package/schema/json/capability.schema.json +12 -4
- package/schema/rdf/capability.ttl +34 -8
- package/src/loader.js +5 -0
- package/src/validation.js +44 -0
|
@@ -1,8 +1,9 @@
|
|
|
1
1
|
# yaml-language-server: $schema=https://www.forwardimpact.team/schema/json/capability.schema.json
|
|
2
2
|
|
|
3
|
+
id: delivery
|
|
3
4
|
name: Delivery
|
|
4
5
|
emojiIcon: 🚀
|
|
5
|
-
ordinalRank:
|
|
6
|
+
ordinalRank: 3
|
|
6
7
|
description: |
|
|
7
8
|
Building and shipping solutions that solve real problems.
|
|
8
9
|
Encompasses full-stack development, data integration, problem discovery,
|
|
@@ -41,144 +42,239 @@ managementResponsibilities:
|
|
|
41
42
|
Shape delivery culture across the business unit, lead strategic delivery
|
|
42
43
|
transformations, and represent delivery commitments at executive level
|
|
43
44
|
skills:
|
|
44
|
-
- id:
|
|
45
|
-
name:
|
|
45
|
+
- id: data_integration
|
|
46
|
+
name: Data Integration
|
|
46
47
|
human:
|
|
47
48
|
description:
|
|
48
|
-
|
|
49
|
-
|
|
50
|
-
|
|
49
|
+
Gaining access to enterprise data, cleaning messy real-world datasets,
|
|
50
|
+
and making information usable for decision-making—often with
|
|
51
|
+
inconsistent formats, missing values, and undocumented schemas. The
|
|
52
|
+
heart of embedded engineering work.
|
|
51
53
|
levelDescriptions:
|
|
52
54
|
awareness:
|
|
53
|
-
You understand
|
|
54
|
-
|
|
55
|
-
|
|
55
|
+
You understand how data flows through systems and can use existing
|
|
56
|
+
pipelines, APIs, and data sources with guidance. You know to ask about
|
|
57
|
+
data quality.
|
|
56
58
|
foundational:
|
|
57
|
-
You
|
|
58
|
-
|
|
59
|
-
|
|
59
|
+
You create simple data transformations and handle common formats (CSV,
|
|
60
|
+
JSON, SQL). You identify and report data quality issues and understand
|
|
61
|
+
basic ETL concepts.
|
|
60
62
|
working:
|
|
61
|
-
You
|
|
62
|
-
|
|
63
|
-
|
|
63
|
+
You integrate multiple data sources independently, clean messy
|
|
64
|
+
datasets, handle inconsistent formats and missing values, and document
|
|
65
|
+
data lineage. You troubleshoot integration failures.
|
|
64
66
|
practitioner:
|
|
65
|
-
You
|
|
66
|
-
|
|
67
|
-
|
|
68
|
-
|
|
67
|
+
You navigate complex enterprise data landscapes across teams, build
|
|
68
|
+
relationships to gain data access, handle undocumented schemas through
|
|
69
|
+
investigation, and build robust, maintainable integration solutions.
|
|
70
|
+
You mentor engineers in your area on data integration challenges.
|
|
69
71
|
expert:
|
|
70
|
-
You define
|
|
71
|
-
unit. You
|
|
72
|
-
|
|
73
|
-
|
|
72
|
+
You define data integration patterns and best practices across the
|
|
73
|
+
business unit. You architect large-scale data flows, solve the most
|
|
74
|
+
complex integration challenges, and are the authority on enterprise
|
|
75
|
+
data integration.
|
|
74
76
|
agent:
|
|
75
|
-
name:
|
|
76
|
-
description:
|
|
77
|
-
Guide for
|
|
77
|
+
name: data-integration
|
|
78
|
+
description: |
|
|
79
|
+
Guide for integrating data from multiple sources, cleaning messy
|
|
80
|
+
datasets, and handling data quality issues.
|
|
78
81
|
useWhen: |
|
|
79
|
-
|
|
80
|
-
|
|
82
|
+
Working with enterprise data, ETL pipelines, or data transformation
|
|
83
|
+
tasks.
|
|
81
84
|
stages:
|
|
82
85
|
specify:
|
|
83
86
|
focus: |
|
|
84
|
-
Define
|
|
85
|
-
Clarify
|
|
87
|
+
Define data integration requirements and acceptance criteria.
|
|
88
|
+
Clarify data sources, formats, and quality expectations.
|
|
86
89
|
readChecklist:
|
|
87
|
-
-
|
|
88
|
-
-
|
|
89
|
-
|
|
90
|
-
-
|
|
91
|
-
- Identify stakeholders and their concerns
|
|
90
|
+
- Identify source and target data systems
|
|
91
|
+
- Document data format and schema requirements
|
|
92
|
+
- Define data quality acceptance criteria
|
|
93
|
+
- Clarify data freshness and latency requirements
|
|
92
94
|
- Mark ambiguities with [NEEDS CLARIFICATION]
|
|
93
95
|
confirmChecklist:
|
|
94
|
-
-
|
|
95
|
-
-
|
|
96
|
-
-
|
|
97
|
-
-
|
|
96
|
+
- Data sources are identified and accessible
|
|
97
|
+
- Data format requirements are documented
|
|
98
|
+
- Quality criteria are defined
|
|
99
|
+
- Latency requirements are clear
|
|
98
100
|
plan:
|
|
99
|
-
focus:
|
|
101
|
+
focus: |
|
|
102
|
+
Plan data integration approach. Identify sources, assess quality,
|
|
103
|
+
and plan transformation logic.
|
|
100
104
|
readChecklist:
|
|
101
|
-
-
|
|
102
|
-
-
|
|
103
|
-
-
|
|
104
|
-
-
|
|
105
|
-
- Document approach with rationale
|
|
105
|
+
- Identify data sources and access requirements
|
|
106
|
+
- Assess data quality and completeness
|
|
107
|
+
- Plan transformation logic and validation
|
|
108
|
+
- Document data lineage approach
|
|
106
109
|
confirmChecklist:
|
|
107
|
-
-
|
|
108
|
-
-
|
|
109
|
-
-
|
|
110
|
-
-
|
|
110
|
+
- Data sources are identified
|
|
111
|
+
- Data formats are understood
|
|
112
|
+
- Data quality requirements are defined
|
|
113
|
+
- Transformation logic is planned
|
|
111
114
|
onboard:
|
|
112
115
|
focus: |
|
|
113
|
-
Set up the
|
|
114
|
-
|
|
115
|
-
|
|
116
|
+
Set up the data integration environment. Install data
|
|
117
|
+
processing tools, configure data source access, and verify
|
|
118
|
+
connectivity to all required systems.
|
|
116
119
|
readChecklist:
|
|
117
|
-
- Install
|
|
118
|
-
-
|
|
119
|
-
-
|
|
120
|
-
- Set up
|
|
121
|
-
-
|
|
120
|
+
- Install data tools (DuckDB, Polars, Great Expectations)
|
|
121
|
+
- Configure database connections and API credentials
|
|
122
|
+
- Verify access to all identified data sources
|
|
123
|
+
- Set up virtual environment and pin dependency versions
|
|
124
|
+
- Create .env file with connection strings and credentials
|
|
122
125
|
confirmChecklist:
|
|
123
|
-
-
|
|
124
|
-
-
|
|
125
|
-
-
|
|
126
|
-
-
|
|
127
|
-
-
|
|
126
|
+
- All data processing libraries installed and importable
|
|
127
|
+
- Data source connections verified and working
|
|
128
|
+
- Credentials stored securely in .env (not committed to git)
|
|
129
|
+
- Sample queries run successfully against each data source
|
|
130
|
+
- Virtual environment is reproducible (requirements.txt or
|
|
131
|
+
pyproject.toml)
|
|
128
132
|
code:
|
|
129
|
-
focus:
|
|
133
|
+
focus: |
|
|
134
|
+
Implement data transformations with robust quality checks
|
|
135
|
+
and error handling for messy real-world data.
|
|
130
136
|
readChecklist:
|
|
131
|
-
-
|
|
132
|
-
-
|
|
133
|
-
-
|
|
134
|
-
-
|
|
137
|
+
- Implement data extraction and loading
|
|
138
|
+
- Handle data quality issues (nulls, formats, duplicates)
|
|
139
|
+
- Create transformation logic
|
|
140
|
+
- Add validation and error handling
|
|
141
|
+
- Document data lineage
|
|
135
142
|
confirmChecklist:
|
|
136
|
-
-
|
|
137
|
-
-
|
|
138
|
-
-
|
|
139
|
-
-
|
|
143
|
+
- Data transformations produce expected output
|
|
144
|
+
- Basic validation exists for input data
|
|
145
|
+
- Data formats are handled correctly
|
|
146
|
+
- Error handling exists for malformed data
|
|
147
|
+
- Pipeline is idempotent
|
|
140
148
|
review:
|
|
141
|
-
focus:
|
|
149
|
+
focus: |
|
|
150
|
+
Validate data quality, transformation correctness, and
|
|
151
|
+
operational readiness.
|
|
142
152
|
readChecklist:
|
|
143
|
-
-
|
|
144
|
-
-
|
|
145
|
-
-
|
|
146
|
-
-
|
|
153
|
+
- Verify data quality checks
|
|
154
|
+
- Test with edge cases and malformed data
|
|
155
|
+
- Review error handling coverage
|
|
156
|
+
- Validate documentation completeness
|
|
147
157
|
confirmChecklist:
|
|
148
|
-
-
|
|
149
|
-
-
|
|
150
|
-
-
|
|
158
|
+
- Data quality checks are implemented
|
|
159
|
+
- Edge cases are handled
|
|
160
|
+
- Data lineage is documented
|
|
161
|
+
- Failures are logged and alertable
|
|
151
162
|
deploy:
|
|
152
163
|
focus: |
|
|
153
|
-
Deploy
|
|
154
|
-
|
|
164
|
+
Deploy data pipeline to production and verify data flow.
|
|
165
|
+
Monitor for data quality and latency issues.
|
|
155
166
|
readChecklist:
|
|
156
|
-
- Deploy
|
|
157
|
-
- Verify
|
|
158
|
-
- Monitor
|
|
159
|
-
-
|
|
167
|
+
- Deploy pipeline configuration
|
|
168
|
+
- Verify data flows end-to-end in production
|
|
169
|
+
- Monitor data quality metrics
|
|
170
|
+
- Confirm alerting is operational
|
|
160
171
|
confirmChecklist:
|
|
161
|
-
-
|
|
162
|
-
-
|
|
163
|
-
-
|
|
164
|
-
-
|
|
172
|
+
- Pipeline deployed successfully
|
|
173
|
+
- Data flowing in production
|
|
174
|
+
- Quality metrics within thresholds
|
|
175
|
+
- Alerting verified working
|
|
176
|
+
toolReferences:
|
|
177
|
+
- name: DuckDB
|
|
178
|
+
url: https://duckdb.org/docs/
|
|
179
|
+
simpleIcon: duckdb
|
|
180
|
+
description: In-process analytical database
|
|
181
|
+
useWhen: Querying CSV/Parquet files with SQL or quick data exploration
|
|
182
|
+
- name: Polars
|
|
183
|
+
url: https://docs.pola.rs/
|
|
184
|
+
simpleIcon: polars
|
|
185
|
+
description: Fast DataFrame library with lazy evaluation
|
|
186
|
+
useWhen: Transforming and cleaning large datasets programmatically
|
|
187
|
+
- name: Great Expectations
|
|
188
|
+
url: https://docs.greatexpectations.io/
|
|
189
|
+
simpleIcon: python
|
|
190
|
+
description: Data validation and profiling framework
|
|
191
|
+
useWhen: Validating data quality and creating data documentation
|
|
192
|
+
instructions: |
|
|
193
|
+
## Step 1: Explore the Source Data
|
|
194
|
+
|
|
195
|
+
Use DuckDB to quickly inspect files without loading into memory.
|
|
196
|
+
Check schema, data types, row counts, and null distributions.
|
|
197
|
+
|
|
198
|
+
## Step 2: Transform with Polars
|
|
199
|
+
|
|
200
|
+
Use lazy evaluation for large datasets: filter, fill nulls,
|
|
201
|
+
parse dates, and aggregate. Collect only when the query plan
|
|
202
|
+
is complete. Write cleaned data to Parquet.
|
|
203
|
+
|
|
204
|
+
## Step 3: Validate Data Quality
|
|
205
|
+
|
|
206
|
+
Define expectations with Great Expectations: not-null checks,
|
|
207
|
+
uniqueness constraints, value ranges. Run validation and
|
|
208
|
+
check results.
|
|
209
|
+
|
|
210
|
+
## Step 4: Export to Target Format
|
|
211
|
+
|
|
212
|
+
Use DuckDB COPY or Polars write methods to export transformed
|
|
213
|
+
data to the target format and location.
|
|
214
|
+
installScript: |
|
|
215
|
+
set -e
|
|
216
|
+
pip install duckdb polars great-expectations
|
|
217
|
+
python -c "import duckdb, polars, great_expectations"
|
|
165
218
|
implementationReference: |
|
|
166
|
-
##
|
|
167
|
-
|
|
168
|
-
|
|
169
|
-
|
|
170
|
-
|
|
171
|
-
|
|
172
|
-
|
|
173
|
-
|
|
174
|
-
|
|
175
|
-
|
|
176
|
-
|
|
177
|
-
|
|
178
|
-
|
|
179
|
-
|
|
180
|
-
|
|
181
|
-
|
|
219
|
+
## SQL Exploration
|
|
220
|
+
|
|
221
|
+
```sql
|
|
222
|
+
SELECT * FROM read_csv('data.csv') LIMIT 10;
|
|
223
|
+
DESCRIBE SELECT * FROM read_csv('data.csv');
|
|
224
|
+
SELECT COUNT(*), COUNT(id), COUNT(email) FROM read_csv('data.csv');
|
|
225
|
+
```
|
|
226
|
+
|
|
227
|
+
## Polars Transformation
|
|
228
|
+
|
|
229
|
+
```python
|
|
230
|
+
import polars as pl
|
|
231
|
+
|
|
232
|
+
df = (
|
|
233
|
+
pl.scan_csv("source_data.csv")
|
|
234
|
+
.filter(pl.col("status") == "active")
|
|
235
|
+
.with_columns(
|
|
236
|
+
pl.col("value").fill_null(0),
|
|
237
|
+
pl.col("date").str.to_date("%Y-%m-%d")
|
|
238
|
+
)
|
|
239
|
+
.group_by("category")
|
|
240
|
+
.agg(pl.col("value").sum())
|
|
241
|
+
.collect()
|
|
242
|
+
)
|
|
243
|
+
df.write_parquet("cleaned_data.parquet")
|
|
244
|
+
```
|
|
245
|
+
|
|
246
|
+
## Data Quality Validation
|
|
247
|
+
|
|
248
|
+
```python
|
|
249
|
+
import great_expectations as gx
|
|
250
|
+
|
|
251
|
+
context = gx.get_context()
|
|
252
|
+
validator = context.sources.pandas_default.read_csv("cleaned_data.csv")
|
|
253
|
+
validator.expect_column_values_to_not_be_null("id")
|
|
254
|
+
validator.expect_column_values_to_be_unique("id")
|
|
255
|
+
validator.expect_column_values_to_be_between("age", 0, 120)
|
|
256
|
+
results = validator.validate()
|
|
257
|
+
```
|
|
258
|
+
|
|
259
|
+
## Verification
|
|
260
|
+
|
|
261
|
+
Your pipeline is working when:
|
|
262
|
+
- Source data loads without errors
|
|
263
|
+
- Transformation produces expected row counts
|
|
264
|
+
- Data quality checks pass
|
|
265
|
+
- Output file is readable and contains expected data
|
|
266
|
+
|
|
267
|
+
```python
|
|
268
|
+
result = pl.read_parquet("output.parquet")
|
|
269
|
+
assert len(result) > 0, "Output should have rows"
|
|
270
|
+
```
|
|
271
|
+
|
|
272
|
+
## Common Pitfalls
|
|
273
|
+
|
|
274
|
+
- **Data leakage**: Using future data in training sets
|
|
275
|
+
- **Silent nulls**: Empty strings vs NULL vs placeholder values
|
|
276
|
+
- **Schema drift**: Columns change without warning
|
|
277
|
+
- **Encoding issues**: UTF-8 vs Latin-1 in CSV files
|
|
182
278
|
- id: full_stack_development
|
|
183
279
|
name: Full-Stack Development
|
|
184
280
|
human:
|
|
@@ -213,11 +309,12 @@ skills:
|
|
|
213
309
|
polymathic engineering.
|
|
214
310
|
agent:
|
|
215
311
|
name: full-stack-development
|
|
216
|
-
description:
|
|
217
|
-
Guide for building complete solutions across the full technology
|
|
312
|
+
description: |
|
|
313
|
+
Guide for building complete solutions across the full technology
|
|
314
|
+
stack.
|
|
218
315
|
useWhen: |
|
|
219
|
-
|
|
220
|
-
infrastructure layers.
|
|
316
|
+
Asked to implement features spanning frontend, backend, database,
|
|
317
|
+
and infrastructure layers.
|
|
221
318
|
stages:
|
|
222
319
|
specify:
|
|
223
320
|
focus: |
|
|
@@ -235,60 +332,68 @@ skills:
|
|
|
235
332
|
- Integration points are identified
|
|
236
333
|
- Non-functional requirements are clear
|
|
237
334
|
plan:
|
|
238
|
-
focus:
|
|
335
|
+
focus: |
|
|
336
|
+
Design the full-stack solution architecture. Define API
|
|
337
|
+
contracts and plan layer interactions.
|
|
239
338
|
readChecklist:
|
|
240
|
-
- Define the API contract
|
|
241
|
-
-
|
|
339
|
+
- Define the API contract first
|
|
340
|
+
- Plan frontend and backend responsibilities
|
|
341
|
+
- Design database schema
|
|
242
342
|
- Plan infrastructure requirements
|
|
243
|
-
- Identify cross-layer dependencies
|
|
244
343
|
confirmChecklist:
|
|
245
344
|
- API contract is defined
|
|
246
|
-
-
|
|
247
|
-
-
|
|
248
|
-
-
|
|
345
|
+
- Layer responsibilities are clear
|
|
346
|
+
- Database schema is planned
|
|
347
|
+
- Infrastructure approach is decided
|
|
249
348
|
onboard:
|
|
250
349
|
focus: |
|
|
251
|
-
Set up the full-stack development environment.
|
|
252
|
-
|
|
253
|
-
|
|
350
|
+
Set up the full-stack development environment. Install
|
|
351
|
+
frameworks, configure services, set up database, and verify
|
|
352
|
+
the development server runs.
|
|
254
353
|
readChecklist:
|
|
255
|
-
- Install
|
|
256
|
-
-
|
|
257
|
-
- Start local database and
|
|
258
|
-
- Configure
|
|
259
|
-
-
|
|
260
|
-
-
|
|
354
|
+
- Install project dependencies (npm install, pip install)
|
|
355
|
+
- Configure environment variables in .env.local or .env
|
|
356
|
+
- Start local database and apply schema/migrations
|
|
357
|
+
- Configure linter, formatter, and pre-commit hooks
|
|
358
|
+
- Set up GitHub tokens for API access if needed
|
|
359
|
+
- Verify development server starts without errors
|
|
261
360
|
confirmChecklist:
|
|
262
|
-
-
|
|
263
|
-
-
|
|
264
|
-
- Database
|
|
265
|
-
-
|
|
266
|
-
-
|
|
267
|
-
-
|
|
361
|
+
- All dependencies installed and versions locked
|
|
362
|
+
- Environment variables configured for local development
|
|
363
|
+
- Database running locally with schema applied
|
|
364
|
+
- Linter and formatter pass on existing code
|
|
365
|
+
- Development server starts and responds to requests
|
|
366
|
+
- CI pipeline configuration is valid
|
|
268
367
|
code:
|
|
269
|
-
focus:
|
|
368
|
+
focus: |
|
|
369
|
+
Build vertically—complete one feature end-to-end before
|
|
370
|
+
starting another. Validates assumptions early.
|
|
270
371
|
readChecklist:
|
|
271
|
-
- Implement
|
|
272
|
-
- Build frontend
|
|
273
|
-
-
|
|
274
|
-
- Configure infrastructure as
|
|
275
|
-
- Test across
|
|
372
|
+
- Implement API endpoints
|
|
373
|
+
- Build frontend integration
|
|
374
|
+
- Create database schema and queries
|
|
375
|
+
- Configure infrastructure as needed
|
|
376
|
+
- Test across layers
|
|
276
377
|
confirmChecklist:
|
|
277
378
|
- Frontend connects to backend correctly
|
|
278
379
|
- Database schema supports the feature
|
|
279
380
|
- Error handling spans all layers
|
|
280
381
|
- Feature works end-to-end
|
|
382
|
+
- Deployment is automated
|
|
281
383
|
review:
|
|
282
|
-
focus:
|
|
384
|
+
focus: |
|
|
385
|
+
Verify integration across layers and ensure deployment
|
|
386
|
+
readiness.
|
|
283
387
|
readChecklist:
|
|
284
|
-
- Test
|
|
285
|
-
- Verify error handling
|
|
286
|
-
- Check deployment
|
|
287
|
-
-
|
|
388
|
+
- Test integration across all layers
|
|
389
|
+
- Verify error handling end-to-end
|
|
390
|
+
- Check deployment configuration
|
|
391
|
+
- Review documentation
|
|
288
392
|
confirmChecklist:
|
|
289
|
-
-
|
|
290
|
-
- Deployment
|
|
291
|
-
-
|
|
393
|
+
- Integration tests pass
|
|
394
|
+
- Deployment verified
|
|
395
|
+
- Documentation is complete
|
|
396
|
+
- Feature is production-ready
|
|
292
397
|
deploy:
|
|
293
398
|
focus: |
|
|
294
399
|
Deploy full-stack feature to production and verify end-to-end
|
|
@@ -305,39 +410,583 @@ skills:
|
|
|
305
410
|
- No errors in monitoring
|
|
306
411
|
- Performance meets requirements
|
|
307
412
|
toolReferences:
|
|
308
|
-
- name:
|
|
309
|
-
url: https://
|
|
310
|
-
simpleIcon:
|
|
311
|
-
description:
|
|
312
|
-
useWhen:
|
|
313
|
-
|
|
314
|
-
|
|
315
|
-
|
|
316
|
-
|
|
413
|
+
- name: Supabase
|
|
414
|
+
url: https://supabase.com/docs
|
|
415
|
+
simpleIcon: supabase
|
|
416
|
+
description: Open source Firebase alternative with PostgreSQL
|
|
417
|
+
useWhen:
|
|
418
|
+
Building applications with PostgreSQL, auth, and real-time features
|
|
419
|
+
- name: Next.js
|
|
420
|
+
url: https://nextjs.org/docs
|
|
421
|
+
simpleIcon: nextdotjs
|
|
422
|
+
description: React framework for full-stack web applications
|
|
423
|
+
useWhen:
|
|
424
|
+
Building React applications with server-side rendering or API routes
|
|
425
|
+
- name: GitHub Actions
|
|
426
|
+
url: https://docs.github.com/en/actions
|
|
427
|
+
simpleIcon: githubactions
|
|
428
|
+
description: CI/CD and automation platform
|
|
429
|
+
useWhen: Automating builds, tests, and deployments
|
|
430
|
+
- name: Nixpacks
|
|
431
|
+
url: https://nixpacks.com/docs
|
|
432
|
+
simpleIcon: nixos
|
|
433
|
+
description: Build tool that auto-detects and builds applications
|
|
434
|
+
useWhen: Auto-building and deploying applications to containers
|
|
317
435
|
- name: Colima
|
|
318
436
|
url: https://github.com/abiosoft/colima
|
|
319
437
|
simpleIcon: docker
|
|
320
|
-
description:
|
|
438
|
+
description:
|
|
439
|
+
Lightweight container runtime for macOS with Docker-compatible CLI
|
|
321
440
|
useWhen:
|
|
322
|
-
Running containers locally, building images, or
|
|
323
|
-
|
|
441
|
+
Running containers locally for development, building images, or
|
|
442
|
+
testing containerized apps
|
|
443
|
+
instructions: |
|
|
444
|
+
## Step 1: Configure Environment
|
|
445
|
+
|
|
446
|
+
Get connection details from `supabase status`. Create `.env.local`
|
|
447
|
+
with Supabase URL and anon key. Create the Supabase client module.
|
|
448
|
+
|
|
449
|
+
## Step 2: Create Database Schema
|
|
450
|
+
|
|
451
|
+
Create a migration with `supabase migration new`, define the
|
|
452
|
+
SQL schema with RLS enabled, and apply with `supabase db push`.
|
|
453
|
+
|
|
454
|
+
## Step 3: Build API Routes
|
|
455
|
+
|
|
456
|
+
Create Next.js API routes for GET and POST operations using
|
|
457
|
+
the Supabase client.
|
|
458
|
+
|
|
459
|
+
## Step 4: Build Frontend
|
|
460
|
+
|
|
461
|
+
Create a React component that fetches from the API and renders
|
|
462
|
+
data. Start with a simple list display.
|
|
463
|
+
|
|
464
|
+
## Step 5: Deploy
|
|
465
|
+
|
|
466
|
+
Use Nixpacks to auto-detect and build the image. Run it
|
|
467
|
+
locally with Colima's Docker-compatible runtime to verify
|
|
468
|
+
before deploying to production.
|
|
469
|
+
installScript: |
|
|
470
|
+
set -e
|
|
471
|
+
brew install colima
|
|
472
|
+
colima start
|
|
473
|
+
brew install supabase/tap/supabase || npm install -g supabase
|
|
474
|
+
npx create-next-app@latest my-app --typescript
|
|
475
|
+
cd my-app
|
|
476
|
+
supabase init
|
|
477
|
+
supabase start
|
|
478
|
+
npm install @supabase/supabase-js
|
|
479
|
+
colima status
|
|
324
480
|
implementationReference: |
|
|
325
|
-
##
|
|
481
|
+
## Supabase Client Setup
|
|
482
|
+
|
|
483
|
+
```typescript
|
|
484
|
+
// lib/supabase.ts
|
|
485
|
+
import { createClient } from '@supabase/supabase-js'
|
|
486
|
+
|
|
487
|
+
export const supabase = createClient(
|
|
488
|
+
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
|
489
|
+
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
|
|
490
|
+
)
|
|
491
|
+
```
|
|
492
|
+
|
|
493
|
+
## Database Schema
|
|
494
|
+
|
|
495
|
+
```sql
|
|
496
|
+
CREATE TABLE items (
|
|
497
|
+
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
|
498
|
+
name TEXT NOT NULL,
|
|
499
|
+
description TEXT,
|
|
500
|
+
created_at TIMESTAMPTZ DEFAULT NOW()
|
|
501
|
+
);
|
|
502
|
+
ALTER TABLE items ENABLE ROW LEVEL SECURITY;
|
|
503
|
+
```
|
|
504
|
+
|
|
505
|
+
## API Route
|
|
506
|
+
|
|
507
|
+
```typescript
|
|
508
|
+
// app/api/items/route.ts
|
|
509
|
+
import { supabase } from '@/lib/supabase'
|
|
510
|
+
import { NextResponse } from 'next/server'
|
|
511
|
+
|
|
512
|
+
export async function GET() {
|
|
513
|
+
const { data, error } = await supabase.from('items').select('*')
|
|
514
|
+
if (error) return NextResponse.json({ error }, { status: 500 })
|
|
515
|
+
return NextResponse.json(data)
|
|
516
|
+
}
|
|
517
|
+
```
|
|
518
|
+
|
|
519
|
+
## Frontend Component
|
|
520
|
+
|
|
521
|
+
```typescript
|
|
522
|
+
// app/page.tsx
|
|
523
|
+
'use client'
|
|
524
|
+
import { useEffect, useState } from 'react'
|
|
525
|
+
|
|
526
|
+
export default function Home() {
|
|
527
|
+
const [items, setItems] = useState([])
|
|
528
|
+
useEffect(() => {
|
|
529
|
+
fetch('/api/items').then(r => r.json()).then(setItems)
|
|
530
|
+
}, [])
|
|
531
|
+
return (
|
|
532
|
+
<main>
|
|
533
|
+
<h1>Items</h1>
|
|
534
|
+
<ul>{items.map((item: any) => <li key={item.id}>{item.name}</li>)}</ul>
|
|
535
|
+
</main>
|
|
536
|
+
)
|
|
537
|
+
}
|
|
538
|
+
```
|
|
539
|
+
|
|
540
|
+
## Verification
|
|
541
|
+
|
|
542
|
+
Your full-stack app is working when:
|
|
543
|
+
- `npm run dev` starts without errors
|
|
544
|
+
- Frontend loads at http://localhost:3000
|
|
545
|
+
- API responds at http://localhost:3000/api/items
|
|
546
|
+
- Data persists in database (check Supabase Studio at http://localhost:54323)
|
|
547
|
+
|
|
548
|
+
## Common Pitfalls
|
|
549
|
+
|
|
550
|
+
- **Missing env vars**: Supabase client fails silently
|
|
551
|
+
- **RLS without policies**: Queries return empty results
|
|
552
|
+
- **Type mismatch**: Generate types with `supabase gen types typescript`
|
|
553
|
+
- **Migration order**: Migrations apply alphabetically by filename
|
|
554
|
+
|
|
555
|
+
## Local Container Testing with Colima
|
|
556
|
+
|
|
557
|
+
```bash
|
|
558
|
+
# Start Colima (lightweight Docker-compatible runtime)
|
|
559
|
+
colima start
|
|
560
|
+
|
|
561
|
+
# Build with Nixpacks and run locally
|
|
562
|
+
nixpacks build . --name my-app
|
|
563
|
+
docker run --rm -p 3000:3000 --env-file .env.local my-app
|
|
564
|
+
|
|
565
|
+
# Verify app responds
|
|
566
|
+
curl http://localhost:3000
|
|
567
|
+
```
|
|
568
|
+
- id: problem_discovery
|
|
569
|
+
name: Problem Discovery
|
|
570
|
+
human:
|
|
571
|
+
description:
|
|
572
|
+
Navigating undefined problem spaces to uncover real requirements through
|
|
573
|
+
observation and immersion. Where most engineers expect specifications,
|
|
574
|
+
FDEs embrace ambiguity—starting with open questions like "How can we
|
|
575
|
+
accelerate patient recruitment?" rather than detailed requirements
|
|
576
|
+
documents.
|
|
577
|
+
levelDescriptions:
|
|
578
|
+
awareness:
|
|
579
|
+
You recognize that initial requirements are often incomplete. You ask
|
|
580
|
+
clarifying questions when you encounter gaps and don't make
|
|
581
|
+
assumptions.
|
|
582
|
+
foundational:
|
|
583
|
+
You actively seek context beyond initial requirements, interview
|
|
584
|
+
stakeholders to understand "why" behind requests, and document
|
|
585
|
+
discovered constraints and assumptions.
|
|
586
|
+
working:
|
|
587
|
+
You navigate ambiguous problem spaces independently. You discover
|
|
588
|
+
requirements through observation and user shadowing, reframe problems
|
|
589
|
+
to find higher-value solutions, and distinguish symptoms from root
|
|
590
|
+
causes.
|
|
591
|
+
practitioner:
|
|
592
|
+
You seek out undefined problems rather than avoiding them. You embed
|
|
593
|
+
with users to discover latent needs, coach engineers in your area on
|
|
594
|
+
problem discovery techniques, and turn ambiguity into clear problem
|
|
595
|
+
statements.
|
|
596
|
+
expert:
|
|
597
|
+
You shape approaches to problem discovery across the business unit.
|
|
598
|
+
You are recognized for transforming ambiguous situations into clear
|
|
599
|
+
opportunities, influence how teams engage with business problems, and
|
|
600
|
+
are the go-to person for the most undefined challenges.
|
|
601
|
+
agent:
|
|
602
|
+
name: problem-discovery
|
|
603
|
+
description: |
|
|
604
|
+
Guide for navigating undefined problem spaces and uncovering real
|
|
605
|
+
requirements.
|
|
606
|
+
useWhen: |
|
|
607
|
+
Facing ambiguous requests, exploring user needs, or translating vague
|
|
608
|
+
asks into clear problem statements.
|
|
609
|
+
stages:
|
|
610
|
+
specify:
|
|
611
|
+
focus: |
|
|
612
|
+
Explore the problem space and document what is known.
|
|
613
|
+
Surface ambiguities and unknowns before attempting solutions.
|
|
614
|
+
readChecklist:
|
|
615
|
+
- Document the initial problem statement as understood
|
|
616
|
+
- List stakeholders and their perspectives
|
|
617
|
+
- Identify what is known vs unknown
|
|
618
|
+
- Document assumptions that need validation
|
|
619
|
+
- Mark all ambiguities with [NEEDS CLARIFICATION]
|
|
620
|
+
confirmChecklist:
|
|
621
|
+
- Initial problem statement is documented
|
|
622
|
+
- Stakeholders are identified
|
|
623
|
+
- Known vs unknown is explicit
|
|
624
|
+
- Assumptions are listed for validation
|
|
625
|
+
plan:
|
|
626
|
+
focus: |
|
|
627
|
+
Embrace ambiguity and explore the problem space. Understand
|
|
628
|
+
context deeply before proposing solutions.
|
|
629
|
+
readChecklist:
|
|
630
|
+
- Ask open-ended questions about goals and context
|
|
631
|
+
- Identify stakeholders and their needs
|
|
632
|
+
- Discover constraints and prior attempts
|
|
633
|
+
- Distinguish symptoms from root causes
|
|
634
|
+
- Write clear problem statement
|
|
635
|
+
confirmChecklist:
|
|
636
|
+
- Understand who has the problem
|
|
637
|
+
- Success criteria are clear
|
|
638
|
+
- Root cause identified, not just symptoms
|
|
639
|
+
- Constraints and assumptions documented
|
|
640
|
+
- Problem statement is validated
|
|
641
|
+
onboard:
|
|
642
|
+
focus: |
|
|
643
|
+
Set up the environment for solution implementation.
|
|
644
|
+
Install required tools, configure access to relevant
|
|
645
|
+
systems, and prepare workspace for development.
|
|
646
|
+
readChecklist:
|
|
647
|
+
- Install project dependencies from plan requirements
|
|
648
|
+
- Configure access to relevant data sources and APIs
|
|
649
|
+
- Set up environment variables and credentials
|
|
650
|
+
- Verify access to stakeholder communication channels
|
|
651
|
+
- Create workspace structure for documentation and code
|
|
652
|
+
confirmChecklist:
|
|
653
|
+
- All planned tools and dependencies are installed
|
|
654
|
+
- API keys and credentials are configured securely
|
|
655
|
+
- Workspace structure supports the planned approach
|
|
656
|
+
- Access to all required systems is verified
|
|
657
|
+
- Development environment matches plan requirements
|
|
658
|
+
code:
|
|
659
|
+
focus: |
|
|
660
|
+
Implement solution while staying connected to the original
|
|
661
|
+
problem. Validate assumptions as you build.
|
|
662
|
+
readChecklist:
|
|
663
|
+
- Build incrementally to validate understanding
|
|
664
|
+
- Check in with stakeholders frequently
|
|
665
|
+
- Adjust as new information emerges
|
|
666
|
+
- Document discovered requirements
|
|
667
|
+
confirmChecklist:
|
|
668
|
+
- Solution addresses the validated problem
|
|
669
|
+
- Stakeholder feedback is incorporated
|
|
670
|
+
- Discovered requirements are documented
|
|
671
|
+
- Scope boundaries are maintained
|
|
672
|
+
review:
|
|
673
|
+
focus: |
|
|
674
|
+
Verify solution addresses the real problem and stakeholders
|
|
675
|
+
agree on success.
|
|
676
|
+
readChecklist:
|
|
677
|
+
- Validate with original stakeholders
|
|
678
|
+
- Confirm problem is addressed
|
|
679
|
+
- Document learnings for future reference
|
|
680
|
+
confirmChecklist:
|
|
681
|
+
- Stakeholders confirm problem is solved
|
|
682
|
+
- Success criteria are met
|
|
683
|
+
- Learnings are documented
|
|
684
|
+
deploy:
|
|
685
|
+
focus: |
|
|
686
|
+
Release solution and verify it addresses the real problem
|
|
687
|
+
in production context.
|
|
688
|
+
readChecklist:
|
|
689
|
+
- Deploy solution to production
|
|
690
|
+
- Gather stakeholder feedback on live solution
|
|
691
|
+
- Monitor for unexpected usage patterns
|
|
692
|
+
- Document discovered requirements for future iterations
|
|
693
|
+
confirmChecklist:
|
|
694
|
+
- Solution is deployed
|
|
695
|
+
- Stakeholders have validated in production
|
|
696
|
+
- Usage patterns match expectations
|
|
697
|
+
- Learnings are captured
|
|
698
|
+
instructions: |
|
|
699
|
+
## Discovery Process
|
|
700
|
+
|
|
701
|
+
### 1. Embrace Ambiguity
|
|
702
|
+
- Don't rush to solutions
|
|
703
|
+
- Resist the urge to fill gaps with assumptions
|
|
704
|
+
- Ask open-ended questions
|
|
705
|
+
- Seek to understand context deeply
|
|
706
|
+
|
|
707
|
+
### 2. Understand the Context
|
|
708
|
+
- Who are the stakeholders?
|
|
709
|
+
- What triggered this request?
|
|
710
|
+
- What has been tried before?
|
|
711
|
+
- What constraints exist?
|
|
712
|
+
- What does success look like?
|
|
713
|
+
|
|
714
|
+
### 3. Find the Real Problem
|
|
715
|
+
- Ask "why" repeatedly (5 Whys technique)
|
|
716
|
+
- Distinguish wants from needs
|
|
717
|
+
- Identify root causes vs symptoms
|
|
718
|
+
- Challenge initial framing
|
|
719
|
+
|
|
720
|
+
### 4. Validate Understanding
|
|
721
|
+
- Restate the problem in your own words
|
|
722
|
+
- Confirm with stakeholders
|
|
723
|
+
- Check for hidden assumptions
|
|
724
|
+
- Identify what's still unknown
|
|
725
|
+
implementationReference: |
|
|
726
|
+
## Key Questions
|
|
727
|
+
|
|
728
|
+
### Understanding Goals
|
|
729
|
+
- What outcome are you trying to achieve?
|
|
730
|
+
- How will you know if this succeeds?
|
|
731
|
+
- What happens if we do nothing?
|
|
732
|
+
- What's the deadline and why?
|
|
733
|
+
|
|
734
|
+
### Understanding Context
|
|
735
|
+
- Who uses this and how?
|
|
736
|
+
- What's the current workaround?
|
|
737
|
+
- What constraints must we work within?
|
|
738
|
+
- What has been tried before?
|
|
739
|
+
|
|
740
|
+
### Understanding Scope
|
|
741
|
+
- What's in scope vs out of scope?
|
|
742
|
+
- What's the minimum viable solution?
|
|
743
|
+
- What could we cut if needed?
|
|
744
|
+
- What can't we compromise on?
|
|
745
|
+
|
|
746
|
+
## Problem Statement Template
|
|
747
|
+
|
|
748
|
+
A good problem statement answers:
|
|
749
|
+
- **Who** has this problem?
|
|
750
|
+
- **What** is the problem they face?
|
|
751
|
+
- **Why** does it matter?
|
|
752
|
+
- **When/Where** does it occur?
|
|
753
|
+
- **How** is it currently handled?
|
|
754
|
+
|
|
755
|
+
Format: "[User type] needs [capability] because [reason], but currently [obstacle]."
|
|
756
|
+
|
|
757
|
+
## Common Pitfalls
|
|
758
|
+
|
|
759
|
+
- **Solutioning too early**: Jumping to "how" before understanding "what"
|
|
760
|
+
- **Taking requests literally**: Building what was asked, not what's needed
|
|
761
|
+
- **Assuming completeness**: Believing initial requirements are complete
|
|
762
|
+
- **Ignoring context**: Missing business or user context
|
|
763
|
+
- **Single perspective**: Only talking to one stakeholder
|
|
764
|
+
- id: rapid_prototyping
|
|
765
|
+
name: Rapid Prototyping & Validation
|
|
766
|
+
human:
|
|
767
|
+
description:
|
|
768
|
+
Building working solutions quickly to validate ideas and build trust
|
|
769
|
+
through delivery. Credibility comes from showing real software in days,
|
|
770
|
+
not months— demonstrating value before polishing details. "Working
|
|
771
|
+
solutions delivered in days" is the FDE standard.
|
|
772
|
+
levelDescriptions:
|
|
773
|
+
awareness:
|
|
774
|
+
You understand the value of prototypes for learning quickly. You can
|
|
775
|
+
create simple demos and mockups with guidance.
|
|
776
|
+
foundational:
|
|
777
|
+
You build functional prototypes to validate ideas, prioritize core
|
|
778
|
+
functionality over polish, and iterate based on user feedback. You
|
|
779
|
+
know the difference between prototype and production code.
|
|
780
|
+
working:
|
|
781
|
+
You deliver working solutions rapidly (days not weeks). You use
|
|
782
|
+
prototypes to build stakeholder trust, know when to stop prototyping
|
|
783
|
+
and start productionizing, and balance speed with appropriate quality.
|
|
784
|
+
practitioner:
|
|
785
|
+
You lead rapid delivery initiatives across teams in your area, coach
|
|
786
|
+
on prototype-first approaches, establish trust through consistent fast
|
|
787
|
+
delivery, and define clear criteria for prototype-to-production
|
|
788
|
+
transitions.
|
|
789
|
+
expert:
|
|
790
|
+
You shape culture around rapid validation and iterative delivery
|
|
791
|
+
across the business unit. You are recognized for transformative fast
|
|
792
|
+
delivery, define standards for prototype-to-production, and exemplify
|
|
793
|
+
the "deliver in days" mindset.
|
|
794
|
+
agent:
|
|
795
|
+
name: rapid-prototyping
|
|
796
|
+
description: |
|
|
797
|
+
Guide for building working prototypes quickly to validate ideas and
|
|
798
|
+
demonstrate feasibility.
|
|
799
|
+
useWhen: |
|
|
800
|
+
Asked to build a quick demo, proof of concept, MVP, or prototype
|
|
801
|
+
something rapidly.
|
|
802
|
+
stages:
|
|
803
|
+
specify:
|
|
804
|
+
focus: |
|
|
805
|
+
Define what the prototype must demonstrate and success criteria.
|
|
806
|
+
Scope ruthlessly—prototypes are for learning, not production.
|
|
807
|
+
readChecklist:
|
|
808
|
+
- Identify the key question or hypothesis to validate
|
|
809
|
+
- Document minimum acceptable demonstration
|
|
810
|
+
- Define what success looks like for this prototype
|
|
811
|
+
- Explicitly mark what is out of scope
|
|
812
|
+
- Mark any ambiguities with [NEEDS CLARIFICATION]
|
|
813
|
+
confirmChecklist:
|
|
814
|
+
- Key question to answer is clear
|
|
815
|
+
- Minimum viable demonstration is defined
|
|
816
|
+
- Success criteria are explicit
|
|
817
|
+
- Out of scope items are documented
|
|
818
|
+
plan:
|
|
819
|
+
focus: |
|
|
820
|
+
Define what the prototype needs to demonstrate and set
|
|
821
|
+
success criteria. Scope ruthlessly for speed.
|
|
822
|
+
readChecklist:
|
|
823
|
+
- Define the key question to answer
|
|
824
|
+
- Scope to minimum viable demonstration
|
|
825
|
+
- Identify what can be hardcoded or skipped
|
|
826
|
+
- Set time box for delivery
|
|
827
|
+
confirmChecklist:
|
|
828
|
+
- Success criteria are defined
|
|
829
|
+
- Scope is minimal and focused
|
|
830
|
+
- Time box is agreed
|
|
831
|
+
- It's clear this is a prototype
|
|
832
|
+
onboard:
|
|
833
|
+
focus: |
|
|
834
|
+
Set up the prototyping environment as fast as possible.
|
|
835
|
+
Use scaffolding tools, install minimal dependencies,
|
|
836
|
+
and get to a running state quickly.
|
|
837
|
+
readChecklist:
|
|
838
|
+
- Scaffold project using template or CLI tool
|
|
839
|
+
- Install only essential dependencies
|
|
840
|
+
- Configure minimal environment variables
|
|
841
|
+
- Start development server and verify it runs
|
|
842
|
+
- Skip non-essential tooling (linters, CI) for speed
|
|
843
|
+
confirmChecklist:
|
|
844
|
+
- Project scaffolded and running locally
|
|
845
|
+
- Core dependencies installed
|
|
846
|
+
- Development server responds to requests
|
|
847
|
+
- Ready to start building visible output immediately
|
|
848
|
+
code:
|
|
849
|
+
focus: |
|
|
850
|
+
Build the simplest thing that demonstrates the concept.
|
|
851
|
+
Prioritize visible progress over backend elegance.
|
|
852
|
+
readChecklist:
|
|
853
|
+
- Start with visible UI/output
|
|
854
|
+
- Hardcode values that would normally be configurable
|
|
855
|
+
- Skip edge cases that won't appear in demo
|
|
856
|
+
- Show progress frequently
|
|
857
|
+
- Document shortcuts taken
|
|
858
|
+
confirmChecklist:
|
|
859
|
+
- Core concept is demonstrable
|
|
860
|
+
- Happy path works end-to-end
|
|
861
|
+
- Known limitations are documented
|
|
862
|
+
- Stakeholders can interact with it
|
|
863
|
+
review:
|
|
864
|
+
focus: |
|
|
865
|
+
Validate prototype answers the original question. Decide
|
|
866
|
+
whether to iterate, productionize, or abandon.
|
|
867
|
+
readChecklist:
|
|
868
|
+
- Demo to stakeholders
|
|
869
|
+
- Gather feedback on the concept
|
|
870
|
+
- Decide next steps
|
|
871
|
+
- Document learnings
|
|
872
|
+
confirmChecklist:
|
|
873
|
+
- Stakeholders have seen the prototype
|
|
874
|
+
- Original question is answered
|
|
875
|
+
- Next steps are decided
|
|
876
|
+
- Learnings are captured
|
|
877
|
+
deploy:
|
|
878
|
+
focus: |
|
|
879
|
+
Make prototype accessible to stakeholders for evaluation.
|
|
880
|
+
Prototypes may not need production deployment.
|
|
881
|
+
readChecklist:
|
|
882
|
+
- Deploy to accessible environment (staging or demo)
|
|
883
|
+
- Share access with stakeholders
|
|
884
|
+
- Gather hands-on feedback
|
|
885
|
+
- Decide on next phase (iterate, productionize, or abandon)
|
|
886
|
+
confirmChecklist:
|
|
887
|
+
- Prototype is accessible to stakeholders
|
|
888
|
+
- Feedback has been gathered
|
|
889
|
+
- Decision on next steps is made
|
|
890
|
+
- Learnings are documented
|
|
891
|
+
toolReferences:
|
|
892
|
+
- name: Supabase
|
|
893
|
+
url: https://supabase.com/docs
|
|
894
|
+
simpleIcon: supabase
|
|
895
|
+
description: Open source Firebase alternative with PostgreSQL
|
|
896
|
+
useWhen: Instant PostgreSQL database with auth for rapid prototypes
|
|
897
|
+
- name: Next.js
|
|
898
|
+
url: https://nextjs.org/docs
|
|
899
|
+
simpleIcon: nextdotjs
|
|
900
|
+
description: React framework for full-stack web applications
|
|
901
|
+
useWhen: Scaffolding a full-stack prototype with server-side rendering
|
|
902
|
+
- name: Nixpacks
|
|
903
|
+
url: https://nixpacks.com/docs
|
|
904
|
+
simpleIcon: nixos
|
|
905
|
+
description: Build tool that auto-detects and builds applications
|
|
906
|
+
useWhen: Deploying prototypes to containers without writing Dockerfiles
|
|
907
|
+
instructions: |
|
|
908
|
+
## Step 1: Define What to Demonstrate
|
|
909
|
+
|
|
910
|
+
Before writing code, answer: What question does this prototype
|
|
911
|
+
answer? What's the minimum to demonstrate the concept? What can
|
|
912
|
+
be hardcoded or skipped? When will you stop?
|
|
913
|
+
|
|
914
|
+
## Step 2: Start with Visible Output
|
|
915
|
+
|
|
916
|
+
Build the UI first—stakeholders need to see something.
|
|
917
|
+
Hardcode data initially so you have working output in minutes.
|
|
918
|
+
|
|
919
|
+
## Step 3: Add Real Data When Needed
|
|
920
|
+
|
|
921
|
+
Only add database when the UI needs real data. Use Supabase
|
|
922
|
+
Studio to create tables directly (skip migrations for prototypes).
|
|
923
|
+
|
|
924
|
+
## Step 4: Document Shortcuts
|
|
925
|
+
|
|
926
|
+
Add a README section listing what was skipped and what's needed
|
|
927
|
+
to productionize. This prevents confusion later.
|
|
928
|
+
installScript: |
|
|
929
|
+
set -e
|
|
930
|
+
npx create-next-app@latest my-prototype --typescript
|
|
931
|
+
cd my-prototype
|
|
932
|
+
supabase init
|
|
933
|
+
supabase start
|
|
934
|
+
npm run dev
|
|
935
|
+
implementationReference: |
|
|
936
|
+
## Start with Hardcoded UI
|
|
937
|
+
|
|
938
|
+
```typescript
|
|
939
|
+
// app/page.tsx
|
|
940
|
+
export default function Home() {
|
|
941
|
+
const items = [
|
|
942
|
+
{ id: 1, name: 'Demo Item 1' },
|
|
943
|
+
{ id: 2, name: 'Demo Item 2' },
|
|
944
|
+
]
|
|
945
|
+
return (
|
|
946
|
+
<main style={{ padding: '2rem' }}>
|
|
947
|
+
<h1>Prototype Demo</h1>
|
|
948
|
+
<ul>{items.map(item => <li key={item.id}>{item.name}</li>)}</ul>
|
|
949
|
+
</main>
|
|
950
|
+
)
|
|
951
|
+
}
|
|
952
|
+
```
|
|
953
|
+
|
|
954
|
+
## Replace with Real Data
|
|
955
|
+
|
|
956
|
+
```typescript
|
|
957
|
+
import { supabase } from '@/lib/supabase'
|
|
958
|
+
const { data: items } = await supabase.from('items').select('*')
|
|
959
|
+
```
|
|
960
|
+
|
|
961
|
+
## Document Shortcuts
|
|
962
|
+
|
|
963
|
+
```markdown
|
|
964
|
+
## Prototype Limitations
|
|
965
|
+
This is a prototype for [purpose]. Not production-ready.
|
|
966
|
+
|
|
967
|
+
**Shortcuts taken:**
|
|
968
|
+
- No authentication
|
|
969
|
+
- Hardcoded configuration in code
|
|
970
|
+
- No error handling for edge cases
|
|
971
|
+
|
|
972
|
+
**To productionize:**
|
|
973
|
+
- Add authentication
|
|
974
|
+
- Move config to environment variables
|
|
975
|
+
- Add proper error handling
|
|
976
|
+
```
|
|
326
977
|
|
|
327
|
-
|
|
328
|
-
- **JavaScript/TypeScript**: Frontend and Node.js backend
|
|
329
|
-
- **Python**: Backend APIs and data processing
|
|
978
|
+
## Acceptable vs Required
|
|
330
979
|
|
|
331
|
-
|
|
332
|
-
|
|
333
|
-
|
|
334
|
-
|
|
980
|
+
| Acceptable to Skip | Still Required |
|
|
981
|
+
|-------------------|----------------|
|
|
982
|
+
| Authentication | Core functionality works |
|
|
983
|
+
| Error handling | Happy path is reliable |
|
|
984
|
+
| Migrations | It's clear this is a prototype |
|
|
985
|
+
| Tests | Limitations are documented |
|
|
335
986
|
|
|
336
|
-
##
|
|
987
|
+
## Common Pitfalls
|
|
337
988
|
|
|
338
|
-
|
|
339
|
-
|
|
340
|
-
|
|
341
|
-
|
|
342
|
-
| Database | Persistence, queries, migrations |
|
|
343
|
-
| Infrastructure | Deployment, scaling, monitoring |
|
|
989
|
+
- **Over-engineering**: Adding features "while you're at it"
|
|
990
|
+
- **No stopping point**: Polishing what you might throw away
|
|
991
|
+
- **Unclear purpose**: Building without knowing what question to answer
|
|
992
|
+
- **Hidden shortcuts**: Not documenting what was skipped
|