create-web-ai-service 1.0.0 β 1.0.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +495 -0
- package/package.json +2 -2
package/README.md
ADDED
|
@@ -0,0 +1,495 @@
|
|
|
1
|
+
# Web AI Service
|
|
2
|
+
|
|
3
|
+
A TypeScript-based workflow engine that creates dynamic API endpoints from YAML workflow definitions. Build powerful AI-powered APIs with LLM calls, custom code execution, and data transformationsβall without writing server boilerplate.
|
|
4
|
+
|
|
5
|
+
## Features
|
|
6
|
+
|
|
7
|
+
- π **YAML-Based Configuration** - Define API endpoints declaratively
|
|
8
|
+
- π€ **Multi-LLM Support** - Built-in support for Gemini, OpenAI, Anthropic, and Grok
|
|
9
|
+
- π **Custom Code Nodes** - Execute TypeScript functions in your workflows
|
|
10
|
+
- β‘ **Parallel Execution** - Run multiple nodes concurrently with error strategies
|
|
11
|
+
- π **Data Transformation** - Reduce, split, and map data with JSONPath
|
|
12
|
+
- β
**Input Validation** - JSON Schema validation on request inputs
|
|
13
|
+
- π― **Type-Safe** - Full TypeScript support with strict typing
|
|
14
|
+
- π **Auto-Routing** - Endpoint folders automatically become API routes
|
|
15
|
+
- π **Plugin System** - Extensible with Supabase and custom plugins
|
|
16
|
+
|
|
17
|
+
---
|
|
18
|
+
|
|
19
|
+
## Table of Contents
|
|
20
|
+
|
|
21
|
+
1. [Quick Start](#quick-start)
|
|
22
|
+
2. [Project Structure](#project-structure)
|
|
23
|
+
3. [Creating Endpoints](#creating-endpoints)
|
|
24
|
+
4. [Node Types](#node-types)
|
|
25
|
+
5. [Using Plugins](#using-plugins)
|
|
26
|
+
6. [Configuration](#configuration)
|
|
27
|
+
7. [Commands Reference](#commands-reference)
|
|
28
|
+
8. [Troubleshooting](#troubleshooting)
|
|
29
|
+
9. [Documentation](#documentation)
|
|
30
|
+
|
|
31
|
+
---
|
|
32
|
+
|
|
33
|
+
## Quick Start
|
|
34
|
+
|
|
35
|
+
### Option 1: Create a New Project (Recommended)
|
|
36
|
+
|
|
37
|
+
The easiest way to start is with the scaffolder:
|
|
38
|
+
|
|
39
|
+
```bash
|
|
40
|
+
npx create-web-ai-service
|
|
41
|
+
```
|
|
42
|
+
|
|
43
|
+
You'll be prompted to:
|
|
44
|
+
1. **Enter your project name** - e.g., `my-api`
|
|
45
|
+
2. **Select plugins** - Choose from available plugins like Supabase
|
|
46
|
+
|
|
47
|
+
Or use command-line arguments for non-interactive setup:
|
|
48
|
+
|
|
49
|
+
```bash
|
|
50
|
+
npx create-web-ai-service my-api --plugins supabase
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
After scaffolding:
|
|
54
|
+
|
|
55
|
+
```bash
|
|
56
|
+
cd my-api
|
|
57
|
+
cp .env.example .env # Configure your API keys
|
|
58
|
+
npm run dev # Start the server
|
|
59
|
+
```
|
|
60
|
+
|
|
61
|
+
Your API is now running at `http://localhost:3000`!
|
|
62
|
+
|
|
63
|
+
### Option 2: Install as a Dependency
|
|
64
|
+
|
|
65
|
+
Add to an existing project:
|
|
66
|
+
|
|
67
|
+
```bash
|
|
68
|
+
npm install web-ai-service
|
|
69
|
+
```
|
|
70
|
+
|
|
71
|
+
### Option 3: Global Installation
|
|
72
|
+
|
|
73
|
+
```bash
|
|
74
|
+
npm install -g web-ai-service
|
|
75
|
+
web-ai-service # Run from any directory with a src/endpoints folder
|
|
76
|
+
```
|
|
77
|
+
|
|
78
|
+
---
|
|
79
|
+
|
|
80
|
+
## Project Structure
|
|
81
|
+
|
|
82
|
+
When you create a new project, you'll get this structure:
|
|
83
|
+
|
|
84
|
+
```
|
|
85
|
+
my-api/
|
|
86
|
+
βββ src/
|
|
87
|
+
β βββ endpoints/ # Your API endpoints
|
|
88
|
+
β β βββ hello/ # Example: GET /hello
|
|
89
|
+
β β βββ GET.yaml # Workflow definition
|
|
90
|
+
β β βββ codes/ # TypeScript code nodes
|
|
91
|
+
β β β βββ format-greeting.ts
|
|
92
|
+
β β βββ prompts/ # LLM system prompts
|
|
93
|
+
β β βββ greeting-system.txt
|
|
94
|
+
β β
|
|
95
|
+
β βββ plugins/ # Shared code modules
|
|
96
|
+
β βββ supabase.ts # (if selected during setup)
|
|
97
|
+
β
|
|
98
|
+
βββ .env # Your API keys (gitignored)
|
|
99
|
+
βββ .env.example # Template for environment variables
|
|
100
|
+
βββ package.json
|
|
101
|
+
βββ tsconfig.json
|
|
102
|
+
```
|
|
103
|
+
|
|
104
|
+
### Key Concepts
|
|
105
|
+
|
|
106
|
+
| Concept | Description |
|
|
107
|
+
|---------|-------------|
|
|
108
|
+
| **Endpoint** | A folder in `src/endpoints/` that becomes an API route |
|
|
109
|
+
| **Workflow** | A YAML file (e.g., `POST.yaml`, `GET.yaml`) defining the processing pipeline |
|
|
110
|
+
| **Stage** | A sequential step in the workflow containing one or more nodes |
|
|
111
|
+
| **Node** | An individual processing unit (LLM call, code execution, etc.) |
|
|
112
|
+
|
|
113
|
+
### How Routing Works
|
|
114
|
+
|
|
115
|
+
| Folder Path | HTTP Method | API Route |
|
|
116
|
+
|-------------|-------------|-----------|
|
|
117
|
+
| `src/endpoints/hello/GET.yaml` | GET | `/hello` |
|
|
118
|
+
| `src/endpoints/summarize/POST.yaml` | POST | `/summarize` |
|
|
119
|
+
| `src/endpoints/users/profile/GET.yaml` | GET | `/users/profile` |
|
|
120
|
+
|
|
121
|
+
---
|
|
122
|
+
|
|
123
|
+
## Creating Endpoints
|
|
124
|
+
|
|
125
|
+
### Basic Example: Text Summarization
|
|
126
|
+
|
|
127
|
+
Create a POST endpoint at `/summarize`:
|
|
128
|
+
|
|
129
|
+
**1. Create the folder structure:**
|
|
130
|
+
```bash
|
|
131
|
+
mkdir -p src/endpoints/summarize/{codes,prompts}
|
|
132
|
+
```
|
|
133
|
+
|
|
134
|
+
**2. Create the system prompt** (`src/endpoints/summarize/prompts/system.txt`):
|
|
135
|
+
```text
|
|
136
|
+
You are a concise summarization assistant. Summarize the provided text clearly in 2-3 paragraphs.
|
|
137
|
+
```
|
|
138
|
+
|
|
139
|
+
**3. Create the workflow** (`src/endpoints/summarize/POST.yaml`):
|
|
140
|
+
```yaml
|
|
141
|
+
version: "1.0"
|
|
142
|
+
|
|
143
|
+
stages:
|
|
144
|
+
- name: main
|
|
145
|
+
nodes:
|
|
146
|
+
summarize:
|
|
147
|
+
type: llm
|
|
148
|
+
input: $input.text
|
|
149
|
+
provider: gemini
|
|
150
|
+
model: gemini-2.0-flash-lite
|
|
151
|
+
temperature: 0.3
|
|
152
|
+
maxTokens: 1024
|
|
153
|
+
systemMessages:
|
|
154
|
+
- file: system.txt
|
|
155
|
+
```
|
|
156
|
+
|
|
157
|
+
**4. Test it:**
|
|
158
|
+
```bash
|
|
159
|
+
curl -X POST http://localhost:3000/summarize \
|
|
160
|
+
-H "Content-Type: application/json" \
|
|
161
|
+
-d '{"text": "Long text to summarize..."}'
|
|
162
|
+
```
|
|
163
|
+
|
|
164
|
+
### Adding Input Validation with Code Nodes
|
|
165
|
+
|
|
166
|
+
Create a code node to validate inputs before processing:
|
|
167
|
+
|
|
168
|
+
**`src/endpoints/summarize/codes/validate.ts`:**
|
|
169
|
+
```typescript
|
|
170
|
+
import type { NodeOutput } from '@workflow/types';
|
|
171
|
+
|
|
172
|
+
interface SummarizeInput {
|
|
173
|
+
text?: string;
|
|
174
|
+
}
|
|
175
|
+
|
|
176
|
+
export default async function(input: unknown): Promise<NodeOutput> {
|
|
177
|
+
const body = input as SummarizeInput;
|
|
178
|
+
|
|
179
|
+
if (!body.text || typeof body.text !== 'string') {
|
|
180
|
+
throw new Error('Missing required field: text');
|
|
181
|
+
}
|
|
182
|
+
|
|
183
|
+
if (body.text.length < 10) {
|
|
184
|
+
throw new Error('Text must be at least 10 characters');
|
|
185
|
+
}
|
|
186
|
+
|
|
187
|
+
return { type: 'string', value: body.text };
|
|
188
|
+
}
|
|
189
|
+
```
|
|
190
|
+
|
|
191
|
+
**Updated workflow with validation stage:**
|
|
192
|
+
```yaml
|
|
193
|
+
version: "1.0"
|
|
194
|
+
|
|
195
|
+
stages:
|
|
196
|
+
- name: validate
|
|
197
|
+
nodes:
|
|
198
|
+
check_input:
|
|
199
|
+
type: code
|
|
200
|
+
input: $input
|
|
201
|
+
file: validate.ts
|
|
202
|
+
|
|
203
|
+
- name: summarize
|
|
204
|
+
nodes:
|
|
205
|
+
summary:
|
|
206
|
+
type: llm
|
|
207
|
+
input: validate.check_input # Reference previous node output
|
|
208
|
+
provider: gemini
|
|
209
|
+
model: gemini-2.0-flash-lite
|
|
210
|
+
systemMessages:
|
|
211
|
+
- file: system.txt
|
|
212
|
+
```
|
|
213
|
+
|
|
214
|
+
### Multi-Stage Workflow Example
|
|
215
|
+
|
|
216
|
+
Chain multiple processing stages:
|
|
217
|
+
|
|
218
|
+
```yaml
|
|
219
|
+
version: "1.0"
|
|
220
|
+
|
|
221
|
+
stages:
|
|
222
|
+
- name: extract
|
|
223
|
+
nodes:
|
|
224
|
+
parse_data:
|
|
225
|
+
type: code
|
|
226
|
+
input: $input
|
|
227
|
+
file: extract-data.ts
|
|
228
|
+
|
|
229
|
+
- name: analyze
|
|
230
|
+
nodes:
|
|
231
|
+
analyze_content:
|
|
232
|
+
type: llm
|
|
233
|
+
input: extract.parse_data
|
|
234
|
+
provider: gemini
|
|
235
|
+
model: gemini-2.0-flash-lite
|
|
236
|
+
systemMessages:
|
|
237
|
+
- file: analyzer-prompt.txt
|
|
238
|
+
|
|
239
|
+
- name: format
|
|
240
|
+
nodes:
|
|
241
|
+
format_response:
|
|
242
|
+
type: code
|
|
243
|
+
input: analyze.analyze_content
|
|
244
|
+
file: format-output.ts
|
|
245
|
+
```
|
|
246
|
+
|
|
247
|
+
### Parallel Node Execution
|
|
248
|
+
|
|
249
|
+
Run multiple LLM calls simultaneously within a stage:
|
|
250
|
+
|
|
251
|
+
```yaml
|
|
252
|
+
stages:
|
|
253
|
+
- name: parallel_analysis
|
|
254
|
+
nodes:
|
|
255
|
+
sentiment:
|
|
256
|
+
type: llm
|
|
257
|
+
input: $input.text
|
|
258
|
+
provider: gemini
|
|
259
|
+
model: gemini-2.0-flash-lite
|
|
260
|
+
systemMessages:
|
|
261
|
+
- file: sentiment-prompt.txt
|
|
262
|
+
|
|
263
|
+
keywords:
|
|
264
|
+
type: llm
|
|
265
|
+
input: $input.text
|
|
266
|
+
provider: openai
|
|
267
|
+
model: gpt-4o-mini
|
|
268
|
+
systemMessages:
|
|
269
|
+
- file: keywords-prompt.txt
|
|
270
|
+
|
|
271
|
+
- name: combine
|
|
272
|
+
nodes:
|
|
273
|
+
merge:
|
|
274
|
+
type: reduce
|
|
275
|
+
inputs:
|
|
276
|
+
- parallel_analysis.sentiment
|
|
277
|
+
- parallel_analysis.keywords
|
|
278
|
+
mapping:
|
|
279
|
+
sentiment: $.0
|
|
280
|
+
keywords: $.1
|
|
281
|
+
```
|
|
282
|
+
|
|
283
|
+
---
|
|
284
|
+
|
|
285
|
+
## Node Types
|
|
286
|
+
|
|
287
|
+
### LLM Node
|
|
288
|
+
|
|
289
|
+
Call an LLM provider:
|
|
290
|
+
|
|
291
|
+
```yaml
|
|
292
|
+
my_llm_node:
|
|
293
|
+
type: llm
|
|
294
|
+
input: $input.text # or reference: stageName.nodeName
|
|
295
|
+
provider: gemini # gemini | openai | anthropic | grok
|
|
296
|
+
model: gemini-2.0-flash-lite
|
|
297
|
+
temperature: 0.7 # Optional (0.0-1.0)
|
|
298
|
+
maxTokens: 1024 # Optional
|
|
299
|
+
systemMessages:
|
|
300
|
+
- file: prompt.txt
|
|
301
|
+
cache: true # Cache for performance
|
|
302
|
+
```
|
|
303
|
+
|
|
304
|
+
**Supported Providers & Models:**
|
|
305
|
+
|
|
306
|
+
| Provider | Example Models |
|
|
307
|
+
|----------|----------------|
|
|
308
|
+
| `gemini` | `gemini-2.0-flash-lite`, `gemini-2.0-flash`, `gemini-1.5-pro` |
|
|
309
|
+
| `openai` | `gpt-4o`, `gpt-4o-mini`, `gpt-4-turbo` |
|
|
310
|
+
| `anthropic` | `claude-3-5-sonnet-latest`, `claude-3-haiku-20240307` |
|
|
311
|
+
| `grok` | `grok-2`, `grok-2-mini` |
|
|
312
|
+
|
|
313
|
+
### Code Node
|
|
314
|
+
|
|
315
|
+
Execute custom TypeScript:
|
|
316
|
+
|
|
317
|
+
```yaml
|
|
318
|
+
my_code_node:
|
|
319
|
+
type: code
|
|
320
|
+
input: $input
|
|
321
|
+
file: my-processor.ts
|
|
322
|
+
```
|
|
323
|
+
|
|
324
|
+
The TypeScript file must export a default async function:
|
|
325
|
+
|
|
326
|
+
```typescript
|
|
327
|
+
import type { NodeOutput } from '@workflow/types';
|
|
328
|
+
|
|
329
|
+
export default async function(input: unknown): Promise<NodeOutput> {
|
|
330
|
+
// Your logic here
|
|
331
|
+
return {
|
|
332
|
+
type: 'json', // 'string' | 'json' | 'number' | 'boolean' | 'array'
|
|
333
|
+
value: { processed: true }
|
|
334
|
+
};
|
|
335
|
+
}
|
|
336
|
+
```
|
|
337
|
+
|
|
338
|
+
### Reduce Node
|
|
339
|
+
|
|
340
|
+
Combine multiple node outputs:
|
|
341
|
+
|
|
342
|
+
```yaml
|
|
343
|
+
merge_results:
|
|
344
|
+
type: reduce
|
|
345
|
+
inputs:
|
|
346
|
+
- stageName.node1
|
|
347
|
+
- stageName.node2
|
|
348
|
+
mapping:
|
|
349
|
+
firstResult: $.0
|
|
350
|
+
secondResult: $.1
|
|
351
|
+
```
|
|
352
|
+
|
|
353
|
+
### Split Node
|
|
354
|
+
|
|
355
|
+
Divide output into named parts:
|
|
356
|
+
|
|
357
|
+
```yaml
|
|
358
|
+
split_data:
|
|
359
|
+
type: split
|
|
360
|
+
input: stageName.nodeName
|
|
361
|
+
mapping:
|
|
362
|
+
header: $.header
|
|
363
|
+
body: $.content
|
|
364
|
+
footer: $.footer
|
|
365
|
+
```
|
|
366
|
+
|
|
367
|
+
### Passthrough Node
|
|
368
|
+
|
|
369
|
+
Pass input directly to output:
|
|
370
|
+
|
|
371
|
+
```yaml
|
|
372
|
+
forward:
|
|
373
|
+
type: passthrough
|
|
374
|
+
input: $input
|
|
375
|
+
```
|
|
376
|
+
|
|
377
|
+
---
|
|
378
|
+
|
|
379
|
+
## Using Plugins
|
|
380
|
+
|
|
381
|
+
### Supabase Plugin
|
|
382
|
+
|
|
383
|
+
If you selected Supabase during project setup, you can use it in code nodes:
|
|
384
|
+
|
|
385
|
+
```typescript
|
|
386
|
+
import { supabase } from '@code-plugins/supabase.js';
|
|
387
|
+
import type { NodeOutput } from '@workflow/types';
|
|
388
|
+
|
|
389
|
+
export default async function(input: unknown): Promise<NodeOutput> {
|
|
390
|
+
const { data, error } = await supabase
|
|
391
|
+
.from('articles')
|
|
392
|
+
.select('*')
|
|
393
|
+
.limit(10);
|
|
394
|
+
|
|
395
|
+
if (error) {
|
|
396
|
+
throw new Error(`Database error: ${error.message}`);
|
|
397
|
+
}
|
|
398
|
+
|
|
399
|
+
return { type: 'json', value: data };
|
|
400
|
+
}
|
|
401
|
+
```
|
|
402
|
+
|
|
403
|
+
Configure in `.env`:
|
|
404
|
+
```bash
|
|
405
|
+
SUPABASE_URL=https://your-project.supabase.co
|
|
406
|
+
SUPABASE_ANON_KEY=your-anon-key
|
|
407
|
+
SUPABASE_SERVICE_KEY=your-service-key # Optional
|
|
408
|
+
```
|
|
409
|
+
|
|
410
|
+
### Creating Custom Plugins
|
|
411
|
+
|
|
412
|
+
Add files to `src/plugins/` and import via `@code-plugins/*`:
|
|
413
|
+
|
|
414
|
+
```typescript
|
|
415
|
+
// src/plugins/my-helper.ts
|
|
416
|
+
export function formatDate(date: Date): string {
|
|
417
|
+
return date.toISOString().split('T')[0];
|
|
418
|
+
}
|
|
419
|
+
|
|
420
|
+
// In any code node:
|
|
421
|
+
import { formatDate } from '@code-plugins/my-helper.js';
|
|
422
|
+
```
|
|
423
|
+
|
|
424
|
+
---
|
|
425
|
+
|
|
426
|
+
## Configuration
|
|
427
|
+
|
|
428
|
+
### Environment Variables
|
|
429
|
+
|
|
430
|
+
| Variable | Default | Description |
|
|
431
|
+
|----------|---------|-------------|
|
|
432
|
+
| `PORT` | `3000` | Server port |
|
|
433
|
+
| `LOG_LEVEL` | `info` | Logging level (`debug`, `info`, `warn`, `error`) |
|
|
434
|
+
| `LLM_TIMEOUT_MS` | `30000` | LLM request timeout |
|
|
435
|
+
|
|
436
|
+
### LLM Provider API Keys
|
|
437
|
+
|
|
438
|
+
You need at least one provider configured:
|
|
439
|
+
|
|
440
|
+
| Variable | Provider |
|
|
441
|
+
|----------|----------|
|
|
442
|
+
| `GEMINI_API_KEY` | Google Gemini |
|
|
443
|
+
| `OPENAI_API_KEY` | OpenAI |
|
|
444
|
+
| `ANTHROPIC_API_KEY` | Anthropic Claude |
|
|
445
|
+
| `GROK_API_KEY` | xAI Grok |
|
|
446
|
+
|
|
447
|
+
### Plugin-Specific Variables
|
|
448
|
+
|
|
449
|
+
| Variable | Plugin |
|
|
450
|
+
|----------|--------|
|
|
451
|
+
| `SUPABASE_URL` | Supabase |
|
|
452
|
+
| `SUPABASE_ANON_KEY` | Supabase |
|
|
453
|
+
| `SUPABASE_SERVICE_KEY` | Supabase (optional) |
|
|
454
|
+
|
|
455
|
+
---
|
|
456
|
+
|
|
457
|
+
## Commands Reference
|
|
458
|
+
|
|
459
|
+
| Command | Description |
|
|
460
|
+
|---------|-------------|
|
|
461
|
+
| `npm run dev` | Start development server with hot reload |
|
|
462
|
+
| `npm run build` | Compile TypeScript to JavaScript |
|
|
463
|
+
| `npm start` | Start production server |
|
|
464
|
+
| `npm run validate` | Validate all workflows |
|
|
465
|
+
| `npm run create-endpoint` | Scaffold a new endpoint interactively |
|
|
466
|
+
| `npm run scan-deps` | Scan and install code node dependencies |
|
|
467
|
+
| `npm run lint` | Run ESLint |
|
|
468
|
+
| `npm run format` | Format code with Prettier |
|
|
469
|
+
|
|
470
|
+
---
|
|
471
|
+
|
|
472
|
+
## Troubleshooting
|
|
473
|
+
|
|
474
|
+
| Error | Solution |
|
|
475
|
+
|-------|----------|
|
|
476
|
+
| "Provider not found" | Check `provider` is valid and API key is set in `.env` |
|
|
477
|
+
| "Code node file not found" | Verify file exists in `codes/` folder with correct filename |
|
|
478
|
+
| "Cannot find module '@workflow/types'" | Run `npm run build` or restart TypeScript server |
|
|
479
|
+
| LLM Timeout | Increase `LLM_TIMEOUT_MS` in `.env` or use a faster model |
|
|
480
|
+
| "SUPABASE_URL required" | Add Supabase credentials to `.env` |
|
|
481
|
+
|
|
482
|
+
---
|
|
483
|
+
|
|
484
|
+
## Documentation
|
|
485
|
+
|
|
486
|
+
For more detailed guides, see the `docs/` folder:
|
|
487
|
+
|
|
488
|
+
- [Getting Started](docs/getting-started.md) - Complete setup walkthrough
|
|
489
|
+
- [Creating Endpoints](docs/creating-endpoints.md) - Advanced endpoint patterns
|
|
490
|
+
- [Using Plugins](docs/using-plugins.md) - Plugin configuration and custom plugins
|
|
491
|
+
- [Configuration Reference](docs/configuration.md) - All environment options
|
|
492
|
+
|
|
493
|
+
## License
|
|
494
|
+
|
|
495
|
+
ISC
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "create-web-ai-service",
|
|
3
|
-
"version": "1.0.
|
|
3
|
+
"version": "1.0.2",
|
|
4
4
|
"description": "CLI scaffolder for creating new web-ai-service projects",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "dist/index.js",
|
|
@@ -40,4 +40,4 @@
|
|
|
40
40
|
"@types/node": "^20.11.5",
|
|
41
41
|
"typescript": "^5.3.3"
|
|
42
42
|
}
|
|
43
|
-
}
|
|
43
|
+
}
|