@cascadeflow/n8n-nodes-cascadeflow 0.5.0 → 0.5.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md
CHANGED
|
@@ -16,6 +16,16 @@
|
|
|
16
16
|
|
|
17
17
|
</div>
|
|
18
18
|
|
|
19
|
+
> **⚠️ IMPORTANT: Version Migration (v5.0.x → v0.5.0)**
|
|
20
|
+
>
|
|
21
|
+
> If you previously installed versions **5.0.1** through **5.0.7**, you need to **manually reinstall** to get v0.5.0:
|
|
22
|
+
>
|
|
23
|
+
> 1. Go to **Settings** > **Community Nodes** in n8n
|
|
24
|
+
> 2. **Uninstall** `@cascadeflow/n8n-nodes-cascadeflow`
|
|
25
|
+
> 3. **Reinstall** by searching for `@cascadeflow/n8n-nodes-cascadeflow`
|
|
26
|
+
>
|
|
27
|
+
> **Why?** Versions 5.0.x were published with incorrect version numbering. We've corrected this to align with the main cascadeflow release (v0.5.0). npm will not auto-upgrade from 5.0.x to 0.5.0 as it treats it as a downgrade.
|
|
28
|
+
|
|
19
29
|
---
|
|
20
30
|
|
|
21
31
|
**Intelligent AI model cascading for n8n workflows.**
|
|
@@ -26,6 +36,43 @@ This is an n8n community node that brings cascadeflow's intelligent AI model cas
|
|
|
26
36
|
|
|
27
37
|
[n8n](https://n8n.io/) is a fair-code licensed workflow automation platform.
|
|
28
38
|
|
|
39
|
+
## How It Works
|
|
40
|
+
|
|
41
|
+
The CascadeFlow node is a **Language Model sub-node** that sits between your AI models and downstream n8n nodes (like Basic LLM Chain, Chain, or any node that accepts Language Model inputs):
|
|
42
|
+
|
|
43
|
+

|
|
44
|
+
|
|
45
|
+
**Architecture:**
|
|
46
|
+
|
|
47
|
+
```
|
|
48
|
+
┌─────────────┐
|
|
49
|
+
│ Drafter │ (e.g., Claude Haiku, GPT-4o-mini)
|
|
50
|
+
│ AI Model │
|
|
51
|
+
└──────┬──────┘
|
|
52
|
+
│
|
|
53
|
+
├──────► ┌──────────────┐
|
|
54
|
+
│ │ CascadeFlow │
|
|
55
|
+
│ │ Node │ ────► ┌──────────────┐
|
|
56
|
+
│ └──────────────┘ │ Basic Chain │
|
|
57
|
+
├──────► Quality checks │ Chain │
|
|
58
|
+
│ Cascades if needed │ & more │
|
|
59
|
+
│ └──────────────┘
|
|
60
|
+
┌──────┴──────┐
|
|
61
|
+
│ Verifier │ (e.g., Claude Sonnet, GPT-4o)
|
|
62
|
+
│ AI Model │
|
|
63
|
+
└─────────────┘
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
**Flow:**
|
|
67
|
+
1. Query goes to cheap drafter model first
|
|
68
|
+
2. CascadeFlow validates the response quality
|
|
69
|
+
3. If quality passes → return drafter response (fast + cheap ✅)
|
|
70
|
+
4. If quality fails → escalate to verifier model (slower but accurate ⚠️)
|
|
71
|
+
|
|
72
|
+
**Result:** 70-80% of queries accept the drafter, saving 40-85% on costs.
|
|
73
|
+
|
|
74
|
+
> **ℹ️ Note:** CascadeFlow works with n8n Chain nodes but **not with AI Agent nodes**, as n8n only allows whitelisted models for Agent inputs. Use with Basic LLM Chain, Chain, or other nodes that accept Language Model connections.
|
|
75
|
+
|
|
29
76
|
## Installation
|
|
30
77
|
|
|
31
78
|
Follow the [installation guide](https://docs.n8n.io/integrations/community-nodes/installation/) in the n8n community nodes documentation.
|
|
@@ -51,290 +98,345 @@ For Docker-based deployments add the following line before the font installation
|
|
|
51
98
|
RUN cd /usr/local/lib/node_modules/n8n && npm install @cascadeflow/n8n-nodes-cascadeflow
|
|
52
99
|
```
|
|
53
100
|
|
|
54
|
-
##
|
|
55
|
-
|
|
56
|
-
The cascadeflow node supports these operations:
|
|
101
|
+
## Quick Start
|
|
57
102
|
|
|
58
|
-
###
|
|
103
|
+
### Basic Setup
|
|
59
104
|
|
|
60
|
-
|
|
105
|
+
1. **Add two AI Chat Model nodes** (e.g., OpenAI Chat Model, Anthropic Chat Model, Ollama Chat Model)
|
|
106
|
+
- Configure one as your **drafter** (cheap model like `gpt-4o-mini` or `claude-3-5-haiku-20241022`)
|
|
107
|
+
- Configure one as your **verifier** (powerful model like `gpt-4o` or `claude-3-5-sonnet-20241022`)
|
|
61
108
|
|
|
62
|
-
**
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
4. If quality fails → escalate to verifier model (e.g., GPT-5) ⚠️
|
|
109
|
+
2. **Add the CascadeFlow node**
|
|
110
|
+
- Connect the drafter model to the **Drafter** input
|
|
111
|
+
- Connect the verifier model to the **Verifier** input
|
|
112
|
+
- Adjust the **Quality Threshold** (default: 0.7)
|
|
67
113
|
|
|
68
|
-
**
|
|
114
|
+
3. **Connect to a Chain node**
|
|
115
|
+
- The CascadeFlow node outputs a Language Model connection
|
|
116
|
+
- Connect it to nodes that accept AI models (Basic LLM Chain, Chain, etc.)
|
|
117
|
+
- **Note:** Does not work with AI Agent nodes (n8n limitation)
|
|
69
118
|
|
|
70
|
-
###
|
|
119
|
+
### Example Workflow
|
|
71
120
|
|
|
72
|
-
|
|
121
|
+
```
|
|
122
|
+
┌──────────────────┐
|
|
123
|
+
│ When chat │
|
|
124
|
+
│ message received │
|
|
125
|
+
└────────┬─────────┘
|
|
126
|
+
│
|
|
127
|
+
v
|
|
128
|
+
┌──────────────────┐ ┌──────────────────┐
|
|
129
|
+
│ OpenAI Model │──────►│ │
|
|
130
|
+
│ gpt-4o-mini │ │ CascadeFlow │ ┌──────────────────┐
|
|
131
|
+
└──────────────────┘ │ Node │──────►│ Basic LLM Chain │
|
|
132
|
+
│ │ │ │
|
|
133
|
+
┌──────────────────┐ │ Threshold: 0.7 │ └──────────────────┘
|
|
134
|
+
│ OpenAI Model │──────►│ │
|
|
135
|
+
│ gpt-4o │ └──────────────────┘
|
|
136
|
+
└──────────────────┘
|
|
137
|
+
```
|
|
73
138
|
|
|
74
139
|
## Configuration
|
|
75
140
|
|
|
76
|
-
###
|
|
141
|
+
### Node Parameters
|
|
142
|
+
|
|
143
|
+
#### Quality Threshold (0-1)
|
|
77
144
|
|
|
78
|
-
|
|
145
|
+
Controls how aggressively to accept drafter responses:
|
|
79
146
|
|
|
80
|
-
- **
|
|
81
|
-
- **
|
|
82
|
-
- **
|
|
83
|
-
- **Together AI API Key** - For open-source models
|
|
84
|
-
- **HuggingFace API Key** - For HuggingFace models
|
|
147
|
+
- **0.5-0.6**: Very aggressive (maximum cost savings, ~80-90% acceptance)
|
|
148
|
+
- **0.7** (default): Balanced (good quality + savings, ~70-80% acceptance)
|
|
149
|
+
- **0.8-0.9**: Conservative (highest quality, ~50-60% acceptance)
|
|
85
150
|
|
|
86
|
-
|
|
151
|
+
Lower threshold = more cost savings, higher threshold = better quality assurance.
|
|
87
152
|
|
|
88
|
-
|
|
153
|
+
## Flow Visualization
|
|
89
154
|
|
|
90
|
-
|
|
91
|
-
|
|
92
|
-
- **Message**: The query or prompt to send to AI
|
|
93
|
-
- **Draft Model**: Configuration for the cheap model
|
|
94
|
-
- Provider (OpenAI, Anthropic, Groq, etc.)
|
|
95
|
-
- Model name (e.g., `claude-3-5-haiku-20241022`, `gpt-4o-mini`)
|
|
96
|
-
- Cost per 1K tokens
|
|
97
|
-
- **Verifier Model**: Configuration for the expensive model
|
|
98
|
-
- Provider
|
|
99
|
-
- Model name (e.g., `gpt-5`, `claude-3-5-sonnet-20241022`)
|
|
100
|
-
- Cost per 1K tokens
|
|
101
|
-
|
|
102
|
-
#### Optional Parameters
|
|
103
|
-
|
|
104
|
-
- **Quality Settings**
|
|
105
|
-
- Quality Threshold (0-1): Minimum score to accept draft
|
|
106
|
-
- Require Validation: Whether to validate before accepting
|
|
107
|
-
- **Advanced Options**
|
|
108
|
-
- Max Tokens: Maximum tokens to generate
|
|
109
|
-
- Temperature: Sampling temperature (0-2)
|
|
110
|
-
- System Prompt: Optional system instructions
|
|
111
|
-
- **Output Mode**: What data to return
|
|
112
|
-
- Full Metrics: All cascade diagnostics
|
|
113
|
-
- Content Only: Just the AI response
|
|
114
|
-
- Metrics Summary: Response + key metrics
|
|
155
|
+
### Viewing Cascade Decisions in Real-Time
|
|
115
156
|
|
|
116
|
-
|
|
157
|
+
CascadeFlow provides detailed logging of every cascade decision in n8n's UI:
|
|
158
|
+
|
|
159
|
+
1. **Execute your workflow** with the CascadeFlow node
|
|
160
|
+
2. **Click on the downstream Chain node** after execution (the node that receives CascadeFlow output)
|
|
161
|
+
3. **Navigate to the "Logs" tab**
|
|
117
162
|
|
|
118
|
-
|
|
163
|
+
You'll see detailed flow information like:
|
|
119
164
|
|
|
120
165
|
```
|
|
121
|
-
|
|
122
|
-
|
|
123
|
-
|
|
124
|
-
|
|
125
|
-
|
|
126
|
-
|
|
127
|
-
|
|
128
|
-
|
|
129
|
-
|
|
130
|
-
|
|
131
|
-
|
|
132
|
-
|
|
166
|
+
🎯 CascadeFlow: Trying drafter model...
|
|
167
|
+
📊 Quality validation: confidence=0.85, method=heuristic
|
|
168
|
+
🎯 Alignment: 0.82
|
|
169
|
+
|
|
170
|
+
┌─────────────────────────────────────────┐
|
|
171
|
+
│ ✅ FLOW: DRAFTER ACCEPTED (FAST PATH) │
|
|
172
|
+
└─────────────────────────────────────────┘
|
|
173
|
+
Query → Drafter → Quality Check ✅ → Response
|
|
174
|
+
⚡ Fast & Cheap: Used drafter model only
|
|
175
|
+
Confidence: 0.85 (threshold: 0.70)
|
|
176
|
+
Quality score: 0.85
|
|
177
|
+
Latency: 420ms
|
|
178
|
+
💰 Cost savings: ~93.8% (used cheap model)
|
|
179
|
+
📊 Stats: 7 drafter, 2 verifier
|
|
133
180
|
```
|
|
134
181
|
|
|
135
|
-
|
|
182
|
+
Or when escalating:
|
|
136
183
|
|
|
137
184
|
```
|
|
138
|
-
|
|
139
|
-
|
|
140
|
-
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
144
|
-
|
|
185
|
+
🎯 CascadeFlow: Trying drafter model...
|
|
186
|
+
📊 Quality validation: confidence=0.62, method=heuristic
|
|
187
|
+
|
|
188
|
+
┌────────────────────────────────────────────────┐
|
|
189
|
+
│ ⚠️ FLOW: ESCALATED TO VERIFIER (SLOW PATH) │
|
|
190
|
+
└────────────────────────────────────────────────┘
|
|
191
|
+
Query → Drafter → Quality Check ❌ → Verifier → Response
|
|
192
|
+
🔄 Escalating: Drafter quality too low, using verifier
|
|
193
|
+
Confidence: 0.62 < 0.70 (threshold)
|
|
194
|
+
Reason: Simple check failed (confidence: 0.62 < 0.70)
|
|
195
|
+
Drafter latency: 380ms
|
|
196
|
+
🔄 Loading verifier model...
|
|
197
|
+
✅ Verifier completed successfully
|
|
198
|
+
Verifier latency: 890ms
|
|
199
|
+
Total latency: 1270ms (drafter: 380ms + verifier: 890ms)
|
|
200
|
+
💰 Cost: Full verifier cost (0% savings this request)
|
|
201
|
+
📊 Stats: 7 drafter (77.8%), 2 verifier
|
|
145
202
|
```
|
|
146
203
|
|
|
147
|
-
###
|
|
204
|
+
### What the Logs Show
|
|
205
|
+
|
|
206
|
+
- **Flow path taken**: Drafter accepted, escalated to verifier, or error fallback
|
|
207
|
+
- **Quality scores**: Confidence level and alignment scores
|
|
208
|
+
- **Latency breakdown**: Time spent on each model
|
|
209
|
+
- **Cost analysis**: Savings percentage for each request
|
|
210
|
+
- **Running statistics**: Acceptance rate across all requests
|
|
211
|
+
|
|
212
|
+
## Recommended Model Configurations
|
|
213
|
+
|
|
214
|
+
### ⭐ Best Overall: Claude Haiku + GPT-4o (Recommended)
|
|
148
215
|
|
|
149
216
|
```
|
|
150
|
-
|
|
151
|
-
|
|
152
|
-
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
↓
|
|
156
|
-
cascadeflow Node: Proofread and edit
|
|
157
|
-
↓
|
|
158
|
-
Notion Node: Save to content calendar
|
|
217
|
+
Drafter: claude-3-5-haiku-20241022
|
|
218
|
+
Verifier: gpt-4o
|
|
219
|
+
Savings: ~73% average
|
|
220
|
+
Why: Haiku's fast, high-quality drafts + GPT-4o's reasoning
|
|
221
|
+
Use for: General purpose, coding, reasoning, complex queries
|
|
159
222
|
```
|
|
160
223
|
|
|
161
|
-
###
|
|
224
|
+
### Anthropic Only (High Quality)
|
|
162
225
|
|
|
163
226
|
```
|
|
164
|
-
|
|
165
|
-
|
|
166
|
-
|
|
167
|
-
|
|
168
|
-
"type": "function",
|
|
169
|
-
"function": {
|
|
170
|
-
"name": "get_weather",
|
|
171
|
-
"description": "Get current weather",
|
|
172
|
-
"parameters": {
|
|
173
|
-
"type": "object",
|
|
174
|
-
"properties": {
|
|
175
|
-
"location": { "type": "string" }
|
|
176
|
-
}
|
|
177
|
-
}
|
|
178
|
-
}
|
|
179
|
-
}
|
|
180
|
-
]
|
|
181
|
-
|
|
182
|
-
Output:
|
|
183
|
-
toolCalls: [
|
|
184
|
-
{
|
|
185
|
-
"id": "call_abc123",
|
|
186
|
-
"function": {
|
|
187
|
-
"name": "get_weather",
|
|
188
|
-
"arguments": "{\"location\": \"Paris\"}"
|
|
189
|
-
}
|
|
190
|
-
}
|
|
191
|
-
]
|
|
227
|
+
Drafter: claude-3-5-haiku-20241022
|
|
228
|
+
Verifier: claude-3-5-sonnet-20241022
|
|
229
|
+
Savings: ~70% average
|
|
230
|
+
Why: Consistent Anthropic experience, excellent quality
|
|
192
231
|
```
|
|
193
232
|
|
|
194
|
-
|
|
195
|
-
|
|
196
|
-
|
|
197
|
-
|
|
198
|
-
|
|
199
|
-
|
|
200
|
-
|
|
201
|
-
"modelUsed": "gpt-4o-mini",
|
|
202
|
-
"totalCost": 0.000211,
|
|
203
|
-
"savingsPercentage": 97.8,
|
|
204
|
-
"latencyMs": 820,
|
|
205
|
-
"cascaded": true,
|
|
206
|
-
"draftAccepted": true,
|
|
207
|
-
"complexity": "simple",
|
|
208
|
-
"routingStrategy": "cascade",
|
|
209
|
-
"qualityScore": 0.89,
|
|
210
|
-
"draftCost": 0.000211,
|
|
211
|
-
"verifierCost": 0.0,
|
|
212
|
-
"summary": {
|
|
213
|
-
"saved": "97.8%",
|
|
214
|
-
"cost": "$0.000211",
|
|
215
|
-
"model": "gpt-4o-mini",
|
|
216
|
-
"speed": "820ms",
|
|
217
|
-
"status": "✅ Draft accepted"
|
|
218
|
-
}
|
|
219
|
-
}
|
|
233
|
+
### OpenAI Only (Good Balance)
|
|
234
|
+
|
|
235
|
+
```
|
|
236
|
+
Drafter: gpt-4o-mini
|
|
237
|
+
Verifier: gpt-4o
|
|
238
|
+
Savings: ~85% average
|
|
239
|
+
Why: Both models from same provider, great cost efficiency
|
|
220
240
|
```
|
|
221
241
|
|
|
222
|
-
###
|
|
242
|
+
### Ultra Fast with Ollama (Local)
|
|
223
243
|
|
|
224
|
-
```json
|
|
225
|
-
{
|
|
226
|
-
"content": "AI response here..."
|
|
227
|
-
}
|
|
228
244
|
```
|
|
229
|
-
|
|
230
|
-
|
|
231
|
-
|
|
232
|
-
|
|
233
|
-
|
|
234
|
-
"content": "AI response here...",
|
|
235
|
-
"modelUsed": "gpt-4o-mini",
|
|
236
|
-
"totalCost": 0.000211,
|
|
237
|
-
"savingsPercentage": 97.8,
|
|
238
|
-
"cascaded": true,
|
|
239
|
-
"draftAccepted": true,
|
|
240
|
-
"latencyMs": 820
|
|
241
|
-
}
|
|
245
|
+
Drafter: ollama/qwen2.5:3b (local)
|
|
246
|
+
Verifier: gpt-4o (cloud)
|
|
247
|
+
Savings: ~99% on drafter calls (no API cost)
|
|
248
|
+
Why: Local model for drafts, cloud for verification
|
|
249
|
+
Note: Requires Ollama installed locally
|
|
242
250
|
```
|
|
243
251
|
|
|
244
252
|
## Cost Savings Examples
|
|
245
253
|
|
|
246
|
-
**
|
|
254
|
+
**Example: Claude Haiku + GPT-4o**
|
|
247
255
|
|
|
248
|
-
|
|
|
249
|
-
|
|
250
|
-
| Simple Q&A (
|
|
251
|
-
| Complex query (
|
|
252
|
-
| **Average** | **$0.
|
|
256
|
+
| Scenario | Traditional (GPT-4o only) | CascadeFlow (Haiku + GPT-4o) | Savings |
|
|
257
|
+
|----------|---------------------------|------------------------------|---------|
|
|
258
|
+
| Simple Q&A (75% acceptance) | $0.0025 | $0.0008 | 68% |
|
|
259
|
+
| Complex query (25% escalation) | $0.0025 | $0.0025 | 0% (correctly escalated) |
|
|
260
|
+
| **Average** | **$0.0025** | **$0.00115** | **54%** |
|
|
253
261
|
|
|
254
262
|
**Monthly savings (10,000 queries):**
|
|
255
|
-
- Traditional (GPT-
|
|
256
|
-
-
|
|
257
|
-
- **You save: $
|
|
263
|
+
- Traditional (GPT-4o only): $25.00
|
|
264
|
+
- CascadeFlow (Haiku + GPT-4o): $11.50
|
|
265
|
+
- **You save: $13.50/month** (54% savings)
|
|
258
266
|
|
|
259
267
|
**Monthly savings (100,000 queries):**
|
|
260
|
-
- Traditional (GPT-
|
|
261
|
-
-
|
|
262
|
-
- **You save: $
|
|
268
|
+
- Traditional (GPT-4o only): $250.00
|
|
269
|
+
- CascadeFlow (Haiku + GPT-4o): $115.00
|
|
270
|
+
- **You save: $135.00/month** (54% savings)
|
|
263
271
|
|
|
264
|
-
|
|
265
|
-
|
|
266
|
-
## Recommended Model Configurations
|
|
272
|
+
## Example Workflows
|
|
267
273
|
|
|
268
|
-
###
|
|
274
|
+
### Customer Support Bot
|
|
269
275
|
|
|
270
276
|
```
|
|
271
|
-
|
|
272
|
-
|
|
273
|
-
|
|
274
|
-
|
|
275
|
-
|
|
277
|
+
┌──────────────────┐
|
|
278
|
+
│ Webhook │ ← Customer question
|
|
279
|
+
│ (POST /support) │
|
|
280
|
+
└────────┬─────────┘
|
|
281
|
+
│
|
|
282
|
+
v
|
|
283
|
+
┌─────────────────────────────────────┐
|
|
284
|
+
│ Claude Haiku ────┐ │
|
|
285
|
+
│ │ CascadeFlow │ ┌──────────────────┐
|
|
286
|
+
│ Claude Sonnet ───┴─► Node │──────►│ Basic Chain │
|
|
287
|
+
└─────────────────────────────────────┘ │ (responds) │
|
|
288
|
+
└──────┬───────────┘
|
|
289
|
+
│
|
|
290
|
+
v
|
|
291
|
+
┌──────────────────┐
|
|
292
|
+
│ Send Response │
|
|
293
|
+
└──────────────────┘
|
|
276
294
|
```
|
|
277
295
|
|
|
278
|
-
|
|
279
|
-
|
|
280
|
-
### OpenAI Only (Good Balance)
|
|
296
|
+
### Content Generation
|
|
281
297
|
|
|
282
298
|
```
|
|
283
|
-
|
|
284
|
-
|
|
285
|
-
|
|
286
|
-
|
|
287
|
-
|
|
299
|
+
┌──────────────────┐
|
|
300
|
+
│ Schedule Trigger │ ← Daily at 9am
|
|
301
|
+
│ (Daily) │
|
|
302
|
+
└────────┬─────────┘
|
|
303
|
+
│
|
|
304
|
+
v
|
|
305
|
+
┌────────────────────────────────────────┐
|
|
306
|
+
│ GPT-4o-mini ─────┐ │
|
|
307
|
+
│ │ CascadeFlow │ ┌──────────────────┐
|
|
308
|
+
│ GPT-4o ──────────┴─► Node │──────►│ Basic Chain │
|
|
309
|
+
└────────────────────────────────────────┘ │ (generates) │
|
|
310
|
+
└──────┬───────────┘
|
|
311
|
+
│
|
|
312
|
+
v
|
|
313
|
+
┌──────────────────┐
|
|
314
|
+
│ Save to Notion │
|
|
315
|
+
└──────────────────┘
|
|
288
316
|
```
|
|
289
317
|
|
|
290
|
-
###
|
|
318
|
+
### Code Review Assistant
|
|
291
319
|
|
|
292
320
|
```
|
|
293
|
-
|
|
294
|
-
|
|
295
|
-
|
|
296
|
-
|
|
321
|
+
┌──────────────────┐
|
|
322
|
+
│ GitHub Trigger │ ← New PR
|
|
323
|
+
│ (PR opened) │
|
|
324
|
+
└────────┬─────────┘
|
|
325
|
+
│
|
|
326
|
+
v
|
|
327
|
+
┌─────────────────────────────────────┐
|
|
328
|
+
│ Ollama qwen2.5 ──┐ │
|
|
329
|
+
│ │ CascadeFlow │ ┌──────────────────┐
|
|
330
|
+
│ GPT-4o ──────────┴─► Node │──────►│ Basic Chain │
|
|
331
|
+
└─────────────────────────────────────┘ │ (reviews code) │
|
|
332
|
+
└──────┬───────────┘
|
|
333
|
+
│
|
|
334
|
+
v
|
|
335
|
+
┌──────────────────┐
|
|
336
|
+
│ Post Comment │
|
|
337
|
+
└──────────────────┘
|
|
297
338
|
```
|
|
298
339
|
|
|
299
|
-
|
|
340
|
+
## UI Visualization Note
|
|
300
341
|
|
|
301
|
-
|
|
302
|
-
Draft: groq/llama-3.1-8b-instant ($0.00005)
|
|
303
|
-
Verifier: gpt-5 ($0.00125)
|
|
304
|
-
Savings: ~75-85%
|
|
305
|
-
Why: Groq's instant speed + GPT-5's reasoning power
|
|
306
|
-
Note: Highest cost savings, best for high-volume workloads
|
|
307
|
-
```
|
|
342
|
+
⚠️ **Important:** Due to n8n's rendering conventions, the node visualization always highlights the **Drafter** connection as active (green), regardless of which model was actually used at runtime. This is because n8n highlights the first input in a sub-node's definition.
|
|
308
343
|
|
|
309
|
-
|
|
344
|
+
**This does not affect functionality** - the cascade logic works correctly. The drafter is always tried first, and the verifier is only loaded when needed.
|
|
310
345
|
|
|
311
|
-
|
|
312
|
-
|
|
313
|
-
|
|
314
|
-
|
|
315
|
-
|
|
316
|
-
|
|
346
|
+
**To see the actual cascade flow and which model was used:**
|
|
347
|
+
|
|
348
|
+
1. Execute your workflow
|
|
349
|
+
2. Click on the downstream Chain node after execution (the node that receives CascadeFlow output)
|
|
350
|
+
3. Navigate to the **"Logs"** tab
|
|
351
|
+
4. You'll see detailed flow information showing:
|
|
352
|
+
- Whether the drafter was accepted or escalated to verifier
|
|
353
|
+
- Quality confidence scores and validation methods
|
|
354
|
+
- Latency breakdown for each model
|
|
355
|
+
- Cost savings percentage
|
|
356
|
+
- Running statistics across all executions
|
|
357
|
+
|
|
358
|
+
The logs provide complete visibility into the cascade decision-making process, showing exactly which path was taken for each request.
|
|
359
|
+
|
|
360
|
+
> **ℹ️ Important:** CascadeFlow does **not work with AI Agent nodes** in n8n, as n8n only allows whitelisted models for Agent inputs. Use with Basic LLM Chain, Chain, or other nodes that accept Language Model connections.
|
|
317
361
|
|
|
318
362
|
## Compatibility
|
|
319
363
|
|
|
320
|
-
|
|
364
|
+
- **n8n version**: 1.0+
|
|
365
|
+
- **Works with any AI Chat Model node** in n8n:
|
|
366
|
+
- OpenAI Chat Model
|
|
367
|
+
- Anthropic Chat Model
|
|
368
|
+
- Ollama Chat Model
|
|
369
|
+
- Azure OpenAI Chat Model
|
|
370
|
+
- Google PaLM Chat Model
|
|
371
|
+
- And more...
|
|
372
|
+
|
|
373
|
+
## Troubleshooting
|
|
374
|
+
|
|
375
|
+
### Issue: "Drafter model is required"
|
|
376
|
+
|
|
377
|
+
**Solution:** Make sure you've connected an AI Chat Model to the **Drafter** input (second input, bottom position).
|
|
378
|
+
|
|
379
|
+
### Issue: "Verifier model is required"
|
|
380
|
+
|
|
381
|
+
**Solution:** Make sure you've connected an AI Chat Model to the **Verifier** input (first input, top position).
|
|
382
|
+
|
|
383
|
+
### Issue: Not seeing cascade logs
|
|
384
|
+
|
|
385
|
+
**Solution:**
|
|
386
|
+
1. Make sure your workflow has executed successfully
|
|
387
|
+
2. Click on the **Chain node that receives the CascadeFlow output** (Basic LLM Chain, Chain, etc.)
|
|
388
|
+
3. Navigate to the **"Logs"** tab (not the "Output" tab)
|
|
389
|
+
4. The logs appear in the downstream node, not the CascadeFlow node itself
|
|
390
|
+
|
|
391
|
+
### Issue: "This node cannot be connected" when connecting to AI Agent
|
|
392
|
+
|
|
393
|
+
**Solution:** This is expected. CascadeFlow does **not work with AI Agent nodes** because n8n only allows whitelisted models for Agent inputs. Use CascadeFlow with:
|
|
394
|
+
- ✅ Basic LLM Chain
|
|
395
|
+
- ✅ Chain
|
|
396
|
+
- ✅ Other nodes that accept Language Model connections
|
|
397
|
+
- ❌ AI Agent (not supported)
|
|
398
|
+
|
|
399
|
+
### Issue: Always escalating to verifier
|
|
400
|
+
|
|
401
|
+
**Solution:**
|
|
402
|
+
1. Check your Quality Threshold setting (try lowering to 0.6-0.65)
|
|
403
|
+
2. Verify your drafter model is actually a cheaper/faster model
|
|
404
|
+
3. Check the logs to see the confidence scores being reported
|
|
321
405
|
|
|
322
406
|
## Resources
|
|
323
407
|
|
|
324
408
|
- [n8n community nodes documentation](https://docs.n8n.io/integrations/community-nodes/)
|
|
325
409
|
- [cascadeflow GitHub](https://github.com/lemony-ai/cascadeflow)
|
|
326
|
-
- [cascadeflow Documentation](https://github.com/lemony-ai/cascadeflow)
|
|
410
|
+
- [cascadeflow Documentation](https://github.com/lemony-ai/cascadeflow/blob/main/README.md)
|
|
411
|
+
- [Full n8n Integration Guide](https://github.com/lemony-ai/cascadeflow/blob/main/docs/guides/n8n_integration.md)
|
|
327
412
|
|
|
328
413
|
## License
|
|
329
414
|
|
|
330
415
|
[MIT](https://github.com/lemony-ai/cascadeflow/blob/main/LICENSE)
|
|
331
416
|
|
|
332
|
-
## Version
|
|
417
|
+
## Version History
|
|
418
|
+
|
|
419
|
+
### v0.5.0 (Latest)
|
|
420
|
+
|
|
421
|
+
**Major Release - Aligned with cascadeflow v0.5.0**
|
|
422
|
+
|
|
423
|
+
- **Flow visualization in n8n Logs tab**: Added detailed cascade flow logging with visual boxes showing drafter acceptance, verifier escalation, and error fallback paths
|
|
424
|
+
- **Real-time quality metrics**: Logs show confidence scores, alignment scores, latency breakdown, and cost savings for each request
|
|
425
|
+
- **Running statistics**: Track acceptance rates across multiple executions
|
|
426
|
+
- **Fixed UI visualization**: Swapped input order to match n8n's highlighting convention (Drafter now highlights as active)
|
|
427
|
+
- **Auto-dependency installation**: Moved @cascadeflow/core from optionalDependencies to regular dependencies
|
|
428
|
+
- **Crash fix**: Fixed n8n crashes caused by @cascadeflow/core static imports loading ML dependencies
|
|
429
|
+
- **Dynamic imports**: SemanticQualityChecker now uses dynamic imports to avoid loading @xenova/transformers at module initialization
|
|
430
|
+
- **Quality validator integration**: Integrated QualityValidator from @cascadeflow/core for improved quality scoring
|
|
431
|
+
- **Better cascade decisions**: Replaced naive length-based quality check with proper complexity-aware validation
|
|
432
|
+
- **Graceful degradation**: Added CASCADE_QUALITY_CONFIG with useSemanticValidation: false for stable operation
|
|
433
|
+
|
|
434
|
+
> **Note:** Versions 5.0.1 through 5.0.7 were deprecated due to incorrect version numbering. If upgrading from 5.0.x, please uninstall and reinstall.
|
|
333
435
|
|
|
334
|
-
###
|
|
436
|
+
### v0.4.19 and earlier
|
|
335
437
|
|
|
336
|
-
- Initial
|
|
337
|
-
- Support for
|
|
338
|
-
-
|
|
339
|
-
-
|
|
340
|
-
-
|
|
438
|
+
- Initial releases as LangChain sub-node
|
|
439
|
+
- Support for any AI Chat Model in n8n
|
|
440
|
+
- Lazy verifier loading (only fetches when needed)
|
|
441
|
+
- Quality threshold configuration
|
|
442
|
+
- Console logging for cascade flow
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"LmChatCascadeFlow.node.d.ts","sourceRoot":"","sources":["../../../nodes/LmChatCascadeFlow/LmChatCascadeFlow.node.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,EACV,SAAS,EACT,oBAAoB,EACpB,oBAAoB,EACpB,UAAU,EACX,MAAM,cAAc,CAAC;
|
|
1
|
+
{"version":3,"file":"LmChatCascadeFlow.node.d.ts","sourceRoot":"","sources":["../../../nodes/LmChatCascadeFlow/LmChatCascadeFlow.node.ts"],"names":[],"mappings":"AAAA,OAAO,KAAK,EACV,SAAS,EACT,oBAAoB,EACpB,oBAAoB,EACpB,UAAU,EACX,MAAM,cAAc,CAAC;AAmRtB,qBAAa,iBAAkB,YAAW,SAAS;IACjD,WAAW,EAAE,oBAAoB,CA6D/B;IAEI,UAAU,CAAC,IAAI,EAAE,oBAAoB,GAAG,OAAO,CAAC,UAAU,CAAC;CAgDlE"}
|
|
@@ -3,6 +3,18 @@ Object.defineProperty(exports, "__esModule", { value: true });
|
|
|
3
3
|
exports.LmChatCascadeFlow = void 0;
|
|
4
4
|
const n8n_workflow_1 = require("n8n-workflow");
|
|
5
5
|
const chat_models_1 = require("@langchain/core/language_models/chat_models");
|
|
6
|
+
// Quality validation - optional import, fallback if unavailable
|
|
7
|
+
let QualityValidator;
|
|
8
|
+
let CASCADE_QUALITY_CONFIG;
|
|
9
|
+
try {
|
|
10
|
+
const cascadeCore = require('@cascadeflow/core');
|
|
11
|
+
QualityValidator = cascadeCore.QualityValidator;
|
|
12
|
+
CASCADE_QUALITY_CONFIG = cascadeCore.CASCADE_QUALITY_CONFIG;
|
|
13
|
+
}
|
|
14
|
+
catch (e) {
|
|
15
|
+
// @cascadeflow/core not available - use simple validation
|
|
16
|
+
console.warn('⚠️ @cascadeflow/core not available, using simple quality check');
|
|
17
|
+
}
|
|
6
18
|
/**
|
|
7
19
|
* Custom CascadeChatModel that wraps two models (drafter and verifier)
|
|
8
20
|
* and implements cascading logic with cost tracking
|
|
@@ -18,6 +30,23 @@ class CascadeChatModel extends chat_models_1.BaseChatModel {
|
|
|
18
30
|
this.drafterModel = drafterModel;
|
|
19
31
|
this.verifierModelGetter = verifierModelGetter;
|
|
20
32
|
this.qualityThreshold = qualityThreshold;
|
|
33
|
+
// Initialize quality validator with CASCADE-optimized config (if available)
|
|
34
|
+
if (QualityValidator && CASCADE_QUALITY_CONFIG) {
|
|
35
|
+
try {
|
|
36
|
+
this.qualityValidator = new QualityValidator({
|
|
37
|
+
...CASCADE_QUALITY_CONFIG,
|
|
38
|
+
minConfidence: qualityThreshold,
|
|
39
|
+
});
|
|
40
|
+
console.log('✅ CascadeFlow quality validator initialized');
|
|
41
|
+
}
|
|
42
|
+
catch (e) {
|
|
43
|
+
console.warn('⚠️ Quality validator initialization failed, using simple check');
|
|
44
|
+
this.qualityValidator = null;
|
|
45
|
+
}
|
|
46
|
+
}
|
|
47
|
+
else {
|
|
48
|
+
this.qualityValidator = null;
|
|
49
|
+
}
|
|
21
50
|
}
|
|
22
51
|
async getVerifierModel() {
|
|
23
52
|
if (!this.verifierModel) {
|
|
@@ -26,31 +55,99 @@ class CascadeChatModel extends chat_models_1.BaseChatModel {
|
|
|
26
55
|
}
|
|
27
56
|
return this.verifierModel;
|
|
28
57
|
}
|
|
58
|
+
/**
|
|
59
|
+
* Simple quality validation fallback (when @cascadeflow/core not available)
|
|
60
|
+
*/
|
|
61
|
+
simpleQualityCheck(responseText) {
|
|
62
|
+
const wordCount = responseText.split(/\s+/).length;
|
|
63
|
+
// Base confidence on response length and structure
|
|
64
|
+
let confidence = 0.75;
|
|
65
|
+
// Very short responses get lower confidence
|
|
66
|
+
if (wordCount < 5) {
|
|
67
|
+
confidence = 0.50;
|
|
68
|
+
}
|
|
69
|
+
else if (wordCount < 15) {
|
|
70
|
+
confidence = 0.65;
|
|
71
|
+
}
|
|
72
|
+
else if (wordCount > 30) {
|
|
73
|
+
confidence = 0.85;
|
|
74
|
+
}
|
|
75
|
+
// Check for uncertainty markers
|
|
76
|
+
const uncertaintyMarkers = ['i don\'t know', 'i\'m not sure', 'unclear', 'uncertain'];
|
|
77
|
+
const hasUncertainty = uncertaintyMarkers.some(marker => responseText.toLowerCase().includes(marker));
|
|
78
|
+
if (hasUncertainty) {
|
|
79
|
+
confidence -= 0.20;
|
|
80
|
+
}
|
|
81
|
+
const passed = confidence >= this.qualityThreshold;
|
|
82
|
+
const reason = passed
|
|
83
|
+
? `Simple check passed (confidence: ${confidence.toFixed(2)} >= ${this.qualityThreshold})`
|
|
84
|
+
: `Simple check failed (confidence: ${confidence.toFixed(2)} < ${this.qualityThreshold})`;
|
|
85
|
+
return { passed, confidence, score: confidence, reason };
|
|
86
|
+
}
|
|
29
87
|
_llmType() {
|
|
30
88
|
return 'cascade';
|
|
31
89
|
}
|
|
32
90
|
async _generate(messages, options, runManager) {
|
|
33
91
|
try {
|
|
34
92
|
// Step 1: Try the drafter model
|
|
93
|
+
await runManager?.handleText('🎯 CascadeFlow: Trying drafter model...\n');
|
|
35
94
|
console.log('🎯 CascadeFlow: Trying drafter model...');
|
|
36
95
|
const drafterStartTime = Date.now();
|
|
37
96
|
const drafterMessage = await this.drafterModel.invoke(messages, options);
|
|
38
97
|
const drafterLatency = Date.now() - drafterStartTime;
|
|
39
98
|
this.drafterCount++;
|
|
40
|
-
// Step 2: Quality check
|
|
99
|
+
// Step 2: Quality check using CascadeFlow validator (or simple fallback)
|
|
41
100
|
const responseText = drafterMessage.content.toString();
|
|
42
|
-
|
|
101
|
+
let validationResult;
|
|
102
|
+
if (this.qualityValidator) {
|
|
103
|
+
// Use full CascadeFlow quality validator
|
|
104
|
+
const queryText = messages.map(m => m.content.toString()).join(' ');
|
|
105
|
+
try {
|
|
106
|
+
validationResult = await this.qualityValidator.validate(responseText, queryText);
|
|
107
|
+
const qualityLog = ` 📊 Quality validation: confidence=${validationResult.confidence.toFixed(2)}, method=${validationResult.method}\n`;
|
|
108
|
+
await runManager?.handleText(qualityLog);
|
|
109
|
+
console.log(qualityLog);
|
|
110
|
+
if (validationResult.details?.alignmentScore) {
|
|
111
|
+
const alignmentLog = ` 🎯 Alignment: ${validationResult.details.alignmentScore.toFixed(2)}\n`;
|
|
112
|
+
await runManager?.handleText(alignmentLog);
|
|
113
|
+
console.log(alignmentLog);
|
|
114
|
+
}
|
|
115
|
+
}
|
|
116
|
+
catch (e) {
|
|
117
|
+
const errorLog = ` ⚠️ Quality validator error, using simple check: ${e}\n`;
|
|
118
|
+
await runManager?.handleText(errorLog);
|
|
119
|
+
console.warn(errorLog);
|
|
120
|
+
validationResult = this.simpleQualityCheck(responseText);
|
|
121
|
+
}
|
|
122
|
+
}
|
|
123
|
+
else {
|
|
124
|
+
// Use simple quality check (fallback)
|
|
125
|
+
validationResult = this.simpleQualityCheck(responseText);
|
|
126
|
+
const simpleLog = ` 📊 Simple quality check: confidence=${validationResult.confidence.toFixed(2)}\n`;
|
|
127
|
+
await runManager?.handleText(simpleLog);
|
|
128
|
+
console.log(simpleLog);
|
|
129
|
+
}
|
|
43
130
|
// Step 3: If quality is sufficient, return drafter response
|
|
44
|
-
if (
|
|
131
|
+
if (validationResult.passed) {
|
|
45
132
|
// Estimate cost savings
|
|
46
133
|
const estimatedDrafterCost = 0.0001; // $0.0001 per request (rough estimate)
|
|
47
134
|
const estimatedVerifierCost = 0.0016; // $0.0016 per request (rough estimate)
|
|
48
135
|
const savings = ((estimatedVerifierCost - estimatedDrafterCost) / estimatedVerifierCost * 100).toFixed(1);
|
|
49
|
-
|
|
50
|
-
|
|
51
|
-
console.log(
|
|
52
|
-
|
|
53
|
-
|
|
136
|
+
const flowLog = `\n┌─────────────────────────────────────────┐\n│ ✅ FLOW: DRAFTER ACCEPTED (FAST PATH) │\n└─────────────────────────────────────────┘\n Query → Drafter → Quality Check ✅ → Response\n ⚡ Fast & Cheap: Used drafter model only\n Confidence: ${validationResult.confidence.toFixed(2)} (threshold: ${this.qualityThreshold})\n Quality score: ${validationResult.score.toFixed(2)}\n Latency: ${drafterLatency}ms\n 💰 Cost savings: ~${savings}% (used cheap model)\n 📊 Stats: ${this.drafterCount} drafter, ${this.verifierCount} verifier\n`;
|
|
137
|
+
await runManager?.handleText(flowLog);
|
|
138
|
+
console.log(flowLog);
|
|
139
|
+
// Add flow metadata to message for n8n UI visibility (logs only, not in response text)
|
|
140
|
+
if (!drafterMessage.response_metadata) {
|
|
141
|
+
drafterMessage.response_metadata = {};
|
|
142
|
+
}
|
|
143
|
+
drafterMessage.response_metadata.cascadeflow = {
|
|
144
|
+
flow: 'drafter_accepted',
|
|
145
|
+
confidence: validationResult.confidence,
|
|
146
|
+
quality_score: validationResult.score,
|
|
147
|
+
latency_ms: drafterLatency,
|
|
148
|
+
cost_savings_percent: parseFloat(savings),
|
|
149
|
+
model_used: 'drafter'
|
|
150
|
+
};
|
|
54
151
|
return {
|
|
55
152
|
generations: [{
|
|
56
153
|
text: drafterMessage.content.toString(),
|
|
@@ -59,9 +156,9 @@ class CascadeChatModel extends chat_models_1.BaseChatModel {
|
|
|
59
156
|
};
|
|
60
157
|
}
|
|
61
158
|
// Step 4: Otherwise, escalate to verifier
|
|
62
|
-
|
|
63
|
-
|
|
64
|
-
console.log(
|
|
159
|
+
const escalateLog = `\n┌────────────────────────────────────────────────┐\n│ ⚠️ FLOW: ESCALATED TO VERIFIER (SLOW PATH) │\n└────────────────────────────────────────────────┘\n Query → Drafter → Quality Check ❌ → Verifier → Response\n 🔄 Escalating: Drafter quality too low, using verifier\n Confidence: ${validationResult.confidence.toFixed(2)} < ${this.qualityThreshold} (threshold)\n Reason: ${validationResult.reason}\n Drafter latency: ${drafterLatency}ms\n 🔄 Loading verifier model...\n`;
|
|
160
|
+
await runManager?.handleText(escalateLog);
|
|
161
|
+
console.log(escalateLog);
|
|
65
162
|
const verifierStartTime = Date.now();
|
|
66
163
|
const verifierModel = await this.getVerifierModel();
|
|
67
164
|
const verifierMessage = await verifierModel.invoke(messages, options);
|
|
@@ -69,10 +166,23 @@ class CascadeChatModel extends chat_models_1.BaseChatModel {
|
|
|
69
166
|
this.verifierCount++;
|
|
70
167
|
const totalLatency = drafterLatency + verifierLatency;
|
|
71
168
|
const acceptanceRate = (this.drafterCount / (this.drafterCount + this.verifierCount) * 100).toFixed(1);
|
|
72
|
-
|
|
73
|
-
|
|
74
|
-
console.log(
|
|
75
|
-
|
|
169
|
+
const completionLog = ` ✅ Verifier completed successfully\n Verifier latency: ${verifierLatency}ms\n Total latency: ${totalLatency}ms (drafter: ${drafterLatency}ms + verifier: ${verifierLatency}ms)\n 💰 Cost: Full verifier cost (0% savings this request)\n 📊 Stats: ${this.drafterCount} drafter (${acceptanceRate}%), ${this.verifierCount} verifier\n`;
|
|
170
|
+
await runManager?.handleText(completionLog);
|
|
171
|
+
console.log(completionLog);
|
|
172
|
+
// Add flow metadata to message for n8n UI visibility (logs only, not in response text)
|
|
173
|
+
if (!verifierMessage.response_metadata) {
|
|
174
|
+
verifierMessage.response_metadata = {};
|
|
175
|
+
}
|
|
176
|
+
verifierMessage.response_metadata.cascadeflow = {
|
|
177
|
+
flow: 'escalated_to_verifier',
|
|
178
|
+
confidence: validationResult.confidence,
|
|
179
|
+
drafter_latency_ms: drafterLatency,
|
|
180
|
+
verifier_latency_ms: verifierLatency,
|
|
181
|
+
total_latency_ms: totalLatency,
|
|
182
|
+
cost_savings_percent: 0,
|
|
183
|
+
model_used: 'verifier',
|
|
184
|
+
reason: validationResult.reason
|
|
185
|
+
};
|
|
76
186
|
return {
|
|
77
187
|
generations: [{
|
|
78
188
|
text: verifierMessage.content.toString(),
|
|
@@ -82,12 +192,26 @@ class CascadeChatModel extends chat_models_1.BaseChatModel {
|
|
|
82
192
|
}
|
|
83
193
|
catch (error) {
|
|
84
194
|
// Fallback to verifier on error
|
|
85
|
-
|
|
86
|
-
|
|
195
|
+
const errorMsg = error instanceof Error ? error.message : String(error);
|
|
196
|
+
const errorLog = `\n┌─────────────────────────────────────────────┐\n│ ❌ FLOW: DRAFTER ERROR - FALLBACK PATH │\n└─────────────────────────────────────────────┘\n Query → Drafter ❌ ERROR → Verifier → Response\n 🔄 Fallback: Drafter failed, using verifier as backup\n Error: ${errorMsg}\n 🔄 Loading verifier model...\n`;
|
|
197
|
+
await runManager?.handleText(errorLog);
|
|
198
|
+
console.log(errorLog);
|
|
87
199
|
const verifierModel = await this.getVerifierModel();
|
|
88
200
|
const verifierMessage = await verifierModel.invoke(messages, options);
|
|
89
201
|
this.verifierCount++;
|
|
90
|
-
|
|
202
|
+
const fallbackCompleteLog = ` ✅ Verifier fallback completed successfully\n 💰 Cost: Full verifier cost (fallback due to error)\n`;
|
|
203
|
+
await runManager?.handleText(fallbackCompleteLog);
|
|
204
|
+
console.log(fallbackCompleteLog);
|
|
205
|
+
// Add flow metadata to message for n8n UI visibility (logs only, not in response text)
|
|
206
|
+
if (!verifierMessage.response_metadata) {
|
|
207
|
+
verifierMessage.response_metadata = {};
|
|
208
|
+
}
|
|
209
|
+
verifierMessage.response_metadata.cascadeflow = {
|
|
210
|
+
flow: 'error_fallback',
|
|
211
|
+
error: errorMsg,
|
|
212
|
+
cost_savings_percent: 0,
|
|
213
|
+
model_used: 'verifier'
|
|
214
|
+
};
|
|
91
215
|
return {
|
|
92
216
|
generations: [{
|
|
93
217
|
text: verifierMessage.content.toString(),
|
|
@@ -123,16 +247,20 @@ class LmChatCascadeFlow {
|
|
|
123
247
|
},
|
|
124
248
|
},
|
|
125
249
|
// Sub-node: accepts AI model connections
|
|
250
|
+
// ⚠️ NOTE: Labels are REVERSED from actual logic!
|
|
251
|
+
// Visual layout: Index 0 = "Verifier" label (top), Index 1 = "Drafter" label (bottom)
|
|
252
|
+
// Actual logic: Index 0 = DRAFTER model (tried first), Index 1 = VERIFIER model (only if needed)
|
|
253
|
+
// User connects: TOP port = drafter model, BOTTOM port = verifier model
|
|
126
254
|
// eslint-disable-next-line n8n-nodes-base/node-class-description-inputs-wrong-regular-node
|
|
127
255
|
inputs: [
|
|
128
256
|
{
|
|
129
|
-
displayName: '
|
|
257
|
+
displayName: 'Verifier (Expensive, Only If Needed)',
|
|
130
258
|
type: 'ai_languageModel',
|
|
131
259
|
maxConnections: 1,
|
|
132
260
|
required: true,
|
|
133
261
|
},
|
|
134
262
|
{
|
|
135
|
-
displayName: '
|
|
263
|
+
displayName: 'Drafter (Cheap, Tried First)',
|
|
136
264
|
type: 'ai_languageModel',
|
|
137
265
|
maxConnections: 1,
|
|
138
266
|
required: true,
|
|
@@ -161,18 +289,20 @@ class LmChatCascadeFlow {
|
|
|
161
289
|
async supplyData() {
|
|
162
290
|
// Get parameters
|
|
163
291
|
const qualityThreshold = this.getNodeParameter('qualityThreshold', 0, 0.7);
|
|
164
|
-
// Get the drafter model immediately
|
|
292
|
+
// Get the drafter model immediately (at index 0 - top port in UI, labeled "Verifier")
|
|
293
|
+
// NOTE: Labels are reversed from logic - top port has drafter model despite "Verifier" label
|
|
165
294
|
const drafterData = await this.getInputConnectionData('ai_languageModel', 0);
|
|
166
295
|
const drafterModel = (Array.isArray(drafterData) ? drafterData[0] : drafterData);
|
|
167
296
|
if (!drafterModel) {
|
|
168
|
-
throw new n8n_workflow_1.NodeOperationError(this.getNode(), 'Drafter model is required. Please connect
|
|
297
|
+
throw new n8n_workflow_1.NodeOperationError(this.getNode(), 'Drafter model is required. Please connect your DRAFTER model to the TOP port (labeled "Verifier").');
|
|
169
298
|
}
|
|
170
|
-
// Create a lazy loader for the verifier model (only fetched when needed)
|
|
299
|
+
// Create a lazy loader for the verifier model (only fetched when needed) (at index 1 - bottom port in UI, labeled "Drafter")
|
|
300
|
+
// NOTE: Labels are reversed from logic - bottom port has verifier model despite "Drafter" label
|
|
171
301
|
const verifierModelGetter = async () => {
|
|
172
302
|
const verifierData = await this.getInputConnectionData('ai_languageModel', 1);
|
|
173
303
|
const verifierModel = (Array.isArray(verifierData) ? verifierData[0] : verifierData);
|
|
174
304
|
if (!verifierModel) {
|
|
175
|
-
throw new n8n_workflow_1.NodeOperationError(this.getNode(), 'Verifier model is required. Please connect
|
|
305
|
+
throw new n8n_workflow_1.NodeOperationError(this.getNode(), 'Verifier model is required. Please connect your VERIFIER model to the BOTTOM port (labeled "Drafter").');
|
|
176
306
|
}
|
|
177
307
|
return verifierModel;
|
|
178
308
|
};
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"LmChatCascadeFlow.node.js","sourceRoot":"","sources":["../../../nodes/LmChatCascadeFlow/LmChatCascadeFlow.node.ts"],"names":[],"mappings":";;;AAOA,+CAAsE;AAEtE,6EAA4E;AAK5E;;;GAGG;AACH,MAAM,gBAAiB,SAAQ,2BAAa;
|
|
1
|
+
{"version":3,"file":"LmChatCascadeFlow.node.js","sourceRoot":"","sources":["../../../nodes/LmChatCascadeFlow/LmChatCascadeFlow.node.ts"],"names":[],"mappings":";;;AAOA,+CAAsE;AAEtE,6EAA4E;AAK5E,gEAAgE;AAChE,IAAI,gBAAqB,CAAC;AAC1B,IAAI,sBAA2B,CAAC;AAChC,IAAI,CAAC;IACH,MAAM,WAAW,GAAG,OAAO,CAAC,mBAAmB,CAAC,CAAC;IACjD,gBAAgB,GAAG,WAAW,CAAC,gBAAgB,CAAC;IAChD,sBAAsB,GAAG,WAAW,CAAC,sBAAsB,CAAC;AAC9D,CAAC;AAAC,OAAO,CAAC,EAAE,CAAC;IACX,0DAA0D;IAC1D,OAAO,CAAC,IAAI,CAAC,iEAAiE,CAAC,CAAC;AAClF,CAAC;AAED;;;GAGG;AACH,MAAM,gBAAiB,SAAQ,2BAAa;IAiB1C,YACE,YAA2B,EAC3B,mBAAiD,EACjD,mBAA2B,GAAG;QAE9B,KAAK,CAAC,EAAE,CAAC,CAAC;QAjBZ,gBAAgB;QACR,gBAAW,GAAW,CAAC,CAAC;QACxB,iBAAY,GAAW,CAAC,CAAC;QACzB,iBAAY,GAAW,CAAC,CAAC;QACzB,kBAAa,GAAW,CAAC,CAAC;QAchC,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;QACjC,IAAI,CAAC,mBAAmB,GAAG,mBAAmB,CAAC;QAC/C,IAAI,CAAC,gBAAgB,GAAG,gBAAgB,CAAC;QAEzC,4EAA4E;QAC5E,IAAI,gBAAgB,IAAI,sBAAsB,EAAE,CAAC;YAC/C,IAAI,CAAC;gBACH,IAAI,CAAC,gBAAgB,GAAG,IAAI,gBAAgB,CAAC;oBAC3C,GAAG,sBAAsB;oBACzB,aAAa,EAAE,gBAAgB;iBAChC,CAAC,CAAC;gBACH,OAAO,CAAC,GAAG,CAAC,6CAA6C,CAAC,CAAC;YAC7D,CAAC;YAAC,OAAO,CAAC,EAAE,CAAC;gBACX,OAAO,CAAC,IAAI,CAAC,iEAAiE,CAAC,CAAC;gBAChF,IAAI,CAAC,gBAAgB,GAAG,IAAI,CAAC;YAC/B,CAAC;QACH,CAAC;aAAM,CAAC;YACN,IAAI,CAAC,gBAAgB,GAAG,IAAI,CAAC;QAC/B,CAAC;IACH,CAAC;IAEO,KAAK,CAAC,gBAAgB;QAC5B,IAAI,CAAC,IAAI,CAAC,aAAa,EAAE,CAAC;YACxB,OAAO,CAAC,GAAG,CAAC,iCAAiC,CAAC,CAAC;YAC/C,IAAI,CAAC,aAAa,GAAG,MAAM,IAAI,CAAC,mBAAmB,EAAE,CAAC;QACxD,CAAC;QACD,OAAO,IAAI,CAAC,aAAa,CAAC;IAC5B,CAAC;IAED;;OAEG;IACK,kBAAkB,CAAC,YAAoB;QAC7C,MAAM,SAAS,GAAG,YAAY,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC;QAEnD,mDAAmD;QACnD,IAAI,UAAU,GAAG,IAAI,CAAC;QAEtB,4CAA4C;QAC5C,IAAI,SAAS,GAAG,CAAC,EAAE,CAAC;YAClB,UAAU,GAAG,IAAI,CAAC;QACpB,CAAC;aAAM,IAAI,SAAS,GAAG,EAAE,EAAE,CAAC;YAC1B,UAAU,GAAG,IAAI,CAAC;QACpB,CAAC;aAAM,IAAI,SAAS,GAAG,EAAE,EAAE,CAAC;YAC1B,UAAU,GAAG,IAAI,CAAC;QACpB,CAAC;QAED,gCAAgC;QAChC,MAAM,kBAAkB,GAAG,CAAC,eAAe,EAAE,eAAe,EAAE,SAAS,EAAE,WAAW,CAAC,CAAC;QACtF,MAAM,cAAc,GAAG,kBAAkB,CAAC,IAAI,CAAC,MAAM,CAAC,EAAE,CAAC,YAAY,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC;QACtG,IAAI,cAAc,EAAE,CAAC;YACnB,UAAU,IAAI,IAAI,CAAC;QACrB,CAAC;QAED,MAAM,MAAM,GAAG,UAAU,IAAI,IAAI,CAAC,gBAAgB,CAAC;QACnD,MAAM,MAAM,GAAG,MAAM;YACnB,CAAC,CAAC,oCAAoC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,OAAO,IAAI,CAAC,gBAAgB,GAAG;YAC1F,CAAC,CAAC,oCAAoC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,IAAI,CAAC,gBAAgB,GAAG,CAAC;QAE5F,OAAO,EAAE,MAAM,EAAE,UAAU,EAAE,KAAK,EAAE,UAAU,EAAE,MAAM,EAAE,CAAC;IAC3D,CAAC;IAED,QAAQ;QACN,OAAO,SAAS,CAAC;IACnB,CAAC;IAED,KAAK,CAAC,SAAS,CACb,QAAuB,EACvB,OAAkC,EAClC,UAAqC;QAErC,IAAI,CAAC;YACH,gCAAgC;YAChC,MAAM,UAAU,EAAE,UAAU,CAAC,2CAA2C,CAAC,CAAC;YAC1E,OAAO,CAAC,GAAG,CAAC,yCAAyC,CAAC,CAAC;YACvD,MAAM,gBAAgB,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;YACpC,MAAM,cAAc,GAAG,MAAM,IAAI,CAAC,YAAY,CAAC,MAAM,CAAC,QAAQ,EAAE,OAAO,CAAC,CAAC;YACzE,MAAM,cAAc,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,gBAAgB,CAAC;YAErD,IAAI,CAAC,YAAY,EAAE,CAAC;YAEpB,yEAAyE;YACzE,MAAM,YAAY,GAAG,cAAc,CAAC,OAAO,CAAC,QAAQ,EAAE,CAAC;YAEvD,IAAI,gBAAqB,CAAC;YAE1B,IAAI,IAAI,CAAC,gBAAgB,EAAE,CAAC;gBAC1B,yCAAyC;gBACzC,MAAM,SAAS,GAAG,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,EAAE,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;gBACpE,IAAI,CAAC;oBACH,gBAAgB,GAAG,MAAM,IAAI,CAAC,gBAAgB,CAAC,QAAQ,CAAC,YAAY,EAAE,SAAS,CAAC,CAAC;oBACjF,MAAM,UAAU,GAAG,wCAAwC,gBAAgB,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,YAAY,gBAAgB,CAAC,MAAM,IAAI,CAAC;oBACzI,MAAM,UAAU,EAAE,UAAU,CAAC,UAAU,CAAC,CAAC;oBACzC,OAAO,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;oBAExB,IAAI,gBAAgB,CAAC,OAAO,EAAE,cAAc,EAAE,CAAC;wBAC7C,MAAM,YAAY,GAAG,oBAAoB,gBAAgB,CAAC,OAAO,CAAC,cAAc,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC;wBAChG,MAAM,UAAU,EAAE,UAAU,CAAC,YAAY,CAAC,CAAC;wBAC3C,OAAO,CAAC,GAAG,CAAC,YAAY,CAAC,CAAC;oBAC5B,CAAC;gBACH,CAAC;gBAAC,OAAO,CAAC,EAAE,CAAC;oBACX,MAAM,QAAQ,GAAG,uDAAuD,CAAC,IAAI,CAAC;oBAC9E,MAAM,UAAU,EAAE,UAAU,CAAC,QAAQ,CAAC,CAAC;oBACvC,OAAO,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;oBACvB,gBAAgB,GAAG,IAAI,CAAC,kBAAkB,CAAC,YAAY,CAAC,CAAC;gBAC3D,CAAC;YACH,CAAC;iBAAM,CAAC;gBACN,sCAAsC;gBACtC,gBAAgB,GAAG,IAAI,CAAC,kBAAkB,CAAC,YAAY,CAAC,CAAC;gBACzD,MAAM,SAAS,GAAG,0CAA0C,gBAAgB,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC;gBACvG,MAAM,UAAU,EAAE,UAAU,CAAC,SAAS,CAAC,CAAC;gBACxC,OAAO,CAAC,GAAG,CAAC,SAAS,CAAC,CAAC;YACzB,CAAC;YAED,4DAA4D;YAC5D,IAAI,gBAAgB,CAAC,MAAM,EAAE,CAAC;gBAC5B,wBAAwB;gBACxB,MAAM,oBAAoB,GAAG,MAAM,CAAC,CAAC,uCAAuC;gBAC5E,MAAM,qBAAqB,GAAG,MAAM,CAAC,CAAC,uCAAuC;gBAC7E,MAAM,OAAO,GAAG,CAAC,CAAC,qBAAqB,GAAG,oBAAoB,CAAC,GAAG,qBAAqB,GAAG,GAAG,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;gBAE1G,MAAM,OAAO,GAAG,sPAAsP,gBAAgB,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,gBAAgB,IAAI,CAAC,gBAAgB,wBAAwB,gBAAgB,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,iBAAiB,cAAc,4BAA4B,OAAO,sCAAsC,IAAI,CAAC,YAAY,aAAa,IAAI,CAAC,aAAa,aAAa,CAAC;gBAEhjB,MAAM,UAAU,EAAE,UAAU,CAAC,OAAO,CAAC,CAAC;gBACtC,OAAO,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;gBAErB,uFAAuF;gBACvF,IAAI,CAAC,cAAc,CAAC,iBAAiB,EAAE,CAAC;oBACrC,cAAsB,CAAC,iBAAiB,GAAG,EAAE,CAAC;gBACjD,CAAC;gBACA,cAAsB,CAAC,iBAAiB,CAAC,WAAW,GAAG;oBACtD,IAAI,EAAE,kBAAkB;oBACxB,UAAU,EAAE,gBAAgB,CAAC,UAAU;oBACvC,aAAa,EAAE,gBAAgB,CAAC,KAAK;oBACrC,UAAU,EAAE,cAAc;oBAC1B,oBAAoB,EAAE,UAAU,CAAC,OAAO,CAAC;oBACzC,UAAU,EAAE,SAAS;iBACtB,CAAC;gBAEF,OAAO;oBACL,WAAW,EAAE,CAAC;4BACZ,IAAI,EAAE,cAAc,CAAC,OAAO,CAAC,QAAQ,EAAE;4BACvC,OAAO,EAAE,cAAc;yBACxB,CAAC;iBACH,CAAC;YACJ,CAAC;YAED,0CAA0C;YAC1C,MAAM,WAAW,GAAG,sSAAsS,gBAAgB,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,IAAI,CAAC,gBAAgB,4BAA4B,gBAAgB,CAAC,MAAM,yBAAyB,cAAc,uCAAuC,CAAC;YAE7f,MAAM,UAAU,EAAE,UAAU,CAAC,WAAW,CAAC,CAAC;YAC1C,OAAO,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC;YAEzB,MAAM,iBAAiB,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;YACrC,MAAM,aAAa,GAAG,MAAM,IAAI,CAAC,gBAAgB,EAAE,CAAC;YACpD,MAAM,eAAe,GAAG,MAAM,aAAa,CAAC,MAAM,CAAC,QAAQ,EAAE,OAAO,CAAC,CAAC;YACtE,MAAM,eAAe,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,iBAAiB,CAAC;YAEvD,IAAI,CAAC,aAAa,EAAE,CAAC;YAErB,MAAM,YAAY,GAAG,cAAc,GAAG,eAAe,CAAC;YACtD,MAAM,cAAc,GAAG,CAAC,IAAI,CAAC,YAAY,GAAG,CAAC,IAAI,CAAC,YAAY,GAAG,IAAI,CAAC,aAAa,CAAC,GAAG,GAAG,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;YAEvG,MAAM,aAAa,GAAG,8DAA8D,eAAe,yBAAyB,YAAY,gBAAgB,cAAc,kBAAkB,eAAe,+EAA+E,IAAI,CAAC,YAAY,aAAa,cAAc,OAAO,IAAI,CAAC,aAAa,aAAa,CAAC;YAEzW,MAAM,UAAU,EAAE,UAAU,CAAC,aAAa,CAAC,CAAC;YAC5C,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,CAAC;YAE3B,uFAAuF;YACvF,IAAI,CAAC,eAAe,CAAC,iBAAiB,EAAE,CAAC;gBACtC,eAAuB,CAAC,iBAAiB,GAAG,EAAE,CAAC;YAClD,CAAC;YACA,eAAuB,CAAC,iBAAiB,CAAC,WAAW,GAAG;gBACvD,IAAI,EAAE,uBAAuB;gBAC7B,UAAU,EAAE,gBAAgB,CAAC,UAAU;gBACvC,kBAAkB,EAAE,cAAc;gBAClC,mBAAmB,EAAE,eAAe;gBACpC,gBAAgB,EAAE,YAAY;gBAC9B,oBAAoB,EAAE,CAAC;gBACvB,UAAU,EAAE,UAAU;gBACtB,MAAM,EAAE,gBAAgB,CAAC,MAAM;aAChC,CAAC;YAEF,OAAO;gBACL,WAAW,EAAE,CAAC;wBACZ,IAAI,EAAE,eAAe,CAAC,OAAO,CAAC,QAAQ,EAAE;wBACxC,OAAO,EAAE,eAAe;qBACzB,CAAC;aACH,CAAC;QACJ,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACf,gCAAgC;YAChC,MAAM,QAAQ,GAAG,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;YACxE,MAAM,QAAQ,GAAG,4QAA4Q,QAAQ,qCAAqC,CAAC;YAE3U,MAAM,UAAU,EAAE,UAAU,CAAC,QAAQ,CAAC,CAAC;YACvC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC;YAEtB,MAAM,aAAa,GAAG,MAAM,IAAI,CAAC,gBAAgB,EAAE,CAAC;YACpD,MAAM,eAAe,GAAG,MAAM,aAAa,CAAC,MAAM,CAAC,QAAQ,EAAE,OAAO,CAAC,CAAC;YACtE,IAAI,CAAC,aAAa,EAAE,CAAC;YAErB,MAAM,mBAAmB,GAAG,yGAAyG,CAAC;YAEtI,MAAM,UAAU,EAAE,UAAU,CAAC,mBAAmB,CAAC,CAAC;YAClD,OAAO,CAAC,GAAG,CAAC,mBAAmB,CAAC,CAAC;YAEjC,uFAAuF;YACvF,IAAI,CAAC,eAAe,CAAC,iBAAiB,EAAE,CAAC;gBACtC,eAAuB,CAAC,iBAAiB,GAAG,EAAE,CAAC;YAClD,CAAC;YACA,eAAuB,CAAC,iBAAiB,CAAC,WAAW,GAAG;gBACvD,IAAI,EAAE,gBAAgB;gBACtB,KAAK,EAAE,QAAQ;gBACf,oBAAoB,EAAE,CAAC;gBACvB,UAAU,EAAE,UAAU;aACvB,CAAC;YAEF,OAAO;gBACL,WAAW,EAAE,CAAC;wBACZ,IAAI,EAAE,eAAe,CAAC,OAAO,CAAC,QAAQ,EAAE;wBACxC,OAAO,EAAE,eAAe;qBACzB,CAAC;aACH,CAAC;QACJ,CAAC;IACH,CAAC;CACF;AAED,MAAa,iBAAiB;IAA9B;QACE,gBAAW,GAAyB;YAClC,WAAW,EAAE,aAAa;YAC1B,IAAI,EAAE,mBAAmB;YACzB,IAAI,EAAE,sBAAsB;YAC5B,KAAK,EAAE,CAAC,WAAW,CAAC;YACpB,OAAO,EAAE,CAAC;YACV,WAAW,EAAE,yIAAyI;YACtJ,QAAQ,EAAE;gBACR,IAAI,EAAE,aAAa;aACpB;YACD,KAAK,EAAE;gBACL,UAAU,EAAE,CAAC,IAAI,CAAC;gBAClB,aAAa,EAAE;oBACb,EAAE,EAAE,CAAC,iBAAiB,EAAE,aAAa,CAAC;iBACvC;gBACD,SAAS,EAAE;oBACT,oBAAoB,EAAE;wBACpB;4BACE,GAAG,EAAE,0CAA0C;yBAChD;qBACF;iBACF;aACF;YACD,yCAAyC;YACzC,kDAAkD;YAClD,sFAAsF;YACtF,iGAAiG;YACjG,wEAAwE;YACxE,2FAA2F;YAC3F,MAAM,EAAE;gBACN;oBACE,WAAW,EAAE,sCAAsC;oBACnD,IAAI,EAAE,kBAAyB;oBAC/B,cAAc,EAAE,CAAC;oBACjB,QAAQ,EAAE,IAAI;iBACf;gBACD;oBACE,WAAW,EAAE,8BAA8B;oBAC3C,IAAI,EAAE,kBAAyB;oBAC/B,cAAc,EAAE,CAAC;oBACjB,QAAQ,EAAE,IAAI;iBACf;aACF;YACD,2DAA2D;YAC3D,+EAA+E;YAC/E,OAAO,EAAE,CAAC,kBAAyB,CAAC;YACpC,WAAW,EAAE,CAAC,OAAO,CAAC;YACtB,UAAU,EAAE;gBACV;oBACE,WAAW,EAAE,mBAAmB;oBAChC,IAAI,EAAE,kBAAkB;oBACxB,IAAI,EAAE,QAAQ;oBACd,OAAO,EAAE,GAAG;oBACZ,WAAW,EAAE;wBACX,QAAQ,EAAE,CAAC;wBACX,QAAQ,EAAE,CAAC;wBACX,eAAe,EAAE,CAAC;qBACnB;oBACD,WAAW,EAAE,6GAA6G;iBAC3H;aACF;SACF,CAAC;IAkDJ,CAAC;IAhDC,KAAK,CAAC,UAAU;QACd,iBAAiB;QACjB,MAAM,gBAAgB,GAAG,IAAI,CAAC,gBAAgB,CAAC,kBAAkB,EAAE,CAAC,EAAE,GAAG,CAAW,CAAC;QAErF,sFAAsF;QACtF,6FAA6F;QAC7F,MAAM,WAAW,GAAG,MAAM,IAAI,CAAC,sBAAsB,CAAC,kBAAyB,EAAE,CAAC,CAAC,CAAC;QACpF,MAAM,YAAY,GAAG,CAAC,KAAK,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,WAAW,CAAkB,CAAC;QAElG,IAAI,CAAC,YAAY,EAAE,CAAC;YAClB,MAAM,IAAI,iCAAkB,CAC1B,IAAI,CAAC,OAAO,EAAE,EACd,oGAAoG,CACrG,CAAC;QACJ,CAAC;QAED,6HAA6H;QAC7H,gGAAgG;QAChG,MAAM,mBAAmB,GAAG,KAAK,IAAI,EAAE;YACrC,MAAM,YAAY,GAAG,MAAM,IAAI,CAAC,sBAAsB,CAAC,kBAAyB,EAAE,CAAC,CAAC,CAAC;YACrF,MAAM,aAAa,GAAG,CAAC,KAAK,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,YAAY,CAAkB,CAAC;YAEtG,IAAI,CAAC,aAAa,EAAE,CAAC;gBACnB,MAAM,IAAI,iCAAkB,CAC1B,IAAI,CAAC,OAAO,EAAE,EACd,wGAAwG,CACzG,CAAC;YACJ,CAAC;YAED,OAAO,aAAa,CAAC;QACvB,CAAC,CAAC;QAEF,OAAO,CAAC,GAAG,CAAC,4BAA4B,CAAC,CAAC;QAC1C,OAAO,CAAC,GAAG,CAAC,eAAe,OAAO,YAAY,CAAC,QAAQ,KAAK,UAAU,CAAC,CAAC,CAAC,YAAY,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAAC;QAClH,OAAO,CAAC,GAAG,CAAC,sDAAsD,CAAC,CAAC;QACpE,OAAO,CAAC,GAAG,CAAC,yBAAyB,gBAAgB,EAAE,CAAC,CAAC;QAEzD,yDAAyD;QACzD,MAAM,YAAY,GAAG,IAAI,gBAAgB,CACvC,YAAY,EACZ,mBAAmB,EACnB,gBAAgB,CACjB,CAAC;QAEF,OAAO;YACL,QAAQ,EAAE,YAAY;SACvB,CAAC;IACJ,CAAC;CACF;AAhHD,8CAgHC"}
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@cascadeflow/n8n-nodes-cascadeflow",
|
|
3
|
-
"version": "0.5.
|
|
3
|
+
"version": "0.5.2",
|
|
4
4
|
"description": "n8n node for cascadeflow - Smart AI model cascading with 40-85% cost savings",
|
|
5
5
|
"keywords": [
|
|
6
6
|
"n8n-community-node-package",
|
|
@@ -21,6 +21,14 @@
|
|
|
21
21
|
"url": "https://github.com/lemony-ai/cascadeflow.git",
|
|
22
22
|
"directory": "packages/integrations/n8n"
|
|
23
23
|
},
|
|
24
|
+
"scripts": {
|
|
25
|
+
"build": "tsc && gulp build:icons",
|
|
26
|
+
"dev": "tsc --watch",
|
|
27
|
+
"format": "prettier nodes credentials --write",
|
|
28
|
+
"lint": "eslint \"nodes/**/*.ts\" \"credentials/**/*.ts\" package.json",
|
|
29
|
+
"lintfix": "eslint \"nodes/**/*.ts\" \"credentials/**/*.ts\" package.json --fix",
|
|
30
|
+
"prepublishOnly": "npm run build && npm run lint"
|
|
31
|
+
},
|
|
24
32
|
"files": [
|
|
25
33
|
"dist"
|
|
26
34
|
],
|
|
@@ -47,13 +55,7 @@
|
|
|
47
55
|
"n8n-workflow": "*"
|
|
48
56
|
},
|
|
49
57
|
"dependencies": {
|
|
50
|
-
"@langchain/core": "^0.3.0"
|
|
51
|
-
|
|
52
|
-
"scripts": {
|
|
53
|
-
"build": "tsc && gulp build:icons",
|
|
54
|
-
"dev": "tsc --watch",
|
|
55
|
-
"format": "prettier nodes credentials --write",
|
|
56
|
-
"lint": "eslint \"nodes/**/*.ts\" \"credentials/**/*.ts\" package.json",
|
|
57
|
-
"lintfix": "eslint \"nodes/**/*.ts\" \"credentials/**/*.ts\" package.json --fix"
|
|
58
|
+
"@langchain/core": "^0.3.0",
|
|
59
|
+
"@cascadeflow/core": "^5.0.3"
|
|
58
60
|
}
|
|
59
|
-
}
|
|
61
|
+
}
|
package/LICENSE
DELETED
|
@@ -1,21 +0,0 @@
|
|
|
1
|
-
MIT License
|
|
2
|
-
|
|
3
|
-
Copyright (c) 2025 Lemony Inc.
|
|
4
|
-
|
|
5
|
-
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
-
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
-
in the Software without restriction, including without limitation the rights
|
|
8
|
-
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
-
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
-
furnished to do so, subject to the following conditions:
|
|
11
|
-
|
|
12
|
-
The above copyright notice and this permission notice shall be included in all
|
|
13
|
-
copies or substantial portions of the Software.
|
|
14
|
-
|
|
15
|
-
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
-
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
-
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
-
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
-
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
-
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
-
SOFTWARE.
|