agentic-flow 1.3.0 → 1.3.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +53 -3
- package/dist/agents/claudeAgent.js +18 -0
- package/dist/cli/claude-code-wrapper.js +9 -0
- package/dist/cli-proxy.js +88 -6
- package/dist/proxy/anthropic-to-requesty.js +627 -0
- package/dist/utils/modelCapabilities.js +22 -0
- package/docs/plans/agent-booster/00-INDEX.md +230 -0
- package/docs/plans/agent-booster/00-OVERVIEW.md +454 -0
- package/docs/plans/agent-booster/01-ARCHITECTURE.md +699 -0
- package/docs/plans/agent-booster/02-INTEGRATION.md +771 -0
- package/docs/plans/agent-booster/03-BENCHMARKS.md +616 -0
- package/docs/plans/agent-booster/04-NPM-SDK.md +673 -0
- package/docs/plans/agent-booster/GITHUB-ISSUE.md +523 -0
- package/docs/plans/agent-booster/README.md +576 -0
- package/docs/plans/requesty/00-overview.md +176 -0
- package/docs/plans/requesty/01-api-research.md +573 -0
- package/docs/plans/requesty/02-architecture.md +1076 -0
- package/docs/plans/requesty/03-implementation-phases.md +1129 -0
- package/docs/plans/requesty/04-testing-strategy.md +905 -0
- package/docs/plans/requesty/05-migration-guide.md +576 -0
- package/docs/plans/requesty/README.md +290 -0
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -6,7 +6,7 @@
|
|
|
6
6
|
[](https://nodejs.org/)
|
|
7
7
|
[](https://github.com/ruvnet/)
|
|
8
8
|
|
|
9
|
-
**Production-ready AI agent orchestration with 66+ specialized agents, 213 MCP tools, and multi-model routing (Anthropic, OpenRouter, Gemini, ONNX).**
|
|
9
|
+
**Production-ready AI agent orchestration with 66+ specialized agents, 213 MCP tools, and multi-model routing (Anthropic, OpenRouter, Requesty, Gemini, ONNX).**
|
|
10
10
|
|
|
11
11
|
---
|
|
12
12
|
|
|
@@ -23,10 +23,10 @@ Extending agent capabilities is effortless. Add custom tools and integrations th
|
|
|
23
23
|
Define routing rules through flexible policy modes: Strict mode keeps sensitive data offline, Economy mode prefers free models (99% savings), Premium mode uses Anthropic for highest quality, or create custom cost/quality thresholds. The policy defines the rules; the swarm enforces them automatically. Runs local for development, Docker for CI/CD, or Flow Nexus cloud for production scale. Agentic Flow is the framework for autonomous efficiency—one unified runner for every Claude Code agent, self-tuning, self-routing, and built for real-world deployment.
|
|
24
24
|
|
|
25
25
|
**Key Capabilities:**
|
|
26
|
-
- ✅ **Claude Code Mode** - Run Claude Code with OpenRouter/Gemini/ONNX (85-99% savings)
|
|
26
|
+
- ✅ **Claude Code Mode** - Run Claude Code with OpenRouter/Requesty/Gemini/ONNX (85-99% savings)
|
|
27
27
|
- ✅ **66 Specialized Agents** - Pre-built experts for coding, research, review, testing, DevOps
|
|
28
28
|
- ✅ **213 MCP Tools** - Memory, GitHub, neural networks, sandboxes, workflows, payments
|
|
29
|
-
- ✅ **Multi-Model Router** - Anthropic, OpenRouter (100+ models), Gemini, ONNX (free local)
|
|
29
|
+
- ✅ **Multi-Model Router** - Anthropic, OpenRouter (100+ models), Requesty (300+ models), Gemini, ONNX (free local)
|
|
30
30
|
- ✅ **Cost Optimization** - DeepSeek at $0.14/M tokens vs Claude at $15/M (99% savings)
|
|
31
31
|
|
|
32
32
|
**Built On:**
|
|
@@ -56,6 +56,10 @@ npx agentic-flow --agent coder --task "Build a REST API with authentication"
|
|
|
56
56
|
export OPENROUTER_API_KEY=sk-or-v1-...
|
|
57
57
|
npx agentic-flow --agent coder --task "Build REST API" --model "meta-llama/llama-3.1-8b-instruct"
|
|
58
58
|
|
|
59
|
+
# Run with Requesty (300+ models, 95% cost savings)
|
|
60
|
+
export REQUESTY_API_KEY=sk-...
|
|
61
|
+
npx agentic-flow --agent coder --task "Build REST API" --provider requesty
|
|
62
|
+
|
|
59
63
|
# Run with Gemini (free tier)
|
|
60
64
|
export GOOGLE_GEMINI_API_KEY=AIza...
|
|
61
65
|
npx agentic-flow --agent coder --task "Build REST API" --provider gemini
|
|
@@ -129,14 +133,17 @@ Automatically spawns Claude Code with proxy configuration for OpenRouter, Gemini
|
|
|
129
133
|
```bash
|
|
130
134
|
# Interactive mode - Opens Claude Code UI with proxy
|
|
131
135
|
npx agentic-flow claude-code --provider openrouter
|
|
136
|
+
npx agentic-flow claude-code --provider requesty
|
|
132
137
|
npx agentic-flow claude-code --provider gemini
|
|
133
138
|
|
|
134
139
|
# Non-interactive mode - Execute task and exit
|
|
135
140
|
npx agentic-flow claude-code --provider openrouter "Write a Python hello world function"
|
|
141
|
+
npx agentic-flow claude-code --provider requesty "Write a Python hello world function"
|
|
136
142
|
npx agentic-flow claude-code --provider openrouter --model "deepseek/deepseek-chat" "Create REST API"
|
|
137
143
|
|
|
138
144
|
# Use specific models
|
|
139
145
|
npx agentic-flow claude-code --provider openrouter --model "mistralai/mistral-small"
|
|
146
|
+
npx agentic-flow claude-code --provider requesty --model "openai/gpt-4o-mini"
|
|
140
147
|
npx agentic-flow claude-code --provider gemini --model "gemini-2.0-flash-exp"
|
|
141
148
|
|
|
142
149
|
# Local ONNX models (100% free, privacy-focused)
|
|
@@ -148,6 +155,8 @@ npx agentic-flow claude-code --provider onnx "Analyze this codebase"
|
|
|
148
155
|
| Provider | Model | Cost/M Tokens | Context | Best For |
|
|
149
156
|
|----------|-------|---------------|---------|----------|
|
|
150
157
|
| OpenRouter | `deepseek/deepseek-chat` (default) | $0.14 | 128k | General tasks, best value |
|
|
158
|
+
| Requesty | `deepseek/deepseek-chat` | $0.14 | 128k | 300+ models, unified API |
|
|
159
|
+
| Requesty | `openai/gpt-4o-mini` | $0.15 | 128k | OpenAI models via Requesty |
|
|
151
160
|
| OpenRouter | `anthropic/claude-3.5-sonnet` | $3.00 | 200k | Highest quality, complex reasoning |
|
|
152
161
|
| OpenRouter | `google/gemini-2.0-flash-exp:free` | FREE | 1M | Development, testing (rate limited) |
|
|
153
162
|
| Gemini | `gemini-2.0-flash-exp` | FREE | 1M | Fast responses, rate limited |
|
|
@@ -169,6 +178,9 @@ npx agentic-flow claude-code --provider onnx "Analyze this codebase"
|
|
|
169
178
|
# OpenRouter (100+ models at 85-99% savings)
|
|
170
179
|
export OPENROUTER_API_KEY=sk-or-v1-...
|
|
171
180
|
|
|
181
|
+
# Requesty (300+ models, unified access)
|
|
182
|
+
export REQUESTY_API_KEY=sk-...
|
|
183
|
+
|
|
172
184
|
# Gemini (FREE tier available)
|
|
173
185
|
export GOOGLE_GEMINI_API_KEY=AIza...
|
|
174
186
|
|
|
@@ -1014,11 +1026,49 @@ export HEALTH_PORT=8080 # Health check port
|
|
|
1014
1026
|
# .env file (auto-loaded)
|
|
1015
1027
|
ANTHROPIC_API_KEY=sk-ant-...
|
|
1016
1028
|
OPENROUTER_API_KEY=sk-or-v1-...
|
|
1029
|
+
REQUESTY_API_KEY=sk-...
|
|
1017
1030
|
GOOGLE_GEMINI_API_KEY=AIza...
|
|
1018
1031
|
ENABLE_CLAUDE_FLOW_SDK=true
|
|
1019
1032
|
COMPLETION_MODEL=deepseek/deepseek-chat-v3.1
|
|
1020
1033
|
```
|
|
1021
1034
|
|
|
1035
|
+
**Requesty Configuration (v1.3.1+)**
|
|
1036
|
+
|
|
1037
|
+
Requesty.ai provides unified access to 300+ AI models including OpenAI (GPT-4o, GPT-4o-mini), Anthropic, DeepSeek, Meta, Mistral, and more through a single API key:
|
|
1038
|
+
|
|
1039
|
+
```bash
|
|
1040
|
+
# Get your API key from https://requesty.ai
|
|
1041
|
+
export REQUESTY_API_KEY=sk-...
|
|
1042
|
+
|
|
1043
|
+
# Use with agents
|
|
1044
|
+
npx agentic-flow --agent coder --task "your task" --provider requesty
|
|
1045
|
+
|
|
1046
|
+
# Use with Claude Code
|
|
1047
|
+
npx agentic-flow claude-code --provider requesty
|
|
1048
|
+
|
|
1049
|
+
# Specify model (default: deepseek/deepseek-chat)
|
|
1050
|
+
npx agentic-flow --agent coder --task "task" --provider requesty --model "openai/gpt-4o-mini"
|
|
1051
|
+
|
|
1052
|
+
# Enable via environment variable
|
|
1053
|
+
export USE_REQUESTY=true
|
|
1054
|
+
npx agentic-flow --agent coder --task "your task"
|
|
1055
|
+
```
|
|
1056
|
+
|
|
1057
|
+
**Supported Requesty Models:**
|
|
1058
|
+
- `deepseek/deepseek-chat` (default) - $0.14/M tokens, 128k context, native tools
|
|
1059
|
+
- `openai/gpt-4o` - $2.50/M tokens, 128k context, native tools
|
|
1060
|
+
- `openai/gpt-4o-mini` - $0.15/M tokens, 128k context, native tools
|
|
1061
|
+
- `openai/gpt-4-turbo` - $10/M tokens, 128k context, native tools
|
|
1062
|
+
- 300+ other models available
|
|
1063
|
+
|
|
1064
|
+
**Features:**
|
|
1065
|
+
- ✅ OpenAI-compatible API format
|
|
1066
|
+
- ✅ Native tool calling support for GPT-4o, DeepSeek
|
|
1067
|
+
- ✅ Automatic max_tokens optimization (capped at 8192)
|
|
1068
|
+
- ✅ 95% cost savings vs Anthropic direct API
|
|
1069
|
+
- ✅ Full MCP tool integration (all 213 tools)
|
|
1070
|
+
- ✅ Works with `--verbose` flag for debugging
|
|
1071
|
+
|
|
1022
1072
|
---
|
|
1023
1073
|
|
|
1024
1074
|
## 📚 Complete Agent List
|
|
@@ -8,6 +8,9 @@ function getCurrentProvider() {
|
|
|
8
8
|
if (process.env.PROVIDER === 'gemini' || process.env.USE_GEMINI === 'true') {
|
|
9
9
|
return 'gemini';
|
|
10
10
|
}
|
|
11
|
+
if (process.env.PROVIDER === 'requesty' || process.env.USE_REQUESTY === 'true') {
|
|
12
|
+
return 'requesty';
|
|
13
|
+
}
|
|
11
14
|
if (process.env.PROVIDER === 'openrouter' || process.env.USE_OPENROUTER === 'true') {
|
|
12
15
|
return 'openrouter';
|
|
13
16
|
}
|
|
@@ -24,6 +27,12 @@ function getModelForProvider(provider) {
|
|
|
24
27
|
apiKey: process.env.GOOGLE_GEMINI_API_KEY || process.env.ANTHROPIC_API_KEY || '',
|
|
25
28
|
baseURL: process.env.PROXY_URL || undefined
|
|
26
29
|
};
|
|
30
|
+
case 'requesty':
|
|
31
|
+
return {
|
|
32
|
+
model: process.env.COMPLETION_MODEL || 'deepseek/deepseek-chat',
|
|
33
|
+
apiKey: process.env.REQUESTY_API_KEY || process.env.ANTHROPIC_API_KEY || '',
|
|
34
|
+
baseURL: process.env.PROXY_URL || undefined
|
|
35
|
+
};
|
|
27
36
|
case 'openrouter':
|
|
28
37
|
return {
|
|
29
38
|
model: process.env.COMPLETION_MODEL || 'deepseek/deepseek-chat',
|
|
@@ -75,6 +84,15 @@ export async function claudeAgent(agent, input, onStream, modelOverride) {
|
|
|
75
84
|
model: finalModel
|
|
76
85
|
});
|
|
77
86
|
}
|
|
87
|
+
else if (provider === 'requesty' && process.env.REQUESTY_API_KEY) {
|
|
88
|
+
// Use ANTHROPIC_BASE_URL if already set by CLI (proxy mode)
|
|
89
|
+
envOverrides.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY || 'proxy-key';
|
|
90
|
+
envOverrides.ANTHROPIC_BASE_URL = process.env.ANTHROPIC_BASE_URL || process.env.REQUESTY_PROXY_URL || 'http://localhost:3000';
|
|
91
|
+
logger.info('Using Requesty proxy', {
|
|
92
|
+
proxyUrl: envOverrides.ANTHROPIC_BASE_URL,
|
|
93
|
+
model: finalModel
|
|
94
|
+
});
|
|
95
|
+
}
|
|
78
96
|
else if (provider === 'openrouter' && process.env.OPENROUTER_API_KEY) {
|
|
79
97
|
// Use ANTHROPIC_BASE_URL if already set by CLI (proxy mode)
|
|
80
98
|
envOverrides.ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY || 'proxy-key';
|
|
@@ -43,6 +43,15 @@ function getProxyConfig(provider, customPort) {
|
|
|
43
43
|
apiKey: process.env.OPENROUTER_API_KEY || '',
|
|
44
44
|
requiresProxy: true
|
|
45
45
|
};
|
|
46
|
+
case 'requesty':
|
|
47
|
+
return {
|
|
48
|
+
provider: 'requesty',
|
|
49
|
+
port,
|
|
50
|
+
baseUrl,
|
|
51
|
+
model: process.env.COMPLETION_MODEL || 'deepseek/deepseek-chat',
|
|
52
|
+
apiKey: process.env.REQUESTY_API_KEY || '',
|
|
53
|
+
requiresProxy: true
|
|
54
|
+
};
|
|
46
55
|
case 'gemini':
|
|
47
56
|
return {
|
|
48
57
|
provider: 'gemini',
|
package/dist/cli-proxy.js
CHANGED
|
@@ -25,6 +25,7 @@ function loadEnvRecursive(startPath = process.cwd()) {
|
|
|
25
25
|
}
|
|
26
26
|
loadEnvRecursive();
|
|
27
27
|
import { AnthropicToOpenRouterProxy } from "./proxy/anthropic-to-openrouter.js";
|
|
28
|
+
import { AnthropicToRequestyProxy } from "./proxy/anthropic-to-requesty.js";
|
|
28
29
|
import { logger } from "./utils/logger.js";
|
|
29
30
|
import { parseArgs } from "./utils/cli.js";
|
|
30
31
|
import { getAgent, listAgents } from "./utils/agentLoader.js";
|
|
@@ -154,6 +155,7 @@ class AgenticFlowCLI {
|
|
|
154
155
|
const useONNX = this.shouldUseONNX(options);
|
|
155
156
|
const useOpenRouter = this.shouldUseOpenRouter(options);
|
|
156
157
|
const useGemini = this.shouldUseGemini(options);
|
|
158
|
+
const useRequesty = this.shouldUseRequesty(options);
|
|
157
159
|
// Debug output for provider selection
|
|
158
160
|
if (options.verbose || process.env.VERBOSE === 'true') {
|
|
159
161
|
console.log('\n🔍 Provider Selection Debug:');
|
|
@@ -162,16 +164,22 @@ class AgenticFlowCLI {
|
|
|
162
164
|
console.log(` Use ONNX: ${useONNX}`);
|
|
163
165
|
console.log(` Use OpenRouter: ${useOpenRouter}`);
|
|
164
166
|
console.log(` Use Gemini: ${useGemini}`);
|
|
167
|
+
console.log(` Use Requesty: ${useRequesty}`);
|
|
165
168
|
console.log(` OPENROUTER_API_KEY: ${process.env.OPENROUTER_API_KEY ? '✓ set' : '✗ not set'}`);
|
|
166
169
|
console.log(` GOOGLE_GEMINI_API_KEY: ${process.env.GOOGLE_GEMINI_API_KEY ? '✓ set' : '✗ not set'}`);
|
|
170
|
+
console.log(` REQUESTY_API_KEY: ${process.env.REQUESTY_API_KEY ? '✓ set' : '✗ not set'}`);
|
|
167
171
|
console.log(` ANTHROPIC_API_KEY: ${process.env.ANTHROPIC_API_KEY ? '✓ set' : '✗ not set'}\n`);
|
|
168
172
|
}
|
|
169
173
|
try {
|
|
170
|
-
// Start proxy if needed (ONNX, OpenRouter, or
|
|
174
|
+
// Start proxy if needed (ONNX, OpenRouter, Gemini, or Requesty)
|
|
171
175
|
if (useONNX) {
|
|
172
176
|
console.log('🚀 Initializing ONNX local inference proxy...');
|
|
173
177
|
await this.startONNXProxy(options.model);
|
|
174
178
|
}
|
|
179
|
+
else if (useRequesty) {
|
|
180
|
+
console.log('🚀 Initializing Requesty proxy...');
|
|
181
|
+
await this.startRequestyProxy(options.model);
|
|
182
|
+
}
|
|
175
183
|
else if (useOpenRouter) {
|
|
176
184
|
console.log('🚀 Initializing OpenRouter proxy...');
|
|
177
185
|
await this.startOpenRouterProxy(options.model);
|
|
@@ -184,7 +192,7 @@ class AgenticFlowCLI {
|
|
|
184
192
|
console.log('🚀 Using direct Anthropic API...\n');
|
|
185
193
|
}
|
|
186
194
|
// Run agent
|
|
187
|
-
await this.runAgent(options, useOpenRouter, useGemini, useONNX);
|
|
195
|
+
await this.runAgent(options, useOpenRouter, useGemini, useONNX, useRequesty);
|
|
188
196
|
logger.info('Execution completed successfully');
|
|
189
197
|
process.exit(0);
|
|
190
198
|
}
|
|
@@ -227,14 +235,30 @@ class AgenticFlowCLI {
|
|
|
227
235
|
}
|
|
228
236
|
return false;
|
|
229
237
|
}
|
|
238
|
+
shouldUseRequesty(options) {
|
|
239
|
+
// Use Requesty if:
|
|
240
|
+
// 1. Provider is explicitly set to requesty
|
|
241
|
+
// 2. PROVIDER env var is set to requesty
|
|
242
|
+
// 3. USE_REQUESTY env var is set
|
|
243
|
+
if (options.provider === 'requesty' || process.env.PROVIDER === 'requesty') {
|
|
244
|
+
return true;
|
|
245
|
+
}
|
|
246
|
+
if (process.env.USE_REQUESTY === 'true') {
|
|
247
|
+
return true;
|
|
248
|
+
}
|
|
249
|
+
return false;
|
|
250
|
+
}
|
|
230
251
|
shouldUseOpenRouter(options) {
|
|
231
|
-
// Don't use OpenRouter if ONNX or
|
|
252
|
+
// Don't use OpenRouter if ONNX, Gemini, or Requesty is explicitly requested
|
|
232
253
|
if (options.provider === 'onnx' || process.env.USE_ONNX === 'true' || process.env.PROVIDER === 'onnx') {
|
|
233
254
|
return false;
|
|
234
255
|
}
|
|
235
256
|
if (options.provider === 'gemini' || process.env.PROVIDER === 'gemini') {
|
|
236
257
|
return false;
|
|
237
258
|
}
|
|
259
|
+
if (options.provider === 'requesty' || process.env.PROVIDER === 'requesty') {
|
|
260
|
+
return false;
|
|
261
|
+
}
|
|
238
262
|
// Use OpenRouter if:
|
|
239
263
|
// 1. Provider is explicitly set to openrouter
|
|
240
264
|
// 2. Model parameter contains "/" (e.g., "meta-llama/llama-3.1-8b-instruct")
|
|
@@ -326,6 +350,46 @@ class AgenticFlowCLI {
|
|
|
326
350
|
// Wait for proxy to be ready
|
|
327
351
|
await new Promise(resolve => setTimeout(resolve, 1500));
|
|
328
352
|
}
|
|
353
|
+
async startRequestyProxy(modelOverride) {
|
|
354
|
+
const requestyKey = process.env.REQUESTY_API_KEY;
|
|
355
|
+
if (!requestyKey) {
|
|
356
|
+
console.error('❌ Error: REQUESTY_API_KEY required for Requesty models');
|
|
357
|
+
console.error('Set it in .env or export REQUESTY_API_KEY=sk-xxxxx');
|
|
358
|
+
process.exit(1);
|
|
359
|
+
}
|
|
360
|
+
logger.info('Starting integrated Requesty proxy');
|
|
361
|
+
const defaultModel = modelOverride ||
|
|
362
|
+
process.env.COMPLETION_MODEL ||
|
|
363
|
+
process.env.REASONING_MODEL ||
|
|
364
|
+
'deepseek/deepseek-chat';
|
|
365
|
+
const capabilities = detectModelCapabilities(defaultModel);
|
|
366
|
+
const proxy = new AnthropicToRequestyProxy({
|
|
367
|
+
requestyApiKey: requestyKey,
|
|
368
|
+
requestyBaseUrl: process.env.REQUESTY_BASE_URL,
|
|
369
|
+
defaultModel,
|
|
370
|
+
capabilities: capabilities
|
|
371
|
+
});
|
|
372
|
+
// Start proxy in background
|
|
373
|
+
proxy.start(this.proxyPort);
|
|
374
|
+
this.proxyServer = proxy;
|
|
375
|
+
// Set ANTHROPIC_BASE_URL to proxy
|
|
376
|
+
process.env.ANTHROPIC_BASE_URL = `http://localhost:${this.proxyPort}`;
|
|
377
|
+
// Set dummy ANTHROPIC_API_KEY for proxy (actual auth uses REQUESTY_API_KEY)
|
|
378
|
+
if (!process.env.ANTHROPIC_API_KEY) {
|
|
379
|
+
process.env.ANTHROPIC_API_KEY = 'sk-ant-proxy-dummy-key';
|
|
380
|
+
}
|
|
381
|
+
console.log(`🔗 Proxy Mode: Requesty`);
|
|
382
|
+
console.log(`🔧 Proxy URL: http://localhost:${this.proxyPort}`);
|
|
383
|
+
console.log(`🤖 Default Model: ${defaultModel}`);
|
|
384
|
+
if (capabilities.requiresEmulation) {
|
|
385
|
+
console.log(`\n⚙️ Detected: Model lacks native tool support`);
|
|
386
|
+
console.log(`🔧 Using ${capabilities.emulationStrategy.toUpperCase()} emulation pattern`);
|
|
387
|
+
console.log(`📊 Expected reliability: ${capabilities.emulationStrategy === 'react' ? '70-85%' : '50-70%'}`);
|
|
388
|
+
}
|
|
389
|
+
console.log('');
|
|
390
|
+
// Wait for proxy to be ready
|
|
391
|
+
await new Promise(resolve => setTimeout(resolve, 1500));
|
|
392
|
+
}
|
|
329
393
|
async startONNXProxy(modelOverride) {
|
|
330
394
|
logger.info('Starting integrated ONNX local inference proxy');
|
|
331
395
|
console.log('🔧 Provider: ONNX Local (Phi-4-mini)');
|
|
@@ -494,7 +558,7 @@ EXAMPLES:
|
|
|
494
558
|
claude
|
|
495
559
|
`);
|
|
496
560
|
}
|
|
497
|
-
async runAgent(options, useOpenRouter, useGemini, useONNX = false) {
|
|
561
|
+
async runAgent(options, useOpenRouter, useGemini, useONNX = false, useRequesty = false) {
|
|
498
562
|
const agentName = options.agent || process.env.AGENT || '';
|
|
499
563
|
const task = options.task || process.env.TASK || '';
|
|
500
564
|
if (!agentName) {
|
|
@@ -513,12 +577,24 @@ EXAMPLES:
|
|
|
513
577
|
}
|
|
514
578
|
// Check for API key (unless using ONNX)
|
|
515
579
|
const isOnnx = options.provider === 'onnx' || process.env.USE_ONNX === 'true' || process.env.PROVIDER === 'onnx';
|
|
516
|
-
if (!isOnnx && !useOpenRouter && !useGemini && !process.env.ANTHROPIC_API_KEY) {
|
|
580
|
+
if (!isOnnx && !useOpenRouter && !useGemini && !useRequesty && !process.env.ANTHROPIC_API_KEY) {
|
|
517
581
|
console.error('\n❌ Error: ANTHROPIC_API_KEY is required\n');
|
|
518
582
|
console.error('Please set your API key:');
|
|
519
583
|
console.error(' export ANTHROPIC_API_KEY=sk-ant-xxxxx\n');
|
|
520
584
|
console.error('Or use alternative providers:');
|
|
521
585
|
console.error(' --provider openrouter (requires OPENROUTER_API_KEY)');
|
|
586
|
+
console.error(' --provider requesty (requires REQUESTY_API_KEY)');
|
|
587
|
+
console.error(' --provider gemini (requires GOOGLE_GEMINI_API_KEY)');
|
|
588
|
+
console.error(' --provider onnx (free local inference)\n');
|
|
589
|
+
process.exit(1);
|
|
590
|
+
}
|
|
591
|
+
if (!isOnnx && useRequesty && !process.env.REQUESTY_API_KEY) {
|
|
592
|
+
console.error('\n❌ Error: REQUESTY_API_KEY is required for Requesty\n');
|
|
593
|
+
console.error('Please set your API key:');
|
|
594
|
+
console.error(' export REQUESTY_API_KEY=sk-xxxxx\n');
|
|
595
|
+
console.error('Or use alternative providers:');
|
|
596
|
+
console.error(' --provider anthropic (requires ANTHROPIC_API_KEY)');
|
|
597
|
+
console.error(' --provider openrouter (requires OPENROUTER_API_KEY)');
|
|
522
598
|
console.error(' --provider gemini (requires GOOGLE_GEMINI_API_KEY)');
|
|
523
599
|
console.error(' --provider onnx (free local inference)\n');
|
|
524
600
|
process.exit(1);
|
|
@@ -529,6 +605,7 @@ EXAMPLES:
|
|
|
529
605
|
console.error(' export OPENROUTER_API_KEY=sk-or-v1-xxxxx\n');
|
|
530
606
|
console.error('Or use alternative providers:');
|
|
531
607
|
console.error(' --provider anthropic (requires ANTHROPIC_API_KEY)');
|
|
608
|
+
console.error(' --provider requesty (requires REQUESTY_API_KEY)');
|
|
532
609
|
console.error(' --provider gemini (requires GOOGLE_GEMINI_API_KEY)');
|
|
533
610
|
console.error(' --provider onnx (free local inference)\n');
|
|
534
611
|
process.exit(1);
|
|
@@ -665,7 +742,7 @@ AGENT COMMANDS:
|
|
|
665
742
|
OPTIONS:
|
|
666
743
|
--task, -t <task> Task description for agent mode
|
|
667
744
|
--model, -m <model> Model to use (triggers OpenRouter if contains "/")
|
|
668
|
-
--provider, -p <name> Provider to use (anthropic, openrouter, gemini, onnx)
|
|
745
|
+
--provider, -p <name> Provider to use (anthropic, openrouter, requesty, gemini, onnx)
|
|
669
746
|
--stream, -s Enable real-time streaming output
|
|
670
747
|
--help, -h Show this help message
|
|
671
748
|
|
|
@@ -712,16 +789,19 @@ EXAMPLES:
|
|
|
712
789
|
|
|
713
790
|
# Proxy Server for Claude Code/Cursor
|
|
714
791
|
npx agentic-flow proxy --provider openrouter --port 3000
|
|
792
|
+
npx agentic-flow proxy --provider requesty --port 3000
|
|
715
793
|
npx agentic-flow proxy --provider gemini --port 3001
|
|
716
794
|
|
|
717
795
|
# Claude Code Integration (Auto-start proxy + spawn Claude Code)
|
|
718
796
|
npx agentic-flow claude-code --provider openrouter "Write a Python function"
|
|
797
|
+
npx agentic-flow claude-code --provider requesty "Write a Python function"
|
|
719
798
|
npx agentic-flow claude-code --provider gemini "Create a REST API"
|
|
720
799
|
npx agentic-flow claude-code --provider anthropic "Help me debug this code"
|
|
721
800
|
|
|
722
801
|
# Agent Execution
|
|
723
802
|
npx agentic-flow --list # List all 150+ agents
|
|
724
803
|
npx agentic-flow --agent coder --task "Create Python hello world"
|
|
804
|
+
npx agentic-flow --agent coder --task "Create REST API" --provider requesty
|
|
725
805
|
npx agentic-flow --agent coder --task "Create REST API" --model "meta-llama/llama-3.1-8b-instruct"
|
|
726
806
|
npx agentic-flow --agent coder --task "Create code" --provider onnx
|
|
727
807
|
|
|
@@ -734,8 +814,10 @@ EXAMPLES:
|
|
|
734
814
|
ENVIRONMENT VARIABLES:
|
|
735
815
|
ANTHROPIC_API_KEY Anthropic API key (for Claude models)
|
|
736
816
|
OPENROUTER_API_KEY OpenRouter API key (for alternative models)
|
|
817
|
+
REQUESTY_API_KEY Requesty API key (for Requesty models - 300+ models)
|
|
737
818
|
GOOGLE_GEMINI_API_KEY Google Gemini API key (for Gemini models)
|
|
738
819
|
USE_OPENROUTER Set to 'true' to force OpenRouter usage
|
|
820
|
+
USE_REQUESTY Set to 'true' to force Requesty usage
|
|
739
821
|
USE_GEMINI Set to 'true' to force Gemini usage
|
|
740
822
|
COMPLETION_MODEL Default model for OpenRouter
|
|
741
823
|
AGENTS_DIR Path to agents directory
|