clasp-ai 0.0.1 → 0.7.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 jedarden
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md CHANGED
@@ -1,59 +1,516 @@
1
- # CLASP
1
+ # CLASP - Claude Language Agent Super Proxy
2
2
 
3
- **C**laude **L**anguage **A**gent **S**uper **P**roxy
3
+ A high-performance Go proxy that translates Claude/Anthropic API calls to OpenAI-compatible endpoints, enabling Claude Code to work with any LLM provider.
4
4
 
5
- Run Claude Code with any LLM provider - OpenAI, Azure, OpenRouter, and more.
5
+ ## Features
6
+
7
+ - **Multi-Provider Support**: OpenAI, Azure OpenAI, OpenRouter (200+ models), and custom endpoints (Ollama, vLLM, LM Studio)
8
+ - **Full Protocol Translation**: Anthropic Messages API ↔ OpenAI Chat Completions API
9
+ - **SSE Streaming**: Real-time token streaming with state machine processing
10
+ - **Tool Calls**: Complete translation of tool_use/tool_result between formats
11
+ - **Connection Pooling**: Optimized HTTP transport with persistent connections
12
+ - **Retry Logic**: Exponential backoff for transient failures
13
+ - **Metrics Endpoint**: Request statistics and performance monitoring
14
+ - **API Key Authentication**: Secure the proxy with optional API key validation
6
15
 
7
16
  ## Installation
8
17
 
18
+ ### Via npm (recommended)
19
+
9
20
  ```bash
21
+ # Install globally
10
22
  npm install -g clasp-ai
23
+
24
+ # Or run directly with npx
25
+ npx clasp-ai
11
26
  ```
12
27
 
13
- ## Features (Coming Soon)
28
+ ### Via Go
14
29
 
15
- - **Multi-Provider Support**: OpenAI, Azure OpenAI, OpenRouter, Anthropic, custom endpoints
16
- - **Zero Configuration**: Auto-detects provider from environment variables
17
- - **Interactive Setup**: Guided configuration for any provider
18
- - **Model Selection**: Dynamic model fetching from provider APIs
19
- - **Profile System**: Save and switch between provider configurations
20
- - **Auto-Update**: Built-in update mechanism
30
+ ```bash
31
+ go install github.com/jedarden/clasp/cmd/clasp@latest
32
+ ```
21
33
 
22
- ## Supported Providers
34
+ ### From Source
23
35
 
24
- | Provider | Models | Status |
25
- |----------|--------|--------|
26
- | OpenRouter | 200+ models | Coming Soon |
27
- | OpenAI | GPT-4o, o1, o3 | Coming Soon |
28
- | Azure OpenAI | Enterprise deployments | Coming Soon |
29
- | Anthropic | Claude (passthrough) | Coming Soon |
30
- | Custom | Ollama, vLLM, LM Studio | Coming Soon |
36
+ ```bash
37
+ git clone https://github.com/jedarden/CLASP.git
38
+ cd CLASP
39
+ make build
40
+ ```
31
41
 
32
- ## Usage (Preview)
42
+ ### Via Docker
33
43
 
34
44
  ```bash
35
- # With OpenRouter (default)
36
- export OPENROUTER_API_KEY=sk-or-...
37
- clasp
45
+ # Run with Docker
46
+ docker run -d -p 8080:8080 \
47
+ -e OPENAI_API_KEY=sk-... \
48
+ ghcr.io/jedarden/clasp-ai:latest
49
+
50
+ # Or with docker-compose
51
+ docker-compose up -d
52
+ ```
53
+
54
+ ## Quick Start
38
55
 
39
- # With OpenAI
40
- export PROVIDER=openai
56
+ ### Using with OpenAI
57
+
58
+ ```bash
59
+ # Set your API key
41
60
  export OPENAI_API_KEY=sk-...
42
- clasp --model gpt-4o
43
61
 
44
- # With Azure
45
- export PROVIDER=azure
46
- export AZURE_API_KEY=...
47
- export AZURE_OPENAI_ENDPOINT=https://xxx.openai.azure.com
62
+ # Start the proxy
63
+ clasp -model gpt-4o
64
+
65
+ # In another terminal, use Claude Code
66
+ ANTHROPIC_BASE_URL=http://localhost:8080 claude
67
+ ```
68
+
69
+ ### Using with Azure OpenAI
70
+
71
+ ```bash
72
+ export AZURE_API_KEY=your-key
73
+ export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
48
74
  export AZURE_DEPLOYMENT_NAME=gpt-4
49
- clasp
50
75
 
51
- # With local Ollama
52
- export PROVIDER=custom
76
+ clasp -provider azure
77
+ ```
78
+
79
+ ### Using with OpenRouter
80
+
81
+ ```bash
82
+ export OPENROUTER_API_KEY=sk-or-...
83
+
84
+ clasp -provider openrouter -model anthropic/claude-3-sonnet
85
+ ```
86
+
87
+ ### Using with Local Models (Ollama)
88
+
89
+ ```bash
53
90
  export CUSTOM_BASE_URL=http://localhost:11434/v1
54
- clasp --model llama3.1:70b
91
+
92
+ clasp -provider custom -model llama3.1
93
+ ```
94
+
95
+ ## Configuration
96
+
97
+ ### Command Line Options
98
+
99
+ ```
100
+ clasp [options]
101
+
102
+ Options:
103
+ -port <port> Port to listen on (default: 8080)
104
+ -provider <name> LLM provider: openai, azure, openrouter, custom
105
+ -model <model> Default model to use for all requests
106
+ -debug Enable debug logging (full request/response)
107
+ -rate-limit Enable rate limiting
108
+ -cache Enable response caching
109
+ -cache-max-size <n> Maximum cache entries (default: 1000)
110
+ -cache-ttl <n> Cache TTL in seconds (default: 3600)
111
+ -multi-provider Enable multi-provider tier routing
112
+ -fallback Enable fallback routing for auto-failover
113
+ -auth Enable API key authentication
114
+ -auth-api-key <key> API key for authentication (required with -auth)
115
+ -version Show version information
116
+ -help Show help message
117
+ ```
118
+
119
+ ### Environment Variables
120
+
121
+ | Variable | Description | Default |
122
+ |----------|-------------|---------|
123
+ | `PROVIDER` | LLM provider type | `openai` |
124
+ | `CLASP_PORT` | Proxy server port | `8080` |
125
+ | `CLASP_MODEL` | Default model | - |
126
+ | `CLASP_MODEL_OPUS` | Model for Opus tier | - |
127
+ | `CLASP_MODEL_SONNET` | Model for Sonnet tier | - |
128
+ | `CLASP_MODEL_HAIKU` | Model for Haiku tier | - |
129
+ | `OPENAI_API_KEY` | OpenAI API key | - |
130
+ | `OPENAI_BASE_URL` | Custom OpenAI base URL | `https://api.openai.com/v1` |
131
+ | `AZURE_API_KEY` | Azure OpenAI API key | - |
132
+ | `AZURE_OPENAI_ENDPOINT` | Azure endpoint URL | - |
133
+ | `AZURE_DEPLOYMENT_NAME` | Azure deployment name | - |
134
+ | `AZURE_API_VERSION` | Azure API version | `2024-02-15-preview` |
135
+ | `OPENROUTER_API_KEY` | OpenRouter API key | - |
136
+ | `CUSTOM_BASE_URL` | Custom endpoint base URL | - |
137
+ | `CUSTOM_API_KEY` | Custom endpoint API key | - |
138
+ | `CLASP_DEBUG` | Enable all debug logging | `false` |
139
+ | `CLASP_DEBUG_REQUESTS` | Log requests only | `false` |
140
+ | `CLASP_DEBUG_RESPONSES` | Log responses only | `false` |
141
+ | `CLASP_RATE_LIMIT` | Enable rate limiting | `false` |
142
+ | `CLASP_RATE_LIMIT_REQUESTS` | Requests per window | `60` |
143
+ | `CLASP_RATE_LIMIT_WINDOW` | Window in seconds | `60` |
144
+ | `CLASP_RATE_LIMIT_BURST` | Burst allowance | `10` |
145
+ | `CLASP_CACHE` | Enable response caching | `false` |
146
+ | `CLASP_CACHE_MAX_SIZE` | Maximum cache entries | `1000` |
147
+ | `CLASP_CACHE_TTL` | Cache TTL in seconds | `3600` |
148
+ | `CLASP_MULTI_PROVIDER` | Enable multi-provider routing | `false` |
149
+ | `CLASP_FALLBACK` | Enable fallback routing | `false` |
150
+ | `CLASP_AUTH` | Enable API key authentication | `false` |
151
+ | `CLASP_AUTH_API_KEY` | Required API key for access | - |
152
+ | `CLASP_AUTH_ALLOW_ANONYMOUS_HEALTH` | Allow /health without auth | `true` |
153
+ | `CLASP_AUTH_ALLOW_ANONYMOUS_METRICS` | Allow /metrics without auth | `false` |
154
+
155
+ ### Model Mapping
156
+
157
+ CLASP can automatically map Claude model tiers to your provider's models:
158
+
159
+ ```bash
160
+ # Map Claude tiers to specific models
161
+ export CLASP_MODEL_OPUS=gpt-4o
162
+ export CLASP_MODEL_SONNET=gpt-4o-mini
163
+ export CLASP_MODEL_HAIKU=gpt-3.5-turbo
164
+ ```
165
+
166
+ ### Multi-Provider Routing
167
+
168
+ Route different Claude model tiers to different LLM providers for cost optimization:
169
+
170
+ ```bash
171
+ # Enable multi-provider routing
172
+ export CLASP_MULTI_PROVIDER=true
173
+
174
+ # Route Opus tier to OpenAI (premium)
175
+ export CLASP_OPUS_PROVIDER=openai
176
+ export CLASP_OPUS_MODEL=gpt-4o
177
+ export CLASP_OPUS_API_KEY=sk-... # Optional, inherits from OPENAI_API_KEY
178
+
179
+ # Route Sonnet tier to OpenRouter (cost-effective)
180
+ export CLASP_SONNET_PROVIDER=openrouter
181
+ export CLASP_SONNET_MODEL=anthropic/claude-3-sonnet
182
+ export CLASP_SONNET_API_KEY=sk-or-... # Optional, inherits from OPENROUTER_API_KEY
183
+
184
+ # Route Haiku tier to local Ollama (free)
185
+ export CLASP_HAIKU_PROVIDER=custom
186
+ export CLASP_HAIKU_MODEL=llama3.1
187
+ export CLASP_HAIKU_BASE_URL=http://localhost:11434/v1
188
+
189
+ clasp -multi-provider
190
+ ```
191
+
192
+ **Multi-Provider Environment Variables:**
193
+
194
+ | Variable | Description |
195
+ |----------|-------------|
196
+ | `CLASP_MULTI_PROVIDER` | Enable multi-provider routing (`true`/`1`) |
197
+ | `CLASP_{TIER}_PROVIDER` | Provider for tier: `openai`, `openrouter`, `custom` |
198
+ | `CLASP_{TIER}_MODEL` | Model name for the tier |
199
+ | `CLASP_{TIER}_API_KEY` | API key (optional, inherits from main config) |
200
+ | `CLASP_{TIER}_BASE_URL` | Base URL (optional, uses provider default) |
201
+
202
+ Where `{TIER}` is `OPUS`, `SONNET`, or `HAIKU`.
203
+
204
+ **Benefits:**
205
+ - **Cost Optimization**: Use expensive providers only for complex tasks
206
+ - **Latency Reduction**: Route simple requests to faster local models
207
+ - **Redundancy**: Mix cloud and local providers for reliability
208
+ - **A/B Testing**: Compare different models across tiers
209
+
210
+ ## API Endpoints
211
+
212
+ | Endpoint | Description |
213
+ |----------|-------------|
214
+ | `POST /v1/messages` | Anthropic Messages API (translated) |
215
+ | `GET /health` | Health check |
216
+ | `GET /metrics` | Request statistics (JSON) |
217
+ | `GET /metrics/prometheus` | Prometheus metrics |
218
+ | `GET /` | Server info |
219
+
220
+ ## Example Usage
221
+
222
+ ### With curl
223
+
224
+ ```bash
225
+ curl http://localhost:8080/v1/messages \
226
+ -H "Content-Type: application/json" \
227
+ -H "x-api-key: any-key" \
228
+ -d '{
229
+ "model": "claude-3-5-sonnet-20241022",
230
+ "max_tokens": 1024,
231
+ "messages": [
232
+ {"role": "user", "content": "Hello!"}
233
+ ]
234
+ }'
235
+ ```
236
+
237
+ ### Streaming
238
+
239
+ ```bash
240
+ curl http://localhost:8080/v1/messages \
241
+ -H "Content-Type: application/json" \
242
+ -H "x-api-key: any-key" \
243
+ -d '{
244
+ "model": "claude-3-5-sonnet-20241022",
245
+ "max_tokens": 1024,
246
+ "stream": true,
247
+ "messages": [
248
+ {"role": "user", "content": "Count to 5"}
249
+ ]
250
+ }'
251
+ ```
252
+
253
+ ## Response Caching
254
+
255
+ CLASP can cache responses to reduce API costs and improve latency for repeated requests:
256
+
257
+ ```bash
258
+ # Enable caching with defaults (1000 entries, 1 hour TTL)
259
+ clasp -cache
260
+
261
+ # Custom cache settings
262
+ clasp -cache -cache-max-size 500 -cache-ttl 1800
263
+
264
+ # Via environment
265
+ CLASP_CACHE=true CLASP_CACHE_MAX_SIZE=500 clasp
266
+ ```
267
+
268
+ **Caching behavior:**
269
+ - Only non-streaming requests are cached
270
+ - Requests with `temperature > 0` are not cached (non-deterministic)
271
+ - Cache uses LRU (Least Recently Used) eviction when full
272
+ - Cache entries expire after TTL (time-to-live)
273
+ - Response headers include `X-CLASP-Cache: HIT` or `X-CLASP-Cache: MISS`
274
+
275
+ ## Metrics
276
+
277
+ Access `/metrics` for request statistics:
278
+
279
+ ```json
280
+ {
281
+ "requests": {
282
+ "total": 100,
283
+ "successful": 98,
284
+ "errors": 2,
285
+ "streaming": 75,
286
+ "tool_calls": 15,
287
+ "success_rate": "98.00%"
288
+ },
289
+ "performance": {
290
+ "avg_latency_ms": "523.50",
291
+ "requests_per_sec": "2.34"
292
+ },
293
+ "cache": {
294
+ "enabled": true,
295
+ "size": 42,
296
+ "max_size": 1000,
297
+ "hits": 156,
298
+ "misses": 44,
299
+ "hit_rate": "78.00%"
300
+ },
301
+ "uptime": "5m30s"
302
+ }
303
+ ```
304
+
305
+ ## Docker
306
+
307
+ ### Build and Run
308
+
309
+ ```bash
310
+ # Build Docker image
311
+ make docker
312
+
313
+ # Run container
314
+ make docker-run
315
+
316
+ # Stop container
317
+ make docker-stop
318
+ ```
319
+
320
+ ### Docker Compose
321
+
322
+ Create a `.env` file with your configuration:
323
+
324
+ ```bash
325
+ PROVIDER=openai
326
+ OPENAI_API_KEY=sk-...
327
+ CLASP_DEFAULT_MODEL=gpt-4o
328
+ ```
329
+
330
+ Then start the service:
331
+
332
+ ```bash
333
+ docker-compose up -d
334
+ ```
335
+
336
+ ### Docker Environment Variables
337
+
338
+ All configuration is done through environment variables. See the Environment Variables section above.
339
+
340
+ ## Development
341
+
342
+ ```bash
343
+ # Build
344
+ make build
345
+
346
+ # Run tests
347
+ make test
348
+
349
+ # Build for all platforms
350
+ make build-all
351
+
352
+ # Build Docker image
353
+ make docker
354
+
355
+ # Format code
356
+ make fmt
357
+ ```
358
+
359
+ ## Debugging
360
+
361
+ Enable debug logging to troubleshoot issues:
362
+
363
+ ```bash
364
+ # Via CLI flag
365
+ clasp -debug
366
+
367
+ # Via environment variable
368
+ CLASP_DEBUG=true clasp
369
+
370
+ # Log only requests or responses
371
+ CLASP_DEBUG_REQUESTS=true clasp
372
+ CLASP_DEBUG_RESPONSES=true clasp
373
+ ```
374
+
375
+ Debug output includes:
376
+ - Incoming Anthropic requests
377
+ - Outgoing OpenAI requests
378
+ - Raw OpenAI responses
379
+ - Transformed Anthropic responses
380
+
381
+ ## Authentication
382
+
383
+ Secure your CLASP proxy with API key authentication to control access:
384
+
385
+ ```bash
386
+ # Enable authentication with CLI flags
387
+ clasp -auth -auth-api-key "my-secret-key"
388
+
389
+ # Or via environment variables
390
+ CLASP_AUTH=true CLASP_AUTH_API_KEY="my-secret-key" clasp
391
+ ```
392
+
393
+ ### Providing the API Key
394
+
395
+ Clients can provide the API key in two ways:
396
+
397
+ ```bash
398
+ # Via x-api-key header
399
+ curl http://localhost:8080/v1/messages \
400
+ -H "x-api-key: my-secret-key" \
401
+ -H "Content-Type: application/json" \
402
+ -d '{"model": "claude-3-5-sonnet-20241022", ...}'
403
+
404
+ # Via Authorization header (Bearer token)
405
+ curl http://localhost:8080/v1/messages \
406
+ -H "Authorization: Bearer my-secret-key" \
407
+ -H "Content-Type: application/json" \
408
+ -d '{"model": "claude-3-5-sonnet-20241022", ...}'
409
+ ```
410
+
411
+ ### Authentication Options
412
+
413
+ | Variable | Description | Default |
414
+ |----------|-------------|---------|
415
+ | `CLASP_AUTH` | Enable authentication | `false` |
416
+ | `CLASP_AUTH_API_KEY` | Required API key | - |
417
+ | `CLASP_AUTH_ALLOW_ANONYMOUS_HEALTH` | Allow /health without auth | `true` |
418
+ | `CLASP_AUTH_ALLOW_ANONYMOUS_METRICS` | Allow /metrics without auth | `false` |
419
+
420
+ ### Endpoint Access with Authentication Enabled
421
+
422
+ | Endpoint | Default Access |
423
+ |----------|----------------|
424
+ | `/` | Always accessible |
425
+ | `/health` | Anonymous by default |
426
+ | `/metrics` | Requires auth by default |
427
+ | `/metrics/prometheus` | Requires auth by default |
428
+ | `/v1/messages` | Requires auth |
429
+
430
+ ### Using with Claude Code
431
+
432
+ When authentication is enabled, set both the base URL and API key:
433
+
434
+ ```bash
435
+ # Start CLASP with auth
436
+ OPENAI_API_KEY=sk-... clasp -auth -auth-api-key "proxy-key"
437
+
438
+ # Use with Claude Code (the proxy key is passed as the Anthropic key)
439
+ ANTHROPIC_BASE_URL=http://localhost:8080 ANTHROPIC_API_KEY=proxy-key claude
440
+ ```
441
+
442
+ ## Request Queuing
443
+
444
+ Queue requests during provider outages for automatic retry:
445
+
446
+ ```bash
447
+ # Enable request queuing
448
+ clasp -queue
449
+
450
+ # Custom queue settings
451
+ clasp -queue -queue-max-size 200 -queue-max-wait 60
452
+
453
+ # Via environment
454
+ CLASP_QUEUE=true CLASP_QUEUE_MAX_SIZE=200 clasp
455
+ ```
456
+
457
+ ### Queue Options
458
+
459
+ | Variable | Description | Default |
460
+ |----------|-------------|---------|
461
+ | `CLASP_QUEUE` | Enable request queuing | `false` |
462
+ | `CLASP_QUEUE_MAX_SIZE` | Maximum queued requests | `100` |
463
+ | `CLASP_QUEUE_MAX_WAIT` | Queue timeout in seconds | `30` |
464
+ | `CLASP_QUEUE_RETRY_DELAY` | Retry delay in milliseconds | `1000` |
465
+ | `CLASP_QUEUE_MAX_RETRIES` | Maximum retries per request | `3` |
466
+
467
+ ## Circuit Breaker
468
+
469
+ Prevent cascade failures with circuit breaker pattern:
470
+
471
+ ```bash
472
+ # Enable circuit breaker
473
+ clasp -circuit-breaker
474
+
475
+ # Custom circuit breaker settings
476
+ clasp -circuit-breaker -cb-threshold 10 -cb-recovery 3 -cb-timeout 60
477
+
478
+ # Via environment
479
+ CLASP_CIRCUIT_BREAKER=true clasp
480
+ ```
481
+
482
+ ### Circuit Breaker Options
483
+
484
+ | Variable | Description | Default |
485
+ |----------|-------------|---------|
486
+ | `CLASP_CIRCUIT_BREAKER` | Enable circuit breaker | `false` |
487
+ | `CLASP_CIRCUIT_BREAKER_THRESHOLD` | Failures before opening circuit | `5` |
488
+ | `CLASP_CIRCUIT_BREAKER_RECOVERY` | Successes to close circuit | `2` |
489
+ | `CLASP_CIRCUIT_BREAKER_TIMEOUT` | Timeout in seconds before retry | `30` |
490
+
491
+ ### Circuit Breaker States
492
+
493
+ - **Closed**: Normal operation, requests pass through
494
+ - **Open**: Circuit tripped, requests fail fast with 503
495
+ - **Half-Open**: Testing if service recovered, limited requests allowed
496
+
497
+ ### Maximum Resilience Configuration
498
+
499
+ For production deployments requiring maximum resilience:
500
+
501
+ ```bash
502
+ # Enable queue + circuit breaker + fallback
503
+ OPENAI_API_KEY=sk-xxx OPENROUTER_API_KEY=sk-or-xxx \
504
+ clasp -queue -circuit-breaker -fallback \
505
+ -queue-max-size 200 \
506
+ -cb-threshold 5 \
507
+ -cb-timeout 30
55
508
  ```
56
509
 
57
510
  ## License
58
511
 
59
- MIT
512
+ MIT License - see [LICENSE](LICENSE) for details.
513
+
514
+ ## Contributing
515
+
516
+ Contributions are welcome! Please open an issue or submit a pull request on [GitHub](https://github.com/jedarden/CLASP).
@@ -0,0 +1,40 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * CLASP Wrapper Script
5
+ * Executes the Go binary with all passed arguments
6
+ */
7
+
8
+ const { spawn } = require('child_process');
9
+ const path = require('path');
10
+ const os = require('os');
11
+ const fs = require('fs');
12
+
13
+ const BINARY_NAME = os.platform() === 'win32' ? 'clasp.exe' : 'clasp';
14
+ const binaryPath = path.join(__dirname, BINARY_NAME);
15
+
16
+ // Check if binary exists
17
+ if (!fs.existsSync(binaryPath)) {
18
+ console.error('[CLASP] Binary not found. Running install script...');
19
+ require('../scripts/install.js');
20
+ }
21
+
22
+ // Spawn the Go binary with all arguments
23
+ const args = process.argv.slice(2);
24
+ const child = spawn(binaryPath, args, {
25
+ stdio: 'inherit',
26
+ env: process.env
27
+ });
28
+
29
+ child.on('error', (err) => {
30
+ if (err.code === 'ENOENT') {
31
+ console.error('[CLASP] Binary not found. Please run: npm run postinstall');
32
+ } else {
33
+ console.error(`[CLASP] Error: ${err.message}`);
34
+ }
35
+ process.exit(1);
36
+ });
37
+
38
+ child.on('close', (code) => {
39
+ process.exit(code || 0);
40
+ });
package/package.json CHANGED
@@ -1,38 +1,53 @@
1
1
  {
2
2
  "name": "clasp-ai",
3
- "version": "0.0.1",
4
- "description": "CLASP - Claude Language Agent Super Proxy. Run Claude Code with any LLM provider - OpenAI, Azure, OpenRouter, and more.",
5
- "bin": {
6
- "clasp": "./bin/clasp"
3
+ "version": "0.7.1",
4
+ "description": "Claude Language Agent Super Proxy - Translate Claude/Anthropic API calls to OpenAI-compatible endpoints",
5
+ "author": "jedarden",
6
+ "license": "MIT",
7
+ "repository": {
8
+ "type": "git",
9
+ "url": "https://github.com/jedarden/CLASP.git"
7
10
  },
8
- "scripts": {
9
- "postinstall": "node postinstall.js || true"
11
+ "bugs": {
12
+ "url": "https://github.com/jedarden/CLASP/issues"
10
13
  },
14
+ "homepage": "https://github.com/jedarden/CLASP#readme",
11
15
  "keywords": [
12
16
  "claude",
13
- "claude-code",
14
- "openai",
15
- "openrouter",
16
- "azure",
17
17
  "anthropic",
18
- "llm",
18
+ "openai",
19
19
  "proxy",
20
+ "llm",
20
21
  "ai",
21
- "clasp",
22
- "gpt",
23
- "gemini",
24
- "grok"
22
+ "api",
23
+ "translation",
24
+ "openrouter",
25
+ "azure"
25
26
  ],
26
- "author": "",
27
- "license": "MIT",
28
- "repository": {
29
- "type": "git",
30
- "url": "https://github.com/yourname/clasp-ai"
27
+ "bin": {
28
+ "clasp": "./bin/clasp-wrapper.js"
31
29
  },
32
- "homepage": "https://github.com/yourname/clasp-ai#readme",
30
+ "scripts": {
31
+ "postinstall": "node scripts/install.js",
32
+ "start": "node bin/clasp-wrapper.js",
33
+ "prepack": "node scripts/prepack.js"
34
+ },
35
+ "files": [
36
+ "bin/",
37
+ "scripts/",
38
+ "README.md",
39
+ "LICENSE"
40
+ ],
33
41
  "engines": {
34
- "node": ">=18.0.0"
42
+ "node": ">=16.0.0"
35
43
  },
36
- "os": ["darwin", "linux", "win32"],
37
- "cpu": ["x64", "arm64"]
44
+ "os": [
45
+ "darwin",
46
+ "linux",
47
+ "win32"
48
+ ],
49
+ "cpu": [
50
+ "x64",
51
+ "arm64"
52
+ ]
38
53
  }