spectyra-proxy 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,96 @@
1
+ # Installation Guide for End Users
2
+
3
+ ## How to Install Spectyra Proxy
4
+
5
+ ### Option 1: Install via npm (Recommended - When Published)
6
+
7
+ **Once the package is published to npm:**
8
+
9
+ ```bash
10
+ npm install -g spectyra-proxy
11
+ ```
12
+
13
+ **Then start it:**
14
+ ```bash
15
+ spectyra-proxy
16
+ ```
17
+
18
+ **That's it!** The proxy will start on:
19
+ - Proxy: http://localhost:3001
20
+ - Dashboard: http://localhost:3002
21
+
22
+ ### Option 2: Install from GitHub (Current Method)
23
+
24
+ **If the npm package isn't published yet:**
25
+
26
+ 1. **Clone or download the repository:**
27
+ ```bash
28
+ git clone https://github.com/your-username/spectyra.git
29
+ cd spectyra/tools/proxy
30
+ ```
31
+
32
+ 2. **Install dependencies:**
33
+ ```bash
34
+ npm install
35
+ # or
36
+ pnpm install
37
+ ```
38
+
39
+ 3. **Start the proxy:**
40
+ ```bash
41
+ npm start
42
+ # or
43
+ pnpm start
44
+ ```
45
+
46
+ ### Option 3: Download Standalone (Future)
47
+
48
+ **When available, you can download a standalone installer:**
49
+ - macOS: `.dmg` installer
50
+ - Windows: `.exe` installer
51
+ - Linux: `.deb` or `.rpm` package
52
+
53
+ ## Requirements
54
+
55
+ - **Node.js**: Version 18 or higher
56
+ - **npm** or **pnpm**: For package installation
57
+ - **Terminal/Command Prompt**: To run commands
58
+
59
+ ## Verify Installation
60
+
61
+ After starting, you should see:
62
+ ```
63
+ 🚀 Spectyra Proxy running on http://localhost:3001
64
+ 📊 Dashboard: http://localhost:3002
65
+ ```
66
+
67
+ Open http://localhost:3002 to verify the dashboard is working.
68
+
69
+ ## Next Steps
70
+
71
+ 1. Configure the proxy (see Step 2 in Connections page)
72
+ 2. Configure your coding tool (see Step 3 in Connections page)
73
+ 3. Start coding and see savings!
74
+
75
+ ## Troubleshooting
76
+
77
+ ### "Command not found: spectyra-proxy"
78
+ - Make sure you installed globally: `npm install -g spectyra-proxy`
79
+ - Try: `npx @spectyra/proxy` instead
80
+
81
+ ### "Port already in use"
82
+ - Another process is using port 3001 or 3002
83
+ - Stop that process or change ports via environment variables:
84
+ ```bash
85
+ PROXY_PORT=3003 DASHBOARD_PORT=3004 spectyra-proxy
86
+ ```
87
+
88
+ ### "Cannot find module"
89
+ - Make sure you ran `npm install` in the proxy directory
90
+ - Try deleting `node_modules` and running `npm install` again
91
+
92
+ ## Need Help?
93
+
94
+ - See: `README.md` for detailed documentation
95
+ - See: `SETUP_GUIDE.md` for tool-specific setup
96
+ - See: `PROVIDER_SUPPORT.md` for provider information
@@ -0,0 +1,162 @@
1
+ # Provider Support Guide
2
+
3
+ ## Supported Providers
4
+
5
+ The Spectyra Proxy supports **all major LLM providers** used by coding tools:
6
+
7
+ ### ✅ OpenAI
8
+ **Tools:**
9
+ - GitHub Copilot
10
+ - Cursor (when using OpenAI models)
11
+ - Codeium (OpenAI mode)
12
+ - Tabnine (OpenAI mode)
13
+
14
+ **API Format:** OpenAI-compatible (`/v1/chat/completions`)
15
+ **Configuration:** Select "OpenAI" in dashboard
16
+
17
+ ### ✅ Anthropic (Claude)
18
+ **Tools:**
19
+ - Claude Code
20
+ - Cursor (when using Claude models)
21
+ - Tools using Claude API
22
+
23
+ **API Format:** Anthropic Messages API (`/v1/messages`)
24
+ **Configuration:** Select "Anthropic" in dashboard
25
+
26
+ **Note:** The proxy automatically handles Anthropic's different API format.
27
+
28
+ ### ✅ Gemini
29
+ **Tools:**
30
+ - Google AI Studio tools
31
+ - Tools using Gemini API
32
+
33
+ **API Format:** Gemini GenerateContent API
34
+ **Configuration:** Select "Gemini" in dashboard
35
+
36
+ ### ✅ Grok
37
+ **Tools:**
38
+ - X.AI tools
39
+ - Tools using Grok API
40
+
41
+ **API Format:** OpenAI-compatible (`/v1/chat/completions`)
42
+ **Configuration:** Select "Grok" in dashboard
43
+
44
+ ## How It Works
45
+
46
+ The proxy automatically:
47
+
48
+ 1. **Detects API format** from the request
49
+ 2. **Converts to Spectyra format** for optimization
50
+ 3. **Routes through Spectyra** with your configured provider
51
+ 4. **Converts back to original format** for your tool
52
+
53
+ **You don't need to worry about API format differences!**
54
+
55
+ ## Configuration
56
+
57
+ ### Step 1: Select Provider
58
+
59
+ In the dashboard (http://localhost:3002), select your provider:
60
+ - **OpenAI** - For Copilot, Cursor (OpenAI), etc.
61
+ - **Anthropic** - For Claude Code, Cursor (Claude), etc.
62
+ - **Gemini** - For Google AI tools
63
+ - **Grok** - For X.AI tools
64
+
65
+ ### Step 2: Enter API Keys
66
+
67
+ - **Spectyra API Key:** Your Spectyra key
68
+ - **Provider API Key:** Your provider key (OpenAI, Anthropic, etc.)
69
+
70
+ ### Step 3: Configure Tool
71
+
72
+ Set your tool's API endpoint to: `http://localhost:3001/v1`
73
+
74
+ **The proxy handles format conversion automatically!**
75
+
76
+ ## Tool-Specific Setup
77
+
78
+ ### GitHub Copilot (OpenAI)
79
+ ```bash
80
+ export OPENAI_API_BASE=http://localhost:3001/v1
81
+ ```
82
+ Provider: **OpenAI**
83
+
84
+ ### Claude Code (Anthropic)
85
+ Configure Claude Code to use: `http://localhost:3001/v1/messages`
86
+ Provider: **Anthropic**
87
+
88
+ ### Cursor
89
+ - If using OpenAI models: Provider = **OpenAI**
90
+ - If using Claude models: Provider = **Anthropic**
91
+ - Set API base: `http://localhost:3001/v1`
92
+
93
+ ### Codeium
94
+ - If using OpenAI: Provider = **OpenAI**
95
+ - Set API endpoint: `http://localhost:3001/v1`
96
+
97
+ ## API Format Conversion
98
+
99
+ The proxy automatically converts between formats:
100
+
101
+ ### OpenAI → Spectyra → OpenAI
102
+ ```
103
+ OpenAI format (messages array)
104
+ ↓
105
+ Spectyra format (standardized)
106
+ ↓
107
+ OpenAI format (response)
108
+ ```
109
+
110
+ ### Anthropic → Spectyra → Anthropic
111
+ ```
112
+ Anthropic format (messages with content arrays)
113
+ ↓
114
+ Spectyra format (standardized)
115
+ ↓
116
+ Anthropic format (message object)
117
+ ```
118
+
119
+ ### Gemini → Spectyra → Gemini
120
+ ```
121
+ Gemini format (contents with parts)
122
+ ↓
123
+ Spectyra format (standardized)
124
+ ↓
125
+ Gemini format (candidates)
126
+ ```
127
+
128
+ ## Troubleshooting
129
+
130
+ ### Tool not connecting
131
+ - Verify provider is correct in dashboard
132
+ - Check tool's API endpoint setting
133
+ - Ensure proxy is running
134
+
135
+ ### Wrong format errors
136
+ - The proxy should auto-detect, but if issues occur:
137
+ - Check provider selection in dashboard
138
+ - Verify tool is using correct endpoint
139
+ - Check proxy console for format detection logs
140
+
141
+ ### Provider-specific issues
142
+
143
+ **Anthropic:**
144
+ - Ensure you're using `/v1/messages` endpoint
145
+ - System messages are handled automatically
146
+
147
+ **Gemini:**
148
+ - Verify API key format
149
+ - Check model name is correct
150
+
151
+ **OpenAI/Grok:**
152
+ - Standard `/v1/chat/completions` endpoint
153
+ - Should work with most tools
154
+
155
+ ## Summary
156
+
157
+ ✅ **All providers supported** - OpenAI, Anthropic, Gemini, Grok
158
+ ✅ **Automatic format conversion** - No manual configuration needed
159
+ ✅ **Works with all coding tools** - Copilot, Cursor, Claude Code, etc.
160
+ ✅ **Simple setup** - Just select provider in dashboard
161
+
162
+ **The proxy handles all the complexity for you!**
package/README.md ADDED
@@ -0,0 +1,194 @@
1
+ # Spectyra Proxy
2
+
3
+ Local proxy server that provides an OpenAI-compatible endpoint, routing requests through Spectyra for automatic optimization. Works with GitHub Copilot, Cursor, Claude Code, and other coding assistants.
4
+
5
+ ## Features
6
+
7
+ - ✅ **OpenAI-compatible API** - Works with any tool that uses OpenAI API
8
+ - ✅ **Automatic optimization** - Routes through Spectyra for 40-65% token savings
9
+ - ✅ **Real-time dashboard** - Web UI showing savings and stats
10
+ - ✅ **Configuration management** - Easy setup via web dashboard
11
+ - ✅ **BYOK support** - Use your own API keys
12
+ - ✅ **Multi-provider support** - OpenAI, Anthropic, Gemini, Grok
13
+
14
+ ## Installation
15
+
16
+ ### Option 1: Install via npm (Recommended)
17
+
18
+ ```bash
19
+ npm install -g spectyra-proxy
20
+ ```
21
+
22
+ Then start with:
23
+ ```bash
24
+ spectyra-proxy
25
+ ```
26
+
27
+ **Note:** The npm package contains only compiled JavaScript, not source code. Your code remains private.
28
+
29
+ ### Option 2: Install from Source (Development)
30
+
31
+ If you need to modify the code or the npm package isn't available:
32
+
33
+ ```bash
34
+ # Clone the repository
35
+ git clone https://github.com/your-username/spectyra.git
36
+ cd spectyra/tools/proxy
37
+
38
+ # Install dependencies
39
+ npm install
40
+ # or
41
+ pnpm install
42
+
43
+ # Build (compiles TypeScript to JavaScript)
44
+ npm run build
45
+
46
+ # Start
47
+ npm start
48
+ ```
49
+
50
+ ## Quick Start
51
+
52
+ 1. **Start the proxy:**
53
+
54
+ **If installed via npm:**
55
+ ```bash
56
+ spectyra-proxy
57
+ ```
58
+
59
+ **If installed from source:**
60
+ ```bash
61
+ npm start
62
+ # or
63
+ pnpm start
64
+ ```
65
+
66
+ 2. **Configure via dashboard:**
67
+ - Open http://localhost:3002 in your browser
68
+ - Enter your Spectyra API key
69
+ - Enter your provider API key (OpenAI, Anthropic, etc.)
70
+ - Select provider and optimization settings
71
+ - Click "Save Configuration"
72
+
73
+ 3. **Configure your coding tool:**
74
+ - Set `OPENAI_API_BASE=http://localhost:3001/v1`
75
+ - Or configure in your tool's settings
76
+
77
+ ## Usage
78
+
79
+ ### With GitHub Copilot
80
+
81
+ 1. Set environment variable:
82
+ ```bash
83
+ export OPENAI_API_BASE=http://localhost:3001/v1
84
+ ```
85
+
86
+ 2. Restart VS Code/Copilot
87
+
88
+ ### With Cursor
89
+
90
+ 1. Open Cursor Settings
91
+ 2. Set API base URL to: `http://localhost:3001/v1`
92
+ 3. Restart Cursor
93
+
94
+ ### With Claude Code
95
+
96
+ 1. Configure Claude Code to use custom API endpoint
97
+ 2. Set endpoint to: `http://localhost:3001/v1`
98
+
99
+ ## Dashboard
100
+
101
+ Access the dashboard at: **http://localhost:3002**
102
+
103
+ **Features:**
104
+ - Real-time savings statistics
105
+ - Recent request history
106
+ - Configuration management
107
+ - Live updates every 2 seconds
108
+
109
+ ## Configuration
110
+
111
+ ### Environment Variables
112
+
113
+ - `PROXY_PORT` - Proxy server port (default: 3001)
114
+ - `DASHBOARD_PORT` - Dashboard port (default: 3002)
115
+ - `SPECTYRA_API_URL` - Spectyra API URL (default: https://spectyra.up.railway.app/v1)
116
+
117
+ ### Configuration File
118
+
119
+ Configuration is saved to `.spectyra-proxy-config.json` in the proxy directory.
120
+
121
+ **Fields:**
122
+ - `spectyraKey` - Your Spectyra API key
123
+ - `providerKey` - Your provider API key (BYOK)
124
+ - `provider` - Provider name (openai, anthropic, gemini, grok)
125
+ - `path` - Optimization path (code, talk)
126
+ - `optimizationLevel` - Optimization level (0-4)
127
+
128
+ ## API Endpoints
129
+
130
+ ### Proxy Endpoint
131
+ - `POST /v1/chat/completions` - OpenAI-compatible chat endpoint
132
+ - `POST /v1/messages` - Anthropic-compatible messages endpoint
133
+
134
+ ### Configuration
135
+ - `GET /config` - Get current configuration (without keys)
136
+ - `POST /config` - Update configuration
137
+
138
+ ### Statistics
139
+ - `GET /stats` - Get usage statistics and savings
140
+
141
+ ### Health Check
142
+ - `GET /health` - Check proxy status
143
+
144
+ ## How It Works
145
+
146
+ ```
147
+ Your Coding Tool (Copilot/Cursor/etc)
148
+ → Local Proxy (localhost:3001)
149
+ → Spectyra API (optimization)
150
+ → Provider API (OpenAI/Anthropic)
151
+ → Optimized Response
152
+ → Your Tool
153
+ ```
154
+
155
+ **Benefits:**
156
+ - Transparent optimization
157
+ - No code changes needed
158
+ - Real-time savings tracking
159
+ - Works with any OpenAI-compatible tool
160
+
161
+ ## Troubleshooting
162
+
163
+ ### Proxy not starting
164
+ - Check if ports 3001 and 3002 are available
165
+ - Try different ports via environment variables
166
+
167
+ ### Configuration not saving
168
+ - Check file permissions in proxy directory
169
+ - Ensure `.spectyra-proxy-config.json` is writable
170
+
171
+ ### Tool not connecting
172
+ - Verify proxy is running: `curl http://localhost:3001/health`
173
+ - Check tool's API base URL setting
174
+ - Ensure tool supports custom API endpoints
175
+
176
+ ### No savings showing
177
+ - Verify configuration is correct
178
+ - Check Spectyra API key is valid
179
+ - Check provider API key is valid
180
+ - Look at proxy console for errors
181
+
182
+ ## Development
183
+
184
+ ```bash
185
+ # Watch mode (auto-restart on changes)
186
+ npm run dev
187
+
188
+ # Build for distribution
189
+ npm run build
190
+ ```
191
+
192
+ ## License
193
+
194
+ MIT
package/SETUP_GUIDE.md ADDED
@@ -0,0 +1,183 @@
1
+ # Spectyra Proxy Setup Guide
2
+
3
+ ## For Developers Using Coding Assistants
4
+
5
+ This guide shows how to set up Spectyra Proxy to optimize your coding assistant API costs.
6
+
7
+ ## Supported Tools
8
+
9
+ - ✅ GitHub Copilot
10
+ - ✅ Cursor
11
+ - ✅ Claude Code
12
+ - ✅ Codeium
13
+ - ✅ Tabnine
14
+ - ✅ Any OpenAI-compatible tool
15
+
16
+ ## Step 1: Install and Start Proxy
17
+
18
+ ```bash
19
+ cd tools/proxy
20
+ pnpm install
21
+ pnpm start
22
+ ```
23
+
24
+ The proxy will start on:
25
+ - **Proxy:** http://localhost:3001
26
+ - **Dashboard:** http://localhost:3002
27
+
28
+ ## Step 2: Configure Proxy
29
+
30
+ 1. Open http://localhost:3002 in your browser
31
+ 2. Go to "Configuration" tab
32
+ 3. Enter:
33
+ - **Spectyra API Key:** Your Spectyra API key
34
+ - **Provider API Key:** Your OpenAI/Anthropic/etc. key (BYOK)
35
+ - **Provider:** Select your provider
36
+ - **Path:** "Code" (recommended for coding)
37
+ - **Optimization Level:** 2-3 (balanced to aggressive)
38
+ 4. Click "Save Configuration"
39
+
40
+ ## Step 3: Configure Your Coding Tool
41
+
42
+ ### GitHub Copilot
43
+
44
+ 1. Set environment variable:
45
+ ```bash
46
+ export OPENAI_API_BASE=http://localhost:3001/v1
47
+ ```
48
+
49
+ 2. Restart VS Code
50
+
51
+ **Or in VS Code settings.json:**
52
+ ```json
53
+ {
54
+ "github.copilot.advanced": {
55
+ "api.baseUrl": "http://localhost:3001/v1"
56
+ }
57
+ }
58
+ ```
59
+
60
+ ### Cursor
61
+
62
+ 1. Open Cursor Settings
63
+ 2. Search for "API"
64
+ 3. Set "OpenAI API Base URL" to: `http://localhost:3001/v1`
65
+ 4. Restart Cursor
66
+
67
+ ### Claude Code
68
+
69
+ 1. Open Claude Code settings
70
+ 2. Set custom API endpoint: `http://localhost:3001/v1`
71
+ 3. Restart Claude Code
72
+
73
+ ### Codeium
74
+
75
+ 1. Open Codeium settings
76
+ 2. Set API endpoint: `http://localhost:3001/v1`
77
+ 3. Restart VS Code/editor
78
+
79
+ ## Step 4: Monitor Savings
80
+
81
+ 1. Open dashboard: http://localhost:3002
82
+ 2. Go to "Statistics" tab
83
+ 3. See real-time:
84
+ - Total requests
85
+ - Tokens saved
86
+ - Cost saved
87
+ - Recent request history
88
+
89
+ ## Verification
90
+
91
+ 1. Use your coding assistant normally
92
+ 2. Check dashboard - you should see requests appearing
93
+ 3. See savings percentages (typically 40-65%)
94
+
95
+ ## Troubleshooting
96
+
97
+ ### Proxy not running
98
+ ```bash
99
+ # Check if running
100
+ curl http://localhost:3001/health
101
+
102
+ # Should return: {"status":"ok","service":"spectyra-proxy"}
103
+ ```
104
+
105
+ ### Tool not connecting
106
+ - Verify proxy is running
107
+ - Check tool's API base URL setting
108
+ - Check tool logs for connection errors
109
+ - Ensure tool supports custom endpoints
110
+
111
+ ### No savings showing
112
+ - Check configuration is saved
113
+ - Verify API keys are correct
114
+ - Check proxy console for errors
115
+ - Ensure optimization level > 0
116
+
117
+ ### Configuration not saving
118
+ - Check file permissions
119
+ - Ensure `.spectyra-proxy-config.json` is writable
120
+ - Check proxy console for errors
121
+
122
+ ## Advanced: Auto-Start on Boot
123
+
124
+ ### macOS (launchd)
125
+
126
+ Create `~/Library/LaunchAgents/com.spectyra.proxy.plist`:
127
+
128
+ ```xml
129
+ <?xml version="1.0" encoding="UTF-8"?>
130
+ <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
131
+ <plist version="1.0">
132
+ <dict>
133
+ <key>Label</key>
134
+ <string>com.spectyra.proxy</string>
135
+ <key>ProgramArguments</key>
136
+ <array>
137
+ <string>/usr/local/bin/node</string>
138
+ <string>/path/to/spectyra/tools/proxy/spectyra-proxy.ts</string>
139
+ </array>
140
+ <key>RunAtLoad</key>
141
+ <true/>
142
+ <key>KeepAlive</key>
143
+ <true/>
144
+ </dict>
145
+ </plist>
146
+ ```
147
+
148
+ Then:
149
+ ```bash
150
+ launchctl load ~/Library/LaunchAgents/com.spectyra.proxy.plist
151
+ ```
152
+
153
+ ### Linux (systemd)
154
+
155
+ Create `/etc/systemd/system/spectyra-proxy.service`:
156
+
157
+ ```ini
158
+ [Unit]
159
+ Description=Spectyra Proxy
160
+ After=network.target
161
+
162
+ [Service]
163
+ Type=simple
164
+ User=your-username
165
+ WorkingDirectory=/path/to/spectyra/tools/proxy
166
+ ExecStart=/usr/bin/node /path/to/spectyra/tools/proxy/spectyra-proxy.ts
167
+ Restart=always
168
+
169
+ [Install]
170
+ WantedBy=multi-user.target
171
+ ```
172
+
173
+ Then:
174
+ ```bash
175
+ sudo systemctl enable spectyra-proxy
176
+ sudo systemctl start spectyra-proxy
177
+ ```
178
+
179
+ ## Next Steps
180
+
181
+ - Monitor savings in dashboard
182
+ - Adjust optimization level if needed
183
+ - Share feedback and improvements!