oca-proxy 1.0.1 → 1.0.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +28 -92
  2. package/bin/oca-proxy.js +111 -51
  3. package/package.json +46 -46
package/README.md CHANGED
@@ -7,23 +7,24 @@ This proxy handles OCI authentication via web-based OAuth flow and exposes stand
7
7
  ## Quick Start
8
8
 
9
9
  ```bash
10
- cd oca-proxy
11
- npm install
12
- npm run build
13
- npx ./bin/oca-proxy.js
10
+ # Run without installing (recommended)
11
+ npx oca-proxy
14
12
  ```
15
13
 
16
- Or install globally and run from anywhere:
14
+ Or install globally from npm and run:
17
15
 
18
16
  ```bash
19
- npm install -g .
17
+ npm install -g oca-proxy
20
18
  oca-proxy
21
19
  ```
22
20
 
23
- You can also use npx after local or global install:
21
+ ### From Source
24
22
 
25
23
  ```bash
26
- npx oca-proxy
24
+ cd oca-proxy
25
+ npm install
26
+ npm run build
27
+ npx ./bin/oca-proxy.js
27
28
  ```
28
29
 
29
30
  On first run, the browser will automatically open for OAuth login. After authentication, the proxy is ready to use.
@@ -36,7 +37,7 @@ The proxy uses web-based OAuth with PKCE on whitelisted ports (8669, 8668, 8667)
36
37
  - **Logout:** Visit `http://localhost:8669/logout`
37
38
  - **Status:** Visit `http://localhost:8669/health`
38
39
 
39
- Tokens are stored in `~/.oca/refresh_token.json` (same location as Python proxy).
40
+ Tokens are stored in `~/.oca/refresh_token.json`.
40
41
 
41
42
  ## Usage with OpenAI SDK
42
43
 
@@ -107,7 +108,7 @@ curl http://localhost:8669/v1/chat/completions \
107
108
 
108
109
  Models not starting with `oca/` are automatically mapped to `oca/gpt-4.1` by default.
109
110
 
110
- Custom mappings can be configured in `~/.config/oca/oca-proxy.config.json` (old path `~/.oca/oca-proxy-config.json` still read):
111
+ Custom mappings can be configured in `~/.config/oca/oca-proxy.config.json`:
111
112
 
112
113
  ```json
113
114
  {
@@ -120,69 +121,7 @@ Custom mappings can be configured in `~/.config/oca/oca-proxy.config.json` (old
120
121
 
121
122
  ## Integration Examples
122
123
 
123
- ### Claude Code
124
-
125
- Use the Anthropic endpoint with Claude Code:
126
-
127
- ```bash
128
- export ANTHROPIC_API_KEY=dummy
129
- export ANTHROPIC_BASE_URL=http://localhost:8669
130
- claude
131
- ```
132
-
133
- Or use environment variables in one line:
134
-
135
- ```bash
136
- ANTHROPIC_API_KEY=dummy ANTHROPIC_BASE_URL=http://localhost:8669 claude
137
- ```
138
-
139
- ### OpenCode
140
-
141
- Create `opencode.json` in your project root:
142
-
143
- ```json
144
- {
145
- "$schema": "https://opencode.ai/config.json",
146
- "provider": {
147
- "oca": {
148
- "api": "openai",
149
- "name": "Oracle Code Assist",
150
- "options": {
151
- "baseURL": "http://localhost:8669/v1",
152
- "apiKey": "dummy"
153
- },
154
- "models": {
155
- "gpt-4.1": {
156
- "id": "oca/gpt-4.1",
157
- "name": "OCA GPT 4.1"
158
- }
159
- }
160
- }
161
- },
162
- "model": "oca/gpt-4.1"
163
- }
164
- ```
165
-
166
- ### Aider
167
-
168
- ```bash
169
- aider --openai-api-key dummy --openai-api-base http://localhost:8669/v1
170
- ```
171
-
172
- ### Continue (VS Code)
173
-
174
- ```json
175
- {
176
- "models": [
177
- {
178
- "provider": "openai",
179
- "model": "oca/gpt-4.1",
180
- "apiKey": "dummy",
181
- "apiBase": "http://localhost:8669/v1"
182
- }
183
- ]
184
- }
185
- ```
124
+ See CONFIG.md for editor and tool setup examples: [CONFIG.md](./CONFIG.md)
186
125
 
187
126
  ## Files
188
127
 
@@ -202,7 +141,7 @@ oca-proxy/
202
141
 
203
142
  ## Running with PM2
204
143
 
205
- PM2 is a production process manager for Node.js applications. To run the OCA Proxy with PM2:
144
+ PM2 is a production process manager for Node.js applications. You can run the OCA Proxy via the global binary or npx.
206
145
 
207
146
  1. Install PM2 globally:
208
147
 
@@ -210,19 +149,19 @@ PM2 is a production process manager for Node.js applications. To run the OCA Pro
210
149
  npm install -g pm2
211
150
  ```
212
151
 
213
- 2. Build the project:
214
-
215
- ```bash
216
- npm run build
217
- ```
152
+ 2. Start the proxy (choose one):
218
153
 
219
- 3. Start the proxy:
154
+ - Global install:
155
+ ```bash
156
+ pm2 start oca-proxy --name oca-proxy
157
+ ```
220
158
 
221
- ```bash
222
- pm2 start dist/oca-proxy.js --name oca-proxy
223
- ```
159
+ - Using npx (no global install):
160
+ ```bash
161
+ pm2 start "npx oca-proxy" --name oca-proxy
162
+ ```
224
163
 
225
- 4. Monitor and manage:
164
+ 3. Monitor and manage:
226
165
  - View status: `pm2 status`
227
166
  - View logs: `pm2 logs oca-proxy`
228
167
  - Restart: `pm2 restart oca-proxy`
@@ -236,7 +175,11 @@ module.exports = {
236
175
  apps: [
237
176
  {
238
177
  name: 'oca-proxy',
239
- script: 'bin/oca-proxy.js',
178
+ // If installed globally:
179
+ script: 'oca-proxy',
180
+ // Or, if you prefer npx, use:
181
+ // script: 'npx',
182
+ // args: 'oca-proxy',
240
183
  env: {
241
184
  NODE_ENV: 'production',
242
185
  PORT: 8669,
@@ -248,11 +191,4 @@ module.exports = {
248
191
 
249
192
  Then start with `pm2 start ecosystem.config.js`.
250
193
 
251
- ## Comparison with Python Proxy
252
-
253
- This TypeScript proxy is functionally equivalent to the Python proxy at `~/project/ccr-oca/oca-proxy/`. Both:
254
194
 
255
- - Use the same OAuth client (internal mode)
256
- - Store tokens in the same location (`~/.oca/refresh_token.json`)
257
- - Support the same whitelisted ports (8669, 8668, 8667)
258
- - Provide OpenAI-compatible endpoints