ghc-proxy 0.1.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +171 -0
- package/dist/main.js +1816 -0
- package/dist/main.js.map +1 -0
- package/package.json +68 -0
package/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2025 wxxb789
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
package/README.md
ADDED
|
@@ -0,0 +1,171 @@
|
|
|
1
|
+
# ghc-proxy
|
|
2
|
+
|
|
3
|
+
[](https://www.npmjs.com/package/ghc-proxy)
|
|
4
|
+
[](https://github.com/wxxb789/ghc-proxy/actions/workflows/ci.yml)
|
|
5
|
+
[](https://github.com/wxxb789/ghc-proxy/blob/master/LICENSE)
|
|
6
|
+
|
|
7
|
+
A proxy that turns your GitHub Copilot subscription into an OpenAI and Anthropic compatible API. Use it to power [Claude Code](https://docs.anthropic.com/en/docs/claude-code/overview), [Cursor](https://www.cursor.com/), or any tool that speaks the OpenAI Chat Completions or Anthropic Messages protocol.
|
|
8
|
+
|
|
9
|
+
> [!WARNING]
|
|
10
|
+
> Reverse-engineered, unofficial, may break. Excessive use can trigger GitHub abuse detection. Use at your own risk.
|
|
11
|
+
|
|
12
|
+
**Note:** If you're using [opencode](https://github.com/sst/opencode), you don't need this -- opencode supports GitHub Copilot natively.
|
|
13
|
+
|
|
14
|
+
## Installation
|
|
15
|
+
|
|
16
|
+
The quickest way to get started is with `npx`:
|
|
17
|
+
|
|
18
|
+
npx ghc-proxy@latest start
|
|
19
|
+
|
|
20
|
+
This starts the proxy on `http://localhost:4141`. It will walk you through GitHub authentication on first run.
|
|
21
|
+
|
|
22
|
+
You can also install it globally:
|
|
23
|
+
|
|
24
|
+
npm install -g ghc-proxy
|
|
25
|
+
|
|
26
|
+
Or run it from source with [Bun](https://bun.sh/) (>= 1.2):
|
|
27
|
+
|
|
28
|
+
git clone https://github.com/wxxb789/ghc-proxy.git
|
|
29
|
+
cd ghc-proxy
|
|
30
|
+
bun install
|
|
31
|
+
bun run dev
|
|
32
|
+
|
|
33
|
+
## What it does
|
|
34
|
+
|
|
35
|
+
ghc-proxy sits between your tools and the GitHub Copilot API. It authenticates with GitHub using the device code flow, obtains a Copilot token, and exposes the following endpoints:
|
|
36
|
+
|
|
37
|
+
**OpenAI compatible:**
|
|
38
|
+
|
|
39
|
+
- `POST /v1/chat/completions` -- chat completions (streaming and non-streaming)
|
|
40
|
+
- `GET /v1/models` -- list available models
|
|
41
|
+
- `POST /v1/embeddings` -- generate embeddings
|
|
42
|
+
|
|
43
|
+
**Anthropic compatible:**
|
|
44
|
+
|
|
45
|
+
- `POST /v1/messages` -- the Anthropic Messages API, with full tool use and streaming support
|
|
46
|
+
- `POST /v1/messages/count_tokens` -- token counting
|
|
47
|
+
|
|
48
|
+
Anthropic requests are translated to OpenAI format on the fly, sent to Copilot, and the responses are translated back. This means Claude Code thinks it's talking to Anthropic, but it's actually talking to Copilot.
|
|
49
|
+
|
|
50
|
+
There are also utility endpoints: `GET /usage` for quota monitoring and `GET /token` to inspect the current Copilot token.
|
|
51
|
+
|
|
52
|
+
## Using with Claude Code
|
|
53
|
+
|
|
54
|
+
The fastest way to get Claude Code running on Copilot:
|
|
55
|
+
|
|
56
|
+
npx ghc-proxy@latest start --claude-code
|
|
57
|
+
|
|
58
|
+
This starts the proxy, prompts you to pick a model, and copies a ready-to-paste command to your clipboard. Run that command in another terminal to launch Claude Code.
|
|
59
|
+
|
|
60
|
+
If you prefer a permanent setup, create `.claude/settings.json` in your project:
|
|
61
|
+
|
|
62
|
+
```json
|
|
63
|
+
{
|
|
64
|
+
"env": {
|
|
65
|
+
"ANTHROPIC_BASE_URL": "http://localhost:4141",
|
|
66
|
+
"ANTHROPIC_AUTH_TOKEN": "dummy",
|
|
67
|
+
"ANTHROPIC_MODEL": "gpt-4.1",
|
|
68
|
+
"ANTHROPIC_DEFAULT_SONNET_MODEL": "gpt-4.1",
|
|
69
|
+
"ANTHROPIC_SMALL_FAST_MODEL": "gpt-4.1",
|
|
70
|
+
"ANTHROPIC_DEFAULT_HAIKU_MODEL": "gpt-4.1",
|
|
71
|
+
"DISABLE_NON_ESSENTIAL_MODEL_CALLS": "1",
|
|
72
|
+
"CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC": "1"
|
|
73
|
+
},
|
|
74
|
+
"permissions": {
|
|
75
|
+
"deny": ["WebSearch"]
|
|
76
|
+
}
|
|
77
|
+
}
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
See the [Claude Code settings docs](https://docs.anthropic.com/en/docs/claude-code/settings#environment-variables) for more options.
|
|
81
|
+
|
|
82
|
+
## CLI commands
|
|
83
|
+
|
|
84
|
+
ghc-proxy uses a subcommand structure:
|
|
85
|
+
|
|
86
|
+
ghc-proxy start # start the proxy server
|
|
87
|
+
ghc-proxy auth # run the GitHub auth flow without starting the server
|
|
88
|
+
ghc-proxy check-usage # show your Copilot usage/quota in the terminal
|
|
89
|
+
ghc-proxy debug # print diagnostic info (version, paths, token status)
|
|
90
|
+
|
|
91
|
+
### Start options
|
|
92
|
+
|
|
93
|
+
| Option | Alias | Default | Description |
|
|
94
|
+
|--------|-------|---------|-------------|
|
|
95
|
+
| `--port` | `-p` | `4141` | Port to listen on |
|
|
96
|
+
| `--verbose` | `-v` | `false` | Enable verbose logging |
|
|
97
|
+
| `--account-type` | `-a` | `individual` | `individual`, `business`, or `enterprise` |
|
|
98
|
+
| `--rate-limit` | `-r` | -- | Minimum seconds between requests |
|
|
99
|
+
| `--wait` | `-w` | `false` | Wait instead of rejecting when rate-limited |
|
|
100
|
+
| `--manual` | -- | `false` | Manually approve each request |
|
|
101
|
+
| `--github-token` | `-g` | -- | Pass a GitHub token directly (from `auth`) |
|
|
102
|
+
| `--claude-code` | `-c` | `false` | Generate a Claude Code launch command |
|
|
103
|
+
| `--show-token` | -- | `false` | Display tokens on auth and refresh |
|
|
104
|
+
| `--proxy-env` | -- | `false` | Use `HTTP_PROXY`/`HTTPS_PROXY` from env |
|
|
105
|
+
| `--idle-timeout` | -- | `120` | Bun server idle timeout in seconds |
|
|
106
|
+
|
|
107
|
+
## Rate limiting
|
|
108
|
+
|
|
109
|
+
If you're worried about hitting Copilot rate limits, you have a few options:
|
|
110
|
+
|
|
111
|
+
# Enforce a 30-second cooldown between requests
|
|
112
|
+
npx ghc-proxy@latest start --rate-limit 30
|
|
113
|
+
|
|
114
|
+
# Same, but wait instead of returning a 429 error
|
|
115
|
+
npx ghc-proxy@latest start --rate-limit 30 --wait
|
|
116
|
+
|
|
117
|
+
# Manually approve every request (useful for debugging)
|
|
118
|
+
npx ghc-proxy@latest start --manual
|
|
119
|
+
|
|
120
|
+
## Docker
|
|
121
|
+
|
|
122
|
+
Build and run:
|
|
123
|
+
|
|
124
|
+
docker build -t ghc-proxy .
|
|
125
|
+
mkdir -p ./copilot-data
|
|
126
|
+
docker run -p 4141:4141 -v $(pwd)/copilot-data:/root/.local/share/ghc-proxy ghc-proxy
|
|
127
|
+
|
|
128
|
+
The authentication and settings are persisted in `copilot-data/config.json` so authentication survives container restarts.
|
|
129
|
+
|
|
130
|
+
You can also pass a GitHub token via environment variable:
|
|
131
|
+
|
|
132
|
+
docker run -p 4141:4141 -e GH_TOKEN=your_token ghc-proxy
|
|
133
|
+
|
|
134
|
+
Docker Compose:
|
|
135
|
+
|
|
136
|
+
```yaml
|
|
137
|
+
services:
|
|
138
|
+
ghc-proxy:
|
|
139
|
+
build: .
|
|
140
|
+
ports:
|
|
141
|
+
- '4141:4141'
|
|
142
|
+
environment:
|
|
143
|
+
- GH_TOKEN=your_token_here
|
|
144
|
+
restart: unless-stopped
|
|
145
|
+
```
|
|
146
|
+
|
|
147
|
+
## Account types
|
|
148
|
+
|
|
149
|
+
If you have a GitHub business or enterprise Copilot plan, pass the `--account-type` flag:
|
|
150
|
+
|
|
151
|
+
npx ghc-proxy@latest start --account-type business
|
|
152
|
+
npx ghc-proxy@latest start --account-type enterprise
|
|
153
|
+
|
|
154
|
+
This routes requests to the correct Copilot API endpoint for your plan. See the [GitHub docs on network routing](https://docs.github.com/en/enterprise-cloud@latest/copilot/managing-copilot/managing-github-copilot-in-your-organization/managing-access-to-github-copilot-in-your-organization/managing-github-copilot-access-to-your-organizations-network#configuring-copilot-subscription-based-network-routing-for-your-enterprise-or-organization) for more details.
|
|
155
|
+
|
|
156
|
+
## How it works
|
|
157
|
+
|
|
158
|
+
The proxy authenticates with GitHub using the [device code OAuth flow](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/authorizing-oauth-apps#device-flow) (the same flow VS Code uses), then exchanges the GitHub token for a short-lived Copilot token that auto-refreshes.
|
|
159
|
+
|
|
160
|
+
Incoming requests hit a [Hono](https://hono.dev/) server. OpenAI-format requests are forwarded directly to `api.githubcopilot.com`. Anthropic-format requests pass through a translation layer (`src/translator/`) that converts the message format, tool schemas, and streaming events between the two protocols -- including full support for tool use, thinking blocks, and image content.
|
|
161
|
+
|
|
162
|
+
The server spoofs VS Code headers so the Copilot API treats it like a normal editor session.
|
|
163
|
+
|
|
164
|
+
## Development
|
|
165
|
+
|
|
166
|
+
bun install
|
|
167
|
+
bun run dev # start with --watch
|
|
168
|
+
bun run build # build with tsdown
|
|
169
|
+
bun run lint # eslint
|
|
170
|
+
bun run typecheck # tsc --noEmit
|
|
171
|
+
bun test # run tests
|