@gitlawb/openclaude 0.1.7 → 0.2.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,29 @@
1
+ NOTICE
2
+
3
+ This repository contains code derived from Anthropic's Claude Code CLI.
4
+
5
+ The original Claude Code source is proprietary software:
6
+ Copyright (c) Anthropic PBC. All rights reserved.
7
+ Subject to Anthropic's Commercial Terms of Service.
8
+
9
+ Modifications and additions by OpenClaude contributors are offered under
10
+ the MIT License where legally permissible:
11
+
12
+ MIT License
13
+ Copyright (c) 2026 OpenClaude contributors (modifications only)
14
+
15
+ Permission is hereby granted, free of charge, to any person obtaining
16
+ a copy of the modifications made by OpenClaude contributors, to deal
17
+ in those modifications without restriction, including without limitation
18
+ the rights to use, copy, modify, merge, publish, distribute, sublicense,
19
+ and/or sell copies, subject to the following conditions:
20
+
21
+ The above copyright notice and this permission notice shall be included
22
+ in all copies or substantial portions of the modifications.
23
+
24
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND.
25
+
26
+ The underlying derived code remains subject to Anthropic's copyright.
27
+ This project does not have Anthropic's authorization to distribute
28
+ their proprietary source. Users and contributors should evaluate their
29
+ own legal position.
package/README.md CHANGED
@@ -1,52 +1,61 @@
1
1
  # OpenClaude
2
2
 
3
- Use Claude Code with **any LLM** not just Claude.
3
+ OpenClaude is an open-source coding-agent CLI for cloud and local model providers.
4
4
 
5
- OpenClaude is a fork of the [Claude Code source leak](https://gitlawb.com/node/repos/z6MkgKkb/instructkr-claude-code) (exposed via npm source maps on March 31, 2026). We added an OpenAI-compatible provider shim so you can plug in GPT-4o, DeepSeek, Gemini, Llama, Mistral, or any model that speaks the OpenAI chat completions API. It now also supports the ChatGPT Codex backend for `codexplan` and `codexspark`, and local inference via [Atomic Chat](https://atomic.chat/) on Apple Silicon.
5
+ Use OpenAI-compatible APIs, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported backends while keeping one terminal-first workflow: prompts, tools, agents, MCP, slash commands, and streaming output.
6
6
 
7
- All of Claude Code's tools work — bash, file read/write/edit, grep, glob, agents, tasks, MCP — just powered by whatever model you choose.
7
+ [![PR Checks](https://github.com/Gitlawb/openclaude/actions/workflows/pr-checks.yml/badge.svg?branch=main)](https://github.com/Gitlawb/openclaude/actions/workflows/pr-checks.yml)
8
+ [![Release](https://img.shields.io/github/v/tag/Gitlawb/openclaude?label=release&color=0ea5e9)](https://github.com/Gitlawb/openclaude/tags)
9
+ [![Discussions](https://img.shields.io/badge/discussions-open-7c3aed)](https://github.com/Gitlawb/openclaude/discussions)
10
+ [![Security Policy](https://img.shields.io/badge/security-policy-0f766e)](SECURITY.md)
11
+ [![License](https://img.shields.io/badge/license-MIT-2563eb)](LICENSE)
8
12
 
9
- ---
10
-
11
- ## Start Here
13
+ [Quick Start](#quick-start) | [Setup Guides](#setup-guides) | [Providers](#supported-providers) | [Source Build](#source-build-and-local-development) | [VS Code Extension](#vs-code-extension) | [Community](#community)
12
14
 
13
- If you are new to terminals or just want the easiest path, start with the beginner guides:
15
+ ## Why OpenClaude
14
16
 
15
- - [Non-Technical Setup](docs/non-technical-setup.md)
16
- - [Windows Quick Start](docs/quick-start-windows.md)
17
- - [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
17
+ - Use one CLI across cloud APIs and local model backends
18
+ - Save provider profiles inside the app with `/provider`
19
+ - Run with OpenAI-compatible services, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported providers
20
+ - Keep coding-agent workflows in one place: bash, file tools, grep, glob, agents, tasks, MCP, and web tools
21
+ - Use the bundled VS Code extension for launch integration and theme support
18
22
 
19
- If you want source builds, Bun workflows, profile launchers, or full provider examples, use:
23
+ ## Quick Start
20
24
 
21
- - [Advanced Setup](docs/advanced-setup.md)
22
-
23
- ---
24
-
25
- ## Beginner Install
26
-
27
- For most users, install the npm package:
25
+ ### Install
28
26
 
29
27
  ```bash
30
28
  npm install -g @gitlawb/openclaude
31
29
  ```
32
30
 
33
- The package name is `@gitlawb/openclaude`, but the command you run is:
31
+ If the install later reports `ripgrep not found`, install ripgrep system-wide and confirm `rg --version` works in the same terminal before starting OpenClaude.
32
+
33
+ ### Start
34
34
 
35
35
  ```bash
36
36
  openclaude
37
37
  ```
38
38
 
39
- If you install via npm and later see `ripgrep not found`, install ripgrep system-wide and confirm `rg --version` works in the same terminal before starting OpenClaude.
39
+ Inside OpenClaude:
40
40
 
41
- ---
41
+ - run `/provider` for guided provider setup and saved profiles
42
+ - run `/onboard-github` for GitHub Models onboarding
42
43
 
43
- ## Fastest Setup
44
+ ### Fastest OpenAI setup
44
45
 
45
- ### Windows PowerShell
46
+ macOS / Linux:
46
47
 
47
- ```powershell
48
- npm install -g @gitlawb/openclaude
48
+ ```bash
49
+ export CLAUDE_CODE_USE_OPENAI=1
50
+ export OPENAI_API_KEY=sk-your-key-here
51
+ export OPENAI_MODEL=gpt-4o
52
+
53
+ openclaude
54
+ ```
55
+
56
+ Windows PowerShell:
49
57
 
58
+ ```powershell
50
59
  $env:CLAUDE_CODE_USE_OPENAI="1"
51
60
  $env:OPENAI_API_KEY="sk-your-key-here"
52
61
  $env:OPENAI_MODEL="gpt-4o"
@@ -54,147 +63,264 @@ $env:OPENAI_MODEL="gpt-4o"
54
63
  openclaude
55
64
  ```
56
65
 
57
- ### macOS / Linux
66
+ ### Fastest local Ollama setup
58
67
 
59
- ```bash
60
- npm install -g @gitlawb/openclaude
68
+ macOS / Linux:
61
69
 
70
+ ```bash
62
71
  export CLAUDE_CODE_USE_OPENAI=1
63
- export OPENAI_API_KEY=sk-your-key-here
64
- export OPENAI_MODEL=gpt-4o
72
+ export OPENAI_BASE_URL=http://localhost:11434/v1
73
+ export OPENAI_MODEL=qwen2.5-coder:7b
65
74
 
66
75
  openclaude
67
76
  ```
68
77
 
69
- That is enough to start with OpenAI.
78
+ Windows PowerShell:
70
79
 
71
- ---
80
+ ```powershell
81
+ $env:CLAUDE_CODE_USE_OPENAI="1"
82
+ $env:OPENAI_BASE_URL="http://localhost:11434/v1"
83
+ $env:OPENAI_MODEL="qwen2.5-coder:7b"
84
+
85
+ openclaude
86
+ ```
72
87
 
73
- ## Choose Your Guide
88
+ ## Setup Guides
74
89
 
75
- ### Beginner
90
+ Beginner-friendly guides:
76
91
 
77
- - Want the easiest setup with copy-paste steps: [Non-Technical Setup](docs/non-technical-setup.md)
78
- - On Windows: [Windows Quick Start](docs/quick-start-windows.md)
79
- - On macOS or Linux: [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
92
+ - [Non-Technical Setup](docs/non-technical-setup.md)
93
+ - [Windows Quick Start](docs/quick-start-windows.md)
94
+ - [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
80
95
 
81
- ### Advanced
96
+ Advanced and source-build guides:
82
97
 
83
- - Want source builds, Bun, local profiles, runtime checks, or more provider choices: [Advanced Setup](docs/advanced-setup.md)
98
+ - [Advanced Setup](docs/advanced-setup.md)
99
+ - [Android Install](ANDROID_INSTALL.md)
84
100
 
85
- ---
101
+ ## Supported Providers
86
102
 
87
- ## Common Beginner Choices
103
+ | Provider | Setup Path | Notes |
104
+ | --- | --- | --- |
105
+ | OpenAI-compatible | `/provider` or env vars | Works with OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and other compatible `/v1` servers |
106
+ | Gemini | `/provider` or env vars | Supports API key, access token, or local ADC workflow on current `main` |
107
+ | GitHub Models | `/onboard-github` | Interactive onboarding with saved credentials |
108
+ | Codex | `/provider` | Uses existing Codex credentials when available |
109
+ | Ollama | `/provider` or env vars | Local inference with no API key |
110
+ | Atomic Chat | advanced setup | Local Apple Silicon backend |
111
+ | Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |
88
112
 
89
- ### OpenAI
113
+ ## What Works
90
114
 
91
- Best default if you already have an OpenAI API key.
115
+ - **Tool-driven coding workflows**: Bash, file read/write/edit, grep, glob, agents, tasks, MCP, and slash commands
116
+ - **Streaming responses**: Real-time token output and tool progress
117
+ - **Tool calling**: Multi-step tool loops with model calls, tool execution, and follow-up responses
118
+ - **Images**: URL and base64 image inputs for providers that support vision
119
+ - **Provider profiles**: Guided setup plus saved `.openclaude-profile.json` support
120
+ - **Local and remote model backends**: Cloud APIs, local servers, and Apple Silicon local inference
121
+
122
+ ## Provider Notes
123
+
124
+ OpenClaude supports multiple providers, but behavior is not identical across all of them.
125
+
126
+ - Anthropic-specific features may not exist on other providers
127
+ - Tool quality depends heavily on the selected model
128
+ - Smaller local models can struggle with long multi-step tool flows
129
+ - Some providers impose lower output caps than the CLI defaults, and OpenClaude adapts where possible
130
+
131
+ For best results, use models with strong tool/function calling support.
132
+
133
+ ## Agent Routing
134
+
135
+ OpenClaude can route different agents to different models through settings-based routing. This is useful for cost optimization or splitting work by model strength.
136
+
137
+ Add to `~/.claude/settings.json`:
138
+
139
+ ```json
140
+ {
141
+ "agentModels": {
142
+ "deepseek-chat": {
143
+ "base_url": "https://api.deepseek.com/v1",
144
+ "api_key": "sk-your-key"
145
+ },
146
+ "gpt-4o": {
147
+ "base_url": "https://api.openai.com/v1",
148
+ "api_key": "sk-your-key"
149
+ }
150
+ },
151
+ "agentRouting": {
152
+ "Explore": "deepseek-chat",
153
+ "Plan": "gpt-4o",
154
+ "general-purpose": "gpt-4o",
155
+ "frontend-dev": "deepseek-chat",
156
+ "default": "gpt-4o"
157
+ }
158
+ }
159
+ ```
92
160
 
93
- ### Ollama
161
+ When no routing match is found, the global provider remains the fallback.
94
162
 
95
- Best if you want to run models locally on your own machine.
163
+ > **Note:** `api_key` values in `settings.json` are stored in plaintext. Keep this file private and do not commit it to version control.
96
164
 
97
- ### Codex
165
+ ## Web Search and Fetch
98
166
 
99
- Best if you already use the Codex CLI or ChatGPT Codex backend.
167
+ By default, `WebSearch` works on non-Anthropic models using DuckDuckGo. This gives GPT-4o, DeepSeek, Gemini, Ollama, and other OpenAI-compatible providers a free web search path out of the box.
100
168
 
101
- ### Atomic Chat
169
+ > **Note:** DuckDuckGo fallback works by scraping search results and may be rate-limited, blocked, or subject to DuckDuckGo's Terms of Service. If you want a more reliable supported option, configure Firecrawl.
102
170
 
103
- Best if you want local inference on Apple Silicon with Atomic Chat. See [Advanced Setup](docs/advanced-setup.md).
171
+ For Anthropic-native backends and Codex responses, OpenClaude keeps the native provider web search behavior.
104
172
 
105
- ---
173
+ `WebFetch` works, but its basic HTTP plus HTML-to-markdown path can still fail on JavaScript-rendered sites or sites that block plain HTTP requests.
106
174
 
107
- ## What Works
175
+ Set a [Firecrawl](https://firecrawl.dev) API key if you want Firecrawl-powered search/fetch behavior:
176
+
177
+ ```bash
178
+ export FIRECRAWL_API_KEY=your-key-here
179
+ ```
108
180
 
109
- - **All tools**: Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, Tasks
110
- - **Streaming**: Real-time token streaming
111
- - **Tool calling**: Multi-step tool chains (the model calls tools, gets results, continues)
112
- - **Images**: Base64 and URL images passed to vision models
113
- - **Slash commands**: /commit, /review, /compact, /diff, /doctor, etc.
114
- - **Sub-agents**: AgentTool spawns sub-agents using the same provider
115
- - **Memory**: Persistent memory system
181
+ With Firecrawl enabled:
116
182
 
117
- ## What's Different
183
+ - `WebSearch` can use Firecrawl's search API while DuckDuckGo remains the default free path for non-Claude models
184
+ - `WebFetch` uses Firecrawl's scrape endpoint instead of raw HTTP, handling JS-rendered pages correctly
118
185
 
119
- - **No thinking mode**: Anthropic's extended thinking is disabled (OpenAI models use different reasoning)
120
- - **No prompt caching**: Anthropic-specific cache headers are skipped
121
- - **No beta features**: Anthropic-specific beta headers are ignored
122
- - **Token limits**: Defaults to 32K max output — some models may cap lower, which is handled gracefully
186
+ Free tier at [firecrawl.dev](https://firecrawl.dev) includes 500 credits. The key is optional.
123
187
 
124
188
  ---
125
189
 
126
- ## How It Works
190
+ ## Headless gRPC Server
127
191
 
128
- The shim (`src/services/api/openaiShim.ts`) sits between Claude Code and the LLM API:
192
+ OpenClaude can be run as a headless gRPC service, allowing you to integrate its agentic capabilities (tools, bash, file editing) into other applications, CI/CD pipelines, or custom user interfaces. The server uses bidirectional streaming to send real-time text chunks, tool calls, and request permissions for sensitive commands.
129
193
 
130
- ```
131
- Claude Code Tool System
132
- |
133
- v
134
- Anthropic SDK interface (duck-typed)
135
- |
136
- v
137
- openaiShim.ts <-- translates formats
138
- |
139
- v
140
- OpenAI Chat Completions API
141
- |
142
- v
143
- Any compatible model
194
+ ### 1. Start the gRPC Server
195
+
196
+ Start the core engine as a gRPC service on `localhost:50051`:
197
+
198
+ ```bash
199
+ npm run dev:grpc
144
200
  ```
145
201
 
146
- It translates:
147
- - Anthropic message blocks → OpenAI messages
148
- - Anthropic tool_use/tool_result → OpenAI function calls
149
- - OpenAI SSE streaming → Anthropic stream events
150
- - Anthropic system prompt arrays → OpenAI system messages
202
+ #### Configuration
151
203
 
152
- The rest of Claude Code doesn't know it's talking to a different model.
204
+ | Variable | Default | Description |
205
+ |-----------|-------------|------------------------------------------------|
206
+ | `GRPC_PORT` | `50051` | Port the gRPC server listens on |
207
+ | `GRPC_HOST` | `localhost` | Bind address. Use `0.0.0.0` to expose on all interfaces (not recommended without authentication) |
153
208
 
154
- ---
209
+ ### 2. Run the Test CLI Client
155
210
 
156
- ## Model Quality Notes
211
+ We provide a lightweight CLI client that communicates exclusively over gRPC. It acts just like the main interactive CLI, rendering colors, streaming tokens, and prompting you for tool permissions (y/n) via the gRPC `action_required` event.
157
212
 
158
- Not all models are equal at agentic tool use. Here's a rough guide:
213
+ In a separate terminal, run:
159
214
 
160
- | Model | Tool Calling | Code Quality | Speed |
161
- |-------|-------------|-------------|-------|
162
- | GPT-4o | Excellent | Excellent | Fast |
163
- | DeepSeek-V3 | Great | Great | Fast |
164
- | Gemini 2.0 Flash | Great | Good | Very Fast |
165
- | Llama 3.3 70B | Good | Good | Medium |
166
- | Mistral Large | Good | Good | Fast |
167
- | GPT-4o-mini | Good | Good | Very Fast |
168
- | Qwen 2.5 72B | Good | Good | Medium |
169
- | Smaller models (<7B) | Limited | Limited | Very Fast |
215
+ ```bash
216
+ npm run dev:grpc:cli
217
+ ```
170
218
 
171
- For best results, use models with strong function/tool calling support.
219
+ *Note: The gRPC definitions are located in `src/proto/openclaude.proto`. You can use this file to generate clients in Python, Go, Rust, or any other language.*
172
220
 
173
221
  ---
174
222
 
175
- ## Files Changed from Original
223
+ ## Source Build And Local Development
224
+
225
+ ```bash
226
+ bun install
227
+ bun run build
228
+ node dist/cli.mjs
229
+ ```
230
+
231
+ Helpful commands:
232
+
233
+ - `bun run dev`
234
+ - `bun test`
235
+ - `bun run test:coverage`
236
+ - `bun run security:pr-scan -- --base origin/main`
237
+ - `bun run smoke`
238
+ - `bun run doctor:runtime`
239
+ - `bun run verify:privacy`
240
+ - focused `bun test ...` runs for the areas you touch
241
+
242
+ ## Testing And Coverage
243
+
244
+ OpenClaude uses Bun's built-in test runner for unit tests.
245
+
246
+ Run the full unit suite:
176
247
 
248
+ ```bash
249
+ bun test
177
250
  ```
178
- src/services/api/openaiShim.ts — NEW: OpenAI-compatible API shim (724 lines)
179
- src/services/api/client.ts — Routes to shim when CLAUDE_CODE_USE_OPENAI=1
180
- src/utils/model/providers.ts — Added 'openai' provider type
181
- src/utils/model/configs.ts — Added openai model mappings
182
- src/utils/model/model.ts — Respects OPENAI_MODEL for defaults
183
- src/utils/auth.ts — Recognizes OpenAI as valid 3P provider
251
+
252
+ Generate unit test coverage:
253
+
254
+ ```bash
255
+ bun run test:coverage
184
256
  ```
185
257
 
186
- 6 files changed. 786 lines added. Zero dependencies added.
258
+ Open the visual coverage report:
187
259
 
188
- ---
260
+ ```bash
261
+ open coverage/index.html
262
+ ```
263
+
264
+ If you already have `coverage/lcov.info` and only want to rebuild the UI:
189
265
 
190
- ## Origin
266
+ ```bash
267
+ bun run test:coverage:ui
268
+ ```
191
269
 
192
- This is a fork of [instructkr/claude-code](https://gitlawb.com/node/repos/z6MkgKkb/instructkr-claude-code), which mirrored the Claude Code source snapshot that became publicly accessible through an npm source map exposure on March 31, 2026.
270
+ Use focused test runs when you only touch one area:
193
271
 
194
- The original Claude Code source is the property of Anthropic. This repository is not affiliated with or endorsed by Anthropic.
272
+ - `bun run test:provider`
273
+ - `bun run test:provider-recommendation`
274
+ - `bun test path/to/file.test.ts`
195
275
 
196
- ---
276
+ Recommended contributor validation before opening a PR:
277
+
278
+ - `bun run build`
279
+ - `bun run smoke`
280
+ - `bun run test:coverage` for broader unit coverage when your change affects shared runtime or provider logic
281
+ - focused `bun test ...` runs for the files and flows you changed
282
+
283
+ Coverage output is written to `coverage/lcov.info`, and OpenClaude also generates a git-activity-style heatmap at `coverage/index.html`.
284
+ ## Repository Structure
285
+
286
+ - `src/` - core CLI/runtime
287
+ - `scripts/` - build, verification, and maintenance scripts
288
+ - `docs/` - setup, contributor, and project documentation
289
+ - `python/` - standalone Python helpers and their tests
290
+ - `vscode-extension/openclaude-vscode/` - VS Code extension
291
+ - `.github/` - repo automation, templates, and CI configuration
292
+ - `bin/` - CLI launcher entrypoints
293
+
294
+ ## VS Code Extension
295
+
296
+ The repo includes a VS Code extension in [`vscode-extension/openclaude-vscode`](vscode-extension/openclaude-vscode) for OpenClaude launch integration, provider-aware control-center UI, and theme support.
297
+
298
+ ## Security
299
+
300
+ If you believe you found a security issue, see [SECURITY.md](SECURITY.md).
301
+
302
+ ## Community
303
+
304
+ - Use [GitHub Discussions](https://github.com/Gitlawb/openclaude/discussions) for Q&A, ideas, and community conversation
305
+ - Use [GitHub Issues](https://github.com/Gitlawb/openclaude/issues) for confirmed bugs and actionable feature work
306
+
307
+ ## Contributing
308
+
309
+ Contributions are welcome.
310
+
311
+ For larger changes, open an issue first so the scope is clear before implementation. Helpful validation commands include:
312
+
313
+ - `bun run build`
314
+ - `bun run test:coverage`
315
+ - `bun run smoke`
316
+ - focused `bun test ...` runs for touched areas
317
+
318
+ ## Disclaimer
319
+
320
+ OpenClaude is an independent community project and is not affiliated with, endorsed by, or sponsored by Anthropic.
321
+
322
+ OpenClaude originated from the Claude Code codebase and has since been substantially modified to support multiple providers and open use. "Claude" and "Claude Code" are trademarks of Anthropic PBC. See [LICENSE](LICENSE) for details.
197
323
 
198
324
  ## License
199
325
 
200
- This repository is provided for educational and research purposes. The original source code is subject to Anthropic's terms. The OpenAI shim additions are public domain.
326
+ See [LICENSE](LICENSE).
@@ -1,7 +1,13 @@
1
- import { join } from 'path'
1
+ import { join, win32 } from 'path'
2
2
  import { pathToFileURL } from 'url'
3
3
 
4
4
  export function getDistImportSpecifier(baseDir) {
5
- const distPath = join(baseDir, '..', 'dist', 'cli.mjs')
5
+ if (/^[A-Za-z]:\\/.test(baseDir)) {
6
+ const distPath = win32.join(baseDir, '..', 'dist', 'cli.mjs')
7
+ return `file:///${distPath.replace(/\\/g, '/')}`
8
+ }
9
+
10
+ const joinImpl = join
11
+ const distPath = joinImpl(baseDir, '..', 'dist', 'cli.mjs')
6
12
  return pathToFileURL(distPath).href
7
13
  }