@gitlawb/openclaude 0.1.7 → 0.1.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,29 @@
1
+ NOTICE
2
+
3
+ This repository contains code derived from Anthropic's Claude Code CLI.
4
+
5
+ The original Claude Code source is proprietary software:
6
+ Copyright (c) Anthropic PBC. All rights reserved.
7
+ Subject to Anthropic's Commercial Terms of Service.
8
+
9
+ Modifications and additions by OpenClaude contributors are offered under
10
+ the MIT License where legally permissible:
11
+
12
+ MIT License
13
+ Copyright (c) 2026 OpenClaude contributors (modifications only)
14
+
15
+ Permission is hereby granted, free of charge, to any person obtaining
16
+ a copy of the modifications made by OpenClaude contributors, to deal
17
+ in those modifications without restriction, including without limitation
18
+ the rights to use, copy, modify, merge, publish, distribute, sublicense,
19
+ and/or sell copies, subject to the following conditions:
20
+
21
+ The above copyright notice and this permission notice shall be included
22
+ in all copies or substantial portions of the modifications.
23
+
24
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND.
25
+
26
+ The underlying derived code remains subject to Anthropic's copyright.
27
+ This project does not have Anthropic's authorization to distribute
28
+ their proprietary source. Users and contributors should evaluate their
29
+ own legal position.
package/README.md CHANGED
@@ -1,52 +1,61 @@
1
1
  # OpenClaude
2
2
 
3
- Use Claude Code with **any LLM** not just Claude.
3
+ OpenClaude is an open-source coding-agent CLI for cloud and local model providers.
4
4
 
5
- OpenClaude is a fork of the [Claude Code source leak](https://gitlawb.com/node/repos/z6MkgKkb/instructkr-claude-code) (exposed via npm source maps on March 31, 2026). We added an OpenAI-compatible provider shim so you can plug in GPT-4o, DeepSeek, Gemini, Llama, Mistral, or any model that speaks the OpenAI chat completions API. It now also supports the ChatGPT Codex backend for `codexplan` and `codexspark`, and local inference via [Atomic Chat](https://atomic.chat/) on Apple Silicon.
5
+ Use OpenAI-compatible APIs, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported backends while keeping one terminal-first workflow: prompts, tools, agents, MCP, slash commands, and streaming output.
6
6
 
7
- All of Claude Code's tools work — bash, file read/write/edit, grep, glob, agents, tasks, MCP — just powered by whatever model you choose.
7
+ [![PR Checks](https://github.com/Gitlawb/openclaude/actions/workflows/pr-checks.yml/badge.svg?branch=main)](https://github.com/Gitlawb/openclaude/actions/workflows/pr-checks.yml)
8
+ [![Release](https://img.shields.io/github/v/tag/Gitlawb/openclaude?label=release&color=0ea5e9)](https://github.com/Gitlawb/openclaude/tags)
9
+ [![Discussions](https://img.shields.io/badge/discussions-open-7c3aed)](https://github.com/Gitlawb/openclaude/discussions)
10
+ [![Security Policy](https://img.shields.io/badge/security-policy-0f766e)](SECURITY.md)
11
+ [![License](https://img.shields.io/badge/license-MIT-2563eb)](LICENSE)
8
12
 
9
- ---
13
+ [Quick Start](#quick-start) | [Setup Guides](#setup-guides) | [Providers](#supported-providers) | [Source Build](#source-build-and-local-development) | [VS Code Extension](#vs-code-extension) | [Community](#community)
10
14
 
11
- ## Start Here
15
+ ## Why OpenClaude
12
16
 
13
- If you are new to terminals or just want the easiest path, start with the beginner guides:
17
+ - Use one CLI across cloud APIs and local model backends
18
+ - Save provider profiles inside the app with `/provider`
19
+ - Run with OpenAI-compatible services, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported providers
20
+ - Keep coding-agent workflows in one place: bash, file tools, grep, glob, agents, tasks, MCP, and web tools
21
+ - Use the bundled VS Code extension for launch integration and theme support
14
22
 
15
- - [Non-Technical Setup](docs/non-technical-setup.md)
16
- - [Windows Quick Start](docs/quick-start-windows.md)
17
- - [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
18
-
19
- If you want source builds, Bun workflows, profile launchers, or full provider examples, use:
20
-
21
- - [Advanced Setup](docs/advanced-setup.md)
22
-
23
- ---
24
-
25
- ## Beginner Install
23
+ ## Quick Start
26
24
 
27
- For most users, install the npm package:
25
+ ### Install
28
26
 
29
27
  ```bash
30
28
  npm install -g @gitlawb/openclaude
31
29
  ```
32
30
 
33
- The package name is `@gitlawb/openclaude`, but the command you run is:
31
+ If the install later reports `ripgrep not found`, install ripgrep system-wide and confirm `rg --version` works in the same terminal before starting OpenClaude.
32
+
33
+ ### Start
34
34
 
35
35
  ```bash
36
36
  openclaude
37
37
  ```
38
38
 
39
- If you install via npm and later see `ripgrep not found`, install ripgrep system-wide and confirm `rg --version` works in the same terminal before starting OpenClaude.
39
+ Inside OpenClaude:
40
40
 
41
- ---
41
+ - run `/provider` for guided provider setup and saved profiles
42
+ - run `/onboard-github` for GitHub Models onboarding
42
43
 
43
- ## Fastest Setup
44
+ ### Fastest OpenAI setup
44
45
 
45
- ### Windows PowerShell
46
+ macOS / Linux:
46
47
 
47
- ```powershell
48
- npm install -g @gitlawb/openclaude
48
+ ```bash
49
+ export CLAUDE_CODE_USE_OPENAI=1
50
+ export OPENAI_API_KEY=sk-your-key-here
51
+ export OPENAI_MODEL=gpt-4o
49
52
 
53
+ openclaude
54
+ ```
55
+
56
+ Windows PowerShell:
57
+
58
+ ```powershell
50
59
  $env:CLAUDE_CODE_USE_OPENAI="1"
51
60
  $env:OPENAI_API_KEY="sk-your-key-here"
52
61
  $env:OPENAI_MODEL="gpt-4o"
@@ -54,147 +63,229 @@ $env:OPENAI_MODEL="gpt-4o"
54
63
  openclaude
55
64
  ```
56
65
 
57
- ### macOS / Linux
66
+ ### Fastest local Ollama setup
58
67
 
59
- ```bash
60
- npm install -g @gitlawb/openclaude
68
+ macOS / Linux:
61
69
 
70
+ ```bash
62
71
  export CLAUDE_CODE_USE_OPENAI=1
63
- export OPENAI_API_KEY=sk-your-key-here
64
- export OPENAI_MODEL=gpt-4o
72
+ export OPENAI_BASE_URL=http://localhost:11434/v1
73
+ export OPENAI_MODEL=qwen2.5-coder:7b
65
74
 
66
75
  openclaude
67
76
  ```
68
77
 
69
- That is enough to start with OpenAI.
78
+ Windows PowerShell:
70
79
 
71
- ---
80
+ ```powershell
81
+ $env:CLAUDE_CODE_USE_OPENAI="1"
82
+ $env:OPENAI_BASE_URL="http://localhost:11434/v1"
83
+ $env:OPENAI_MODEL="qwen2.5-coder:7b"
72
84
 
73
- ## Choose Your Guide
85
+ openclaude
86
+ ```
74
87
 
75
- ### Beginner
88
+ ## Setup Guides
76
89
 
77
- - Want the easiest setup with copy-paste steps: [Non-Technical Setup](docs/non-technical-setup.md)
78
- - On Windows: [Windows Quick Start](docs/quick-start-windows.md)
79
- - On macOS or Linux: [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
90
+ Beginner-friendly guides:
80
91
 
81
- ### Advanced
92
+ - [Non-Technical Setup](docs/non-technical-setup.md)
93
+ - [Windows Quick Start](docs/quick-start-windows.md)
94
+ - [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
82
95
 
83
- - Want source builds, Bun, local profiles, runtime checks, or more provider choices: [Advanced Setup](docs/advanced-setup.md)
96
+ Advanced and source-build guides:
84
97
 
85
- ---
98
+ - [Advanced Setup](docs/advanced-setup.md)
99
+ - [Android Install](ANDROID_INSTALL.md)
86
100
 
87
- ## Common Beginner Choices
101
+ ## Supported Providers
88
102
 
89
- ### OpenAI
103
+ | Provider | Setup Path | Notes |
104
+ | --- | --- | --- |
105
+ | OpenAI-compatible | `/provider` or env vars | Works with OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and other compatible `/v1` servers |
106
+ | Gemini | `/provider` or env vars | Supports API key, access token, or local ADC workflow on current `main` |
107
+ | GitHub Models | `/onboard-github` | Interactive onboarding with saved credentials |
108
+ | Codex | `/provider` | Uses existing Codex credentials when available |
109
+ | Ollama | `/provider` or env vars | Local inference with no API key |
110
+ | Atomic Chat | advanced setup | Local Apple Silicon backend |
111
+ | Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |
90
112
 
91
- Best default if you already have an OpenAI API key.
113
+ ## What Works
92
114
 
93
- ### Ollama
115
+ - **Tool-driven coding workflows**: Bash, file read/write/edit, grep, glob, agents, tasks, MCP, and slash commands
116
+ - **Streaming responses**: Real-time token output and tool progress
117
+ - **Tool calling**: Multi-step tool loops with model calls, tool execution, and follow-up responses
118
+ - **Images**: URL and base64 image inputs for providers that support vision
119
+ - **Provider profiles**: Guided setup plus saved `.openclaude-profile.json` support
120
+ - **Local and remote model backends**: Cloud APIs, local servers, and Apple Silicon local inference
121
+
122
+ ## Provider Notes
123
+
124
+ OpenClaude supports multiple providers, but behavior is not identical across all of them.
125
+
126
+ - Anthropic-specific features may not exist on other providers
127
+ - Tool quality depends heavily on the selected model
128
+ - Smaller local models can struggle with long multi-step tool flows
129
+ - Some providers impose lower output caps than the CLI defaults, and OpenClaude adapts where possible
130
+
131
+ For best results, use models with strong tool/function calling support.
132
+
133
+ ## Agent Routing
134
+
135
+ OpenClaude can route different agents to different models through settings-based routing. This is useful for cost optimization or splitting work by model strength.
136
+
137
+ Add to `~/.claude/settings.json`:
138
+
139
+ ```json
140
+ {
141
+ "agentModels": {
142
+ "deepseek-chat": {
143
+ "base_url": "https://api.deepseek.com/v1",
144
+ "api_key": "sk-your-key"
145
+ },
146
+ "gpt-4o": {
147
+ "base_url": "https://api.openai.com/v1",
148
+ "api_key": "sk-your-key"
149
+ }
150
+ },
151
+ "agentRouting": {
152
+ "Explore": "deepseek-chat",
153
+ "Plan": "gpt-4o",
154
+ "general-purpose": "gpt-4o",
155
+ "frontend-dev": "deepseek-chat",
156
+ "default": "gpt-4o"
157
+ }
158
+ }
159
+ ```
94
160
 
95
- Best if you want to run models locally on your own machine.
161
+ When no routing match is found, the global provider remains the fallback.
96
162
 
97
- ### Codex
163
+ > **Note:** `api_key` values in `settings.json` are stored in plaintext. Keep this file private and do not commit it to version control.
98
164
 
99
- Best if you already use the Codex CLI or ChatGPT Codex backend.
165
+ ## Web Search and Fetch
100
166
 
101
- ### Atomic Chat
167
+ By default, `WebSearch` works on non-Anthropic models using DuckDuckGo. This gives GPT-4o, DeepSeek, Gemini, Ollama, and other OpenAI-compatible providers a free web search path out of the box.
102
168
 
103
- Best if you want local inference on Apple Silicon with Atomic Chat. See [Advanced Setup](docs/advanced-setup.md).
169
+ > **Note:** DuckDuckGo fallback works by scraping search results and may be rate-limited, blocked, or subject to DuckDuckGo's Terms of Service. If you want a more reliable supported option, configure Firecrawl.
104
170
 
105
- ---
171
+ For Anthropic-native backends and Codex responses, OpenClaude keeps the native provider web search behavior.
106
172
 
107
- ## What Works
173
+ `WebFetch` works, but its basic HTTP plus HTML-to-markdown path can still fail on JavaScript-rendered sites or sites that block plain HTTP requests.
108
174
 
109
- - **All tools**: Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, Tasks
110
- - **Streaming**: Real-time token streaming
111
- - **Tool calling**: Multi-step tool chains (the model calls tools, gets results, continues)
112
- - **Images**: Base64 and URL images passed to vision models
113
- - **Slash commands**: /commit, /review, /compact, /diff, /doctor, etc.
114
- - **Sub-agents**: AgentTool spawns sub-agents using the same provider
115
- - **Memory**: Persistent memory system
175
+ Set a [Firecrawl](https://firecrawl.dev) API key if you want Firecrawl-powered search/fetch behavior:
116
176
 
117
- ## What's Different
177
+ ```bash
178
+ export FIRECRAWL_API_KEY=your-key-here
179
+ ```
118
180
 
119
- - **No thinking mode**: Anthropic's extended thinking is disabled (OpenAI models use different reasoning)
120
- - **No prompt caching**: Anthropic-specific cache headers are skipped
121
- - **No beta features**: Anthropic-specific beta headers are ignored
122
- - **Token limits**: Defaults to 32K max output — some models may cap lower, which is handled gracefully
181
+ With Firecrawl enabled:
123
182
 
124
- ---
183
+ - `WebSearch` can use Firecrawl's search API while DuckDuckGo remains the default free path for non-Claude models
184
+ - `WebFetch` uses Firecrawl's scrape endpoint instead of raw HTTP, handling JS-rendered pages correctly
125
185
 
126
- ## How It Works
186
+ Free tier at [firecrawl.dev](https://firecrawl.dev) includes 500 credits. The key is optional.
127
187
 
128
- The shim (`src/services/api/openaiShim.ts`) sits between Claude Code and the LLM API:
188
+ ## Source Build And Local Development
129
189
 
130
- ```
131
- Claude Code Tool System
132
- |
133
- v
134
- Anthropic SDK interface (duck-typed)
135
- |
136
- v
137
- openaiShim.ts <-- translates formats
138
- |
139
- v
140
- OpenAI Chat Completions API
141
- |
142
- v
143
- Any compatible model
190
+ ```bash
191
+ bun install
192
+ bun run build
193
+ node dist/cli.mjs
144
194
  ```
145
195
 
146
- It translates:
147
- - Anthropic message blocks → OpenAI messages
148
- - Anthropic tool_use/tool_result → OpenAI function calls
149
- - OpenAI SSE streaming → Anthropic stream events
150
- - Anthropic system prompt arrays → OpenAI system messages
196
+ Helpful commands:
151
197
 
152
- The rest of Claude Code doesn't know it's talking to a different model.
198
+ - `bun run dev`
199
+ - `bun test`
200
+ - `bun run test:coverage`
201
+ - `bun run security:pr-scan -- --base origin/main`
202
+ - `bun run smoke`
203
+ - `bun run doctor:runtime`
204
+ - `bun run verify:privacy`
205
+ - focused `bun test ...` runs for the areas you touch
153
206
 
154
- ---
207
+ ## Testing And Coverage
155
208
 
156
- ## Model Quality Notes
209
+ OpenClaude uses Bun's built-in test runner for unit tests.
157
210
 
158
- Not all models are equal at agentic tool use. Here's a rough guide:
211
+ Run the full unit suite:
159
212
 
160
- | Model | Tool Calling | Code Quality | Speed |
161
- |-------|-------------|-------------|-------|
162
- | GPT-4o | Excellent | Excellent | Fast |
163
- | DeepSeek-V3 | Great | Great | Fast |
164
- | Gemini 2.0 Flash | Great | Good | Very Fast |
165
- | Llama 3.3 70B | Good | Good | Medium |
166
- | Mistral Large | Good | Good | Fast |
167
- | GPT-4o-mini | Good | Good | Very Fast |
168
- | Qwen 2.5 72B | Good | Good | Medium |
169
- | Smaller models (<7B) | Limited | Limited | Very Fast |
213
+ ```bash
214
+ bun test
215
+ ```
170
216
 
171
- For best results, use models with strong function/tool calling support.
217
+ Generate unit test coverage:
172
218
 
173
- ---
219
+ ```bash
220
+ bun run test:coverage
221
+ ```
174
222
 
175
- ## Files Changed from Original
223
+ Open the visual coverage report:
176
224
 
225
+ ```bash
226
+ open coverage/index.html
177
227
  ```
178
- src/services/api/openaiShim.ts — NEW: OpenAI-compatible API shim (724 lines)
179
- src/services/api/client.ts — Routes to shim when CLAUDE_CODE_USE_OPENAI=1
180
- src/utils/model/providers.ts — Added 'openai' provider type
181
- src/utils/model/configs.ts — Added openai model mappings
182
- src/utils/model/model.ts — Respects OPENAI_MODEL for defaults
183
- src/utils/auth.ts — Recognizes OpenAI as valid 3P provider
228
+
229
+ If you already have `coverage/lcov.info` and only want to rebuild the UI:
230
+
231
+ ```bash
232
+ bun run test:coverage:ui
184
233
  ```
185
234
 
186
- 6 files changed. 786 lines added. Zero dependencies added.
235
+ Use focused test runs when you only touch one area:
236
+
237
+ - `bun run test:provider`
238
+ - `bun run test:provider-recommendation`
239
+ - `bun test path/to/file.test.ts`
240
+
241
+ Recommended contributor validation before opening a PR:
242
+
243
+ - `bun run build`
244
+ - `bun run smoke`
245
+ - `bun run test:coverage` for broader unit coverage when your change affects shared runtime or provider logic
246
+ - focused `bun test ...` runs for the files and flows you changed
247
+
248
+ Coverage output is written to `coverage/lcov.info`, and OpenClaude also generates a git-activity-style heatmap at `coverage/index.html`.
249
+ ## Repository Structure
250
+
251
+ - `src/` - core CLI/runtime
252
+ - `scripts/` - build, verification, and maintenance scripts
253
+ - `docs/` - setup, contributor, and project documentation
254
+ - `python/` - standalone Python helpers and their tests
255
+ - `vscode-extension/openclaude-vscode/` - VS Code extension
256
+ - `.github/` - repo automation, templates, and CI configuration
257
+ - `bin/` - CLI launcher entrypoints
258
+
259
+ ## VS Code Extension
260
+
261
+ The repo includes a VS Code extension in [`vscode-extension/openclaude-vscode`](vscode-extension/openclaude-vscode) for OpenClaude launch integration, provider-aware control-center UI, and theme support.
262
+
263
+ ## Security
264
+
265
+ If you believe you found a security issue, see [SECURITY.md](SECURITY.md).
266
+
267
+ ## Community
268
+
269
+ - Use [GitHub Discussions](https://github.com/Gitlawb/openclaude/discussions) for Q&A, ideas, and community conversation
270
+ - Use [GitHub Issues](https://github.com/Gitlawb/openclaude/issues) for confirmed bugs and actionable feature work
271
+
272
+ ## Contributing
273
+
274
+ Contributions are welcome.
187
275
 
188
- ---
276
+ For larger changes, open an issue first so the scope is clear before implementation. Helpful validation commands include:
189
277
 
190
- ## Origin
278
+ - `bun run build`
279
+ - `bun run test:coverage`
280
+ - `bun run smoke`
281
+ - focused `bun test ...` runs for touched areas
191
282
 
192
- This is a fork of [instructkr/claude-code](https://gitlawb.com/node/repos/z6MkgKkb/instructkr-claude-code), which mirrored the Claude Code source snapshot that became publicly accessible through an npm source map exposure on March 31, 2026.
283
+ ## Disclaimer
193
284
 
194
- The original Claude Code source is the property of Anthropic. This repository is not affiliated with or endorsed by Anthropic.
285
+ OpenClaude is an independent community project and is not affiliated with, endorsed by, or sponsored by Anthropic.
195
286
 
196
- ---
287
+ OpenClaude originated from the Claude Code codebase and has since been substantially modified to support multiple providers and open use. "Claude" and "Claude Code" are trademarks of Anthropic PBC. See [LICENSE](LICENSE) for details.
197
288
 
198
289
  ## License
199
290
 
200
- This repository is provided for educational and research purposes. The original source code is subject to Anthropic's terms. The OpenAI shim additions are public domain.
291
+ See [LICENSE](LICENSE).
@@ -1,7 +1,13 @@
1
- import { join } from 'path'
1
+ import { join, win32 } from 'path'
2
2
  import { pathToFileURL } from 'url'
3
3
 
4
4
  export function getDistImportSpecifier(baseDir) {
5
- const distPath = join(baseDir, '..', 'dist', 'cli.mjs')
5
+ if (/^[A-Za-z]:\\/.test(baseDir)) {
6
+ const distPath = win32.join(baseDir, '..', 'dist', 'cli.mjs')
7
+ return `file:///${distPath.replace(/\\/g, '/')}`
8
+ }
9
+
10
+ const joinImpl = join
11
+ const distPath = joinImpl(baseDir, '..', 'dist', 'cli.mjs')
6
12
  return pathToFileURL(distPath).href
7
13
  }