@amirtechai/xclaude 0.1.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,29 @@
1
+ NOTICE
2
+
3
+ This repository contains code derived from Anthropic's Claude Code CLI.
4
+
5
+ The original Claude Code source is proprietary software:
6
+ Copyright (c) Anthropic PBC. All rights reserved.
7
+ Subject to Anthropic's Commercial Terms of Service.
8
+
9
+ Modifications and additions by OpenClaude contributors are offered under
10
+ the MIT License where legally permissible:
11
+
12
+ MIT License
13
+ Copyright (c) 2026 OpenClaude contributors (modifications only)
14
+
15
+ Permission is hereby granted, free of charge, to any person obtaining
16
+ a copy of the modifications made by OpenClaude contributors, to deal
17
+ in those modifications without restriction, including without limitation
18
+ the rights to use, copy, modify, merge, publish, distribute, sublicense,
19
+ and/or sell copies, subject to the following conditions:
20
+
21
+ The above copyright notice and this permission notice shall be included
22
+ in all copies or substantial portions of the modifications.
23
+
24
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND.
25
+
26
+ The underlying derived code remains subject to Anthropic's copyright.
27
+ This project does not have Anthropic's authorization to distribute
28
+ their proprietary source. Users and contributors should evaluate their
29
+ own legal position.
package/README.md ADDED
@@ -0,0 +1,277 @@
1
+ # Xclaude
2
+
3
+ Xclaude is an open-source coding-agent CLI that works with more than one model provider.
4
+
5
+ Use OpenAI-compatible APIs, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported backends while keeping the same terminal-first workflow: prompts, tools, agents, MCP, slash commands, and streaming output.
6
+
7
+ ## Why Xclaude
8
+
9
+ - Use one CLI across cloud and local model providers
10
+ - Save provider profiles inside the app with `/provider`
11
+ - Run locally with Ollama or Atomic Chat
12
+ - Keep core coding-agent workflows: bash, file tools, grep, glob, agents, tasks, MCP, and web tools
13
+
14
+ ## Provenance & Legal Notice
15
+
16
+ Xclaude is derived from Anthropic's Claude Code CLI source code, which was
17
+ inadvertently exposed in March 2026 through a packaging error in npm. The
18
+ original Claude Code source is proprietary software owned by Anthropic PBC.
19
+
20
+ This project adds multi-provider support, strips telemetry, and adapts the
21
+ codebase for open use. It is not an authorized fork or open-source release
22
+ by Anthropic.
23
+
24
+ **"Claude" and "Claude Code" are trademarks of Anthropic PBC.**
25
+
26
+ Contributors should be aware that the legal status of distributing code
27
+ derived from Anthropic's proprietary source is unresolved. See the LICENSE
28
+ file for details.
29
+
30
+ ---
31
+
32
+ ## Quick Start
33
+
34
+ ### Install
35
+
36
+ ```bash
37
+ npm install -g @gitlawb/xclaude
38
+ ```
39
+
40
+ If the npm install path later reports `ripgrep not found`, install ripgrep system-wide and confirm `rg --version` works in the same terminal before starting Xclaude.
41
+
42
+ ### Start
43
+
44
+ ```bash
45
+ xclaude
46
+ ```
47
+
48
+ Inside Xclaude:
49
+
50
+ - run `/provider` for guided setup of OpenAI-compatible, Gemini, Ollama, or Codex profiles
51
+ - run `/onboard-github` for GitHub Models setup
52
+
53
+ ### Fastest OpenAI setup
54
+
55
+ macOS / Linux:
56
+
57
+ ```bash
58
+ export CLAUDE_CODE_USE_OPENAI=1
59
+ export OPENAI_API_KEY=sk-your-key-here
60
+ export OPENAI_MODEL=gpt-4o
61
+
62
+ xclaude
63
+ ```
64
+
65
+ Windows PowerShell:
66
+
67
+ ```powershell
68
+ $env:CLAUDE_CODE_USE_OPENAI="1"
69
+ $env:OPENAI_API_KEY="sk-your-key-here"
70
+ $env:OPENAI_MODEL="gpt-4o"
71
+
72
+ xclaude
73
+ ```
74
+
75
+ ### Fastest local Ollama setup
76
+
77
+ macOS / Linux:
78
+
79
+ ```bash
80
+ export CLAUDE_CODE_USE_OPENAI=1
81
+ export OPENAI_BASE_URL=http://localhost:11434/v1
82
+ export OPENAI_MODEL=qwen2.5-coder:7b
83
+
84
+ xclaude
85
+ ```
86
+
87
+ Windows PowerShell:
88
+
89
+ ```powershell
90
+ $env:CLAUDE_CODE_USE_OPENAI="1"
91
+ $env:OPENAI_BASE_URL="http://localhost:11434/v1"
92
+ $env:OPENAI_MODEL="qwen2.5-coder:7b"
93
+
94
+ xclaude
95
+ ```
96
+
97
+ ---
98
+
99
+ ## Setup Guides
100
+
101
+ Beginner-friendly guides:
102
+
103
+ - [Non-Technical Setup](docs/non-technical-setup.md)
104
+ - [Windows Quick Start](docs/quick-start-windows.md)
105
+ - [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
106
+
107
+ Advanced and source-build guides:
108
+
109
+ - [Advanced Setup](docs/advanced-setup.md)
110
+ - [Android Install](ANDROID_INSTALL.md)
111
+
112
+ ---
113
+
114
+ ## Supported Providers
115
+
116
+ | Provider | Setup Path | Notes |
117
+ | --- | --- | --- |
118
+ | OpenAI-compatible | `/provider` or env vars | Works with OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and compatible local `/v1` servers |
119
+ | Gemini | `/provider` or env vars | Google Gemini support through the runtime provider layer |
120
+ | GitHub Models | `/onboard-github` | Interactive onboarding with saved credentials |
121
+ | Codex | `/provider` | Uses existing Codex credentials when available |
122
+ | Ollama | `/provider` or env vars | Local inference with no API key |
123
+ | Atomic Chat | advanced setup | Local Apple Silicon backend |
124
+ | Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |
125
+
126
+ ---
127
+
128
+ ## What Works
129
+
130
+ - Tool-driven coding workflows
131
+ Bash, file read/write/edit, grep, glob, agents, tasks, MCP, and slash commands
132
+ - Streaming responses
133
+ Real-time token output and tool progress
134
+ - Tool calling
135
+ Multi-step tool loops with model calls, tool execution, and follow-up responses
136
+ - Images
137
+ URL and base64 image inputs for providers that support vision
138
+ - Provider profiles
139
+ Guided setup plus saved `.xclaude-profile.json` support
140
+ - Local and remote model backends
141
+ Cloud APIs, local servers, and Apple Silicon local inference
142
+
143
+ ---
144
+
145
+ ## Provider Notes
146
+
147
+ Xclaude supports multiple providers, but behavior is not identical across all of them.
148
+
149
+ - Anthropic-specific features may not exist on other providers
150
+ - Tool quality depends heavily on the selected model
151
+ - Smaller local models can struggle with long multi-step tool flows
152
+ - Some providers impose lower output caps than the CLI defaults, and Xclaude adapts where possible
153
+
154
+ For best results, use models with strong tool/function calling support.
155
+
156
+ ---
157
+
158
+ ## Agent Routing
159
+
160
+ Route different agents to different AI providers within the same session. Useful for cost optimization (cheap model for code review, powerful model for complex coding) or leveraging model strengths.
161
+
162
+ ### Configuration
163
+
164
+ Add to `~/.claude/settings.json`:
165
+
166
+ ```json
167
+ {
168
+ "agentModels": {
169
+ "deepseek-chat": {
170
+ "base_url": "https://api.deepseek.com/v1",
171
+ "api_key": "sk-your-key"
172
+ },
173
+ "gpt-4o": {
174
+ "base_url": "https://api.openai.com/v1",
175
+ "api_key": "sk-your-key"
176
+ }
177
+ },
178
+ "agentRouting": {
179
+ "Explore": "deepseek-chat",
180
+ "Plan": "gpt-4o",
181
+ "general-purpose": "gpt-4o",
182
+ "frontend-dev": "deepseek-chat",
183
+ "default": "gpt-4o"
184
+ }
185
+ }
186
+ ```
187
+
188
+ ### How It Works
189
+
190
+ - **agentModels**: Maps model names to OpenAI-compatible API endpoints
191
+ - **agentRouting**: Maps agent types or team member names to model names
192
+ - **Priority**: `name` > `subagent_type` > `"default"` > global provider
193
+ - **Matching**: Case-insensitive, hyphen/underscore equivalent (`general-purpose` = `general_purpose`)
194
+ - **Teams**: Team members are routed by their `name` — no extra config needed
195
+
196
+ When no routing match is found, the global provider (env vars) is used as fallback.
197
+
198
+ > **Note:** `api_key` values in `settings.json` are stored in plaintext. Keep this file private and do not commit it to version control.
199
+
200
+ ---
201
+
202
+ ## Web Search and Fetch
203
+
204
+ By default, `WebSearch` now works on non-Anthropic models using DuckDuckGo. This gives GPT-4o, DeepSeek, Gemini, Ollama, and other OpenAI-compatible providers a free web search path out of the box.
205
+
206
+ >**Note:** DuckDuckGo fallback works by scraping search results and may be rate-limited, blocked, or subject to DuckDuckGo's Terms of Service. If you want a more reliable supported option, configure Firecrawl.
207
+
208
+ For Anthropic-native backends (Anthropic/Vertex/Foundry) and Codex responses, Xclaude keeps the native provider web search behavior.
209
+
210
+ `WebFetch` works but uses basic HTTP plus HTML-to-markdown conversion. That fails on JavaScript-rendered pages (React, Next.js, Vue SPAs) and sites that block plain HTTP requests.
211
+
212
+ Set a [Firecrawl](https://firecrawl.dev) API key if you want Firecrawl-powered search/fetch behavior:
213
+
214
+ ```bash
215
+ export FIRECRAWL_API_KEY=your-key-here
216
+ ```
217
+
218
+ With Firecrawl enabled:
219
+
220
+ - `WebSearch` can use Firecrawl's search API (while DuckDuckGo remains the default free path for non-Claude models)
221
+ - `WebFetch` uses Firecrawl's scrape endpoint instead of raw HTTP, handling JS-rendered pages correctly
222
+
223
+ Free tier at [firecrawl.dev](https://firecrawl.dev) includes 500 credits. The key is optional.
224
+
225
+ ---
226
+
227
+ ## Source Build
228
+
229
+ ```bash
230
+ bun install
231
+ bun run build
232
+ node dist/cli.mjs
233
+ ```
234
+
235
+ Helpful commands:
236
+
237
+ - `bun run dev`
238
+ - `bun run smoke`
239
+ - `bun run doctor:runtime`
240
+
241
+ ---
242
+
243
+ ## VS Code Extension
244
+
245
+ The repo includes a VS Code extension in [`vscode-extension/xclaude-vscode`](vscode-extension/xclaude-vscode) for Xclaude launch integration and theme support.
246
+
247
+ ---
248
+
249
+ ## Security
250
+
251
+ If you believe you found a security issue, see [SECURITY.md](SECURITY.md).
252
+
253
+ ---
254
+
255
+ ## Contributing
256
+
257
+ Contributions are welcome.
258
+
259
+ For larger changes, open an issue first so the scope is clear before implementation. Helpful validation commands include:
260
+
261
+ - `bun run build`
262
+ - `bun run smoke`
263
+ - focused `bun test ...` runs for touched areas
264
+
265
+ ---
266
+
267
+ ## Disclaimer
268
+
269
+ Xclaude is an independent community project and is not affiliated with, endorsed by, or sponsored by Anthropic.
270
+
271
+ "Claude" and "Claude Code" are trademarks of Anthropic.
272
+
273
+ ---
274
+
275
+ ## License
276
+
277
+ MIT
@@ -0,0 +1,7 @@
1
+ import { join } from 'path'
2
+ import { pathToFileURL } from 'url'
3
+
4
+ export function getDistImportSpecifier(baseDir) {
5
+ const distPath = join(baseDir, '..', 'dist', 'cli.mjs')
6
+ return pathToFileURL(distPath).href
7
+ }
@@ -0,0 +1,14 @@
1
+ import assert from 'node:assert/strict'
2
+ import { join } from 'node:path'
3
+ import test from 'node:test'
4
+ import { pathToFileURL } from 'node:url'
5
+
6
+ import { getDistImportSpecifier } from './import-specifier.mjs'
7
+
8
+ test('builds a file URL import specifier for dist/cli.mjs', () => {
9
+ const binDir = join('/tmp', 'xclaude-import-spec-test', 'bin')
10
+ const specifier = getDistImportSpecifier(binDir)
11
+ const expected = pathToFileURL(join(binDir, '..', 'dist', 'cli.mjs')).href
12
+
13
+ assert.equal(specifier, expected)
14
+ })
package/bin/xclaude ADDED
@@ -0,0 +1,46 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * OpenClaude — Claude Code with any LLM
5
+ *
6
+ * If dist/cli.mjs exists (built), run that.
7
+ * Otherwise, tell the user to build first or use `bun run dev`.
8
+ */
9
+
10
+ // Node.js version check — must run before any modern-API imports.
11
+ // process.versions is always available, no imports needed.
12
+ {
13
+ const [nodeMajor] = process.versions.node.split('.').map(Number)
14
+ if (nodeMajor < 20) {
15
+ process.stderr.write(
16
+ `\n openclaude requires Node.js 20 or later.\n` +
17
+ ` You are running Node.js ${process.versions.node}.\n` +
18
+ ` Please upgrade: https://nodejs.org/en/download\n\n`,
19
+ )
20
+ process.exit(1)
21
+ }
22
+ }
23
+
24
+ import { existsSync } from 'fs'
25
+ import { join, dirname } from 'path'
26
+ import { fileURLToPath, pathToFileURL } from 'url'
27
+
28
+ const __dirname = dirname(fileURLToPath(import.meta.url))
29
+ const distPath = join(__dirname, '..', 'dist', 'cli.mjs')
30
+
31
+ if (existsSync(distPath)) {
32
+ await import(pathToFileURL(distPath).href)
33
+ } else {
34
+ console.error(`
35
+ openclaude: dist/cli.mjs not found.
36
+
37
+ Build first:
38
+ bun run build
39
+
40
+ Or run directly with Bun:
41
+ bun run dev
42
+
43
+ See README.md for setup instructions.
44
+ `)
45
+ process.exit(1)
46
+ }