pertrellyclaude 0.1.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,193 @@
1
+ # OpenClaude
2
+
3
+ Use Claude Code with **any LLM** — not just Claude.
4
+
5
+ OpenClaude lets you use Claude Code with any LLM through an OpenAI-compatible API shim. Plug in GPT-4o, DeepSeek, Gemini, Llama, Mistral, or any model that speaks the OpenAI chat completions API. It also supports the ChatGPT Codex backend for `codexplan` and `codexspark`, and local inference via [Atomic Chat](https://atomic.chat/) on Apple Silicon.
6
+
7
+ All of Claude Code's tools work — bash, file read/write/edit, grep, glob, agents, tasks, MCP — just powered by whatever model you choose.
8
+
9
+ ---
10
+
11
+ ## Start Here
12
+
13
+ If you are new to terminals or just want the easiest path, start with the beginner guides:
14
+
15
+ - [Non-Technical Setup](docs/non-technical-setup.md)
16
+ - [Windows Quick Start](docs/quick-start-windows.md)
17
+ - [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
18
+
19
+ If you want source builds, Bun workflows, profile launchers, or full provider examples, use:
20
+
21
+ - [Advanced Setup](docs/advanced-setup.md)
22
+
23
+ ---
24
+
25
+ ## Install
26
+
27
+ Clone the repository and build:
28
+
29
+ ```bash
30
+ git clone https://github.com/Izaiaspertrelly/openclaude.git
31
+ cd openclaude
32
+ npm install
33
+ npm run build
34
+ ```
35
+
36
+ Then run:
37
+
38
+ ```bash
39
+ npm start
40
+ # or
41
+ node bin/openclaude
42
+ ```
43
+
44
+ If you install via npm and later see `ripgrep not found`, install ripgrep system-wide and confirm `rg --version` works in the same terminal before starting OpenClaude.
45
+
46
+ ---
47
+
48
+ ## Fastest Setup
49
+
50
+ ### Windows PowerShell
51
+
52
+ ```powershell
53
+ git clone https://github.com/Izaiaspertrelly/openclaude.git
54
+ cd openclaude
55
+ npm install && npm run build
56
+
57
+ $env:CLAUDE_CODE_USE_OPENAI="1"
58
+ $env:OPENAI_API_KEY="sk-your-key-here"
59
+ $env:OPENAI_MODEL="gpt-4o"
60
+
61
+ openclaude
62
+ ```
63
+
64
+ ### macOS / Linux
65
+
66
+ ```bash
67
+ git clone https://github.com/Izaiaspertrelly/openclaude.git
68
+ cd openclaude
69
+ npm install && npm run build
70
+
71
+ export CLAUDE_CODE_USE_OPENAI=1
72
+ export OPENAI_API_KEY=sk-your-key-here
73
+ export OPENAI_MODEL=gpt-4o
74
+
75
+ openclaude
76
+ ```
77
+
78
+ That is enough to start with OpenAI.
79
+
80
+ ---
81
+
82
+ ## Choose Your Guide
83
+
84
+ ### Beginner
85
+
86
+ - Want the easiest setup with copy-paste steps: [Non-Technical Setup](docs/non-technical-setup.md)
87
+ - On Windows: [Windows Quick Start](docs/quick-start-windows.md)
88
+ - On macOS or Linux: [macOS / Linux Quick Start](docs/quick-start-mac-linux.md)
89
+
90
+ ### Advanced
91
+
92
+ - Want source builds, Bun, local profiles, runtime checks, or more provider choices: [Advanced Setup](docs/advanced-setup.md)
93
+
94
+ ---
95
+
96
+ ## Common Beginner Choices
97
+
98
+ ### OpenAI
99
+
100
+ Best default if you already have an OpenAI API key.
101
+
102
+ ### Ollama
103
+
104
+ Best if you want to run models locally on your own machine.
105
+
106
+ ### Codex
107
+
108
+ Best if you already use the Codex CLI or ChatGPT Codex backend.
109
+
110
+ ### Atomic Chat
111
+
112
+ Best if you want local inference on Apple Silicon with Atomic Chat. See [Advanced Setup](docs/advanced-setup.md).
113
+
114
+ ---
115
+
116
+ ## What Works
117
+
118
+ - **All tools**: Bash, FileRead, FileWrite, FileEdit, Glob, Grep, WebFetch, WebSearch, Agent, MCP, LSP, NotebookEdit, Tasks
119
+ - **Streaming**: Real-time token streaming
120
+ - **Tool calling**: Multi-step tool chains (the model calls tools, gets results, continues)
121
+ - **Images**: Base64 and URL images passed to vision models
122
+ - **Slash commands**: /commit, /review, /compact, /diff, /doctor, etc.
123
+ - **Sub-agents**: AgentTool spawns sub-agents using the same provider
124
+ - **Memory**: Persistent memory system
125
+
126
+ ## What's Different
127
+
128
+ - **No thinking mode**: Anthropic's extended thinking is disabled (OpenAI models use different reasoning)
129
+ - **No prompt caching**: Anthropic-specific cache headers are skipped
130
+ - **No beta features**: Anthropic-specific beta headers are ignored
131
+ - **Token limits**: Defaults to 32K max output — some models may cap lower, which is handled gracefully
132
+
133
+ ---
134
+
135
+ ## How It Works
136
+
137
+ The shim (`src/services/api/openaiShim.ts`) sits between Claude Code and the LLM API:
138
+
139
+ ```
140
+ Claude Code Tool System
141
+ |
142
+ v
143
+ Anthropic SDK interface (duck-typed)
144
+ |
145
+ v
146
+ openaiShim.ts <-- translates formats
147
+ |
148
+ v
149
+ OpenAI Chat Completions API
150
+ |
151
+ v
152
+ Any compatible model
153
+ ```
154
+
155
+ It translates:
156
+ - Anthropic message blocks → OpenAI messages
157
+ - Anthropic tool_use/tool_result → OpenAI function calls
158
+ - OpenAI SSE streaming → Anthropic stream events
159
+ - Anthropic system prompt arrays → OpenAI system messages
160
+
161
+ The rest of Claude Code doesn't know it's talking to a different model.
162
+
163
+ ---
164
+
165
+ ## Model Quality Notes
166
+
167
+ Not all models are equal at agentic tool use. Here's a rough guide:
168
+
169
+ | Model | Tool Calling | Code Quality | Speed |
170
+ |-------|-------------|-------------|-------|
171
+ | GPT-4o | Excellent | Excellent | Fast |
172
+ | DeepSeek-V3 | Great | Great | Fast |
173
+ | Gemini 2.0 Flash | Great | Good | Very Fast |
174
+ | Llama 3.3 70B | Good | Good | Medium |
175
+ | Mistral Large | Good | Good | Fast |
176
+ | GPT-4o-mini | Good | Good | Very Fast |
177
+ | Qwen 2.5 72B | Good | Good | Medium |
178
+ | Smaller models (<7B) | Limited | Limited | Very Fast |
179
+
180
+ For best results, use models with strong function/tool calling support.
181
+
182
+ ---
183
+
184
+ ## About
185
+
186
+ OpenClaude is provided for educational and research purposes.
187
+
188
+
189
+ ---
190
+
191
+ ## License
192
+
193
+ This repository is provided for educational and research purposes. The original source code is subject to Anthropic's terms. The OpenAI shim additions are public domain.
@@ -0,0 +1,7 @@
1
+ import { join } from 'path'
2
+ import { pathToFileURL } from 'url'
3
+
4
+ export function getDistImportSpecifier(baseDir) {
5
+ const distPath = join(baseDir, '..', 'dist', 'cli.mjs')
6
+ return pathToFileURL(distPath).href
7
+ }
@@ -0,0 +1,13 @@
1
+ import assert from 'node:assert/strict'
2
+ import test from 'node:test'
3
+
4
+ import { getDistImportSpecifier } from './import-specifier.mjs'
5
+
6
+ test('builds a file URL import specifier for dist/cli.mjs', () => {
7
+ const specifier = getDistImportSpecifier('C:\\repo\\bin')
8
+
9
+ assert.equal(
10
+ specifier,
11
+ 'file:///C:/repo/dist/cli.mjs',
12
+ )
13
+ })
package/bin/openclaude ADDED
@@ -0,0 +1,32 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * OpenClaude — Claude Code with any LLM
5
+ *
6
+ * If dist/cli.mjs exists (built), run that.
7
+ * Otherwise, tell the user to build first or use `bun run dev`.
8
+ */
9
+
10
+ import { existsSync } from 'fs'
11
+ import { join, dirname } from 'path'
12
+ import { fileURLToPath, pathToFileURL } from 'url'
13
+
14
+ const __dirname = dirname(fileURLToPath(import.meta.url))
15
+ const distPath = join(__dirname, '..', 'dist', 'cli.mjs')
16
+
17
+ if (existsSync(distPath)) {
18
+ await import(pathToFileURL(distPath).href)
19
+ } else {
20
+ console.error(`
21
+ openclaude: dist/cli.mjs not found.
22
+
23
+ Build first:
24
+ bun run build
25
+
26
+ Or run directly with Bun:
27
+ bun run dev
28
+
29
+ See README.md for setup instructions.
30
+ `)
31
+ process.exit(1)
32
+ }