claude-codex-local 0.2.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 Luong NGUYEN
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,299 @@
1
+ Metadata-Version: 2.4
2
+ Name: claude-codex-local
3
+ Version: 0.2.0
4
+ Summary: Local backend bridge for Claude Code and Codex.
5
+ License-Expression: MIT
6
+ Project-URL: Homepage, https://github.com/luongnv89/claude-codex-local
7
+ Project-URL: Repository, https://github.com/luongnv89/claude-codex-local
8
+ Project-URL: Bug Tracker, https://github.com/luongnv89/claude-codex-local/issues
9
+ Project-URL: Changelog, https://github.com/luongnv89/claude-codex-local/blob/main/docs/CHANGELOG.md
10
+ Keywords: claude,codex,llm,local,ollama,lm-studio,bridge,cli
11
+ Classifier: Development Status :: 3 - Alpha
12
+ Classifier: Environment :: Console
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Programming Language :: Python :: 3.10
16
+ Classifier: Programming Language :: Python :: 3.11
17
+ Classifier: Programming Language :: Python :: 3.12
18
+ Classifier: Topic :: Software Development :: Libraries
19
+ Classifier: Topic :: Utilities
20
+ Requires-Python: >=3.10
21
+ Description-Content-Type: text/markdown
22
+ License-File: LICENSE
23
+ Requires-Dist: questionary>=2.0
24
+ Requires-Dist: rich>=13.0
25
+ Dynamic: license-file
26
+
27
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)
28
+ [![Python 3.10+](https://img.shields.io/badge/python-3.10%2B-blue.svg)](https://www.python.org/downloads/)
29
+ [![CI](https://github.com/luongnv89/claude-codex-local/actions/workflows/ci.yml/badge.svg)](https://github.com/luongnv89/claude-codex-local/actions/workflows/ci.yml)
30
+ [![Code style: ruff](https://img.shields.io/badge/code%20style-ruff-000000.svg)](https://github.com/astral-sh/ruff)
31
+
32
+ # Claude Code + Codex, running entirely on your machine
33
+
34
+ One alias (`cc` or `cx`) swaps the backend to a local model. Your skills, agents, MCP servers, and config stay untouched.
35
+
36
+ [**Get Started →**](#quick-start)
37
+
38
+ ---
39
+
40
+ ## How It Works
41
+
42
+ ```mermaid
43
+ graph LR
44
+ A["cc / cx alias"] --> B["helper script<br>.claude-codex-local/bin/cc"]
45
+ B -->|Ollama| C["ollama launch claude<br>--model gemma4:26b"]
46
+ B -->|"LM Studio / llama.cpp"| D["inline env vars<br>+ exec claude"]
47
+ C --> E["~/.claude<br>your real config"]
48
+ D --> E
49
+ ```
50
+
51
+ The wizard runs once and wires everything up. After that, `cc` just works. Your real `~/.claude` and `~/.codex` are never modified.
52
+
53
+ ---
54
+
55
+ ## Features
56
+
57
+ | Feature | What you get |
58
+ |---|---|
59
+ | Ollama first-class | `ollama launch` — no duplicated config, no custom Modelfiles |
60
+ | Config untouched | All skills, statusline, agents, plugins, and MCP servers carry over |
61
+ | Smart model selection | `llmfit` analyses your hardware and picks the best quantization that fits |
62
+ | Resume on failure | Wizard persists progress — `--resume` picks up from the last completed step |
63
+ | Idempotent aliases | Re-running the wizard replaces the existing alias block, never appends |
64
+ | Cloud fallback | Run `claude` / `codex` directly (no prefix) to switch back instantly |
65
+
66
+ ---
67
+
68
+ ## Quick Start
69
+
70
+ ### One-command install (no clone required)
71
+
72
+ ```bash
73
+ bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
74
+ ```
75
+
76
+ Or with wget:
77
+
78
+ ```bash
79
+ bash <(wget -qO- https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
80
+ ```
81
+
82
+ > Use `bash <(...)`, not `curl … | bash`. The wizard is interactive and needs a real TTY — piping steals stdin.
83
+
84
+ Override defaults with env vars:
85
+
86
+ ```bash
87
+ CCL_REF=v0.2.0 CCL_INSTALL_DIR=~/tools/claude-codex-local \
88
+ bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
89
+ ```
90
+
91
+ ### Install from a clone
92
+
93
+ ```bash
94
+ git clone https://github.com/luongnv89/claude-codex-local.git
95
+ cd claude-codex-local
96
+ ```
97
+
98
+ ```bash
99
+ python3 -m venv .venv && source .venv/bin/activate
100
+ pip install -r requirements.txt
101
+ ```
102
+
103
+ ```bash
104
+ ./bin/claude-codex-local
105
+ ```
106
+
107
+ ### After setup
108
+
109
+ Reload your shell so the alias is available:
110
+
111
+ ```bash
112
+ source ~/.zshrc # or source ~/.bashrc
113
+ ```
114
+
115
+ Then run:
116
+
117
+ ```bash
118
+ cc # Claude Code → local model
119
+ cx # Codex CLI → local model
120
+ ```
121
+
122
+ ---
123
+
124
+ ## Wizard Steps
125
+
126
+ ```mermaid
127
+ graph TD
128
+ A[1. Discover environment] --> B[2. Install missing components]
129
+ B --> C[3. Pick harness + engine]
130
+ C --> D[4. Pick model]
131
+ D --> E[5. Smoke test engine]
132
+ E --> F[6. Wire harness]
133
+ F --> G[7. Install helper + aliases]
134
+ G --> H[8. Verify launch end-to-end]
135
+ H --> I[9. Generate guide.md]
136
+ ```
137
+
138
+ See [`guide.example.md`](guide.example.md) for the personalized daily-use guide the wizard generates.
139
+
140
+ ---
141
+
142
+ ## Usage
143
+
144
+ ```bash
145
+ ./bin/claude-codex-local setup --harness claude --engine ollama # skip prefs picker
146
+ ./bin/claude-codex-local setup --non-interactive # CI-friendly
147
+ ./bin/claude-codex-local setup --resume # resume after failure
148
+ ./bin/claude-codex-local find-model # standalone model recommendation
149
+ ```
150
+
151
+ Diagnostic helpers:
152
+
153
+ ```bash
154
+ ./bin/poc-doctor # wizard state + presence check
155
+ ./bin/poc-machine-profile # full hardware profile as JSON
156
+ ./bin/poc-recommend # llmfit-only model recommendation
157
+ ```
158
+
159
+ ---
160
+
161
+ ## Prerequisites
162
+
163
+ - macOS or Linux with zsh or bash
164
+ - Python 3.10+
165
+ - At least one harness: [Claude Code](https://claude.ai/code) or [Codex CLI](https://github.com/openai/codex)
166
+ - At least one engine: [Ollama](https://ollama.com) (recommended), [LM Studio](https://lmstudio.ai), or llama.cpp
167
+ - [`llmfit`](https://github.com/luongnv89/llmfit) on `PATH` (optional — for automatic model selection)
168
+
169
+ ---
170
+
171
+ ## Proven Paths
172
+
173
+ | Harness | Engine | Model | Status |
174
+ |---|---|---|---|
175
+ | Claude Code | Ollama | `gemma4:26b` | Verified end-to-end |
176
+ | Codex CLI | Ollama | `gemma4:26b` | Verified |
177
+ | Codex CLI | Ollama | `qwen2.5-coder:0.5b` | Verified |
178
+ | Claude Code | LM Studio | Qwen3 family | Blocked — `400 thinking.type`; wizard warns and recommends alternatives |
179
+ | Any | llama.cpp | any | Inline-env code path exists, no live proof yet |
180
+
181
+ ---
182
+
183
+ ## Rollback
184
+
185
+ ```bash
186
+ # Remove the fenced block from ~/.zshrc (between the marker lines)
187
+ rm -rf .claude-codex-local
188
+ ```
189
+
190
+ That's it. Your `~/.claude` and `~/.codex` are unchanged.
191
+
192
+ ---
193
+
194
+ <details>
195
+ <summary>Architecture details</summary>
196
+
197
+ ### Three layers
198
+
199
+ 1. **Machine profile + model recommendation** (`poc_bridge.py`) — dumps a JSON snapshot of installed harnesses/engines/llmfit/disk, runs `llmfit` for ranked model recommendations, and provides a `doctor` command for pretty-printing wizard state.
200
+
201
+ 2. **Interactive wizard** (`wizard.py`) — 9 steps from discovery to ready-to-use daily alias. Persists progress in `.claude-codex-local/wizard-state.json` so `--resume` picks up after a failure.
202
+
203
+ 3. **Helper scripts + shell aliases** — `.claude-codex-local/bin/cc` (or `cx`) is a short bash wrapper. For Ollama it runs `ollama launch claude|codex --model <tag>`. For LM Studio / llama.cpp it sets inline env vars and execs the real harness. A fenced block in `~/.zshrc` / `~/.bashrc` declares the aliases.
204
+
205
+ ### Why `ollama launch`
206
+
207
+ `ollama launch claude --model <tag>` is an official Ollama subcommand that sets the right env vars internally and execs the user's real `claude` binary against the local daemon — using `~/.claude` as-is.
208
+
209
+ This means:
210
+ - No duplicated `~/.claude` directory
211
+ - No custom Modelfile or `ollama create`
212
+ - No `ANTHROPIC_CUSTOM_MODEL_OPTION` to manage manually
213
+ - `cc` just works
214
+
215
+ ### Claude Code → LM Studio / llama.cpp env vars
216
+
217
+ | Env var | LM Studio | llama.cpp |
218
+ |---|---|---|
219
+ | `ANTHROPIC_BASE_URL` | `http://localhost:1234` | `http://localhost:8001` |
220
+ | `ANTHROPIC_API_KEY` | `lmstudio` | `sk-local` |
221
+ | `ANTHROPIC_CUSTOM_MODEL_OPTION` | `<tag>` | `<tag>` |
222
+ | `ANTHROPIC_CUSTOM_MODEL_OPTION_NAME` | `Local (lmstudio) <tag>` | `Local (llamacpp) <tag>` |
223
+ | `CLAUDE_CODE_ATTRIBUTION_HEADER` | `"0"` | `"0"` |
224
+ | `CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC` | `"1"` | `"1"` |
225
+
226
+ ### Codex CLI → Ollama
227
+
228
+ ```bash
229
+ ollama launch codex --model <tag> -- --oss --local-provider=ollama
230
+ ```
231
+
232
+ The `--oss --local-provider=ollama` flags are required after `--` because Codex otherwise tries to route through the ChatGPT account and rejects non-OpenAI model names.
233
+
234
+ ### Qwen3 + Claude Code
235
+
236
+ Claude Code sends a `thinking` payload that Qwen3 reasoning models interpret as an unterminated `<think>` block. The wizard detects Qwen3 model names at pick time and recommends Gemma 3 or Qwen 2.5 Coder instead.
237
+
238
+ </details>
239
+
240
+ <details>
241
+ <summary>Project structure</summary>
242
+
243
+ ```
244
+ .
245
+ ├── bin/
246
+ │ ├── claude-codex-local # Main wizard entrypoint
247
+ │ ├── poc-doctor # Diagnostic: wizard state
248
+ │ ├── poc-machine-profile # Diagnostic: hardware profile
249
+ │ └── poc-recommend # Diagnostic: model recommendation
250
+ ├── scripts/
251
+ │ └── e2e_smoke.sh # End-to-end smoke test
252
+ ├── docs/
253
+ │ ├── poc-wizard.md # 9-step wizard architecture
254
+ │ ├── poc-architecture.md # System design overview
255
+ │ ├── poc-bootstrap.md # Bootstrap / install flow
256
+ │ └── poc-proof.md # Design rationale
257
+ ├── tests/ # pytest test suite
258
+ ├── wizard.py # Interactive setup wizard (core logic)
259
+ ├── poc_bridge.py # Backend bridge / harness wiring
260
+ ├── install.sh # One-command remote installer
261
+ └── pyproject.toml # Project metadata and tool config
262
+ ```
263
+
264
+ </details>
265
+
266
+ <details>
267
+ <summary>Tech stack</summary>
268
+
269
+ | Layer | Tool |
270
+ |---|---|
271
+ | Language | Python 3.10+ |
272
+ | UI / prompts | [questionary](https://github.com/tmbo/questionary), [rich](https://github.com/Textualize/rich) |
273
+ | Linting | [ruff](https://github.com/astral-sh/ruff) |
274
+ | Type checking | [mypy](https://mypy-lang.org) |
275
+ | Testing | [pytest](https://pytest.org) + pytest-cov |
276
+ | Security | [bandit](https://github.com/PyCQA/bandit), [detect-secrets](https://github.com/Yelp/detect-secrets) |
277
+ | Pre-commit | [pre-commit](https://pre-commit.com) |
278
+
279
+ </details>
280
+
281
+ <details>
282
+ <summary>Local state</summary>
283
+
284
+ Everything written by the bridge goes under `.claude-codex-local/`. Override with `CLAUDE_CODEX_LOCAL_STATE_DIR`.
285
+
286
+ </details>
287
+
288
+ <details>
289
+ <summary>Contributing</summary>
290
+
291
+ Contributions are welcome. Read [CONTRIBUTING.md](CONTRIBUTING.md) before opening a PR.
292
+
293
+ For security issues, see [SECURITY.md](SECURITY.md).
294
+
295
+ </details>
296
+
297
+ ---
298
+
299
+ [MIT](LICENSE) — © 2024 Luong NGUYEN
@@ -0,0 +1,273 @@
1
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)
2
+ [![Python 3.10+](https://img.shields.io/badge/python-3.10%2B-blue.svg)](https://www.python.org/downloads/)
3
+ [![CI](https://github.com/luongnv89/claude-codex-local/actions/workflows/ci.yml/badge.svg)](https://github.com/luongnv89/claude-codex-local/actions/workflows/ci.yml)
4
+ [![Code style: ruff](https://img.shields.io/badge/code%20style-ruff-000000.svg)](https://github.com/astral-sh/ruff)
5
+
6
+ # Claude Code + Codex, running entirely on your machine
7
+
8
+ One alias (`cc` or `cx`) swaps the backend to a local model. Your skills, agents, MCP servers, and config stay untouched.
9
+
10
+ [**Get Started →**](#quick-start)
11
+
12
+ ---
13
+
14
+ ## How It Works
15
+
16
+ ```mermaid
17
+ graph LR
18
+ A["cc / cx alias"] --> B["helper script<br>.claude-codex-local/bin/cc"]
19
+ B -->|Ollama| C["ollama launch claude<br>--model gemma4:26b"]
20
+ B -->|"LM Studio / llama.cpp"| D["inline env vars<br>+ exec claude"]
21
+ C --> E["~/.claude<br>your real config"]
22
+ D --> E
23
+ ```
24
+
25
+ The wizard runs once and wires everything up. After that, `cc` just works. Your real `~/.claude` and `~/.codex` are never modified.
26
+
27
+ ---
28
+
29
+ ## Features
30
+
31
+ | Feature | What you get |
32
+ |---|---|
33
+ | Ollama first-class | `ollama launch` — no duplicated config, no custom Modelfiles |
34
+ | Config untouched | All skills, statusline, agents, plugins, and MCP servers carry over |
35
+ | Smart model selection | `llmfit` analyses your hardware and picks the best quantization that fits |
36
+ | Resume on failure | Wizard persists progress — `--resume` picks up from the last completed step |
37
+ | Idempotent aliases | Re-running the wizard replaces the existing alias block, never appends |
38
+ | Cloud fallback | Run `claude` / `codex` directly (no prefix) to switch back instantly |
39
+
40
+ ---
41
+
42
+ ## Quick Start
43
+
44
+ ### One-command install (no clone required)
45
+
46
+ ```bash
47
+ bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
48
+ ```
49
+
50
+ Or with wget:
51
+
52
+ ```bash
53
+ bash <(wget -qO- https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
54
+ ```
55
+
56
+ > Use `bash <(...)`, not `curl … | bash`. The wizard is interactive and needs a real TTY — piping steals stdin.
57
+
58
+ Override defaults with env vars:
59
+
60
+ ```bash
61
+ CCL_REF=v0.2.0 CCL_INSTALL_DIR=~/tools/claude-codex-local \
62
+ bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
63
+ ```
64
+
65
+ ### Install from a clone
66
+
67
+ ```bash
68
+ git clone https://github.com/luongnv89/claude-codex-local.git
69
+ cd claude-codex-local
70
+ ```
71
+
72
+ ```bash
73
+ python3 -m venv .venv && source .venv/bin/activate
74
+ pip install -r requirements.txt
75
+ ```
76
+
77
+ ```bash
78
+ ./bin/claude-codex-local
79
+ ```
80
+
81
+ ### After setup
82
+
83
+ Reload your shell so the alias is available:
84
+
85
+ ```bash
86
+ source ~/.zshrc # or source ~/.bashrc
87
+ ```
88
+
89
+ Then run:
90
+
91
+ ```bash
92
+ cc # Claude Code → local model
93
+ cx # Codex CLI → local model
94
+ ```
95
+
96
+ ---
97
+
98
+ ## Wizard Steps
99
+
100
+ ```mermaid
101
+ graph TD
102
+ A[1. Discover environment] --> B[2. Install missing components]
103
+ B --> C[3. Pick harness + engine]
104
+ C --> D[4. Pick model]
105
+ D --> E[5. Smoke test engine]
106
+ E --> F[6. Wire harness]
107
+ F --> G[7. Install helper + aliases]
108
+ G --> H[8. Verify launch end-to-end]
109
+ H --> I[9. Generate guide.md]
110
+ ```
111
+
112
+ See [`guide.example.md`](guide.example.md) for the personalized daily-use guide the wizard generates.
113
+
114
+ ---
115
+
116
+ ## Usage
117
+
118
+ ```bash
119
+ ./bin/claude-codex-local setup --harness claude --engine ollama # skip prefs picker
120
+ ./bin/claude-codex-local setup --non-interactive # CI-friendly
121
+ ./bin/claude-codex-local setup --resume # resume after failure
122
+ ./bin/claude-codex-local find-model # standalone model recommendation
123
+ ```
124
+
125
+ Diagnostic helpers:
126
+
127
+ ```bash
128
+ ./bin/poc-doctor # wizard state + presence check
129
+ ./bin/poc-machine-profile # full hardware profile as JSON
130
+ ./bin/poc-recommend # llmfit-only model recommendation
131
+ ```
132
+
133
+ ---
134
+
135
+ ## Prerequisites
136
+
137
+ - macOS or Linux with zsh or bash
138
+ - Python 3.10+
139
+ - At least one harness: [Claude Code](https://claude.ai/code) or [Codex CLI](https://github.com/openai/codex)
140
+ - At least one engine: [Ollama](https://ollama.com) (recommended), [LM Studio](https://lmstudio.ai), or llama.cpp
141
+ - [`llmfit`](https://github.com/luongnv89/llmfit) on `PATH` (optional — for automatic model selection)
142
+
143
+ ---
144
+
145
+ ## Proven Paths
146
+
147
+ | Harness | Engine | Model | Status |
148
+ |---|---|---|---|
149
+ | Claude Code | Ollama | `gemma4:26b` | Verified end-to-end |
150
+ | Codex CLI | Ollama | `gemma4:26b` | Verified |
151
+ | Codex CLI | Ollama | `qwen2.5-coder:0.5b` | Verified |
152
+ | Claude Code | LM Studio | Qwen3 family | Blocked — `400 thinking.type`; wizard warns and recommends alternatives |
153
+ | Any | llama.cpp | any | Inline-env code path exists, no live proof yet |
154
+
155
+ ---
156
+
157
+ ## Rollback
158
+
159
+ ```bash
160
+ # Remove the fenced block from ~/.zshrc (between the marker lines)
161
+ rm -rf .claude-codex-local
162
+ ```
163
+
164
+ That's it. Your `~/.claude` and `~/.codex` are unchanged.
165
+
166
+ ---
167
+
168
+ <details>
169
+ <summary>Architecture details</summary>
170
+
171
+ ### Three layers
172
+
173
+ 1. **Machine profile + model recommendation** (`poc_bridge.py`) — dumps a JSON snapshot of installed harnesses/engines/llmfit/disk, runs `llmfit` for ranked model recommendations, and provides a `doctor` command for pretty-printing wizard state.
174
+
175
+ 2. **Interactive wizard** (`wizard.py`) — 9 steps from discovery to ready-to-use daily alias. Persists progress in `.claude-codex-local/wizard-state.json` so `--resume` picks up after a failure.
176
+
177
+ 3. **Helper scripts + shell aliases** — `.claude-codex-local/bin/cc` (or `cx`) is a short bash wrapper. For Ollama it runs `ollama launch claude|codex --model <tag>`. For LM Studio / llama.cpp it sets inline env vars and execs the real harness. A fenced block in `~/.zshrc` / `~/.bashrc` declares the aliases.
178
+
179
+ ### Why `ollama launch`
180
+
181
+ `ollama launch claude --model <tag>` is an official Ollama subcommand that sets the right env vars internally and execs the user's real `claude` binary against the local daemon — using `~/.claude` as-is.
182
+
183
+ This means:
184
+ - No duplicated `~/.claude` directory
185
+ - No custom Modelfile or `ollama create`
186
+ - No `ANTHROPIC_CUSTOM_MODEL_OPTION` to manage manually
187
+ - `cc` just works
188
+
189
+ ### Claude Code → LM Studio / llama.cpp env vars
190
+
191
+ | Env var | LM Studio | llama.cpp |
192
+ |---|---|---|
193
+ | `ANTHROPIC_BASE_URL` | `http://localhost:1234` | `http://localhost:8001` |
194
+ | `ANTHROPIC_API_KEY` | `lmstudio` | `sk-local` |
195
+ | `ANTHROPIC_CUSTOM_MODEL_OPTION` | `<tag>` | `<tag>` |
196
+ | `ANTHROPIC_CUSTOM_MODEL_OPTION_NAME` | `Local (lmstudio) <tag>` | `Local (llamacpp) <tag>` |
197
+ | `CLAUDE_CODE_ATTRIBUTION_HEADER` | `"0"` | `"0"` |
198
+ | `CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC` | `"1"` | `"1"` |
199
+
200
+ ### Codex CLI → Ollama
201
+
202
+ ```bash
203
+ ollama launch codex --model <tag> -- --oss --local-provider=ollama
204
+ ```
205
+
206
+ The `--oss --local-provider=ollama` flags are required after `--` because Codex otherwise tries to route through the ChatGPT account and rejects non-OpenAI model names.
207
+
208
+ ### Qwen3 + Claude Code
209
+
210
+ Claude Code sends a `thinking` payload that Qwen3 reasoning models interpret as an unterminated `<think>` block. The wizard detects Qwen3 model names at pick time and recommends Gemma 3 or Qwen 2.5 Coder instead.
211
+
212
+ </details>
213
+
214
+ <details>
215
+ <summary>Project structure</summary>
216
+
217
+ ```
218
+ .
219
+ ├── bin/
220
+ │ ├── claude-codex-local # Main wizard entrypoint
221
+ │ ├── poc-doctor # Diagnostic: wizard state
222
+ │ ├── poc-machine-profile # Diagnostic: hardware profile
223
+ │ └── poc-recommend # Diagnostic: model recommendation
224
+ ├── scripts/
225
+ │ └── e2e_smoke.sh # End-to-end smoke test
226
+ ├── docs/
227
+ │ ├── poc-wizard.md # 9-step wizard architecture
228
+ │ ├── poc-architecture.md # System design overview
229
+ │ ├── poc-bootstrap.md # Bootstrap / install flow
230
+ │ └── poc-proof.md # Design rationale
231
+ ├── tests/ # pytest test suite
232
+ ├── wizard.py # Interactive setup wizard (core logic)
233
+ ├── poc_bridge.py # Backend bridge / harness wiring
234
+ ├── install.sh # One-command remote installer
235
+ └── pyproject.toml # Project metadata and tool config
236
+ ```
237
+
238
+ </details>
239
+
240
+ <details>
241
+ <summary>Tech stack</summary>
242
+
243
+ | Layer | Tool |
244
+ |---|---|
245
+ | Language | Python 3.10+ |
246
+ | UI / prompts | [questionary](https://github.com/tmbo/questionary), [rich](https://github.com/Textualize/rich) |
247
+ | Linting | [ruff](https://github.com/astral-sh/ruff) |
248
+ | Type checking | [mypy](https://mypy-lang.org) |
249
+ | Testing | [pytest](https://pytest.org) + pytest-cov |
250
+ | Security | [bandit](https://github.com/PyCQA/bandit), [detect-secrets](https://github.com/Yelp/detect-secrets) |
251
+ | Pre-commit | [pre-commit](https://pre-commit.com) |
252
+
253
+ </details>
254
+
255
+ <details>
256
+ <summary>Local state</summary>
257
+
258
+ Everything written by the bridge goes under `.claude-codex-local/`. Override with `CLAUDE_CODEX_LOCAL_STATE_DIR`.
259
+
260
+ </details>
261
+
262
+ <details>
263
+ <summary>Contributing</summary>
264
+
265
+ Contributions are welcome. Read [CONTRIBUTING.md](CONTRIBUTING.md) before opening a PR.
266
+
267
+ For security issues, see [SECURITY.md](SECURITY.md).
268
+
269
+ </details>
270
+
271
+ ---
272
+
273
+ [MIT](LICENSE) — © 2024 Luong NGUYEN
@@ -0,0 +1,3 @@
1
+ """claude-codex-local — local backend bridge for Claude Code and Codex."""
2
+
3
+ __version__ = "0.2.0"