@hanzo/dev 3.0.6 → 3.0.8
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +34 -333
- package/package.json +6 -6
package/README.md
CHANGED
|
@@ -1,358 +1,59 @@
|
|
|
1
|
-
<
|
|
2
|
-
|
|
3
|
-
 
|
|
4
|
-
|
|
5
|
-
**Every Code** (Code for short) is a fast, local coding agent for your terminal. It's a community-driven fork of `openai/codex` focused on real developer ergonomics: Browser integration, multi-agents, theming, and reasoning control — all while staying compatible with upstream.
|
|
6
|
-
|
|
7
|
-
 
|
|
8
|
-
## What's new in v0.6.0 (December 2025)
|
|
9
|
-
|
|
10
|
-
- **Auto Review** – background ghost-commit watcher runs reviews in a separate worktree whenever a turn changes code; uses `codex-5.1-mini-high` and reports issues plus ready-to-apply fixes without blocking the main thread.
|
|
11
|
-
- **Code Bridge** – Sentry-style local bridge that streams errors, console, screenshots, and control from running apps into Code; ships an MCP server; install by asking Code to pull `https://github.com/just-every/code-bridge`.
|
|
12
|
-
- **Plays well with Auto Drive** – reviews run in parallel with long Auto Drive tasks so quality checks land while the flow keeps moving.
|
|
13
|
-
- **Quality-first focus** – the release shifts emphasis from "can the model write this file" to "did we verify it works".
|
|
14
|
-
- _From v0.5.0:_ rename to Every Code, upgraded `/auto` planning/recovery, unified `/settings`, faster streaming/history with card-based activity, and more reliable `/resume` + `/undo`.
|
|
15
|
-
|
|
16
|
-
[Read the full notes in RELEASE_NOTES.md](docs/release-notes/RELEASE_NOTES.md)
|
|
17
|
-
|
|
18
|
-
 
|
|
19
|
-
## Why Every Code
|
|
20
|
-
|
|
21
|
-
- 🚀 **Auto Drive orchestration** – Multi-agent automation that now self-heals and ships complete tasks.
|
|
22
|
-
- 🌐 **Browser Integration** – CDP support, headless browsing, screenshots captured inline.
|
|
23
|
-
- 🤖 **Multi-agent commands** – `/plan`, `/code` and `/solve` coordinate multiple CLI agents.
|
|
24
|
-
- 🧭 **Unified settings hub** – `/settings` overlay for limits, theming, approvals, and provider wiring.
|
|
25
|
-
- 🎨 **Theme system** – Switch between accessible presets, customize accents, and preview live via `/themes`.
|
|
26
|
-
- 🔌 **MCP support** – Extend with filesystem, DBs, APIs, or your own tools.
|
|
27
|
-
- 🔒 **Safety modes** – Read-only, approvals, and workspace sandboxing.
|
|
28
|
-
|
|
29
|
-
 
|
|
30
|
-
## AI Videos
|
|
31
|
-
|
|
32
|
-
 
|
|
33
|
-
<p align="center">
|
|
34
|
-
<a href="https://www.youtube.com/watch?v=Ra3q8IVpIOc">
|
|
35
|
-
<img src="docs/images/video-auto-review-play.jpg" alt="Play Auto Review video" width="100%">
|
|
36
|
-
</a><br>
|
|
37
|
-
<strong>Auto Review</strong>
|
|
38
|
-
</p>
|
|
39
|
-
|
|
40
|
-
 
|
|
1
|
+
<p align="center"><code>npm i -g @openai/codex</code><br />or <code>brew install --cask codex</code></p>
|
|
2
|
+
<p align="center"><strong>Codex CLI</strong> is a coding agent from OpenAI that runs locally on your computer.
|
|
41
3
|
<p align="center">
|
|
42
|
-
<
|
|
43
|
-
<img src="docs/images/video-auto-drive-new-play.jpg" alt="Play Introducing Auto Drive video" width="100%">
|
|
44
|
-
</a><br>
|
|
45
|
-
<strong>Auto Drive Overview</strong>
|
|
4
|
+
<img src="https://github.com/openai/codex/blob/main/.github/codex-cli-splash.png" alt="Codex CLI splash" width="80%" />
|
|
46
5
|
</p>
|
|
6
|
+
</br>
|
|
7
|
+
If you want Codex in your code editor (VS Code, Cursor, Windsurf), <a href="https://developers.openai.com/codex/ide">install in your IDE.</a>
|
|
8
|
+
</br>If you are looking for the <em>cloud-based agent</em> from OpenAI, <strong>Codex Web</strong>, go to <a href="https://chatgpt.com/codex">chatgpt.com/codex</a>.</p>
|
|
47
9
|
|
|
48
|
-
|
|
49
|
-
<p align="center">
|
|
50
|
-
<a href="https://youtu.be/sV317OhiysQ">
|
|
51
|
-
<img src="docs/images/video-v03-play.jpg" alt="Play Multi-Agent Support video" width="100%">
|
|
52
|
-
</a><br>
|
|
53
|
-
<strong>Multi-Agent Promo</strong>
|
|
54
|
-
</p>
|
|
55
|
-
|
|
56
|
-
|
|
10
|
+
---
|
|
57
11
|
|
|
58
|
-
 
|
|
59
12
|
## Quickstart
|
|
60
13
|
|
|
61
|
-
###
|
|
62
|
-
|
|
63
|
-
```bash
|
|
64
|
-
npx -y @just-every/code
|
|
65
|
-
```
|
|
66
|
-
|
|
67
|
-
### Install & Run
|
|
68
|
-
|
|
69
|
-
```bash
|
|
70
|
-
npm install -g @just-every/code
|
|
71
|
-
code // or `coder` if you're using VS Code
|
|
72
|
-
```
|
|
73
|
-
|
|
74
|
-
Note: If another tool already provides a `code` command (e.g. VS Code), our CLI is also installed as `coder`. Use `coder` to avoid conflicts.
|
|
75
|
-
|
|
76
|
-
**Authenticate** (one of the following):
|
|
77
|
-
- **Sign in with ChatGPT** (Plus/Pro/Team; uses models available to your plan)
|
|
78
|
-
- Run `code` and pick "Sign in with ChatGPT"
|
|
79
|
-
- **API key** (usage-based)
|
|
80
|
-
- Set `export OPENAI_API_KEY=xyz` and run `code`
|
|
81
|
-
|
|
82
|
-
### Install Claude & Gemini (optional)
|
|
83
|
-
|
|
84
|
-
Every Code supports orchestrating other AI CLI tools. Install these and config to use alongside Code.
|
|
85
|
-
|
|
86
|
-
```bash
|
|
87
|
-
# Ensure Node.js 20+ is available locally (installs into ~/.n)
|
|
88
|
-
npm install -g n
|
|
89
|
-
export N_PREFIX="$HOME/.n"
|
|
90
|
-
export PATH="$N_PREFIX/bin:$PATH"
|
|
91
|
-
n 20.18.1
|
|
92
|
-
|
|
93
|
-
# Install the companion CLIs
|
|
94
|
-
export npm_config_prefix="${npm_config_prefix:-$HOME/.npm-global}"
|
|
95
|
-
mkdir -p "$npm_config_prefix/bin"
|
|
96
|
-
export PATH="$npm_config_prefix/bin:$PATH"
|
|
97
|
-
npm install -g @anthropic-ai/claude-code @google/gemini-cli @qwen-code/qwen-code
|
|
98
|
-
|
|
99
|
-
# Quick smoke tests
|
|
100
|
-
claude --version
|
|
101
|
-
gemini --version
|
|
102
|
-
qwen --version
|
|
103
|
-
```
|
|
104
|
-
|
|
105
|
-
> ℹ️ Add `export N_PREFIX="$HOME/.n"` and `export PATH="$N_PREFIX/bin:$PATH"` (plus the `npm_config_prefix` bin path) to your shell profile so the CLIs stay on `PATH` in future sessions.
|
|
106
|
-
|
|
107
|
-
 
|
|
108
|
-
## Commands
|
|
109
|
-
|
|
110
|
-
### Browser
|
|
111
|
-
```bash
|
|
112
|
-
# Connect code to external Chrome browser (running CDP)
|
|
113
|
-
/chrome # Connect with auto-detect port
|
|
114
|
-
/chrome 9222 # Connect to specific port
|
|
115
|
-
|
|
116
|
-
# Switch to internal browser mode
|
|
117
|
-
/browser # Use internal headless browser
|
|
118
|
-
/browser https://example.com # Open URL in internal browser
|
|
119
|
-
```
|
|
120
|
-
|
|
121
|
-
### Agents
|
|
122
|
-
```bash
|
|
123
|
-
# Plan code changes (Claude, Gemini and GPT-5 consensus)
|
|
124
|
-
# All agents review task and create a consolidated plan
|
|
125
|
-
/plan "Stop the AI from ordering pizza at 3AM"
|
|
126
|
-
|
|
127
|
-
# Solve complex problems (Claude, Gemini and GPT-5 race)
|
|
128
|
-
# Fastest preferred (see https://arxiv.org/abs/2505.17813)
|
|
129
|
-
/solve "Why does deleting one user drop the whole database?"
|
|
130
|
-
|
|
131
|
-
# Write code! (Claude, Gemini and GPT-5 consensus)
|
|
132
|
-
# Creates multiple worktrees then implements the optimal solution
|
|
133
|
-
/code "Show dark mode when I feel cranky"
|
|
134
|
-
```
|
|
135
|
-
|
|
136
|
-
### Auto Drive
|
|
137
|
-
```bash
|
|
138
|
-
# Hand off a multi-step task; Auto Drive will coordinate agents and approvals
|
|
139
|
-
/auto "Refactor the auth flow and add device login"
|
|
140
|
-
|
|
141
|
-
# Resume or inspect an active Auto Drive run
|
|
142
|
-
/auto status
|
|
143
|
-
```
|
|
144
|
-
|
|
145
|
-
### General
|
|
146
|
-
```bash
|
|
147
|
-
# Try a new theme!
|
|
148
|
-
/themes
|
|
149
|
-
|
|
150
|
-
# Change reasoning level
|
|
151
|
-
/reasoning low|medium|high
|
|
152
|
-
|
|
153
|
-
# Switch models or effort presets
|
|
154
|
-
/model
|
|
155
|
-
|
|
156
|
-
# Start new conversation
|
|
157
|
-
/new
|
|
158
|
-
```
|
|
14
|
+
### Installing and running Codex CLI
|
|
159
15
|
|
|
160
|
-
|
|
16
|
+
Install globally with your preferred package manager:
|
|
161
17
|
|
|
162
18
|
```shell
|
|
163
|
-
|
|
164
|
-
|
|
165
|
-
Options:
|
|
166
|
-
--model <name> Override the model for the active provider (e.g. gpt-5.1)
|
|
167
|
-
--read-only Prevent file modifications
|
|
168
|
-
--no-approval Skip approval prompts (use with caution)
|
|
169
|
-
--config <key=val> Override config values
|
|
170
|
-
--oss Use local open source models
|
|
171
|
-
--sandbox <mode> Set sandbox level (read-only, workspace-write, etc.)
|
|
172
|
-
--help Show help information
|
|
173
|
-
--debug Log API requests and responses to file
|
|
174
|
-
--version Show version number
|
|
175
|
-
```
|
|
176
|
-
|
|
177
|
-
Note: `--model` only changes the model name sent to the active provider. To use a different provider, set `model_provider` in `config.toml`. Providers must expose an OpenAI-compatible API (Chat Completions or Responses).
|
|
178
|
-
|
|
179
|
-
 
|
|
180
|
-
## Memory & project docs
|
|
181
|
-
|
|
182
|
-
Every Code can remember context across sessions:
|
|
183
|
-
|
|
184
|
-
1. **Create an `AGENTS.md` or `CLAUDE.md` file** in your project root:
|
|
185
|
-
```markdown
|
|
186
|
-
# Project Context
|
|
187
|
-
This is a React TypeScript application with:
|
|
188
|
-
- Authentication via JWT
|
|
189
|
-
- PostgreSQL database
|
|
190
|
-
- Express.js backend
|
|
191
|
-
|
|
192
|
-
## Key files:
|
|
193
|
-
- `/src/auth/` - Authentication logic
|
|
194
|
-
- `/src/api/` - API client code
|
|
195
|
-
- `/server/` - Backend services
|
|
19
|
+
# Install using npm
|
|
20
|
+
npm install -g @openai/codex
|
|
196
21
|
```
|
|
197
22
|
|
|
198
|
-
2. **Session memory**: Every Code maintains conversation history
|
|
199
|
-
3. **Codebase analysis**: Automatically understands project structure
|
|
200
|
-
|
|
201
|
-
 
|
|
202
|
-
## Non-interactive / CI mode
|
|
203
|
-
|
|
204
|
-
For automation and CI/CD:
|
|
205
|
-
|
|
206
23
|
```shell
|
|
207
|
-
#
|
|
208
|
-
|
|
209
|
-
|
|
210
|
-
# Generate reports
|
|
211
|
-
code --read-only "analyze code quality and generate report"
|
|
212
|
-
|
|
213
|
-
# Batch processing
|
|
214
|
-
code --config output_format=json "list all TODO comments"
|
|
215
|
-
```
|
|
216
|
-
|
|
217
|
-
 
|
|
218
|
-
## Model Context Protocol (MCP)
|
|
219
|
-
|
|
220
|
-
Every Code supports MCP for extended capabilities:
|
|
221
|
-
|
|
222
|
-
- **File operations**: Advanced file system access
|
|
223
|
-
- **Database connections**: Query and modify databases
|
|
224
|
-
- **API integrations**: Connect to external services
|
|
225
|
-
- **Custom tools**: Build your own extensions
|
|
226
|
-
|
|
227
|
-
Configure MCP in `~/.code/config.toml` Define each server under a named table like `[mcp_servers.<name>]` (this maps to the JSON `mcpServers` object used by other clients):
|
|
228
|
-
|
|
229
|
-
```toml
|
|
230
|
-
[mcp_servers.filesystem]
|
|
231
|
-
command = "npx"
|
|
232
|
-
args = ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]
|
|
233
|
-
```
|
|
234
|
-
|
|
235
|
-
 
|
|
236
|
-
## Configuration
|
|
237
|
-
|
|
238
|
-
Main config file: `~/.code/config.toml`
|
|
239
|
-
|
|
240
|
-
> [!NOTE]
|
|
241
|
-
> Every Code reads from both `~/.code/` and `~/.codex/` for backwards compatibility, but it only writes updates to `~/.code/`. If you switch back to Codex and it fails to start, remove `~/.codex/config.toml`. If Every Code appears to miss settings after upgrading, copy your legacy `~/.codex/config.toml` into `~/.code/`.
|
|
242
|
-
|
|
243
|
-
```toml
|
|
244
|
-
# Model settings
|
|
245
|
-
model = "gpt-5.1"
|
|
246
|
-
model_provider = "openai"
|
|
247
|
-
|
|
248
|
-
# Behavior
|
|
249
|
-
approval_policy = "on-request" # untrusted | on-failure | on-request | never
|
|
250
|
-
model_reasoning_effort = "medium" # low | medium | high
|
|
251
|
-
sandbox_mode = "workspace-write"
|
|
252
|
-
|
|
253
|
-
# UI preferences see THEME_CONFIG.md
|
|
254
|
-
[tui.theme]
|
|
255
|
-
name = "light-photon"
|
|
256
|
-
|
|
257
|
-
# Add config for specific models
|
|
258
|
-
[profiles.gpt-5]
|
|
259
|
-
model = "gpt-5.1"
|
|
260
|
-
model_provider = "openai"
|
|
261
|
-
approval_policy = "never"
|
|
262
|
-
model_reasoning_effort = "high"
|
|
263
|
-
model_reasoning_summary = "detailed"
|
|
24
|
+
# Install using Homebrew
|
|
25
|
+
brew install --cask codex
|
|
264
26
|
```
|
|
265
27
|
|
|
266
|
-
|
|
267
|
-
|
|
268
|
-
- `CODE_HOME`: Override config directory location
|
|
269
|
-
- `OPENAI_API_KEY`: Use API key instead of ChatGPT auth
|
|
270
|
-
- `OPENAI_BASE_URL`: Use OpenAI-compatible API endpoints (chat or responses)
|
|
271
|
-
- `OPENAI_WIRE_API`: Force the built-in OpenAI provider to use `chat` or `responses` wiring
|
|
272
|
-
|
|
273
|
-
 
|
|
274
|
-
## FAQ
|
|
275
|
-
|
|
276
|
-
**How is this different from the original?**
|
|
277
|
-
> This fork adds browser integration, multi-agent commands (`/plan`, `/solve`, `/code`), theme system, and enhanced reasoning controls while maintaining full compatibility.
|
|
28
|
+
Then simply run `codex` to get started.
|
|
278
29
|
|
|
279
|
-
|
|
280
|
-
>
|
|
30
|
+
<details>
|
|
31
|
+
<summary>You can also go to the <a href="https://github.com/openai/codex/releases/latest">latest GitHub Release</a> and download the appropriate binary for your platform.</summary>
|
|
281
32
|
|
|
282
|
-
|
|
283
|
-
> Absolutely. Use the same "Sign in with ChatGPT" flow as the original.
|
|
33
|
+
Each GitHub Release contains many executables, but in practice, you likely want one of these:
|
|
284
34
|
|
|
285
|
-
|
|
286
|
-
|
|
35
|
+
- macOS
|
|
36
|
+
- Apple Silicon/arm64: `codex-aarch64-apple-darwin.tar.gz`
|
|
37
|
+
- x86_64 (older Mac hardware): `codex-x86_64-apple-darwin.tar.gz`
|
|
38
|
+
- Linux
|
|
39
|
+
- x86_64: `codex-x86_64-unknown-linux-musl.tar.gz`
|
|
40
|
+
- arm64: `codex-aarch64-unknown-linux-musl.tar.gz`
|
|
287
41
|
|
|
288
|
-
|
|
289
|
-
## Contributing
|
|
42
|
+
Each archive contains a single entry with the platform baked into the name (e.g., `codex-x86_64-unknown-linux-musl`), so you likely want to rename it to `codex` after extracting it.
|
|
290
43
|
|
|
291
|
-
|
|
44
|
+
</details>
|
|
292
45
|
|
|
293
|
-
###
|
|
46
|
+
### Using Codex with your ChatGPT plan
|
|
294
47
|
|
|
295
|
-
|
|
296
|
-
# Clone and setup
|
|
297
|
-
git clone https://github.com/just-every/code.git
|
|
298
|
-
cd code
|
|
299
|
-
npm install
|
|
48
|
+
Run `codex` and select **Sign in with ChatGPT**. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. [Learn more about what's included in your ChatGPT plan](https://help.openai.com/en/articles/11369540-codex-in-chatgpt).
|
|
300
49
|
|
|
301
|
-
|
|
302
|
-
./build-fast.sh
|
|
303
|
-
|
|
304
|
-
# Run locally
|
|
305
|
-
./code-rs/target/dev-fast/code
|
|
306
|
-
```
|
|
307
|
-
|
|
308
|
-
#### Git hooks
|
|
309
|
-
|
|
310
|
-
This repo ships shared hooks under `.githooks/`. To enable them locally:
|
|
311
|
-
|
|
312
|
-
```bash
|
|
313
|
-
git config core.hooksPath .githooks
|
|
314
|
-
```
|
|
50
|
+
You can also use Codex with an API key, but this requires [additional setup](https://developers.openai.com/codex/auth#sign-in-with-an-api-key).
|
|
315
51
|
|
|
316
|
-
|
|
52
|
+
## Docs
|
|
317
53
|
|
|
318
|
-
|
|
54
|
+
- [**Codex Documentation**](https://developers.openai.com/codex)
|
|
55
|
+
- [**Contributing**](./docs/contributing.md)
|
|
56
|
+
- [**Installing & building**](./docs/install.md)
|
|
57
|
+
- [**Open source fund**](./docs/open-source-fund.md)
|
|
319
58
|
|
|
320
|
-
|
|
321
|
-
2. Create a feature branch: `git checkout -b feature/amazing-feature`
|
|
322
|
-
3. Make your changes
|
|
323
|
-
4. Run tests: `cargo test`
|
|
324
|
-
5. Build successfully: `./build-fast.sh`
|
|
325
|
-
6. Submit a pull request
|
|
326
|
-
|
|
327
|
-
|
|
328
|
-
 
|
|
329
|
-
## Legal & Use
|
|
330
|
-
|
|
331
|
-
### License & attribution
|
|
332
|
-
- This project is a community fork of `openai/codex` under **Apache-2.0**. We preserve upstream LICENSE and NOTICE files.
|
|
333
|
-
- **Every Code** (Code) is **not** affiliated with, sponsored by, or endorsed by OpenAI.
|
|
334
|
-
|
|
335
|
-
### Your responsibilities
|
|
336
|
-
Using OpenAI, Anthropic or Google services through Every Code means you agree to **their Terms and policies**. In particular:
|
|
337
|
-
- **Don't** programmatically scrape/extract content outside intended flows.
|
|
338
|
-
- **Don't** bypass or interfere with rate limits, quotas, or safety mitigations.
|
|
339
|
-
- Use your **own** account; don't share or rotate accounts to evade limits.
|
|
340
|
-
- If you configure other model providers, you're responsible for their terms.
|
|
341
|
-
|
|
342
|
-
### Privacy
|
|
343
|
-
- Your auth file lives at `~/.code/auth.json`
|
|
344
|
-
- Inputs/outputs you send to AI providers are handled under their Terms and Privacy Policy; consult those documents (and any org-level data-sharing settings).
|
|
345
|
-
|
|
346
|
-
### Subject to change
|
|
347
|
-
AI providers can change eligibility, limits, models, or authentication flows. Every Code supports **both** ChatGPT sign-in and API-key modes so you can pick what fits (local/hobby vs CI/automation).
|
|
348
|
-
|
|
349
|
-
 
|
|
350
|
-
## License
|
|
351
|
-
|
|
352
|
-
Apache 2.0 - See [LICENSE](LICENSE) file for details.
|
|
353
|
-
|
|
354
|
-
Every Code is a community fork of the original Codex CLI. We maintain compatibility while adding enhanced features requested by the developer community.
|
|
355
|
-
|
|
356
|
-
 
|
|
357
|
-
---
|
|
358
|
-
**Need help?** Open an issue on [GitHub](https://github.com/just-every/code/issues) or check our documentation.
|
|
59
|
+
This repository is licensed under the [Apache-2.0 License](LICENSE).
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@hanzo/dev",
|
|
3
|
-
"version": "3.0.
|
|
3
|
+
"version": "3.0.8",
|
|
4
4
|
"license": "Apache-2.0",
|
|
5
5
|
"description": "Hanzo AI coding assistant - intelligent CLI for developers",
|
|
6
6
|
"bin": {
|
|
@@ -35,10 +35,10 @@
|
|
|
35
35
|
"prettier": "^3.3.3"
|
|
36
36
|
},
|
|
37
37
|
"optionalDependencies": {
|
|
38
|
-
"@hanzo/dev-darwin-arm64": "3.0.
|
|
39
|
-
"@hanzo/dev-darwin-x64": "3.0.
|
|
40
|
-
"@hanzo/dev-linux-x64-musl": "3.0.
|
|
41
|
-
"@hanzo/dev-linux-arm64-musl": "3.0.
|
|
42
|
-
"@hanzo/dev-win32-x64": "3.0.
|
|
38
|
+
"@hanzo/dev-darwin-arm64": "3.0.8",
|
|
39
|
+
"@hanzo/dev-darwin-x64": "3.0.8",
|
|
40
|
+
"@hanzo/dev-linux-x64-musl": "3.0.8",
|
|
41
|
+
"@hanzo/dev-linux-arm64-musl": "3.0.8",
|
|
42
|
+
"@hanzo/dev-win32-x64": "3.0.8"
|
|
43
43
|
}
|
|
44
44
|
}
|