@samooth/open-codex 0.2.7 → 0.2.10
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +45 -53
- package/dist/cli.js +284 -282
- package/dist/cli.js.map +4 -4
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -290,55 +290,6 @@ npm link
|
|
|
290
290
|
|
|
291
291
|
</details>
|
|
292
292
|
|
|
293
|
-
### Alternative AI Providers
|
|
294
|
-
|
|
295
|
-
This fork of Codex supports multiple AI providers:
|
|
296
|
-
|
|
297
|
-
- `openai` (default)
|
|
298
|
-
- `gemini`
|
|
299
|
-
- `openrouter`
|
|
300
|
-
- `ollama`
|
|
301
|
-
- `xai`
|
|
302
|
-
- `deepseek`
|
|
303
|
-
- `hf` (Hugging Face)
|
|
304
|
-
|
|
305
|
-
To use a different provider, set the `provider` key in your config file:
|
|
306
|
-
|
|
307
|
-
```json
|
|
308
|
-
{
|
|
309
|
-
"provider": "gemini"
|
|
310
|
-
}
|
|
311
|
-
```
|
|
312
|
-
|
|
313
|
-
Or use the `--provider` flag, e.g. `codex --provider gemini`.
|
|
314
|
-
|
|
315
|
-
#### Ollama Configuration
|
|
316
|
-
|
|
317
|
-
When using the `ollama` provider, OpenCodex defaults to communicating with a local server at `http://localhost:11434`. You can customize this by setting the **`OLLAMA_BASE_URL`** environment variable:
|
|
318
|
-
|
|
319
|
-
```bash
|
|
320
|
-
export OLLAMA_BASE_URL="http://192.168.1.50:11434"
|
|
321
|
-
```
|
|
322
|
-
|
|
323
|
-
For embeddings (used in Project Memory and RAG), OpenCodex defaults to the `nomic-embed-text:latest` model for Ollama. Ensure you have pulled it:
|
|
324
|
-
`ollama pull nomic-embed-text:latest`
|
|
325
|
-
|
|
326
|
-
#### Dynamic Model Discovery
|
|
327
|
-
|
|
328
|
-
For many providers, you can use the `/models` command within the interactive chat to see a list of available models and switch between them. For the **Hugging Face** provider, this dynamically fetches the latest `tool-use` compatible models directly from the Hugging Face Hub.
|
|
329
|
-
|
|
330
|
-
Here’s a table of all providers and their default models:
|
|
331
|
-
|
|
332
|
-
| Provider | Environment Variable Required | Default Agentic Model | Default Full Context Model |
|
|
333
|
-
|------------|-------------------------------|-----------------------|----------------------------|
|
|
334
|
-
| openai | `OPENAI_API_KEY` | `o4-mini` | `o3` |
|
|
335
|
-
| gemini | `GOOGLE_GENERATIVE_AI_API_KEY`| `gemini-3-pro-preview`| `gemini-2.5-pro` |
|
|
336
|
-
| openrouter | `OPENROUTER_API_KEY` | `openai/o4-mini` | `openai/o3` |
|
|
337
|
-
| ollama | *Not required* | *User must specify* | *User must specify* |
|
|
338
|
-
| xai | `XAI_API_KEY` | `grok-3-mini-beta` | `grok-3-beta` |
|
|
339
|
-
| deepseek | `DS_API_KEY` | `deepseek-chat` | `deepseek-reasoner` |
|
|
340
|
-
| hf | `HF_API_KEY` | `moonshotai/Kimi-K2.5`| `moonshotai/Kimi-K2.5` |
|
|
341
|
-
|
|
342
293
|
---
|
|
343
294
|
|
|
344
295
|
## Configuration
|
|
@@ -352,9 +303,6 @@ Codex looks for config files in **`~/.codex/`** (either YAML or JSON format). Th
|
|
|
352
303
|
"provider": "openai", // Default provider
|
|
353
304
|
"approvalMode": "suggest", // or auto-edit, full-auto
|
|
354
305
|
"fullAutoErrorMode": "ask-user", // or ignore-and-continue
|
|
355
|
-
"enableWebSearch": false, // default is false
|
|
356
|
-
"enableDeepThinking": false, // adds "Deep Thinking" prefix to prompt
|
|
357
|
-
"embeddingModel": "text-embedding-3-small", // Custom model for RAG/Memory
|
|
358
306
|
"memory": {
|
|
359
307
|
"enabled": true
|
|
360
308
|
}
|
|
@@ -363,13 +311,57 @@ Codex looks for config files in **`~/.codex/`** (either YAML or JSON format). Th
|
|
|
363
311
|
|
|
364
312
|
You can also define custom instructions:
|
|
365
313
|
|
|
366
|
-
```
|
|
314
|
+
```md
|
|
367
315
|
# ~/.codex/instructions.md
|
|
368
316
|
|
|
369
317
|
- Always respond with emojis
|
|
370
318
|
- Only use git commands if I explicitly mention you should
|
|
371
319
|
```
|
|
372
320
|
|
|
321
|
+
### Alternative AI Providers
|
|
322
|
+
|
|
323
|
+
This fork of Codex supports multiple AI providers:
|
|
324
|
+
|
|
325
|
+
- openai (default)
|
|
326
|
+
- gemini
|
|
327
|
+
- openrouter
|
|
328
|
+
- ollama
|
|
329
|
+
- xai
|
|
330
|
+
- deepseek
|
|
331
|
+
- hf (Hugging Face)
|
|
332
|
+
|
|
333
|
+
To use a different provider, set the `provider` key in your config file:
|
|
334
|
+
|
|
335
|
+
```json
|
|
336
|
+
{
|
|
337
|
+
"provider": "gemini"
|
|
338
|
+
}
|
|
339
|
+
```
|
|
340
|
+
|
|
341
|
+
OR use the `--provider` flag. eg. `codex --provider gemini`
|
|
342
|
+
|
|
343
|
+
#### Dynamic Model Discovery
|
|
344
|
+
|
|
345
|
+
For many providers, you can use the `/models` command within the interactive chat to see a list of available models and switch between them. For the **Hugging Face** provider, this dynamically fetches the latest `tool-use` compatible models directly from the Hugging Face Hub.
|
|
346
|
+
|
|
347
|
+
Here's a list of all the providers and their default models:
|
|
348
|
+
|
|
349
|
+
| Provider | Environment Variable Required | Default Agentic Model | Default Full Context Model |
|
|
350
|
+
| ---------- | ----------------------------- | ---------------------------- | -------------------------- |
|
|
351
|
+
| openai | OPENAI_API_KEY | o4-mini | o3 |
|
|
352
|
+
| gemini | GOOGLE_GENERATIVE_AI_API_KEY | gemini-3-pro-preview | gemini-2.5-pro |
|
|
353
|
+
| openrouter | OPENROUTER_API_KEY | openai/o4-mini | openai/o3 |
|
|
354
|
+
| ollama | Not required | User must specify | User must specify |
|
|
355
|
+
| xai | XAI_API_KEY | grok-3-mini-beta | grok-3-beta |
|
|
356
|
+
| deepseek | DS_API_KEY | deepseek-chat | deepseek-reasoner |
|
|
357
|
+
| hf | HF_API_KEY | moonshotai/Kimi-K2.5 | moonshotai/Kimi-K2.5 |
|
|
358
|
+
|
|
359
|
+
#### When using an alternative provider, make sure you have the correct environment variables set.
|
|
360
|
+
|
|
361
|
+
```bash
|
|
362
|
+
export GOOGLE_GENERATIVE_AI_API_KEY="your-gemini-api-key-here"
|
|
363
|
+
```
|
|
364
|
+
|
|
373
365
|
---
|
|
374
366
|
|
|
375
367
|
## FAQ
|