zencefyl 0.2.8 → 0.2.9

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,47 +1,181 @@
1
- # zencefyl
1
+ <div align="center">
2
+ <img src="assets/duck-mascot.svg" width="160" alt="Zencefyl duck mascot" />
3
+ <h1>zencefyl</h1>
4
+ <p><strong>Personal AI engineering companion for the terminal.</strong></p>
5
+ <p>Workspace-aware, memory-backed, knowledge-tracking, document-building, and actually pleasant to use.</p>
2
6
 
3
- Personal AI engineering companion. Knows what you know, corrects you when you're wrong, and grows with you over time.
4
-
5
- **No API key required.** Runs on your existing Claude.ai subscription via Claude Code.
7
+ <p>
8
+ <a href="https://www.npmjs.com/package/zencefyl"><img alt="npm version" src="https://img.shields.io/npm/v/zencefyl?color=111111&label=npm"></a>
9
+ <img alt="node" src="https://img.shields.io/badge/node-18%2B-111111">
10
+ <img alt="ui" src="https://img.shields.io/badge/interface-terminal-111111">
11
+ <img alt="memory" src="https://img.shields.io/badge/memory-workspace--aware-111111">
12
+ </p>
13
+ </div>
6
14
 
7
15
  ---
8
16
 
9
- ## Prerequisites
17
+ ## What It Is
18
+
19
+ `zencefyl` is a terminal AI companion that does more than answer prompts.
20
+
21
+ It can:
22
+ - remember what happened in the same workspace across sessions
23
+ - track what you know, where you struggle, and what is due for review
24
+ - generate serious study guides, reports, and LaTeX/PDF documents
25
+ - switch between multiple providers and local/hosted model paths
26
+ - keep the interface interactive instead of dumping walls of flat text
27
+
28
+ And yes, the duck is part of the product. It is the repo mascot and the terminal omen.
29
+
30
+ ## Why It Feels Different
10
31
 
11
- - Node.js 18+
12
- - [pnpm](https://pnpm.io) — `npm install -g pnpm`
13
- - [Claude Code](https://claude.ai/code) installed and logged in with your Claude.ai account
32
+ Most CLI agents are good at “answer and act.”
33
+
34
+ `zencefyl` is trying to be better at:
35
+ - continuity
36
+ - learning
37
+ - document output
38
+ - interactive terminal UX
39
+ - being useful in the same repo over time
40
+
41
+ That means the product is shaped around:
42
+ - workspace memory
43
+ - knowledge graph + review
44
+ - project recap and resume
45
+ - approval flows
46
+ - guided panels
47
+ - exportable artifacts
14
48
 
15
49
  ## Install
16
50
 
51
+ ### npm
52
+
53
+ ```bash
54
+ npm install -g zencefyl
55
+ zencefyl
56
+ ```
57
+
58
+ ### From source
59
+
17
60
  ```bash
18
61
  git clone https://github.com/bartugundogdu/zencefyl
19
62
  cd zencefyl
20
63
  pnpm install
21
64
  pnpm build
22
65
  pnpm link --global
66
+ zencefyl
23
67
  ```
24
68
 
25
- If `pnpm link --global` fails with a missing bin directory error, run `pnpm setup` first, then `source ~/.bashrc`, then re-run `pnpm link --global`.
26
-
27
- ## Run
69
+ If `pnpm link --global` fails because the global bin directory is missing, run:
28
70
 
29
71
  ```bash
30
- zencefyl
72
+ pnpm setup
31
73
  ```
32
74
 
33
- Works from any directory. Type `exit` or press Ctrl+C to quit.
75
+ then reload your shell and retry.
76
+
77
+ ## Core Features
78
+
79
+ ### Workspace continuity
80
+ - remembers recent work in the same directory
81
+ - startup recap for known workspaces
82
+ - `/resume` and `/timeline` for picking up where you left off
83
+ - project-first memory retrieval instead of global memory noise
84
+
85
+ ### Knowledge tracking
86
+ - evidence-backed knowledge graph
87
+ - mastery separated from raw retrievability
88
+ - due-topic review with FSRS
89
+ - gaps, corrections, and profile memory
90
+
91
+ ### Document studio
92
+ - study guides
93
+ - reports
94
+ - textbooks
95
+ - revision-aware rebuilds
96
+ - LaTeX-first PDF generation
34
97
 
35
- ## Switch to direct API (optional)
98
+ ### Interactive terminal UX
99
+ - structured panels instead of flat dumps
100
+ - expandable long outputs with `Ctrl+O`
101
+ - model/provider browser
102
+ - approval prompts with keyboard navigation
103
+ - duck companion in the corner
36
104
 
37
- If you'd rather use the Anthropic API directly instead of Claude Code:
105
+ ## Providers
38
106
 
39
- ```json
40
- // ~/.zencefyl/config.json
41
- {
42
- "provider": "anthropic",
43
- "apiKey": "sk-ant-..."
44
- }
107
+ `zencefyl` supports multiple model paths, including:
108
+ - Claude Code
109
+ - Anthropic API
110
+ - OpenAI subscription/API paths
111
+ - Gemini subscription
112
+ - Hugging Face browsing/inference flows
113
+ - Ollama
114
+ - local Transformers.js
115
+
116
+ The `/model` panel is the control center for switching and browsing.
117
+
118
+ ## Commands
119
+
120
+ The command surface is intentionally being kept smaller and stronger over time.
121
+
122
+ Current core commands include:
123
+ - `/help`
124
+ - `/model`
125
+ - `/settings`
126
+ - `/resume`
127
+ - `/timeline`
128
+ - `/review`
129
+ - `/knowledge`
130
+ - `/gaps`
131
+ - `/profile`
132
+ - `/corrections`
133
+ - `/export`
134
+ - `/save`
135
+ - `/forget`
136
+ - `/prune`
137
+ - `/doctor`
138
+ - `/remap`
139
+
140
+ ## Example Uses
141
+
142
+ ```text
143
+ can you fix the failing build in this repo?
144
+ ```
145
+
146
+ ```text
147
+ make me a deep study guide on stationary systolic array types and export it to pdf
148
+ ```
149
+
150
+ ```text
151
+ what were we doing in this workspace last time?
45
152
  ```
46
153
 
47
- Or set `ANTHROPIC_API_KEY` in your environment.
154
+ ```text
155
+ review my recent corrections and tell me what patterns you see
156
+ ```
157
+
158
+ ## Product Direction
159
+
160
+ The project is actively pushing toward:
161
+ - stronger repo understanding
162
+ - better diff/review UX
163
+ - better workspace memory
164
+ - richer panels and terminal interaction
165
+ - higher-quality document output
166
+
167
+ The goal is not to become another generic coding CLI.
168
+
169
+ The goal is to become a real personal engineering companion.
170
+
171
+ ## Notes
172
+
173
+ - Node.js `18+` is required.
174
+ - The tool does not need to auto-start after install. Run `zencefyl` explicitly.
175
+ - If a capability requires system setup, `zencefyl` should guide that honestly instead of pretending it already exists.
176
+
177
+ ## Repo
178
+
179
+ - Source: `src/`
180
+ - Internal reports and development notes: `docs/internal/`
181
+ - Private/internal docs are intentionally kept separate from public repo-facing docs.
@@ -137,6 +137,11 @@ var MOONSHOT_MODELS = {
137
137
  deep: "kimi-k1.5"
138
138
  // strongest reasoning with thinking
139
139
  };
140
+ var HUGGINGFACE_MODELS = {
141
+ fast: "Qwen/Qwen2.5-7B-Instruct",
142
+ default: "Qwen/Qwen2.5-32B-Instruct",
143
+ deep: "meta-llama/Llama-3.3-70B-Instruct"
144
+ };
140
145
  var MODEL_REGISTRY = [
141
146
  // ── Anthropic (claude-code or anthropic provider) ─────────────────────────
142
147
  { id: "claude-sonnet-4-6", label: "Claude Sonnet 4.6", provider: "anthropic" },
@@ -208,7 +213,8 @@ var PROVIDER_LABELS = {
208
213
  "gemini-subscription": "Gemini (subscription)",
209
214
  "ollama": "Ollama (local)",
210
215
  "local-transformers": "Local Models (transformers.js)",
211
- "moonshot": "Moonshot (Kimi)"
216
+ "moonshot": "Moonshot (Kimi)",
217
+ "huggingface": "Hugging Face"
212
218
  };
213
219
 
214
220
  export {
@@ -220,7 +226,8 @@ export {
220
226
  GEMINI_SUBSCRIPTION_MODELS,
221
227
  OLLAMA_MODELS_DEFAULT,
222
228
  MOONSHOT_MODELS,
229
+ HUGGINGFACE_MODELS,
223
230
  MODEL_REGISTRY,
224
231
  PROVIDER_LABELS
225
232
  };
226
- //# sourceMappingURL=chunk-CZ4NCT3D.js.map
233
+ //# sourceMappingURL=chunk-Q5BQ7N7F.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/constants/models.ts"],"sourcesContent":["// Default model IDs and pricing constants.\n// Override model IDs in ~/.zencefyl/config.json — no code changes needed.\n//\n// Model IDs sourced from @mariozechner/pi-ai models.generated.js (subscription)\n// and official pricing pages (API key providers).\n\nimport type { ModelConfig } from '../types/config'\n\n// Fallback models used when config.json has no models section.\nexport const DEFAULT_MODELS: ModelConfig = {\n fast: 'claude-haiku-4-5-20251001', // Cheap + fast — background tasks\n default: 'claude-sonnet-4-6', // Main model for most turns\n deep: 'claude-opus-4-6', // Reserved for deep reasoning\n}\n\n// Default models for the local-transformers provider.\nexport const LOCAL_TRANSFORMERS_MODELS: ModelConfig = {\n fast: 'tinyllama', // 1.1B params, fast on CPU\n default: 'phi3-mini', // 3.8B params, good quality\n deep: 'stablelm2', // 1.6B params, better reasoning\n}\n\n// Context window sizes per model (input tokens).\n// Used to show a \"X / Y tokens\" budget indicator in the status bar.\nexport const MODEL_CONTEXT_WINDOW: Record<string, number> = {\n // Anthropic\n 'claude-haiku-4-5-20251001': 200_000,\n 'claude-sonnet-4-6': 200_000,\n 'claude-opus-4-6': 200_000,\n // OpenAI subscription (via ChatGPT plan — @mariozechner/pi-ai)\n // Context windows are approximate — Codex endpoint docs not fully public.\n 'gpt-5.4': 128_000,\n 'gpt-5.4-mini': 128_000,\n 'gpt-5.3-codex': 128_000,\n 'gpt-5.3-codex-spark': 128_000,\n 'gpt-5.2': 128_000,\n 'gpt-5.2-codex': 128_000,\n 'gpt-5.1': 128_000,\n 'gpt-5.1-codex-max': 128_000,\n 'gpt-5.1-codex-mini': 128_000,\n // Gemini subscription (via Google Cloud Code Assist — @mariozechner/pi-ai)\n 'gemini-3.1-pro-preview': 1_000_000,\n 'gemini-3-pro-preview': 1_000_000,\n 'gemini-3-flash-preview': 1_000_000,\n 'gemini-2.5-pro': 1_000_000,\n 'gemini-2.5-flash': 1_000_000,\n 'gemini-2.0-flash': 1_000_000,\n // Ollama local models (context varies — using typical defaults)\n 'gemma3': 128_000,\n 'gemma3:1b': 32_000,\n 'gemma3:12b': 128_000,\n 'llama3.2': 128_000,\n 'llama3.2:1b': 128_000,\n 'llama3.3': 128_000,\n 'mistral': 32_000,\n 'mistral-nemo': 128_000,\n 'phi4': 128_000,\n 'phi4-mini': 128_000,\n 'tinyllama': 4_096,\n 'qwen2.5': 128_000,\n 'qwen2.5:14b': 128_000,\n 'qwen2.5:72b': 128_000,\n 'deepseek-r1': 64_000,\n 'deepseek-r1:14b': 64_000,\n 'deepseek-r1:32b': 64_000,\n 'qwq': 128_000,\n 'codellama': 16_000,\n 'codellama:13b': 16_000,\n 'codegemma': 8_000,\n 'llava': 4_096,\n 'llava:13b': 4_096,\n // Local transformers models (approximate — varies by model)\n 'phi3-mini': 4_096,\n 'stablelm2': 4_096,\n 'gemma-2b': 8_192,\n // Moonshot (Kimi models)\n 'kimi-k2.5': 256_000,\n 'kimi-k1.5': 256_000,\n 'kimi-k1.5-long-context': 2_000_000,\n}\n\n// Cost per million tokens in USD.\n// Subscription providers (openai-subscription, gemini-subscription) are billed\n// through the plan itself — we track $0 here so StatusBar shows \"free\" for them.\nexport const MODEL_PRICING: Record<string, { inputPerM: number; outputPerM: number }> = {\n // Anthropic (source: anthropic.com/pricing)\n 'claude-haiku-4-5-20251001': { inputPerM: 0.80, outputPerM: 4.00 },\n 'claude-sonnet-4-6': { inputPerM: 3.00, outputPerM: 15.00 },\n 'claude-opus-4-6': { inputPerM: 15.00, outputPerM: 75.00 },\n // OpenAI subscription models — billed via ChatGPT plan, not per token\n 'gpt-5.4': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.4-mini': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.3-codex': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.3-codex-spark': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.2': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.2-codex': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.1': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.1-codex-max': { inputPerM: 0, outputPerM: 0 },\n 'gpt-5.1-codex-mini': { inputPerM: 0, outputPerM: 0 },\n // Gemini subscription models — billed via Google Cloud Code Assist plan, not per token\n 'gemini-3.1-pro-preview': { inputPerM: 0, outputPerM: 0 },\n 'gemini-3-pro-preview': { inputPerM: 0, outputPerM: 0 },\n 'gemini-3-flash-preview': { inputPerM: 0, outputPerM: 0 },\n 'gemini-2.5-pro': { inputPerM: 0, outputPerM: 0 },\n 'gemini-2.5-flash': { inputPerM: 0, outputPerM: 0 },\n 'gemini-2.0-flash': { inputPerM: 0, outputPerM: 0 },\n // Moonshot (Kimi) — approximate pricing (check platform.moonshot.cn for current rates)\n 'kimi-k2.5': { inputPerM: 2.00, outputPerM: 10.00 },\n 'kimi-k1.5': { inputPerM: 8.00, outputPerM: 32.00 },\n 'kimi-k1.5-long-context': { inputPerM: 4.00, outputPerM: 16.00 },\n // Local transformers models — $0 (runs locally, no API cost)\n 'tinyllama': { inputPerM: 0, outputPerM: 0 },\n 'phi3-mini': { inputPerM: 0, outputPerM: 0 },\n 'stablelm2': { inputPerM: 0, outputPerM: 0 },\n 'gemma-2b': { inputPerM: 0, outputPerM: 0 },\n}\n\n// ── Per-provider default model tiers ──────────────────────────────────────────\n\n// Default models for the ChatGPT OAuth subscription provider.\n// IDs must match what @mariozechner/pi-ai openai-codex provider recognises.\n// Valid IDs: gpt-5.1, gpt-5.1-codex-max, gpt-5.1-codex-mini, gpt-5.2,\n// gpt-5.2-codex, gpt-5.3-codex, gpt-5.3-codex-spark, gpt-5.4, gpt-5.4-mini\nexport const OPENAI_SUBSCRIPTION_MODELS: ModelConfig = {\n fast: 'gpt-5.4-mini', // smaller, faster tier\n default: 'gpt-5.4', // latest flagship\n deep: 'gpt-5.3-codex', // strong coding + reasoning\n}\n\n// Default models for the Gemini OAuth subscription provider.\nexport const GEMINI_SUBSCRIPTION_MODELS: ModelConfig = {\n fast: 'gemini-3-flash-preview', // Gemini 3 fast tier\n default: 'gemini-2.5-pro', // stable flagship\n deep: 'gemini-3.1-pro-preview', // latest strongest\n}\n\n// Default models for the Ollama provider.\nexport const OLLAMA_MODELS_DEFAULT: ModelConfig = {\n fast: 'gemma3:1b', // small + fast\n default: 'gemma3', // 4B — recommended general-purpose\n deep: 'qwen2.5', // 7B — stronger for complex tasks\n}\n\n// Default models for the Moonshot (Kimi) provider.\nexport const MOONSHOT_MODELS: ModelConfig = {\n fast: 'kimi-k2.5', // fastest, most cost-effective\n default: 'kimi-k2.5', // balanced — the model you're talking to now\n deep: 'kimi-k1.5', // strongest reasoning with thinking\n}\n\n// Default models for the Hugging Face hosted inference provider.\n// These are only fallbacks; the main selection UX should come from the Hub browser.\nexport const HUGGINGFACE_MODELS: ModelConfig = {\n fast: 'Qwen/Qwen2.5-7B-Instruct',\n default: 'Qwen/Qwen2.5-32B-Instruct',\n deep: 'meta-llama/Llama-3.3-70B-Instruct',\n}\n\n// ── Model registry for the /model picker ──────────────────────────────────────\n\n// A model entry shown in the interactive /model picker.\nexport interface ModelEntry {\n id: string // model ID passed to provider.chat()\n label: string // human-readable display name\n provider: string // which provider it belongs to (for grouping)\n tier?: string // 'fast' | 'default' | 'deep' — optional badge\n size?: string // disk size hint shown in install wizard (e.g. '3.3 GB')\n}\n\n// All models surfaced in the /model picker, grouped by provider.\n// Subscription models marked with their subscription; Ollama entries are generic.\nexport const MODEL_REGISTRY: ModelEntry[] = [\n // ── Anthropic (claude-code or anthropic provider) ─────────────────────────\n { id: 'claude-sonnet-4-6', label: 'Claude Sonnet 4.6', provider: 'anthropic' },\n { id: 'claude-opus-4-6', label: 'Claude Opus 4.6', provider: 'anthropic' },\n { id: 'claude-haiku-4-5-20251001', label: 'Claude Haiku 4.5', provider: 'anthropic' },\n\n // ── OpenAI subscription (via ChatGPT plan — @mariozechner/pi-ai) ─────────\n // Model IDs must match what pi-ai's openai-codex provider recognises.\n { id: 'gpt-5.4', label: 'GPT-5.4', provider: 'openai-subscription', tier: 'default' },\n { id: 'gpt-5.4-mini', label: 'GPT-5.4 Mini', provider: 'openai-subscription', tier: 'fast' },\n { id: 'gpt-5.3-codex', label: 'GPT-5.3 Codex', provider: 'openai-subscription', tier: 'deep' },\n { id: 'gpt-5.3-codex-spark', label: 'GPT-5.3 Codex Spark', provider: 'openai-subscription' },\n { id: 'gpt-5.2-codex', label: 'GPT-5.2 Codex', provider: 'openai-subscription' },\n { id: 'gpt-5.2', label: 'GPT-5.2', provider: 'openai-subscription' },\n { id: 'gpt-5.1', label: 'GPT-5.1', provider: 'openai-subscription' },\n { id: 'gpt-5.1-codex-max', label: 'GPT-5.1 Codex Max', provider: 'openai-subscription' },\n { id: 'gpt-5.1-codex-mini', label: 'GPT-5.1 Codex Mini', provider: 'openai-subscription' },\n\n // ── Gemini subscription (via Google Cloud Code Assist — @mariozechner/pi-ai) ─\n // Model IDs from pi-ai's google-gemini-cli provider.\n { id: 'gemini-3.1-pro-preview', label: 'Gemini 3.1 Pro', provider: 'gemini-subscription', tier: 'deep' },\n { id: 'gemini-3-pro-preview', label: 'Gemini 3 Pro', provider: 'gemini-subscription' },\n { id: 'gemini-3-flash-preview', label: 'Gemini 3 Flash', provider: 'gemini-subscription', tier: 'fast' },\n { id: 'gemini-2.5-pro', label: 'Gemini 2.5 Pro', provider: 'gemini-subscription', tier: 'default' },\n { id: 'gemini-2.5-flash', label: 'Gemini 2.5 Flash', provider: 'gemini-subscription' },\n { id: 'gemini-2.0-flash', label: 'Gemini 2.0 Flash', provider: 'gemini-subscription' },\n\n // ── Ollama (local) — requires `ollama serve` ──────────────────────────────\n // Fast / small models (< 4 GB)\n { id: 'gemma3:1b', label: 'Gemma 3 1B', provider: 'ollama', tier: 'fast', size: '0.8 GB' },\n { id: 'llama3.2:1b', label: 'Llama 3.2 1B', provider: 'ollama', tier: 'fast', size: '1.3 GB' },\n { id: 'phi4-mini', label: 'Phi 4 Mini', provider: 'ollama', tier: 'fast', size: '2.5 GB' },\n { id: 'tinyllama', label: 'TinyLlama 1.1B', provider: 'ollama', tier: 'fast', size: '0.6 GB' },\n // Balanced models (2–8 GB) — recommended for daily use\n { id: 'gemma3', label: 'Gemma 3 4B', provider: 'ollama', tier: 'default', size: '3.3 GB' },\n { id: 'gemma3:12b', label: 'Gemma 3 12B', provider: 'ollama', size: '8.1 GB' },\n { id: 'llama3.2', label: 'Llama 3.2 3B', provider: 'ollama', tier: 'default', size: '2.0 GB' },\n { id: 'llama3.3', label: 'Llama 3.3 70B', provider: 'ollama', tier: 'deep', size: '43 GB' },\n { id: 'mistral', label: 'Mistral 7B', provider: 'ollama', size: '4.1 GB' },\n { id: 'mistral-nemo', label: 'Mistral Nemo 12B', provider: 'ollama', size: '7.1 GB' },\n { id: 'phi4', label: 'Phi 4 14B', provider: 'ollama', size: '9.1 GB' },\n { id: 'qwen2.5', label: 'Qwen 2.5 7B', provider: 'ollama', tier: 'default', size: '4.7 GB' },\n { id: 'qwen2.5:14b', label: 'Qwen 2.5 14B', provider: 'ollama', size: '9.0 GB' },\n { id: 'qwen2.5:72b', label: 'Qwen 2.5 72B', provider: 'ollama', tier: 'deep', size: '47 GB' },\n // Reasoning models\n { id: 'deepseek-r1', label: 'DeepSeek R1 7B', provider: 'ollama', size: '4.7 GB' },\n { id: 'deepseek-r1:14b', label: 'DeepSeek R1 14B', provider: 'ollama', size: '9.0 GB' },\n { id: 'deepseek-r1:32b', label: 'DeepSeek R1 32B', provider: 'ollama', tier: 'deep', size: '20 GB' },\n { id: 'qwq', label: 'QwQ 32B (reasoning)', provider: 'ollama', tier: 'deep', size: '20 GB' },\n // Code models\n { id: 'codellama', label: 'CodeLlama 7B', provider: 'ollama', size: '3.8 GB' },\n { id: 'codellama:13b', label: 'CodeLlama 13B', provider: 'ollama', size: '7.4 GB' },\n { id: 'codegemma', label: 'CodeGemma 7B', provider: 'ollama', size: '5.0 GB' },\n // Multimodal\n { id: 'llava', label: 'LLaVA 7B (vision)', provider: 'ollama', size: '4.5 GB' },\n { id: 'llava:13b', label: 'LLaVA 13B (vision)', provider: 'ollama', size: '8.0 GB' },\n\n // ── Local Transformers (fully local, no external deps) ───────────────────\n { id: 'tinyllama', label: 'TinyLlama 1.1B', provider: 'local-transformers', tier: 'fast' },\n { id: 'phi3-mini', label: 'Phi-3 Mini 3.8B', provider: 'local-transformers', tier: 'default' },\n { id: 'stablelm2', label: 'StableLM 2 Zephyr 1.6B', provider: 'local-transformers', tier: 'default' },\n { id: 'gemma-2b', label: 'Gemma 2B Instruct', provider: 'local-transformers', tier: 'fast' },\n\n // ── Moonshot (Kimi) ─────────────────────────────────────────────────────────\n { id: 'kimi-k2.5', label: 'Kimi K2.5', provider: 'moonshot' },\n { id: 'kimi-k1.5', label: 'Kimi K1.5', provider: 'moonshot' },\n { id: 'kimi-k1.5-long-context', label: 'Kimi K1.5 (2M ctx)', provider: 'moonshot' },\n]\n\n// Provider display names for the picker's group headers.\nexport const PROVIDER_LABELS: Record<string, string> = {\n 'claude-code': 'Claude Code',\n 'anthropic': 'Anthropic',\n 'openai-compat': 'OpenAI-Compatible',\n 'openai-subscription': 'OpenAI (subscription)',\n 'gemini-subscription': 'Gemini (subscription)',\n 'ollama': 'Ollama (local)',\n 'local-transformers': 'Local Models (transformers.js)',\n 'moonshot': 'Moonshot (Kimi)',\n 'huggingface': 'Hugging Face',\n}\n"],"mappings":";;;AASO,IAAM,iBAA8B;AAAA,EACzC,MAAS;AAAA;AAAA,EACT,SAAS;AAAA;AAAA,EACT,MAAS;AAAA;AACX;AAGO,IAAM,4BAAyC;AAAA,EACpD,MAAS;AAAA;AAAA,EACT,SAAS;AAAA;AAAA,EACT,MAAS;AAAA;AACX;AAIO,IAAM,uBAA+C;AAAA;AAAA,EAE1D,6BAA6B;AAAA,EAC7B,qBAA6B;AAAA,EAC7B,mBAA6B;AAAA;AAAA;AAAA,EAG7B,WAAwB;AAAA,EACxB,gBAAwB;AAAA,EACxB,iBAAwB;AAAA,EACxB,uBAAwB;AAAA,EACxB,WAAwB;AAAA,EACxB,iBAAwB;AAAA,EACxB,WAAwB;AAAA,EACxB,qBAAwB;AAAA,EACxB,sBAAwB;AAAA;AAAA,EAExB,0BAA2B;AAAA,EAC3B,wBAA2B;AAAA,EAC3B,0BAA2B;AAAA,EAC3B,kBAA2B;AAAA,EAC3B,oBAA2B;AAAA,EAC3B,oBAA2B;AAAA;AAAA,EAE3B,UAAmB;AAAA,EACnB,aAAoB;AAAA,EACpB,cAAmB;AAAA,EACnB,YAAmB;AAAA,EACnB,eAAmB;AAAA,EACnB,YAAmB;AAAA,EACnB,WAAoB;AAAA,EACpB,gBAAmB;AAAA,EACnB,QAAmB;AAAA,EACnB,aAAmB;AAAA,EACnB,aAAqB;AAAA,EACrB,WAAmB;AAAA,EACnB,eAAmB;AAAA,EACnB,eAAmB;AAAA,EACnB,eAAoB;AAAA,EACpB,mBAAoB;AAAA,EACpB,mBAAoB;AAAA,EACpB,OAAmB;AAAA,EACnB,aAAoB;AAAA,EACpB,iBAAoB;AAAA,EACpB,aAAqB;AAAA,EACrB,SAAqB;AAAA,EACrB,aAAqB;AAAA;AAAA,EAErB,aAAa;AAAA,EACb,aAAa;AAAA,EACb,YAAa;AAAA;AAAA,EAEb,aAAc;AAAA,EACd,aAAc;AAAA,EACd,0BAA0B;AAC5B;AAKO,IAAM,gBAA2E;AAAA;AAAA,EAEtF,6BAA6B,EAAE,WAAW,KAAO,YAAY,EAAM;AAAA,EACnE,qBAA6B,EAAE,WAAW,GAAO,YAAY,GAAM;AAAA,EACnE,mBAA6B,EAAE,WAAW,IAAO,YAAY,GAAM;AAAA;AAAA,EAEnE,WAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,gBAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,iBAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,uBAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,WAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,iBAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,WAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,qBAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACtD,sBAAwB,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA;AAAA,EAEtD,0BAA2B,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACzD,wBAA2B,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACzD,0BAA2B,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACzD,kBAA2B,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACzD,oBAA2B,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EACzD,oBAA2B,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA;AAAA,EAEzD,aAAc,EAAE,WAAW,GAAM,YAAY,GAAM;AAAA,EACnD,aAAc,EAAE,WAAW,GAAM,YAAY,GAAM;AAAA,EACnD,0BAA0B,EAAE,WAAW,GAAM,YAAY,GAAM;AAAA;AAAA,EAE/D,aAAa,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EAC3C,aAAa,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EAC3C,aAAa,EAAE,WAAW,GAAG,YAAY,EAAE;AAAA,EAC3C,YAAa,EAAE,WAAW,GAAG,YAAY,EAAE;AAC7C;AAQO,IAAM,6BAA0C;AAAA,EACrD,MAAS;AAAA;AAAA,EACT,SAAS;AAAA;AAAA,EACT,MAAS;AAAA;AACX;AAGO,IAAM,6BAA0C;AAAA,EACrD,MAAS;AAAA;AAAA,EACT,SAAS;AAAA;AAAA,EACT,MAAS;AAAA;AACX;AAGO,IAAM,wBAAqC;AAAA,EAChD,MAAS;AAAA;AAAA,EACT,SAAS;AAAA;AAAA,EACT,MAAS;AAAA;AACX;AAGO,IAAM,kBAA+B;AAAA,EAC1C,MAAS;AAAA;AAAA,EACT,SAAS;AAAA;AAAA,EACT,MAAS;AAAA;AACX;AAIO,IAAM,qBAAkC;AAAA,EAC7C,MAAS;AAAA,EACT,SAAS;AAAA,EACT,MAAS;AACX;AAeO,IAAM,iBAA+B;AAAA;AAAA,EAE1C,EAAE,IAAI,qBAA6B,OAAO,qBAAwB,UAAU,YAAY;AAAA,EACxF,EAAE,IAAI,mBAA6B,OAAO,mBAAwB,UAAU,YAAY;AAAA,EACxF,EAAE,IAAI,6BAA6B,OAAO,oBAAwB,UAAU,YAAY;AAAA;AAAA;AAAA,EAIxF,EAAE,IAAI,WAAuB,OAAO,WAAqB,UAAU,uBAAuB,MAAM,UAAU;AAAA,EAC1G,EAAE,IAAI,gBAAuB,OAAO,gBAAqB,UAAU,uBAAuB,MAAM,OAAU;AAAA,EAC1G,EAAE,IAAI,iBAAuB,OAAO,iBAAqB,UAAU,uBAAuB,MAAM,OAAU;AAAA,EAC1G,EAAE,IAAI,uBAAuB,OAAO,uBAAuB,UAAU,sBAAsB;AAAA,EAC3F,EAAE,IAAI,iBAAuB,OAAO,iBAAqB,UAAU,sBAAsB;AAAA,EACzF,EAAE,IAAI,WAAuB,OAAO,WAAsB,UAAU,sBAAsB;AAAA,EAC1F,EAAE,IAAI,WAAuB,OAAO,WAAsB,UAAU,sBAAsB;AAAA,EAC1F,EAAE,IAAI,qBAAuB,OAAO,qBAAqB,UAAU,sBAAsB;AAAA,EACzF,EAAE,IAAI,sBAAuB,OAAO,sBAAsB,UAAU,sBAAsB;AAAA;AAAA;AAAA,EAI1F,EAAE,IAAI,0BAA2B,OAAO,kBAAqB,UAAU,uBAAuB,MAAM,OAAU;AAAA,EAC9G,EAAE,IAAI,wBAA2B,OAAO,gBAAsB,UAAU,sBAAuC;AAAA,EAC/G,EAAE,IAAI,0BAA2B,OAAO,kBAAsB,UAAU,uBAAuB,MAAM,OAAU;AAAA,EAC/G,EAAE,IAAI,kBAA2B,OAAO,kBAAsB,UAAU,uBAAuB,MAAM,UAAU;AAAA,EAC/G,EAAE,IAAI,oBAA2B,OAAO,oBAAsB,UAAU,sBAAuC;AAAA,EAC/G,EAAE,IAAI,oBAA2B,OAAO,oBAAsB,UAAU,sBAAuC;AAAA;AAAA;AAAA,EAI/G,EAAE,IAAI,aAAiB,OAAO,cAAoB,UAAU,UAAU,MAAM,QAAW,MAAM,SAAS;AAAA,EACtG,EAAE,IAAI,eAAiB,OAAO,gBAAoB,UAAU,UAAU,MAAM,QAAW,MAAM,SAAS;AAAA,EACtG,EAAE,IAAI,aAAiB,OAAO,cAAqB,UAAU,UAAU,MAAM,QAAW,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,aAAiB,OAAO,kBAAqB,UAAU,UAAU,MAAM,QAAW,MAAM,SAAS;AAAA;AAAA,EAEvG,EAAE,IAAI,UAAiB,OAAO,cAAqB,UAAU,UAAU,MAAM,WAAW,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,cAAiB,OAAO,eAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,YAAiB,OAAO,gBAAqB,UAAU,UAAU,MAAM,WAAW,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,YAAiB,OAAO,iBAAqB,UAAU,UAAU,MAAM,QAAW,MAAM,QAAS;AAAA,EACvG,EAAE,IAAI,WAAiB,OAAO,cAAsB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACxG,EAAE,IAAI,gBAAiB,OAAO,oBAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,QAAiB,OAAO,aAAsB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACxG,EAAE,IAAI,WAAiB,OAAO,eAAsB,UAAU,UAAU,MAAM,WAAW,MAAM,SAAS;AAAA,EACxG,EAAE,IAAI,eAAiB,OAAO,gBAAsB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACxG,EAAE,IAAI,eAAiB,OAAO,gBAAsB,UAAU,UAAU,MAAM,QAAW,MAAM,QAAS;AAAA;AAAA,EAExG,EAAE,IAAI,eAAiB,OAAO,kBAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,mBAAmB,OAAO,mBAAmB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,mBAAmB,OAAO,mBAAmB,UAAU,UAAU,MAAM,QAAW,MAAM,QAAS;AAAA,EACvG,EAAE,IAAI,OAAiB,OAAO,uBAAuB,UAAU,UAAU,MAAM,QAAS,MAAM,QAAS;AAAA;AAAA,EAEvG,EAAE,IAAI,aAAiB,OAAO,gBAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,iBAAiB,OAAO,iBAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,aAAiB,OAAO,gBAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA;AAAA,EAEvG,EAAE,IAAI,SAAiB,OAAO,qBAAqB,UAAU,UAA2B,MAAM,SAAS;AAAA,EACvG,EAAE,IAAI,aAAiB,OAAO,sBAAsB,UAAU,UAA0B,MAAM,SAAS;AAAA;AAAA,EAGvG,EAAE,IAAI,aAAa,OAAO,kBAAkB,UAAU,sBAAsB,MAAM,OAAO;AAAA,EACzF,EAAE,IAAI,aAAa,OAAO,mBAAmB,UAAU,sBAAsB,MAAM,UAAU;AAAA,EAC7F,EAAE,IAAI,aAAa,OAAO,0BAA0B,UAAU,sBAAsB,MAAM,UAAU;AAAA,EACpG,EAAE,IAAI,YAAa,OAAO,qBAAqB,UAAU,sBAAsB,MAAM,OAAO;AAAA;AAAA,EAG5F,EAAE,IAAI,aAAc,OAAO,aAAuB,UAAU,WAAW;AAAA,EACvE,EAAE,IAAI,aAAc,OAAO,aAAuB,UAAU,WAAW;AAAA,EACvE,EAAE,IAAI,0BAA0B,OAAO,sBAAsB,UAAU,WAAW;AACpF;AAGO,IAAM,kBAA0C;AAAA,EACrD,eAAuB;AAAA,EACvB,aAAuB;AAAA,EACvB,iBAAuB;AAAA,EACvB,uBAAuB;AAAA,EACvB,uBAAuB;AAAA,EACvB,UAAuB;AAAA,EACvB,sBAAsB;AAAA,EACtB,YAAuB;AAAA,EACvB,eAAuB;AACzB;","names":[]}
@@ -244,6 +244,15 @@ function isExpired(creds) {
244
244
  }
245
245
  async function getAccessToken(dataDir, providerId) {
246
246
  const store = loadCredentials(dataDir);
247
+ if (providerId === "huggingface") {
248
+ const token = store["huggingface"]?.token ?? process.env["HUGGINGFACE_API_KEY"] ?? process.env["HF_TOKEN"] ?? "";
249
+ if (!token) {
250
+ throw new Error(
251
+ "No Hugging Face token found. Run zencefyl setup and choose Hugging Face, or set HUGGINGFACE_API_KEY / HF_TOKEN."
252
+ );
253
+ }
254
+ return token;
255
+ }
247
256
  const creds = store[providerId];
248
257
  if (!creds) {
249
258
  throw new Error(
@@ -280,4 +289,4 @@ export {
280
289
  ensureGeminiProjectId,
281
290
  getAccessToken
282
291
  };
283
- //# sourceMappingURL=chunk-JZA53FAX.js.map
292
+ //# sourceMappingURL=chunk-VIJFYNJO.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/auth/credentials.ts"],"sourcesContent":["// Credential storage and refresh for OAuth providers.\n//\n// Credentials are stored in ~/.zencefyl/credentials.json — separate from\n// config.json so they are never accidentally committed to git.\n//\n// Structure:\n// {\n// \"openai-subscription\": { access: \"...\", refresh: \"...\", expires: 1234567890 },\n// \"gemini-subscription\": { access: \"...\", refresh: \"...\", expires: 1234567890, projectId: \"...\" }\n// }\n\nimport fs from 'node:fs'\nimport path from 'node:path'\nimport os from 'node:os'\nimport type { OAuthCredentials } from '@mariozechner/pi-ai/oauth'\n\n// ── Types ─────────────────────────────────────────────────────────────────────\n\nexport type ProviderId = 'openai-subscription' | 'gemini-subscription' | 'huggingface'\n\n// Gemini credentials carry a projectId needed by refreshGoogleCloudToken.\nexport type GeminiCredentials = OAuthCredentials & { projectId?: string }\nexport type HuggingFaceCredentials = { token: string }\n\ntype CredentialsStore = {\n 'openai-subscription'?: OAuthCredentials\n 'gemini-subscription'?: GeminiCredentials\n 'huggingface'?: HuggingFaceCredentials\n}\n\ntype CodexCliAuth = OAuthCredentials & { accountId?: string }\n\nconst GOOGLE_USERINFO_URL = 'https://www.googleapis.com/oauth2/v1/userinfo?alt=json'\nconst GOOGLE_CODE_ASSIST_ENDPOINT = 'https://cloudcode-pa.googleapis.com'\nconst GOOGLE_TIER_FREE = 'free-tier'\nconst GOOGLE_TIER_LEGACY = 'legacy-tier'\nconst GOOGLE_TIER_STANDARD = 'standard-tier'\n\n// ── File path ─────────────────────────────────────────────────────────────────\n\nexport function credentialsPath(dataDir: string): string {\n return path.join(dataDir, 'credentials.json')\n}\n\n// ── Load / Save ───────────────────────────────────────────────────────────────\n\nexport function loadCredentials(dataDir: string): CredentialsStore {\n const p = credentialsPath(dataDir)\n if (!fs.existsSync(p)) return {}\n try {\n return JSON.parse(fs.readFileSync(p, 'utf8')) as CredentialsStore\n } catch {\n return {}\n }\n}\n\nexport function saveCredentials(dataDir: string, store: CredentialsStore): void {\n fs.writeFileSync(credentialsPath(dataDir), JSON.stringify(store, null, 2), 'utf8')\n}\n\nexport function saveProviderCredentials(\n dataDir: string,\n providerId: ProviderId,\n creds: OAuthCredentials | GeminiCredentials | HuggingFaceCredentials,\n): void {\n const store = loadCredentials(dataDir)\n store[providerId] = creds as GeminiCredentials & HuggingFaceCredentials\n saveCredentials(dataDir, store)\n}\n\nfunction decodeJwtExpiryMs(token: string): number | null {\n const parts = token.split('.')\n if (parts.length < 2) return null\n\n try {\n const payloadRaw = Buffer.from(parts[1] ?? '', 'base64url').toString('utf8')\n const payload = JSON.parse(payloadRaw) as { exp?: unknown }\n return typeof payload.exp === 'number' && Number.isFinite(payload.exp) && payload.exp > 0\n ? payload.exp * 1000\n : null\n } catch {\n return null\n }\n}\n\nfunction resolveCodexCliAuthPath(): string {\n const codexHome = process.env['CODEX_HOME']\n ? path.resolve(process.env['CODEX_HOME'])\n : path.join(os.homedir(), '.codex')\n\n return path.join(codexHome, 'auth.json')\n}\n\n// Reuse credentials created by the official Codex CLI so Zencefyl does not\n// depend on a fragile duplicate browser OAuth flow when valid local auth exists.\nexport function loadCodexCliCredentials(): CodexCliAuth | null {\n const authPath = resolveCodexCliAuthPath()\n if (!fs.existsSync(authPath)) return null\n\n try {\n const raw = JSON.parse(fs.readFileSync(authPath, 'utf8')) as Record<string, unknown>\n const tokens = raw['tokens']\n if (!tokens || typeof tokens !== 'object') return null\n\n const record = tokens as Record<string, unknown>\n const access = record['access_token']\n const refresh = record['refresh_token']\n if (typeof access !== 'string' || !access || typeof refresh !== 'string' || !refresh) {\n return null\n }\n\n let fallbackExpiry = Date.now() + 60 * 60 * 1000\n try {\n fallbackExpiry = fs.statSync(authPath).mtimeMs + 60 * 60 * 1000\n } catch {\n // Ignore stat failures and keep the generic fallback expiry.\n }\n\n return {\n access,\n refresh,\n expires: decodeJwtExpiryMs(access) ?? fallbackExpiry,\n ...(typeof record['account_id'] === 'string' ? { accountId: record['account_id'] } : {}),\n }\n } catch {\n return null\n }\n}\n\nexport function resolveGeminiProjectId(\n value?: string | null,\n): string | null {\n const projectId = value\n ?? process.env['GOOGLE_CLOUD_PROJECT']\n ?? process.env['GOOGLE_CLOUD_PROJECT_ID']\n ?? process.env['GCLOUD_PROJECT']\n ?? loadGcloudConfiguredProjectId()\n ?? null\n\n return typeof projectId === 'string' && projectId.trim()\n ? projectId.trim()\n : null\n}\n\nfunction gcloudConfigDir(): string {\n return path.join(os.homedir(), '.config', 'gcloud')\n}\n\nfunction parseIniValue(contents: string, section: string, key: string): string | null {\n let currentSection = ''\n for (const rawLine of contents.split(/\\r?\\n/)) {\n const line = rawLine.trim()\n if (!line || line.startsWith('#') || line.startsWith(';')) continue\n\n const sectionMatch = line.match(/^\\[(.+)\\]$/)\n if (sectionMatch) {\n currentSection = sectionMatch[1] ?? ''\n continue\n }\n\n if (currentSection !== section) continue\n const valueMatch = line.match(/^([^=]+?)\\s*=\\s*(.+)$/)\n if (!valueMatch) continue\n if (valueMatch[1]?.trim() !== key) continue\n\n const value = valueMatch[2]?.trim() ?? ''\n return value || null\n }\n\n return null\n}\n\nfunction loadGcloudConfiguredProjectId(): string | null {\n const gcloudDir = gcloudConfigDir()\n const activeConfigPath = path.join(gcloudDir, 'active_config')\n let activeConfig = 'default'\n\n try {\n if (fs.existsSync(activeConfigPath)) {\n activeConfig = fs.readFileSync(activeConfigPath, 'utf8').trim() || 'default'\n }\n } catch {\n // Ignore local gcloud config read failures and continue to other fallbacks.\n }\n\n const configPath = path.join(gcloudDir, 'configurations', `config_${activeConfig}`)\n if (!fs.existsSync(configPath)) return null\n\n try {\n return parseIniValue(fs.readFileSync(configPath, 'utf8'), 'core', 'project')\n } catch {\n return null\n }\n}\n\nfunction isVpcScAffected(payload: unknown): boolean {\n if (!payload || typeof payload !== 'object') return false\n const error = (payload as { error?: unknown }).error\n if (!error || typeof error !== 'object') return false\n const details = (error as { details?: unknown[] }).details\n if (!Array.isArray(details)) return false\n\n return details.some(item =>\n typeof item === 'object' &&\n item !== null &&\n (item as { reason?: string }).reason === 'SECURITY_POLICY_VIOLATED'\n )\n}\n\nfunction getDefaultGeminiTier(\n allowedTiers?: Array<{ id?: string; isDefault?: boolean }>,\n): { id?: string } {\n if (!allowedTiers?.length) return { id: GOOGLE_TIER_LEGACY }\n return allowedTiers.find(tier => tier.isDefault) ?? { id: GOOGLE_TIER_LEGACY }\n}\n\nasync function wait(ms: number): Promise<void> {\n await new Promise(resolve => setTimeout(resolve, ms))\n}\n\nasync function pollGeminiProjectOperation(\n operationName: string,\n headers: Record<string, string>,\n): Promise<{ done?: boolean; response?: { cloudaicompanionProject?: { id?: string } } }> {\n for (let attempt = 0; attempt < 24; attempt += 1) {\n if (attempt > 0) await wait(5000)\n\n const response = await fetch(`${GOOGLE_CODE_ASSIST_ENDPOINT}/v1internal/${operationName}`, {\n method: 'GET',\n headers,\n })\n\n if (!response.ok) {\n throw new Error(`Failed to poll Gemini project provisioning: ${response.status} ${response.statusText}`)\n }\n\n const data = await response.json() as {\n done?: boolean\n response?: { cloudaicompanionProject?: { id?: string } }\n }\n\n if (data.done) return data\n }\n\n throw new Error('Timed out while waiting for Gemini project provisioning')\n}\n\nasync function discoverGeminiProjectIdWithAccessToken(accessToken: string): Promise<string> {\n const envProjectId = resolveGeminiProjectId()\n const headers = {\n Authorization: `Bearer ${accessToken}`,\n 'Content-Type': 'application/json',\n 'User-Agent': 'google-api-nodejs-client/9.15.1',\n 'X-Goog-Api-Client': `gl-node/${process.versions.node}`,\n }\n\n const loadBody = {\n ...(envProjectId ? { cloudaicompanionProject: envProjectId } : {}),\n metadata: {\n ideType: 'IDE_UNSPECIFIED',\n platform: 'PLATFORM_UNSPECIFIED',\n pluginType: 'GEMINI',\n ...(envProjectId ? { duetProject: envProjectId } : {}),\n },\n }\n\n const loadResponse = await fetch(`${GOOGLE_CODE_ASSIST_ENDPOINT}/v1internal:loadCodeAssist`, {\n method: 'POST',\n headers,\n body: JSON.stringify(loadBody),\n })\n\n let data: {\n currentTier?: { id?: string }\n cloudaicompanionProject?: string | { id?: string }\n allowedTiers?: Array<{ id?: string; isDefault?: boolean }>\n }\n\n if (!loadResponse.ok) {\n let payload: unknown = null\n try {\n payload = await loadResponse.clone().json()\n } catch {\n payload = null\n }\n\n if (isVpcScAffected(payload)) {\n data = { currentTier: { id: GOOGLE_TIER_STANDARD } }\n } else {\n const text = await loadResponse.text()\n throw new Error(`Gemini loadCodeAssist failed: ${loadResponse.status} ${loadResponse.statusText}${text ? `: ${text}` : ''}`)\n }\n } else {\n data = await loadResponse.json() as typeof data\n }\n\n if (data.currentTier) {\n const project = data.cloudaicompanionProject\n if (typeof project === 'string' && project.trim()) return project.trim()\n if (typeof project === 'object' && typeof project?.id === 'string' && project.id.trim()) return project.id.trim()\n if (envProjectId) return envProjectId\n throw new Error(\n 'This Gemini account requires GOOGLE_CLOUD_PROJECT, GOOGLE_CLOUD_PROJECT_ID, or an active gcloud project.'\n )\n }\n\n const tierId = getDefaultGeminiTier(data.allowedTiers).id ?? GOOGLE_TIER_FREE\n if (tierId !== GOOGLE_TIER_FREE && !envProjectId) {\n throw new Error(\n 'This Gemini account requires GOOGLE_CLOUD_PROJECT, GOOGLE_CLOUD_PROJECT_ID, or an active gcloud project.'\n )\n }\n\n const onboardBody: Record<string, unknown> = {\n tierId,\n metadata: {\n ideType: 'IDE_UNSPECIFIED',\n platform: 'PLATFORM_UNSPECIFIED',\n pluginType: 'GEMINI',\n },\n }\n\n if (tierId !== GOOGLE_TIER_FREE && envProjectId) {\n onboardBody['cloudaicompanionProject'] = envProjectId\n ;(onboardBody['metadata'] as Record<string, unknown>)['duetProject'] = envProjectId\n }\n\n const onboardResponse = await fetch(`${GOOGLE_CODE_ASSIST_ENDPOINT}/v1internal:onboardUser`, {\n method: 'POST',\n headers,\n body: JSON.stringify(onboardBody),\n })\n\n if (!onboardResponse.ok) {\n const text = await onboardResponse.text()\n throw new Error(`Gemini onboardUser failed: ${onboardResponse.status} ${onboardResponse.statusText}${text ? `: ${text}` : ''}`)\n }\n\n let operation = await onboardResponse.json() as {\n done?: boolean\n name?: string\n response?: { cloudaicompanionProject?: { id?: string } }\n }\n\n if (!operation.done && operation.name) {\n operation = await pollGeminiProjectOperation(operation.name, headers)\n }\n\n const projectId = operation.response?.cloudaicompanionProject?.id\n if (typeof projectId === 'string' && projectId.trim()) return projectId.trim()\n if (envProjectId) return envProjectId\n\n throw new Error(\n 'Could not discover or provision a Gemini Code Assist project. Set GOOGLE_CLOUD_PROJECT or select a gcloud project.'\n )\n}\n\n// Gemini CLI / Code Assist can require a project that is not present in the\n// OAuth payload. We resolve it lazily from env, gcloud config, or by asking the\n// Cloud Code Assist backend to discover/provision one for the authenticated user.\nexport async function ensureGeminiProjectId(\n accessToken: string,\n value?: string | null,\n): Promise<string> {\n const existing = resolveGeminiProjectId(value)\n if (existing) return existing\n return await discoverGeminiProjectIdWithAccessToken(accessToken)\n}\n\n// ── Token refresh ─────────────────────────────────────────────────────────────\n\n// Buffer in ms — refresh 60s before actual expiry so requests never hit an\n// expired token mid-flight.\nconst REFRESH_BUFFER_MS = 60_000\n\nfunction isExpired(creds: OAuthCredentials): boolean {\n return Date.now() >= creds.expires - REFRESH_BUFFER_MS\n}\n\n// Returns a valid access token, refreshing first if needed.\n// Persists updated credentials back to disk on refresh.\nexport async function getAccessToken(\n dataDir: string,\n providerId: 'openai-subscription',\n): Promise<string>\nexport async function getAccessToken(\n dataDir: string,\n providerId: 'gemini-subscription',\n): Promise<string>\nexport async function getAccessToken(\n dataDir: string,\n providerId: 'huggingface',\n): Promise<string>\nexport async function getAccessToken(dataDir: string, providerId: ProviderId): Promise<string> {\n const store = loadCredentials(dataDir)\n if (providerId === 'huggingface') {\n const token = store['huggingface']?.token\n ?? process.env['HUGGINGFACE_API_KEY']\n ?? process.env['HF_TOKEN']\n ?? ''\n\n if (!token) {\n throw new Error(\n 'No Hugging Face token found. Run zencefyl setup and choose Hugging Face, or set HUGGINGFACE_API_KEY / HF_TOKEN.'\n )\n }\n\n return token\n }\n\n const creds = store[providerId]\n\n if (!creds) {\n throw new Error(\n `No credentials for ${providerId}. ` +\n `Run zencefyl setup and choose the subscription option.`\n )\n }\n\n if (!isExpired(creds)) {\n return creds.access\n }\n\n // Token expired — refresh it\n let refreshed: OAuthCredentials\n\n if (providerId === 'openai-subscription') {\n const { refreshOpenAICodexToken } = await import('@mariozechner/pi-ai/oauth')\n refreshed = await refreshOpenAICodexToken(creds.refresh)\n } else {\n const { refreshGoogleCloudToken } = await import('@mariozechner/pi-ai/oauth')\n const projectId = await ensureGeminiProjectId(\n creds.access,\n (creds as GeminiCredentials).projectId ?? null,\n )\n refreshed = await refreshGoogleCloudToken(creds.refresh, projectId)\n // Preserve projectId through refresh\n ;(refreshed as GeminiCredentials).projectId = projectId\n }\n\n saveProviderCredentials(dataDir, providerId, refreshed)\n return refreshed.access\n}\n"],"mappings":";;;AAWA,OAAO,QAAU;AACjB,OAAO,UAAU;AACjB,OAAO,QAAU;AAoBjB,IAAM,8BAA8B;AACpC,IAAM,mBAAmB;AACzB,IAAM,qBAAqB;AAC3B,IAAM,uBAAuB;AAItB,SAAS,gBAAgB,SAAyB;AACvD,SAAO,KAAK,KAAK,SAAS,kBAAkB;AAC9C;AAIO,SAAS,gBAAgB,SAAmC;AACjE,QAAM,IAAI,gBAAgB,OAAO;AACjC,MAAI,CAAC,GAAG,WAAW,CAAC,EAAG,QAAO,CAAC;AAC/B,MAAI;AACF,WAAO,KAAK,MAAM,GAAG,aAAa,GAAG,MAAM,CAAC;AAAA,EAC9C,QAAQ;AACN,WAAO,CAAC;AAAA,EACV;AACF;AAEO,SAAS,gBAAgB,SAAiB,OAA+B;AAC9E,KAAG,cAAc,gBAAgB,OAAO,GAAG,KAAK,UAAU,OAAO,MAAM,CAAC,GAAG,MAAM;AACnF;AAEO,SAAS,wBACd,SACA,YACA,OACM;AACN,QAAM,QAAQ,gBAAgB,OAAO;AACrC,QAAM,UAAU,IAAI;AACpB,kBAAgB,SAAS,KAAK;AAChC;AAEA,SAAS,kBAAkB,OAA8B;AACvD,QAAM,QAAQ,MAAM,MAAM,GAAG;AAC7B,MAAI,MAAM,SAAS,EAAG,QAAO;AAE7B,MAAI;AACF,UAAM,aAAa,OAAO,KAAK,MAAM,CAAC,KAAK,IAAI,WAAW,EAAE,SAAS,MAAM;AAC3E,UAAM,UAAU,KAAK,MAAM,UAAU;AACrC,WAAO,OAAO,QAAQ,QAAQ,YAAY,OAAO,SAAS,QAAQ,GAAG,KAAK,QAAQ,MAAM,IACpF,QAAQ,MAAM,MACd;AAAA,EACN,QAAQ;AACN,WAAO;AAAA,EACT;AACF;AAEA,SAAS,0BAAkC;AACzC,QAAM,YAAY,QAAQ,IAAI,YAAY,IACtC,KAAK,QAAQ,QAAQ,IAAI,YAAY,CAAC,IACtC,KAAK,KAAK,GAAG,QAAQ,GAAG,QAAQ;AAEpC,SAAO,KAAK,KAAK,WAAW,WAAW;AACzC;AAIO,SAAS,0BAA+C;AAC7D,QAAM,WAAW,wBAAwB;AACzC,MAAI,CAAC,GAAG,WAAW,QAAQ,EAAG,QAAO;AAErC,MAAI;AACF,UAAM,MAAM,KAAK,MAAM,GAAG,aAAa,UAAU,MAAM,CAAC;AACxD,UAAM,SAAS,IAAI,QAAQ;AAC3B,QAAI,CAAC,UAAU,OAAO,WAAW,SAAU,QAAO;AAElD,UAAM,SAAS;AACf,UAAM,SAAS,OAAO,cAAc;AACpC,UAAM,UAAU,OAAO,eAAe;AACtC,QAAI,OAAO,WAAW,YAAY,CAAC,UAAU,OAAO,YAAY,YAAY,CAAC,SAAS;AACpF,aAAO;AAAA,IACT;AAEA,QAAI,iBAAiB,KAAK,IAAI,IAAI,KAAK,KAAK;AAC5C,QAAI;AACF,uBAAiB,GAAG,SAAS,QAAQ,EAAE,UAAU,KAAK,KAAK;AAAA,IAC7D,QAAQ;AAAA,IAER;AAEA,WAAO;AAAA,MACL;AAAA,MACA;AAAA,MACA,SAAS,kBAAkB,MAAM,KAAK;AAAA,MACtC,GAAI,OAAO,OAAO,YAAY,MAAM,WAAW,EAAE,WAAW,OAAO,YAAY,EAAE,IAAI,CAAC;AAAA,IACxF;AAAA,EACF,QAAQ;AACN,WAAO;AAAA,EACT;AACF;AAEO,SAAS,uBACd,OACe;AACf,QAAM,YAAY,SACb,QAAQ,IAAI,sBAAsB,KAClC,QAAQ,IAAI,yBAAyB,KACrC,QAAQ,IAAI,gBAAgB,KAC5B,8BAA8B,KAC9B;AAEL,SAAO,OAAO,cAAc,YAAY,UAAU,KAAK,IACnD,UAAU,KAAK,IACf;AACN;AAEA,SAAS,kBAA0B;AACjC,SAAO,KAAK,KAAK,GAAG,QAAQ,GAAG,WAAW,QAAQ;AACpD;AAEA,SAAS,cAAc,UAAkB,SAAiB,KAA4B;AACpF,MAAI,iBAAiB;AACrB,aAAW,WAAW,SAAS,MAAM,OAAO,GAAG;AAC7C,UAAM,OAAO,QAAQ,KAAK;AAC1B,QAAI,CAAC,QAAQ,KAAK,WAAW,GAAG,KAAK,KAAK,WAAW,GAAG,EAAG;AAE3D,UAAM,eAAe,KAAK,MAAM,YAAY;AAC5C,QAAI,cAAc;AAChB,uBAAiB,aAAa,CAAC,KAAK;AACpC;AAAA,IACF;AAEA,QAAI,mBAAmB,QAAS;AAChC,UAAM,aAAa,KAAK,MAAM,uBAAuB;AACrD,QAAI,CAAC,WAAY;AACjB,QAAI,WAAW,CAAC,GAAG,KAAK,MAAM,IAAK;AAEnC,UAAM,QAAQ,WAAW,CAAC,GAAG,KAAK,KAAK;AACvC,WAAO,SAAS;AAAA,EAClB;AAEA,SAAO;AACT;AAEA,SAAS,gCAA+C;AACtD,QAAM,YAAY,gBAAgB;AAClC,QAAM,mBAAmB,KAAK,KAAK,WAAW,eAAe;AAC7D,MAAI,eAAe;AAEnB,MAAI;AACF,QAAI,GAAG,WAAW,gBAAgB,GAAG;AACnC,qBAAe,GAAG,aAAa,kBAAkB,MAAM,EAAE,KAAK,KAAK;AAAA,IACrE;AAAA,EACF,QAAQ;AAAA,EAER;AAEA,QAAM,aAAa,KAAK,KAAK,WAAW,kBAAkB,UAAU,YAAY,EAAE;AAClF,MAAI,CAAC,GAAG,WAAW,UAAU,EAAG,QAAO;AAEvC,MAAI;AACF,WAAO,cAAc,GAAG,aAAa,YAAY,MAAM,GAAG,QAAQ,SAAS;AAAA,EAC7E,QAAQ;AACN,WAAO;AAAA,EACT;AACF;AAEA,SAAS,gBAAgB,SAA2B;AAClD,MAAI,CAAC,WAAW,OAAO,YAAY,SAAU,QAAO;AACpD,QAAM,QAAS,QAAgC;AAC/C,MAAI,CAAC,SAAS,OAAO,UAAU,SAAU,QAAO;AAChD,QAAM,UAAW,MAAkC;AACnD,MAAI,CAAC,MAAM,QAAQ,OAAO,EAAG,QAAO;AAEpC,SAAO,QAAQ;AAAA,IAAK,UAClB,OAAO,SAAS,YAChB,SAAS,QACR,KAA6B,WAAW;AAAA,EAC3C;AACF;AAEA,SAAS,qBACP,cACiB;AACjB,MAAI,CAAC,cAAc,OAAQ,QAAO,EAAE,IAAI,mBAAmB;AAC3D,SAAO,aAAa,KAAK,UAAQ,KAAK,SAAS,KAAK,EAAE,IAAI,mBAAmB;AAC/E;AAEA,eAAe,KAAK,IAA2B;AAC7C,QAAM,IAAI,QAAQ,aAAW,WAAW,SAAS,EAAE,CAAC;AACtD;AAEA,eAAe,2BACb,eACA,SACuF;AACvF,WAAS,UAAU,GAAG,UAAU,IAAI,WAAW,GAAG;AAChD,QAAI,UAAU,EAAG,OAAM,KAAK,GAAI;AAEhC,UAAM,WAAW,MAAM,MAAM,GAAG,2BAA2B,eAAe,aAAa,IAAI;AAAA,MACzF,QAAS;AAAA,MACT;AAAA,IACF,CAAC;AAED,QAAI,CAAC,SAAS,IAAI;AAChB,YAAM,IAAI,MAAM,+CAA+C,SAAS,MAAM,IAAI,SAAS,UAAU,EAAE;AAAA,IACzG;AAEA,UAAM,OAAO,MAAM,SAAS,KAAK;AAKjC,QAAI,KAAK,KAAM,QAAO;AAAA,EACxB;AAEA,QAAM,IAAI,MAAM,yDAAyD;AAC3E;AAEA,eAAe,uCAAuC,aAAsC;AAC1F,QAAM,eAAe,uBAAuB;AAC5C,QAAM,UAAU;AAAA,IACd,eAAmB,UAAU,WAAW;AAAA,IACxC,gBAAmB;AAAA,IACnB,cAAmB;AAAA,IACnB,qBAAqB,WAAW,QAAQ,SAAS,IAAI;AAAA,EACvD;AAEA,QAAM,WAAW;AAAA,IACf,GAAI,eAAe,EAAE,yBAAyB,aAAa,IAAI,CAAC;AAAA,IAChE,UAAU;AAAA,MACR,SAAY;AAAA,MACZ,UAAY;AAAA,MACZ,YAAY;AAAA,MACZ,GAAI,eAAe,EAAE,aAAa,aAAa,IAAI,CAAC;AAAA,IACtD;AAAA,EACF;AAEA,QAAM,eAAe,MAAM,MAAM,GAAG,2BAA2B,8BAA8B;AAAA,IAC3F,QAAQ;AAAA,IACR;AAAA,IACA,MAAQ,KAAK,UAAU,QAAQ;AAAA,EACjC,CAAC;AAED,MAAI;AAMJ,MAAI,CAAC,aAAa,IAAI;AACpB,QAAI,UAAmB;AACvB,QAAI;AACF,gBAAU,MAAM,aAAa,MAAM,EAAE,KAAK;AAAA,IAC5C,QAAQ;AACN,gBAAU;AAAA,IACZ;AAEA,QAAI,gBAAgB,OAAO,GAAG;AAC5B,aAAO,EAAE,aAAa,EAAE,IAAI,qBAAqB,EAAE;AAAA,IACrD,OAAO;AACL,YAAM,OAAO,MAAM,aAAa,KAAK;AACrC,YAAM,IAAI,MAAM,iCAAiC,aAAa,MAAM,IAAI,aAAa,UAAU,GAAG,OAAO,KAAK,IAAI,KAAK,EAAE,EAAE;AAAA,IAC7H;AAAA,EACF,OAAO;AACL,WAAO,MAAM,aAAa,KAAK;AAAA,EACjC;AAEA,MAAI,KAAK,aAAa;AACpB,UAAM,UAAU,KAAK;AACrB,QAAI,OAAO,YAAY,YAAY,QAAQ,KAAK,EAAG,QAAO,QAAQ,KAAK;AACvE,QAAI,OAAO,YAAY,YAAY,OAAO,SAAS,OAAO,YAAY,QAAQ,GAAG,KAAK,EAAG,QAAO,QAAQ,GAAG,KAAK;AAChH,QAAI,aAAc,QAAO;AACzB,UAAM,IAAI;AAAA,MACR;AAAA,IACF;AAAA,EACF;AAEA,QAAM,SAAS,qBAAqB,KAAK,YAAY,EAAE,MAAM;AAC7D,MAAI,WAAW,oBAAoB,CAAC,cAAc;AAChD,UAAM,IAAI;AAAA,MACR;AAAA,IACF;AAAA,EACF;AAEA,QAAM,cAAuC;AAAA,IAC3C;AAAA,IACA,UAAU;AAAA,MACR,SAAY;AAAA,MACZ,UAAY;AAAA,MACZ,YAAY;AAAA,IACd;AAAA,EACF;AAEA,MAAI,WAAW,oBAAoB,cAAc;AAC/C,gBAAY,yBAAyB,IAAI;AACxC,IAAC,YAAY,UAAU,EAA8B,aAAa,IAAI;AAAA,EACzE;AAEA,QAAM,kBAAkB,MAAM,MAAM,GAAG,2BAA2B,2BAA2B;AAAA,IAC3F,QAAQ;AAAA,IACR;AAAA,IACA,MAAQ,KAAK,UAAU,WAAW;AAAA,EACpC,CAAC;AAED,MAAI,CAAC,gBAAgB,IAAI;AACvB,UAAM,OAAO,MAAM,gBAAgB,KAAK;AACxC,UAAM,IAAI,MAAM,8BAA8B,gBAAgB,MAAM,IAAI,gBAAgB,UAAU,GAAG,OAAO,KAAK,IAAI,KAAK,EAAE,EAAE;AAAA,EAChI;AAEA,MAAI,YAAY,MAAM,gBAAgB,KAAK;AAM3C,MAAI,CAAC,UAAU,QAAQ,UAAU,MAAM;AACrC,gBAAY,MAAM,2BAA2B,UAAU,MAAM,OAAO;AAAA,EACtE;AAEA,QAAM,YAAY,UAAU,UAAU,yBAAyB;AAC/D,MAAI,OAAO,cAAc,YAAY,UAAU,KAAK,EAAG,QAAO,UAAU,KAAK;AAC7E,MAAI,aAAc,QAAO;AAEzB,QAAM,IAAI;AAAA,IACR;AAAA,EACF;AACF;AAKA,eAAsB,sBACpB,aACA,OACiB;AACjB,QAAM,WAAW,uBAAuB,KAAK;AAC7C,MAAI,SAAU,QAAO;AACrB,SAAO,MAAM,uCAAuC,WAAW;AACjE;AAMA,IAAM,oBAAoB;AAE1B,SAAS,UAAU,OAAkC;AACnD,SAAO,KAAK,IAAI,KAAK,MAAM,UAAU;AACvC;AAgBA,eAAsB,eAAe,SAAiB,YAAyC;AAC7F,QAAM,QAAQ,gBAAgB,OAAO;AACrC,MAAI,eAAe,eAAe;AAChC,UAAM,QAAQ,MAAM,aAAa,GAAG,SAC/B,QAAQ,IAAI,qBAAqB,KACjC,QAAQ,IAAI,UAAU,KACtB;AAEL,QAAI,CAAC,OAAO;AACV,YAAM,IAAI;AAAA,QACR;AAAA,MACF;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AAEA,QAAM,QAAQ,MAAM,UAAU;AAE9B,MAAI,CAAC,OAAO;AACV,UAAM,IAAI;AAAA,MACR,sBAAsB,UAAU;AAAA,IAElC;AAAA,EACF;AAEA,MAAI,CAAC,UAAU,KAAK,GAAG;AACrB,WAAO,MAAM;AAAA,EACf;AAGA,MAAI;AAEJ,MAAI,eAAe,uBAAuB;AACxC,UAAM,EAAE,wBAAwB,IAAI,MAAM,OAAO,2BAA2B;AAC5E,gBAAY,MAAM,wBAAwB,MAAM,OAAO;AAAA,EACzD,OAAO;AACL,UAAM,EAAE,wBAAwB,IAAI,MAAM,OAAO,2BAA2B;AAC5E,UAAM,YAAY,MAAM;AAAA,MACtB,MAAM;AAAA,MACL,MAA4B,aAAa;AAAA,IAC5C;AACA,gBAAmB,MAAM,wBAAwB,MAAM,SAAS,SAAS;AAExE,IAAC,UAAgC,YAAY;AAAA,EAChD;AAEA,0BAAwB,SAAS,YAAY,SAAS;AACtD,SAAO,UAAU;AACnB;","names":[]}
@@ -8,7 +8,7 @@ import {
8
8
  resolveGeminiProjectId,
9
9
  saveCredentials,
10
10
  saveProviderCredentials
11
- } from "./chunk-JZA53FAX.js";
11
+ } from "./chunk-VIJFYNJO.js";
12
12
  export {
13
13
  credentialsPath,
14
14
  ensureGeminiProjectId,
@@ -19,4 +19,4 @@ export {
19
19
  saveCredentials,
20
20
  saveProviderCredentials
21
21
  };
22
- //# sourceMappingURL=credentials-C7NSDIAG.js.map
22
+ //# sourceMappingURL=credentials-TLQMSS4A.js.map