autoforge-ai 0.1.4 → 0.1.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/.env.example CHANGED
@@ -9,11 +9,6 @@
9
9
  # - webkit: Safari engine
10
10
  # - msedge: Microsoft Edge
11
11
  # PLAYWRIGHT_BROWSER=firefox
12
- #
13
- # PLAYWRIGHT_HEADLESS: Run browser without visible window
14
- # - true: Browser runs in background, saves CPU (default)
15
- # - false: Browser opens a visible window (useful for debugging)
16
- # PLAYWRIGHT_HEADLESS=true
17
12
 
18
13
  # Extra Read Paths (Optional)
19
14
  # Comma-separated list of absolute paths for read-only access to external directories.
@@ -25,56 +20,17 @@
25
20
  # Google Cloud Vertex AI Configuration (Optional)
26
21
  # To use Claude via Vertex AI on Google Cloud Platform, uncomment and set these variables.
27
22
  # Requires: gcloud CLI installed and authenticated (run: gcloud auth application-default login)
28
- # Note: Use @ instead of - in model names (e.g., claude-opus-4-5@20251101)
23
+ # Note: Use @ instead of - in model names for date-suffixed models (e.g., claude-sonnet-4-5@20250929)
29
24
  #
30
25
  # CLAUDE_CODE_USE_VERTEX=1
31
26
  # CLOUD_ML_REGION=us-east5
32
27
  # ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
33
- # ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-5@20251101
28
+ # ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-6
34
29
  # ANTHROPIC_DEFAULT_SONNET_MODEL=claude-sonnet-4-5@20250929
35
30
  # ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-3-5-haiku@20241022
36
31
 
37
32
  # ===================
38
- # Alternative API Providers
33
+ # Alternative API Providers (GLM, Ollama, Kimi, Custom)
39
34
  # ===================
40
- # NOTE: These env vars are the legacy way to configure providers.
41
- # The recommended way is to use the Settings UI (API Provider section).
42
- # UI settings take precedence when api_provider != "claude".
43
-
44
- # Kimi K2.5 (Moonshot) Configuration (Optional)
45
- # Get an API key at: https://kimi.com
46
- #
47
- # ANTHROPIC_BASE_URL=https://api.kimi.com/coding/
48
- # ANTHROPIC_API_KEY=your-kimi-api-key
49
- # ANTHROPIC_DEFAULT_SONNET_MODEL=kimi-k2.5
50
- # ANTHROPIC_DEFAULT_OPUS_MODEL=kimi-k2.5
51
- # ANTHROPIC_DEFAULT_HAIKU_MODEL=kimi-k2.5
52
-
53
- # GLM/Alternative API Configuration (Optional)
54
- # To use Zhipu AI's GLM models instead of Claude, uncomment and set these variables.
55
- # This only affects AutoForge - your global Claude Code settings remain unchanged.
56
- # Get an API key at: https://z.ai/subscribe
57
- #
58
- # ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
59
- # ANTHROPIC_AUTH_TOKEN=your-zhipu-api-key
60
- # API_TIMEOUT_MS=3000000
61
- # ANTHROPIC_DEFAULT_SONNET_MODEL=glm-4.7
62
- # ANTHROPIC_DEFAULT_OPUS_MODEL=glm-4.7
63
- # ANTHROPIC_DEFAULT_HAIKU_MODEL=glm-4.5-air
64
-
65
- # Ollama Local Model Configuration (Optional)
66
- # To use local models via Ollama instead of Claude, uncomment and set these variables.
67
- # Requires Ollama v0.14.0+ with Anthropic API compatibility.
68
- # See: https://ollama.com/blog/claude
69
- #
70
- # ANTHROPIC_BASE_URL=http://localhost:11434
71
- # ANTHROPIC_AUTH_TOKEN=ollama
72
- # API_TIMEOUT_MS=3000000
73
- # ANTHROPIC_DEFAULT_SONNET_MODEL=qwen3-coder
74
- # ANTHROPIC_DEFAULT_OPUS_MODEL=qwen3-coder
75
- # ANTHROPIC_DEFAULT_HAIKU_MODEL=qwen3-coder
76
- #
77
- # Model recommendations:
78
- # - For best results, use a capable coding model like qwen3-coder or deepseek-coder-v2
79
- # - You can use the same model for all tiers, or different models per tier
80
- # - Larger models (70B+) work best for Opus tier, smaller (7B-20B) for Haiku
35
+ # Configure alternative providers via the Settings UI (gear icon > API Provider).
36
+ # The Settings UI is the recommended way to switch providers and models.
package/README.md CHANGED
@@ -326,37 +326,13 @@ When test progress increases, the agent sends:
326
326
  }
327
327
  ```
328
328
 
329
- ### Using GLM Models (Alternative to Claude)
329
+ ### Alternative API Providers (GLM, Ollama, Kimi, Custom)
330
330
 
331
- Add these variables to your `.env` file to use Zhipu AI's GLM models:
331
+ Alternative providers are configured via the **Settings UI** (gear icon > API Provider). Select your provider, set the base URL, auth token, and model directly in the UI — no `.env` changes needed.
332
332
 
333
- ```bash
334
- ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
335
- ANTHROPIC_AUTH_TOKEN=your-zhipu-api-key
336
- API_TIMEOUT_MS=3000000
337
- ANTHROPIC_DEFAULT_SONNET_MODEL=glm-4.7
338
- ANTHROPIC_DEFAULT_OPUS_MODEL=glm-4.7
339
- ANTHROPIC_DEFAULT_HAIKU_MODEL=glm-4.5-air
340
- ```
341
-
342
- This routes AutoForge's API requests through Zhipu's Claude-compatible API, allowing you to use GLM-4.7 and other models. **This only affects AutoForge** - your global Claude Code settings remain unchanged.
343
-
344
- Get an API key at: https://z.ai/subscribe
345
-
346
- ### Using Ollama Local Models
347
-
348
- Add these variables to your `.env` file to run agents with local models via Ollama v0.14.0+:
349
-
350
- ```bash
351
- ANTHROPIC_BASE_URL=http://localhost:11434
352
- ANTHROPIC_AUTH_TOKEN=ollama
353
- API_TIMEOUT_MS=3000000
354
- ANTHROPIC_DEFAULT_SONNET_MODEL=qwen3-coder
355
- ANTHROPIC_DEFAULT_OPUS_MODEL=qwen3-coder
356
- ANTHROPIC_DEFAULT_HAIKU_MODEL=qwen3-coder
357
- ```
333
+ Available providers: **Claude** (default), **GLM** (Zhipu AI), **Ollama** (local models), **Kimi** (Moonshot), **Custom**
358
334
 
359
- See the [CLAUDE.md](CLAUDE.md) for recommended models and known limitations.
335
+ For Ollama, install [Ollama v0.14.0+](https://ollama.com), run `ollama serve`, and pull a coding model (e.g., `ollama pull qwen3-coder`). Then select "Ollama" in the Settings UI.
360
336
 
361
337
  ### Using Vertex AI
362
338
 
@@ -366,7 +342,7 @@ Add these variables to your `.env` file to run agents via Google Cloud Vertex AI
366
342
  CLAUDE_CODE_USE_VERTEX=1
367
343
  CLOUD_ML_REGION=us-east5
368
344
  ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
369
- ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-5@20251101
345
+ ANTHROPIC_DEFAULT_OPUS_MODEL=claude-opus-4-6
370
346
  ANTHROPIC_DEFAULT_SONNET_MODEL=claude-sonnet-4-5@20250929
371
347
  ANTHROPIC_DEFAULT_HAIKU_MODEL=claude-3-5-haiku@20241022
372
348
  ```
package/client.py CHANGED
@@ -46,8 +46,9 @@ def convert_model_for_vertex(model: str) -> str:
46
46
  """
47
47
  Convert model name format for Vertex AI compatibility.
48
48
 
49
- Vertex AI uses @ to separate model name from version (e.g., claude-opus-4-5@20251101)
50
- while the Anthropic API uses - (e.g., claude-opus-4-5-20251101).
49
+ Vertex AI uses @ to separate model name from version (e.g., claude-sonnet-4-5@20250929)
50
+ while the Anthropic API uses - (e.g., claude-sonnet-4-5-20250929).
51
+ Models without a date suffix (e.g., claude-opus-4-6) pass through unchanged.
51
52
 
52
53
  Args:
53
54
  model: Model name in Anthropic format (with hyphens)
@@ -61,7 +62,7 @@ def convert_model_for_vertex(model: str) -> str:
61
62
  return model
62
63
 
63
64
  # Pattern: claude-{name}-{version}-{date} -> claude-{name}-{version}@{date}
64
- # Example: claude-opus-4-5-20251101 -> claude-opus-4-5@20251101
65
+ # Example: claude-sonnet-4-5-20250929 -> claude-sonnet-4-5@20250929
65
66
  # The date is always 8 digits at the end
66
67
  match = re.match(r'^(claude-.+)-(\d{8})$', model)
67
68
  if match:
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "autoforge-ai",
3
- "version": "0.1.4",
3
+ "version": "0.1.5",
4
4
  "description": "Autonomous coding agent with web UI - build complete apps with AI",
5
5
  "license": "AGPL-3.0",
6
6
  "bin": {
package/registry.py CHANGED
@@ -46,10 +46,16 @@ def _migrate_registry_dir() -> None:
46
46
  # Available models with display names
47
47
  # To add a new model: add an entry here with {"id": "model-id", "name": "Display Name"}
48
48
  AVAILABLE_MODELS = [
49
- {"id": "claude-opus-4-5-20251101", "name": "Claude Opus 4.5"},
50
- {"id": "claude-sonnet-4-5-20250929", "name": "Claude Sonnet 4.5"},
49
+ {"id": "claude-opus-4-6", "name": "Claude Opus"},
50
+ {"id": "claude-sonnet-4-5-20250929", "name": "Claude Sonnet"},
51
51
  ]
52
52
 
53
+ # Map legacy model IDs to their current replacements.
54
+ # Used by get_all_settings() to auto-migrate stale values on first read after upgrade.
55
+ LEGACY_MODEL_MAP = {
56
+ "claude-opus-4-5-20251101": "claude-opus-4-6",
57
+ }
58
+
53
59
  # List of valid model IDs (derived from AVAILABLE_MODELS)
54
60
  VALID_MODELS = [m["id"] for m in AVAILABLE_MODELS]
55
61
 
@@ -59,7 +65,7 @@ VALID_MODELS = [m["id"] for m in AVAILABLE_MODELS]
59
65
  _env_default_model = os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL")
60
66
  if _env_default_model is not None:
61
67
  _env_default_model = _env_default_model.strip()
62
- DEFAULT_MODEL = _env_default_model or "claude-opus-4-5-20251101"
68
+ DEFAULT_MODEL = _env_default_model or "claude-opus-4-6"
63
69
 
64
70
  # Ensure env-provided DEFAULT_MODEL is in VALID_MODELS for validation consistency
65
71
  # (idempotent: only adds if missing, doesn't alter AVAILABLE_MODELS semantics)
@@ -598,6 +604,9 @@ def get_all_settings() -> dict[str, str]:
598
604
  """
599
605
  Get all settings as a dictionary.
600
606
 
607
+ Automatically migrates legacy model IDs (e.g. claude-opus-4-5-20251101 -> claude-opus-4-6)
608
+ on first read after upgrade. This is a one-time silent migration.
609
+
601
610
  Returns:
602
611
  Dictionary mapping setting keys to values.
603
612
  """
@@ -606,7 +615,26 @@ def get_all_settings() -> dict[str, str]:
606
615
  session = SessionLocal()
607
616
  try:
608
617
  settings = session.query(Settings).all()
609
- return {s.key: s.value for s in settings}
618
+ result = {s.key: s.value for s in settings}
619
+
620
+ # Auto-migrate legacy model IDs
621
+ migrated = False
622
+ for key in ("model", "api_model"):
623
+ old_id = result.get(key)
624
+ if old_id and old_id in LEGACY_MODEL_MAP:
625
+ new_id = LEGACY_MODEL_MAP[old_id]
626
+ setting = session.query(Settings).filter(Settings.key == key).first()
627
+ if setting:
628
+ setting.value = new_id
629
+ setting.updated_at = datetime.now()
630
+ result[key] = new_id
631
+ migrated = True
632
+ logger.info("Migrated setting '%s': %s -> %s", key, old_id, new_id)
633
+
634
+ if migrated:
635
+ session.commit()
636
+
637
+ return result
610
638
  finally:
611
639
  session.close()
612
640
  except Exception as e:
@@ -624,10 +652,10 @@ API_PROVIDERS: dict[str, dict[str, Any]] = {
624
652
  "base_url": None,
625
653
  "requires_auth": False,
626
654
  "models": [
627
- {"id": "claude-opus-4-5-20251101", "name": "Claude Opus 4.5"},
628
- {"id": "claude-sonnet-4-5-20250929", "name": "Claude Sonnet 4.5"},
655
+ {"id": "claude-opus-4-6", "name": "Claude Opus"},
656
+ {"id": "claude-sonnet-4-5-20250929", "name": "Claude Sonnet"},
629
657
  ],
630
- "default_model": "claude-opus-4-5-20251101",
658
+ "default_model": "claude-opus-4-6",
631
659
  },
632
660
  "kimi": {
633
661
  "name": "Kimi K2.5 (Moonshot)",
@@ -7,7 +7,6 @@ Settings are stored in the registry database and shared across all projects.
7
7
  """
8
8
 
9
9
  import mimetypes
10
- import os
11
10
  import sys
12
11
 
13
12
  from fastapi import APIRouter
@@ -39,19 +38,6 @@ def _parse_yolo_mode(value: str | None) -> bool:
39
38
  return (value or "false").lower() == "true"
40
39
 
41
40
 
42
- def _is_glm_mode() -> bool:
43
- """Check if GLM API is configured via environment variables."""
44
- base_url = os.getenv("ANTHROPIC_BASE_URL", "")
45
- # GLM mode is when ANTHROPIC_BASE_URL is set but NOT pointing to Ollama
46
- return bool(base_url) and not _is_ollama_mode()
47
-
48
-
49
- def _is_ollama_mode() -> bool:
50
- """Check if Ollama API is configured via environment variables."""
51
- base_url = os.getenv("ANTHROPIC_BASE_URL", "")
52
- return "localhost:11434" in base_url or "127.0.0.1:11434" in base_url
53
-
54
-
55
41
  @router.get("/providers", response_model=ProvidersResponse)
56
42
  async def get_available_providers():
57
43
  """Get list of available API providers."""
@@ -116,9 +102,8 @@ async def get_settings():
116
102
 
117
103
  api_provider = all_settings.get("api_provider", "claude")
118
104
 
119
- # Compute glm_mode / ollama_mode from api_provider for backward compat
120
- glm_mode = api_provider == "glm" or _is_glm_mode()
121
- ollama_mode = api_provider == "ollama" or _is_ollama_mode()
105
+ glm_mode = api_provider == "glm"
106
+ ollama_mode = api_provider == "ollama"
122
107
 
123
108
  return SettingsResponse(
124
109
  yolo_mode=_parse_yolo_mode(all_settings.get("yolo_mode")),
@@ -181,8 +166,8 @@ async def update_settings(update: SettingsUpdate):
181
166
  # Return updated settings
182
167
  all_settings = get_all_settings()
183
168
  api_provider = all_settings.get("api_provider", "claude")
184
- glm_mode = api_provider == "glm" or _is_glm_mode()
185
- ollama_mode = api_provider == "ollama" or _is_ollama_mode()
169
+ glm_mode = api_provider == "glm"
170
+ ollama_mode = api_provider == "ollama"
186
171
 
187
172
  return SettingsResponse(
188
173
  yolo_mode=_parse_yolo_mode(all_settings.get("yolo_mode")),
package/server/schemas.py CHANGED
@@ -411,8 +411,8 @@ class SettingsResponse(BaseModel):
411
411
  """Response schema for global settings."""
412
412
  yolo_mode: bool = False
413
413
  model: str = DEFAULT_MODEL
414
- glm_mode: bool = False # True if GLM API is configured via .env
415
- ollama_mode: bool = False # True if Ollama API is configured via .env
414
+ glm_mode: bool = False # True when api_provider is "glm"
415
+ ollama_mode: bool = False # True when api_provider is "ollama"
416
416
  testing_agent_ratio: int = 1 # Regression testing agents (0-3)
417
417
  playwright_headless: bool = True
418
418
  batch_size: int = 3 # Features per coding agent batch (1-3)
@@ -157,7 +157,7 @@ class AssistantChatSession:
157
157
  """
158
158
  Manages a read-only assistant conversation for a project.
159
159
 
160
- Uses Claude Opus 4.5 with only read-only tools enabled.
160
+ Uses Claude Opus with only read-only tools enabled.
161
161
  Persists conversation history to SQLite.
162
162
  """
163
163
 
@@ -258,11 +258,11 @@ class AssistantChatSession:
258
258
  system_cli = shutil.which("claude")
259
259
 
260
260
  # Build environment overrides for API configuration
261
- from registry import get_effective_sdk_env
261
+ from registry import DEFAULT_MODEL, get_effective_sdk_env
262
262
  sdk_env = get_effective_sdk_env()
263
263
 
264
264
  # Determine model from SDK env (provider-aware) or fallback to env/default
265
- model = sdk_env.get("ANTHROPIC_DEFAULT_OPUS_MODEL") or os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL", "claude-opus-4-5-20251101")
265
+ model = sdk_env.get("ANTHROPIC_DEFAULT_OPUS_MODEL") or os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL", DEFAULT_MODEL)
266
266
 
267
267
  try:
268
268
  logger.info("Creating ClaudeSDKClient...")
@@ -154,11 +154,11 @@ class ExpandChatSession:
154
154
  system_prompt = skill_content.replace("$ARGUMENTS", project_path)
155
155
 
156
156
  # Build environment overrides for API configuration
157
- from registry import get_effective_sdk_env
157
+ from registry import DEFAULT_MODEL, get_effective_sdk_env
158
158
  sdk_env = get_effective_sdk_env()
159
159
 
160
160
  # Determine model from SDK env (provider-aware) or fallback to env/default
161
- model = sdk_env.get("ANTHROPIC_DEFAULT_OPUS_MODEL") or os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL", "claude-opus-4-5-20251101")
161
+ model = sdk_env.get("ANTHROPIC_DEFAULT_OPUS_MODEL") or os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL", DEFAULT_MODEL)
162
162
 
163
163
  # Build MCP servers config for feature creation
164
164
  mcp_servers = {
@@ -346,7 +346,7 @@ class AgentProcessManager:
346
346
 
347
347
  Args:
348
348
  yolo_mode: If True, run in YOLO mode (skip testing agents)
349
- model: Model to use (e.g., claude-opus-4-5-20251101)
349
+ model: Model to use (e.g., claude-opus-4-6)
350
350
  parallel_mode: DEPRECATED - ignored, always uses unified orchestrator
351
351
  max_concurrency: Max concurrent coding agents (1-5, default 1)
352
352
  testing_agent_ratio: Number of regression testing agents (0-3, default 1)
@@ -140,11 +140,11 @@ class SpecChatSession:
140
140
  system_cli = shutil.which("claude")
141
141
 
142
142
  # Build environment overrides for API configuration
143
- from registry import get_effective_sdk_env
143
+ from registry import DEFAULT_MODEL, get_effective_sdk_env
144
144
  sdk_env = get_effective_sdk_env()
145
145
 
146
146
  # Determine model from SDK env (provider-aware) or fallback to env/default
147
- model = sdk_env.get("ANTHROPIC_DEFAULT_OPUS_MODEL") or os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL", "claude-opus-4-5-20251101")
147
+ model = sdk_env.get("ANTHROPIC_DEFAULT_OPUS_MODEL") or os.getenv("ANTHROPIC_DEFAULT_OPUS_MODEL", DEFAULT_MODEL)
148
148
 
149
149
  try:
150
150
  self.client = ClaudeSDKClient(