loren-code 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,45 +1,29 @@
1
- # LOREN CODE
1
+ # loren-code
2
2
 
3
- Ollama Cloud model manager and local bridge for Claude Code, with dynamic model switching, API key rotation, and first-run bootstrap.
3
+ `loren-code` installs the `loren` CLI for working with a local Ollama Cloud bridge.
4
4
 
5
- ## Features
5
+ It is built to rotate multiple Ollama Cloud API keys, including the common setup where users configure more than one free-tier key for longer uninterrupted bridge usage.
6
6
 
7
- - Dynamic model switching without restarting the server
8
- - Live model list fetched from Ollama Cloud
9
- - API key add/remove/rotate commands
10
- - First-run setup for `.env.local` and `.runtime`
11
- - Local bridge on port `8788`
12
- - Claude Code wrapper support
7
+ Loren manages rotation and failover, but it does not bypass upstream limits or service terms.
13
8
 
14
- ## Quick Start
15
-
16
- ### Prerequisites
17
-
18
- - Node.js 18+
19
- - Ollama Cloud API key(s)
20
-
21
- ### Clone And Run Locally
9
+ ## Install
22
10
 
23
11
  ```bash
24
- git clone https://github.com/lorenzune/loren-code.git
25
- cd loren-code
26
- npm install
27
- node scripts/loren.js help
12
+ npm install -g loren-code
28
13
  ```
29
14
 
30
- If `.env.local` is missing, Loren creates it automatically from `.env.example`.
31
- You still need to add real `OLLAMA_API_KEYS`.
32
-
33
- ### Install From npm
15
+ Verify:
34
16
 
35
17
  ```bash
36
- npm install -g loren-code
37
18
  loren help
38
19
  ```
39
20
 
40
- The published package exposes `loren` via the `bin` field automatically.
21
+ ## First Run
41
22
 
42
- ## Configuration
23
+ Loren creates `.env.local` automatically if it does not exist.
24
+
25
+ You must add valid `OLLAMA_API_KEYS` before the bridge can make upstream requests.
26
+ If you configure multiple keys, Loren rotates them automatically.
43
27
 
44
28
  Example `.env.local`:
45
29
 
@@ -52,16 +36,17 @@ DEFAULT_MODEL_ALIAS=gpt-oss:20b
52
36
  OLLAMA_MODEL_ALIASES={"ollama-free-auto":"gpt-oss:20b","ollama-free-fast":"gemma3:12b"}
53
37
  ```
54
38
 
55
- ## Usage
56
-
57
- ### CLI
39
+ ## Main Commands
58
40
 
59
41
  ```bash
60
42
  loren help
61
43
  loren config:show
62
44
  loren status
45
+ loren start
46
+ loren stop
63
47
  loren model:list
64
48
  loren model:set gpt-oss:20b
49
+ loren model:current
65
50
  loren model:refresh
66
51
  loren keys:list
67
52
  loren keys:add sk-your-new-key
@@ -69,56 +54,34 @@ loren keys:remove 0
69
54
  loren keys:rotate
70
55
  ```
71
56
 
72
- ### Server
73
-
74
- ```bash
75
- npm start
76
- ```
77
-
78
- or:
57
+ ## Start The Bridge
79
58
 
80
59
  ```bash
81
60
  loren start
82
- loren stop
83
- loren status
84
61
  ```
85
62
 
86
- ## Bridge Endpoints
87
-
88
- - `GET /health`
89
- - `GET /v1/models`
90
- - `GET /v1/models?refresh=true`
91
- - `POST /v1/refresh`
92
- - `POST /v1/messages`
93
- - `POST /v1/messages/count_tokens`
94
- - `GET /metrics`
95
- - `GET /dashboard`
96
-
97
- ## Claude Code Integration
63
+ The local bridge runs on:
98
64
 
99
- 1. Start the bridge.
100
- 2. Point Claude Code to `http://127.0.0.1:8788`.
101
- 3. Use `loren model:set` to switch model aliases.
102
- 4. Use `loren model:refresh` to refresh the model list.
65
+ ```text
66
+ http://127.0.0.1:8788
67
+ ```
103
68
 
104
69
  ## Troubleshooting
105
70
 
106
- ### `Command not found: loren`
71
+ ### `loren` not found
107
72
 
108
- Install the package globally:
73
+ Make sure the package was installed globally:
109
74
 
110
75
  ```bash
111
76
  npm install -g loren-code
112
77
  ```
113
78
 
114
- If you are working from a local clone, use `node scripts/loren.js ...`.
115
-
116
79
  ### `npm` blocked in PowerShell
117
80
 
118
- Use `npm.cmd` instead:
81
+ Use:
119
82
 
120
83
  ```powershell
121
- npm.cmd run help
84
+ npm.cmd install -g loren-code
122
85
  ```
123
86
 
124
87
  ### Missing API keys
@@ -129,25 +92,8 @@ Populate `OLLAMA_API_KEYS` in `.env.local`.
129
92
 
130
93
  Change `BRIDGE_PORT` in `.env.local`.
131
94
 
132
- ## Project Structure
133
-
134
- ```text
135
- loren-code/
136
- |- scripts/
137
- | |- loren.js
138
- | |- claude-wrapper.js
139
- | `- install-claude-ollama.ps1
140
- |- src/
141
- | |- bootstrap.js
142
- | |- server.js
143
- | |- config.js
144
- | |- key-manager.js
145
- | `- ...
146
- |- .env.example
147
- |- package.json
148
- `- README.md
149
- ```
95
+ ## Repository
150
96
 
151
- ## License
97
+ Source code and project documentation:
152
98
 
153
- MIT
99
+ https://github.com/lorenzune/loren-code
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "loren-code",
3
- "version": "0.1.0",
3
+ "version": "0.1.2",
4
4
  "description": "Ollama Cloud Model Manager - Dynamic model switching, API key rotation, and real-time configuration updates",
5
5
  "author": "lorenzune",
6
6
  "license": "MIT",
@@ -55,6 +55,8 @@
55
55
  "help": "node scripts/loren.js help",
56
56
  "smoke": "node scripts/smoke-test.js",
57
57
  "check:publish": "node scripts/publish-check.js",
58
+ "prepack": "node scripts/sync-readme.js npm",
59
+ "postpack": "node scripts/sync-readme.js github",
58
60
  "prepublishOnly": "npm test && npm run lint",
59
61
  "test": "node scripts/publish-check.js",
60
62
  "lint": "node -e \"console.log('No linter configured')\""
@@ -12,7 +12,7 @@ $launcherExePath = Join-Path $repoRoot "scripts\\ClaudeWrapperLauncher.exe"
12
12
  $envPath = Join-Path $repoRoot ".env.local"
13
13
 
14
14
  if (-not (Test-Path $envPath)) {
15
- throw ".env.local non trovato. Crea prima il file con OLLAMA_API_KEYS."
15
+ throw ".env.local not found. Create it first with OLLAMA_API_KEYS."
16
16
  }
17
17
 
18
18
  New-Item -ItemType Directory -Force -Path $workspaceSettingsDir | Out-Null
@@ -70,6 +70,19 @@ function Get-EnvValue {
70
70
  }
71
71
 
72
72
  function Get-CSharpCompiler {
73
+ $command = Get-Command csc -ErrorAction SilentlyContinue
74
+ if ($command -and $command.Source -and (Test-Path $command.Source)) {
75
+ return $command.Source
76
+ }
77
+
78
+ $runtimeDir = [Runtime.InteropServices.RuntimeEnvironment]::GetRuntimeDirectory()
79
+ if (-not [string]::IsNullOrWhiteSpace($runtimeDir)) {
80
+ $runtimeCandidate = Join-Path $runtimeDir "csc.exe"
81
+ if (Test-Path $runtimeCandidate) {
82
+ return $runtimeCandidate
83
+ }
84
+ }
85
+
73
86
  $candidates = @(
74
87
  "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe",
75
88
  "C:\Windows\Microsoft.NET\Framework\v4.0.30319\csc.exe"
@@ -81,7 +94,7 @@ function Get-CSharpCompiler {
81
94
  }
82
95
  }
83
96
 
84
- throw "Compilatore C# non trovato. Impossibile generare il launcher .exe."
97
+ throw "C# compiler not found. Unable to generate the launcher .exe."
85
98
  }
86
99
 
87
100
  function Get-OllamaAvailableModels {
@@ -133,7 +146,7 @@ function Get-OllamaAvailableModels {
133
146
  }
134
147
  }
135
148
  } catch {
136
- Write-Warning "Impossibile caricare la lista modelli da Ollama Cloud. Continuo con alias e target locali."
149
+ Write-Warning "Unable to load the model list from Ollama Cloud. Continuing with local aliases and targets."
137
150
  }
138
151
 
139
152
  return $models
@@ -143,21 +156,51 @@ $compilerPath = Get-CSharpCompiler
143
156
  & $compilerPath "/nologo" "/target:exe" "/out:$launcherExePath" $launcherSourcePath | Out-Null
144
157
  if ($LASTEXITCODE -ne 0 -or -not (Test-Path $launcherExePath)) {
145
158
  if (Test-Path $launcherExePath) {
146
- Write-Warning "Compilazione launcher fallita, ma uso il launcher esistente."
159
+ Write-Warning "Launcher compilation failed, but an existing launcher will be used."
147
160
  } else {
148
- throw "Compilazione del launcher fallita."
161
+ throw "Launcher compilation failed."
149
162
  }
150
163
  }
151
164
 
152
165
  $workspaceSettings = Read-JsonFile -Path $workspaceSettingsPath
166
+ $bridgeHost = Get-EnvValue -Path $envPath -Name "BRIDGE_HOST"
167
+ if (-not $bridgeHost) {
168
+ $bridgeHost = "127.0.0.1"
169
+ }
170
+
171
+ $bridgePort = Get-EnvValue -Path $envPath -Name "BRIDGE_PORT"
172
+ if (-not $bridgePort) {
173
+ $bridgePort = "8788"
174
+ }
175
+
176
+ $bridgeBaseUrl = "http://${bridgeHost}:${bridgePort}"
177
+
153
178
  $workspaceSettings["claudeCode.claudeProcessWrapper"] = $launcherExePath
154
179
  $workspaceSettings["claudeCode.disableLoginPrompt"] = $true
180
+ $workspaceSettings["claudeCode.environmentVariables"] = @(
181
+ @{
182
+ name = "ANTHROPIC_BASE_URL"
183
+ value = $bridgeBaseUrl
184
+ },
185
+ @{
186
+ name = "ANTHROPIC_API_KEY"
187
+ value = "bridge-local"
188
+ },
189
+ @{
190
+ name = "ANTHROPIC_AUTH_TOKEN"
191
+ value = ""
192
+ },
193
+ @{
194
+ name = "CLAUDE_CODE_SKIP_AUTH_LOGIN"
195
+ value = "1"
196
+ }
197
+ )
155
198
  Write-JsonFile -Path $workspaceSettingsPath -Data $workspaceSettings
156
199
 
157
200
  $claudeSettings = Read-JsonFile -Path $claudeSettingsPath
158
201
  $aliasJson = Get-EnvValue -Path $envPath -Name "OLLAMA_MODEL_ALIASES"
159
202
  if (-not $aliasJson) {
160
- throw "OLLAMA_MODEL_ALIASES non trovato in .env.local"
203
+ throw "OLLAMA_MODEL_ALIASES not found in .env.local"
161
204
  }
162
205
 
163
206
  $parsedAliases = $aliasJson | ConvertFrom-Json
@@ -168,7 +211,7 @@ foreach ($property in $parsedAliases.PSObject.Properties) {
168
211
  $availableModels = Get-OllamaAvailableModels -EnvPath $envPath -Aliases $aliases
169
212
 
170
213
  if ($availableModels.Count -eq 0) {
171
- throw "OLLAMA_MODEL_ALIASES non contiene modelli"
214
+ throw "OLLAMA_MODEL_ALIASES does not contain any models"
172
215
  }
173
216
 
174
217
  $defaultModel = if ($aliases.ContainsKey("ollama-free-auto")) { "ollama-free-auto" } else { $availableModels[0] }
@@ -176,9 +219,9 @@ $claudeSettings["model"] = $defaultModel
176
219
  $claudeSettings["availableModels"] = $availableModels
177
220
  Write-JsonFile -Path $claudeSettingsPath -Data $claudeSettings
178
221
 
179
- Write-Host "Installazione completata."
180
- Write-Host "Launcher Claude:" $launcherExePath
222
+ Write-Host "Installation completed."
223
+ Write-Host "Claude launcher:" $launcherExePath
181
224
  Write-Host "VS Code user settings:" $workspaceSettingsPath
182
225
  Write-Host "Claude user settings:" $claudeSettingsPath
183
226
  Write-Host ""
184
- Write-Host "Riavvia VS Code. Claude Code usera il bridge in qualsiasi progetto."
227
+ Write-Host "Restart VS Code. Claude Code will use the bridge in any project."
package/scripts/loren.js CHANGED
@@ -13,6 +13,8 @@ const runtimeDir = path.join(projectRoot, ".runtime");
13
13
  const pidFilePath = path.join(runtimeDir, "loren.pid");
14
14
  const logFilePath = path.join(runtimeDir, "bridge.log");
15
15
  const errorLogFilePath = path.join(runtimeDir, "bridge.err.log");
16
+ const userHome = process.env.USERPROFILE || process.env.HOME || projectRoot;
17
+ const claudeSettingsPath = path.join(userHome, ".claude", "settings.json");
16
18
 
17
19
  // Force working directory to project root for config loading
18
20
  process.chdir(projectRoot);
@@ -53,7 +55,7 @@ const COMMANDS = {
53
55
 
54
56
  function main() {
55
57
  const args = process.argv.slice(2);
56
- const [command, subcommand, ...rest] = args;
58
+ const [command] = args;
57
59
 
58
60
  if (!command || command === "help" || command === "--help" || command === "-h") {
59
61
  printHelp();
@@ -78,7 +80,7 @@ function main() {
78
80
  }
79
81
 
80
82
  if (category && action && COMMANDS[category] && COMMANDS[category][action]) {
81
- COMMANDS[category][action](rest);
83
+ COMMANDS[category][action](args.slice(1));
82
84
  return;
83
85
  }
84
86
 
@@ -194,9 +196,13 @@ function setModel(args) {
194
196
  const envVars = loadEnvFile(envFilePath);
195
197
  envVars.DEFAULT_MODEL_ALIAS = requestedModel;
196
198
  saveEnvFile(envFilePath, envVars);
199
+ syncClaudeSelectedModel(requestedModel);
197
200
 
198
201
  console.log(`\n✓ Default model set to: ${requestedModel}`);
199
202
  console.log(" New requests will use this model immediately.");
203
+ if (fs.existsSync(claudeSettingsPath)) {
204
+ console.log(" Claude Code settings were updated as well.");
205
+ }
200
206
  console.log("");
201
207
  }
202
208
 
@@ -469,6 +475,28 @@ function safeUnlink(filePath) {
469
475
  }
470
476
  }
471
477
 
478
+ function syncClaudeSelectedModel(model) {
479
+ const settingsDir = path.dirname(claudeSettingsPath);
480
+ fs.mkdirSync(settingsDir, { recursive: true });
481
+
482
+ let settings = {};
483
+ if (fs.existsSync(claudeSettingsPath)) {
484
+ try {
485
+ settings = JSON.parse(fs.readFileSync(claudeSettingsPath, "utf8").replace(/^\uFEFF/, ""));
486
+ } catch {
487
+ settings = {};
488
+ }
489
+ }
490
+
491
+ const availableModels = Array.isArray(settings.availableModels) ? settings.availableModels : [];
492
+ if (!availableModels.includes(model)) {
493
+ settings.availableModels = [model, ...availableModels];
494
+ }
495
+
496
+ settings.model = model;
497
+ fs.writeFileSync(claudeSettingsPath, `${JSON.stringify(settings, null, 2)}\n`, "utf8");
498
+ }
499
+
472
500
  // ============== HELP ==============
473
501
 
474
502
  function printHelp() {
@@ -47,6 +47,7 @@ if (Test-Path $workspaceSettingsPath) {
47
47
  $settings = Read-JsonFile -Path $workspaceSettingsPath
48
48
  [void]$settings.Remove("claudeCode.claudeProcessWrapper")
49
49
  [void]$settings.Remove("claudeCode.disableLoginPrompt")
50
+ [void]$settings.Remove("claudeCode.environmentVariables")
50
51
  Write-JsonFile -Path $workspaceSettingsPath -Data $settings
51
52
  }
52
53
 
@@ -70,4 +71,4 @@ if (Test-Path $launcherExePath) {
70
71
  Remove-Item -LiteralPath $launcherExePath -Force -ErrorAction SilentlyContinue
71
72
  }
72
73
 
73
- Write-Host "Configurazione globale rimossa."
74
+ Write-Host "Global configuration removed."
package/src/cache.js CHANGED
@@ -1,21 +1,21 @@
1
1
  import NodeCache from 'node-cache';
2
2
  import logger from './logger.js';
3
3
 
4
- // Cache per i modelli (5 minuti di TTL)
4
+ // Cache for models (5 minute TTL)
5
5
  export const modelCache = new NodeCache({
6
- stdTTL: 300, // 5 minuti
7
- checkperiod: 60, // Controlla ogni minuto se ci sono entry scadute
8
- useClones: false // Performance migliore se non usiamo clones
6
+ stdTTL: 300, // 5 minutes
7
+ checkperiod: 60, // Check for expired entries every minute
8
+ useClones: false // Better performance when values are not cloned
9
9
  });
10
10
 
11
- // Cache per le risposte delle API (30 secondi)
11
+ // Cache for API responses (30 seconds)
12
12
  export const apiCache = new NodeCache({
13
13
  stdTTL: 30,
14
14
  checkperiod: 10,
15
15
  useClones: false
16
16
  });
17
17
 
18
- // Funzione helper per il caching con error handling
18
+ // Helper for cache access with error handling
19
19
  export function getFromCache(cache, key) {
20
20
  try {
21
21
  const value = cache.get(key);
@@ -51,7 +51,7 @@ export function deleteFromCache(cache, key) {
51
51
  }
52
52
  }
53
53
 
54
- // Stats per il monitoraggio
54
+ // Monitoring stats
55
55
  export function getCacheStats(cache, name) {
56
56
  const stats = cache.getStats();
57
57
  return {
@@ -61,4 +61,4 @@ export function getCacheStats(cache, name) {
61
61
  keys: cache.keys().length,
62
62
  hitRate: stats.hits > 0 ? (stats.hits / (stats.hits + stats.misses) * 100).toFixed(2) : 0
63
63
  };
64
- }
64
+ }
@@ -7,7 +7,7 @@ export class ConfigWatcher {
7
7
  this.configFile = configFile;
8
8
  this.onChange = onChange;
9
9
  this.watcher = null;
10
- this.debounceTimeout = 1000; // 1 secondo
10
+ this.debounceTimeout = 1000; // 1 second
11
11
  this.debounceTimer = null;
12
12
  }
13
13
 
@@ -45,7 +45,7 @@ export class ConfigWatcher {
45
45
  }
46
46
 
47
47
  handleChange() {
48
- // Debounce per evitare multipli reload rapidi
48
+ // Debounce to avoid multiple rapid reloads
49
49
  if (this.debounceTimer) {
50
50
  clearTimeout(this.debounceTimer);
51
51
  }
@@ -62,7 +62,7 @@ export class ConfigWatcher {
62
62
  }
63
63
  }
64
64
 
65
- // Funzione helper per creare un config watcher con reload automatico
65
+ // Helper to create a config watcher with automatic reload
66
66
  export function createConfigWatcher(configFile, loadConfigFunction) {
67
67
  const watcher = new ConfigWatcher(configFile, async () => {
68
68
  const newConfig = loadConfigFunction();
@@ -70,4 +70,4 @@ export function createConfigWatcher(configFile, loadConfigFunction) {
70
70
  });
71
71
 
72
72
  return watcher;
73
- }
73
+ }
@@ -55,7 +55,7 @@ export function getAgentStats() {
55
55
  };
56
56
  }
57
57
 
58
- // Cleanup function per chiudere tutti gli agent
58
+ // Cleanup function to close all agents
59
59
  export function closeAgents() {
60
60
  return new Promise((resolve) => {
61
61
  let pending = 2;
@@ -77,4 +77,4 @@ export function closeAgents() {
77
77
  done();
78
78
  });
79
79
  });
80
- }
80
+ }
@@ -11,7 +11,7 @@ export class KeyManager {
11
11
  }));
12
12
  this.index = 0;
13
13
  this.maxFailures = 3;
14
- this.failureWindowMs = 5 * 60 * 1000; // 5 minuti
14
+ this.failureWindowMs = 5 * 60 * 1000; // 5 minutes
15
15
  }
16
16
 
17
17
  async getHealthyKey() {
@@ -21,7 +21,7 @@ export class KeyManager {
21
21
  do {
22
22
  const keyInfo = this.keys[this.index];
23
23
 
24
- // Resetta lo stato se è passato abbastanza tempo dall'ultimo fallimento
24
+ // Reset key state if enough time has passed since the last failure
25
25
  if (keyInfo.lastFailure && (now - keyInfo.lastFailure) > this.failureWindowMs) {
26
26
  keyInfo.failureCount = 0;
27
27
  keyInfo.healthy = true;
@@ -66,4 +66,4 @@ export class KeyManager {
66
66
  }))
67
67
  };
68
68
  }
69
- }
69
+ }
package/src/metrics.js CHANGED
@@ -1,8 +1,8 @@
1
1
  import logger from './logger.js';
2
2
 
3
- // Metriche raccolte
3
+ // Collected metrics
4
4
  const metrics = {
5
- // Contatori
5
+ // Counters
6
6
  requests: {
7
7
  total: 0,
8
8
  byEndpoint: {
@@ -29,7 +29,7 @@ const metrics = {
29
29
  other: 0
30
30
  }
31
31
  },
32
- // Tempi di risposta
32
+ // Response times
33
33
  responseTimes: [],
34
34
  // Token usage
35
35
  tokens: {
@@ -39,13 +39,13 @@ const metrics = {
39
39
  },
40
40
  // Uptime
41
41
  startTime: Date.now(),
42
- // Connessioni attive
42
+ // Active connections
43
43
  activeConnections: 0,
44
44
  // Cache stats
45
45
  cacheStats: {}
46
46
  };
47
47
 
48
- // Funzioni per incrementare metriche
48
+ // Metric update helpers
49
49
  export function incrementRequest(endpoint, statusCode) {
50
50
  metrics.requests.total++;
51
51
  if (metrics.requests.byEndpoint[endpoint] === undefined) {
@@ -68,7 +68,7 @@ export function incrementError(type = 'other') {
68
68
 
69
69
  export function recordResponseTime(duration) {
70
70
  metrics.responseTimes.push(duration);
71
- // Mantieni solo gli ultimi 1000 tempi
71
+ // Keep only the latest 1000 samples
72
72
  if (metrics.responseTimes.length > 1000) {
73
73
  metrics.responseTimes.shift();
74
74
  }
@@ -93,7 +93,7 @@ export function setCacheStats(stats) {
93
93
  metrics.cacheStats = stats;
94
94
  }
95
95
 
96
- // Calcola statistiche sui tempi di risposta
96
+ // Calculate response time stats
97
97
  function getResponseTimeStats() {
98
98
  if (metrics.responseTimes.length === 0) {
99
99
  return { avg: 0, min: 0, max: 0, p95: 0 };
@@ -109,7 +109,7 @@ function getResponseTimeStats() {
109
109
  return { avg: avg.toFixed(2), min, max, p95 };
110
110
  }
111
111
 
112
- // Ottieni tutte le metriche
112
+ // Return all collected metrics
113
113
  export function getMetrics() {
114
114
  const uptime = Date.now() - metrics.startTime;
115
115
  const memoryUsage = process.memoryUsage();
@@ -155,7 +155,7 @@ export function getMetrics() {
155
155
  };
156
156
  }
157
157
 
158
- // Funzioni di utilità
158
+ // Utility functions
159
159
  function formatUptime(ms) {
160
160
  const seconds = Math.floor(ms / 1000);
161
161
  const minutes = Math.floor(seconds / 60);
@@ -176,7 +176,7 @@ function formatBytes(bytes) {
176
176
  return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
177
177
  }
178
178
 
179
- // Middleware per tracciare richieste
179
+ // Middleware for request tracking
180
180
  export function metricsMiddleware(req, res, next) {
181
181
  const start = Date.now();
182
182
  const endpoint = (() => {
@@ -187,12 +187,12 @@ export function metricsMiddleware(req, res, next) {
187
187
  }
188
188
  })();
189
189
 
190
- // Override res.end per catturare la risposta
190
+ // Override res.end to capture the response
191
191
  const originalEnd = res.end;
192
192
  res.end = function(chunk, encoding) {
193
193
  const duration = Date.now() - start;
194
194
 
195
- // Registra metriche
195
+ // Record metrics
196
196
  incrementRequest(endpoint, res.statusCode);
197
197
  recordResponseTime(duration);
198
198
 
@@ -201,7 +201,7 @@ export function metricsMiddleware(req, res, next) {
201
201
  incrementError(errorType);
202
202
  }
203
203
 
204
- // Ripristina il metodo originale
204
+ // Restore the original method
205
205
  res.end = originalEnd;
206
206
  res.end(chunk, encoding);
207
207
  };
package/src/server.js CHANGED
@@ -15,7 +15,7 @@ import { getMetrics, incrementError, recordTokenUsage, metricsMiddleware } from
15
15
  import { createConfigWatcher } from "./config-watcher.js";
16
16
  import usageTracker from "./usage-tracker.js";
17
17
 
18
- // Configurazione globale
18
+ // Global runtime state
19
19
  const __filename = fileURLToPath(import.meta.url);
20
20
  const __dirname = path.dirname(__filename);
21
21
  const projectRoot = path.resolve(__dirname, "..");
@@ -25,7 +25,7 @@ ensureEnvLocal(projectRoot, { logger });
25
25
 
26
26
  let config = loadConfig();
27
27
  let keyManager = new KeyManager(config.apiKeys);
28
- const envFilePath = path.resolve(process.cwd(), ".env.local");
28
+ const envFilePath = path.join(projectRoot, ".env.local");
29
29
 
30
30
  function reloadRuntimeConfig() {
31
31
  config = loadConfig();
@@ -35,15 +35,15 @@ function reloadRuntimeConfig() {
35
35
  }
36
36
 
37
37
  // Config watcher
38
- const configWatcher = createConfigWatcher('.env.local', () => {
38
+ const configWatcher = createConfigWatcher(envFilePath, () => {
39
39
  reloadRuntimeConfig();
40
40
  void probeAllApiKeys();
41
41
  });
42
42
 
43
- // Avvia il watcher
43
+ // Start the watcher
44
44
  configWatcher.start();
45
45
 
46
- // Cleanup alla chiusura
46
+ // Cleanup on shutdown
47
47
  process.on('SIGINT', () => {
48
48
  logger.info('Shutting down gracefully...');
49
49
  configWatcher.stop();
@@ -62,7 +62,7 @@ if (!config.apiKeys.length) {
62
62
  }
63
63
 
64
64
  const server = http.createServer(async (req, res) => {
65
- // Applica il middleware di metriche
65
+ // Apply metrics middleware
66
66
  metricsMiddleware(req, res, () => {});
67
67
 
68
68
  try {
@@ -106,7 +106,7 @@ async function routeRequest(req, res) {
106
106
 
107
107
  const url = new URL(req.url, `http://${req.headers.host || "localhost"}`);
108
108
 
109
- // Log della richiesta
109
+ // Request logging
110
110
  logger.info(`${req.method} ${url.pathname}`, {
111
111
  ip: req.socket.remoteAddress,
112
112
  userAgent: req.headers['user-agent']
@@ -134,7 +134,7 @@ async function routeRequest(req, res) {
134
134
  return;
135
135
  }
136
136
 
137
- // Usage API endpoint (GET per dati, POST per reset)
137
+ // Usage API endpoint (GET for data, POST for reset)
138
138
  if (url.pathname === "/api/usage" && (req.method === "GET" || req.method === "POST")) {
139
139
  await handleUsage(req, res);
140
140
  return;
@@ -192,7 +192,7 @@ async function handleHealth(_req, res) {
192
192
  }
193
193
 
194
194
  async function handleMetrics(_req, res) {
195
- // Aggiorna cache stats
195
+ // Refresh cache stats
196
196
  const cacheStats = {
197
197
  models: getCacheStats(modelCache, 'models')
198
198
  };
@@ -316,7 +316,7 @@ async function handleEvents(req, res) {
316
316
 
317
317
  async function handleDashboard(_req, res) {
318
318
  try {
319
- // Usa il percorso corretto per il file dashboard.html
319
+ // Resolve the dashboard.html path relative to this module
320
320
  const __filename = fileURLToPath(import.meta.url);
321
321
  const __dirname = path.dirname(__filename);
322
322
  const dashboardPath = path.join(__dirname, 'dashboard.html');
@@ -392,7 +392,7 @@ async function handleUsage(req, res) {
392
392
  try {
393
393
  const url = new URL(req.url, `http://${req.headers.host || "localhost"}`);
394
394
 
395
- // Supporta reset via query parameter: /api/usage?reset=true
395
+ // Support reset via query parameter: /api/usage?reset=true
396
396
  if (url.searchParams.get('reset') === 'true') {
397
397
  usageTracker.resetAll();
398
398
  sendJson(res, 200, { ok: true, message: 'Usage data reset successfully' });
@@ -401,7 +401,7 @@ async function handleUsage(req, res) {
401
401
 
402
402
  const usageData = usageTracker.getDashboardData();
403
403
 
404
- // Aggiungi informazioni sui rate limit attivi
404
+ // Add active rate limit details
405
405
  const rateLimitedKeys = usageData.keys.filter(k => k.isRateLimited);
406
406
 
407
407
  sendJson(res, 200, {
@@ -518,7 +518,7 @@ async function handleMessages(req, res) {
518
518
  try {
519
519
  const body = await readJson(req);
520
520
 
521
- // Valida input
521
+ // Validate input
522
522
  const validatedBody = validateInput(MessageSchema, body);
523
523
 
524
524
  const anthropicRequest = normalizeAnthropicRequest(validatedBody);
@@ -547,7 +547,7 @@ async function handleMessages(req, res) {
547
547
  const payload = await upstream.json();
548
548
  const message = ollamaToAnthropicMessage(payload, anthropicRequest.model);
549
549
 
550
- // Registra token usage con la chiave specifica
550
+ // Record token usage for the specific API key
551
551
  usageTracker.recordUsage(apiKey, message.usage?.output_tokens || 0);
552
552
  recordTokenUsage(
553
553
  anthropicRequest.model,
@@ -602,7 +602,7 @@ async function handleCountTokens(req, res) {
602
602
  }
603
603
  }
604
604
 
605
- // Resto delle funzioni helper (abbreviate per brevità)
605
+ // Remaining helper functions
606
606
  function normalizeAnthropicRequest(body) {
607
607
  const requestedModel = body.model || config.defaultModel;
608
608
  return {
@@ -765,12 +765,11 @@ function ollamaToAnthropicMessage(payload, requestedModel) {
765
765
  });
766
766
  }
767
767
 
768
- // Traccia l'usage con i token reali
768
+ // Track usage with actual token counts
769
769
  const inputTokens = payload.prompt_eval_count || 0;
770
770
  const outputTokens = payload.eval_count || 0;
771
771
 
772
- // Qui dovremmo avere l'API key usata, ma per ora usiamo un approccio diverso
773
- // Lo tracceremo a livello superiore
772
+ // The API key is tracked at a higher level for now
774
773
 
775
774
  return {
776
775
  id: `msg_${randomUUID().replace(/-/g, "")}`,
@@ -912,7 +911,7 @@ async function pipeStreamingResponse(upstream, request, res, apiKey) {
912
911
  nextIndex += 1;
913
912
  }
914
913
 
915
- // Registra token usage con la chiave specifica (anche per streaming)
914
+ // Record token usage for the specific API key, including streaming responses
916
915
  usageTracker.recordUsage(apiKey, outputTokens || estimateTokens(aggregatedText));
917
916
  recordTokenUsage(
918
917
  request.model,
@@ -988,7 +987,7 @@ function mapDoneReason(reason) {
988
987
  }
989
988
 
990
989
  async function fetchUpstream(pathname, init) {
991
- // Controlla prima se ci sono chiavi rate limited e suggerisci la migliore
990
+ // Check for rate-limited keys first and pick the best candidate
992
991
  const suggestedKey = usageTracker.suggestNextKey(config.apiKeys);
993
992
  if (!suggestedKey) {
994
993
  throw new Error('All API keys are rate limited');
@@ -999,10 +998,10 @@ async function fetchUpstream(pathname, init) {
999
998
  try {
1000
999
  const response = await performUpstreamFetch(apiKey, pathname, init);
1001
1000
 
1002
- // Traccia l'usage (indipendentemente dal successo)
1003
- usageTracker.recordUsage(apiKey, 0); // Tokens verranno aggiornati dopo
1001
+ // Track usage regardless of request success
1002
+ usageTracker.recordUsage(apiKey, 0); // Token counts are updated later
1004
1003
 
1005
- // Controlla risposte di rate limit
1004
+ // Handle upstream rate limit responses
1006
1005
  if (response.status === 429) {
1007
1006
  let detailsText = '';
1008
1007
  let reason = 'Rate limit reached';
@@ -1040,7 +1039,7 @@ async function fetchUpstream(pathname, init) {
1040
1039
  usageTracker.markHealthy(apiKey);
1041
1040
  }
1042
1041
 
1043
- // Restituisci sia la response che la chiave usata
1042
+ // Return both the response and the API key that was used
1044
1043
  return { response, apiKey };
1045
1044
  } catch (error) {
1046
1045
  keyManager.markKeyFailed(suggestedKey, error);