@bonginkan/maria 3.1.7 → 3.1.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,12 +1,12 @@
1
- # 📊 MARIA v3.1.6
1
+ # 🔧 MARIA v3.1.8
2
2
 
3
- **Advanced Telemetry & Analytics** - Enterprise AI platform with ML-powered monitoring, predictive analytics, and real-time insights. Features 3 core services plus comprehensive telemetry system.
3
+ **Zero Hardcoding Architecture** - Enterprise AI platform with centralized configuration, GPT-5 Mini default, and simplified environment management. Features 3 core services plus advanced telemetry.
4
4
 
5
5
  🌐 **Homepage**: [https://bonginkan.ai/](https://bonginkan.ai/)
6
6
 
7
7
  [![TypeScript](https://img.shields.io/badge/TypeScript-5.0%2B-blue)](https://www.typescriptlang.org/)
8
8
  [![Node.js](https://img.shields.io/badge/Node.js-20%2B-green)](https://nodejs.org/)
9
- [![npm](https://img.shields.io/npm/v/@bonginkan/maria?label=npm%20v3.1.6)](https://www.npmjs.com/package/@bonginkan/maria)
9
+ [![npm](https://img.shields.io/npm/v/@bonginkan/maria?label=npm%20v3.1.8)](https://www.npmjs.com/package/@bonginkan/maria)
10
10
  [![Bundle Size](https://img.shields.io/badge/Bundle-192KB-brightgreen)](https://github.com/bonginkan/maria)
11
11
  [![API Surface](https://img.shields.io/badge/Public%20API-3%20Services-blue)](https://github.com/bonginkan/maria)
12
12
 
@@ -26,7 +26,7 @@ When you run `maria`, you'll see:
26
26
  ║ (c) 2025 Bonginkan Inc. ║
27
27
  ╚══════════════════════════════════════════════════════════╝
28
28
 
29
- MARIA CODE v3.1.6 — Ready
29
+ MARIA CODE v3.1.8 — Ready
30
30
  /help for commands | Providers: 8/8 OK
31
31
 
32
32
  Available AI Providers:
@@ -44,7 +44,7 @@ npm install -g @bonginkan/maria
44
44
  pnpm add -g @bonginkan/maria
45
45
  ```
46
46
 
47
- ## 📊 Quick Start - Telemetry (New in v3.1.6)
47
+ ## 📊 Quick Start - Telemetry (v3.1.6+)
48
48
 
49
49
  ```bash
50
50
  # Start monitoring stack (optional)
@@ -64,7 +64,7 @@ MARIA revolutionizes AI development with **3 powerful services**, **advanced tel
64
64
  - 🧩 **DualMemoryEngine** - System 1 & System 2 cognitive memory
65
65
  - 📁 **FileSystemService** - Safe, atomic file operations
66
66
 
67
- ### 🆕 Telemetry System (v3.1.6)
67
+ ### 🆕 Telemetry System (v3.1.6+)
68
68
  - 📈 **Prometheus/Grafana Integration** - Real-time metrics & dashboards
69
69
  - 🤖 **ML Anomaly Detection** - Isolation Forest, Autoencoder, LSTM models
70
70
  - 📊 **Predictive Analytics** - ARIMA, Prophet, Transformer forecasting
@@ -118,7 +118,7 @@ const fs = new FileSystemService({
118
118
  });
119
119
  ```
120
120
 
121
- ### 🆕 Telemetry Example (v3.1.6)
121
+ ### 🆕 Telemetry Example (v3.1.6+)
122
122
 
123
123
  ```typescript
124
124
  import { startTelemetry } from '@bonginkan/maria';
@@ -316,26 +316,31 @@ See [LICENSE](LICENSE) for full details.
316
316
  - **Documentation**: https://docs.maria-ai.dev
317
317
  - **Support**: https://github.com/bonginkan/maria/issues
318
318
 
319
- ## 🎉 What's New in v3.1.6
320
-
321
- ### 📊 Advanced Telemetry System
322
- - **Prometheus/Grafana Integration**: Real-time metrics visualization
323
- - **ML Anomaly Detection**: 3 algorithms (Isolation Forest, Autoencoder, LSTM)
324
- - **Predictive Analytics**: 4 forecasting models (LSTM, ARIMA, Prophet, Transformer)
325
- - **Custom Metrics Framework**: Flexible metric types with alerts
326
- - **Docker Compose Setup**: One-command monitoring deployment
327
-
328
- ### 📈 Monitoring Features
329
- - **9 Core Metrics**: Response times, error rates, token usage, satisfaction
330
- - **Pre-built Dashboards**: Beautiful Grafana visualizations
331
- - **Alert System**: Slack integration with severity levels
332
- - **Trend Analysis**: Direction, strength, and seasonality detection
333
- - **Export Formats**: JSON, Prometheus, CSV
334
-
335
- ### Core Features (from v3.0.0)
336
- - **Minimal API Surface**: 3 core services + telemetry
337
- - **Beautiful CLI Experience**: Professional startup screen
338
- - **Performance Optimized**: 192KB bundle, < 1s startup
319
+ ## 🎉 What's New in v3.1.8
320
+
321
+ ### 🔧 Zero Hardcoding Architecture
322
+ - **Centralized Configuration**: All defaults in `config/defaults.ts`
323
+ - **Dynamic Version Management**: Auto-pulled from package.json
324
+ - **Configurable AI Models**: Environment-based model lists
325
+ - **Simplified Environment Files**: Just `.env` and `.env.example`
326
+ - **GPT-5 Mini Default**: Updated from GPT-3.5-turbo
327
+
328
+ ### 🤖 Configuration Improvements
329
+ - **Model Lists**: `OPENAI_MODELS`, `ANTHROPIC_MODELS` env vars
330
+ - **Custom Endpoints**: Configurable API endpoints
331
+ - **Environment Priority**: Env vars > Defaults > Fallbacks
332
+ - **No Hardcoded Values**: Everything externalized
333
+
334
+ ### 📊 Telemetry System (v3.1.6+)
335
+ - **Prometheus/Grafana**: Real-time metrics
336
+ - **ML Anomaly Detection**: 3 algorithms
337
+ - **Predictive Analytics**: 4 forecasting models
338
+ - **Custom Metrics**: Flexible framework
339
+
340
+ ### ✨ Core Features
341
+ - **3 Core Services**: Router, Memory, FileSystem
342
+ - **8 AI Providers**: Cloud & Local support
343
+ - **Beautiful CLI**: Professional experience
339
344
 
340
345
  ### 🚨 Breaking Changes from v2.x
341
346
  - **API Simplification**: Many internal services no longer exposed
@@ -10879,9 +10879,34 @@ ${this.attachedFiles.map((f2) => `- ${f2}`).join("\n")}`;
10879
10879
  });
10880
10880
 
10881
10881
  // src/config/defaults.ts
10882
- var DEFAULT_MEMORY_CONFIG, DEFAULT_ROUTER_CONFIG, DEFAULT_UI_CONFIG, DEFAULT_PATHS, DEFAULT_ENV_VARS2, DEFAULT_DUMMY_VALUES2, DEFAULT_HELP_TEXTS2, DEFAULT_COMMAND_OUTPUTS2, DEFAULT_FEATURE_FLAGS, DEFAULT_PROVIDER_PREFS, DEFAULT_STARTUP, DEFAULT_HSR_CONFIG, DEFAULT_RL_CONFIG, DEFAULT_CONFIG;
10882
+ var import_fs, import_path, packageVersion, parseList, DEFAULT_MEMORY_CONFIG, DEFAULT_ROUTER_CONFIG, DEFAULT_UI_CONFIG, DEFAULT_PATHS, DEFAULT_ENV_VARS2, DEFAULT_DUMMY_VALUES2, DEFAULT_HELP_TEXTS2, DEFAULT_COMMAND_OUTPUTS2, DEFAULT_FEATURE_FLAGS, DEFAULT_PROVIDER_PREFS, AI_PROVIDERS_CONFIG, DEFAULT_STARTUP, DEFAULT_HSR_CONFIG, DEFAULT_RL_CONFIG, APP_VERSION, DEFAULT_CONFIG;
10883
10883
  var init_defaults = __esm({
10884
10884
  "src/config/defaults.ts"() {
10885
+ import_fs = require("fs");
10886
+ import_path = require("path");
10887
+ packageVersion = "3.1.8";
10888
+ try {
10889
+ const possiblePaths = [
10890
+ (0, import_path.join)(process.cwd(), "package.json"),
10891
+ (0, import_path.join)(__dirname, "../../package.json")
10892
+ ];
10893
+ for (const path21 of possiblePaths) {
10894
+ try {
10895
+ const pkgContent = (0, import_fs.readFileSync)(path21, "utf8");
10896
+ const pkg2 = JSON.parse(pkgContent);
10897
+ if (pkg2.version) {
10898
+ packageVersion = pkg2.version;
10899
+ break;
10900
+ }
10901
+ } catch {
10902
+ }
10903
+ }
10904
+ } catch {
10905
+ }
10906
+ parseList = (envVar, defaultList) => {
10907
+ if (!envVar) return defaultList;
10908
+ return envVar.split(",").map((s2) => s2.trim()).filter(Boolean);
10909
+ };
10885
10910
  DEFAULT_MEMORY_CONFIG = {
10886
10911
  system1: {
10887
10912
  maxKnowledgeNodes: 1e3,
@@ -10979,13 +11004,56 @@ var init_defaults = __esm({
10979
11004
  enableProactiveReporting: true
10980
11005
  };
10981
11006
  DEFAULT_PROVIDER_PREFS = {
10982
- provider: "openai",
10983
- model: "gpt-4",
10984
- offline: false,
10985
- debug: false,
11007
+ provider: process.env.AI_PROVIDER || "openai",
11008
+ model: process.env.OPENAI_MODEL || "gpt-5-mini-2025-08-07",
11009
+ offline: process.env.OFFLINE_MODE === "true",
11010
+ debug: process.env.DEBUG === "true",
10986
11011
  priority: "auto",
10987
- maxTokens: 4096,
10988
- temperature: 0.7
11012
+ maxTokens: parseInt(process.env.OPENAI_MAX_TOKENS || "2000", 10),
11013
+ temperature: parseFloat(process.env.OPENAI_TEMPERATURE || "0.7")
11014
+ };
11015
+ AI_PROVIDERS_CONFIG = {
11016
+ openai: {
11017
+ apiKey: process.env.OPENAI_API_KEY,
11018
+ model: process.env.OPENAI_MODEL || "gpt-5-mini-2025-08-07",
11019
+ availableModels: parseList(process.env.OPENAI_MODELS, [
11020
+ "gpt-5-mini-2025-08-07",
11021
+ "gpt-5-mini",
11022
+ "gpt-4",
11023
+ "gpt-4-turbo",
11024
+ "o1-preview",
11025
+ "o1-mini"
11026
+ ]),
11027
+ endpoint: process.env.OPENAI_ENDPOINT || "https://api.openai.com/v1/chat/completions",
11028
+ maxTokens: parseInt(process.env.OPENAI_MAX_TOKENS || "2000", 10),
11029
+ temperature: parseFloat(process.env.OPENAI_TEMPERATURE || "0.7"),
11030
+ timeout: parseInt(process.env.OPENAI_TIMEOUT || "30000", 10),
11031
+ maxRetries: parseInt(process.env.OPENAI_MAX_RETRIES || "3", 10)
11032
+ },
11033
+ anthropic: {
11034
+ apiKey: process.env.ANTHROPIC_API_KEY,
11035
+ model: process.env.ANTHROPIC_MODEL || "claude-3-sonnet-20240229",
11036
+ availableModels: parseList(process.env.ANTHROPIC_MODELS, [
11037
+ "claude-3-opus-20240229",
11038
+ "claude-3-sonnet-20240229",
11039
+ "claude-3-haiku-20240307"
11040
+ ]),
11041
+ endpoint: process.env.ANTHROPIC_ENDPOINT || "https://api.anthropic.com/v1/messages",
11042
+ maxTokens: parseInt(process.env.ANTHROPIC_MAX_TOKENS || "2000", 10),
11043
+ temperature: parseFloat(process.env.ANTHROPIC_TEMPERATURE || "0.7")
11044
+ },
11045
+ ollama: {
11046
+ enabled: process.env.OLLAMA_ENABLED === "true",
11047
+ apiUrl: process.env.OLLAMA_API_URL || "http://localhost:11434",
11048
+ defaultModel: process.env.OLLAMA_DEFAULT_MODEL || "llama3.2:3b",
11049
+ maxTokens: parseInt(process.env.OLLAMA_MAX_TOKENS || "4096", 10)
11050
+ },
11051
+ lmstudio: {
11052
+ enabled: process.env.LMSTUDIO_ENABLED === "true",
11053
+ apiUrl: process.env.LMSTUDIO_API_URL || "http://localhost:1234",
11054
+ defaultModel: process.env.LMSTUDIO_DEFAULT_MODEL || "gpt-oss-120b",
11055
+ maxTokens: parseInt(process.env.LMSTUDIO_MAX_TOKENS || "8192", 10)
11056
+ }
10989
11057
  };
10990
11058
  DEFAULT_STARTUP = {
10991
11059
  showLogo: true,
@@ -11011,7 +11079,9 @@ var init_defaults = __esm({
11011
11079
  enablePPO: true,
11012
11080
  enableDPO: false
11013
11081
  };
11082
+ APP_VERSION = process.env.npm_package_version || packageVersion;
11014
11083
  DEFAULT_CONFIG = {
11084
+ version: APP_VERSION,
11015
11085
  memory: DEFAULT_MEMORY_CONFIG,
11016
11086
  router: DEFAULT_ROUTER_CONFIG,
11017
11087
  ui: DEFAULT_UI_CONFIG,
@@ -11022,6 +11092,7 @@ var init_defaults = __esm({
11022
11092
  command: DEFAULT_COMMAND_OUTPUTS2,
11023
11093
  flags: DEFAULT_FEATURE_FLAGS,
11024
11094
  provider: DEFAULT_PROVIDER_PREFS,
11095
+ providers: AI_PROVIDERS_CONFIG,
11025
11096
  startup: DEFAULT_STARTUP,
11026
11097
  hsr: DEFAULT_HSR_CONFIG,
11027
11098
  rl: DEFAULT_RL_CONFIG
@@ -11030,10 +11101,10 @@ var init_defaults = __esm({
11030
11101
  });
11031
11102
 
11032
11103
  // src/services/conversation-persistence.ts
11033
- var import_fs, path3, os2, ConversationPersistence;
11104
+ var import_fs2, path3, os2, ConversationPersistence;
11034
11105
  var init_conversation_persistence = __esm({
11035
11106
  "src/services/conversation-persistence.ts"() {
11036
- import_fs = require("fs");
11107
+ import_fs2 = require("fs");
11037
11108
  path3 = __toESM(require("path"), 1);
11038
11109
  os2 = __toESM(require("os"), 1);
11039
11110
  ConversationPersistence = class {
@@ -11054,7 +11125,7 @@ var init_conversation_persistence = __esm({
11054
11125
  async ensureConfigDir() {
11055
11126
  try {
11056
11127
  const configDir = path3.dirname(this.sessionFile);
11057
- await import_fs.promises.mkdir(configDir, { recursive: true });
11128
+ await import_fs2.promises.mkdir(configDir, { recursive: true });
11058
11129
  } catch (error2) {
11059
11130
  console.warn("Failed to create config directory:", error2);
11060
11131
  }
@@ -11064,7 +11135,7 @@ var init_conversation_persistence = __esm({
11064
11135
  */
11065
11136
  async loadHistory() {
11066
11137
  try {
11067
- const data2 = await import_fs.promises.readFile(this.sessionFile, "utf-8");
11138
+ const data2 = await import_fs2.promises.readFile(this.sessionFile, "utf-8");
11068
11139
  const session = JSON.parse(data2);
11069
11140
  const messages2 = session.messages.map((msg2) => ({
11070
11141
  ...msg2,
@@ -11102,7 +11173,7 @@ var init_conversation_persistence = __esm({
11102
11173
  model: messages2[messages2.length - 1]?.model
11103
11174
  }
11104
11175
  };
11105
- await import_fs.promises.writeFile(this.sessionFile, JSON.stringify(session, null, 2));
11176
+ await import_fs2.promises.writeFile(this.sessionFile, JSON.stringify(session, null, 2));
11106
11177
  } catch (error2) {
11107
11178
  console.warn("Failed to save conversation history:", error2);
11108
11179
  }
@@ -11112,7 +11183,7 @@ var init_conversation_persistence = __esm({
11112
11183
  */
11113
11184
  async clearHistory() {
11114
11185
  try {
11115
- await import_fs.promises.unlink(this.sessionFile);
11186
+ await import_fs2.promises.unlink(this.sessionFile);
11116
11187
  } catch (error2) {
11117
11188
  }
11118
11189
  this.pendingWrites = [];
@@ -11123,7 +11194,7 @@ var init_conversation_persistence = __esm({
11123
11194
  async getStats() {
11124
11195
  try {
11125
11196
  const messages2 = await this.loadHistory();
11126
- const stats2 = await import_fs.promises.stat(this.sessionFile);
11197
+ const stats2 = await import_fs2.promises.stat(this.sessionFile);
11127
11198
  return {
11128
11199
  totalMessages: messages2.length,
11129
11200
  oldestMessage: messages2[0]?.timestamp,
@@ -12011,20 +12082,16 @@ var OpenAIProvider2;
12011
12082
  var init_openai_provider2 = __esm({
12012
12083
  "src/services/ai-response/providers/openai-provider.ts"() {
12013
12084
  init_base_provider();
12085
+ init_defaults();
12014
12086
  OpenAIProvider2 = class extends BaseAIProvider2 {
12015
12087
  name = "OpenAI";
12016
- apiEndpoint = "https://api.openai.com/v1/chat/completions";
12088
+ apiEndpoint = AI_PROVIDERS_CONFIG.openai.endpoint;
12017
12089
  async performInitialization() {
12018
12090
  if (!this.config.defaultModel) {
12019
- this.config.defaultModel = "gpt-3.5-turbo";
12091
+ this.config.defaultModel = AI_PROVIDERS_CONFIG.openai.model;
12020
12092
  }
12021
- this.availableModels = [
12022
- "gpt-4",
12023
- "gpt-4-turbo-preview",
12024
- "gpt-3.5-turbo",
12025
- "gpt-3.5-turbo-16k"
12026
- ];
12027
- if (!this.config.apiKey) {
12093
+ this.availableModels = AI_PROVIDERS_CONFIG.openai.availableModels;
12094
+ if (!this.config.apiKey && !AI_PROVIDERS_CONFIG.openai.apiKey) {
12028
12095
  console.warn("[OpenAI Provider] No API key provided - will use template responses");
12029
12096
  }
12030
12097
  }
@@ -12169,21 +12236,16 @@ var AnthropicProvider2;
12169
12236
  var init_anthropic_provider2 = __esm({
12170
12237
  "src/services/ai-response/providers/anthropic-provider.ts"() {
12171
12238
  init_base_provider();
12239
+ init_defaults();
12172
12240
  AnthropicProvider2 = class extends BaseAIProvider2 {
12173
12241
  name = "Anthropic";
12174
- apiEndpoint = "https://api.anthropic.com/v1/messages";
12242
+ apiEndpoint = AI_PROVIDERS_CONFIG.anthropic.endpoint;
12175
12243
  async performInitialization() {
12176
12244
  if (!this.config.defaultModel) {
12177
- this.config.defaultModel = "claude-3-sonnet-20240229";
12245
+ this.config.defaultModel = AI_PROVIDERS_CONFIG.anthropic.model;
12178
12246
  }
12179
- this.availableModels = [
12180
- "claude-3-opus-20240229",
12181
- "claude-3-sonnet-20240229",
12182
- "claude-3-haiku-20240307",
12183
- "claude-2.1",
12184
- "claude-instant-1.2"
12185
- ];
12186
- if (!this.config.apiKey) {
12247
+ this.availableModels = AI_PROVIDERS_CONFIG.anthropic.availableModels;
12248
+ if (!this.config.apiKey && !AI_PROVIDERS_CONFIG.anthropic.apiKey) {
12187
12249
  console.warn("[Anthropic Provider] No API key provided - will use template responses");
12188
12250
  }
12189
12251
  }
@@ -14790,23 +14852,23 @@ var init_type_guards = __esm({
14790
14852
  function loadConfig() {
14791
14853
  let currentDir = process.cwd();
14792
14854
  while (currentDir !== "/") {
14793
- const _configPath = (0, import_path.join)(currentDir, _CONFIG_FILE);
14794
- if ((0, import_fs2.existsSync)(_configPath)) {
14855
+ const _configPath = (0, import_path2.join)(currentDir, _CONFIG_FILE);
14856
+ if ((0, import_fs3.existsSync)(_configPath)) {
14795
14857
  try {
14796
- const _content = (0, import_fs2.readFileSync)(_configPath, "utf-8");
14858
+ const _content = (0, import_fs3.readFileSync)(_configPath, "utf-8");
14797
14859
  return (0, import_toml.parse)(_content);
14798
14860
  } catch {
14799
14861
  }
14800
14862
  }
14801
- const _parentDir = (0, import_path.join)(currentDir, "..");
14863
+ const _parentDir = (0, import_path2.join)(currentDir, "..");
14802
14864
  if (_parentDir === currentDir) {
14803
14865
  break;
14804
14866
  }
14805
14867
  currentDir = _parentDir;
14806
14868
  }
14807
- if ((0, import_fs2.existsSync)(_GLOBAL_CONFIG_PATH)) {
14869
+ if ((0, import_fs3.existsSync)(_GLOBAL_CONFIG_PATH)) {
14808
14870
  try {
14809
- const _content = (0, import_fs2.readFileSync)(_GLOBAL_CONFIG_PATH, "utf-8");
14871
+ const _content = (0, import_fs3.readFileSync)(_GLOBAL_CONFIG_PATH, "utf-8");
14810
14872
  return (0, import_toml.parse)(_content);
14811
14873
  } catch {
14812
14874
  }
@@ -14846,7 +14908,7 @@ async function writeConfig(config2, _path2) {
14846
14908
  });
14847
14909
  }
14848
14910
  function saveConfig(_config, _path2) {
14849
- const _configPath = path || (0, import_path.join)(process.cwd(), _CONFIG_FILE);
14911
+ const _configPath = path || (0, import_path2.join)(process.cwd(), _CONFIG_FILE);
14850
14912
  const lines2 = [];
14851
14913
  if (config.user) {
14852
14914
  lines2.push("[user]");
@@ -15058,17 +15120,17 @@ function saveConfig(_config, _path2) {
15058
15120
  lines2.push(`defaultModel = "${config.defaultModel}"`);
15059
15121
  }
15060
15122
  const _content = lines2.join("\n");
15061
- (0, import_fs2.writeFileSync)(_configPath, _content, "utf-8");
15123
+ (0, import_fs3.writeFileSync)(_configPath, _content, "utf-8");
15062
15124
  }
15063
- var import_fs2, import_path, import_toml, import_os, _CONFIG_FILE, _GLOBAL_CONFIG_PATH;
15125
+ var import_fs3, import_path2, import_toml, import_os, _CONFIG_FILE, _GLOBAL_CONFIG_PATH;
15064
15126
  var init_config = __esm({
15065
15127
  "src/utils/config.ts"() {
15066
- import_fs2 = require("fs");
15067
- import_path = require("path");
15128
+ import_fs3 = require("fs");
15129
+ import_path2 = require("path");
15068
15130
  import_toml = require("toml");
15069
15131
  import_os = require("os");
15070
15132
  _CONFIG_FILE = ".maria-code.toml";
15071
- _GLOBAL_CONFIG_PATH = (0, import_path.join)((0, import_os.homedir)(), ".maria-code", "config.toml");
15133
+ _GLOBAL_CONFIG_PATH = (0, import_path2.join)((0, import_os.homedir)(), ".maria-code", "config.toml");
15072
15134
  }
15073
15135
  });
15074
15136
 
@@ -16229,13 +16291,13 @@ var init_alias_system = __esm({
16229
16291
  });
16230
16292
 
16231
16293
  // src/services/template-manager.ts
16232
- var import_path2, import_os2, import_fs3, logger3, TemplateManager;
16294
+ var import_path3, import_os2, import_fs4, logger3, TemplateManager;
16233
16295
  var init_template_manager = __esm({
16234
16296
  "src/services/template-manager.ts"() {
16235
16297
  init_logger();
16236
- import_path2 = require("path");
16298
+ import_path3 = require("path");
16237
16299
  import_os2 = require("os");
16238
- import_fs3 = require("fs");
16300
+ import_fs4 = require("fs");
16239
16301
  logger3 = logger;
16240
16302
  TemplateManager = class _TemplateManager {
16241
16303
  static instance;
@@ -16244,7 +16306,7 @@ var init_template_manager = __esm({
16244
16306
  templatesDir;
16245
16307
  builtInTemplates = /* @__PURE__ */ new Map();
16246
16308
  constructor() {
16247
- this.templatesDir = (0, import_path2.join)((0, import_os2.homedir)(), ".maria-code", "_templates");
16309
+ this.templatesDir = (0, import_path3.join)((0, import_os2.homedir)(), ".maria-code", "_templates");
16248
16310
  this.ensureTemplatesDir();
16249
16311
  this.initializeBuiltInTemplates();
16250
16312
  this.loadUserTemplates();
@@ -16259,8 +16321,8 @@ var init_template_manager = __esm({
16259
16321
  * Ensure _templates directory exists
16260
16322
  */
16261
16323
  ensureTemplatesDir() {
16262
- if (!(0, import_fs3.existsSync)(this.templatesDir)) {
16263
- (0, import_fs3.mkdirSync)(this.templatesDir, { recursive: true });
16324
+ if (!(0, import_fs4.existsSync)(this.templatesDir)) {
16325
+ (0, import_fs4.mkdirSync)(this.templatesDir, { recursive: true });
16264
16326
  }
16265
16327
  }
16266
16328
  /**
@@ -16392,11 +16454,11 @@ var init_template_manager = __esm({
16392
16454
  */
16393
16455
  loadUserTemplates() {
16394
16456
  try {
16395
- const files2 = (0, import_fs3.readdirSync)(this.templatesDir);
16457
+ const files2 = (0, import_fs4.readdirSync)(this.templatesDir);
16396
16458
  files2.forEach((file2) => {
16397
16459
  if (file2.endsWith(".json")) {
16398
16460
  try {
16399
- const content2 = (0, import_fs3.readFileSync)((0, import_path2.join)(this.templatesDir, file2), "utf-8");
16461
+ const content2 = (0, import_fs4.readFileSync)((0, import_path3.join)(this.templatesDir, file2), "utf-8");
16400
16462
  const t = JSON.parse(content2);
16401
16463
  t.createdAt = new Date(t.createdAt);
16402
16464
  t.updatedAt = new Date(t.updatedAt);
@@ -16415,8 +16477,8 @@ var init_template_manager = __esm({
16415
16477
  */
16416
16478
  saveTemplate(template2) {
16417
16479
  const _filename = `${template2.id}.json`;
16418
- const _filepath = (0, import_path2.join)(this.templatesDir, _filename);
16419
- (0, import_fs3.writeFileSync)(_filepath, JSON.stringify(template2, null, 2));
16480
+ const _filepath = (0, import_path3.join)(this.templatesDir, _filename);
16481
+ (0, import_fs4.writeFileSync)(_filepath, JSON.stringify(template2, null, 2));
16420
16482
  }
16421
16483
  /**
16422
16484
  * Create a new _template
@@ -16491,7 +16553,7 @@ var init_template_manager = __esm({
16491
16553
  this.userTemplates.delete(id2);
16492
16554
  try {
16493
16555
  const fs18 = await import("fs");
16494
- fs18.unlinkSync((0, import_path2.join)(this.templatesDir, `${id2}.json`));
16556
+ fs18.unlinkSync((0, import_path3.join)(this.templatesDir, `${id2}.json`));
16495
16557
  } catch (_error) {
16496
16558
  logger3.error("Failed to delete template file:", _error);
16497
16559
  }
@@ -17007,14 +17069,14 @@ var init_batch_execution = __esm({
17007
17069
  });
17008
17070
 
17009
17071
  // src/services/hotkey-manager.ts
17010
- var import_chalk11, import_fs4, import_path3, import_os3, logger4, HotkeyManager;
17072
+ var import_chalk11, import_fs5, import_path4, import_os3, logger4, HotkeyManager;
17011
17073
  var init_hotkey_manager = __esm({
17012
17074
  "src/services/hotkey-manager.ts"() {
17013
17075
  init_slash_command_handler();
17014
17076
  init_logger();
17015
17077
  import_chalk11 = __toESM(require("chalk"), 1);
17016
- import_fs4 = require("fs");
17017
- import_path3 = require("path");
17078
+ import_fs5 = require("fs");
17079
+ import_path4 = require("path");
17018
17080
  import_os3 = require("os");
17019
17081
  logger4 = logger;
17020
17082
  HotkeyManager = class _HotkeyManager {
@@ -17054,7 +17116,7 @@ var init_hotkey_manager = __esm({
17054
17116
  return mods ? `${mods}+${n2.key}` : n2.key;
17055
17117
  }
17056
17118
  constructor() {
17057
- this.configPath = (0, import_path3.join)((0, import_os3.homedir)(), ".maria", "hotkeys.json");
17119
+ this.configPath = (0, import_path4.join)((0, import_os3.homedir)(), ".maria", "hotkeys.json");
17058
17120
  this.loadBindings();
17059
17121
  this.initializeDefaultBindings();
17060
17122
  }
@@ -17468,8 +17530,8 @@ ${import_chalk11.default.gray("Use /_hotkey to manage hotkeys")}
17468
17530
  */
17469
17531
  loadBindings() {
17470
17532
  try {
17471
- if ((0, import_fs4.existsSync)(this.configPath)) {
17472
- const _data = (0, import_fs4.readFileSync)(this.configPath, "utf-8");
17533
+ if ((0, import_fs5.existsSync)(this.configPath)) {
17534
+ const _data = (0, import_fs5.readFileSync)(this.configPath, "utf-8");
17473
17535
  const raw = JSON.parse(_data);
17474
17536
  const coerced = this.coerceConfig(raw);
17475
17537
  if (coerced) {
@@ -17486,11 +17548,11 @@ ${import_chalk11.default.gray("Use /_hotkey to manage hotkeys")}
17486
17548
  saveBindings() {
17487
17549
  try {
17488
17550
  const _config = this.exportConfig();
17489
- const _dir = (0, import_path3.join)((0, import_os3.homedir)(), ".maria");
17490
- if (!(0, import_fs4.existsSync)(_dir)) {
17491
- (0, import_fs4.mkdirSync)(_dir, { recursive: true });
17551
+ const _dir = (0, import_path4.join)((0, import_os3.homedir)(), ".maria");
17552
+ if (!(0, import_fs5.existsSync)(_dir)) {
17553
+ (0, import_fs5.mkdirSync)(_dir, { recursive: true });
17492
17554
  }
17493
- (0, import_fs4.writeFileSync)(this.configPath, JSON.stringify(_config, null, 2));
17555
+ (0, import_fs5.writeFileSync)(this.configPath, JSON.stringify(_config, null, 2));
17494
17556
  } catch (_error) {
17495
17557
  logger4.error("Failed to save _hotkey _bindings:", _error);
17496
17558
  }
@@ -30412,7 +30474,7 @@ function createCLI() {
30412
30474
  });
30413
30475
  return program2;
30414
30476
  }
30415
- var import_commander, import_chalk18, readline, import_node_process, import_fs5, import_path4, import_url, packageJson2, conversationPersistence, chatContext, aiResponseService, slashCommandHandler, sessionMemory, program;
30477
+ var import_commander, import_chalk18, readline, import_node_process, packageJson2, conversationPersistence, chatContext, aiResponseService, slashCommandHandler, sessionMemory, program;
30416
30478
  var init_cli = __esm({
30417
30479
  "src/cli.ts"() {
30418
30480
  import_commander = require("commander");
@@ -30423,19 +30485,8 @@ var init_cli = __esm({
30423
30485
  init_chat_context_fixed_service();
30424
30486
  init_ai_response_service();
30425
30487
  init_slash_command_handler();
30426
- import_fs5 = require("fs");
30427
- import_path4 = require("path");
30428
- import_url = require("url");
30429
- packageJson2 = { version: "3.1.7" };
30430
- try {
30431
- const packagePath = (0, import_path4.join)(process.cwd(), "package.json");
30432
- const pkgContent = (0, import_fs5.readFileSync)(packagePath, "utf8");
30433
- const pkg2 = JSON.parse(pkgContent);
30434
- if (pkg2.version) {
30435
- packageJson2.version = pkg2.version;
30436
- }
30437
- } catch {
30438
- }
30488
+ init_defaults();
30489
+ packageJson2 = { version: APP_VERSION };
30439
30490
  sessionMemory = [];
30440
30491
  program = createCLI();
30441
30492
  program.parse(process.argv);
@@ -40382,8 +40433,8 @@ var init_package = __esm({
40382
40433
  "package.json"() {
40383
40434
  package_default = {
40384
40435
  name: "@bonginkan/maria",
40385
- version: "3.1.7",
40386
- description: "\u{1F4CA} MARIA v3.1.7 - Advanced Telemetry & Analytics. Enterprise AI platform with ML-powered monitoring, predictive analytics, and real-time insights.",
40436
+ version: "3.1.8",
40437
+ description: "\u{1F527} MARIA v3.1.8 - Zero Hardcoding Architecture. Enterprise AI platform with centralized configuration, GPT-5 Mini default, and simplified environment management.",
40387
40438
  keywords: [
40388
40439
  "ai",
40389
40440
  "cli",