@hongymagic/q 2026.328.0 → 2026.330.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +69 -34
  2. package/dist/q.js +440 -123
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -25,23 +25,37 @@ bun install && bun run build
25
25
 
26
26
  ## Setup
27
27
 
28
- 1. **Create config:**
28
+ `q` works without a config file.
29
+
30
+ 1. **Free local setup with Ollama**
29
31
 
30
32
  ```bash
31
- q config init
33
+ ollama pull gemma3
34
+ q --provider ollama --model gemma3 explain this stack trace
32
35
  ```
33
36
 
34
- 2. **Set your API key:**
37
+ 2. **Free cloud setup with Gemini**
35
38
 
36
39
  ```bash
37
- export ANTHROPIC_API_KEY="your-key-here"
40
+ export GEMINI_API_KEY="your-key-here"
41
+ q explain rust lifetimes
38
42
  ```
39
43
 
44
+ 3. **Optional: create a pinned config**
45
+
46
+ ```bash
47
+ q config init
48
+ ```
49
+
50
+ `q` auto-detects common provider keys (`GEMINI_API_KEY`, `GROQ_API_KEY`, `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`) and falls back to local Ollama when available.
51
+
40
52
  ## Usage
41
53
 
42
54
  ```bash
43
55
  q how do I restart docker
56
+ GEMINI_API_KEY="your-key" q explain closures in javascript
44
57
  q --copy what is a kubernetes pod
58
+ q --provider ollama --model gemma3 explain this error
45
59
  ```
46
60
 
47
61
  ### Piping Content
@@ -85,47 +99,57 @@ q --mode explain how do I restart docker # Returns a detailed explanation
85
99
 
86
100
  ```bash
87
101
  q config path # Print config file path
88
- q config init # Create example config
89
- q config doctor # Diagnose config and provider issues
90
- q providers # List configured providers + model and credential status
102
+ q config init # Create optional config file
103
+ q config doctor # Diagnose config and setup issues
104
+ q providers # List available providers + model and credential status
91
105
  ```
92
106
 
93
107
  ## Configuration
94
108
 
95
- Config is loaded from (later overrides earlier):
109
+ Config is optional. `q` starts with built-in provider presets and per-provider defaults, then applies overrides (later overrides earlier):
96
110
 
97
- 1. `$XDG_CONFIG_HOME/q/config.toml` (or `~/.config/q/config.toml`)
98
- 2. `./config.toml` (project-specific)
99
- 3. Environment: `Q_PROVIDER`, `Q_MODEL`, `Q_COPY`
111
+ 1. Built-in defaults (`google`, `groq`, `anthropic`, `openai`, `ollama`, `azure`, `bedrock`)
112
+ 2. `$XDG_CONFIG_HOME/q/config.toml` (or `~/.config/q/config.toml`)
113
+ 3. `./config.toml` (project-specific)
114
+ 4. Environment: `Q_PROVIDER`, `Q_MODEL`, `Q_COPY`
115
+ 5. CLI flags: `--provider`, `--model`
100
116
 
101
117
  Each provider can specify its own default `model`, which takes precedence over `default.model` but is overridden by `Q_MODEL` or `--model`:
102
118
 
103
119
  ```toml
104
120
  [default]
105
- provider = "anthropic"
106
- model = "claude-sonnet-4-20250514" # Global default
121
+ provider = "google"
122
+ # model = "gemini-2.5-flash" # Optional global default
107
123
 
108
- [providers.anthropic]
109
- type = "anthropic"
110
- api_key_env = "ANTHROPIC_API_KEY"
111
- model = "claude-sonnet-4-20250514" # Per-provider default
124
+ [providers.google]
125
+ type = "google"
126
+ api_key_env = "GEMINI_API_KEY"
127
+ model = "gemini-2.5-flash" # Optional per-provider default
112
128
  ```
113
129
 
114
130
  See [config.example.toml](config.example.toml) for all options.
115
131
 
132
+ ### Free Setup Options
133
+
134
+ | Option | Cost | What you need | Notes |
135
+ |------|------|---------------|-------|
136
+ | `ollama` | Free local | Ollama installed and a local model | Best zero-key/offline option |
137
+ | `google` | Free tier | `GEMINI_API_KEY` (or `GOOGLE_API_KEY` / `GOOGLE_GENERATIVE_AI_API_KEY`) | Best free cloud default |
138
+ | `groq` | Free tier | `GROQ_API_KEY` | Fast free cloud fallback |
139
+
116
140
  ### Provider Types
117
141
 
118
- | Type | Description |
119
- |------|-------------|
120
- | `anthropic` | Anthropic Claude API |
121
- | `openai` | OpenAI API |
122
- | `openai_compatible` | Any OpenAI-compatible API |
123
- | `ollama` | Local Ollama instance |
124
- | `portkey` | Portkey AI Gateway (self-hosted or cloud) |
125
- | `google` | Google Gemini API |
126
- | `groq` | Groq (ultra-fast Llama, Mixtral inference) |
127
- | `azure` | Azure OpenAI deployments |
128
- | `bedrock` | AWS Bedrock (Claude, Titan, Llama) |
142
+ | Type | Built in | Description |
143
+ |------|----------|-------------|
144
+ | `google` | Yes | Google Gemini API |
145
+ | `groq` | Yes | Groq (ultra-fast open models) |
146
+ | `ollama` | Yes | Local Ollama instance |
147
+ | `anthropic` | Yes | Anthropic Claude API |
148
+ | `openai` | Yes | OpenAI API |
149
+ | `azure` | Yes | Azure OpenAI deployments |
150
+ | `bedrock` | Yes | AWS Bedrock (Claude, Titan, Llama) |
151
+ | `openai_compatible` | No | Any OpenAI-compatible API (needs `base_url`) |
152
+ | `portkey` | No | Portkey AI Gateway (needs `base_url` and `provider_slug`) |
129
153
 
130
154
  ### Portkey Gateway Setup
131
155
 
@@ -182,21 +206,32 @@ All agent PRs are created as drafts and require human approval to merge. The sys
182
206
 
183
207
  ## Troubleshooting
184
208
 
185
- **"Missing API key" error:**
209
+ **"Setup required" error:**
186
210
 
187
- Ensure your API key environment variable is set:
211
+ Use one of these quick starts:
188
212
 
189
213
  ```bash
190
- export ANTHROPIC_API_KEY="your-key-here"
214
+ export GEMINI_API_KEY="your-key-here"
215
+ q explain kubernetes pods
191
216
  ```
192
217
 
193
- **"Config file not found" error:**
218
+ ```bash
219
+ q --provider ollama --model gemma3 explain kubernetes pods
220
+ ```
221
+
222
+ Or create a config file with `q config init`.
223
+
224
+ **"Missing API key" error:**
225
+
226
+ Ensure your API key environment variable is set:
194
227
 
195
- Run `q config init` to create a default configuration file.
228
+ ```bash
229
+ export GEMINI_API_KEY="your-key-here"
230
+ ```
196
231
 
197
232
  **Diagnose config issues:**
198
233
 
199
- Run `q config doctor` to check config files, environment overrides, and provider health at a glance.
234
+ Run `q config doctor` to check config files, environment overrides, built-in defaults, and provider health at a glance.
200
235
 
201
236
  **Failure logs:**
202
237
 
package/dist/q.js CHANGED
@@ -14712,7 +14712,7 @@ import { parseArgs } from "node:util";
14712
14712
  // package.json
14713
14713
  var package_default = {
14714
14714
  name: "@hongymagic/q",
14715
- version: "2026.328.0",
14715
+ version: "2026.330.0",
14716
14716
  description: "Quick AI answers from the command line",
14717
14717
  main: "dist/q.js",
14718
14718
  type: "module",
@@ -14796,17 +14796,6 @@ class QError extends Error {
14796
14796
  this.name = "QError";
14797
14797
  }
14798
14798
  }
14799
-
14800
- class ConfigNotFoundError extends QError {
14801
- path;
14802
- constructor(path) {
14803
- super(`Config file not found: ${path}
14804
- Run 'q config init' to create one.`, 2);
14805
- this.path = path;
14806
- this.name = "ConfigNotFoundError";
14807
- }
14808
- }
14809
-
14810
14799
  class ConfigParseError extends QError {
14811
14800
  path;
14812
14801
  details;
@@ -14850,6 +14839,22 @@ class MissingApiKeyError extends QError {
14850
14839
  }
14851
14840
  }
14852
14841
 
14842
+ class ModelNotConfiguredError extends QError {
14843
+ providerName;
14844
+ constructor(providerName) {
14845
+ super(`No model configured for provider '${providerName}'. Set --model, Q_MODEL, provider.model, or default.model.`, 2);
14846
+ this.providerName = providerName;
14847
+ this.name = "ModelNotConfiguredError";
14848
+ }
14849
+ }
14850
+
14851
+ class SetupRequiredError extends QError {
14852
+ constructor() {
14853
+ super("Setup required. Install Ollama, set GEMINI_API_KEY or GROQ_API_KEY, or run 'q config init'.", 2);
14854
+ this.name = "SetupRequiredError";
14855
+ }
14856
+ }
14857
+
14853
14858
  class ProviderError extends QError {
14854
14859
  constructor(message, cause) {
14855
14860
  super(message, 1, cause === undefined ? undefined : { cause });
@@ -14874,9 +14879,9 @@ var HELP_TEXT = `q - Quick AI answers from the command line
14874
14879
  USAGE:
14875
14880
  q [options] <query...> Ask a question
14876
14881
  q config path Print config file path
14877
- q config init Create example config file
14878
- q config doctor Diagnose config and provider issues
14879
- q providers List configured providers
14882
+ q config init Create optional config file
14883
+ q config doctor Diagnose config and setup issues
14884
+ q providers List available providers
14880
14885
 
14881
14886
  OPTIONS:
14882
14887
  -p, --provider <name> Override the default provider
@@ -14894,7 +14899,7 @@ ENVIRONMENT:
14894
14899
  Q_COPY Override default copy behaviour (true/false)
14895
14900
 
14896
14901
  CONFIG:
14897
- Config is loaded from (in order, later overrides earlier):
14902
+ Config is optional. q uses built-in defaults and then loads overrides from:
14898
14903
  1. ~/.config/q/config.toml (or $XDG_CONFIG_HOME/q/config.toml)
14899
14904
  2. ./config.toml (current directory)
14900
14905
  3. Environment variables (Q_PROVIDER, Q_MODEL, Q_COPY)
@@ -14902,6 +14907,8 @@ CONFIG:
14902
14907
  EXAMPLES:
14903
14908
  q how do I restart docker
14904
14909
  q -p openai --model gpt-4o what is recursion
14910
+ GEMINI_API_KEY=your-key q explain rust lifetimes
14911
+ q --provider ollama --model gemma3 explain this error
14905
14912
  q config init
14906
14913
  `;
14907
14914
  var VERSION = package_default.version;
@@ -28788,6 +28795,273 @@ function formatPrefixedMessage(message, prefix, style) {
28788
28795
  return `${style(prefix)}${remainder}`;
28789
28796
  }
28790
28797
 
28798
+ // src/provider-catalog.ts
28799
+ var GOOGLE_API_KEY_ENV_VARS = [
28800
+ "GEMINI_API_KEY",
28801
+ "GOOGLE_API_KEY",
28802
+ "GOOGLE_GENERATIVE_AI_API_KEY"
28803
+ ];
28804
+ var DEFAULT_MODEL_BY_PROVIDER_TYPE = {
28805
+ anthropic: "claude-sonnet-4-20250514",
28806
+ openai: "gpt-4o-mini",
28807
+ google: "gemini-2.5-flash",
28808
+ groq: "openai/gpt-oss-20b",
28809
+ ollama: "gemma3",
28810
+ bedrock: "us.anthropic.claude-sonnet-4-20250514-v1:0"
28811
+ };
28812
+ var ENV_PROVIDER_INFERENCE_ORDER = [
28813
+ "google",
28814
+ "groq",
28815
+ "anthropic",
28816
+ "openai",
28817
+ "bedrock",
28818
+ "azure"
28819
+ ];
28820
+ function dedupe(values) {
28821
+ return [...new Set(values.filter((value) => value.length > 0))];
28822
+ }
28823
+ function hasAnyEnv(envVars, env3) {
28824
+ return envVars.some((envVar) => Boolean(env3[envVar]));
28825
+ }
28826
+ function hasAllEnv(envVars, env3) {
28827
+ return envVars.every((envVar) => Boolean(env3[envVar]));
28828
+ }
28829
+ function formatStatus(present) {
28830
+ return present ? "set" : "missing";
28831
+ }
28832
+ function describeAnyOf(label, envVars, env3) {
28833
+ const present = hasAnyEnv(envVars, env3);
28834
+ const display = envVars.join(" or ");
28835
+ return {
28836
+ line: `${label}: ${display} (${formatStatus(present)})`,
28837
+ issue: present ? undefined : `${display} is not set`
28838
+ };
28839
+ }
28840
+ function describeAllOf(label, envVars, env3) {
28841
+ const present = hasAllEnv(envVars, env3);
28842
+ const display = envVars.join(" + ");
28843
+ return {
28844
+ line: `${label}: ${display} (${formatStatus(present)})`,
28845
+ issue: present ? undefined : `${display} is not set`
28846
+ };
28847
+ }
28848
+ function getGoogleApiKeyEnvVars(configuredEnvVar) {
28849
+ if (!configuredEnvVar) {
28850
+ return [...GOOGLE_API_KEY_ENV_VARS];
28851
+ }
28852
+ if (GOOGLE_API_KEY_ENV_VARS.some((envVar) => envVar === configuredEnvVar)) {
28853
+ return dedupe([configuredEnvVar, ...GOOGLE_API_KEY_ENV_VARS]);
28854
+ }
28855
+ return [configuredEnvVar];
28856
+ }
28857
+ function getBuiltInProviderConfigs(env3 = process.env) {
28858
+ return {
28859
+ anthropic: {
28860
+ type: "anthropic",
28861
+ api_key_env: "ANTHROPIC_API_KEY"
28862
+ },
28863
+ openai: {
28864
+ type: "openai",
28865
+ api_key_env: "OPENAI_API_KEY"
28866
+ },
28867
+ google: {
28868
+ type: "google",
28869
+ api_key_env: GOOGLE_API_KEY_ENV_VARS[0]
28870
+ },
28871
+ groq: {
28872
+ type: "groq",
28873
+ api_key_env: "GROQ_API_KEY"
28874
+ },
28875
+ ollama: {
28876
+ type: "ollama"
28877
+ },
28878
+ azure: {
28879
+ type: "azure",
28880
+ api_key_env: "AZURE_API_KEY",
28881
+ ...env3.AZURE_RESOURCE_NAME ? { resource_name: env3.AZURE_RESOURCE_NAME } : {}
28882
+ },
28883
+ bedrock: {
28884
+ type: "bedrock",
28885
+ ...env3.AWS_REGION ? { region: env3.AWS_REGION } : {}
28886
+ }
28887
+ };
28888
+ }
28889
+ function getDefaultModelForProviderType(providerType) {
28890
+ return DEFAULT_MODEL_BY_PROVIDER_TYPE[providerType];
28891
+ }
28892
+ function getDefaultModelForProvider(providerConfig) {
28893
+ if (!providerConfig) {
28894
+ return;
28895
+ }
28896
+ return providerConfig.model ?? getDefaultModelForProviderType(providerConfig.type);
28897
+ }
28898
+ function inferProviderFromEnvironment(env3 = process.env) {
28899
+ for (const providerName of ENV_PROVIDER_INFERENCE_ORDER) {
28900
+ switch (providerName) {
28901
+ case "google":
28902
+ if (hasAnyEnv(GOOGLE_API_KEY_ENV_VARS, env3)) {
28903
+ return providerName;
28904
+ }
28905
+ break;
28906
+ case "groq":
28907
+ if (env3.GROQ_API_KEY) {
28908
+ return providerName;
28909
+ }
28910
+ break;
28911
+ case "anthropic":
28912
+ if (env3.ANTHROPIC_API_KEY) {
28913
+ return providerName;
28914
+ }
28915
+ break;
28916
+ case "openai":
28917
+ if (env3.OPENAI_API_KEY) {
28918
+ return providerName;
28919
+ }
28920
+ break;
28921
+ case "bedrock":
28922
+ if (env3.AWS_ACCESS_KEY_ID && env3.AWS_SECRET_ACCESS_KEY || env3.AWS_PROFILE) {
28923
+ return providerName;
28924
+ }
28925
+ break;
28926
+ case "azure":
28927
+ if (env3.AZURE_API_KEY && env3.AZURE_RESOURCE_NAME) {
28928
+ return providerName;
28929
+ }
28930
+ break;
28931
+ }
28932
+ }
28933
+ return;
28934
+ }
28935
+ async function detectLocalProvider() {
28936
+ try {
28937
+ const response = await fetch("http://127.0.0.1:11434/api/tags", {
28938
+ signal: AbortSignal.timeout(200)
28939
+ });
28940
+ if (response.ok) {
28941
+ return "ollama";
28942
+ }
28943
+ } catch {}
28944
+ return;
28945
+ }
28946
+ function getProviderStatusSummary(providerConfig, env3 = process.env) {
28947
+ switch (providerConfig.type) {
28948
+ case "ollama":
28949
+ return {
28950
+ lines: ["Auth: none required"],
28951
+ issues: []
28952
+ };
28953
+ case "google": {
28954
+ const apiKey = describeAnyOf("Key", getGoogleApiKeyEnvVars(providerConfig.api_key_env), env3);
28955
+ return {
28956
+ lines: [apiKey.line],
28957
+ issues: apiKey.issue ? [apiKey.issue] : []
28958
+ };
28959
+ }
28960
+ case "azure": {
28961
+ const lines = [];
28962
+ const issues = [];
28963
+ const apiKey = describeAnyOf("Key", [providerConfig.api_key_env ?? "AZURE_API_KEY"], env3);
28964
+ lines.push(apiKey.line);
28965
+ if (apiKey.issue) {
28966
+ issues.push(apiKey.issue);
28967
+ }
28968
+ if (providerConfig.base_url) {
28969
+ lines.push("Endpoint: base_url configured");
28970
+ } else if (providerConfig.resource_name) {
28971
+ lines.push(`Endpoint: resource ${providerConfig.resource_name}`);
28972
+ } else {
28973
+ lines.push("Endpoint: set AZURE_RESOURCE_NAME or base_url");
28974
+ issues.push("AZURE_RESOURCE_NAME or base_url is required");
28975
+ }
28976
+ return { lines, issues };
28977
+ }
28978
+ case "bedrock": {
28979
+ if (providerConfig.access_key_env || providerConfig.secret_key_env) {
28980
+ const credentials = describeAllOf("AWS credentials", [
28981
+ providerConfig.access_key_env ?? "AWS_ACCESS_KEY_ID",
28982
+ providerConfig.secret_key_env ?? "AWS_SECRET_ACCESS_KEY"
28983
+ ], env3);
28984
+ return {
28985
+ lines: [credentials.line],
28986
+ issues: credentials.issue ? [credentials.issue] : []
28987
+ };
28988
+ }
28989
+ return {
28990
+ lines: ["Auth: AWS SDK default credential chain"],
28991
+ issues: []
28992
+ };
28993
+ }
28994
+ case "openai_compatible": {
28995
+ const lines = [];
28996
+ const issues = [];
28997
+ if (providerConfig.api_key_env) {
28998
+ const apiKey = describeAnyOf("Key", [providerConfig.api_key_env], env3);
28999
+ lines.push(apiKey.line);
29000
+ if (apiKey.issue) {
29001
+ issues.push(apiKey.issue);
29002
+ }
29003
+ } else {
29004
+ lines.push("Key: (none required)");
29005
+ }
29006
+ if (providerConfig.base_url) {
29007
+ lines.push("Endpoint: base_url configured");
29008
+ } else {
29009
+ lines.push("Endpoint: missing base_url");
29010
+ issues.push("base_url is required");
29011
+ }
29012
+ return { lines, issues };
29013
+ }
29014
+ case "portkey": {
29015
+ const lines = [];
29016
+ const issues = [];
29017
+ if (providerConfig.api_key_env) {
29018
+ const apiKey = describeAnyOf("Gateway key", [providerConfig.api_key_env], env3);
29019
+ lines.push(apiKey.line);
29020
+ if (apiKey.issue) {
29021
+ issues.push(apiKey.issue);
29022
+ }
29023
+ } else {
29024
+ lines.push("Gateway key: (optional)");
29025
+ }
29026
+ if (providerConfig.provider_api_key_env) {
29027
+ const providerKey = describeAnyOf("Upstream key", [providerConfig.provider_api_key_env], env3);
29028
+ lines.push(providerKey.line);
29029
+ if (providerKey.issue) {
29030
+ issues.push(providerKey.issue);
29031
+ }
29032
+ } else {
29033
+ lines.push("Upstream key: (optional)");
29034
+ }
29035
+ if (!providerConfig.base_url) {
29036
+ issues.push("base_url is required");
29037
+ lines.push("Endpoint: missing base_url");
29038
+ } else {
29039
+ lines.push("Endpoint: base_url configured");
29040
+ }
29041
+ if (!providerConfig.provider_slug) {
29042
+ issues.push("provider_slug is required");
29043
+ lines.push("Provider slug: missing");
29044
+ } else {
29045
+ lines.push(`Provider slug: ${providerConfig.provider_slug}`);
29046
+ }
29047
+ return { lines, issues };
29048
+ }
29049
+ default: {
29050
+ if (!providerConfig.api_key_env) {
29051
+ return {
29052
+ lines: ["Key: (none required)"],
29053
+ issues: []
29054
+ };
29055
+ }
29056
+ const apiKey = describeAnyOf("Key", [providerConfig.api_key_env], env3);
29057
+ return {
29058
+ lines: [apiKey.line],
29059
+ issues: apiKey.issue ? [apiKey.issue] : []
29060
+ };
29061
+ }
29062
+ }
29063
+ }
29064
+
28791
29065
  // src/config/index.ts
28792
29066
  var ProviderType = exports_external.enum([
28793
29067
  "openai",
@@ -28815,8 +29089,8 @@ var ProviderConfigSchema = exports_external.object({
28815
29089
  secret_key_env: exports_external.string().optional()
28816
29090
  });
28817
29091
  var DefaultConfigSchema = exports_external.object({
28818
- provider: exports_external.string(),
28819
- model: exports_external.string(),
29092
+ provider: exports_external.string().optional(),
29093
+ model: exports_external.string().optional(),
28820
29094
  copy: exports_external.boolean().optional()
28821
29095
  });
28822
29096
  var ConfigSchema = exports_external.object({
@@ -28859,18 +29133,21 @@ class Config {
28859
29133
  Config.tryLoadFile(getXdgConfigPath()),
28860
29134
  Config.tryLoadFile(getCwdConfigPath())
28861
29135
  ]);
28862
- if (!xdgConfig && !cwdConfig) {
28863
- throw new ConfigNotFoundError(getXdgConfigPath());
28864
- }
28865
29136
  const mergedDefault = {
28866
29137
  ...xdgConfig?.default ?? {},
28867
29138
  ...cwdConfig?.default ?? {}
28868
29139
  };
28869
29140
  const mergedProviders = {
29141
+ ...getBuiltInProviderConfigs(),
28870
29142
  ...xdgConfig?.providers ?? {},
28871
29143
  ...cwdConfig?.providers ?? {}
28872
29144
  };
29145
+ const inferredProvider = await Config.inferDefaultProvider(mergedDefault);
29146
+ const providerForDefaultModel = env.Q_PROVIDER ?? mergedDefault.provider ?? inferredProvider;
29147
+ const inferredModel = providerForDefaultModel ? getDefaultModelForProvider(mergedProviders[providerForDefaultModel]) : undefined;
28873
29148
  const finalDefault = {
29149
+ ...inferredProvider ? { provider: inferredProvider } : {},
29150
+ ...inferredModel ? { model: inferredModel } : {},
28874
29151
  ...mergedDefault,
28875
29152
  ...env.Q_PROVIDER ? { provider: env.Q_PROVIDER } : {},
28876
29153
  ...env.Q_MODEL ? { model: env.Q_MODEL } : {},
@@ -28887,8 +29164,21 @@ class Config {
28887
29164
  const interpolated = Config.interpolateEnvVars(result.data);
28888
29165
  return new Config(interpolated);
28889
29166
  }
29167
+ static async inferDefaultProvider(mergedDefault) {
29168
+ if (mergedDefault.provider || env.Q_PROVIDER) {
29169
+ return;
29170
+ }
29171
+ const envProvider = inferProviderFromEnvironment();
29172
+ if (envProvider) {
29173
+ return envProvider;
29174
+ }
29175
+ return detectLocalProvider();
29176
+ }
28890
29177
  getProvider(name) {
28891
29178
  const providerName = name ?? this.default.provider;
29179
+ if (!providerName) {
29180
+ throw new ConfigValidationError("No default provider is configured. Set Q_PROVIDER, pass --provider, install Ollama, set an API key, or run 'q config init'.");
29181
+ }
28892
29182
  const provider = this.providers[providerName];
28893
29183
  if (!provider) {
28894
29184
  throw new ConfigValidationError(`Provider '${providerName}' not found in config.
@@ -28952,6 +29242,8 @@ Consider using HTTPS for non-localhost connections.`);
28952
29242
  var ALLOWED_INTERPOLATION_VARS = new Set([
28953
29243
  "ANTHROPIC_API_KEY",
28954
29244
  "OPENAI_API_KEY",
29245
+ "GEMINI_API_KEY",
29246
+ "GOOGLE_API_KEY",
28955
29247
  "PORTKEY_API_KEY",
28956
29248
  "GOOGLE_GENERATIVE_AI_API_KEY",
28957
29249
  "GROQ_API_KEY",
@@ -28998,18 +29290,22 @@ async function runConfigDoctor() {
28998
29290
  xdgFile.exists(),
28999
29291
  cwdFile.exists()
29000
29292
  ]);
29293
+ const [xdgData, cwdData] = await Promise.all([
29294
+ xdgExists ? safeParseToml(xdgFile) : Promise.resolve(null),
29295
+ cwdExists ? safeParseToml(cwdFile) : Promise.resolve(null)
29296
+ ]);
29001
29297
  const layers = [
29002
29298
  {
29003
29299
  source: "XDG config",
29004
29300
  path: xdgPath,
29005
29301
  found: xdgExists,
29006
- data: xdgExists ? await safeParseToml(xdgFile) : null
29302
+ data: xdgData
29007
29303
  },
29008
29304
  {
29009
29305
  source: "CWD config",
29010
29306
  path: cwdPath,
29011
29307
  found: cwdExists,
29012
- data: cwdExists ? await safeParseToml(cwdFile) : null
29308
+ data: cwdData
29013
29309
  }
29014
29310
  ];
29015
29311
  const envOverrides = [
@@ -29030,33 +29326,51 @@ async function runConfigDoctor() {
29030
29326
  }
29031
29327
  ];
29032
29328
  const providerIssues = [];
29329
+ const configuredProviderNames = new Set([
29330
+ ...Object.keys((xdgData ?? {}).providers ?? {}),
29331
+ ...Object.keys((cwdData ?? {}).providers ?? {})
29332
+ ]);
29033
29333
  try {
29034
29334
  const config2 = await Config.load();
29035
- for (const [name, providerConfig] of Object.entries(config2.providers)) {
29036
- if (providerConfig.api_key_env) {
29037
- if (!process.env[providerConfig.api_key_env]) {
29038
- providerIssues.push({
29039
- provider: name,
29040
- issue: `${providerConfig.api_key_env} is not set`
29041
- });
29042
- }
29335
+ if (!xdgExists && !cwdExists) {
29336
+ if (config2.default.provider) {
29337
+ providerIssues.push({
29338
+ provider: "(setup)",
29339
+ issue: `No config file found. Using built-in defaults with provider '${config2.default.provider}'.`
29340
+ });
29341
+ } else {
29342
+ providerIssues.push({
29343
+ provider: "(setup)",
29344
+ issue: "No config file or detected provider found. Install Ollama, set GEMINI_API_KEY or GROQ_API_KEY, or run 'q config init'."
29345
+ });
29043
29346
  }
29044
- if (providerConfig.provider_api_key_env) {
29045
- if (!process.env[providerConfig.provider_api_key_env]) {
29046
- providerIssues.push({
29047
- provider: name,
29048
- issue: `${providerConfig.provider_api_key_env} is not set`
29049
- });
29050
- }
29347
+ }
29348
+ if (config2.default.provider) {
29349
+ configuredProviderNames.add(config2.default.provider);
29350
+ }
29351
+ if (configuredProviderNames.size === 0 && config2.default.provider) {
29352
+ configuredProviderNames.add(config2.default.provider);
29353
+ }
29354
+ for (const name of configuredProviderNames) {
29355
+ const providerConfig = config2.providers[name];
29356
+ if (!providerConfig) {
29357
+ providerIssues.push({
29358
+ provider: name,
29359
+ issue: "Provider is not available."
29360
+ });
29361
+ continue;
29362
+ }
29363
+ const { issues } = getProviderStatusSummary(providerConfig);
29364
+ for (const issue2 of issues) {
29365
+ providerIssues.push({ provider: name, issue: issue2 });
29051
29366
  }
29052
29367
  }
29053
29368
  } catch (err) {
29054
29369
  const message = err instanceof Error ? err.message : String(err);
29055
29370
  providerIssues.push({ provider: "(config)", issue: message });
29056
29371
  }
29057
- const noConfigAtAll = !xdgExists && !cwdExists;
29058
29372
  const configLoadFailed = providerIssues.some((i) => i.provider === "(config)");
29059
- const hasErrors = noConfigAtAll || configLoadFailed;
29373
+ const hasErrors = configLoadFailed;
29060
29374
  const hasWarnings = providerIssues.length > 0;
29061
29375
  return {
29062
29376
  layers,
@@ -29123,72 +29437,62 @@ function formatZodErrors(error48) {
29123
29437
  var EXAMPLE_CONFIG = `# q configuration file
29124
29438
  # Location: ~/.config/q/config.toml
29125
29439
  #
29126
- # Config resolution order (later overrides earlier):
29127
- # 1. This file (XDG_CONFIG_HOME/q/config.toml or ~/.config/q/config.toml)
29128
- # 2. ./config.toml in current directory (project-specific)
29129
- # 3. Environment variables: Q_PROVIDER, Q_MODEL, Q_COPY
29440
+ # This file is optional. q can run with built-in defaults:
29441
+ # - install Ollama for local usage, or
29442
+ # - set GEMINI_API_KEY / GROQ_API_KEY / ANTHROPIC_API_KEY / OPENAI_API_KEY
29443
+ #
29444
+ # Add settings here only when you want to pin behaviour or customise providers.
29130
29445
 
29131
29446
  [default]
29132
- provider = "anthropic"
29133
- model = "claude-sonnet-4-20250514"
29134
- # copy = true # Always copy answer to clipboard (override with --no-copy)
29135
-
29136
- [providers.anthropic]
29137
- type = "anthropic"
29138
- api_key_env = "ANTHROPIC_API_KEY"
29139
- # model = "claude-sonnet-4-20250514" # Optional per-provider default model
29140
-
29141
- [providers.openai]
29142
- type = "openai"
29143
- api_key_env = "OPENAI_API_KEY"
29144
- # model = "gpt-4o" # Optional per-provider default model
29447
+ # provider = "google" # Optional: q auto-detects a provider when possible
29448
+ # model = "gemini-2.5-flash" # Optional: q uses provider defaults when omitted
29449
+ # copy = true # Optional: always copy answer to clipboard
29145
29450
 
29146
- # Example: OpenAI-compatible provider (e.g., local LLM via LM Studio)
29147
- # [providers.local]
29148
- # type = "openai_compatible"
29149
- # base_url = "http://localhost:1234/v1"
29150
- # api_key_env = "LOCAL_API_KEY"
29151
-
29152
- # Example: Portkey Gateway
29153
- # [providers.portkey_internal]
29154
- # type = "portkey"
29155
- # base_url = "https://your-portkey-gateway.internal/v1"
29156
- # provider_slug = "@your-org/bedrock-provider"
29157
- # api_key_env = "PORTKEY_API_KEY"
29158
- # provider_api_key_env = "PROVIDER_API_KEY"
29159
- # headers = { "x-portkey-trace-id" = "\${HOSTNAME}" } # Only allowlisted env vars
29160
-
29161
- # Example: Ollama (local models)
29451
+ # Local-first example
29162
29452
  # [providers.ollama]
29163
29453
  # type = "ollama"
29164
- # base_url = "http://localhost:11434"
29454
+ # # base_url = "http://localhost:11434"
29455
+ # # model = "gemma3"
29165
29456
 
29166
- # Example: Google Gemini
29457
+ # Free cloud example (Google Gemini)
29167
29458
  # [providers.google]
29168
29459
  # type = "google"
29169
- # api_key_env = "GOOGLE_GENERATIVE_AI_API_KEY"
29170
- # # Models: gemini-2.5-pro, gemini-2.5-flash, gemini-2.0-flash
29460
+ # # api_key_env = "GEMINI_API_KEY"
29461
+ # # model = "gemini-2.5-flash"
29171
29462
 
29172
- # Example: Groq (ultra-fast inference)
29463
+ # Free cloud example (Groq)
29173
29464
  # [providers.groq]
29174
29465
  # type = "groq"
29175
- # api_key_env = "GROQ_API_KEY"
29176
- # # Models: llama-3.3-70b-versatile, qwen-qwq-32b, deepseek-r1-distill-llama-70b
29466
+ # # api_key_env = "GROQ_API_KEY"
29467
+ # # model = "openai/gpt-oss-20b"
29468
+
29469
+ # Advanced examples
29470
+ # [providers.work_openai]
29471
+ # type = "openai"
29472
+ # api_key_env = "OPENAI_API_KEY"
29473
+ # model = "gpt-4o-mini"
29474
+
29475
+ # [providers.local]
29476
+ # type = "openai_compatible"
29477
+ # base_url = "http://localhost:1234/v1"
29478
+ # # api_key_env = "LOCAL_API_KEY"
29177
29479
 
29178
- # Example: Azure OpenAI
29179
29480
  # [providers.azure]
29180
29481
  # type = "azure"
29181
- # resource_name = "my-azure-resource" # Or use base_url instead
29482
+ # resource_name = "my-azure-resource"
29182
29483
  # api_key_env = "AZURE_API_KEY"
29183
- # api_version = "v1" # Optional, defaults to v1
29184
- # # Model = deployment name (e.g., "gpt-4o-deployment")
29484
+ # # model = "gpt-4o-deployment"
29185
29485
 
29186
- # Example: AWS Bedrock
29187
29486
  # [providers.bedrock]
29188
29487
  # type = "bedrock"
29189
- # region = "us-east-1" # Optional, defaults to AWS_REGION env var
29190
- # # Uses standard AWS env vars: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
29191
- # # Models: anthropic.claude-3-5-sonnet-20241022-v2:0, us.amazon.nova-pro-v1:0
29488
+ # region = "us-east-1"
29489
+ # # model = "us.anthropic.claude-sonnet-4-20250514-v1:0"
29490
+
29491
+ # [providers.portkey]
29492
+ # type = "portkey"
29493
+ # base_url = "https://api.portkey.ai/v1"
29494
+ # provider_slug = "openai"
29495
+ # api_key_env = "PORTKEY_API_KEY"
29192
29496
  `;
29193
29497
  async function initConfig() {
29194
29498
  const configPath = getXdgConfigPath();
@@ -29297,7 +29601,7 @@ class QError2 extends Error {
29297
29601
  }
29298
29602
  }
29299
29603
 
29300
- class ConfigNotFoundError2 extends QError2 {
29604
+ class ConfigNotFoundError extends QError2 {
29301
29605
  path;
29302
29606
  constructor(path2) {
29303
29607
  super(`Config file not found: ${path2}
@@ -29350,6 +29654,22 @@ class MissingApiKeyError2 extends QError2 {
29350
29654
  }
29351
29655
  }
29352
29656
 
29657
+ class ModelNotConfiguredError2 extends QError2 {
29658
+ providerName;
29659
+ constructor(providerName) {
29660
+ super(`No model configured for provider '${providerName}'. Set --model, Q_MODEL, provider.model, or default.model.`, 2);
29661
+ this.providerName = providerName;
29662
+ this.name = "ModelNotConfiguredError";
29663
+ }
29664
+ }
29665
+
29666
+ class SetupRequiredError2 extends QError2 {
29667
+ constructor() {
29668
+ super("Setup required. Install Ollama, set GEMINI_API_KEY or GROQ_API_KEY, or run 'q config init'.", 2);
29669
+ this.name = "SetupRequiredError";
29670
+ }
29671
+ }
29672
+
29353
29673
  class ProviderError2 extends QError2 {
29354
29674
  constructor(message, cause) {
29355
29675
  super(message, 1, cause === undefined ? undefined : { cause });
@@ -29364,7 +29684,7 @@ class UsageError2 extends QError2 {
29364
29684
  }
29365
29685
  }
29366
29686
  function getUserErrorMessage(error48) {
29367
- if (error48 instanceof ConfigNotFoundError2) {
29687
+ if (error48 instanceof ConfigNotFoundError) {
29368
29688
  return "Config file not found. Run 'q config init'.";
29369
29689
  }
29370
29690
  if (error48 instanceof ConfigParseError2) {
@@ -29379,6 +29699,12 @@ function getUserErrorMessage(error48) {
29379
29699
  if (error48 instanceof MissingApiKeyError2) {
29380
29700
  return `Missing API key. Set ${error48.envVar}.`;
29381
29701
  }
29702
+ if (error48 instanceof ModelNotConfiguredError2) {
29703
+ return getFirstLine(error48.message);
29704
+ }
29705
+ if (error48 instanceof SetupRequiredError2) {
29706
+ return error48.message;
29707
+ }
29382
29708
  if (error48 instanceof ProviderError2) {
29383
29709
  return getProviderUserMessage(error48);
29384
29710
  }
@@ -53202,7 +53528,7 @@ var google = createGoogleGenerativeAI();
53202
53528
  // src/providers/google.ts
53203
53529
  function createGoogleProvider(config2, providerName) {
53204
53530
  return createGoogleGenerativeAI({
53205
- apiKey: resolveApiKey(config2.api_key_env, providerName),
53531
+ apiKey: resolveApiKeyCandidates(getGoogleApiKeyEnvVars(config2.api_key_env), providerName),
53206
53532
  baseURL: config2.base_url,
53207
53533
  headers: config2.headers
53208
53534
  });
@@ -62837,6 +63163,15 @@ function resolveApiKey(envVarName, providerName) {
62837
63163
  }
62838
63164
  return apiKey;
62839
63165
  }
63166
+ function resolveApiKeyCandidates(envVarNames, providerName) {
63167
+ for (const envVarName of envVarNames) {
63168
+ const apiKey = process.env[envVarName];
63169
+ if (apiKey) {
63170
+ return apiKey;
63171
+ }
63172
+ }
63173
+ throw new MissingApiKeyError(envVarNames.join(" or "), providerName);
63174
+ }
62840
63175
 
62841
63176
  // src/providers/anthropic.ts
62842
63177
  function createAnthropicProvider(config2, providerName) {
@@ -62861,12 +63196,18 @@ function filterSensitiveFields(config2) {
62861
63196
  }
62862
63197
  }
62863
63198
  function resolveProvider(config2, providerOverride, modelOverride, debug = false) {
62864
- const providerName = providerOverride ?? config2.default.provider;
63199
+ const providerName = providerOverride ?? env.Q_PROVIDER ?? config2.default.provider;
63200
+ if (!providerName) {
63201
+ throw new SetupRequiredError;
63202
+ }
62865
63203
  const providerConfig = config2.providers[providerName];
62866
63204
  if (!providerConfig) {
62867
63205
  throw new ProviderNotFoundError(providerName);
62868
63206
  }
62869
- const modelId = modelOverride ?? env.Q_MODEL ?? providerConfig.model ?? config2.default.model;
63207
+ const modelId = modelOverride ?? env.Q_MODEL ?? providerConfig.model ?? config2.default.model ?? getDefaultModelForProvider(providerConfig);
63208
+ if (!modelId) {
63209
+ throw new ModelNotConfiguredError(providerName);
63210
+ }
62870
63211
  logDebug(`Provider config: ${JSON.stringify(filterSensitiveFields(providerConfig), null, 2)}`, debug);
62871
63212
  const model = createModel(providerConfig, providerName, modelId, debug);
62872
63213
  return {
@@ -62915,31 +63256,6 @@ function createModel(config2, providerName, modelId, debug = false) {
62915
63256
  }
62916
63257
  }
62917
63258
  }
62918
- var CREDENTIAL_FIELDS = [
62919
- "api_key_env",
62920
- "provider_api_key_env",
62921
- "access_key_env",
62922
- "secret_key_env"
62923
- ];
62924
- function getCredentialStatuses(providerConfig) {
62925
- return CREDENTIAL_FIELDS.reduce((acc, field) => {
62926
- const envVar = providerConfig[field];
62927
- if (envVar) {
62928
- acc.push({ envVar, present: Boolean(process.env[envVar]) });
62929
- }
62930
- return acc;
62931
- }, []);
62932
- }
62933
- function formatCredentialLine(statuses) {
62934
- if (statuses.length === 0) {
62935
- return " Key: (none required)";
62936
- }
62937
- return statuses.map(({ envVar, present }) => {
62938
- const status = present ? "set" : "missing";
62939
- return ` Key: ${envVar} (${status})`;
62940
- }).join(`
62941
- `);
62942
- }
62943
63259
  function listProviders(config2) {
62944
63260
  const providers = Object.keys(config2.providers);
62945
63261
  const defaultProvider = config2.default.provider;
@@ -62951,18 +63267,19 @@ function listProviders(config2) {
62951
63267
  return ` ${name15} [unknown]`;
62952
63268
  const marker16 = isDefault ? " (default)" : "";
62953
63269
  const header = ` ${name15}${marker16} [${providerConfig.type}]`;
62954
- const modelDisplay = providerConfig.model ?? "(default)";
63270
+ const modelDisplay = providerConfig.model ?? getDefaultModelForProvider(providerConfig) ?? "(set with --model or config)";
62955
63271
  const modelLine = ` Model: ${modelDisplay}`;
62956
- const credentialLine = formatCredentialLine(getCredentialStatuses(providerConfig));
62957
- return [header, modelLine, credentialLine].join(`
63272
+ const credentialLines = getProviderStatusSummary(providerConfig).lines.map((line) => ` ${line}`);
63273
+ return [header, modelLine, ...credentialLines].join(`
62958
63274
  `);
62959
63275
  });
62960
63276
  const lines = [
62961
- "Configured providers:",
63277
+ "Available providers:",
62962
63278
  "",
62963
63279
  ...providerBlocks,
62964
63280
  "",
62965
- `Default model: ${defaultModel}`
63281
+ `Default provider: ${defaultProvider ?? "(choose with --provider or env)"}`,
63282
+ `Default model: ${defaultModel ?? "(provider default)"}`
62966
63283
  ];
62967
63284
  return lines.join(`
62968
63285
  `);
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@hongymagic/q",
3
- "version": "2026.328.0",
3
+ "version": "2026.330.0",
4
4
  "description": "Quick AI answers from the command line",
5
5
  "main": "dist/q.js",
6
6
  "type": "module",