grok-search-cli 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/.env.example ADDED
@@ -0,0 +1,2 @@
1
+ XAI_API_KEY=your_xai_api_key
2
+ XAI_MODEL=grok-4-1-fast-non-reasoning
@@ -0,0 +1,3 @@
1
+ XAI_API_KEY=your_openrouter_api_key
2
+ XAI_MODEL=x-ai/grok-4.1-fast
3
+ XAI_BASE_URL=https://openrouter.ai/api/v1
@@ -0,0 +1,4 @@
1
+ XAI_API_KEY=your_yunwu_api_key
2
+ XAI_MODEL=grok-4-fast
3
+ XAI_BASE_URL=https://yunwu.ai/v1
4
+ XAI_COMPAT_MODE=true
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 timzhong
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,147 @@
1
+ # grok-search-cli
2
+
3
+ [中文说明](./README.zh.md)
4
+
5
+ Expose xAI Web + X search as a single CLI for agents that can run shell commands but do not integrate with xAI search APIs directly. It is built for prompts that need live web results, X discussion, and one-pass synthesis in the same call.
6
+
7
+ ## Get Started
8
+
9
+ You only need two things to get it running: install the package, then provide the required environment variables.
10
+
11
+ ```bash
12
+ pnpm add -g grok-search-cli
13
+ ```
14
+
15
+ ```env
16
+ XAI_API_KEY=your_xai_api_key
17
+ XAI_MODEL=grok-4-1-fast-non-reasoning
18
+ # XAI_BASE_URL=https://your-proxy.example.com/v1 # optional proxy base URL
19
+ # XAI_COMPAT_MODE=true # use OpenAI-compatible /chat/completions mode
20
+ ```
21
+
22
+ Run it:
23
+
24
+ ```bash
25
+ grok-search "latest xAI updates"
26
+ ```
27
+
28
+ If you do not want a global install:
29
+
30
+ ```bash
31
+ npx grok-search-cli "latest xAI updates"
32
+ ```
33
+
34
+ Using OpenRouter:
35
+
36
+ ```env
37
+ XAI_API_KEY=your_openrouter_api_key
38
+ XAI_BASE_URL=https://openrouter.ai/api/v1
39
+ XAI_MODEL=x-ai/grok-4-fast:online
40
+ ```
41
+
42
+ With OpenRouter, the CLI auto-enables compatibility mode for the OpenAI-style endpoint and forwards OpenRouter web-search fields.
43
+
44
+ ## Use With Agents
45
+
46
+ This repo ships with an installable skill:
47
+
48
+ ```bash
49
+ npx skills add <owner>/<repo> --skill grok-search-cli
50
+ ```
51
+
52
+ For Codex, use it as a research-agent preset: `gpt-5.4-mini` with `low` reasoning. See [agents/codex.yaml](./agents/codex.yaml).
53
+
54
+ Trigger it with:
55
+
56
+ ```text
57
+ Spawn a grok-research researcher agent with gpt-5.4-mini and low reasoning, then use grok-search for high-freshness web+X research.
58
+ ```
59
+
60
+ ## Provider Recommendations
61
+
62
+ Recommended order:
63
+
64
+ 1. Use official xAI APIs when possible. This is the most direct and stable path for `web_search` and `x_search`.
65
+ 2. OpenRouter is also a solid option. Its web plugin can use native xAI search for xAI models, including both Web Search and X Search.
66
+ 3. If you use a third-party proxy, verify search support yourself. Many proxies only expose `/chat/completions`, and search only works if the proxy provider has enabled web search on their side.
67
+
68
+ ## Two Modes
69
+
70
+ ### xAI Official Mode
71
+
72
+ Connect directly to xAI and use the Responses API with both tools enabled:
73
+
74
+ - `xai.tools.webSearch()`
75
+ - `xai.tools.xSearch()`
76
+
77
+ This is the full-featured mode.
78
+
79
+ ### Proxy Mode
80
+
81
+ If your gateway only supports OpenAI-style `/chat/completions`, enable compatibility mode:
82
+
83
+ ```env
84
+ XAI_BASE_URL=https://your-proxy.example.com/v1
85
+ XAI_COMPAT_MODE=true
86
+ ```
87
+
88
+ In this mode the CLI switches to completion calls and does not explicitly register xAI tools.
89
+
90
+ If `XAI_BASE_URL` points to OpenRouter, the CLI uses OpenRouter Responses API web search and sends OpenRouter-specific search fields.
91
+
92
+ In compatibility mode:
93
+
94
+ - actual search behavior depends on the proxy
95
+ - tool-specific flags such as `--allowed-domains`, `--allowed-handles`, and `--from-date` are not forwarded; the CLI prints a warning if you pass them
96
+
97
+ ## Common Commands
98
+
99
+ ```bash
100
+ grok-search "What are people saying about xAI on X right now?"
101
+ ```
102
+
103
+ ```bash
104
+ grok-search "Somebody said xAI open-sourced model X today. Is that true?"
105
+ ```
106
+
107
+ ```bash
108
+ grok-search "Find the latest arXiv papers and GitHub repos about browser-use agents"
109
+ ```
110
+
111
+ ```bash
112
+ grok-search "latest AI SDK updates" \
113
+ --allowed-domains=ai-sdk.dev,vercel.com
114
+ ```
115
+
116
+ ```bash
117
+ grok-search "latest xAI status on X" \
118
+ --allowed-handles=xai,elonmusk \
119
+ --from-date=2026-04-01
120
+ ```
121
+
122
+ ```bash
123
+ grok-search "latest xAI updates" --json
124
+ ```
125
+
126
+ ```bash
127
+ grok-search skill
128
+ ```
129
+
130
+ The command prints the bundled skill to `stdout`, so it can also be redirected:
131
+
132
+ ```bash
133
+ grok-search skill > ~/.codex/skills/grok-search-cli/SKILL.md
134
+ ```
135
+
136
+ ```bash
137
+ grok-search --help
138
+ ```
139
+
140
+ ## Local Development
141
+
142
+ For local `.env` workflows, `pnpm dev` already runs through `dotenvx`:
143
+
144
+ ```bash
145
+ pnpm install
146
+ pnpm dev "latest xAI updates" --verbose
147
+ ```
package/README.zh.md ADDED
@@ -0,0 +1,147 @@
1
+ # grok-search-cli
2
+
3
+ [English](./README.md)
4
+
5
+ 把 xAI 的 Web + X 联合搜索能力暴露成一个通用 CLI,给只能调用命令行的 agent 用。适合那种既要看实时网页结果、又要看 X 讨论、还希望模型顺手做一轮综合判断的 prompt。
6
+
7
+ ## 跑起来
8
+
9
+ 最小依赖只有两件事:装包,然后提供所需环境变量。
10
+
11
+ ```bash
12
+ pnpm add -g grok-search-cli
13
+ ```
14
+
15
+ ```env
16
+ XAI_API_KEY=your_xai_api_key
17
+ XAI_MODEL=grok-4-1-fast-non-reasoning
18
+ # XAI_BASE_URL=https://your-proxy.example.com/v1 # optional proxy base URL
19
+ # XAI_COMPAT_MODE=true # use OpenAI-compatible /chat/completions mode
20
+ ```
21
+
22
+ 直接跑:
23
+
24
+ ```bash
25
+ grok-search "latest xAI updates"
26
+ ```
27
+
28
+ 如果不想全局安装:
29
+
30
+ ```bash
31
+ npx grok-search-cli "latest xAI updates"
32
+ ```
33
+
34
+ 如果走 OpenRouter,可以直接这样配:
35
+
36
+ ```env
37
+ XAI_API_KEY=your_openrouter_api_key
38
+ XAI_BASE_URL=https://openrouter.ai/api/v1
39
+ XAI_MODEL=x-ai/grok-4-fast:online
40
+ ```
41
+
42
+ 走 OpenRouter 时,CLI 会针对它的 OpenAI 风格接口自动开启兼容模式,并转发 OpenRouter 的 web search 字段。
43
+
44
+ ## 给 Agent 用
45
+
46
+ 这个仓库自带一个可安装 skill:
47
+
48
+ ```bash
49
+ npx skills add <owner>/<repo> --skill grok-search-cli
50
+ ```
51
+
52
+ 如果是 Codex,推荐把它当成一个 research agent 约定来用:默认 `gpt-5.4-mini` + `low` reasoning,见 [agents/codex.yaml](./agents/codex.yaml)。
53
+
54
+ 可直接用这句触发:
55
+
56
+ ```text
57
+ Spawn a grok-research researcher agent with gpt-5.4-mini and low reasoning, then use grok-search for high-freshness web+X research.
58
+ ```
59
+
60
+ ## 服务推荐
61
+
62
+ 推荐顺序:
63
+
64
+ 1. 优先使用 xAI 官方服务。这是 `web_search` 和 `x_search` 最直接、最稳定的路径。
65
+ 2. OpenRouter 也比较稳。它的 web 插件对 xAI 模型可以走原生 xAI 搜索,包含 Web Search 和 X Search。
66
+ 3. 如果使用第三方代理站,需要自己确认搜索能力。很多代理站只暴露 `/chat/completions`,是否真的能搜索,取决于站点是否在内部为你开启了 web search。
67
+
68
+ ## 两种模式
69
+
70
+ ### xAI 官方模式
71
+
72
+ 直接连 xAI 官方 API,使用 Responses API,并同时注册:
73
+
74
+ - `xai.tools.webSearch()`
75
+ - `xai.tools.xSearch()`
76
+
77
+ 这是能力最完整的模式。
78
+
79
+ ### 中转站模式
80
+
81
+ 如果你的 API 中转站只支持 OpenAI 风格的 `/chat/completions`,就打开兼容模式:
82
+
83
+ ```env
84
+ XAI_BASE_URL=https://your-proxy.example.com/v1
85
+ XAI_COMPAT_MODE=true
86
+ ```
87
+
88
+ 这时 CLI 会改走 completion 调用,不再显式注册 xAI tools。
89
+
90
+ 如果 `XAI_BASE_URL` 指向 OpenRouter,CLI 会走 OpenRouter 的 Responses API,并按 OpenRouter 的 web search 请求字段发送参数。
91
+
92
+ 兼容模式下:
93
+
94
+ - 是否真的搜索、怎么搜索,由中转站决定
95
+ - `--allowed-domains`、`--allowed-handles`、`--from-date` 这类 tool 参数不会下传;如果传了,CLI 会打印 warning
96
+
97
+ ## 常用命令
98
+
99
+ ```bash
100
+ grok-search "现在 X 上大家都在怎么讨论 xAI?"
101
+ ```
102
+
103
+ ```bash
104
+ grok-search "有人说 xAI 今天开源了某个模型,这是真的吗?"
105
+ ```
106
+
107
+ ```bash
108
+ grok-search "找最新的 browser-use agents 相关 arXiv 论文和 GitHub 项目"
109
+ ```
110
+
111
+ ```bash
112
+ grok-search "latest AI SDK updates" \
113
+ --allowed-domains=ai-sdk.dev,vercel.com
114
+ ```
115
+
116
+ ```bash
117
+ grok-search "latest xAI status on X" \
118
+ --allowed-handles=xai,elonmusk \
119
+ --from-date=2026-04-01
120
+ ```
121
+
122
+ ```bash
123
+ grok-search "latest xAI updates" --json
124
+ ```
125
+
126
+ ```bash
127
+ grok-search skill
128
+ ```
129
+
130
+ 这个命令会把内置 skill 直接打印到 `stdout`,所以也可以重定向保存:
131
+
132
+ ```bash
133
+ grok-search skill > ~/.codex/skills/grok-search-cli/SKILL.md
134
+ ```
135
+
136
+ ```bash
137
+ grok-search --help
138
+ ```
139
+
140
+ ## 本地开发
141
+
142
+ 本地如果要走 `.env`,`pnpm dev` 已经默认走 `dotenvx`:
143
+
144
+ ```bash
145
+ pnpm install
146
+ pnpm dev "latest xAI updates" --verbose
147
+ ```
@@ -0,0 +1,7 @@
1
+ name: grok-research
2
+ agent_type: researcher
3
+ model: gpt-5.4-mini
4
+ reasoning_effort: low
5
+ description: >
6
+ Research-focused Codex preset for high-freshness web and X search through the
7
+ grok-search CLI.
package/dist/cli.js ADDED
@@ -0,0 +1,721 @@
1
+ #!/usr/bin/env node
2
+ import { generateText as e, streamText as t } from "ai";
3
+ import { chmodSync as n, existsSync as r, mkdirSync as i, readFileSync as a, writeFileSync as o } from "node:fs";
4
+ import s from "node:os";
5
+ import c from "node:path";
6
+ import l from "conf";
7
+ import { createOpenAICompatible as ee } from "@ai-sdk/openai-compatible";
8
+ import { createXai as te, xai as u } from "@ai-sdk/xai";
9
+ import { fileURLToPath as ne } from "node:url";
10
+ //#region src/config.ts
11
+ var d = "grok-4-1-fast-non-reasoning", f = c.join(s.homedir(), ".config", "grok-search-cli");
12
+ function p(e) {
13
+ return typeof e == "string" && e.trim() || void 0;
14
+ }
15
+ function m() {
16
+ return {
17
+ XAI_API_KEY: "",
18
+ XAI_MODEL: d,
19
+ XAI_BASE_URL: "",
20
+ XAI_COMPAT_MODE: !1,
21
+ _examples: {
22
+ xai: {
23
+ XAI_API_KEY: "your_xai_api_key",
24
+ XAI_MODEL: "grok-4-1-fast-non-reasoning"
25
+ },
26
+ openrouter: {
27
+ XAI_API_KEY: "your_openrouter_api_key",
28
+ XAI_MODEL: "x-ai/grok-4.1-fast",
29
+ XAI_BASE_URL: "https://openrouter.ai/api/v1"
30
+ },
31
+ yunwu: {
32
+ XAI_API_KEY: "your_yunwu_api_key",
33
+ XAI_MODEL: "grok-4-fast",
34
+ XAI_BASE_URL: "https://yunwu.ai/v1",
35
+ XAI_COMPAT_MODE: !0
36
+ }
37
+ }
38
+ };
39
+ }
40
+ function h() {
41
+ try {
42
+ return new l({
43
+ projectName: "grok-search-cli",
44
+ configName: "config",
45
+ cwd: f,
46
+ defaults: m()
47
+ });
48
+ } catch (e) {
49
+ let t = e instanceof Error ? e.message : String(e);
50
+ throw Error(`Failed to initialize user config: ${t}`);
51
+ }
52
+ }
53
+ var g = h(), _ = g.path;
54
+ function v() {
55
+ if (i(c.dirname(_), { recursive: !0 }), !r(_)) {
56
+ o(_, `${JSON.stringify(m(), null, 2)}\n`, "utf8");
57
+ try {
58
+ n(_, 384);
59
+ } catch {}
60
+ }
61
+ return _;
62
+ }
63
+ function y() {
64
+ let e = g.store;
65
+ return {
66
+ apiKey: p(process.env.XAI_API_KEY) ?? e?.XAI_API_KEY,
67
+ model: p(process.env.XAI_MODEL) ?? e?.XAI_MODEL ?? d,
68
+ baseUrl: p(process.env.XAI_BASE_URL) ?? e?.XAI_BASE_URL,
69
+ compatModeRaw: process.env.XAI_COMPAT_MODE?.trim() ?? e?.XAI_COMPAT_MODE
70
+ };
71
+ }
72
+ function b() {
73
+ let e = g.store;
74
+ return {
75
+ apiKey: p(process.env.XAI_API_KEY) ? "env" : e?.XAI_API_KEY ? "config" : "missing",
76
+ model: p(process.env.XAI_MODEL) ? "env" : e?.XAI_MODEL ? "config" : "default",
77
+ baseUrl: p(process.env.XAI_BASE_URL) ? "env" : e?.XAI_BASE_URL ? "config" : "default",
78
+ compatMode: process.env.XAI_COMPAT_MODE?.trim() ? "env" : e?.XAI_COMPAT_MODE == null ? "default" : "config"
79
+ };
80
+ }
81
+ function x() {
82
+ return y().model;
83
+ }
84
+ function S() {
85
+ return y().apiKey;
86
+ }
87
+ function C() {
88
+ return y().baseUrl;
89
+ }
90
+ function re() {
91
+ let e = y(), t = b(), n = E(), r = T(e.baseUrl);
92
+ return {
93
+ configPath: _,
94
+ apiKeyPresent: !!e.apiKey,
95
+ model: e.model,
96
+ baseUrl: e.baseUrl,
97
+ providerKind: r,
98
+ apiMode: n,
99
+ sources: t
100
+ };
101
+ }
102
+ function w(e) {
103
+ if (!e) return !1;
104
+ try {
105
+ let { hostname: t } = new URL(e);
106
+ return t === "openrouter.ai" || t.endsWith(".openrouter.ai");
107
+ } catch {
108
+ return !1;
109
+ }
110
+ }
111
+ function T(e) {
112
+ if (!e) return "xai";
113
+ if (w(e)) return "openrouter";
114
+ try {
115
+ let { hostname: t } = new URL(e);
116
+ if (t === "x.ai" || t.endsWith(".x.ai")) return "xai";
117
+ } catch {}
118
+ return "third-party";
119
+ }
120
+ function ie() {
121
+ let { compatModeRaw: e, baseUrl: t } = y();
122
+ return T(t) !== "openrouter" && /^(1|true|yes|on)$/i.test(String(e || ""));
123
+ }
124
+ function E() {
125
+ return ie() ? "completion" : "responses";
126
+ }
127
+ function D(e) {
128
+ return T(C()) === "openrouter" && e === "responses" ? "OpenRouter Responses API Beta" : e === "responses" ? "xAI Responses API" : "OpenAI-compatible /chat/completions";
129
+ }
130
+ function O(e, t) {
131
+ if (e === "xai") {
132
+ console.error(t === "responses" ? "xAI Responses API request failed. Check XAI_API_KEY, model name, and search parameters." : "xAI-compatible completion request failed. Check XAI_API_KEY, model name, and gateway settings.");
133
+ return;
134
+ }
135
+ if (e === "openrouter") {
136
+ console.error("OpenRouter request failed. This CLI uses OpenRouter Responses API web search for openrouter.ai endpoints.");
137
+ return;
138
+ }
139
+ console.error("Third-party gateway request failed. Verify the gateway supports the selected API shape and that web search is enabled on the provider side.");
140
+ }
141
+ //#endregion
142
+ //#region src/args.ts
143
+ function k(e) {
144
+ process.stderr.write(`${e}\n`), process.exit(1);
145
+ }
146
+ function ae() {
147
+ let e = x();
148
+ process.stdout.write(`\
149
+ Usage:
150
+ grok-search "<prompt>" [options]
151
+ grok-search skill
152
+ grok-search doctor
153
+
154
+ Options:
155
+ skill Print the bundled skill to stdout
156
+ doctor Show config and credential diagnostics
157
+ --model=<id> Override model. Default: ${e}
158
+ --timeout=<seconds> Request timeout. Default: 60
159
+ --json Output JSON
160
+ --verbose Print request and token diagnostics
161
+ --allowed-domains=a.com,b.com Web Search allowed domains
162
+ --excluded-domains=a.com,b.com Web Search excluded domains
163
+ --allowed-handles=xai,elonmusk X Search allowed handles
164
+ --excluded-handles=spam1,spam2 X Search excluded handles
165
+ --from-date=YYYY-MM-DD X Search start date
166
+ --to-date=YYYY-MM-DD X Search end date
167
+ --image Enable image understanding
168
+ --video Enable video understanding for X Search
169
+ -h, --help Show this help
170
+
171
+ Environment:
172
+ XAI_API_KEY Required
173
+ `);
174
+ }
175
+ function oe(e) {
176
+ return e ? e.split(",").map((e) => e.trim()).filter(Boolean) : [];
177
+ }
178
+ function A(e, t, n) {
179
+ let r = e.indexOf("=");
180
+ return r >= 0 ? e.slice(r + 1) : t[n + 1];
181
+ }
182
+ function se() {
183
+ return {
184
+ model: x(),
185
+ timeoutMs: 6e4,
186
+ json: !1,
187
+ verbose: !1,
188
+ allowedDomains: [],
189
+ excludedDomains: [],
190
+ allowedHandles: [],
191
+ excludedHandles: [],
192
+ enableImageUnderstanding: !1,
193
+ enableVideoUnderstanding: !1
194
+ };
195
+ }
196
+ function j(e, t, n, r, i) {
197
+ let a = A(e, t, n)?.trim();
198
+ return a || k(`Missing value for ${r}`), i(a), e.includes("=") ? n : n + 1;
199
+ }
200
+ function M(e, t, n, r) {
201
+ return r(oe(A(e, t, n))), e.includes("=") ? n : n + 1;
202
+ }
203
+ function ce(e, t, n, r) {
204
+ let i = A(e, t, n)?.trim(), a = Number(i);
205
+ return (!i || !Number.isFinite(a) || a <= 0) && k("Invalid value for --timeout. Use a number of seconds > 0."), r.timeoutMs = Math.round(a * 1e3), e.includes("=") ? n : n + 1;
206
+ }
207
+ function N(e) {
208
+ let t = se(), n = [];
209
+ for (let r = 0; r < e.length; r += 1) {
210
+ let i = e[r];
211
+ if (!i.startsWith("-")) {
212
+ n.push(i);
213
+ continue;
214
+ }
215
+ if (i === "--json") {
216
+ t.json = !0;
217
+ continue;
218
+ }
219
+ if (i === "--verbose") {
220
+ t.verbose = !0;
221
+ continue;
222
+ }
223
+ if (i === "--image") {
224
+ t.enableImageUnderstanding = !0;
225
+ continue;
226
+ }
227
+ if (i === "--video") {
228
+ t.enableVideoUnderstanding = !0;
229
+ continue;
230
+ }
231
+ if (i.startsWith("--model")) {
232
+ r = j(i, e, r, "--model", (e) => {
233
+ t.model = e;
234
+ });
235
+ continue;
236
+ }
237
+ if (i.startsWith("--timeout")) {
238
+ r = ce(i, e, r, t);
239
+ continue;
240
+ }
241
+ if (i.startsWith("--allowed-domains")) {
242
+ r = M(i, e, r, (e) => {
243
+ t.allowedDomains = e;
244
+ });
245
+ continue;
246
+ }
247
+ if (i.startsWith("--excluded-domains")) {
248
+ r = M(i, e, r, (e) => {
249
+ t.excludedDomains = e;
250
+ });
251
+ continue;
252
+ }
253
+ if (i.startsWith("--allowed-handles")) {
254
+ r = M(i, e, r, (e) => {
255
+ t.allowedHandles = e;
256
+ });
257
+ continue;
258
+ }
259
+ if (i.startsWith("--excluded-handles")) {
260
+ r = M(i, e, r, (e) => {
261
+ t.excludedHandles = e;
262
+ });
263
+ continue;
264
+ }
265
+ if (i.startsWith("--from-date")) {
266
+ r = j(i, e, r, "--from-date", (e) => {
267
+ t.fromDate = e;
268
+ });
269
+ continue;
270
+ }
271
+ if (i.startsWith("--to-date")) {
272
+ r = j(i, e, r, "--to-date", (e) => {
273
+ t.toDate = e;
274
+ });
275
+ continue;
276
+ }
277
+ k(`Unknown option: ${i}`);
278
+ }
279
+ let r = n.join(" ").trim();
280
+ return r || k("Missing prompt."), t.allowedDomains.length > 0 && t.excludedDomains.length > 0 && k("Use either --allowed-domains or --excluded-domains, not both."), t.allowedHandles.length > 0 && t.excludedHandles.length > 0 && k("Use either --allowed-handles or --excluded-handles, not both."), {
281
+ command: "all",
282
+ prompt: r,
283
+ options: t
284
+ };
285
+ }
286
+ function P(e) {
287
+ if ((e.length === 0 || e.includes("--help") || e.includes("-h")) && (ae(), process.exit(0)), e[0] === "skill") {
288
+ let t = e.filter((e, t) => t > 0 && e.startsWith("-") && e !== "--help" && e !== "-h");
289
+ return t.length > 0 && k(`Unknown option: ${t[0]}`), { command: "skill" };
290
+ }
291
+ if (e[0] === "doctor") {
292
+ let t = e.filter((e, t) => t > 0 && e.startsWith("-") && e !== "--help" && e !== "-h");
293
+ return t.length > 0 && k(`Unknown option: ${t[0]}`), { command: "doctor" };
294
+ }
295
+ return N(e);
296
+ }
297
+ //#endregion
298
+ //#region src/openrouter.ts
299
+ function F(e) {
300
+ let t = {};
301
+ return e.allowedDomains.length > 0 && (t.allowed_domains = e.allowedDomains), e.excludedDomains.length > 0 && (t.excluded_domains = e.excludedDomains), I(t) ? {
302
+ type: "openrouter:web_search",
303
+ parameters: t
304
+ } : { type: "openrouter:web_search" };
305
+ }
306
+ function I(e) {
307
+ return Object.keys(e).length > 0;
308
+ }
309
+ function L(e, t, n) {
310
+ return {
311
+ model: t.model,
312
+ input: e,
313
+ tools: [F(t)],
314
+ stream: n
315
+ };
316
+ }
317
+ async function R(e) {
318
+ if (!e.ok) throw Error(await e.text());
319
+ return e;
320
+ }
321
+ function z(e) {
322
+ return (e.output?.find((e) => e.type === "message"))?.content?.find((e) => e.type === "output_text")?.text ?? "";
323
+ }
324
+ function B(e) {
325
+ return ((e.output?.find((e) => e.type === "message"))?.content?.find((e) => e.type === "output_text"))?.annotations?.filter((e) => e.type === "url_citation").map((e) => ({
326
+ sourceType: "url_citation",
327
+ url: e.url
328
+ })) ?? [];
329
+ }
330
+ function V(e) {
331
+ return {
332
+ inputTokens: e?.input_tokens,
333
+ outputTokens: e?.output_tokens,
334
+ totalTokens: e?.total_tokens,
335
+ outputTokenDetails: {
336
+ textTokens: e?.output_tokens,
337
+ reasoningTokens: void 0
338
+ }
339
+ };
340
+ }
341
+ function H(e) {
342
+ return {
343
+ text: z(e),
344
+ finishReason: e.status ?? "unknown",
345
+ usage: V(e.usage),
346
+ sources: B(e)
347
+ };
348
+ }
349
+ async function U(e) {
350
+ return R(await fetch(`${e.baseUrl}/responses`, {
351
+ method: "POST",
352
+ headers: {
353
+ Authorization: `Bearer ${e.apiKey}`,
354
+ "Content-Type": "application/json"
355
+ },
356
+ body: JSON.stringify(e.body),
357
+ signal: e.abortSignal
358
+ }));
359
+ }
360
+ function W(e) {
361
+ return (t) => ({
362
+ ...t,
363
+ tools: [F(e)]
364
+ });
365
+ }
366
+ function G(e) {
367
+ if (!e.startsWith("data: ")) return { type: "ignore" };
368
+ let t = e.slice(6);
369
+ if (t === "[DONE]") return { type: "done" };
370
+ let n = JSON.parse(t);
371
+ return n.type === "response.output_text.delta" && typeof n.delta == "string" ? {
372
+ type: "delta",
373
+ text: n.delta
374
+ } : n.type === "response.completed" ? {
375
+ type: "completed",
376
+ response: n.response
377
+ } : { type: "ignore" };
378
+ }
379
+ async function le(e) {
380
+ return H(await (await U({
381
+ baseUrl: e.baseUrl,
382
+ apiKey: e.apiKey,
383
+ body: L(e.prompt, e.options, !1),
384
+ abortSignal: e.abortSignal
385
+ })).json());
386
+ }
387
+ async function ue(e) {
388
+ let t = (await U({
389
+ baseUrl: e.baseUrl,
390
+ apiKey: e.apiKey,
391
+ body: L(e.prompt, e.options, !0),
392
+ abortSignal: e.abortSignal
393
+ })).body?.getReader();
394
+ if (!t) throw Error("OpenRouter streaming response body is missing.");
395
+ let n = new TextDecoder(), r = "", i, a, o = new Promise((e, t) => {
396
+ i = e, a = t;
397
+ }), s = !1, c = (async function* () {
398
+ try {
399
+ for (;;) {
400
+ let { done: e, value: a } = await t.read();
401
+ if (e) break;
402
+ r += n.decode(a, { stream: !0 });
403
+ let o = r.split("\n");
404
+ r = o.pop() ?? "";
405
+ for (let e of o) {
406
+ let t = G(e);
407
+ if (t.type === "delta") {
408
+ yield t.text;
409
+ continue;
410
+ }
411
+ if (t.type === "completed") {
412
+ s = !0, i?.(t.response);
413
+ continue;
414
+ }
415
+ if (t.type === "done") return;
416
+ }
417
+ }
418
+ s || a?.(/* @__PURE__ */ Error("OpenRouter stream ended before response.completed."));
419
+ } catch (e) {
420
+ throw a?.(e), e;
421
+ }
422
+ })(), l = o.then(H);
423
+ return {
424
+ textStream: c,
425
+ text: l.then((e) => e.text),
426
+ finishReason: l.then((e) => e.finishReason),
427
+ usage: l.then((e) => e.usage),
428
+ sources: l.then((e) => e.sources)
429
+ };
430
+ }
431
+ //#endregion
432
+ //#region src/output.ts
433
+ function K({ model: e, finishReason: t, requestApi: n, baseUrl: r, durationMs: i, firstTokenLatencyMs: a, usage: o }) {
434
+ let s = o ?? {}, c = s.outputTokens, l = s.outputTokenDetails?.reasoningTokens;
435
+ return {
436
+ model: e,
437
+ finishReason: t,
438
+ requestApi: n,
439
+ baseUrl: r,
440
+ durationMs: i,
441
+ firstTokenLatencyMs: a,
442
+ tokensPerSecond: typeof c == "number" && i > 0 ? Number((c * 1e3 / i).toFixed(2)) : void 0,
443
+ usage: {
444
+ inputTokens: s.inputTokens,
445
+ outputTokens: c,
446
+ reasoningTokens: l,
447
+ totalTokens: s.totalTokens
448
+ }
449
+ };
450
+ }
451
+ function de(e) {
452
+ process.stdout.write("\nVerbose:\n"), process.stdout.write(`Model: ${e.model}\n`), process.stdout.write(`Finish reason: ${e.finishReason}\n`), process.stdout.write(`Request API: ${e.requestApi}\n`), e.baseUrl && process.stdout.write(`Base URL: ${e.baseUrl}\n`);
453
+ let t = [`Duration: ${(e.durationMs / 1e3).toFixed(2)}s`];
454
+ e.firstTokenLatencyMs != null && t.push(`first token ${(e.firstTokenLatencyMs / 1e3).toFixed(2)}s`), process.stdout.write(`${t.join(", ")}\n`);
455
+ let n = [
456
+ e.usage.inputTokens == null ? void 0 : `input=${e.usage.inputTokens}`,
457
+ e.usage.outputTokens == null ? void 0 : `output=${e.usage.outputTokens}`,
458
+ e.usage.reasoningTokens == null ? void 0 : `reasoning=${e.usage.reasoningTokens}`,
459
+ e.usage.totalTokens == null ? void 0 : `total=${e.usage.totalTokens}`
460
+ ].filter(Boolean);
461
+ n.length > 0 && process.stdout.write(`Usage: ${n.join(", ")}\n`), e.tokensPerSecond != null && process.stdout.write(`TPS: ${e.tokensPerSecond}\n`);
462
+ }
463
+ function q(e) {
464
+ process.stdout.write(`${e.text}\n`), e.verbose && de(e.verbose);
465
+ }
466
+ async function fe(e) {
467
+ let t = !1, n = !1, r;
468
+ for await (let i of e.stream.textStream) r ??= Date.now() - e.startedAt, t = !0, n = i.endsWith("\n"), process.stdout.write(i);
469
+ (!t || !n) && process.stdout.write("\n");
470
+ let i = await e.stream.usage, a = await e.stream.finishReason ?? "unknown", o = {
471
+ ...e.payload,
472
+ text: await e.stream.text,
473
+ finishReason: a,
474
+ usage: i ?? null,
475
+ sources: await e.stream.sources ?? [],
476
+ verbose: e.includeVerbose ? K({
477
+ model: e.payload.model,
478
+ finishReason: a,
479
+ requestApi: e.requestApi,
480
+ baseUrl: e.baseUrl,
481
+ durationMs: Date.now() - e.startedAt,
482
+ firstTokenLatencyMs: r,
483
+ usage: i
484
+ }) : void 0
485
+ };
486
+ process.stdout.write("\n"), q(o);
487
+ }
488
+ //#endregion
489
+ //#region src/providers.ts
490
+ function J(e) {
491
+ return {
492
+ web_search: u.tools.webSearch({
493
+ allowedDomains: e.allowedDomains.length > 0 ? e.allowedDomains : void 0,
494
+ excludedDomains: e.excludedDomains.length > 0 ? e.excludedDomains : void 0,
495
+ enableImageUnderstanding: e.enableImageUnderstanding || void 0
496
+ }),
497
+ x_search: u.tools.xSearch({
498
+ allowedXHandles: e.allowedHandles.length > 0 ? e.allowedHandles : void 0,
499
+ excludedXHandles: e.excludedHandles.length > 0 ? e.excludedHandles : void 0,
500
+ fromDate: e.fromDate,
501
+ toDate: e.toDate,
502
+ enableImageUnderstanding: e.enableImageUnderstanding || void 0,
503
+ enableVideoUnderstanding: e.enableVideoUnderstanding || void 0
504
+ })
505
+ };
506
+ }
507
+ function Y() {
508
+ let e = C();
509
+ return e ? te({ baseURL: e }) : u;
510
+ }
511
+ function X(e) {
512
+ let t = C();
513
+ return ee({
514
+ name: "compat",
515
+ apiKey: S(),
516
+ baseURL: t || "https://api.x.ai/v1",
517
+ transformRequestBody: w(t) ? W(e) : void 0
518
+ });
519
+ }
520
+ function pe(e) {
521
+ return e.allowedDomains.length > 0 || e.excludedDomains.length > 0 || e.allowedHandles.length > 0 || e.excludedHandles.length > 0 || typeof e.fromDate == "string" || typeof e.toDate == "string" || e.enableImageUnderstanding || e.enableVideoUnderstanding;
522
+ }
523
+ function me(e) {
524
+ return e.allowedHandles.length > 0 || e.excludedHandles.length > 0 || typeof e.fromDate == "string" || typeof e.toDate == "string" || e.enableImageUnderstanding || e.enableVideoUnderstanding;
525
+ }
526
+ //#endregion
527
+ //#region src/skill.ts
528
+ var Z = "grok-search-cli", he = ne(import.meta.url), Q = c.dirname(he);
529
+ function ge(...e) {
530
+ let t = [c.resolve(Q, "..", "skills", Z, ...e), c.resolve(Q, "..", "..", "skills", Z, ...e)];
531
+ for (let e of t) try {
532
+ return a(e, "utf8");
533
+ } catch (e) {
534
+ if (e.code !== "ENOENT") throw e;
535
+ }
536
+ throw Error(`Bundled skill file not found for ${Z}: ${e.join("/")}`);
537
+ }
538
+ function _e() {
539
+ return ge("SKILL.md");
540
+ }
541
+ //#endregion
542
+ //#region src/cli.ts
543
+ function ve() {
544
+ let e = re();
545
+ if (process.stdout.write("Doctor:\n"), process.stdout.write(`Config path: ${e.configPath}\n`), process.stdout.write(`API key: ${e.apiKeyPresent ? "present" : "missing"} (${e.sources.apiKey})\n`), process.stdout.write(`Model: ${e.model} (${e.sources.model})\n`), process.stdout.write(`Base URL: ${e.baseUrl ?? "(xAI default)"} (${e.sources.baseUrl})\n`), process.stdout.write(`Compat mode source: ${e.sources.compatMode}\n`), process.stdout.write(`Provider: ${e.providerKind}\n`), process.stdout.write(`API mode: ${e.apiMode}\n`), process.stdout.write(`Status: ${e.apiKeyPresent ? "OK" : "NOT READY"}\n`), !e.apiKeyPresent) {
546
+ let e = v();
547
+ process.stdout.write(`Action: edit ${e}\n`);
548
+ }
549
+ }
550
+ function ye() {
551
+ let e = S();
552
+ return e || k(`Missing XAI_API_KEY.\n\nEdit ${v()} or set XAI_API_KEY in your shell environment.\nThis CLI reads credentials from process.env first, then ${_}.`), e;
553
+ }
554
+ function be(e, t) {
555
+ let n = C();
556
+ return e === "completion" && pe(t) && !w(n);
557
+ }
558
+ function xe(e, t) {
559
+ return $(e) && me(t);
560
+ }
561
+ function $(e) {
562
+ return T(C()) === "openrouter" && e === "responses";
563
+ }
564
+ function Se(e) {
565
+ return process.stdout.isTTY && !e;
566
+ }
567
+ function Ce(e) {
568
+ return {
569
+ command: e.command,
570
+ apiMode: e.apiMode,
571
+ model: e.model,
572
+ prompt: e.prompt,
573
+ text: e.result.text,
574
+ finishReason: e.result.finishReason ?? "unknown",
575
+ usage: e.result.usage ?? null,
576
+ sources: e.result.sources ?? [],
577
+ verbose: e.verbose ? K({
578
+ model: e.model,
579
+ finishReason: e.result.finishReason ?? "unknown",
580
+ requestApi: e.requestApi,
581
+ baseUrl: e.baseUrl,
582
+ durationMs: Date.now() - e.startedAt,
583
+ usage: e.result.usage ?? null
584
+ }) : void 0
585
+ };
586
+ }
587
+ function we(e) {
588
+ if (e.apiMode === "responses") {
589
+ let n = t({
590
+ model: Y().responses(e.model),
591
+ prompt: e.prompt,
592
+ tools: J(e.options),
593
+ abortSignal: e.abortSignal
594
+ });
595
+ return {
596
+ textStream: n.textStream,
597
+ text: n.text,
598
+ finishReason: n.finishReason,
599
+ usage: n.usage,
600
+ sources: n.sources
601
+ };
602
+ }
603
+ let n = t({
604
+ model: X(e.options)(e.model),
605
+ prompt: e.prompt,
606
+ abortSignal: e.abortSignal
607
+ });
608
+ return {
609
+ textStream: n.textStream,
610
+ text: n.text,
611
+ finishReason: n.finishReason,
612
+ usage: n.usage,
613
+ sources: n.sources
614
+ };
615
+ }
616
+ async function Te(t) {
617
+ if (t.apiMode === "responses") {
618
+ let n = await e({
619
+ model: Y().responses(t.model),
620
+ prompt: t.prompt,
621
+ tools: J(t.options),
622
+ abortSignal: t.abortSignal
623
+ });
624
+ return {
625
+ text: n.text,
626
+ finishReason: n.finishReason ?? "unknown",
627
+ usage: n.usage ?? null,
628
+ sources: n.sources ?? []
629
+ };
630
+ }
631
+ let n = await e({
632
+ model: X(t.options)(t.model),
633
+ prompt: t.prompt,
634
+ abortSignal: t.abortSignal
635
+ });
636
+ return {
637
+ text: n.text,
638
+ finishReason: n.finishReason ?? "unknown",
639
+ usage: n.usage ?? null,
640
+ sources: n.sources ?? []
641
+ };
642
+ }
643
+ async function Ee(e) {
644
+ await fe({
645
+ stream: $(e.apiMode) ? await ue({
646
+ prompt: e.prompt,
647
+ options: e.options,
648
+ baseUrl: e.baseUrl || "https://openrouter.ai/api/v1",
649
+ apiKey: e.apiKey,
650
+ abortSignal: e.abortSignal
651
+ }) : we(e),
652
+ payload: {
653
+ command: "all",
654
+ apiMode: e.apiMode,
655
+ model: e.model,
656
+ prompt: e.prompt
657
+ },
658
+ includeVerbose: e.options.verbose,
659
+ requestApi: e.requestApi,
660
+ baseUrl: e.baseUrl,
661
+ startedAt: e.startedAt
662
+ });
663
+ }
664
+ async function De(e) {
665
+ let t = $(e.apiMode) ? await le({
666
+ prompt: e.prompt,
667
+ options: e.options,
668
+ baseUrl: e.baseUrl || "https://openrouter.ai/api/v1",
669
+ apiKey: e.apiKey,
670
+ abortSignal: e.abortSignal
671
+ }) : await Te(e), n = Ce({
672
+ command: "all",
673
+ apiMode: e.apiMode,
674
+ model: e.model,
675
+ prompt: e.prompt,
676
+ result: t,
677
+ verbose: e.options.verbose,
678
+ requestApi: e.requestApi,
679
+ baseUrl: e.baseUrl,
680
+ startedAt: e.startedAt
681
+ });
682
+ if (e.options.json) {
683
+ process.stdout.write(`${JSON.stringify(n, null, 2)}\n`);
684
+ return;
685
+ }
686
+ q(n);
687
+ }
688
+ async function Oe() {
689
+ let e = P(process.argv.slice(2));
690
+ if (e.command === "skill") {
691
+ process.stdout.write(`${_e()}\n`);
692
+ return;
693
+ }
694
+ if (e.command === "doctor") {
695
+ ve();
696
+ return;
697
+ }
698
+ let t = ye(), n = E(), r = C(), i = D(n), a = Date.now(), o = AbortSignal.timeout(e.options.timeoutMs);
699
+ be(n, e.options) && console.error("Warning: compatibility mode is enabled, so tool-specific filters are not forwarded. Remove XAI_COMPAT_MODE to use xAI Responses API."), xe(n, e.options) && console.error("Warning: OpenRouter server-tool mode currently forwards web domain filters only. X-specific filters such as handles, dates, image, and video options are ignored.");
700
+ let s = {
701
+ prompt: e.prompt,
702
+ model: e.options.model,
703
+ apiMode: n,
704
+ options: e.options,
705
+ requestApi: i,
706
+ baseUrl: r,
707
+ startedAt: a,
708
+ abortSignal: o,
709
+ apiKey: t
710
+ };
711
+ if (Se(e.options.json)) {
712
+ await Ee(s);
713
+ return;
714
+ }
715
+ await De(s);
716
+ }
717
+ Oe().catch((e) => {
718
+ let t = e instanceof Error ? e.stack || e.message : String(e);
719
+ O(T(C()), E()), process.stderr.write(`${t}\n`), process.exit(1);
720
+ });
721
+ //#endregion
package/package.json ADDED
@@ -0,0 +1,40 @@
1
+ {
2
+ "name": "grok-search-cli",
3
+ "version": "0.1.1",
4
+ "description": "CLI for xAI webSearch and xSearch via the Vercel AI SDK provider.",
5
+ "license": "MIT",
6
+ "type": "module",
7
+ "bin": {
8
+ "grok-search": "dist/cli.js"
9
+ },
10
+ "files": [
11
+ "agents",
12
+ "dist",
13
+ "skills",
14
+ "README.md",
15
+ "README.zh.md",
16
+ ".env.example",
17
+ ".env.openrouter.example",
18
+ ".env.yunwu.example"
19
+ ],
20
+ "engines": {
21
+ "node": ">=22.0.0"
22
+ },
23
+ "dependencies": {
24
+ "@ai-sdk/openai-compatible": "^2.0.38",
25
+ "@ai-sdk/xai": "^3.0.77",
26
+ "ai": "^6.0.145",
27
+ "conf": "^14.0.0"
28
+ },
29
+ "devDependencies": {
30
+ "@dotenvx/dotenvx": "^1.51.1",
31
+ "@types/node": "^24.5.2",
32
+ "tsx": "^4.20.6",
33
+ "typescript": "^5.9.2",
34
+ "vite": "^8.0.3"
35
+ },
36
+ "scripts": {
37
+ "build": "vite build",
38
+ "dev": "dotenvx run -f .env -- tsx src/cli.ts"
39
+ }
40
+ }
@@ -0,0 +1,118 @@
1
+ ---
2
+ name: grok-search-cli
3
+ description: Use for high-freshness web+X research through a CLI, with Grok-backed synthesis plus JSON/citations output for agent workflows.
4
+ ---
5
+
6
+ # grok-search-cli
7
+
8
+ Use this skill when you need high-freshness search from a shell command and the runtime can invoke a CLI but does not natively integrate with xAI search tools.
9
+
10
+ ## Why Use It
11
+
12
+ This CLI is useful because Grok search is strong on freshness.
13
+
14
+ In xAI official mode, the CLI does not just call a plain search API and dump links back. It hands the research step to a Grok model with server-side search tools. In practice, that is close to letting a search-focused subagent go online, collect current results, and do one round of filtering and synthesis before the answer comes back.
15
+
16
+ That makes this tool a good fit when you want current information with a lower hallucination risk than answering from stale memory.
17
+
18
+ Especially consider this tool when searching:
19
+
20
+ - X discussions and real-time public sentiment
21
+ - GitHub repositories, release chatter, issue discussions, and ecosystem updates
22
+ - papers, research announcements, and technical writeups
23
+ - any current technical question where freshness matters more than raw recall
24
+
25
+ ## What It Does
26
+
27
+ This skill exposes one CLI entrypoint for:
28
+
29
+ - combined web + X search
30
+ - JSON output for downstream agent processing
31
+
32
+ ## Choose the Right Mode
33
+
34
+ ### xAI official mode
35
+
36
+ Use this when direct xAI access is available.
37
+
38
+ Required environment:
39
+
40
+ ```bash
41
+ export XAI_API_KEY=...
42
+ ```
43
+
44
+ Optional:
45
+
46
+ ```bash
47
+ export XAI_MODEL=grok-4-1-fast-non-reasoning
48
+ ```
49
+
50
+ In this mode the CLI uses xAI Responses API server-side tools:
51
+
52
+ - `xai.tools.webSearch()`
53
+ - `xai.tools.xSearch()`
54
+
55
+ ### Proxy mode
56
+
57
+ Use this when the endpoint only supports OpenAI-style `/chat/completions`.
58
+
59
+ Required environment:
60
+
61
+ ```bash
62
+ export XAI_API_KEY=...
63
+ export XAI_BASE_URL=https://your-proxy.example.com/v1
64
+ export XAI_COMPAT_MODE=true
65
+ ```
66
+
67
+ Optional:
68
+
69
+ ```bash
70
+ export XAI_MODEL=grok-4-1-fast-non-reasoning
71
+ ```
72
+
73
+ In this mode the CLI falls back to completion calls. Tool-specific filters are not forwarded because search behavior depends on the proxy.
74
+
75
+ ## When To Reach For It
76
+
77
+ Prefer this CLI over answering from memory when:
78
+
79
+ - the user asks for the latest, current, recent, or today
80
+ - the answer depends on X, GitHub, papers, release notes, or public web reporting
81
+ - you want the model to search first and synthesize second
82
+ - you want a smaller chance of hallucinating versions, timelines, or public discussion
83
+
84
+ It is still a search-and-synthesis tool, not a perfect source of truth. For important claims, inspect the returned sources.
85
+
86
+ ## Command Patterns
87
+
88
+ Default mode is combined web + X search:
89
+
90
+ ```bash
91
+ grok-search "latest xAI updates"
92
+ ```
93
+
94
+ Machine-readable output:
95
+
96
+ ```bash
97
+ grok-search "latest xAI updates" --json
98
+ ```
99
+
100
+ Useful filters in xAI official mode:
101
+
102
+ ```bash
103
+ grok-search "latest AI SDK updates" --allowed-domains=ai-sdk.dev,vercel.com
104
+ grok-search "latest xAI status" --allowed-handles=xai,elonmusk --from-date=2026-04-01
105
+ ```
106
+
107
+ ## When to Prefer This Skill
108
+
109
+ - The agent can run shell commands but cannot call xAI APIs directly
110
+ - You want one stable CLI for combined web and X research
111
+ - You want citations or sources in stdout / JSON
112
+ - You need a fallback path for OpenAI-compatible proxy endpoints
113
+ - You want Grok to do one search/synthesis pass before the answer reaches the main agent
114
+
115
+ ## Notes
116
+
117
+ - `--json` is the safest mode for agent consumption
118
+ - In proxy mode, search is proxy-defined and tool-specific flags are ignored with a warning