consult-llm-mcp 2.7.1 → 2.7.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +10 -0
- package/README.md +8 -34
- package/package.json +5 -5
package/CHANGELOG.md
CHANGED
|
@@ -1,5 +1,15 @@
|
|
|
1
1
|
# Changelog
|
|
2
2
|
|
|
3
|
+
## v2.7.1 (2026-03-09)
|
|
4
|
+
|
|
5
|
+
- Monitor: show "Thinking..." spinner when thinking events are streaming
|
|
6
|
+
- Monitor: auto-enable follow mode when scrolled to bottom in detail view
|
|
7
|
+
- Monitor: sort servers with active consultations above idle ones
|
|
8
|
+
- Monitor: show tool error messages in detail view
|
|
9
|
+
- Fixed cursor-agent thinking deltas containing literal `\n` instead of newlines
|
|
10
|
+
- Fixed cursor-agent crash on unknown tool types
|
|
11
|
+
- Fixed monitor event flushing for real-time detail view updates
|
|
12
|
+
|
|
3
13
|
## v2.7.0 (2026-03-08)
|
|
4
14
|
|
|
5
15
|
- Fixed cursor-agent tool success detection
|
package/README.md
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
# consult-llm-mcp
|
|
2
2
|
|
|
3
|
-
An MCP server that lets Claude Code consult stronger AI models (GPT-5.
|
|
3
|
+
An MCP server that lets Claude Code consult stronger AI models (GPT-5.4, Gemini
|
|
4
4
|
3.1 Pro, DeepSeek Reasoner) when Sonnet has you running in circles and you need
|
|
5
5
|
to bring in the heavy artillery. Supports multi-turn conversations.
|
|
6
6
|
|
|
@@ -23,11 +23,11 @@ to bring in the heavy artillery. Supports multi-turn conversations.
|
|
|
23
23
|
```
|
|
24
24
|
|
|
25
25
|
[Quick start](#quick-start) · [Configuration](#configuration) ·
|
|
26
|
-
[Monitor TUI](#monitor) · [Changelog](CHANGELOG.md)
|
|
26
|
+
[Skills](#skills) · [Monitor TUI](#monitor) · [Changelog](CHANGELOG.md)
|
|
27
27
|
|
|
28
28
|
## Features
|
|
29
29
|
|
|
30
|
-
- Query powerful AI models (GPT-5.
|
|
30
|
+
- Query powerful AI models (GPT-5.4, Gemini 3.1 Pro, DeepSeek Reasoner) with
|
|
31
31
|
relevant files as context
|
|
32
32
|
- Direct queries with optional file context
|
|
33
33
|
- Include git changes for code review and analysis
|
|
@@ -42,6 +42,7 @@ to bring in the heavy artillery. Supports multi-turn conversations.
|
|
|
42
42
|
across requests with `thread_id`
|
|
43
43
|
- [Web mode](#web-mode): Copy formatted prompts to clipboard for browser-based
|
|
44
44
|
LLM services
|
|
45
|
+
- [Skills](#skills): Multi-LLM debate, collaboration, and consultation workflows
|
|
45
46
|
- Less is more: Single MCP tool to not clutter the context
|
|
46
47
|
|
|
47
48
|
<img src="meta/monitor-screenshot.webp" alt="consult-llm-monitor screenshot" width="600">
|
|
@@ -583,9 +584,6 @@ claude mcp add consult-llm \
|
|
|
583
584
|
-- npx -y consult-llm-mcp
|
|
584
585
|
```
|
|
585
586
|
|
|
586
|
-
Alternatively, use a [slash command](#example-slash-command) with hardcoded
|
|
587
|
-
model names for guaranteed model selection.
|
|
588
|
-
|
|
589
587
|
## MCP tool: consult_llm
|
|
590
588
|
|
|
591
589
|
The server provides a single tool called `consult_llm` for asking powerful AI
|
|
@@ -732,22 +730,14 @@ agent's context. This allows Claude to infer when to call the MCP from natural
|
|
|
732
730
|
language (e.g., "ask gemini about..."). Works out of the box, but you have less
|
|
733
731
|
control over how the MCP is invoked.
|
|
734
732
|
|
|
735
|
-
### 2.
|
|
736
|
-
|
|
737
|
-
Explicitly invoke with `/consult ask gemini about X`. Guaranteed activation with
|
|
738
|
-
full control over custom instructions, but requires the explicit syntax. For
|
|
739
|
-
example, you can instruct Claude to always find related files and pass them as
|
|
740
|
-
context via the `files` parameter. See the
|
|
741
|
-
[example slash command](#example-slash-command) below.
|
|
742
|
-
|
|
743
|
-
### 3. Skills
|
|
733
|
+
### 2. Skills
|
|
744
734
|
|
|
745
735
|
Automatically triggers when Claude detects matching intent. Like slash commands,
|
|
746
736
|
supports custom instructions (e.g., always gathering relevant files), but not
|
|
747
|
-
always reliably triggered. See the [
|
|
737
|
+
always reliably triggered. See the [consult skill](#consult) below.
|
|
748
738
|
|
|
749
|
-
**Recommendation:** Start with no custom activation. Use
|
|
750
|
-
|
|
739
|
+
**Recommendation:** Start with no custom activation. Use skills if you need
|
|
740
|
+
custom instructions for how the MCP is invoked.
|
|
751
741
|
|
|
752
742
|
## Installing skills
|
|
753
743
|
|
|
@@ -782,22 +772,6 @@ strictly necessary since Claude can infer from the schema that "ask gemini"
|
|
|
782
772
|
should call this MCP, but it gives more precise control over how the agent calls
|
|
783
773
|
this MCP.
|
|
784
774
|
|
|
785
|
-
## Slash command
|
|
786
|
-
|
|
787
|
-
Here's an example
|
|
788
|
-
[Claude Code slash command](https://code.claude.com/docs/en/slash-commands) that
|
|
789
|
-
uses the `consult_llm` MCP tool. See [examples/consult.md](examples/consult.md)
|
|
790
|
-
for the full content.
|
|
791
|
-
|
|
792
|
-
Save it as `~/.claude/commands/consult.md` and you can then use it by typing
|
|
793
|
-
`/consult ask gemini about X` or `/consult ask codex about X` in Claude Code.
|
|
794
|
-
|
|
795
|
-
## Multi-LLM skills
|
|
796
|
-
|
|
797
|
-
Skills that orchestrate multi-turn conversations between LLMs. All use
|
|
798
|
-
`thread_id` to maintain conversation context across rounds, so each LLM
|
|
799
|
-
remembers the full history without resending everything.
|
|
800
|
-
|
|
801
775
|
### collab
|
|
802
776
|
|
|
803
777
|
**Collaborative ideation.** Gemini and Codex independently brainstorm ideas,
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "consult-llm-mcp",
|
|
3
|
-
"version": "2.7.
|
|
3
|
+
"version": "2.7.2",
|
|
4
4
|
"description": "MCP server for consulting powerful AI models",
|
|
5
5
|
"repository": {
|
|
6
6
|
"type": "git",
|
|
@@ -31,9 +31,9 @@
|
|
|
31
31
|
"ai"
|
|
32
32
|
],
|
|
33
33
|
"optionalDependencies": {
|
|
34
|
-
"consult-llm-mcp-darwin-arm64": "2.7.
|
|
35
|
-
"consult-llm-mcp-darwin-x64": "2.7.
|
|
36
|
-
"consult-llm-mcp-linux-x64": "2.7.
|
|
37
|
-
"consult-llm-mcp-linux-arm64": "2.7.
|
|
34
|
+
"consult-llm-mcp-darwin-arm64": "2.7.2",
|
|
35
|
+
"consult-llm-mcp-darwin-x64": "2.7.2",
|
|
36
|
+
"consult-llm-mcp-linux-x64": "2.7.2",
|
|
37
|
+
"consult-llm-mcp-linux-arm64": "2.7.2"
|
|
38
38
|
}
|
|
39
39
|
}
|