@aigne/example-mcp-server 0.3.89 → 0.3.90-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,28 @@
1
1
  # Changelog
2
2
 
3
+ ## [0.3.90-beta.1](https://github.com/AIGNE-io/aigne-framework/compare/example-mcp-server-v0.3.90-beta...example-mcp-server-v0.3.90-beta.1) (2025-12-08)
4
+
5
+
6
+ ### Bug Fixes
7
+
8
+ * correct run example & doc improvements ([#707](https://github.com/AIGNE-io/aigne-framework/issues/707)) ([f98fc5d](https://github.com/AIGNE-io/aigne-framework/commit/f98fc5df28fd6ce6134128c2f0e5395c1554b740))
9
+
10
+
11
+ ### Dependencies
12
+
13
+ * The following workspace dependencies were updated
14
+ * dependencies
15
+ * @aigne/cli bumped to 1.58.0-beta.1
16
+
17
+ ## [0.3.90-beta](https://github.com/AIGNE-io/aigne-framework/compare/example-mcp-server-v0.3.89...example-mcp-server-v0.3.90-beta) (2025-12-07)
18
+
19
+
20
+ ### Dependencies
21
+
22
+ * The following workspace dependencies were updated
23
+ * dependencies
24
+ * @aigne/cli bumped to 1.58.0-beta
25
+
3
26
  ## [0.3.89](https://github.com/AIGNE-io/aigne-framework/compare/example-mcp-server-v0.3.89-beta...example-mcp-server-v0.3.89) (2025-12-06)
4
27
 
5
28
 
package/README.md CHANGED
@@ -21,9 +21,9 @@ The [Model Context Protocol (MCP)](https://modelcontextprotocol.io) is an open s
21
21
 
22
22
  ## Quick Start (No Installation Required)
23
23
 
24
- ```bash
25
- OPENAI_API_KEY="" # Set your OpenAI API key here
24
+ ### Run the Example
26
25
 
26
+ ```bash
27
27
  # Start the MCP server
28
28
  npx -y @aigne/example-mcp-server serve-mcp --port 3456
29
29
 
@@ -34,18 +34,46 @@ npx -y @aigne/example-mcp-server serve-mcp --port 3456
34
34
 
35
35
  This command will start the MCP server with the agents defined in this example
36
36
 
37
- ### Using Different Models
37
+ ### Connect to an AI Model
38
+
39
+ As an example, running `npx -y @aigne/example-mcp-server serve-mcp --port 3456` requires an AI model. If this is your first run, you need to connect one.
40
+
41
+ ![run example](./run-example.png)
42
+
43
+ - Connect via the official AIGNE Hub
44
+
45
+ Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.
46
+
47
+ ![connect to official aigne hub](../images/connect-to-aigne-hub.png)
48
+
49
+ - Connect via a self-hosted AIGNE Hub
50
+
51
+ Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: [Blocklet Store](https://store.blocklet.dev/blocklets/z8ia3xzq2tMq8CRHfaXj1BTYJyYnEcHbqP8cJ?utm_source=www.arcblock.io&utm_medium=blog_link&utm_campaign=default&utm_content=store.blocklet.dev#:~:text=%F0%9F%9A%80%20Get%20Started%20in%20Minutes).
52
+
53
+ ![connect to self hosted aigne hub](../images/connect-to-self-hosted-aigne-hub.png)
54
+
55
+ - Connect via a third-party model provider
56
+
57
+ Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
58
+
59
+ ```bash
60
+ export OPENAI_API_KEY="" # Set your OpenAI API key here
61
+ ```
62
+ For more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see [.env.local.example](./.env.local.example).
63
+
64
+ After configuration, run the example again.
65
+
66
+ ### Debugging
67
+
68
+ The `aigne observe` command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
69
+
70
+ Start the observation server.
71
+
72
+ ![aigne-observe-execute](../images/aigne-observe-execute.png)
38
73
 
39
- You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
74
+ View a list of recent executions.
40
75
 
41
- * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
42
- * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
43
- * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
44
- * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
45
- * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
46
- * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
47
- * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
48
- * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
76
+ ![aigne-observe-list](../images/aigne-observe-list.png)
49
77
 
50
78
  ## Available Agents
51
79
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-mcp-server",
3
- "version": "0.3.89",
3
+ "version": "0.3.90-beta.1",
4
4
  "description": "A demonstration of using AIGNE CLI to build a MCP server",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/mcp-server",
@@ -12,7 +12,7 @@
12
12
  "type": "module",
13
13
  "bin": "aigne.yaml",
14
14
  "dependencies": {
15
- "@aigne/cli": "^1.57.3"
15
+ "@aigne/cli": "^1.58.0-beta.1"
16
16
  },
17
17
  "scripts": {
18
18
  "test": "aigne test",
Binary file