@aigne/example-mcp-github 1.17.4 → 1.17.5-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +43 -25
  2. package/package.json +7 -7
package/README.md CHANGED
@@ -81,52 +81,70 @@ AI ->> User: Here's the README content: ...
81
81
 
82
82
  ## Quick Start (No Installation Required)
83
83
 
84
+ ### Run the Example
85
+
84
86
  ```bash
85
- export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
86
87
  export GITHUB_TOKEN=YOUR_GITHUB_TOKEN # Set your GitHub token
87
88
 
88
89
  npx -y @aigne/example-mcp-github # Run the example
89
90
  ```
90
91
 
91
- ## Installation
92
+ ### Connect to an AI Model
92
93
 
93
- ### Clone the Repository
94
+ As an example, running `npx -y @aigne/example-mcp-github` requires an AI model. If this is your first run, you need to connect one.
95
+
96
+ ![run example](./run-example.png)
97
+
98
+ - Connect via the official AIGNE Hub
99
+
100
+ Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.
101
+
102
+ ![connect to official aigne hub](../images/connect-to-aigne-hub.png)
103
+
104
+ - Connect via a self-hosted AIGNE Hub
105
+
106
+ Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: [Blocklet Store](https://store.blocklet.dev/blocklets/z8ia3xzq2tMq8CRHfaXj1BTYJyYnEcHbqP8cJ?utm_source=www.arcblock.io&utm_medium=blog_link&utm_campaign=default&utm_content=store.blocklet.dev#:~:text=%F0%9F%9A%80%20Get%20Started%20in%20Minutes).
107
+
108
+ ![connect to self hosted aigne hub](../images/connect-to-self-hosted-aigne-hub.png)
109
+
110
+ - Connect via a third-party model provider
111
+
112
+ Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
94
113
 
95
114
  ```bash
96
- git clone https://github.com/AIGNE-io/aigne-framework
115
+ export OPENAI_API_KEY="" # Set your OpenAI API key here
97
116
  ```
117
+ For more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see [.env.local.example](./.env.local.example).
98
118
 
99
- ### Install Dependencies
119
+ After configuration, run the example again.
100
120
 
101
- ```bash
102
- cd aigne-framework/examples/mcp-github
121
+ ### Debugging
103
122
 
104
- pnpm install
105
- ```
123
+ The `aigne observe` command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
124
+
125
+ Start the observation server.
106
126
 
107
- ### Setup Environment Variables
127
+ ![aigne-observe-execute](../images/aigne-observe-execute.png)
128
+
129
+ View a list of recent executions.
130
+
131
+ ![aigne-observe-list](../images/aigne-observe-list.png)
132
+
133
+ ## Installation
108
134
 
109
- Setup your API keys in the `.env.local` file:
135
+ ### Clone the Repository
110
136
 
111
137
  ```bash
112
- OPENAI_API_KEY="" # Set your OpenAI API key here
113
- GITHUB_TOKEN="" # Set your GitHub Personal Access Token here
138
+ git clone https://github.com/AIGNE-io/aigne-framework
114
139
  ```
115
140
 
116
- #### Using Different Models
117
-
118
- You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
141
+ ### Install Dependencies
119
142
 
120
- * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
121
- * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
122
- * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
123
- * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
124
- * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
125
- * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
126
- * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
127
- * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
143
+ ```bash
144
+ cd aigne-framework/examples/mcp-github
128
145
 
129
- For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
146
+ pnpm install
147
+ ```
130
148
 
131
149
  ### Run the Example
132
150
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-mcp-github",
3
- "version": "1.17.4",
3
+ "version": "1.17.5-beta.1",
4
4
  "description": "A demonstration of using AIGNE Framework and GitHub MCP Server to interact with GitHub repositories",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/mcp-github",
@@ -16,15 +16,15 @@
16
16
  "README.md"
17
17
  ],
18
18
  "dependencies": {
19
- "@aigne/agent-library": "^1.22.4",
20
- "@aigne/cli": "^1.57.3",
21
- "@aigne/core": "^1.70.1",
22
- "@aigne/default-memory": "^1.3.4",
23
- "@aigne/openai": "^0.16.14"
19
+ "@aigne/core": "^1.71.0-beta.1",
20
+ "@aigne/cli": "^1.58.0-beta.1",
21
+ "@aigne/agent-library": "^1.23.0-beta.1",
22
+ "@aigne/default-memory": "^1.3.5-beta.1",
23
+ "@aigne/openai": "^0.16.15-beta.1"
24
24
  },
25
25
  "devDependencies": {
26
26
  "@types/bun": "^1.2.22",
27
- "@aigne/test-utils": "^0.5.67"
27
+ "@aigne/test-utils": "^0.5.68-beta.1"
28
28
  },
29
29
  "scripts": {
30
30
  "start": "bun run index.ts",