@aigne/example-mcp-github 1.12.6 → 1.13.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,7 +1,48 @@
1
1
  # Change the name of this file to .env.local and fill in the following values
2
2
 
3
- DEBUG=aigne:mcp
3
+ # Uncomment the lines below to enable debug logging
4
+ # DEBUG="aigne:*"
4
5
 
5
- OPENAI_API_KEY="" # Your OpenAI API key
6
+ # Use different Models
6
7
 
8
+ # OpenAI
9
+ MODEL="openai:gpt-4.1"
10
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
11
+
12
+ # Anthropic claude
13
+ # MODEL="anthropic:claude-3-7-sonnet-latest"
14
+ # ANTHROPIC_API_KEY=""
15
+
16
+ # Gemini
17
+ # MODEL="gemini:gemini-2.0-flash"
18
+ # GEMINI_API_KEY=""
19
+
20
+ # Bedrock nova
21
+ # MODEL=bedrock:us.amazon.nova-premier-v1:0
22
+ # AWS_ACCESS_KEY_ID=""
23
+ # AWS_SECRET_ACCESS_KEY=""
24
+ # AWS_REGION=us-west-2
25
+
26
+ # DeepSeek
27
+ # MODEL="deepseek:deepseek-chat"
28
+ # DEEPSEEK_API_KEY=""
29
+
30
+ # OpenRouter
31
+ # MODEL="openrouter:openai/gpt-4o"
32
+ # OPEN_ROUTER_API_KEY=""
33
+
34
+ # xAI
35
+ # MODEL="xai:grok-2-latest"
36
+ # XAI_API_KEY=""
37
+
38
+ # Ollama
39
+ # MODEL="ollama:llama3.2"
40
+ # OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
41
+
42
+
43
+ # Setup proxy if needed
44
+ # HTTPS_PROXY=http://localhost:7890
45
+
46
+
47
+ # Setup your GitHub Personal Access Token for GitHub integration
7
48
  GITHUB_TOKEN="" # Your GitHub Personal Access Token
package/README.md CHANGED
@@ -64,12 +64,12 @@ AI ->> User: Here's the README content: ...
64
64
 
65
65
  ## Prerequisites
66
66
 
67
- - [Node.js](https://nodejs.org) and npm installed on your machine
68
- - An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
69
- - [GitHub Personal Access Token](https://github.com/settings/tokens) with appropriate permissions
70
- - Optional dependencies (if running the example from source code):
71
- - [Bun](https://bun.sh) for running unit tests & examples
72
- - [Pnpm](https://pnpm.io) for package management
67
+ * [Node.js](https://nodejs.org) and npm installed on your machine
68
+ * An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
69
+ * [GitHub Personal Access Token](https://github.com/settings/tokens) with appropriate permissions
70
+ * Optional dependencies (if running the example from source code):
71
+ * [Bun](https://bun.sh) for running unit tests & examples
72
+ * [Pnpm](https://pnpm.io) for package management
73
73
 
74
74
  ## Quick Start (No Installation Required)
75
75
 
@@ -105,6 +105,21 @@ OPENAI_API_KEY="" # Set your OpenAI API key here
105
105
  GITHUB_TOKEN="" # Set your GitHub Personal Access Token here
106
106
  ```
107
107
 
108
+ #### Using Different Models
109
+
110
+ You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
111
+
112
+ * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
113
+ * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
114
+ * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
115
+ * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
116
+ * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
117
+ * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
118
+ * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
119
+ * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
120
+
121
+ For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
122
+
108
123
  ### Run the Example
109
124
 
110
125
  ```bash
@@ -124,7 +139,7 @@ The example supports the following command-line parameters:
124
139
  | Parameter | Description | Default |
125
140
  |-----------|-------------|---------|
126
141
  | `--chat` | Run in interactive chat mode | Disabled (one-shot mode) |
127
- | `--model <provider[:model]>` | AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
142
+ | `--model <provider[:model]>` | AI model to use in format 'provider\[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
128
143
  | `--temperature <value>` | Temperature for model generation | Provider default |
129
144
  | `--top-p <value>` | Top-p sampling value | Provider default |
130
145
  | `--presence-penalty <value>` | Presence penalty value | Provider default |
@@ -195,7 +210,7 @@ Always provide clear, concise responses with relevant information from GitHub.
195
210
  // Example: Search for repositories
196
211
  const result = await aigne.invoke(
197
212
  agent,
198
- "Search for repositories related to 'modelcontextprotocol'"
213
+ "Search for repositories related to 'modelcontextprotocol'",
199
214
  );
200
215
 
201
216
  console.log(result);
@@ -216,32 +231,32 @@ The GitHub MCP server provides a wide range of operations including:
216
231
 
217
232
  1. **Repository Operations**:
218
233
 
219
- - Search repositories
220
- - Create repositories
221
- - Get repository information
234
+ * Search repositories
235
+ * Create repositories
236
+ * Get repository information
222
237
 
223
238
  2. **File Operations**:
224
239
 
225
- - Get file contents
226
- - Create or update files
227
- - Push multiple files in a single commit
240
+ * Get file contents
241
+ * Create or update files
242
+ * Push multiple files in a single commit
228
243
 
229
244
  3. **Issue and PR Operations**:
230
245
 
231
- - Create issues
232
- - Create pull requests
233
- - Add comments
234
- - Merge pull requests
246
+ * Create issues
247
+ * Create pull requests
248
+ * Add comments
249
+ * Merge pull requests
235
250
 
236
251
  4. **Search Operations**:
237
252
 
238
- - Search code
239
- - Search issues
240
- - Search users
253
+ * Search code
254
+ * Search issues
255
+ * Search users
241
256
 
242
257
  5. **Commit Operations**:
243
- - List commits
244
- - Get commit details
258
+ * List commits
259
+ * Get commit details
245
260
 
246
261
  ## License
247
262
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-mcp-github",
3
- "version": "1.12.6",
3
+ "version": "1.13.0",
4
4
  "description": "A demonstration of using AIGNE Framework and GitHub MCP Server to interact with GitHub repositories",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/mcp-github",
@@ -16,14 +16,14 @@
16
16
  "README.md"
17
17
  ],
18
18
  "dependencies": {
19
- "@aigne/agent-library": "^1.17.3",
20
- "@aigne/cli": "^1.16.0",
21
- "@aigne/core": "^1.26.0",
22
- "@aigne/openai": "^0.4.3"
19
+ "@aigne/agent-library": "^1.17.5",
20
+ "@aigne/cli": "^1.18.0",
21
+ "@aigne/core": "^1.28.0",
22
+ "@aigne/openai": "^0.6.0"
23
23
  },
24
24
  "devDependencies": {
25
25
  "@types/bun": "^1.2.9",
26
- "@aigne/test-utils": "^0.4.10"
26
+ "@aigne/test-utils": "^0.4.12"
27
27
  },
28
28
  "scripts": {
29
29
  "start": "bun run index.ts",