@aigne/example-mcp-blocklet 1.12.6 → 1.13.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,4 +1,47 @@
1
1
  # Change the name of this file to .env.local and fill in the following values
2
2
 
3
- OPENAI_API_KEY="" # Your OpenAI API key
3
+ # Uncomment the lines below to enable debug logging
4
+ # DEBUG="aigne:*"
5
+
6
+ # Use different Models
7
+
8
+ # OpenAI
9
+ MODEL="openai:gpt-4.1"
10
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
11
+
12
+ # Anthropic claude
13
+ # MODEL="anthropic:claude-3-7-sonnet-latest"
14
+ # ANTHROPIC_API_KEY=""
15
+
16
+ # Gemini
17
+ # MODEL="gemini:gemini-2.0-flash"
18
+ # GEMINI_API_KEY=""
19
+
20
+ # Bedrock nova
21
+ # MODEL=bedrock:us.amazon.nova-premier-v1:0
22
+ # AWS_ACCESS_KEY_ID=""
23
+ # AWS_SECRET_ACCESS_KEY=""
24
+ # AWS_REGION=us-west-2
25
+
26
+ # DeepSeek
27
+ # MODEL="deepseek:deepseek-chat"
28
+ # DEEPSEEK_API_KEY=""
29
+
30
+ # OpenRouter
31
+ # MODEL="openrouter:openai/gpt-4o"
32
+ # OPEN_ROUTER_API_KEY=""
33
+
34
+ # xAI
35
+ # MODEL="xai:grok-2-latest"
36
+ # XAI_API_KEY=""
37
+
38
+ # Ollama
39
+ # MODEL="ollama:llama3.2"
40
+ # OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
41
+
42
+
43
+ # Setup proxy if needed
44
+ # HTTPS_PROXY=http://localhost:7890
45
+
46
+ # Setup Blocklet app URL
4
47
  BLOCKLET_APP_URL="" # Your Blocklet app URL
package/README.md CHANGED
@@ -4,11 +4,11 @@ This demo demonstrates how to use [AIGNE Framework](https://github.com/AIGNE-io/
4
4
 
5
5
  ## Prerequisites
6
6
 
7
- - [Node.js](https://nodejs.org) and npm installed on your machine
8
- - An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
9
- - Optional dependencies (if running the example from source code):
10
- - [Bun](https://bun.sh) for running unit tests & examples
11
- - [Pnpm](https://pnpm.io) for package management
7
+ * [Node.js](https://nodejs.org) and npm installed on your machine
8
+ * An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
9
+ * Optional dependencies (if running the example from source code):
10
+ * [Bun](https://bun.sh) for running unit tests & examples
11
+ * [Pnpm](https://pnpm.io) for package management
12
12
 
13
13
  ## Quick Start (No Installation Required)
14
14
 
@@ -50,6 +50,21 @@ OPENAI_API_KEY="" # Set your OpenAI API key here
50
50
  BLOCKLET_APP_URL="" # Set your Blocklet app URL here
51
51
  ```
52
52
 
53
+ #### Using Different Models
54
+
55
+ You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
56
+
57
+ * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
58
+ * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
59
+ * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
60
+ * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
61
+ * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
62
+ * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
63
+ * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
64
+ * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
65
+
66
+ For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
67
+
53
68
  ### Run the Example
54
69
 
55
70
  ```bash
@@ -69,7 +84,7 @@ The example supports the following command-line parameters:
69
84
  | Parameter | Description | Default |
70
85
  |-----------|-------------|---------|
71
86
  | `--chat` | Run in interactive chat mode | Disabled (one-shot mode) |
72
- | `--model <provider[:model]>` | AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
87
+ | `--model <provider[:model]>` | AI model to use in format 'provider\[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
73
88
  | `--temperature <value>` | Temperature for model generation | Provider default |
74
89
  | `--top-p <value>` | Top-p sampling value | Provider default |
75
90
  | `--presence-penalty <value>` | Presence penalty value | Provider default |
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-mcp-blocklet",
3
- "version": "1.12.6",
3
+ "version": "1.13.0",
4
4
  "description": "A demonstration of using AIGNE Framework and MCP Server hosted by the Blocklet platform",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/mcp-blocklet",
@@ -20,13 +20,13 @@
20
20
  "jsonwebtoken": "^9.0.2",
21
21
  "open": "^10.1.2",
22
22
  "zod": "^3.24.4",
23
- "@aigne/agent-library": "^1.17.3",
24
- "@aigne/cli": "^1.16.0",
25
- "@aigne/core": "^1.26.0"
23
+ "@aigne/agent-library": "^1.17.5",
24
+ "@aigne/core": "^1.28.0",
25
+ "@aigne/cli": "^1.18.0"
26
26
  },
27
27
  "devDependencies": {
28
28
  "@types/jsonwebtoken": "^9.0.9",
29
- "@aigne/test-utils": "^0.4.10"
29
+ "@aigne/test-utils": "^0.4.12"
30
30
  },
31
31
  "scripts": {
32
32
  "start": "bun run index.ts",