@aigne/example-mcp-puppeteer 1.14.6 → 1.15.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,5 +1,44 @@
1
1
  # Change the name of this file to .env.local and fill in the following values
2
2
 
3
- DEBUG=aigne:mcp
3
+ # Uncomment the lines below to enable debug logging
4
+ # DEBUG="aigne:*"
4
5
 
5
- OPENAI_API_KEY="" # Your OpenAI API key
6
+ # Use different Models
7
+
8
+ # OpenAI
9
+ MODEL="openai:gpt-4.1"
10
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
11
+
12
+ # Anthropic claude
13
+ # MODEL="anthropic:claude-3-7-sonnet-latest"
14
+ # ANTHROPIC_API_KEY=""
15
+
16
+ # Gemini
17
+ # MODEL="gemini:gemini-2.0-flash"
18
+ # GEMINI_API_KEY=""
19
+
20
+ # Bedrock nova
21
+ # MODEL=bedrock:us.amazon.nova-premier-v1:0
22
+ # AWS_ACCESS_KEY_ID=""
23
+ # AWS_SECRET_ACCESS_KEY=""
24
+ # AWS_REGION=us-west-2
25
+
26
+ # DeepSeek
27
+ # MODEL="deepseek:deepseek-chat"
28
+ # DEEPSEEK_API_KEY=""
29
+
30
+ # OpenRouter
31
+ # MODEL="openrouter:openai/gpt-4o"
32
+ # OPEN_ROUTER_API_KEY=""
33
+
34
+ # xAI
35
+ # MODEL="xai:grok-2-latest"
36
+ # XAI_API_KEY=""
37
+
38
+ # Ollama
39
+ # MODEL="ollama:llama3.2"
40
+ # OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
41
+
42
+
43
+ # Setup proxy if needed
44
+ # HTTPS_PROXY=http://localhost:7890
package/README.md CHANGED
@@ -34,7 +34,6 @@ class evaluate processing
34
34
 
35
35
  Following is a sequence diagram of the workflow to summarize content from a website:
36
36
 
37
-
38
37
  ```mermaid
39
38
  sequenceDiagram
40
39
  participant User
@@ -55,11 +54,11 @@ AI ->> User: The content is as follows: ...
55
54
 
56
55
  ## Prerequisites
57
56
 
58
- - [Node.js](https://nodejs.org) and npm installed on your machine
59
- - An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
60
- - Optional dependencies (if running the example from source code):
61
- - [Bun](https://bun.sh) for running unit tests & examples
62
- - [Pnpm](https://pnpm.io) for package management
57
+ * [Node.js](https://nodejs.org) and npm installed on your machine
58
+ * An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
59
+ * Optional dependencies (if running the example from source code):
60
+ * [Bun](https://bun.sh) for running unit tests & examples
61
+ * [Pnpm](https://pnpm.io) for package management
63
62
 
64
63
  ## Quick Start (No Installation Required)
65
64
 
@@ -100,6 +99,21 @@ Setup your OpenAI API key in the `.env.local` file:
100
99
  OPENAI_API_KEY="" # Set your OpenAI API key here
101
100
  ```
102
101
 
102
+ #### Using Different Models
103
+
104
+ You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
105
+
106
+ * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
107
+ * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
108
+ * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
109
+ * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
110
+ * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
111
+ * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
112
+ * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
113
+ * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
114
+
115
+ For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
116
+
103
117
  ### Run the Example
104
118
 
105
119
  ```bash
@@ -113,7 +127,7 @@ The example supports the following command-line parameters:
113
127
  | Parameter | Description | Default |
114
128
  |-----------|-------------|---------|
115
129
  | `--chat` | Run in interactive chat mode | Disabled (one-shot mode) |
116
- | `--model <provider[:model]>` | AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
130
+ | `--model <provider[:model]>` | AI model to use in format 'provider\[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
117
131
  | `--temperature <value>` | Temperature for model generation | Provider default |
118
132
  | `--top-p <value>` | Top-p sampling value | Provider default |
119
133
  | `--presence-penalty <value>` | Presence penalty value | Provider default |
@@ -139,12 +153,10 @@ echo "extract content from https://www.arcblock.io" | pnpm start
139
153
  The following example demonstrates how to extract content from a website:
140
154
 
141
155
  ```typescript
142
- import assert from "node:assert";
143
156
  import { AIAgent, AIGNE, MCPAgent } from "@aigne/core";
144
157
  import { OpenAIChatModel } from "@aigne/core/models/openai-chat-model.js";
145
158
 
146
159
  const { OPENAI_API_KEY } = process.env;
147
- assert(OPENAI_API_KEY, "Please set the OPENAI_API_KEY environment variable");
148
160
 
149
161
  const model = new OpenAIChatModel({
150
162
  apiKey: OPENAI_API_KEY,
@@ -168,7 +180,10 @@ const agent = AIAgent.from({
168
180
  `,
169
181
  });
170
182
 
171
- const result = await aigne.invoke(agent, "extract content from https://www.arcblock.io");
183
+ const result = await aigne.invoke(
184
+ agent,
185
+ "extract content from https://www.arcblock.io",
186
+ );
172
187
 
173
188
  console.log(result);
174
189
  // output:
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-mcp-puppeteer",
3
- "version": "1.14.6",
3
+ "version": "1.15.0",
4
4
  "description": "A demonstration of using AIGNE Framework and Puppeteer MCP Server to extract content from websites using Puppeteer",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/mcp-puppeteer",
@@ -16,14 +16,14 @@
16
16
  "README.md"
17
17
  ],
18
18
  "dependencies": {
19
- "@aigne/agent-library": "^1.17.3",
20
- "@aigne/cli": "^1.16.0",
21
- "@aigne/openai": "^0.4.3",
22
- "@aigne/core": "^1.26.0"
19
+ "@aigne/agent-library": "^1.17.5",
20
+ "@aigne/cli": "^1.18.0",
21
+ "@aigne/core": "^1.28.0",
22
+ "@aigne/openai": "^0.6.0"
23
23
  },
24
24
  "devDependencies": {
25
25
  "@types/bun": "^1.2.9",
26
- "@aigne/test-utils": "^0.4.10"
26
+ "@aigne/test-utils": "^0.4.12"
27
27
  },
28
28
  "scripts": {
29
29
  "start": "bun run index.ts",