iridet-bot 0.1.1a1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,369 @@
1
+ Metadata-Version: 2.4
2
+ Name: iridet-bot
3
+ Version: 0.1.1a1
4
+ Summary: Full-stack AI agent with Python backend and Vue frontend
5
+ Requires-Python: >=3.9
6
+ Description-Content-Type: text/markdown
7
+ Requires-Dist: fastapi>=0.104.0
8
+ Requires-Dist: uvicorn>=0.24.0
9
+ Requires-Dist: python-dotenv>=1.0.0
10
+ Requires-Dist: openai>=1.0.0
11
+ Requires-Dist: pillow>=10.0.0
12
+ Requires-Dist: pydantic>=2.0.0
13
+ Requires-Dist: pydantic-settings>=2.0.0
14
+ Requires-Dist: aiofiles>=23.0.0
15
+ Requires-Dist: jinja2>=3.0.0
16
+
17
+ # IriBot - Lightweight AI Agent Chat System
18
+
19
+ A full-featured AI agent application with tool calling capabilities and real-time conversation experience. Built with Python FastAPI backend + Vue 3 frontend full-stack architecture.
20
+
21
+ ## 📦 PyPI 发布与 CLI
22
+
23
+ - 安装:pip install iribot
24
+ - 运行:iribot --host 127.0.0.1 --port 8000
25
+ - 构建:使用 Makefile(make build,会自动构建前端并打包到后端静态资源)
26
+
27
+ ## ✨ Key Features
28
+
29
+ ### 🤖 AI Agent Conversation
30
+
31
+ - Intelligent conversation powered by OpenAI API
32
+ - Streaming response support for real-time AI replies
33
+ - Image input support (vision capabilities)
34
+ - Customizable system prompts
35
+
36
+ ### 🛠️ Tool Calling System
37
+
38
+ The agent can autonomously call the following tools to complete tasks:
39
+
40
+ - **File Operations**
41
+ - `read_file` - Read file contents
42
+ - `write_file` - Create or modify files
43
+ - `list_directory` - List directory contents
44
+
45
+ - **Command Execution**
46
+ - `shell_start` - Start an interactive shell session
47
+ - `shell_run` - Execute commands in shell
48
+ - `shell_read` - Read shell output
49
+ - `shell_write` - Write input to shell
50
+ - `shell_stop` - Stop shell session
51
+
52
+ ### 💬 Session Management
53
+
54
+ - Multi-session support, create multiple independent conversations
55
+ - Persistent session history storage
56
+ - Session list management (create, switch, delete)
57
+ - Independent system prompts for each session
58
+
59
+ ### 🎨 Modern UI
60
+
61
+ - Beautiful interface based on TDesign component library
62
+ - Real-time tool call status display
63
+ - Markdown message rendering support
64
+ - Responsive design for different screen sizes
65
+
66
+ ## 🏗️ System Architecture
67
+
68
+ ```mermaid
69
+ graph TB
70
+ subgraph Frontend["Frontend Layer"]
71
+ A[ChatSidebar<br/>Session List]
72
+ B[ChatContainer<br/>Chat View]
73
+ C[ToolCallMessage<br/>Tool Call View]
74
+ FE[Vue 3 + TDesign UI + Vite]
75
+ end
76
+
77
+ subgraph Backend["Backend Layer"]
78
+ D[main.py<br/>FastAPI Server]
79
+ E[agent.py<br/>AI Agent]
80
+ F[executor.py<br/>Tool Executor]
81
+ G[session_manager.py<br/>Session State Management]
82
+ H[tools/<br/>Tool Suite]
83
+ BE[FastAPI + OpenAI SDK]
84
+ end
85
+
86
+ subgraph External["External Services"]
87
+ I[OpenAI API / Compatible LLM Service<br/>GPT-4, GPT-3.5, or Custom Models]
88
+ end
89
+
90
+ Frontend -->|HTTP/SSE<br/>Server-Sent Events| Backend
91
+ Backend -->|OpenAI API| External
92
+
93
+ D --> E
94
+ E --> F
95
+ E --> G
96
+ F --> H
97
+
98
+ style Frontend fill:#e1f5ff
99
+ style Backend fill:#fff4e1
100
+ style External fill:#f0f0f0
101
+ ```
102
+
103
+ ### Data Flow
104
+
105
+ ```mermaid
106
+ sequenceDiagram
107
+ participant User
108
+ participant Frontend
109
+ participant SessionManager
110
+ participant Agent
111
+ participant ToolExecutor
112
+ participant Tools
113
+ participant OpenAI
114
+
115
+ User->>Frontend: Input Message
116
+ Frontend->>SessionManager: POST /api/chat/stream
117
+ SessionManager->>SessionManager: Save user message
118
+ SessionManager->>Agent: Forward message
119
+ Agent->>OpenAI: Call OpenAI API
120
+
121
+ alt Text Response
122
+ OpenAI-->>Agent: Stream text content
123
+ Agent-->>Frontend: SSE stream
124
+ Frontend-->>User: Display in real-time
125
+ end
126
+
127
+ alt Tool Call Request
128
+ OpenAI-->>Agent: Return tool call request
129
+ Agent->>ToolExecutor: Execute tool
130
+ ToolExecutor->>Tools: Call specific tool
131
+
132
+ alt File Operations
133
+ Tools->>Tools: Read/Write file system
134
+ end
135
+
136
+ alt Shell Commands
137
+ Tools->>Tools: Execute shell commands
138
+ end
139
+
140
+ Tools-->>ToolExecutor: Return result
141
+ ToolExecutor-->>Agent: Tool execution result
142
+ Agent->>OpenAI: Send tool result
143
+ OpenAI-->>Agent: Continue generating response
144
+ Agent-->>Frontend: SSE stream
145
+ Frontend-->>User: Display response
146
+ end
147
+ ```
148
+
149
+ ## 🚀 Quick Start
150
+
151
+ ### Requirements
152
+
153
+ - Python 3.8+
154
+ - Node.js 16+
155
+ - OpenAI API Key (or compatible LLM service)
156
+
157
+ ### Installation
158
+
159
+ #### 1. Clone the Repository
160
+
161
+ ```bash
162
+ git clone <repository-url>
163
+ cd mybot
164
+ ```
165
+
166
+ #### 2. Backend Setup
167
+
168
+ ```bash
169
+ cd iribot
170
+
171
+ # Create virtual environment (recommended)
172
+ python -m venv venv
173
+ source venv/bin/activate # Windows: venv\Scripts\activate
174
+
175
+ # Install dependencies
176
+ pip install -r requirements.txt
177
+
178
+ # Configure environment variables
179
+ cp .env.example .env
180
+ # Edit .env file and add your OpenAI API Key
181
+ ```
182
+
183
+ `.env` configuration example:
184
+
185
+ ```ini
186
+ OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
187
+ OPENAI_MODEL=gpt-4-turbo-preview
188
+ # OPENAI_BASE_URL=https://api.openai.com/v1 # Optional, use custom API endpoint
189
+ DEBUG=false
190
+ ```
191
+
192
+ #### 3. Frontend Setup
193
+
194
+ ```bash
195
+ cd frontend
196
+
197
+ # Install dependencies
198
+ npm install
199
+ ```
200
+
201
+ #### 4. Start Services
202
+
203
+ ##### Using Automated Scripts (Recommended)
204
+
205
+ **Windows:**
206
+
207
+ ```bash
208
+ # In project root directory
209
+ ./setup.bat
210
+ ```
211
+
212
+ **Linux/macOS:**
213
+
214
+ ```bash
215
+ # In project root directory
216
+ chmod +x setup.sh
217
+ ./setup.sh
218
+ ```
219
+
220
+ ##### Manual Start
221
+
222
+ **Backend:**
223
+
224
+ ```bash
225
+ cd iribot
226
+ uvicorn main:app --reload --port 8000
227
+ ```
228
+
229
+ **Frontend:**
230
+
231
+ ```bash
232
+ cd frontend
233
+ npm run dev
234
+ ```
235
+
236
+ ## 🔧 Configuration
237
+
238
+ ### Backend Configuration
239
+
240
+ Configure in `iribot/.env` file:
241
+
242
+ | Config Item | Description | Default |
243
+ | ----------------- | -------------------- | ---------------------- |
244
+ | `OPENAI_API_KEY` | OpenAI API key | Required |
245
+ | `OPENAI_MODEL` | Model to use | `gpt-4-vision-preview` |
246
+ | `OPENAI_BASE_URL` | Custom API endpoint | Empty (use official) |
247
+ | `DEBUG` | Debug mode | `false` |
248
+ | `BASH_PATH` | Bash executable path | `bash` |
249
+
250
+ ### Frontend Configuration
251
+
252
+ Frontend connects to backend via Vite proxy. Configuration file: `frontend/vite.config.js`
253
+
254
+ ```javascript
255
+ export default {
256
+ server: {
257
+ proxy: {
258
+ "/api": {
259
+ target: "http://localhost:8000",
260
+ changeOrigin: true,
261
+ },
262
+ },
263
+ },
264
+ };
265
+ ```
266
+
267
+ ## 🔌 API Endpoints
268
+
269
+ ### Session Management
270
+
271
+ - `POST /api/sessions` - Create new session
272
+ - `GET /api/sessions` - Get session list
273
+ - `GET /api/sessions/{session_id}` - Get session details
274
+ - `DELETE /api/sessions/{session_id}` - Delete session
275
+
276
+ ### Chat Interface
277
+
278
+ - `POST /api/chat/stream` - Send message (SSE streaming response)
279
+
280
+ ### Tool Status
281
+
282
+ - `GET /api/tools/status` - Get all tool statuses
283
+
284
+ ## 🛠️ Extension Development
285
+
286
+ ### Adding New Tools
287
+
288
+ 1. Create a new tool file in the `iribot/tools/` directory
289
+ 2. Inherit from `BaseTool` class:
290
+
291
+ ```python
292
+ from tools.base import BaseTool
293
+
294
+ class MyCustomTool(BaseTool):
295
+ @property
296
+ def name(self) -> str:
297
+ return "my_custom_tool"
298
+
299
+ @property
300
+ def description(self) -> str:
301
+ return "Tool description"
302
+
303
+ @property
304
+ def parameters(self) -> dict:
305
+ return {
306
+ "type": "object",
307
+ "properties": {
308
+ "param1": {
309
+ "type": "string",
310
+ "description": "Parameter description"
311
+ }
312
+ },
313
+ "required": ["param1"]
314
+ }
315
+
316
+ def execute(self, **kwargs) -> dict:
317
+ # Implement tool logic
318
+ return {
319
+ "success": True,
320
+ "result": "Execution result"
321
+ }
322
+ ```
323
+
324
+ 3. Register the tool in `executor.py`:
325
+
326
+ ```python
327
+ def _register_default_tools(self):
328
+ # ... other tools
329
+ self.register_tool(MyCustomTool())
330
+ ```
331
+
332
+ ### Adding New Frontend Components
333
+
334
+ Add tool call visualization components in the `frontend/src/components/tool-calls/` directory.
335
+
336
+ ## 📝 Tech Stack
337
+
338
+ ### Backend
339
+
340
+ - **FastAPI** - Modern, fast web framework
341
+ - **OpenAI SDK** - LLM interface calling
342
+ - **Pydantic** - Data validation and settings management
343
+ - **Uvicorn** - ASGI server
344
+
345
+ ### Frontend
346
+
347
+ - **Vue 3** - Progressive JavaScript framework
348
+ - **TDesign** - Enterprise-level UI component library
349
+ - **Vite** - Next-generation frontend build tool
350
+ - **Marked** - Markdown parser
351
+
352
+ ## 🤝 Contributing
353
+
354
+ Issues and Pull Requests are welcome!
355
+
356
+ ## 📄 License
357
+
358
+ MIT License
359
+
360
+ ## 🔗 Related Links
361
+
362
+ - [OpenAI API Documentation](https://platform.openai.com/docs)
363
+ - [FastAPI Documentation](https://fastapi.tiangolo.com/)
364
+ - [Vue 3 Documentation](https://vuejs.org/)
365
+ - [TDesign Documentation](https://tdesign.tencent.com/vue-next/overview)
366
+
367
+ ---
368
+
369
+ **Note:** Using this project requires a valid OpenAI API Key or compatible LLM service endpoint.
@@ -0,0 +1,353 @@
1
+ # IriBot - Lightweight AI Agent Chat System
2
+
3
+ A full-featured AI agent application with tool calling capabilities and real-time conversation experience. Built with Python FastAPI backend + Vue 3 frontend full-stack architecture.
4
+
5
+ ## 📦 PyPI 发布与 CLI
6
+
7
+ - 安装:pip install iribot
8
+ - 运行:iribot --host 127.0.0.1 --port 8000
9
+ - 构建:使用 Makefile(make build,会自动构建前端并打包到后端静态资源)
10
+
11
+ ## ✨ Key Features
12
+
13
+ ### 🤖 AI Agent Conversation
14
+
15
+ - Intelligent conversation powered by OpenAI API
16
+ - Streaming response support for real-time AI replies
17
+ - Image input support (vision capabilities)
18
+ - Customizable system prompts
19
+
20
+ ### 🛠️ Tool Calling System
21
+
22
+ The agent can autonomously call the following tools to complete tasks:
23
+
24
+ - **File Operations**
25
+ - `read_file` - Read file contents
26
+ - `write_file` - Create or modify files
27
+ - `list_directory` - List directory contents
28
+
29
+ - **Command Execution**
30
+ - `shell_start` - Start an interactive shell session
31
+ - `shell_run` - Execute commands in shell
32
+ - `shell_read` - Read shell output
33
+ - `shell_write` - Write input to shell
34
+ - `shell_stop` - Stop shell session
35
+
36
+ ### 💬 Session Management
37
+
38
+ - Multi-session support, create multiple independent conversations
39
+ - Persistent session history storage
40
+ - Session list management (create, switch, delete)
41
+ - Independent system prompts for each session
42
+
43
+ ### 🎨 Modern UI
44
+
45
+ - Beautiful interface based on TDesign component library
46
+ - Real-time tool call status display
47
+ - Markdown message rendering support
48
+ - Responsive design for different screen sizes
49
+
50
+ ## 🏗️ System Architecture
51
+
52
+ ```mermaid
53
+ graph TB
54
+ subgraph Frontend["Frontend Layer"]
55
+ A[ChatSidebar<br/>Session List]
56
+ B[ChatContainer<br/>Chat View]
57
+ C[ToolCallMessage<br/>Tool Call View]
58
+ FE[Vue 3 + TDesign UI + Vite]
59
+ end
60
+
61
+ subgraph Backend["Backend Layer"]
62
+ D[main.py<br/>FastAPI Server]
63
+ E[agent.py<br/>AI Agent]
64
+ F[executor.py<br/>Tool Executor]
65
+ G[session_manager.py<br/>Session State Management]
66
+ H[tools/<br/>Tool Suite]
67
+ BE[FastAPI + OpenAI SDK]
68
+ end
69
+
70
+ subgraph External["External Services"]
71
+ I[OpenAI API / Compatible LLM Service<br/>GPT-4, GPT-3.5, or Custom Models]
72
+ end
73
+
74
+ Frontend -->|HTTP/SSE<br/>Server-Sent Events| Backend
75
+ Backend -->|OpenAI API| External
76
+
77
+ D --> E
78
+ E --> F
79
+ E --> G
80
+ F --> H
81
+
82
+ style Frontend fill:#e1f5ff
83
+ style Backend fill:#fff4e1
84
+ style External fill:#f0f0f0
85
+ ```
86
+
87
+ ### Data Flow
88
+
89
+ ```mermaid
90
+ sequenceDiagram
91
+ participant User
92
+ participant Frontend
93
+ participant SessionManager
94
+ participant Agent
95
+ participant ToolExecutor
96
+ participant Tools
97
+ participant OpenAI
98
+
99
+ User->>Frontend: Input Message
100
+ Frontend->>SessionManager: POST /api/chat/stream
101
+ SessionManager->>SessionManager: Save user message
102
+ SessionManager->>Agent: Forward message
103
+ Agent->>OpenAI: Call OpenAI API
104
+
105
+ alt Text Response
106
+ OpenAI-->>Agent: Stream text content
107
+ Agent-->>Frontend: SSE stream
108
+ Frontend-->>User: Display in real-time
109
+ end
110
+
111
+ alt Tool Call Request
112
+ OpenAI-->>Agent: Return tool call request
113
+ Agent->>ToolExecutor: Execute tool
114
+ ToolExecutor->>Tools: Call specific tool
115
+
116
+ alt File Operations
117
+ Tools->>Tools: Read/Write file system
118
+ end
119
+
120
+ alt Shell Commands
121
+ Tools->>Tools: Execute shell commands
122
+ end
123
+
124
+ Tools-->>ToolExecutor: Return result
125
+ ToolExecutor-->>Agent: Tool execution result
126
+ Agent->>OpenAI: Send tool result
127
+ OpenAI-->>Agent: Continue generating response
128
+ Agent-->>Frontend: SSE stream
129
+ Frontend-->>User: Display response
130
+ end
131
+ ```
132
+
133
+ ## 🚀 Quick Start
134
+
135
+ ### Requirements
136
+
137
+ - Python 3.8+
138
+ - Node.js 16+
139
+ - OpenAI API Key (or compatible LLM service)
140
+
141
+ ### Installation
142
+
143
+ #### 1. Clone the Repository
144
+
145
+ ```bash
146
+ git clone <repository-url>
147
+ cd mybot
148
+ ```
149
+
150
+ #### 2. Backend Setup
151
+
152
+ ```bash
153
+ cd iribot
154
+
155
+ # Create virtual environment (recommended)
156
+ python -m venv venv
157
+ source venv/bin/activate # Windows: venv\Scripts\activate
158
+
159
+ # Install dependencies
160
+ pip install -r requirements.txt
161
+
162
+ # Configure environment variables
163
+ cp .env.example .env
164
+ # Edit .env file and add your OpenAI API Key
165
+ ```
166
+
167
+ `.env` configuration example:
168
+
169
+ ```ini
170
+ OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
171
+ OPENAI_MODEL=gpt-4-turbo-preview
172
+ # OPENAI_BASE_URL=https://api.openai.com/v1 # Optional, use custom API endpoint
173
+ DEBUG=false
174
+ ```
175
+
176
+ #### 3. Frontend Setup
177
+
178
+ ```bash
179
+ cd frontend
180
+
181
+ # Install dependencies
182
+ npm install
183
+ ```
184
+
185
+ #### 4. Start Services
186
+
187
+ ##### Using Automated Scripts (Recommended)
188
+
189
+ **Windows:**
190
+
191
+ ```bash
192
+ # In project root directory
193
+ ./setup.bat
194
+ ```
195
+
196
+ **Linux/macOS:**
197
+
198
+ ```bash
199
+ # In project root directory
200
+ chmod +x setup.sh
201
+ ./setup.sh
202
+ ```
203
+
204
+ ##### Manual Start
205
+
206
+ **Backend:**
207
+
208
+ ```bash
209
+ cd iribot
210
+ uvicorn main:app --reload --port 8000
211
+ ```
212
+
213
+ **Frontend:**
214
+
215
+ ```bash
216
+ cd frontend
217
+ npm run dev
218
+ ```
219
+
220
+ ## 🔧 Configuration
221
+
222
+ ### Backend Configuration
223
+
224
+ Configure in `iribot/.env` file:
225
+
226
+ | Config Item | Description | Default |
227
+ | ----------------- | -------------------- | ---------------------- |
228
+ | `OPENAI_API_KEY` | OpenAI API key | Required |
229
+ | `OPENAI_MODEL` | Model to use | `gpt-4-vision-preview` |
230
+ | `OPENAI_BASE_URL` | Custom API endpoint | Empty (use official) |
231
+ | `DEBUG` | Debug mode | `false` |
232
+ | `BASH_PATH` | Bash executable path | `bash` |
233
+
234
+ ### Frontend Configuration
235
+
236
+ Frontend connects to backend via Vite proxy. Configuration file: `frontend/vite.config.js`
237
+
238
+ ```javascript
239
+ export default {
240
+ server: {
241
+ proxy: {
242
+ "/api": {
243
+ target: "http://localhost:8000",
244
+ changeOrigin: true,
245
+ },
246
+ },
247
+ },
248
+ };
249
+ ```
250
+
251
+ ## 🔌 API Endpoints
252
+
253
+ ### Session Management
254
+
255
+ - `POST /api/sessions` - Create new session
256
+ - `GET /api/sessions` - Get session list
257
+ - `GET /api/sessions/{session_id}` - Get session details
258
+ - `DELETE /api/sessions/{session_id}` - Delete session
259
+
260
+ ### Chat Interface
261
+
262
+ - `POST /api/chat/stream` - Send message (SSE streaming response)
263
+
264
+ ### Tool Status
265
+
266
+ - `GET /api/tools/status` - Get all tool statuses
267
+
268
+ ## 🛠️ Extension Development
269
+
270
+ ### Adding New Tools
271
+
272
+ 1. Create a new tool file in the `iribot/tools/` directory
273
+ 2. Inherit from `BaseTool` class:
274
+
275
+ ```python
276
+ from tools.base import BaseTool
277
+
278
+ class MyCustomTool(BaseTool):
279
+ @property
280
+ def name(self) -> str:
281
+ return "my_custom_tool"
282
+
283
+ @property
284
+ def description(self) -> str:
285
+ return "Tool description"
286
+
287
+ @property
288
+ def parameters(self) -> dict:
289
+ return {
290
+ "type": "object",
291
+ "properties": {
292
+ "param1": {
293
+ "type": "string",
294
+ "description": "Parameter description"
295
+ }
296
+ },
297
+ "required": ["param1"]
298
+ }
299
+
300
+ def execute(self, **kwargs) -> dict:
301
+ # Implement tool logic
302
+ return {
303
+ "success": True,
304
+ "result": "Execution result"
305
+ }
306
+ ```
307
+
308
+ 3. Register the tool in `executor.py`:
309
+
310
+ ```python
311
+ def _register_default_tools(self):
312
+ # ... other tools
313
+ self.register_tool(MyCustomTool())
314
+ ```
315
+
316
+ ### Adding New Frontend Components
317
+
318
+ Add tool call visualization components in the `frontend/src/components/tool-calls/` directory.
319
+
320
+ ## 📝 Tech Stack
321
+
322
+ ### Backend
323
+
324
+ - **FastAPI** - Modern, fast web framework
325
+ - **OpenAI SDK** - LLM interface calling
326
+ - **Pydantic** - Data validation and settings management
327
+ - **Uvicorn** - ASGI server
328
+
329
+ ### Frontend
330
+
331
+ - **Vue 3** - Progressive JavaScript framework
332
+ - **TDesign** - Enterprise-level UI component library
333
+ - **Vite** - Next-generation frontend build tool
334
+ - **Marked** - Markdown parser
335
+
336
+ ## 🤝 Contributing
337
+
338
+ Issues and Pull Requests are welcome!
339
+
340
+ ## 📄 License
341
+
342
+ MIT License
343
+
344
+ ## 🔗 Related Links
345
+
346
+ - [OpenAI API Documentation](https://platform.openai.com/docs)
347
+ - [FastAPI Documentation](https://fastapi.tiangolo.com/)
348
+ - [Vue 3 Documentation](https://vuejs.org/)
349
+ - [TDesign Documentation](https://tdesign.tencent.com/vue-next/overview)
350
+
351
+ ---
352
+
353
+ **Note:** Using this project requires a valid OpenAI API Key or compatible LLM service endpoint.
@@ -0,0 +1,4 @@
1
+ OPENAI_API_KEY=your-api-key-here
2
+ OPENAI_MODEL=gpt-4-vision-preview
3
+ OPENAI_BASE_URL=
4
+ DEBUG=False