ai-agent-proxy 0.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- ai_agent_proxy-0.1.0/PKG-INFO +171 -0
- ai_agent_proxy-0.1.0/README.md +161 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/__init__.py +1 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/api.py +573 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/chat.py +135 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/cli.py +257 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/config.py +3 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/daemon.py +330 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/proxy.py +152 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/service.py +101 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/session.py +148 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy/workspace.py +166 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy.egg-info/PKG-INFO +171 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy.egg-info/SOURCES.txt +18 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy.egg-info/dependency_links.txt +1 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy.egg-info/entry_points.txt +3 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy.egg-info/requires.txt +2 -0
- ai_agent_proxy-0.1.0/ai_agent_proxy.egg-info/top_level.txt +1 -0
- ai_agent_proxy-0.1.0/pyproject.toml +22 -0
- ai_agent_proxy-0.1.0/setup.cfg +4 -0
|
@@ -0,0 +1,171 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: ai-agent-proxy
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Turn your agent CLI into an OpenAI-like service with chat and responses endpoints
|
|
5
|
+
Author-email: Leo <leoustc@icloud.com>
|
|
6
|
+
Requires-Python: >=3.8
|
|
7
|
+
Description-Content-Type: text/markdown
|
|
8
|
+
Requires-Dist: fastapi>=0.110
|
|
9
|
+
Requires-Dist: uvicorn>=0.23
|
|
10
|
+
|
|
11
|
+
# AI Agent Proxy
|
|
12
|
+
|
|
13
|
+
`ai-agent-proxy` is a Python package that turns your agent CLI into an OpenAI-like HTTP service.
|
|
14
|
+
|
|
15
|
+
It exposes both:
|
|
16
|
+
|
|
17
|
+
- the traditional chat endpoint: `POST /v1/chat/completions`
|
|
18
|
+
- the Responses API endpoint: `POST /v1/responses`
|
|
19
|
+
|
|
20
|
+
It also includes:
|
|
21
|
+
|
|
22
|
+
- the `ai-agent-proxy` CLI
|
|
23
|
+
- a FastAPI server with a simple browser chat at `GET /web`
|
|
24
|
+
- persistent local agent workspaces
|
|
25
|
+
- automatic startup of the `manager` agent
|
|
26
|
+
|
|
27
|
+
## What It Does
|
|
28
|
+
|
|
29
|
+
`ai-agent-proxy` runs your local agent backend behind an API shape that OpenAI-style clients can use.
|
|
30
|
+
|
|
31
|
+
You can use it to:
|
|
32
|
+
|
|
33
|
+
- point chatbot clients at an OpenAI-compatible URL
|
|
34
|
+
- expose a manager-backed assistant to Mattermost, Slack, or browser clients
|
|
35
|
+
- keep agent state in persistent local workspaces under `~/.ai_agent_proxy`
|
|
36
|
+
- run a simple manager-first service without changing your core CLI workflow
|
|
37
|
+
|
|
38
|
+
## Endpoints
|
|
39
|
+
|
|
40
|
+
The service supports:
|
|
41
|
+
|
|
42
|
+
- `GET /`
|
|
43
|
+
- `GET /web`
|
|
44
|
+
- `GET /health`
|
|
45
|
+
- `POST /v1/chat/completions`
|
|
46
|
+
- `POST /v1/responses`
|
|
47
|
+
- `POST /agent-reply`
|
|
48
|
+
|
|
49
|
+
`GET /` redirects to `/web`.
|
|
50
|
+
|
|
51
|
+
## Installation
|
|
52
|
+
|
|
53
|
+
From source:
|
|
54
|
+
|
|
55
|
+
```bash
|
|
56
|
+
pip install .
|
|
57
|
+
```
|
|
58
|
+
|
|
59
|
+
For local development:
|
|
60
|
+
|
|
61
|
+
```bash
|
|
62
|
+
make install
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
## CLI
|
|
66
|
+
|
|
67
|
+
Main commands:
|
|
68
|
+
|
|
69
|
+
- `ai-agent-proxy enable`
|
|
70
|
+
- `ai-agent-proxy restart`
|
|
71
|
+
- `ai-agent-proxy init <role>`
|
|
72
|
+
- `ai-agent-proxy list`
|
|
73
|
+
- `ai-agent-proxy status`
|
|
74
|
+
- `ai-agent-proxy connect <agent> [working_dir]`
|
|
75
|
+
- `ai-agent-proxy stop <role>`
|
|
76
|
+
|
|
77
|
+
The service command is:
|
|
78
|
+
|
|
79
|
+
```bash
|
|
80
|
+
ai-agent-proxy daemon
|
|
81
|
+
```
|
|
82
|
+
|
|
83
|
+
## Start The Service
|
|
84
|
+
|
|
85
|
+
Run from source:
|
|
86
|
+
|
|
87
|
+
```bash
|
|
88
|
+
make debug
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
Install and enable as a system service:
|
|
92
|
+
|
|
93
|
+
```bash
|
|
94
|
+
ai-agent-proxy enable --ip 0.0.0.0 --port 7011
|
|
95
|
+
ai-agent-proxy restart
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
By default the service starts the `manager` agent automatically.
|
|
99
|
+
|
|
100
|
+
## Authentication
|
|
101
|
+
|
|
102
|
+
If `AI_AGENT_PROXY_API_KEY` is set, API requests must send:
|
|
103
|
+
|
|
104
|
+
```http
|
|
105
|
+
Authorization: Bearer <your-key>
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
## Traditional Endpoint
|
|
109
|
+
|
|
110
|
+
OpenAI-compatible chat completions:
|
|
111
|
+
|
|
112
|
+
```bash
|
|
113
|
+
curl http://127.0.0.1:7011/v1/chat/completions \
|
|
114
|
+
-H 'Authorization: Bearer YOUR_KEY' \
|
|
115
|
+
-H 'Content-Type: application/json' \
|
|
116
|
+
-d '{
|
|
117
|
+
"model": "manager",
|
|
118
|
+
"messages": [
|
|
119
|
+
{"role": "user", "content": "hello"}
|
|
120
|
+
]
|
|
121
|
+
}'
|
|
122
|
+
```
|
|
123
|
+
|
|
124
|
+
## Responses Endpoint
|
|
125
|
+
|
|
126
|
+
OpenAI Responses API style:
|
|
127
|
+
|
|
128
|
+
```bash
|
|
129
|
+
curl http://127.0.0.1:7011/v1/responses \
|
|
130
|
+
-H 'Authorization: Bearer YOUR_KEY' \
|
|
131
|
+
-H 'Content-Type: application/json' \
|
|
132
|
+
-d '{
|
|
133
|
+
"model": "manager",
|
|
134
|
+
"input": "hello",
|
|
135
|
+
"stream": false
|
|
136
|
+
}'
|
|
137
|
+
```
|
|
138
|
+
|
|
139
|
+
## Browser UI
|
|
140
|
+
|
|
141
|
+
Open:
|
|
142
|
+
|
|
143
|
+
```text
|
|
144
|
+
http://127.0.0.1:7011/web
|
|
145
|
+
```
|
|
146
|
+
|
|
147
|
+
This UI is a simple chatbot routed to the `manager` agent.
|
|
148
|
+
|
|
149
|
+
## Workspace Layout
|
|
150
|
+
|
|
151
|
+
Agent state is stored under:
|
|
152
|
+
|
|
153
|
+
```text
|
|
154
|
+
~/.ai_agent_proxy/
|
|
155
|
+
```
|
|
156
|
+
|
|
157
|
+
Example:
|
|
158
|
+
|
|
159
|
+
```text
|
|
160
|
+
~/.ai_agent_proxy/manager/
|
|
161
|
+
~/.ai_agent_proxy/engineer_alice/
|
|
162
|
+
```
|
|
163
|
+
|
|
164
|
+
Each workspace can include role instructions, memory, notes, control files, and logs.
|
|
165
|
+
|
|
166
|
+
## Notes
|
|
167
|
+
|
|
168
|
+
- the Python import package is `ai_agent_proxy`
|
|
169
|
+
- the CLI command is `ai-agent-proxy`
|
|
170
|
+
- the default systemd unit name is `ai-agent-proxy.service`
|
|
171
|
+
- the API is manager-first: inbound requests are routed through the `manager` agent
|
|
@@ -0,0 +1,161 @@
|
|
|
1
|
+
# AI Agent Proxy
|
|
2
|
+
|
|
3
|
+
`ai-agent-proxy` is a Python package that turns your agent CLI into an OpenAI-like HTTP service.
|
|
4
|
+
|
|
5
|
+
It exposes both:
|
|
6
|
+
|
|
7
|
+
- the traditional chat endpoint: `POST /v1/chat/completions`
|
|
8
|
+
- the Responses API endpoint: `POST /v1/responses`
|
|
9
|
+
|
|
10
|
+
It also includes:
|
|
11
|
+
|
|
12
|
+
- the `ai-agent-proxy` CLI
|
|
13
|
+
- a FastAPI server with a simple browser chat at `GET /web`
|
|
14
|
+
- persistent local agent workspaces
|
|
15
|
+
- automatic startup of the `manager` agent
|
|
16
|
+
|
|
17
|
+
## What It Does
|
|
18
|
+
|
|
19
|
+
`ai-agent-proxy` runs your local agent backend behind an API shape that OpenAI-style clients can use.
|
|
20
|
+
|
|
21
|
+
You can use it to:
|
|
22
|
+
|
|
23
|
+
- point chatbot clients at an OpenAI-compatible URL
|
|
24
|
+
- expose a manager-backed assistant to Mattermost, Slack, or browser clients
|
|
25
|
+
- keep agent state in persistent local workspaces under `~/.ai_agent_proxy`
|
|
26
|
+
- run a simple manager-first service without changing your core CLI workflow
|
|
27
|
+
|
|
28
|
+
## Endpoints
|
|
29
|
+
|
|
30
|
+
The service supports:
|
|
31
|
+
|
|
32
|
+
- `GET /`
|
|
33
|
+
- `GET /web`
|
|
34
|
+
- `GET /health`
|
|
35
|
+
- `POST /v1/chat/completions`
|
|
36
|
+
- `POST /v1/responses`
|
|
37
|
+
- `POST /agent-reply`
|
|
38
|
+
|
|
39
|
+
`GET /` redirects to `/web`.
|
|
40
|
+
|
|
41
|
+
## Installation
|
|
42
|
+
|
|
43
|
+
From source:
|
|
44
|
+
|
|
45
|
+
```bash
|
|
46
|
+
pip install .
|
|
47
|
+
```
|
|
48
|
+
|
|
49
|
+
For local development:
|
|
50
|
+
|
|
51
|
+
```bash
|
|
52
|
+
make install
|
|
53
|
+
```
|
|
54
|
+
|
|
55
|
+
## CLI
|
|
56
|
+
|
|
57
|
+
Main commands:
|
|
58
|
+
|
|
59
|
+
- `ai-agent-proxy enable`
|
|
60
|
+
- `ai-agent-proxy restart`
|
|
61
|
+
- `ai-agent-proxy init <role>`
|
|
62
|
+
- `ai-agent-proxy list`
|
|
63
|
+
- `ai-agent-proxy status`
|
|
64
|
+
- `ai-agent-proxy connect <agent> [working_dir]`
|
|
65
|
+
- `ai-agent-proxy stop <role>`
|
|
66
|
+
|
|
67
|
+
The service command is:
|
|
68
|
+
|
|
69
|
+
```bash
|
|
70
|
+
ai-agent-proxy daemon
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
## Start The Service
|
|
74
|
+
|
|
75
|
+
Run from source:
|
|
76
|
+
|
|
77
|
+
```bash
|
|
78
|
+
make debug
|
|
79
|
+
```
|
|
80
|
+
|
|
81
|
+
Install and enable as a system service:
|
|
82
|
+
|
|
83
|
+
```bash
|
|
84
|
+
ai-agent-proxy enable --ip 0.0.0.0 --port 7011
|
|
85
|
+
ai-agent-proxy restart
|
|
86
|
+
```
|
|
87
|
+
|
|
88
|
+
By default the service starts the `manager` agent automatically.
|
|
89
|
+
|
|
90
|
+
## Authentication
|
|
91
|
+
|
|
92
|
+
If `AI_AGENT_PROXY_API_KEY` is set, API requests must send:
|
|
93
|
+
|
|
94
|
+
```http
|
|
95
|
+
Authorization: Bearer <your-key>
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
## Traditional Endpoint
|
|
99
|
+
|
|
100
|
+
OpenAI-compatible chat completions:
|
|
101
|
+
|
|
102
|
+
```bash
|
|
103
|
+
curl http://127.0.0.1:7011/v1/chat/completions \
|
|
104
|
+
-H 'Authorization: Bearer YOUR_KEY' \
|
|
105
|
+
-H 'Content-Type: application/json' \
|
|
106
|
+
-d '{
|
|
107
|
+
"model": "manager",
|
|
108
|
+
"messages": [
|
|
109
|
+
{"role": "user", "content": "hello"}
|
|
110
|
+
]
|
|
111
|
+
}'
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
## Responses Endpoint
|
|
115
|
+
|
|
116
|
+
OpenAI Responses API style:
|
|
117
|
+
|
|
118
|
+
```bash
|
|
119
|
+
curl http://127.0.0.1:7011/v1/responses \
|
|
120
|
+
-H 'Authorization: Bearer YOUR_KEY' \
|
|
121
|
+
-H 'Content-Type: application/json' \
|
|
122
|
+
-d '{
|
|
123
|
+
"model": "manager",
|
|
124
|
+
"input": "hello",
|
|
125
|
+
"stream": false
|
|
126
|
+
}'
|
|
127
|
+
```
|
|
128
|
+
|
|
129
|
+
## Browser UI
|
|
130
|
+
|
|
131
|
+
Open:
|
|
132
|
+
|
|
133
|
+
```text
|
|
134
|
+
http://127.0.0.1:7011/web
|
|
135
|
+
```
|
|
136
|
+
|
|
137
|
+
This UI is a simple chatbot routed to the `manager` agent.
|
|
138
|
+
|
|
139
|
+
## Workspace Layout
|
|
140
|
+
|
|
141
|
+
Agent state is stored under:
|
|
142
|
+
|
|
143
|
+
```text
|
|
144
|
+
~/.ai_agent_proxy/
|
|
145
|
+
```
|
|
146
|
+
|
|
147
|
+
Example:
|
|
148
|
+
|
|
149
|
+
```text
|
|
150
|
+
~/.ai_agent_proxy/manager/
|
|
151
|
+
~/.ai_agent_proxy/engineer_alice/
|
|
152
|
+
```
|
|
153
|
+
|
|
154
|
+
Each workspace can include role instructions, memory, notes, control files, and logs.
|
|
155
|
+
|
|
156
|
+
## Notes
|
|
157
|
+
|
|
158
|
+
- the Python import package is `ai_agent_proxy`
|
|
159
|
+
- the CLI command is `ai-agent-proxy`
|
|
160
|
+
- the default systemd unit name is `ai-agent-proxy.service`
|
|
161
|
+
- the API is manager-first: inbound requests are routed through the `manager` agent
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
__version__ = "0.1.0"
|