agentbasis 0.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- agentbasis-0.1.0/CHANGELOG.md +34 -0
- agentbasis-0.1.0/CONTRIBUTING.md +128 -0
- agentbasis-0.1.0/LICENSE +21 -0
- agentbasis-0.1.0/MANIFEST.in +5 -0
- agentbasis-0.1.0/PKG-INFO +220 -0
- agentbasis-0.1.0/README.md +189 -0
- agentbasis-0.1.0/agentbasis/__init__.py +87 -0
- agentbasis-0.1.0/agentbasis/client.py +134 -0
- agentbasis-0.1.0/agentbasis/config.py +33 -0
- agentbasis-0.1.0/agentbasis/context.py +259 -0
- agentbasis-0.1.0/agentbasis/decorators.py +80 -0
- agentbasis-0.1.0/agentbasis/frameworks/langchain/__init__.py +109 -0
- agentbasis-0.1.0/agentbasis/frameworks/langchain/callback.py +373 -0
- agentbasis-0.1.0/agentbasis/frameworks/pydanticai/__init__.py +32 -0
- agentbasis-0.1.0/agentbasis/frameworks/pydanticai/instrumentation.py +233 -0
- agentbasis-0.1.0/agentbasis/llms/anthropic/__init__.py +18 -0
- agentbasis-0.1.0/agentbasis/llms/anthropic/messages.py +298 -0
- agentbasis-0.1.0/agentbasis/llms/gemini/__init__.py +18 -0
- agentbasis-0.1.0/agentbasis/llms/gemini/chat.py +326 -0
- agentbasis-0.1.0/agentbasis/llms/openai/__init__.py +18 -0
- agentbasis-0.1.0/agentbasis/llms/openai/chat.py +235 -0
- agentbasis-0.1.0/agentbasis.egg-info/PKG-INFO +220 -0
- agentbasis-0.1.0/agentbasis.egg-info/SOURCES.txt +26 -0
- agentbasis-0.1.0/agentbasis.egg-info/dependency_links.txt +1 -0
- agentbasis-0.1.0/agentbasis.egg-info/requires.txt +5 -0
- agentbasis-0.1.0/agentbasis.egg-info/top_level.txt +1 -0
- agentbasis-0.1.0/pyproject.toml +59 -0
- agentbasis-0.1.0/setup.cfg +4 -0
|
@@ -0,0 +1,34 @@
|
|
|
1
|
+
# Changelog
|
|
2
|
+
|
|
3
|
+
All notable changes to the AgentBasis Python SDK will be documented in this file.
|
|
4
|
+
|
|
5
|
+
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
|
6
|
+
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
|
7
|
+
|
|
8
|
+
## [0.1.0] - 2026-01-19
|
|
9
|
+
|
|
10
|
+
### Added
|
|
11
|
+
|
|
12
|
+
- **Core SDK**: Initialize with `agentbasis.init(api_key, agent_id)`
|
|
13
|
+
- **OpenTelemetry Integration**: Full OTel-based tracing with OTLP HTTP exporter
|
|
14
|
+
- **Graceful Shutdown**: Automatic flush on exit via `atexit`
|
|
15
|
+
|
|
16
|
+
#### LLM Integrations
|
|
17
|
+
- **OpenAI**: Sync, async, and streaming support for Chat Completions
|
|
18
|
+
- **Anthropic**: Sync, async, and streaming support for Messages API
|
|
19
|
+
- **Gemini**: Sync, async, and streaming support for GenerativeModel
|
|
20
|
+
|
|
21
|
+
#### Framework Integrations
|
|
22
|
+
- **LangChain**: Callback handler with full parent-child span relationships
|
|
23
|
+
- **Pydantic AI**: Integration via native OpenTelemetry support
|
|
24
|
+
|
|
25
|
+
#### Context Management
|
|
26
|
+
- User/session/conversation tracking with `set_user()`, `set_session()`, `set_conversation()`
|
|
27
|
+
- Context manager pattern with `agentbasis.context()`
|
|
28
|
+
- Decorator support with `@agentbasis.with_context()`
|
|
29
|
+
|
|
30
|
+
#### Decorators
|
|
31
|
+
- `@agentbasis.trace` for tracking any function (sync and async)
|
|
32
|
+
|
|
33
|
+
### Documentation
|
|
34
|
+
- Full documentation available at [docs.agentbasis.co](https://docs.agentbasis.co)
|
|
@@ -0,0 +1,128 @@
|
|
|
1
|
+
# Contributing to AgentBasis Python SDK
|
|
2
|
+
|
|
3
|
+
Thank you for your interest in contributing to the AgentBasis Python SDK!
|
|
4
|
+
|
|
5
|
+
## Development Setup
|
|
6
|
+
|
|
7
|
+
### Prerequisites
|
|
8
|
+
|
|
9
|
+
- Python 3.8 or higher
|
|
10
|
+
- pip
|
|
11
|
+
|
|
12
|
+
### Installation for Development
|
|
13
|
+
|
|
14
|
+
1. Clone the repository:
|
|
15
|
+
```bash
|
|
16
|
+
git clone https://github.com/AgentBasis/agentbasis-python-sdk.git
|
|
17
|
+
cd agentbasis-python-sdk
|
|
18
|
+
```
|
|
19
|
+
|
|
20
|
+
2. Install in development mode:
|
|
21
|
+
```bash
|
|
22
|
+
pip install -e .
|
|
23
|
+
```
|
|
24
|
+
|
|
25
|
+
3. Install development dependencies:
|
|
26
|
+
```bash
|
|
27
|
+
pip install pytest pytest-asyncio
|
|
28
|
+
```
|
|
29
|
+
|
|
30
|
+
## Building the Package
|
|
31
|
+
|
|
32
|
+
After making changes, rebuild the package:
|
|
33
|
+
|
|
34
|
+
```bash
|
|
35
|
+
python -m build
|
|
36
|
+
```
|
|
37
|
+
|
|
38
|
+
This updates the distribution files in `dist/`.
|
|
39
|
+
|
|
40
|
+
## Versioning & Releasing to PyPI
|
|
41
|
+
|
|
42
|
+
We use [Semantic Versioning](https://semver.org/):
|
|
43
|
+
|
|
44
|
+
| Change Type | Version Bump | Example |
|
|
45
|
+
|-------------|--------------|---------|
|
|
46
|
+
| Bug fix | Patch | `0.1.0` → `0.1.1` |
|
|
47
|
+
| New feature (backward compatible) | Minor | `0.1.0` → `0.2.0` |
|
|
48
|
+
| Breaking change | Major | `0.1.0` → `1.0.0` |
|
|
49
|
+
|
|
50
|
+
### To Release a New Version
|
|
51
|
+
|
|
52
|
+
1. Update the version in `pyproject.toml`:
|
|
53
|
+
```toml
|
|
54
|
+
version = "0.1.1"
|
|
55
|
+
```
|
|
56
|
+
|
|
57
|
+
2. Update `CHANGELOG.md` with the changes
|
|
58
|
+
|
|
59
|
+
3. Build the package:
|
|
60
|
+
```bash
|
|
61
|
+
python -m build
|
|
62
|
+
```
|
|
63
|
+
|
|
64
|
+
4. Upload to PyPI:
|
|
65
|
+
```bash
|
|
66
|
+
python -m twine upload dist/*
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
**Note:** PyPI does not allow overwriting existing versions. Once a version is published, you must increment the version number for any changes or fixes.
|
|
70
|
+
|
|
71
|
+
## Testing Locally
|
|
72
|
+
|
|
73
|
+
### Install from Local Build
|
|
74
|
+
|
|
75
|
+
```bash
|
|
76
|
+
pip install dist/agentbasis-0.1.0-py3-none-any.whl
|
|
77
|
+
```
|
|
78
|
+
|
|
79
|
+
### Install from GitHub (for testing before release)
|
|
80
|
+
|
|
81
|
+
```bash
|
|
82
|
+
pip install git+https://github.com/AgentBasis/agentbasis-python-sdk.git
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
### Run Unit Tests
|
|
86
|
+
|
|
87
|
+
```bash
|
|
88
|
+
python -m pytest tests/
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
## Project Structure
|
|
92
|
+
|
|
93
|
+
```
|
|
94
|
+
agentbasis/
|
|
95
|
+
├── __init__.py # Main entry point, init() function
|
|
96
|
+
├── client.py # AgentBasis client with OTel setup
|
|
97
|
+
├── config.py # Configuration handling
|
|
98
|
+
├── context.py # Context management (user, session, etc.)
|
|
99
|
+
├── decorators.py # @trace decorator
|
|
100
|
+
├── llms/ # LLM provider integrations
|
|
101
|
+
│ ├── openai/
|
|
102
|
+
│ ├── anthropic/
|
|
103
|
+
│ └── gemini/
|
|
104
|
+
└── frameworks/ # Framework integrations
|
|
105
|
+
├── langchain/
|
|
106
|
+
└── pydanticai/
|
|
107
|
+
```
|
|
108
|
+
|
|
109
|
+
## Code Style
|
|
110
|
+
|
|
111
|
+
- Follow PEP 8 guidelines
|
|
112
|
+
- Use type hints where possible
|
|
113
|
+
- Add docstrings to public functions
|
|
114
|
+
- Keep functions focused and single-purpose
|
|
115
|
+
|
|
116
|
+
## Submitting Changes
|
|
117
|
+
|
|
118
|
+
1. Fork the repository
|
|
119
|
+
2. Create a feature branch (`git checkout -b feature/my-feature`)
|
|
120
|
+
3. Make your changes
|
|
121
|
+
4. Run tests to ensure nothing is broken
|
|
122
|
+
5. Commit your changes (`git commit -am 'Add new feature'`)
|
|
123
|
+
6. Push to the branch (`git push origin feature/my-feature`)
|
|
124
|
+
7. Open a Pull Request
|
|
125
|
+
|
|
126
|
+
## Questions?
|
|
127
|
+
|
|
128
|
+
If you have questions, please reach out to support@agentbasis.co
|
agentbasis-0.1.0/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 AgentBasis
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
|
@@ -0,0 +1,220 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: agentbasis
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Management & Observability SDK for AI Agents
|
|
5
|
+
Author-email: AgentBasis <support@agentbasis.co>
|
|
6
|
+
Maintainer-email: AgentBasis <support@agentbasis.co>
|
|
7
|
+
Project-URL: Homepage, https://agentbasis.co
|
|
8
|
+
Project-URL: Documentation, https://docs.agentbasis.co
|
|
9
|
+
Project-URL: Repository, https://github.com/AgentBasis/agentbasis-python-sdk
|
|
10
|
+
Project-URL: Bug Tracker, https://github.com/AgentBasis/agentbasis-python-sdk/issues
|
|
11
|
+
Keywords: ai,agents,observability,tracing,opentelemetry,llm,openai,anthropic,langchain,monitoring,telemetry
|
|
12
|
+
Classifier: Development Status :: 4 - Beta
|
|
13
|
+
Classifier: Intended Audience :: Developers
|
|
14
|
+
Classifier: Programming Language :: Python :: 3
|
|
15
|
+
Classifier: Programming Language :: Python :: 3.8
|
|
16
|
+
Classifier: Programming Language :: Python :: 3.9
|
|
17
|
+
Classifier: Programming Language :: Python :: 3.10
|
|
18
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
19
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
20
|
+
Classifier: License :: OSI Approved :: MIT License
|
|
21
|
+
Classifier: Operating System :: OS Independent
|
|
22
|
+
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
|
23
|
+
Classifier: Topic :: System :: Monitoring
|
|
24
|
+
Requires-Python: >=3.8
|
|
25
|
+
Description-Content-Type: text/markdown
|
|
26
|
+
Requires-Dist: requests>=2.25.0
|
|
27
|
+
Requires-Dist: pydantic>=2.0.0
|
|
28
|
+
Requires-Dist: opentelemetry-api>=1.20.0
|
|
29
|
+
Requires-Dist: opentelemetry-sdk>=1.20.0
|
|
30
|
+
Requires-Dist: opentelemetry-exporter-otlp-proto-http>=1.20.0
|
|
31
|
+
|
|
32
|
+
# AgentBasis Python SDK
|
|
33
|
+
|
|
34
|
+
**Management & Observability SDK for AI Agents in production**
|
|
35
|
+
|
|
36
|
+
The **AgentBasis Python SDK** provides a simple, lightweight way to track the performance, traces, sessions, and behavior of AI agents. It sends data using the **OpenTelemetry (OTel)** standard, making it compatible with AgentBasis and other observability backends.
|
|
37
|
+
|
|
38
|
+
This is the **foundation SDK** that enables deep observability for coded agents built with:
|
|
39
|
+
- Pure Python
|
|
40
|
+
- LLM Providers:
|
|
41
|
+
- OpenAI
|
|
42
|
+
- Anthropic
|
|
43
|
+
- Gemini
|
|
44
|
+
- Frameworks
|
|
45
|
+
- LangChain
|
|
46
|
+
- Pydantic AI
|
|
47
|
+
|
|
48
|
+
## Installation
|
|
49
|
+
|
|
50
|
+
```bash
|
|
51
|
+
pip install agentbasis
|
|
52
|
+
```
|
|
53
|
+
|
|
54
|
+
## Quick Start
|
|
55
|
+
|
|
56
|
+
### 1. Initialize the SDK
|
|
57
|
+
Start by initializing the SDK with your API key and Agent ID. This usually goes at the top of your main application file.
|
|
58
|
+
|
|
59
|
+
```python
|
|
60
|
+
import agentbasis
|
|
61
|
+
|
|
62
|
+
# Initialize with your API Key and Agent ID
|
|
63
|
+
agentbasis.init(
|
|
64
|
+
api_key="your-api-key-here",
|
|
65
|
+
agent_id="your-agent-id-here"
|
|
66
|
+
)
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
### 2. Manual Tracking (Decorators)
|
|
70
|
+
Use the `@trace` decorator to automatically track any function.
|
|
71
|
+
|
|
72
|
+
```python
|
|
73
|
+
from agentbasis import trace
|
|
74
|
+
|
|
75
|
+
@trace
|
|
76
|
+
def chat_with_user(query):
|
|
77
|
+
# Your agent logic here
|
|
78
|
+
return "Response to: " + query
|
|
79
|
+
|
|
80
|
+
# When you call this, data is automatically sent to AgentBasis
|
|
81
|
+
chat_with_user("Hello world")
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
### 3. OpenAI Integration
|
|
85
|
+
Automatically track all your OpenAI calls (models, tokens, prompts) with one line of code.
|
|
86
|
+
|
|
87
|
+
```python
|
|
88
|
+
from agentbasis.llms.openai import instrument
|
|
89
|
+
|
|
90
|
+
# Enable OpenAI instrumentation
|
|
91
|
+
instrument()
|
|
92
|
+
|
|
93
|
+
# Now just use the OpenAI client as normal
|
|
94
|
+
from openai import OpenAI
|
|
95
|
+
client = OpenAI()
|
|
96
|
+
response = client.chat.completions.create(
|
|
97
|
+
model="gpt-4",
|
|
98
|
+
messages=[{"role": "user", "content": "Hello"}]
|
|
99
|
+
)
|
|
100
|
+
```
|
|
101
|
+
|
|
102
|
+
### 4. Anthropic Integration
|
|
103
|
+
Automatically track all your Anthropic Claude calls.
|
|
104
|
+
|
|
105
|
+
```python
|
|
106
|
+
from agentbasis.llms.anthropic import instrument
|
|
107
|
+
|
|
108
|
+
# Enable Anthropic instrumentation
|
|
109
|
+
instrument()
|
|
110
|
+
|
|
111
|
+
# Now just use the Anthropic client as normal
|
|
112
|
+
from anthropic import Anthropic
|
|
113
|
+
client = Anthropic()
|
|
114
|
+
response = client.messages.create(
|
|
115
|
+
model="claude-3-opus-20240229",
|
|
116
|
+
max_tokens=1024,
|
|
117
|
+
messages=[{"role": "user", "content": "Hello"}]
|
|
118
|
+
)
|
|
119
|
+
```
|
|
120
|
+
|
|
121
|
+
### 5. LangChain Integration
|
|
122
|
+
Track chains, tools, retrievers, and LLM calls in LangChain with full parent-child span relationships.
|
|
123
|
+
|
|
124
|
+
```python
|
|
125
|
+
from agentbasis.frameworks.langchain import get_callback_handler
|
|
126
|
+
|
|
127
|
+
# Create a callback handler
|
|
128
|
+
handler = get_callback_handler()
|
|
129
|
+
|
|
130
|
+
# Pass it to your LangChain calls
|
|
131
|
+
from langchain_openai import ChatOpenAI
|
|
132
|
+
llm = ChatOpenAI(model="gpt-4")
|
|
133
|
+
response = llm.invoke("Hello world", config={"callbacks": [handler]})
|
|
134
|
+
```
|
|
135
|
+
|
|
136
|
+
For chains and agents, pass the callback handler in the config:
|
|
137
|
+
|
|
138
|
+
```python
|
|
139
|
+
from langchain.chains import LLMChain
|
|
140
|
+
from langchain.prompts import PromptTemplate
|
|
141
|
+
from agentbasis.frameworks.langchain import get_callback_config
|
|
142
|
+
|
|
143
|
+
# Use get_callback_config() for convenience
|
|
144
|
+
chain = LLMChain(llm=llm, prompt=PromptTemplate.from_template("{query}"))
|
|
145
|
+
result = chain.invoke({"query": "What is AI?"}, config=get_callback_config())
|
|
146
|
+
```
|
|
147
|
+
|
|
148
|
+
### 6. Pydantic AI Integration
|
|
149
|
+
Track Pydantic AI agents with built-in OpenTelemetry support.
|
|
150
|
+
|
|
151
|
+
```python
|
|
152
|
+
from agentbasis.frameworks.pydanticai import instrument
|
|
153
|
+
|
|
154
|
+
# Enable global instrumentation for all Pydantic AI agents
|
|
155
|
+
instrument()
|
|
156
|
+
|
|
157
|
+
# Your agents are now automatically traced
|
|
158
|
+
from pydantic_ai import Agent
|
|
159
|
+
agent = Agent("openai:gpt-4")
|
|
160
|
+
result = agent.run_sync("Hello!")
|
|
161
|
+
```
|
|
162
|
+
|
|
163
|
+
For per-agent control with user context:
|
|
164
|
+
|
|
165
|
+
```python
|
|
166
|
+
from agentbasis.frameworks.pydanticai import create_traced_agent
|
|
167
|
+
|
|
168
|
+
# Create an agent pre-configured with tracing and context
|
|
169
|
+
agent = create_traced_agent(
|
|
170
|
+
"openai:gpt-4",
|
|
171
|
+
system_prompt="You are a helpful assistant."
|
|
172
|
+
)
|
|
173
|
+
|
|
174
|
+
# Set user context - it will be included in traces
|
|
175
|
+
agentbasis.set_user("user-123")
|
|
176
|
+
result = agent.run_sync("Hello!")
|
|
177
|
+
```
|
|
178
|
+
|
|
179
|
+
### 7. Track Users & Sessions (Optional)
|
|
180
|
+
Associate traces with specific users and sessions to debug issues and see per-user analytics.
|
|
181
|
+
|
|
182
|
+
```python
|
|
183
|
+
# Set the current user (from your auth system)
|
|
184
|
+
agentbasis.set_user(current_user.id)
|
|
185
|
+
|
|
186
|
+
# Optionally set session and conversation IDs
|
|
187
|
+
agentbasis.set_session("session-abc")
|
|
188
|
+
agentbasis.set_conversation("conv-123")
|
|
189
|
+
|
|
190
|
+
# All subsequent LLM calls will be tagged with this context
|
|
191
|
+
response = client.chat.completions.create(...)
|
|
192
|
+
```
|
|
193
|
+
|
|
194
|
+
Or use the context manager for scoped context:
|
|
195
|
+
|
|
196
|
+
```python
|
|
197
|
+
from agentbasis import context
|
|
198
|
+
|
|
199
|
+
with context(user_id="user-123", session_id="session-abc"):
|
|
200
|
+
# All traces in this block include the context
|
|
201
|
+
response = client.chat.completions.create(...)
|
|
202
|
+
```
|
|
203
|
+
|
|
204
|
+
## Core Concepts
|
|
205
|
+
|
|
206
|
+
- **OpenTelemetry**: We use OTel under the hood for maximum compatibility.
|
|
207
|
+
- **Spans**: Every action (function call, LLM request) is recorded as a Span.
|
|
208
|
+
- **Transport**: Data is batched and sent asynchronously to the AgentBasis backend.
|
|
209
|
+
|
|
210
|
+
## Documentation
|
|
211
|
+
|
|
212
|
+
For full documentation, visit [docs.agentbasis.co](https://docs.agentbasis.co).
|
|
213
|
+
|
|
214
|
+
## Contributing
|
|
215
|
+
|
|
216
|
+
See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup and guidelines.
|
|
217
|
+
|
|
218
|
+
## License
|
|
219
|
+
MIT License - see [LICENSE](LICENSE) for details.
|
|
220
|
+
|
|
@@ -0,0 +1,189 @@
|
|
|
1
|
+
# AgentBasis Python SDK
|
|
2
|
+
|
|
3
|
+
**Management & Observability SDK for AI Agents in production**
|
|
4
|
+
|
|
5
|
+
The **AgentBasis Python SDK** provides a simple, lightweight way to track the performance, traces, sessions, and behavior of AI agents. It sends data using the **OpenTelemetry (OTel)** standard, making it compatible with AgentBasis and other observability backends.
|
|
6
|
+
|
|
7
|
+
This is the **foundation SDK** that enables deep observability for coded agents built with:
|
|
8
|
+
- Pure Python
|
|
9
|
+
- LLM Providers:
|
|
10
|
+
- OpenAI
|
|
11
|
+
- Anthropic
|
|
12
|
+
- Gemini
|
|
13
|
+
- Frameworks
|
|
14
|
+
- LangChain
|
|
15
|
+
- Pydantic AI
|
|
16
|
+
|
|
17
|
+
## Installation
|
|
18
|
+
|
|
19
|
+
```bash
|
|
20
|
+
pip install agentbasis
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
## Quick Start
|
|
24
|
+
|
|
25
|
+
### 1. Initialize the SDK
|
|
26
|
+
Start by initializing the SDK with your API key and Agent ID. This usually goes at the top of your main application file.
|
|
27
|
+
|
|
28
|
+
```python
|
|
29
|
+
import agentbasis
|
|
30
|
+
|
|
31
|
+
# Initialize with your API Key and Agent ID
|
|
32
|
+
agentbasis.init(
|
|
33
|
+
api_key="your-api-key-here",
|
|
34
|
+
agent_id="your-agent-id-here"
|
|
35
|
+
)
|
|
36
|
+
```
|
|
37
|
+
|
|
38
|
+
### 2. Manual Tracking (Decorators)
|
|
39
|
+
Use the `@trace` decorator to automatically track any function.
|
|
40
|
+
|
|
41
|
+
```python
|
|
42
|
+
from agentbasis import trace
|
|
43
|
+
|
|
44
|
+
@trace
|
|
45
|
+
def chat_with_user(query):
|
|
46
|
+
# Your agent logic here
|
|
47
|
+
return "Response to: " + query
|
|
48
|
+
|
|
49
|
+
# When you call this, data is automatically sent to AgentBasis
|
|
50
|
+
chat_with_user("Hello world")
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
### 3. OpenAI Integration
|
|
54
|
+
Automatically track all your OpenAI calls (models, tokens, prompts) with one line of code.
|
|
55
|
+
|
|
56
|
+
```python
|
|
57
|
+
from agentbasis.llms.openai import instrument
|
|
58
|
+
|
|
59
|
+
# Enable OpenAI instrumentation
|
|
60
|
+
instrument()
|
|
61
|
+
|
|
62
|
+
# Now just use the OpenAI client as normal
|
|
63
|
+
from openai import OpenAI
|
|
64
|
+
client = OpenAI()
|
|
65
|
+
response = client.chat.completions.create(
|
|
66
|
+
model="gpt-4",
|
|
67
|
+
messages=[{"role": "user", "content": "Hello"}]
|
|
68
|
+
)
|
|
69
|
+
```
|
|
70
|
+
|
|
71
|
+
### 4. Anthropic Integration
|
|
72
|
+
Automatically track all your Anthropic Claude calls.
|
|
73
|
+
|
|
74
|
+
```python
|
|
75
|
+
from agentbasis.llms.anthropic import instrument
|
|
76
|
+
|
|
77
|
+
# Enable Anthropic instrumentation
|
|
78
|
+
instrument()
|
|
79
|
+
|
|
80
|
+
# Now just use the Anthropic client as normal
|
|
81
|
+
from anthropic import Anthropic
|
|
82
|
+
client = Anthropic()
|
|
83
|
+
response = client.messages.create(
|
|
84
|
+
model="claude-3-opus-20240229",
|
|
85
|
+
max_tokens=1024,
|
|
86
|
+
messages=[{"role": "user", "content": "Hello"}]
|
|
87
|
+
)
|
|
88
|
+
```
|
|
89
|
+
|
|
90
|
+
### 5. LangChain Integration
|
|
91
|
+
Track chains, tools, retrievers, and LLM calls in LangChain with full parent-child span relationships.
|
|
92
|
+
|
|
93
|
+
```python
|
|
94
|
+
from agentbasis.frameworks.langchain import get_callback_handler
|
|
95
|
+
|
|
96
|
+
# Create a callback handler
|
|
97
|
+
handler = get_callback_handler()
|
|
98
|
+
|
|
99
|
+
# Pass it to your LangChain calls
|
|
100
|
+
from langchain_openai import ChatOpenAI
|
|
101
|
+
llm = ChatOpenAI(model="gpt-4")
|
|
102
|
+
response = llm.invoke("Hello world", config={"callbacks": [handler]})
|
|
103
|
+
```
|
|
104
|
+
|
|
105
|
+
For chains and agents, pass the callback handler in the config:
|
|
106
|
+
|
|
107
|
+
```python
|
|
108
|
+
from langchain.chains import LLMChain
|
|
109
|
+
from langchain.prompts import PromptTemplate
|
|
110
|
+
from agentbasis.frameworks.langchain import get_callback_config
|
|
111
|
+
|
|
112
|
+
# Use get_callback_config() for convenience
|
|
113
|
+
chain = LLMChain(llm=llm, prompt=PromptTemplate.from_template("{query}"))
|
|
114
|
+
result = chain.invoke({"query": "What is AI?"}, config=get_callback_config())
|
|
115
|
+
```
|
|
116
|
+
|
|
117
|
+
### 6. Pydantic AI Integration
|
|
118
|
+
Track Pydantic AI agents with built-in OpenTelemetry support.
|
|
119
|
+
|
|
120
|
+
```python
|
|
121
|
+
from agentbasis.frameworks.pydanticai import instrument
|
|
122
|
+
|
|
123
|
+
# Enable global instrumentation for all Pydantic AI agents
|
|
124
|
+
instrument()
|
|
125
|
+
|
|
126
|
+
# Your agents are now automatically traced
|
|
127
|
+
from pydantic_ai import Agent
|
|
128
|
+
agent = Agent("openai:gpt-4")
|
|
129
|
+
result = agent.run_sync("Hello!")
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
For per-agent control with user context:
|
|
133
|
+
|
|
134
|
+
```python
|
|
135
|
+
from agentbasis.frameworks.pydanticai import create_traced_agent
|
|
136
|
+
|
|
137
|
+
# Create an agent pre-configured with tracing and context
|
|
138
|
+
agent = create_traced_agent(
|
|
139
|
+
"openai:gpt-4",
|
|
140
|
+
system_prompt="You are a helpful assistant."
|
|
141
|
+
)
|
|
142
|
+
|
|
143
|
+
# Set user context - it will be included in traces
|
|
144
|
+
agentbasis.set_user("user-123")
|
|
145
|
+
result = agent.run_sync("Hello!")
|
|
146
|
+
```
|
|
147
|
+
|
|
148
|
+
### 7. Track Users & Sessions (Optional)
|
|
149
|
+
Associate traces with specific users and sessions to debug issues and see per-user analytics.
|
|
150
|
+
|
|
151
|
+
```python
|
|
152
|
+
# Set the current user (from your auth system)
|
|
153
|
+
agentbasis.set_user(current_user.id)
|
|
154
|
+
|
|
155
|
+
# Optionally set session and conversation IDs
|
|
156
|
+
agentbasis.set_session("session-abc")
|
|
157
|
+
agentbasis.set_conversation("conv-123")
|
|
158
|
+
|
|
159
|
+
# All subsequent LLM calls will be tagged with this context
|
|
160
|
+
response = client.chat.completions.create(...)
|
|
161
|
+
```
|
|
162
|
+
|
|
163
|
+
Or use the context manager for scoped context:
|
|
164
|
+
|
|
165
|
+
```python
|
|
166
|
+
from agentbasis import context
|
|
167
|
+
|
|
168
|
+
with context(user_id="user-123", session_id="session-abc"):
|
|
169
|
+
# All traces in this block include the context
|
|
170
|
+
response = client.chat.completions.create(...)
|
|
171
|
+
```
|
|
172
|
+
|
|
173
|
+
## Core Concepts
|
|
174
|
+
|
|
175
|
+
- **OpenTelemetry**: We use OTel under the hood for maximum compatibility.
|
|
176
|
+
- **Spans**: Every action (function call, LLM request) is recorded as a Span.
|
|
177
|
+
- **Transport**: Data is batched and sent asynchronously to the AgentBasis backend.
|
|
178
|
+
|
|
179
|
+
## Documentation
|
|
180
|
+
|
|
181
|
+
For full documentation, visit [docs.agentbasis.co](https://docs.agentbasis.co).
|
|
182
|
+
|
|
183
|
+
## Contributing
|
|
184
|
+
|
|
185
|
+
See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup and guidelines.
|
|
186
|
+
|
|
187
|
+
## License
|
|
188
|
+
MIT License - see [LICENSE](LICENSE) for details.
|
|
189
|
+
|
|
@@ -0,0 +1,87 @@
|
|
|
1
|
+
from typing import Optional
|
|
2
|
+
from .client import AgentBasis
|
|
3
|
+
from .decorators import trace
|
|
4
|
+
from .context import (
|
|
5
|
+
context,
|
|
6
|
+
set_user,
|
|
7
|
+
set_session,
|
|
8
|
+
set_conversation,
|
|
9
|
+
set_metadata,
|
|
10
|
+
with_context,
|
|
11
|
+
AgentBasisContext,
|
|
12
|
+
)
|
|
13
|
+
|
|
14
|
+
|
|
15
|
+
def init(api_key: Optional[str] = None, agent_id: Optional[str] = None) -> AgentBasis:
|
|
16
|
+
"""
|
|
17
|
+
Initialize the AgentBasis SDK.
|
|
18
|
+
|
|
19
|
+
Args:
|
|
20
|
+
api_key: Your AgentBasis API Key. If not provided, reads from AGENTBASIS_API_KEY env var.
|
|
21
|
+
agent_id: The ID of the agent to track. If not provided, reads from AGENTBASIS_AGENT_ID env var.
|
|
22
|
+
Returns:
|
|
23
|
+
The initialized AgentBasis client instance.
|
|
24
|
+
"""
|
|
25
|
+
return AgentBasis.initialize(api_key=api_key, agent_id=agent_id)
|
|
26
|
+
|
|
27
|
+
|
|
28
|
+
def flush(timeout_millis: int = 30000) -> bool:
|
|
29
|
+
"""
|
|
30
|
+
Force flush all pending telemetry data.
|
|
31
|
+
|
|
32
|
+
This is useful when you want to ensure all traces are sent before
|
|
33
|
+
a critical operation, at specific checkpoints, or before exiting.
|
|
34
|
+
|
|
35
|
+
Note: The SDK automatically flushes on normal Python exit via atexit.
|
|
36
|
+
|
|
37
|
+
Args:
|
|
38
|
+
timeout_millis: Maximum time to wait for flush (default 30 seconds).
|
|
39
|
+
|
|
40
|
+
Returns:
|
|
41
|
+
True if flush completed successfully, False if timed out or not initialized.
|
|
42
|
+
|
|
43
|
+
Example:
|
|
44
|
+
>>> agentbasis.init(api_key="...", agent_id="...")
|
|
45
|
+
>>> # ... your agent code ...
|
|
46
|
+
>>> agentbasis.flush() # Ensure all data is sent
|
|
47
|
+
"""
|
|
48
|
+
try:
|
|
49
|
+
client = AgentBasis.get_instance()
|
|
50
|
+
return client.flush(timeout_millis)
|
|
51
|
+
except RuntimeError:
|
|
52
|
+
# SDK not initialized
|
|
53
|
+
return False
|
|
54
|
+
|
|
55
|
+
|
|
56
|
+
def shutdown():
|
|
57
|
+
"""
|
|
58
|
+
Manually shut down the SDK and flush all pending data.
|
|
59
|
+
|
|
60
|
+
This is automatically called on Python exit, but can be called
|
|
61
|
+
manually if you need to shut down the SDK before the process ends.
|
|
62
|
+
|
|
63
|
+
This method is idempotent - calling it multiple times is safe.
|
|
64
|
+
"""
|
|
65
|
+
try:
|
|
66
|
+
client = AgentBasis.get_instance()
|
|
67
|
+
client.shutdown()
|
|
68
|
+
except RuntimeError:
|
|
69
|
+
# SDK not initialized
|
|
70
|
+
pass
|
|
71
|
+
|
|
72
|
+
|
|
73
|
+
__all__ = [
|
|
74
|
+
"init",
|
|
75
|
+
"AgentBasis",
|
|
76
|
+
"trace",
|
|
77
|
+
"flush",
|
|
78
|
+
"shutdown",
|
|
79
|
+
# Context management
|
|
80
|
+
"context",
|
|
81
|
+
"set_user",
|
|
82
|
+
"set_session",
|
|
83
|
+
"set_conversation",
|
|
84
|
+
"set_metadata",
|
|
85
|
+
"with_context",
|
|
86
|
+
"AgentBasisContext",
|
|
87
|
+
]
|