tracia 0.0.1__py3-none-any.whl → 0.1.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,277 @@
1
+ Metadata-Version: 2.4
2
+ Name: tracia
3
+ Version: 0.1.0
4
+ Summary: LLM prompt management and tracing SDK
5
+ Project-URL: Homepage, https://tracia.io
6
+ Project-URL: Documentation, https://docs.tracia.io
7
+ Project-URL: Repository, https://github.com/tracia/tracia-python
8
+ Project-URL: Issues, https://github.com/tracia/tracia-python/issues
9
+ Author-email: Tracia <hello@tracia.io>
10
+ License-Expression: MIT
11
+ License-File: LICENSE
12
+ Keywords: ai,anthropic,chatgpt,claude,gemini,google,litellm,llm,observability,openai,prompt,tracing
13
+ Classifier: Development Status :: 4 - Beta
14
+ Classifier: Intended Audience :: Developers
15
+ Classifier: License :: OSI Approved :: MIT License
16
+ Classifier: Programming Language :: Python :: 3
17
+ Classifier: Programming Language :: Python :: 3.10
18
+ Classifier: Programming Language :: Python :: 3.11
19
+ Classifier: Programming Language :: Python :: 3.12
20
+ Classifier: Programming Language :: Python :: 3.13
21
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
22
+ Classifier: Typing :: Typed
23
+ Requires-Python: >=3.10
24
+ Requires-Dist: httpx>=0.25.0
25
+ Requires-Dist: litellm>=1.30.0
26
+ Requires-Dist: pydantic>=2.0.0
27
+ Requires-Dist: typing-extensions>=4.0.0
28
+ Provides-Extra: dev
29
+ Requires-Dist: mypy>=1.0.0; extra == 'dev'
30
+ Requires-Dist: pytest-asyncio>=0.21.0; extra == 'dev'
31
+ Requires-Dist: pytest-cov>=4.0.0; extra == 'dev'
32
+ Requires-Dist: pytest>=7.0.0; extra == 'dev'
33
+ Requires-Dist: respx>=0.20.0; extra == 'dev'
34
+ Requires-Dist: ruff>=0.1.0; extra == 'dev'
35
+ Description-Content-Type: text/markdown
36
+
37
+ # Tracia
38
+
39
+ **LLM prompt management and tracing SDK for Python**
40
+
41
+ [![PyPI version](https://badge.fury.io/py/tracia.svg)](https://badge.fury.io/py/tracia)
42
+ [![Python 3.10+](https://img.shields.io/badge/python-3.10+-blue.svg)](https://www.python.org/downloads/)
43
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
44
+
45
+ ## What is Tracia?
46
+
47
+ Tracia is a modern LLM prompt management and tracing platform. This Python SDK provides:
48
+
49
+ - **Unified LLM Access** - Call OpenAI, Anthropic, Google, and 100+ providers through a single interface (powered by LiteLLM)
50
+ - **Automatic Tracing** - Every LLM call is automatically traced with latency, token usage, and cost
51
+ - **Prompt Management** - Store, version, and manage your prompts in the cloud
52
+ - **Session Linking** - Easily link related calls for multi-turn conversations
53
+
54
+ ## Installation
55
+
56
+ ```bash
57
+ pip install tracia
58
+ ```
59
+
60
+ You'll also need API keys for the LLM providers you want to use:
61
+ ```bash
62
+ export OPENAI_API_KEY="sk-..."
63
+ export ANTHROPIC_API_KEY="sk-ant-..."
64
+ export GOOGLE_API_KEY="..."
65
+ ```
66
+
67
+ ## Quick Start
68
+
69
+ ```python
70
+ from tracia import Tracia
71
+
72
+ # Initialize the client
73
+ client = Tracia(api_key="your_tracia_api_key")
74
+
75
+ # Run a local prompt
76
+ result = client.run_local(
77
+ model="gpt-4o",
78
+ messages=[{"role": "user", "content": "Hello!"}]
79
+ )
80
+ print(result.text)
81
+ print(f"Tokens: {result.usage.total_tokens}")
82
+ ```
83
+
84
+ ## Streaming
85
+
86
+ ```python
87
+ # Stream the response
88
+ stream = client.run_local(
89
+ model="gpt-4o",
90
+ messages=[{"role": "user", "content": "Tell me a story"}],
91
+ stream=True
92
+ )
93
+
94
+ for chunk in stream:
95
+ print(chunk, end="", flush=True)
96
+
97
+ # Get the final result (stream.result is a Future[StreamResult])
98
+ final = stream.result.result()
99
+ print(f"\nTotal tokens: {final.usage.total_tokens}")
100
+ ```
101
+
102
+ ## Multi-turn Conversations with Sessions
103
+
104
+ ```python
105
+ # Create a session for linked conversations
106
+ session = client.create_session()
107
+
108
+ # First message
109
+ r1 = session.run_local(
110
+ model="gpt-4o",
111
+ messages=[{"role": "user", "content": "My name is Alice"}]
112
+ )
113
+
114
+ # Follow-up - automatically linked to the same trace
115
+ r2 = session.run_local(
116
+ model="gpt-4o",
117
+ messages=[
118
+ {"role": "user", "content": "My name is Alice"},
119
+ {"role": "assistant", "content": r1.text},
120
+ {"role": "user", "content": "What's my name?"}
121
+ ]
122
+ )
123
+ ```
124
+
125
+ ## Function Calling
126
+
127
+ ```python
128
+ from tracia import ToolDefinition, ToolParameters, JsonSchemaProperty
129
+
130
+ # Define a tool
131
+ tools = [
132
+ ToolDefinition(
133
+ name="get_weather",
134
+ description="Get the current weather",
135
+ parameters=ToolParameters(
136
+ properties={
137
+ "location": JsonSchemaProperty(
138
+ type="string",
139
+ description="City name"
140
+ )
141
+ },
142
+ required=["location"]
143
+ )
144
+ )
145
+ ]
146
+
147
+ result = client.run_local(
148
+ model="gpt-4o",
149
+ messages=[{"role": "user", "content": "What's the weather in Paris?"}],
150
+ tools=tools
151
+ )
152
+
153
+ if result.tool_calls:
154
+ for call in result.tool_calls:
155
+ print(f"Tool: {call.name}, Args: {call.arguments}")
156
+ ```
157
+
158
+ ## Variable Interpolation
159
+
160
+ ```python
161
+ result = client.run_local(
162
+ model="gpt-4o",
163
+ messages=[
164
+ {"role": "system", "content": "You are a helpful assistant named {{name}}."},
165
+ {"role": "user", "content": "Hello!"}
166
+ ],
167
+ variables={"name": "Claude"}
168
+ )
169
+ ```
170
+
171
+ ## Prompts API
172
+
173
+ ```python
174
+ # List all prompts
175
+ prompts = client.prompts.list()
176
+
177
+ # Get a specific prompt
178
+ prompt = client.prompts.get("my-prompt")
179
+
180
+ # Run a prompt template
181
+ result = client.prompts.run(
182
+ "my-prompt",
183
+ variables={"name": "World"}
184
+ )
185
+ ```
186
+
187
+ ## Spans API
188
+
189
+ ```python
190
+ from tracia import Eval, EvaluateOptions
191
+
192
+ # List spans
193
+ spans = client.spans.list()
194
+
195
+ # Evaluate a span
196
+ client.spans.evaluate(
197
+ "sp_xxx",
198
+ EvaluateOptions(
199
+ evaluator="quality",
200
+ value=Eval.POSITIVE, # or Eval.NEGATIVE
201
+ note="Great response!",
202
+ ),
203
+ )
204
+ ```
205
+
206
+ ## Async Support
207
+
208
+ All methods have async variants:
209
+
210
+ ```python
211
+ import asyncio
212
+
213
+ async def main():
214
+ async with Tracia(api_key="...") as client:
215
+ result = await client.arun_local(
216
+ model="gpt-4o",
217
+ messages=[{"role": "user", "content": "Hello!"}]
218
+ )
219
+ print(result.text)
220
+
221
+ asyncio.run(main())
222
+ ```
223
+
224
+ ## Supported Providers
225
+
226
+ Via LiteLLM, Tracia supports 100+ providers including:
227
+
228
+ - **OpenAI**: gpt-4o, gpt-4, gpt-3.5-turbo, o1, o3
229
+ - **Anthropic**: claude-3-opus, claude-sonnet-4, claude-3-haiku
230
+ - **Google**: gemini-2.0-flash, gemini-2.5-pro
231
+ - And many more...
232
+
233
+ ## Error Handling
234
+
235
+ ```python
236
+ from tracia import TraciaError, TraciaErrorCode
237
+
238
+ try:
239
+ result = client.run_local(...)
240
+ except TraciaError as e:
241
+ if e.code == TraciaErrorCode.MISSING_PROVIDER_API_KEY:
242
+ print("Please set your API key")
243
+ elif e.code == TraciaErrorCode.PROVIDER_ERROR:
244
+ print(f"LLM error: {e.message}")
245
+ ```
246
+
247
+ ## Configuration Options
248
+
249
+ ```python
250
+ client = Tracia(
251
+ api_key="...",
252
+ base_url="https://app.tracia.io", # Custom API URL
253
+ on_span_error=lambda e, span_id: print(f"Span error: {e}")
254
+ )
255
+
256
+ result = client.run_local(
257
+ model="gpt-4o",
258
+ messages=[...],
259
+ temperature=0.7,
260
+ max_output_tokens=1000,
261
+ timeout_ms=30000,
262
+ tags=["production"],
263
+ user_id="user_123",
264
+ session_id="session_456",
265
+ send_trace=True, # Set to False to disable tracing
266
+ )
267
+ ```
268
+
269
+ ## Learn More
270
+
271
+ - Website: [tracia.io](https://tracia.io)
272
+ - Documentation: [docs.tracia.io](https://docs.tracia.io)
273
+ - GitHub: [github.com/tracia](https://github.com/tracia)
274
+
275
+ ## License
276
+
277
+ MIT
@@ -0,0 +1,18 @@
1
+ tracia/__init__.py,sha256=i3fAcuJjd3QVuXHRGLVmNsQW4VirnHL_dNUpLWL4xPE,3538
2
+ tracia/_client.py,sha256=5bTui8EN_vQ9XA9ZxGnkxTWpG2fIseWwdrVGHRmlRbE,40127
3
+ tracia/_constants.py,sha256=vM9X3TLawroFqrFLme_SuYrUgboDu7q2NMyvW_JNQTg,821
4
+ tracia/_errors.py,sha256=NEmqucrx3NqGjYFtelHJw1rs2lvRJky6Xave5bNOswg,2464
5
+ tracia/_http.py,sha256=KE7_vFUpfCHeWewiCqrROw_vCRU9pOpN2i4ieNsOkdI,11237
6
+ tracia/_llm.py,sha256=kv3uU4FcVWhYd-N2m5xsp5rnFY7i-NgbCjv4yqh9RRk,31071
7
+ tracia/_session.py,sha256=6ET-ibGKbU7Sd0o75zPI1O4bRXEqmbD5MMExPO6DO-Y,6903
8
+ tracia/_streaming.py,sha256=qrTTYWY9P39n1a4dh2vKQedgohdMJA7sAHy39YtLGoE,3873
9
+ tracia/_types.py,sha256=XeLNvW32VVv5vVP6BTGC07h9iz21s-KcT2lFxs0yD84,15294
10
+ tracia/_utils.py,sha256=xB28NRFHcduv2k0N3PXIh7sYCwtDU35R1gCiPwdmvvU,3090
11
+ tracia/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
12
+ tracia/resources/__init__.py,sha256=llZvQRPlEsHDsOdpWcCLgBSPjrJALFltAFThZii3NaY,127
13
+ tracia/resources/prompts.py,sha256=pVNKBRnGAft7JH3HapXYFZN37eO8_9NG7kpXbau1iVA,7553
14
+ tracia/resources/spans.py,sha256=zpF21NeJw_FR8TPVj3nR7r0-FZIzWUycol0n5XoMuTg,6591
15
+ tracia-0.1.0.dist-info/METADATA,sha256=mBVXXalffBkWQMLcQ8AdKYpnyZmmssqcceBq5YRaauA,7033
16
+ tracia-0.1.0.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
17
+ tracia-0.1.0.dist-info/licenses/LICENSE,sha256=jnoLcro_uevZVCUf3nkyE43c6mG6MGNJ_3_TShNJj5s,1063
18
+ tracia-0.1.0.dist-info/RECORD,,
@@ -1,52 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: tracia
3
- Version: 0.0.1
4
- Summary: LLM prompt management and tracing SDK - Coming soon
5
- Project-URL: Homepage, https://tracia.io
6
- Project-URL: Repository, https://github.com/tracia/tracia-python
7
- Project-URL: Issues, https://github.com/tracia/tracia-python/issues
8
- Author-email: Tracia <hello@tracia.io>
9
- License-Expression: MIT
10
- License-File: LICENSE
11
- Keywords: ai,anthropic,chatgpt,claude,gemini,google,llm,observability,openai,prompt,tracing
12
- Classifier: Development Status :: 1 - Planning
13
- Classifier: Intended Audience :: Developers
14
- Classifier: License :: OSI Approved :: MIT License
15
- Classifier: Programming Language :: Python :: 3
16
- Classifier: Programming Language :: Python :: 3.8
17
- Classifier: Programming Language :: Python :: 3.9
18
- Classifier: Programming Language :: Python :: 3.10
19
- Classifier: Programming Language :: Python :: 3.11
20
- Classifier: Programming Language :: Python :: 3.12
21
- Classifier: Topic :: Software Development :: Libraries :: Python Modules
22
- Requires-Python: >=3.8
23
- Description-Content-Type: text/markdown
24
-
25
- # Tracia
26
-
27
- **LLM prompt management and tracing SDK**
28
-
29
- 🚀 **Coming soon** - Full SDK launching in early 2026.
30
-
31
- ## What is Tracia?
32
-
33
- Tracia is a modern LLM prompt management and tracing platform with zero-config setup. Store your prompts in the cloud, trace every LLM call, and iterate faster.
34
-
35
- ## Quick Preview
36
-
37
- ```python
38
- from tracia import Tracia
39
-
40
- client = Tracia()
41
- response = client.run("my-prompt", variables={"name": "World"})
42
- ```
43
-
44
- ## Learn More
45
-
46
- - 🌐 Website: [tracia.io](https://tracia.io)
47
- - 📖 Docs: Coming soon
48
- - 🐙 GitHub: [github.com/tracia](https://github.com/tracia)
49
-
50
- ## License
51
-
52
- MIT
@@ -1,5 +0,0 @@
1
- tracia/__init__.py,sha256=LOfccReusbDjJkitNMUSvGb5X18iBZv-Zrka7ryRD5o,148
2
- tracia-0.0.1.dist-info/METADATA,sha256=TUNipf7P_IObyy1xDumTKu1HfS_dGfszMFZdNkzi2D8,1643
3
- tracia-0.0.1.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
4
- tracia-0.0.1.dist-info/licenses/LICENSE,sha256=jnoLcro_uevZVCUf3nkyE43c6mG6MGNJ_3_TShNJj5s,1063
5
- tracia-0.0.1.dist-info/RECORD,,
File without changes