chuk-ai-session-manager 0.4.1__py3-none-any.whl → 0.5__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,4 +1,4 @@
1
- chuk_ai_session_manager/__init__.py,sha256=Kg8_fqwkV88b0f9H8Q45MDqaZdHHTo3Wf0Oko89DhHg,10217
1
+ chuk_ai_session_manager/__init__.py,sha256=no7tWD_1TG8b9Zuc52ha5fHtke5YQBraxXA_9xaAsqE,5633
2
2
  chuk_ai_session_manager/exceptions.py,sha256=WqrrUZuOAiUmz7tKnSnk0y222U_nV9a8LyaXLayn2fg,4420
3
3
  chuk_ai_session_manager/infinite_conversation.py,sha256=7j3caMnsX27M5rjj4oOkqiy_2AfcupWwsAWRflnKiSo,12092
4
4
  chuk_ai_session_manager/sample_tools.py,sha256=U-jTGveTJ95uSnA4jB30fJQJG3K-TGxN9jcOY6qVHZQ,8179
@@ -16,7 +16,7 @@ chuk_ai_session_manager/models/session_event.py,sha256=RTghC9_sDHzD8qdgEYCoclJzp
16
16
  chuk_ai_session_manager/models/session_metadata.py,sha256=KFG7lc_E0BQTP2OD9Y529elVGJXppDUMqz8vVONW0rw,1510
17
17
  chuk_ai_session_manager/models/session_run.py,sha256=uhMM4-WSrqOUsiWQPnyakInd-foZhxI-YnSHSWiZZwE,4369
18
18
  chuk_ai_session_manager/models/token_usage.py,sha256=M9Qwmeb2woILaSRwA2SIAiG-sIwC3cL_1H-y3NjW5Ik,11436
19
- chuk_ai_session_manager-0.4.1.dist-info/METADATA,sha256=0PvxzwOw2HG8XujmXkMInNVAlrhF149rxyLzXTUROAg,11136
20
- chuk_ai_session_manager-0.4.1.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
21
- chuk_ai_session_manager-0.4.1.dist-info/top_level.txt,sha256=5RinqD0v-niHuLYePUREX4gEWTlrpgtUg0RfexVRBMk,24
22
- chuk_ai_session_manager-0.4.1.dist-info/RECORD,,
19
+ chuk_ai_session_manager-0.5.dist-info/METADATA,sha256=AQtoeqt0OvMykMTJYsfxrObeOfnoSyj66RRuBrzoEoM,25537
20
+ chuk_ai_session_manager-0.5.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
21
+ chuk_ai_session_manager-0.5.dist-info/top_level.txt,sha256=5RinqD0v-niHuLYePUREX4gEWTlrpgtUg0RfexVRBMk,24
22
+ chuk_ai_session_manager-0.5.dist-info/RECORD,,
@@ -1,355 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: chuk-ai-session-manager
3
- Version: 0.4.1
4
- Summary: Session manager for AI applications
5
- Requires-Python: >=3.11
6
- Description-Content-Type: text/markdown
7
- Requires-Dist: chuk-sessions>=0.3
8
- Requires-Dist: chuk-tool-processor>=0.4.1
9
- Requires-Dist: pydantic>=2.11.3
10
- Provides-Extra: tiktoken
11
- Requires-Dist: tiktoken>=0.9.0; extra == "tiktoken"
12
- Provides-Extra: redis
13
- Requires-Dist: redis>=4.0.0; extra == "redis"
14
- Provides-Extra: dev
15
- Requires-Dist: pytest>=7.0.0; extra == "dev"
16
- Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
17
- Requires-Dist: pytest-asyncio>=0.21.0; extra == "dev"
18
- Requires-Dist: redis>=4.0.0; extra == "dev"
19
- Requires-Dist: black>=23.0.0; extra == "dev"
20
- Requires-Dist: isort>=5.12.0; extra == "dev"
21
- Requires-Dist: mypy>=1.0.0; extra == "dev"
22
- Provides-Extra: full
23
-
24
- # chuk-ai-session-manager
25
-
26
- [![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/)
27
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
28
-
29
- **The easiest way to add conversation tracking to any AI application.**
30
-
31
- Track conversations, monitor costs, and manage infinite context with just 3 lines of code. Built for production, designed for simplicity.
32
-
33
- ## 🚀 30-Second Start
34
-
35
- ```bash
36
- uv add chuk-ai-session-manager
37
- ```
38
-
39
- ```python
40
- from chuk_ai_session_manager import track_conversation
41
-
42
- # Track any AI conversation in one line
43
- await track_conversation("Hello!", "Hi there! How can I help?")
44
- ```
45
-
46
- That's it! 🎉 Your conversation is now tracked with full observability.
47
-
48
- ## ✨ Why Choose CHUK?
49
-
50
- - **🔥 Stupidly Simple**: 3 lines to track any conversation
51
- - **💰 Cost Smart**: Automatic token counting and cost tracking
52
- - **♾️ Infinite Context**: No more "conversation too long" errors
53
- - **🔧 Any LLM**: Works with OpenAI, Anthropic, local models, anything
54
- - **📊 Full Observability**: See exactly what's happening in your AI app
55
- - **🚀 Production Ready**: Used in real applications, not just demos
56
-
57
- ## 🎯 Perfect For
58
-
59
- - **Building chatbots** that remember conversations
60
- - **Tracking LLM costs** across your entire application
61
- - **Managing long conversations** without hitting token limits
62
- - **Debugging AI applications** with complete audit trails
63
- - **Production AI systems** that need reliable session management
64
-
65
- ## 📱 Quick Examples
66
-
67
- ### Track Any Conversation
68
- ```python
69
- from chuk_ai_session_manager import track_conversation
70
-
71
- # Works with any LLM response
72
- session_id = await track_conversation(
73
- user_message="What's the weather like?",
74
- ai_response="It's sunny and 75°F in your area.",
75
- model="gpt-4",
76
- provider="openai"
77
- )
78
- ```
79
-
80
- ### Persistent Conversations
81
- ```python
82
- from chuk_ai_session_manager import SessionManager
83
-
84
- # Create a conversation that remembers context
85
- sm = SessionManager()
86
-
87
- await sm.user_says("My name is Alice")
88
- await sm.ai_responds("Nice to meet you, Alice!")
89
-
90
- await sm.user_says("What's my name?")
91
- await sm.ai_responds("Your name is Alice!")
92
-
93
- # Get conversation stats
94
- stats = await sm.get_stats()
95
- print(f"Cost: ${stats['estimated_cost']:.6f}")
96
- print(f"Tokens: {stats['total_tokens']}")
97
- ```
98
-
99
- ### Infinite Context (Never Run Out of Space)
100
- ```python
101
- # Automatically handles conversations of any length
102
- sm = SessionManager(
103
- infinite_context=True, # 🔥 Magic happens here
104
- token_threshold=4000 # When to create new segment
105
- )
106
-
107
- # Keep chatting forever - context is preserved automatically
108
- for i in range(100): # This would normally hit token limits
109
- await sm.user_says(f"Question {i}: Tell me about AI")
110
- await sm.ai_responds("AI is fascinating...")
111
-
112
- # Still works! Automatic summarization keeps context alive
113
- conversation = await sm.get_conversation()
114
- print(f"Full conversation: {len(conversation)} exchanges")
115
- ```
116
-
117
- ### Cost Tracking (Know What You're Spending)
118
- ```python
119
- # Automatic cost monitoring across all interactions
120
- sm = SessionManager()
121
-
122
- await sm.user_says("Write a long story about dragons")
123
- await sm.ai_responds("Once upon a time..." * 500) # Long response
124
-
125
- stats = await sm.get_stats()
126
- print(f"💰 That story cost: ${stats['estimated_cost']:.6f}")
127
- print(f"📊 Used {stats['total_tokens']} tokens")
128
- print(f"📈 {stats['user_messages']} user messages, {stats['ai_messages']} AI responses")
129
- ```
130
-
131
- ### Multi-Provider Support
132
- ```python
133
- # Works with any LLM provider
134
- import openai
135
- import anthropic
136
-
137
- sm = SessionManager()
138
-
139
- # OpenAI
140
- await sm.user_says("Hello!")
141
- openai_response = await openai.chat.completions.create(...)
142
- await sm.ai_responds(openai_response.choices[0].message.content, model="gpt-4", provider="openai")
143
-
144
- # Anthropic
145
- await sm.user_says("How are you?")
146
- anthropic_response = await anthropic.messages.create(...)
147
- await sm.ai_responds(anthropic_response.content[0].text, model="claude-3", provider="anthropic")
148
-
149
- # See costs across all providers
150
- stats = await sm.get_stats()
151
- print(f"Total cost across all providers: ${stats['estimated_cost']:.6f}")
152
- ```
153
-
154
- ## 🛠️ Advanced Features
155
-
156
- ### Conversation Analytics
157
- ```python
158
- # Get detailed insights into your conversations
159
- conversation = await sm.get_conversation()
160
- stats = await sm.get_stats()
161
-
162
- print(f"📊 Conversation Analytics:")
163
- print(f" Messages: {stats['user_messages']} user, {stats['ai_messages']} AI")
164
- print(f" Average response length: {stats['avg_response_length']}")
165
- print(f" Most expensive response: ${stats['max_response_cost']:.6f}")
166
- print(f" Session duration: {stats['duration_minutes']:.1f} minutes")
167
- ```
168
-
169
- ### Tool Integration
170
- ```python
171
- # Track tool usage alongside conversations
172
- await sm.tool_used(
173
- tool_name="web_search",
174
- arguments={"query": "latest AI news"},
175
- result={"articles": ["AI breakthrough...", "New model released..."]},
176
- cost=0.001
177
- )
178
-
179
- stats = await sm.get_stats()
180
- print(f"Tool calls: {stats['tool_calls']}")
181
- ```
182
-
183
- ### Session Export/Import
184
- ```python
185
- # Export conversations for analysis
186
- conversation_data = await sm.export_conversation()
187
- with open('conversation.json', 'w') as f:
188
- json.dump(conversation_data, f)
189
-
190
- # Import previous conversations
191
- sm = SessionManager()
192
- await sm.import_conversation('conversation.json')
193
- ```
194
-
195
- ## 🎨 Real-World Examples
196
-
197
- ### Customer Support Bot
198
- ```python
199
- async def handle_support_ticket(user_message: str, ticket_id: str):
200
- # Each ticket gets its own session
201
- sm = SessionManager(session_id=ticket_id)
202
-
203
- await sm.user_says(user_message)
204
-
205
- # Your AI logic here
206
- ai_response = await your_ai_model(user_message)
207
- await sm.ai_responds(ai_response, model="gpt-4", provider="openai")
208
-
209
- # Automatic cost tracking per ticket
210
- stats = await sm.get_stats()
211
- print(f"Ticket {ticket_id} cost: ${stats['estimated_cost']:.6f}")
212
-
213
- return ai_response
214
- ```
215
-
216
- ### AI Assistant with Memory
217
- ```python
218
- async def ai_assistant():
219
- sm = SessionManager(infinite_context=True)
220
-
221
- while True:
222
- user_input = input("You: ")
223
- if user_input.lower() == 'quit':
224
- break
225
-
226
- await sm.user_says(user_input)
227
-
228
- # Get conversation context for AI
229
- conversation = await sm.get_conversation()
230
- context = "\n".join([f"{turn['role']}: {turn['content']}" for turn in conversation[-5:]])
231
-
232
- # Your AI call with context
233
- ai_response = await your_ai_model(f"Context:\n{context}\n\nUser: {user_input}")
234
- await sm.ai_responds(ai_response)
235
-
236
- print(f"AI: {ai_response}")
237
-
238
- # Show final stats
239
- stats = await sm.get_stats()
240
- print(f"\n💰 Total conversation cost: ${stats['estimated_cost']:.6f}")
241
- ```
242
-
243
- ### Multi-User Chat Application
244
- ```python
245
- class ChatApplication:
246
- def __init__(self):
247
- self.user_sessions = {}
248
-
249
- async def handle_message(self, user_id: str, message: str):
250
- # Each user gets their own session
251
- if user_id not in self.user_sessions:
252
- self.user_sessions[user_id] = SessionManager(infinite_context=True)
253
-
254
- sm = self.user_sessions[user_id]
255
- await sm.user_says(message)
256
-
257
- # AI processes with user's personal context
258
- ai_response = await self.generate_response(sm, message)
259
- await sm.ai_responds(ai_response)
260
-
261
- return ai_response
262
-
263
- async def get_user_stats(self, user_id: str):
264
- if user_id in self.user_sessions:
265
- return await self.user_sessions[user_id].get_stats()
266
- return None
267
- ```
268
-
269
- ## 📊 Monitoring Dashboard
270
-
271
- ```python
272
- # Get comprehensive analytics across all sessions
273
- from chuk_ai_session_manager import get_global_stats
274
-
275
- stats = await get_global_stats()
276
- print(f"""
277
- 🚀 AI Application Dashboard
278
- ==========================
279
- Total Sessions: {stats['total_sessions']}
280
- Total Messages: {stats['total_messages']}
281
- Total Cost: ${stats['total_cost']:.2f}
282
- Average Session Length: {stats['avg_session_length']:.1f} messages
283
- Most Active Hour: {stats['peak_hour']}
284
- Top Models Used: {', '.join(stats['top_models'])}
285
- """)
286
- ```
287
-
288
- ## 🔧 Installation Options
289
-
290
- ```bash
291
- # Basic installation
292
- uv add chuk-ai-session-manager
293
-
294
- # With Redis support (for production)
295
- uv add chuk-ai-session-manager[redis]
296
-
297
- # Full installation (all features)
298
- uv add chuk-ai-session-manager[full]
299
-
300
- # Or with pip
301
- pip install chuk-ai-session-manager
302
- ```
303
-
304
- ## 🌟 What Makes CHUK Special?
305
-
306
- | Feature | Other Libraries | CHUK AI Session Manager |
307
- |---------|----------------|------------------------|
308
- | **Setup Complexity** | Complex configuration | 3 lines of code |
309
- | **Cost Tracking** | Manual calculation | Automatic across all providers |
310
- | **Long Conversations** | Token limit errors | Infinite context with auto-segmentation |
311
- | **Multi-Provider** | Provider-specific code | Works with any LLM |
312
- | **Production Ready** | Requires additional work | Built for production |
313
- | **Learning Curve** | Steep | 5 minutes to productivity |
314
-
315
- ## 📖 More Examples
316
-
317
- Check out the `/examples` directory for complete working examples:
318
-
319
- - `simple_tracking.py` - Basic conversation tracking
320
- - `openai_integration.py` - OpenAI API integration
321
- - `infinite_context.py` - Handling long conversations
322
- - `cost_monitoring.py` - Cost tracking and analytics
323
- - `multi_provider.py` - Using multiple LLM providers
324
- - `production_app.py` - Production-ready application
325
-
326
- ## 🎯 Quick Decision Guide
327
-
328
- **Choose CHUK AI Session Manager if you want:**
329
- - ✅ Simple conversation tracking with zero configuration
330
- - ✅ Automatic cost monitoring across all LLM providers
331
- - ✅ Infinite conversation length without token limit errors
332
- - ✅ Production-ready session management out of the box
333
- - ✅ Complete conversation analytics and observability
334
- - ✅ Framework-agnostic solution that works with any LLM library
335
-
336
- ## 🤝 Community & Support
337
-
338
- - 📖 **Documentation**: [Full docs with tutorials](link-to-docs)
339
- - 🐛 **Issues**: Report bugs on GitHub
340
- - 💡 **Feature Requests**: Suggest new features
341
- - 📧 **Support**: enterprise@chuk.dev for production support
342
-
343
- ## 📝 License
344
-
345
- MIT License - build amazing AI applications with confidence!
346
-
347
- ---
348
-
349
- **🎉 Ready to build better AI applications?**
350
-
351
- ```bash
352
- uv add chuk-ai-session-manager
353
- ```
354
-
355
- **Get started in 30 seconds with one line of code!**