noesium 0.2.0__py3-none-any.whl → 0.2.1__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,253 @@
1
+ Metadata-Version: 2.4
2
+ Name: noesium
3
+ Version: 0.2.1
4
+ Summary: Towards a cognitive agentic framework
5
+ Author-email: Xiaming Chen <chenxm35@gmail.com>
6
+ Maintainer-email: Xiaming Chen <chenxm35@gmail.com>
7
+ License-Expression: MIT
8
+ Project-URL: Homepage, https://github.com/mirasoth/noesium
9
+ Project-URL: Repository, https://github.com/mirasoth/noesium
10
+ Keywords: agents,multi-agent system,cognition,artificial intelligence
11
+ Classifier: Development Status :: 4 - Beta
12
+ Classifier: Intended Audience :: Developers
13
+ Classifier: Operating System :: OS Independent
14
+ Classifier: Programming Language :: Python :: 3
15
+ Classifier: Programming Language :: Python :: 3.11
16
+ Classifier: Programming Language :: Python :: 3.12
17
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
18
+ Requires-Python: >=3.11
19
+ Description-Content-Type: text/markdown
20
+ License-File: LICENSE
21
+ Requires-Dist: pydantic>=2.0.0
22
+ Requires-Dist: requests>=2.31.0
23
+ Requires-Dist: aiohttp>=3.12.15
24
+ Requires-Dist: python-dotenv>=1.1.1
25
+ Requires-Dist: colorlog>=6.8.0
26
+ Requires-Dist: typing-extensions>=4.8.0
27
+ Provides-Extra: google
28
+ Requires-Dist: google-genai>=1.5.0; extra == "google"
29
+ Requires-Dist: google-api-python-client>=2.174.0; extra == "google"
30
+ Requires-Dist: google-auth-oauthlib>=1.2.2; extra == "google"
31
+ Requires-Dist: google-auth>=2.40.3; extra == "google"
32
+ Provides-Extra: aliyun
33
+ Requires-Dist: aliyun-python-sdk-core<3.0.0,>=2.13.1; extra == "aliyun"
34
+ Provides-Extra: llm
35
+ Requires-Dist: litellm>=1.0.0; extra == "llm"
36
+ Requires-Dist: openai>=1.0.0; extra == "llm"
37
+ Requires-Dist: instructor>=1.10.0; extra == "llm"
38
+ Provides-Extra: local-llm
39
+ Requires-Dist: ollama>=0.5.3; extra == "local-llm"
40
+ Requires-Dist: llama-cpp-python>=0.3.16; extra == "local-llm"
41
+ Requires-Dist: huggingface-hub>=0.34.4; extra == "local-llm"
42
+ Provides-Extra: ai-providers-all
43
+ Requires-Dist: noesium[aliyun,google,llm,local-llm]; extra == "ai-providers-all"
44
+ Provides-Extra: langchain
45
+ Requires-Dist: langchain-core>=0.3.72; extra == "langchain"
46
+ Requires-Dist: langchain-text-splitters>=0.3.0; extra == "langchain"
47
+ Requires-Dist: langchain-ollama>=0.2.0; extra == "langchain"
48
+ Requires-Dist: langgraph>=0.5.4; extra == "langchain"
49
+ Provides-Extra: agents
50
+ Requires-Dist: noesium[langchain]; extra == "agents"
51
+ Requires-Dist: bubus>=1.5.6; extra == "agents"
52
+ Provides-Extra: postgres
53
+ Requires-Dist: psycopg2-binary>=2.9.0; extra == "postgres"
54
+ Requires-Dist: psycopg2>=2.9.10; extra == "postgres"
55
+ Provides-Extra: weaviate
56
+ Requires-Dist: weaviate-client<5,>=4; extra == "weaviate"
57
+ Requires-Dist: protobuf<6,>=5; extra == "weaviate"
58
+ Provides-Extra: datascience
59
+ Requires-Dist: networkx>=3.5; extra == "datascience"
60
+ Requires-Dist: matplotlib>=3.8.0; extra == "datascience"
61
+ Requires-Dist: pexpect>=4.9.0; extra == "datascience"
62
+ Requires-Dist: ipython>=8.18.0; extra == "datascience"
63
+ Requires-Dist: pandas>=2.0.0; extra == "datascience"
64
+ Provides-Extra: mcp
65
+ Requires-Dist: mcp>=1.0.0; extra == "mcp"
66
+ Provides-Extra: tools
67
+ Requires-Dist: noesium[aliyun,datascience,google,mcp]; extra == "tools"
68
+ Requires-Dist: wizsearch<2.0.0,>=1.0.1; extra == "tools"
69
+ Requires-Dist: arxiv>=2.2.0; extra == "tools"
70
+ Requires-Dist: pillow<12.0,>=10.1.0; extra == "tools"
71
+ Requires-Dist: pymupdf>=1.23.0; extra == "tools"
72
+ Requires-Dist: openpyxl>=3.1.5; extra == "tools"
73
+ Requires-Dist: wikipedia-api>=0.6.0; extra == "tools"
74
+ Requires-Dist: aiofiles>=24.1.0; extra == "tools"
75
+ Provides-Extra: all
76
+ Requires-Dist: noesium[agents,ai-providers-all,postgres,tools,weaviate]; extra == "all"
77
+ Provides-Extra: dev
78
+ Requires-Dist: pytest<9,>=8.2; extra == "dev"
79
+ Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
80
+ Requires-Dist: pytest-asyncio>=1.1.0; extra == "dev"
81
+ Requires-Dist: black>=23.0.0; extra == "dev"
82
+ Requires-Dist: isort>=5.12.0; extra == "dev"
83
+ Requires-Dist: mypy>=1.10.0; extra == "dev"
84
+ Requires-Dist: autoflake>=2.3.1; extra == "dev"
85
+ Requires-Dist: flake8>=7.3.0; extra == "dev"
86
+ Dynamic: license-file
87
+
88
+ # Noesium
89
+
90
+ [![CI](https://github.com/mirasoth/noesium/actions/workflows/ci.yml/badge.svg)](https://github.com/mirasoth/noesium/actions/workflows/ci.yml)
91
+ [![PyPI version](https://img.shields.io/pypi/v/noesium.svg)](https://pypi.org/project/noesium/)
92
+ [![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/mirasoth/noesium)
93
+
94
+ Project Noesium is an initiative to develop a computation-driven, cognitive agentic system. This repo contains the foundational abstractions (Agent, Memory, Tool, Goal, Orchestration, and more) along with essential modules such as LLM clients, logging, message buses, model routing, and observability. For the underlying philosophy, refer to my talk on MAS ([link](https://github.com/caesar0301/mas-talk-2508/blob/master/mas-talk-xmingc.pdf)).
95
+
96
+ ## Installation
97
+
98
+ ```bash
99
+ pip install -U noesium
100
+ ```
101
+
102
+ ## Core Modules
103
+
104
+ | Module | Description |
105
+ |--------|-------------|
106
+ | **LLM Integration** (`noesium.core.llm`) | Multi-provider support (OpenAI, OpenRouter, Ollama, LlamaCPP, LiteLLM), dynamic routing, token tracking |
107
+ | **Goal Management** (`noesium.core.goalith`) | LLM-based goal decomposition, DAG-based goal graph, dependency tracking |
108
+ | **Tool Management** (`noesium.core.toolify`) | Tool registry, MCP integration, 17+ built-in toolkits |
109
+ | **Memory** (`noesium.core.memory`) | MemU integration, embedding-based retrieval, multi-category storage |
110
+ | **Vector Store** (`noesium.core.vector_store`) | PGVector and Weaviate support, semantic search |
111
+ | **Message Bus** (`noesium.core.msgbus`) | Event-driven architecture, watchdog patterns |
112
+ | **Routing** (`noesium.core.routing`) | Dynamic complexity-based model selection |
113
+ | **Tracing** (`noesium.core.tracing`) | Token usage monitoring, Opik integration |
114
+
115
+ ## Built-in Agents
116
+
117
+ - **AskuraAgent** - Conversational agent for collecting semi-structured information via human-in-the-loop workflows
118
+ - **SearchAgent** - Web search with query polishing, multi-engine support, and optional content crawling
119
+ - **DeepResearchAgent** - Iterative research with LLM-powered reflection and citation generation
120
+ - **MemoryAgent** - Memory management with categorization, embedding search, and memory linking
121
+
122
+ ## Quick Start
123
+
124
+ ### LLM Client
125
+
126
+ ```python
127
+ from noesium.core.llm import get_llm_client
128
+
129
+ # Create client (supports openai, openrouter, ollama, llamacpp)
130
+ client = get_llm_client(provider="openai", api_key="sk-...")
131
+
132
+ # Chat completion
133
+ response = client.completion([{"role": "user", "content": "Hello!"}])
134
+
135
+ # Structured output
136
+ from pydantic import BaseModel
137
+
138
+ class Answer(BaseModel):
139
+ text: str
140
+ confidence: float
141
+
142
+ client = get_llm_client(provider="openai", structured_output=True)
143
+ result = client.structured_completion(messages, Answer)
144
+ ```
145
+
146
+ ### Tool Management
147
+
148
+ ```python
149
+ from noesium.core.toolify import BaseToolkit, ToolkitConfig, ToolkitRegistry, register_toolkit
150
+
151
+ @register_toolkit("calculator")
152
+ class CalculatorToolkit(BaseToolkit):
153
+ def get_tools_map(self):
154
+ return {"add": self.add, "multiply": self.multiply}
155
+
156
+ def add(self, a: float, b: float) -> float:
157
+ return a + b
158
+
159
+ def multiply(self, a: float, b: float) -> float:
160
+ return a * b
161
+
162
+ # Use toolkit
163
+ config = ToolkitConfig(name="calculator")
164
+ calc = ToolkitRegistry.create_toolkit("calculator", config)
165
+ result = calc.call_tool("add", a=5, b=3)
166
+ ```
167
+
168
+ ### Goal Decomposition
169
+
170
+ ```python
171
+ from noesium.core.goalith.goalgraph.node import GoalNode
172
+ from noesium.core.goalith.goalgraph.graph import GoalGraph
173
+ from noesium.core.goalith.decomposer import LLMDecomposer
174
+
175
+ # Create and decompose a goal
176
+ goal = GoalNode(description="Plan a product launch", priority=8.0)
177
+ graph = GoalGraph()
178
+ graph.add_node(goal)
179
+
180
+ decomposer = LLMDecomposer()
181
+ subgoals = decomposer.decompose(goal, context={"budget": "$50,000"})
182
+ ```
183
+
184
+ ### Search Agent
185
+
186
+ ```python
187
+ from noesium.agents.search import SearchAgent, SearchConfig
188
+
189
+ config = SearchConfig(
190
+ polish_query=True,
191
+ search_engines=["tavily"],
192
+ max_results_per_engine=5
193
+ )
194
+ agent = SearchAgent(config=config)
195
+ results = await agent.search("latest developments in quantum computing")
196
+ ```
197
+
198
+ ### Deep Research Agent
199
+
200
+ ```python
201
+ from noesium.agents.deep_research import DeepResearchAgent, DeepResearchConfig
202
+
203
+ config = DeepResearchConfig(
204
+ number_of_initial_queries=3,
205
+ max_research_loops=3,
206
+ web_search_citation_enabled=True
207
+ )
208
+ agent = DeepResearchAgent(config=config)
209
+ result = await agent.research("What are the implications of AI on healthcare?")
210
+ ```
211
+
212
+ ## Environment Variables
213
+
214
+ ```bash
215
+ # LLM Providers
216
+ export NOESIUM_LLM_PROVIDER="openai"
217
+ export OPENAI_API_KEY="sk-..."
218
+ export OPENROUTER_API_KEY="sk-..."
219
+ export OLLAMA_BASE_URL="http://localhost:11434"
220
+ export LLAMACPP_MODEL_PATH="/path/to/model.gguf"
221
+
222
+ # Vector Store (PostgreSQL)
223
+ export POSTGRES_HOST="localhost"
224
+ export POSTGRES_PORT="5432"
225
+ export POSTGRES_DB="vectordb"
226
+ export POSTGRES_USER="postgres"
227
+ export POSTGRES_PASSWORD="postgres"
228
+
229
+ # Search Tools
230
+ export SERPER_API_KEY="..."
231
+ export JINA_API_KEY="..."
232
+ ```
233
+
234
+ ## Examples
235
+
236
+ See the `examples/` directory for comprehensive usage examples:
237
+
238
+ - `examples/agents/` - Agent demos (Askura, Search, DeepResearch)
239
+ - `examples/llm/` - LLM provider examples and token tracking
240
+ - `examples/goals/` - Goal decomposition patterns
241
+ - `examples/memory/` - Memory agent operations
242
+ - `examples/tools/` - Toolkit demonstrations
243
+ - `examples/vector_store/` - PGVector and Weaviate usage
244
+
245
+ ## Documentation
246
+
247
+ - **Design Specifications**: `specs/` directory contains RFCs for system architecture
248
+ - **Agent Details**: See `AGENTS.md` for comprehensive agent and toolkit documentation
249
+ - **Toolify System**: `noesium/core/toolify/README.md`
250
+
251
+ ## License
252
+
253
+ MIT License - see [LICENSE](LICENSE) file for details.
@@ -115,8 +115,8 @@ noesium/toolkits/tabular_data_toolkit.py,sha256=5kofmB5TsLPOYnPCqCvmGrHfxSRBPcNa
115
115
  noesium/toolkits/user_interaction_toolkit.py,sha256=_FnSL75eb9f21IlV4KzVXGhda6RRJc3rvYo34Tt51nY,13650
116
116
  noesium/toolkits/video_toolkit.py,sha256=zd65drrbB3AGFZGBVieadGZC7sCxemrtYUg1uoQen7E,5858
117
117
  noesium/toolkits/wikipedia_toolkit.py,sha256=mQyIuMPGS-vttWWoaV-RmSg5b5MJ4hoLa1H1cUsmst0,15333
118
- noesium-0.2.0.dist-info/licenses/LICENSE,sha256=uxyIWSLWuSDg9cwA8ehH8LvzOWPM66UGQiOTUQuqTO0,1071
119
- noesium-0.2.0.dist-info/METADATA,sha256=yGm-sZNdz54NW8bCqOGgwkwq0fqX-wtG4XiwU7dbrKA,17475
120
- noesium-0.2.0.dist-info/WHEEL,sha256=wUyA8OaulRlbfwMtmQsvNngGrxQHAvkKcvRmdizlJi0,92
121
- noesium-0.2.0.dist-info/top_level.txt,sha256=IW36N4DMZNJLo2bt_QMlnw-NZepN5fnqmEKbTbb7Tug,8
122
- noesium-0.2.0.dist-info/RECORD,,
118
+ noesium-0.2.1.dist-info/licenses/LICENSE,sha256=uxyIWSLWuSDg9cwA8ehH8LvzOWPM66UGQiOTUQuqTO0,1071
119
+ noesium-0.2.1.dist-info/METADATA,sha256=9ezyGOwmVboUet9Vb5ICEo2v58Y-DI5bzTWIxtf6F5o,9579
120
+ noesium-0.2.1.dist-info/WHEEL,sha256=wUyA8OaulRlbfwMtmQsvNngGrxQHAvkKcvRmdizlJi0,92
121
+ noesium-0.2.1.dist-info/top_level.txt,sha256=IW36N4DMZNJLo2bt_QMlnw-NZepN5fnqmEKbTbb7Tug,8
122
+ noesium-0.2.1.dist-info/RECORD,,
@@ -1,533 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: noesium
3
- Version: 0.2.0
4
- Summary: Towards a cognitive agentic framework
5
- Author-email: Xiaming Chen <chenxm35@gmail.com>
6
- Maintainer-email: Xiaming Chen <chenxm35@gmail.com>
7
- License-Expression: MIT
8
- Project-URL: Homepage, https://github.com/mirasoth/noesium
9
- Project-URL: Repository, https://github.com/mirasoth/noesium
10
- Keywords: agents,multi-agent system,cognition,artificial intelligence
11
- Classifier: Development Status :: 4 - Beta
12
- Classifier: Intended Audience :: Developers
13
- Classifier: Operating System :: OS Independent
14
- Classifier: Programming Language :: Python :: 3
15
- Classifier: Programming Language :: Python :: 3.11
16
- Classifier: Programming Language :: Python :: 3.12
17
- Classifier: Topic :: Software Development :: Libraries :: Python Modules
18
- Requires-Python: >=3.11
19
- Description-Content-Type: text/markdown
20
- License-File: LICENSE
21
- Requires-Dist: pydantic>=2.0.0
22
- Requires-Dist: requests>=2.31.0
23
- Requires-Dist: httpx>=0.28.1
24
- Requires-Dist: aiohttp>=3.12.15
25
- Requires-Dist: aiofiles>=24.1.0
26
- Requires-Dist: anyio>=4.9.0
27
- Requires-Dist: python-dotenv>=1.1.1
28
- Requires-Dist: colorlog>=6.8.0
29
- Requires-Dist: typing-extensions>=4.8.0
30
- Requires-Dist: deprecated>=1.2.18
31
- Requires-Dist: psutil>=7.0.0
32
- Requires-Dist: networkx>=3.5
33
- Requires-Dist: bubus>=1.5.6
34
- Provides-Extra: openai
35
- Requires-Dist: openai>=1.0.0; extra == "openai"
36
- Requires-Dist: instructor>=1.10.0; extra == "openai"
37
- Provides-Extra: google
38
- Requires-Dist: google-genai>=1.5.0; extra == "google"
39
- Requires-Dist: google-api-python-client>=2.174.0; extra == "google"
40
- Requires-Dist: google-auth-oauthlib>=1.2.2; extra == "google"
41
- Requires-Dist: google-auth>=2.40.3; extra == "google"
42
- Provides-Extra: aliyun
43
- Requires-Dist: aliyun-python-sdk-core<3.0.0,>=2.13.1; extra == "aliyun"
44
- Provides-Extra: litellm
45
- Requires-Dist: litellm>=1.0.0; extra == "litellm"
46
- Provides-Extra: local-llm
47
- Requires-Dist: ollama>=0.5.3; extra == "local-llm"
48
- Requires-Dist: llama-cpp-python>=0.3.16; extra == "local-llm"
49
- Requires-Dist: huggingface-hub>=0.34.4; extra == "local-llm"
50
- Provides-Extra: langchain
51
- Requires-Dist: langchain-core>=0.3.72; extra == "langchain"
52
- Requires-Dist: langchain-text-splitters>=0.3.0; extra == "langchain"
53
- Requires-Dist: langchain-ollama>=0.2.0; extra == "langchain"
54
- Requires-Dist: langgraph>=0.5.4; extra == "langchain"
55
- Provides-Extra: postgres
56
- Requires-Dist: psycopg2-binary>=2.9.0; extra == "postgres"
57
- Requires-Dist: psycopg2>=2.9.10; extra == "postgres"
58
- Provides-Extra: weaviate
59
- Requires-Dist: weaviate-client<5,>=4; extra == "weaviate"
60
- Requires-Dist: protobuf<6,>=5; extra == "weaviate"
61
- Provides-Extra: datascience
62
- Requires-Dist: matplotlib>=3.8.0; extra == "datascience"
63
- Requires-Dist: pexpect>=4.9.0; extra == "datascience"
64
- Requires-Dist: ipython>=8.18.0; extra == "datascience"
65
- Requires-Dist: pandas>=2.0.0; extra == "datascience"
66
- Provides-Extra: mcp
67
- Requires-Dist: mcp>=1.0.0; extra == "mcp"
68
- Provides-Extra: tools
69
- Requires-Dist: noesium[aliyun,datascience,google,mcp]; extra == "tools"
70
- Requires-Dist: wizsearch<2.0.0,>=1.0.1; extra == "tools"
71
- Requires-Dist: arxiv>=2.2.0; extra == "tools"
72
- Requires-Dist: pillow<12.0,>=10.1.0; extra == "tools"
73
- Requires-Dist: pymupdf>=1.23.0; extra == "tools"
74
- Requires-Dist: openpyxl>=3.1.5; extra == "tools"
75
- Requires-Dist: wikipedia-api>=0.6.0; extra == "tools"
76
- Provides-Extra: all
77
- Requires-Dist: noesium[google,langchain,local-llm,openai,postgres,tools,weaviate]; extra == "all"
78
- Provides-Extra: dev
79
- Requires-Dist: pytest<9,>=8.2; extra == "dev"
80
- Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
81
- Requires-Dist: pytest-asyncio>=1.1.0; extra == "dev"
82
- Requires-Dist: black>=23.0.0; extra == "dev"
83
- Requires-Dist: isort>=5.12.0; extra == "dev"
84
- Requires-Dist: mypy>=1.10.0; extra == "dev"
85
- Requires-Dist: autoflake>=2.3.1; extra == "dev"
86
- Requires-Dist: flake8>=7.3.0; extra == "dev"
87
- Dynamic: license-file
88
-
89
- # Noesium-core
90
-
91
- [![CI](https://github.com/mirasoth/noesium/actions/workflows/ci.yml/badge.svg)](https://github.com/mirasoth/noesium/actions/workflows/ci.yml)
92
- [![PyPI version](https://img.shields.io/pypi/v/noesium.svg)](https://pypi.org/project/noesium/)
93
- [![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/mirasoth/noesium)
94
-
95
- Project Noesium is an initiative to develop a computation-driven, cognitive agentic system. This repo contains the foundational abstractions (Agent, Memory, Tool, Goal, Orchestration, and more) along with essential modules such as LLM clients, logging, message buses, model routing, and observability. For the underlying philosophy, refer to my talk on MAS ([link](https://github.com/caesar0301/mas-talk-2508/blob/master/mas-talk-xmingc.pdf)).
96
-
97
- ## Installation
98
-
99
- ```bash
100
- pip install -U noesium
101
- ```
102
-
103
- ## Core Modules
104
-
105
- Noesium offers a comprehensive set of modules for creating intelligent agent-based applications:
106
-
107
- ### LLM Integration & Management (`noesium.core.llm`)
108
- - **Multi-model support**: OpenAI, OpenRouter, Ollama, LlamaCPP, and LiteLLM
109
- - **Advanced routing**: Dynamic complexity-based and self-assessment routing strategies
110
- - **Tracing & monitoring**: Built-in token tracking and Opik tracing integration
111
- - **Extensible architecture**: Easy to add new LLM providers
112
-
113
- ### Goal Management & Planning (`noesium.core.goalith`) - *In Development*
114
- - **Goal decomposition**: LLM-based, callable, and simple goal decomposition strategies
115
- - **Graph-based structure**: DAG-based goal management with dependencies
116
- - **Node management**: Goal, subgoal, and task node creation and tracking
117
- - **Conflict detection**: Framework for automated goal conflict identification (planned)
118
- - **Replanning**: Dynamic goal replanning capabilities (planned)
119
-
120
- ### Tool Management (`noesium.core.toolify`)
121
- - **Tool registry**: Centralized tool registration and management
122
- - **MCP integration**: Model Context Protocol support for tool discovery
123
- - **Execution engine**: Robust tool execution with error handling
124
- - **Toolkit system**: Organized tool collections and configurations
125
-
126
- ### Memory Management (`noesium.core.memory`)
127
- - **MemU integration**: Advanced memory agent with categorization
128
- - **Embedding support**: Vector-based memory retrieval and linking
129
- - **Multi-category storage**: Activity, event, and profile memory types
130
- - **Memory linking**: Automatic relationship discovery between memories
131
-
132
- ### Vector Storage (`noesium.core.vector_store`)
133
- - **PGVector support**: PostgreSQL with pgvector extension
134
- - **Weaviate integration**: Cloud-native vector database
135
- - **Semantic search**: Embedding-based document retrieval
136
- - **Flexible indexing**: HNSW and DiskANN indexing strategies
137
-
138
- ### Message Bus (`noesium.core.msgbus`)
139
- - **Event-driven architecture**: Inter-component communication
140
- - **Watchdog patterns**: Monitoring and reactive behaviors
141
- - **Flexible routing**: Message filtering and delivery
142
-
143
- ### Routing & Tracing (`noesium.core.routing`, `noesium.core.tracing`)
144
- - **Smart routing**: Dynamic model selection based on complexity
145
- - **Token tracking**: Comprehensive usage monitoring
146
- - **Opik integration**: Production-ready observability
147
- - **LangGraph hooks**: Workflow tracing and debugging
148
-
149
- ## Quick Start
150
-
151
- ### 1. LLM Client Usage
152
-
153
- ```python
154
- from noesium.core.llm import get_llm_client
155
-
156
- # OpenAI/OpenRouter providers
157
- client = get_llm_client(provider="openai", api_key="sk-...")
158
- client = get_llm_client(provider="openrouter", api_key="sk-...")
159
-
160
- # Local providers
161
- client = get_llm_client(provider="ollama", base_url="http://localhost:11434")
162
- client = get_llm_client(provider="llamacpp", model_path="/path/to/model.gguf")
163
-
164
- # Basic chat completion
165
- response = client.completion([
166
- {"role": "user", "content": "Hello!"}
167
- ])
168
-
169
- # Structured output (requires structured_output=True)
170
- from pydantic import BaseModel
171
-
172
- class Response(BaseModel):
173
- answer: str
174
- confidence: float
175
-
176
- client = get_llm_client(provider="openai", structured_output=True)
177
- result = client.structured_completion(messages, Response)
178
- ```
179
-
180
- ### 2. Goal Management with Goalith
181
-
182
- **Note**: The Goalith goal management system is currently under development. The core components are available but the full service integration is not yet complete.
183
-
184
- ```python
185
- # Basic goal node creation and management
186
- from noesium.core.goalith.goalgraph.node import GoalNode, NodeStatus
187
- from noesium.core.goalith.goalgraph.graph import GoalGraph
188
- from noesium.core.goalith.decomposer import LLMDecomposer
189
-
190
- # Create a goal node
191
- goal_node = GoalNode(
192
- description="Plan and execute a product launch",
193
- priority=8.0,
194
- context={
195
- "budget": "$50,000",
196
- "timeline": "3 months",
197
- "target_audience": "young professionals"
198
- },
199
- tags=["product", "launch", "marketing"]
200
- )
201
-
202
- # Create goal graph for management
203
- graph = GoalGraph()
204
- graph.add_node(goal_node)
205
-
206
- # Use LLM decomposer directly
207
- decomposer = LLMDecomposer()
208
- subgoals = decomposer.decompose(goal_node, context={
209
- "team_size": "5 people",
210
- "experience_level": "intermediate"
211
- })
212
-
213
- print(f"Goal: {goal_node.description}")
214
- print(f"Status: {goal_node.status}")
215
- print(f"Generated {len(subgoals)} subgoals")
216
- ```
217
-
218
- ### 3. Memory Management
219
-
220
- ```python
221
- from noesium.core.memory.memu import MemoryAgent
222
-
223
- # Initialize memory agent
224
- memory_agent = MemoryAgent(
225
- agent_id="my_agent",
226
- user_id="user123",
227
- memory_dir="/tmp/memory_storage",
228
- enable_embeddings=True
229
- )
230
-
231
- # Add activity memory
232
- activity_content = """
233
- USER: Hi, I'm Sarah and I work as a software engineer.
234
- ASSISTANT: Nice to meet you Sarah! What kind of projects do you work on?
235
- USER: I mainly work on web applications using Python and React.
236
- """
237
-
238
- result = memory_agent.call_function(
239
- "add_activity_memory",
240
- {
241
- "character_name": "Sarah",
242
- "content": activity_content
243
- }
244
- )
245
-
246
- # Generate memory suggestions
247
- if result.get("success"):
248
- memory_items = result.get("memory_items", [])
249
- suggestions = memory_agent.call_function(
250
- "generate_memory_suggestions",
251
- {
252
- "character_name": "Sarah",
253
- "new_memory_items": memory_items
254
- }
255
- )
256
- ```
257
-
258
- ### 4. Vector Store Operations
259
-
260
- ```python
261
- from noesium.core.vector_store import PGVectorStore
262
- from noesium.core.llm import get_llm_client
263
-
264
- # Initialize vector store
265
- vector_store = PGVectorStore(
266
- collection_name="my_documents",
267
- embedding_model_dims=768,
268
- dbname="vectordb",
269
- user="postgres",
270
- password="postgres",
271
- host="localhost",
272
- port=5432
273
- )
274
-
275
- # Initialize embedding client
276
- embed_client = get_llm_client(provider="ollama", embed_model="nomic-embed-text")
277
-
278
- # Prepare documents
279
- documents = [
280
- {
281
- "id": "doc1",
282
- "content": "Machine learning is a subset of AI...",
283
- "metadata": {"category": "AI", "type": "definition"}
284
- }
285
- ]
286
-
287
- # Generate embeddings and store
288
- vectors = []
289
- payloads = []
290
- ids = []
291
-
292
- for doc in documents:
293
- embedding = embed_client.embed(doc["content"])
294
- vectors.append(embedding)
295
- payloads.append(doc["metadata"])
296
- ids.append(doc["id"])
297
-
298
- # Insert into vector store
299
- vector_store.insert(vectors=vectors, payloads=payloads, ids=ids)
300
-
301
- # Search
302
- query = "What is artificial intelligence?"
303
- query_embedding = embed_client.embed(query)
304
- results = vector_store.search(query=query, vectors=query_embedding, limit=5)
305
- ```
306
-
307
- ### 5. Tool Management
308
-
309
- ```python
310
- from noesium.core.toolify import BaseToolkit, ToolkitConfig, ToolkitRegistry, register_toolkit
311
- from typing import Dict, Callable
312
-
313
- # Create a custom toolkit using decorator
314
- @register_toolkit("calculator")
315
- class CalculatorToolkit(BaseToolkit):
316
- def get_tools_map(self) -> Dict[str, Callable]:
317
- return {
318
- "add": self.add,
319
- "multiply": self.multiply
320
- }
321
-
322
- def add(self, a: float, b: float) -> float:
323
- """Add two numbers."""
324
- return a + b
325
-
326
- def multiply(self, a: float, b: float) -> float:
327
- """Multiply two numbers."""
328
- return a * b
329
-
330
- # Alternative: Manual registration
331
- config = ToolkitConfig(name="calculator", description="Basic math operations")
332
- ToolkitRegistry.register("calculator", CalculatorToolkit)
333
-
334
- # Create and use toolkit
335
- calculator = ToolkitRegistry.create_toolkit("calculator", config)
336
- result = calculator.call_tool("add", a=5, b=3)
337
- print(f"5 + 3 = {result}")
338
- ```
339
-
340
- ### 6. Message Bus and Events
341
-
342
- ```python
343
- from noesium.core.msgbus import EventBus, BaseEvent, BaseWatchdog
344
-
345
- # Define custom event
346
- class TaskCompleted(BaseEvent):
347
- def __init__(self, task_id: str, result: str):
348
- super().__init__()
349
- self.task_id = task_id
350
- self.result = result
351
-
352
- # Create event bus
353
- bus = EventBus()
354
-
355
- # Define watchdog
356
- class TaskWatchdog(BaseWatchdog):
357
- def handle_event(self, event: BaseEvent):
358
- if isinstance(event, TaskCompleted):
359
- print(f"Task {event.task_id} completed with result: {event.result}")
360
-
361
- # Register watchdog and publish event
362
- watchdog = TaskWatchdog()
363
- bus.register_watchdog(watchdog)
364
- bus.publish(TaskCompleted("task_1", "success"))
365
- ```
366
-
367
- ### 7. Token Tracking and Tracing
368
-
369
- ```python
370
- from noesium.core.tracing import get_token_tracker
371
- from noesium.core.llm import get_llm_client
372
-
373
- # Initialize client and tracker
374
- client = get_llm_client(provider="openai")
375
- tracker = get_token_tracker()
376
-
377
- # Reset tracker
378
- tracker.reset()
379
-
380
- # Make LLM calls (automatically tracked)
381
- response1 = client.completion([{"role": "user", "content": "Hello"}])
382
- response2 = client.completion([{"role": "user", "content": "How are you?"}])
383
-
384
- # Get usage statistics
385
- stats = tracker.get_stats()
386
- print(f"Total tokens: {stats['total_tokens']}")
387
- print(f"Total calls: {stats['total_calls']}")
388
- print(f"Average tokens per call: {stats.get('avg_tokens_per_call', 0)}")
389
- ```
390
-
391
- ## Environment Variables
392
-
393
- Set these environment variables for different providers:
394
-
395
- ```bash
396
- # Default LLM provider
397
- export NOESIUM_LLM_PROVIDER="openai"
398
-
399
- # OpenAI
400
- export OPENAI_API_KEY="sk-..."
401
-
402
- # OpenRouter
403
- export OPENROUTER_API_KEY="sk-..."
404
-
405
- # LlamaCPP
406
- export LLAMACPP_MODEL_PATH="/path/to/model.gguf"
407
-
408
- # Ollama
409
- export OLLAMA_BASE_URL="http://localhost:11434"
410
-
411
- # PostgreSQL (for vector store)
412
- export POSTGRES_HOST="localhost"
413
- export POSTGRES_PORT="5432"
414
- export POSTGRES_DB="vectordb"
415
- export POSTGRES_USER="postgres"
416
- export POSTGRES_PASSWORD="postgres"
417
- ```
418
-
419
- ## Advanced Usage
420
-
421
- ### Custom Goal Decomposer
422
-
423
- ```python
424
- from noesium.core.goalith.decomposer.base import GoalDecomposer
425
- from noesium.core.goalith.goalgraph.node import GoalNode
426
- from typing import List, Dict, Any, Optional
427
- import copy
428
-
429
- class CustomDecomposer(GoalDecomposer):
430
- @property
431
- def name(self) -> str:
432
- return "custom_decomposer"
433
-
434
- def decompose(self, goal_node: GoalNode, context: Optional[Dict[str, Any]] = None) -> List[GoalNode]:
435
- # Custom decomposition logic
436
- subtasks = [
437
- "Research requirements",
438
- "Design solution",
439
- "Implement features",
440
- "Test and deploy"
441
- ]
442
-
443
- nodes = []
444
- for i, subtask in enumerate(subtasks):
445
- # Deep copy context to avoid shared references
446
- context_copy = copy.deepcopy(goal_node.context) if goal_node.context else {}
447
-
448
- node = GoalNode(
449
- description=subtask,
450
- parent=goal_node.id,
451
- priority=goal_node.priority - i * 0.1,
452
- context=context_copy,
453
- tags=goal_node.tags.copy() if goal_node.tags else [],
454
- decomposer_name=self.name
455
- )
456
- nodes.append(node)
457
-
458
- return nodes
459
-
460
- # Use the decomposer directly
461
- custom_decomposer = CustomDecomposer()
462
- goal_node = GoalNode(description="Build a web application")
463
- subgoals = custom_decomposer.decompose(goal_node)
464
- ```
465
-
466
- ### LLM Routing Strategies
467
-
468
- ```python
469
- from noesium.core.routing import ModelRouter, DynamicComplexityStrategy
470
- from noesium.core.llm import get_llm_client
471
-
472
- # Create a lite client for complexity assessment
473
- lite_client = get_llm_client(provider="ollama", chat_model="llama3.2:1b")
474
-
475
- # Create router with dynamic complexity strategy
476
- router = ModelRouter(
477
- strategy="dynamic_complexity",
478
- lite_client=lite_client,
479
- strategy_config={
480
- "complexity_threshold_low": 0.3,
481
- "complexity_threshold_high": 0.7
482
- }
483
- )
484
-
485
- # Route queries to get tier recommendations
486
- simple_query = "What is 2+2?"
487
- result = router.route(simple_query)
488
- print(f"Query: {simple_query}")
489
- print(f"Recommended tier: {result.tier}") # Likely ModelTier.LITE
490
- print(f"Confidence: {result.confidence}")
491
-
492
- complex_query = "Explain quantum computing and its applications"
493
- result = router.route(complex_query)
494
- print(f"Query: {complex_query}")
495
- print(f"Recommended tier: {result.tier}") # Likely ModelTier.POWER
496
- print(f"Confidence: {result.confidence}")
497
-
498
- # Get recommended model configuration
499
- routing_result, model_config = router.route_and_configure(complex_query)
500
- print(f"Recommended config: {model_config}")
501
- ```
502
-
503
- ## Examples
504
-
505
- Check the `examples/` directory for comprehensive usage examples:
506
-
507
- - **LLM Examples**: `examples/llm/` - OpenAI, Ollama, LlamaCPP, token tracking
508
- - **Goal Management**: `examples/goals/` - Goal decomposition and planning
509
- - **Memory Examples**: `examples/memory/` - Memory agent operations
510
- - **Vector Store**: `examples/vector_store/` - PGVector and Weaviate usage
511
- - **Message Bus**: `examples/msgbus/` - Event-driven patterns
512
- - **Tools**: Various toolkit implementations
513
-
514
- ## Development
515
-
516
- ```bash
517
- # Install development dependencies
518
- make install
519
-
520
- # Run tests
521
- make test
522
-
523
- # Run specific test categories
524
- make test-unit # Unit tests only
525
- make test-integration # Integration tests only
526
-
527
- # Format code
528
- make format
529
- ```
530
-
531
- ## License
532
-
533
- MIT License - see [LICENSE](LICENSE) file for details.