@nextsparkjs/plugin-langchain 0.1.0-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (67) hide show
  1. package/.env.example +41 -0
  2. package/api/observability/metrics/route.ts +110 -0
  3. package/api/observability/traces/[traceId]/route.ts +398 -0
  4. package/api/observability/traces/route.ts +205 -0
  5. package/api/sessions/route.ts +332 -0
  6. package/components/observability/CollapsibleJson.tsx +71 -0
  7. package/components/observability/CompactTimeline.tsx +75 -0
  8. package/components/observability/ConversationFlow.tsx +271 -0
  9. package/components/observability/DisabledMessage.tsx +21 -0
  10. package/components/observability/FiltersPanel.tsx +82 -0
  11. package/components/observability/ObservabilityDashboard.tsx +230 -0
  12. package/components/observability/SpansList.tsx +210 -0
  13. package/components/observability/TraceDetail.tsx +335 -0
  14. package/components/observability/TraceStatusBadge.tsx +39 -0
  15. package/components/observability/TracesTable.tsx +97 -0
  16. package/components/observability/index.ts +7 -0
  17. package/docs/01-getting-started/01-overview.md +196 -0
  18. package/docs/01-getting-started/02-installation.md +368 -0
  19. package/docs/01-getting-started/03-configuration.md +794 -0
  20. package/docs/02-core-concepts/01-architecture.md +566 -0
  21. package/docs/02-core-concepts/02-agents.md +597 -0
  22. package/docs/02-core-concepts/03-tools.md +689 -0
  23. package/docs/03-orchestration/01-graph-orchestrator.md +809 -0
  24. package/docs/03-orchestration/02-legacy-react.md +650 -0
  25. package/docs/04-advanced/01-observability.md +645 -0
  26. package/docs/04-advanced/02-token-tracking.md +469 -0
  27. package/docs/04-advanced/03-streaming.md +476 -0
  28. package/docs/04-advanced/04-guardrails.md +597 -0
  29. package/docs/05-reference/01-api-reference.md +1403 -0
  30. package/docs/05-reference/02-customization.md +646 -0
  31. package/docs/05-reference/03-examples.md +881 -0
  32. package/docs/index.md +85 -0
  33. package/hooks/observability/useMetrics.ts +31 -0
  34. package/hooks/observability/useTraceDetail.ts +48 -0
  35. package/hooks/observability/useTraces.ts +59 -0
  36. package/lib/agent-factory.ts +354 -0
  37. package/lib/agent-helpers.ts +201 -0
  38. package/lib/db-memory-store.ts +417 -0
  39. package/lib/graph/index.ts +58 -0
  40. package/lib/graph/nodes/combiner.ts +399 -0
  41. package/lib/graph/nodes/router.ts +440 -0
  42. package/lib/graph/orchestrator-graph.ts +386 -0
  43. package/lib/graph/prompts/combiner.md +131 -0
  44. package/lib/graph/prompts/router.md +193 -0
  45. package/lib/graph/types.ts +365 -0
  46. package/lib/guardrails.ts +230 -0
  47. package/lib/index.ts +44 -0
  48. package/lib/logger.ts +70 -0
  49. package/lib/memory-store.ts +168 -0
  50. package/lib/message-serializer.ts +110 -0
  51. package/lib/prompt-renderer.ts +94 -0
  52. package/lib/providers.ts +226 -0
  53. package/lib/streaming.ts +232 -0
  54. package/lib/token-tracker.ts +298 -0
  55. package/lib/tools-builder.ts +192 -0
  56. package/lib/tracer-callbacks.ts +342 -0
  57. package/lib/tracer.ts +350 -0
  58. package/migrations/001_langchain_memory.sql +83 -0
  59. package/migrations/002_token_usage.sql +127 -0
  60. package/migrations/003_observability.sql +257 -0
  61. package/package.json +28 -0
  62. package/plugin.config.ts +170 -0
  63. package/presets/lib/langchain.config.ts.preset +142 -0
  64. package/presets/templates/sector7/ai-observability/[traceId]/page.tsx +91 -0
  65. package/presets/templates/sector7/ai-observability/page.tsx +54 -0
  66. package/types/langchain.types.ts +274 -0
  67. package/types/observability.types.ts +270 -0
@@ -0,0 +1,196 @@
1
+ # LangChain Plugin - Overview
2
+
3
+ ## Introduction
4
+
5
+ The LangChain Plugin is a comprehensive AI agent infrastructure that enables themes to integrate conversational AI capabilities into their applications. Built on top of [LangChain](https://langchain.com/) and [LangGraph](https://langchain-ai.github.io/langgraph/), this plugin provides a complete framework for creating, configuring, and deploying AI agents with:
6
+
7
+ - **Multiple LLM Provider Support**: OpenAI, Anthropic, and Ollama (local)
8
+ - **Persistent Conversation Memory**: Database-backed session management
9
+ - **Tool-based Architecture**: Agents interact with your data through well-defined tools
10
+ - **Multi-tenant Security**: Row-Level Security (RLS) for data isolation
11
+ - **Orchestration Patterns**: Route requests to specialized agents
12
+
13
+ ## Core Concepts
14
+
15
+ ### What is an AI Agent?
16
+
17
+ An AI agent is a system that uses a Large Language Model (LLM) to reason about problems and take actions through tools. Unlike simple chat interfaces, agents can:
18
+
19
+ 1. **Reason** about the user's intent
20
+ 2. **Plan** a sequence of actions
21
+ 3. **Execute** tools to retrieve or modify data
22
+ 4. **Respond** with accurate, data-driven answers
23
+
24
+ ```
25
+ User Message → Agent (LLM + Tools) → Tool Execution → Response
26
+ ```
27
+
28
+ ### Plugin vs Theme Responsibility
29
+
30
+ This plugin follows a clear separation of concerns:
31
+
32
+ | Plugin Provides | Theme Implements |
33
+ |-----------------|------------------|
34
+ | Agent creation infrastructure | Agent definitions and configuration |
35
+ | Memory persistence | Entity-specific tools |
36
+ | Provider abstraction | System prompts |
37
+ | Tool building utilities | API endpoints |
38
+ | Message serialization | Orchestration logic |
39
+
40
+ The plugin is **entity-agnostic** - it doesn't know about your tasks, customers, or pages. Themes define what data agents can access and how.
41
+
42
+ ## Architecture Overview
43
+
44
+ ```
45
+ ┌─────────────────────────────────────────────────────────────────┐
46
+ │ THEME LAYER │
47
+ ├─────────────────────────────────────────────────────────────────┤
48
+ │ langchain.config.ts │ agents/*.md │ tools/*.ts │
49
+ │ (Agent Definitions) │ (Prompts) │ (Entity Tools) │
50
+ ├─────────────────────────────────────────────────────────────────┤
51
+ │ PLUGIN LAYER │
52
+ ├─────────────────────────────────────────────────────────────────┤
53
+ │ createAgent() │ memoryStore │ providers │ buildTools│
54
+ │ (Factory) │ (Persistence) │ (LLM Access) │ (Helpers) │
55
+ ├─────────────────────────────────────────────────────────────────┤
56
+ │ LANGCHAIN/LANGGRAPH │
57
+ ├─────────────────────────────────────────────────────────────────┤
58
+ │ LLM Providers │ ReAct Pattern │ Message Types │ Tool Calls│
59
+ └─────────────────────────────────────────────────────────────────┘
60
+ ```
61
+
62
+ ## Key Features
63
+
64
+ ### 1. Multi-Provider Support
65
+
66
+ Switch between LLM providers without changing your agent code:
67
+
68
+ ```typescript
69
+ // Use local Ollama for development
70
+ { provider: 'ollama', model: 'llama3.2:3b' }
71
+
72
+ // Use OpenAI for production
73
+ { provider: 'openai', model: 'gpt-4o' }
74
+
75
+ // Use Anthropic for specific agents
76
+ { provider: 'anthropic', model: 'claude-3-5-sonnet-20241022' }
77
+ ```
78
+
79
+ ### 2. Persistent Conversations
80
+
81
+ All conversations are automatically persisted to PostgreSQL with:
82
+
83
+ - **Session Management**: Create, list, rename, pin conversations
84
+ - **Message Limits**: Configurable sliding window (default: 50 messages)
85
+ - **Conversation Limits**: Maximum 50 conversations per user
86
+ - **Multi-tenancy**: Isolated by userId + teamId
87
+
88
+ ### 3. Tool-Based Data Access
89
+
90
+ Agents interact with your data through tools, not direct database access:
91
+
92
+ ```typescript
93
+ const taskTools = [
94
+ {
95
+ name: 'list_tasks',
96
+ description: 'List all tasks for the current user',
97
+ schema: z.object({ status: z.string().optional() }),
98
+ func: async ({ status }) => TasksService.list(userId, { status })
99
+ }
100
+ ]
101
+ ```
102
+
103
+ ### 4. Orchestration Patterns
104
+
105
+ Route requests to specialized agents based on user intent:
106
+
107
+ ```
108
+ User: "Show me my tasks"
109
+ → Orchestrator analyzes intent
110
+ → Routes to Task Agent
111
+ → Task Agent uses task tools
112
+ → Returns task list
113
+ ```
114
+
115
+ ## Use Cases
116
+
117
+ ### Single Agent Architecture
118
+
119
+ Best for simple applications with unified data access:
120
+
121
+ - Personal productivity apps
122
+ - Small business tools
123
+ - Prototypes and MVPs
124
+
125
+ ```
126
+ User → Single Agent (all tools) → Response
127
+ ```
128
+
129
+ ### Multi-Agent Orchestration
130
+
131
+ Best for complex applications with specialized domains:
132
+
133
+ - Enterprise CRM systems
134
+ - Content management platforms
135
+ - E-commerce with multiple entities
136
+
137
+ ```
138
+ User → Orchestrator → [Task Agent | Customer Agent | Page Agent] → Response
139
+ ```
140
+
141
+ ## Getting Started
142
+
143
+ To integrate the LangChain plugin into your theme:
144
+
145
+ 1. **Install Dependencies**: Run the database migration
146
+ 2. **Configure Environment**: Set up LLM provider credentials
147
+ 3. **Create Configuration**: Define agents in `langchain.config.ts`
148
+ 4. **Build Tools**: Create tools for your entities
149
+ 5. **Write Prompts**: Define agent behavior in markdown files
150
+ 6. **Expose API**: Create chat endpoints for your frontend
151
+
152
+ Continue to the [Architecture](../02-core-concepts/01-architecture.md) section for detailed technical concepts, or jump to [Installation](./02-installation.md) to get started immediately.
153
+
154
+ ## Documentation Structure
155
+
156
+ See the full [Documentation Index](../index.md) for navigation.
157
+
158
+ ### Getting Started
159
+
160
+ | Section | Description |
161
+ |---------|-------------|
162
+ | [Overview](./01-overview.md) | Introduction and core concepts (you are here) |
163
+ | [Installation](./02-installation.md) | Setup and dependencies |
164
+ | [Configuration](./03-configuration.md) | Theme-level configuration |
165
+
166
+ ### Core Concepts
167
+
168
+ | Section | Description |
169
+ |---------|-------------|
170
+ | [Architecture](../02-core-concepts/01-architecture.md) | Technical architecture and patterns |
171
+ | [Agents](../02-core-concepts/02-agents.md) | Creating and customizing agents |
172
+ | [Tools](../02-core-concepts/03-tools.md) | Building tools for agents |
173
+
174
+ ### Orchestration
175
+
176
+ | Section | Description |
177
+ |---------|-------------|
178
+ | [Graph Orchestrator](../03-orchestration/01-graph-orchestrator.md) | **Recommended** - Modern state-machine orchestration |
179
+ | [Legacy ReAct](../03-orchestration/02-legacy-react.md) | Deprecated ReAct-based approach |
180
+
181
+ ### Advanced Topics
182
+
183
+ | Section | Description |
184
+ |---------|-------------|
185
+ | [Observability](../04-advanced/01-observability.md) | Tracing, metrics, and debugging dashboard |
186
+ | [Token Tracking](../04-advanced/02-token-tracking.md) | Token usage and cost monitoring |
187
+ | [Streaming](../04-advanced/03-streaming.md) | Real-time SSE streaming responses |
188
+ | [Guardrails](../04-advanced/04-guardrails.md) | Security: injection detection, PII masking |
189
+
190
+ ### Reference
191
+
192
+ | Section | Description |
193
+ |---------|-------------|
194
+ | [API Reference](../05-reference/01-api-reference.md) | Complete API documentation |
195
+ | [Customization](../05-reference/02-customization.md) | Advanced customization guide |
196
+ | [Examples](../05-reference/03-examples.md) | Real-world implementation examples |
@@ -0,0 +1,368 @@
1
+ # Installation
2
+
3
+ This guide covers the complete setup process for integrating the LangChain plugin into your theme.
4
+
5
+ ## Prerequisites
6
+
7
+ Before installing, ensure you have:
8
+
9
+ - PostgreSQL database (Supabase recommended)
10
+ - Node.js 18+ with pnpm
11
+ - At least one LLM provider configured (Ollama recommended for development)
12
+
13
+ ## Step 1: Database Migration
14
+
15
+ The plugin requires a `langchain_sessions` table for conversation persistence.
16
+
17
+ ### Run the Migration
18
+
19
+ ```bash
20
+ # Navigate to the migrations folder
21
+ cd contents/plugins/langchain/migrations
22
+
23
+ # Apply the migration to your database
24
+ psql $DATABASE_URL -f 001_langchain_memory.sql
25
+ ```
26
+
27
+ Or using your migration tool:
28
+
29
+ ```bash
30
+ pnpm db:migrate
31
+ ```
32
+
33
+ ### Migration Contents
34
+
35
+ The migration creates:
36
+
37
+ ```sql
38
+ -- Table for conversation storage
39
+ CREATE TABLE public."langchain_sessions" (
40
+ id TEXT PRIMARY KEY,
41
+ "userId" TEXT NOT NULL REFERENCES users(id),
42
+ "teamId" TEXT NOT NULL REFERENCES teams(id),
43
+ "sessionId" TEXT NOT NULL,
44
+ name TEXT DEFAULT NULL,
45
+ "isPinned" BOOLEAN DEFAULT false,
46
+ messages JSONB NOT NULL DEFAULT '[]',
47
+ metadata JSONB DEFAULT '{}',
48
+ "maxMessages" INTEGER DEFAULT 50,
49
+ "expiresAt" TIMESTAMPTZ DEFAULT NULL,
50
+ "createdAt" TIMESTAMPTZ DEFAULT now(),
51
+ "updatedAt" TIMESTAMPTZ DEFAULT now(),
52
+ UNIQUE ("userId", "teamId", "sessionId")
53
+ );
54
+
55
+ -- Performance indexes
56
+ CREATE INDEX idx_sessions_lookup ON langchain_sessions(...);
57
+ CREATE INDEX idx_sessions_pinned ON langchain_sessions(...);
58
+ CREATE INDEX idx_sessions_updated ON langchain_sessions(...);
59
+
60
+ -- Row-Level Security
61
+ ALTER TABLE langchain_sessions ENABLE ROW LEVEL SECURITY;
62
+ ```
63
+
64
+ ## Step 2: Environment Variables
65
+
66
+ Configure your LLM provider(s) in `.env`:
67
+
68
+ ### Required Variables
69
+
70
+ ```env
71
+ # Enable the plugin
72
+ LANGCHAIN_PLUGIN_ENABLED=true
73
+
74
+ # Optional: Enable debug console logging
75
+ LANGCHAIN_PLUGIN_DEBUG=false
76
+
77
+ # Enable file logging (core environment variable)
78
+ LOG_ENABLED=true # Logs to logger/ai/
79
+ ```
80
+
81
+ ### Provider Configuration
82
+
83
+ #### Option A: Ollama (Recommended for Development)
84
+
85
+ Ollama runs locally and is free. Install from [ollama.com](https://ollama.com):
86
+
87
+ ```bash
88
+ # Install Ollama (macOS)
89
+ brew install ollama
90
+
91
+ # Pull a model
92
+ ollama pull llama3.2:3b # Fast, lightweight
93
+ ollama pull qwen2.5:7b # Better quality
94
+ ollama pull mistral # Good balance
95
+
96
+ # Start Ollama server
97
+ ollama serve
98
+ ```
99
+
100
+ Configure in `.env`:
101
+
102
+ ```env
103
+ LANGCHAIN_OLLAMA_BASE_URL=http://localhost:11434
104
+ LANGCHAIN_OLLAMA_MODEL=qwen2.5:7b
105
+ ```
106
+
107
+ #### Option B: OpenAI
108
+
109
+ ```env
110
+ OPENAI_API_KEY=sk-...
111
+ LANGCHAIN_OPENAI_MODEL=gpt-4o-mini
112
+ ```
113
+
114
+ #### Option C: Anthropic
115
+
116
+ ```env
117
+ ANTHROPIC_API_KEY=sk-ant-...
118
+ LANGCHAIN_ANTHROPIC_MODEL=claude-3-5-sonnet-20241022
119
+ ```
120
+
121
+ #### Option D: LM Studio (OpenAI-Compatible)
122
+
123
+ LM Studio provides a local server with OpenAI API compatibility:
124
+
125
+ ```env
126
+ # Point OpenAI provider to LM Studio
127
+ LANGCHAIN_OPENAI_BASE_URL=http://localhost:1234/v1
128
+ LANGCHAIN_OPENAI_MODEL=local-model
129
+ # API key not required for local
130
+ ```
131
+
132
+ ### Complete Example
133
+
134
+ ```env
135
+ # Plugin settings
136
+ LANGCHAIN_PLUGIN_ENABLED=true
137
+ LANGCHAIN_PLUGIN_DEBUG=false
138
+
139
+ # File logging (core)
140
+ LOG_ENABLED=true # Logs to logger/ai/
141
+
142
+ # Ollama (development)
143
+ LANGCHAIN_OLLAMA_BASE_URL=http://localhost:11434
144
+ LANGCHAIN_OLLAMA_MODEL=qwen2.5:7b
145
+
146
+ # OpenAI (production fallback)
147
+ OPENAI_API_KEY=sk-...
148
+ LANGCHAIN_OPENAI_MODEL=gpt-4o-mini
149
+
150
+ # Anthropic (optional)
151
+ ANTHROPIC_API_KEY=sk-ant-...
152
+ LANGCHAIN_ANTHROPIC_MODEL=claude-3-5-sonnet-20241022
153
+ ```
154
+
155
+ ## Step 3: Theme Structure
156
+
157
+ Create the following structure in your theme:
158
+
159
+ ```
160
+ contents/themes/your-theme/
161
+ ├── lib/
162
+ │ └── langchain/
163
+ │ ├── langchain.config.ts # Agent configuration
164
+ │ ├── orchestrator.ts # (optional) Orchestration logic
165
+ │ ├── agents/
166
+ │ │ ├── index.ts # Prompt loader
167
+ │ │ └── *.md # System prompts
168
+ │ └── tools/
169
+ │ ├── index.ts # Tool exports
170
+ │ └── *.ts # Entity tools
171
+ └── api/
172
+ └── ai/
173
+ └── chat/
174
+ └── route.ts # Chat API endpoint
175
+ ```
176
+
177
+ ### Using Presets
178
+
179
+ The plugin provides presets as starting points:
180
+
181
+ ```bash
182
+ # Copy configuration preset
183
+ cp contents/plugins/langchain/presets/lib/langchain.config.ts.preset \
184
+ contents/themes/your-theme/lib/langchain/langchain.config.ts
185
+
186
+ # Copy API endpoint preset
187
+ mkdir -p contents/themes/your-theme/api/ai/chat
188
+ cp contents/plugins/langchain/presets/api/chat/route.ts.preset \
189
+ contents/themes/your-theme/api/ai/chat/route.ts
190
+
191
+ # Copy tools preset
192
+ mkdir -p contents/themes/your-theme/lib/langchain/tools
193
+ cp contents/plugins/langchain/presets/lib/tools/entity-tools.ts.preset \
194
+ contents/themes/your-theme/lib/langchain/tools/example.ts
195
+
196
+ # Copy prompt preset
197
+ mkdir -p contents/themes/your-theme/lib/langchain/agents
198
+ cp contents/plugins/langchain/presets/agents/entity-assistant.md.preset \
199
+ contents/themes/your-theme/lib/langchain/agents/assistant.md
200
+ ```
201
+
202
+ ## Step 4: Basic Configuration
203
+
204
+ Create your agent configuration:
205
+
206
+ ```typescript
207
+ // contents/themes/your-theme/lib/langchain/langchain.config.ts
208
+
209
+ import type {
210
+ ThemeLangChainConfig,
211
+ AgentDefinition,
212
+ ToolContext,
213
+ } from '@/contents/plugins/langchain/types/langchain.types'
214
+ import { createAgentHelpers } from '@/contents/plugins/langchain/lib/agent-helpers'
215
+ import { createMyTools } from './tools/my-entity'
216
+
217
+ export const AGENTS: Record<string, AgentDefinition> = {
218
+ 'my-assistant': {
219
+ provider: 'ollama',
220
+ temperature: 0.3,
221
+ description: 'Assistant for my entity',
222
+ systemPrompt: 'my-assistant', // loads agents/my-assistant.md
223
+ createTools: (context: ToolContext) => createMyTools(context),
224
+ },
225
+ }
226
+
227
+ export const langchainConfig: ThemeLangChainConfig = {
228
+ defaultProvider: 'ollama',
229
+ defaultTemperature: 0.3,
230
+ agents: AGENTS,
231
+ }
232
+
233
+ // Export helpers
234
+ const helpers = createAgentHelpers(AGENTS, {
235
+ provider: langchainConfig.defaultProvider,
236
+ temperature: langchainConfig.defaultTemperature,
237
+ })
238
+
239
+ export const {
240
+ getAgentConfig,
241
+ getAgentModelConfig,
242
+ getAgentTools,
243
+ getAgentPromptName,
244
+ hasAgent,
245
+ getAgentNames,
246
+ } = helpers
247
+ ```
248
+
249
+ ## Step 5: Verify Installation
250
+
251
+ ### Test Provider Connection
252
+
253
+ ```typescript
254
+ import { isProviderAvailable, getAvailableProviders } from '@/contents/plugins/langchain/plugin.config'
255
+
256
+ // Check available providers
257
+ console.log('Available:', getAvailableProviders())
258
+ // ['ollama', 'openai'] (depends on env config)
259
+
260
+ // Check specific provider
261
+ console.log('Ollama:', isProviderAvailable('ollama'))
262
+ // true/false
263
+ ```
264
+
265
+ ### Test Agent Creation
266
+
267
+ ```typescript
268
+ import { createAgent } from '@/contents/plugins/langchain/lib/agent-factory'
269
+
270
+ const agent = await createAgent({
271
+ sessionId: 'test-session',
272
+ systemPrompt: 'You are a helpful assistant. Respond with "Hello, I am working!"',
273
+ tools: [],
274
+ modelConfig: {
275
+ provider: 'ollama',
276
+ model: 'llama3.2:3b',
277
+ temperature: 0.3,
278
+ },
279
+ })
280
+
281
+ const response = await agent.chat('Test message')
282
+ console.log(response.content)
283
+ // Should receive a response from the LLM
284
+ ```
285
+
286
+ ### Test Memory Persistence
287
+
288
+ ```typescript
289
+ import { memoryStore } from '@/contents/plugins/langchain/lib/memory-store'
290
+
291
+ const context = { userId: 'test-user', teamId: 'test-team' }
292
+
293
+ // Create session (sessionId is auto-generated)
294
+ const { sessionId } = await memoryStore.createSession(context, 'Test Chat')
295
+ console.log('Created session:', sessionId)
296
+
297
+ // List sessions
298
+ const sessions = await memoryStore.listSessions(context)
299
+ console.log('Sessions:', sessions.length)
300
+
301
+ // Clean up
302
+ await memoryStore.clearSession(sessionId, context)
303
+ ```
304
+
305
+ ## Troubleshooting
306
+
307
+ ### Ollama Connection Failed
308
+
309
+ ```
310
+ Error: Failed to connect to Ollama at http://localhost:11434
311
+ ```
312
+
313
+ **Solution:**
314
+
315
+ ```bash
316
+ # Check if Ollama is running
317
+ curl http://localhost:11434/api/tags
318
+
319
+ # Start Ollama
320
+ ollama serve
321
+
322
+ # Verify model is pulled
323
+ ollama list
324
+ ```
325
+
326
+ ### OpenAI API Key Invalid
327
+
328
+ ```
329
+ Error: Invalid API Key
330
+ ```
331
+
332
+ **Solution:**
333
+ - Verify `OPENAI_API_KEY` is set correctly
334
+ - Check the key starts with `sk-`
335
+ - Ensure the key has API access (not just ChatGPT access)
336
+
337
+ ### Database Connection Error
338
+
339
+ ```
340
+ Error: relation "langchain_sessions" does not exist
341
+ ```
342
+
343
+ **Solution:**
344
+
345
+ ```bash
346
+ # Run the migration
347
+ psql $DATABASE_URL -f contents/plugins/langchain/migrations/001_langchain_memory.sql
348
+ ```
349
+
350
+ ### Provider Not Available
351
+
352
+ ```
353
+ Error: Provider 'openai' is not available. Configure OPENAI_API_KEY
354
+ ```
355
+
356
+ **Solution:**
357
+ - Set the required environment variables
358
+ - Restart the development server
359
+ - Use `getAvailableProviders()` to check what's configured
360
+
361
+ ## Next Steps
362
+
363
+ Now that the plugin is installed:
364
+
365
+ 1. [Configure your agents](./03-configuration.md)
366
+ 2. [Create custom tools](../02-core-concepts/03-tools.md)
367
+ 3. [Set up graph orchestration](../03-orchestration/01-graph-orchestrator.md) (recommended)
368
+ 4. [Configure observability](../04-advanced/01-observability.md)