@elizaos/plugin-knowledge 1.0.0-beta.70

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,195 @@
1
+ # Knowledge Plugin for ElizaOS
2
+
3
+ This plugin provides Retrieval Augmented Generation (Knowledge) capabilities for ElizaOS agents, allowing them to load, index, and query knowledge from various sources.
4
+
5
+ ## Quick Setup
6
+
7
+ ### Basic Setup (With plugin-openai)
8
+
9
+ If you already have plugin-openai configured, you don't need any additional environment variables! The Knowledge plugin will automatically use your OpenAI configuration.
10
+
11
+ 1. Make sure you have plugin-openai configured with:
12
+
13
+ ```env
14
+ OPENAI_API_KEY=your-openai-api-key
15
+ OPENAI_EMBEDDING_MODEL=text-embedding-3-small
16
+ ```
17
+
18
+ 2. Add the Knowledge plugin to your agent's configuration
19
+ 3. That's it! The plugin will work without any additional variables
20
+
21
+ ### Enabling Contextual Knowledge
22
+
23
+ If you want enhanced Knowledge capabilities with contextual embeddings, add:
24
+
25
+ ```env
26
+ # Enable contextual Knowledge
27
+ CTX_KNOWLEDGE_ENABLED=true
28
+
29
+ # Required text generation settings
30
+ TEXT_PROVIDER=openrouter # Choose your provider: openai, anthropic, openrouter, or google
31
+ TEXT_MODEL=anthropic/claude-3.5-sonnet # Model for your chosen provider
32
+
33
+ # Provider-specific API key (based on TEXT_PROVIDER)
34
+ OPENROUTER_API_KEY=your-openrouter-api-key
35
+ # OR ANTHROPIC_API_KEY=your-anthropic-api-key
36
+ # OR GOOGLE_API_KEY=your-google-api-key
37
+ # OR use existing OPENAI_API_KEY
38
+ ```
39
+
40
+ ### Custom Embedding Configuration (Without plugin-openai)
41
+
42
+ If you're not using plugin-openai or want to use different embedding settings:
43
+
44
+ ```env
45
+ # Required embedding settings
46
+ EMBEDDING_PROVIDER=openai # or google
47
+ TEXT_EMBEDDING_MODEL=text-embedding-3-small
48
+
49
+ # Provider-specific API key
50
+ OPENAI_API_KEY=your-openai-api-key # if using openai
51
+ # OR GOOGLE_API_KEY=your-google-api-key # if using google
52
+
53
+ # Optional: Custom embedding dimension
54
+ EMBEDDING_DIMENSION=1536
55
+ ```
56
+
57
+ ## Advanced Configuration
58
+
59
+ ### Recommended Configurations for Contextual Knowledge
60
+
61
+ For optimal performance with contextual Knowledge, we recommend these provider combinations:
62
+
63
+ **Option 1: OpenRouter with Claude/Gemini (Best for cost efficiency)**
64
+
65
+ ```env
66
+ # If using with plugin-openai, only need these additions:
67
+ CTX_KNOWLEDGE_ENABLED=true
68
+ TEXT_PROVIDER=openrouter
69
+ TEXT_MODEL=anthropic/claude-3.5-sonnet # or google/gemini-2.5-flash-preview
70
+ OPENROUTER_API_KEY=your-openrouter-api-key
71
+ ```
72
+
73
+ **Option 2: OpenAI for Everything**
74
+
75
+ ```env
76
+ # If using with plugin-openai, only need these additions:
77
+ CTX_KNOWLEDGE_ENABLED=true
78
+ TEXT_PROVIDER=openai
79
+ TEXT_MODEL=gpt-4o
80
+ ```
81
+
82
+ **Option 3: Google AI for Everything**
83
+
84
+ ```env
85
+ EMBEDDING_PROVIDER=google
86
+ TEXT_EMBEDDING_MODEL=text-embedding-004
87
+ TEXT_PROVIDER=google
88
+ TEXT_MODEL=gemini-1.5-pro-latest
89
+ GOOGLE_API_KEY=your-google-api-key
90
+ CTX_KNOWLEDGE_ENABLED=true
91
+ ```
92
+
93
+ ### Advanced Rate Limiting Options
94
+
95
+ ```env
96
+ # Rate limiting (optional)
97
+ MAX_CONCURRENT_REQUESTS=30 # Default: 30
98
+ REQUESTS_PER_MINUTE=60 # Default: 60
99
+ TOKENS_PER_MINUTE=150000 # Default: 150000
100
+ ```
101
+
102
+ ### Custom API Endpoints
103
+
104
+ ```env
105
+ # Only needed if using custom API endpoints
106
+ OPENAI_BASE_URL=https://your-openai-proxy.com/v1
107
+ ANTHROPIC_BASE_URL=https://your-anthropic-proxy.com
108
+ OPENROUTER_BASE_URL=https://your-openrouter-proxy.com/api/v1
109
+ GOOGLE_BASE_URL=https://your-google-proxy.com
110
+ ```
111
+
112
+ ### Token Limits
113
+
114
+ ```env
115
+ # Advanced token handling (optional)
116
+ MAX_INPUT_TOKENS=4000 # Default: 4000
117
+ MAX_OUTPUT_TOKENS=4096 # Default: 4096
118
+ ```
119
+
120
+ ## Architecture
121
+
122
+ The plugin is built with a modular, clean architecture that follows SOLID principles:
123
+
124
+ ```
125
+ packages/plugin-knowledge/
126
+ ├── src/
127
+ │ ├── index.ts # Main entry point and plugin definition
128
+ │ ├── service.ts # Knowledge service implementation
129
+ │ ├── types.ts # Type definitions
130
+ │ ├── llm.ts # LLM interactions (text generation, embeddings)
131
+ │ ├── config.ts # Configuration validation
132
+ │ ├── ctx-embeddings.ts # Contextual embedding generation
133
+ │ ├── document-processor.ts # Shared document processing utilities
134
+ │ └── utils.ts # Utility functions
135
+ ├── README.md # This file
136
+ └── package.json # Package definition
137
+ ```
138
+
139
+ ### Database-Specific Processing Paths
140
+
141
+ The Knowledge plugin adapts to the database technology being used:
142
+
143
+ 1. **PostgreSQL Mode**: Uses worker threads to offload document processing from the main thread
144
+ 2. **PGLite Mode**: Uses synchronous processing in the main thread due to PGLite's single-threaded nature
145
+
146
+ This allows the plugin to work optimally with both databases while maintaining the same functionality.
147
+
148
+ ### Processing Flow
149
+
150
+ The document processing flow follows these steps regardless of database type:
151
+
152
+ 1. Extract text from the document based on content type
153
+ 2. Store the main document in the database
154
+ 3. Split the document into chunks
155
+ 4. Generate embeddings for each chunk (with optional context enrichment)
156
+ 5. Store the chunks with embeddings in the database
157
+
158
+ ## Component Overview
159
+
160
+ - **KnowledgeService**: Core service that manages document processing and storage
161
+ - **Document Processor**: Provides shared document processing utilities for both processing paths
162
+
163
+ ## Features
164
+
165
+ - Document upload and processing (PDF, text, and other formats)
166
+ - Contextual chunking and embedding generation
167
+ - Robust error handling and recovery
168
+ - Rate limiting to respect provider limitations
169
+ - Support for multiple LLM providers
170
+
171
+ ## Usage
172
+
173
+ ### Basic Usage
174
+
175
+ ```typescript
176
+ import { KnowledgeService } from '@elizaos/plugin-knowledge';
177
+
178
+ // Add knowledge to an agent
179
+ const result = await knowledgeService.addKnowledge({
180
+ clientDocumentId: 'unique-id',
181
+ content: documentContent, // Base64 string for binary files or plain text for text files
182
+ contentType: 'application/pdf',
183
+ originalFilename: 'document.pdf',
184
+ worldId: 'world-id',
185
+ roomId: 'optional-room-id', // Optional scoping
186
+ entityId: 'optional-entity-id', // Optional scoping
187
+ });
188
+
189
+ console.log(`Document stored with ID: ${result.storedDocumentMemoryId}`);
190
+ console.log(`Created ${result.fragmentCount} searchable fragments`);
191
+ ```
192
+
193
+ ## License
194
+
195
+ See the ElizaOS license for details.
@@ -0,0 +1,129 @@
1
+ // src/docs-loader.ts
2
+ import { logger } from "@elizaos/core";
3
+ import * as fs from "fs";
4
+ import * as path from "path";
5
+ function getKnowledgePath() {
6
+ const envPath = process.env.KNOWLEDGE_PATH;
7
+ if (envPath) {
8
+ const resolvedPath = path.resolve(envPath);
9
+ if (!fs.existsSync(resolvedPath)) {
10
+ logger.warn(
11
+ `Knowledge path from environment variable does not exist: ${resolvedPath}`
12
+ );
13
+ logger.warn(
14
+ "Please create the directory or update KNOWLEDGE_PATH environment variable"
15
+ );
16
+ }
17
+ return resolvedPath;
18
+ }
19
+ const defaultPath = path.join(process.cwd(), "docs");
20
+ if (!fs.existsSync(defaultPath)) {
21
+ logger.info(`Default docs folder does not exist at: ${defaultPath}`);
22
+ logger.info("To use the knowledge plugin, either:");
23
+ logger.info('1. Create a "docs" folder in your project root');
24
+ logger.info(
25
+ "2. Set KNOWLEDGE_PATH environment variable to your documents folder"
26
+ );
27
+ }
28
+ return defaultPath;
29
+ }
30
+ async function loadDocsFromPath(service, agentId, worldId) {
31
+ const docsPath = getKnowledgePath();
32
+ if (!fs.existsSync(docsPath)) {
33
+ logger.warn(`Knowledge path does not exist: ${docsPath}`);
34
+ return { total: 0, successful: 0, failed: 0 };
35
+ }
36
+ logger.info(`Loading documents from: ${docsPath}`);
37
+ const files = getAllFiles(docsPath);
38
+ if (files.length === 0) {
39
+ logger.info("No files found in knowledge path");
40
+ return { total: 0, successful: 0, failed: 0 };
41
+ }
42
+ logger.info(`Found ${files.length} files to process`);
43
+ let successful = 0;
44
+ let failed = 0;
45
+ for (const filePath of files) {
46
+ try {
47
+ const fileName = path.basename(filePath);
48
+ const fileExt = path.extname(filePath).toLowerCase();
49
+ if (fileName.startsWith(".")) {
50
+ continue;
51
+ }
52
+ const contentType = getContentType(fileExt);
53
+ if (!contentType) {
54
+ logger.debug(`Skipping unsupported file type: ${filePath}`);
55
+ continue;
56
+ }
57
+ const fileBuffer = fs.readFileSync(filePath);
58
+ const isBinary = [
59
+ "application/pdf",
60
+ "application/msword",
61
+ "application/vnd.openxmlformats-officedocument.wordprocessingml.document"
62
+ ].includes(contentType);
63
+ const content = isBinary ? fileBuffer.toString("base64") : fileBuffer.toString("utf-8");
64
+ const knowledgeOptions = {
65
+ clientDocumentId: `${agentId}-docs-${Date.now()}-${fileName}`,
66
+ contentType,
67
+ originalFilename: fileName,
68
+ worldId: worldId || agentId,
69
+ content
70
+ };
71
+ logger.debug(`Processing document: ${fileName}`);
72
+ const result = await service.addKnowledge(knowledgeOptions);
73
+ logger.info(
74
+ `Successfully processed ${fileName}: ${result.fragmentCount} fragments created`
75
+ );
76
+ successful++;
77
+ } catch (error) {
78
+ logger.error(`Failed to process file ${filePath}:`, error);
79
+ failed++;
80
+ }
81
+ }
82
+ logger.info(
83
+ `Document loading complete: ${successful} successful, ${failed} failed out of ${files.length} total`
84
+ );
85
+ return {
86
+ total: files.length,
87
+ successful,
88
+ failed
89
+ };
90
+ }
91
+ function getAllFiles(dirPath, files = []) {
92
+ try {
93
+ const entries = fs.readdirSync(dirPath, { withFileTypes: true });
94
+ for (const entry of entries) {
95
+ const fullPath = path.join(dirPath, entry.name);
96
+ if (entry.isDirectory()) {
97
+ if (!["node_modules", ".git", ".vscode", "dist", "build"].includes(
98
+ entry.name
99
+ )) {
100
+ getAllFiles(fullPath, files);
101
+ }
102
+ } else if (entry.isFile()) {
103
+ files.push(fullPath);
104
+ }
105
+ }
106
+ } catch (error) {
107
+ logger.error(`Error reading directory ${dirPath}:`, error);
108
+ }
109
+ return files;
110
+ }
111
+ function getContentType(extension) {
112
+ const contentTypes = {
113
+ ".txt": "text/plain",
114
+ ".md": "text/plain",
115
+ ".tson": "text/plain",
116
+ ".xml": "text/plain",
117
+ ".csv": "text/plain",
118
+ ".html": "text/html",
119
+ ".pdf": "application/pdf",
120
+ ".doc": "application/msword",
121
+ ".docx": "application/vnd.openxmlformats-officedocument.wordprocessingml.document"
122
+ };
123
+ return contentTypes[extension] || null;
124
+ }
125
+ export {
126
+ getKnowledgePath,
127
+ loadDocsFromPath
128
+ };
129
+ //# sourceMappingURL=docs-loader-3LDO3WCY.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/docs-loader.ts"],"sourcesContent":["import { logger, UUID } from \"@elizaos/core\";\nimport * as fs from \"fs\";\nimport * as path from \"path\";\nimport { KnowledgeService } from \"./service.ts\";\nimport { AddKnowledgeOptions } from \"./types.ts\";\n\n/**\n * Get the knowledge path from environment or default to ./docs\n */\nexport function getKnowledgePath(): string {\n const envPath = process.env.KNOWLEDGE_PATH;\n\n if (envPath) {\n // Resolve relative paths from current working directory\n const resolvedPath = path.resolve(envPath);\n\n if (!fs.existsSync(resolvedPath)) {\n logger.warn(\n `Knowledge path from environment variable does not exist: ${resolvedPath}`\n );\n logger.warn(\n \"Please create the directory or update KNOWLEDGE_PATH environment variable\"\n );\n }\n\n return resolvedPath;\n }\n\n // Default to docs folder in current working directory\n const defaultPath = path.join(process.cwd(), \"docs\");\n\n if (!fs.existsSync(defaultPath)) {\n logger.info(`Default docs folder does not exist at: ${defaultPath}`);\n logger.info(\"To use the knowledge plugin, either:\");\n logger.info('1. Create a \"docs\" folder in your project root');\n logger.info(\n \"2. Set KNOWLEDGE_PATH environment variable to your documents folder\"\n );\n }\n\n return defaultPath;\n}\n\n/**\n * Load documents from the knowledge path\n */\nexport async function loadDocsFromPath(\n service: KnowledgeService,\n agentId: UUID,\n worldId?: UUID\n): Promise<{ total: number; successful: number; failed: number }> {\n const docsPath = getKnowledgePath();\n\n if (!fs.existsSync(docsPath)) {\n logger.warn(`Knowledge path does not exist: ${docsPath}`);\n return { total: 0, successful: 0, failed: 0 };\n }\n\n logger.info(`Loading documents from: ${docsPath}`);\n\n // Get all files recursively\n const files = getAllFiles(docsPath);\n\n if (files.length === 0) {\n logger.info(\"No files found in knowledge path\");\n return { total: 0, successful: 0, failed: 0 };\n }\n\n logger.info(`Found ${files.length} files to process`);\n\n let successful = 0;\n let failed = 0;\n\n for (const filePath of files) {\n try {\n const fileName = path.basename(filePath);\n const fileExt = path.extname(filePath).toLowerCase();\n\n // Skip hidden files and directories\n if (fileName.startsWith(\".\")) {\n continue;\n }\n\n // Determine content type\n const contentType = getContentType(fileExt);\n\n // Skip unsupported file types\n if (!contentType) {\n logger.debug(`Skipping unsupported file type: ${filePath}`);\n continue;\n }\n\n // Read file\n const fileBuffer = fs.readFileSync(filePath);\n\n // For binary files, convert to base64\n const isBinary = [\n \"application/pdf\",\n \"application/msword\",\n \"application/vnd.openxmlformats-officedocument.wordprocessingml.document\",\n ].includes(contentType);\n\n const content = isBinary\n ? fileBuffer.toString(\"base64\")\n : fileBuffer.toString(\"utf-8\");\n\n // Create knowledge options\n const knowledgeOptions: AddKnowledgeOptions = {\n clientDocumentId: `${agentId}-docs-${Date.now()}-${fileName}` as UUID,\n contentType,\n originalFilename: fileName,\n worldId: worldId || agentId,\n content,\n };\n\n // Process the document\n logger.debug(`Processing document: ${fileName}`);\n const result = await service.addKnowledge(knowledgeOptions);\n\n logger.info(\n `Successfully processed ${fileName}: ${result.fragmentCount} fragments created`\n );\n successful++;\n } catch (error) {\n logger.error(`Failed to process file ${filePath}:`, error);\n failed++;\n }\n }\n\n logger.info(\n `Document loading complete: ${successful} successful, ${failed} failed out of ${files.length} total`\n );\n\n return {\n total: files.length,\n successful,\n failed,\n };\n}\n\n/**\n * Recursively get all files in a directory\n */\nfunction getAllFiles(dirPath: string, files: string[] = []): string[] {\n try {\n const entries = fs.readdirSync(dirPath, { withFileTypes: true });\n\n for (const entry of entries) {\n const fullPath = path.join(dirPath, entry.name);\n\n if (entry.isDirectory()) {\n // Skip node_modules and other common directories\n if (\n ![\"node_modules\", \".git\", \".vscode\", \"dist\", \"build\"].includes(\n entry.name\n )\n ) {\n getAllFiles(fullPath, files);\n }\n } else if (entry.isFile()) {\n files.push(fullPath);\n }\n }\n } catch (error) {\n logger.error(`Error reading directory ${dirPath}:`, error);\n }\n\n return files;\n}\n\n/**\n * Get content type based on file extension\n */\nfunction getContentType(extension: string): string | null {\n const contentTypes: Record<string, string> = {\n \".txt\": \"text/plain\",\n \".md\": \"text/plain\",\n \".tson\": \"text/plain\",\n \".xml\": \"text/plain\",\n \".csv\": \"text/plain\",\n \".html\": \"text/html\",\n \".pdf\": \"application/pdf\",\n \".doc\": \"application/msword\",\n \".docx\":\n \"application/vnd.openxmlformats-officedocument.wordprocessingml.document\",\n };\n\n return contentTypes[extension] || null;\n}\n"],"mappings":";AAAA,SAAS,cAAoB;AAC7B,YAAY,QAAQ;AACpB,YAAY,UAAU;AAOf,SAAS,mBAA2B;AACzC,QAAM,UAAU,QAAQ,IAAI;AAE5B,MAAI,SAAS;AAEX,UAAM,eAAoB,aAAQ,OAAO;AAEzC,QAAI,CAAI,cAAW,YAAY,GAAG;AAChC,aAAO;AAAA,QACL,4DAA4D,YAAY;AAAA,MAC1E;AACA,aAAO;AAAA,QACL;AAAA,MACF;AAAA,IACF;AAEA,WAAO;AAAA,EACT;AAGA,QAAM,cAAmB,UAAK,QAAQ,IAAI,GAAG,MAAM;AAEnD,MAAI,CAAI,cAAW,WAAW,GAAG;AAC/B,WAAO,KAAK,0CAA0C,WAAW,EAAE;AACnE,WAAO,KAAK,sCAAsC;AAClD,WAAO,KAAK,gDAAgD;AAC5D,WAAO;AAAA,MACL;AAAA,IACF;AAAA,EACF;AAEA,SAAO;AACT;AAKA,eAAsB,iBACpB,SACA,SACA,SACgE;AAChE,QAAM,WAAW,iBAAiB;AAElC,MAAI,CAAI,cAAW,QAAQ,GAAG;AAC5B,WAAO,KAAK,kCAAkC,QAAQ,EAAE;AACxD,WAAO,EAAE,OAAO,GAAG,YAAY,GAAG,QAAQ,EAAE;AAAA,EAC9C;AAEA,SAAO,KAAK,2BAA2B,QAAQ,EAAE;AAGjD,QAAM,QAAQ,YAAY,QAAQ;AAElC,MAAI,MAAM,WAAW,GAAG;AACtB,WAAO,KAAK,kCAAkC;AAC9C,WAAO,EAAE,OAAO,GAAG,YAAY,GAAG,QAAQ,EAAE;AAAA,EAC9C;AAEA,SAAO,KAAK,SAAS,MAAM,MAAM,mBAAmB;AAEpD,MAAI,aAAa;AACjB,MAAI,SAAS;AAEb,aAAW,YAAY,OAAO;AAC5B,QAAI;AACF,YAAM,WAAgB,cAAS,QAAQ;AACvC,YAAM,UAAe,aAAQ,QAAQ,EAAE,YAAY;AAGnD,UAAI,SAAS,WAAW,GAAG,GAAG;AAC5B;AAAA,MACF;AAGA,YAAM,cAAc,eAAe,OAAO;AAG1C,UAAI,CAAC,aAAa;AAChB,eAAO,MAAM,mCAAmC,QAAQ,EAAE;AAC1D;AAAA,MACF;AAGA,YAAM,aAAgB,gBAAa,QAAQ;AAG3C,YAAM,WAAW;AAAA,QACf;AAAA,QACA;AAAA,QACA;AAAA,MACF,EAAE,SAAS,WAAW;AAEtB,YAAM,UAAU,WACZ,WAAW,SAAS,QAAQ,IAC5B,WAAW,SAAS,OAAO;AAG/B,YAAM,mBAAwC;AAAA,QAC5C,kBAAkB,GAAG,OAAO,SAAS,KAAK,IAAI,CAAC,IAAI,QAAQ;AAAA,QAC3D;AAAA,QACA,kBAAkB;AAAA,QAClB,SAAS,WAAW;AAAA,QACpB;AAAA,MACF;AAGA,aAAO,MAAM,wBAAwB,QAAQ,EAAE;AAC/C,YAAM,SAAS,MAAM,QAAQ,aAAa,gBAAgB;AAE1D,aAAO;AAAA,QACL,0BAA0B,QAAQ,KAAK,OAAO,aAAa;AAAA,MAC7D;AACA;AAAA,IACF,SAAS,OAAO;AACd,aAAO,MAAM,0BAA0B,QAAQ,KAAK,KAAK;AACzD;AAAA,IACF;AAAA,EACF;AAEA,SAAO;AAAA,IACL,8BAA8B,UAAU,gBAAgB,MAAM,kBAAkB,MAAM,MAAM;AAAA,EAC9F;AAEA,SAAO;AAAA,IACL,OAAO,MAAM;AAAA,IACb;AAAA,IACA;AAAA,EACF;AACF;AAKA,SAAS,YAAY,SAAiB,QAAkB,CAAC,GAAa;AACpE,MAAI;AACF,UAAM,UAAa,eAAY,SAAS,EAAE,eAAe,KAAK,CAAC;AAE/D,eAAW,SAAS,SAAS;AAC3B,YAAM,WAAgB,UAAK,SAAS,MAAM,IAAI;AAE9C,UAAI,MAAM,YAAY,GAAG;AAEvB,YACE,CAAC,CAAC,gBAAgB,QAAQ,WAAW,QAAQ,OAAO,EAAE;AAAA,UACpD,MAAM;AAAA,QACR,GACA;AACA,sBAAY,UAAU,KAAK;AAAA,QAC7B;AAAA,MACF,WAAW,MAAM,OAAO,GAAG;AACzB,cAAM,KAAK,QAAQ;AAAA,MACrB;AAAA,IACF;AAAA,EACF,SAAS,OAAO;AACd,WAAO,MAAM,2BAA2B,OAAO,KAAK,KAAK;AAAA,EAC3D;AAEA,SAAO;AACT;AAKA,SAAS,eAAe,WAAkC;AACxD,QAAM,eAAuC;AAAA,IAC3C,QAAQ;AAAA,IACR,OAAO;AAAA,IACP,SAAS;AAAA,IACT,QAAQ;AAAA,IACR,QAAQ;AAAA,IACR,SAAS;AAAA,IACT,QAAQ;AAAA,IACR,QAAQ;AAAA,IACR,SACE;AAAA,EACJ;AAEA,SAAO,aAAa,SAAS,KAAK;AACpC;","names":[]}
@@ -0,0 +1,14 @@
1
+ import { Plugin } from '@elizaos/core';
2
+
3
+ /**
4
+ * Knowledge Plugin - Main Entry Point
5
+ *
6
+ * This file exports all the necessary functions and types for the Knowledge plugin.
7
+ */
8
+
9
+ /**
10
+ * Knowledge Plugin - Provides Retrieval Augmented Generation capabilities
11
+ */
12
+ declare const knowledgePlugin: Plugin;
13
+
14
+ export { knowledgePlugin as default, knowledgePlugin };