@aigne/doc-smith 0.8.15-beta.6 → 0.8.15-beta.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (34) hide show
  1. package/CHANGELOG.md +20 -0
  2. package/agents/clear/choose-contents.mjs +4 -4
  3. package/agents/clear/clear-auth-tokens.mjs +8 -8
  4. package/agents/clear/clear-deployment-config.mjs +2 -2
  5. package/agents/clear/clear-document-config.mjs +3 -3
  6. package/agents/clear/clear-document-structure.mjs +10 -10
  7. package/agents/clear/clear-generated-docs.mjs +103 -14
  8. package/agents/clear/clear-media-description.mjs +7 -7
  9. package/agents/generate/check-need-generate-structure.mjs +2 -7
  10. package/agents/generate/generate-structure.yaml +159 -65
  11. package/agents/generate/user-review-document-structure.mjs +1 -0
  12. package/agents/generate/utils/merge-document-structures.mjs +54 -0
  13. package/agents/schema/document-structure-item.yaml +23 -0
  14. package/agents/schema/document-structure.yaml +1 -3
  15. package/agents/translate/index.yaml +1 -1
  16. package/agents/translate/record-translation-history.mjs +6 -2
  17. package/agents/update/save-and-translate-document.mjs +11 -0
  18. package/agents/utils/choose-docs.mjs +2 -1
  19. package/agents/utils/load-sources.mjs +55 -38
  20. package/agents/utils/save-doc.mjs +0 -9
  21. package/aigne.yaml +2 -4
  22. package/package.json +2 -1
  23. package/prompts/detail/custom/custom-components.md +38 -3
  24. package/prompts/structure/generate/system-prompt.md +0 -30
  25. package/prompts/structure/generate/user-prompt.md +68 -27
  26. package/prompts/structure/review/structure-review-system.md +73 -0
  27. package/prompts/translate/code-block.md +13 -3
  28. package/types/document-structure-schema.mjs +3 -3
  29. package/utils/docs-finder-utils.mjs +48 -0
  30. package/utils/extract-api.mjs +32 -0
  31. package/utils/file-utils.mjs +14 -87
  32. package/utils/history-utils.mjs +20 -8
  33. package/agents/generate/document-structure-tools/generate-sub-structure.mjs +0 -131
  34. package/agents/generate/generate-structure-without-tools.yaml +0 -65
@@ -0,0 +1,73 @@
1
+ <role_and_goal>
2
+ You are an AI document strategist with the personality of an **INTJ (The Architect)**. Your core strengths are strategic thinking, understanding complex systems, and creating logically sound blueprints. You are a perfectionist, rigorously logical, and can anticipate future challenges.
3
+
4
+ </role_and_goal>
5
+
6
+ <document_structure>
7
+ projectName: |
8
+ {{projectName}}
9
+ projectDesc: |
10
+ {{projectDesc}}
11
+
12
+ documentStructure:
13
+ {{ documentStructure | yaml.stringify }}
14
+ </document_structure>
15
+
16
+ <instructions>
17
+ You are a Documentation Structure Refiner — an expert in technical documentation architecture and information design.
18
+
19
+ Your task:
20
+ Given an existing document structure (a JSON array or tree of sections), refine and optimize its **hierarchy and order** to improve clarity, usability, and conventional organization.
21
+ ️ You must not add, delete, rename, or rewrite any nodes. Only adjust the **order** and **nesting levels** of existing nodes.
22
+
23
+ ---
24
+
25
+ ## Optimization Goals
26
+
27
+ 1. **Logical Order**
28
+ - Introductory materials should always appear at the beginning:
29
+ - “Overview”, “Introduction”, “Quick Start”, “Getting Started”, “Setup” should be near the top.
30
+ - Meta and community-related sections (e.g., “Community”, “Contributing”, “License”, “Changelog”) should always be at the end.
31
+ - Technical reference and configuration sections should appear after conceptual and usage sections.
32
+
33
+ 2. **Hierarchy Correction**
34
+ - Ensure proper depth:
35
+ - “Overview” and “Quick Start” should have **1–2 levels max**.
36
+ - Remove deeply nested technical details from “Overview” or “Quick Start”.
37
+ - Relocate such details under “Architecture”, “API Reference”, or “Modules”.
38
+ - Preserve all nodes — only change their parent-child relationships when needed for clarity.
39
+
40
+ 3. **Grouping and Alignment**
41
+ - Align similar nodes logically (e.g., group “Usage”, “Examples”, “Tutorials” together).
42
+ - Avoid duplication or overlap by reordering, not by deletion.
43
+
44
+ 4. **Naming and Identity**
45
+ - You are **not allowed to rename or reword** any section titles or descriptions.
46
+ - Keep all existing keys, identifiers, and text intact.
47
+
48
+ 5. **Balance**
49
+ - Maintain a clean, well-organized hierarchy.
50
+ - Keep top-level nodes concise (≤ 8 preferred).
51
+ - Avoid over-nesting (≤ 4 levels deep).
52
+
53
+ ---
54
+
55
+ ## Behavior Rules
56
+
57
+ - Do **not** add new nodes.
58
+ - Do **not** delete existing nodes.
59
+ - Do **not** rename or rewrite content.
60
+ - You **may** move nodes to different parents or reorder siblings to achieve better logical flow.
61
+ - You **must** maintain all data and structural integrity.
62
+ - The final structure must remain fully valid and machine-readable (same schema as input).
63
+
64
+ ---
65
+
66
+ ## Objective
67
+
68
+ Output a single **optimized JSON structure** (same format as input), where:
69
+ 1. The hierarchy and order are improved.
70
+ 2. All nodes are preserved exactly as given.
71
+ 3. The structure reflects a natural and professional documentation layout
72
+ 4. Only return the nodes need to be changed to achieve the above goals.
73
+ </instructions>
@@ -2,13 +2,23 @@
2
2
  The following formats are considered Code Blocks:
3
3
 
4
4
  - Wrapped with ```
5
- - Supports configurations: language, title, icon, where title and icon are optional
5
+ - Supports configurations: language, optional title, optional icon (icon uses key=value)
6
+ - title is free text placed after the language (not as title=xxx), may contain spaces, and **must NEVER be wrapped in quotes**
6
7
  - content can be code, command line examples, text or any other content
7
8
 
8
9
  <code_block_sample>
9
10
 
10
- ```{language} [{title}] [icon={icon}]
11
- {content}
11
+ - `language`: javascript
12
+ - `title`: Modern: Using createRoot()
13
+ - `icon`: logos:javascript
14
+
15
+ ```javascript Modern: Using createRoot() icon=logos:javascript
16
+ import { createRoot } from 'react-dom/client'
17
+
18
+ const container = document.getElementById('root')
19
+ const root = createRoot(container)
20
+
21
+ root.unmount()
12
22
  ```
13
23
 
14
24
  </code_block_sample>
@@ -6,7 +6,7 @@ export const documentItemSchema = z.object({
6
6
  title: z.string().min(1, "Title is required"),
7
7
  description: z.string().min(1, "Description is required"),
8
8
  path: z.string().startsWith("/", 'Path must start with "/"'),
9
- parentId: z.string().nullable(),
9
+ parentId: z.string().nullish(),
10
10
  sourceIds: z.array(z.string()).min(1, "At least one source ID is required"),
11
11
  });
12
12
 
@@ -18,7 +18,7 @@ export const addDocumentInputSchema = z.object({
18
18
  title: z.string().min(1, "Title is required"),
19
19
  description: z.string().min(1, "Description is required"),
20
20
  path: z.string().startsWith("/", 'Path must start with "/"'),
21
- parentId: z.string().nullable().optional(),
21
+ parentId: z.string().nullish(),
22
22
  sourceIds: z.array(z.string()).min(1, "At least one source ID is required"),
23
23
  });
24
24
 
@@ -44,7 +44,7 @@ export const deleteDocumentOutputSchema = z.object({
44
44
  // Move document schemas
45
45
  export const moveDocumentInputSchema = z.object({
46
46
  path: z.string().min(1, "Path is required"),
47
- newParentId: z.string().nullable().optional(),
47
+ newParentId: z.string().nullish(),
48
48
  });
49
49
 
50
50
  export const moveDocumentOutputSchema = z.object({
@@ -1,5 +1,6 @@
1
1
  import { access, readdir, readFile } from "node:fs/promises";
2
2
  import { join } from "node:path";
3
+ import { pathExists } from "./file-utils.mjs";
3
4
 
4
5
  /**
5
6
  * Get action-specific text based on isTranslate flag
@@ -276,3 +277,50 @@ export function addFeedbackToItems(items, feedback) {
276
277
  feedback: feedback.trim(),
277
278
  }));
278
279
  }
280
+
281
+ /**
282
+ * Load document execution structure from structure-plan.json
283
+ * @param {string} outputDir - Output directory containing structure-plan.json
284
+ * @returns {Promise<Array|null>} Document execution structure array or null if not found/failed
285
+ */
286
+ export async function loadDocumentStructure(outputDir) {
287
+ if (!outputDir) {
288
+ return null;
289
+ }
290
+
291
+ try {
292
+ const structurePlanPath = join(outputDir, "structure-plan.json");
293
+ const structureExists = await pathExists(structurePlanPath);
294
+
295
+ if (!structureExists) {
296
+ return null;
297
+ }
298
+
299
+ const structureContent = await readFile(structurePlanPath, "utf8");
300
+ if (!structureContent?.trim()) {
301
+ return null;
302
+ }
303
+
304
+ try {
305
+ // Validate that the content looks like JSON before parsing
306
+ const trimmedContent = structureContent.trim();
307
+ if (!trimmedContent.startsWith("[") && !trimmedContent.startsWith("{")) {
308
+ console.warn("structure-plan.json contains non-JSON content, skipping parse");
309
+ return null;
310
+ }
311
+
312
+ const parsed = JSON.parse(structureContent);
313
+ // Return array if it's an array, otherwise return null
314
+ return Array.isArray(parsed) ? parsed : null;
315
+ } catch (parseError) {
316
+ console.error(`Failed to parse structure-plan.json: ${parseError.message}`);
317
+ return null;
318
+ }
319
+ } catch (readError) {
320
+ // Only warn if it's not a "file not found" error
321
+ if (readError.code !== "ENOENT") {
322
+ console.warn(`Error reading structure-plan.json: ${readError.message}`);
323
+ }
324
+ return null;
325
+ }
326
+ }
@@ -0,0 +1,32 @@
1
+ import { readFile } from "node:fs/promises";
2
+ import { transpileDeclaration } from "typescript";
3
+
4
+ export async function extractApi(path) {
5
+ const content = await readFile(path, "utf8");
6
+
7
+ const lang = languages.find((lang) => lang.match(path, content));
8
+ if (lang) {
9
+ return lang.extract(path, content);
10
+ }
11
+
12
+ return content;
13
+ }
14
+
15
+ const languages = [
16
+ {
17
+ match: (path) => /\.m?(js|ts)x?$/.test(path),
18
+ extract: extractJsApi,
19
+ },
20
+ ];
21
+
22
+ async function extractJsApi(_path, content) {
23
+ const res = transpileDeclaration(content, {
24
+ compilerOptions: {
25
+ declaration: true,
26
+ emitDeclarationOnly: true,
27
+ allowJs: true,
28
+ },
29
+ });
30
+
31
+ return res.outputText.trim();
32
+ }
@@ -11,8 +11,8 @@ import { gunzipSync } from "node:zlib";
11
11
 
12
12
  import { debug } from "./debug.mjs";
13
13
  import { isGlobPattern } from "./utils.mjs";
14
- import { INTELLIGENT_SUGGESTION_TOKEN_THRESHOLD } from "./constants/index.mjs";
15
14
  import { uploadFiles } from "./upload-files.mjs";
15
+ import { extractApi } from "./extract-api.mjs";
16
16
 
17
17
  /**
18
18
  * Check if a directory is inside a git repository using git command
@@ -508,7 +508,9 @@ export async function readFileContents(files, baseDir = process.cwd(), options =
508
508
 
509
509
  return null;
510
510
  } else {
511
- const content = await readFile(file, "utf8");
511
+ const content = await extractApi(file);
512
+ if (!content) return null;
513
+
512
514
  const relativePath = path.relative(baseDir, file);
513
515
  return {
514
516
  sourceId: relativePath,
@@ -527,6 +529,11 @@ export async function readFileContents(files, baseDir = process.cwd(), options =
527
529
  return results.filter((result) => result !== null);
528
530
  }
529
531
 
532
+ export function calculateTokens(text) {
533
+ const tokens = encode(text);
534
+ return tokens.length;
535
+ }
536
+
530
537
  /**
531
538
  * Calculate total lines and tokens from file contents
532
539
  * @param {Array<{content: string}>} sourceFiles - Array of objects containing content property
@@ -552,97 +559,17 @@ export function calculateFileStats(sourceFiles) {
552
559
  }
553
560
 
554
561
  /**
555
- * Build sources content string based on context size
556
- * For large contexts, only include core project files to avoid token limit issues
562
+ * Build sources content string
557
563
  * @param {Array<{sourceId: string, content: string}>} sourceFiles - Array of source file objects
558
- * @param {boolean} isLargeContext - Whether the context is large
559
564
  * @returns {string} Concatenated sources content with sourceId comments
560
565
  */
561
- export function buildSourcesContent(sourceFiles, isLargeContext = false) {
562
- // Define core file patterns that represent project structure and key information
563
- const coreFilePatterns = [
564
- // Configuration files
565
- /package\.json$/,
566
- /tsconfig\.json$/,
567
- /jsconfig\.json$/,
568
- /\.env\.example$/,
569
- /Cargo\.toml$/,
570
- /go\.mod$/,
571
- /pom\.xml$/,
572
- /build\.gradle$/,
573
- /Gemfile$/,
574
- /requirements\.txt$/,
575
- /Pipfile$/,
576
- /composer\.json$/,
577
- /pyproject\.toml$/,
578
-
579
- // Documentation
580
- /README\.md$/i,
581
- /CHANGELOG\.md$/i,
582
- /CONTRIBUTING\.md$/i,
583
- /\.github\/.*\.md$/i,
584
-
585
- // Entry points and main files
586
- /index\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
587
- /main\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
588
- /app\.(js|ts|jsx|tsx|py)$/,
589
- /server\.(js|ts|jsx|tsx|py)$/,
590
-
591
- // API definitions
592
- /api\/.*\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
593
- /routes\/.*\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
594
- /controllers\/.*\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
595
-
596
- // Type definitions and schemas
597
- /types\.(ts|d\.ts)$/,
598
- /schema\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
599
- /.*\.d\.ts$/,
600
-
601
- // Core utilities
602
- /utils\/.*\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
603
- /lib\/.*\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
604
- /helpers\/.*\.(js|ts|jsx|tsx|py|go|rs|java|rb|php)$/,
605
- ];
606
-
607
- // Function to check if a file is a core file
608
- const isCoreFile = (filePath) => {
609
- return coreFilePatterns.some((pattern) => pattern.test(filePath));
610
- };
611
-
566
+ export function buildSourcesContent(sourceFiles) {
612
567
  // Build sources string
613
568
  let allSources = "";
614
569
 
615
- if (isLargeContext) {
616
- // Only include core files for large contexts
617
- const coreFiles = sourceFiles.filter((source) => isCoreFile(source.sourceId));
618
-
619
- // Determine which files to use and set appropriate message
620
- const filesToInclude = coreFiles.length > 0 ? coreFiles : sourceFiles;
621
- const noteMessage =
622
- coreFiles.length > 0
623
- ? "// Note: Context is large, showing only core project files.\n"
624
- : "// Note: Context is large, showing a sample of files.\n";
625
-
626
- allSources += noteMessage;
627
- let accumulatedTokens = 0;
628
-
629
- for (const source of filesToInclude) {
630
- const fileContent = `// sourceId: ${source.sourceId}\n${source.content}\n`;
631
- const fileTokens = encode(fileContent);
632
-
633
- // Check if adding this file would exceed the token limit
634
- if (accumulatedTokens + fileTokens.length > INTELLIGENT_SUGGESTION_TOKEN_THRESHOLD) {
635
- break;
636
- }
637
-
638
- allSources += fileContent;
639
- accumulatedTokens += fileTokens.length;
640
- }
641
- } else {
642
- // Include all files for normal contexts
643
- for (const source of sourceFiles) {
644
- allSources += `// sourceId: ${source.sourceId}\n${source.content}\n`;
645
- }
570
+ // Include all files for normal contexts
571
+ for (const source of sourceFiles) {
572
+ allSources += `\n// sourceId: ${source.sourceId}\n${source.content}\n`;
646
573
  }
647
574
 
648
575
  return allSources;
@@ -99,7 +99,7 @@ function recordUpdateGit({ feedback }) {
99
99
  /**
100
100
  * Records an update in the YAML file.
101
101
  */
102
- function recordUpdateYaml({ operation, feedback, documentPath = null }) {
102
+ function recordUpdateYaml({ operation, feedback, docPaths = null }) {
103
103
  try {
104
104
  const docSmithDir = join(process.cwd(), DOC_SMITH_DIR);
105
105
  if (!existsSync(docSmithDir)) {
@@ -125,9 +125,9 @@ function recordUpdateYaml({ operation, feedback, documentPath = null }) {
125
125
  feedback,
126
126
  };
127
127
 
128
- // Add document path if provided
129
- if (documentPath) {
130
- entry.documentPath = documentPath;
128
+ // Add document paths if provided
129
+ if (Array.isArray(docPaths) && docPaths.length > 0) {
130
+ entry.docPaths = docPaths;
131
131
  }
132
132
 
133
133
  // Add to beginning (newest first)
@@ -153,14 +153,14 @@ function recordUpdateYaml({ operation, feedback, documentPath = null }) {
153
153
  * @param {Object} params
154
154
  * @param {string} params.operation - The type of operation (e.g., 'document_update', 'structure_update', 'translation_update').
155
155
  * @param {string} params.feedback - The user's feedback text.
156
- * @param {string} params.documentPath - The document path.
156
+ * @param {string[]} [params.docPaths] - Document path list for updates.
157
157
  */
158
- export function recordUpdate({ operation, feedback, documentPath = null }) {
158
+ export function recordUpdate({ operation, feedback, docPaths = null }) {
159
159
  // Skip if no feedback
160
160
  if (!feedback?.trim()) return;
161
161
 
162
162
  // Always record in YAML
163
- recordUpdateYaml({ operation, feedback, documentPath });
163
+ recordUpdateYaml({ operation, feedback, docPaths });
164
164
 
165
165
  // Also record in git if git is available and not in a git repository
166
166
  if (isGitAvailable() && !isInGitRepository(process.cwd())) {
@@ -183,7 +183,19 @@ export function getHistory() {
183
183
 
184
184
  try {
185
185
  const content = readFileSync(historyPath, "utf8");
186
- return parse(content) || { entries: [] };
186
+ const parsed = parse(content) || { entries: [] };
187
+ const entries = Array.isArray(parsed.entries) ? parsed.entries : [];
188
+
189
+ const normalized = entries.map((entry) => {
190
+ if (!entry) return entry;
191
+ // Normalize legacy entries: documentPath -> docPaths: [documentPath]
192
+ if (!entry.docPaths && entry.documentPath) {
193
+ return { ...entry, docPaths: [entry.documentPath] };
194
+ }
195
+ return entry;
196
+ });
197
+
198
+ return { ...parsed, entries: normalized };
187
199
  } catch (error) {
188
200
  console.warn("Could not read the history:", error.message);
189
201
  return { entries: [] };
@@ -1,131 +0,0 @@
1
- import {
2
- buildSourcesContent,
3
- calculateFileStats,
4
- loadFilesFromPaths,
5
- readFileContents,
6
- } from "../../../utils/file-utils.mjs";
7
- import {
8
- INTELLIGENT_SUGGESTION_TOKEN_THRESHOLD,
9
- DEFAULT_EXCLUDE_PATTERNS,
10
- DEFAULT_INCLUDE_PATTERNS,
11
- } from "../../../utils/constants/index.mjs";
12
- import { toRelativePath } from "../../../utils/utils.mjs";
13
-
14
- export default async function generateSubStructure(
15
- {
16
- parentDocument,
17
- subSourcePaths,
18
- includePatterns,
19
- excludePatterns,
20
- useDefaultPatterns = true,
21
- ...rest
22
- },
23
- options,
24
- ) {
25
- const sourcePaths = subSourcePaths?.map((item) => item.path);
26
- if (!sourcePaths || sourcePaths.length === 0) {
27
- return {
28
- subStructure: [],
29
- };
30
- }
31
-
32
- let files = await loadFilesFromPaths(sourcePaths, {
33
- includePatterns,
34
- excludePatterns,
35
- useDefaultPatterns,
36
- defaultIncludePatterns: DEFAULT_INCLUDE_PATTERNS,
37
- defaultExcludePatterns: DEFAULT_EXCLUDE_PATTERNS,
38
- });
39
- files = [...new Set(files)];
40
-
41
- // all files path
42
- const allFilesPaths = files.map((file) => `- ${toRelativePath(file)}`).join("\n");
43
-
44
- // Read all source files using the utility function
45
- const sourceFiles = await readFileContents(files, process.cwd());
46
-
47
- // Count tokens and lines using utility function
48
- const { totalTokens } = calculateFileStats(sourceFiles);
49
-
50
- // check if totalTokens is too large
51
- let isLargeContext = false;
52
- if (totalTokens > INTELLIGENT_SUGGESTION_TOKEN_THRESHOLD) {
53
- isLargeContext = true;
54
- }
55
-
56
- // Build allSources string using utility function
57
- const allSources = buildSourcesContent(sourceFiles, isLargeContext);
58
-
59
- // Performance optimization:
60
- // Using both structured output and Tool with Gemini model causes redundant calls
61
- // Only use Tool when context is very large
62
- const generateStructureAgent = isLargeContext
63
- ? options.context.agents["generateStructure"]
64
- : options.context.agents["generateStructureWithoutTools"];
65
- const result = await options.context.invoke(generateStructureAgent, {
66
- ...rest,
67
- isSubStructure: true,
68
- parentDocument,
69
- datasources: allSources,
70
- allFilesPaths,
71
- isLargeContext,
72
- files,
73
- totalTokens,
74
- });
75
-
76
- return {
77
- subStructure: result.documentStructure || [],
78
- message: `Generated a sub structure for '${parentDocument.path}' successfully. Please merge all sub-structures to output the complete document structure.`,
79
- };
80
- }
81
-
82
- generateSubStructure.description = `
83
- Generates a sub-structure.
84
- Handles large file sets by splitting them into smaller sub-document structures when the context size exceeds limits. This approach ensures more focused and complete documentation generation.
85
- `;
86
-
87
- generateSubStructure.inputSchema = {
88
- type: "object",
89
- properties: {
90
- parentDocument: {
91
- type: "object",
92
- description: "The parent node to generate a sub structure for",
93
- properties: {
94
- title: { type: "string", description: "The title of the parent node" },
95
- description: { type: "string", description: "The description of the parent node" },
96
- path: {
97
- type: "string",
98
- description:
99
- "The path of the parent node, Path in URL format, cannot be empty, cannot contain spaces or special characters, must start with /, no need to include language level, e.g., /zh/about should return /about ",
100
- },
101
- parentId: { type: "string", description: "The parent ID of the parent node" },
102
- sourceIds: { type: "array", description: "The source IDs of the parent node" },
103
- },
104
- },
105
- subSourcePaths: {
106
- type: "array",
107
- description: "The source paths of the sub structure",
108
- items: {
109
- type: "object",
110
- properties: {
111
- path: { type: "string", description: "The source path of the sub structure" },
112
- reason: { type: "string", description: "The reason for selecting the source path" },
113
- },
114
- required: ["path", "reason"],
115
- },
116
- },
117
- },
118
- };
119
-
120
- generateSubStructure.outputSchema = {
121
- type: "object",
122
- properties: {
123
- subStructure: {
124
- type: "array",
125
- description:
126
- "The sub structure of the parent node, need merge all sub-structures and output the complete document structure.",
127
- },
128
- message: { type: "string", description: "The message of the sub structure" },
129
- },
130
- required: ["subStructure"],
131
- };
@@ -1,65 +0,0 @@
1
- name: generateStructureWithoutTools
2
- description: Generate the structure and organization of your documentation
3
- instructions:
4
- - role: system
5
- url: ../../prompts/structure/generate/system-prompt.md
6
- - role: user
7
- url: ../../prompts/structure/generate/user-prompt.md
8
- task_render_mode: collapse
9
- task_title: Generate the structure of the documentation
10
- input_schema:
11
- type: object
12
- properties:
13
- rules:
14
- type: string
15
- description: Your specific requirements for documentation structure
16
- locale:
17
- type: string
18
- description: Primary language for documentation (e.g., zh, en, ja)
19
- datasources:
20
- type: string
21
- description: Project content and context to help generate documentation structure
22
- targetAudience:
23
- type: string
24
- description: Target audience for the documentation
25
- nodeName:
26
- type: string
27
- description: Specific section or page name to focus on
28
- glossary:
29
- type: string
30
- description: Glossary for consistent terminology
31
- feedback:
32
- type: string
33
- description: Tell us how to improve the documentation structure
34
- userPreferences:
35
- type: string
36
- description: Your saved preferences for structure and documentation style
37
- docsType:
38
- type: string
39
- description: "Documentation type (options: general, getting-started, reference, faq)"
40
- default: general
41
- required:
42
- - rules
43
- - datasources
44
- output_schema:
45
- type: object
46
- properties:
47
- projectName:
48
- type: string
49
- description: Project name identified from your content sources
50
- projectDesc:
51
- type: string
52
- description: Brief project description generated from content analysis (under 50 words)
53
- documentStructure: ../schema/document-structure.yaml
54
- documentStructureTree:
55
- type: string
56
- description: |
57
- Visual tree structure showing documentation hierarchy with indented levels for easy review:
58
- ```
59
- - Home
60
- - Getting Started
61
- - Installation
62
- - Requirements
63
- ```
64
- required:
65
- - documentStructure