@databricks/appkit 0.15.0 → 0.17.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (165) hide show
  1. package/dist/_virtual/_rolldown/runtime.js +2 -0
  2. package/dist/app/index.d.ts.map +1 -1
  3. package/dist/appkit/package.js +1 -1
  4. package/dist/cache/index.d.ts +1 -1
  5. package/dist/cache/index.d.ts.map +1 -1
  6. package/dist/cli/commands/plugin/add-resource/add-resource.js +10 -4
  7. package/dist/cli/commands/plugin/add-resource/add-resource.js.map +1 -1
  8. package/dist/cli/commands/plugin/create/scaffold.js +10 -16
  9. package/dist/cli/commands/plugin/create/scaffold.js.map +1 -1
  10. package/dist/cli/commands/plugin/list/list.js +44 -26
  11. package/dist/cli/commands/plugin/list/list.js.map +1 -1
  12. package/dist/cli/commands/plugin/manifest-resolve.js +57 -0
  13. package/dist/cli/commands/plugin/manifest-resolve.js.map +1 -0
  14. package/dist/cli/commands/plugin/sync/sync.js +121 -71
  15. package/dist/cli/commands/plugin/sync/sync.js.map +1 -1
  16. package/dist/cli/commands/plugin/trusted-js-manifest.js +28 -0
  17. package/dist/cli/commands/plugin/trusted-js-manifest.js.map +1 -0
  18. package/dist/cli/commands/plugin/validate/validate.js +32 -14
  19. package/dist/cli/commands/plugin/validate/validate.js.map +1 -1
  20. package/dist/connectors/genie/client.d.ts +4 -0
  21. package/dist/connectors/genie/client.js +33 -16
  22. package/dist/connectors/genie/client.js.map +1 -1
  23. package/dist/connectors/genie/defaults.js +2 -1
  24. package/dist/connectors/genie/defaults.js.map +1 -1
  25. package/dist/connectors/genie/index.d.ts +3 -0
  26. package/dist/connectors/genie/types.d.ts +1 -0
  27. package/dist/connectors/genie/types.d.ts.map +1 -1
  28. package/dist/connectors/lakebase/index.d.ts +2 -3
  29. package/dist/connectors/lakebase/index.d.ts.map +1 -1
  30. package/dist/connectors/lakebase/index.js.map +1 -1
  31. package/dist/connectors/lakebase-v1/client.js +1 -1
  32. package/dist/context/execution-context.d.ts +0 -1
  33. package/dist/context/execution-context.d.ts.map +1 -1
  34. package/dist/context/index.d.ts +3 -0
  35. package/dist/context/service-context.d.ts +1 -1
  36. package/dist/context/service-context.d.ts.map +1 -1
  37. package/dist/context/user-context.d.ts +1 -2
  38. package/dist/context/user-context.d.ts.map +1 -1
  39. package/dist/core/appkit.d.ts +2 -1
  40. package/dist/core/appkit.d.ts.map +1 -1
  41. package/dist/core/index.d.ts +1 -0
  42. package/dist/errors/authentication.d.ts +0 -1
  43. package/dist/errors/authentication.d.ts.map +1 -1
  44. package/dist/errors/base.d.ts.map +1 -1
  45. package/dist/errors/configuration.d.ts +0 -1
  46. package/dist/errors/configuration.d.ts.map +1 -1
  47. package/dist/errors/connection.d.ts +0 -1
  48. package/dist/errors/connection.d.ts.map +1 -1
  49. package/dist/errors/execution.d.ts +0 -1
  50. package/dist/errors/execution.d.ts.map +1 -1
  51. package/dist/errors/initialization.d.ts +0 -1
  52. package/dist/errors/initialization.d.ts.map +1 -1
  53. package/dist/errors/server.d.ts +0 -1
  54. package/dist/errors/server.d.ts.map +1 -1
  55. package/dist/errors/tunnel.d.ts +0 -1
  56. package/dist/errors/tunnel.d.ts.map +1 -1
  57. package/dist/errors/validation.d.ts +0 -1
  58. package/dist/errors/validation.d.ts.map +1 -1
  59. package/dist/index.d.ts +7 -1
  60. package/dist/plugin/dev-reader.d.ts +5 -4
  61. package/dist/plugin/dev-reader.d.ts.map +1 -1
  62. package/dist/plugin/index.d.ts +4 -0
  63. package/dist/plugin/plugin.d.ts +3 -1
  64. package/dist/plugin/plugin.d.ts.map +1 -1
  65. package/dist/plugin/to-plugin.d.ts +1 -1
  66. package/dist/plugin/to-plugin.d.ts.map +1 -1
  67. package/dist/plugins/analytics/analytics.d.ts +5 -2
  68. package/dist/plugins/analytics/analytics.d.ts.map +1 -1
  69. package/dist/plugins/analytics/analytics.js +2 -2
  70. package/dist/plugins/analytics/analytics.js.map +1 -1
  71. package/dist/plugins/analytics/index.d.ts +2 -0
  72. package/dist/plugins/analytics/index.js +0 -1
  73. package/dist/plugins/analytics/manifest.js +30 -18
  74. package/dist/plugins/analytics/manifest.js.map +1 -1
  75. package/dist/plugins/analytics/types.d.ts +1 -0
  76. package/dist/plugins/analytics/types.d.ts.map +1 -1
  77. package/dist/plugins/genie/genie.d.ts +4 -1
  78. package/dist/plugins/genie/genie.d.ts.map +1 -1
  79. package/dist/plugins/genie/genie.js +10 -6
  80. package/dist/plugins/genie/genie.js.map +1 -1
  81. package/dist/plugins/genie/index.d.ts +4 -0
  82. package/dist/plugins/genie/index.js +0 -1
  83. package/dist/plugins/genie/manifest.js +37 -8
  84. package/dist/plugins/genie/manifest.js.map +1 -1
  85. package/dist/plugins/genie/types.d.ts +2 -0
  86. package/dist/plugins/genie/types.d.ts.map +1 -1
  87. package/dist/plugins/index.d.ts +13 -0
  88. package/dist/plugins/index.js +0 -4
  89. package/dist/plugins/lakebase/index.d.ts +2 -0
  90. package/dist/plugins/lakebase/index.js +0 -1
  91. package/dist/plugins/lakebase/lakebase.d.ts +14 -12
  92. package/dist/plugins/lakebase/lakebase.d.ts.map +1 -1
  93. package/dist/plugins/lakebase/lakebase.js +2 -2
  94. package/dist/plugins/lakebase/lakebase.js.map +1 -1
  95. package/dist/plugins/lakebase/manifest.js +14 -8
  96. package/dist/plugins/lakebase/manifest.js.map +1 -1
  97. package/dist/plugins/lakebase/types.d.ts +1 -1
  98. package/dist/plugins/lakebase/types.d.ts.map +1 -1
  99. package/dist/plugins/server/index.d.ts +7 -9
  100. package/dist/plugins/server/index.d.ts.map +1 -1
  101. package/dist/plugins/server/index.js +2 -2
  102. package/dist/plugins/server/index.js.map +1 -1
  103. package/dist/plugins/server/manifest.js +36 -18
  104. package/dist/plugins/server/manifest.js.map +1 -1
  105. package/dist/plugins/server/types.d.ts +2 -0
  106. package/dist/plugins/server/types.d.ts.map +1 -1
  107. package/dist/registry/index.d.ts +4 -0
  108. package/dist/registry/manifest-loader.d.ts +1 -1
  109. package/dist/registry/manifest-loader.d.ts.map +1 -1
  110. package/dist/registry/resource-registry.d.ts +1 -1
  111. package/dist/registry/resource-registry.d.ts.map +1 -1
  112. package/dist/registry/types.d.ts +1 -4
  113. package/dist/registry/types.d.ts.map +1 -1
  114. package/dist/registry/types.generated.d.ts +1 -1
  115. package/dist/registry/types.generated.d.ts.map +1 -1
  116. package/dist/registry/types.generated.js.map +1 -1
  117. package/dist/shared/src/cache.d.ts +1 -1
  118. package/dist/shared/src/cache.d.ts.map +1 -1
  119. package/dist/shared/src/execute.d.ts +1 -1
  120. package/dist/shared/src/execute.d.ts.map +1 -1
  121. package/dist/shared/src/genie.d.ts +6 -0
  122. package/dist/shared/src/genie.d.ts.map +1 -1
  123. package/dist/shared/src/index.d.ts +7 -0
  124. package/dist/shared/src/plugin.d.ts +2 -3
  125. package/dist/shared/src/plugin.d.ts.map +1 -1
  126. package/dist/shared/src/sql/helpers.d.ts +0 -1
  127. package/dist/shared/src/sql/helpers.d.ts.map +1 -1
  128. package/dist/shared/src/sql/types.d.ts.map +1 -1
  129. package/dist/shared/src/tunnel.d.ts +1 -1
  130. package/dist/shared/src/tunnel.d.ts.map +1 -1
  131. package/dist/stream/arrow-stream-processor.d.ts +1 -0
  132. package/dist/stream/buffers.d.ts +1 -0
  133. package/dist/stream/index.d.ts +3 -0
  134. package/dist/stream/stream-manager.d.ts +1 -0
  135. package/dist/stream/stream-manager.d.ts.map +1 -1
  136. package/dist/stream/types.d.ts +3 -0
  137. package/dist/telemetry/config.d.ts +1 -0
  138. package/dist/telemetry/index.d.ts +4 -0
  139. package/dist/telemetry/instrumentations.d.ts +1 -0
  140. package/dist/telemetry/telemetry-manager.d.ts +4 -0
  141. package/dist/telemetry/telemetry-provider.d.ts +6 -0
  142. package/dist/telemetry/types.d.ts.map +1 -1
  143. package/dist/type-generator/cache.js +10 -12
  144. package/dist/type-generator/cache.js.map +1 -1
  145. package/dist/type-generator/index.js +2 -2
  146. package/dist/type-generator/index.js.map +1 -1
  147. package/dist/type-generator/query-registry.js +165 -52
  148. package/dist/type-generator/query-registry.js.map +1 -1
  149. package/dist/type-generator/spinner.js +5 -1
  150. package/dist/type-generator/spinner.js.map +1 -1
  151. package/dist/type-generator/vite-plugin.d.ts +0 -1
  152. package/dist/type-generator/vite-plugin.d.ts.map +1 -1
  153. package/dist/type-generator/vite-plugin.js +2 -2
  154. package/dist/type-generator/vite-plugin.js.map +1 -1
  155. package/docs/api/appkit-ui/genie/GenieChatMessageList.md +7 -5
  156. package/docs/development/project-setup.md +1 -1
  157. package/docs/plugins/plugin-management.md +16 -2
  158. package/package.json +3 -1
  159. package/dist/plugins/analytics/manifest.json +0 -36
  160. package/dist/plugins/genie/manifest.json +0 -43
  161. package/dist/plugins/lakebase/manifest.json +0 -12
  162. package/dist/plugins/server/manifest.json +0 -36
  163. /package/dist/plugins/server/remote-tunnel/{denied.html → denied.html/denied.html} +0 -0
  164. /package/dist/plugins/server/remote-tunnel/{index.html → index.html/index.html} +0 -0
  165. /package/dist/plugins/server/remote-tunnel/{wait.html → wait.html/wait.html} +0 -0
@@ -1,7 +1,7 @@
1
1
  import { createLogger } from "../logging/logger.js";
2
2
  import crypto from "node:crypto";
3
+ import fs from "node:fs/promises";
3
4
  import path from "node:path";
4
- import fs from "node:fs";
5
5
 
6
6
  //#region src/type-generator/cache.ts
7
7
  const logger = createLogger("type-generator:cache");
@@ -22,16 +22,15 @@ function hashSQL(sql) {
22
22
  * If the cache is not found, run the query explain
23
23
  * @returns - the cache
24
24
  */
25
- function loadCache() {
25
+ async function loadCache() {
26
26
  const cachePath = path.join(CACHE_DIR, CACHE_FILE);
27
27
  try {
28
- if (!fs.existsSync(CACHE_DIR)) fs.mkdirSync(CACHE_DIR, { recursive: true });
29
- if (fs.existsSync(cachePath)) {
30
- const cache = JSON.parse(fs.readFileSync(cachePath, "utf8"));
31
- if (cache.version === CACHE_VERSION) return cache;
32
- }
33
- } catch {
34
- logger.warn("Cache file is corrupted, flushing cache completely.");
28
+ await fs.mkdir(CACHE_DIR, { recursive: true });
29
+ const raw = await fs.readFile(cachePath, "utf8");
30
+ const cache = JSON.parse(raw);
31
+ if (cache.version === CACHE_VERSION) return cache;
32
+ } catch (err) {
33
+ if (err.code !== "ENOENT") logger.warn("Cache file is corrupted, flushing cache completely.");
35
34
  }
36
35
  return {
37
36
  version: CACHE_VERSION,
@@ -40,12 +39,11 @@ function loadCache() {
40
39
  }
41
40
  /**
42
41
  * Save the cache to the file system
43
- * The cache is saved as a JSON file, it is used to avoid running the query explain multiple times
44
42
  * @param cache - cache object to save
45
43
  */
46
- function saveCache(cache) {
44
+ async function saveCache(cache) {
47
45
  const cachePath = path.join(CACHE_DIR, CACHE_FILE);
48
- fs.writeFileSync(cachePath, JSON.stringify(cache, null, 2), "utf8");
46
+ await fs.writeFile(cachePath, JSON.stringify(cache, null, 2), "utf8");
49
47
  }
50
48
 
51
49
  //#endregion
@@ -1 +1 @@
1
- {"version":3,"file":"cache.js","names":[],"sources":["../../src/type-generator/cache.ts"],"sourcesContent":["import crypto from \"node:crypto\";\nimport fs from \"node:fs\";\nimport path from \"node:path\";\nimport { createLogger } from \"../logging/logger\";\n\nconst logger = createLogger(\"type-generator:cache\");\n\n/**\n * Cache types\n * @property hash - the hash of the SQL query\n * @property type - the type of the query\n */\ninterface CacheEntry {\n hash: string;\n type: string;\n retry: boolean;\n}\n\n/**\n * Cache interface\n * @property version - the version of the cache\n * @property queries - the queries in the cache\n */\ninterface Cache {\n version: string;\n queries: Record<string, CacheEntry>;\n}\n\nexport const CACHE_VERSION = \"2\";\nconst CACHE_FILE = \".appkit-types-cache.json\";\nconst CACHE_DIR = path.join(\n process.cwd(),\n \"node_modules\",\n \".databricks\",\n \"appkit\",\n);\n\n/**\n * Hash the SQL query\n * Uses MD5 to hash the SQL query\n * @param sql - the SQL query to hash\n * @returns - the hash of the SQL query\n */\nexport function hashSQL(sql: string): string {\n return crypto.createHash(\"md5\").update(sql).digest(\"hex\");\n}\n\n/**\n * Load the cache from the file system\n * If the cache is not found, run the query explain\n * @returns - the cache\n */\nexport function loadCache(): Cache {\n const cachePath = path.join(CACHE_DIR, CACHE_FILE);\n try {\n if (!fs.existsSync(CACHE_DIR)) {\n fs.mkdirSync(CACHE_DIR, { recursive: true });\n }\n\n if (fs.existsSync(cachePath)) {\n const cache = JSON.parse(fs.readFileSync(cachePath, \"utf8\")) as Cache;\n if (cache.version === CACHE_VERSION) {\n return cache;\n }\n }\n } catch {\n logger.warn(\"Cache file is corrupted, flushing cache completely.\");\n }\n return { version: CACHE_VERSION, queries: {} };\n}\n\n/**\n * Save the cache to the file system\n * The cache is saved as a JSON file, it is used to avoid running the query explain multiple times\n * @param cache - cache object to save\n */\nexport function saveCache(cache: Cache): void {\n const cachePath = path.join(CACHE_DIR, CACHE_FILE);\n fs.writeFileSync(cachePath, JSON.stringify(cache, null, 2), \"utf8\");\n}\n"],"mappings":";;;;;;AAKA,MAAM,SAAS,aAAa,uBAAuB;AAuBnD,MAAa,gBAAgB;AAC7B,MAAM,aAAa;AACnB,MAAM,YAAY,KAAK,KACrB,QAAQ,KAAK,EACb,gBACA,eACA,SACD;;;;;;;AAQD,SAAgB,QAAQ,KAAqB;AAC3C,QAAO,OAAO,WAAW,MAAM,CAAC,OAAO,IAAI,CAAC,OAAO,MAAM;;;;;;;AAQ3D,SAAgB,YAAmB;CACjC,MAAM,YAAY,KAAK,KAAK,WAAW,WAAW;AAClD,KAAI;AACF,MAAI,CAAC,GAAG,WAAW,UAAU,CAC3B,IAAG,UAAU,WAAW,EAAE,WAAW,MAAM,CAAC;AAG9C,MAAI,GAAG,WAAW,UAAU,EAAE;GAC5B,MAAM,QAAQ,KAAK,MAAM,GAAG,aAAa,WAAW,OAAO,CAAC;AAC5D,OAAI,MAAM,YAAY,cACpB,QAAO;;SAGL;AACN,SAAO,KAAK,sDAAsD;;AAEpE,QAAO;EAAE,SAAS;EAAe,SAAS,EAAE;EAAE;;;;;;;AAQhD,SAAgB,UAAU,OAAoB;CAC5C,MAAM,YAAY,KAAK,KAAK,WAAW,WAAW;AAClD,IAAG,cAAc,WAAW,KAAK,UAAU,OAAO,MAAM,EAAE,EAAE,OAAO"}
1
+ {"version":3,"file":"cache.js","names":[],"sources":["../../src/type-generator/cache.ts"],"sourcesContent":["import crypto from \"node:crypto\";\nimport fs from \"node:fs/promises\";\nimport path from \"node:path\";\nimport { createLogger } from \"../logging/logger\";\n\nconst logger = createLogger(\"type-generator:cache\");\n\n/**\n * Cache types\n * @property hash - the hash of the SQL query\n * @property type - the type of the query\n */\ninterface CacheEntry {\n hash: string;\n type: string;\n retry: boolean;\n}\n\n/**\n * Cache interface\n * @property version - the version of the cache\n * @property queries - the queries in the cache\n */\ninterface Cache {\n version: string;\n queries: Record<string, CacheEntry>;\n}\n\nexport const CACHE_VERSION = \"2\";\nconst CACHE_FILE = \".appkit-types-cache.json\";\nconst CACHE_DIR = path.join(\n process.cwd(),\n \"node_modules\",\n \".databricks\",\n \"appkit\",\n);\n\n/**\n * Hash the SQL query\n * Uses MD5 to hash the SQL query\n * @param sql - the SQL query to hash\n * @returns - the hash of the SQL query\n */\nexport function hashSQL(sql: string): string {\n return crypto.createHash(\"md5\").update(sql).digest(\"hex\");\n}\n\n/**\n * Load the cache from the file system\n * If the cache is not found, run the query explain\n * @returns - the cache\n */\nexport async function loadCache(): Promise<Cache> {\n const cachePath = path.join(CACHE_DIR, CACHE_FILE);\n try {\n await fs.mkdir(CACHE_DIR, { recursive: true });\n\n const raw = await fs.readFile(cachePath, \"utf8\");\n const cache = JSON.parse(raw) as Cache;\n if (cache.version === CACHE_VERSION) {\n return cache;\n }\n } catch (err) {\n if ((err as NodeJS.ErrnoException).code !== \"ENOENT\") {\n logger.warn(\"Cache file is corrupted, flushing cache completely.\");\n }\n }\n return { version: CACHE_VERSION, queries: {} };\n}\n\n/**\n * Save the cache to the file system\n * @param cache - cache object to save\n */\nexport async function saveCache(cache: Cache): Promise<void> {\n const cachePath = path.join(CACHE_DIR, CACHE_FILE);\n await fs.writeFile(cachePath, JSON.stringify(cache, null, 2), \"utf8\");\n}\n"],"mappings":";;;;;;AAKA,MAAM,SAAS,aAAa,uBAAuB;AAuBnD,MAAa,gBAAgB;AAC7B,MAAM,aAAa;AACnB,MAAM,YAAY,KAAK,KACrB,QAAQ,KAAK,EACb,gBACA,eACA,SACD;;;;;;;AAQD,SAAgB,QAAQ,KAAqB;AAC3C,QAAO,OAAO,WAAW,MAAM,CAAC,OAAO,IAAI,CAAC,OAAO,MAAM;;;;;;;AAQ3D,eAAsB,YAA4B;CAChD,MAAM,YAAY,KAAK,KAAK,WAAW,WAAW;AAClD,KAAI;AACF,QAAM,GAAG,MAAM,WAAW,EAAE,WAAW,MAAM,CAAC;EAE9C,MAAM,MAAM,MAAM,GAAG,SAAS,WAAW,OAAO;EAChD,MAAM,QAAQ,KAAK,MAAM,IAAI;AAC7B,MAAI,MAAM,YAAY,cACpB,QAAO;UAEF,KAAK;AACZ,MAAK,IAA8B,SAAS,SAC1C,QAAO,KAAK,sDAAsD;;AAGtE,QAAO;EAAE,SAAS;EAAe,SAAS,EAAE;EAAE;;;;;;AAOhD,eAAsB,UAAU,OAA6B;CAC3D,MAAM,YAAY,KAAK,KAAK,WAAW,WAAW;AAClD,OAAM,GAAG,UAAU,WAAW,KAAK,UAAU,OAAO,MAAM,EAAE,EAAE,OAAO"}
@@ -1,6 +1,6 @@
1
1
  import { createLogger } from "../logging/logger.js";
2
2
  import { generateQueriesFromDescribe } from "./query-registry.js";
3
- import fs from "node:fs";
3
+ import fs from "node:fs/promises";
4
4
  import dotenv from "dotenv";
5
5
 
6
6
  //#region src/type-generator/index.ts
@@ -39,7 +39,7 @@ async function generateFromEntryPoint(options) {
39
39
  let queryRegistry = [];
40
40
  if (queryFolder) queryRegistry = await generateQueriesFromDescribe(queryFolder, warehouseId, { noCache });
41
41
  const typeDeclarations = generateTypeDeclarations(queryRegistry);
42
- fs.writeFileSync(outFile, typeDeclarations, "utf-8");
42
+ await fs.writeFile(outFile, typeDeclarations, "utf-8");
43
43
  logger.debug("Type generation complete!");
44
44
  }
45
45
 
@@ -1 +1 @@
1
- {"version":3,"file":"index.js","names":[],"sources":["../../src/type-generator/index.ts"],"sourcesContent":["import fs from \"node:fs\";\nimport dotenv from \"dotenv\";\nimport { createLogger } from \"../logging/logger\";\nimport { generateQueriesFromDescribe } from \"./query-registry\";\nimport type { QuerySchema } from \"./types\";\n\ndotenv.config();\n\nconst logger = createLogger(\"type-generator\");\n\n/**\n * Generate type declarations for QueryRegistry\n * Create the d.ts file from the plugin routes and query schemas\n * @param querySchemas - the list of query schemas\n * @returns - the type declarations as a string\n */\nfunction generateTypeDeclarations(querySchemas: QuerySchema[] = []): string {\n const queryEntries = querySchemas\n .map(({ name, type }) => {\n const indentedType = type\n .split(\"\\n\")\n .map((line, i) => (i === 0 ? line : ` ${line}`))\n .join(\"\\n\");\n return ` ${name}: ${indentedType}`;\n })\n .join(\";\\n\");\n\n const querySection = queryEntries ? `\\n${queryEntries};\\n ` : \"\";\n\n return `// Auto-generated by AppKit - DO NOT EDIT\n// Generated by 'npx @databricks/appkit generate-types' or Vite plugin during build\nimport \"@databricks/appkit-ui/react\";\nimport type { SQLTypeMarker, SQLStringMarker, SQLNumberMarker, SQLBooleanMarker, SQLBinaryMarker, SQLDateMarker, SQLTimestampMarker } from \"@databricks/appkit-ui/js\";\n\ndeclare module \"@databricks/appkit-ui/react\" {\n interface QueryRegistry {${querySection}}\n}\n`;\n}\n\n/**\n * Entry point for generating type declarations from all imported files\n * @param options - the options for the generation\n * @param options.entryPoint - the entry point file\n * @param options.outFile - the output file\n * @param options.querySchemaFile - optional path to query schema file (e.g. config/queries/schema.ts)\n */\nexport async function generateFromEntryPoint(options: {\n outFile: string;\n queryFolder?: string;\n warehouseId: string;\n noCache?: boolean;\n}) {\n const { outFile, queryFolder, warehouseId, noCache } = options;\n\n logger.debug(\"Starting type generation...\");\n\n let queryRegistry: QuerySchema[] = [];\n if (queryFolder)\n queryRegistry = await generateQueriesFromDescribe(\n queryFolder,\n warehouseId,\n {\n noCache,\n },\n );\n\n const typeDeclarations = generateTypeDeclarations(queryRegistry);\n\n fs.writeFileSync(outFile, typeDeclarations, \"utf-8\");\n\n logger.debug(\"Type generation complete!\");\n}\n"],"mappings":";;;;;;AAMA,OAAO,QAAQ;AAEf,MAAM,SAAS,aAAa,iBAAiB;;;;;;;AAQ7C,SAAS,yBAAyB,eAA8B,EAAE,EAAU;CAC1E,MAAM,eAAe,aAClB,KAAK,EAAE,MAAM,WAAW;AAKvB,SAAO,OAAO,KAAK,IAJE,KAClB,MAAM,KAAK,CACX,KAAK,MAAM,MAAO,MAAM,IAAI,OAAO,OAAO,OAAQ,CAClD,KAAK,KAAK;GAEb,CACD,KAAK,MAAM;AAId,QAAO;;;;;;6BAFc,eAAe,KAAK,aAAa,SAAS,GAQvB;;;;;;;;;;;AAY1C,eAAsB,uBAAuB,SAK1C;CACD,MAAM,EAAE,SAAS,aAAa,aAAa,YAAY;AAEvD,QAAO,MAAM,8BAA8B;CAE3C,IAAI,gBAA+B,EAAE;AACrC,KAAI,YACF,iBAAgB,MAAM,4BACpB,aACA,aACA,EACE,SACD,CACF;CAEH,MAAM,mBAAmB,yBAAyB,cAAc;AAEhE,IAAG,cAAc,SAAS,kBAAkB,QAAQ;AAEpD,QAAO,MAAM,4BAA4B"}
1
+ {"version":3,"file":"index.js","names":[],"sources":["../../src/type-generator/index.ts"],"sourcesContent":["import fs from \"node:fs/promises\";\nimport dotenv from \"dotenv\";\nimport { createLogger } from \"../logging/logger\";\nimport { generateQueriesFromDescribe } from \"./query-registry\";\nimport type { QuerySchema } from \"./types\";\n\ndotenv.config();\n\nconst logger = createLogger(\"type-generator\");\n\n/**\n * Generate type declarations for QueryRegistry\n * Create the d.ts file from the plugin routes and query schemas\n * @param querySchemas - the list of query schemas\n * @returns - the type declarations as a string\n */\nfunction generateTypeDeclarations(querySchemas: QuerySchema[] = []): string {\n const queryEntries = querySchemas\n .map(({ name, type }) => {\n const indentedType = type\n .split(\"\\n\")\n .map((line, i) => (i === 0 ? line : ` ${line}`))\n .join(\"\\n\");\n return ` ${name}: ${indentedType}`;\n })\n .join(\";\\n\");\n\n const querySection = queryEntries ? `\\n${queryEntries};\\n ` : \"\";\n\n return `// Auto-generated by AppKit - DO NOT EDIT\n// Generated by 'npx @databricks/appkit generate-types' or Vite plugin during build\nimport \"@databricks/appkit-ui/react\";\nimport type { SQLTypeMarker, SQLStringMarker, SQLNumberMarker, SQLBooleanMarker, SQLBinaryMarker, SQLDateMarker, SQLTimestampMarker } from \"@databricks/appkit-ui/js\";\n\ndeclare module \"@databricks/appkit-ui/react\" {\n interface QueryRegistry {${querySection}}\n}\n`;\n}\n\n/**\n * Entry point for generating type declarations from all imported files\n * @param options - the options for the generation\n * @param options.entryPoint - the entry point file\n * @param options.outFile - the output file\n * @param options.querySchemaFile - optional path to query schema file (e.g. config/queries/schema.ts)\n */\nexport async function generateFromEntryPoint(options: {\n outFile: string;\n queryFolder?: string;\n warehouseId: string;\n noCache?: boolean;\n}) {\n const { outFile, queryFolder, warehouseId, noCache } = options;\n\n logger.debug(\"Starting type generation...\");\n\n let queryRegistry: QuerySchema[] = [];\n if (queryFolder)\n queryRegistry = await generateQueriesFromDescribe(\n queryFolder,\n warehouseId,\n {\n noCache,\n },\n );\n\n const typeDeclarations = generateTypeDeclarations(queryRegistry);\n\n await fs.writeFile(outFile, typeDeclarations, \"utf-8\");\n\n logger.debug(\"Type generation complete!\");\n}\n"],"mappings":";;;;;;AAMA,OAAO,QAAQ;AAEf,MAAM,SAAS,aAAa,iBAAiB;;;;;;;AAQ7C,SAAS,yBAAyB,eAA8B,EAAE,EAAU;CAC1E,MAAM,eAAe,aAClB,KAAK,EAAE,MAAM,WAAW;AAKvB,SAAO,OAAO,KAAK,IAJE,KAClB,MAAM,KAAK,CACX,KAAK,MAAM,MAAO,MAAM,IAAI,OAAO,OAAO,OAAQ,CAClD,KAAK,KAAK;GAEb,CACD,KAAK,MAAM;AAId,QAAO;;;;;;6BAFc,eAAe,KAAK,aAAa,SAAS,GAQvB;;;;;;;;;;;AAY1C,eAAsB,uBAAuB,SAK1C;CACD,MAAM,EAAE,SAAS,aAAa,aAAa,YAAY;AAEvD,QAAO,MAAM,8BAA8B;CAE3C,IAAI,gBAA+B,EAAE;AACrC,KAAI,YACF,iBAAgB,MAAM,4BACpB,aACA,aACA,EACE,SACD,CACF;CAEH,MAAM,mBAAmB,yBAAyB,cAAc;AAEhE,OAAM,GAAG,UAAU,SAAS,kBAAkB,QAAQ;AAEtD,QAAO,MAAM,4BAA4B"}
@@ -3,12 +3,29 @@ import { CACHE_VERSION, hashSQL, loadCache, saveCache } from "./cache.js";
3
3
  import { Spinner } from "./spinner.js";
4
4
  import { sqlTypeToHelper, sqlTypeToMarker } from "./types.js";
5
5
  import { WorkspaceClient } from "@databricks/sdk-experimental";
6
+ import fs from "node:fs/promises";
6
7
  import path from "node:path";
7
- import fs from "node:fs";
8
+ import pc from "picocolors";
8
9
 
9
10
  //#region src/type-generator/query-registry.ts
10
11
  const logger = createLogger("type-generator:query-registry");
11
12
  /**
13
+ * Parse a raw API/SDK error into a structured code + message.
14
+ * Handles Databricks-style JSON bodies embedded in the message string,
15
+ * e.g. `Response from server (Bad Request) {"error_code":"...","message":"..."}`.
16
+ */
17
+ function parseError(raw) {
18
+ const jsonMatch = raw.match(/\{[\s\S]*\}/);
19
+ if (jsonMatch) try {
20
+ const parsed = JSON.parse(jsonMatch[0]);
21
+ if (parsed.error_code || parsed.message) return {
22
+ code: parsed.error_code,
23
+ message: parsed.message || raw
24
+ };
25
+ } catch {}
26
+ return { message: raw };
27
+ }
28
+ /**
12
29
  * Extract parameters from a SQL query
13
30
  * @param sql - the SQL query to extract parameters from
14
31
  * @returns an array of parameter names
@@ -69,19 +86,6 @@ function generateUnknownResultQuery(sql, queryName) {
69
86
  result: unknown;
70
87
  }`;
71
88
  }
72
- function cacheFailedQuery(cache, querySchemas, sql, queryName, sqlHash) {
73
- const type = generateUnknownResultQuery(sql, queryName);
74
- querySchemas.push({
75
- name: queryName,
76
- type
77
- });
78
- cache.queries[queryName] = {
79
- hash: sqlHash,
80
- type,
81
- retry: true
82
- };
83
- saveCache(cache);
84
- }
85
89
  function extractParameterTypes(sql) {
86
90
  const paramTypes = {};
87
91
  const matches = sql.matchAll(/--\s*@param\s+(\w+)\s+(STRING|NUMERIC|BOOLEAN|DATE|TIMESTAMP|BINARY)/gi);
@@ -101,67 +105,176 @@ function extractParameterTypes(sql) {
101
105
  * @returns an array of query schemas
102
106
  */
103
107
  async function generateQueriesFromDescribe(queryFolder, warehouseId, options = {}) {
104
- const { noCache = false } = options;
105
- const queryFiles = fs.readdirSync(queryFolder).filter((file) => file.endsWith(".sql"));
106
- logger.debug("Found %d SQL queries", queryFiles.length);
107
- const cache = noCache ? {
108
+ const { noCache = false, concurrency: rawConcurrency = 10 } = options;
109
+ const concurrency = typeof rawConcurrency === "number" && Number.isFinite(rawConcurrency) ? Math.max(1, Math.floor(rawConcurrency)) : 10;
110
+ const [allFiles, cache] = await Promise.all([fs.readdir(queryFolder), noCache ? {
108
111
  version: CACHE_VERSION,
109
112
  queries: {}
110
- } : loadCache();
113
+ } : loadCache()]);
114
+ const queryFiles = allFiles.filter((file) => file.endsWith(".sql"));
115
+ logger.debug("Found %d SQL queries", queryFiles.length);
111
116
  const client = new WorkspaceClient({});
112
- const querySchemas = [];
113
117
  const spinner = new Spinner();
118
+ const sqlContents = await Promise.all(queryFiles.map((file) => fs.readFile(path.join(queryFolder, file), "utf8")));
119
+ const startTime = performance.now();
120
+ const cachedResults = [];
121
+ const uncachedQueries = [];
122
+ const logEntries = [];
114
123
  for (let i = 0; i < queryFiles.length; i++) {
115
124
  const file = queryFiles[i];
116
125
  const queryName = normalizeQueryName(path.basename(file, ".sql"));
117
- const sql = fs.readFileSync(path.join(queryFolder, file), "utf8");
126
+ const sql = sqlContents[i];
118
127
  const sqlHash = hashSQL(sql);
119
128
  const cached = cache.queries[queryName];
120
129
  if (cached && cached.hash === sqlHash && !cached.retry) {
121
- querySchemas.push({
122
- name: queryName,
123
- type: cached.type
130
+ cachedResults.push({
131
+ index: i,
132
+ schema: {
133
+ name: queryName,
134
+ type: cached.type
135
+ }
136
+ });
137
+ logEntries.push({
138
+ queryName,
139
+ status: "HIT"
140
+ });
141
+ } else {
142
+ const cleanedSql = sql.replace(/:([a-zA-Z_]\w*)/g, "''").trim().replace(/;\s*$/, "");
143
+ uncachedQueries.push({
144
+ index: i,
145
+ queryName,
146
+ sql,
147
+ sqlHash,
148
+ cleanedSql
124
149
  });
125
- spinner.start(`Processing ${queryName} (${i + 1}/${queryFiles.length})`);
126
- spinner.stop(`✓ ${queryName} (cached)`);
127
- continue;
128
150
  }
129
- spinner.start(`Processing ${queryName} (${i + 1}/${queryFiles.length})`);
130
- const cleanedSql = sql.replace(/:([a-zA-Z_]\w*)/g, "''").trim().replace(/;\s*$/, "");
131
- try {
151
+ }
152
+ const freshResults = [];
153
+ if (uncachedQueries.length > 0) {
154
+ let completed = 0;
155
+ const total = uncachedQueries.length;
156
+ spinner.start(`Describing ${total} ${total === 1 ? "query" : "queries"} (0/${total})`);
157
+ const describeOne = async ({ index, queryName, sql, sqlHash, cleanedSql }) => {
132
158
  const result = await client.statementExecution.executeStatement({
133
159
  statement: `DESCRIBE QUERY ${cleanedSql}`,
134
160
  warehouse_id: warehouseId
135
161
  });
162
+ completed++;
163
+ spinner.update(`Describing ${total} ${total === 1 ? "query" : "queries"} (${completed}/${total})`);
164
+ logger.debug("DESCRIBE result for %s: state=%s, rows=%d", queryName, result.status.state, result.result?.data_array?.length ?? 0);
136
165
  if (result.status.state === "FAILED") {
137
166
  const sqlError = result.status.error?.message || "Query execution failed";
138
- cacheFailedQuery(cache, querySchemas, sql, queryName, sqlHash);
139
- spinner.stop(`✗ ${queryName} - failed`);
140
- spinner.printDetail(`SQL Error: ${sqlError}`);
141
- spinner.printDetail(`Query: ${cleanedSql.slice(0, 200)}`);
142
- continue;
167
+ logger.warn("DESCRIBE failed for %s: %s", queryName, sqlError);
168
+ const type = generateUnknownResultQuery(sql, queryName);
169
+ return {
170
+ status: "fail",
171
+ index,
172
+ schema: {
173
+ name: queryName,
174
+ type
175
+ },
176
+ cacheEntry: {
177
+ hash: sqlHash,
178
+ type,
179
+ retry: true
180
+ },
181
+ error: parseError(sqlError)
182
+ };
143
183
  }
144
184
  const { type, hasResults } = convertToQueryType(result, sql, queryName);
145
- querySchemas.push({
146
- name: queryName,
147
- type
148
- });
149
- const retry = !hasResults;
150
- cache.queries[queryName] = {
151
- hash: sqlHash,
152
- type,
153
- retry
185
+ return {
186
+ status: "ok",
187
+ index,
188
+ schema: {
189
+ name: queryName,
190
+ type
191
+ },
192
+ cacheEntry: {
193
+ hash: sqlHash,
194
+ type,
195
+ retry: !hasResults
196
+ }
154
197
  };
155
- saveCache(cache);
156
- spinner.stop(`✓ ${queryName}`);
157
- } catch (error) {
158
- const errorMessage = error instanceof Error ? error.message : "Unknown error";
159
- spinner.stop(`✗ ${queryName}`);
160
- spinner.printDetail(errorMessage);
161
- cacheFailedQuery(cache, querySchemas, sql, queryName, sqlHash);
198
+ };
199
+ const processBatchResults = (settled, batchOffset) => {
200
+ for (let i = 0; i < settled.length; i++) {
201
+ const entry = settled[i];
202
+ const { queryName } = uncachedQueries[batchOffset + i];
203
+ if (entry.status === "fulfilled") {
204
+ const res = entry.value;
205
+ freshResults.push({
206
+ index: res.index,
207
+ schema: res.schema
208
+ });
209
+ cache.queries[queryName] = res.cacheEntry;
210
+ logEntries.push({
211
+ queryName,
212
+ status: "MISS",
213
+ failed: res.status === "fail",
214
+ error: res.status === "fail" ? res.error : void 0
215
+ });
216
+ } else {
217
+ const { sql, sqlHash, index } = uncachedQueries[batchOffset + i];
218
+ const reason = entry.reason instanceof Error ? entry.reason.message : String(entry.reason);
219
+ logger.warn("DESCRIBE rejected for %s: %s", queryName, reason);
220
+ const type = generateUnknownResultQuery(sql, queryName);
221
+ freshResults.push({
222
+ index,
223
+ schema: {
224
+ name: queryName,
225
+ type
226
+ }
227
+ });
228
+ cache.queries[queryName] = {
229
+ hash: sqlHash,
230
+ type,
231
+ retry: true
232
+ };
233
+ logEntries.push({
234
+ queryName,
235
+ status: "MISS",
236
+ failed: true,
237
+ error: parseError(reason)
238
+ });
239
+ }
240
+ }
241
+ };
242
+ if (uncachedQueries.length > concurrency) for (let b = 0; b < uncachedQueries.length; b += concurrency) {
243
+ const batch = uncachedQueries.slice(b, b + concurrency);
244
+ processBatchResults(await Promise.allSettled(batch.map(describeOne)), b);
245
+ await saveCache(cache);
246
+ }
247
+ else {
248
+ processBatchResults(await Promise.allSettled(uncachedQueries.map(describeOne)), 0);
249
+ await saveCache(cache);
250
+ }
251
+ spinner.stop("");
252
+ }
253
+ const elapsed = ((performance.now() - startTime) / 1e3).toFixed(2);
254
+ if (logEntries.length > 0) {
255
+ const maxNameLen = Math.max(...logEntries.map((e) => e.queryName.length));
256
+ const separator = pc.dim("─".repeat(50));
257
+ console.log("");
258
+ console.log(` ${pc.bold("Typegen Queries")} ${pc.dim(`(${logEntries.length})`)}`);
259
+ console.log(` ${separator}`);
260
+ for (const entry of logEntries) {
261
+ const tag = entry.failed ? pc.bold(pc.red("ERROR")) : entry.status === "HIT" ? `cache ${pc.bold(pc.green("HIT "))}` : `cache ${pc.bold(pc.yellow("MISS "))}`;
262
+ const rawName = entry.queryName.padEnd(maxNameLen);
263
+ const name = entry.failed ? pc.dim(pc.strikethrough(rawName)) : rawName;
264
+ const errorCode = entry.error?.message.match(/\[([^\]]+)\]/)?.[1];
265
+ const reason = errorCode ? ` ${pc.dim(errorCode)}` : "";
266
+ console.log(` ${tag} ${name}${reason}`);
162
267
  }
268
+ const newCount = logEntries.filter((e) => e.status === "MISS" && !e.failed).length;
269
+ const cacheCount = logEntries.filter((e) => e.status === "HIT" && !e.failed).length;
270
+ const errorCount = logEntries.filter((e) => e.failed).length;
271
+ console.log(` ${separator}`);
272
+ const parts = [`${newCount} new`, `${cacheCount} from cache`];
273
+ if (errorCount > 0) parts.push(`${errorCount} ${errorCount === 1 ? "error" : "errors"}`);
274
+ console.log(` ${parts.join(", ")}. ${pc.dim(`${elapsed}s`)}`);
275
+ console.log("");
163
276
  }
164
- return querySchemas;
277
+ return [...cachedResults, ...freshResults].sort((a, b) => a.index - b.index).map((r) => r.schema);
165
278
  }
166
279
  /**
167
280
  * Normalize query name by removing the .obo extension
@@ -1 +1 @@
1
- {"version":3,"file":"query-registry.js","names":[],"sources":["../../src/type-generator/query-registry.ts"],"sourcesContent":["import fs from \"node:fs\";\nimport path from \"node:path\";\nimport { WorkspaceClient } from \"@databricks/sdk-experimental\";\nimport { createLogger } from \"../logging/logger\";\nimport { CACHE_VERSION, hashSQL, loadCache, saveCache } from \"./cache\";\nimport { Spinner } from \"./spinner\";\nimport {\n type DatabricksStatementExecutionResponse,\n type QuerySchema,\n sqlTypeToHelper,\n sqlTypeToMarker,\n} from \"./types\";\n\nconst logger = createLogger(\"type-generator:query-registry\");\n\n/**\n * Extract parameters from a SQL query\n * @param sql - the SQL query to extract parameters from\n * @returns an array of parameter names\n */\nexport function extractParameters(sql: string): string[] {\n const matches = sql.matchAll(/:([a-zA-Z_]\\w*)/g);\n const params = new Set<string>();\n for (const match of matches) {\n params.add(match[1]);\n }\n return Array.from(params);\n}\n\n// parameters that are injected by the server\nexport const SERVER_INJECTED_PARAMS = [\"workspaceId\"];\n\n/**\n * Generates the TypeScript type literal for query parameters from SQL.\n * Shared by both the success and failure paths.\n */\nfunction formatParametersType(sql: string): string {\n const params = extractParameters(sql).filter(\n (p) => !SERVER_INJECTED_PARAMS.includes(p),\n );\n const paramTypes = extractParameterTypes(sql);\n\n return params.length > 0\n ? `{\\n ${params\n .map((p) => {\n const sqlType = paramTypes[p];\n const markerType = sqlType\n ? sqlTypeToMarker[sqlType]\n : \"SQLTypeMarker\";\n const helper = sqlType ? sqlTypeToHelper[sqlType] : \"sql.*()\";\n return `/** ${sqlType || \"any\"} - use ${helper} */\\n ${p}: ${markerType}`;\n })\n .join(\";\\n \")};\\n }`\n : \"Record<string, never>\";\n}\n\nexport function convertToQueryType(\n result: DatabricksStatementExecutionResponse,\n sql: string,\n queryName: string,\n): { type: string; hasResults: boolean } {\n const dataRows = result.result?.data_array || [];\n const columns = dataRows.map((row) => ({\n name: row[0] || \"\",\n type_name: row[1]?.toUpperCase() || \"STRING\",\n comment: row[2] || undefined,\n }));\n\n const paramsType = formatParametersType(sql);\n\n // generate result fields with JSDoc\n const resultFields = columns.map((column) => {\n const normalizedType = normalizeTypeName(column.type_name);\n const mappedType = typeMap[normalizedType] || \"unknown\";\n // validate column name is a valid identifier\n const name = /^[a-zA-Z_$][a-zA-Z0-9_$]*$/.test(column.name)\n ? column.name\n : `\"${column.name}\"`;\n\n // generate comment for column\n const comment = column.comment\n ? `/** ${column.comment} */\\n `\n : `/** @sqlType ${column.type_name} */\\n `;\n\n return `${comment}${name}: ${mappedType}`;\n });\n\n const hasResults = resultFields.length > 0;\n\n const type = `{\n name: \"${queryName}\";\n parameters: ${paramsType};\n result: ${\n hasResults\n ? `Array<{\n ${resultFields.join(\";\\n \")};\n }>`\n : \"unknown\"\n };\n }`;\n\n return { type, hasResults };\n}\n\n/**\n * Used when DESCRIBE QUERY fails so the query still appears in QueryRegistry.\n * Generates a type with unknown result from SQL alone (no warehouse call).\n */\nfunction generateUnknownResultQuery(sql: string, queryName: string): string {\n const paramsType = formatParametersType(sql);\n\n return `{\n name: \"${queryName}\";\n parameters: ${paramsType};\n result: unknown;\n }`;\n}\n\nfunction cacheFailedQuery(\n cache: ReturnType<typeof loadCache>,\n querySchemas: QuerySchema[],\n sql: string,\n queryName: string,\n sqlHash: string,\n): void {\n const type = generateUnknownResultQuery(sql, queryName);\n querySchemas.push({ name: queryName, type });\n cache.queries[queryName] = { hash: sqlHash, type, retry: true };\n saveCache(cache);\n}\n\nexport function extractParameterTypes(sql: string): Record<string, string> {\n const paramTypes: Record<string, string> = {};\n const regex =\n /--\\s*@param\\s+(\\w+)\\s+(STRING|NUMERIC|BOOLEAN|DATE|TIMESTAMP|BINARY)/gi;\n const matches = sql.matchAll(regex);\n for (const match of matches) {\n const [, paramName, paramType] = match;\n paramTypes[paramName] = paramType.toUpperCase();\n }\n\n return paramTypes;\n}\n\n/**\n * Generate query schemas from a folder of SQL files\n * It uses DESCRIBE QUERY to get the schema without executing the query\n * @param queryFolder - the folder containing the SQL files\n * @param warehouseId - the warehouse id to use for schema analysis\n * @param options - options for the query generation\n * @param options.noCache - if true, skip the cache and regenerate all types\n * @returns an array of query schemas\n */\nexport async function generateQueriesFromDescribe(\n queryFolder: string,\n warehouseId: string,\n options: { noCache?: boolean } = {},\n): Promise<QuerySchema[]> {\n const { noCache = false } = options;\n\n // read all query files in the folder\n const queryFiles = fs\n .readdirSync(queryFolder)\n .filter((file) => file.endsWith(\".sql\"));\n\n logger.debug(\"Found %d SQL queries\", queryFiles.length);\n\n // load cache\n const cache = noCache ? { version: CACHE_VERSION, queries: {} } : loadCache();\n\n const client = new WorkspaceClient({});\n const querySchemas: QuerySchema[] = [];\n const spinner = new Spinner();\n\n // process each query file\n for (let i = 0; i < queryFiles.length; i++) {\n const file = queryFiles[i];\n const rawName = path.basename(file, \".sql\");\n const queryName = normalizeQueryName(rawName);\n\n // read query file content\n const sql = fs.readFileSync(path.join(queryFolder, file), \"utf8\");\n const sqlHash = hashSQL(sql);\n\n // check cache (skip if marked for retry after a failed DESCRIBE)\n const cached = cache.queries[queryName];\n if (cached && cached.hash === sqlHash && !cached.retry) {\n querySchemas.push({ name: queryName, type: cached.type });\n spinner.start(`Processing ${queryName} (${i + 1}/${queryFiles.length})`);\n spinner.stop(`✓ ${queryName} (cached)`);\n continue;\n }\n\n spinner.start(`Processing ${queryName} (${i + 1}/${queryFiles.length})`);\n\n const sqlWithDefaults = sql.replace(/:([a-zA-Z_]\\w*)/g, \"''\");\n\n // strip trailing semicolon for DESCRIBE QUERY\n const cleanedSql = sqlWithDefaults.trim().replace(/;\\s*$/, \"\");\n\n // execute DESCRIBE QUERY to get schema without running the actual query\n try {\n const result = (await client.statementExecution.executeStatement({\n statement: `DESCRIBE QUERY ${cleanedSql}`,\n warehouse_id: warehouseId,\n })) as DatabricksStatementExecutionResponse;\n\n if (result.status.state === \"FAILED\") {\n const sqlError =\n result.status.error?.message || \"Query execution failed\";\n cacheFailedQuery(cache, querySchemas, sql, queryName, sqlHash);\n spinner.stop(`✗ ${queryName} - failed`);\n spinner.printDetail(`SQL Error: ${sqlError}`);\n spinner.printDetail(`Query: ${cleanedSql.slice(0, 200)}`);\n continue;\n }\n\n // convert result to query schema\n const { type, hasResults } = convertToQueryType(result, sql, queryName);\n querySchemas.push({ name: queryName, type });\n\n // update cache immediately so successful results survive partial failures\n // retry if DESCRIBE returned no columns (result: unknown)\n const retry = !hasResults;\n cache.queries[queryName] = { hash: sqlHash, type, retry };\n saveCache(cache);\n\n spinner.stop(`✓ ${queryName}`);\n } catch (error) {\n const errorMessage =\n error instanceof Error ? error.message : \"Unknown error\";\n spinner.stop(`✗ ${queryName}`);\n spinner.printDetail(errorMessage);\n cacheFailedQuery(cache, querySchemas, sql, queryName, sqlHash);\n }\n }\n\n return querySchemas;\n}\n\n/**\n * Normalize query name by removing the .obo extension\n * @param queryName - the query name to normalize\n * @returns the normalized query name\n */\nexport function normalizeQueryName(fileName: string): string {\n return fileName.replace(/\\.obo$/, \"\");\n}\n\n/**\n * Normalize SQL type name by removing parameters/generics\n * Examples:\n * DECIMAL(38,6) -> DECIMAL\n * ARRAY<STRING> -> ARRAY\n * MAP<STRING,INT> -> MAP\n * STRUCT<name:STRING> -> STRUCT\n * INTERVAL DAY TO SECOND -> INTERVAL\n * GEOGRAPHY(4326) -> GEOGRAPHY\n */\nexport function normalizeTypeName(typeName: string): string {\n return typeName\n .replace(/\\(.*\\)$/, \"\") // remove (p, s) eg: DECIMAL(38,6) -> DECIMAL\n .replace(/<.*>$/, \"\") // remove <T> eg: ARRAY<STRING> -> ARRAY\n .split(\" \")[0]; // take first word eg: INTERVAL DAY TO SECOND -> INTERVAL\n}\n\n/** Type Map for Databricks data types to JavaScript types */\nconst typeMap: Record<string, string> = {\n // string types\n STRING: \"string\",\n BINARY: \"string\",\n // boolean\n BOOLEAN: \"boolean\",\n // numeric types\n TINYINT: \"number\",\n SMALLINT: \"number\",\n INT: \"number\",\n BIGINT: \"number\",\n FLOAT: \"number\",\n DOUBLE: \"number\",\n DECIMAL: \"number\",\n // date/time types\n DATE: \"string\",\n TIMESTAMP: \"string\",\n TIMESTAMP_NTZ: \"string\",\n INTERVAL: \"string\",\n // complex types\n ARRAY: \"unknown[]\",\n MAP: \"Record<string, unknown>\",\n STRUCT: \"Record<string, unknown>\",\n OBJECT: \"Record<string, unknown>\",\n VARIANT: \"unknown\",\n // spatial types\n GEOGRAPHY: \"unknown\",\n GEOMETRY: \"unknown\",\n // null type\n VOID: \"null\",\n};\n"],"mappings":";;;;;;;;;AAaA,MAAM,SAAS,aAAa,gCAAgC;;;;;;AAO5D,SAAgB,kBAAkB,KAAuB;CACvD,MAAM,UAAU,IAAI,SAAS,mBAAmB;CAChD,MAAM,yBAAS,IAAI,KAAa;AAChC,MAAK,MAAM,SAAS,QAClB,QAAO,IAAI,MAAM,GAAG;AAEtB,QAAO,MAAM,KAAK,OAAO;;AAI3B,MAAa,yBAAyB,CAAC,cAAc;;;;;AAMrD,SAAS,qBAAqB,KAAqB;CACjD,MAAM,SAAS,kBAAkB,IAAI,CAAC,QACnC,MAAM,CAAC,uBAAuB,SAAS,EAAE,CAC3C;CACD,MAAM,aAAa,sBAAsB,IAAI;AAE7C,QAAO,OAAO,SAAS,IACnB,YAAY,OACT,KAAK,MAAM;EACV,MAAM,UAAU,WAAW;EAC3B,MAAM,aAAa,UACf,gBAAgB,WAChB;EACJ,MAAM,SAAS,UAAU,gBAAgB,WAAW;AACpD,SAAO,OAAO,WAAW,MAAM,SAAS,OAAO,aAAa,EAAE,IAAI;GAClE,CACD,KAAK,YAAY,CAAC,YACrB;;AAGN,SAAgB,mBACd,QACA,KACA,WACuC;CAEvC,MAAM,WADW,OAAO,QAAQ,cAAc,EAAE,EACvB,KAAK,SAAS;EACrC,MAAM,IAAI,MAAM;EAChB,WAAW,IAAI,IAAI,aAAa,IAAI;EACpC,SAAS,IAAI,MAAM;EACpB,EAAE;CAEH,MAAM,aAAa,qBAAqB,IAAI;CAG5C,MAAM,eAAe,QAAQ,KAAK,WAAW;EAE3C,MAAM,aAAa,QADI,kBAAkB,OAAO,UAAU,KACZ;EAE9C,MAAM,OAAO,6BAA6B,KAAK,OAAO,KAAK,GACvD,OAAO,OACP,IAAI,OAAO,KAAK;AAOpB,SAAO,GAJS,OAAO,UACnB,OAAO,OAAO,QAAQ,eACtB,gBAAgB,OAAO,UAAU,eAEjB,KAAK,IAAI;GAC7B;CAEF,MAAM,aAAa,aAAa,SAAS;AAczC,QAAO;EAAE,MAZI;aACF,UAAU;kBACL,WAAW;cAEvB,aACI;QACF,aAAa,KAAK,YAAY,CAAC;UAE7B,UACL;;EAGY;EAAY;;;;;;AAO7B,SAAS,2BAA2B,KAAa,WAA2B;AAG1E,QAAO;aACI,UAAU;kBAHF,qBAAqB,IAAI,CAIjB;;;;AAK7B,SAAS,iBACP,OACA,cACA,KACA,WACA,SACM;CACN,MAAM,OAAO,2BAA2B,KAAK,UAAU;AACvD,cAAa,KAAK;EAAE,MAAM;EAAW;EAAM,CAAC;AAC5C,OAAM,QAAQ,aAAa;EAAE,MAAM;EAAS;EAAM,OAAO;EAAM;AAC/D,WAAU,MAAM;;AAGlB,SAAgB,sBAAsB,KAAqC;CACzE,MAAM,aAAqC,EAAE;CAG7C,MAAM,UAAU,IAAI,SADlB,yEACiC;AACnC,MAAK,MAAM,SAAS,SAAS;EAC3B,MAAM,GAAG,WAAW,aAAa;AACjC,aAAW,aAAa,UAAU,aAAa;;AAGjD,QAAO;;;;;;;;;;;AAYT,eAAsB,4BACpB,aACA,aACA,UAAiC,EAAE,EACX;CACxB,MAAM,EAAE,UAAU,UAAU;CAG5B,MAAM,aAAa,GAChB,YAAY,YAAY,CACxB,QAAQ,SAAS,KAAK,SAAS,OAAO,CAAC;AAE1C,QAAO,MAAM,wBAAwB,WAAW,OAAO;CAGvD,MAAM,QAAQ,UAAU;EAAE,SAAS;EAAe,SAAS,EAAE;EAAE,GAAG,WAAW;CAE7E,MAAM,SAAS,IAAI,gBAAgB,EAAE,CAAC;CACtC,MAAM,eAA8B,EAAE;CACtC,MAAM,UAAU,IAAI,SAAS;AAG7B,MAAK,IAAI,IAAI,GAAG,IAAI,WAAW,QAAQ,KAAK;EAC1C,MAAM,OAAO,WAAW;EAExB,MAAM,YAAY,mBADF,KAAK,SAAS,MAAM,OAAO,CACE;EAG7C,MAAM,MAAM,GAAG,aAAa,KAAK,KAAK,aAAa,KAAK,EAAE,OAAO;EACjE,MAAM,UAAU,QAAQ,IAAI;EAG5B,MAAM,SAAS,MAAM,QAAQ;AAC7B,MAAI,UAAU,OAAO,SAAS,WAAW,CAAC,OAAO,OAAO;AACtD,gBAAa,KAAK;IAAE,MAAM;IAAW,MAAM,OAAO;IAAM,CAAC;AACzD,WAAQ,MAAM,cAAc,UAAU,IAAI,IAAI,EAAE,GAAG,WAAW,OAAO,GAAG;AACxE,WAAQ,KAAK,KAAK,UAAU,WAAW;AACvC;;AAGF,UAAQ,MAAM,cAAc,UAAU,IAAI,IAAI,EAAE,GAAG,WAAW,OAAO,GAAG;EAKxE,MAAM,aAHkB,IAAI,QAAQ,oBAAoB,KAAK,CAG1B,MAAM,CAAC,QAAQ,SAAS,GAAG;AAG9D,MAAI;GACF,MAAM,SAAU,MAAM,OAAO,mBAAmB,iBAAiB;IAC/D,WAAW,kBAAkB;IAC7B,cAAc;IACf,CAAC;AAEF,OAAI,OAAO,OAAO,UAAU,UAAU;IACpC,MAAM,WACJ,OAAO,OAAO,OAAO,WAAW;AAClC,qBAAiB,OAAO,cAAc,KAAK,WAAW,QAAQ;AAC9D,YAAQ,KAAK,KAAK,UAAU,WAAW;AACvC,YAAQ,YAAY,cAAc,WAAW;AAC7C,YAAQ,YAAY,UAAU,WAAW,MAAM,GAAG,IAAI,GAAG;AACzD;;GAIF,MAAM,EAAE,MAAM,eAAe,mBAAmB,QAAQ,KAAK,UAAU;AACvE,gBAAa,KAAK;IAAE,MAAM;IAAW;IAAM,CAAC;GAI5C,MAAM,QAAQ,CAAC;AACf,SAAM,QAAQ,aAAa;IAAE,MAAM;IAAS;IAAM;IAAO;AACzD,aAAU,MAAM;AAEhB,WAAQ,KAAK,KAAK,YAAY;WACvB,OAAO;GACd,MAAM,eACJ,iBAAiB,QAAQ,MAAM,UAAU;AAC3C,WAAQ,KAAK,KAAK,YAAY;AAC9B,WAAQ,YAAY,aAAa;AACjC,oBAAiB,OAAO,cAAc,KAAK,WAAW,QAAQ;;;AAIlE,QAAO;;;;;;;AAQT,SAAgB,mBAAmB,UAA0B;AAC3D,QAAO,SAAS,QAAQ,UAAU,GAAG;;;;;;;;;;;;AAavC,SAAgB,kBAAkB,UAA0B;AAC1D,QAAO,SACJ,QAAQ,WAAW,GAAG,CACtB,QAAQ,SAAS,GAAG,CACpB,MAAM,IAAI,CAAC;;;AAIhB,MAAM,UAAkC;CAEtC,QAAQ;CACR,QAAQ;CAER,SAAS;CAET,SAAS;CACT,UAAU;CACV,KAAK;CACL,QAAQ;CACR,OAAO;CACP,QAAQ;CACR,SAAS;CAET,MAAM;CACN,WAAW;CACX,eAAe;CACf,UAAU;CAEV,OAAO;CACP,KAAK;CACL,QAAQ;CACR,QAAQ;CACR,SAAS;CAET,WAAW;CACX,UAAU;CAEV,MAAM;CACP"}
1
+ {"version":3,"file":"query-registry.js","names":[],"sources":["../../src/type-generator/query-registry.ts"],"sourcesContent":["import fs from \"node:fs/promises\";\nimport path from \"node:path\";\nimport { WorkspaceClient } from \"@databricks/sdk-experimental\";\nimport pc from \"picocolors\";\nimport { createLogger } from \"../logging/logger\";\nimport { CACHE_VERSION, hashSQL, loadCache, saveCache } from \"./cache\";\nimport { Spinner } from \"./spinner\";\nimport {\n type DatabricksStatementExecutionResponse,\n type QuerySchema,\n sqlTypeToHelper,\n sqlTypeToMarker,\n} from \"./types\";\n\nconst logger = createLogger(\"type-generator:query-registry\");\n\n/**\n * Parse a raw API/SDK error into a structured code + message.\n * Handles Databricks-style JSON bodies embedded in the message string,\n * e.g. `Response from server (Bad Request) {\"error_code\":\"...\",\"message\":\"...\"}`.\n */\nfunction parseError(raw: string): { code?: string; message: string } {\n const jsonMatch = raw.match(/\\{[\\s\\S]*\\}/);\n if (jsonMatch) {\n try {\n const parsed = JSON.parse(jsonMatch[0]);\n if (parsed.error_code || parsed.message) {\n return {\n code: parsed.error_code,\n message: parsed.message || raw,\n };\n }\n } catch {\n // not valid JSON, fall through\n }\n }\n return { message: raw };\n}\n\n/**\n * Extract parameters from a SQL query\n * @param sql - the SQL query to extract parameters from\n * @returns an array of parameter names\n */\nexport function extractParameters(sql: string): string[] {\n const matches = sql.matchAll(/:([a-zA-Z_]\\w*)/g);\n const params = new Set<string>();\n for (const match of matches) {\n params.add(match[1]);\n }\n return Array.from(params);\n}\n\n// parameters that are injected by the server\nexport const SERVER_INJECTED_PARAMS = [\"workspaceId\"];\n\n/**\n * Generates the TypeScript type literal for query parameters from SQL.\n * Shared by both the success and failure paths.\n */\nfunction formatParametersType(sql: string): string {\n const params = extractParameters(sql).filter(\n (p) => !SERVER_INJECTED_PARAMS.includes(p),\n );\n const paramTypes = extractParameterTypes(sql);\n\n return params.length > 0\n ? `{\\n ${params\n .map((p) => {\n const sqlType = paramTypes[p];\n const markerType = sqlType\n ? sqlTypeToMarker[sqlType]\n : \"SQLTypeMarker\";\n const helper = sqlType ? sqlTypeToHelper[sqlType] : \"sql.*()\";\n return `/** ${sqlType || \"any\"} - use ${helper} */\\n ${p}: ${markerType}`;\n })\n .join(\";\\n \")};\\n }`\n : \"Record<string, never>\";\n}\n\nexport function convertToQueryType(\n result: DatabricksStatementExecutionResponse,\n sql: string,\n queryName: string,\n): { type: string; hasResults: boolean } {\n const dataRows = result.result?.data_array || [];\n const columns = dataRows.map((row) => ({\n name: row[0] || \"\",\n type_name: row[1]?.toUpperCase() || \"STRING\",\n comment: row[2] || undefined,\n }));\n\n const paramsType = formatParametersType(sql);\n\n // generate result fields with JSDoc\n const resultFields = columns.map((column) => {\n const normalizedType = normalizeTypeName(column.type_name);\n const mappedType = typeMap[normalizedType] || \"unknown\";\n // validate column name is a valid identifier\n const name = /^[a-zA-Z_$][a-zA-Z0-9_$]*$/.test(column.name)\n ? column.name\n : `\"${column.name}\"`;\n\n // generate comment for column\n const comment = column.comment\n ? `/** ${column.comment} */\\n `\n : `/** @sqlType ${column.type_name} */\\n `;\n\n return `${comment}${name}: ${mappedType}`;\n });\n\n const hasResults = resultFields.length > 0;\n\n const type = `{\n name: \"${queryName}\";\n parameters: ${paramsType};\n result: ${\n hasResults\n ? `Array<{\n ${resultFields.join(\";\\n \")};\n }>`\n : \"unknown\"\n };\n }`;\n\n return { type, hasResults };\n}\n\n/**\n * Used when DESCRIBE QUERY fails so the query still appears in QueryRegistry.\n * Generates a type with unknown result from SQL alone (no warehouse call).\n */\nfunction generateUnknownResultQuery(sql: string, queryName: string): string {\n const paramsType = formatParametersType(sql);\n\n return `{\n name: \"${queryName}\";\n parameters: ${paramsType};\n result: unknown;\n }`;\n}\n\nexport function extractParameterTypes(sql: string): Record<string, string> {\n const paramTypes: Record<string, string> = {};\n const regex =\n /--\\s*@param\\s+(\\w+)\\s+(STRING|NUMERIC|BOOLEAN|DATE|TIMESTAMP|BINARY)/gi;\n const matches = sql.matchAll(regex);\n for (const match of matches) {\n const [, paramName, paramType] = match;\n paramTypes[paramName] = paramType.toUpperCase();\n }\n\n return paramTypes;\n}\n\n/**\n * Generate query schemas from a folder of SQL files\n * It uses DESCRIBE QUERY to get the schema without executing the query\n * @param queryFolder - the folder containing the SQL files\n * @param warehouseId - the warehouse id to use for schema analysis\n * @param options - options for the query generation\n * @param options.noCache - if true, skip the cache and regenerate all types\n * @returns an array of query schemas\n */\nexport async function generateQueriesFromDescribe(\n queryFolder: string,\n warehouseId: string,\n options: { noCache?: boolean; concurrency?: number } = {},\n): Promise<QuerySchema[]> {\n const { noCache = false, concurrency: rawConcurrency = 10 } = options;\n const concurrency =\n typeof rawConcurrency === \"number\" && Number.isFinite(rawConcurrency)\n ? Math.max(1, Math.floor(rawConcurrency))\n : 10;\n\n // read all query files and cache in parallel\n const [allFiles, cache] = await Promise.all([\n fs.readdir(queryFolder),\n noCache\n ? ({ version: CACHE_VERSION, queries: {} } as Awaited<\n ReturnType<typeof loadCache>\n >)\n : loadCache(),\n ]);\n\n const queryFiles = allFiles.filter((file) => file.endsWith(\".sql\"));\n logger.debug(\"Found %d SQL queries\", queryFiles.length);\n\n const client = new WorkspaceClient({});\n const spinner = new Spinner();\n\n // Read all SQL files in parallel\n const sqlContents = await Promise.all(\n queryFiles.map((file) => fs.readFile(path.join(queryFolder, file), \"utf8\")),\n );\n\n const startTime = performance.now();\n\n // Phase 1: Check cache, separate cached vs uncached\n const cachedResults: Array<{ index: number; schema: QuerySchema }> = [];\n const uncachedQueries: Array<{\n index: number;\n queryName: string;\n sql: string;\n sqlHash: string;\n cleanedSql: string;\n }> = [];\n const logEntries: Array<{\n queryName: string;\n status: \"HIT\" | \"MISS\";\n failed?: boolean;\n error?: { code?: string; message: string };\n }> = [];\n\n for (let i = 0; i < queryFiles.length; i++) {\n const file = queryFiles[i];\n const rawName = path.basename(file, \".sql\");\n const queryName = normalizeQueryName(rawName);\n\n const sql = sqlContents[i];\n const sqlHash = hashSQL(sql);\n\n const cached = cache.queries[queryName];\n if (cached && cached.hash === sqlHash && !cached.retry) {\n cachedResults.push({\n index: i,\n schema: { name: queryName, type: cached.type },\n });\n logEntries.push({ queryName, status: \"HIT\" });\n } else {\n const sqlWithDefaults = sql.replace(/:([a-zA-Z_]\\w*)/g, \"''\");\n const cleanedSql = sqlWithDefaults.trim().replace(/;\\s*$/, \"\");\n uncachedQueries.push({ index: i, queryName, sql, sqlHash, cleanedSql });\n }\n }\n\n // Phase 2: Execute all uncached DESCRIBE calls in parallel\n type DescribeResult =\n | {\n status: \"ok\";\n index: number;\n schema: QuerySchema;\n cacheEntry: { hash: string; type: string; retry: boolean };\n }\n | {\n status: \"fail\";\n index: number;\n schema: QuerySchema;\n cacheEntry: { hash: string; type: string; retry: boolean };\n error: { code?: string; message: string };\n };\n\n const freshResults: Array<{ index: number; schema: QuerySchema }> = [];\n\n if (uncachedQueries.length > 0) {\n let completed = 0;\n const total = uncachedQueries.length;\n spinner.start(\n `Describing ${total} ${total === 1 ? \"query\" : \"queries\"} (0/${total})`,\n );\n\n const describeOne = async ({\n index,\n queryName,\n sql,\n sqlHash,\n cleanedSql,\n }: (typeof uncachedQueries)[number]): Promise<DescribeResult> => {\n const result = (await client.statementExecution.executeStatement({\n statement: `DESCRIBE QUERY ${cleanedSql}`,\n warehouse_id: warehouseId,\n })) as DatabricksStatementExecutionResponse;\n\n completed++;\n spinner.update(\n `Describing ${total} ${total === 1 ? \"query\" : \"queries\"} (${completed}/${total})`,\n );\n\n logger.debug(\n \"DESCRIBE result for %s: state=%s, rows=%d\",\n queryName,\n result.status.state,\n result.result?.data_array?.length ?? 0,\n );\n\n if (result.status.state === \"FAILED\") {\n const sqlError =\n result.status.error?.message || \"Query execution failed\";\n logger.warn(\"DESCRIBE failed for %s: %s\", queryName, sqlError);\n const type = generateUnknownResultQuery(sql, queryName);\n return {\n status: \"fail\",\n index,\n schema: { name: queryName, type },\n cacheEntry: { hash: sqlHash, type, retry: true },\n error: parseError(sqlError),\n };\n }\n\n const { type, hasResults } = convertToQueryType(result, sql, queryName);\n return {\n status: \"ok\",\n index,\n schema: { name: queryName, type },\n cacheEntry: { hash: sqlHash, type, retry: !hasResults },\n };\n };\n\n // Process in chunks, saving cache after each chunk\n const processBatchResults = (\n settled: PromiseSettledResult<DescribeResult>[],\n batchOffset: number,\n ) => {\n for (let i = 0; i < settled.length; i++) {\n const entry = settled[i];\n const { queryName } = uncachedQueries[batchOffset + i];\n\n if (entry.status === \"fulfilled\") {\n const res = entry.value;\n freshResults.push({ index: res.index, schema: res.schema });\n cache.queries[queryName] = res.cacheEntry;\n logEntries.push({\n queryName,\n status: \"MISS\",\n failed: res.status === \"fail\",\n error: res.status === \"fail\" ? res.error : undefined,\n });\n } else {\n const { sql, sqlHash, index } = uncachedQueries[batchOffset + i];\n const reason =\n entry.reason instanceof Error\n ? entry.reason.message\n : String(entry.reason);\n logger.warn(\"DESCRIBE rejected for %s: %s\", queryName, reason);\n const type = generateUnknownResultQuery(sql, queryName);\n freshResults.push({ index, schema: { name: queryName, type } });\n cache.queries[queryName] = { hash: sqlHash, type, retry: true };\n logEntries.push({\n queryName,\n status: \"MISS\",\n failed: true,\n error: parseError(reason),\n });\n }\n }\n };\n\n if (uncachedQueries.length > concurrency) {\n for (let b = 0; b < uncachedQueries.length; b += concurrency) {\n const batch = uncachedQueries.slice(b, b + concurrency);\n const batchResults = await Promise.allSettled(batch.map(describeOne));\n processBatchResults(batchResults, b);\n await saveCache(cache);\n }\n } else {\n const settled = await Promise.allSettled(\n uncachedQueries.map(describeOne),\n );\n processBatchResults(settled, 0);\n await saveCache(cache);\n }\n\n spinner.stop(\"\");\n }\n\n const elapsed = ((performance.now() - startTime) / 1000).toFixed(2);\n\n // Print formatted table\n if (logEntries.length > 0) {\n const maxNameLen = Math.max(...logEntries.map((e) => e.queryName.length));\n const separator = pc.dim(\"─\".repeat(50));\n console.log(\"\");\n console.log(\n ` ${pc.bold(\"Typegen Queries\")} ${pc.dim(`(${logEntries.length})`)}`,\n );\n console.log(` ${separator}`);\n for (const entry of logEntries) {\n const tag = entry.failed\n ? pc.bold(pc.red(\"ERROR\"))\n : entry.status === \"HIT\"\n ? `cache ${pc.bold(pc.green(\"HIT \"))}`\n : `cache ${pc.bold(pc.yellow(\"MISS \"))}`;\n const rawName = entry.queryName.padEnd(maxNameLen);\n const name = entry.failed ? pc.dim(pc.strikethrough(rawName)) : rawName;\n const errorCode = entry.error?.message.match(/\\[([^\\]]+)\\]/)?.[1];\n const reason = errorCode ? ` ${pc.dim(errorCode)}` : \"\";\n console.log(` ${tag} ${name}${reason}`);\n }\n const newCount = logEntries.filter(\n (e) => e.status === \"MISS\" && !e.failed,\n ).length;\n const cacheCount = logEntries.filter(\n (e) => e.status === \"HIT\" && !e.failed,\n ).length;\n const errorCount = logEntries.filter((e) => e.failed).length;\n console.log(` ${separator}`);\n const parts = [`${newCount} new`, `${cacheCount} from cache`];\n if (errorCount > 0)\n parts.push(`${errorCount} ${errorCount === 1 ? \"error\" : \"errors\"}`);\n console.log(` ${parts.join(\", \")}. ${pc.dim(`${elapsed}s`)}`);\n console.log(\"\");\n }\n\n // Merge and sort by original file index for deterministic output\n return [...cachedResults, ...freshResults]\n .sort((a, b) => a.index - b.index)\n .map((r) => r.schema);\n}\n\n/**\n * Normalize query name by removing the .obo extension\n * @param queryName - the query name to normalize\n * @returns the normalized query name\n */\nexport function normalizeQueryName(fileName: string): string {\n return fileName.replace(/\\.obo$/, \"\");\n}\n\n/**\n * Normalize SQL type name by removing parameters/generics\n * Examples:\n * DECIMAL(38,6) -> DECIMAL\n * ARRAY<STRING> -> ARRAY\n * MAP<STRING,INT> -> MAP\n * STRUCT<name:STRING> -> STRUCT\n * INTERVAL DAY TO SECOND -> INTERVAL\n * GEOGRAPHY(4326) -> GEOGRAPHY\n */\nexport function normalizeTypeName(typeName: string): string {\n return typeName\n .replace(/\\(.*\\)$/, \"\") // remove (p, s) eg: DECIMAL(38,6) -> DECIMAL\n .replace(/<.*>$/, \"\") // remove <T> eg: ARRAY<STRING> -> ARRAY\n .split(\" \")[0]; // take first word eg: INTERVAL DAY TO SECOND -> INTERVAL\n}\n\n/** Type Map for Databricks data types to JavaScript types */\nconst typeMap: Record<string, string> = {\n // string types\n STRING: \"string\",\n BINARY: \"string\",\n // boolean\n BOOLEAN: \"boolean\",\n // numeric types\n TINYINT: \"number\",\n SMALLINT: \"number\",\n INT: \"number\",\n BIGINT: \"number\",\n FLOAT: \"number\",\n DOUBLE: \"number\",\n DECIMAL: \"number\",\n // date/time types\n DATE: \"string\",\n TIMESTAMP: \"string\",\n TIMESTAMP_NTZ: \"string\",\n INTERVAL: \"string\",\n // complex types\n ARRAY: \"unknown[]\",\n MAP: \"Record<string, unknown>\",\n STRUCT: \"Record<string, unknown>\",\n OBJECT: \"Record<string, unknown>\",\n VARIANT: \"unknown\",\n // spatial types\n GEOGRAPHY: \"unknown\",\n GEOMETRY: \"unknown\",\n // null type\n VOID: \"null\",\n};\n"],"mappings":";;;;;;;;;;AAcA,MAAM,SAAS,aAAa,gCAAgC;;;;;;AAO5D,SAAS,WAAW,KAAiD;CACnE,MAAM,YAAY,IAAI,MAAM,cAAc;AAC1C,KAAI,UACF,KAAI;EACF,MAAM,SAAS,KAAK,MAAM,UAAU,GAAG;AACvC,MAAI,OAAO,cAAc,OAAO,QAC9B,QAAO;GACL,MAAM,OAAO;GACb,SAAS,OAAO,WAAW;GAC5B;SAEG;AAIV,QAAO,EAAE,SAAS,KAAK;;;;;;;AAQzB,SAAgB,kBAAkB,KAAuB;CACvD,MAAM,UAAU,IAAI,SAAS,mBAAmB;CAChD,MAAM,yBAAS,IAAI,KAAa;AAChC,MAAK,MAAM,SAAS,QAClB,QAAO,IAAI,MAAM,GAAG;AAEtB,QAAO,MAAM,KAAK,OAAO;;AAI3B,MAAa,yBAAyB,CAAC,cAAc;;;;;AAMrD,SAAS,qBAAqB,KAAqB;CACjD,MAAM,SAAS,kBAAkB,IAAI,CAAC,QACnC,MAAM,CAAC,uBAAuB,SAAS,EAAE,CAC3C;CACD,MAAM,aAAa,sBAAsB,IAAI;AAE7C,QAAO,OAAO,SAAS,IACnB,YAAY,OACT,KAAK,MAAM;EACV,MAAM,UAAU,WAAW;EAC3B,MAAM,aAAa,UACf,gBAAgB,WAChB;EACJ,MAAM,SAAS,UAAU,gBAAgB,WAAW;AACpD,SAAO,OAAO,WAAW,MAAM,SAAS,OAAO,aAAa,EAAE,IAAI;GAClE,CACD,KAAK,YAAY,CAAC,YACrB;;AAGN,SAAgB,mBACd,QACA,KACA,WACuC;CAEvC,MAAM,WADW,OAAO,QAAQ,cAAc,EAAE,EACvB,KAAK,SAAS;EACrC,MAAM,IAAI,MAAM;EAChB,WAAW,IAAI,IAAI,aAAa,IAAI;EACpC,SAAS,IAAI,MAAM;EACpB,EAAE;CAEH,MAAM,aAAa,qBAAqB,IAAI;CAG5C,MAAM,eAAe,QAAQ,KAAK,WAAW;EAE3C,MAAM,aAAa,QADI,kBAAkB,OAAO,UAAU,KACZ;EAE9C,MAAM,OAAO,6BAA6B,KAAK,OAAO,KAAK,GACvD,OAAO,OACP,IAAI,OAAO,KAAK;AAOpB,SAAO,GAJS,OAAO,UACnB,OAAO,OAAO,QAAQ,eACtB,gBAAgB,OAAO,UAAU,eAEjB,KAAK,IAAI;GAC7B;CAEF,MAAM,aAAa,aAAa,SAAS;AAczC,QAAO;EAAE,MAZI;aACF,UAAU;kBACL,WAAW;cAEvB,aACI;QACF,aAAa,KAAK,YAAY,CAAC;UAE7B,UACL;;EAGY;EAAY;;;;;;AAO7B,SAAS,2BAA2B,KAAa,WAA2B;AAG1E,QAAO;aACI,UAAU;kBAHF,qBAAqB,IAAI,CAIjB;;;;AAK7B,SAAgB,sBAAsB,KAAqC;CACzE,MAAM,aAAqC,EAAE;CAG7C,MAAM,UAAU,IAAI,SADlB,yEACiC;AACnC,MAAK,MAAM,SAAS,SAAS;EAC3B,MAAM,GAAG,WAAW,aAAa;AACjC,aAAW,aAAa,UAAU,aAAa;;AAGjD,QAAO;;;;;;;;;;;AAYT,eAAsB,4BACpB,aACA,aACA,UAAuD,EAAE,EACjC;CACxB,MAAM,EAAE,UAAU,OAAO,aAAa,iBAAiB,OAAO;CAC9D,MAAM,cACJ,OAAO,mBAAmB,YAAY,OAAO,SAAS,eAAe,GACjE,KAAK,IAAI,GAAG,KAAK,MAAM,eAAe,CAAC,GACvC;CAGN,MAAM,CAAC,UAAU,SAAS,MAAM,QAAQ,IAAI,CAC1C,GAAG,QAAQ,YAAY,EACvB,UACK;EAAE,SAAS;EAAe,SAAS,EAAE;EAAE,GAGxC,WAAW,CAChB,CAAC;CAEF,MAAM,aAAa,SAAS,QAAQ,SAAS,KAAK,SAAS,OAAO,CAAC;AACnE,QAAO,MAAM,wBAAwB,WAAW,OAAO;CAEvD,MAAM,SAAS,IAAI,gBAAgB,EAAE,CAAC;CACtC,MAAM,UAAU,IAAI,SAAS;CAG7B,MAAM,cAAc,MAAM,QAAQ,IAChC,WAAW,KAAK,SAAS,GAAG,SAAS,KAAK,KAAK,aAAa,KAAK,EAAE,OAAO,CAAC,CAC5E;CAED,MAAM,YAAY,YAAY,KAAK;CAGnC,MAAM,gBAA+D,EAAE;CACvE,MAAM,kBAMD,EAAE;CACP,MAAM,aAKD,EAAE;AAEP,MAAK,IAAI,IAAI,GAAG,IAAI,WAAW,QAAQ,KAAK;EAC1C,MAAM,OAAO,WAAW;EAExB,MAAM,YAAY,mBADF,KAAK,SAAS,MAAM,OAAO,CACE;EAE7C,MAAM,MAAM,YAAY;EACxB,MAAM,UAAU,QAAQ,IAAI;EAE5B,MAAM,SAAS,MAAM,QAAQ;AAC7B,MAAI,UAAU,OAAO,SAAS,WAAW,CAAC,OAAO,OAAO;AACtD,iBAAc,KAAK;IACjB,OAAO;IACP,QAAQ;KAAE,MAAM;KAAW,MAAM,OAAO;KAAM;IAC/C,CAAC;AACF,cAAW,KAAK;IAAE;IAAW,QAAQ;IAAO,CAAC;SACxC;GAEL,MAAM,aADkB,IAAI,QAAQ,oBAAoB,KAAK,CAC1B,MAAM,CAAC,QAAQ,SAAS,GAAG;AAC9D,mBAAgB,KAAK;IAAE,OAAO;IAAG;IAAW;IAAK;IAAS;IAAY,CAAC;;;CAoB3E,MAAM,eAA8D,EAAE;AAEtE,KAAI,gBAAgB,SAAS,GAAG;EAC9B,IAAI,YAAY;EAChB,MAAM,QAAQ,gBAAgB;AAC9B,UAAQ,MACN,cAAc,MAAM,GAAG,UAAU,IAAI,UAAU,UAAU,MAAM,MAAM,GACtE;EAED,MAAM,cAAc,OAAO,EACzB,OACA,WACA,KACA,SACA,iBAC+D;GAC/D,MAAM,SAAU,MAAM,OAAO,mBAAmB,iBAAiB;IAC/D,WAAW,kBAAkB;IAC7B,cAAc;IACf,CAAC;AAEF;AACA,WAAQ,OACN,cAAc,MAAM,GAAG,UAAU,IAAI,UAAU,UAAU,IAAI,UAAU,GAAG,MAAM,GACjF;AAED,UAAO,MACL,6CACA,WACA,OAAO,OAAO,OACd,OAAO,QAAQ,YAAY,UAAU,EACtC;AAED,OAAI,OAAO,OAAO,UAAU,UAAU;IACpC,MAAM,WACJ,OAAO,OAAO,OAAO,WAAW;AAClC,WAAO,KAAK,8BAA8B,WAAW,SAAS;IAC9D,MAAM,OAAO,2BAA2B,KAAK,UAAU;AACvD,WAAO;KACL,QAAQ;KACR;KACA,QAAQ;MAAE,MAAM;MAAW;MAAM;KACjC,YAAY;MAAE,MAAM;MAAS;MAAM,OAAO;MAAM;KAChD,OAAO,WAAW,SAAS;KAC5B;;GAGH,MAAM,EAAE,MAAM,eAAe,mBAAmB,QAAQ,KAAK,UAAU;AACvE,UAAO;IACL,QAAQ;IACR;IACA,QAAQ;KAAE,MAAM;KAAW;KAAM;IACjC,YAAY;KAAE,MAAM;KAAS;KAAM,OAAO,CAAC;KAAY;IACxD;;EAIH,MAAM,uBACJ,SACA,gBACG;AACH,QAAK,IAAI,IAAI,GAAG,IAAI,QAAQ,QAAQ,KAAK;IACvC,MAAM,QAAQ,QAAQ;IACtB,MAAM,EAAE,cAAc,gBAAgB,cAAc;AAEpD,QAAI,MAAM,WAAW,aAAa;KAChC,MAAM,MAAM,MAAM;AAClB,kBAAa,KAAK;MAAE,OAAO,IAAI;MAAO,QAAQ,IAAI;MAAQ,CAAC;AAC3D,WAAM,QAAQ,aAAa,IAAI;AAC/B,gBAAW,KAAK;MACd;MACA,QAAQ;MACR,QAAQ,IAAI,WAAW;MACvB,OAAO,IAAI,WAAW,SAAS,IAAI,QAAQ;MAC5C,CAAC;WACG;KACL,MAAM,EAAE,KAAK,SAAS,UAAU,gBAAgB,cAAc;KAC9D,MAAM,SACJ,MAAM,kBAAkB,QACpB,MAAM,OAAO,UACb,OAAO,MAAM,OAAO;AAC1B,YAAO,KAAK,gCAAgC,WAAW,OAAO;KAC9D,MAAM,OAAO,2BAA2B,KAAK,UAAU;AACvD,kBAAa,KAAK;MAAE;MAAO,QAAQ;OAAE,MAAM;OAAW;OAAM;MAAE,CAAC;AAC/D,WAAM,QAAQ,aAAa;MAAE,MAAM;MAAS;MAAM,OAAO;MAAM;AAC/D,gBAAW,KAAK;MACd;MACA,QAAQ;MACR,QAAQ;MACR,OAAO,WAAW,OAAO;MAC1B,CAAC;;;;AAKR,MAAI,gBAAgB,SAAS,YAC3B,MAAK,IAAI,IAAI,GAAG,IAAI,gBAAgB,QAAQ,KAAK,aAAa;GAC5D,MAAM,QAAQ,gBAAgB,MAAM,GAAG,IAAI,YAAY;AAEvD,uBADqB,MAAM,QAAQ,WAAW,MAAM,IAAI,YAAY,CAAC,EACnC,EAAE;AACpC,SAAM,UAAU,MAAM;;OAEnB;AAIL,uBAHgB,MAAM,QAAQ,WAC5B,gBAAgB,IAAI,YAAY,CACjC,EAC4B,EAAE;AAC/B,SAAM,UAAU,MAAM;;AAGxB,UAAQ,KAAK,GAAG;;CAGlB,MAAM,YAAY,YAAY,KAAK,GAAG,aAAa,KAAM,QAAQ,EAAE;AAGnE,KAAI,WAAW,SAAS,GAAG;EACzB,MAAM,aAAa,KAAK,IAAI,GAAG,WAAW,KAAK,MAAM,EAAE,UAAU,OAAO,CAAC;EACzE,MAAM,YAAY,GAAG,IAAI,IAAI,OAAO,GAAG,CAAC;AACxC,UAAQ,IAAI,GAAG;AACf,UAAQ,IACN,KAAK,GAAG,KAAK,kBAAkB,CAAC,GAAG,GAAG,IAAI,IAAI,WAAW,OAAO,GAAG,GACpE;AACD,UAAQ,IAAI,KAAK,YAAY;AAC7B,OAAK,MAAM,SAAS,YAAY;GAC9B,MAAM,MAAM,MAAM,SACd,GAAG,KAAK,GAAG,IAAI,QAAQ,CAAC,GACxB,MAAM,WAAW,QACf,SAAS,GAAG,KAAK,GAAG,MAAM,QAAQ,CAAC,KACnC,SAAS,GAAG,KAAK,GAAG,OAAO,QAAQ,CAAC;GAC1C,MAAM,UAAU,MAAM,UAAU,OAAO,WAAW;GAClD,MAAM,OAAO,MAAM,SAAS,GAAG,IAAI,GAAG,cAAc,QAAQ,CAAC,GAAG;GAChE,MAAM,YAAY,MAAM,OAAO,QAAQ,MAAM,eAAe,GAAG;GAC/D,MAAM,SAAS,YAAY,KAAK,GAAG,IAAI,UAAU,KAAK;AACtD,WAAQ,IAAI,KAAK,IAAI,IAAI,OAAO,SAAS;;EAE3C,MAAM,WAAW,WAAW,QACzB,MAAM,EAAE,WAAW,UAAU,CAAC,EAAE,OAClC,CAAC;EACF,MAAM,aAAa,WAAW,QAC3B,MAAM,EAAE,WAAW,SAAS,CAAC,EAAE,OACjC,CAAC;EACF,MAAM,aAAa,WAAW,QAAQ,MAAM,EAAE,OAAO,CAAC;AACtD,UAAQ,IAAI,KAAK,YAAY;EAC7B,MAAM,QAAQ,CAAC,GAAG,SAAS,OAAO,GAAG,WAAW,aAAa;AAC7D,MAAI,aAAa,EACf,OAAM,KAAK,GAAG,WAAW,GAAG,eAAe,IAAI,UAAU,WAAW;AACtE,UAAQ,IAAI,KAAK,MAAM,KAAK,KAAK,CAAC,IAAI,GAAG,IAAI,GAAG,QAAQ,GAAG,GAAG;AAC9D,UAAQ,IAAI,GAAG;;AAIjB,QAAO,CAAC,GAAG,eAAe,GAAG,aAAa,CACvC,MAAM,GAAG,MAAM,EAAE,QAAQ,EAAE,MAAM,CACjC,KAAK,MAAM,EAAE,OAAO;;;;;;;AAQzB,SAAgB,mBAAmB,UAA0B;AAC3D,QAAO,SAAS,QAAQ,UAAU,GAAG;;;;;;;;;;;;AAavC,SAAgB,kBAAkB,UAA0B;AAC1D,QAAO,SACJ,QAAQ,WAAW,GAAG,CACtB,QAAQ,SAAS,GAAG,CACpB,MAAM,IAAI,CAAC;;;AAIhB,MAAM,UAAkC;CAEtC,QAAQ;CACR,QAAQ;CAER,SAAS;CAET,SAAS;CACT,UAAU;CACV,KAAK;CACL,QAAQ;CACR,OAAO;CACP,QAAQ;CACR,SAAS;CAET,MAAM;CACN,WAAW;CACX,eAAe;CACf,UAAU;CAEV,OAAO;CACP,KAAK;CACL,QAAQ;CACR,QAAQ;CACR,SAAS;CAET,WAAW;CACX,UAAU;CAEV,MAAM;CACP"}
@@ -21,6 +21,9 @@ var Spinner = class {
21
21
  process.stdout.write(`\r ${this.text}${this.frames[this.current]}`);
22
22
  }, 300);
23
23
  }
24
+ update(text) {
25
+ this.text = text;
26
+ }
24
27
  stop(finalText) {
25
28
  if (this.interval) {
26
29
  clearInterval(this.interval);
@@ -29,7 +32,8 @@ var Spinner = class {
29
32
  process.stdout.write(`\x1b[2K\r ${finalText || this.text}\n`);
30
33
  }
31
34
  printDetail(text) {
32
- process.stdout.write(`\x1b[2m ${text}\x1b[0m\n`);
35
+ process.stdout.write(`\x1b[2K\r\x1b[2m ${text}\x1b[0m\n`);
36
+ if (this.interval) process.stdout.write(` ${this.text}${this.frames[this.current]}`);
33
37
  }
34
38
  };
35
39
 
@@ -1 +1 @@
1
- {"version":3,"file":"spinner.js","names":[],"sources":["../../src/type-generator/spinner.ts"],"sourcesContent":["/**\n * Simple loading spinner for CLI\n */\nexport class Spinner {\n private frames = [\" \", \". \", \".. \", \"...\"];\n private current = 0;\n private interval: NodeJS.Timeout | null = null;\n private text = \"\";\n\n start(text: string) {\n this.text = text;\n this.current = 0;\n process.stdout.write(` ${this.text}${this.frames[0]}`);\n this.interval = setInterval(() => {\n this.current = (this.current + 1) % this.frames.length;\n process.stdout.write(`\\r ${this.text}${this.frames[this.current]}`);\n }, 300);\n }\n\n stop(finalText?: string) {\n if (this.interval) {\n clearInterval(this.interval);\n this.interval = null;\n }\n // clear the line and write the final text\n process.stdout.write(`\\x1b[2K\\r ${finalText || this.text}\\n`);\n }\n\n printDetail(text: string) {\n process.stdout.write(`\\x1b[2m ${text}\\x1b[0m\\n`);\n }\n}\n"],"mappings":";;;;AAGA,IAAa,UAAb,MAAqB;CACnB,AAAQ,SAAS;EAAC;EAAO;EAAO;EAAO;EAAM;CAC7C,AAAQ,UAAU;CAClB,AAAQ,WAAkC;CAC1C,AAAQ,OAAO;CAEf,MAAM,MAAc;AAClB,OAAK,OAAO;AACZ,OAAK,UAAU;AACf,UAAQ,OAAO,MAAM,KAAK,KAAK,OAAO,KAAK,OAAO,KAAK;AACvD,OAAK,WAAW,kBAAkB;AAChC,QAAK,WAAW,KAAK,UAAU,KAAK,KAAK,OAAO;AAChD,WAAQ,OAAO,MAAM,OAAO,KAAK,OAAO,KAAK,OAAO,KAAK,WAAW;KACnE,IAAI;;CAGT,KAAK,WAAoB;AACvB,MAAI,KAAK,UAAU;AACjB,iBAAc,KAAK,SAAS;AAC5B,QAAK,WAAW;;AAGlB,UAAQ,OAAO,MAAM,cAAc,aAAa,KAAK,KAAK,IAAI;;CAGhE,YAAY,MAAc;AACxB,UAAQ,OAAO,MAAM,cAAc,KAAK,WAAW"}
1
+ {"version":3,"file":"spinner.js","names":[],"sources":["../../src/type-generator/spinner.ts"],"sourcesContent":["/**\n * Simple loading spinner for CLI\n */\nexport class Spinner {\n private frames = [\" \", \". \", \".. \", \"...\"];\n private current = 0;\n private interval: NodeJS.Timeout | null = null;\n private text = \"\";\n\n start(text: string) {\n this.text = text;\n this.current = 0;\n process.stdout.write(` ${this.text}${this.frames[0]}`);\n this.interval = setInterval(() => {\n this.current = (this.current + 1) % this.frames.length;\n process.stdout.write(`\\r ${this.text}${this.frames[this.current]}`);\n }, 300);\n }\n\n update(text: string) {\n this.text = text;\n }\n\n stop(finalText?: string) {\n if (this.interval) {\n clearInterval(this.interval);\n this.interval = null;\n }\n // clear the line and write the final text\n process.stdout.write(`\\x1b[2K\\r ${finalText || this.text}\\n`);\n }\n\n printDetail(text: string) {\n // Clear spinner line, print detail, then redraw spinner\n process.stdout.write(`\\x1b[2K\\r\\x1b[2m ${text}\\x1b[0m\\n`);\n if (this.interval) {\n process.stdout.write(` ${this.text}${this.frames[this.current]}`);\n }\n }\n}\n"],"mappings":";;;;AAGA,IAAa,UAAb,MAAqB;CACnB,AAAQ,SAAS;EAAC;EAAO;EAAO;EAAO;EAAM;CAC7C,AAAQ,UAAU;CAClB,AAAQ,WAAkC;CAC1C,AAAQ,OAAO;CAEf,MAAM,MAAc;AAClB,OAAK,OAAO;AACZ,OAAK,UAAU;AACf,UAAQ,OAAO,MAAM,KAAK,KAAK,OAAO,KAAK,OAAO,KAAK;AACvD,OAAK,WAAW,kBAAkB;AAChC,QAAK,WAAW,KAAK,UAAU,KAAK,KAAK,OAAO;AAChD,WAAQ,OAAO,MAAM,OAAO,KAAK,OAAO,KAAK,OAAO,KAAK,WAAW;KACnE,IAAI;;CAGT,OAAO,MAAc;AACnB,OAAK,OAAO;;CAGd,KAAK,WAAoB;AACvB,MAAI,KAAK,UAAU;AACjB,iBAAc,KAAK,SAAS;AAC5B,QAAK,WAAW;;AAGlB,UAAQ,OAAO,MAAM,cAAc,aAAa,KAAK,KAAK,IAAI;;CAGhE,YAAY,MAAc;AAExB,UAAQ,OAAO,MAAM,uBAAuB,KAAK,WAAW;AAC5D,MAAI,KAAK,SACP,SAAQ,OAAO,MAAM,KAAK,KAAK,OAAO,KAAK,OAAO,KAAK,WAAW"}
@@ -1,7 +1,6 @@
1
1
  import { Plugin } from "vite";
2
2
 
3
3
  //#region src/type-generator/vite-plugin.d.ts
4
-
5
4
  /**
6
5
  * Options for the AppKit types plugin.
7
6
  */
@@ -1 +1 @@
1
- {"version":3,"file":"vite-plugin.d.ts","names":[],"sources":["../../src/type-generator/vite-plugin.ts"],"sourcesContent":[],"mappings":";;;;;;AAEmC;AAsBnC,UAbU,wBAAA,CAauB;EAAA,OAAA,CAAA,EAAA,MAAA;;cAAsC,CAAA,EAAA,MAAA,EAAA;;;;;;;;iBAAvD,iBAAA,WAA4B,2BAA2B"}
1
+ {"version":3,"file":"vite-plugin.d.ts","names":[],"sources":["../../src/type-generator/vite-plugin.ts"],"mappings":";;;;;AAEmC;UASzB,wBAAA;EAER,OAAA;EAAA;EAEA,YAAA;AAAA;;;;;;;iBASc,iBAAA,CAAkB,OAAA,GAAU,wBAAA,GAA2B,MAAA"}
@@ -1,7 +1,7 @@
1
1
  import { createLogger } from "../logging/logger.js";
2
2
  import { generateFromEntryPoint } from "./index.js";
3
3
  import path from "node:path";
4
- import fs from "node:fs";
4
+ import { existsSync } from "node:fs";
5
5
 
6
6
  //#region src/type-generator/vite-plugin.ts
7
7
  const logger = createLogger("type-generator:vite-plugin");
@@ -40,7 +40,7 @@ function appKitTypesPlugin(options) {
40
40
  logger.debug("Warehouse ID not found. Skipping type generation.");
41
41
  return false;
42
42
  }
43
- if (!fs.existsSync(path.join(process.cwd(), "config", "queries"))) return false;
43
+ if (!existsSync(path.join(process.cwd(), "config", "queries"))) return false;
44
44
  return true;
45
45
  },
46
46
  configResolved(config) {
@@ -1 +1 @@
1
- {"version":3,"file":"vite-plugin.js","names":[],"sources":["../../src/type-generator/vite-plugin.ts"],"sourcesContent":["import fs from \"node:fs\";\nimport path from \"node:path\";\nimport type { Plugin } from \"vite\";\nimport { createLogger } from \"../logging/logger\";\nimport { generateFromEntryPoint } from \"./index\";\n\nconst logger = createLogger(\"type-generator:vite-plugin\");\n\n/**\n * Options for the AppKit types plugin.\n */\ninterface AppKitTypesPluginOptions {\n /* Path to the output d.ts file (relative to client folder). */\n outFile?: string;\n /** Folders to watch for changes. */\n watchFolders?: string[];\n}\n\n/**\n * Vite plugin to generate types for AppKit queries.\n * Calls generateFromEntryPoint under the hood.\n * @param options - Options to override default values.\n * @returns Vite plugin to generate types for AppKit queries.\n */\nexport function appKitTypesPlugin(options?: AppKitTypesPluginOptions): Plugin {\n let root: string;\n let outFile: string;\n let watchFolders: string[];\n\n async function generate() {\n try {\n const warehouseId = process.env.DATABRICKS_WAREHOUSE_ID || \"\";\n\n if (!warehouseId) {\n logger.debug(\"Warehouse ID not found. Skipping type generation.\");\n return;\n }\n\n await generateFromEntryPoint({\n outFile,\n queryFolder: watchFolders[0],\n warehouseId,\n noCache: false,\n });\n } catch (error) {\n // throw in production to fail the build\n if (process.env.NODE_ENV === \"production\") {\n throw error;\n }\n logger.error(\"Error generating types: %O\", error);\n }\n }\n\n return {\n name: \"appkit-types\",\n\n apply() {\n const warehouseId = process.env.DATABRICKS_WAREHOUSE_ID || \"\";\n\n if (!warehouseId) {\n logger.debug(\"Warehouse ID not found. Skipping type generation.\");\n return false;\n }\n\n if (!fs.existsSync(path.join(process.cwd(), \"config\", \"queries\"))) {\n return false;\n }\n\n return true;\n },\n\n configResolved(config) {\n root = config.root;\n outFile = path.resolve(root, options?.outFile ?? \"src/appKitTypes.d.ts\");\n watchFolders = options?.watchFolders ?? [\n path.join(process.cwd(), \"config\", \"queries\"),\n ];\n },\n\n buildStart() {\n generate();\n },\n\n configureServer(server) {\n server.watcher.add(watchFolders);\n\n server.watcher.on(\"change\", (changedFile) => {\n const isWatchedFile = watchFolders.some((folder) =>\n changedFile.startsWith(folder),\n );\n\n if (isWatchedFile && changedFile.endsWith(\".sql\")) {\n generate();\n }\n });\n },\n };\n}\n"],"mappings":";;;;;;AAMA,MAAM,SAAS,aAAa,6BAA6B;;;;;;;AAkBzD,SAAgB,kBAAkB,SAA4C;CAC5E,IAAI;CACJ,IAAI;CACJ,IAAI;CAEJ,eAAe,WAAW;AACxB,MAAI;GACF,MAAM,cAAc,QAAQ,IAAI,2BAA2B;AAE3D,OAAI,CAAC,aAAa;AAChB,WAAO,MAAM,oDAAoD;AACjE;;AAGF,SAAM,uBAAuB;IAC3B;IACA,aAAa,aAAa;IAC1B;IACA,SAAS;IACV,CAAC;WACK,OAAO;AAEd,OAAI,QAAQ,IAAI,aAAa,aAC3B,OAAM;AAER,UAAO,MAAM,8BAA8B,MAAM;;;AAIrD,QAAO;EACL,MAAM;EAEN,QAAQ;AAGN,OAAI,EAFgB,QAAQ,IAAI,2BAA2B,KAEzC;AAChB,WAAO,MAAM,oDAAoD;AACjE,WAAO;;AAGT,OAAI,CAAC,GAAG,WAAW,KAAK,KAAK,QAAQ,KAAK,EAAE,UAAU,UAAU,CAAC,CAC/D,QAAO;AAGT,UAAO;;EAGT,eAAe,QAAQ;AACrB,UAAO,OAAO;AACd,aAAU,KAAK,QAAQ,MAAM,SAAS,WAAW,uBAAuB;AACxE,kBAAe,SAAS,gBAAgB,CACtC,KAAK,KAAK,QAAQ,KAAK,EAAE,UAAU,UAAU,CAC9C;;EAGH,aAAa;AACX,aAAU;;EAGZ,gBAAgB,QAAQ;AACtB,UAAO,QAAQ,IAAI,aAAa;AAEhC,UAAO,QAAQ,GAAG,WAAW,gBAAgB;AAK3C,QAJsB,aAAa,MAAM,WACvC,YAAY,WAAW,OAAO,CAC/B,IAEoB,YAAY,SAAS,OAAO,CAC/C,WAAU;KAEZ;;EAEL"}
1
+ {"version":3,"file":"vite-plugin.js","names":[],"sources":["../../src/type-generator/vite-plugin.ts"],"sourcesContent":["import { existsSync } from \"node:fs\";\nimport path from \"node:path\";\nimport type { Plugin } from \"vite\";\nimport { createLogger } from \"../logging/logger\";\nimport { generateFromEntryPoint } from \"./index\";\n\nconst logger = createLogger(\"type-generator:vite-plugin\");\n\n/**\n * Options for the AppKit types plugin.\n */\ninterface AppKitTypesPluginOptions {\n /* Path to the output d.ts file (relative to client folder). */\n outFile?: string;\n /** Folders to watch for changes. */\n watchFolders?: string[];\n}\n\n/**\n * Vite plugin to generate types for AppKit queries.\n * Calls generateFromEntryPoint under the hood.\n * @param options - Options to override default values.\n * @returns Vite plugin to generate types for AppKit queries.\n */\nexport function appKitTypesPlugin(options?: AppKitTypesPluginOptions): Plugin {\n let root: string;\n let outFile: string;\n let watchFolders: string[];\n\n async function generate() {\n try {\n const warehouseId = process.env.DATABRICKS_WAREHOUSE_ID || \"\";\n\n if (!warehouseId) {\n logger.debug(\"Warehouse ID not found. Skipping type generation.\");\n return;\n }\n\n await generateFromEntryPoint({\n outFile,\n queryFolder: watchFolders[0],\n warehouseId,\n noCache: false,\n });\n } catch (error) {\n // throw in production to fail the build\n if (process.env.NODE_ENV === \"production\") {\n throw error;\n }\n logger.error(\"Error generating types: %O\", error);\n }\n }\n\n return {\n name: \"appkit-types\",\n\n apply() {\n const warehouseId = process.env.DATABRICKS_WAREHOUSE_ID || \"\";\n\n if (!warehouseId) {\n logger.debug(\"Warehouse ID not found. Skipping type generation.\");\n return false;\n }\n\n if (!existsSync(path.join(process.cwd(), \"config\", \"queries\"))) {\n return false;\n }\n\n return true;\n },\n\n configResolved(config) {\n root = config.root;\n outFile = path.resolve(root, options?.outFile ?? \"src/appKitTypes.d.ts\");\n watchFolders = options?.watchFolders ?? [\n path.join(process.cwd(), \"config\", \"queries\"),\n ];\n },\n\n buildStart() {\n generate();\n },\n\n configureServer(server) {\n server.watcher.add(watchFolders);\n\n server.watcher.on(\"change\", (changedFile) => {\n const isWatchedFile = watchFolders.some((folder) =>\n changedFile.startsWith(folder),\n );\n\n if (isWatchedFile && changedFile.endsWith(\".sql\")) {\n generate();\n }\n });\n },\n };\n}\n"],"mappings":";;;;;;AAMA,MAAM,SAAS,aAAa,6BAA6B;;;;;;;AAkBzD,SAAgB,kBAAkB,SAA4C;CAC5E,IAAI;CACJ,IAAI;CACJ,IAAI;CAEJ,eAAe,WAAW;AACxB,MAAI;GACF,MAAM,cAAc,QAAQ,IAAI,2BAA2B;AAE3D,OAAI,CAAC,aAAa;AAChB,WAAO,MAAM,oDAAoD;AACjE;;AAGF,SAAM,uBAAuB;IAC3B;IACA,aAAa,aAAa;IAC1B;IACA,SAAS;IACV,CAAC;WACK,OAAO;AAEd,OAAI,QAAQ,IAAI,aAAa,aAC3B,OAAM;AAER,UAAO,MAAM,8BAA8B,MAAM;;;AAIrD,QAAO;EACL,MAAM;EAEN,QAAQ;AAGN,OAAI,EAFgB,QAAQ,IAAI,2BAA2B,KAEzC;AAChB,WAAO,MAAM,oDAAoD;AACjE,WAAO;;AAGT,OAAI,CAAC,WAAW,KAAK,KAAK,QAAQ,KAAK,EAAE,UAAU,UAAU,CAAC,CAC5D,QAAO;AAGT,UAAO;;EAGT,eAAe,QAAQ;AACrB,UAAO,OAAO;AACd,aAAU,KAAK,QAAQ,MAAM,SAAS,WAAW,uBAAuB;AACxE,kBAAe,SAAS,gBAAgB,CACtC,KAAK,KAAK,QAAQ,KAAK,EAAE,UAAU,UAAU,CAC9C;;EAGH,aAAa;AACX,aAAU;;EAGZ,gBAAgB,QAAQ;AACtB,UAAO,QAAQ,IAAI,aAAa;AAEhC,UAAO,QAAQ,GAAG,WAAW,gBAAgB;AAK3C,QAJsB,aAAa,MAAM,WACvC,YAAY,WAAW,OAAO,CAC/B,IAEoB,YAAY,SAAS,OAAO,CAC/C,WAAU;KAEZ;;EAEL"}
@@ -10,11 +10,13 @@ Scrollable message list that renders Genie chat messages with auto-scroll, skele
10
10
 
11
11
  ### Props[​](#props "Direct link to Props")
12
12
 
13
- | Prop | Type | Required | Default | Description |
14
- | ----------- | -------------------- | -------- | ------- | --------------------------------------------------------------------------- |
15
- | `messages` | `GenieMessageItem[]` | ✓ | - | Array of messages to display |
16
- | `status` | `enum` | ✓ | - | Current chat status (controls loading indicators and skeleton placeholders) |
17
- | `className` | `string` | | - | Additional CSS class for the scroll area |
13
+ | Prop | Type | Required | Default | Description |
14
+ | --------------------- | -------------------- | -------- | ------- | --------------------------------------------------------------------------- |
15
+ | `messages` | `GenieMessageItem[]` | ✓ | - | Array of messages to display |
16
+ | `status` | `enum` | ✓ | - | Current chat status (controls loading indicators and skeleton placeholders) |
17
+ | `className` | `string` | | - | Additional CSS class for the scroll area |
18
+ | `hasPreviousPage` | `boolean` | | `false` | Whether a previous page of older messages exists |
19
+ | `onFetchPreviousPage` | `(() => void)` | | - | Callback to fetch the previous page of messages |
18
20
 
19
21
  ### Usage[​](#usage "Direct link to Usage")
20
22
 
@@ -61,7 +61,7 @@ The AppKit `server()` plugin automatically serves:
61
61
  "@types/react": "^19.0.0",
62
62
  "@types/react-dom": "^19.0.0",
63
63
  "@vitejs/plugin-react": "^5.1.1",
64
- "tsdown": "^0.15.7",
64
+ "tsdown": "^0.20.3",
65
65
  "tsx": "^4.19.0",
66
66
  "typescript": "~5.6.0",
67
67
  "vite": "^7.2.4"
@@ -2,6 +2,8 @@
2
2
 
3
3
  AppKit includes a CLI for managing plugins. All commands are available under `npx @databricks/appkit plugin`.
4
4
 
5
+ **Manifest convention:** `manifest.json` is the default and recommended format for CLI commands (`sync`, `list`, `validate`). For zero-trust safety, JS manifests (`manifest.js`/`manifest.cjs`) are ignored unless you pass `--allow-js-manifest`, which executes plugin code and should be used only with trusted sources. The **add-resource** command only edits `manifest.json` in place.
6
+
5
7
  ## Create a plugin[​](#create-a-plugin "Direct link to Create a plugin")
6
8
 
7
9
  Scaffold a new plugin interactively:
@@ -18,7 +20,7 @@ The wizard walks you through:
18
20
  * **Resources**: Which Databricks resources the plugin needs (SQL Warehouse, Secret, etc.) and whether each is required or optional
19
21
  * **Optional fields**: Author, version, license
20
22
 
21
- The command generates a complete plugin scaffold with `manifest.json`, TypeScript class, and barrel exports — ready to register in your app.
23
+ The command generates a complete plugin scaffold with `manifest.json` and a TypeScript plugin class that imports the manifest directly — ready to register in your app.
22
24
 
23
25
  ## Sync plugin manifests[​](#sync-plugin-manifests "Direct link to Sync plugin manifests")
24
26
 
@@ -31,6 +33,13 @@ npx @databricks/appkit plugin sync --write
31
33
 
32
34
  This discovers plugin manifests from installed packages and local imports, then writes a consolidated manifest used by deployment tooling. Plugins referenced in your `createApp({ plugins: [...] })` call are automatically marked as required.
33
35
 
36
+ Trusted installed Databricks packages (for example `@databricks/appkit`) are allowed to load bundled JS manifests during `plugin sync`. For other sources, if you intentionally rely on JS manifests, opt in explicitly:
37
+
38
+ ```bash
39
+ npx @databricks/appkit plugin sync --write --allow-js-manifest
40
+
41
+ ```
42
+
34
43
  Use the `--silent` flag in build hooks to suppress output:
35
44
 
36
45
  ```json
@@ -59,6 +68,8 @@ npx @databricks/appkit plugin validate plugins/my-plugin appkit.plugins.json
59
68
 
60
69
  The validator auto-detects whether a file is a plugin manifest or a template manifest (from `$schema`) and reports errors with humanized paths and expected values.
61
70
 
71
+ To include JS manifests in validation, pass `--allow-js-manifest`.
72
+
62
73
  ## List plugins[​](#list-plugins "Direct link to List plugins")
63
74
 
64
75
  View registered plugins from `appkit.plugins.json` or scan a directory:
@@ -70,6 +81,9 @@ npx @databricks/appkit plugin list
70
81
  # Scan a directory for plugin folders
71
82
  npx @databricks/appkit plugin list --dir plugins/
72
83
 
84
+ # Scan a directory and include JS manifests (trusted code only)
85
+ npx @databricks/appkit plugin list --dir plugins/ --allow-js-manifest
86
+
73
87
  # JSON output for scripting
74
88
  npx @databricks/appkit plugin list --json
75
89
 
@@ -77,7 +91,7 @@ npx @databricks/appkit plugin list --json
77
91
 
78
92
  ## Add a resource to a plugin[​](#add-a-resource-to-a-plugin "Direct link to Add a resource to a plugin")
79
93
 
80
- Interactively add a new resource requirement to an existing plugin manifest:
94
+ Interactively add a new resource requirement to an existing plugin manifest. **Requires `manifest.json`** in the plugin directory (the command edits it in place; it does not modify `manifest.js`):
81
95
 
82
96
  ```bash
83
97
  npx @databricks/appkit plugin add-resource