dhti-cli 1.2.0 → 1.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -28,7 +28,11 @@ Generative AI features are built as [LangServe Apps](https://python.langchain.co
28
28
  🚀 Checkout **[Vidhi Recipes](/vidhi/README.md)** for chatbot, RAG, imaging (DICOM) and MCPX for dockerized calculators
29
29
 
30
30
  #### How (non‑technical / clinical)
31
- DHTI includes ready‑to‑use [skills](/.github/skills/) that can prompt agentic platforms (e.g., [AntiGravity](https://antigravity.google/), VSCode, or Claude) to generate the GenAI backends and UI components (elixirs and conches) you need. Test these components with synthetic data in OpenMRS or the CDS‑Hooks sandbox, then hand them off to production teams. Because DHTI follows open standards, that handoff (the “valley of death”) becomes smoother and more predictable. Try the [prompts](/.github/skills/start-dhti/examples/e2e-sample.md) in your preferred agentic platform after cloning this repo.
31
+ DHTI includes ready‑to‑use [skills](/.agents/skills/) that can prompt agentic platforms (e.g., [AntiGravity](https://antigravity.google/), VSCode, or Claude) to generate the GenAI backends and UI components (elixirs and conches) you need. Test these components with synthetic data in OpenMRS or the CDS‑Hooks sandbox, then hand them off to production teams. Because DHTI follows open standards, that handoff (the “valley of death”) becomes smoother and more predictable. Try the [prompts](/.agents/skills/start-dhti/examples/e2e-sample.md) in your preferred agentic platform after cloning this repo.
32
+
33
+ Other skills from the open agent skills ecosystem may be useful too! For example, use `npx skills find clinical trial` to find clinical trial related skills. From the results, you can use `npx skills add <skill-name>` to use the skill in your agentic platform. (e.g.`npx skills add anthropics/healthcare@clinical-trial-protocol-skill`)
34
+
35
+ **🤖 [AI-Powered Workflow with GitHub Copilot SDK:](/notes/COPILOT.md) - WIP**
32
36
 
33
37
  ## Try it out
34
38
  [[Cheatsheet](/notes/cheatsheet.md) | [Download PDF Cheatsheet](https://nuchange.ca/wp-content/uploads/2026/01/dhti_cheatsheet.pdf)]
@@ -113,7 +117,7 @@ You will see the new **patient context aware chatbot** in the patient summary pa
113
117
 
114
118
  | Why | How |
115
119
  | --- | --- |
116
- | I am a clinician! I have no idea how to build GenAI apps. | ✨ DHTI comes with batteries ([skills](/.github/skills/)) included! Use your preferred agentic platform (e.g., [AntiGravity](https://antigravity.google/), [VSCode with Copilot in agent mode](https://code.visualstudio.com/docs/copilot/overview), Claude, [Cursor](https://cursor.com/) and many other) to generate elixirs and conches from [problem-oriented prompts](/prompts/e2e-sample.md) (most of these platforms have a free tier). Test them using synthetic data in OpenMRS or the CDS-Hooks sandbox, then hand them off to production teams. |
120
+ | I am a clinician! I have no idea how to build GenAI apps. | ✨ DHTI comes with batteries ([skills](/.github/skills/)) included! Use your preferred agentic platform (e.g., [AntiGravity](https://antigravity.google/), [VSCode with Copilot in agent mode](https://code.visualstudio.com/docs/copilot/overview), Claude, [Cursor](https://cursor.com/) and many other) to generate elixirs and conches from [problem-oriented prompts](/prompts/e2e-sample.md) (most of these platforms have a free tier). Test them using synthetic data in OpenMRS or the CDS-Hooks sandbox, then hand them off to production teams. You may find useful skills in the open agent skills ecosystem. `npx skills find clinical trial`|
117
121
  | I know LangChain, but I don’t know how to build a chain/agent based on data in our EHR. | [These sample elixirs](https://github.com/dermatologist/dhti-elixir) adopt FHIR and cds-hooks as standards for data retrieval and display. The [base class](https://github.com/dermatologist/dhti-elixir-base) provides reusable artifacts |
118
122
  | I need a simple platform for experimenting. | This repository provides everything to start experimenting fast. The command-line tools help to virtualize and orchestrate your experiments using [Docker](https://www.docker.com/)|
119
123
  | I am a UI designer. I want to design helpful UI for real users. | See [these sample conches](https://github.com/dermatologist/openmrs-esm-dhti). It shows how to build interface components (conches) for [OpenMRS](https://openmrs.org/) an open-source EMR used by many. Read more about [OpenMRS UI](https://o3-docs.openmrs.org/) |
@@ -0,0 +1,54 @@
1
+ import { Command } from '@oclif/core';
2
+ /**
3
+ * Copilot command that uses the GitHub Copilot SDK to interact with DHTI
4
+ * and display results with streaming support.
5
+ */
6
+ export default class Copilot extends Command {
7
+ static description: string;
8
+ static examples: string[];
9
+ static flags: {
10
+ 'clear-history': import("@oclif/core/interfaces").BooleanFlag<boolean>;
11
+ file: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
12
+ model: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
13
+ prompt: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
14
+ skill: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
15
+ };
16
+ /**
17
+ * Detects the appropriate skill based on the prompt content
18
+ * @param prompt - The user's prompt text
19
+ * @returns The detected skill name
20
+ */
21
+ private detectSkill;
22
+ /**
23
+ * Gets the path to the conversation history file
24
+ * @returns The path to the history file
25
+ */
26
+ private getHistoryFilePath;
27
+ /**
28
+ * Loads conversation history from file
29
+ * @returns Array of conversation turns or empty array if no history
30
+ */
31
+ private loadConversationHistory;
32
+ /**
33
+ * Saves conversation history to file
34
+ * @param history - Array of conversation turns to save
35
+ */
36
+ private saveConversationHistory;
37
+ /**
38
+ * Clears the conversation history
39
+ */
40
+ private clearConversationHistory;
41
+ /**
42
+ * Fetches skill content from GitHub if not available locally
43
+ * @param skillName - The name of the skill to fetch
44
+ * @returns The skill content or null if not found
45
+ */
46
+ private fetchSkillFromGitHub;
47
+ /**
48
+ * Loads skill instructions from local or remote source
49
+ * @param skillName - The name of the skill to load
50
+ * @returns The skill content or null if not found
51
+ */
52
+ private loadSkill;
53
+ run(): Promise<void>;
54
+ }
@@ -0,0 +1,308 @@
1
+ import { CopilotClient } from '@github/copilot-sdk';
2
+ import { Command, Flags } from '@oclif/core';
3
+ import chalk from 'chalk';
4
+ import fs from 'node:fs';
5
+ import os from 'node:os';
6
+ import path from 'node:path';
7
+ import { fileURLToPath } from 'node:url';
8
+ /**
9
+ * Copilot command that uses the GitHub Copilot SDK to interact with DHTI
10
+ * and display results with streaming support.
11
+ */
12
+ export default class Copilot extends Command {
13
+ static description = 'Interact with DHTI using GitHub Copilot SDK with streaming responses';
14
+ static examples = [
15
+ '<%= config.bin %> <%= command.id %> --prompt "Start the DHTI stack with langserve"',
16
+ '<%= config.bin %> <%= command.id %> --file ./my-prompt.txt --model gpt-4.1',
17
+ '<%= config.bin %> <%= command.id %> --prompt "Generate a new elixir for patient risk assessment" --skill elixir-generator',
18
+ '<%= config.bin %> <%= command.id %> --clear-history --prompt "Start fresh conversation"',
19
+ '<%= config.bin %> <%= command.id %> --clear-history # Clear history without starting new conversation',
20
+ ];
21
+ static flags = {
22
+ 'clear-history': Flags.boolean({
23
+ default: false,
24
+ description: 'Clear conversation history and start a new session',
25
+ }),
26
+ file: Flags.string({
27
+ char: 'f',
28
+ description: 'Path to a file containing the prompt to send to copilot-sdk',
29
+ exclusive: ['prompt'],
30
+ }),
31
+ model: Flags.string({
32
+ char: 'm',
33
+ default: 'gpt-4.1',
34
+ description: 'Model to use for copilot-sdk interactions',
35
+ }),
36
+ prompt: Flags.string({
37
+ char: 'p',
38
+ description: 'Prompt to send to the copilot-sdk',
39
+ exclusive: ['file'],
40
+ }),
41
+ skill: Flags.string({
42
+ char: 's',
43
+ default: 'auto',
44
+ description: 'Skill to use for copilot-sdk interactions (auto, start-dhti, elixir-generator, conch-generator)',
45
+ }),
46
+ };
47
+ /**
48
+ * Detects the appropriate skill based on the prompt content
49
+ * @param prompt - The user's prompt text
50
+ * @returns The detected skill name
51
+ */
52
+ detectSkill(prompt) {
53
+ const lowerPrompt = prompt.toLowerCase();
54
+ // Check for elixir-related keywords
55
+ if (lowerPrompt.includes('elixir') ||
56
+ lowerPrompt.includes('backend') ||
57
+ lowerPrompt.includes('langserve') ||
58
+ lowerPrompt.includes('genai app')) {
59
+ return 'elixir-generator';
60
+ }
61
+ // Check for conch-related keywords
62
+ if (lowerPrompt.includes('conch') ||
63
+ lowerPrompt.includes('frontend') ||
64
+ lowerPrompt.includes('ui') ||
65
+ lowerPrompt.includes('openmrs')) {
66
+ return 'conch-generator';
67
+ }
68
+ // Use start-dhti if prompt includes 'start', 'show', or 'run'
69
+ if (lowerPrompt.includes('start') || lowerPrompt.includes('show') || lowerPrompt.includes('run')) {
70
+ return 'start-dhti';
71
+ }
72
+ // If none of the skills match, exit asking for a skill name and show available skills
73
+ const availableSkills = ['start-dhti', 'elixir-generator', 'conch-generator'];
74
+ this.error(`Could not detect the appropriate skill from the prompt.\n` +
75
+ `Please specify a skill name using --skill.\n` +
76
+ `Available skills: ${availableSkills.join(', ')}`);
77
+ return '';
78
+ }
79
+ /**
80
+ * Gets the path to the conversation history file
81
+ * @returns The path to the history file
82
+ */
83
+ getHistoryFilePath() {
84
+ const dhtiDir = path.join(os.homedir(), '.dhti');
85
+ if (!fs.existsSync(dhtiDir)) {
86
+ fs.mkdirSync(dhtiDir, { recursive: true });
87
+ }
88
+ return path.join(dhtiDir, 'copilot-history.json');
89
+ }
90
+ /**
91
+ * Loads conversation history from file
92
+ * @returns Array of conversation turns or empty array if no history
93
+ */
94
+ loadConversationHistory() {
95
+ try {
96
+ const historyPath = this.getHistoryFilePath();
97
+ if (fs.existsSync(historyPath)) {
98
+ const historyData = fs.readFileSync(historyPath, 'utf8');
99
+ return JSON.parse(historyData);
100
+ }
101
+ }
102
+ catch (error) {
103
+ this.warn(chalk.yellow(`Failed to load conversation history: ${error}`));
104
+ }
105
+ return [];
106
+ }
107
+ /**
108
+ * Saves conversation history to file
109
+ * @param history - Array of conversation turns to save
110
+ */
111
+ saveConversationHistory(history) {
112
+ try {
113
+ const historyPath = this.getHistoryFilePath();
114
+ fs.writeFileSync(historyPath, JSON.stringify(history, null, 2), 'utf8');
115
+ }
116
+ catch (error) {
117
+ this.warn(chalk.yellow(`Failed to save conversation history: ${error}`));
118
+ }
119
+ }
120
+ /**
121
+ * Clears the conversation history
122
+ */
123
+ clearConversationHistory() {
124
+ try {
125
+ const historyPath = this.getHistoryFilePath();
126
+ if (fs.existsSync(historyPath)) {
127
+ fs.unlinkSync(historyPath);
128
+ this.log(chalk.green('✓ Conversation history cleared'));
129
+ }
130
+ else {
131
+ this.log(chalk.yellow('No conversation history to clear'));
132
+ }
133
+ }
134
+ catch (error) {
135
+ this.warn(chalk.yellow(`Failed to clear conversation history: ${error}`));
136
+ }
137
+ }
138
+ /**
139
+ * Fetches skill content from GitHub if not available locally
140
+ * @param skillName - The name of the skill to fetch
141
+ * @returns The skill content or null if not found
142
+ */
143
+ async fetchSkillFromGitHub(skillName) {
144
+ try {
145
+ const url = `https://raw.githubusercontent.com/dermatologist/dhti/develop/.agents/skills/${skillName}/SKILL.md`;
146
+ const response = await fetch(url);
147
+ if (!response.ok) {
148
+ return null;
149
+ }
150
+ return response.text();
151
+ }
152
+ catch (error) {
153
+ this.warn(`Failed to fetch skill ${skillName} from GitHub: ${error}`);
154
+ return null;
155
+ }
156
+ }
157
+ /**
158
+ * Loads skill instructions from local or remote source
159
+ * @param skillName - The name of the skill to load
160
+ * @returns The skill content or null if not found
161
+ */
162
+ async loadSkill(skillName) {
163
+ // Resolve skills directory
164
+ const __filename = fileURLToPath(import.meta.url);
165
+ const __dirname = path.dirname(__filename);
166
+ const skillsDir = path.resolve(__dirname, '../../.agents/skills');
167
+ const skillPath = path.join(skillsDir, skillName, 'SKILL.md');
168
+ // Try to load from local directory first
169
+ if (fs.existsSync(skillPath)) {
170
+ try {
171
+ return fs.readFileSync(skillPath, 'utf8');
172
+ }
173
+ catch (error) {
174
+ this.warn(`Failed to read local skill file: ${error}`);
175
+ }
176
+ }
177
+ // If not found locally, try to fetch from GitHub
178
+ this.log(chalk.yellow(`Skill ${skillName} not found locally, fetching from GitHub...`));
179
+ return this.fetchSkillFromGitHub(skillName);
180
+ }
181
+ // eslint-disable-next-line perfectionist/sort-classes
182
+ async run() {
183
+ const { flags } = await this.parse(Copilot);
184
+ // Handle clear-history flag
185
+ if (flags['clear-history']) {
186
+ this.clearConversationHistory();
187
+ // If only clearing history, exit after clearing
188
+ if (!flags.prompt && !flags.file) {
189
+ return;
190
+ }
191
+ }
192
+ // Validate that either prompt or file is provided
193
+ if (!flags.prompt && !flags.file) {
194
+ this.error('Either --prompt or --file must be provided');
195
+ }
196
+ // Get the prompt content
197
+ let promptContent;
198
+ if (flags.file) {
199
+ if (!fs.existsSync(flags.file)) {
200
+ this.error(`File not found: ${flags.file}`);
201
+ }
202
+ try {
203
+ promptContent = fs.readFileSync(flags.file, 'utf8');
204
+ }
205
+ catch (error) {
206
+ this.error(`Failed to read file: ${error}`);
207
+ }
208
+ }
209
+ else {
210
+ promptContent = flags.prompt;
211
+ }
212
+ // Load conversation history
213
+ const conversationHistory = this.loadConversationHistory();
214
+ const hasHistory = conversationHistory.length > 0;
215
+ if (hasHistory) {
216
+ this.log(chalk.cyan(`📜 Loaded ${conversationHistory.length} previous message(s) from history`));
217
+ }
218
+ // Determine which skill to use
219
+ let skillName = flags.skill;
220
+ if (skillName === 'auto') {
221
+ skillName = this.detectSkill(promptContent);
222
+ this.log(chalk.cyan(`Auto-detected skill: ${skillName}`));
223
+ }
224
+ // Load the skill instructions
225
+ const skillContent = await this.loadSkill(skillName);
226
+ if (!skillContent) {
227
+ this.warn(chalk.yellow(`Could not load skill: ${skillName}. Proceeding without skill context.`));
228
+ }
229
+ // Build system message with skill instructions
230
+ let systemMessageContent = 'You are a helpful assistant that can use specific skills to generate components of the DHTI stack based on user prompts.';
231
+ // Add skill-specific instructions
232
+ if (skillContent) {
233
+ systemMessageContent += '\n\n' + skillContent;
234
+ }
235
+ // Add default instruction to use start-dhti skill
236
+ if (skillName !== 'start-dhti') {
237
+ systemMessageContent += '\n\nNote: If the user needs to start the DHTI stack, use the start-dhti skill workflow.';
238
+ }
239
+ // Add conversation history context
240
+ if (hasHistory) {
241
+ systemMessageContent += '\n\n## Previous Conversation\n';
242
+ systemMessageContent += 'Here is the conversation history for context:\n\n';
243
+ for (const turn of conversationHistory) {
244
+ systemMessageContent += `${turn.role === 'user' ? 'User' : 'Assistant'}: ${turn.content}\n\n`;
245
+ }
246
+ systemMessageContent += 'Continue the conversation naturally based on this context.';
247
+ }
248
+ this.log(chalk.green('Initializing GitHub Copilot SDK...'));
249
+ let client = null;
250
+ let assistantResponse = '';
251
+ try {
252
+ // Create copilot client
253
+ client = new CopilotClient();
254
+ // Create a session with streaming enabled
255
+ const session = await client.createSession({
256
+ model: flags.model,
257
+ streaming: true,
258
+ systemMessage: {
259
+ content: systemMessageContent,
260
+ mode: 'append',
261
+ },
262
+ });
263
+ this.log(chalk.green(`Using model: ${flags.model}`));
264
+ this.log(chalk.green(`Using skill: ${skillName}`));
265
+ this.log(chalk.blue('\n--- Copilot Response ---\n'));
266
+ // Handle streaming responses
267
+ let responseStarted = false;
268
+ session.on('assistant.message_delta', (event) => {
269
+ if (!responseStarted) {
270
+ responseStarted = true;
271
+ }
272
+ const content = event.data.deltaContent;
273
+ process.stdout.write(content);
274
+ assistantResponse += content;
275
+ });
276
+ // Handle session idle (response complete)
277
+ session.on('session.idle', () => {
278
+ if (responseStarted) {
279
+ console.log('\n'); // Add newline after response
280
+ }
281
+ });
282
+ // Send the prompt and wait for completion
283
+ await session.sendAndWait({ prompt: promptContent });
284
+ this.log(chalk.blue('\n--- End of Response ---\n'));
285
+ // Save conversation history
286
+ conversationHistory.push({ content: promptContent, role: 'user' });
287
+ if (assistantResponse.trim()) {
288
+ conversationHistory.push({ content: assistantResponse.trim(), role: 'assistant' });
289
+ }
290
+ this.saveConversationHistory(conversationHistory);
291
+ this.log(chalk.dim(`💾 Conversation saved (${conversationHistory.length} messages). Use --clear-history to reset.`));
292
+ }
293
+ catch (error) {
294
+ const errorMessage = error instanceof Error ? error.message : String(error);
295
+ this.error(chalk.red(`Failed to interact with Copilot SDK: ${errorMessage}\n\n`) +
296
+ chalk.yellow('Troubleshooting:\n') +
297
+ chalk.yellow('1. Ensure GitHub Copilot CLI is installed: https://docs.github.com/en/copilot/using-github-copilot/using-github-copilot-in-the-command-line\n') +
298
+ chalk.yellow('2. Authenticate with: copilot auth login\n') +
299
+ chalk.yellow('3. Verify CLI is working: copilot --version\n'));
300
+ }
301
+ finally {
302
+ // Clean up
303
+ if (client) {
304
+ await client.stop();
305
+ }
306
+ }
307
+ }
308
+ }
@@ -175,6 +175,80 @@
175
175
  "conch.js"
176
176
  ]
177
177
  },
178
+ "copilot": {
179
+ "aliases": [],
180
+ "args": {},
181
+ "description": "Interact with DHTI using GitHub Copilot SDK with streaming responses",
182
+ "examples": [
183
+ "<%= config.bin %> <%= command.id %> --prompt \"Start the DHTI stack with langserve\"",
184
+ "<%= config.bin %> <%= command.id %> --file ./my-prompt.txt --model gpt-4.1",
185
+ "<%= config.bin %> <%= command.id %> --prompt \"Generate a new elixir for patient risk assessment\" --skill elixir-generator",
186
+ "<%= config.bin %> <%= command.id %> --clear-history --prompt \"Start fresh conversation\"",
187
+ "<%= config.bin %> <%= command.id %> --clear-history # Clear history without starting new conversation"
188
+ ],
189
+ "flags": {
190
+ "clear-history": {
191
+ "description": "Clear conversation history and start a new session",
192
+ "name": "clear-history",
193
+ "allowNo": false,
194
+ "type": "boolean"
195
+ },
196
+ "file": {
197
+ "char": "f",
198
+ "description": "Path to a file containing the prompt to send to copilot-sdk",
199
+ "exclusive": [
200
+ "prompt"
201
+ ],
202
+ "name": "file",
203
+ "hasDynamicHelp": false,
204
+ "multiple": false,
205
+ "type": "option"
206
+ },
207
+ "model": {
208
+ "char": "m",
209
+ "description": "Model to use for copilot-sdk interactions",
210
+ "name": "model",
211
+ "default": "gpt-4.1",
212
+ "hasDynamicHelp": false,
213
+ "multiple": false,
214
+ "type": "option"
215
+ },
216
+ "prompt": {
217
+ "char": "p",
218
+ "description": "Prompt to send to the copilot-sdk",
219
+ "exclusive": [
220
+ "file"
221
+ ],
222
+ "name": "prompt",
223
+ "hasDynamicHelp": false,
224
+ "multiple": false,
225
+ "type": "option"
226
+ },
227
+ "skill": {
228
+ "char": "s",
229
+ "description": "Skill to use for copilot-sdk interactions (auto, start-dhti, elixir-generator, conch-generator)",
230
+ "name": "skill",
231
+ "default": "auto",
232
+ "hasDynamicHelp": false,
233
+ "multiple": false,
234
+ "type": "option"
235
+ }
236
+ },
237
+ "hasDynamicHelp": false,
238
+ "hiddenAliases": [],
239
+ "id": "copilot",
240
+ "pluginAlias": "dhti-cli",
241
+ "pluginName": "dhti-cli",
242
+ "pluginType": "core",
243
+ "strict": true,
244
+ "enableJsonFlag": false,
245
+ "isESM": true,
246
+ "relativePath": [
247
+ "dist",
248
+ "commands",
249
+ "copilot.js"
250
+ ]
251
+ },
178
252
  "docker": {
179
253
  "aliases": [],
180
254
  "args": {
@@ -808,5 +882,5 @@
808
882
  ]
809
883
  }
810
884
  },
811
- "version": "1.2.0"
885
+ "version": "1.3.1"
812
886
  }
package/package.json CHANGED
@@ -1,13 +1,14 @@
1
1
  {
2
2
  "name": "dhti-cli",
3
3
  "description": "DHTI CLI",
4
- "version": "1.2.0",
4
+ "version": "1.3.1",
5
5
  "author": "Bell Eapen",
6
6
  "bin": {
7
7
  "dhti-cli": "bin/run.js"
8
8
  },
9
9
  "bugs": "https://github.com/dermatologist/dhti/issues",
10
10
  "dependencies": {
11
+ "@github/copilot-sdk": "^0.1.23",
11
12
  "@langchain/community": "^0.3.53",
12
13
  "@langchain/ollama": "^0.2.3",
13
14
  "@oclif/core": "^4",
@@ -18,7 +19,8 @@
18
19
  "js-yaml": "^4.1.0",
19
20
  "medpromptjs": ">=0.4.3",
20
21
  "ora": "^8.0.1",
21
- "request": "^2.88.2"
22
+ "request": "^2.88.2",
23
+ "vscode-jsonrpc": "^8.2.1"
22
24
  },
23
25
  "devDependencies": {
24
26
  "@oclif/prettier-config": "^0.2.1",