dhti-cli 1.1.0 → 1.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +18 -4
- package/dist/commands/compose.js +2 -0
- package/dist/commands/conch.d.ts +1 -1
- package/dist/commands/conch.js +43 -4
- package/dist/commands/copilot.d.ts +54 -0
- package/dist/commands/copilot.js +300 -0
- package/dist/commands/docker.js +8 -3
- package/dist/resources/docker-compose-master.yml +23 -2
- package/dist/resources/genai/app/bootstrap.py +34 -4
- package/oclif.manifest.json +80 -3
- package/package.json +4 -2
package/README.md
CHANGED
|
@@ -25,9 +25,15 @@ Generative AI features are built as [LangServe Apps](https://python.langchain.co
|
|
|
25
25
|
|
|
26
26
|
🚀 You can test the elixir using a real EMR system, [OpenMRS](https://openmrs.org/), that communicates with the elixir using **CDS-Hooks** or use any other CDS-Hooks compatible EMR system. You can also use the [CDS-Hooks sandbox for testing](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) without an EMR.
|
|
27
27
|
|
|
28
|
+
🚀 Checkout **[Vidhi Recipes](/vidhi/README.md)** for chatbot, RAG, imaging (DICOM) and MCPX for dockerized calculators
|
|
29
|
+
|
|
28
30
|
#### How (non‑technical / clinical)
|
|
29
31
|
DHTI includes ready‑to‑use [skills](/.github/skills/) that can prompt agentic platforms (e.g., [AntiGravity](https://antigravity.google/), VSCode, or Claude) to generate the GenAI backends and UI components (elixirs and conches) you need. Test these components with synthetic data in OpenMRS or the CDS‑Hooks sandbox, then hand them off to production teams. Because DHTI follows open standards, that handoff (the “valley of death”) becomes smoother and more predictable. Try the [prompts](/.github/skills/start-dhti/examples/e2e-sample.md) in your preferred agentic platform after cloning this repo.
|
|
30
32
|
|
|
33
|
+
Other skills from the open agent skills ecosystem may be useful too! For example, use `npx skills find clinical trial` to find clinical trial related skills. From the results, you can use `npx skills add <skill-name>` to use the skill in your agentic platform. (e.g.`npx skills add anthropics/healthcare@clinical-trial-protocol-skill`)
|
|
34
|
+
|
|
35
|
+
**🤖 [AI-Powered Workflow with GitHub Copilot SDK:](/notes/COPILOT.md) - WIP**
|
|
36
|
+
|
|
31
37
|
## Try it out
|
|
32
38
|
[[Cheatsheet](/notes/cheatsheet.md) | [Download PDF Cheatsheet](https://nuchange.ca/wp-content/uploads/2026/01/dhti_cheatsheet.pdf)]
|
|
33
39
|
|
|
@@ -52,15 +58,16 @@ npx dhti-cli docker -u # start services from compose
|
|
|
52
58
|
```
|
|
53
59
|
|
|
54
60
|
Notes:
|
|
55
|
-
-
|
|
61
|
+
- Install from a local directory using `-l`.
|
|
56
62
|
- Stop and remove containers with `npx dhti-cli docker -d`.
|
|
57
63
|
|
|
58
64
|
✌️ Decide where to test the new elixir: OpenMRS a full EHR system, or CDS-Hooks sandbox for a lightweight testing without an EHR.
|
|
59
65
|
|
|
60
66
|
💥 Test elixir in a CDS-Hooks sandbox.
|
|
61
67
|
|
|
62
|
-
* `npx dhti-cli
|
|
63
|
-
*
|
|
68
|
+
* `npx dhti-cli elixir start -n dhti-elixir-schat` and navigate to the **Application URL displayed in the console** (scroll up to see this). Not the base URL listed at the bottom.
|
|
69
|
+
* Uses hapi.fhir.org for data.
|
|
70
|
+
* In the **Rx View** tab, type in the contentString textbox and wait for the elixir to respond (Submits automatically in 5 seconds).
|
|
64
71
|
|
|
65
72
|
<p align="center">
|
|
66
73
|
<img src="https://github.com/dermatologist/dhti/blob/develop/notes/cds-hook-sandbox.jpg" />
|
|
@@ -88,12 +95,17 @@ You will see the new **patient context aware chatbot** in the patient summary pa
|
|
|
88
95
|
|
|
89
96
|
* `npx dhti-cli docker -d` to stop and delete all the docker containers.
|
|
90
97
|
|
|
98
|
+
## Configuration
|
|
99
|
+
|
|
100
|
+
* `npx dhti-cli docker bootstrap -f bootstrap.py` will create and sync bootstrap.py where you can configure default model and hyperparameters for LangServe. Run this command after changing bootstrap.py to apply the changes.
|
|
101
|
+
|
|
91
102
|
## Wiki & Documentation
|
|
92
103
|
* [](https://github.com/dermatologist/dhti/wiki)
|
|
93
104
|
* [Documentation](https://dermatologist.github.io/dhti/)
|
|
94
105
|
* [CLI Reference](/notes/README.md)
|
|
95
106
|
|
|
96
107
|
## User contributions & examples
|
|
108
|
+
* 🚀 **[Vidhi Recipes](/vidhi/README.md)** for chatbot, RAG, imaging (DICOM) and MCPX for dockerized calculators
|
|
97
109
|
* [Elixirs](https://github.com/dermatologist/dhti-elixir)
|
|
98
110
|
* [OpenMRS Conches / UI](https://github.com/dermatologist/openmrs-esm-dhti)
|
|
99
111
|
* [CDS Hooks Sandbox for testing](https://github.com/dermatologist/cds-hooks-sandbox)
|
|
@@ -101,14 +113,16 @@ You will see the new **patient context aware chatbot** in the patient summary pa
|
|
|
101
113
|
## Presentations
|
|
102
114
|
⭐️ **Pitched at [Falling Walls Lab Illinois](https://falling-walls.com/falling-walls-lab-illinois) and released on 2025-09-12.**
|
|
103
115
|
|
|
104
|
-
## What problems do DHTI solve?
|
|
116
|
+
## 🔧 What problems do DHTI solve?
|
|
105
117
|
|
|
106
118
|
| Why | How |
|
|
107
119
|
| --- | --- |
|
|
120
|
+
| I am a clinician! I have no idea how to build GenAI apps. | ✨ DHTI comes with batteries ([skills](/.github/skills/)) included! Use your preferred agentic platform (e.g., [AntiGravity](https://antigravity.google/), [VSCode with Copilot in agent mode](https://code.visualstudio.com/docs/copilot/overview), Claude, [Cursor](https://cursor.com/) and many other) to generate elixirs and conches from [problem-oriented prompts](/prompts/e2e-sample.md) (most of these platforms have a free tier). Test them using synthetic data in OpenMRS or the CDS-Hooks sandbox, then hand them off to production teams. You may find useful skills in the open agent skills ecosystem. `npx skills find clinical trial`|
|
|
108
121
|
| I know LangChain, but I don’t know how to build a chain/agent based on data in our EHR. | [These sample elixirs](https://github.com/dermatologist/dhti-elixir) adopt FHIR and cds-hooks as standards for data retrieval and display. The [base class](https://github.com/dermatologist/dhti-elixir-base) provides reusable artifacts |
|
|
109
122
|
| I need a simple platform for experimenting. | This repository provides everything to start experimenting fast. The command-line tools help to virtualize and orchestrate your experiments using [Docker](https://www.docker.com/)|
|
|
110
123
|
| I am a UI designer. I want to design helpful UI for real users. | See [these sample conches](https://github.com/dermatologist/openmrs-esm-dhti). It shows how to build interface components (conches) for [OpenMRS](https://openmrs.org/) an open-source EMR used by many. Read more about [OpenMRS UI](https://o3-docs.openmrs.org/) |
|
|
111
124
|
| We use another EMR | Your EMR may support CDS-Hook for displaying components. In that case, you can use [cds-hooks-sandbox for testing](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) |
|
|
125
|
+
| We don't use an EMR. We use a web based health information system for ------ population with no FHIR support. | You can still use DHTI as a GenAI experimentation platform. ✨ We have a [browser extension](https://github.com/dermatologist/openmrs-esm-dhti/blob/develop/packages/dhti-screen-grabber/README.md) that can read any web page! |
|
|
112
126
|
| Our IT team is often unable to take my experiments to production. | Use DHTI, follow the recommended patterns, and you will make their lives easier.|
|
|
113
127
|
|
|
114
128
|
|
package/dist/commands/compose.js
CHANGED
|
@@ -80,6 +80,7 @@ export default class Compose extends Command {
|
|
|
80
80
|
const mcpx = ['mcpx'];
|
|
81
81
|
const docktor = ['mcpx'];
|
|
82
82
|
const medplum = ['medplum-server', 'medplum-app', 'postgres-db', 'redis', 'mpclient'];
|
|
83
|
+
const orthanc = ['orthanc', 'cors-proxy'];
|
|
83
84
|
const _modules = {
|
|
84
85
|
cqlFhir,
|
|
85
86
|
docktor,
|
|
@@ -95,6 +96,7 @@ export default class Compose extends Command {
|
|
|
95
96
|
openmrs,
|
|
96
97
|
redis,
|
|
97
98
|
webui,
|
|
99
|
+
orthanc,
|
|
98
100
|
};
|
|
99
101
|
try {
|
|
100
102
|
const masterData = yaml.load(fs.readFileSync(path.join(RESOURCES_DIR, 'docker-compose-master.yml'), 'utf8'));
|
package/dist/commands/conch.d.ts
CHANGED
|
@@ -10,7 +10,7 @@ export default class Conch extends Command {
|
|
|
10
10
|
'dry-run': import("@oclif/core/interfaces").BooleanFlag<boolean>;
|
|
11
11
|
git: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
|
|
12
12
|
local: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
|
|
13
|
-
name: import("@oclif/core/interfaces").OptionFlag<string
|
|
13
|
+
name: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
|
|
14
14
|
sources: import("@oclif/core/interfaces").OptionFlag<string[] | undefined, import("@oclif/core/interfaces").CustomOptions>;
|
|
15
15
|
workdir: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
|
|
16
16
|
};
|
package/dist/commands/conch.js
CHANGED
|
@@ -8,10 +8,12 @@ import { promisify } from 'node:util';
|
|
|
8
8
|
const execAsync = promisify(exec);
|
|
9
9
|
export default class Conch extends Command {
|
|
10
10
|
static args = {
|
|
11
|
-
op: Args.string({ description: 'Operation to perform (init, install, or start)' }),
|
|
11
|
+
op: Args.string({ description: 'Operation to perform (add, init, install, or start)' }),
|
|
12
12
|
};
|
|
13
13
|
static description = 'Initialize, install, or start OpenMRS frontend development';
|
|
14
14
|
static examples = [
|
|
15
|
+
'<%= config.bin %> <%= command.id %> add -g my-repo/my-package -n my-package -w ~/projects',
|
|
16
|
+
'<%= config.bin %> <%= command.id %> add -g my-repo/my-package -b main -n my-package -w ~/projects',
|
|
15
17
|
'<%= config.bin %> <%= command.id %> install -n my-app -w ~/projects',
|
|
16
18
|
'<%= config.bin %> <%= command.id %> init -n my-app -w ~/projects',
|
|
17
19
|
'<%= config.bin %> <%= command.id %> start -n my-app -w ~/projects',
|
|
@@ -29,14 +31,14 @@ export default class Conch extends Command {
|
|
|
29
31
|
}),
|
|
30
32
|
git: Flags.string({
|
|
31
33
|
char: 'g',
|
|
32
|
-
default: 'dermatologist/openmrs-esm-dhti
|
|
34
|
+
default: 'dermatologist/openmrs-esm-dhti',
|
|
33
35
|
description: 'GitHub repository to install (for install operation)',
|
|
34
36
|
}),
|
|
35
37
|
local: Flags.string({
|
|
36
38
|
char: 'l',
|
|
37
39
|
description: 'Local path to use instead of calculated workdir/name path (for start operation)',
|
|
38
40
|
}),
|
|
39
|
-
name: Flags.string({ char: 'n', description: 'Name of the conch' }),
|
|
41
|
+
name: Flags.string({ char: 'n', default: 'esm-dhti', description: 'Name of the conch' }),
|
|
40
42
|
sources: Flags.string({
|
|
41
43
|
char: 's',
|
|
42
44
|
description: 'Additional sources to include when starting (e.g., packages/esm-chatbot-agent, packages/esm-another-app)',
|
|
@@ -50,6 +52,43 @@ export default class Conch extends Command {
|
|
|
50
52
|
};
|
|
51
53
|
async run() {
|
|
52
54
|
const { args, flags } = await this.parse(Conch);
|
|
55
|
+
if (args.op === 'add') {
|
|
56
|
+
// Validate that git and name are overridden from defaults
|
|
57
|
+
const defaultGit = 'dermatologist/openmrs-esm-dhti';
|
|
58
|
+
const defaultName = 'esm-dhti';
|
|
59
|
+
const gitOverridden = flags.git !== defaultGit;
|
|
60
|
+
const nameOverridden = flags.name !== defaultName;
|
|
61
|
+
if (!gitOverridden || !nameOverridden) {
|
|
62
|
+
console.log(chalk.yellow('Note: The "add" operation requires non-default values for both --git and --name flags.'));
|
|
63
|
+
if (!gitOverridden) {
|
|
64
|
+
console.log(chalk.yellow(' Current --git: (default)'));
|
|
65
|
+
}
|
|
66
|
+
if (!nameOverridden) {
|
|
67
|
+
console.log(chalk.yellow(' Current --name: (default)'));
|
|
68
|
+
}
|
|
69
|
+
console.log(chalk.yellow('\nNo changes made. Please provide custom --git and --name values.'));
|
|
70
|
+
this.exit(0);
|
|
71
|
+
}
|
|
72
|
+
if (flags['dry-run']) {
|
|
73
|
+
console.log(chalk.yellow('[DRY RUN] Would execute add operation:'));
|
|
74
|
+
const targetPath = path.join(flags.workdir, 'esm-dhti', 'packages', flags.name);
|
|
75
|
+
console.log(chalk.cyan(` npx degit ${flags.git}#${flags.branch} ${targetPath}`));
|
|
76
|
+
return;
|
|
77
|
+
}
|
|
78
|
+
try {
|
|
79
|
+
console.log(chalk.blue(`Adding package ${flags.name} from ${flags.git}#${flags.branch}...`));
|
|
80
|
+
const targetPath = path.join(flags.workdir, 'esm-dhti', 'packages', flags.name);
|
|
81
|
+
const degitCommand = `npx degit ${flags.git}#${flags.branch} ${targetPath}`;
|
|
82
|
+
await execAsync(degitCommand);
|
|
83
|
+
console.log(chalk.green('✓ Package added successfully'));
|
|
84
|
+
console.log(chalk.green(`\n✓ Package is ready at ${targetPath}`));
|
|
85
|
+
}
|
|
86
|
+
catch (error) {
|
|
87
|
+
console.error(chalk.red('Error during add operation:'), error);
|
|
88
|
+
this.exit(1);
|
|
89
|
+
}
|
|
90
|
+
return;
|
|
91
|
+
}
|
|
53
92
|
if (args.op === 'init') {
|
|
54
93
|
// Validate required flags
|
|
55
94
|
if (!flags.workdir) {
|
|
@@ -214,7 +253,7 @@ export default class Conch extends Command {
|
|
|
214
253
|
return;
|
|
215
254
|
}
|
|
216
255
|
// If no valid operation is provided
|
|
217
|
-
console.error(chalk.red('Error: Invalid operation. Use "install", "init", or "start"'));
|
|
256
|
+
console.error(chalk.red('Error: Invalid operation. Use "add", "install", "init", or "start"'));
|
|
218
257
|
this.exit(1);
|
|
219
258
|
}
|
|
220
259
|
}
|
|
@@ -0,0 +1,54 @@
|
|
|
1
|
+
import { Command } from '@oclif/core';
|
|
2
|
+
/**
|
|
3
|
+
* Copilot command that uses the GitHub Copilot SDK to interact with DHTI
|
|
4
|
+
* and display results with streaming support.
|
|
5
|
+
*/
|
|
6
|
+
export default class Copilot extends Command {
|
|
7
|
+
static description: string;
|
|
8
|
+
static examples: string[];
|
|
9
|
+
static flags: {
|
|
10
|
+
'clear-history': import("@oclif/core/interfaces").BooleanFlag<boolean>;
|
|
11
|
+
file: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
|
|
12
|
+
model: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
|
|
13
|
+
prompt: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
|
|
14
|
+
skill: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
|
|
15
|
+
};
|
|
16
|
+
/**
|
|
17
|
+
* Detects the appropriate skill based on the prompt content
|
|
18
|
+
* @param prompt - The user's prompt text
|
|
19
|
+
* @returns The detected skill name
|
|
20
|
+
*/
|
|
21
|
+
private detectSkill;
|
|
22
|
+
/**
|
|
23
|
+
* Gets the path to the conversation history file
|
|
24
|
+
* @returns The path to the history file
|
|
25
|
+
*/
|
|
26
|
+
private getHistoryFilePath;
|
|
27
|
+
/**
|
|
28
|
+
* Loads conversation history from file
|
|
29
|
+
* @returns Array of conversation turns or empty array if no history
|
|
30
|
+
*/
|
|
31
|
+
private loadConversationHistory;
|
|
32
|
+
/**
|
|
33
|
+
* Saves conversation history to file
|
|
34
|
+
* @param history - Array of conversation turns to save
|
|
35
|
+
*/
|
|
36
|
+
private saveConversationHistory;
|
|
37
|
+
/**
|
|
38
|
+
* Clears the conversation history
|
|
39
|
+
*/
|
|
40
|
+
private clearConversationHistory;
|
|
41
|
+
/**
|
|
42
|
+
* Fetches skill content from GitHub if not available locally
|
|
43
|
+
* @param skillName - The name of the skill to fetch
|
|
44
|
+
* @returns The skill content or null if not found
|
|
45
|
+
*/
|
|
46
|
+
private fetchSkillFromGitHub;
|
|
47
|
+
/**
|
|
48
|
+
* Loads skill instructions from local or remote source
|
|
49
|
+
* @param skillName - The name of the skill to load
|
|
50
|
+
* @returns The skill content or null if not found
|
|
51
|
+
*/
|
|
52
|
+
private loadSkill;
|
|
53
|
+
run(): Promise<void>;
|
|
54
|
+
}
|
|
@@ -0,0 +1,300 @@
|
|
|
1
|
+
import { CopilotClient } from '@github/copilot-sdk';
|
|
2
|
+
import { Command, Flags } from '@oclif/core';
|
|
3
|
+
import chalk from 'chalk';
|
|
4
|
+
import fs from 'node:fs';
|
|
5
|
+
import os from 'node:os';
|
|
6
|
+
import path from 'node:path';
|
|
7
|
+
import { fileURLToPath } from 'node:url';
|
|
8
|
+
/**
|
|
9
|
+
* Copilot command that uses the GitHub Copilot SDK to interact with DHTI
|
|
10
|
+
* and display results with streaming support.
|
|
11
|
+
*/
|
|
12
|
+
export default class Copilot extends Command {
|
|
13
|
+
static description = 'Interact with DHTI using GitHub Copilot SDK with streaming responses';
|
|
14
|
+
static examples = [
|
|
15
|
+
'<%= config.bin %> <%= command.id %> --prompt "Start the DHTI stack with langserve"',
|
|
16
|
+
'<%= config.bin %> <%= command.id %> --file ./my-prompt.txt --model gpt-4.1',
|
|
17
|
+
'<%= config.bin %> <%= command.id %> --prompt "Generate a new elixir for patient risk assessment" --skill elixir-generator',
|
|
18
|
+
'<%= config.bin %> <%= command.id %> --clear-history --prompt "Start fresh conversation"',
|
|
19
|
+
'<%= config.bin %> <%= command.id %> --clear-history # Clear history without starting new conversation',
|
|
20
|
+
];
|
|
21
|
+
static flags = {
|
|
22
|
+
'clear-history': Flags.boolean({
|
|
23
|
+
default: false,
|
|
24
|
+
description: 'Clear conversation history and start a new session',
|
|
25
|
+
}),
|
|
26
|
+
file: Flags.string({
|
|
27
|
+
char: 'f',
|
|
28
|
+
description: 'Path to a file containing the prompt to send to copilot-sdk',
|
|
29
|
+
exclusive: ['prompt'],
|
|
30
|
+
}),
|
|
31
|
+
model: Flags.string({
|
|
32
|
+
char: 'm',
|
|
33
|
+
default: 'gpt-4.1',
|
|
34
|
+
description: 'Model to use for copilot-sdk interactions',
|
|
35
|
+
}),
|
|
36
|
+
prompt: Flags.string({
|
|
37
|
+
char: 'p',
|
|
38
|
+
description: 'Prompt to send to the copilot-sdk',
|
|
39
|
+
exclusive: ['file'],
|
|
40
|
+
}),
|
|
41
|
+
skill: Flags.string({
|
|
42
|
+
char: 's',
|
|
43
|
+
default: 'auto',
|
|
44
|
+
description: 'Skill to use for copilot-sdk interactions (auto, start-dhti, elixir-generator, conch-generator)',
|
|
45
|
+
}),
|
|
46
|
+
};
|
|
47
|
+
/**
|
|
48
|
+
* Detects the appropriate skill based on the prompt content
|
|
49
|
+
* @param prompt - The user's prompt text
|
|
50
|
+
* @returns The detected skill name
|
|
51
|
+
*/
|
|
52
|
+
detectSkill(prompt) {
|
|
53
|
+
const lowerPrompt = prompt.toLowerCase();
|
|
54
|
+
// Check for elixir-related keywords
|
|
55
|
+
if (lowerPrompt.includes('elixir') ||
|
|
56
|
+
lowerPrompt.includes('backend') ||
|
|
57
|
+
lowerPrompt.includes('langserve') ||
|
|
58
|
+
lowerPrompt.includes('genai app')) {
|
|
59
|
+
return 'elixir-generator';
|
|
60
|
+
}
|
|
61
|
+
// Check for conch-related keywords
|
|
62
|
+
if (lowerPrompt.includes('conch') ||
|
|
63
|
+
lowerPrompt.includes('frontend') ||
|
|
64
|
+
lowerPrompt.includes('ui') ||
|
|
65
|
+
lowerPrompt.includes('openmrs')) {
|
|
66
|
+
return 'conch-generator';
|
|
67
|
+
}
|
|
68
|
+
// Default to start-dhti for general setup/orchestration
|
|
69
|
+
return 'start-dhti';
|
|
70
|
+
}
|
|
71
|
+
/**
|
|
72
|
+
* Gets the path to the conversation history file
|
|
73
|
+
* @returns The path to the history file
|
|
74
|
+
*/
|
|
75
|
+
getHistoryFilePath() {
|
|
76
|
+
const dhtiDir = path.join(os.homedir(), '.dhti');
|
|
77
|
+
if (!fs.existsSync(dhtiDir)) {
|
|
78
|
+
fs.mkdirSync(dhtiDir, { recursive: true });
|
|
79
|
+
}
|
|
80
|
+
return path.join(dhtiDir, 'copilot-history.json');
|
|
81
|
+
}
|
|
82
|
+
/**
|
|
83
|
+
* Loads conversation history from file
|
|
84
|
+
* @returns Array of conversation turns or empty array if no history
|
|
85
|
+
*/
|
|
86
|
+
loadConversationHistory() {
|
|
87
|
+
try {
|
|
88
|
+
const historyPath = this.getHistoryFilePath();
|
|
89
|
+
if (fs.existsSync(historyPath)) {
|
|
90
|
+
const historyData = fs.readFileSync(historyPath, 'utf8');
|
|
91
|
+
return JSON.parse(historyData);
|
|
92
|
+
}
|
|
93
|
+
}
|
|
94
|
+
catch (error) {
|
|
95
|
+
this.warn(chalk.yellow(`Failed to load conversation history: ${error}`));
|
|
96
|
+
}
|
|
97
|
+
return [];
|
|
98
|
+
}
|
|
99
|
+
/**
|
|
100
|
+
* Saves conversation history to file
|
|
101
|
+
* @param history - Array of conversation turns to save
|
|
102
|
+
*/
|
|
103
|
+
saveConversationHistory(history) {
|
|
104
|
+
try {
|
|
105
|
+
const historyPath = this.getHistoryFilePath();
|
|
106
|
+
fs.writeFileSync(historyPath, JSON.stringify(history, null, 2), 'utf8');
|
|
107
|
+
}
|
|
108
|
+
catch (error) {
|
|
109
|
+
this.warn(chalk.yellow(`Failed to save conversation history: ${error}`));
|
|
110
|
+
}
|
|
111
|
+
}
|
|
112
|
+
/**
|
|
113
|
+
* Clears the conversation history
|
|
114
|
+
*/
|
|
115
|
+
clearConversationHistory() {
|
|
116
|
+
try {
|
|
117
|
+
const historyPath = this.getHistoryFilePath();
|
|
118
|
+
if (fs.existsSync(historyPath)) {
|
|
119
|
+
fs.unlinkSync(historyPath);
|
|
120
|
+
this.log(chalk.green('✓ Conversation history cleared'));
|
|
121
|
+
}
|
|
122
|
+
else {
|
|
123
|
+
this.log(chalk.yellow('No conversation history to clear'));
|
|
124
|
+
}
|
|
125
|
+
}
|
|
126
|
+
catch (error) {
|
|
127
|
+
this.warn(chalk.yellow(`Failed to clear conversation history: ${error}`));
|
|
128
|
+
}
|
|
129
|
+
}
|
|
130
|
+
/**
|
|
131
|
+
* Fetches skill content from GitHub if not available locally
|
|
132
|
+
* @param skillName - The name of the skill to fetch
|
|
133
|
+
* @returns The skill content or null if not found
|
|
134
|
+
*/
|
|
135
|
+
async fetchSkillFromGitHub(skillName) {
|
|
136
|
+
try {
|
|
137
|
+
const url = `https://raw.githubusercontent.com/dermatologist/dhti/develop/.agents/skills/${skillName}/SKILL.md`;
|
|
138
|
+
const response = await fetch(url);
|
|
139
|
+
if (!response.ok) {
|
|
140
|
+
return null;
|
|
141
|
+
}
|
|
142
|
+
return response.text();
|
|
143
|
+
}
|
|
144
|
+
catch (error) {
|
|
145
|
+
this.warn(`Failed to fetch skill ${skillName} from GitHub: ${error}`);
|
|
146
|
+
return null;
|
|
147
|
+
}
|
|
148
|
+
}
|
|
149
|
+
/**
|
|
150
|
+
* Loads skill instructions from local or remote source
|
|
151
|
+
* @param skillName - The name of the skill to load
|
|
152
|
+
* @returns The skill content or null if not found
|
|
153
|
+
*/
|
|
154
|
+
async loadSkill(skillName) {
|
|
155
|
+
// Resolve skills directory
|
|
156
|
+
const __filename = fileURLToPath(import.meta.url);
|
|
157
|
+
const __dirname = path.dirname(__filename);
|
|
158
|
+
const skillsDir = path.resolve(__dirname, '../../.agents/skills');
|
|
159
|
+
const skillPath = path.join(skillsDir, skillName, 'SKILL.md');
|
|
160
|
+
// Try to load from local directory first
|
|
161
|
+
if (fs.existsSync(skillPath)) {
|
|
162
|
+
try {
|
|
163
|
+
return fs.readFileSync(skillPath, 'utf8');
|
|
164
|
+
}
|
|
165
|
+
catch (error) {
|
|
166
|
+
this.warn(`Failed to read local skill file: ${error}`);
|
|
167
|
+
}
|
|
168
|
+
}
|
|
169
|
+
// If not found locally, try to fetch from GitHub
|
|
170
|
+
this.log(chalk.yellow(`Skill ${skillName} not found locally, fetching from GitHub...`));
|
|
171
|
+
return this.fetchSkillFromGitHub(skillName);
|
|
172
|
+
}
|
|
173
|
+
// eslint-disable-next-line perfectionist/sort-classes
|
|
174
|
+
async run() {
|
|
175
|
+
const { flags } = await this.parse(Copilot);
|
|
176
|
+
// Handle clear-history flag
|
|
177
|
+
if (flags['clear-history']) {
|
|
178
|
+
this.clearConversationHistory();
|
|
179
|
+
// If only clearing history, exit after clearing
|
|
180
|
+
if (!flags.prompt && !flags.file) {
|
|
181
|
+
return;
|
|
182
|
+
}
|
|
183
|
+
}
|
|
184
|
+
// Validate that either prompt or file is provided
|
|
185
|
+
if (!flags.prompt && !flags.file) {
|
|
186
|
+
this.error('Either --prompt or --file must be provided');
|
|
187
|
+
}
|
|
188
|
+
// Get the prompt content
|
|
189
|
+
let promptContent;
|
|
190
|
+
if (flags.file) {
|
|
191
|
+
if (!fs.existsSync(flags.file)) {
|
|
192
|
+
this.error(`File not found: ${flags.file}`);
|
|
193
|
+
}
|
|
194
|
+
try {
|
|
195
|
+
promptContent = fs.readFileSync(flags.file, 'utf8');
|
|
196
|
+
}
|
|
197
|
+
catch (error) {
|
|
198
|
+
this.error(`Failed to read file: ${error}`);
|
|
199
|
+
}
|
|
200
|
+
}
|
|
201
|
+
else {
|
|
202
|
+
promptContent = flags.prompt;
|
|
203
|
+
}
|
|
204
|
+
// Load conversation history
|
|
205
|
+
const conversationHistory = this.loadConversationHistory();
|
|
206
|
+
const hasHistory = conversationHistory.length > 0;
|
|
207
|
+
if (hasHistory) {
|
|
208
|
+
this.log(chalk.cyan(`📜 Loaded ${conversationHistory.length} previous message(s) from history`));
|
|
209
|
+
}
|
|
210
|
+
// Determine which skill to use
|
|
211
|
+
let skillName = flags.skill;
|
|
212
|
+
if (skillName === 'auto') {
|
|
213
|
+
skillName = this.detectSkill(promptContent);
|
|
214
|
+
this.log(chalk.cyan(`Auto-detected skill: ${skillName}`));
|
|
215
|
+
}
|
|
216
|
+
// Load the skill instructions
|
|
217
|
+
const skillContent = await this.loadSkill(skillName);
|
|
218
|
+
if (!skillContent) {
|
|
219
|
+
this.warn(chalk.yellow(`Could not load skill: ${skillName}. Proceeding without skill context.`));
|
|
220
|
+
}
|
|
221
|
+
// Build system message with skill instructions
|
|
222
|
+
let systemMessageContent = 'You are a helpful assistant that can use specific skills to generate components of the DHTI stack based on user prompts.';
|
|
223
|
+
// Add skill-specific instructions
|
|
224
|
+
if (skillContent) {
|
|
225
|
+
systemMessageContent += '\n\n' + skillContent;
|
|
226
|
+
}
|
|
227
|
+
// Add default instruction to use start-dhti skill
|
|
228
|
+
if (skillName !== 'start-dhti') {
|
|
229
|
+
systemMessageContent += '\n\nNote: If the user needs to start the DHTI stack, use the start-dhti skill workflow.';
|
|
230
|
+
}
|
|
231
|
+
// Add conversation history context
|
|
232
|
+
if (hasHistory) {
|
|
233
|
+
systemMessageContent += '\n\n## Previous Conversation\n';
|
|
234
|
+
systemMessageContent += 'Here is the conversation history for context:\n\n';
|
|
235
|
+
for (const turn of conversationHistory) {
|
|
236
|
+
systemMessageContent += `${turn.role === 'user' ? 'User' : 'Assistant'}: ${turn.content}\n\n`;
|
|
237
|
+
}
|
|
238
|
+
systemMessageContent += 'Continue the conversation naturally based on this context.';
|
|
239
|
+
}
|
|
240
|
+
this.log(chalk.green('Initializing GitHub Copilot SDK...'));
|
|
241
|
+
let client = null;
|
|
242
|
+
let assistantResponse = '';
|
|
243
|
+
try {
|
|
244
|
+
// Create copilot client
|
|
245
|
+
client = new CopilotClient();
|
|
246
|
+
// Create a session with streaming enabled
|
|
247
|
+
const session = await client.createSession({
|
|
248
|
+
model: flags.model,
|
|
249
|
+
streaming: true,
|
|
250
|
+
systemMessage: {
|
|
251
|
+
content: systemMessageContent,
|
|
252
|
+
mode: 'append',
|
|
253
|
+
},
|
|
254
|
+
});
|
|
255
|
+
this.log(chalk.green(`Using model: ${flags.model}`));
|
|
256
|
+
this.log(chalk.green(`Using skill: ${skillName}`));
|
|
257
|
+
this.log(chalk.blue('\n--- Copilot Response ---\n'));
|
|
258
|
+
// Handle streaming responses
|
|
259
|
+
let responseStarted = false;
|
|
260
|
+
session.on('assistant.message_delta', (event) => {
|
|
261
|
+
if (!responseStarted) {
|
|
262
|
+
responseStarted = true;
|
|
263
|
+
}
|
|
264
|
+
const content = event.data.deltaContent;
|
|
265
|
+
process.stdout.write(content);
|
|
266
|
+
assistantResponse += content;
|
|
267
|
+
});
|
|
268
|
+
// Handle session idle (response complete)
|
|
269
|
+
session.on('session.idle', () => {
|
|
270
|
+
if (responseStarted) {
|
|
271
|
+
console.log('\n'); // Add newline after response
|
|
272
|
+
}
|
|
273
|
+
});
|
|
274
|
+
// Send the prompt and wait for completion
|
|
275
|
+
await session.sendAndWait({ prompt: promptContent });
|
|
276
|
+
this.log(chalk.blue('\n--- End of Response ---\n'));
|
|
277
|
+
// Save conversation history
|
|
278
|
+
conversationHistory.push({ content: promptContent, role: 'user' });
|
|
279
|
+
if (assistantResponse.trim()) {
|
|
280
|
+
conversationHistory.push({ content: assistantResponse.trim(), role: 'assistant' });
|
|
281
|
+
}
|
|
282
|
+
this.saveConversationHistory(conversationHistory);
|
|
283
|
+
this.log(chalk.dim(`💾 Conversation saved (${conversationHistory.length} messages). Use --clear-history to reset.`));
|
|
284
|
+
}
|
|
285
|
+
catch (error) {
|
|
286
|
+
const errorMessage = error instanceof Error ? error.message : String(error);
|
|
287
|
+
this.error(chalk.red(`Failed to interact with Copilot SDK: ${errorMessage}\n\n`) +
|
|
288
|
+
chalk.yellow('Troubleshooting:\n') +
|
|
289
|
+
chalk.yellow('1. Ensure GitHub Copilot CLI is installed: https://docs.github.com/en/copilot/using-github-copilot/using-github-copilot-in-the-command-line\n') +
|
|
290
|
+
chalk.yellow('2. Authenticate with: copilot auth login\n') +
|
|
291
|
+
chalk.yellow('3. Verify CLI is working: copilot --version\n'));
|
|
292
|
+
}
|
|
293
|
+
finally {
|
|
294
|
+
// Clean up
|
|
295
|
+
if (client) {
|
|
296
|
+
await client.stop();
|
|
297
|
+
}
|
|
298
|
+
}
|
|
299
|
+
}
|
|
300
|
+
}
|
package/dist/commands/docker.js
CHANGED
|
@@ -110,15 +110,20 @@ export default class Docker extends Command {
|
|
|
110
110
|
console.log('Please provide a valid path to bootstrap.py file');
|
|
111
111
|
this.exit(1);
|
|
112
112
|
}
|
|
113
|
-
|
|
113
|
+
// Determine copy direction based on whether local file exists
|
|
114
|
+
const fileExists = fs.existsSync(flags.file);
|
|
115
|
+
const copyCommand = fileExists
|
|
116
|
+
? `docker cp ${flags.file} ${flags.container}:/app/app/bootstrap.py`
|
|
117
|
+
: `docker cp ${flags.container}:/app/app/bootstrap.py ${flags.file}`;
|
|
114
118
|
const restartCommand = `docker restart ${flags.container}`;
|
|
115
119
|
if (flags['dry-run']) {
|
|
116
120
|
console.log(chalk.yellow('[DRY RUN] Would execute:'));
|
|
117
|
-
|
|
121
|
+
const direction = fileExists ? 'to container' : 'from container';
|
|
122
|
+
console.log(chalk.cyan(` ${copyCommand} (copy ${direction})`));
|
|
118
123
|
console.log(chalk.cyan(` ${restartCommand}`));
|
|
119
124
|
return;
|
|
120
125
|
}
|
|
121
|
-
// copy
|
|
126
|
+
// copy file and only restart after copy completes
|
|
122
127
|
exec(copyCommand, (error, stdout, stderr) => {
|
|
123
128
|
if (error) {
|
|
124
129
|
console.error(`exec error: ${error}`);
|
|
@@ -1,4 +1,3 @@
|
|
|
1
|
-
|
|
2
1
|
services:
|
|
3
2
|
gateway:
|
|
4
3
|
image: beapen/dhti-gateway:latest
|
|
@@ -88,7 +87,6 @@ services:
|
|
|
88
87
|
- OPENROUTER_API_KEY=${OPENROUTER_API_KEY:-openrouter-api-key}
|
|
89
88
|
- FHIR_BASE_URL=${FHIR_BASE_URL:-http://localhost:8080/openmrs/ws/fhir2/R4}
|
|
90
89
|
|
|
91
|
-
|
|
92
90
|
ollama:
|
|
93
91
|
image: ollama/ollama:latest
|
|
94
92
|
ports:
|
|
@@ -357,6 +355,28 @@ services:
|
|
|
357
355
|
depends_on:
|
|
358
356
|
- medplum-server
|
|
359
357
|
|
|
358
|
+
orthanc:
|
|
359
|
+
image: orthancteam/orthanc:latest
|
|
360
|
+
ports:
|
|
361
|
+
# Orthanc web interface and DICOMweb port
|
|
362
|
+
- "8042:8042"
|
|
363
|
+
# DICOM store SCU/SCP port (default 4242 in container, 104 is standard, 8104 used in some examples)
|
|
364
|
+
- "8104:4242"
|
|
365
|
+
environment:
|
|
366
|
+
# Enable the web interface and remote access
|
|
367
|
+
- ORTHANC_JSON={"RemoteAccessAllowed":true,"AuthenticationEnabled":false,"HttpHeaders":{"Access-Control-Allow-Origin":"*","Access-Control-Allow-Methods":"GET, POST, PUT, DELETE, OPTIONS","Access-Control-Allow-Headers":"Content-Type"},"HttpRequestTimeout":60}
|
|
368
|
+
volumes:
|
|
369
|
+
# Persist the data (DICOM files and database)
|
|
370
|
+
- orthanc-data:/var/lib/orthanc/
|
|
371
|
+
restart: unless-stopped
|
|
372
|
+
|
|
373
|
+
cors-proxy:
|
|
374
|
+
image: redocly/cors-anywhere:latest
|
|
375
|
+
ports:
|
|
376
|
+
- "8010:8080"
|
|
377
|
+
restart: "unless-stopped"
|
|
378
|
+
|
|
379
|
+
|
|
360
380
|
volumes:
|
|
361
381
|
openmrs-data: ~
|
|
362
382
|
openmrs-db: ~
|
|
@@ -368,3 +388,4 @@ volumes:
|
|
|
368
388
|
ollama-root: ~
|
|
369
389
|
ollama-webui: ~
|
|
370
390
|
mcpx-config: ~
|
|
391
|
+
orthanc-data: ~
|
|
@@ -1,19 +1,49 @@
|
|
|
1
1
|
from kink import di
|
|
2
|
-
|
|
2
|
+
import os
|
|
3
3
|
from dotenv import load_dotenv
|
|
4
|
+
from langchain.chat_models import init_chat_model
|
|
5
|
+
from langchain_community.llms.fake import FakeListLLM
|
|
4
6
|
from langchain_core.prompts import PromptTemplate
|
|
5
|
-
from
|
|
7
|
+
from langchain_google_genai import ChatGoogleGenerativeAI
|
|
8
|
+
from langchain_openai import ChatOpenAI
|
|
6
9
|
|
|
7
10
|
## Override the default configuration of elixirs here if needed
|
|
8
11
|
|
|
9
12
|
|
|
10
13
|
def bootstrap():
|
|
11
14
|
load_dotenv()
|
|
12
|
-
|
|
15
|
+
di["fhir_access_token"] = os.environ.get(
|
|
16
|
+
"FHIR_ACCESS_TOKEN", "YWRtaW46QWRtaW4xMjM="
|
|
17
|
+
) # admin:Admin123 in base64
|
|
18
|
+
di["fhir_base_url"] = os.environ.get(
|
|
19
|
+
"FHIR_BASE_URL", "http://backend:8080/openmrs/ws/fhir2/R4"
|
|
20
|
+
)
|
|
21
|
+
# Check if google api key is set in the environment
|
|
22
|
+
if os.environ.get("GOOGLE_API_KEY"):
|
|
23
|
+
llm = ChatGoogleGenerativeAI(model="gemini-2.5-flash")
|
|
24
|
+
# Check if openai api key is set in the environment
|
|
25
|
+
elif os.environ.get("OPENAI_API_KEY"):
|
|
26
|
+
llm = ChatOpenAI(model="gpt-4o", temperature=0)
|
|
27
|
+
else:
|
|
28
|
+
llm = FakeListLLM(responses=["I am a fake LLM", "I don't know"])
|
|
29
|
+
di["main_llm"] = llm
|
|
30
|
+
|
|
31
|
+
openrouter_api_key = os.environ.get("OPENROUTER_API_KEY")
|
|
32
|
+
if openrouter_api_key:
|
|
33
|
+
model = init_chat_model(
|
|
34
|
+
model="nvidia/nemotron-nano-9b-v2:free",
|
|
35
|
+
model_provider="openai",
|
|
36
|
+
base_url="https://openrouter.ai/api/v1",
|
|
37
|
+
api_key=openrouter_api_key,
|
|
38
|
+
)
|
|
39
|
+
else:
|
|
40
|
+
# Fallback to the main LLM if no OpenRouter API key is configured
|
|
41
|
+
model = llm
|
|
42
|
+
|
|
43
|
+
di["function_llm"] = model
|
|
13
44
|
di["main_prompt"] = PromptTemplate.from_template(
|
|
14
45
|
"Summarize the following in 100 words: {input}"
|
|
15
46
|
)
|
|
16
|
-
di["main_llm"] = fake_llm
|
|
17
47
|
di["cds_hook_discovery"] = {
|
|
18
48
|
"services": [
|
|
19
49
|
{
|
package/oclif.manifest.json
CHANGED
|
@@ -87,12 +87,14 @@
|
|
|
87
87
|
"aliases": [],
|
|
88
88
|
"args": {
|
|
89
89
|
"op": {
|
|
90
|
-
"description": "Operation to perform (init, install, or start)",
|
|
90
|
+
"description": "Operation to perform (add, init, install, or start)",
|
|
91
91
|
"name": "op"
|
|
92
92
|
}
|
|
93
93
|
},
|
|
94
94
|
"description": "Initialize, install, or start OpenMRS frontend development",
|
|
95
95
|
"examples": [
|
|
96
|
+
"<%= config.bin %> <%= command.id %> add -g my-repo/my-package -n my-package -w ~/projects",
|
|
97
|
+
"<%= config.bin %> <%= command.id %> add -g my-repo/my-package -b main -n my-package -w ~/projects",
|
|
96
98
|
"<%= config.bin %> <%= command.id %> install -n my-app -w ~/projects",
|
|
97
99
|
"<%= config.bin %> <%= command.id %> init -n my-app -w ~/projects",
|
|
98
100
|
"<%= config.bin %> <%= command.id %> start -n my-app -w ~/projects",
|
|
@@ -118,7 +120,7 @@
|
|
|
118
120
|
"char": "g",
|
|
119
121
|
"description": "GitHub repository to install (for install operation)",
|
|
120
122
|
"name": "git",
|
|
121
|
-
"default": "dermatologist/openmrs-esm-dhti
|
|
123
|
+
"default": "dermatologist/openmrs-esm-dhti",
|
|
122
124
|
"hasDynamicHelp": false,
|
|
123
125
|
"multiple": false,
|
|
124
126
|
"type": "option"
|
|
@@ -135,6 +137,7 @@
|
|
|
135
137
|
"char": "n",
|
|
136
138
|
"description": "Name of the conch",
|
|
137
139
|
"name": "name",
|
|
140
|
+
"default": "esm-dhti",
|
|
138
141
|
"hasDynamicHelp": false,
|
|
139
142
|
"multiple": false,
|
|
140
143
|
"type": "option"
|
|
@@ -172,6 +175,80 @@
|
|
|
172
175
|
"conch.js"
|
|
173
176
|
]
|
|
174
177
|
},
|
|
178
|
+
"copilot": {
|
|
179
|
+
"aliases": [],
|
|
180
|
+
"args": {},
|
|
181
|
+
"description": "Interact with DHTI using GitHub Copilot SDK with streaming responses",
|
|
182
|
+
"examples": [
|
|
183
|
+
"<%= config.bin %> <%= command.id %> --prompt \"Start the DHTI stack with langserve\"",
|
|
184
|
+
"<%= config.bin %> <%= command.id %> --file ./my-prompt.txt --model gpt-4.1",
|
|
185
|
+
"<%= config.bin %> <%= command.id %> --prompt \"Generate a new elixir for patient risk assessment\" --skill elixir-generator",
|
|
186
|
+
"<%= config.bin %> <%= command.id %> --clear-history --prompt \"Start fresh conversation\"",
|
|
187
|
+
"<%= config.bin %> <%= command.id %> --clear-history # Clear history without starting new conversation"
|
|
188
|
+
],
|
|
189
|
+
"flags": {
|
|
190
|
+
"clear-history": {
|
|
191
|
+
"description": "Clear conversation history and start a new session",
|
|
192
|
+
"name": "clear-history",
|
|
193
|
+
"allowNo": false,
|
|
194
|
+
"type": "boolean"
|
|
195
|
+
},
|
|
196
|
+
"file": {
|
|
197
|
+
"char": "f",
|
|
198
|
+
"description": "Path to a file containing the prompt to send to copilot-sdk",
|
|
199
|
+
"exclusive": [
|
|
200
|
+
"prompt"
|
|
201
|
+
],
|
|
202
|
+
"name": "file",
|
|
203
|
+
"hasDynamicHelp": false,
|
|
204
|
+
"multiple": false,
|
|
205
|
+
"type": "option"
|
|
206
|
+
},
|
|
207
|
+
"model": {
|
|
208
|
+
"char": "m",
|
|
209
|
+
"description": "Model to use for copilot-sdk interactions",
|
|
210
|
+
"name": "model",
|
|
211
|
+
"default": "gpt-4.1",
|
|
212
|
+
"hasDynamicHelp": false,
|
|
213
|
+
"multiple": false,
|
|
214
|
+
"type": "option"
|
|
215
|
+
},
|
|
216
|
+
"prompt": {
|
|
217
|
+
"char": "p",
|
|
218
|
+
"description": "Prompt to send to the copilot-sdk",
|
|
219
|
+
"exclusive": [
|
|
220
|
+
"file"
|
|
221
|
+
],
|
|
222
|
+
"name": "prompt",
|
|
223
|
+
"hasDynamicHelp": false,
|
|
224
|
+
"multiple": false,
|
|
225
|
+
"type": "option"
|
|
226
|
+
},
|
|
227
|
+
"skill": {
|
|
228
|
+
"char": "s",
|
|
229
|
+
"description": "Skill to use for copilot-sdk interactions (auto, start-dhti, elixir-generator, conch-generator)",
|
|
230
|
+
"name": "skill",
|
|
231
|
+
"default": "auto",
|
|
232
|
+
"hasDynamicHelp": false,
|
|
233
|
+
"multiple": false,
|
|
234
|
+
"type": "option"
|
|
235
|
+
}
|
|
236
|
+
},
|
|
237
|
+
"hasDynamicHelp": false,
|
|
238
|
+
"hiddenAliases": [],
|
|
239
|
+
"id": "copilot",
|
|
240
|
+
"pluginAlias": "dhti-cli",
|
|
241
|
+
"pluginName": "dhti-cli",
|
|
242
|
+
"pluginType": "core",
|
|
243
|
+
"strict": true,
|
|
244
|
+
"enableJsonFlag": false,
|
|
245
|
+
"isESM": true,
|
|
246
|
+
"relativePath": [
|
|
247
|
+
"dist",
|
|
248
|
+
"commands",
|
|
249
|
+
"copilot.js"
|
|
250
|
+
]
|
|
251
|
+
},
|
|
175
252
|
"docker": {
|
|
176
253
|
"aliases": [],
|
|
177
254
|
"args": {
|
|
@@ -805,5 +882,5 @@
|
|
|
805
882
|
]
|
|
806
883
|
}
|
|
807
884
|
},
|
|
808
|
-
"version": "1.
|
|
885
|
+
"version": "1.3.0"
|
|
809
886
|
}
|
package/package.json
CHANGED
|
@@ -1,13 +1,14 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "dhti-cli",
|
|
3
3
|
"description": "DHTI CLI",
|
|
4
|
-
"version": "1.
|
|
4
|
+
"version": "1.3.0",
|
|
5
5
|
"author": "Bell Eapen",
|
|
6
6
|
"bin": {
|
|
7
7
|
"dhti-cli": "bin/run.js"
|
|
8
8
|
},
|
|
9
9
|
"bugs": "https://github.com/dermatologist/dhti/issues",
|
|
10
10
|
"dependencies": {
|
|
11
|
+
"@github/copilot-sdk": "^0.1.23",
|
|
11
12
|
"@langchain/community": "^0.3.53",
|
|
12
13
|
"@langchain/ollama": "^0.2.3",
|
|
13
14
|
"@oclif/core": "^4",
|
|
@@ -18,7 +19,8 @@
|
|
|
18
19
|
"js-yaml": "^4.1.0",
|
|
19
20
|
"medpromptjs": ">=0.4.3",
|
|
20
21
|
"ora": "^8.0.1",
|
|
21
|
-
"request": "^2.88.2"
|
|
22
|
+
"request": "^2.88.2",
|
|
23
|
+
"vscode-jsonrpc": "^8.2.1"
|
|
22
24
|
},
|
|
23
25
|
"devDependencies": {
|
|
24
26
|
"@oclif/prettier-config": "^0.2.1",
|