dhti-cli 1.1.0 → 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -25,6 +25,8 @@ Generative AI features are built as [LangServe Apps](https://python.langchain.co
25
25
 
26
26
  🚀 You can test the elixir using a real EMR system, [OpenMRS](https://openmrs.org/), that communicates with the elixir using **CDS-Hooks** or use any other CDS-Hooks compatible EMR system. You can also use the [CDS-Hooks sandbox for testing](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) without an EMR.
27
27
 
28
+ 🚀 Checkout **[Vidhi Recipes](/vidhi/README.md)** for chatbot, RAG, imaging (DICOM) and MCPX for dockerized calculators
29
+
28
30
  #### How (non‑technical / clinical)
29
31
  DHTI includes ready‑to‑use [skills](/.github/skills/) that can prompt agentic platforms (e.g., [AntiGravity](https://antigravity.google/), VSCode, or Claude) to generate the GenAI backends and UI components (elixirs and conches) you need. Test these components with synthetic data in OpenMRS or the CDS‑Hooks sandbox, then hand them off to production teams. Because DHTI follows open standards, that handoff (the “valley of death”) becomes smoother and more predictable. Try the [prompts](/.github/skills/start-dhti/examples/e2e-sample.md) in your preferred agentic platform after cloning this repo.
30
32
 
@@ -52,15 +54,16 @@ npx dhti-cli docker -u # start services from compose
52
54
  ```
53
55
 
54
56
  Notes:
55
- - Configure models and hyperparameters in `~/dhti/elixir/app/bootstrap.py` or install from a local directory using `-l`.
57
+ - Install from a local directory using `-l`.
56
58
  - Stop and remove containers with `npx dhti-cli docker -d`.
57
59
 
58
60
  ✌️ Decide where to test the new elixir: OpenMRS a full EHR system, or CDS-Hooks sandbox for a lightweight testing without an EHR.
59
61
 
60
62
  💥 Test elixir in a CDS-Hooks sandbox.
61
63
 
62
- * `npx dhti-cli conch start -n dhti-elixir-schat` and navigate to the Application URL displayed in the console. (Uses hapi.fhir.org).
63
- * In the **Rx View** tab, type in the contentString textbox and wait for the elixir to respond.
64
+ * `npx dhti-cli elixir start -n dhti-elixir-schat` and navigate to the **Application URL displayed in the console** (scroll up to see this). Not the base URL listed at the bottom.
65
+ * Uses hapi.fhir.org for data.
66
+ * In the **Rx View** tab, type in the contentString textbox and wait for the elixir to respond (Submits automatically in 5 seconds).
64
67
 
65
68
  <p align="center">
66
69
  <img src="https://github.com/dermatologist/dhti/blob/develop/notes/cds-hook-sandbox.jpg" />
@@ -88,12 +91,17 @@ You will see the new **patient context aware chatbot** in the patient summary pa
88
91
 
89
92
  * `npx dhti-cli docker -d` to stop and delete all the docker containers.
90
93
 
94
+ ## Configuration
95
+
96
+ * `npx dhti-cli docker bootstrap -f bootstrap.py` will create and sync bootstrap.py where you can configure default model and hyperparameters for LangServe. Run this command after changing bootstrap.py to apply the changes.
97
+
91
98
  ## Wiki & Documentation
92
99
  * [![Wiki](https://img.shields.io/badge/DHTI-wiki-demo)](https://github.com/dermatologist/dhti/wiki)
93
100
  * [Documentation](https://dermatologist.github.io/dhti/)
94
101
  * [CLI Reference](/notes/README.md)
95
102
 
96
103
  ## User contributions & examples
104
+ * 🚀 **[Vidhi Recipes](/vidhi/README.md)** for chatbot, RAG, imaging (DICOM) and MCPX for dockerized calculators
97
105
  * [Elixirs](https://github.com/dermatologist/dhti-elixir)
98
106
  * [OpenMRS Conches / UI](https://github.com/dermatologist/openmrs-esm-dhti)
99
107
  * [CDS Hooks Sandbox for testing](https://github.com/dermatologist/cds-hooks-sandbox)
@@ -101,14 +109,16 @@ You will see the new **patient context aware chatbot** in the patient summary pa
101
109
  ## Presentations
102
110
  ⭐️ **Pitched at [Falling Walls Lab Illinois](https://falling-walls.com/falling-walls-lab-illinois) and released on 2025-09-12.**
103
111
 
104
- ## What problems do DHTI solve?
112
+ ## 🔧 What problems do DHTI solve?
105
113
 
106
114
  | Why | How |
107
115
  | --- | --- |
116
+ | I am a clinician! I have no idea how to build GenAI apps. | ✨ DHTI comes with batteries ([skills](/.github/skills/)) included! Use your preferred agentic platform (e.g., [AntiGravity](https://antigravity.google/), [VSCode with Copilot in agent mode](https://code.visualstudio.com/docs/copilot/overview), Claude, [Cursor](https://cursor.com/) and many other) to generate elixirs and conches from [problem-oriented prompts](/prompts/e2e-sample.md) (most of these platforms have a free tier). Test them using synthetic data in OpenMRS or the CDS-Hooks sandbox, then hand them off to production teams. |
108
117
  | I know LangChain, but I don’t know how to build a chain/agent based on data in our EHR. | [These sample elixirs](https://github.com/dermatologist/dhti-elixir) adopt FHIR and cds-hooks as standards for data retrieval and display. The [base class](https://github.com/dermatologist/dhti-elixir-base) provides reusable artifacts |
109
118
  | I need a simple platform for experimenting. | This repository provides everything to start experimenting fast. The command-line tools help to virtualize and orchestrate your experiments using [Docker](https://www.docker.com/)|
110
119
  | I am a UI designer. I want to design helpful UI for real users. | See [these sample conches](https://github.com/dermatologist/openmrs-esm-dhti). It shows how to build interface components (conches) for [OpenMRS](https://openmrs.org/) an open-source EMR used by many. Read more about [OpenMRS UI](https://o3-docs.openmrs.org/) |
111
120
  | We use another EMR | Your EMR may support CDS-Hook for displaying components. In that case, you can use [cds-hooks-sandbox for testing](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) |
121
+ | We don't use an EMR. We use a web based health information system for ------ population with no FHIR support. | You can still use DHTI as a GenAI experimentation platform. ✨ We have a [browser extension](https://github.com/dermatologist/openmrs-esm-dhti/blob/develop/packages/dhti-screen-grabber/README.md) that can read any web page! |
112
122
  | Our IT team is often unable to take my experiments to production. | Use DHTI, follow the recommended patterns, and you will make their lives easier.|
113
123
 
114
124
 
@@ -80,6 +80,7 @@ export default class Compose extends Command {
80
80
  const mcpx = ['mcpx'];
81
81
  const docktor = ['mcpx'];
82
82
  const medplum = ['medplum-server', 'medplum-app', 'postgres-db', 'redis', 'mpclient'];
83
+ const orthanc = ['orthanc', 'cors-proxy'];
83
84
  const _modules = {
84
85
  cqlFhir,
85
86
  docktor,
@@ -95,6 +96,7 @@ export default class Compose extends Command {
95
96
  openmrs,
96
97
  redis,
97
98
  webui,
99
+ orthanc,
98
100
  };
99
101
  try {
100
102
  const masterData = yaml.load(fs.readFileSync(path.join(RESOURCES_DIR, 'docker-compose-master.yml'), 'utf8'));
@@ -10,7 +10,7 @@ export default class Conch extends Command {
10
10
  'dry-run': import("@oclif/core/interfaces").BooleanFlag<boolean>;
11
11
  git: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
12
12
  local: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
13
- name: import("@oclif/core/interfaces").OptionFlag<string | undefined, import("@oclif/core/interfaces").CustomOptions>;
13
+ name: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
14
14
  sources: import("@oclif/core/interfaces").OptionFlag<string[] | undefined, import("@oclif/core/interfaces").CustomOptions>;
15
15
  workdir: import("@oclif/core/interfaces").OptionFlag<string, import("@oclif/core/interfaces").CustomOptions>;
16
16
  };
@@ -8,10 +8,12 @@ import { promisify } from 'node:util';
8
8
  const execAsync = promisify(exec);
9
9
  export default class Conch extends Command {
10
10
  static args = {
11
- op: Args.string({ description: 'Operation to perform (init, install, or start)' }),
11
+ op: Args.string({ description: 'Operation to perform (add, init, install, or start)' }),
12
12
  };
13
13
  static description = 'Initialize, install, or start OpenMRS frontend development';
14
14
  static examples = [
15
+ '<%= config.bin %> <%= command.id %> add -g my-repo/my-package -n my-package -w ~/projects',
16
+ '<%= config.bin %> <%= command.id %> add -g my-repo/my-package -b main -n my-package -w ~/projects',
15
17
  '<%= config.bin %> <%= command.id %> install -n my-app -w ~/projects',
16
18
  '<%= config.bin %> <%= command.id %> init -n my-app -w ~/projects',
17
19
  '<%= config.bin %> <%= command.id %> start -n my-app -w ~/projects',
@@ -29,14 +31,14 @@ export default class Conch extends Command {
29
31
  }),
30
32
  git: Flags.string({
31
33
  char: 'g',
32
- default: 'dermatologist/openmrs-esm-dhti-template',
34
+ default: 'dermatologist/openmrs-esm-dhti',
33
35
  description: 'GitHub repository to install (for install operation)',
34
36
  }),
35
37
  local: Flags.string({
36
38
  char: 'l',
37
39
  description: 'Local path to use instead of calculated workdir/name path (for start operation)',
38
40
  }),
39
- name: Flags.string({ char: 'n', description: 'Name of the conch' }),
41
+ name: Flags.string({ char: 'n', default: 'esm-dhti', description: 'Name of the conch' }),
40
42
  sources: Flags.string({
41
43
  char: 's',
42
44
  description: 'Additional sources to include when starting (e.g., packages/esm-chatbot-agent, packages/esm-another-app)',
@@ -50,6 +52,43 @@ export default class Conch extends Command {
50
52
  };
51
53
  async run() {
52
54
  const { args, flags } = await this.parse(Conch);
55
+ if (args.op === 'add') {
56
+ // Validate that git and name are overridden from defaults
57
+ const defaultGit = 'dermatologist/openmrs-esm-dhti';
58
+ const defaultName = 'esm-dhti';
59
+ const gitOverridden = flags.git !== defaultGit;
60
+ const nameOverridden = flags.name !== defaultName;
61
+ if (!gitOverridden || !nameOverridden) {
62
+ console.log(chalk.yellow('Note: The "add" operation requires non-default values for both --git and --name flags.'));
63
+ if (!gitOverridden) {
64
+ console.log(chalk.yellow(' Current --git: (default)'));
65
+ }
66
+ if (!nameOverridden) {
67
+ console.log(chalk.yellow(' Current --name: (default)'));
68
+ }
69
+ console.log(chalk.yellow('\nNo changes made. Please provide custom --git and --name values.'));
70
+ this.exit(0);
71
+ }
72
+ if (flags['dry-run']) {
73
+ console.log(chalk.yellow('[DRY RUN] Would execute add operation:'));
74
+ const targetPath = path.join(flags.workdir, 'esm-dhti', 'packages', flags.name);
75
+ console.log(chalk.cyan(` npx degit ${flags.git}#${flags.branch} ${targetPath}`));
76
+ return;
77
+ }
78
+ try {
79
+ console.log(chalk.blue(`Adding package ${flags.name} from ${flags.git}#${flags.branch}...`));
80
+ const targetPath = path.join(flags.workdir, 'esm-dhti', 'packages', flags.name);
81
+ const degitCommand = `npx degit ${flags.git}#${flags.branch} ${targetPath}`;
82
+ await execAsync(degitCommand);
83
+ console.log(chalk.green('✓ Package added successfully'));
84
+ console.log(chalk.green(`\n✓ Package is ready at ${targetPath}`));
85
+ }
86
+ catch (error) {
87
+ console.error(chalk.red('Error during add operation:'), error);
88
+ this.exit(1);
89
+ }
90
+ return;
91
+ }
53
92
  if (args.op === 'init') {
54
93
  // Validate required flags
55
94
  if (!flags.workdir) {
@@ -214,7 +253,7 @@ export default class Conch extends Command {
214
253
  return;
215
254
  }
216
255
  // If no valid operation is provided
217
- console.error(chalk.red('Error: Invalid operation. Use "install", "init", or "start"'));
256
+ console.error(chalk.red('Error: Invalid operation. Use "add", "install", "init", or "start"'));
218
257
  this.exit(1);
219
258
  }
220
259
  }
@@ -110,15 +110,20 @@ export default class Docker extends Command {
110
110
  console.log('Please provide a valid path to bootstrap.py file');
111
111
  this.exit(1);
112
112
  }
113
- const copyCommand = `docker cp ${flags.file} ${flags.container}:/app/app/bootstrap.py`;
113
+ // Determine copy direction based on whether local file exists
114
+ const fileExists = fs.existsSync(flags.file);
115
+ const copyCommand = fileExists
116
+ ? `docker cp ${flags.file} ${flags.container}:/app/app/bootstrap.py`
117
+ : `docker cp ${flags.container}:/app/app/bootstrap.py ${flags.file}`;
114
118
  const restartCommand = `docker restart ${flags.container}`;
115
119
  if (flags['dry-run']) {
116
120
  console.log(chalk.yellow('[DRY RUN] Would execute:'));
117
- console.log(chalk.cyan(` ${copyCommand}`));
121
+ const direction = fileExists ? 'to container' : 'from container';
122
+ console.log(chalk.cyan(` ${copyCommand} (copy ${direction})`));
118
123
  console.log(chalk.cyan(` ${restartCommand}`));
119
124
  return;
120
125
  }
121
- // copy -f to container:/app/app/ and only restart after copy completes
126
+ // copy file and only restart after copy completes
122
127
  exec(copyCommand, (error, stdout, stderr) => {
123
128
  if (error) {
124
129
  console.error(`exec error: ${error}`);
@@ -1,4 +1,3 @@
1
-
2
1
  services:
3
2
  gateway:
4
3
  image: beapen/dhti-gateway:latest
@@ -88,7 +87,6 @@ services:
88
87
  - OPENROUTER_API_KEY=${OPENROUTER_API_KEY:-openrouter-api-key}
89
88
  - FHIR_BASE_URL=${FHIR_BASE_URL:-http://localhost:8080/openmrs/ws/fhir2/R4}
90
89
 
91
-
92
90
  ollama:
93
91
  image: ollama/ollama:latest
94
92
  ports:
@@ -357,6 +355,28 @@ services:
357
355
  depends_on:
358
356
  - medplum-server
359
357
 
358
+ orthanc:
359
+ image: orthancteam/orthanc:latest
360
+ ports:
361
+ # Orthanc web interface and DICOMweb port
362
+ - "8042:8042"
363
+ # DICOM store SCU/SCP port (default 4242 in container, 104 is standard, 8104 used in some examples)
364
+ - "8104:4242"
365
+ environment:
366
+ # Enable the web interface and remote access
367
+ - ORTHANC_JSON={"RemoteAccessAllowed":true,"AuthenticationEnabled":false,"HttpHeaders":{"Access-Control-Allow-Origin":"*","Access-Control-Allow-Methods":"GET, POST, PUT, DELETE, OPTIONS","Access-Control-Allow-Headers":"Content-Type"},"HttpRequestTimeout":60}
368
+ volumes:
369
+ # Persist the data (DICOM files and database)
370
+ - orthanc-data:/var/lib/orthanc/
371
+ restart: unless-stopped
372
+
373
+ cors-proxy:
374
+ image: redocly/cors-anywhere:latest
375
+ ports:
376
+ - "8010:8080"
377
+ restart: "unless-stopped"
378
+
379
+
360
380
  volumes:
361
381
  openmrs-data: ~
362
382
  openmrs-db: ~
@@ -368,3 +388,4 @@ volumes:
368
388
  ollama-root: ~
369
389
  ollama-webui: ~
370
390
  mcpx-config: ~
391
+ orthanc-data: ~
@@ -1,19 +1,49 @@
1
1
  from kink import di
2
- from os import getenv
2
+ import os
3
3
  from dotenv import load_dotenv
4
+ from langchain.chat_models import init_chat_model
5
+ from langchain_community.llms.fake import FakeListLLM
4
6
  from langchain_core.prompts import PromptTemplate
5
- from langchain_core.language_models.fake import FakeListLLM
7
+ from langchain_google_genai import ChatGoogleGenerativeAI
8
+ from langchain_openai import ChatOpenAI
6
9
 
7
10
  ## Override the default configuration of elixirs here if needed
8
11
 
9
12
 
10
13
  def bootstrap():
11
14
  load_dotenv()
12
- fake_llm = FakeListLLM(responses=["Paris", "I don't know"])
15
+ di["fhir_access_token"] = os.environ.get(
16
+ "FHIR_ACCESS_TOKEN", "YWRtaW46QWRtaW4xMjM="
17
+ ) # admin:Admin123 in base64
18
+ di["fhir_base_url"] = os.environ.get(
19
+ "FHIR_BASE_URL", "http://backend:8080/openmrs/ws/fhir2/R4"
20
+ )
21
+ # Check if google api key is set in the environment
22
+ if os.environ.get("GOOGLE_API_KEY"):
23
+ llm = ChatGoogleGenerativeAI(model="gemini-2.5-flash")
24
+ # Check if openai api key is set in the environment
25
+ elif os.environ.get("OPENAI_API_KEY"):
26
+ llm = ChatOpenAI(model="gpt-4o", temperature=0)
27
+ else:
28
+ llm = FakeListLLM(responses=["I am a fake LLM", "I don't know"])
29
+ di["main_llm"] = llm
30
+
31
+ openrouter_api_key = os.environ.get("OPENROUTER_API_KEY")
32
+ if openrouter_api_key:
33
+ model = init_chat_model(
34
+ model="nvidia/nemotron-nano-9b-v2:free",
35
+ model_provider="openai",
36
+ base_url="https://openrouter.ai/api/v1",
37
+ api_key=openrouter_api_key,
38
+ )
39
+ else:
40
+ # Fallback to the main LLM if no OpenRouter API key is configured
41
+ model = llm
42
+
43
+ di["function_llm"] = model
13
44
  di["main_prompt"] = PromptTemplate.from_template(
14
45
  "Summarize the following in 100 words: {input}"
15
46
  )
16
- di["main_llm"] = fake_llm
17
47
  di["cds_hook_discovery"] = {
18
48
  "services": [
19
49
  {
@@ -87,12 +87,14 @@
87
87
  "aliases": [],
88
88
  "args": {
89
89
  "op": {
90
- "description": "Operation to perform (init, install, or start)",
90
+ "description": "Operation to perform (add, init, install, or start)",
91
91
  "name": "op"
92
92
  }
93
93
  },
94
94
  "description": "Initialize, install, or start OpenMRS frontend development",
95
95
  "examples": [
96
+ "<%= config.bin %> <%= command.id %> add -g my-repo/my-package -n my-package -w ~/projects",
97
+ "<%= config.bin %> <%= command.id %> add -g my-repo/my-package -b main -n my-package -w ~/projects",
96
98
  "<%= config.bin %> <%= command.id %> install -n my-app -w ~/projects",
97
99
  "<%= config.bin %> <%= command.id %> init -n my-app -w ~/projects",
98
100
  "<%= config.bin %> <%= command.id %> start -n my-app -w ~/projects",
@@ -118,7 +120,7 @@
118
120
  "char": "g",
119
121
  "description": "GitHub repository to install (for install operation)",
120
122
  "name": "git",
121
- "default": "dermatologist/openmrs-esm-dhti-template",
123
+ "default": "dermatologist/openmrs-esm-dhti",
122
124
  "hasDynamicHelp": false,
123
125
  "multiple": false,
124
126
  "type": "option"
@@ -135,6 +137,7 @@
135
137
  "char": "n",
136
138
  "description": "Name of the conch",
137
139
  "name": "name",
140
+ "default": "esm-dhti",
138
141
  "hasDynamicHelp": false,
139
142
  "multiple": false,
140
143
  "type": "option"
@@ -805,5 +808,5 @@
805
808
  ]
806
809
  }
807
810
  },
808
- "version": "1.1.0"
811
+ "version": "1.2.0"
809
812
  }
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "dhti-cli",
3
3
  "description": "DHTI CLI",
4
- "version": "1.1.0",
4
+ "version": "1.2.0",
5
5
  "author": "Bell Eapen",
6
6
  "bin": {
7
7
  "dhti-cli": "bin/run.js"