dhti-cli 0.3.0 → 0.3.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -3,27 +3,43 @@
3
3
  <img src="https://github.com/dermatologist/dhti/blob/develop/notes/dhti-logo.jpg" />
4
4
  </p>
5
5
 
6
- [![npm version](https://badge.fury.io/js/dhti.svg)](https://www.npmjs.com/package/dhti)
7
- [![npm](https://img.shields.io/npm/dt/dhti)](https://www.npmjs.com/package/dhti)
6
+ [![npm version](https://badge.fury.io/js/dhti-cli.svg)](https://www.npmjs.com/package/dhti-cli)
7
+ [![npm](https://img.shields.io/npm/dt/dhti-cli)](https://www.npmjs.com/package/dhti-cli)
8
8
  [![Build](https://github.com/dermatologist/dhti/workflows/CI/badge.svg)](https://nuchange.ca)
9
- [![Known Vulnerabilities](https://snyk.io/test/github/dermatologist/dhti/badge.svg)](https://www.npmjs.com/package/dhti)
9
+ [![Known Vulnerabilities](https://snyk.io/test/github/dermatologist/dhti/badge.svg)](https://www.npmjs.com/package/dhti-cli)
10
10
  [![Documentation](https://badgen.net/badge/icon/documentation?icon=libraries&label)](https://dermatologist.github.io/dhti/)
11
+ [![Wiki](https://img.shields.io/badge/DHTI-wiki-demo)](https://github.com/dermatologist/dhti/wiki)
12
+
11
13
 
12
- ## About
13
14
  - 🚀 *Dhanvantari rose out of the water with his four hands, holding a pot full of elixirs!*
14
15
 
15
- TL; DR: **dhti-cli is for quick prototyping, developing, sharing and testing of Gen AI applications, models, and UI elements within the context of an electronic health record.**
16
- [Paper coming soon!](https://nuchange.ca)
16
+ #### TL; DR: 🏥 DHTI enables rapid prototyping, sharing, and testing of GenAI applications within an Electronic Health Record (EHR), facilitating the seamless transition of your experiments to clinical practice.
17
+ 👉 [Try it out today!](#try-it-out) and give us a star ⭐️ if you like it!
18
+
19
+ ### About
20
+
21
+ Generative AI features are built as [LangServe Apps](https://python.langchain.com/docs/langserve/) (elixirs) that can be installed into a LangServe instance and exposed as APIs. [OpenMRS O3 esm](https://o3-docs.openmrs.org/) / [**CDS hook** clients](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) (conchs) provide the interface to communicate with these APIs. All backend data exchange is done through the **FHIR API** (a base class provides all these features). dhti-cli simplifies this process by providing a CLI that includes managing a Docker Compose with all additional components, such as [Ollama](https://ollama.com/) for **local LLM hosting**. LLM and hyperparameters are **injected at runtime** and can be easily swapped. In essence, dhti decouples GenAI modules from the rest of the system. 👉 [Try it out today!](#try-it-out)
22
+
23
+ ### Want to know more?
17
24
 
18
25
  Gen AI can transform medicine. But it needs a framework for collaborative research and practice. DHTI is a reference architecture and an implementation for such a framework that integrates an EMR ([OpenMRS](https://openmrs.org/)), :link: Gen AI application server ([LangServe](https://python.langchain.com/v0.2/docs/langserve/)), self-hosted LLMs for privacy ([Ollama](https://ollama.com/)), tools on [MCP server](https://github.com/dermatologist/fhir-mcp-server), vector store for RAG ([redis](https://redis.io/)), monitoring ([LangFuse](https://langfuse.com/)), 🔥 FHIR repository with [CQL](https://nuchange.ca/2025/06/v-llm-in-the-loop-cql-execution-with-unstructured-data-and-fhir-terminology-support.html) support ([HAPI](https://cloud.alphora.com/sandbox/r4/cqm/)) and graph utilities ([Neo4j](https://neo4j.com/)) in one docker-compose! DHTI is inspired by [Bahmni](https://www.bahmni.org/) and **aims to facilitate GenAI adoption and research in areas with low resources.** The MCP server hosts pluggable, agent-invokable tools (FHIR query, summarization, terminology lookup, custom analytics, etc.) that you can extend without modifying core services.
19
26
 
20
- The essence of DHTI is *modularity* with an emphasis on *configuration!* It is non-opinionated on LLMs, hyperparameters and pretty much everything. DHTI supports installable Gen AI routines through [LangChain templates](https://templates.langchain.com/) (which we call :curry: **elixir**) and installable UI elements through [OpenMRS O3](https://o3-docs.openmrs.org/) React container (which we call :shell: **conch**). 🔥 FHIR is used for backend and [CDS-Hooks](https://cds-hooks.org/) for frontend communication, decoupling conches from OpenMRS, making them potentially usable with any health information system. We have a [fork of the cds-hook sandbox](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) for testing that uses the [order-select](https://cds-hooks.org/hooks/order-select/) hook, utilizing the contentString from the [FHIR CommunicationRequest](https://build.fhir.org/communicationrequest.html) within the [cds-hook context](https://cds-hooks.org/examples/) for user inputs (recommended).
27
+ The essence of DHTI is *modularity* with an emphasis on *configuration!* It is non-opinionated on LLMs, hyperparameters and pretty much everything. DHTI supports installable Gen AI routines through [LangServe Apps](https://python.langchain.com/docs/langserve/) (which we call :curry: **elixir**) and installable UI elements through [OpenMRS O3](https://o3-docs.openmrs.org/) React container (which we call :shell: **conch**). 🔥 FHIR is used for backend and [CDS-Hooks](https://cds-hooks.org/) for frontend communication, decoupling conches from OpenMRS, making them potentially usable with any health information system. We have a [fork of the cds-hook sandbox](https://github.com/dermatologist/cds-hooks-sandbox/tree/dhti-1) for testing that uses the [order-select](https://cds-hooks.org/hooks/order-select/) hook, utilizing the contentString from the [FHIR CommunicationRequest](https://build.fhir.org/communicationrequest.html) within the [cds-hook context](https://cds-hooks.org/examples/) for user inputs (recommended).
28
+
29
+
30
+ <p align="center">
31
+ <img src="https://github.com/dermatologist/openmrs-esm-dhti-template/blob/develop/notes/conch.jpg" />
32
+ </p>
33
+
34
+ *[OpenMRS ESM DHTI template](https://github.com/dermatologist/openmrs-esm-dhti-template) + [DHTI elixir template](https://github.com/dermatologist/dhti-elixir-template) together forms a simple but functional EMR chatbot too!* 👉 [Try it out today!](#try-it-out)
21
35
 
22
36
  <p align="center">
23
37
  <img src="https://github.com/dermatologist/dhti/blob/develop/notes/cds-hook-sandbox.jpg" />
24
38
  </p>
25
39
 
26
- 🚀 dhti-cli is a CLI tool for quick prototyping and testing of elixirs and conches. You can create a new docker-compose with required modules, start/stop services, install Elixirs and conch, create Docker images for them, and more. 🚀 This helps to test new ideas and share them with others quickly. 🚀 Once tested, you can transition them to the production team for deployment. Adoption of standards makes this transition easier!
40
+ *[CDS-Hooks sandbox](https://github.com/dermatologist/cds-hooks-sandbox) for testing conchs without OpenMRS.* 👉 [Try it out today!](#try-it-out)
41
+
42
+ 🚀 dhti-cli is a CLI tool for quick prototyping and testing of elixirs and conches. You can create a new docker-compose with required modules, start/stop services, install Elixirs and conch, create Docker images for them, and more. 🚀 This helps to test new ideas and share them with others quickly. 🚀 Once tested, you can transition them to the production team for deployment. Adoption of standards makes this transition easier! 👉 [Try it out today!](#try-it-out)
27
43
 
28
44
  ⭐️ **Pitched at [Falling Walls Lab Illinois](https://falling-walls.com/falling-walls-lab-illinois) and released on 09/12/2025.**
29
45
 
@@ -43,6 +59,8 @@ The essence of DHTI is *modularity* with an emphasis on *configuration!* It is n
43
59
  <img src="https://github.com/dermatologist/dhti/blob/develop/notes/arch-1.drawio.svg" />
44
60
  </p>
45
61
 
62
+ 🔥 **Coming soon!:** We are currently working on expanding the DHTI architecture to support traditional machine learning models, such as *EEG sleep stage classification* and *trichogram analysis*, exposing inference pipelines as agentic tools!
63
+
46
64
  ## ✨ Features
47
65
  * **Modular**: Supports installable Gen AI routines and UI elements.
48
66
  * **Quick prototyping**: CLI helps in quick prototyping and testing of Gen AI routines and UI elements.
@@ -63,13 +81,14 @@ The essence of DHTI is *modularity* with an emphasis on *configuration!* It is n
63
81
 
64
82
  *Developers can build elixirs and conchs for DHTI.*
65
83
 
66
- :curry: Elixirs are [LangChain templates]((https://templates.langchain.com/)) for backend GenAI functionality. By convention, Elixirs are prefixed with *dhti-elixir-* and all elixirs depend on [dhti-elixir-base](https://github.com/dermatologist/dhti-elixir-base) which provides some base classes and defines dependencies. You can use [this template](https://github.com/dermatologist/dhti-elixir-template) to build new elixirs, and license it the way you want (We :heart: open-source!).
84
+ :curry: Elixirs are [LangServe Apps](https://python.langchain.com/docs/langserve/) for backend GenAI functionality. By convention, Elixirs are prefixed with *dhti-elixir-* and all elixirs depend on [dhti-elixir-base](https://github.com/dermatologist/dhti-elixir-base) which provides some base classes and defines dependencies. You can use [this template](https://github.com/dermatologist/dhti-elixir-template) or the [cookiecutter](https://github.com/dermatologist/cookiecutter-uv) to build new elixirs, and license it the way you want (We :heart: open-source!).
67
85
 
68
- :shell: Conches are [OpenMRS O3s](https://o3-docs.openmrs.org/) and follow the standard naming convention *openmrs-esm-*. A separate OpenMRS independant container for conchs is on our roadmap for use outside OpenMRS. You can use [this template](https://github.com/dermatologist/openmrs-esm-dhti-template) to build new conches.
86
+ :shell: Conches are [OpenMRS O3s](https://o3-docs.openmrs.org/) and follow the standard naming convention *openmrs-esm-*. You can use [this template](https://github.com/dermatologist/openmrs-esm-dhti-template) to build new conches.
69
87
 
70
88
  :white_check_mark:
71
89
  * **Developer friendly**: Copy working files to running containers for testing.
72
90
  * **Dependency Injection**: Dependency injection for models and hyperparameters for configuring elixirs.
91
+ * 👉 [Try it out today!](#try-it-out)
73
92
 
74
93
  ## 🧠 For Gen AI Researchers
75
94
 
@@ -82,8 +101,9 @@ Tools to fine-tune language models for the stack are on our roadmap. We encourag
82
101
  :white_check_mark:
83
102
  * **Generate synthetic data**: DHTI supports generating synthetic data for testing.
84
103
  * **CQL support**: [CQL for clinical decision support](https://nuchange.ca/2025/06/v-llm-in-the-loop-cql-execution-with-unstructured-data-and-fhir-terminology-support.html).
85
- * **FHIR**: Data exchange with FHIR schema.
86
- * **EMR**: Built in EMR, OpenMRS, for patient records.
104
+ * **FHIR**: Data exchange with FHIR schema and **CDS-Hooks** for frontend-backend communication.
105
+ * **EMR**: Built-in EMR, OpenMRS, for patient records.
106
+ * 👉 [Try it out today!](#try-it-out)
87
107
 
88
108
  🌈 *Join us to make the Gen AI equitable and help doctors save lives!*
89
109
 
@@ -94,6 +114,7 @@ Tools to fine-tune language models for the stack are on our roadmap. We encourag
94
114
  ## :sparkles: Resources (in Beta)
95
115
  * [dhti-elixir-base](https://github.com/dermatologist/dhti-elixir-base): Base classes for dhti-elixir
96
116
  * [dhti-elixir-template](https://github.com/dermatologist/dhti-elixir-template): A template for creating new dhti-elixirs.
117
+ * [openmrs-esm-dhti-template](https://github.com/dermatologist/openmrs-esm-dhti-template): A conch template for OpenMRS
97
118
  * [fhir-mcp-server](https://github.com/dermatologist/fhir-mcp-server): A MCP server for hosting FHIR-compliant tools.
98
119
 
99
120
  ## :sparkles: Resources (in Alpha)
@@ -107,104 +128,23 @@ Tools to fine-tune language models for the stack are on our roadmap. We encourag
107
128
  * [dhti-elixir-upload](https://github.com/dermatologist/dhti-elixir-upload-file): Upload documents to the vector store for clinical knowledgebase and clinical trial matching.
108
129
  * [openmrs-esm-qa](https://github.com/dermatologist/openmrs-esm-genai): A sample conch for Q&A on patient records using the dhti-elixir-fhire elixir.
109
130
 
110
- ## 🔧 Usage
111
-
112
- ### List of plugins
113
- ```
114
- dhti-cli help
115
- ```
116
-
117
- ### Get help for each plugin
118
- * As an example, get help for compose:
119
-
120
- ```
121
- dhti-cli compose --help
122
- ```
123
-
124
- ### 🏗️ *Try it out! It takes only a few minutes to setup GenAI backed EMR in your local machine!*
125
-
126
- You only need:
127
- * docker
128
- * nodejs
129
-
130
- ## :walking: Step 1
131
-
132
- * Git clone this repository: `git clone https://github.com/dermatologist/dhti.git && cd dhti`
133
- * Install the required packages: `npm install`
134
- * Build the CLI: `npm run build`
135
- * Install CLI locally: `npm link`
136
- * Test the CLI: `dhti-cli help` *This will show the available commands.*
137
- * The working directory is `~/dhti` (Customizable)
138
-
139
- ### 🔧 Create a new docker-compose
140
- * Create a new docker-compose file: `dhti-cli compose add -m openmrs -m langserve`
141
-
142
- * The docker-compose.yml is created with the following modules:
143
- - OpenMRS (EMR)
144
- - LangServe (API for LLM models)
145
-
146
- Other available modules: `ollama, langfuse, cqlFhir, redis, neo4j and mcpFhir` (Documentation in progress)
131
+ ## Try it out
147
132
 
148
- You can read the newly created docker-compose by: `dhti-cli compose read`
133
+ * You only need [Node.js](https://nodejs.org/) and [Docker](https://www.docker.com/) installed to run this project. Optionally, you can install [Python](https://www.python.org/) if you want to develop new elixirs. We use a fake LLM script for testing purposes, so you don't need an OpenAI key to run this project. It just says "Paris" or "I don't know" to any prompt. You can replace it with any internal or external LLM service later.
149
134
 
135
+ * `npx dhti-cli help` to see all available commands.
150
136
 
151
- ### 🚀 Start the services for initial setup
152
- * Start the services: `dhti-cli docker -u`
137
+ * `npx dhti-cli compose add -m openmrs -m langserve` to add OpenMRS and Langserve elixirs to your docker-compose.yml at ~/dhti. Other available modules: `ollama, langfuse, cqlFhir, redis, neo4j and mcpFhir`. You can read the newly created docker-compose by: `npx dhti-cli compose read`
153
138
 
154
- It may take a while to download the images and start the services. ([OpenMRS](https://openmrs.org/) may take upto 45 mins the first time to setup the database)
139
+ * `npx dhti-cli elixir install -g https://github.com/dermatologist/dhti-elixir-template.git -n dhti-elixir-template` to install a sample elixir from github. (Optionally) You may configure the hyperparameters in `~/dhti/elixir/app/bootstrap.py`. You can install multiple elixirs.
155
140
 
141
+ * `npx dhti-cli docker -n yourdockerhandle/genai-test:1.0 -t elixir` to build a docker image for the elixir.
156
142
 
157
- ### 🚀 Access OpenMRS and login:
158
- * Go to `http://localhost/openmrs/spa/home`
159
- * Login with the following credentials:
160
- - Username: admin
161
- - Password: Admin123
162
- - Choose any location and click on 'confirm'.
163
-
164
- ### 🚀 Access the LangServe API
165
- * Go to `localhost:8001/docs` (Empty Swagger UI)
166
-
167
- ## Congratulations! You have successfully setup DHTI! :tada:
168
- * Shut down the services: `dhti-cli docker -d`
169
-
170
- ## :running: STEP 2: 🛠️ *Now let us Install an Elixir (Gen AI functionalities are packaged as elixirs)*
171
-
172
- * Let's install the elixir here: https://github.com/dermatologist/dhti-elixir-template. This is just a template that uses a Mock LLM to output random text. You can use this template to build your own elixirs! (Cookiecutter to be released soon!) Later you will see how to add real LLM support.
173
-
174
- :running:
175
-
176
- `dhti-cli elixir install -g https://github.com/dermatologist/dhti-elixir-template.git -n dhti-elixir-template`.
177
-
178
- You may also install from PyPi or a wheel file!
179
-
180
- ### 🔍 Examine bootstrap.py (Optional)
181
- `cat ~/dhti/elixir/app/bootstrap.py`
143
+ * `npx dhti-cli conch install -g https://github.com/dermatologist/openmrs-esm-dhti-template.git -n openmrs-esm-dhti-template` to install a simple OpenMRS ESM module (conch)from github. You can install multiple conches.
182
144
 
183
- This is where you can override defaults in the elixir for *LLM, embedding model, hyperparameters etc that are injected at runtime.* Refer to each elixir for the available options. [You may check out how to inject a real LLM using Google Gemini!](/notes/add-llm.md)
145
+ * `npx dhti-cli docker -n yourdockerhandle/conch-test:1.0 -t conch` to build a docker image for the conches.
184
146
 
185
- ### 🔧 Create docker container
186
- `dhti-cli docker -n beapen/genai-test:1.0 -t elixir`
187
-
188
- (You may replace `beapen/genai-test:1.0` with your own image name)
189
-
190
- ### 🚀 Congratulations! You installed your first elixir. We will see it in action later!
191
-
192
- While developing you can copy the app folder to a running container for testing (provided there are no changes in dependencies). Read more [here](/notes/dev-copy.md).
193
-
194
- ## STEP 3: :shell: *Now let us Install a Conch (The UI component)*
195
-
196
- * Let's install the conch here:https://github.com/dermatologist/openmrs-esm-dhti-template. This uses the elixir template that we installed in STEP 2 as the backend. You can use the template to build your own conchs.
197
-
198
- :shell: `dhti-cli conch install -g https://github.com/dermatologist/openmrs-esm-dhti-template.git -n openmrs-esm-dhti-template`
199
-
200
- We can also install from a dev folder after cloning the repository. While developing you can copy the dist folder to a running container for testing. Read more [here](/notes/dev-copy.md).
201
-
202
- ### 🔧 Create new docker container
203
- `dhti-cli docker -n beapen/conch-test:1.0 -t conch`
204
-
205
- ## 🚀 It is now time to start DHTI!
206
-
207
- `dhti-cli docker -u`
147
+ * `npx dhti-cli docker -u` to start all the docker images in your docker-compose.yml.
208
148
 
209
149
  ### :clap: Access the Conch in OpenMRS and test the integration
210
150
 
@@ -219,8 +159,9 @@ This is just a template, though. You can build your own conchs!
219
159
  Add some text to the text area and click on **Submit**.
220
160
  You will see the text above the textbox.
221
161
 
222
- ### Stop the services
223
- You can remove the services by: `dhti-cli docker -d`
162
+ * `npx dhti-cli docker -d` to stop and delete all the docker containers.
163
+
164
+ Read more in [notes/steps.md](/notes/steps.md). Complete documentation is in progress.
224
165
 
225
166
  ### The demo uses a template with mock LLM. [Check out how to add real LLM support using Google Gemini.](/notes/add-llm.md)
226
167
 
@@ -228,6 +169,7 @@ You can remove the services by: `dhti-cli docker -d`
228
169
 
229
170
  ## 🚀 Advanced
230
171
 
172
+ * [Detailed steps to try it out](/notes/steps.md)
231
173
  * [Setting up Ollama](/notes/setup-ollama.md)
232
174
  * [CLI Options](/notes/cli-options.md)
233
175
 
@@ -238,4 +180,4 @@ If you find this project useful, give us a star. It helps others discover the pr
238
180
 
239
181
  ## Contributors
240
182
 
241
- * [Bell Eapen](https://nuchange.ca) | [![Twitter Follow](https://img.shields.io/twitter/follow/beapen?style=social)](https://twitter.com/beapen)
183
+ * [Bell Eapen](https://nuchange.ca) ([UIS](https://www.uis.edu/directory/bell-punneliparambil-eapen)) | [Contact](https://nuchange.ca/contact) | [![Twitter Follow](https://img.shields.io/twitter/follow/beapen?style=social)](https://twitter.com/beapen)
@@ -2,18 +2,26 @@ import { Args, Command, Flags } from '@oclif/core';
2
2
  import yaml from 'js-yaml';
3
3
  import fs from 'node:fs';
4
4
  import os from 'node:os';
5
+ import path from 'node:path';
6
+ import { fileURLToPath } from 'node:url';
5
7
  export default class Compose extends Command {
6
8
  static args = {
7
9
  op: Args.string({ description: 'Operation to perform (add, delete, read or reset)' }),
8
10
  };
9
11
  static description = 'Generates a docker-compose.yml file from a list of modules';
10
- static examples = [
11
- '<%= config.bin %> <%= command.id %>',
12
- ];
12
+ static examples = ['<%= config.bin %> <%= command.id %>'];
13
13
  static flags = {
14
- file: Flags.string({ char: 'f', default: `${os.homedir()}/dhti/docker-compose.yml`, description: 'Full path to the docker compose file to read from. Creates if it does not exist' }),
14
+ file: Flags.string({
15
+ char: 'f',
16
+ default: `${os.homedir()}/dhti/docker-compose.yml`,
17
+ description: 'Full path to the docker compose file to read from. Creates if it does not exist',
18
+ }),
15
19
  // flag with a value (-n, --name=VALUE)
16
- module: Flags.string({ char: 'm', description: 'Modules to add from ( langserve, openmrs, ollama, langfuse, cqlFhir, redis, neo4j and mcpFhir)', multiple: true }),
20
+ module: Flags.string({
21
+ char: 'm',
22
+ description: 'Modules to add from ( langserve, openmrs, ollama, langfuse, cqlFhir, redis, neo4j and mcpFhir)',
23
+ multiple: true,
24
+ }),
17
25
  };
18
26
  static init = () => {
19
27
  // Create ${os.homedir()}/dhti if it does not exist
@@ -27,6 +35,10 @@ export default class Compose extends Command {
27
35
  };
28
36
  async run() {
29
37
  const { args, flags } = await this.parse(Compose);
38
+ // Resolve resources directory for both dev (src) and packaged (dist)
39
+ const __filename = fileURLToPath(import.meta.url);
40
+ const __dirname = path.dirname(__filename);
41
+ const RESOURCES_DIR = path.resolve(__dirname, '../resources');
30
42
  // console.log('args', args) //args { op: 'add' }
31
43
  // console.log('flags', flags) //flags { module: [ 'default', 'langserve', 'redis' ] }
32
44
  const openmrs = ['gateway', 'frontend', 'backend', 'openmrs-db'];
@@ -51,10 +63,10 @@ export default class Compose extends Command {
51
63
  ollama,
52
64
  openmrs,
53
65
  redis,
54
- webui
66
+ webui,
55
67
  };
56
68
  try {
57
- const masterData = yaml.load(fs.readFileSync('src/resources/docker-compose-master.yml', 'utf8'));
69
+ const masterData = yaml.load(fs.readFileSync(path.join(RESOURCES_DIR, 'docker-compose-master.yml'), 'utf8'));
58
70
  let existingData = { services: {}, version: '3.8' };
59
71
  if (fs.existsSync(flags.file)) {
60
72
  existingData = yaml.load(fs.readFileSync(flags.file, 'utf8'));
@@ -2,28 +2,44 @@ import { Args, Command, Flags } from '@oclif/core';
2
2
  import { exec } from 'node:child_process';
3
3
  import fs from 'node:fs';
4
4
  import os from 'node:os';
5
+ import path from 'node:path';
6
+ import { fileURLToPath } from 'node:url';
5
7
  export default class Conch extends Command {
6
8
  static args = {
7
9
  op: Args.string({ description: 'Operation to perform (install, uninstall or dev)' }),
8
10
  };
9
11
  static description = 'Install or uninstall conchs to create a Docker image';
10
- static examples = [
11
- '<%= config.bin %> <%= command.id %>',
12
- ];
12
+ static examples = ['<%= config.bin %> <%= command.id %>'];
13
13
  static flags = {
14
- branch: Flags.string({ char: 'b', default: "develop", description: 'Branch to install from' }),
15
- container: Flags.string({ char: 'c', default: "dhti-frontend-1", description: 'Name of the container to copy the conch to while in dev mode' }),
16
- dev: Flags.string({ char: 'd', default: "none", description: 'Dev folder to install' }),
17
- git: Flags.string({ char: 'g', default: "none", description: 'Github repository to install' }),
18
- image: Flags.string({ char: 'i', default: "openmrs/openmrs-reference-application-3-frontend:3.0.0-beta.17", description: 'Base image to use for the conch' }),
14
+ branch: Flags.string({ char: 'b', default: 'develop', description: 'Branch to install from' }),
15
+ container: Flags.string({
16
+ char: 'c',
17
+ default: 'dhti-frontend-1',
18
+ description: 'Name of the container to copy the conch to while in dev mode',
19
+ }),
20
+ dev: Flags.string({ char: 'd', default: 'none', description: 'Dev folder to install' }),
21
+ git: Flags.string({ char: 'g', default: 'none', description: 'Github repository to install' }),
22
+ image: Flags.string({
23
+ char: 'i',
24
+ default: 'openmrs/openmrs-reference-application-3-frontend:3.0.0-beta.17',
25
+ description: 'Base image to use for the conch',
26
+ }),
19
27
  name: Flags.string({ char: 'n', description: 'Name of the elixir' }),
20
- repoVersion: Flags.string({ char: 'v', default: "1.0.0", description: 'Version of the conch' }),
21
- workdir: Flags.string({ char: 'w', default: `${os.homedir()}/dhti`, description: 'Working directory to install the conch' }),
28
+ repoVersion: Flags.string({ char: 'v', default: '1.0.0', description: 'Version of the conch' }),
29
+ workdir: Flags.string({
30
+ char: 'w',
31
+ default: `${os.homedir()}/dhti`,
32
+ description: 'Working directory to install the conch',
33
+ }),
22
34
  };
23
35
  async run() {
24
36
  const { args, flags } = await this.parse(Conch);
37
+ // Resolve resources directory for both dev (src) and packaged (dist)
38
+ const __filename = fileURLToPath(import.meta.url);
39
+ const __dirname = path.dirname(__filename);
40
+ const RESOURCES_DIR = path.resolve(__dirname, '../resources');
25
41
  if (!flags.name) {
26
- console.log("Please provide a name for the conch");
42
+ console.log('Please provide a name for the conch');
27
43
  this.exit(1);
28
44
  }
29
45
  // if arg is dev then copy to docker as below
@@ -50,14 +66,14 @@ export default class Conch extends Command {
50
66
  });
51
67
  }
52
68
  catch (error) {
53
- console.log("Error copying conch to container", error);
69
+ console.log('Error copying conch to container', error);
54
70
  }
55
71
  }
56
72
  // Create a directory to install the elixir
57
73
  if (!fs.existsSync(`${flags.workdir}/conch`)) {
58
74
  fs.mkdirSync(`${flags.workdir}/conch`);
59
75
  }
60
- fs.cpSync('src/resources/spa', `${flags.workdir}/conch`, { recursive: true });
76
+ fs.cpSync(path.join(RESOURCES_DIR, 'spa'), `${flags.workdir}/conch`, { recursive: true });
61
77
  // Rewrite files
62
78
  const rewrite = () => {
63
79
  flags.name = flags.name ?? 'openmrs-esm-genai';
@@ -77,16 +93,19 @@ export default class Conch extends Command {
77
93
  fs.writeFileSync(`${flags.workdir}/conch/def/spa-assemble-config.json`, JSON.stringify(spaAssembleConfig, null, 2));
78
94
  // Read and process Dockerfile
79
95
  let dockerfile = fs.readFileSync(`${flags.workdir}/conch/Dockerfile`, 'utf8');
80
- dockerfile = dockerfile.replaceAll('conch', flags.name).replaceAll('version', flags.repoVersion).replaceAll('server-image', flags.image);
96
+ dockerfile = dockerfile
97
+ .replaceAll('conch', flags.name)
98
+ .replaceAll('version', flags.repoVersion)
99
+ .replaceAll('server-image', flags.image);
81
100
  fs.writeFileSync(`${flags.workdir}/conch/Dockerfile`, dockerfile);
82
101
  // Read routes.json
83
102
  const routes = JSON.parse(fs.readFileSync(`${flags.workdir}/conch/${flags.name}/src/routes.json`, 'utf8'));
84
103
  // Add to routes.registry.json
85
104
  const registry = JSON.parse(fs.readFileSync(`${flags.workdir}/conch/def/routes.registry.json`, 'utf8'));
86
105
  if (args.op === 'install')
87
- registry[(flags.name).replace('openmrs-', '@openmrs/')] = routes;
106
+ registry[flags.name.replace('openmrs-', '@openmrs/')] = routes;
88
107
  if (args.op === 'uninstall')
89
- delete registry[(flags.name).replace('openmrs-', '@openmrs/')];
108
+ delete registry[flags.name.replace('openmrs-', '@openmrs/')];
90
109
  fs.writeFileSync(`${flags.workdir}/conch/def/routes.registry.json`, JSON.stringify(registry, null, 2));
91
110
  };
92
111
  if (flags.git !== 'none') {
@@ -3,6 +3,7 @@ import { exec } from 'node:child_process';
3
3
  import fs from 'node:fs';
4
4
  import os from 'node:os';
5
5
  import path from 'node:path';
6
+ import { fileURLToPath } from 'node:url';
6
7
  export default class Elixir extends Command {
7
8
  static args = {
8
9
  op: Args.string({ description: 'Operation to perform (install, uninstall or dev)' }),
@@ -34,6 +35,10 @@ export default class Elixir extends Command {
34
35
  };
35
36
  async run() {
36
37
  const { args, flags } = await this.parse(Elixir);
38
+ // Resolve resources directory for both dev (src) and packaged (dist)
39
+ const __filename = fileURLToPath(import.meta.url);
40
+ const __dirname = path.dirname(__filename);
41
+ const RESOURCES_DIR = path.resolve(__dirname, '../resources');
37
42
  if (!flags.name) {
38
43
  console.log('Please provide a name for the elixir');
39
44
  this.exit(1);
@@ -42,9 +47,9 @@ export default class Elixir extends Command {
42
47
  // if arg is dev then copy to docker as below
43
48
  // docker restart dhti-langserve-1
44
49
  if (args.op === 'dev') {
45
- console.log(`cd ${flags.dev} && docker cp ${expoName}/. ${flags.container}:/app/.venv/lib/python3.11/site-packages/${expoName}`);
50
+ console.log(`cd ${flags.dev} && docker cp src/${expoName}/. ${flags.container}:/app/.venv/lib/python3.12/site-packages/${expoName}`);
46
51
  try {
47
- exec(`cd ${flags.dev} && docker cp ${expoName}/. ${flags.container}:/app/.venv/lib/python3.11/site-packages/${expoName}`, (error, stdout, stderr) => {
52
+ exec(`cd ${flags.dev} && docker cp src/${expoName}/. ${flags.container}:/app/.venv/lib/python3.12/site-packages/${expoName}`, (error, stdout, stderr) => {
48
53
  if (error) {
49
54
  console.error(`exec error: ${error}`);
50
55
  return;
@@ -69,7 +74,7 @@ export default class Elixir extends Command {
69
74
  if (!fs.existsSync(`${flags.workdir}/elixir`)) {
70
75
  fs.mkdirSync(`${flags.workdir}/elixir`);
71
76
  }
72
- fs.cpSync('src/resources/genai', `${flags.workdir}/elixir`, { recursive: true });
77
+ fs.cpSync(path.join(RESOURCES_DIR, 'genai'), `${flags.workdir}/elixir`, { recursive: true });
73
78
  // if whl is not none, copy the whl file to thee whl directory
74
79
  if (flags.whl !== 'none') {
75
80
  if (!fs.existsSync(`${flags.workdir}/elixir/whl/`)) {
@@ -440,5 +440,5 @@
440
440
  ]
441
441
  }
442
442
  },
443
- "version": "0.3.0"
443
+ "version": "0.3.2"
444
444
  }
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "dhti-cli",
3
3
  "description": "DHTI CLI",
4
- "version": "0.3.0",
4
+ "version": "0.3.2",
5
5
  "author": "Bell Eapen",
6
6
  "bin": {
7
7
  "dhti-cli": "bin/run.js"
@@ -25,7 +25,7 @@
25
25
  "@types/chai": "^4",
26
26
  "@types/js-yaml": "^4.0.9",
27
27
  "@types/mocha": "^10",
28
- "@types/node": "^18",
28
+ "@types/node": "^24",
29
29
  "@types/request": "^2.48.12",
30
30
  "@types/sinon": "^17.0.4",
31
31
  "chai": "^4",