cloud-pc-templates 1.2.3 → 1.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -7,6 +7,14 @@ Cloud PC Templates is a command-line tool for managing cloud PC configurations a
7
7
  ```bash
8
8
  npm install -g cloud-pc-templates
9
9
  ```
10
+ ## Video Tutorial
11
+
12
+ For a step-by-step guide on using Cloud PC Templates, watch this video tutorial:
13
+
14
+ [![Cloud PC Templates Tutorial](https://img.youtube.com/vi/XMF0K9R2rD0/0.jpg)](https://www.youtube.com/watch?v=XMF0K9R2rD0)
15
+
16
+ [Click here to watch on YouTube](https://www.youtube.com/watch?v=XMF0K9R2rD0)
17
+
10
18
 
11
19
  ## Usage
12
20
 
@@ -42,14 +50,7 @@ npx cloud-pc-templates ai login loginMode ollamacloud
42
50
  ```
43
51
 
44
52
  **What it does:**
45
- 1. Checks if you're already logged in by testing the health endpoint at `http://localhost:3004/health`
46
- 2. If already logged in, displays "Already logged in" message
47
- 3. If not logged in:
48
- - Prompts you to enter your API Key (masked input with asterisks)
49
- - Downloads the Ollama proxy script from GitHub
50
- - Runs the proxy with your API key as an argument
51
- - Validates the health endpoint
52
- - Displays "Logged in" confirmation
53
+ 1. Runs a wrapper service at `http://localhost:3004/health` which is an interface to ollama cloud models, it requires ollama api key.
53
54
 
54
55
  **Example:**
55
56
  ```bash
@@ -59,27 +60,15 @@ Enter API Key: **************************
59
60
  - Endpoint checked: http://localhost:3004/health
60
61
  ```
61
62
 
62
- **Features:**
63
- - Masked input for API key (shows `*` instead of actual characters)
64
- - Supports pasting long API keys
65
- - Backspace support for corrections
66
- - Cross-platform terminal support (TTY and non-TTY)
67
- - Real-time proxy output logging for debugging
68
-
69
63
  ##### Ollama Local Login
70
64
 
71
- Connect to Ollama Local instance:
65
+ Connect to Ollama Local models:
72
66
  ```bash
73
67
  npx cloud-pc-templates ai login loginMode ollamalocal
74
68
  ```
75
69
 
76
70
  **What it does:**
77
- 1. Checks if Ollama is running on the default port (11434)
78
- 2. If Ollama is not running, displays a warning with installation instructions
79
- 3. Downloads the Ollama Offline Proxy script from GitHub
80
- 4. Runs the proxy on port 3005 (no API key required)
81
- 5. Validates the proxy health endpoint
82
- 6. Displays "Logged in" confirmation
71
+ 1. Runs a wrapper service to connect to ollama local models
83
72
 
84
73
  **Example (with Ollama running):**
85
74
  ```bash
@@ -93,26 +82,29 @@ $ npx cloud-pc-templates ai login loginMode ollamalocal
93
82
  - Ollama running on: localhost:11434
94
83
  ```
95
84
 
96
- **Example (without Ollama running):**
97
- ```bash
98
- $ npx cloud-pc-templates ai login loginMode ollamalocal
99
- 🔍 Checking if Ollama is running...
100
- ⚠️ WARNING: Ollama is not running on localhost:11434
101
- Please install Ollama and run it before using this login mode.
102
- Download Ollama from: https://ollama.ai
103
-
104
- After installation, start Ollama with:
105
- ollama serve
106
-
107
- Continuing anyway...
108
- ```
109
-
110
85
  **Features:**
111
86
  - No API key required
112
87
  - Checks for local Ollama installation
113
88
  - Helpful warnings with installation instructions
114
89
  - Runs proxy on port 3005
115
- - Provides detailed status output
90
+
91
+ ##### Hugging Face Login
92
+
93
+ Connect to Hugging Face services:
94
+ ```bash
95
+ npx cloud-pc-templates ai login loginMode huggingface
96
+ ```
97
+
98
+ **What it does:**
99
+ 1. Runs a wrapper service over huggingface cloud models and it requires huggingface api key
100
+
101
+ **Example:**
102
+ ```bash
103
+ $ npx cloud-pc-templates ai login loginMode huggingface
104
+ Enter Hugging Face API Key: ****************************
105
+ ✓ Logged in
106
+ - Endpoint checked: http://localhost:3006/health
107
+ ```
116
108
 
117
109
  ### Command Discovery
118
110
 
@@ -146,43 +138,12 @@ cloud-pc-templates/
146
138
  ├── handlers/
147
139
  │ ├── ollamacloud.js # Ollama Cloud login functionality
148
140
  │ ├── ollamalocal.js # Ollama Local login functionality
141
+ │ ├── huggingface.js # Hugging Face login functionality
149
142
  │ └── launch.js # Website launcher
150
143
  ├── package.json # Project metadata and bin configuration
151
144
  └── README.md # This file
152
145
  ```
153
-
154
- ### Modules
155
-
156
- #### index.js
157
- - **Command Tree**: Hierarchical command structure supporting nested subcommands
158
- - **Argument Parsing**: Handles command-line arguments with `--` prefix stripping
159
- - **Request Routing**: Routes commands to appropriate handlers
160
-
161
- #### handlers/ollamacloud.js
162
- - `promptForApiKey()`: Interactive masked API key input
163
- - `checkHealthEndpoint()`: Health check for proxy server
164
- - `downloadAndRunProxy()`: Downloads and executes proxy script with API key
165
- - `checkAndLoginOllamaCloud()`: Main login orchestrator
166
-
167
- #### handlers/ollamalocal.js
168
- - `checkOllamaHealth()`: Verifies Ollama is running on port 11434
169
- - `downloadAndRunProxy()`: Downloads and executes offline proxy script
170
- - `checkProxyHealth()`: Health check for the offline proxy
171
- - `checkAndLoginOllamaLocal()`: Main login orchestrator with warning system
172
-
173
- #### handlers/launch.js
174
- - `openBrowser()`: Cross-platform browser launcher
175
- - `launchWebsite()`: Opens cloud-pc-templates.com
176
-
177
- ## Features
178
-
179
- ### Masked API Key Input
180
- When logging in to Ollama Cloud, your API key is protected:
181
- - Each character you type displays as an asterisk `*`
182
- - Works with keyboard input and pasted text
183
- - Supports backspace for corrections
184
-
185
- ### Cross-Platform Support
146
+ ### Cross-Platform Support for the launch command
186
147
  The launch command works on:
187
148
  - macOS (uses `open` command)
188
149
  - Linux (uses `xdg-open` command)
@@ -216,36 +177,6 @@ Each handler is a separate module that can be:
216
177
  - Extended with new features
217
178
  - Reused in other projects
218
179
 
219
- ### Adding New Commands
220
-
221
- To add a new command:
222
-
223
- 1. Create a new handler file in `handlers/`:
224
- ```javascript
225
- // handlers/mycommand.js
226
- async function myCommandHandler() {
227
- // Implementation
228
- }
229
-
230
- module.exports = { myCommandHandler };
231
- ```
232
-
233
- 2. Import it in `index.js`:
234
- ```javascript
235
- const { myCommandHandler } = require('./handlers/mycommand');
236
- ```
237
-
238
- 3. Add it to the command tree:
239
- ```javascript
240
- const commandTree = {
241
- // ... existing commands
242
- mycommand: {
243
- description: 'My command description',
244
- handler: () => myCommandHandler()
245
- }
246
- };
247
- ```
248
-
249
180
  ## NPM Script
250
181
 
251
182
  The project is configured with a binary entrypoint in `package.json`:
@@ -260,59 +191,10 @@ The project is configured with a binary entrypoint in `package.json`:
260
191
 
261
192
  This enables the `npx cloud-pc-templates` command globally.
262
193
 
263
- ## API Key Security
264
-
265
- When entering your API key:
266
- - Input is masked with asterisks
267
- - Key is passed directly to the proxy process
268
- - Never logged or stored in plain text
269
- - Passed via command-line argument or environment variable
270
-
271
- ## Troubleshooting
272
-
273
- ### "Unknown command" error
274
- - Make sure you've spelled the command correctly
275
- - Use `npx cloud-pc-templates help` to see available commands
276
- - Commands are case-sensitive
277
-
278
- ### Ollama Cloud login fails
279
- - Check that you have a valid API key
280
- - Ensure your network connection is stable
281
- - Try checking if the health endpoint is accessible manually:
282
- ```bash
283
- curl http://localhost:3004/health
284
- ```
285
-
286
- ### Ollama Local login fails
287
- - Make sure Ollama is installed: https://ollama.ai
288
- - Start Ollama with: `ollama serve`
289
- - Check if Ollama is running on port 11434:
290
- ```bash
291
- curl http://localhost:11434/api/tags
292
- ```
293
- - Ensure port 3005 is not in use by another application
294
-
295
- ### Browser won't open with `launch`
296
- - Ensure you have a default browser configured
297
- - On Linux, make sure `xdg-open` is installed: `sudo apt-get install xdg-utils`
298
- - On Windows, ensure a browser is set as default
299
- - On Android/Termux:
300
- - For best results, install Termux API: `pkg install termux-api`
301
- - Otherwise, `xdg-open` will be used as fallback
302
- - Ensure you have a browser app installed on your device
303
-
304
- ## Video Tutorial
305
-
306
- For a step-by-step guide on using Cloud PC Templates, watch this video tutorial:
307
-
308
- [![Cloud PC Templates Tutorial](https://img.youtube.com/vi/XMF0K9R2rD0/0.jpg)](https://www.youtube.com/watch?v=XMF0K9R2rD0)
309
-
310
- [Click here to watch on YouTube](https://www.youtube.com/watch?v=XMF0K9R2rD0)
311
-
312
194
  ## License
313
195
 
314
196
  ISC
315
197
 
316
198
  ## Author
317
199
 
318
- Cloud PC Templates Contributors
200
+ Devashish Priyadarshi
@@ -0,0 +1,169 @@
1
+ const http = require('http');
2
+ const https = require('https');
3
+ const fs = require('fs');
4
+ const path = require('path');
5
+ const { spawn } = require('child_process');
6
+ const os = require('os');
7
+
8
+ const PROXY_PORT = 3003;
9
+
10
+ // Function to get masked API key input
11
+ function promptForApiKey() {
12
+ return new Promise((resolve) => {
13
+ process.stdout.write('Enter Hugging Face API Key: ');
14
+
15
+ const stdin = process.stdin;
16
+
17
+ // Handle both TTY and non-TTY environments
18
+ if (stdin.isTTY) {
19
+ stdin.setRawMode(true);
20
+ }
21
+ stdin.resume();
22
+
23
+ let apiKey = '';
24
+
25
+ stdin.on('data', (buffer) => {
26
+ const chunk = buffer.toString();
27
+
28
+ // Process each character in the chunk (handles pasted text)
29
+ for (let i = 0; i < chunk.length; i++) {
30
+ const char = chunk[i];
31
+
32
+ if (char === '\n' || char === '\r' || char === '\u0004') {
33
+ // Enter or EOF
34
+ if (stdin.isTTY) {
35
+ stdin.setRawMode(false);
36
+ }
37
+ stdin.pause();
38
+ stdin.removeAllListeners('data');
39
+ console.log('');
40
+ resolve(apiKey);
41
+ return;
42
+ } else if (char === '\u0003') {
43
+ // Ctrl+C
44
+ if (stdin.isTTY) {
45
+ stdin.setRawMode(false);
46
+ }
47
+ process.exit();
48
+ } else if (char === '\x7f' || char === '\b') {
49
+ // Backspace
50
+ if (apiKey.length > 0) {
51
+ apiKey = apiKey.slice(0, -1);
52
+ process.stdout.write('\x1b[D\x1b[K');
53
+ }
54
+ } else if (char >= '\x20' && char <= '\x7e') {
55
+ // Printable character
56
+ apiKey += char;
57
+ process.stdout.write('*');
58
+ }
59
+ }
60
+ });
61
+ });
62
+ }
63
+
64
+ // Function to check health endpoint
65
+ function checkHealthEndpoint(endpoint) {
66
+ return new Promise((resolve) => {
67
+ const url = new URL(endpoint);
68
+ const protocol = url.protocol === 'https:' ? https : http;
69
+
70
+ const request = protocol.request(url, { method: 'GET' }, (res) => {
71
+ resolve(res.statusCode === 200);
72
+ });
73
+
74
+ request.on('error', () => {
75
+ resolve(false);
76
+ });
77
+
78
+ request.end();
79
+ });
80
+ }
81
+
82
+ // Function to download and run the proxy
83
+ async function downloadAndRunProxy(endpoint) {
84
+ const url = 'https://raw.githubusercontent.com/devashish234073/cloud-pc-templates-marketplace/refs/heads/main/JS-PROXIES/hf-proxy.js';
85
+ const tempFile = path.join(os.tmpdir(), 'hf-proxy.js');
86
+
87
+ // Download the file
88
+ await new Promise((resolve, reject) => {
89
+ const file = fs.createWriteStream(tempFile);
90
+ https.get(url, (res) => {
91
+ res.pipe(file);
92
+ file.on('finish', () => {
93
+ file.close();
94
+ resolve();
95
+ });
96
+ }).on('error', reject);
97
+ });
98
+
99
+ // Get API key from user
100
+ const apiKey = await promptForApiKey();
101
+
102
+ // Run the proxy with API key passed as command-line argument
103
+ return new Promise((resolve, reject) => {
104
+ const child = spawn('node', [tempFile, apiKey]);
105
+
106
+ let serverReady = false;
107
+
108
+ // Capture stdout to detect when server is ready
109
+ child.stdout.on('data', (data) => {
110
+ const output = data.toString();
111
+ console.log(output);
112
+
113
+ // Check if server indicates it's ready
114
+ if (output.includes('listening') || output.includes('started') || output.includes('running')) {
115
+ serverReady = true;
116
+ }
117
+ });
118
+
119
+ // Capture stderr for error messages
120
+ child.stderr.on('data', (data) => {
121
+ console.error(data.toString());
122
+ });
123
+
124
+ // Wait a bit for server to start, then validate
125
+ setTimeout(async () => {
126
+ try {
127
+ const isHealthy = await checkHealthEndpoint(endpoint);
128
+ if (isHealthy) {
129
+ console.log('✓ Logged in');
130
+ console.log(` - Endpoint checked: ${endpoint}`);
131
+ } else {
132
+ console.log('✓ Proxy started');
133
+ console.log(` - Endpoint: ${endpoint}`);
134
+ }
135
+ resolve();
136
+ } catch (error) {
137
+ reject(error);
138
+ }
139
+ }, 2000);
140
+
141
+ child.on('error', reject);
142
+ });
143
+ }
144
+
145
+ // Function to check and login to Hugging Face
146
+ async function checkAndLoginHuggingFace() {
147
+ const endpoint = `http://localhost:${PROXY_PORT}/health`;
148
+
149
+ try {
150
+ const isHealthy = await checkHealthEndpoint(endpoint);
151
+ if (isHealthy) {
152
+ console.log('✓ Already logged in');
153
+ console.log(` - Endpoint checked: ${endpoint}`);
154
+ return;
155
+ }
156
+
157
+ // Not healthy, download and run proxy
158
+ await downloadAndRunProxy(endpoint);
159
+ } catch (error) {
160
+ console.error('Error during login:', error.message);
161
+ }
162
+ }
163
+
164
+ module.exports = {
165
+ checkAndLoginHuggingFace,
166
+ checkHealthEndpoint,
167
+ downloadAndRunProxy,
168
+ promptForApiKey
169
+ };
package/index.js CHANGED
@@ -2,6 +2,7 @@
2
2
 
3
3
  const { checkAndLoginOllamaCloud } = require('./handlers/ollamacloud');
4
4
  const { checkAndLoginOllamaLocal } = require('./handlers/ollamalocal');
5
+ const { checkAndLoginHuggingFace } = require('./handlers/huggingface');
5
6
  const { launchWebsite } = require('./handlers/launch');
6
7
 
7
8
  // Command tree structure
@@ -27,6 +28,10 @@ const commandTree = {
27
28
  ollamalocal: {
28
29
  description: 'Connect to Ollama Local',
29
30
  handler: () => aiLogin('ollamalocal')
31
+ },
32
+ huggingface: {
33
+ description: 'Connect to Hugging Face',
34
+ handler: () => aiLogin('huggingface')
30
35
  }
31
36
  }
32
37
  }
@@ -54,6 +59,7 @@ function help() {
54
59
  console.log('AI Commands:');
55
60
  console.log(' npx cloud-pc-templates ai login loginMode ollamacloud');
56
61
  console.log(' npx cloud-pc-templates ai login loginMode ollamalocal');
62
+ console.log(' npx cloud-pc-templates ai login loginMode huggingface');
57
63
  }
58
64
 
59
65
  // Default AI function
@@ -67,6 +73,8 @@ async function aiLogin(mode) {
67
73
  await checkAndLoginOllamaCloud();
68
74
  } else if (mode === 'ollamalocal') {
69
75
  await checkAndLoginOllamaLocal();
76
+ } else if (mode === 'huggingface') {
77
+ await checkAndLoginHuggingFace();
70
78
  }
71
79
  }
72
80
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "cloud-pc-templates",
3
- "version": "1.2.3",
3
+ "version": "1.3.1",
4
4
  "description": "",
5
5
  "main": "index.js",
6
6
  "bin": {