@mindstudio-ai/local-model-tunnel 0.2.0 → 0.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +30 -115
- package/dist/chunk-PTK4SJQK.js +1768 -0
- package/dist/chunk-PTK4SJQK.js.map +1 -0
- package/dist/cli.d.ts +0 -2
- package/dist/cli.js +8 -517
- package/dist/cli.js.map +1 -1
- package/dist/index.d.ts +24 -5
- package/dist/index.js +6 -13
- package/dist/index.js.map +1 -1
- package/dist/tui-56JFPKBP.js +1561 -0
- package/dist/tui-56JFPKBP.js.map +1 -0
- package/package.json +11 -4
- package/dist/api.d.ts +0 -88
- package/dist/api.d.ts.map +0 -1
- package/dist/api.js +0 -168
- package/dist/api.js.map +0 -1
- package/dist/cli.d.ts.map +0 -1
- package/dist/config.d.ts +0 -27
- package/dist/config.d.ts.map +0 -1
- package/dist/config.js +0 -109
- package/dist/config.js.map +0 -1
- package/dist/helpers.d.ts +0 -4
- package/dist/helpers.d.ts.map +0 -1
- package/dist/helpers.js +0 -33
- package/dist/helpers.js.map +0 -1
- package/dist/index.d.ts.map +0 -1
- package/dist/ollama.d.ts +0 -11
- package/dist/ollama.d.ts.map +0 -1
- package/dist/ollama.js +0 -36
- package/dist/ollama.js.map +0 -1
- package/dist/providers/comfyui.d.ts +0 -29
- package/dist/providers/comfyui.d.ts.map +0 -1
- package/dist/providers/comfyui.js +0 -359
- package/dist/providers/comfyui.js.map +0 -1
- package/dist/providers/index.d.ts +0 -63
- package/dist/providers/index.d.ts.map +0 -1
- package/dist/providers/index.js +0 -126
- package/dist/providers/index.js.map +0 -1
- package/dist/providers/lmstudio.d.ts +0 -11
- package/dist/providers/lmstudio.d.ts.map +0 -1
- package/dist/providers/lmstudio.js +0 -106
- package/dist/providers/lmstudio.js.map +0 -1
- package/dist/providers/ollama.d.ts +0 -11
- package/dist/providers/ollama.d.ts.map +0 -1
- package/dist/providers/ollama.js +0 -59
- package/dist/providers/ollama.js.map +0 -1
- package/dist/providers/stable-diffusion.d.ts +0 -41
- package/dist/providers/stable-diffusion.d.ts.map +0 -1
- package/dist/providers/stable-diffusion.js +0 -283
- package/dist/providers/stable-diffusion.js.map +0 -1
- package/dist/providers/types.d.ts +0 -196
- package/dist/providers/types.d.ts.map +0 -1
- package/dist/providers/types.js +0 -19
- package/dist/providers/types.js.map +0 -1
- package/dist/quickstart/QuickstartScreen.d.ts +0 -5
- package/dist/quickstart/QuickstartScreen.d.ts.map +0 -1
- package/dist/quickstart/QuickstartScreen.js +0 -617
- package/dist/quickstart/QuickstartScreen.js.map +0 -1
- package/dist/quickstart/detect.d.ts +0 -22
- package/dist/quickstart/detect.d.ts.map +0 -1
- package/dist/quickstart/detect.js +0 -243
- package/dist/quickstart/detect.js.map +0 -1
- package/dist/quickstart/index.d.ts +0 -4
- package/dist/quickstart/index.d.ts.map +0 -1
- package/dist/quickstart/index.js +0 -274
- package/dist/quickstart/index.js.map +0 -1
- package/dist/quickstart/installers.d.ts +0 -109
- package/dist/quickstart/installers.d.ts.map +0 -1
- package/dist/quickstart/installers.js +0 -1296
- package/dist/quickstart/installers.js.map +0 -1
- package/dist/runner.d.ts +0 -19
- package/dist/runner.d.ts.map +0 -1
- package/dist/runner.js +0 -314
- package/dist/runner.js.map +0 -1
- package/dist/tui/App.d.ts +0 -7
- package/dist/tui/App.d.ts.map +0 -1
- package/dist/tui/App.js +0 -53
- package/dist/tui/App.js.map +0 -1
- package/dist/tui/TunnelRunner.d.ts +0 -19
- package/dist/tui/TunnelRunner.d.ts.map +0 -1
- package/dist/tui/TunnelRunner.js +0 -228
- package/dist/tui/TunnelRunner.js.map +0 -1
- package/dist/tui/components/Header.d.ts +0 -9
- package/dist/tui/components/Header.d.ts.map +0 -1
- package/dist/tui/components/Header.js +0 -21
- package/dist/tui/components/Header.js.map +0 -1
- package/dist/tui/components/ModelsPanel.d.ts +0 -7
- package/dist/tui/components/ModelsPanel.d.ts.map +0 -1
- package/dist/tui/components/ModelsPanel.js +0 -28
- package/dist/tui/components/ModelsPanel.js.map +0 -1
- package/dist/tui/components/ProvidersPanel.d.ts +0 -7
- package/dist/tui/components/ProvidersPanel.d.ts.map +0 -1
- package/dist/tui/components/ProvidersPanel.js +0 -6
- package/dist/tui/components/ProvidersPanel.js.map +0 -1
- package/dist/tui/components/RequestLog.d.ts +0 -8
- package/dist/tui/components/RequestLog.d.ts.map +0 -1
- package/dist/tui/components/RequestLog.js +0 -60
- package/dist/tui/components/RequestLog.js.map +0 -1
- package/dist/tui/components/StatusBar.d.ts +0 -10
- package/dist/tui/components/StatusBar.d.ts.map +0 -1
- package/dist/tui/components/StatusBar.js +0 -7
- package/dist/tui/components/StatusBar.js.map +0 -1
- package/dist/tui/components/index.d.ts +0 -6
- package/dist/tui/components/index.d.ts.map +0 -1
- package/dist/tui/components/index.js +0 -6
- package/dist/tui/components/index.js.map +0 -1
- package/dist/tui/events.d.ts +0 -35
- package/dist/tui/events.d.ts.map +0 -1
- package/dist/tui/events.js +0 -26
- package/dist/tui/events.js.map +0 -1
- package/dist/tui/hooks/index.d.ts +0 -5
- package/dist/tui/hooks/index.d.ts.map +0 -1
- package/dist/tui/hooks/index.js +0 -5
- package/dist/tui/hooks/index.js.map +0 -1
- package/dist/tui/hooks/useConnection.d.ts +0 -10
- package/dist/tui/hooks/useConnection.d.ts.map +0 -1
- package/dist/tui/hooks/useConnection.js +0 -42
- package/dist/tui/hooks/useConnection.js.map +0 -1
- package/dist/tui/hooks/useModels.d.ts +0 -9
- package/dist/tui/hooks/useModels.d.ts.map +0 -1
- package/dist/tui/hooks/useModels.js +0 -28
- package/dist/tui/hooks/useModels.js.map +0 -1
- package/dist/tui/hooks/useProviders.d.ts +0 -9
- package/dist/tui/hooks/useProviders.d.ts.map +0 -1
- package/dist/tui/hooks/useProviders.js +0 -30
- package/dist/tui/hooks/useProviders.js.map +0 -1
- package/dist/tui/hooks/useRequests.d.ts +0 -9
- package/dist/tui/hooks/useRequests.d.ts.map +0 -1
- package/dist/tui/hooks/useRequests.js +0 -60
- package/dist/tui/hooks/useRequests.js.map +0 -1
- package/dist/tui/index.d.ts +0 -2
- package/dist/tui/index.d.ts.map +0 -1
- package/dist/tui/index.js +0 -19
- package/dist/tui/index.js.map +0 -1
- package/dist/tui/screens/ConfigScreen.d.ts +0 -2
- package/dist/tui/screens/ConfigScreen.d.ts.map +0 -1
- package/dist/tui/screens/ConfigScreen.js +0 -18
- package/dist/tui/screens/ConfigScreen.js.map +0 -1
- package/dist/tui/screens/HomeScreen.d.ts +0 -2
- package/dist/tui/screens/HomeScreen.d.ts.map +0 -1
- package/dist/tui/screens/HomeScreen.js +0 -156
- package/dist/tui/screens/HomeScreen.js.map +0 -1
- package/dist/tui/screens/ModelsScreen.d.ts +0 -2
- package/dist/tui/screens/ModelsScreen.d.ts.map +0 -1
- package/dist/tui/screens/ModelsScreen.js +0 -59
- package/dist/tui/screens/ModelsScreen.js.map +0 -1
- package/dist/tui/screens/StatusScreen.d.ts +0 -2
- package/dist/tui/screens/StatusScreen.d.ts.map +0 -1
- package/dist/tui/screens/StatusScreen.js +0 -53
- package/dist/tui/screens/StatusScreen.js.map +0 -1
- package/dist/tui/screens/index.d.ts +0 -9
- package/dist/tui/screens/index.d.ts.map +0 -1
- package/dist/tui/screens/index.js +0 -38
- package/dist/tui/screens/index.js.map +0 -1
- package/dist/tui/types.d.ts +0 -30
- package/dist/tui/types.d.ts.map +0 -1
- package/dist/tui/types.js +0 -2
- package/dist/tui/types.js.map +0 -1
- package/dist/workflows/index.d.ts +0 -47
- package/dist/workflows/index.d.ts.map +0 -1
- package/dist/workflows/index.js +0 -95
- package/dist/workflows/index.js.map +0 -1
- package/dist/workflows/ltx-video.d.ts +0 -45
- package/dist/workflows/ltx-video.d.ts.map +0 -1
- package/dist/workflows/ltx-video.js +0 -114
- package/dist/workflows/ltx-video.js.map +0 -1
- package/dist/workflows/wan2.1.d.ts +0 -44
- package/dist/workflows/wan2.1.d.ts.map +0 -1
- package/dist/workflows/wan2.1.js +0 -119
- package/dist/workflows/wan2.1.js.map +0 -1
package/README.md
CHANGED
|
@@ -1,143 +1,58 @@
|
|
|
1
1
|
# MindStudio Local Model Tunnel
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
Use your own locally-running AI models in MindStudio. The tunnel connects local providers like Ollama, LM Studio, Stable Diffusion, and ComfyUI to MindStudio Cloud so you can use your own hardware for text, image, and video generation.
|
|
4
4
|
|
|
5
|
-
|
|
6
|
-
|
|
7
|
-
- **Text Generation**
|
|
8
|
-
|
|
9
|
-
- [Ollama](https://ollama.ai)
|
|
10
|
-
- [LM Studio](https://lmstudio.ai/)
|
|
11
|
-
|
|
12
|
-
- **Image Generation**
|
|
13
|
-
- [Stable Diffusion Forge Neo](https://github.com/Haoming02/sd-webui-forge-classic/tree/neo)
|
|
14
|
-
|
|
15
|
-
## Prerequisites
|
|
16
|
-
|
|
17
|
-
- Node.js 18+
|
|
5
|
+
## Quick Start
|
|
18
6
|
|
|
19
|
-
|
|
7
|
+
You'll need [Node.js 18+](https://nodejs.org) installed.
|
|
20
8
|
|
|
21
|
-
```bash
|
|
22
|
-
npm install -g @mindstudio-ai/local-model-tunnel
|
|
23
9
|
```
|
|
24
|
-
|
|
25
|
-
## Quick Start
|
|
26
|
-
|
|
27
|
-
```bash
|
|
28
|
-
# Launch the interactive menu
|
|
10
|
+
npm install -g @mindstudio-ai/local-model-tunnel
|
|
29
11
|
mindstudio-local
|
|
30
12
|
```
|
|
31
13
|
|
|
32
|
-
|
|
33
|
-
|
|
34
|
-
- **Setup** - Install and configure local AI providers (Ollama, LM Studio, Stable Diffusion)
|
|
35
|
-
- **Authenticate** - Log in to MindStudio
|
|
36
|
-
- **Register Models** - Register your local models with MindStudio
|
|
37
|
-
- **Start Tunnel** - Launch the local model tunnel
|
|
38
|
-
- **View Models** - See available local models
|
|
39
|
-
- **Configuration** - View current settings
|
|
40
|
-
|
|
41
|
-
### Manual Commands
|
|
42
|
-
|
|
43
|
-
If you prefer command-line usage:
|
|
44
|
-
|
|
45
|
-
```bash
|
|
46
|
-
# Run the setup wizard
|
|
47
|
-
mindstudio-local setup
|
|
48
|
-
|
|
49
|
-
# Authenticate with MindStudio
|
|
50
|
-
mindstudio-local auth
|
|
51
|
-
|
|
52
|
-
# Register your local models
|
|
53
|
-
mindstudio-local register
|
|
54
|
-
|
|
55
|
-
# Start the tunnel
|
|
56
|
-
mindstudio-local start
|
|
57
|
-
```
|
|
58
|
-
|
|
59
|
-
## Setup Wizard
|
|
60
|
-
|
|
61
|
-
The setup wizard (`mindstudio-local setup`) helps you install and configure providers:
|
|
62
|
-
|
|
63
|
-
**Ollama:**
|
|
64
|
-
|
|
65
|
-
- Auto-install Ollama (Linux/macOS)
|
|
66
|
-
- Start/stop Ollama server
|
|
67
|
-
- Download models from [ollama.com/library](https://ollama.com/library)
|
|
14
|
+
The app will walk you through connecting your MindStudio account and detecting any local providers you have running.
|
|
68
15
|
|
|
69
|
-
|
|
16
|
+
## Supported Providers
|
|
70
17
|
|
|
71
|
-
|
|
72
|
-
|
|
18
|
+
| Provider | Capability | Website |
|
|
19
|
+
|----------|-----------|---------|
|
|
20
|
+
| [Ollama](https://ollama.com) | Text generation | ollama.com |
|
|
21
|
+
| [LM Studio](https://lmstudio.ai) | Text generation | lmstudio.ai |
|
|
22
|
+
| [Stable Diffusion WebUI](https://github.com/AUTOMATIC1111/stable-diffusion-webui) | Image generation | github.com |
|
|
23
|
+
| [ComfyUI](https://www.comfy.org) | Video generation | comfy.org |
|
|
73
24
|
|
|
74
|
-
**
|
|
25
|
+
Don't have any of these installed yet? No problem -- select **Manage Providers** in the app for step-by-step setup guides for each one.
|
|
75
26
|
|
|
76
|
-
|
|
77
|
-
- Provides setup instructions
|
|
78
|
-
- Tip: Download models from [civitai.com](https://civitai.com) (filter by "SDXL 1.0")
|
|
79
|
-
|
|
80
|
-
## Provider Setup (Manual)
|
|
81
|
-
|
|
82
|
-
### Ollama
|
|
83
|
-
|
|
84
|
-
1. Download [Ollama](https://ollama.com/download)
|
|
85
|
-
2. Pull a model: `ollama pull llama3.2` (see [all models](https://ollama.com/library))
|
|
86
|
-
3. Start the server: `ollama serve`
|
|
27
|
+
## How It Works
|
|
87
28
|
|
|
88
|
-
|
|
29
|
+
1. You start a local provider (e.g. `ollama serve`)
|
|
30
|
+
2. The tunnel detects it and discovers your models
|
|
31
|
+
3. You sync your models to MindStudio Cloud
|
|
32
|
+
4. When a MindStudio app uses one of your models, the request is routed to your local machine and the response is streamed back
|
|
89
33
|
|
|
90
|
-
|
|
91
|
-
2. Download a model through the app
|
|
92
|
-
3. Enable the [Local Server](https://lmstudio.ai/docs/developer/core/server#running-the-server)
|
|
34
|
+
The tunnel stays running and handles requests as they come in. You can see live request logs and status in the dashboard.
|
|
93
35
|
|
|
94
|
-
|
|
36
|
+
## Example: Getting Started with Ollama
|
|
95
37
|
|
|
96
|
-
|
|
38
|
+
The fastest way to get running with text generation:
|
|
97
39
|
|
|
98
|
-
```bash
|
|
99
|
-
git clone --branch neo https://github.com/Haoming02/sd-webui-forge-classic.git sd-webui-forge-neo
|
|
100
|
-
cd sd-webui-forge-neo
|
|
101
|
-
python launch.py --api
|
|
102
40
|
```
|
|
41
|
+
# Install Ollama (macOS/Linux)
|
|
42
|
+
curl -fsSL https://ollama.com/install.sh | sh
|
|
103
43
|
|
|
104
|
-
|
|
44
|
+
# Download a model
|
|
45
|
+
ollama pull llama3.2
|
|
105
46
|
|
|
106
|
-
|
|
107
|
-
|
|
108
|
-
python launch.py --api
|
|
47
|
+
# Start the tunnel
|
|
48
|
+
mindstudio-local
|
|
109
49
|
```
|
|
110
50
|
|
|
111
|
-
|
|
112
|
-
|
|
113
|
-
| Command | Description |
|
|
114
|
-
| ------------ | ----------------------------------------- |
|
|
115
|
-
| _(none)_ | Open interactive home screen |
|
|
116
|
-
| `setup` | Interactive setup wizard for providers |
|
|
117
|
-
| `auth` | Authenticate with MindStudio via browser |
|
|
118
|
-
| `register` | Register all local models with MindStudio |
|
|
119
|
-
| `start` | Start the local model tunnel |
|
|
120
|
-
| `models` | List available local models |
|
|
121
|
-
| `status` | Check connection status |
|
|
122
|
-
| `config` | Show current configuration |
|
|
123
|
-
| `set-config` | Set configuration |
|
|
124
|
-
| `logout` | Remove stored credentials |
|
|
125
|
-
|
|
126
|
-
## Configuration Options
|
|
127
|
-
|
|
128
|
-
```bash
|
|
129
|
-
# Use custom provider URLs
|
|
130
|
-
mindstudio-local set-config --ollama-url http://localhost:11434
|
|
131
|
-
mindstudio-local set-config --lmstudio-url http://localhost:1234/v1
|
|
132
|
-
mindstudio-local set-config --sd-url http://127.0.0.1:7860
|
|
133
|
-
```
|
|
51
|
+
Select **Sync Models** in the dashboard to register your models with MindStudio, and you're ready to go.
|
|
134
52
|
|
|
135
|
-
##
|
|
53
|
+
## Want a New Provider?
|
|
136
54
|
|
|
137
|
-
|
|
138
|
-
2. Discovers your local models
|
|
139
|
-
3. Polls MindStudio for inference requests
|
|
140
|
-
4. Routes requests to local server and streams responses back
|
|
55
|
+
If there's a local AI tool you'd like to use with MindStudio, [open an issue](https://github.com/mindstudio-ai/mindstudio-local-model-tunnel/issues) to request it. Or if you're feeling adventurous, add it yourself -- each provider is a self-contained directory under `src/providers/` and the `CLAUDE.md` file has a full guide for adding one. PRs welcome!
|
|
141
56
|
|
|
142
57
|
## License
|
|
143
58
|
|