@mindstudio-ai/local-model-tunnel 0.1.9 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (170) hide show
  1. package/README.md +30 -115
  2. package/dist/chunk-PTK4SJQK.js +1768 -0
  3. package/dist/chunk-PTK4SJQK.js.map +1 -0
  4. package/dist/cli.d.ts +0 -2
  5. package/dist/cli.js +8 -517
  6. package/dist/cli.js.map +1 -1
  7. package/dist/index.d.ts +24 -5
  8. package/dist/index.js +6 -13
  9. package/dist/index.js.map +1 -1
  10. package/dist/tui-56JFPKBP.js +1561 -0
  11. package/dist/tui-56JFPKBP.js.map +1 -0
  12. package/package.json +11 -4
  13. package/dist/api.d.ts +0 -88
  14. package/dist/api.d.ts.map +0 -1
  15. package/dist/api.js +0 -168
  16. package/dist/api.js.map +0 -1
  17. package/dist/cli.d.ts.map +0 -1
  18. package/dist/config.d.ts +0 -27
  19. package/dist/config.d.ts.map +0 -1
  20. package/dist/config.js +0 -109
  21. package/dist/config.js.map +0 -1
  22. package/dist/helpers.d.ts +0 -4
  23. package/dist/helpers.d.ts.map +0 -1
  24. package/dist/helpers.js +0 -33
  25. package/dist/helpers.js.map +0 -1
  26. package/dist/index.d.ts.map +0 -1
  27. package/dist/ollama.d.ts +0 -11
  28. package/dist/ollama.d.ts.map +0 -1
  29. package/dist/ollama.js +0 -36
  30. package/dist/ollama.js.map +0 -1
  31. package/dist/providers/comfyui.d.ts +0 -29
  32. package/dist/providers/comfyui.d.ts.map +0 -1
  33. package/dist/providers/comfyui.js +0 -359
  34. package/dist/providers/comfyui.js.map +0 -1
  35. package/dist/providers/index.d.ts +0 -63
  36. package/dist/providers/index.d.ts.map +0 -1
  37. package/dist/providers/index.js +0 -126
  38. package/dist/providers/index.js.map +0 -1
  39. package/dist/providers/lmstudio.d.ts +0 -11
  40. package/dist/providers/lmstudio.d.ts.map +0 -1
  41. package/dist/providers/lmstudio.js +0 -106
  42. package/dist/providers/lmstudio.js.map +0 -1
  43. package/dist/providers/ollama.d.ts +0 -11
  44. package/dist/providers/ollama.d.ts.map +0 -1
  45. package/dist/providers/ollama.js +0 -59
  46. package/dist/providers/ollama.js.map +0 -1
  47. package/dist/providers/stable-diffusion.d.ts +0 -41
  48. package/dist/providers/stable-diffusion.d.ts.map +0 -1
  49. package/dist/providers/stable-diffusion.js +0 -283
  50. package/dist/providers/stable-diffusion.js.map +0 -1
  51. package/dist/providers/types.d.ts +0 -196
  52. package/dist/providers/types.d.ts.map +0 -1
  53. package/dist/providers/types.js +0 -19
  54. package/dist/providers/types.js.map +0 -1
  55. package/dist/quickstart/QuickstartScreen.d.ts +0 -5
  56. package/dist/quickstart/QuickstartScreen.d.ts.map +0 -1
  57. package/dist/quickstart/QuickstartScreen.js +0 -616
  58. package/dist/quickstart/QuickstartScreen.js.map +0 -1
  59. package/dist/quickstart/detect.d.ts +0 -22
  60. package/dist/quickstart/detect.d.ts.map +0 -1
  61. package/dist/quickstart/detect.js +0 -243
  62. package/dist/quickstart/detect.js.map +0 -1
  63. package/dist/quickstart/index.d.ts +0 -4
  64. package/dist/quickstart/index.d.ts.map +0 -1
  65. package/dist/quickstart/index.js +0 -245
  66. package/dist/quickstart/index.js.map +0 -1
  67. package/dist/quickstart/installers.d.ts +0 -109
  68. package/dist/quickstart/installers.d.ts.map +0 -1
  69. package/dist/quickstart/installers.js +0 -1296
  70. package/dist/quickstart/installers.js.map +0 -1
  71. package/dist/runner.d.ts +0 -19
  72. package/dist/runner.d.ts.map +0 -1
  73. package/dist/runner.js +0 -314
  74. package/dist/runner.js.map +0 -1
  75. package/dist/tui/App.d.ts +0 -7
  76. package/dist/tui/App.d.ts.map +0 -1
  77. package/dist/tui/App.js +0 -53
  78. package/dist/tui/App.js.map +0 -1
  79. package/dist/tui/TunnelRunner.d.ts +0 -19
  80. package/dist/tui/TunnelRunner.d.ts.map +0 -1
  81. package/dist/tui/TunnelRunner.js +0 -228
  82. package/dist/tui/TunnelRunner.js.map +0 -1
  83. package/dist/tui/components/Header.d.ts +0 -9
  84. package/dist/tui/components/Header.d.ts.map +0 -1
  85. package/dist/tui/components/Header.js +0 -21
  86. package/dist/tui/components/Header.js.map +0 -1
  87. package/dist/tui/components/ModelsPanel.d.ts +0 -7
  88. package/dist/tui/components/ModelsPanel.d.ts.map +0 -1
  89. package/dist/tui/components/ModelsPanel.js +0 -28
  90. package/dist/tui/components/ModelsPanel.js.map +0 -1
  91. package/dist/tui/components/ProvidersPanel.d.ts +0 -7
  92. package/dist/tui/components/ProvidersPanel.d.ts.map +0 -1
  93. package/dist/tui/components/ProvidersPanel.js +0 -6
  94. package/dist/tui/components/ProvidersPanel.js.map +0 -1
  95. package/dist/tui/components/RequestLog.d.ts +0 -8
  96. package/dist/tui/components/RequestLog.d.ts.map +0 -1
  97. package/dist/tui/components/RequestLog.js +0 -60
  98. package/dist/tui/components/RequestLog.js.map +0 -1
  99. package/dist/tui/components/StatusBar.d.ts +0 -10
  100. package/dist/tui/components/StatusBar.d.ts.map +0 -1
  101. package/dist/tui/components/StatusBar.js +0 -7
  102. package/dist/tui/components/StatusBar.js.map +0 -1
  103. package/dist/tui/components/index.d.ts +0 -6
  104. package/dist/tui/components/index.d.ts.map +0 -1
  105. package/dist/tui/components/index.js +0 -6
  106. package/dist/tui/components/index.js.map +0 -1
  107. package/dist/tui/events.d.ts +0 -35
  108. package/dist/tui/events.d.ts.map +0 -1
  109. package/dist/tui/events.js +0 -26
  110. package/dist/tui/events.js.map +0 -1
  111. package/dist/tui/hooks/index.d.ts +0 -5
  112. package/dist/tui/hooks/index.d.ts.map +0 -1
  113. package/dist/tui/hooks/index.js +0 -5
  114. package/dist/tui/hooks/index.js.map +0 -1
  115. package/dist/tui/hooks/useConnection.d.ts +0 -10
  116. package/dist/tui/hooks/useConnection.d.ts.map +0 -1
  117. package/dist/tui/hooks/useConnection.js +0 -42
  118. package/dist/tui/hooks/useConnection.js.map +0 -1
  119. package/dist/tui/hooks/useModels.d.ts +0 -9
  120. package/dist/tui/hooks/useModels.d.ts.map +0 -1
  121. package/dist/tui/hooks/useModels.js +0 -28
  122. package/dist/tui/hooks/useModels.js.map +0 -1
  123. package/dist/tui/hooks/useProviders.d.ts +0 -9
  124. package/dist/tui/hooks/useProviders.d.ts.map +0 -1
  125. package/dist/tui/hooks/useProviders.js +0 -30
  126. package/dist/tui/hooks/useProviders.js.map +0 -1
  127. package/dist/tui/hooks/useRequests.d.ts +0 -9
  128. package/dist/tui/hooks/useRequests.d.ts.map +0 -1
  129. package/dist/tui/hooks/useRequests.js +0 -60
  130. package/dist/tui/hooks/useRequests.js.map +0 -1
  131. package/dist/tui/index.d.ts +0 -2
  132. package/dist/tui/index.d.ts.map +0 -1
  133. package/dist/tui/index.js +0 -19
  134. package/dist/tui/index.js.map +0 -1
  135. package/dist/tui/screens/ConfigScreen.d.ts +0 -2
  136. package/dist/tui/screens/ConfigScreen.d.ts.map +0 -1
  137. package/dist/tui/screens/ConfigScreen.js +0 -18
  138. package/dist/tui/screens/ConfigScreen.js.map +0 -1
  139. package/dist/tui/screens/HomeScreen.d.ts +0 -2
  140. package/dist/tui/screens/HomeScreen.d.ts.map +0 -1
  141. package/dist/tui/screens/HomeScreen.js +0 -156
  142. package/dist/tui/screens/HomeScreen.js.map +0 -1
  143. package/dist/tui/screens/ModelsScreen.d.ts +0 -2
  144. package/dist/tui/screens/ModelsScreen.d.ts.map +0 -1
  145. package/dist/tui/screens/ModelsScreen.js +0 -59
  146. package/dist/tui/screens/ModelsScreen.js.map +0 -1
  147. package/dist/tui/screens/StatusScreen.d.ts +0 -2
  148. package/dist/tui/screens/StatusScreen.d.ts.map +0 -1
  149. package/dist/tui/screens/StatusScreen.js +0 -53
  150. package/dist/tui/screens/StatusScreen.js.map +0 -1
  151. package/dist/tui/screens/index.d.ts +0 -9
  152. package/dist/tui/screens/index.d.ts.map +0 -1
  153. package/dist/tui/screens/index.js +0 -38
  154. package/dist/tui/screens/index.js.map +0 -1
  155. package/dist/tui/types.d.ts +0 -30
  156. package/dist/tui/types.d.ts.map +0 -1
  157. package/dist/tui/types.js +0 -2
  158. package/dist/tui/types.js.map +0 -1
  159. package/dist/workflows/index.d.ts +0 -47
  160. package/dist/workflows/index.d.ts.map +0 -1
  161. package/dist/workflows/index.js +0 -95
  162. package/dist/workflows/index.js.map +0 -1
  163. package/dist/workflows/ltx-video.d.ts +0 -45
  164. package/dist/workflows/ltx-video.d.ts.map +0 -1
  165. package/dist/workflows/ltx-video.js +0 -114
  166. package/dist/workflows/ltx-video.js.map +0 -1
  167. package/dist/workflows/wan2.1.d.ts +0 -44
  168. package/dist/workflows/wan2.1.d.ts.map +0 -1
  169. package/dist/workflows/wan2.1.js +0 -119
  170. package/dist/workflows/wan2.1.js.map +0 -1
package/README.md CHANGED
@@ -1,143 +1,58 @@
1
1
  # MindStudio Local Model Tunnel
2
2
 
3
- Run local models with MindStudio.
3
+ Use your own locally-running AI models in MindStudio. The tunnel connects local providers like Ollama, LM Studio, Stable Diffusion, and ComfyUI to MindStudio Cloud so you can use your own hardware for text, image, and video generation.
4
4
 
5
- Providers supported so far:
6
-
7
- - **Text Generation**
8
-
9
- - [Ollama](https://ollama.ai)
10
- - [LM Studio](https://lmstudio.ai/)
11
-
12
- - **Image Generation**
13
- - [Stable Diffusion Forge Neo](https://github.com/Haoming02/sd-webui-forge-classic/tree/neo)
14
-
15
- ## Prerequisites
16
-
17
- - Node.js 18+
5
+ ## Quick Start
18
6
 
19
- ## Installation
7
+ You'll need [Node.js 18+](https://nodejs.org) installed.
20
8
 
21
- ```bash
22
- npm install -g @mindstudio-ai/local-model-tunnel
23
9
  ```
24
-
25
- ## Quick Start
26
-
27
- ```bash
28
- # Launch the interactive menu
10
+ npm install -g @mindstudio-ai/local-model-tunnel
29
11
  mindstudio-local
30
12
  ```
31
13
 
32
- This opens an interactive home screen where you can:
33
-
34
- - **Setup** - Install and configure local AI providers (Ollama, LM Studio, Stable Diffusion)
35
- - **Authenticate** - Log in to MindStudio
36
- - **Register Models** - Register your local models with MindStudio
37
- - **Start Tunnel** - Launch the local model tunnel
38
- - **View Models** - See available local models
39
- - **Configuration** - View current settings
40
-
41
- ### Manual Commands
42
-
43
- If you prefer command-line usage:
44
-
45
- ```bash
46
- # Run the setup wizard
47
- mindstudio-local setup
48
-
49
- # Authenticate with MindStudio
50
- mindstudio-local auth
51
-
52
- # Register your local models
53
- mindstudio-local register
54
-
55
- # Start the tunnel
56
- mindstudio-local start
57
- ```
58
-
59
- ## Setup Wizard
60
-
61
- The setup wizard (`mindstudio-local setup`) helps you install and configure providers:
62
-
63
- **Ollama:**
64
-
65
- - Auto-install Ollama (Linux/macOS)
66
- - Start/stop Ollama server
67
- - Download models from [ollama.com/library](https://ollama.com/library)
14
+ The app will walk you through connecting your MindStudio account and detecting any local providers you have running.
68
15
 
69
- **LM Studio:**
16
+ ## Supported Providers
70
17
 
71
- - Opens download page in browser
72
- - Guides you through enabling the local server
18
+ | Provider | Capability | Website |
19
+ |----------|-----------|---------|
20
+ | [Ollama](https://ollama.com) | Text generation | ollama.com |
21
+ | [LM Studio](https://lmstudio.ai) | Text generation | lmstudio.ai |
22
+ | [Stable Diffusion WebUI](https://github.com/AUTOMATIC1111/stable-diffusion-webui) | Image generation | github.com |
23
+ | [ComfyUI](https://www.comfy.org) | Video generation | comfy.org |
73
24
 
74
- **Stable Diffusion Forge:**
25
+ Don't have any of these installed yet? No problem -- select **Manage Providers** in the app for step-by-step setup guides for each one.
75
26
 
76
- - Clones the repository to your chosen location
77
- - Provides setup instructions
78
- - Tip: Download models from [civitai.com](https://civitai.com) (filter by "SDXL 1.0")
79
-
80
- ## Provider Setup (Manual)
81
-
82
- ### Ollama
83
-
84
- 1. Download [Ollama](https://ollama.com/download)
85
- 2. Pull a model: `ollama pull llama3.2` (see [all models](https://ollama.com/library))
86
- 3. Start the server: `ollama serve`
27
+ ## How It Works
87
28
 
88
- ### LM Studio
29
+ 1. You start a local provider (e.g. `ollama serve`)
30
+ 2. The tunnel detects it and discovers your models
31
+ 3. You sync your models to MindStudio Cloud
32
+ 4. When a MindStudio app uses one of your models, the request is routed to your local machine and the response is streamed back
89
33
 
90
- 1. Download [LM Studio](https://lmstudio.ai/download)
91
- 2. Download a model through the app
92
- 3. Enable the [Local Server](https://lmstudio.ai/docs/developer/core/server#running-the-server)
34
+ The tunnel stays running and handles requests as they come in. You can see live request logs and status in the dashboard.
93
35
 
94
- ### Stable Diffusion (Forge Neo)
36
+ ## Example: Getting Started with Ollama
95
37
 
96
- **First-time setup:**
38
+ The fastest way to get running with text generation:
97
39
 
98
- ```bash
99
- git clone --branch neo https://github.com/Haoming02/sd-webui-forge-classic.git sd-webui-forge-neo
100
- cd sd-webui-forge-neo
101
- python launch.py --api
102
40
  ```
41
+ # Install Ollama (macOS/Linux)
42
+ curl -fsSL https://ollama.com/install.sh | sh
103
43
 
104
- **Subsequent runs:**
44
+ # Download a model
45
+ ollama pull llama3.2
105
46
 
106
- ```bash
107
- cd sd-webui-forge-neo
108
- python launch.py --api
47
+ # Start the tunnel
48
+ mindstudio-local
109
49
  ```
110
50
 
111
- ## Commands
112
-
113
- | Command | Description |
114
- | ------------ | ----------------------------------------- |
115
- | _(none)_ | Open interactive home screen |
116
- | `setup` | Interactive setup wizard for providers |
117
- | `auth` | Authenticate with MindStudio via browser |
118
- | `register` | Register all local models with MindStudio |
119
- | `start` | Start the local model tunnel |
120
- | `models` | List available local models |
121
- | `status` | Check connection status |
122
- | `config` | Show current configuration |
123
- | `set-config` | Set configuration |
124
- | `logout` | Remove stored credentials |
125
-
126
- ## Configuration Options
127
-
128
- ```bash
129
- # Use custom provider URLs
130
- mindstudio-local set-config --ollama-url http://localhost:11434
131
- mindstudio-local set-config --lmstudio-url http://localhost:1234/v1
132
- mindstudio-local set-config --sd-url http://127.0.0.1:7860
133
- ```
51
+ Select **Sync Models** in the dashboard to register your models with MindStudio, and you're ready to go.
134
52
 
135
- ## How It Works
53
+ ## Want a New Provider?
136
54
 
137
- 1. Authenticates with your MindStudio account
138
- 2. Discovers your local models
139
- 3. Polls MindStudio for inference requests
140
- 4. Routes requests to local server and streams responses back
55
+ If there's a local AI tool you'd like to use with MindStudio, [open an issue](https://github.com/mindstudio-ai/mindstudio-local-model-tunnel/issues) to request it. Or if you're feeling adventurous, add it yourself -- each provider is a self-contained directory under `src/providers/` and the `CLAUDE.md` file has a full guide for adding one. PRs welcome!
141
56
 
142
57
  ## License
143
58