libre-webui 0.3.1 → 0.3.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +53 -21
- package/frontend/dist/assets/{index-DE426j2O.js → index-BbJ5OmoO.js} +4 -4
- package/frontend/dist/index.html +1 -1
- package/frontend/dist/js/{ArtifactContainer-Cu1sm7Li.js → ArtifactContainer-BeIYvOxW.js} +1 -1
- package/frontend/dist/js/{ArtifactDemoPage-B8QqSOBD.js → ArtifactDemoPage-DLNsXpdK.js} +1 -1
- package/frontend/dist/js/{ChatPage-CAzOtZHX.js → ChatPage-B1ySV3lW.js} +2 -2
- package/frontend/dist/js/{GalleryPage-BvGkYN_0.js → GalleryPage-BRYMP7Q3.js} +1 -1
- package/frontend/dist/js/{ModelsPage-CUo5jDV_.js → ModelsPage-7lF58OTC.js} +1 -1
- package/frontend/dist/js/{PersonasPage-Du-8hTvR.js → PersonasPage-Cc_gMLv0.js} +1 -1
- package/frontend/dist/js/{UserManagementPage-BnHnW1KA.js → UserManagementPage-DqV3zxVm.js} +1 -1
- package/package.json +1 -2
package/README.md
CHANGED
|
@@ -96,20 +96,67 @@ npx libre-webui
|
|
|
96
96
|
|
|
97
97
|
That's it. Opens at `http://localhost:8080`
|
|
98
98
|
|
|
99
|
-
|
|
99
|
+
### Docker
|
|
100
|
+
|
|
101
|
+
| Setup | Command |
|
|
102
|
+
| ----------------------------------------- | ------------------------------------------------------------ |
|
|
103
|
+
| Bundled Ollama (CPU) | `docker-compose up -d` |
|
|
104
|
+
| Bundled Ollama (NVIDIA GPU) | `docker-compose -f docker-compose.gpu.yml up -d` |
|
|
105
|
+
| External Ollama (already running on host) | `docker-compose -f docker-compose.external-ollama.yml up -d` |
|
|
106
|
+
|
|
107
|
+
Access at `http://localhost:8080`
|
|
108
|
+
|
|
109
|
+
<details>
|
|
110
|
+
<summary><strong>Development builds (unstable)</strong></summary>
|
|
111
|
+
|
|
112
|
+
> **Warning:** Development builds are automatically generated from the `dev` branch and may contain experimental features, breaking changes, or bugs. Use at your own risk and do not use in production environments.
|
|
113
|
+
|
|
114
|
+
| Setup | Command |
|
|
115
|
+
| --------------------------------- | ---------------------------------------------------------------- |
|
|
116
|
+
| Dev + Bundled Ollama (CPU) | `docker-compose -f docker-compose.dev.yml up -d` |
|
|
117
|
+
| Dev + Bundled Ollama (NVIDIA GPU) | `docker-compose -f docker-compose.dev.gpu.yml up -d` |
|
|
118
|
+
| Dev + External Ollama | `docker-compose -f docker-compose.dev.external-ollama.yml up -d` |
|
|
119
|
+
|
|
120
|
+
Development builds use separate data volumes (`libre_webui_dev_data`) to prevent conflicts with stable installations.
|
|
121
|
+
|
|
122
|
+
To pull the latest dev image manually:
|
|
100
123
|
|
|
101
124
|
```bash
|
|
102
|
-
|
|
103
|
-
npx libre-webui --help # Show all options
|
|
125
|
+
docker pull librewebui/libre-webui:dev
|
|
104
126
|
```
|
|
105
127
|
|
|
106
|
-
|
|
128
|
+
</details>
|
|
129
|
+
|
|
130
|
+
### Kubernetes (Helm)
|
|
107
131
|
|
|
108
132
|
```bash
|
|
109
|
-
|
|
110
|
-
ANTHROPIC_API_KEY=sk-ant-... npx libre-webui
|
|
133
|
+
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui
|
|
111
134
|
```
|
|
112
135
|
|
|
136
|
+
<details>
|
|
137
|
+
<summary><strong>Helm configuration options</strong></summary>
|
|
138
|
+
|
|
139
|
+
```bash
|
|
140
|
+
# With external Ollama
|
|
141
|
+
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
|
|
142
|
+
--set ollama.bundled.enabled=false \
|
|
143
|
+
--set ollama.external.enabled=true \
|
|
144
|
+
--set ollama.external.url=http://my-ollama:11434
|
|
145
|
+
|
|
146
|
+
# With NVIDIA GPU support
|
|
147
|
+
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
|
|
148
|
+
--set ollama.bundled.gpu.enabled=true
|
|
149
|
+
|
|
150
|
+
# With Ingress
|
|
151
|
+
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
|
|
152
|
+
--set ingress.enabled=true \
|
|
153
|
+
--set ingress.hosts[0].host=chat.example.com
|
|
154
|
+
```
|
|
155
|
+
|
|
156
|
+
See [helm/libre-webui/values.yaml](helm/libre-webui/values.yaml) for all configuration options.
|
|
157
|
+
|
|
158
|
+
</details>
|
|
159
|
+
|
|
113
160
|
### Development Setup
|
|
114
161
|
|
|
115
162
|
```bash
|
|
@@ -135,23 +182,8 @@ OLLAMA_BASE_URL=http://localhost:11434
|
|
|
135
182
|
# Cloud AI Providers (add the ones you need)
|
|
136
183
|
OPENAI_API_KEY=sk-...
|
|
137
184
|
ANTHROPIC_API_KEY=sk-ant-...
|
|
138
|
-
GROQ_API_KEY=gsk_...
|
|
139
|
-
GEMINI_API_KEY=...
|
|
140
|
-
|
|
141
|
-
# Optional: Text-to-Speech
|
|
142
|
-
ELEVENLABS_API_KEY=...
|
|
143
|
-
```
|
|
144
|
-
|
|
145
|
-
**Or with Docker** (requires Ollama running on host):
|
|
146
|
-
|
|
147
|
-
```bash
|
|
148
|
-
docker-compose -f docker-compose.external-ollama.yml up -d
|
|
149
185
|
```
|
|
150
186
|
|
|
151
|
-
> A bundled `docker-compose.yml` with Ollama included exists but is untested.
|
|
152
|
-
|
|
153
|
-
Access at `http://localhost:5173` (dev) or `http://localhost:8080` (Docker)
|
|
154
|
-
|
|
155
187
|
### Desktop App (In Development)
|
|
156
188
|
|
|
157
189
|
> **Note:** The desktop app is currently in active development. The macOS build is pending Apple notarization, which may cause security warnings or installation issues on some systems. We're working to resolve this. Feedback and bug reports are welcome!
|