llms-py 2.0.31__tar.gz → 2.0.33__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {llms_py-2.0.31/llms_py.egg-info → llms_py-2.0.33}/PKG-INFO +6 -3
- {llms_py-2.0.31 → llms_py-2.0.33}/README.md +5 -2
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/llms.json +737 -701
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/main.py +1 -1
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ai.mjs +1 -1
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/servicestack-vue.mjs +9 -9
- {llms_py-2.0.31 → llms_py-2.0.33/llms_py.egg-info}/PKG-INFO +6 -3
- {llms_py-2.0.31 → llms_py-2.0.33}/pyproject.toml +1 -1
- {llms_py-2.0.31 → llms_py-2.0.33}/setup.py +1 -1
- {llms_py-2.0.31 → llms_py-2.0.33}/LICENSE +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/MANIFEST.in +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/__init__.py +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/__main__.py +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/index.html +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Analytics.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/App.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Avatar.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Brand.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ChatPrompt.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Main.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ModelSelector.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/OAuthSignIn.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ProviderIcon.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ProviderStatus.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Recents.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SettingsDialog.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Sidebar.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SignIn.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SystemPromptEditor.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SystemPromptSelector.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Welcome.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/app.css +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/fav.svg +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/chart.js +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/charts.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/color.js +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/highlight.min.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/idb.min.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/marked.min.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/servicestack-client.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/vue-router.min.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/vue.min.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/vue.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/markdown.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/tailwind.input.css +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/threadStore.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/typography.css +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/utils.mjs +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui.json +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/SOURCES.txt +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/dependency_links.txt +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/entry_points.txt +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/not-zip-safe +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/requires.txt +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/top_level.txt +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/requirements.txt +0 -0
- {llms_py-2.0.31 → llms_py-2.0.33}/setup.cfg +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: llms-py
|
|
3
|
-
Version: 2.0.
|
|
3
|
+
Version: 2.0.33
|
|
4
4
|
Summary: A lightweight CLI tool and OpenAI-compatible server for querying multiple Large Language Model (LLM) providers
|
|
5
5
|
Home-page: https://github.com/ServiceStack/llms
|
|
6
6
|
Author: ServiceStack
|
|
@@ -54,6 +54,7 @@ Configure additional providers and models in [llms.json](llms/llms.json)
|
|
|
54
54
|
- **Multi-Provider Support**: OpenRouter, Ollama, Anthropic, Google, OpenAI, Grok, Groq, Qwen, Z.ai, Mistral
|
|
55
55
|
- **OpenAI-Compatible API**: Works with any client that supports OpenAI's chat completion API
|
|
56
56
|
- **Built-in Analytics**: Built-in analytics UI to visualize costs, requests, and token usage
|
|
57
|
+
- **GitHub OAuth**: Optionally Secure your web UI and restrict access to specified GitHub Users
|
|
57
58
|
- **Configuration Management**: Easy provider enable/disable and configuration management
|
|
58
59
|
- **CLI Interface**: Simple command-line interface for quick interactions
|
|
59
60
|
- **Server Mode**: Run an OpenAI-compatible HTTP server at `http://localhost:{PORT}/v1/chat/completions`
|
|
@@ -275,7 +276,10 @@ See [GITHUB_OAUTH_SETUP.md](GITHUB_OAUTH_SETUP.md) for detailed setup instructio
|
|
|
275
276
|
|
|
276
277
|
## Configuration
|
|
277
278
|
|
|
278
|
-
The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings.
|
|
279
|
+
The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. If it doesn't exist, `llms.json` is auto created with the latest
|
|
280
|
+
configuration, so you can re-create it by deleting your local config (e.g. `rm -rf ~/.llms`).
|
|
281
|
+
|
|
282
|
+
Key sections:
|
|
279
283
|
|
|
280
284
|
### Defaults
|
|
281
285
|
- `headers`: Common HTTP headers for all requests
|
|
@@ -288,7 +292,6 @@ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.jso
|
|
|
288
292
|
- `convert`: Max image size and length limits and auto conversion settings
|
|
289
293
|
|
|
290
294
|
### Providers
|
|
291
|
-
|
|
292
295
|
Each provider configuration includes:
|
|
293
296
|
- `enabled`: Whether the provider is active
|
|
294
297
|
- `type`: Provider class (OpenAiProvider, GoogleProvider, etc.)
|
|
@@ -14,6 +14,7 @@ Configure additional providers and models in [llms.json](llms/llms.json)
|
|
|
14
14
|
- **Multi-Provider Support**: OpenRouter, Ollama, Anthropic, Google, OpenAI, Grok, Groq, Qwen, Z.ai, Mistral
|
|
15
15
|
- **OpenAI-Compatible API**: Works with any client that supports OpenAI's chat completion API
|
|
16
16
|
- **Built-in Analytics**: Built-in analytics UI to visualize costs, requests, and token usage
|
|
17
|
+
- **GitHub OAuth**: Optionally Secure your web UI and restrict access to specified GitHub Users
|
|
17
18
|
- **Configuration Management**: Easy provider enable/disable and configuration management
|
|
18
19
|
- **CLI Interface**: Simple command-line interface for quick interactions
|
|
19
20
|
- **Server Mode**: Run an OpenAI-compatible HTTP server at `http://localhost:{PORT}/v1/chat/completions`
|
|
@@ -235,7 +236,10 @@ See [GITHUB_OAUTH_SETUP.md](GITHUB_OAUTH_SETUP.md) for detailed setup instructio
|
|
|
235
236
|
|
|
236
237
|
## Configuration
|
|
237
238
|
|
|
238
|
-
The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings.
|
|
239
|
+
The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. If it doesn't exist, `llms.json` is auto created with the latest
|
|
240
|
+
configuration, so you can re-create it by deleting your local config (e.g. `rm -rf ~/.llms`).
|
|
241
|
+
|
|
242
|
+
Key sections:
|
|
239
243
|
|
|
240
244
|
### Defaults
|
|
241
245
|
- `headers`: Common HTTP headers for all requests
|
|
@@ -248,7 +252,6 @@ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.jso
|
|
|
248
252
|
- `convert`: Max image size and length limits and auto conversion settings
|
|
249
253
|
|
|
250
254
|
### Providers
|
|
251
|
-
|
|
252
255
|
Each provider configuration includes:
|
|
253
256
|
- `enabled`: Whether the provider is active
|
|
254
257
|
- `type`: Provider class (OpenAiProvider, GoogleProvider, etc.)
|