llms-py 2.0.31__tar.gz → 2.0.33__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (57) hide show
  1. {llms_py-2.0.31/llms_py.egg-info → llms_py-2.0.33}/PKG-INFO +6 -3
  2. {llms_py-2.0.31 → llms_py-2.0.33}/README.md +5 -2
  3. {llms_py-2.0.31 → llms_py-2.0.33}/llms/llms.json +737 -701
  4. {llms_py-2.0.31 → llms_py-2.0.33}/llms/main.py +1 -1
  5. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ai.mjs +1 -1
  6. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/servicestack-vue.mjs +9 -9
  7. {llms_py-2.0.31 → llms_py-2.0.33/llms_py.egg-info}/PKG-INFO +6 -3
  8. {llms_py-2.0.31 → llms_py-2.0.33}/pyproject.toml +1 -1
  9. {llms_py-2.0.31 → llms_py-2.0.33}/setup.py +1 -1
  10. {llms_py-2.0.31 → llms_py-2.0.33}/LICENSE +0 -0
  11. {llms_py-2.0.31 → llms_py-2.0.33}/MANIFEST.in +0 -0
  12. {llms_py-2.0.31 → llms_py-2.0.33}/llms/__init__.py +0 -0
  13. {llms_py-2.0.31 → llms_py-2.0.33}/llms/__main__.py +0 -0
  14. {llms_py-2.0.31 → llms_py-2.0.33}/llms/index.html +0 -0
  15. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Analytics.mjs +0 -0
  16. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/App.mjs +0 -0
  17. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Avatar.mjs +0 -0
  18. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Brand.mjs +0 -0
  19. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ChatPrompt.mjs +0 -0
  20. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Main.mjs +0 -0
  21. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ModelSelector.mjs +0 -0
  22. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/OAuthSignIn.mjs +0 -0
  23. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ProviderIcon.mjs +0 -0
  24. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/ProviderStatus.mjs +0 -0
  25. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Recents.mjs +0 -0
  26. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SettingsDialog.mjs +0 -0
  27. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Sidebar.mjs +0 -0
  28. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SignIn.mjs +0 -0
  29. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SystemPromptEditor.mjs +0 -0
  30. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/SystemPromptSelector.mjs +0 -0
  31. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/Welcome.mjs +0 -0
  32. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/app.css +0 -0
  33. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/fav.svg +0 -0
  34. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/chart.js +0 -0
  35. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/charts.mjs +0 -0
  36. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/color.js +0 -0
  37. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/highlight.min.mjs +0 -0
  38. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/idb.min.mjs +0 -0
  39. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/marked.min.mjs +0 -0
  40. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/servicestack-client.mjs +0 -0
  41. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/vue-router.min.mjs +0 -0
  42. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/vue.min.mjs +0 -0
  43. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/lib/vue.mjs +0 -0
  44. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/markdown.mjs +0 -0
  45. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/tailwind.input.css +0 -0
  46. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/threadStore.mjs +0 -0
  47. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/typography.css +0 -0
  48. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui/utils.mjs +0 -0
  49. {llms_py-2.0.31 → llms_py-2.0.33}/llms/ui.json +0 -0
  50. {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/SOURCES.txt +0 -0
  51. {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/dependency_links.txt +0 -0
  52. {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/entry_points.txt +0 -0
  53. {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/not-zip-safe +0 -0
  54. {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/requires.txt +0 -0
  55. {llms_py-2.0.31 → llms_py-2.0.33}/llms_py.egg-info/top_level.txt +0 -0
  56. {llms_py-2.0.31 → llms_py-2.0.33}/requirements.txt +0 -0
  57. {llms_py-2.0.31 → llms_py-2.0.33}/setup.cfg +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: llms-py
3
- Version: 2.0.31
3
+ Version: 2.0.33
4
4
  Summary: A lightweight CLI tool and OpenAI-compatible server for querying multiple Large Language Model (LLM) providers
5
5
  Home-page: https://github.com/ServiceStack/llms
6
6
  Author: ServiceStack
@@ -54,6 +54,7 @@ Configure additional providers and models in [llms.json](llms/llms.json)
54
54
  - **Multi-Provider Support**: OpenRouter, Ollama, Anthropic, Google, OpenAI, Grok, Groq, Qwen, Z.ai, Mistral
55
55
  - **OpenAI-Compatible API**: Works with any client that supports OpenAI's chat completion API
56
56
  - **Built-in Analytics**: Built-in analytics UI to visualize costs, requests, and token usage
57
+ - **GitHub OAuth**: Optionally Secure your web UI and restrict access to specified GitHub Users
57
58
  - **Configuration Management**: Easy provider enable/disable and configuration management
58
59
  - **CLI Interface**: Simple command-line interface for quick interactions
59
60
  - **Server Mode**: Run an OpenAI-compatible HTTP server at `http://localhost:{PORT}/v1/chat/completions`
@@ -275,7 +276,10 @@ See [GITHUB_OAUTH_SETUP.md](GITHUB_OAUTH_SETUP.md) for detailed setup instructio
275
276
 
276
277
  ## Configuration
277
278
 
278
- The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. Key sections:
279
+ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. If it doesn't exist, `llms.json` is auto created with the latest
280
+ configuration, so you can re-create it by deleting your local config (e.g. `rm -rf ~/.llms`).
281
+
282
+ Key sections:
279
283
 
280
284
  ### Defaults
281
285
  - `headers`: Common HTTP headers for all requests
@@ -288,7 +292,6 @@ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.jso
288
292
  - `convert`: Max image size and length limits and auto conversion settings
289
293
 
290
294
  ### Providers
291
-
292
295
  Each provider configuration includes:
293
296
  - `enabled`: Whether the provider is active
294
297
  - `type`: Provider class (OpenAiProvider, GoogleProvider, etc.)
@@ -14,6 +14,7 @@ Configure additional providers and models in [llms.json](llms/llms.json)
14
14
  - **Multi-Provider Support**: OpenRouter, Ollama, Anthropic, Google, OpenAI, Grok, Groq, Qwen, Z.ai, Mistral
15
15
  - **OpenAI-Compatible API**: Works with any client that supports OpenAI's chat completion API
16
16
  - **Built-in Analytics**: Built-in analytics UI to visualize costs, requests, and token usage
17
+ - **GitHub OAuth**: Optionally Secure your web UI and restrict access to specified GitHub Users
17
18
  - **Configuration Management**: Easy provider enable/disable and configuration management
18
19
  - **CLI Interface**: Simple command-line interface for quick interactions
19
20
  - **Server Mode**: Run an OpenAI-compatible HTTP server at `http://localhost:{PORT}/v1/chat/completions`
@@ -235,7 +236,10 @@ See [GITHUB_OAUTH_SETUP.md](GITHUB_OAUTH_SETUP.md) for detailed setup instructio
235
236
 
236
237
  ## Configuration
237
238
 
238
- The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. Key sections:
239
+ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. If it doesn't exist, `llms.json` is auto created with the latest
240
+ configuration, so you can re-create it by deleting your local config (e.g. `rm -rf ~/.llms`).
241
+
242
+ Key sections:
239
243
 
240
244
  ### Defaults
241
245
  - `headers`: Common HTTP headers for all requests
@@ -248,7 +252,6 @@ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.jso
248
252
  - `convert`: Max image size and length limits and auto conversion settings
249
253
 
250
254
  ### Providers
251
-
252
255
  Each provider configuration includes:
253
256
  - `enabled`: Whether the provider is active
254
257
  - `type`: Provider class (OpenAiProvider, GoogleProvider, etc.)