llms-py 2.0.31__py3-none-any.whl → 2.0.32__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: llms-py
3
- Version: 2.0.31
3
+ Version: 2.0.32
4
4
  Summary: A lightweight CLI tool and OpenAI-compatible server for querying multiple Large Language Model (LLM) providers
5
5
  Home-page: https://github.com/ServiceStack/llms
6
6
  Author: ServiceStack
@@ -54,6 +54,7 @@ Configure additional providers and models in [llms.json](llms/llms.json)
54
54
  - **Multi-Provider Support**: OpenRouter, Ollama, Anthropic, Google, OpenAI, Grok, Groq, Qwen, Z.ai, Mistral
55
55
  - **OpenAI-Compatible API**: Works with any client that supports OpenAI's chat completion API
56
56
  - **Built-in Analytics**: Built-in analytics UI to visualize costs, requests, and token usage
57
+ - **GitHub OAuth**: Optionally Secure your web UI and restrict access to specified GitHub Users
57
58
  - **Configuration Management**: Easy provider enable/disable and configuration management
58
59
  - **CLI Interface**: Simple command-line interface for quick interactions
59
60
  - **Server Mode**: Run an OpenAI-compatible HTTP server at `http://localhost:{PORT}/v1/chat/completions`
@@ -275,7 +276,10 @@ See [GITHUB_OAUTH_SETUP.md](GITHUB_OAUTH_SETUP.md) for detailed setup instructio
275
276
 
276
277
  ## Configuration
277
278
 
278
- The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. Key sections:
279
+ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.json` and defines available providers, models, and default settings. If it doesn't exist, `llms.json` is auto created with the latest
280
+ configuration, so you can re-create it by deleting your local config (e.g. `rm -rf ~/.llms`).
281
+
282
+ Key sections:
279
283
 
280
284
  ### Defaults
281
285
  - `headers`: Common HTTP headers for all requests
@@ -288,7 +292,6 @@ The configuration file [llms.json](llms/llms.json) is saved to `~/.llms/llms.jso
288
292
  - `convert`: Max image size and length limits and auto conversion settings
289
293
 
290
294
  ### Providers
291
-
292
295
  Each provider configuration includes:
293
296
  - `enabled`: Whether the provider is active
294
297
  - `type`: Provider class (OpenAiProvider, GoogleProvider, etc.)
@@ -2,7 +2,7 @@ llms/__init__.py,sha256=Mk6eHi13yoUxLlzhwfZ6A1IjsfSQt9ShhOdbLXTvffU,53
2
2
  llms/__main__.py,sha256=hrBulHIt3lmPm1BCyAEVtB6DQ0Hvc3gnIddhHCmJasg,151
3
3
  llms/index.html,sha256=_pkjdzCX95HTf19LgE4gMh6tLittcnf7M_jL2hSEbbM,3250
4
4
  llms/llms.json,sha256=4y4BbQcwul4HdUU9LfZOJWyUjTt0nFzGkT_Mlyd7dVU,31945
5
- llms/main.py,sha256=6a1xJVL5eS8V0Ndns4o771CYxFfjj8JPByUnT8s60qg,90865
5
+ llms/main.py,sha256=z-lqetSCyTfhIANICBM3KrzUs_xSc427KohopzpklHA,90865
6
6
  llms/ui.json,sha256=iBOmpNeD5-o8AgUa51ymS-KemovJ7bm9J1fnL0nf8jk,134025
7
7
  llms/ui/Analytics.mjs,sha256=LfWbUlpb__0EEYtHu6e4r8AeyhsNQeAxrg44RuNSR0M,73261
8
8
  llms/ui/App.mjs,sha256=L8Zn7b7YVqR5jgVQKvo_txijSd1T7jq6QOQEt7Q0eB0,3811
@@ -21,7 +21,7 @@ llms/ui/SignIn.mjs,sha256=df3b-7L3ZIneDGbJWUk93K9RGo40gVeuR5StzT1ZH9g,2324
21
21
  llms/ui/SystemPromptEditor.mjs,sha256=PffkNPV6hGbm1QZBKPI7yvWPZSBL7qla0d-JEJ4mxYo,1466
22
22
  llms/ui/SystemPromptSelector.mjs,sha256=UgoeuscFes0B1oFkx74dFwC0JgRib37VM4Gy3-kCVDQ,3769
23
23
  llms/ui/Welcome.mjs,sha256=r9j7unF9CF3k7gEQBMRMVsa2oSjgHGNn46Oa5l5BwlY,950
24
- llms/ui/ai.mjs,sha256=KYxbKkcdyJZuaSfzgAPqsC81CiviAtbXZwyzmKYoIk4,4849
24
+ llms/ui/ai.mjs,sha256=yJTZUqDgbiAwhR_BtNTEw_5ef8b0pmN1S19HtbBG4HI,4849
25
25
  llms/ui/app.css,sha256=m6wR6XCzJWbUs0K_MDyGbcnxsWOu2Q58nGpAL646kio,111026
26
26
  llms/ui/fav.svg,sha256=_R6MFeXl6wBFT0lqcUxYQIDWgm246YH_3hSTW0oO8qw,734
27
27
  llms/ui/markdown.mjs,sha256=uWSyBZZ8a76Dkt53q6CJzxg7Gkx7uayX089td3Srv8w,6388
@@ -36,13 +36,13 @@ llms/ui/lib/highlight.min.mjs,sha256=sG7wq8bF-IKkfie7S4QSyh5DdHBRf0NqQxMOEH8-MT0
36
36
  llms/ui/lib/idb.min.mjs,sha256=CeTXyV4I_pB5vnibvJuyXdMs0iVF2ZL0Z7cdm3w_QaI,3853
37
37
  llms/ui/lib/marked.min.mjs,sha256=QRHb_VZugcBJRD2EP6gYlVFEsJw5C2fQ8ImMf_pA2_s,39488
38
38
  llms/ui/lib/servicestack-client.mjs,sha256=UVafVbzhJ_0N2lzv7rlzIbzwnWpoqXxGk3N3FSKgOOc,54534
39
- llms/ui/lib/servicestack-vue.mjs,sha256=EU3cnlQuTzsmPvoK50JFN98t4AO80vVNA-CS2kaS0nI,215162
39
+ llms/ui/lib/servicestack-vue.mjs,sha256=n1auUBv4nAdEwZoDiPBRSMizH6M9GP_5b8QSDCR-3wI,215269
40
40
  llms/ui/lib/vue-router.min.mjs,sha256=fR30GHoXI1u81zyZ26YEU105pZgbbAKSXbpnzFKIxls,30418
41
41
  llms/ui/lib/vue.min.mjs,sha256=iXh97m5hotl0eFllb3aoasQTImvp7mQoRJ_0HoxmZkw,163811
42
42
  llms/ui/lib/vue.mjs,sha256=dS8LKOG01t9CvZ04i0tbFXHqFXOO_Ha4NmM3BytjQAs,537071
43
- llms_py-2.0.31.dist-info/licenses/LICENSE,sha256=bus9cuAOWeYqBk2OuhSABVV1P4z7hgrEFISpyda_H5w,1532
44
- llms_py-2.0.31.dist-info/METADATA,sha256=gKqXFGqKGvYDYKOwgGlysr6CtNTPNudzCJOOjZ2LpMs,37813
45
- llms_py-2.0.31.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
46
- llms_py-2.0.31.dist-info/entry_points.txt,sha256=WswyE7PfnkZMIxboC-MS6flBD6wm-CYU7JSUnMhqMfM,40
47
- llms_py-2.0.31.dist-info/top_level.txt,sha256=gC7hk9BKSeog8gyg-EM_g2gxm1mKHwFRfK-10BxOsa4,5
48
- llms_py-2.0.31.dist-info/RECORD,,
43
+ llms_py-2.0.32.dist-info/licenses/LICENSE,sha256=bus9cuAOWeYqBk2OuhSABVV1P4z7hgrEFISpyda_H5w,1532
44
+ llms_py-2.0.32.dist-info/METADATA,sha256=kEdwuKJiSHH-J1RIHldQp9w7j42tAQhmeoeczw6YL7A,38070
45
+ llms_py-2.0.32.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
46
+ llms_py-2.0.32.dist-info/entry_points.txt,sha256=WswyE7PfnkZMIxboC-MS6flBD6wm-CYU7JSUnMhqMfM,40
47
+ llms_py-2.0.32.dist-info/top_level.txt,sha256=gC7hk9BKSeog8gyg-EM_g2gxm1mKHwFRfK-10BxOsa4,5
48
+ llms_py-2.0.32.dist-info/RECORD,,