kalavai-client 0.6.19__py3-none-any.whl → 0.6.20__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,2 +1,2 @@
1
1
 
2
- __version__ = "0.6.19"
2
+ __version__ = "0.6.20"
@@ -154,7 +154,7 @@ releases:
154
154
  - name: replicas
155
155
  value: 1
156
156
  - name: image_tag
157
- value: "v2025.07.31"
157
+ value: "v2025.07.33"
158
158
  - name: deployment.in_cluster
159
159
  value: "True"
160
160
  - name: deployment.kalavai_username_key
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.3
2
2
  Name: kalavai-client
3
- Version: 0.6.19
3
+ Version: 0.6.20
4
4
  Summary: Client app for kalavai platform
5
5
  License: Apache-2.0
6
6
  Keywords: LLM,platform
@@ -50,7 +50,7 @@ Description-Content-Type: text/markdown
50
50
  ⭐⭐⭐ **Kalavai platform is open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/YN6ThTJKbM) and follow our [Substack](https://kalavainet.substack.com/).**
51
51
 
52
52
 
53
- # Kalavai: turn your devices into a scalable AI platform
53
+ # Kalavai: a platform to self-host AI on easy mode
54
54
 
55
55
  > AI in the cloud is not aligned with you, it's aligned with the company that owns it. Make sure you own your AI
56
56
 
@@ -61,15 +61,14 @@ Kalavai is an **open source** tool that turns **any devices** into a self-hosted
61
61
 
62
62
  ## What can Kalavai do?
63
63
 
64
- Kalavai's goal is to make using self-hosted AI (GenAI models and agents) in real applications accessible and affordable to all. It's a tool that transforms machines into a _magic box_ that **integrates all the components required to make AI useful in the age of massive computing**, from model deployment and orchestration to Agentic AI.
64
+ Kalavai's goal is to make using self-hosted AI (GenAI models and agents) in real applications accessible and affordable to all.
65
65
 
66
66
  ### Core features
67
67
 
68
- - Manage **multiple devices resources as one**. One pool of RAM, CPUs and GPUs
69
- - **Deploy open source models seamlessly across devices**, wherever they are (cloud, on premises, personal devices)
70
- - Beyond LLMs: not just for large language models, but text-to-speech, speech-to-text, image understanding, coding generation and embedding models.
71
- - The hybrid dream: build on your laptop, move to the cloud (any!) with zero changes
72
- - Auto-discovery: all **models are automatically exposed** through a single OpenAI-like API and a ChatGPT-like UI playground
68
+ - Manage **multiple devices resources as one**, wherever they come from (hybrid cloud, on prem, personal devices)
69
+ - **Deploy open source models seamlessly across devices**, with zero-cost migration
70
+ - Beyond LLMs: not just for large language models, but text-to-speech, speech-to-text, image generation, video understanding, coding generation and embedding models.
71
+ - Production-ready: **models are automatically exposed** through a single OpenAI-like API and a ChatGPT-like UI playground, with off-the-shelf monitoring and evaluation framework.
73
72
  - Compatible with [most popular model engines](#support-for-llm-engines)
74
73
  - [Easy to expand](https://github.com/kalavai-net/kube-watcher/tree/main/templates) to custom workloads
75
74
 
@@ -103,17 +102,24 @@ Kalavai's goal is to make using self-hosted AI (GenAI models and agents) in real
103
102
 
104
103
  </details>
105
104
 
106
- ### Support for LLM engines
105
+ ### Support for AI engines
107
106
 
108
- We currently support out of the box the following LLM engines:
107
+ We currently support out of the box the following AI engines:
109
108
 
110
- - [vLLM](https://docs.vllm.ai/en/latest/)
111
- - [llama.cpp](https://github.com/ggerganov/llama.cpp)
112
- - [Aphrodite Engine](https://github.com/aphrodite-engine/aphrodite-engine)
113
- - [Petals](https://github.com/bigscience-workshop/petals)
109
+ - [vLLM](https://docs.vllm.ai/en/latest/): most popular GPU-based model inference.
110
+ - [llama.cpp](https://github.com/ggerganov/llama.cpp): CPU-based GGUF model inference.
111
+ - [SGLang](https://github.com/sgl-project/sglang): Super fast GPU-based model inference.
112
+ - [n8n](https://n8n.io/): no-code workload automation framework.
113
+ - [Flowise](https://flowiseai.com/): no-code agentic AI workload framework.
114
+ - [Speaches](https://speaches.ai/): audio (speech-to-text and text-to-speech) model inference.
115
+ - [Langfuse](https://langfuse.com/): open source evaluation and monitoring GenAI framework.
116
+ - [OpenWebUI](https://docs.openwebui.com/): ChatGPT-like UI playground to interface with any models.
114
117
 
115
118
  Coming soon:
116
119
 
120
+ - [diffusers](https://huggingface.co/docs/diffusers/en/index)
121
+ - [Aphrodite Engine](https://github.com/aphrodite-engine/aphrodite-engine)
122
+ - [Petals](https://github.com/bigscience-workshop/petals)
117
123
  - [exo](https://github.com/exo-explore/exo)
118
124
  - [GPUstack](https://docs.gpustack.ai/0.4/overview/)
119
125
  - [RayServe](https://docs.ray.io/en/latest/serve/index.html)
@@ -141,12 +147,16 @@ The `kalavai-client` is the main tool to interact with the Kalavai platform, to
141
147
 
142
148
  <summary>Requirements</summary>
143
149
 
150
+ For seed nodes:
151
+ - A 64 bits x86 based Linux machine (laptop, desktop or VM)
152
+ - [Docker engine installed](https://docs.docker.com/engine/install/ubuntu/) with [privilege access](https://docs.docker.com/engine/containers/run/#runtime-privilege-and-linux-capabilities).
153
+
144
154
  For workers sharing resources with the pool:
145
155
 
146
- - A laptop, desktop or Virtual Machine
156
+ - A laptop, desktop or Virtual Machine (MacOS, Linux or Windows; ARM or x86)
157
+ - If self-hosting, workers should be on the same network as the seed node. Looking for over-the-internet connectivity? Check out our [managed seeds](https://platform.kalavai.net)
147
158
  - Docker engine installed (for [linux](https://docs.docker.com/engine/install/ubuntu/), [Windows and MacOS](https://docs.docker.com/desktop/)) with [privilege access](https://docs.docker.com/engine/containers/run/#runtime-privilege-and-linux-capabilities).
148
159
 
149
- > **Support for Windows and MacOS workers is experimental**: kalavai workers run on docker containers that require access to the host network interfaces, thus systems that do not support containers natively (Windows and MacOS) may have difficulties finding each other.
150
160
 
151
161
  </details>
152
162
 
@@ -162,24 +172,24 @@ pip install kalavai-client
162
172
 
163
173
  ## Create a a local, private AI pool
164
174
 
165
- > Kalavai is **free to use, no caps, for both commercial and non-commercial purposes**. All you need to get started is one or more computers that can see each other (i.e. within the same network), and you are good to go. If you are interested in join computers in different locations / networks, [contact us](mailto:info@kalavai.net) or [book a demo](https://app.onecal.io/b/kalavai/book-a-demo) with the founders.
166
-
167
175
  You can create and manage your pools with the new kalavai GUI, which can be started with:
168
176
 
169
177
  ```bash
170
178
  kalavai gui start
171
179
  ```
172
180
 
173
- This will expose the GUI and the backend services in localhost. By default, the GUI is accessible via [http://localhost:3000](http://localhost:3000). In the UI users can create and join LLM pools, monitor devices, deploy LLMs and more.
181
+ This will expose the GUI and the backend services in localhost. By default, the GUI is accessible via [http://localhost:49153](http://localhost:49153). In the UI users can create and join AI pools, monitor devices, deploy LLMs and more.
174
182
 
175
183
  ![Kalavai logo](docs/docs/assets/images/ui_dashboard_multiple.png)
176
184
 
177
- Check out our [getting started guide](https://kalavai-net.github.io/kalavai-client/getting_started/) for next steps.
185
+ Check out our [getting started guide](https://kalavai-net.github.io/kalavai-client/getting_started/) for next steps on how to add more workers to your pool, or use our [managed seeds service](https://kalavai-net.github.io/kalavai-client/getting_started/#1b-managed-pools-create-a-seed) for over-the-internet AI pools.
178
186
 
179
187
 
180
188
  ## Enough already, let's run stuff!
181
189
 
182
- Check our [examples](examples/) to put your new AI pool to good use! For an end to end tour, check our [self-hosted](https://kalavai-net.github.io/kalavai-client/self_hosted_llm_pool/) and [public LLM pools](https://kalavai-net.github.io/kalavai-client/public_llm_pool/) guides.
190
+ For an end to end tour on building your own OpenAI-like service, check our [self-hosted](https://kalavai-net.github.io/kalavai-client/self_hosted_llm_pool/) guide.
191
+
192
+ Check our [examples](examples/) to put your new AI pool to good use!
183
193
 
184
194
 
185
195
  ## Compatibility matrix
@@ -247,7 +257,7 @@ Anything missing here? Give us a shout in the [discussion board](https://github.
247
257
 
248
258
  <summary>Expand</summary>
249
259
 
250
- Python version >= 3.6.
260
+ Python version >= 3.10.
251
261
 
252
262
  ```bash
253
263
  sudo add-apt-repository ppa:deadsnakes/ppa
@@ -1,7 +1,7 @@
1
- kalavai_client/__init__.py,sha256=29RPUdF-Jn8Tqqu5Mk5Ci4E0On_MZIVUBqNZv5nyu0s,23
1
+ kalavai_client/__init__.py,sha256=sCg7mMwXeCTDe1p0b_ZZO5PQl1hoCmZ7zXqDv1j8bfY,23
2
2
  kalavai_client/__main__.py,sha256=WQUfxvRsBJH5gsCJg8pLz95QnZIj7Ol8psTO77m0QE0,73
3
3
  kalavai_client/assets/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
4
- kalavai_client/assets/apps.yaml,sha256=17JuXSv-Qj5Az6ZTRyiEaQXVbI325uTrZzKk2irts2g,6410
4
+ kalavai_client/assets/apps.yaml,sha256=63sO_MJcUcSap4Dt3ADZWc7dUGTiYW5eczqNEbDnMSA,6410
5
5
  kalavai_client/assets/apps_values.yaml,sha256=LeSNd3PwkIx0wkTIlEk2KNz3Yy4sXSaHALQEkopdhKE,2165
6
6
  kalavai_client/assets/docker-compose-gui.yaml,sha256=OAVO0ohaCpDB9FGeih0yAbVNwUfDtaCzssZ25uiuJyA,787
7
7
  kalavai_client/assets/docker-compose-template.yaml,sha256=KHIwJ2WWX7Y7wQKiXRr82Jqd3IKRyls5zhTyl8mSmrc,1805
@@ -18,8 +18,8 @@ kalavai_client/cluster.py,sha256=Z2PIXbZuSAv9xmw-MyZP1M41BpVMpirLzG51bqGA-zc,135
18
18
  kalavai_client/core.py,sha256=haNLna0TWzxmGx9cEhJjnV3r9YSOS3Fhtr4dt70LnwQ,35390
19
19
  kalavai_client/env.py,sha256=YsfZj7LWf6ABquDsoIFFkXCFYwenpDk8zVnGsf7qv98,2823
20
20
  kalavai_client/utils.py,sha256=5cUpMVsADF3JdDW0wbu-f38MURkhQz9pPngg0WxssJw,13460
21
- kalavai_client-0.6.19.dist-info/LICENSE,sha256=xx0jnfkXJvxRnG63LTGOxlggYnIysveWIZ6H3PNdCrQ,11357
22
- kalavai_client-0.6.19.dist-info/METADATA,sha256=CYVNusQKxd6KHa0UCx3QBDytEXGApHg2OrZd7O5LfIU,12393
23
- kalavai_client-0.6.19.dist-info/WHEEL,sha256=b4K_helf-jlQoXBBETfwnf4B04YC67LOev0jo4fX5m8,88
24
- kalavai_client-0.6.19.dist-info/entry_points.txt,sha256=9T6D45gxwzfVbglMm1r6XPdXuuZdHfy_7fCeu2jUphc,50
25
- kalavai_client-0.6.19.dist-info/RECORD,,
21
+ kalavai_client-0.6.20.dist-info/LICENSE,sha256=xx0jnfkXJvxRnG63LTGOxlggYnIysveWIZ6H3PNdCrQ,11357
22
+ kalavai_client-0.6.20.dist-info/METADATA,sha256=COeOSfLyxsuzCteQJZYYzT3lFp1Lxexpe84A2UNcVx4,12776
23
+ kalavai_client-0.6.20.dist-info/WHEEL,sha256=b4K_helf-jlQoXBBETfwnf4B04YC67LOev0jo4fX5m8,88
24
+ kalavai_client-0.6.20.dist-info/entry_points.txt,sha256=9T6D45gxwzfVbglMm1r6XPdXuuZdHfy_7fCeu2jUphc,50
25
+ kalavai_client-0.6.20.dist-info/RECORD,,