kalavai-client 0.6.13__tar.gz → 0.6.16__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (27) hide show
  1. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/PKG-INFO +23 -53
  2. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/README.md +22 -52
  3. kalavai_client-0.6.16/kalavai_client/__init__.py +2 -0
  4. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/apps.yaml +1 -1
  5. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/docker-compose-gui.yaml +1 -0
  6. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/docker-compose-template.yaml +6 -1
  7. kalavai_client-0.6.16/kalavai_client/bridge_api.py +513 -0
  8. kalavai_client-0.6.16/kalavai_client/bridge_models.py +51 -0
  9. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/cli.py +6 -3
  10. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/core.py +13 -3
  11. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/utils.py +3 -1
  12. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/pyproject.toml +1 -1
  13. kalavai_client-0.6.13/kalavai_client/__init__.py +0 -2
  14. kalavai_client-0.6.13/kalavai_client/bridge_api.py +0 -276
  15. kalavai_client-0.6.13/kalavai_client/bridge_models.py +0 -53
  16. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/LICENSE +0 -0
  17. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/__main__.py +0 -0
  18. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/__init__.py +0 -0
  19. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/apps_values.yaml +0 -0
  20. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/nginx.conf +0 -0
  21. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/pool_config_template.yaml +0 -0
  22. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/pool_config_values.yaml +0 -0
  23. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/user_workspace.yaml +0 -0
  24. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/assets/user_workspace_values.yaml +0 -0
  25. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/auth.py +0 -0
  26. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/cluster.py +0 -0
  27. {kalavai_client-0.6.13 → kalavai_client-0.6.16}/kalavai_client/env.py +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.3
2
2
  Name: kalavai-client
3
- Version: 0.6.13
3
+ Version: 0.6.16
4
4
  Summary: Client app for kalavai platform
5
5
  License: Apache-2.0
6
6
  Keywords: LLM,platform
@@ -52,30 +52,28 @@ Description-Content-Type: text/markdown
52
52
 
53
53
  </div>
54
54
 
55
- ⭐⭐⭐ **Kalavai and our AI pools are open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/YN6ThTJKbM), follow our [Substack](https://kalavainet.substack.com/) and give us a [review on Product Hunt](https://www.producthunt.com/products/kalavai/reviews/new).**
55
+ ⭐⭐⭐ **Kalavai platform is open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/YN6ThTJKbM) and follow our [Substack](https://kalavainet.substack.com/).**
56
56
 
57
57
 
58
58
  # Kalavai: turn your devices into a scalable AI platform
59
59
 
60
- ### Taming the adoption of Large Language Models
60
+ > AI in the cloud is not aligned with you, it's aligned with the company that owns it. Make sure you own your AI
61
61
 
62
- > Kalavai is an **open source** tool that turns **everyday devices** into your very own LLM platform. It aggregates resources from multiple machines, including desktops and laptops, and is **compatible with most model engines** to make LLM deployment and orchestration simple and reliable.
62
+ ### Taming the adoption of self-hosted GenAI
63
63
 
64
- <div align="center">
65
-
66
- <a href="https://www.producthunt.com/products/kalavai/reviews?utm_source=badge-product_review&utm_medium=badge&utm_souce=badge-kalavai" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/product_review.svg?product_id=720725&theme=neutral" alt="Kalavai - The&#0032;first&#0032;platform&#0032;to&#0032;crowdsource&#0032;AI&#0032;computation | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
67
-
68
- </div>
64
+ Kalavai is an **open source** tool that turns **any devices** into a self-hosted AI platform. It aggregates resources from multiple machines, including cloud, on prem and personal computers, and is **compatible with most model engines** to make model deployment and orchestration simple and reliable.
69
65
 
70
66
 
71
67
  ## What can Kalavai do?
72
68
 
73
- Kalavai's goal is to make using AI (LLMs, AI agents) in real applications accessible and affordable to all. It's a _magic box_ that **integrates all the components required to make AI useful in the age of massive computing**, from model deployment and orchestration to Agentic AI.
69
+ Kalavai's goal is to make using self-hosted AI (GenAI models and agents) in real applications accessible and affordable to all. It's a tool that transforms machines into a _magic box_ that **integrates all the components required to make AI useful in the age of massive computing**, from model deployment and orchestration to Agentic AI.
74
70
 
75
71
  ### Core features
76
72
 
77
73
  - Manage **multiple devices resources as one**. One pool of RAM, CPUs and GPUs
78
- - **Deploy Large Language Models seamlessly across devices**, wherever they are (multiple clouds, on premises, personal devices)
74
+ - **Deploy open source models seamlessly across devices**, wherever they are (cloud, on premises, personal devices)
75
+ - Beyond LLMs: not just for large language models, but text-to-speech, speech-to-text, image understanding, coding generation and embedding models.
76
+ - The hybrid dream: build on your laptop, move to the cloud (any!) with zero changes
79
77
  - Auto-discovery: all **models are automatically exposed** through a single OpenAI-like API and a ChatGPT-like UI playground
80
78
  - Compatible with [most popular model engines](#support-for-llm-engines)
81
79
  - [Easy to expand](https://github.com/kalavai-net/kube-watcher/tree/main/templates) to custom workloads
@@ -83,19 +81,19 @@ Kalavai's goal is to make using AI (LLMs, AI agents) in real applications access
83
81
 
84
82
  <details>
85
83
 
86
- **<summary>Video tutorials</summary>**
84
+ **<summary>Powered by Kalavai</summary>**
87
85
 
88
- ### Self-hosted LLM pools
89
-
90
- https://github.com/user-attachments/assets/0d2316f3-79ea-46ac-b41e-8ef720f52672
86
+ - [CoGen AI](https://cogenai.kalavai.net): A community hosted alternative to OpenAI API for unlimited inference.
87
+ - [Create your own Free Cursor/Windsurf Clone](https://www.youtube.com/watch?v=6zHSo7oeCDQ&t=21s)
91
88
 
92
89
 
93
90
  </details>
94
91
 
92
+
95
93
  ### Latest updates
96
94
 
95
+ - 11 June 2025: Native support for Mac and Raspberry pi devices (ARM).
97
96
  - 20 February 2025: New shiny GUI interface to control LLM pools and deploy models
98
- - 6 February 2025: 🔥🔥🔥 Access **DeepSeek R1 model for free** when you join our [public LLM pool](https://kalavai-net.github.io/kalavai-client/public_llm_pool/)
99
97
  - 31 January 2025: `kalavai-client` is now a [PyPI package](https://pypi.org/project/kalavai-client/), easier to install than ever!
100
98
  <details>
101
99
  <summary>More news</summary>
@@ -148,8 +146,6 @@ The `kalavai-client` is the main tool to interact with the Kalavai platform, to
148
146
 
149
147
  <summary>Requirements</summary>
150
148
 
151
- ### Requirements
152
-
153
149
  For workers sharing resources with the pool:
154
150
 
155
151
  - A laptop, desktop or Virtual Machine
@@ -157,37 +153,8 @@ For workers sharing resources with the pool:
157
153
 
158
154
  > **Support for Windows and MacOS workers is experimental**: kalavai workers run on docker containers that require access to the host network interfaces, thus systems that do not support containers natively (Windows and MacOS) may have difficulties finding each other.
159
155
 
160
- Any system that runs python 3.6+ is able to run the `kalavai-client` and therefore connect and operate an LLM pool, [without sharing with the pool](). Your computer won't be adding its capacity to the pool, but it wil be able to deploy jobs and interact with models.
161
-
162
156
  </details>
163
157
 
164
- <details>
165
-
166
- <summary> Common issues</summary>
167
-
168
- If you see the following error:
169
-
170
- ```bash
171
- fatal error: Python.h: No such file or directory | #include <Python.h>
172
- ```
173
-
174
- Make sure you also install python3-dev package. For ubuntu distros:
175
-
176
- ```bash
177
- sudo apt install python3-dev
178
- ```
179
-
180
- If you see:
181
- ```bash
182
- AttributeError: install_layout. Did you mean: 'install_platlib'?
183
- [end of output]
184
- ```
185
-
186
- Upgrade your setuptools:
187
- ```bash
188
- pip install -U setuptools
189
- ```
190
- </details>
191
158
 
192
159
  ### Install the client
193
160
 
@@ -230,6 +197,8 @@ If your system is not currently supported, [open an issue](https://github.com/ka
230
197
 
231
198
  ### OS compatibility
232
199
 
200
+ Currently **seed nodes** are supported exclusively on linux machines (x86_64 platform). However Kalavai supports mix-pools, i.e. having Windows and MacOS computers as workers.
201
+
233
202
  Since **worker nodes** run inside docker, any machine that can run docker **should** be compatible with Kalavai. Here are instructions for [linux](https://docs.docker.com/engine/install/), [Windows](https://docs.docker.com/desktop/setup/install/windows-install/) and [MacOS](https://docs.docker.com/desktop/setup/install/mac-install/).
234
203
 
235
204
  The kalavai client, which controls and access pools, can be installed on any machine that has python 3.10+.
@@ -237,9 +206,10 @@ The kalavai client, which controls and access pools, can be installed on any mac
237
206
 
238
207
  ### Hardware compatibility:
239
208
 
240
- - `amd64` or `x86_64` CPU architecture
209
+ - `amd64` or `x86_64` CPU architecture for seed and worker nodes.
210
+ - `arm64` CPU architecture for worker nodes.
241
211
  - NVIDIA GPU
242
- - AMD and Intel GPUs are currently not supported ([interested in helping us test it?](https://kalavai-net.github.io/kalavai-client/compatibility/#help-testing-amd-gpus))
212
+ - Mac M series, AMD and Intel GPUs are currently not supported ([interested in helping us test it?](https://kalavai-net.github.io/kalavai-client/compatibility/#help-testing-amd-gpus))
243
213
 
244
214
  </details>
245
215
 
@@ -247,15 +217,15 @@ The kalavai client, which controls and access pools, can be installed on any mac
247
217
 
248
218
  - [x] Kalavai client on Linux
249
219
  - [x] [TEMPLATE] Distributed LLM deployment
250
- - [x] Kalavai client on Windows (with WSL2)
220
+ - [x] Kalavai client on Windows (worker only)
221
+ - [x] Kalavai client on Windows WSL2 (seed and worker)
251
222
  - [x] Self-hosted LLM pools
252
223
  - [x] Collaborative LLM deployment
253
224
  - [x] Ray cluster support
254
- - [x] Kalavai client on Mac
225
+ - [x] Kalavai client on Mac (worker only)
255
226
  - [x] Kalavai pools UI
256
- - [ ] [TEMPLATE] [GPUStack](https://github.com/gpustack/gpustack) support
257
- - [ ] [TEMPLATE] [exo](https://github.com/exo-explore/exo) support
258
227
  - [ ] Support for AMD GPUs
228
+ - [ ] Support for Mac M GPUs
259
229
  - [x] Docker install path
260
230
 
261
231
 
@@ -7,30 +7,28 @@
7
7
 
8
8
  </div>
9
9
 
10
- ⭐⭐⭐ **Kalavai and our AI pools are open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/YN6ThTJKbM), follow our [Substack](https://kalavainet.substack.com/) and give us a [review on Product Hunt](https://www.producthunt.com/products/kalavai/reviews/new).**
10
+ ⭐⭐⭐ **Kalavai platform is open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/YN6ThTJKbM) and follow our [Substack](https://kalavainet.substack.com/).**
11
11
 
12
12
 
13
13
  # Kalavai: turn your devices into a scalable AI platform
14
14
 
15
- ### Taming the adoption of Large Language Models
15
+ > AI in the cloud is not aligned with you, it's aligned with the company that owns it. Make sure you own your AI
16
16
 
17
- > Kalavai is an **open source** tool that turns **everyday devices** into your very own LLM platform. It aggregates resources from multiple machines, including desktops and laptops, and is **compatible with most model engines** to make LLM deployment and orchestration simple and reliable.
17
+ ### Taming the adoption of self-hosted GenAI
18
18
 
19
- <div align="center">
20
-
21
- <a href="https://www.producthunt.com/products/kalavai/reviews?utm_source=badge-product_review&utm_medium=badge&utm_souce=badge-kalavai" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/product_review.svg?product_id=720725&theme=neutral" alt="Kalavai - The&#0032;first&#0032;platform&#0032;to&#0032;crowdsource&#0032;AI&#0032;computation | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
22
-
23
- </div>
19
+ Kalavai is an **open source** tool that turns **any devices** into a self-hosted AI platform. It aggregates resources from multiple machines, including cloud, on prem and personal computers, and is **compatible with most model engines** to make model deployment and orchestration simple and reliable.
24
20
 
25
21
 
26
22
  ## What can Kalavai do?
27
23
 
28
- Kalavai's goal is to make using AI (LLMs, AI agents) in real applications accessible and affordable to all. It's a _magic box_ that **integrates all the components required to make AI useful in the age of massive computing**, from model deployment and orchestration to Agentic AI.
24
+ Kalavai's goal is to make using self-hosted AI (GenAI models and agents) in real applications accessible and affordable to all. It's a tool that transforms machines into a _magic box_ that **integrates all the components required to make AI useful in the age of massive computing**, from model deployment and orchestration to Agentic AI.
29
25
 
30
26
  ### Core features
31
27
 
32
28
  - Manage **multiple devices resources as one**. One pool of RAM, CPUs and GPUs
33
- - **Deploy Large Language Models seamlessly across devices**, wherever they are (multiple clouds, on premises, personal devices)
29
+ - **Deploy open source models seamlessly across devices**, wherever they are (cloud, on premises, personal devices)
30
+ - Beyond LLMs: not just for large language models, but text-to-speech, speech-to-text, image understanding, coding generation and embedding models.
31
+ - The hybrid dream: build on your laptop, move to the cloud (any!) with zero changes
34
32
  - Auto-discovery: all **models are automatically exposed** through a single OpenAI-like API and a ChatGPT-like UI playground
35
33
  - Compatible with [most popular model engines](#support-for-llm-engines)
36
34
  - [Easy to expand](https://github.com/kalavai-net/kube-watcher/tree/main/templates) to custom workloads
@@ -38,19 +36,19 @@ Kalavai's goal is to make using AI (LLMs, AI agents) in real applications access
38
36
 
39
37
  <details>
40
38
 
41
- **<summary>Video tutorials</summary>**
39
+ **<summary>Powered by Kalavai</summary>**
42
40
 
43
- ### Self-hosted LLM pools
44
-
45
- https://github.com/user-attachments/assets/0d2316f3-79ea-46ac-b41e-8ef720f52672
41
+ - [CoGen AI](https://cogenai.kalavai.net): A community hosted alternative to OpenAI API for unlimited inference.
42
+ - [Create your own Free Cursor/Windsurf Clone](https://www.youtube.com/watch?v=6zHSo7oeCDQ&t=21s)
46
43
 
47
44
 
48
45
  </details>
49
46
 
47
+
50
48
  ### Latest updates
51
49
 
50
+ - 11 June 2025: Native support for Mac and Raspberry pi devices (ARM).
52
51
  - 20 February 2025: New shiny GUI interface to control LLM pools and deploy models
53
- - 6 February 2025: 🔥🔥🔥 Access **DeepSeek R1 model for free** when you join our [public LLM pool](https://kalavai-net.github.io/kalavai-client/public_llm_pool/)
54
52
  - 31 January 2025: `kalavai-client` is now a [PyPI package](https://pypi.org/project/kalavai-client/), easier to install than ever!
55
53
  <details>
56
54
  <summary>More news</summary>
@@ -103,8 +101,6 @@ The `kalavai-client` is the main tool to interact with the Kalavai platform, to
103
101
 
104
102
  <summary>Requirements</summary>
105
103
 
106
- ### Requirements
107
-
108
104
  For workers sharing resources with the pool:
109
105
 
110
106
  - A laptop, desktop or Virtual Machine
@@ -112,37 +108,8 @@ For workers sharing resources with the pool:
112
108
 
113
109
  > **Support for Windows and MacOS workers is experimental**: kalavai workers run on docker containers that require access to the host network interfaces, thus systems that do not support containers natively (Windows and MacOS) may have difficulties finding each other.
114
110
 
115
- Any system that runs python 3.6+ is able to run the `kalavai-client` and therefore connect and operate an LLM pool, [without sharing with the pool](). Your computer won't be adding its capacity to the pool, but it wil be able to deploy jobs and interact with models.
116
-
117
111
  </details>
118
112
 
119
- <details>
120
-
121
- <summary> Common issues</summary>
122
-
123
- If you see the following error:
124
-
125
- ```bash
126
- fatal error: Python.h: No such file or directory | #include <Python.h>
127
- ```
128
-
129
- Make sure you also install python3-dev package. For ubuntu distros:
130
-
131
- ```bash
132
- sudo apt install python3-dev
133
- ```
134
-
135
- If you see:
136
- ```bash
137
- AttributeError: install_layout. Did you mean: 'install_platlib'?
138
- [end of output]
139
- ```
140
-
141
- Upgrade your setuptools:
142
- ```bash
143
- pip install -U setuptools
144
- ```
145
- </details>
146
113
 
147
114
  ### Install the client
148
115
 
@@ -185,6 +152,8 @@ If your system is not currently supported, [open an issue](https://github.com/ka
185
152
 
186
153
  ### OS compatibility
187
154
 
155
+ Currently **seed nodes** are supported exclusively on linux machines (x86_64 platform). However Kalavai supports mix-pools, i.e. having Windows and MacOS computers as workers.
156
+
188
157
  Since **worker nodes** run inside docker, any machine that can run docker **should** be compatible with Kalavai. Here are instructions for [linux](https://docs.docker.com/engine/install/), [Windows](https://docs.docker.com/desktop/setup/install/windows-install/) and [MacOS](https://docs.docker.com/desktop/setup/install/mac-install/).
189
158
 
190
159
  The kalavai client, which controls and access pools, can be installed on any machine that has python 3.10+.
@@ -192,9 +161,10 @@ The kalavai client, which controls and access pools, can be installed on any mac
192
161
 
193
162
  ### Hardware compatibility:
194
163
 
195
- - `amd64` or `x86_64` CPU architecture
164
+ - `amd64` or `x86_64` CPU architecture for seed and worker nodes.
165
+ - `arm64` CPU architecture for worker nodes.
196
166
  - NVIDIA GPU
197
- - AMD and Intel GPUs are currently not supported ([interested in helping us test it?](https://kalavai-net.github.io/kalavai-client/compatibility/#help-testing-amd-gpus))
167
+ - Mac M series, AMD and Intel GPUs are currently not supported ([interested in helping us test it?](https://kalavai-net.github.io/kalavai-client/compatibility/#help-testing-amd-gpus))
198
168
 
199
169
  </details>
200
170
 
@@ -202,15 +172,15 @@ The kalavai client, which controls and access pools, can be installed on any mac
202
172
 
203
173
  - [x] Kalavai client on Linux
204
174
  - [x] [TEMPLATE] Distributed LLM deployment
205
- - [x] Kalavai client on Windows (with WSL2)
175
+ - [x] Kalavai client on Windows (worker only)
176
+ - [x] Kalavai client on Windows WSL2 (seed and worker)
206
177
  - [x] Self-hosted LLM pools
207
178
  - [x] Collaborative LLM deployment
208
179
  - [x] Ray cluster support
209
- - [x] Kalavai client on Mac
180
+ - [x] Kalavai client on Mac (worker only)
210
181
  - [x] Kalavai pools UI
211
- - [ ] [TEMPLATE] [GPUStack](https://github.com/gpustack/gpustack) support
212
- - [ ] [TEMPLATE] [exo](https://github.com/exo-explore/exo) support
213
182
  - [ ] Support for AMD GPUs
183
+ - [ ] Support for Mac M GPUs
214
184
  - [x] Docker install path
215
185
 
216
186
 
@@ -0,0 +1,2 @@
1
+
2
+ __version__ = "0.6.16"
@@ -152,7 +152,7 @@ releases:
152
152
  - name: replicas
153
153
  value: 1
154
154
  - name: image_tag
155
- value: "v2025.05.2"
155
+ value: "v2025.06.7"
156
156
  - name: deployment.in_cluster
157
157
  value: "True"
158
158
  - name: deployment.kalavai_username_key
@@ -2,6 +2,7 @@ services:
2
2
  kalavai_gui:
3
3
  container_name: kalavai_gui
4
4
  image: bundenth/kalavai-gui:latest
5
+ platform: linux/amd64
5
6
  extra_hosts:
6
7
  - "host.docker.internal:host-gateway"
7
8
  networks:
@@ -3,6 +3,7 @@ services:
3
3
  {{vpn_name}}:
4
4
  image: gravitl/netclient:v0.90.0
5
5
  container_name: {{vpn_name}}
6
+ platform: linux/amd64
6
7
  cap_add:
7
8
  - NET_ADMIN
8
9
  - SYS_MODULE
@@ -17,7 +18,8 @@ services:
17
18
  # run worker only if command is set
18
19
  {%if command %}
19
20
  {{service_name}}:
20
- image: docker.io/bundenth/kalavai-runner:gpu-latest
21
+ image: docker.io/bundenth/kalavai-runner:{{target_platform}}-latest
22
+ pull_policy: always
21
23
  container_name: {{service_name}}
22
24
  {% if vpn %}
23
25
  depends_on:
@@ -35,6 +37,9 @@ services:
35
37
  {% endif %}
36
38
  --node_name="{{node_name}}"
37
39
  --node_ip="{{node_ip_address}}"
40
+ {% if random_suffix %}
41
+ --random_suffix="{{random_suffix}}"
42
+ {% endif %}
38
43
  {% if command == "server" %}
39
44
  --port_range="30000-32767"
40
45
  {% else %}