devpy-cli 1.0.3__tar.gz → 1.0.4__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: devpy-cli
3
- Version: 1.0.3
3
+ Version: 1.0.4
4
4
  Summary: AI-powered DevOps CLI Assistant for local and remote Docker management
5
5
  Author-email: Eddy Ortega <atrox390@gmail.com>
6
6
  License: MIT
@@ -19,13 +19,15 @@ Requires-Dist: cryptography>=42.0.0
19
19
  Requires-Dist: rich>=13.7.0
20
20
  Requires-Dist: langchain>=0.1.0
21
21
  Requires-Dist: langchain-openai>=0.0.5
22
+ Requires-Dist: langchain-anthropic>=0.1.0
23
+ Requires-Dist: langchain-google-genai>=1.0.0
22
24
  Requires-Dist: langgraph>=0.0.10
23
25
  Requires-Dist: python-dotenv>=1.0.0
24
26
  Requires-Dist: psutil>=5.9.0
25
27
 
26
28
  # DevPy CLI
27
29
 
28
- An intelligent command-line assistant powered by LLM (DeepSeek/OpenAI) to manage Docker environments, both local and remote via SSH. Designed to simplify DevOps tasks with natural language, ensuring security and control.
30
+ An intelligent command-line assistant powered by multiple LLM providers (DeepSeek, OpenAI, Anthropic Claude, Google Gemini, Ollama/OpenWebUI) to manage Docker environments, both local and remote via SSH. Designed to simplify DevOps tasks with natural language, ensuring security and control.
29
31
 
30
32
  ## Key Features
31
33
 
@@ -331,13 +333,23 @@ DevPy CLI exposes a set of Docker-focused tools that the agent can call to fulfi
331
333
  - **LLM API Authentication**
332
334
  - The `.env` file created by the setup wizard stores:
333
335
  - `LLM` – which provider/adapter to use.
334
- - `<PROVIDER>_API_KEY` – the API key for that provider.
335
- - Optionally `LLM_BASE_URL` – custom base URL for compatible providers.
336
+ - `<PROVIDER>_API_KEY` – the API key for that provider (for example `DEEPSEEK_API_KEY`, `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`).
337
+ - Optionally `LLM_BASE_URL` – custom base URL for compatible providers (including OpenAI-compatible proxies such as Ollama or OpenWebUI).
336
338
  - You can re-run the wizard at any time with:
337
339
  ```bash
338
340
  config llm
339
341
  ```
340
342
 
343
+ - **Supported LLM Providers**
344
+ - **DeepSeek** – uses `LLM=deepseek` and `DEEPSEEK_API_KEY`, with optional `LLM_BASE_URL` override.
345
+ - **OpenAI (ChatGPT / GPT-4o family)** – uses `LLM=chatgpt` and `OPENAI_API_KEY`.
346
+ - **Anthropic Claude** – uses `LLM=anthropic` or `LLM=claude` with `ANTHROPIC_API_KEY`.
347
+ - **Google Gemini** – uses `LLM=google` or `LLM=gemini` with `GOOGLE_API_KEY`.
348
+ - **Ollama / OpenWebUI** – uses `LLM=ollama` or `LLM=openwebui` and talks to an OpenAI-compatible endpoint:
349
+ - `LLM_BASE_URL`, `OLLAMA_BASE_URL`, or `OPENWEBUI_BASE_URL` define the base URL (e.g. `http://localhost:11434` or an OpenWebUI URL).
350
+ - `OLLAMA_MODEL` selects the local model (for example `llama3.1`).
351
+ - `OPENAI_API_KEY` can be any non-empty token (often not validated by Ollama/OpenWebUI).
352
+
341
353
  - **SSH Key Encryption**
342
354
  - Stored SSH keys live in `ssh_keys.enc`.
343
355
  - Each key is encrypted using a passphrase-derived key (PBKDF2 + AES-256).
@@ -1,6 +1,6 @@
1
1
  # DevPy CLI
2
2
 
3
- An intelligent command-line assistant powered by LLM (DeepSeek/OpenAI) to manage Docker environments, both local and remote via SSH. Designed to simplify DevOps tasks with natural language, ensuring security and control.
3
+ An intelligent command-line assistant powered by multiple LLM providers (DeepSeek, OpenAI, Anthropic Claude, Google Gemini, Ollama/OpenWebUI) to manage Docker environments, both local and remote via SSH. Designed to simplify DevOps tasks with natural language, ensuring security and control.
4
4
 
5
5
  ## Key Features
6
6
 
@@ -306,13 +306,23 @@ DevPy CLI exposes a set of Docker-focused tools that the agent can call to fulfi
306
306
  - **LLM API Authentication**
307
307
  - The `.env` file created by the setup wizard stores:
308
308
  - `LLM` – which provider/adapter to use.
309
- - `<PROVIDER>_API_KEY` – the API key for that provider.
310
- - Optionally `LLM_BASE_URL` – custom base URL for compatible providers.
309
+ - `<PROVIDER>_API_KEY` – the API key for that provider (for example `DEEPSEEK_API_KEY`, `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`).
310
+ - Optionally `LLM_BASE_URL` – custom base URL for compatible providers (including OpenAI-compatible proxies such as Ollama or OpenWebUI).
311
311
  - You can re-run the wizard at any time with:
312
312
  ```bash
313
313
  config llm
314
314
  ```
315
315
 
316
+ - **Supported LLM Providers**
317
+ - **DeepSeek** – uses `LLM=deepseek` and `DEEPSEEK_API_KEY`, with optional `LLM_BASE_URL` override.
318
+ - **OpenAI (ChatGPT / GPT-4o family)** – uses `LLM=chatgpt` and `OPENAI_API_KEY`.
319
+ - **Anthropic Claude** – uses `LLM=anthropic` or `LLM=claude` with `ANTHROPIC_API_KEY`.
320
+ - **Google Gemini** – uses `LLM=google` or `LLM=gemini` with `GOOGLE_API_KEY`.
321
+ - **Ollama / OpenWebUI** – uses `LLM=ollama` or `LLM=openwebui` and talks to an OpenAI-compatible endpoint:
322
+ - `LLM_BASE_URL`, `OLLAMA_BASE_URL`, or `OPENWEBUI_BASE_URL` define the base URL (e.g. `http://localhost:11434` or an OpenWebUI URL).
323
+ - `OLLAMA_MODEL` selects the local model (for example `llama3.1`).
324
+ - `OPENAI_API_KEY` can be any non-empty token (often not validated by Ollama/OpenWebUI).
325
+
316
326
  - **SSH Key Encryption**
317
327
  - Stored SSH keys live in `ssh_keys.enc`.
318
328
  - Each key is encrypted using a passphrase-derived key (PBKDF2 + AES-256).
@@ -452,8 +452,15 @@ tools = [
452
452
  ]
453
453
 
454
454
 
455
- if os.getenv('LLM') == 'deepseek':
455
+ llm_name = os.getenv('LLM')
456
+ if llm_name == 'deepseek':
456
457
  from llm.deepseek import llm
458
+ elif llm_name in ('anthropic', 'claude'):
459
+ from llm.claude import llm
460
+ elif llm_name in ('google', 'gemini'):
461
+ from llm.google import llm
462
+ elif llm_name in ('ollama', 'openwebui'):
463
+ from llm.ollama import llm
457
464
  else:
458
465
  from llm.chatgpt import llm
459
466
 
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: devpy-cli
3
- Version: 1.0.3
3
+ Version: 1.0.4
4
4
  Summary: AI-powered DevOps CLI Assistant for local and remote Docker management
5
5
  Author-email: Eddy Ortega <atrox390@gmail.com>
6
6
  License: MIT
@@ -19,13 +19,15 @@ Requires-Dist: cryptography>=42.0.0
19
19
  Requires-Dist: rich>=13.7.0
20
20
  Requires-Dist: langchain>=0.1.0
21
21
  Requires-Dist: langchain-openai>=0.0.5
22
+ Requires-Dist: langchain-anthropic>=0.1.0
23
+ Requires-Dist: langchain-google-genai>=1.0.0
22
24
  Requires-Dist: langgraph>=0.0.10
23
25
  Requires-Dist: python-dotenv>=1.0.0
24
26
  Requires-Dist: psutil>=5.9.0
25
27
 
26
28
  # DevPy CLI
27
29
 
28
- An intelligent command-line assistant powered by LLM (DeepSeek/OpenAI) to manage Docker environments, both local and remote via SSH. Designed to simplify DevOps tasks with natural language, ensuring security and control.
30
+ An intelligent command-line assistant powered by multiple LLM providers (DeepSeek, OpenAI, Anthropic Claude, Google Gemini, Ollama/OpenWebUI) to manage Docker environments, both local and remote via SSH. Designed to simplify DevOps tasks with natural language, ensuring security and control.
29
31
 
30
32
  ## Key Features
31
33
 
@@ -331,13 +333,23 @@ DevPy CLI exposes a set of Docker-focused tools that the agent can call to fulfi
331
333
  - **LLM API Authentication**
332
334
  - The `.env` file created by the setup wizard stores:
333
335
  - `LLM` – which provider/adapter to use.
334
- - `<PROVIDER>_API_KEY` – the API key for that provider.
335
- - Optionally `LLM_BASE_URL` – custom base URL for compatible providers.
336
+ - `<PROVIDER>_API_KEY` – the API key for that provider (for example `DEEPSEEK_API_KEY`, `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`).
337
+ - Optionally `LLM_BASE_URL` – custom base URL for compatible providers (including OpenAI-compatible proxies such as Ollama or OpenWebUI).
336
338
  - You can re-run the wizard at any time with:
337
339
  ```bash
338
340
  config llm
339
341
  ```
340
342
 
343
+ - **Supported LLM Providers**
344
+ - **DeepSeek** – uses `LLM=deepseek` and `DEEPSEEK_API_KEY`, with optional `LLM_BASE_URL` override.
345
+ - **OpenAI (ChatGPT / GPT-4o family)** – uses `LLM=chatgpt` and `OPENAI_API_KEY`.
346
+ - **Anthropic Claude** – uses `LLM=anthropic` or `LLM=claude` with `ANTHROPIC_API_KEY`.
347
+ - **Google Gemini** – uses `LLM=google` or `LLM=gemini` with `GOOGLE_API_KEY`.
348
+ - **Ollama / OpenWebUI** – uses `LLM=ollama` or `LLM=openwebui` and talks to an OpenAI-compatible endpoint:
349
+ - `LLM_BASE_URL`, `OLLAMA_BASE_URL`, or `OPENWEBUI_BASE_URL` define the base URL (e.g. `http://localhost:11434` or an OpenWebUI URL).
350
+ - `OLLAMA_MODEL` selects the local model (for example `llama3.1`).
351
+ - `OPENAI_API_KEY` can be any non-empty token (often not validated by Ollama/OpenWebUI).
352
+
341
353
  - **SSH Key Encryption**
342
354
  - Stored SSH keys live in `ssh_keys.enc`.
343
355
  - Each key is encrypted using a passphrase-derived key (PBKDF2 + AES-256).
@@ -16,4 +16,7 @@ devpy_cli.egg-info/requires.txt
16
16
  devpy_cli.egg-info/top_level.txt
17
17
  llm/__init__.py
18
18
  llm/chatgpt.py
19
- llm/deepseek.py
19
+ llm/claude.py
20
+ llm/deepseek.py
21
+ llm/google.py
22
+ llm/ollama.py
@@ -4,6 +4,8 @@ cryptography>=42.0.0
4
4
  rich>=13.7.0
5
5
  langchain>=0.1.0
6
6
  langchain-openai>=0.0.5
7
+ langchain-anthropic>=0.1.0
8
+ langchain-google-genai>=1.0.0
7
9
  langgraph>=0.0.10
8
10
  python-dotenv>=1.0.0
9
11
  psutil>=5.9.0
@@ -0,0 +1,8 @@
1
+ import os
2
+ from langchain_anthropic import ChatAnthropic
3
+
4
+ llm = ChatAnthropic(
5
+ model='claude-3-5-sonnet-latest',
6
+ api_key=os.getenv('ANTHROPIC_API_KEY'),
7
+ max_tokens=1500,
8
+ )
@@ -0,0 +1,8 @@
1
+ import os
2
+ from langchain_google_genai import ChatGoogleGenerativeAI
3
+
4
+ llm = ChatGoogleGenerativeAI(
5
+ model='gemini-1.5-pro',
6
+ api_key=os.getenv('GOOGLE_API_KEY'),
7
+ max_output_tokens=1500,
8
+ )
@@ -0,0 +1,20 @@
1
+ import os
2
+ from langchain_openai import ChatOpenAI
3
+
4
+
5
+ def _get_base_url() -> str:
6
+ base = os.getenv('LLM_BASE_URL') or os.getenv('OLLAMA_BASE_URL') or os.getenv('OPENWEBUI_BASE_URL')
7
+ if base:
8
+ base = base.rstrip('/')
9
+ if not base.endswith('/v1'):
10
+ base = base + '/v1'
11
+ return base
12
+ return 'http://localhost:11434/v1'
13
+
14
+
15
+ llm = ChatOpenAI(
16
+ model=os.getenv('OLLAMA_MODEL', 'llama3.1:8b'),
17
+ api_key=os.getenv('OPENAI_API_KEY', 'ollama'),
18
+ base_url=_get_base_url(),
19
+ max_tokens=1500,
20
+ )
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
4
4
 
5
5
  [project]
6
6
  name = "devpy-cli"
7
- version = "1.0.3"
7
+ version = "1.0.4"
8
8
  description = "AI-powered DevOps CLI Assistant for local and remote Docker management"
9
9
  readme = "README.md"
10
10
  authors = [{ name = "Eddy Ortega", email = "atrox390@gmail.com" }]
@@ -23,6 +23,8 @@ dependencies = [
23
23
  "rich>=13.7.0",
24
24
  "langchain>=0.1.0",
25
25
  "langchain-openai>=0.0.5",
26
+ "langchain-anthropic>=0.1.0",
27
+ "langchain-google-genai>=1.0.0",
26
28
  "langgraph>=0.0.10",
27
29
  "python-dotenv>=1.0.0",
28
30
  "psutil>=5.9.0",
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes