ehs-llm-client 0.1.2__py3-none-any.whl → 0.1.5__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,9 +1,9 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: ehs-llm-client
3
- Version: 0.1.2
3
+ Version: 0.1.5
4
4
  Summary: Unified LLM client. Currently supports Openai, Azure Openai and Google Gemini
5
5
  Author-email: Andersen Huang <andersen.huang@ehsanalytics.com>
6
- License: MIT
6
+ License-Expression: MIT
7
7
  Description-Content-Type: text/markdown
8
8
  Requires-Dist: openai
9
9
  Requires-Dist: google-genai
@@ -15,7 +15,7 @@ Requires-Dist: python-dateutil
15
15
  [![PyPI version](https://img.shields.io/pypi/v/llm-client)](https://pypi.org/project/llm-client/)
16
16
  [![Python Version](https://img.shields.io/pypi/pyversions/llm-client)](https://www.python.org/downloads/)
17
17
 
18
- `llm-client` is a unified **async Python client** for interacting with multiple LLM providers, including **OpenAI**, **Azure OpenAI**, and **Google Gemini**.
18
+ `ehs-llm-client` is a unified **async Python client** for interacting with multiple LLM providers, including **OpenAI**, **Azure OpenAI**, and **Google Gemini**.
19
19
  It supports **single calls**, **structured JSON outputs**, and **batch processing**, designed for **production-ready, reusable code** in applications, scripts, or pipelines.
20
20
 
21
21
  ---
@@ -36,7 +36,7 @@ It supports **single calls**, **structured JSON outputs**, and **batch processin
36
36
 
37
37
  ```bash
38
38
  # Install from PyPI
39
- pip install llm-client
39
+ pip install ehs-llm-client
40
40
  ```
41
41
 
42
42
  Or install editable version during development:
@@ -50,7 +50,7 @@ pip install -e .
50
50
 
51
51
  ## Configuration
52
52
 
53
- `llm-client` supports **three configuration modes**:
53
+ `ehs-llm-client` supports **three configuration modes**:
54
54
 
55
55
  ### 1️⃣ Config file (`.cfg`)
56
56
 
@@ -70,7 +70,7 @@ model_batch = gpt-4.1-mini
70
70
  Usage:
71
71
 
72
72
  ```python
73
- from llm_client import LLM
73
+ from ehs_llm_client import LLM
74
74
 
75
75
  llm = LLM("prod", config_file_path="llmconfig.cfg")
76
76
  ```
@@ -80,7 +80,7 @@ llm = LLM("prod", config_file_path="llmconfig.cfg")
80
80
  ### 2️⃣ Config dictionary (for tests / CI)
81
81
 
82
82
  ```python
83
- from llm_client import LLM
83
+ from ehs_llm_client import LLM
84
84
 
85
85
  llm = LLM(
86
86
  "default",
@@ -3,7 +3,7 @@ ehs_llm_client/client.py,sha256=fbyMOPTw_NayG2GCbiIQbwZHyBrU76s9Lirh5uBpv3o,1620
3
3
  ehs_llm_client/config.py,sha256=1sYOvX5biBUoLYMa22CiJ_MElRlBfBgQsatfCTs_kN0,1698
4
4
  ehs_llm_client/exceptions.py,sha256=UEihJBOZufiMPjkfStEuiE7rhJTUJXib6puGN65FykA,92
5
5
  ehs_llm_client/utils.py,sha256=zVLF8obG1cKJ-9cyHulzoTZBT0LQtGkx4cdiG2UaKuk,352
6
- ehs_llm_client-0.1.2.dist-info/METADATA,sha256=aPrzMJqxBhbIskSITRT3ZXNJLSKELz85HPBU5vhJFfA,3651
7
- ehs_llm_client-0.1.2.dist-info/WHEEL,sha256=wUyA8OaulRlbfwMtmQsvNngGrxQHAvkKcvRmdizlJi0,92
8
- ehs_llm_client-0.1.2.dist-info/top_level.txt,sha256=8xnm7U82x1dZYaanP_IxTIw7e4PKG-GEpmaR0jU6FSE,15
9
- ehs_llm_client-0.1.2.dist-info/RECORD,,
6
+ ehs_llm_client-0.1.5.dist-info/METADATA,sha256=FHmYTFOhSeEkz-vTwIyOCp2b0qsQ_eHfWROiaOR3YNM,3682
7
+ ehs_llm_client-0.1.5.dist-info/WHEEL,sha256=wUyA8OaulRlbfwMtmQsvNngGrxQHAvkKcvRmdizlJi0,92
8
+ ehs_llm_client-0.1.5.dist-info/top_level.txt,sha256=8xnm7U82x1dZYaanP_IxTIw7e4PKG-GEpmaR0jU6FSE,15
9
+ ehs_llm_client-0.1.5.dist-info/RECORD,,