gllm-inference-binary 0.4.61__cp312-cp312-win_amd64.whl → 0.4.62__cp312-cp312-win_amd64.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of gllm-inference-binary might be problematic. Click here for more details.

@@ -93,10 +93,13 @@ class LMRequestProcessorCatalog(BaseCatalog[LMRequestProcessor]):
93
93
  2.3. For the available model ID formats, see: https://gdplabs.gitbook.io/sdk/resources/supported-models
94
94
  3. `credentials` is optional. If it is filled, it can either be:
95
95
  3.1. An environment variable name containing the API key (e.g. OPENAI_API_KEY).
96
- 3.2. A path to a credentials JSON file, currently only supported for Google Vertex AI.
97
- 3.3. A dictionary of credentials, currently supported for Bedrock and LangChain.
98
- 4. `config` is optional. When this column is empty, the LM invoker will use the
99
- default configuration. If it is filled, it must be a valid JSON string.
96
+ 3.2. An environment variable name containing the path to a credentials JSON file
97
+ (e.g. GOOGLE_CREDENTIALS_FILE_PATH). Currently only supported for Google Vertex AI.
98
+ 3.3. A dictionary of credentials, with each value being an environment variable name corresponding to the
99
+ credential (e.g. {"api_key": "OPENAI_API_KEY"}). Currently supported for Bedrock and LangChain.
100
+ If it is empty, the LM invoker will use the default credentials loaded from the environment variables.
101
+ 4. `config` is optional. If filled, must be a dictionary containing the configuration for the LM invoker.
102
+ If it is empty, the LM invoker will use the default configuration.
100
103
  5. `output_parser_type` can either be:
101
104
  5.1. none: No output parser will be used.
102
105
  5.2. json: The JSONOutputParser will be used.
Binary file
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: gllm-inference-binary
3
- Version: 0.4.61
3
+ Version: 0.4.62
4
4
  Summary: A library containing components related to model inferences in Gen AI applications.
5
5
  Author: Henry Wicaksono
6
6
  Author-email: henry.wicaksono@gdplabs.id
@@ -8,7 +8,7 @@ gllm_inference/builder/model_id.pyi,sha256=99Upl0mLOQT8pA7XlhUjPuFhLAW0KCYw4C4a0
8
8
  gllm_inference/catalog/__init__.pyi,sha256=HWgPKWIzprpMHRKe_qN9BZSIQhVhrqiyjLjIXwvj1ho,291
9
9
  gllm_inference/catalog/catalog.pyi,sha256=wYRLUWsUXBcPxl__ycMBRV04sFMVY98glrPKOaAWsRs,4760
10
10
  gllm_inference/catalog/component_map.pyi,sha256=IbnAxu2DT81KuGCVCX-SXXoQ6lw6V18v0UJQeHJglic,1354
11
- gllm_inference/catalog/lm_request_processor_catalog.pyi,sha256=ScDmf4tHdF-At6Nvjt6hlCZZTgTHfhBXF-BXZMno4QQ,5093
11
+ gllm_inference/catalog/lm_request_processor_catalog.pyi,sha256=5gU62WLAq1hhTGRmaJTaDctwiChz5n3tMb8QwUDtbiM,5463
12
12
  gllm_inference/catalog/prompt_builder_catalog.pyi,sha256=HNRwdSw5-Ihw6Z1AhUhlWvednFpiTmcv68e_fUOeHn0,3902
13
13
  gllm_inference/constants.pyi,sha256=_ejABbf_Z-NWYuKLGM5ln2GTTqC1vYRY2aaltxlLo5w,283
14
14
  gllm_inference/em_invoker/__init__.pyi,sha256=LivY6D0QAoo5k4V8eibJbap0IangA1ign2zthJNq4PI,1285
@@ -108,8 +108,8 @@ gllm_inference/utils/openai_multimodal_lm_helper.pyi,sha256=oolyuXA5S9Njft6E15Th
108
108
  gllm_inference/utils/retry.pyi,sha256=gEHkFUmzX8CCkvFrXPYhFuoZ_iq0a210TBiRU88ZHbA,80
109
109
  gllm_inference/utils/utils.pyi,sha256=N1fum4TLEsIYsdnK8y6fVxDDF5WT_MnLP9FSJUsjcGQ,6159
110
110
  gllm_inference.build/.gitignore,sha256=aEiIwOuxfzdCmLZe4oB1JsBmCUxwG8x-u-HBCV9JT8E,1
111
- gllm_inference.cp312-win_amd64.pyd,sha256=cuT9SRHtFlE4ooYeElHsb8qIITKCKGd2eOxvQFj-A4g,3446272
111
+ gllm_inference.cp312-win_amd64.pyd,sha256=uGM5mwHtQuYHf3vTuZLmcrgBgLYCj43EMk0sdYzKgoU,3446272
112
112
  gllm_inference.pyi,sha256=mnf0N8LU2LmMGy8so2ZhT59GqLw5ZGuS4WtvSPO35AQ,5030
113
- gllm_inference_binary-0.4.61.dist-info/METADATA,sha256=-mvlvGp3-SKwDJb55Uv4w8zvDrq9lfuEUcrh-0IW0t0,4958
114
- gllm_inference_binary-0.4.61.dist-info/WHEEL,sha256=4N0hGcnWMI_Ty6ATf4qJqqSl-UNI-Ln828iTWGIywmU,98
115
- gllm_inference_binary-0.4.61.dist-info/RECORD,,
113
+ gllm_inference_binary-0.4.62.dist-info/METADATA,sha256=_2NkG0nqiOli_nrerx5mdrbsgFlQkewk9pS6zIB9OeY,4958
114
+ gllm_inference_binary-0.4.62.dist-info/WHEEL,sha256=4N0hGcnWMI_Ty6ATf4qJqqSl-UNI-Ln828iTWGIywmU,98
115
+ gllm_inference_binary-0.4.62.dist-info/RECORD,,