vellum-ai 0.6.4__py3-none-any.whl → 0.6.5__py3-none-any.whl

Sign up to get free protection for your applications and to get access to all the features.
@@ -18,7 +18,7 @@ class BaseClientWrapper:
18
18
  headers: typing.Dict[str, str] = {
19
19
  "X-Fern-Language": "Python",
20
20
  "X-Fern-SDK-Name": "vellum-ai",
21
- "X-Fern-SDK-Version": "0.6.4",
21
+ "X-Fern-SDK-Version": "0.6.5",
22
22
  }
23
23
  headers["X_API_KEY"] = self.api_key
24
24
  return headers
@@ -47,6 +47,11 @@ class DeploymentRead(pydantic_v1.BaseModel):
47
47
  Deprecated. The Prompt execution endpoints return a `prompt_version_id` that could be used instead.
48
48
  """
49
49
 
50
+ last_deployed_history_item_id: str = pydantic_v1.Field()
51
+ """
52
+ The ID of the history item associated with this Deployment's LATEST Release Tag
53
+ """
54
+
50
55
  def json(self, **kwargs: typing.Any) -> str:
51
56
  kwargs_with_defaults: typing.Any = {"by_alias": True, "exclude_unset": True, **kwargs}
52
57
  return super().json(**kwargs_with_defaults)
@@ -41,6 +41,11 @@ class WorkflowDeploymentRead(pydantic_v1.BaseModel):
41
41
 
42
42
  created: dt.datetime
43
43
  last_deployed_on: dt.datetime
44
+ last_deployed_history_item_id: str = pydantic_v1.Field()
45
+ """
46
+ The ID of the history item associated with this Workflow Deployment's LATEST Release Tag
47
+ """
48
+
44
49
  input_variables: typing.List[VellumVariable] = pydantic_v1.Field()
45
50
  """
46
51
  The input variables this Workflow Deployment expects to receive values for when it is executed.
@@ -0,0 +1,126 @@
1
+ Metadata-Version: 2.1
2
+ Name: vellum-ai
3
+ Version: 0.6.5
4
+ Summary:
5
+ Requires-Python: >=3.8,<4.0
6
+ Classifier: Programming Language :: Python :: 3
7
+ Classifier: Programming Language :: Python :: 3.8
8
+ Classifier: Programming Language :: Python :: 3.9
9
+ Classifier: Programming Language :: Python :: 3.10
10
+ Classifier: Programming Language :: Python :: 3.11
11
+ Requires-Dist: cdktf (>=0.20.5,<0.21.0)
12
+ Requires-Dist: httpx (>=0.21.2)
13
+ Requires-Dist: publication (==0.0.3)
14
+ Requires-Dist: pydantic (>=1.9.2)
15
+ Requires-Dist: typing_extensions (>=4.0.0)
16
+ Description-Content-Type: text/markdown
17
+
18
+ # Vellum Python Library
19
+
20
+ [![pypi](https://img.shields.io/pypi/v/vellum-ai.svg)](https://pypi.python.org/pypi/vellum-ai)
21
+ ![license badge](https://img.shields.io/github/license/vellum-ai/vellum-client-python)
22
+ [![fern shield](https://img.shields.io/badge/%F0%9F%8C%BF-SDK%20generated%20by%20Fern-brightgreen)](https://buildwithfern.com/?utm_source=vellum-ai/vellum-client-python/readme)
23
+
24
+ The Vellum Python SDK provides access to the Vellum API from python.
25
+
26
+
27
+ ## API Docs
28
+ You can find Vellum's complete API docs at [docs.vellum.ai](https://docs.vellum.ai/api-reference/introduction/getting-started).
29
+
30
+ ## Installation
31
+
32
+ ```sh
33
+ pip install --upgrade vellum-ai
34
+ ```
35
+
36
+ ## Usage
37
+ Below is how you would invoke a deployed Prompt from the Vellum API. For a complete list of all APIs
38
+ that Vellum supports, check out our [API Reference](https://docs.vellum.ai/api-reference/introduction/getting-started).
39
+
40
+ ```python
41
+ from vellum import (
42
+ PromptDeploymentInputRequest_String,
43
+ )
44
+ from vellum.client import Vellum
45
+
46
+ client = Vellum(
47
+ api_key="YOUR_API_KEY",
48
+ )
49
+
50
+ def execute() -> str:
51
+ result = client.execute_prompt(
52
+ prompt_deployment_name="<example-deployment-name>>",
53
+ release_tag="LATEST",
54
+ inputs=[
55
+ PromptDeploymentInputRequest_String(
56
+ name="input_a",
57
+ type="STRING",
58
+ value="Hello, world!",
59
+ )
60
+ ],
61
+ )
62
+
63
+ if result.state == "REJECTED":
64
+ raise Exception(result.error.message)
65
+
66
+ return result.outputs[0].value
67
+
68
+ if __name__ == "__main__":
69
+ print(execute())
70
+ ```
71
+
72
+ > [!TIP]
73
+ > You can set a system environment variable `VELLUM_API_KEY` to avoid writing your api key within your code. To do so, add `export VELLUM_API_KEY=<your-api-token>`
74
+ > to your ~/.zshrc or ~/.bashrc, open a new terminal, and then any code calling `vellum.Vellum()` will read this key.
75
+
76
+ ## Async Client
77
+ This SDK has an async version. Here's how to use it:
78
+
79
+
80
+
81
+ ```python
82
+ import asyncio
83
+
84
+ import vellum
85
+ from vellum.client import AsyncVellum
86
+
87
+ client = AsyncVellum(api_key="YOUR_API_KEY")
88
+
89
+ async def execute() -> str:
90
+ result = await client.execute_prompt(
91
+ prompt_deployment_name="<example-deployment-name>>",
92
+ release_tag="LATEST",
93
+ inputs=[
94
+ vellum.PromptDeploymentInputRequest_String(
95
+ name="input_a",
96
+ value="Hello, world!",
97
+ )
98
+ ],
99
+ )
100
+
101
+ if result.state == "REJECTED":
102
+ raise Exception(result.error.message)
103
+
104
+ return result.outputs[0].value
105
+
106
+ if __name__ == "__main__":
107
+ print(asyncio.run(execute()))
108
+ ```
109
+
110
+ ## Contributing
111
+
112
+ While we value open-source contributions to this SDK, most of this library is generated programmatically.
113
+
114
+ Please feel free to make contributions to any of the directories or files below:
115
+ ```plaintext
116
+ examples/*
117
+ src/vellum/lib/*
118
+ tests/*
119
+ README.md
120
+ ```
121
+
122
+ Any additions made to files beyond those directories and files above would have to be moved over to our generation code
123
+ (found in the separate [vellum-client-generator](https://github.com/vellum-ai/vellum-client-generator) repo),
124
+ otherwise they would be overwritten upon the next generated release. Feel free to open a PR as a proof of concept,
125
+ but know that we will not be able to merge it as-is. We suggest opening an issue first to discuss with us!
126
+
@@ -2,7 +2,7 @@ vellum/__init__.py,sha256=7aKsuZge9dDZncC299GYskQc1AJ1GUkJ64doEDqHmS4,44855
2
2
  vellum/client.py,sha256=FklbOzCaDTPP_EQn0HJXUq1_ZFOHuSePt6_nVQ_YLgY,97463
3
3
  vellum/core/__init__.py,sha256=1pNSKkwyQvMl_F0wohBqmoQAITptg3zlvCwsoSSzy7c,853
4
4
  vellum/core/api_error.py,sha256=RE8LELok2QCjABadECTvtDp7qejA1VmINCh6TbqPwSE,426
5
- vellum/core/client_wrapper.py,sha256=LAmDIndEooz_x9oVaUk8OCp9HJDoHlb9_mnFhDkh9bA,1697
5
+ vellum/core/client_wrapper.py,sha256=qDiIYijRBOToj6MOYVMjrOKpLovjDPs7PkLNsbJSgGM,1697
6
6
  vellum/core/datetime_utils.py,sha256=nBys2IsYrhPdszxGKCNRPSOCwa-5DWOHG95FB8G9PKo,1047
7
7
  vellum/core/file.py,sha256=sy1RUGZ3aJYuw998bZytxxo6QdgKmlnlgBaMvwEKCGg,1480
8
8
  vellum/core/http_client.py,sha256=5ok6hqgZDJhg57EHvMnr0BBaHdG50QxFPKaCZ9aVWTc,5059
@@ -104,7 +104,7 @@ vellum/types/created_enum.py,sha256=_dfKJhEenYcIUYY1uKQuq1uNS3k9HbPGCxXnW-Tu5uo,
104
104
  vellum/types/delete_enum.py,sha256=g6Rnc2pbgXkEbqhG0Bx1z-ZGr4DMkb8QK8du9dQQcpQ,118
105
105
  vellum/types/deleted_enum.py,sha256=F7VTcnxIkXrwyQr5CjGikBbCnlo6To_rP0pibWm-ioo,120
106
106
  vellum/types/deployment_provider_payload_response.py,sha256=nEw7v0EVo3NgKDVtsBMjd9XLWmFAGk59U1Z-qSs-Stc,898
107
- vellum/types/deployment_read.py,sha256=q3xfBEKQ8HsXc9en1c3oKSGQbyTc-xY54puIEe20okM,1938
107
+ vellum/types/deployment_read.py,sha256=Ob9ArdqKJb5vjRx26hX_iOnPF2MwtBYxB5xx2LVNbEk,2100
108
108
  vellum/types/deployment_release_tag_deployment_history_item.py,sha256=997C-J0NOEvOm7Y_dyyaqYvKMIEHCDj0JEpAcmOjOEQ,903
109
109
  vellum/types/deployment_release_tag_read.py,sha256=o0X8dMSqajT3-lEnLk9tRb8PRhs3l3M4iBM7CX9316c,1432
110
110
  vellum/types/document_document_to_document_index.py,sha256=kCPPJFnXu9HFZbk7PgRCtRDj5Cw2_0yEPjAStm-YC2E,1532
@@ -418,7 +418,7 @@ vellum/types/vellum_image.py,sha256=1QCMf26kEKRKP9DTxNI0qp7CNC1viWGFV9hmIxFyxoY,
418
418
  vellum/types/vellum_image_request.py,sha256=ADerbxbSJHzNYouJa1jaBIGvbHR2nSmqYAxE-cgS2Rg,921
419
419
  vellum/types/vellum_variable.py,sha256=MPxkKBtuxtg4HZud4xwsyT_sH6FG-YDGeFLpUa4NZDs,944
420
420
  vellum/types/vellum_variable_type.py,sha256=uHeBCGi7U_SksgKOxtvI4KxYffD4BD2TlddTPo_LUSM,281
421
- vellum/types/workflow_deployment_read.py,sha256=KsGJ4Ah4ybWbEsdnb0ixXHxCfwxRSQssBuQmlyZMpbc,2110
421
+ vellum/types/workflow_deployment_read.py,sha256=010Jqbj-XOXRm4evLM5yIg4RwDVFB5ayKopMIRPUH88,2281
422
422
  vellum/types/workflow_event_error.py,sha256=1f-xt3rNeCIpSm37KmAqVMc9IEbDD-3pNH4zwBYzXP0,981
423
423
  vellum/types/workflow_execution_actual_chat_history_request.py,sha256=ZBk37qbr-gXYeFLhgmkQQLKvrtlJCWVe9B6GYjq763o,2223
424
424
  vellum/types/workflow_execution_actual_json_request.py,sha256=XuiH6iE_NZiut9E3Y8VwiY5rOHx8u3sd2i_TxbfG8d8,2160
@@ -459,7 +459,7 @@ vellum/types/workflow_result_event_output_data_search_results.py,sha256=gazaUrC5
459
459
  vellum/types/workflow_result_event_output_data_string.py,sha256=aVWIIGbLj4TJJhTTj6WzhbYXQkcZatKuhhNy8UYwXbw,1482
460
460
  vellum/types/workflow_stream_event.py,sha256=KA6Bkk_XA6AIPWR-1vKnwF1A8l_Bm5y0arQCWWWRpsk,911
461
461
  vellum/version.py,sha256=neLt8HBHHUtDF9M5fsyUzHT-pKooEPvceaLDqqIGb0s,77
462
- vellum_ai-0.6.4.dist-info/LICENSE,sha256=CcaljEIoOBaU-wItPH4PmM_mDCGpyuUY0Er1BGu5Ti8,1073
463
- vellum_ai-0.6.4.dist-info/METADATA,sha256=Y5gS7YXyKOfHyJOHFZfDWmXeutOluhSu8wAo4V6F3K0,3591
464
- vellum_ai-0.6.4.dist-info/WHEEL,sha256=Zb28QaM1gQi8f4VCBhsUklF61CTlNYfs9YAZn-TOGFk,88
465
- vellum_ai-0.6.4.dist-info/RECORD,,
462
+ vellum_ai-0.6.5.dist-info/LICENSE,sha256=CcaljEIoOBaU-wItPH4PmM_mDCGpyuUY0Er1BGu5Ti8,1073
463
+ vellum_ai-0.6.5.dist-info/METADATA,sha256=LRrhuNTE8VAO6jeFWXRA5RDnsW1OtcniYmYj_nGX548,3872
464
+ vellum_ai-0.6.5.dist-info/WHEEL,sha256=Zb28QaM1gQi8f4VCBhsUklF61CTlNYfs9YAZn-TOGFk,88
465
+ vellum_ai-0.6.5.dist-info/RECORD,,
@@ -1,109 +0,0 @@
1
- Metadata-Version: 2.1
2
- Name: vellum-ai
3
- Version: 0.6.4
4
- Summary:
5
- Requires-Python: >=3.8,<4.0
6
- Classifier: Programming Language :: Python :: 3
7
- Classifier: Programming Language :: Python :: 3.8
8
- Classifier: Programming Language :: Python :: 3.9
9
- Classifier: Programming Language :: Python :: 3.10
10
- Classifier: Programming Language :: Python :: 3.11
11
- Requires-Dist: cdktf (>=0.20.5,<0.21.0)
12
- Requires-Dist: httpx (>=0.21.2)
13
- Requires-Dist: publication (==0.0.3)
14
- Requires-Dist: pydantic (>=1.9.2)
15
- Requires-Dist: typing_extensions (>=4.0.0)
16
- Description-Content-Type: text/markdown
17
-
18
- # Vellum Python Library
19
-
20
- [![pypi](https://img.shields.io/pypi/v/vellum-ai.svg)](https://pypi.python.org/pypi/vellum-ai)
21
- [![fern shield](https://img.shields.io/badge/%F0%9F%8C%BF-SDK%20generated%20by%20Fern-brightgreen)](https://buildwithfern.com/?utm_source=vellum-ai/vellum-client-python/readme)
22
-
23
- The Vellum Python Library provides access to the Vellum API from python.
24
-
25
-
26
- ## API Docs
27
- You can find Vellum's complete API docs at [docs.vellum.ai](https://docs.vellum.ai).
28
-
29
- ## Installation
30
-
31
- ```sh
32
- pip install --upgrade vellum-ai
33
- ```
34
-
35
- ## Usage
36
-
37
- ```python
38
- import vellum
39
- from vellum.client import Vellum
40
-
41
-
42
- client = Vellum(api_key="YOUR_API_KEY")
43
-
44
- result = client.generate(
45
- deployment_name="my-deployment",
46
- requests=[
47
- vellum.GenerateRequest(
48
- input_values={"question": "Can I get a refund?"})]
49
- )
50
-
51
- print(result.text)
52
- ```
53
-
54
- ## Async Client
55
-
56
- ```python
57
- import vellum
58
- from vellum.client import AsyncVellum
59
-
60
- raven = AsyncVellum(api_key="YOUR_API_KEY")
61
-
62
- async def generate() -> str:
63
- result = client.generate(
64
- deployment_name="my-deployment",
65
- requests=[
66
- vellum.GenerateRequest(
67
- input_values={"question": "Can I get a refund?"})]
68
- )
69
-
70
- return result.text
71
- ```
72
-
73
- ## Uploading documents
74
-
75
- Documents can be uploaded to Vellum via either the UI or this API. Once uploaded and indexed, Vellum's Search allows you to perform semantic searches against them.
76
-
77
- ```python
78
- from vellum.client import Vellum
79
-
80
- client = Vellum(api_key="YOUR_API_KEY")
81
-
82
- with open("/path/to/your/file.txt", "rb") as file:
83
- result = client.documents.upload(
84
- # File to upload
85
- contents=file,
86
- # Document label
87
- label="Human-friendly label for your document",
88
- # The names of indexes that you'd like this document to be added to.
89
- add_to_index_names=["<your-index-name>"],
90
- # Optionally include a unique ID from your system to this document later.
91
- # Useful if you want to perform updates later
92
- external_id="<your-index-name>",
93
- # Optionally include keywords to associate with the document that can be used in hybrid search
94
- keywords=[],
95
- )
96
-
97
- print(result)
98
- ```
99
-
100
- ## Beta status
101
-
102
- This SDK is in beta, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning the package version to a specific version in your pyproject.toml file. This way, you can install the same version each time without breaking changes unless you are intentionally looking for the latest version.
103
-
104
- ## Contributing
105
-
106
- While we value open-source contributions to this SDK, this library is generated programmatically. Additions made directly to this library would have to be moved over to our generation code, otherwise they would be overwritten upon the next generated release. Feel free to open a PR as a proof of concept, but know that we will not be able to merge it as-is. We suggest opening an issue first to discuss with us!
107
-
108
- On the other hand, contributions to the README are always very welcome!
109
-