metaflow-stubs 2.17.2__py2.py3-none-any.whl → 2.17.3__py2.py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of metaflow-stubs might be problematic. Click here for more details.

Files changed (166) hide show
  1. metaflow-stubs/__init__.pyi +559 -559
  2. metaflow-stubs/cards.pyi +2 -2
  3. metaflow-stubs/cli.pyi +2 -2
  4. metaflow-stubs/cli_components/__init__.pyi +2 -2
  5. metaflow-stubs/cli_components/utils.pyi +2 -2
  6. metaflow-stubs/client/__init__.pyi +2 -2
  7. metaflow-stubs/client/core.pyi +5 -5
  8. metaflow-stubs/client/filecache.pyi +2 -2
  9. metaflow-stubs/events.pyi +2 -2
  10. metaflow-stubs/exception.pyi +2 -2
  11. metaflow-stubs/flowspec.pyi +3 -3
  12. metaflow-stubs/generated_for.txt +1 -1
  13. metaflow-stubs/includefile.pyi +3 -3
  14. metaflow-stubs/meta_files.pyi +2 -2
  15. metaflow-stubs/metadata_provider/__init__.pyi +2 -2
  16. metaflow-stubs/metadata_provider/heartbeat.pyi +2 -2
  17. metaflow-stubs/metadata_provider/metadata.pyi +2 -2
  18. metaflow-stubs/metadata_provider/util.pyi +2 -2
  19. metaflow-stubs/metaflow_config.pyi +2 -2
  20. metaflow-stubs/metaflow_current.pyi +24 -24
  21. metaflow-stubs/metaflow_git.pyi +2 -2
  22. metaflow-stubs/multicore_utils.pyi +2 -2
  23. metaflow-stubs/packaging_sys/__init__.pyi +6 -6
  24. metaflow-stubs/packaging_sys/backend.pyi +4 -4
  25. metaflow-stubs/packaging_sys/distribution_support.pyi +4 -4
  26. metaflow-stubs/packaging_sys/tar_backend.pyi +6 -6
  27. metaflow-stubs/packaging_sys/utils.pyi +2 -2
  28. metaflow-stubs/packaging_sys/v1.pyi +2 -2
  29. metaflow-stubs/parameters.pyi +3 -3
  30. metaflow-stubs/plugins/__init__.pyi +14 -14
  31. metaflow-stubs/plugins/airflow/__init__.pyi +2 -2
  32. metaflow-stubs/plugins/airflow/airflow_decorator.pyi +2 -2
  33. metaflow-stubs/plugins/airflow/airflow_utils.pyi +2 -2
  34. metaflow-stubs/plugins/airflow/exception.pyi +2 -2
  35. metaflow-stubs/plugins/airflow/sensors/__init__.pyi +2 -2
  36. metaflow-stubs/plugins/airflow/sensors/base_sensor.pyi +2 -2
  37. metaflow-stubs/plugins/airflow/sensors/external_task_sensor.pyi +2 -2
  38. metaflow-stubs/plugins/airflow/sensors/s3_sensor.pyi +2 -2
  39. metaflow-stubs/plugins/argo/__init__.pyi +2 -2
  40. metaflow-stubs/plugins/argo/argo_client.pyi +2 -2
  41. metaflow-stubs/plugins/argo/argo_events.pyi +2 -2
  42. metaflow-stubs/plugins/argo/argo_workflows.pyi +7 -3
  43. metaflow-stubs/plugins/argo/argo_workflows_decorator.pyi +3 -3
  44. metaflow-stubs/plugins/argo/argo_workflows_deployer.pyi +3 -3
  45. metaflow-stubs/plugins/argo/argo_workflows_deployer_objects.pyi +3 -3
  46. metaflow-stubs/plugins/argo/exit_hooks.pyi +2 -2
  47. metaflow-stubs/plugins/aws/__init__.pyi +3 -3
  48. metaflow-stubs/plugins/aws/aws_client.pyi +2 -2
  49. metaflow-stubs/plugins/aws/aws_utils.pyi +2 -2
  50. metaflow-stubs/plugins/aws/batch/__init__.pyi +2 -2
  51. metaflow-stubs/plugins/aws/batch/batch.pyi +2 -2
  52. metaflow-stubs/plugins/aws/batch/batch_client.pyi +2 -2
  53. metaflow-stubs/plugins/aws/batch/batch_decorator.pyi +2 -2
  54. metaflow-stubs/plugins/aws/secrets_manager/__init__.pyi +2 -2
  55. metaflow-stubs/plugins/aws/secrets_manager/aws_secrets_manager_secrets_provider.pyi +3 -3
  56. metaflow-stubs/plugins/aws/step_functions/__init__.pyi +2 -2
  57. metaflow-stubs/plugins/aws/step_functions/dynamo_db_client.pyi +2 -2
  58. metaflow-stubs/plugins/aws/step_functions/event_bridge_client.pyi +2 -2
  59. metaflow-stubs/plugins/aws/step_functions/schedule_decorator.pyi +2 -2
  60. metaflow-stubs/plugins/aws/step_functions/step_functions.pyi +2 -2
  61. metaflow-stubs/plugins/aws/step_functions/step_functions_client.pyi +2 -2
  62. metaflow-stubs/plugins/aws/step_functions/step_functions_decorator.pyi +2 -2
  63. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer.pyi +3 -3
  64. metaflow-stubs/plugins/aws/step_functions/step_functions_deployer_objects.pyi +3 -3
  65. metaflow-stubs/plugins/azure/__init__.pyi +2 -2
  66. metaflow-stubs/plugins/azure/azure_credential.pyi +2 -2
  67. metaflow-stubs/plugins/azure/azure_exceptions.pyi +2 -2
  68. metaflow-stubs/plugins/azure/azure_secret_manager_secrets_provider.pyi +3 -3
  69. metaflow-stubs/plugins/azure/azure_utils.pyi +2 -2
  70. metaflow-stubs/plugins/azure/blob_service_client_factory.pyi +2 -2
  71. metaflow-stubs/plugins/azure/includefile_support.pyi +2 -2
  72. metaflow-stubs/plugins/cards/__init__.pyi +2 -2
  73. metaflow-stubs/plugins/cards/card_client.pyi +3 -3
  74. metaflow-stubs/plugins/cards/card_creator.pyi +2 -2
  75. metaflow-stubs/plugins/cards/card_datastore.pyi +2 -2
  76. metaflow-stubs/plugins/cards/card_decorator.pyi +2 -2
  77. metaflow-stubs/plugins/cards/card_modules/__init__.pyi +2 -2
  78. metaflow-stubs/plugins/cards/card_modules/basic.pyi +2 -2
  79. metaflow-stubs/plugins/cards/card_modules/card.pyi +2 -2
  80. metaflow-stubs/plugins/cards/card_modules/components.pyi +2 -2
  81. metaflow-stubs/plugins/cards/card_modules/convert_to_native_type.pyi +2 -2
  82. metaflow-stubs/plugins/cards/card_modules/renderer_tools.pyi +2 -2
  83. metaflow-stubs/plugins/cards/card_modules/test_cards.pyi +2 -2
  84. metaflow-stubs/plugins/cards/card_resolver.pyi +2 -2
  85. metaflow-stubs/plugins/cards/component_serializer.pyi +2 -2
  86. metaflow-stubs/plugins/cards/exception.pyi +2 -2
  87. metaflow-stubs/plugins/catch_decorator.pyi +3 -3
  88. metaflow-stubs/plugins/datatools/__init__.pyi +2 -2
  89. metaflow-stubs/plugins/datatools/local.pyi +2 -2
  90. metaflow-stubs/plugins/datatools/s3/__init__.pyi +2 -2
  91. metaflow-stubs/plugins/datatools/s3/s3.pyi +3 -3
  92. metaflow-stubs/plugins/datatools/s3/s3tail.pyi +2 -2
  93. metaflow-stubs/plugins/datatools/s3/s3util.pyi +2 -2
  94. metaflow-stubs/plugins/debug_logger.pyi +2 -2
  95. metaflow-stubs/plugins/debug_monitor.pyi +2 -2
  96. metaflow-stubs/plugins/environment_decorator.pyi +2 -2
  97. metaflow-stubs/plugins/events_decorator.pyi +2 -2
  98. metaflow-stubs/plugins/exit_hook/__init__.pyi +2 -2
  99. metaflow-stubs/plugins/exit_hook/exit_hook_decorator.pyi +2 -2
  100. metaflow-stubs/plugins/frameworks/__init__.pyi +2 -2
  101. metaflow-stubs/plugins/frameworks/pytorch.pyi +2 -2
  102. metaflow-stubs/plugins/gcp/__init__.pyi +2 -2
  103. metaflow-stubs/plugins/gcp/gcp_secret_manager_secrets_provider.pyi +3 -3
  104. metaflow-stubs/plugins/gcp/gs_exceptions.pyi +2 -2
  105. metaflow-stubs/plugins/gcp/gs_storage_client_factory.pyi +2 -2
  106. metaflow-stubs/plugins/gcp/gs_utils.pyi +2 -2
  107. metaflow-stubs/plugins/gcp/includefile_support.pyi +2 -2
  108. metaflow-stubs/plugins/kubernetes/__init__.pyi +2 -2
  109. metaflow-stubs/plugins/kubernetes/kube_utils.pyi +2 -2
  110. metaflow-stubs/plugins/kubernetes/kubernetes.pyi +2 -2
  111. metaflow-stubs/plugins/kubernetes/kubernetes_client.pyi +2 -2
  112. metaflow-stubs/plugins/kubernetes/kubernetes_decorator.pyi +2 -2
  113. metaflow-stubs/plugins/kubernetes/kubernetes_job.pyi +2 -2
  114. metaflow-stubs/plugins/kubernetes/kubernetes_jobsets.pyi +2 -2
  115. metaflow-stubs/plugins/kubernetes/spot_monitor_sidecar.pyi +2 -2
  116. metaflow-stubs/plugins/parallel_decorator.pyi +2 -2
  117. metaflow-stubs/plugins/project_decorator.pyi +2 -2
  118. metaflow-stubs/plugins/pypi/__init__.pyi +2 -2
  119. metaflow-stubs/plugins/pypi/conda_decorator.pyi +2 -2
  120. metaflow-stubs/plugins/pypi/conda_environment.pyi +4 -4
  121. metaflow-stubs/plugins/pypi/parsers.pyi +2 -2
  122. metaflow-stubs/plugins/pypi/pypi_decorator.pyi +2 -2
  123. metaflow-stubs/plugins/pypi/pypi_environment.pyi +2 -2
  124. metaflow-stubs/plugins/pypi/utils.pyi +2 -2
  125. metaflow-stubs/plugins/resources_decorator.pyi +2 -2
  126. metaflow-stubs/plugins/retry_decorator.pyi +2 -2
  127. metaflow-stubs/plugins/secrets/__init__.pyi +3 -3
  128. metaflow-stubs/plugins/secrets/inline_secrets_provider.pyi +3 -3
  129. metaflow-stubs/plugins/secrets/secrets_decorator.pyi +2 -2
  130. metaflow-stubs/plugins/secrets/secrets_func.pyi +2 -2
  131. metaflow-stubs/plugins/secrets/secrets_spec.pyi +2 -2
  132. metaflow-stubs/plugins/secrets/utils.pyi +2 -2
  133. metaflow-stubs/plugins/storage_executor.pyi +2 -2
  134. metaflow-stubs/plugins/test_unbounded_foreach_decorator.pyi +3 -3
  135. metaflow-stubs/plugins/timeout_decorator.pyi +3 -3
  136. metaflow-stubs/plugins/uv/__init__.pyi +2 -2
  137. metaflow-stubs/plugins/uv/uv_environment.pyi +3 -3
  138. metaflow-stubs/pylint_wrapper.pyi +2 -2
  139. metaflow-stubs/runner/__init__.pyi +2 -2
  140. metaflow-stubs/runner/deployer.pyi +4 -4
  141. metaflow-stubs/runner/deployer_impl.pyi +3 -3
  142. metaflow-stubs/runner/metaflow_runner.pyi +3 -3
  143. metaflow-stubs/runner/nbdeploy.pyi +2 -2
  144. metaflow-stubs/runner/nbrun.pyi +2 -2
  145. metaflow-stubs/runner/subprocess_manager.pyi +2 -2
  146. metaflow-stubs/runner/utils.pyi +3 -3
  147. metaflow-stubs/system/__init__.pyi +2 -2
  148. metaflow-stubs/system/system_logger.pyi +2 -2
  149. metaflow-stubs/system/system_monitor.pyi +2 -2
  150. metaflow-stubs/tagging_util.pyi +2 -2
  151. metaflow-stubs/tuple_util.pyi +2 -2
  152. metaflow-stubs/user_configs/__init__.pyi +2 -2
  153. metaflow-stubs/user_configs/config_options.pyi +2 -2
  154. metaflow-stubs/user_configs/config_parameters.pyi +5 -5
  155. metaflow-stubs/user_decorators/__init__.pyi +2 -2
  156. metaflow-stubs/user_decorators/common.pyi +2 -2
  157. metaflow-stubs/user_decorators/mutable_flow.pyi +5 -5
  158. metaflow-stubs/user_decorators/mutable_step.pyi +5 -5
  159. metaflow-stubs/user_decorators/user_flow_decorator.pyi +4 -4
  160. metaflow-stubs/user_decorators/user_step_decorator.pyi +5 -5
  161. metaflow-stubs/version.pyi +2 -2
  162. {metaflow_stubs-2.17.2.dist-info → metaflow_stubs-2.17.3.dist-info}/METADATA +2 -2
  163. metaflow_stubs-2.17.3.dist-info/RECORD +166 -0
  164. metaflow_stubs-2.17.2.dist-info/RECORD +0 -166
  165. {metaflow_stubs-2.17.2.dist-info → metaflow_stubs-2.17.3.dist-info}/WHEEL +0 -0
  166. {metaflow_stubs-2.17.2.dist-info → metaflow_stubs-2.17.3.dist-info}/top_level.txt +0 -0
@@ -1,15 +1,15 @@
1
1
  ######################################################################################################
2
2
  # Auto-generated Metaflow stub file #
3
- # MF version: 2.17.2 #
4
- # Generated on 2025-08-19T21:01:04.587987 #
3
+ # MF version: 2.17.3 #
4
+ # Generated on 2025-08-21T22:44:50.438086 #
5
5
  ######################################################################################################
6
6
 
7
7
  from __future__ import annotations
8
8
 
9
9
  import typing
10
10
  if typing.TYPE_CHECKING:
11
- import typing
12
11
  import datetime
12
+ import typing
13
13
  FlowSpecDerived = typing.TypeVar("FlowSpecDerived", bound="FlowSpec", contravariant=False, covariant=False)
14
14
  StepFlag = typing.NewType("StepFlag", bool)
15
15
 
@@ -40,16 +40,16 @@ from .user_decorators.user_step_decorator import StepMutator as StepMutator
40
40
  from .user_decorators.user_step_decorator import user_step_decorator as user_step_decorator
41
41
  from .user_decorators.user_flow_decorator import FlowMutator as FlowMutator
42
42
  from . import metaflow_git as metaflow_git
43
- from . import events as events
44
43
  from . import tuple_util as tuple_util
44
+ from . import events as events
45
45
  from . import runner as runner
46
46
  from . import plugins as plugins
47
47
  from .plugins.datatools.s3.s3 import S3 as S3
48
48
  from . import includefile as includefile
49
49
  from .includefile import IncludeFile as IncludeFile
50
50
  from .plugins.pypi.parsers import conda_environment_yml_parser as conda_environment_yml_parser
51
- from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
52
51
  from .plugins.pypi.parsers import pyproject_toml_parser as pyproject_toml_parser
52
+ from .plugins.pypi.parsers import requirements_txt_parser as requirements_txt_parser
53
53
  from . import cards as cards
54
54
  from . import client as client
55
55
  from .client.core import namespace as namespace
@@ -153,133 +153,53 @@ def step(f: typing.Union[typing.Callable[[FlowSpecDerived], None], typing.Callab
153
153
  ...
154
154
 
155
155
  @typing.overload
156
- def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
157
- """
158
- Specifies the Conda environment for the step.
159
-
160
- Information in this decorator will augment any
161
- attributes set in the `@conda_base` flow-level decorator. Hence,
162
- you can use `@conda_base` to set packages required by all
163
- steps and use `@conda` to specify step-specific overrides.
164
-
165
-
166
- Parameters
167
- ----------
168
- packages : Dict[str, str], default {}
169
- Packages to use for this step. The key is the name of the package
170
- and the value is the version to use.
171
- libraries : Dict[str, str], default {}
172
- Supported for backward compatibility. When used with packages, packages will take precedence.
173
- python : str, optional, default None
174
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
175
- that the version used will correspond to the version of the Python interpreter used to start the run.
176
- disabled : bool, default False
177
- If set to True, disables @conda.
178
- """
179
- ...
180
-
181
- @typing.overload
182
- def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
183
- ...
184
-
185
- @typing.overload
186
- def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
187
- ...
188
-
189
- def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
156
+ def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
190
157
  """
191
- Specifies the Conda environment for the step.
192
-
193
- Information in this decorator will augment any
194
- attributes set in the `@conda_base` flow-level decorator. Hence,
195
- you can use `@conda_base` to set packages required by all
196
- steps and use `@conda` to specify step-specific overrides.
197
-
158
+ Specifies that the step will success under all circumstances.
198
159
 
199
- Parameters
200
- ----------
201
- packages : Dict[str, str], default {}
202
- Packages to use for this step. The key is the name of the package
203
- and the value is the version to use.
204
- libraries : Dict[str, str], default {}
205
- Supported for backward compatibility. When used with packages, packages will take precedence.
206
- python : str, optional, default None
207
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
208
- that the version used will correspond to the version of the Python interpreter used to start the run.
209
- disabled : bool, default False
210
- If set to True, disables @conda.
211
- """
212
- ...
213
-
214
- @typing.overload
215
- def secrets(*, sources: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], role: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
216
- """
217
- Specifies secrets to be retrieved and injected as environment variables prior to
218
- the execution of a step.
160
+ The decorator will create an optional artifact, specified by `var`, which
161
+ contains the exception raised. You can use it to detect the presence
162
+ of errors, indicating that all happy-path artifacts produced by the step
163
+ are missing.
219
164
 
220
165
 
221
166
  Parameters
222
167
  ----------
223
- sources : List[Union[str, Dict[str, Any]]], default: []
224
- List of secret specs, defining how the secrets are to be retrieved
225
- role : str, optional, default: None
226
- Role to use for fetching secrets
168
+ var : str, optional, default None
169
+ Name of the artifact in which to store the caught exception.
170
+ If not specified, the exception is not stored.
171
+ print_exception : bool, default True
172
+ Determines whether or not the exception is printed to
173
+ stdout when caught.
227
174
  """
228
175
  ...
229
176
 
230
177
  @typing.overload
231
- def secrets(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
178
+ def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
232
179
  ...
233
180
 
234
181
  @typing.overload
235
- def secrets(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
236
- ...
237
-
238
- def secrets(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, sources: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], role: typing.Optional[str] = None):
239
- """
240
- Specifies secrets to be retrieved and injected as environment variables prior to
241
- the execution of a step.
242
-
243
-
244
- Parameters
245
- ----------
246
- sources : List[Union[str, Dict[str, Any]]], default: []
247
- List of secret specs, defining how the secrets are to be retrieved
248
- role : str, optional, default: None
249
- Role to use for fetching secrets
250
- """
182
+ def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
251
183
  ...
252
184
 
253
- @typing.overload
254
- def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
185
+ def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
255
186
  """
256
- Specifies environment variables to be set prior to the execution of a step.
257
-
187
+ Specifies that the step will success under all circumstances.
258
188
 
259
- Parameters
260
- ----------
261
- vars : Dict[str, str], default {}
262
- Dictionary of environment variables to set.
263
- """
264
- ...
265
-
266
- @typing.overload
267
- def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
268
- ...
269
-
270
- @typing.overload
271
- def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
272
- ...
273
-
274
- def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
275
- """
276
- Specifies environment variables to be set prior to the execution of a step.
189
+ The decorator will create an optional artifact, specified by `var`, which
190
+ contains the exception raised. You can use it to detect the presence
191
+ of errors, indicating that all happy-path artifacts produced by the step
192
+ are missing.
277
193
 
278
194
 
279
195
  Parameters
280
196
  ----------
281
- vars : Dict[str, str], default {}
282
- Dictionary of environment variables to set.
197
+ var : str, optional, default None
198
+ Name of the artifact in which to store the caught exception.
199
+ If not specified, the exception is not stored.
200
+ print_exception : bool, default True
201
+ Determines whether or not the exception is printed to
202
+ stdout when caught.
283
203
  """
284
204
  ...
285
205
 
@@ -343,167 +263,90 @@ def timeout(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None],
343
263
  ...
344
264
 
345
265
  @typing.overload
346
- def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
266
+ def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
347
267
  """
348
- Creates a human-readable report, a Metaflow Card, after this step completes.
349
-
350
- Note that you may add multiple `@card` decorators in a step with different parameters.
351
-
352
-
353
- Parameters
354
- ----------
355
- type : str, default 'default'
356
- Card type.
357
- id : str, optional, default None
358
- If multiple cards are present, use this id to identify this card.
359
- options : Dict[str, Any], default {}
360
- Options passed to the card. The contents depend on the card type.
361
- timeout : int, default 45
362
- Interrupt reporting if it takes more than this many seconds.
268
+ Decorator prototype for all step decorators. This function gets specialized
269
+ and imported for all decorators types by _import_plugin_decorators().
363
270
  """
364
271
  ...
365
272
 
366
273
  @typing.overload
367
- def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
368
- ...
369
-
370
- @typing.overload
371
- def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
274
+ def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
372
275
  ...
373
276
 
374
- def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
277
+ def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
375
278
  """
376
- Creates a human-readable report, a Metaflow Card, after this step completes.
377
-
378
- Note that you may add multiple `@card` decorators in a step with different parameters.
379
-
380
-
381
- Parameters
382
- ----------
383
- type : str, default 'default'
384
- Card type.
385
- id : str, optional, default None
386
- If multiple cards are present, use this id to identify this card.
387
- options : Dict[str, Any], default {}
388
- Options passed to the card. The contents depend on the card type.
389
- timeout : int, default 45
390
- Interrupt reporting if it takes more than this many seconds.
279
+ Decorator prototype for all step decorators. This function gets specialized
280
+ and imported for all decorators types by _import_plugin_decorators().
391
281
  """
392
282
  ...
393
283
 
394
284
  @typing.overload
395
- def catch(*, var: typing.Optional[str] = None, print_exception: bool = True) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
285
+ def conda(*, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
396
286
  """
397
- Specifies that the step will success under all circumstances.
287
+ Specifies the Conda environment for the step.
398
288
 
399
- The decorator will create an optional artifact, specified by `var`, which
400
- contains the exception raised. You can use it to detect the presence
401
- of errors, indicating that all happy-path artifacts produced by the step
402
- are missing.
289
+ Information in this decorator will augment any
290
+ attributes set in the `@conda_base` flow-level decorator. Hence,
291
+ you can use `@conda_base` to set packages required by all
292
+ steps and use `@conda` to specify step-specific overrides.
403
293
 
404
294
 
405
295
  Parameters
406
296
  ----------
407
- var : str, optional, default None
408
- Name of the artifact in which to store the caught exception.
409
- If not specified, the exception is not stored.
410
- print_exception : bool, default True
411
- Determines whether or not the exception is printed to
412
- stdout when caught.
297
+ packages : Dict[str, str], default {}
298
+ Packages to use for this step. The key is the name of the package
299
+ and the value is the version to use.
300
+ libraries : Dict[str, str], default {}
301
+ Supported for backward compatibility. When used with packages, packages will take precedence.
302
+ python : str, optional, default None
303
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
304
+ that the version used will correspond to the version of the Python interpreter used to start the run.
305
+ disabled : bool, default False
306
+ If set to True, disables @conda.
413
307
  """
414
308
  ...
415
309
 
416
310
  @typing.overload
417
- def catch(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
311
+ def conda(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
418
312
  ...
419
313
 
420
314
  @typing.overload
421
- def catch(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
315
+ def conda(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
422
316
  ...
423
317
 
424
- def catch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, var: typing.Optional[str] = None, print_exception: bool = True):
318
+ def conda(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, libraries: typing.Dict[str, str] = {}, python: typing.Optional[str] = None, disabled: bool = False):
425
319
  """
426
- Specifies that the step will success under all circumstances.
320
+ Specifies the Conda environment for the step.
427
321
 
428
- The decorator will create an optional artifact, specified by `var`, which
429
- contains the exception raised. You can use it to detect the presence
430
- of errors, indicating that all happy-path artifacts produced by the step
431
- are missing.
322
+ Information in this decorator will augment any
323
+ attributes set in the `@conda_base` flow-level decorator. Hence,
324
+ you can use `@conda_base` to set packages required by all
325
+ steps and use `@conda` to specify step-specific overrides.
432
326
 
433
327
 
434
328
  Parameters
435
329
  ----------
436
- var : str, optional, default None
437
- Name of the artifact in which to store the caught exception.
438
- If not specified, the exception is not stored.
439
- print_exception : bool, default True
440
- Determines whether or not the exception is printed to
441
- stdout when caught.
330
+ packages : Dict[str, str], default {}
331
+ Packages to use for this step. The key is the name of the package
332
+ and the value is the version to use.
333
+ libraries : Dict[str, str], default {}
334
+ Supported for backward compatibility. When used with packages, packages will take precedence.
335
+ python : str, optional, default None
336
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
337
+ that the version used will correspond to the version of the Python interpreter used to start the run.
338
+ disabled : bool, default False
339
+ If set to True, disables @conda.
442
340
  """
443
341
  ...
444
342
 
445
343
  @typing.overload
446
- def retry(*, times: int = 3, minutes_between_retries: int = 2) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
344
+ def resources(*, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
447
345
  """
448
- Specifies the number of times the task corresponding
449
- to a step needs to be retried.
450
-
451
- This decorator is useful for handling transient errors, such as networking issues.
452
- If your task contains operations that can't be retried safely, e.g. database updates,
453
- it is advisable to annotate it with `@retry(times=0)`.
346
+ Specifies the resources needed when executing this step.
454
347
 
455
- This can be used in conjunction with the `@catch` decorator. The `@catch`
456
- decorator will execute a no-op task after all retries have been exhausted,
457
- ensuring that the flow execution can continue.
458
-
459
-
460
- Parameters
461
- ----------
462
- times : int, default 3
463
- Number of times to retry this task.
464
- minutes_between_retries : int, default 2
465
- Number of minutes between retries.
466
- """
467
- ...
468
-
469
- @typing.overload
470
- def retry(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
471
- ...
472
-
473
- @typing.overload
474
- def retry(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
475
- ...
476
-
477
- def retry(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, times: int = 3, minutes_between_retries: int = 2):
478
- """
479
- Specifies the number of times the task corresponding
480
- to a step needs to be retried.
481
-
482
- This decorator is useful for handling transient errors, such as networking issues.
483
- If your task contains operations that can't be retried safely, e.g. database updates,
484
- it is advisable to annotate it with `@retry(times=0)`.
485
-
486
- This can be used in conjunction with the `@catch` decorator. The `@catch`
487
- decorator will execute a no-op task after all retries have been exhausted,
488
- ensuring that the flow execution can continue.
489
-
490
-
491
- Parameters
492
- ----------
493
- times : int, default 3
494
- Number of times to retry this task.
495
- minutes_between_retries : int, default 2
496
- Number of minutes between retries.
497
- """
498
- ...
499
-
500
- @typing.overload
501
- def resources(*, cpu: int = 1, gpu: typing.Optional[int] = None, disk: typing.Optional[int] = None, memory: int = 4096, shared_memory: typing.Optional[int] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
502
- """
503
- Specifies the resources needed when executing this step.
504
-
505
- Use `@resources` to specify the resource requirements
506
- independently of the specific compute layer (`@batch`, `@kubernetes`).
348
+ Use `@resources` to specify the resource requirements
349
+ independently of the specific compute layer (`@batch`, `@kubernetes`).
507
350
 
508
351
  You can choose the compute layer on the command line by executing e.g.
509
352
  ```
@@ -576,6 +419,94 @@ def resources(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None]
576
419
  """
577
420
  ...
578
421
 
422
+ @typing.overload
423
+ def card(*, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
424
+ """
425
+ Creates a human-readable report, a Metaflow Card, after this step completes.
426
+
427
+ Note that you may add multiple `@card` decorators in a step with different parameters.
428
+
429
+
430
+ Parameters
431
+ ----------
432
+ type : str, default 'default'
433
+ Card type.
434
+ id : str, optional, default None
435
+ If multiple cards are present, use this id to identify this card.
436
+ options : Dict[str, Any], default {}
437
+ Options passed to the card. The contents depend on the card type.
438
+ timeout : int, default 45
439
+ Interrupt reporting if it takes more than this many seconds.
440
+ """
441
+ ...
442
+
443
+ @typing.overload
444
+ def card(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
445
+ ...
446
+
447
+ @typing.overload
448
+ def card(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
449
+ ...
450
+
451
+ def card(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, type: str = 'default', id: typing.Optional[str] = None, options: typing.Dict[str, typing.Any] = {}, timeout: int = 45):
452
+ """
453
+ Creates a human-readable report, a Metaflow Card, after this step completes.
454
+
455
+ Note that you may add multiple `@card` decorators in a step with different parameters.
456
+
457
+
458
+ Parameters
459
+ ----------
460
+ type : str, default 'default'
461
+ Card type.
462
+ id : str, optional, default None
463
+ If multiple cards are present, use this id to identify this card.
464
+ options : Dict[str, Any], default {}
465
+ Options passed to the card. The contents depend on the card type.
466
+ timeout : int, default 45
467
+ Interrupt reporting if it takes more than this many seconds.
468
+ """
469
+ ...
470
+
471
+ @typing.overload
472
+ def secrets(*, sources: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], role: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
473
+ """
474
+ Specifies secrets to be retrieved and injected as environment variables prior to
475
+ the execution of a step.
476
+
477
+
478
+ Parameters
479
+ ----------
480
+ sources : List[Union[str, Dict[str, Any]]], default: []
481
+ List of secret specs, defining how the secrets are to be retrieved
482
+ role : str, optional, default: None
483
+ Role to use for fetching secrets
484
+ """
485
+ ...
486
+
487
+ @typing.overload
488
+ def secrets(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
489
+ ...
490
+
491
+ @typing.overload
492
+ def secrets(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
493
+ ...
494
+
495
+ def secrets(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, sources: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], role: typing.Optional[str] = None):
496
+ """
497
+ Specifies secrets to be retrieved and injected as environment variables prior to
498
+ the execution of a step.
499
+
500
+
501
+ Parameters
502
+ ----------
503
+ sources : List[Union[str, Dict[str, Any]]], default: []
504
+ List of secret specs, defining how the secrets are to be retrieved
505
+ role : str, optional, default: None
506
+ Role to use for fetching secrets
507
+ """
508
+ ...
509
+
579
510
  def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: typing.Optional[str] = None, image_pull_policy: str = 'KUBERNETES_IMAGE_PULL_POLICY', image_pull_secrets: typing.List[str] = [], service_account: str = 'METAFLOW_KUBERNETES_SERVICE_ACCOUNT', secrets: typing.Optional[typing.List[str]] = None, node_selector: typing.Union[typing.Dict[str, str], str, None] = None, namespace: str = 'METAFLOW_KUBERNETES_NAMESPACE', gpu: typing.Optional[int] = None, gpu_vendor: str = 'KUBERNETES_GPU_VENDOR', tolerations: typing.List[typing.Dict[str, str]] = [], labels: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_LABELS', annotations: typing.Dict[str, str] = 'METAFLOW_KUBERNETES_ANNOTATIONS', use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = '/metaflow_temp', persistent_volume_claims: typing.Optional[typing.Dict[str, str]] = None, shared_memory: typing.Optional[int] = None, port: typing.Optional[int] = None, compute_pool: typing.Optional[str] = None, hostname_resolution_timeout: int = 600, qos: str = 'Burstable', security_context: typing.Optional[typing.Dict[str, typing.Any]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
580
511
  """
581
512
  Specifies that this step should execute on Kubernetes.
@@ -665,76 +596,6 @@ def kubernetes(*, cpu: int = 1, memory: int = 4096, disk: int = 10240, image: ty
665
596
  """
666
597
  ...
667
598
 
668
- @typing.overload
669
- def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
670
- """
671
- Specifies the PyPI packages for the step.
672
-
673
- Information in this decorator will augment any
674
- attributes set in the `@pyi_base` flow-level decorator. Hence,
675
- you can use `@pypi_base` to set packages required by all
676
- steps and use `@pypi` to specify step-specific overrides.
677
-
678
-
679
- Parameters
680
- ----------
681
- packages : Dict[str, str], default: {}
682
- Packages to use for this step. The key is the name of the package
683
- and the value is the version to use.
684
- python : str, optional, default: None
685
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
686
- that the version used will correspond to the version of the Python interpreter used to start the run.
687
- """
688
- ...
689
-
690
- @typing.overload
691
- def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
692
- ...
693
-
694
- @typing.overload
695
- def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
696
- ...
697
-
698
- def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
699
- """
700
- Specifies the PyPI packages for the step.
701
-
702
- Information in this decorator will augment any
703
- attributes set in the `@pyi_base` flow-level decorator. Hence,
704
- you can use `@pypi_base` to set packages required by all
705
- steps and use `@pypi` to specify step-specific overrides.
706
-
707
-
708
- Parameters
709
- ----------
710
- packages : Dict[str, str], default: {}
711
- Packages to use for this step. The key is the name of the package
712
- and the value is the version to use.
713
- python : str, optional, default: None
714
- Version of Python to use, e.g. '3.7.4'. A default value of None implies
715
- that the version used will correspond to the version of the Python interpreter used to start the run.
716
- """
717
- ...
718
-
719
- @typing.overload
720
- def parallel(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
721
- """
722
- Decorator prototype for all step decorators. This function gets specialized
723
- and imported for all decorators types by _import_plugin_decorators().
724
- """
725
- ...
726
-
727
- @typing.overload
728
- def parallel(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
729
- ...
730
-
731
- def parallel(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None):
732
- """
733
- Decorator prototype for all step decorators. This function gets specialized
734
- and imported for all decorators types by _import_plugin_decorators().
735
- """
736
- ...
737
-
738
599
  @typing.overload
739
600
  def batch(*, cpu: int = 1, gpu: int = 0, memory: int = 4096, image: typing.Optional[str] = None, queue: str = 'METAFLOW_BATCH_JOB_QUEUE', iam_role: str = 'METAFLOW_ECS_S3_ACCESS_IAM_ROLE', execution_role: str = 'METAFLOW_ECS_FARGATE_EXECUTION_ROLE', shared_memory: typing.Optional[int] = None, max_swap: typing.Optional[int] = None, swappiness: typing.Optional[int] = None, use_tmpfs: bool = False, tmpfs_tempdir: bool = True, tmpfs_size: typing.Optional[int] = None, tmpfs_path: typing.Optional[str] = None, inferentia: int = 0, trainium: int = None, efa: int = 0, ephemeral_storage: int = None, log_driver: typing.Optional[str] = None, log_options: typing.Optional[typing.List[str]] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
740
601
  """
@@ -885,55 +746,373 @@ def batch(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], ty
885
746
  ...
886
747
 
887
748
  @typing.overload
888
- def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
749
+ def pypi(*, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
889
750
  """
890
- Specifies the flow(s) that this flow depends on.
891
-
892
- ```
893
- @trigger_on_finish(flow='FooFlow')
894
- ```
895
- or
896
- ```
897
- @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
898
- ```
899
- This decorator respects the @project decorator and triggers the flow
900
- when upstream runs within the same namespace complete successfully
901
-
902
- Additionally, you can specify project aware upstream flow dependencies
903
- by specifying the fully qualified project_flow_name.
904
- ```
905
- @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
906
- ```
907
- or
908
- ```
909
- @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
910
- ```
911
-
912
- You can also specify just the project or project branch (other values will be
913
- inferred from the current project or project branch):
914
- ```
915
- @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
916
- ```
751
+ Specifies the PyPI packages for the step.
917
752
 
918
- Note that `branch` is typically one of:
919
- - `prod`
920
- - `user.bob`
921
- - `test.my_experiment`
922
- - `prod.staging`
753
+ Information in this decorator will augment any
754
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
755
+ you can use `@pypi_base` to set packages required by all
756
+ steps and use `@pypi` to specify step-specific overrides.
923
757
 
924
758
 
925
759
  Parameters
926
760
  ----------
927
- flow : Union[str, Dict[str, str]], optional, default None
928
- Upstream flow dependency for this flow.
929
- flows : List[Union[str, Dict[str, str]]], default []
930
- Upstream flow dependencies for this flow.
931
- options : Dict[str, Any], default {}
932
- Backend-specific configuration for tuning eventing behavior.
933
- """
934
- ...
935
-
936
- @typing.overload
761
+ packages : Dict[str, str], default: {}
762
+ Packages to use for this step. The key is the name of the package
763
+ and the value is the version to use.
764
+ python : str, optional, default: None
765
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
766
+ that the version used will correspond to the version of the Python interpreter used to start the run.
767
+ """
768
+ ...
769
+
770
+ @typing.overload
771
+ def pypi(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
772
+ ...
773
+
774
+ @typing.overload
775
+ def pypi(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
776
+ ...
777
+
778
+ def pypi(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, packages: typing.Dict[str, str] = {}, python: typing.Optional[str] = None):
779
+ """
780
+ Specifies the PyPI packages for the step.
781
+
782
+ Information in this decorator will augment any
783
+ attributes set in the `@pyi_base` flow-level decorator. Hence,
784
+ you can use `@pypi_base` to set packages required by all
785
+ steps and use `@pypi` to specify step-specific overrides.
786
+
787
+
788
+ Parameters
789
+ ----------
790
+ packages : Dict[str, str], default: {}
791
+ Packages to use for this step. The key is the name of the package
792
+ and the value is the version to use.
793
+ python : str, optional, default: None
794
+ Version of Python to use, e.g. '3.7.4'. A default value of None implies
795
+ that the version used will correspond to the version of the Python interpreter used to start the run.
796
+ """
797
+ ...
798
+
799
+ @typing.overload
800
+ def retry(*, times: int = 3, minutes_between_retries: int = 2) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
801
+ """
802
+ Specifies the number of times the task corresponding
803
+ to a step needs to be retried.
804
+
805
+ This decorator is useful for handling transient errors, such as networking issues.
806
+ If your task contains operations that can't be retried safely, e.g. database updates,
807
+ it is advisable to annotate it with `@retry(times=0)`.
808
+
809
+ This can be used in conjunction with the `@catch` decorator. The `@catch`
810
+ decorator will execute a no-op task after all retries have been exhausted,
811
+ ensuring that the flow execution can continue.
812
+
813
+
814
+ Parameters
815
+ ----------
816
+ times : int, default 3
817
+ Number of times to retry this task.
818
+ minutes_between_retries : int, default 2
819
+ Number of minutes between retries.
820
+ """
821
+ ...
822
+
823
+ @typing.overload
824
+ def retry(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
825
+ ...
826
+
827
+ @typing.overload
828
+ def retry(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
829
+ ...
830
+
831
+ def retry(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, times: int = 3, minutes_between_retries: int = 2):
832
+ """
833
+ Specifies the number of times the task corresponding
834
+ to a step needs to be retried.
835
+
836
+ This decorator is useful for handling transient errors, such as networking issues.
837
+ If your task contains operations that can't be retried safely, e.g. database updates,
838
+ it is advisable to annotate it with `@retry(times=0)`.
839
+
840
+ This can be used in conjunction with the `@catch` decorator. The `@catch`
841
+ decorator will execute a no-op task after all retries have been exhausted,
842
+ ensuring that the flow execution can continue.
843
+
844
+
845
+ Parameters
846
+ ----------
847
+ times : int, default 3
848
+ Number of times to retry this task.
849
+ minutes_between_retries : int, default 2
850
+ Number of minutes between retries.
851
+ """
852
+ ...
853
+
854
+ @typing.overload
855
+ def environment(*, vars: typing.Dict[str, str] = {}) -> typing.Callable[[typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]], typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]]]:
856
+ """
857
+ Specifies environment variables to be set prior to the execution of a step.
858
+
859
+
860
+ Parameters
861
+ ----------
862
+ vars : Dict[str, str], default {}
863
+ Dictionary of environment variables to set.
864
+ """
865
+ ...
866
+
867
+ @typing.overload
868
+ def environment(f: typing.Callable[[FlowSpecDerived, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, StepFlag], None]:
869
+ ...
870
+
871
+ @typing.overload
872
+ def environment(f: typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]) -> typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None]:
873
+ ...
874
+
875
+ def environment(f: typing.Union[typing.Callable[[FlowSpecDerived, StepFlag], None], typing.Callable[[FlowSpecDerived, typing.Any, StepFlag], None], None] = None, *, vars: typing.Dict[str, str] = {}):
876
+ """
877
+ Specifies environment variables to be set prior to the execution of a step.
878
+
879
+
880
+ Parameters
881
+ ----------
882
+ vars : Dict[str, str], default {}
883
+ Dictionary of environment variables to set.
884
+ """
885
+ ...
886
+
887
+ @typing.overload
888
+ def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
889
+ """
890
+ Specifies the event(s) that this flow depends on.
891
+
892
+ ```
893
+ @trigger(event='foo')
894
+ ```
895
+ or
896
+ ```
897
+ @trigger(events=['foo', 'bar'])
898
+ ```
899
+
900
+ Additionally, you can specify the parameter mappings
901
+ to map event payload to Metaflow parameters for the flow.
902
+ ```
903
+ @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
904
+ ```
905
+ or
906
+ ```
907
+ @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
908
+ {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
909
+ ```
910
+
911
+ 'parameters' can also be a list of strings and tuples like so:
912
+ ```
913
+ @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
914
+ ```
915
+ This is equivalent to:
916
+ ```
917
+ @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
918
+ ```
919
+
920
+
921
+ Parameters
922
+ ----------
923
+ event : Union[str, Dict[str, Any]], optional, default None
924
+ Event dependency for this flow.
925
+ events : List[Union[str, Dict[str, Any]]], default []
926
+ Events dependency for this flow.
927
+ options : Dict[str, Any], default {}
928
+ Backend-specific configuration for tuning eventing behavior.
929
+ """
930
+ ...
931
+
932
+ @typing.overload
933
+ def trigger(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
934
+ ...
935
+
936
+ def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}):
937
+ """
938
+ Specifies the event(s) that this flow depends on.
939
+
940
+ ```
941
+ @trigger(event='foo')
942
+ ```
943
+ or
944
+ ```
945
+ @trigger(events=['foo', 'bar'])
946
+ ```
947
+
948
+ Additionally, you can specify the parameter mappings
949
+ to map event payload to Metaflow parameters for the flow.
950
+ ```
951
+ @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
952
+ ```
953
+ or
954
+ ```
955
+ @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
956
+ {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
957
+ ```
958
+
959
+ 'parameters' can also be a list of strings and tuples like so:
960
+ ```
961
+ @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
962
+ ```
963
+ This is equivalent to:
964
+ ```
965
+ @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
966
+ ```
967
+
968
+
969
+ Parameters
970
+ ----------
971
+ event : Union[str, Dict[str, Any]], optional, default None
972
+ Event dependency for this flow.
973
+ events : List[Union[str, Dict[str, Any]]], default []
974
+ Events dependency for this flow.
975
+ options : Dict[str, Any], default {}
976
+ Backend-specific configuration for tuning eventing behavior.
977
+ """
978
+ ...
979
+
980
+ @typing.overload
981
+ def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
982
+ """
983
+ Specifies the times when the flow should be run when running on a
984
+ production scheduler.
985
+
986
+
987
+ Parameters
988
+ ----------
989
+ hourly : bool, default False
990
+ Run the workflow hourly.
991
+ daily : bool, default True
992
+ Run the workflow daily.
993
+ weekly : bool, default False
994
+ Run the workflow weekly.
995
+ cron : str, optional, default None
996
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
997
+ specified by this expression.
998
+ timezone : str, optional, default None
999
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1000
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1001
+ """
1002
+ ...
1003
+
1004
+ @typing.overload
1005
+ def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1006
+ ...
1007
+
1008
+ def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1009
+ """
1010
+ Specifies the times when the flow should be run when running on a
1011
+ production scheduler.
1012
+
1013
+
1014
+ Parameters
1015
+ ----------
1016
+ hourly : bool, default False
1017
+ Run the workflow hourly.
1018
+ daily : bool, default True
1019
+ Run the workflow daily.
1020
+ weekly : bool, default False
1021
+ Run the workflow weekly.
1022
+ cron : str, optional, default None
1023
+ Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1024
+ specified by this expression.
1025
+ timezone : str, optional, default None
1026
+ Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1027
+ which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1028
+ """
1029
+ ...
1030
+
1031
+ def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1032
+ """
1033
+ Specifies what flows belong to the same project.
1034
+
1035
+ A project-specific namespace is created for all flows that
1036
+ use the same `@project(name)`.
1037
+
1038
+
1039
+ Parameters
1040
+ ----------
1041
+ name : str
1042
+ Project name. Make sure that the name is unique amongst all
1043
+ projects that use the same production scheduler. The name may
1044
+ contain only lowercase alphanumeric characters and underscores.
1045
+
1046
+ branch : Optional[str], default None
1047
+ The branch to use. If not specified, the branch is set to
1048
+ `user.<username>` unless `production` is set to `True`. This can
1049
+ also be set on the command line using `--branch` as a top-level option.
1050
+ It is an error to specify `branch` in the decorator and on the command line.
1051
+
1052
+ production : bool, default False
1053
+ Whether or not the branch is the production branch. This can also be set on the
1054
+ command line using `--production` as a top-level option. It is an error to specify
1055
+ `production` in the decorator and on the command line.
1056
+ The project branch name will be:
1057
+ - if `branch` is specified:
1058
+ - if `production` is True: `prod.<branch>`
1059
+ - if `production` is False: `test.<branch>`
1060
+ - if `branch` is not specified:
1061
+ - if `production` is True: `prod`
1062
+ - if `production` is False: `user.<username>`
1063
+ """
1064
+ ...
1065
+
1066
+ @typing.overload
1067
+ def trigger_on_finish(*, flow: typing.Union[typing.Dict[str, str], str, None] = None, flows: typing.List[typing.Union[str, typing.Dict[str, str]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1068
+ """
1069
+ Specifies the flow(s) that this flow depends on.
1070
+
1071
+ ```
1072
+ @trigger_on_finish(flow='FooFlow')
1073
+ ```
1074
+ or
1075
+ ```
1076
+ @trigger_on_finish(flows=['FooFlow', 'BarFlow'])
1077
+ ```
1078
+ This decorator respects the @project decorator and triggers the flow
1079
+ when upstream runs within the same namespace complete successfully
1080
+
1081
+ Additionally, you can specify project aware upstream flow dependencies
1082
+ by specifying the fully qualified project_flow_name.
1083
+ ```
1084
+ @trigger_on_finish(flow='my_project.branch.my_branch.FooFlow')
1085
+ ```
1086
+ or
1087
+ ```
1088
+ @trigger_on_finish(flows=['my_project.branch.my_branch.FooFlow', 'BarFlow'])
1089
+ ```
1090
+
1091
+ You can also specify just the project or project branch (other values will be
1092
+ inferred from the current project or project branch):
1093
+ ```
1094
+ @trigger_on_finish(flow={"name": "FooFlow", "project": "my_project", "project_branch": "branch"})
1095
+ ```
1096
+
1097
+ Note that `branch` is typically one of:
1098
+ - `prod`
1099
+ - `user.bob`
1100
+ - `test.my_experiment`
1101
+ - `prod.staging`
1102
+
1103
+
1104
+ Parameters
1105
+ ----------
1106
+ flow : Union[str, Dict[str, str]], optional, default None
1107
+ Upstream flow dependency for this flow.
1108
+ flows : List[Union[str, Dict[str, str]]], default []
1109
+ Upstream flow dependencies for this flow.
1110
+ options : Dict[str, Any], default {}
1111
+ Backend-specific configuration for tuning eventing behavior.
1112
+ """
1113
+ ...
1114
+
1115
+ @typing.overload
937
1116
  def trigger_on_finish(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
938
1117
  ...
939
1118
 
@@ -1026,89 +1205,46 @@ def pypi_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packag
1026
1205
  """
1027
1206
  ...
1028
1207
 
1029
- @typing.overload
1030
- def schedule(*, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1031
- """
1032
- Specifies the times when the flow should be run when running on a
1033
- production scheduler.
1034
-
1035
-
1036
- Parameters
1037
- ----------
1038
- hourly : bool, default False
1039
- Run the workflow hourly.
1040
- daily : bool, default True
1041
- Run the workflow daily.
1042
- weekly : bool, default False
1043
- Run the workflow weekly.
1044
- cron : str, optional, default None
1045
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1046
- specified by this expression.
1047
- timezone : str, optional, default None
1048
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1049
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1050
- """
1051
- ...
1052
-
1053
- @typing.overload
1054
- def schedule(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1055
- ...
1056
-
1057
- def schedule(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, hourly: bool = False, daily: bool = True, weekly: bool = False, cron: typing.Optional[str] = None, timezone: typing.Optional[str] = None):
1058
- """
1059
- Specifies the times when the flow should be run when running on a
1060
- production scheduler.
1061
-
1062
-
1063
- Parameters
1064
- ----------
1065
- hourly : bool, default False
1066
- Run the workflow hourly.
1067
- daily : bool, default True
1068
- Run the workflow daily.
1069
- weekly : bool, default False
1070
- Run the workflow weekly.
1071
- cron : str, optional, default None
1072
- Run the workflow at [a custom Cron schedule](https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html#cron-expressions)
1073
- specified by this expression.
1074
- timezone : str, optional, default None
1075
- Timezone on which the schedule runs (default: None). Currently supported only for Argo workflows,
1076
- which accepts timezones in [IANA format](https://nodatime.org/TimeZones).
1077
- """
1078
- ...
1079
-
1080
- def project(*, name: str, branch: typing.Optional[str] = None, production: bool = False) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1208
+ def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1081
1209
  """
1082
- Specifies what flows belong to the same project.
1083
-
1084
- A project-specific namespace is created for all flows that
1085
- use the same `@project(name)`.
1210
+ The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1211
+ before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1212
+ and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1213
+ added as a flow decorators. Adding more than one decorator will ensure that `start` step
1214
+ starts only after all sensors finish.
1086
1215
 
1087
1216
 
1088
1217
  Parameters
1089
1218
  ----------
1219
+ timeout : int
1220
+ Time, in seconds before the task times out and fails. (Default: 3600)
1221
+ poke_interval : int
1222
+ Time in seconds that the job should wait in between each try. (Default: 60)
1223
+ mode : str
1224
+ How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1225
+ exponential_backoff : bool
1226
+ allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1227
+ pool : str
1228
+ the slot pool this task should run in,
1229
+ slot pools are a way to limit concurrency for certain tasks. (Default:None)
1230
+ soft_fail : bool
1231
+ Set to true to mark the task as SKIPPED on failure. (Default: False)
1090
1232
  name : str
1091
- Project name. Make sure that the name is unique amongst all
1092
- projects that use the same production scheduler. The name may
1093
- contain only lowercase alphanumeric characters and underscores.
1094
-
1095
- branch : Optional[str], default None
1096
- The branch to use. If not specified, the branch is set to
1097
- `user.<username>` unless `production` is set to `True`. This can
1098
- also be set on the command line using `--branch` as a top-level option.
1099
- It is an error to specify `branch` in the decorator and on the command line.
1100
-
1101
- production : bool, default False
1102
- Whether or not the branch is the production branch. This can also be set on the
1103
- command line using `--production` as a top-level option. It is an error to specify
1104
- `production` in the decorator and on the command line.
1105
- The project branch name will be:
1106
- - if `branch` is specified:
1107
- - if `production` is True: `prod.<branch>`
1108
- - if `production` is False: `test.<branch>`
1109
- - if `branch` is not specified:
1110
- - if `production` is True: `prod`
1111
- - if `production` is False: `user.<username>`
1233
+ Name of the sensor on Airflow
1234
+ description : str
1235
+ Description of sensor in the Airflow UI
1236
+ bucket_key : Union[str, List[str]]
1237
+ The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1238
+ When it's specified as a full s3:// url, please leave `bucket_name` as None
1239
+ bucket_name : str
1240
+ Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1241
+ When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1242
+ wildcard_match : bool
1243
+ whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1244
+ aws_conn_id : str
1245
+ a reference to the s3 connection on Airflow. (Default: None)
1246
+ verify : bool
1247
+ Whether or not to verify SSL certificates for S3 connection. (Default: None)
1112
1248
  """
1113
1249
  ...
1114
1250
 
@@ -1163,99 +1299,6 @@ def conda_base(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, packa
1163
1299
  """
1164
1300
  ...
1165
1301
 
1166
- @typing.overload
1167
- def trigger(*, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1168
- """
1169
- Specifies the event(s) that this flow depends on.
1170
-
1171
- ```
1172
- @trigger(event='foo')
1173
- ```
1174
- or
1175
- ```
1176
- @trigger(events=['foo', 'bar'])
1177
- ```
1178
-
1179
- Additionally, you can specify the parameter mappings
1180
- to map event payload to Metaflow parameters for the flow.
1181
- ```
1182
- @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
1183
- ```
1184
- or
1185
- ```
1186
- @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
1187
- {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
1188
- ```
1189
-
1190
- 'parameters' can also be a list of strings and tuples like so:
1191
- ```
1192
- @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
1193
- ```
1194
- This is equivalent to:
1195
- ```
1196
- @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
1197
- ```
1198
-
1199
-
1200
- Parameters
1201
- ----------
1202
- event : Union[str, Dict[str, Any]], optional, default None
1203
- Event dependency for this flow.
1204
- events : List[Union[str, Dict[str, Any]]], default []
1205
- Events dependency for this flow.
1206
- options : Dict[str, Any], default {}
1207
- Backend-specific configuration for tuning eventing behavior.
1208
- """
1209
- ...
1210
-
1211
- @typing.overload
1212
- def trigger(f: typing.Type[FlowSpecDerived]) -> typing.Type[FlowSpecDerived]:
1213
- ...
1214
-
1215
- def trigger(f: typing.Optional[typing.Type[FlowSpecDerived]] = None, *, event: typing.Union[str, typing.Dict[str, typing.Any], None] = None, events: typing.List[typing.Union[str, typing.Dict[str, typing.Any]]] = [], options: typing.Dict[str, typing.Any] = {}):
1216
- """
1217
- Specifies the event(s) that this flow depends on.
1218
-
1219
- ```
1220
- @trigger(event='foo')
1221
- ```
1222
- or
1223
- ```
1224
- @trigger(events=['foo', 'bar'])
1225
- ```
1226
-
1227
- Additionally, you can specify the parameter mappings
1228
- to map event payload to Metaflow parameters for the flow.
1229
- ```
1230
- @trigger(event={'name':'foo', 'parameters':{'flow_param': 'event_field'}})
1231
- ```
1232
- or
1233
- ```
1234
- @trigger(events=[{'name':'foo', 'parameters':{'flow_param_1': 'event_field_1'},
1235
- {'name':'bar', 'parameters':{'flow_param_2': 'event_field_2'}])
1236
- ```
1237
-
1238
- 'parameters' can also be a list of strings and tuples like so:
1239
- ```
1240
- @trigger(event={'name':'foo', 'parameters':['common_name', ('flow_param', 'event_field')]})
1241
- ```
1242
- This is equivalent to:
1243
- ```
1244
- @trigger(event={'name':'foo', 'parameters':{'common_name': 'common_name', 'flow_param': 'event_field'}})
1245
- ```
1246
-
1247
-
1248
- Parameters
1249
- ----------
1250
- event : Union[str, Dict[str, Any]], optional, default None
1251
- Event dependency for this flow.
1252
- events : List[Union[str, Dict[str, Any]]], default []
1253
- Events dependency for this flow.
1254
- options : Dict[str, Any], default {}
1255
- Backend-specific configuration for tuning eventing behavior.
1256
- """
1257
- ...
1258
-
1259
1302
  def airflow_external_task_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, external_dag_id: str, external_task_ids: typing.List[str], allowed_states: typing.List[str], failed_states: typing.List[str], execution_delta: "datetime.timedelta", check_existence: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1260
1303
  """
1261
1304
  The `@airflow_external_task_sensor` decorator attaches a Airflow [ExternalTaskSensor](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/sensors/external_task/index.html#airflow.sensors.external_task.ExternalTaskSensor) before the start step of the flow.
@@ -1299,46 +1342,3 @@ def airflow_external_task_sensor(*, timeout: int, poke_interval: int, mode: str,
1299
1342
  """
1300
1343
  ...
1301
1344
 
1302
- def airflow_s3_key_sensor(*, timeout: int, poke_interval: int, mode: str, exponential_backoff: bool, pool: str, soft_fail: bool, name: str, description: str, bucket_key: typing.Union[str, typing.List[str]], bucket_name: str, wildcard_match: bool, aws_conn_id: str, verify: bool) -> typing.Callable[[typing.Type[FlowSpecDerived]], typing.Type[FlowSpecDerived]]:
1303
- """
1304
- The `@airflow_s3_key_sensor` decorator attaches a Airflow [S3KeySensor](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/sensors/s3/index.html#airflow.providers.amazon.aws.sensors.s3.S3KeySensor)
1305
- before the start step of the flow. This decorator only works when a flow is scheduled on Airflow
1306
- and is compiled using `airflow create`. More than one `@airflow_s3_key_sensor` can be
1307
- added as a flow decorators. Adding more than one decorator will ensure that `start` step
1308
- starts only after all sensors finish.
1309
-
1310
-
1311
- Parameters
1312
- ----------
1313
- timeout : int
1314
- Time, in seconds before the task times out and fails. (Default: 3600)
1315
- poke_interval : int
1316
- Time in seconds that the job should wait in between each try. (Default: 60)
1317
- mode : str
1318
- How the sensor operates. Options are: { poke | reschedule }. (Default: "poke")
1319
- exponential_backoff : bool
1320
- allow progressive longer waits between pokes by using exponential backoff algorithm. (Default: True)
1321
- pool : str
1322
- the slot pool this task should run in,
1323
- slot pools are a way to limit concurrency for certain tasks. (Default:None)
1324
- soft_fail : bool
1325
- Set to true to mark the task as SKIPPED on failure. (Default: False)
1326
- name : str
1327
- Name of the sensor on Airflow
1328
- description : str
1329
- Description of sensor in the Airflow UI
1330
- bucket_key : Union[str, List[str]]
1331
- The key(s) being waited on. Supports full s3:// style url or relative path from root level.
1332
- When it's specified as a full s3:// url, please leave `bucket_name` as None
1333
- bucket_name : str
1334
- Name of the S3 bucket. Only needed when bucket_key is not provided as a full s3:// url.
1335
- When specified, all the keys passed to bucket_key refers to this bucket. (Default:None)
1336
- wildcard_match : bool
1337
- whether the bucket_key should be interpreted as a Unix wildcard pattern. (Default: False)
1338
- aws_conn_id : str
1339
- a reference to the s3 connection on Airflow. (Default: None)
1340
- verify : bool
1341
- Whether or not to verify SSL certificates for S3 connection. (Default: None)
1342
- """
1343
- ...
1344
-